Security | Threat Detection | Cyberattacks | DevSecOps | Compliance

The latest News and Information on Data Security including privacy, protection, and encryption.

Credential Theft Protection: Defending Your Organization's Data

Cyber attacks often begin with reconnaissance. Before they launch an attack, threat actors poke and prod at an organization’s defenses, looking for vulnerabilities. If you’ve invested in robust cybersecurity solutions, you may feel you’re protected against that threat. But what if your attackers don’t target your corporate network? What if, instead, they target your employees? And what if your employees don’t even know they’re being targeted?

Pseudonymization vs Anonymization: Key Differences, Benefits, & Examples

When it comes to protecting personally identifiable information (PII), organizations have two main options: pseudonymization and anonymization. Both methods aim to prevent unauthorized disclosure of sensitive PII data, but they differ in their implementation, advantages, and regulatory implications. In this blog, we’ll explore the key differences between pseudonymization vs anonymization, their benefits, practical examples, and how to choose the best method for your organization’s needs.

How CASB and DLP Work Together to Safeguard Data

Cloud computing has changed the way we work, and mostly for the better. Widely available cloud applications let us create new documents, access our existing files, and communicate with our coworkers from just about anywhere. However, cloud computing has also created new data security and privacy concerns. A comprehensive CASB DLP policy can help address these concerns and keep your organization’s data exactly where it belongs.

Volatile Data Acquisition on Linux Systems Using fmem

The content of this post is solely the responsibility of the author. LevelBlue does not adopt or endorse any of the views, positions, or information provided by the author in this article. Memory forensics is a critical aspect of digital forensics, allowing investigators to analyze the volatile memory of a system to uncover evidence of malicious activity, detect hidden malware, and reconstruct system events.

Data Destruction: The Final Line of Defense Against Cyber Attacks

Data is the lifeblood of modern organizations, and while watertight data protection policies are undeniably crucial, the need for robust data destruction methods has never been more pressing. Ultimately, all parties and vendors in your supply chain trust you to maintain the integrity of their data. Once that data is no longer needed, transparency about its whereabouts is vital.

Nightfall's Firewall for AI

From customer service chatbots to enterprise search tools, it’s essential to protect your sensitive data while building or using AI. Enter: Nightfall’s Firewall for AI, which connects seamlessly via APIs and SDKs to detect sensitive data exposure in your AI apps and data pipelines. With Nightfall’s Firewall for AI, you can… … intercept prompts containing sensitive data before they’re sent to third-party LLMs or included in your training data.

DFPM and DSPM: Two Steps Towards Modernizing Data Security

Data security is evolving. This evolution is making the need to understand what is going on with your data more critical. Teams need to be able to answer questions like, where is data being stored? Which vendor or team is using it? When is sensitive data being used? Where is data being sent?

DFARS 7012 Class Deviation and NIST 800-171 Rev 3 Guidance for DIBs

NIST 800-171 revision 3 was released on May 14, 2024, prompting DoD to issue an indefinite class deviation for DFARS 252.204-7012, Safeguarding Covered Defense Information and Cyber Incident Reporting (DFARS 7012). US Defense Industrial Base (DIB) contractors must now comply with NIST SP 800-171 revision 2 rather than the version in effect at the time the solicitation is issued, as was previously required.

Is Slack using your data to train their AI models? Here's what you need to know.

AI is everywhere—but how can you be sure that your data isn’t being used to train the AI models that power your favorite SaaS apps like Slack? This topic reached a fever pitch on Hacker News last week, when a flurry of Slack users vented their frustrations about the messaging app’s obtuse privacy policy. The main issue?