• 200 5th Avenue, 
    Waltham, MA — 02451
  • Copyright Allure Security Technology 2018.
    All Rights Reserved

Insider Leaks: Why They Happen and How to Shut Them Down

Written by Salvatore Stolfo

Earlier this year, a study by the Ponemon Institute found that all types of insider threats are on the upswing. Insider threats can be external attackers who have stolen credentials and masquerade as an inside employee giving the attacker access to what the inside employee is granted. An internal user can also steal or “borrow” another user’s credentials and masquerade as their fellow employee as well. A malicious employee who has gone bad or violates security policy, whether using his or her own credentials or someone else’s, is no less a challenge to detect and prevent.

Since 2016, the average number of incidents involving employee or contractor negligence has increased overall by 26 percent. Further, Ponemon noted a 53 percent increase in criminal and malicious insiders, and the average number of credential theft incidents has more than doubled over the past two years, increasing by an alarming 170 percent. Additionally, 90 percent of those who participated in an ESG survey completed in September 2018 reported that managing data security processes and technologies has become more difficult over the past two years, which supports the Ponemon Institute findings.  

CISOs and other security professionals need to understand the behaviors of insider users who leak data and use this to inform their strategies for stopping data loss. Leaks can be attributed to ignorance of security policies or lack of awareness of the risks of certain behaviors. Other motivators include:

  • Convenience: Behaviors such as sharing a password with a co-worker instead of creating a dedicated account for that individual (exactly how Edward Snowden gained access to NSA files)
  • Productivity: Behaviors such as bypassing a VPN or secure collaboration space because of deadlines that need to be met, travelling for work, etc.
  • Revenge: Behaviors such as the exfiltration of volumes of files by a disgruntled worker who wants to “get back” at the company (the motivation in this high-profile hacking case involving Georgia Pacific)
  • Financial gain: Behaviors such as sharing sensitive intellectual property for monetary gain
  • Whistleblowing: Behaviors such as an employee who shares sensitive documents with the media to bring to light corruption, poor working conditions, etc.

A majority of employees break corporate cybersecurity rules not out of maliciousness, but because they were either unaware of policy violation, or they were trying to get their jobs done and found enterprise security controls too cumbersome. However, as the Ponemon Institute report shows, the malicious insider motivated by revenge or the sneaky insider gaining the credentials of another employee are both becoming greater threats to the enterprise.

Some companies hope to solve this problem by making employees sign non-disclosure agreements. Tesla founder Elon Musk just unveiled stricter new confidentiality agreements for his staff to sign after multiple high-profile leaks. Others use security tools such as data encryption or password managers in an attempt to stop employees from leaking information. Google just recently blocked remote workers from attending weekly “all hands” meetings in an effort to clamp down on media leaks.

The problem is, these responses and policies can easily be worked around. If an insider leaks information, but the company cannot prove who was responsible, even the strictest NDA is useless. A remote employee who lacks credentials to log into a corporate meeting could easily “borrow” a co-worker’s login and password. And once an insider gains the credentials of another employee with higher security permission levels, they are essentially masquerading as that employee within the corporate systems.

Employees and third-party contractors are regularly given access to critical systems, files, and data to do their jobs. But without visibility into how they are interacting with data -- namely, enterprise documents -- it can be difficult to determine whether their activity is putting your organization at risk. In fact, most organizations think employees and third parties are following policies because they lack the visibility to confirm otherwise. The ESG survey reported that 91 percent of respondents believe all employees follow data security policies and processes and 86 percent believe this is true of third parties as well.

How can enterprises gain awareness of how confidential data is being handled and whom within the company is accessing sensitive corporate data outside of policies and processes? This is where Allure Decoy Documents can help, by addressing the management of risks associated with employees and third parties, and by identifying leakers and bringing their bad behaviors to light.

Insider leak story: Snooping to manipulate stock performance

A large enterprise was trying to trace the source of stock tampering and financial fraud, and called in Allure Security to help. The company first noticed confidential information appearing in public sources such as social media that only employees could know. This aroused suspicions of a rogue insider who was illegally benefiting from insider knowledge of an impending acquisition. The insider was accessing press release drafts that hadn’t been made public, and leaking this information to affect the company’s market valuation during the M&A process.

To gain a clearer picture of how the information was being accessed and by whom, patented Allure Decoy Documents containing compelling information about the target company were strategically placed in the company’s file shares. Then, the company waited for this insider to start accessing the documents. Sure enough, one of the documents was opened externally at the home of the alleged inside attacker, triggering an alert to  the company’s security team. Using the proprietary geofencing and telemetry technology built into Allure Decoy Documents, the security team was able to surface the insider’s identity and provide proof for law enforcement. The FBI then did its duty.

On average, it takes an organization 72 days to contain an insider threat, according to the Ponemon report. Think about how many documents can be exfiltrated and shared outside the company in that amount of time. Allure Decoy Documents send an alert (proven to be 98% accurate by DARPA research) within minutes of a user opening and viewing a document, reducing time-to-detection and helping incident response teams conduct investigations that ultimately reveal the identity of the leaker.

Request a demo

« Back To All Posts

Tags: decoy documents , deception , detection and response , stolen credentials , third-party risk , data leak