+1 (415) 529-5225
info@definisec.com

Defeating Application Data Disclosure

The Parade of Silver Bullets

The assault on Application Data continues with the preponderance of Ransomware, Doxing, Intellectual Property Theft, and related attacks. The past decade has however trended toward cloud migration and back-end protection, leaving sizable gaps in Application Data coverage.

During this time, thousands of companies popped up claiming to have a Silver Bullet to stop all woes. These were often newcomers with success in other industries. It didn’t take long to realize that this didn’t substitute for the years required to master the fundamentals required to deliver a Cybersecurity solution.

This created industry burnout, leaving good tech on the sidelines.

A Tool for This, A Tool for That

This dynamic contributed to yet another problem – that of technique-based tools that focus on very specific aspects of the greater problem. Today, the corporate security umbrella includes an arsenal of tools spread across teams of dozens of players, each specializing in distinct areas.

While the depth and breadth of corporate security challenges motivates some of this need, haste and market pressures served to reinforce this trend. This has created islands of protection that are costly to maintain and administer. Attempts to integrate pieces into a manageable whole sometimes require too much complexity, creating new opportunities for attackers.

Informal and Shadow IT

Despite that there are real differences between the two, the end-result is largely the same. Somewhere, if not in many, “somewheres”, there is someone tasked, or self-appointed, to manage enabling services and associated resources. This person isn’t part of the formal IT organization – even when there is one. Perhaps he/ she is in a small organization, and has volunteered to administer Google Docs for everyone. Maybe he/ she is in a large corporation facilitating constant third party connectivity and data exchange.

In these cases, systems are administered to get things done, without formality. Activities aren’t necessarily documented and nobody’s creating and managing change review boards. Things are done on the fly, to solve specific problems, to keep others productive. The person administering facilities understands the peculiar nature of things, approaching matters differently on odd-numbered Tuesdays after Winter Solstice, things like that.

If you’re that person, you know exactly what we mean. Sometimes IT doesn’t know about it, other times they are fully supportive of it because they don’t have the time or the resources. Either way, it contributes to the segmented nature of today’s protections, despite that, “spot protections” can often be quite good.

Sensitive Application Data

None of this helps manage today’s reality that attackers are having their way with corporate data – especially end-user application data. Though data you use on your desktop every day may not seem to be all that sensitive, that simply isn’t the case: Corporate documents and email messages carry high-value information often not perceived as such, even by those authoring and managing the very same content.

In today’s segmented realities, a lot of this information is at the mercy of host intrusion detection and prevention software, sometimes supplemented with a barrage of other applications that do a bunch of other specific things. And despite effectiveness, they are all very good at one thing – slowing host performance to a barely usable crawl.

Disparate Systems, Unmanaged Content

From Captain Obvious: Corporate protection is only useful when content remains within reach of corporate resources and controls: All bets are off when data moves through third-party cloud services or gets shared with partners, vendors, customers and others. Protection is then at the mercy of those managing external systems.

This isn’t at all what we’re looking for.

This contributes to justification that lowers the priority of consideration for application data protection: If content travels to external systems so frequently and so readily, and the proposed solution (or tool as it more likely would be) maintains control only while data is stored on a company computer, what then is the point of further complicating matters for a tiny incremental amount of control? And even if some continuity can be maintained when third parties deploy the same technology, such dependencies don’t usually work out well.

This makes perfect sense. With limited resources and budget, IT is going to bypass such considerations rather quickly (and probably should).

Whatever the reasons – and there are a lot of good ones – application data and email is, for the most part, “barely” managed and protected. The most effective approach would require protection at the source, making sure it, “follows” content as it travels through disparate systems managed (or perhaps not managed at all) by other parties.

Protecting Data at the Source

Desktop data encryption has been ineffective in addressing these gaps. Not only has it been difficult to deploy and maintain, but it’s also been end-user intrusive and marginally suitable, sometimes even against basic threats. Though there are newer offerings that have improved over the years, many haven’t proven to be justified for one simple reason: Basic encryption doesn’t inhibit advanced techniques.

Until recently, advanced techniques were available to the select few that held the technical savvy to combine multiple penetration techniques together with 0-days to work their way into almost anything. Given many systems rely on keeping attackers out of host resources, the game is lost when matched up against these adversaries.

Problem is, advanced techniques are no longer in the hands of the few. Today, many of these same techniques are offered as services, available for purchase (or rent), which has driven further expansion of Ransomware, for example. Host-based intrusion detection and prevention doesn’t make the grade, and many forms of data protection also fail rather quickly.

Practitioners know this, and such considerations are met with the same (justified) groans we got during the Parade of Silver Bullets.

Marking Data for Special Handling

But what if you could, with File Explorer, click on a data file or document then choose “Special Handling” from the resulting context menu? Let’s say that File Explorer would, in response, display an icon, “overlay” to signify that content would henceforth be handled differently.

And what if, from that point forward, use of marked items would only require that end-users respond to a couple prompts, one at the beginning of each day, perhaps, for identifying credentials and one specific to a request for access that asks users to insert a USB token and touch a mounted sensor to provide a proof of physical presence….

If we told you that was the, “cost” of addressing the gap in host application data protection, whether content resided on company-managed resources or in third-party systems, would it be worth a few minutes of your time to investigate – especially if we said we could show you that protection was specially designed to work against aggressive, complex threats?

Obviously third party access to managed content would require something, though there is a distinct difference between a full-scale, IT-managed technology deployment and a simple endopint application that coordinates SaaS participation: The latter can be deployed and used by end-users with little training, the former requires complex interoperability testing and maintenance, roadmap coordination, and assigned central administration resources to maintain the system on a continuous basis.

At the very least, lack of third-party end-user participation denies plaintext disclosure, which is a suitable default condition.

Nevertheless, we understand the past hasn’t enamored many to revisit those of us who claim to have addressed problem issues with new means for protecting application data at the source, so let’s talk a little bit about proper protection and management.

Hint: It’s far beyond simple data encryption, requiring effective data encryption that isn’t trivial – and a lot more.

Managing Application Data in Today’s Landscape

Note that we’ve referred to this approach as, “managing” rather than, “encrypting”. Data encryption alone, in today’s threat landscape, isn’t sufficient to provide suitable protection given the variety of different threats. An effective set of controls requires us to, at the very least:

  1. Enforce Access Control for sensitive resources, at all times
  2. Maintain effective data privacy to inhibit unauthorized disclosure, even on a compromised host
  3. Provide Data Integrity assurances when content is accessed
  4. Ensure that data is highly available, so we can access content at all times
  5. Survive host data loss and sabotage, whether through purposed attacks or hardware theft/ loss/ failure
  6. Recover from Ransomware to continue working even when attacks bypass other controls
  7. Store precise usage records that aren’t readily accessible when the host is compromised
  8. Maintain controls no matter where the data travels – within and outside corporate resources

These facilities must be deployed such that:

  1. End-users are minimally bothered
  2. Applications and infrastructure do not preclude the required oversight
  3. Deployment must be straightforward and compatible with a variety of Enterprise Management systems
  4. Administration and maintenance must be straightforward and simple, without the need for constant attention

These requirements aren’t necessarily specific to any one attack, though of course Ransomware is a specific concern as are Doxing, human errors, and malicious insider threats. This set of needs is instead designed to limit the extent of a breach while providing teams the tools necessary to maintain Business Continuity while prioritizing Incident Response and Recovery tasks.

This proposed, “solution” doesn’t by any means represent a Silver Bullet, but it’s an important set of steps toward addressing common challenges from a single source. At the very least, this would afford administrators some improved semblance of control even when attacks bypass protective intentions. This factor, by itself, can drastically change the overall cost and impact of a security incident or data breach.

SSProtect and KODiAC

We deliver the aforementioned capabilities with SSProtect, a SaaS system comprised of a host application – the :Foundation Client – and KODiAC Cloud Services.

The :Foundation Client is small – 7.5 MB installed – and it’s responsible for managing application workflow integration, plaintext isolation while sensitive content is used in native application software, and it coordinates our patented cryptographic offloading with KODiAC Cloud Services. This is all carried out while ensuring KODiAC cannot access end-user plaintext content.

SSProtect implements a multi-party consent trust model that precludes one-sided surveillance of end-user data. If KODiAC cannot access your plaintext content, then subpoena of KODiAC decryption keys isn’t sufficient either.

The :Foundation Client uses a self-service deployment model which can be automated using most any common Enterprise delivery system. Users can be created through Import, from any host computer though SSProtect doesn’t use a browser-based interface to reduce the cloud’s attack surface. Management facilities are exposed through context-based UI components within the :Foundation Client.

Deployment is as a result very simple, and takes an end-user a minute or two, at most, to provision.

Effective Controls

Protections aren’t all created equal. The effectiveness of every technology relies on details that are highly specialized, which is an opportunity for, “convenient” misunderstandings. Though most vendors mean well, there are some that flat out mislead, hand-waving away esoteric details that are sometimes completely irrelevant. Purchasing decisions are made on perception, and that can sometimes backfire.

For example, many host-based encryption technologies store keys directly on the host computer, which is insufficient when faced with advanced threats. SSProtect was designed, from the start, to inhibit advanced threats while maintaining protection even when a host computer is compromised.

Our protection combines encryption, integrity protection, and access control through cloud-authorized coordination. Our patented cryptographic offloading utilizes multiple keys that are separately isolated until plaintext access is authorized. This implements a multi-party consent trust model that requires an attacker to breach a host computer and KODiAC Cloud Services at the same time to gain access to plaintext materials. As noted, this also provides protection against plaintext disclosure to cloud service components.

Attackers can of course breach a host computer and wait for authorized users to work with plaintext then offload content. The :Foundation Client however includes driver-level isolation to ensure that only the managing application can work with plaintext data (rendered through authorization using offloaded 2FA – the USB token noted above). This doesn’t make it impossible for an attacker, but he/ she must heist content through means other than simple data copy operations or theft of in-memory keys (for OTFE, making it unsuitable). With SSProtect, attackers instead have to handle different types of data by working to find weaknesses in application data handling. This of course includes the potential for finding weaknesses in the :Foundation Client as well.

Ultimately, KODiAC records data use activity for every transaction. Attackers cannot cover their tracks, then, unless of course they break into the cloud infrastructure. This means KODiAC holds reliable, precise oversight that can then be used to generate Objective Disclosure Risk Reports to present the, “worst-case scenario” after a known breach. If your Enterprise has 10,000 protected items and an attacker gained entry for 6 weeks, that 6-week period would show which of the 10,000 files were accessed. The remaining 9,000, for example, remain theoretically secured since complete access to all content on a host computer, at any time, is theoretically insufficient to recover plaintext data. Of course, prior use – even managed, protected use – can result in residual plaintext leftover by the managing application. This is taken into account in associated Disclosure Risk Reports.

The point here isn’t to tell you about everything SSProtect can do – we’ll cover details in follow-up articles (though offer some more teasers, below). The idea here is to note that the integrated system of innovations is specifically designed to minimize the impact of a breach. Breaches will happen – nothing is perfect. Maintaining control and oversight affords the potential to keep end-users productive while Recovering from a breach. That as noted can have a profound impact on the overall cost of events that are today far too common.

Key Protective Measures

As promised, here are some of the protective and management facilities we’ll revisit, in technical detail, in subsequent articles.

Protect Application Data with Native Use
Protected content can be stored virtually anywhere (no “drop” folders) while maintaining application/ infrastructure compatibility. To use protected content, one must have the desktop :Foundation Client and a valid KODiAC Cloud Services account. Overall, provisioning takes a minute or two, tops, start to finish. Subsequent use of managed content only requires the optional fine-grained 2FA; SSProtect handles encryption, decryption, and in-use isolation such that attackers lying in wait can’t simply offload plaintext data.

Fine-Grained 2FA With a Physical Presence, Authenticated in the CloudAccess Control is authenticated and authorized in the cloud using a host-connected USB token that requires a physical touch. This inhibits remote access impersonation, keeping nation-states and other advanced attackers from assuming the desktop user’s identity then using their credentials to carry out authorized actions. Without fine-grained 2FA and without a physical presence qualifier, an attacker that gains remote host access tends to get what he/ she is after (though protective systems sometimes detect and stop offloading).

Cloud-Stored, Secure Content for On-Demand Restoration
Due to the nature of cryptographic offloading, the cloud can (optionally) maintain a copy of each individual version of a managed entity. Versions are updated only on authorized change, which can require fine-grained 2FA as noted above (it’s optional). As such, Ransomware and other types of data corruption don’t make it into cloud version instances, they instead overwrite host-local encrypted content which is flagged on next access due to built-in Integrity Protection. As such, version instances can be Restored on-demand by any User authorized to access the given content (owners are authorized by default, i.e. the person protecting the first instance). Backup/ Restore is thus built directly into the system, in fact using the same codepath as managed access, for reliability. Additional service components find and repair corruption, replicate managed User Archives, or deliver Organization content in a secured format for offline access.

Automatic Peer Collaboration, Policy-Based Third Party Trusts
Sharing is automatic for members of a managed Organization – a logical collection of users managed together. To share data with external users, Privileged Users adjust Policy to provide explicit permission. This is done, “on the fly”, and as such doesn’t require end-users to, “encrypt for a target audience” (in fact end-users don’t encrypt data, they mark it for management and the rest is automatic). Third Party Trusts, as they are called, can be adjusted at any time, on the fly, with immediate effect.

Distributed Central Administration
An oxymoron for sure, Administration is carried out through context-based UI elements in the :Foundation Client. This means Privileged Users can, from any host with the :Foundation Client, login to their Account and work with Administrative controls to affect change across the Organization they manage. Of course, as expected, changes (optionally) require the use of fine-grained 2FA that uses a physical presence, elevating protection (and auditing changes in the cloud for reporting).

IMPORTANT: SSProtect isn’t a sync and sharing solution: Data is never, “automatically” shared with anyone. Once content is marked for SSProtect management, you continue to use all pre-existing and available facilities for such activities. You can and should continue to use file sync and sharing infrastructure, and with SSProtect data isolation, you won’t expose plaintext to the cloud even if you accidentally open a protected file in a, “drop” folder: In-use plaintext isolation keeps the sync and sharing stack from updating the cloud until after the file is saved, closed, and re-encrypted (with new keys, automatically). Ultimately, the sync and sharing solution only sees the next protected/ encrypted version instance. This is one way of protecting content across disparate sync and sharing solutions in an Enterprise, with a single point of Reporting on usage activity.

Administrator Services and Additional Details

There are many more detailed features, and additional, optional service components for Administrators. These will be topics of further discussion in follow-up articles. A summary of Continuity and Incident Response/ Recovery facilities can be found within the complete list on our public Knowledge Base: Service Content.

Check back with us for technical articles that explain the underlying innovations that bind these facilities together into a single protective system as a Defense In Depth complement to Host Intrusion Detection and Prevention.