"Data Loss Prevention is not just about having a software or hardware solution," says Benjamin Brooks, Vice President at Beryllium InfoSec Collaborative.
You must, he says, have DLP policies the organization is aware of.
"If you don’t have a process in place, anything goes. If we don’t establish what is right, then anything is right. And when it comes to protecting our organization’s sensitive data, we have ideas about what is right, but those need to be put on paper and promulgated throughout the entire organization."
Brooks shared his insights on how to approach data loss during the SecureWorld web conference, Time Is Up: Rethink DLP, which is available on demand.
At a high level, Brooks says DLP programs should do the following:
And he uses an acronym to explain four-step process to building a data loss protection program: MDMP for Manage, Discover, Monitor, Protect.
Ryan Manship, CEO of RedTeam Security, says DLP can be difficult to do well. He told the web conference audience this difficulty is especially evident in large, robust environments.
"There’s often not a clear understanding into how the organization wants its data to be managed. Are we standardizing internally on what that looks like?
If we have multiple applications that are getting data from different sources, how are we dealing with it?
And how are we leveraging metadata, what kind of opportunities do we have in place to understand our data and actually get some control around it?"
He adds that it can be difficult to get your organization to fully understand the insider threat concept. Since an insider is already within your security perimeter, you should view your data as potentially porous from the inside out.
Manship also mentioned to web conference moderator Roy Wattanasin that he expects regulation to be a bigger driver of DLP programs and practices in the future.
He also believes data privacy will continue to have a major impact on Data Loss Prevention requirements.
Abhik Mitra of Code42, which provides next-gen DLP solutions, says he regularly hears from clients that traditional or legacy DLP programs are a problem at nearly every turn. This chart essentially sums up the difficulties:
"With Legacy DLP, you generally find yourself having to hire extra folks to administer, manage, and write the policies that drive your DLP solution. And ultimately, DLP blocks employee productivity.
More and more organizations are viewing DLP as a program that impedes the productivity and effectiveness of their employees.
That’s driving a conversation: should I invest in something that slows my business down?"
That conversation is happening at a critical time because data collection and usage is exploding and it is no longer restricted to endpoints; it’s in the cloud, it’s living on Google Drive or Microsoft OneDrive, and many other places.
These shifts, Mitra says, led Code42 to develop next-gen DLP that eliminates productivity stumbling blocks and creates data loss protection that factors in all of the data and all of the users.
It also helps security leaders answer critical questions at critical times.
"When is that data leaving and how is it leaving? You couple that with a security incident of some sort, and then being able to have those answers, readily, becomes paramount."
Those are the type of answers that can help cybersecurity teams and the organization at the same time.
Speaking of answers, if you’re looking for more information on Data Loss Prevention, check out the SecureWorld web conference, Time Is Up: Rethink DLP, which is complimentary and available on demand.
[RELATED: 2019 cybersecurity conference calendar]