Evento di Lancio: Smart AI Security. Controllo Totale dei Dati. Prenota il tuo posto

chiudere
chiudere
La tua rete di domani
La tua rete di domani
Pianifica il tuo percorso verso una rete più veloce, sicura e resiliente, progettata per le applicazioni e gli utenti che supporti.
Experience Netskope
Prova direttamente la piattaforma Netskope
Ecco la tua occasione per sperimentare in prima persona la piattaforma single-cloud di Netskope One. Iscriviti a laboratori pratici e a ritmo autonomo, unisciti a noi per dimostrazioni mensili di prodotti dal vivo, fai un test drive gratuito di Netskope Private Access o partecipa a workshop dal vivo guidati da istruttori.
Un leader in SSE. Ora è un leader nel settore SASE a singolo fornitore.
Netskope è riconosciuto come Leader Più Lontano in Visione sia per le piattaforme SSE che SASE
2 volte leader nel Quadrante Magico di Gartner® per piattaforme SASE
Una piattaforma unificata costruita per il tuo percorso
Securing Generative AI for Dummies
Securing Generative AI for Dummies
Scopri come la tua organizzazione può bilanciare il potenziale innovativo dell'AI generativa con pratiche solide di sicurezza dei dati.
eBook sulla Modern Data Loss Prevention (DLP) for Dummies
Modern Data Loss Prevention (DLP) for Dummies
Ricevi consigli e trucchi per passare a un DLP fornito dal cloud.
Modern SD-WAN for SASE Dummies Book
Modern SD-WAN for SASE Dummies
Smettila di inseguire la tua architettura di rete
Comprendere dove risiede il rischio
Advanced Analytics trasforma il modo in cui i team di operazioni di sicurezza applicano insight basati sui dati per implementare policy migliori. Con l'Advanced Analytics, puoi identificare tendenze, concentrarti sulle aree di interesse e utilizzare i dati per agire.
Supporto tecnico Netskope
Supporto tecnico Netskope
I nostri ingegneri di supporto qualificati sono dislocati in tutto il mondo e possiedono competenze diversificate in sicurezza cloud, networking, virtualizzazione, content delivery e sviluppo software, garantendo un'assistenza tecnica tempestiva e di qualità.
Video Netskope
Formazione Netskope
La formazione Netskope ti aiuterà a diventare un esperto di sicurezza cloud. Siamo qui per aiutarti a proteggere il tuo percorso di trasformazione digitale e a sfruttare al meglio le tue applicazioni cloud, web e private.

The 3 Major Shortcomings of Traditional DLP

Feb 07 2023

As digital transformation continues to blossom and cloud adoption increases, we continue to see challenges crop up when it comes to traditional DLP solutions. 

Setting aside the architectural and operational complexity and high cost that comes with traditional DLP, practitioners recognize that existing tools aren’t able to keep up. In fact, modern hybrid work business practices, coupled with an ever-increasing number of SaaS apps (35% increase in number of apps in use in 2022) and data that no longer sits in an on-premises data center, make it clear that the paradigm has shifted. Legacy DLP solutions are quickly going blind as they were designed for a perimeter-centric world. 

To find a way forward, let’s take a deeper look at some of the major shortcomings of legacy DLP solutions and how DLP needs to evolve to keep organizations and their sensitive data safe.

Difficulties supporting cloud and hybrid work

Because traditional DLP solutions were architected as on-premises solutions and anchored by their on-premises infrastructure, they don’t naturally extend to cloud channels. DLP vendors initially found a workaround for data discovery in the cloud through clumsy ICAP integrations with CASB solutions, but this created the first big architectural limitations, which included: 

  • Disjointed technological environments 
  • Hard-to-reconcile policies 
  • Different enforcements 
  • Separate consoles 
  • Considerable latency to enforce protections 

Cloud detection services with REST API connectors offered another approach to connect the on-premises DLP solutions and CASB. But this method only patched some of the problems as opposed to providing a real long-term solution.

What’s more, is that the risks to data have only grown as we’ve entered an increasingly hybrid-enabled work environment.

Hybrid work has resulted in organizations that are highly distributed with branch offices popping up around the globe as they continue to expand their business. And as a result, this growth has left organizations with the burden of having to deal with a sprawling DLP infrastructure, tied to on-premises dependencies and hardware components like proxies, databases, servers, etc. Trying to provide coverage with legacy DLP tools has become a nightmare for many practitioners, as the on-premises architecture must be most likely replicated for all branches. 

To top all of this off, a legacy approach lacks the appropriate coverage for remote employees who are directly connecting to corporate resources on-premises and in the cloud, as well as to risky SaaS apps, unmanaged personal BYOD devices that can connect to corporate assets, and even IoT devices accessing sensitive data. It also requires that remote users would keep their VPN connection on when working outside the office. Without the proper controls and tools in place, organizations are putting their sensitive data at significant risk of undetected data exfiltration.

Unsustainable sprawl and inability to scale

In addition to the difficulties around supporting the cloud and hybrid work, data has also evolved significantly, booming not just in volume but also in variety and velocity. Sensitive information can be embedded in more unstructured formats, like images and screenshots (often taken with poor image quality) that are stored and shared in the cloud or flow through email messages, asynchronous communications on collaboration apps like Slack and Teams, and uploaded to personal instances of corporate SaaS apps (i.e. personal OneDrive vs. corporate OneDrive). As sensitive data becomes harder to identify, it becomes more difficult to protect.

In a hybrid world, legacy solutions can’t scale at cloud speed and have difficulties keeping up with new use cases, new data privacy legislation, and regulatory requirements. They aren’t equipped to ingest and process growing amounts of information or leverage sophisticated machine learning and AI models, at least not without adding more computing power via additional detection servers, larger databases, and voluminous endpoint agents. This approach, besides being very costly, ultimately slows down other computational processes. Therefore many use cases, such as advanced image recognition, correlation of context-based information from many risk vectors, advanced endpoint-based detection, and fingerprinting of large files and datasets, remain unsolvable for organizations still trying to make legacy DLP solution work.  

In addition to this, software updates for traditional DLP solutions are their own nightmare to deal with. These updates notoriously take months or even years and a lot of manual work to go from one version to the next, not taking into account possible system errors and potential loss of data and configurations. As a result, organizations are often behind on DLP version upgrades and aren’t using newer protections (such as, newer data identifiers, newer detection methods, newer compliance policies, etc.) because of the lengthy and resource-intensive updates that they have to go through.

Overwhelming false positives without proper context 

With sensitive data residing and moving to more environments outside the managed data center network and the amount of data constantly growing, the number of incidents has grown to a point that it is now nearly impossible for the incident response team to triage and remediate every incident with the right level of analysis and understanding. The massive number of false positives flood incident response teams—thousands or hundreds of thousands of alerts per day–demand direct attention, but they have to be overlooked for lack of time and bandwidth. As a result, incident response teams have expanded accordingly at a high cost.

Automation and orchestration tools like UEBA have come to assist, helping ingest alerts and figure out a more optimized way to remediate them in bulk. UEBA is an effective tool in symbiosis with DLP, but the UEBA model alone is not sustainable if DLP becomes more and more inaccurate and its gaps become larger. It needs more.

In an effort to offer better context, DLP must shift into a fully integrated zero trust data protection platform, able to ingest and use information from any security source and translate them into actionable policy recommendations and intelligent incident response rules. Continually verifying contextual factors about the user, whether that’s device trust, behavioral trust, app trust, or geolocation, allows for an adaptive, precise response when it comes to trusting a user. 

While these shortcomings are undoubtedly causing issues for security practitioners, there is a way forward. The new white paper “Why You Should and How You Can Move Away from Existing DLP Programs” offers actionable strategies for how you can evolve your legacy DLP program to better secure your organization’s sensitive data while reducing costs and improving efficiency.

author image
Carmine Clementelli
Carmine Clementelli is a security expert and technology leader for data security, CASB, and zero trust at Netskope.
Carmine Clementelli is a security expert and technology leader for data security, CASB, and zero trust at Netskope.
Connettiti con Netskope

Iscriviti al blog di Netskope

Iscriviti per ricevere ogni mese una panoramica degli ultimi contenuti di Netskope direttamente nella tua casella di posta.