We all know that organizations are trying to jump on the bandwagon with cloud. The top objectives that corporations should focus on, from a business perspective, are business enablement coupled with scalability and agility.
This is all about being able to have a slightly different mindset and thinking about how you can create business architecture that can scale vertically or horizontally and still be nimble enough to adjust to small changes when required, call it a course correction.
How can organizations execute this course correction?
From a technology standpoint, objectives that enable the business to execute at desired speed include reduction of complexity and gap-free protection.
There is a lot of technical debt and redundancy in organizations. We need to think about how we can align our new future-proof technical architecture with business goals, focusing on the simplification of the current tech stack by vendor consolidation, and building efficient and automated processes. Of course, having a good data protection program is key because data is your top priority in the cloud.
The most common problem I have seen during my discussions with clients is applying the same controls over the same types of data, as well as a lack of data classification. Some view data protection as an IT problem, not involving business at all. Or they are addressing prevention controls but are forgetting when it comes to monitoring, detection, response and reporting capabilities.
When applying Zero Trust principles, having a good data protection program is one of the main pillars required to be successful.
It is known that data should be protected at rest, in use, and in transit by utilizing capabilities such as encryption, especially in cloud. But, there are many methods of protecting data, which depend on form and need, such as dynamic or static masking, tokenization, format preserving encryption (FPE), anonymization, etc.
Data governance with a consistent data security policy must exist and be followed. The specific type of control applied at a given time will be determined by other factors, such as regulation requirements and data residency, whether the data structured or unstructured, or the business’ risk appetite, etc.
What capabilities are often forgotten?
The most commonly forgotten capability is integration. This is a typical occurrence for somebody who does not think holistically, but instead focuses on one specific outcome. It is imperative to address flexible integration, scalability, and performance, especially in analysis and processing. We are lucky because cloud can deliver exactly that.
Many enterprises are starting to realize the importance of classifying data and understanding what needs to exist to protect those intangible types of assets, programmatically as well as administratively. Not only does that make data programs more effective and efficient (e.g. DLP) but also helps with avoiding false positives that challenge many security teams. This is because too much focus is placed on a specific tool or capability without considering the chronological set of events required to make a truly quality process.
Another step that is usually forgotten is an understanding of MDM (master data management) and the effect of proper planning for data reuse and integration. We usually dump everything in a pool and expect a valid algorithm will solve all of our concerns, but there is a saying “Garbage in, garbage out.” However, having the right format will help not only with accuracy in analysis but help with reusing the same data while minimizing organizational exposure to certain types of business risk, such as critical information loss and access to unauthorized data.
Now take that into consideration when planning to migrate or move to the cloud.
Because of cloud scalability, you will be able to gain a higher speed of analysis, integrate with other services by reusing the same pool of data, and even protect it once (via the method of “one to many”), but that requires a proper governance model. Even the Zero Trust model will be able to provide better context for your adaptive policies when the holistic data program is built well.