Those of us in the data security industry, practitioners and vendors alike, have been conditioned to think of data protection in terms that are analogous to physical security. Blocking devices and sensors are akin to locks and security systems. This is why for years we have been investing in those technologies that will block out unauthorized connections all the while making information more and more accessible. There is, however, a new world order at hand. Data creation rates now far outpace the ability of IT managers to write security rules, and the number of data breaches and threats that originate from network insiders have proven much more frequent and insidious than even our most dire predictions of five years ago.
To protect data in this new order, we need a system that is more fundamental than padlocks and alarms, we need a new infrastructure or a new house, if you will. This new infrastructure is a system of data governance that is general enough to be applied in the networks of businesses in all industries. It is a combination of technologies and practices for protecting data that can scale to match the complexity of security challenges created by the explosion in digital information.
To implement a system of data governance there are some "musts" or fundamentals that need to be adhered to. First, 80 percent of enterprise data is unstructured-meaning that it exists outside of a database or document management system. This data needs to be protected where it lives, and that means the aggregation points like shared file systems and network attached storage (NAS). Locking down file shares first does not mean that laptops and other portable devices with memory shouldn't be protected, but a first point of order is to control who can access what information and who has privilege to download it to such devices. Placing proper access control on file systems means that the proliferation of data to highly mobile form factors can be centrally and thereby efficiently controlled.
The second point of order is to "cleanup" unused data, data permissions and inactive user accounts. One of the best places to mitigate security risk is to eliminate outdated security settings and stale information, all of which contribute to an environment where access is overly permissive and unwarranted. In some instances, company insiders with knowledge of and access to unused accounts have been able to surreptitiously garner information to which they had no business driven entitlement. Whether the organization has 50 employees or 50,000, the potential risk is the same. Outdated access controls and inactive accounts mean a higher probability of data loss and misuse, and in some industries, it can mean a violation of regulations that mandate segregation of duties.
With the first two steps fully addressed, organizations must next put in place the basic pillars of good governance, which are visibility, oversight and accountability. Organizations need to ensure that they have the means to inventory their security controls at any given time. That is, a data administrator, auditor or corporate officer should be able to quickly produce and view a list of which individuals and groups in the company have access privileges to important information and what level of privilege they have (full, restricted etc.). These privileges constitute the potential for access and therefore the security posture of the data. Thus, data stakeholders need visibility to that posture at any given time.
What is of equal importance is that data owners or stewards be identified for each data set in the organization. If this is not possible, or poses to be lengthy based on the organization's size, then the company should at least assign owners to data sets known to contain sensitive and valuable information (i.e., HR, customer, and financial data). Identifying data business owners and making them accountable to the business for the security posture of the data is key to ensuring that access controls are accurate and that data is properly stored (i.e., delete or archived). Without accountable data owners, IT personnel are forced to spend countless and often fruitless hours in search of those persons responsible, or worse yet, must make decisions about the security and archival of the data without the stakeholder's input. This situation is error prone and consequently fraught with risk.
Finally, any framework for governance and protection of data must be robust enough to provide continuous oversight of how all data is being used in the organization. Data access environments are highly dynamic. Digital information is being created at meteoric rates. Business needs for access to data change as often as job functions and project scopes do. Even people who use their access privileges properly most of the time may periodically engage in inappropriate behavior (e.g., triggered maybe by a job change or curiosity). Hence, even if all access controls are up-to-date and data stewards are in place, it is important to continuously review how individuals are using their data privileges and whether the types of privileges and access activity are consistent with the individual's role in the organization. This sort of frequent oversight is a "must" for mitigating the risk of data loss and misuse by allowing organizations to investigate and swiftly respond to anomalous access activity.
While there is no secret formula for preventing data breaches, the "musts" described above represent significant steps toward reducing the probability of its occurrence. Should a data breach occur for a company that applied the IT "musts" consistently and frequently, preparedness will allow them to identify those responsible and introduce corrective measures.