Outsourcing development can provide many benefits, but an outsourced testing process can be a force multiplier for security lapses. Given that many organizations rely on outsourcers for application development and testing, test data should always be created and properly masked before it is handed over to any third-party. The onus is typically on the organization that owns the data, not the outsourcer, to ensure the data is protected and individual identities concealed.
Get a Handle on Test Data Access and Handling
Using copies of real customer data for application testing requires a simultaneous focus on making sure the processes for accessing and pulling test data are secure—which means, knowing who is accessing test data and what exactly they are doing with it. Individuals who are wrongfully accessing and handling this data often aren’t aware of just how hazardous their behavior may be, or the risk magnitude of their actions.
In security, the principle of least privilege is the practice of limiting access rights for users to the bare minimum permissions they need to perform their jobs. Unfortunately, poor credentials management is prevalent within many organizations. It can take several forms, including failure to terminate credentials as an employee’s job role changes (meaning employees accumulate and retain access to many more systems than necessary to do their current jobs), or credentials sharing, often with lower-level employees. Forrester recently found that 80% of security breaches involve privileged credentials.
Combine this with the fact that DevOps teams are moving at a clip, which translates to more teams (and more individuals) creating more applications and updates and thus testing at much higher frequencies. Together, these factors create a perfect storm for risky, careless and sloppy behavior, even if it’s inadvertent. Research from 451 Alliance shows that user behavior is the top information security pain point within many organizations.
Consider the example of an authorized internal user who is using unmasked test data that was extracted from an in-production mainframe database, and sends it to an unvetted and/or unauthorized external third-party organization. User behavior auditing can be the key to proactively identifying and addressing such potentially risky behaviors, ultimately helping DevOps teams achieve the elusive balance between needing to move quickly and superior information security. Some may view this type of auditing as a form of unwelcome surveillance, but it is really about providing protection to individuals and organizations as the pace of DevOps accelerates and compliance demands intensify.
The Google fine serves as yet another reminder of how seriously European regulators are taking the issue of customer data privacy and security.
British Airways and Marriott have also received large fines under GDPR, CNBC reported. The U.K. Information Commissioner’s Office (ICO) has proposed that British Airways pay a fine of $230 million for an incident that took place from June to September 2018 and compromised the data of 500,000 customers. The ICO also hit Marriott with a potential $123 million penalty for the loss of 339 million guest records, reported in November 2018. Both companies have already indicated they will appeal the decisions.
Other violations over the past few months range from the egregious (such as the case of a Portuguese hospital using bogus accounts to access patient data), to those that may have little-to-no malicious intent (such as the October incident of a small Austrian business being fined for security camera filming in a public place).
While some GDPR infractions are obvious, some are less so, but this won’t make the pain of a fine any less severe. As GDPR sets stricter precedents for consent mandates, organizations will face an ongoing challenge in securing consent for all the ways they plan to leverage customer data, including technical subjects such as application testing. Such areas must be navigated carefully, and in some instances organizations will be better served implementing steps to avoid having to secure consent in the first place.
The Google fine serves as yet another reminder of how seriously European regulators are taking the issue of customer data privacy and security. While this was the biggest fine on record, it wasn’t the first, and it won’t be the last. Other violations over the past few months range from the egregious (like the case of a Portuguese hospital using bogus accounts to access patient data), to those with likely little to no malicious intent (like the October incident of a small Austrian business being fined for security camera filming in a public place).
While some GDPR infractions are obvious, some are less so, but this won’t make the pain of a fine any less severe. As GDPR sets stricter precedents for consent mandates, organizations will face an ongoing challenge in securing consent for all the ways they plan to leverage customer data, including technical subjects such as application testing. Such areas must be navigated carefully, and in some instances organizations will be better served implementing steps to avoid having to secure consent in the first place.