Personally-identifiable information (PII) and classified information stored within an Oracle database is quite secure, as the capabilities of Oracle database ensure that it is accessible only by authenticated and authorized users.
But the definition of authentication and authorization can change, in a sense.
In a production application, authentication means verifying that one is a valid application user and authorization means giving that application user the proper privileges to access and manipulate data.
But now let’s clone the data from that production application to non-production systems (i.e. development, QA, testing, training, etc), so that we can develop new application functionality, fix existing functionality, test it, and deliver user training.
By doing so, in effect the community of valid users has changed, and now instead of being production application users, the community of valid users is developers and testers. Or it is newly-hired employees being trained using “live” information from production.
And the developers and testers and trainees are indeed authenticated properly, and are thus authorized to access and manipulate this same data in their non-production systems. The same information which can be used to defraud, steal identities, and cause all manner of mayhem, if one so chose. This honor system is the way things have been secured in the information technology (IT) industry for decades.
Now of course, I can hear security officers from every point of the compass screaming and wildly gesticulating, “NO! That’s NOT true! The definition of authentication and authorization does NOT change, just because systems and data are copied from production to non-production!” OK, then you explain what has been happening for the past 30 years?
In the case of classified data, these developers and testers go through a security clearance process to ensure that they can indeed be trusted with this information and that they will never misuse it. Violating one’s security clearance in any respect is grounds for criminal prosecution, and for most people that is a scary enough prospect to wipe out any possible temptation.
Outside of government, organizations have simply relied on professionalism and trust to prevent this information from being used and abused. And for the most part, for all these many decades, that has been effective. I know that I have never even been tempted to share PII or classified data in which I’ve come in contact. I’m not bragging, it is just part of the job.
Now, that doesn’t mean that I haven’t been tempted to look up my own information. This is essentially the same as googling oneself, except here it is using non-public information.
I recall a discussion with the CFO of an energy company, who was upset that the Sarbanes-Oxley Act of 2002 held she and her CEO criminally liable for any information breaches within her company. She snarled, “I’ll be damned if I’m going to jail because some programmer gets greedy.” I felt this is an accurate analysis of the situation, though I had scant sympathy (“That’s why you’re paid the big bucks“). Since that time we have all seen efforts to rely less on honor and trust, and more on securing data in non-production. Because I think everyone realizes that the bucks just aren’t big enough for that kind of liability.
This has given rise to products and features which use encryption for data at rest and data in flight. But as pointed out earlier, encryption is no defense against a change in the definition of authentication and authorization. I mean, if you authenticate correctly and are authorized, then encryption is reversible obfuscation, right?
To deal with this reality, it is necessary to irreversibly obfuscate PII and classified data, which is also known as data masking. There are many vendors of data masking software, leading to a robust and growing industry. If you irreversibly obfuscate PII and classified information within production data as it is cloned to non-production, then you have completely eliminated the risk.
After all, from a practical standpoint, it is extremely difficult as well as conceptually questionable to completely artificially generate life-like data from scratch. It’s a whole lot easier to start with real, live production data, then just mask the PII and classified data out of it.
Problem solved!
Or is it?
Unfortunately, there is one more problem, and it has nothing to do with the concept of masking and irreversible obfuscation. Instead, it has to do with the capabilities of the databases in which data is stored.
Most (if not all) of this PII and classified data is stored within databases, which have mechanisms built in for the purpose of data recovery in the event of a mishap or mistake. In Oracle database, such mechanisms include Log Miner and Flashback Database.
Since data masking mechanisms use SQL within Oracle databases, then data recovery mechanisms might be used to recover the PII and classified data that existed before the masking jobs were executed. It is not a flaw in the masking mechanism, but rather it is the perversion of a database feature for an unintended purpose.
Ouch.
This topic became a 10-minute “lightning talk” presentation on “How Oracle Data Recovery Mechanisms Complicate Data Security, and What To Do About It” on Wednesday 13-April 2016 at the Collaborate 2016 conference in Las Vegas. Please read the slidedeck for my notes on how to deal with the situation presented here.
You can download the slidedeck here
.
I’m looking for a top Delphix Administration trainer. Independents are fine, but MUST be top notch for our large institutional client.