One of the biggest and most frustrating events is when Delta, Equifax or anybody else in the business world, gets hacked. I used to look at this and say, “Idiots, you did not protect my data” and be mad at them for weeks. In the cases of the Office of Personnel Management, where security clearance data was stolen, and Equifax, where financial data was taken, I was angry for a lot longer than that.
Yes, these technical geniuses were not very careful with my data, and in the case of OPM deserve to be fired all the way up to the top, they have a problem that not easy to correct: the software they have in their organizations is full of holes that either have not been patched (their fault) or where flaws have not been identified (the vendor’s fault). I have a whole stream of stories I have told in the past to demonstrate that vendors with known problems wait until they can work those into a development cycle that goes on for years, before they get corrected. In the meantime, our data is at risk.
The vendors know it is at risk, and hide those flaws from anybody who looks into them. The reason they can do that is our laws do not require a software vendor to be responsible for errors they make in the production of that software and its use on the Internet where they should reasonably expect any flaws to be exploited by hackers. Years ago, I wrote several policy papers that outlined the problem. The vendors responded with justifications for maintaining a policy that protected software from this kind of liability. I know them all, by now, and some of them I would even agree with.
* Many times it is the integration of software from many vendors that produces errors that were not present in any single software product deployed.
* Patches are issued for known exploits and new ones are found every day. Millions of changes are made every month, and released even though some IT people never implement them.
* IT departments keep old software around to avoid license fees for newer software with fewer flaws.
Yes, it is complicated. However, software vendors have released versions (sometimes whole products) that were never security tested before being released. In one case, an on-line data storage company had not tested its entire system until a venture capital investor decided it might be a good idea. We have to have some laws in this area or the software vulnerabilities lists are just going to get longer and longer and it is already too long. IT shops can’t keep up.
Vendors need some incentive to do more security testing prior to releasing a new product or updating an old one. They need time limits on fixing known vulnerabilities. Buried in those software libraries they use for development are many of those, and some of them are intentionally introduced. They need some liability for products they produce and maybe that would give them some incentive to try to clean up their act.
No comments:
Post a Comment