Cybersecurity is rightly seen as an ongoing battle. New attacks are being engineered all the time, and new defenses are created to combat them. Protecting against attacks and accidental breaches is something that requires an ever-evolving effort.
In practice, however, it’s easy for cybersecurity to grow static and stale. An anecdote recently reported by security expert Kalev Leetaru helps to illustrate the problem.
Leetaru was called multiple times from an unlisted number by a company claiming to conduct surveys on behalf of the National Science Foundation (NSF). The surveys requested a wealth of personal information including addresses, birth dates, and the last four digits of the social security number.
Leetaru was initially shocked that a major government agency would resort to such untrustworthy methods to collect survey data. And when he dug further he revealed troubling holes in the agency’s data security strategy. There were very real reasons to worry that personal survey data could be compromised by hackers.
Where and why those security holes exist is important, but it’s not the takeaway of Leetaru’s story. His agenda is to illustrate how organizations even at the highest levels rely on security solutions that are totally out of step with the needs of today.
The Problem with Security by Obscurity
When Leetaru contacted the NSF to inquire about its protocols he was denied specific information on multiple occasions. He was informed by agency officials that releasing details about the cybersecurity infrastructure could put it at risk.
This is a classic example of “security by obscurity”. It’s an old approach to cybersecurity based on the idea that if hackers don’t know what defenses are in place they won’t be able to breach them. Unfortunately, this approach has been discredited in the wake of almost every major cyber incident. As a result, there is no reason to believe that transparency compromises cybersecurity.
Cybersecurity is an issue that the NSF must take seriously for a number of reasons. Yet, as Leetaru points out, it relies on outdated philosophies, incomplete defenses, and inappropriate surveys. That leads to an unavoidable conclusion – If the government’s practices are outdated, what private-sector practices are outdated as well?
Reconsidering the Data Lifecycle
Companies are collecting more data than ever from more individuals and touch-points than ever. Unfortunately, the policies that dictate data lifecycle management are often years old. They may prove adequate in most instances. But since they can’t prove adequate in all instance, they create liabilities.
It’s time for the private sector (and the NSF) to embrace a new approach to data lifecycles. It must acknowledge that the speed and scope of data are growing exponentially. It has to incorporate both protections and disaster response strategies. Finally, it must incorporate non-technical solutions like a US cyber policy.
Leetrau was not inspired to write about the issues he uncovered because they are a major threat. He wrote about them because they’re a common threat. Cybersecurity moves fast, and existing approaches reach irrelevance on short timelines. Companies must now take a proactive approach and cover themselves from all angles.