Coming June 11th

The Quest for Absolute Security

As security professionals, we tend to look at security as black or white: Either a system, network, data, application, etc. is secure or it’s not. There’s a vulnerability in this piece of software so it’s insecure. Employee X uses a weak password and that’s going to allow an exploit. An attacker was found on our network so we’ve “failed.”

Is this really a practical way to evaluate security? In the moments when not being overly self-critical, I think most security practitioners would admit that achieving a 100% secure system/network/dataset is unrealistic. The desire to protect everything in absolute terms seems to be part of most cybersecurity practitioners’ DNA, however. Therefore, any compromise is regarded as a “fail.”

How much of the rest of business operates in absolute terms? If new product sales underperform one quarter, should the business immediately discontinue the product? If a 1.0 product design lacks certain functionality, do businesses scrap all future updates? Of course not. Every business unit is subject to goals and expectations, but it takes more than one imperfection to result in failure. Heck, IBM reported 22 consecutive quarters of declining revenue and yet they’re still considered an industry leader by most estimates. The point is, there are degrees of success, and while it might be easier to, say, measure a salesperson on the percentage of quota achieved over the course of a few quarters, achieving 100% security is an impossible dream. As industry expert Kevin Johnson of Secure Ideas says, “I can make your systems 100% secure, but they won’t be functional — at all. Is that really the point of security?”

Dealing with cyber vulnerabilities

The fact is, vulnerabilities will always exist, whether it’s a coding error, a successful phish, or simply advancements in technology that invalidate old ways of securing the network. Firewalls are a perfect example; firewalls, as they were created back in the 1980s, were extremely effective for their intended purpose — keeping outsiders out of a discrete, on-premises network. Over the years, as networks grew, B2B partnerships increased, and mobile and virtual technologies expanded the network attack surface, a perimeter-based, single point of authentication no longer sufficed. 

Subscribe to our newsletter:

Today we live in a complex world of hardware, software, and firmware. Much like the comfortable pair of jeans from college you just can’t bear to throw out, legacy equipment and ideas hang around past their “effectiveness date” because change is hard and technology moves at such a blistering pace that updating every time something “better” comes along would be impossible. But with technology advancement comes vulnerabilities. And as hard as white hats are working to protect data, networks, applications, and such, cyber criminals are working equally hard to inflict damage. Once an attacker finds a way onto your network, they can exfiltrate data, install malicious code, and create additional vulnerabilities that can be used to exploit systems at the desired time. With all this in mind, it’s easy to see why security pros view these scenarios as “game over,” why it’s so easy to fall into a mindset of “all or nothing.”

Network protection with realistic expectations

When experiencing a moment of “I must protect All The Things!!!” it’s helpful to think about how other fields approach win-loss scenarios. For instance, in the world of retail, “shrinkage” is a term used to describe how much inventory loss due to theft or errors can be expected and absorbed by the business. No successful retail organization would ever estimate 0% shrinkage because that’s unrealistic. According to the National Retail Federation, U.S. retail organizations experience an average of 1.33% shrinkage (as a percentage of total sales) every year. For retail organizations generating billions of dollars of revenue, those numbers really start to add up, but there is always some level of expected and acceptable loss.

Looking at non-cyber crime statistics, local, state, and federal law enforcement track year-on-year crime rates to gauge progress. As with retail shrinkage, achieving a 0% crime rate in any given town, city, or state is unrealistic, so rather than aiming for perfect, agencies set goals to decrease crime in their jurisdictions over time. If during a specified time period the crime rate increases, the FBI or police reevaluate, reset, and improve in areas they can control such as better evidence gathering and interview techniques.

Taking a page out of these playbooks, security teams should embrace the idea that protecting data, networks, systems, etc. is an ever-evolving process instead of a point-in-time, black or white situation. As such, the concept of “absolute” becomes obsolete. Rather than, “How do I secure the entirety of my organization’s digital data and technology,” the security strategy can become, “What is the most sensitive data my company collects? Where is it stored and how is it used? What protections are already in place? Are they adequate? Are there methods to place better controls around access to data without overhauling the whole infrastructure?”

Asking these kind of questions—looking at security in small, prioritized chunks—is a lot more manageable than, “Secure everything to 100%.” Beyond breaking down security into workable pieces, the most important aspect of improvement is acceptance that there are no absolutes. Perfect security is not viable and should therefore not be a goal. Though there will be high-profile data breaches for which the CISO (and perhaps some direct reports) will be held solely accountable, in most cases, security teams will survive security incidents provided they can demonstrate due diligence. This requires the security team to understand and properly convey realistic information about the cybersecurity program to business stakeholders. If security sets an expectation that 100% of data breaches can be prevented (because that’s the dream), any incident will be considered a strike against the program. If, however, we accept the actuality that systems can’t be both 100% secure and functional, security can be measured in shades of probability, where no one incident becomes an automatic “fail.”

Katherine Teitler, Director of Content

Written by Katherine Teitler, Director of Content

Katherine Teitler leads content strategy and development for Edgewise Networks. In her role as Director of Content she is a storyteller; a translator; and liaison between sales, marketing, and the customer. Prior to Edgewise, Katherine was the Director of Content for MISTI, a global training and events company, where she was in charge of digital content strategy and programming for the company's cybersecurity events, and the Director of Content at IANS, where she built, managed, and contributed to the company's research portal.