There’s no other way to state it: Existing vulnerability management processes are broken. Current vulnerability management paradigms are not keeping up with threats. Attacks similar to WannaCry and Petya, which exploited the Eternal Blue vulnerability, could happen again at any time.
You don’t have to look very far back for an example. On April 23 of this year, open source CMS provider Drupal notified users of an impending patch to address a critical security issue. On Wednesday, April 25, the patch was released – in the middle of a workday. Despite this delay, attackers wasted no time and got right to work. Long before organizations were able to patch their systems, even though it took only a couple of days, attackers exploited vulnerabilities. In this case, it was the remote code execution vulnerability, which enabled attackers to plant cryptomining software, mostly through Coinhive injections, in multiple Drupal-based sites.
Why is This Happening?
There has been a sea change in digital dependence, concurrent with the adoption of a wider enterprise software stack, which includes OSS and 3rd party apps, alongside the large vendors. This in combination with agile development methodologies, now allows organizations to deploy new – and more vulnerable – software continuously, with only minimum security controls in place.
“Even advocates of continuous vulnerability assessments must admit that there is no way to keep pace with attackers in a vulnerability management program when the time to exploit is measured in hours,” noted Rendition Infosec. The solution? Effective vulnerability remediation that offers integrated, intelligent and weighted insights derived from technical context based on multiple IT systems, credible threat intelligence and insights from the greater business ecosystem.
But what led to this spike in vulnerability exposure? Consider that –
- Almost all enterprises are partially or fully cloud-based. This means that exploitable assets (like Drupal) are exposed to anyone no longer hiding behind corporate security perimeters.
- Current development paradigms demand constant change. This means less inspection and testing – and more vulnerabilities that slip into production versions.
- More software, more vendors. Enterprises are working with more and smaller vendors, along with multi-platform solutions. Enterprise-ready vendors are conducting vulnerability research – meaning more vulnerabilities are surfacing.
- Too many tools in vulnerability assessment. Today’s average organization has up to 20 vulnerability assessment and patch management tools.
- More digital, more vulnerabilities. Growing digital complexity more disclosed vulnerabilities:
o In 2000, there were 1020
o In 2010, there were 4652
o In 2016 there were 6447
o In 2017 there were 14704
But What About Patching?
Patches eventually solve vulnerabilities. But experts like Tim Erlin, Vice President at Tripwire, concur that, “Organizations should really be aiming to fix vulnerabilities on their systems as rapidly as is feasible. Any gap in applying a patch to a vulnerability provides an opportunity for hackers to access systems and steal confidential data.”
There are three main problems with patching alone as a vulnerability management strategy
- In IT nothing exists in a vacuum. Patches can have an unpredictable impact, and even cause downtime and interrupt crucial business processes.
- Patching can take time. As we saw in the Drupal example above, patching can’t always provide an immediate, safe fix.
- Patching isn’t always clear-cut. Patching specific software on PCs is a no brainer, but patching servers and networks is far more complex.
Cross-organizational vulnerability management processes often complicate and confuse rather than solve vulnerability challenges. When multifaceted teams comprising IT, security, DevOps, and R&D use diverse detection tools to discover thousands of potential vulnerabilities, it’s a recipe for organizational chaos. The brutal truth is that existing processes create a lot of extra work and friction, increasing remediation times.
The problem is that each potential vulnerability discovered needs to be analyzed, categorized, and possibly remediated. This translates into a lot of data for manual review – and all this effort involves, produces few insights into the business impact of vulnerabilities, how remediation should happen, or who should do the remediating.
Today’s Default: Ex Post Facto
Since neither patching nor existing processes are providing timely and focused responses to vulnerabilities, most organizations have given up on prevention. They rely, by default, on incident after-the-fact response and forensics.
This ex-post facto methodology aims to reduce malicious vulnerability exploitation by identifying incidents quickly to mitigate damage, and then forensically examining the event to hopefully ensure that future similar attacks don’t occur. The problem is, the threat environment is so rapidly-evolving that the potential costs of every incident are prohibitively high. Enterprises can no longer afford to wait until after the fact.
Where to go from here?
A new vulnerability management paradigm is long overdue. And Vulcan has created one. Download our new eBook ‘Why Continuous Software Exposure Demands Continuous Protection’ to learn three steps to fix what’s broken in modern software vulnerability management.