Sunday, May 29, 2022

Notes from the Field - Center for Internet Security Control 07 - Continuous Vulnerability Management

This is the seventh in a series of posts I'm writing on the Center for Internet Security (CIS) Controls Version 8. The CIS Controls are 18 critical information security controls that all organizations and information security professionals should be familiar with and implement to protect their networks and data. In this post I discuss what I see in my work as an information security auditor with clients regarding Control 07 - Continuous Vulnerability Management.

Acadia National Park with view of Sand Beach
The clients I work with vary a great deal in terms of their information security maturity levels. Some have advanced programs with good policies and strong controls in place. Others are just getting started on their information security journey. They don't know what they need to do or where to go to find out. Vendors attempt to sell them expensive solutions that they don't understand and do not have enough staff to successfully implement. I recommend my clients start by reading the CIS Controls document. It's a concise document that executives can understand and information security professionals can run with.  

The overview for Control 07 - Continuous Vulnerability Management is Develop a plan to continuously assess and track vulnerabilities on all enterprise assets within the enterprise’s infrastructure, in order to remediate, and minimize, the window of opportunity for attackers. Monitor public and private industry sources for new threat and vulnerability information.

Control 07 includes seven sub-controls or safeguards, as the CIS Document refers to them. They are: 

7.1 Establish and Maintain a Vulnerability Management Process

7.2 Establish and Maintain a Remediation Process 

7.3 Perform Automated Operating System Patch Management

7.4 Perform Automated Application Patch Management

7.5 Perform Automated Vulnerability Scans of Internal Enterprise Assets

7.6 Perform Automated Vulnerability Scans of Externally-Exposed Enterprise Assets

7.7 Remediate Detected Vulnerabilities Applications

Why is this control critical? Attackers are continuously scanning corporate networks from the outside, looking for vulnerabilities to exploit. Some of their many goals include compromising those networks to exfiltrate data or install ransomware, both of which can be profitable for them. They generally look for easy targets, companies with insecure practices. Don't let your company be an easy target. 

Information Security teams must continually scan their networks for vulnerabilities to remediate them before attackers find them. Attackers have the same access to vulnerability information that infosec pros do. They also have sophisticated tools to quickly exploit those vulnerabilities. We can't wait to remediate vulnerabilities until there's a convenient time. There never is one in the information technology and security fields. We are always juggling many competing priorities. And hackers don't work from 9 to 5. We must prioritize vulnerability management as the consequences of neglecting it are catastrophic. Companies that are victims of attackers have paid millions of dollars to ransomware gangs. The companies later paid tens or hundreds of millions of dollars to clean up their networks after attacks and pay claims in lawsuits from customers and shareholders. 

The clients I work with usually have some type of vulnerability scanning tool in place, such as Nessus, Qualys, Rapid7, OpenVAS - scanning their internal and external networks. Sometimes they scan both the hosts and applications, but usually just the hosts. Unfortunately, the vulnerability scans are often misconfigured, resulting in vulnerabilities that can go unmediated for months or years. 

During a recent gap assessment, the client I worked with had implemented Nessus Professional, which is a paid product. The employee who had set it up months before left the company shortly thereafter. Nevertheless, the information technology team continued to receive the weekly vulnerability scan reports showing that there were no vulnerabilities. In a conference room, the IT manager shared his screen to the large wall monitor and displayed the four most recent reports. None of them listed any vulnerabilities. The manager went back further, still no vulnerabilities. He was proud to show how secure their network was. Maybe...but every time I see vulnerability scans that consistently find no vulnerabilities, I know the scans are not running properly. 

I pointed out that between the time a vulnerability is identified and patched, the scans would have found something over the past several months. We looked a little closer when it became very clear what was going on. The report simply showed open ports on systems. The scan the departed employee had set up was a port scan. These types of scan results don't provide much useful information in identifying vulnerabilities other than listed unsupported operating systems. We discussed the importance of credentialed vulnerability scans. They reconfigured the scans accordingly. When we reviewed the scan results a few hours later, there were pages of vulnerabilities, many several years old. The organization was not as secure as the manager thought but was on its way in the right direction. 

Many clients that I've worked with have set up scans but are not scanning all of their systems. Or they are scanning systems without entering credentials into the scan configuration, which is necessary to conduct authenticated scans - deep scans in which the scanning tool can identify missing security and application patches as well as registry and configuration vulnerabilities. In one recent case, a client was conducting proper authenticated scans against its production network but not its User Acceptance Testing (UAT) environment. This is common but what was different in this case is that their UAT systems were public facing in which customers tested web applications prior to the company moving the applications into Staging and later into Production. An attacker could potentially compromise the UAT environment and move into the Production environment as they were on the same subnet. 

Finally, there are customers that are not scanning their systems at all. They don't think they need to. They don't have any public facing systems. Or they do but they believe the data hosted on the servers is data that no one would target. That's not a good approach to protecting your network. Even if a company only had public data on its websites, a compromise of a web server would cause reputational damage at a minimum. From the web servers, the attackers could make their way to the internal networks where there is more desirable data to be exfiltrated or encrypted for ransom. People often look through a very narrow lens at situations. It's important to look at the big picture in both life and information security.  

All organizations must have a proper vulnerability management program in place. It means implementing automated vulnerability scanning and patch remediation processes. It also means regularly verifying that the automated are configuring and running properly. This is critical to protecting your company and customer data from attackers. 

Next month I discuss Center for Internet Security Control 08 - Audit Log Management.  

No comments:

Post a Comment