El ocaso del análisis DAST

The decline of DAST analysis



Many organizations collect data from a multitude of sources, such as users, and then store it in data lakes. Nowadays, data is everything, as is the ability to process a large amount of data in a short time. 

The use of data is always done through APIs. APIs hold all application components together and make them accessible as one to any type of client, such as a user. 

Of course, APIs didn't come alone. The traditional monolithic architecture has given way to others, such as microservices. Today everything is modular. Microservices make any application modular for better scalability, greater flexibility, and shorter time-to-market. 

Microservices combined with APIs are considered the future of software development. This is so true that OWASP launched an API-adapted version of its well-known OWASP TOP 10 in 2019. Therefore, in this scenario, application security must adapt to this new way of creating software and stay up to date. 

In the last decade, application security has also evolved in the form of various automated and specialized security tests. However, this evolution of security controls seems to be lagging behind, as is the case with dynamic application security analysis (DAST).


DAST, a cool guy 

DAST analysis emerged as a "fairly" automated black box tool for finding specific vulnerabilities in web applications at a time when almost the only techniques were SAST analysis and manual pentesting. DASTs emerged with the promise of helping the pentester and filling some of the gaps in SAST tools, such as reducing false positives and scan time. 

The execution process is very simple: from predefined rule sets, it sends HTTP requests to the web application and checks certain text strings in the responses to see if there are any vulnerabilities. Simply put, DAST analysis attempts to simulate a pentester. 

Needless to say, there are a couple of main drawbacks:  

  • False positives. Any number of false positives would require a brief review after scanning, so-called triage. Such triage would often require the expert eyes of a cybersecurity analyst and, even for an expert, takes precious time that can end up slowing down post-scanning steps. 
  • Time. Scans take time to run, usually 30 minutes to 2 hours, or maybe days, depending on the size of the application. Also, the time will depend on how good the configuration is. The better the configuration, the less time it will take to run. But getting a good pre-scan configuration is not that easy and most of the time it also requires a cybersecurity expert. 


DAST is far behind software development 

The way of developing software has changed. DAST is now often integrated into the CI/CD pipeline and, in CI/CD environments, agility and speed are key. Any build and deployment should take no more than a couple of minutes. 

This does not happen with DAST analyses. As we've already mentioned, DAST scans take time to run, it takes time to review the findings, and time to configure them. But these obstacles are not the only ones that hinder agility and speed: 

  • Discovery and crawling. One of the main features of DAST tools is its ability to search and navigate almost every nook and cranny of the application during scanning. By applying some heuristic rules to rewrite URLs and follow links, DAST tools can detect and crawl numerous subdomains and sections of web applications. Another version of this discovery process is to use proxies and collect all endpoints to scan. But today, while security is moving left in the SDLC, these features could be considered a deficiency because they require time. Fortunately, most DAST tools over the years have included the ability to specify in various formats which application endpoints to scan. 
  • Developers use DAST. Under the new DevSecOps paradigm, developers can use some tools present in the pipelines on their own, such as security tools, with the aim of achieving greater agility in the software development process. If developers themselves can review the findings of a security analysis from a DAST analysis, this would supposedly speed up the entire development process. In theory. In practice, developers do not always know how to distinguish what is a false positive and what is not, or they need to spend more time doing so than a security expert. This scenario dramatically reduces DevSecOps teams' tolerance for false positives and undermines confidence in security controls. 

Finally, the software itself has also changed. As we said at the beginning, APIs and microservices are the present and the future, and DAST is well suited for them: 

  • The DAST does not identify any dynamically generated content on the front-end, which is very common nowadays due to the widespread use of Javascript frameworks, such as Angular, React, Next, JQuery, etc. 
  • DAST does not detect some types of common API vulnerabilities, such as IDOR/BOLA, because this requires context from the application's business logic, such as user roles and privileges. 
  • The DAST also has difficulty passing through some protection walls, such as anti-CSRF tokens and various typical authentication/authorization mechanisms in APIs, such as OAuth2, SSO, and multi-factor authentication. Although it is possible to overcome some of these barriers, it would certainly increase the time to prepare the scan, and each application needs its own custom configuration. 


How to use DAST in 2023 and not die trying? 

At this point, it is quite tempting to think that DAST is useless, but it is not. Many of the above deficiencies can be overcome by using DAST in other ways: 

  • Reuse DAST to find easy results. Some vulnerabilities are quick and easy for any DAST tool to find and have a reasonably low false positive rate. Some examples are insecure or non-existent HTTP headers, Cross Site Scripting or even some type of SQLi 
  • Test specific security requirements. If a catalog of security requirements exists, DAST analysis could be used to run a very specific set of tests to verify those high-value requirements across many applications. 
  • Create configuration templates beforehand. As we have already mentioned, the better the configuration, the less time it will take to execute. It would be a good idea to invest time in preparing configurations that can be used to scan multiple applications with similar features or architecture. By doing this, with just one well-done configuration, execution time and false positives would be greatly reduced in future scans. 
  • Avoid full scans. Scanning the entire application could take a long time and each step in the CI/CD pipeline should last only a few seconds or minutes. Instead, limit the scope of the scan to only the latest changes made to the application 
  • Feed the DAST with the exact API routes to scan. If your tool supports it (recommended), test only the API endpoints you want to analyze, such as those with changes. This would allow 100% scan coverage to be achieved gradually without slowing down the CI/CD process. 
  • Run the DAST asynchronously. If the scan starts in a phase of the CI/CD cycle and is going to take a while, a good option would be to simply run it without waiting for it to finish. Later, when completed, the responsible team could review the findings or do some triage 

Apart from this, the DAST remains a really useful tool for any pentester, since it is capable of testing (fuzz) a large number of input parameters in numerous applications in a short time with a series of predefined rule sets that allow finding many types of vulnerabilities, such as injections and misconfigurations. 


What a DAST tool should have to perform security testing on APIs 

When evaluating DAST or similar tools for API security testing, it can be difficult to know which tool is the best option, so below are some criteria to use: 

  • Easy to integrate into a CI/CD pipeline 
  • Allows you to choose what type of application to scan: API or web with front-end 
  • Supports several API specification formats to specify the exact API paths to scan: Postman collections, OpenAPI/Swagger, GraphQL introspection, WADL, WSDL, etc. 
  • Allows you to select the specific type of API to test: REST, GraphQL, SOAP 
  • Provides the ability to define pre- and post-scripts to generate highly precise configurations to detect business logic vulnerabilities 


In summary 

The way software is developed has changed, as has the software itself. Agility and speed are now key features of any SDLC thanks to the benefits of using CI/CD pipelines. APIs have become the core of any new software component, so the generation of secure applications depends on the absence of vulnerabilities in the APIs underneath. 

APIs need to be tested and secured at a very fast pace and, although DAST analysis is not ideal for that, it is possible to modify the configuration of the scans and the way they are integrated into the pipelines to analyze APIs better and faster. 


AI: a new hope 

This post cannot end without mentioning artificial intelligence. 

The truth is that any DAST solution could leverage AI to solve many of the drawbacks mentioned in this article, and then some. For example, it could be used to improve the discovery and navigation process through intelligent URL rewriting, avoid duplicate or iterative requests, reduce false positives, detect complex business logic vulnerabilities... 

Do you think it is possible that AI becomes the fire that will revive the DAST phoenix? 


Ernesto Rubio , Cybersecurity Consultant

return to blog

Leave a comment

Please note that comments must be approved before they are published.