The SATE VI Report is out!

26 views
Skip to first unread message

Delaitre, Aurelien M. (Assoc)

unread,
Jun 28, 2023, 11:53:04 AM6/28/23
to samate-...@list.nist.gov

Dear Software Assurance Community,

 

Not without delay, the SATE VI report has finally been published!

 

SATE VI focused on testing static analysis tools on datasets containing existing and injected bugs. The report provides insights on creating and using such datasets and a methodology to assess static analysis tools for the end-user’s specific needs. The report also contains a section dedicated to sound static code analysis.

 

We deeply thank the SATE VI participants, some of whom have been participating in SATE since 2008. We recognize and appreciate their contributions in the ongoing efforts to improve software assurance.

 

Publication link: https://www.nist.gov/publications/sate-vi-report-bug-injection-and-collection

 

Abstract:

 

The Static Analysis Tool Exposition (SATE) VI report presents the results of a security-focused bug finding evaluation exercise carried out from 2018 to 2023 on various code bases using static analysis tools. Existing bugs were extracted from bug tracker reports and the National Vulnerability Database (NVD), and additional bugs were injected using automated tools and manual analysis. The results of this exercise showed significant variability across tool effectiveness, depending on the test cases, bug classes, and bug complexity involved. The report discusses the shortcomings and difficulties encountered during the bug injection process, which marginally impeded the efficiency of the evaluation.

 

The report emphasizes the correlation between high code complexity and tool difficulty in identifying bugs. Recall and discrimination rates were lower for the convoluted C Track than the considerably less complex Java Track. Across all languages and code bases, tools found bugs with lower complexity more readily than bugs with higher complexity. Finding rates varied for different bug classes, in line with the inherent complexity of each bug class (e.g., recall for simpler initialization errors was greater than on more intricate buffer errors).

 

The report discusses the shortcomings of the bug injection process. Regardless of the test case, injected bugs were not found by tools at the same rate as existing bugs, implying that their quality needs to improve.

 

The report also includes a summary of the Ockham Sound Analysis Criteria track, which focused on tools that do not report false positives or false negatives.

 

The SATE VI report concludes that static analysis is a useful technique to find real security bugs in large code bases. The right set of tools, used properly, can help increase code quality and security. Potential users should test a tool or set of tools on their own code base before using them in production. The metrics presented in SATE VI are suitable for assessing tool fitness for such a use case.

 

Keywords: Static Analysis; Cybersecurity; Bug Injection; Software Vulnerability; Software Assurance

 

Reply all
Reply to author
Forward
0 new messages