<div class="csl-bib-body">
<div class="csl-entry">Ounjai, J., Wüstholz, V., & Christakis, M. (2023). Green Fuzzer Benchmarking. In <i>ISSTA 2023: Proceedings of the 32nd ACM SIGSOFT International Symposium on Software Testing and Analysis</i> (pp. 1396–1406). Association for Computing Machinery. https://doi.org/10.1145/3597926.3598144</div>
</div>
-
dc.identifier.uri
http://hdl.handle.net/20.500.12708/188021
-
dc.description.abstract
Over the last decade, fuzzing has been increasingly gaining traction due to its effectiveness in finding bugs. Nevertheless, fuzzer evaluations have been challenging during this time, mainly due to lack of standardized benchmarking. Aiming to alleviate this issue, in 2020, Google released FuzzBench, an open-source benchmarking platform, that is widely used for accurate fuzzer benchmarking.
However, a typical FuzzBench experiment takes CPU years to run. If we additionally consider that fuzzers under active development evaluate any changes empirically, benchmarking becomes prohibitive both in terms of computational resources and time. In this paper, we propose GreenBench, a greener benchmarking platform that, compared to FuzzBench, significantly speeds up fuzzer evaluations while maintaining very high accuracy.
In contrast to FuzzBench, GreenBench drastically increases the number of benchmarks while drastically decreasing the duration of fuzzing campaigns. As a result, the fuzzer rankings generated by GreenBench are almost as accurate as those by FuzzBench (with very high correlation), but GreenBench is from 18 to 61 times faster. We discuss the implications of these findings for the fuzzing community.
en
dc.description.sponsorship
European Commission
-
dc.language.iso
en
-
dc.rights.uri
http://creativecommons.org/licenses/by/4.0/
-
dc.subject
fuzzing
en
dc.subject
testing
en
dc.subject
benchmarking
en
dc.title
Green Fuzzer Benchmarking
en
dc.type
Inproceedings
en
dc.type
Konferenzbeitrag
de
dc.rights.license
Creative Commons Namensnennung 4.0 International
de
dc.rights.license
Creative Commons Attribution 4.0 International
en
dc.contributor.affiliation
Max Planck Institute for Software Systems, Germany
-
dc.contributor.affiliation
ConsenSys, Austria
-
dc.relation.isbn
979-8-4007-0221-1
-
dc.relation.doi
10.1145/3597926
-
dc.description.startpage
1396
-
dc.description.endpage
1406
-
dc.relation.grantno
101076510
-
dc.type.category
Full-Paper Contribution
-
tuw.booktitle
ISSTA 2023: Proceedings of the 32nd ACM SIGSOFT International Symposium on Software Testing and Analysis