There's not a huge paper I can find on the history of all the big bug classes. I mean, there's probably not even a real definition of "bug class" that would survive a drunken conversation between two CTF teams. But like, screw it, "format strings" are a bug class. "Deserialization bugs" are a bug class. "strcpy and friends" is a bug class. You know what I mean if you're the type of person who subscribed to this list. 

One thing you know if you internally describe yourself as "hacker" and not "security professional" is that the bug classes of the future are here now, but the commercial world doesn't see them yet or sees them as one-off issues.

For example, check out this amazing slide from this GoSecure presentation (https://gosecure.github.io/presentations/2019-04-29_atlseccon/History_of_Deserialization_v2.2.pdf)

image.png

If you can't see the picture, it was 9 years from "first public instances" to "bell curve of public mass use of bug class". The nightmare for the commercial industry of course is that this ignores private research, which looks like a much bigger bell curve, superimposed on the whole thing as you all know. 

If you've been working on writing automated tools to find bugs, then you also know the hardest thing to find is a new bug class. We can't all be Rain Forest Puppy! But even building things to replicate how hackers think when they just want to find Deserialization or strcpy bugs is hard. 

Like, what does it mean to even understand a bug? Bugs are not on one line! They are not even one concept! Can you inject a bug into a C program without also injecting another bug you didn't even intend on injecting? 

Even though humans are terrible at logic, we somehow think that stepping logically through a problem is how reasoning works. But you have to get over this! Stare at an ant farm and chant to yourself the neo-Buddhist mantra of "This is reasoning. This is cognition" and this way you quickly realize that precision is not the answer when you try to go from building a calculating machine to a reasoning machine. Precision and reasoning are opposites! In other words, the closest we have to reasoning over program behavior is fuzzing. Once you understand that, a lot of other stuff starts to make sense.

-dave