So one thing I have as a "lessons learned" from the past 20 years is that
security is not a proactive sport. In fact, we are all experts at running
to where the ball _was_as opposed to where it is _going_.
Like, if you listen to Risky Biz this week, Patrick asks Metlstorm whether
it's time to go out and replace all the old enterprise file sharing systems
have around, proactively. And the answer, from Metl, who's hacked into
every org in Oceania for the past 20 years, is "yeah, this is generating
huge return on investment for the ransomware crews so they're just going to
keep doing it, and being proactive might be a great idea." But what he
didn't say, but clearly had in his head was "but lol, nobody is going to
actually do that. So good luck out there chooms!"
At some level, STIX and TAXII and the whole CTI market are about passing
around information on what someone _might_ have used to hack something, at
some point in the _distant past_. It's a paleontology of hackers past - XML
schemas about huge ancient reptiles swimming in the tropical seas of
your networks, the taxonomies of extinct orders we now know only through a
delicate finger-like flipper bone or a clever piece of shellcode.
So my first thought is that performance measurement tools seem exactly
aimed at a lot of security problems but performance people are extremely
reluctant <https://aus.social/@brendangregg/110276319669838295> to admit
that because of the drama involved in the security market. Which is very
smart of them! :)
Secondly, I wanted to re-link to Halvar's QCon keynote
He has a section on the difficulties of getting good performance
benchmarks, which typically you would do as part of your build chain. So in
theory, you have a lot of compilation features you can twiddle when
compiling and you want to change those values, compile your program, and
get a number for how fast it is. But this turns out to basically be
impossible in the real world for reasons I'll let him explain in his
presentation (see below).
A lot of these problems with performance seem only solvable by a continuous
process of evolutionary algorithms - where you have a population of
different compilation variables, and you probably introduce new ones over
time, and you kill off the cloud VMs where you're getting terrible
performance under real-world situations and let the ones getting good or
average performance thrive.
I'm sure this is being done, and probably if I listened to more of Dino dai
Zovi's talks I'd know where and how, but aside from having performance
implications, it also has security implications because it will tend
towards offering offensive implants value for becoming less parasitic and