Secure Coding mailing list archives

SANS Institute - CWE/SANS TOP 25 Most Dangerous Programming Errors


From: coley at linus.mitre.org (Steven M. Christey)
Date: Wed, 14 Jan 2009 14:45:03 -0500 (EST)


To all, I'll ask a more strategic question - assuming we're agreed that
the Top 25 is a non-optimal means to an end, what can the software
security community do better to raise awareness and see real-world change?

Gary and others who've broken into mainstream media - what aspects of your
message resonate with the everyday people, and what doesn't?

I personally feel that we need to do a better job of educating the general
consumer, not just the ones who hire consultants.

On Wed, 14 Jan 2009, Gary McGraw wrote:

gem> Using bug parade lists for training leads to awareness but does not educate.
s> Yep - which is why we want universities to get cracking, and if the Top 25
s> helps to prod them on, then so be it.

Good lord I hope that the CS curriculum does not embrace the bug parade.

I think what I meant was, if there is more awareness that "hey, security
is a problem and this Top 25 thing can help fix it, and the authors are
saying that education needs to catch up" - then that may develop into
indirect pressure on universities to make changes.

Sadly, I have little faith that software security will work its way into
the university curriculum in a meaningful way (and will be ecstatically
psyched to be completely wrong).

I'm not going to say I'm overly optimistic either, maybe it's too much
wishful thinking on my end.

One ray of hope is that a lot of the press has reflected our statements
that security is not part of the curriculum at most universities.  Much of
the press has also reflected the apparently-surprising fact that lots of
hacking is possible because programmers make errors with security
consequences.

gem> Top ten lists mix levels.
s> Regarding Seven Pernicious Kingdoms - how does the Top 25 map to them?

Good question.  You tell me!

I can't at this point - though I know it's not a clean mix.

gem> Automated tools can find bugs---let them.
s> Yes, and a lesson of the Top 25 (that we all already know) is that when
s> people start to apply it, they'll see how a tool won't be a silver bullet.

A tool is so much superior to a list that I simply have say..huh?!

Sorry, I should have been more specific.

1) to oversimplify, tools don't find design flaws, so people who rely
exclusively on tools for Top 25 compliance will realize that they're not a
silver bullet.  Or if they don't, maybe their auditors will.

2) tools generate so many results that it's hard to prioritize which bugs
to fix first.  Top 25 will provide consumers with one cut at that.

gem> When it comes to testing, security requirements are more important
than vulnerability lists.

s> New York State has put up draft text that mentions the Top 25 as
s> part of a condition for acquisition.

In my view this is not a helpful development.

I encourage people to look at the New York State contract text.  The Top
25 is a relatively small part of it.

In that context, I'm viewing it as "something of substance [having been
reviewed by diverse people] that's better than nothing."

Have people on this list written or used contract language that translates
(somehow) into real software security?  If the Top 25 isn't the right
answer, then what is?

- Steve


Current thread: