Secure Coding mailing list archives

Tools: Evaluation Criteria


From: coley at linus.mitre.org (Steven M. Christey)
Date: Tue, 22 May 2007 12:52:46 -0400 (EDT)


On Tue, 22 May 2007, McGovern, James F (HTSC, IT) wrote:

We will shortly be starting an evaluation of tools to assist in the
secure coding practices initiative and have been wildly successful in
finding lots of consultants who can assist us in evaluating but
absolutely zero in terms of finding RFI/RFPs of others who have
travelled this path before us. Would especially love to understand
stretch goals that we should be looking for beyond simple stuff like
finding buffer overflows in C, OWASP checklists, etc.

semi-spam: With over 600 nodes in draft 6, the Common Weakness Enumeration
(CWE) at http://cwe.mitre.org is the most comprehensive list of
vulnerability issues out there, and it's not just implementation bugs.
That might help you find other areas you want to test.  In addition, many
code analysis tool vendors are participating in CWE.

In my travels, it "feels" as if folks are simply choosing tools in this
space because they are the market leader, incumbent vendor or simply
asking an industry analyst but none seem to have any "deep" criteria. I
guess at some level, choosing any tool will move the needle, but
investments really should be longer term.

Preliminary CWE analyses have shown a lot less overlap across the tools
than expected, so even baased on vulnerabilities tested, this is an
important consideration.

You might also want to check out the SAMATE project (samate.nist.gov),
which is working towards evaluation and understanding of tools, although
it's a multi-year program.

Finally, Network Computing did a tool comparison:


http://www.networkcomputing.com/article/printFullArticle.jhtml?articleID=198900460

- Steve


Current thread: