Bugtraq mailing list archives

RE: Oracle - the last word


From: Iggy E <iggy_e () yahoo com>
Date: Fri, 12 May 2006 20:00:13 -0700 (PDT)


I politely disagree... if there are no measurements then there can be
no metrics (or is that the other way around? :-) There has to be a
start some place; i.e. in your examples, David's time can be recorded
to the hour, and even the researcher/analyst could have a rating to
compensate for skill difference.

The suggestions/ideas put forth here in this thread are very
interesting, IMO. Besides rating a software package solely on the
number of vulnerabilities found, it's more accurate to include the
time to patch the vulnerability by the vendor, and the time it takes
a vendor to respond to a vulnerability report. These 2 factors could
get a weighted rating and be combined with the "# of vulnerabilities"
rating. Which would be a more accurate assessment of "how safe is
software X?".

I can think of Windows OS vs. Linux OS, and IE vs. Firefox as perfect
examples. Microsoft and its supporters will reference the number (and
perhaps the criticality) of vulnerabilities, while not taking into
account the patch and response time.

Of course, this leads to other possible factors such as a vendor's
patch delivery mechanism, but we can't not take on the task just
because there isn't an immediate clear solution.

Stephen Evans, CISSP


--- Lee Kelly <robert.kelly () verizonbusiness com> wrote:

Actually I would think this information would be only as good as
the person
doing the testing, and in fact may lead to a false timeline. To
continue
using Mr. Litchfield's example consider the following: 

- The bugs (regardless of number) found in a day could have been
blatantly
obvious;
- The bugs that took two weeks to find may have been more
technically
obscure, or it may be that Mr. Litchfield had other things to do
rather than
spend all his time looking for bugs;
- From this, and previous postings, I am going to take for granted
that Mr.
Litchfield is an Oracle expert although we have never met to my
knowledge.
That being said, how long would it take a novice (or someone less
skilled)
to find these same bugs. I think even Mr. Litchfield would agree
that there
are malicious people out there just as expert, maybe even more so,
than he
is regarding Oracle products. 
- Level of effort also has to take into account when the research
started
versus when the application/patch/upgrade was released. For example
let's
say that 10gR2 was released on April 1st (don't actually know, just
picking
a date) and Mr. Litchfield was on vacation or travel until April
8th. If it
then took him two weeks to find these bugs the 'bad guys' will have
had a
week headstart over his research. I understand that more people
than Mr.
Litchfield are doing this research but this would need to be
factored in the
equation.

All this being said -- I am not taking the position that this
information
would not be 'interesting', but I don't thing it would "provide a
more
concrete answer to the question "how secure is software X."

Thank You,
 
Lee Kelly, CISSP

-----Original Message-----
From: Steven M. Christey [mailto:coley () mitre org] 
Sent: Wednesday, May 10, 2006 6:29 PM
To: davidl () ngssoftware com
Cc: bugtraq () securityfocus com
Subject: Re: Oracle - the last word


David Litchfield said:

When Oracle 10g Release 1 was released you could spend a day
looking
for bugs and find thirty. When 10g Release 2 was released I had to
spend two weeks looking to find the same number.

This increasing level of effort is likely happening for other major
widely audited software products, too.  It would be a very useful
data
point if researchers could publicly quantify how much time and
effort
they needed to find the issues (note: this is not my idea, it came
out
of various other discussions.)  Level of effort might provide a
more
concrete answer to the question "how secure is software X?"

Some researchers might not want to publicize this kind of
information,
but this would be one great way to help us move away from the
primitive practice of counting the number of reported
vulnerabilities.
(and while I'm talking about quantifying researcher effort, it
might
be highly illustrative to measure how much time is spent in dealing
with vendors during disclosure.)

- Steve




__________________________________________________
Do You Yahoo!?
Tired of spam?  Yahoo! Mail has the best spam protection around 
http://mail.yahoo.com 


Current thread: