IDS mailing list archives

Re: OSEC [WAS: Re: Intrusion Prevention]


From: "Marcus J. Ranum" <mjr () ranum com>
Date: Mon, 30 Dec 2002 23:34:13 -0500

Greg Shipley writes:
However, with that said we've noticed over time that a) various industry
"certification" efforts were watered-down b) many of those efforts were
more or less irrelevant, and c) there as a definite need in the industry
for a trusted 3rd party to validate vendor claims with REAL testing.

Testing is a fundamental problem with all products, and always
has been. What customers want is someone to tell them what
sucks and what doesn't - while still providing enough facts
that they can have at least a minimal understanding of what is
going on. As you know, establishing a truly deep understanding
requires a huge investment in time - more than virtually anyone
is willing to make. That's why even some testing groups have
been fooled in the past. For example, witness Meircomm's snafu-ed
test of Intrusion.com's product. A lot of customers and industry
analysts (and probably some people at Meircomm) were fooled by
that rigged benchmark.

With respect to industry certification efforts - that's a tricker
matter. The objective is to set a bar and continually raise it.
It flat-out doesn't work if you start with the bar too high. For
example in 1998, if you'd made a certification program for NIDS that
required that they correctly handle TCP sequencing, IP fragmentation,
packet reordering, and TCP start/stop semantics, you'd have had
trouble finding anyone who could even participate in such a
program. So all the others would have sat back and thrown rocks
at your program as being "biassed" or whatever until they were
comfortable playing - and by then everyone would be so smeared
with mud that nobody'd trust any of you.

Finally, IMNSHO comparing OSEC criteria to ICSA criteria is akin to
comparing a Formula-1 racer to, say, a garbage truck.

Yeah. It's all about carrying capacity, and those darned F-1
cars don't even have a TRUNK for cryin' out loud!!! :)

Joking aside - I'm not sure if you're trying to say "ICSA sucks
and OSEC doesn't" or if you're trying to say "they're different
things built for different purposes and have different results."
I assume the latter.

What I gather you're trying to do with OSEC is test stuff and
find it lacking or not. Basically you want to say what products
you think are good or bad - based on your idea (with input from
customers and vendors) of good and bad. Of course, if I were a
vendor, I'd skewer you as publicly and often as possible for
any bias I could assign you. Because your approach is inherently
confrontational. Back when I worked at an IDS vendor, I tried to
talk our marketing department out of participating in your
reviews because, frankly, the vendors are forced to live or
die based on your _opinion_. That's nice, but we've seen before
that opinions of how to design a product may vary. Many industry
expert types "Don't Get This" important aspect: products are
often the way they are because their designers believe that's how
they should be. Later the designers defend and support those
aspects of their designs because that's how they believe their
products should be - not simply out of convenience. The egg
really DOES sometimes come before the chicken. :)

About a million years ago I was designing and coding firewalls.
I wrote pure proxy firewalls. OK, actually, I _invented_ pure
proxy firewalls. You know what? I still think that, for security,
it's The Way To Do It and everything else sucks. But the industry
appears to disagree. That's OK, it's customer choice. But if I
was reviewing product firewalls, guess which ones I'd say sucked
and which didn't? If I developed a firewall testing methodology,
NONE of the packet screens would have cut it. And people would
have been able to accuse me of trying to promote my own product
because my _beliefs_ and my _implementation_ were inseparable.

So here's the problem: how do you test a product without holding
any subjective beliefs in your criteria? Man, that's hard. I wish
you luck. (By the way, I've noted a very strong subjective preference
on your part to Open Source solutions, most notably Snort. You've
consistently cast their failures in kinder light than anyone
else's, etc... So be careful...)

I think this is why a lot of folks want to move away from
testing anything other than the simple crud they can understand:
"DUH! PACKETS PER SECOND!"  since it's easier to keep from
getting subjective. I've always enjoyed reading your reviews
even when they skewered my products, because I could imagine
the painful writhing contortions you had to go through trying
to get all the various products to work. :)

Anyhow - don't bash ICSA. I did, once, a long time ago. In fact,
I wrote an article about it, that I regret but have kept on my
web site
http://www.ranum.com/pubs/fwtest/index.htm
that may be germane to this discussion. About a year ago, some
of the ICSA folks lured me up to Pennsylvania (now, for unrelated
reasons, I am moving to Pa in the spring...) with promises of
beer, and I got a chance to look at the lab, talk to their
people, and learn a bit more about some of the stuff behind the
scenes. I was pretty impressed. The thing that impressed me the
most was getting a bit of the inside skinny on how many
vendors passed the test the first time (many have failed DOZENS
of times) and I thought that was cool. Obviously, it'd be best
if all products going into the test were perfect going in. But
I'd be happy, as a vendor or a customer, if they were BETTER
coming out. Your test makes vendors take a bigger gamble: one
that many/some will be reluctant to take. If I were still a
vendor - honestly - I'd offer my product up to your tender
mercies only after I'd gotten it past the ICSA folks and even
then I'd probably try to talk my marketing folks out of even
returning your phone calls. :) You had them quaking in their
boots, which was always fun to see. :)

So, I think there's a place for *ALL* these different tests
and it's a bad idea to throw mud at any of them. Honestly,
I think that a smart customer should do their own. It's
flat-out INSANE to spend more than $100,000 on a product
without doing an operational pilot of it and 2 competitors.
Yet, many companies do exactly that. They get what they
deserve.

mjr. 
---
Marcus J. Ranum                         http://www.ranum.com
Computer and Communications Security    mjr () ranum com


Current thread: