Vulnerability Development mailing list archives

RE: coding (was: Re: CodeGreen beta release (idq-patcher/antiCodeRed/etc.)


From: "David Schwartz" <davids () webmaster com>
Date: Fri, 7 Sep 2001 15:01:25 -0700


In the profound words of David Schwartz:

Malicious code and exploit code, on the other hand, is more like a
cigarette that kills you instantly or a gun that blows up when
you squeeze
the trigger. They're interesting to talk about and look at, but
there is no
moral application for them.

      Bullshit!  There are PLENTY of "moral applications" for exploit
code...

        Okay, what are they?

Just to name a few: testing your own servers to see if they
are vulnerable;

        That requires nothing malicious nor anything to exploit anything. That
simply requires detecting the presence of a vulnerability.

testing your servers after patching to verify the
patch actually worked as advertized;

        That requires nothing malicious nor does anything need to be exploited.
Please state specifically what malicious or exploitative act is required.

using the exploit in an authorized
penetration test type of scenario;

        That's malicious? Arguably that does require the vulnerability to be
exploited.

demonstrating to clueless higher
management at your place of employment the need for applying that
patch that they are so reluctant to do;

        That's no malicious, nor does that require the vulnerability to be
exploited. In any event, the moral value of responding to irrational
requests or demands as if they were rational is questionable.

studying the code for educational
purposes, to learn how it works, possibly for the purpose of developing
something to guard against it; etc...

        I said they have no moral _application_. Studying a gun is not an
application of a gun. In any event, the studied code need not be malicious
nor need it exploit anything.

There are many, many legitimate,
"moral" uses for exploit code...  Code is just like any other tool:
it can be used for either good or bad purposes...  It's not inherent
in its design which you use it for...  There is no "good" or "bad"
code; only code...  Plenty of so-called "good" programs have been
used for very bad purposes...  And, plenty of so-called "bad" programs
have been used for very good purposes...

        I could not disagree more with this assertion. It's a great cop out -- 'I
only built it, I have no control over what people do with it'. But it's not
true at all.

        To cite a recent real-world example of this, there's a discussion on
alt.irc about operator invisibility, which is a piece of code. The only
purpose for operator invisibility is to intercept the communications of
third parties without their knowledge. It has no other application.

        If you agree that it is immoral or unethical to intercept the communication
of third parties without their knowledge, then how can you escape the
conclusion that it is imorral or unethical to provide for use code that
provides only this functionality?

        Would you argue that it's okay to produce the perfect poison (quick,
undetectable, etcetera), an item clearly and inarguably optimized as an
effective silent killed, because you just make the perfect poison, you have
no idea or control over what it's going to be used for?

        No, humans make tools. They make them for a purpose and the purpose is
reflected in the design of the tool.

        DS


Current thread: