WebApp Sec mailing list archives

Re: Oracle CSO's Response to InfoSecMagazines Secure Coding Bah!


From: "Maty SIMAN" <matysiman () hotmail com>
Date: Tue, 10 Feb 2004 22:54:14 +0200

I am sorry to disagree with some of your opinions.

First of all, we all agree that application should be more robust and
secure - and that can be achieved through secure coding.

However, the analogy of bridge is somewhat unfair.
To complete the analogy, you must assume that every single day, hundreds of
hooligans try to ruin your bridge, huge trucks try to block the road, all
those vandals around the world share their information and above all, you
still have to make it affordable.

Each object has its use. You (as consumer) assume that some precautions are
taken (your oven won't electrify you...), but still, a determined hacker
(baby...) can hurt (or be hurt...) . If you want to make the appliance more
secure, you should buy and pay for a special safety device. You don't just
assume that everything if bullet-proof.

Software is no difference.
It should be secure against REASONABLE attempts to hack it.
I agree - the definition of "reasonable" is somewhat obscure.
Just as any appliance that uses laser beam should follow some strict
security guidelines,
so should the software industry - follow some best-practices or thumb-rules,
so each software would be "reasonably" secure.

If you want your just-bought application to be more secure, you should pay
for it.

Do you really think that windows' integrated FW would be as secure as, lets
say, NG?

BTW:
Does anyone hasa suggestion for how much security is reasonable?
I think 75% is just fine, and the remaining 25% should be paid for if
needed.

That all,
Maty SIMAN, CISSP

----- Original Message -----
From: "Mark Curphey" <mark () curphey com>
To: <webappsec () securityfocus com>
Sent: Monday, February 09, 2004 11:21 AM
Subject: Oracle CSO's Response to InfoSecMagazines Secure Coding Bah!


SOUND BYTES

*SECURE CODING? ABSOLUTELY!
By Mary Ann Davidson, CSO, Oracle

Andrew Briney's "Secure Coding? Bah!" article struck a chord, as
it
should have been titled "Secure Coding? Absolutely!" Given that
the
software industry as a whole has never made a concerted effort to
write better code, it's far too early to throw in the towel.

Many are convinced that because we can't have perfect code, we
shouldn't even try for good code. It's nonsense to give up on writing
better code, especially when we appear to have plenty of time to
invent new technologies that don't solve our problems.

Briney said, "Risk reduction is all about reducing vulnerabilities,
mitigating threats and lowering event costs." However, most customers
have almost no information on the security-worthiness of the products
they buy, and some risks can't be mitigated. The single best thing
the industry can do to mitigate users' risk is to write better
software.

Software development must improve because software has become part of
our critical infrastructure. As such, software development should be
held to the same standards as other facets of critical
infrastructure. Imagine if civil engineers built bridges with the
same inattention to fundamental engineering practices as many
software developers. Would it be acceptable to hear:

"I can't be bothered to figure out how to make the bridge secure. I'm
only interested in using the latest cool building materials and
having a sexy facade."

"Time to market is crucial. If I can't get my bridge up this month,
my competitor will."

"It's not my fault if the bridge fails. I didn't expect so many heavy
trucks on it."

There are no perfect bridges, but engineers are keenly aware of the
ramifications of poor engineering practice. If they were as
unschooled in secure design practice as the average software
developer, we would have collapsing bridges and severe loss of life.

Briney said, "But it's even faster and cheaper to build crappy
software to get the project rolled out immediately, please your boss
and help the company make its quarterly number." Actually, it isn't.
Much of secure coding practice is just good coding practice.
Observing a good development process actually gets better quality
products out the door faster.

"Secure programming is an oxymoron because none of the parties who
could make it happen on a broad scale are properly 'incentivized,'"
Briney said. There are many things that can and have happened to
refute this. For one, customers can demand more secure software. The
Department of Defense has made a good start by requiring formal,
third-party security evaluations for products used in national
security systems. This requirement may be extended to other agencies.

Security evaluations don't result in perfect software, but they do
force vendors to follow a secure development process. Oracle has
invested more than $17 million in security evaluations throughout the
last 12 years, and I can categorically state that it has resulted in
better products and more "security awareness" among our
developers.
Avoidance of even one significant security fault would more than pay
for the cost of an evaluation.

More than 50% of security faults are a result of buffer overflows. If
we, as an industry, merely stamped out buffer overflows in the next
two years, we would reduce security faults by half and would
significantly decrease our customers' risk exposure. Checking
boundary conditions is measurable, achievable and something that
every developer should have learned in their first programming class.
If they didn't, they shouldn't be working in the industry.

In summary, my response to Briney's editorial is "Secure coding? Yes,
absolutely."



Current thread: