Firewall Wizards mailing list archives

Re: Security policy and risk analysis questions


From: Mark (Mookie) <mark () zang com>
Date: Sat, 1 May 1999 08:50:41 -0700 (PDT)

asking ends up reducing to "what are the odds that someone will write and
distribute an easy-to-use exploit", and the like.

Possibility is usually assumed to be on the high side.  We must assume that
there are people with more time on their hands than is productive.
Therefore, this small fast forces us to increase that number automatically.
In security, as you know, we always assume that there is someone who can
write whatever exploit that needs to be written to help 'educate' the public
about security.  On the otherhand specific applications and/or environments
have less exposure to the public and the possibility that someone could
attack something they have never seen decreases your Ps factor.

The goal of the formula is not to help create statistics, except to help
visualize the real need for certain INFOSEC steps given a particular
environment.

This argument gains credence when you take into account the practices of
those who have the ability to infiltrate your infrastructure. If you posses
something they value highly, (and usually it's NOT CC's), then you can assume 
with some conviction that you will be targetted.

My experiences have shown me that a site with a "product" which is known to
be stored there, wether by public or otherwise means, is often identified as
the site-of-the-week. I've seen teams replicating the visible environment of
a site to investigate the vunerabilities that are probable. A machine will be
set up as a target machine running the same web software, the same cgi pgms
where possible. Firewalls will be by-passed to step inside the juicy innards.
Eventually a decision will be made based on the results. After that the
outcome of the breach is usually dependant on what isn't known about the site,
reporting mechanisms, maybe a little obscurity, non standard binaries; all
that sort of stuff.

Because security is always a case-by-case situation it can be crucial to
your integrity to customize as much as possible. Standards are great if you
know them in depth or know nothing about them. You believe in them either
way.  If you DONT know the application to the n'th degree but don't want to
be like the next guy then it makes good sense to add extra protections to 
code and it's environment.

An example might be the design of the att.com gateway, they didn't settle
for off-the-shelf stuff, they made their own from the ground up to suit
their needs. Of course a lot of it is now outdated and retired but it's an
indication of the success of their project that almost no-one was able to
breach it.

Virtually every safe site these day is a custom design. It's the big
places that have the ability to hire in the expertise to keep everyone
out. For the man in the electronic street it's getting better, more and more
tools are appearing on a web site near you each day. The danger lies in
trying to do too much and not taking a minimalistic approach to what you
offer. (Though if you're fc.net you get done out of spite). :)

When securing your castle, removing the pretty wooden bits that can burn
the place down and strengthening the core communication channels with triple
reinforced concrete will keep the pillagers out and wake you up when they
start a bangin' on the door.

Going back to the formula, if some of your sites are worth very little
due to the content then it's understandable to tighten up but not spend
too much time on it. Value judgements are what it's all about.

Cheers,
Mark
mark () zang com
mark () metalab unc edu



Current thread: