Bugtraq mailing list archives

Re: PRNGs (was Re: machine independent protection from stack-smashingattack)


From: John Viega <viega () LIST ORG>
Date: Tue, 22 Aug 2000 15:48:09 -0700

This message summarizes a rather long offline discussion between
Crispin and myself.  

On Fri, Aug 18, 2000 at 02:25:23PM -0700, Crispin Cowan wrote:
John Viega wrote:

As for what I think the world needs in this area... I think there
needs to be a good end-to-end solution for high-quality random numbers
in software that is widely ported and easily installed.  It should
deal with entropy accumulation, and actually emiting a string of
random numbers from that entropy.

Software randomness is intrinsically impossible.  Software is
deterministic.  You need an external source of entropy, and that means
you need access to I/O devices.  If you do it at a user level, it can be
spoofed.  Therefore, the only really good source of randomness is an OS
device.  /dev/random is a fine OS random source device in Linux.  If it
hasn't been ported to your favorite OS, then it should be.

The notion of "random" that exists in a quantum world certainly cannot
be captured by a computer.  However, computers can collect entropy
that can be used to produce cryptographically strong (pseudo-)random
numbers.  For example, the system clock can often provide a very small
amount of entropy in a real application (not nearly enough on its own
to produce good quality numbers).  Another example is running a lot of
threads on a busy machine, and performing certain timing comparisons;
the hope is that the load of the machine will be enough to make the
scheduling as good as unpredictable.

As for "doing it in user level"; as long as you rely on information
obtained directly from the kernel, there is no reason why you can't
collect and output randomness from user space, as long as you do it
from a static library.  I believe that Crispin was thinking about
doing DLL interpolation to spoof either entropy collection routines or
the output of the PRNG itself.  Static linking basically allows you to
thwart that kind of attack.

In each of those two areas, I'd like to see a solution that people
will have faith in, much like people seem to have in /dev/random (but
perhaps even more faith than that).  I'd also like to see the
limitations of /dev/random overcome.  In particular, the biggest
limitation is the fact that /dev/random doesn't have reasonable
bandwidth; you have to use /dev/urandom for that, and I don't know as
if people are willing to trust it as much as /dev/random.

/dev/random gives you as much bandwidth as it can; it blocks when the
entropy pool goes dry.  To get more bandwidth, you need a higher
bandwidth source of external entropy.

This turns out not to be true.  See below.

Having talked about this issue at great length with John Kelsey, I'm
pretty confident that once your algorithm reaches a state you believe
to be secure, then you should be able to output random numbers as fast
as the software can crank them out, as long as the internal state of
your algorithm isn't compromised.  I know many crypto types who aren't
willing to put that kind of faith into /dev/urandom.

I find this claim difficult to believe.  If you start with a fixed-size
random seed and a fixed algorithm, and you output an arbitrary amount of
bits, then eventually you give the attacker enough bits to be able to
infer the initial random seed.  Once the seed has been disclosed, the
generator becomes a fixed pseudorandom number generator, and the
attacker can predict all future bits.

One-way hash functions can provide this sort of functionality; once
you have reached a secure state, you can generate a large
number of pseudo-random values that cannot be guessed (assuming no
compromise of the internal PRNG state).  Furthermore, statistical
analysis won't reveal a significant amount of state in the internal
generator, assuming the non-invertablity of the hash function.

We're currently building a reference implementation of Kelsey &
Schneier's Yarrow algorithm (work w/ John Kelsey).  That algorithm
takes an extremely conservative approach; using a 128-bit cipher and a
160-bit hash function as components within the PRNG, the generator is
allowed to generate up to 2^46 outputs without any further entropy
(approximately).  The right algorithm could go on the order of 2^128
outputs without requiring new entropy, which would be more random
outputs than any program could feasibly use.

Note that you have to use a fairly good hash function.  MD5 probably
isn't sufficient; there are rumors spread by respected cryptographers
that a top industry cryptographer has broken MD5, but is bound by
employee agreement to keep silent.

One worry I have is whether the easily ported techniques alone (mainly
things that don't require code in the OS) are going to provide people
in the security and cryptography community with a good enough sense of
well-being.  For example, I think everybody would feel a lot more
comfortable if the OS were also feeding in keyboard and mouse entropy.

/dev/random does do this.

Granted.  There was some misunderstanding of the original post here;
the intended context was about easily ported sources of randomness,
_NOT_ /dev/random.  Sorry for being unclear in the original mail.

John

Attachment: _bin
Description:


Current thread: