Re: Intel announcements at RSA '99

New Message Reply About this list Date view Thread view Subject view Author view

bram (bram@gawth.com)
Sat, 23 Jan 1999 17:39:35 -0800 (PST)


On Sat, 23 Jan 1999, Enzo Michelangeli wrote:

> The definition of entropy of a data source (a stochastic process) was
> supplied by Shannon in 1948, as minimum number of bits necessary to describe
> its output. That's the catch: if you don't know in detail the inner workings
> of the source, you can never be sure that the output couldn't be compressed
> further by using some weird algorithm.
>
> Of course statistical tools help to see how bad the situation is, but they
> are fundamentally limited by the fact that they only test against SOME forms
> of data interdependency, if nothing else because they must run in a limited
> time. Generally they address issues of frequency of each symbol (first-order
> statistics) and frequency of pairs of symbols separated by a given interval
> (second-order statistics, which, under some restrictive conditions
> (linearity, stationariety, ergodicity etc.), can be reduced to power
> spectra). Incidentally, those are also the areas exploited by most
> compressors, based on a first step of identification of repeated patterns or
> spectral analysis, followed by entropy coding (e.g., Huffman) of the symbols
> produced. However, no common compressor will squeeze sequences produced by
> highly non-linear algorithms, even though they are absolutely deterministic
> (PRNG's).

Thankfully, these days there are some good CSPRNG's (Continuously Seeded
Pseudo Random Number Generators) which fix the 'low entropy source'
problem, Yarrow being an example.

A CSPRNG works by having a pool of random bits, and having an algorithm
for mixing entropy into the pool and pulling pseudo random bits out of the
pool. The mixing algorithm is designed so that even if the data being
mixed in is nonrandom (or, even worse, malicious,) the amount of entropy
in the pool will at worst remain steady. Data for an RNG is then mixed in
from time to time, and if the amount of entropy in it's output is less
than you'd like it to be, just mix in more.

CSPRNG's, by the way, are a distinct cryptographic primitive, with it's
own sort of threat model. Some of the more subtle points of them don't
unfortunately seem to be well documented anywhere. For example, adding
more entropy to a CSPRNG once it's got more than around 160 bits of
entropy in it doesn't really do much of anything - it's already got more
entropy than you'll ever need. What's important is the amount of time
between additions of entropy (which determines the success of continuation
attacks) and the number of distinct sources it's entropy comes form (for
redundancy.)

Information on Yarrow is at:

http://www.counterpane.com/yarrow.html

-Bram


New Message Reply About this list Date view Thread view Subject view Author view

 
All trademarks and copyrights are the property of their respective owners.

Other Directory Sites: SeekWonder | Directory Owners Forum

The following archive was created by hippie-mail 7.98617-22 on Sat Apr 10 1999 - 01:18:05