Re: Intel announcements at RSA '99

New Message Reply About this list Date view Thread view Subject view Author view

Enzo Michelangeli (em@who.net)
Sat, 23 Jan 1999 08:40:18 +0800


-----Original Message-----
From: Mike Rosing <eresrch@msn.fullfeed.com>
Date: Saturday, January 23, 1999 7:55 AM

>On Fri, 22 Jan 1999, James A. Donald wrote:

>> [...]
>> Knowledge of the underlying hardware, knowledge that shows it
>> derives its randomness from the fundamental randomness of the
>> universe, either thermal entropy, (Johnson noise) or quantum
>> indeterminacy (shot noise), knowledge that enables us to
>> determine the good functioning of the underlying noise
>> amplification circuits from the character of the output.

>>
>> A good circuit would simply directly amplify the underlying
>> noise source, so that the entropy of the output would be
>> somewhat less than one entropy bit per signal bit, thus
>> ensuring that any malfunction of the underlying circuit would
>> be obvious.
>
>Cool, I can easily do all that. I'm still not sure how you convert
>a real signal into whatever the definition of "digital entropy" is,
>but at least I can pass DIEHARD and Diaphony.

The definition of entropy of a data source (a stochastic process) was
supplied by Shannon in 1948, as minimum number of bits necessary to describe
its output. That's the catch: if you don't know in detail the inner workings
of the source, you can never be sure that the output couldn't be compressed
further by using some weird algorithm.

Of course statistical tools help to see how bad the situation is, but they
are fundamentally limited by the fact that they only test against SOME forms
of data interdependency, if nothing else because they must run in a limited
time. Generally they address issues of frequency of each symbol (first-order
statistics) and frequency of pairs of symbols separated by a given interval
(second-order statistics, which, under some restrictive conditions
(linearity, stationariety, ergodicity etc.), can be reduced to power
spectra). Incidentally, those are also the areas exploited by most
compressors, based on a first step of identification of repeated patterns or
spectral analysis, followed by entropy coding (e.g., Huffman) of the symbols
produced. However, no common compressor will squeeze sequences produced by
highly non-linear algorithms, even though they are absolutely deterministic
(PRNG's).

> It will be interesting to
>see how well the P3 does with its RNG.

Up to a point: a PRNG would pass the tests with flying colours, and would
still be useless as source of entropy.

Enzo


New Message Reply About this list Date view Thread view Subject view Author view

 
All trademarks and copyrights are the property of their respective owners.

Other Directory Sites: SeekWonder | Directory Owners Forum

The following archive was created by hippie-mail 7.98617-22 on Sat Apr 10 1999 - 01:18:05