What is entropy?

New Message Reply About this list Date view Thread view Subject view Author view

Mike Rosing (eresrch@msn.fullfeed.com)
Sat, 11 Jul 1998 22:15:09 -0500 (CDT)


On Sat, 11 Jul 1998, Bill Frantz wrote:

> To toss my had into the current discussion of randomness, and to get a
> public vetting, let me describe a recent secure random generator I
> implemented.

Both brave and honorable. My questions are meant to enhance discussion,
although I may be taking off on a tangent.

> Assumptions and ground rules:
>
> (1) If we can get 160 bits of entropy, and keep it secret, we can maintain
> security on the DSA and DES keys we generate for the life of the
> application (20 years).
>
> (2) Since we probably can't keep a secret for 20 years, we continuously
> stir new entropy into the generator.
>
> (5) We aren't permitted to bother the user. I.e. no, "wave the mouse
> around while we gather entropy."
>
> (6) We assume that mixing bad entropy with good entropy results in good
> entropy.
>
>
> Basic entropy mixing logic:
[...]

Let's first define entropy. According to my reference (Reif,
Fundamentals of Satistical and Thermal Physics, McGraw-Hill 1965)
If P is the number of states accessable to a system, then the entropy S
is given by S = k ln(P) where k is a constant. Entropy is a measure of
the number of states accessable to a system. Shannon tosses in a minus
sign, but is measuring the same thing, the total number of chacters is
the number of accesible states for a communications system.

In the above assumptions, what do you mean by entropy?

Patience, persistence, truth,
Dr. mike


New Message Reply About this list Date view Thread view Subject view Author view

 
All trademarks and copyrights are the property of their respective owners.

Other Directory Sites: SeekWonder | Directory Owners Forum

The following archive was created by hippie-mail 7.98617-22 on Fri Aug 21 1998 - 17:20:17 ADT