Re: What is entropy?

New Message Reply About this list Date view Thread view Subject view Author view

Bill Frantz (frantz@netcom.com)
Sat, 11 Jul 1998 23:14:54 -0800


At 7:15 PM -0800 7/11/98, Mike Rosing asks:
>Let's first define entropy. According to my reference (Reif,
>Fundamentals of Satistical and Thermal Physics, McGraw-Hill 1965)
>If P is the number of states accessable to a system, then the entropy S
>is given by S = k ln(P) where k is a constant. Entropy is a measure of
>the number of states accessable to a system. Shannon tosses in a minus
>sign, but is measuring the same thing, the total number of chacters is
>the number of accesible states for a communications system.

It think my much more intuitive view is similar to Shannon's. If I have
160 bits of entropy in my system, then the probability of guessing its
current state is 1 in 2**160. That is using only external information
about its state.

In the case of a random number generator, we certainly hope that the
information used to seed it is not included in the accessible external
state. Furthermore, the outputs of the generator should not reveal the
internal state.

-------------------------------------------------------------------------
Bill Frantz | If hate must be my prison | Periwinkle -- Consulting
(408)356-8506 | lock, then love must be | 16345 Englewood Ave.
frantz@netcom.com | the key. - Phil Ochs | Los Gatos, CA 95032, USA


New Message Reply About this list Date view Thread view Subject view Author view

 
All trademarks and copyrights are the property of their respective owners.

Other Directory Sites: SeekWonder | Directory Owners Forum

The following archive was created by hippie-mail 7.98617-22 on Fri Aug 21 1998 - 17:20:17 ADT