Re: What is entropy?

New Message Reply About this list Date view Thread view Subject view Author view

Michael F. Reusch (reusch@home.com)
Sun, 12 Jul 1998 20:37:43 -0400


At 10:15 PM 7/11/98 -0500, Mike Rosing wrote:
>
>On Sat, 11 Jul 1998, Bill Frantz wrote:
>
[...]
>>
>> (6) We assume that mixing bad entropy with good entropy results in good
>> entropy.
>>
>>
>> Basic entropy mixing logic:
>[...]
>
>Let's first define entropy. According to my reference (Reif,
>Fundamentals of Satistical and Thermal Physics, McGraw-Hill 1965)
>If P is the number of states accessable to a system, then the entropy S
>is given by S = k ln(P) where k is a constant. Entropy is a measure of
>the number of states accessable to a system. Shannon tosses in a minus
>sign, but is measuring the same thing, the total number of chacters is
>the number of accesible states for a communications system.

Entropy is conveniently defined as = -Information and proportional to
- Sum over i of Pi log(Pi), where Pi is the probability of the
ith "event". For one digital bit, the entropy would be
S = - P0 log(P0) - P1 Log(P1), where P0 is the probability of
a zero and P1 the Probability of a 1 and P0+P1 = 1.
[Khinchin, "The Mathematical Foundations of Information Theory",
Dover, NY, 1957].

In Boltzmann's kinetic theory, entropy has the very similar form,
i.e., S ~ - Integral f(x,v,t) log(f(x,v,t)) d3v d3x

In Mike Rosing's case, S = k ln(D) where k is a constant and D is the
"density of states". One statistical mechanics problem for which log(D)
can be found in closed form is the ideal gas, for which S is again roughly
S = k ln(D) ~ - Sum Pi log Pi and Pi = exp(-Ei/T). Here
Ei is the ith particle's kinetic energy and T the gas temperature.

This form, S = - Sum over i of Pi log(Pi), has all sorts of nice properties
that one expects of entropy like additivity,
S( A and B ) = S(A) + S(B) if A and B are independent and
S( A and B ) = S(A) + S(B given A), if they are not, where
S(B given A) < S(B), i.e., information decreases entropy.

I was wondering when "mixing bad entropy with good entropy results in good
entropy". If an attacker can feed you things to mix in doesn't this depend
on the
mixing function? Mixing bits is not like mixing gases and the entropy can
decrease!
It is pretty clear that for simple bitwise mixing functions AND and OR are
very bad
choices and with XOR the attacker can force you to simply flip your bits
between
uses and this does not sound good.

I know nothing about more complex mixing functions like concatenate the
good hash
with the evil hash and hash again. Seems like a pretty intractable problem,
even
empirically, for enough bits. There should be someone out their with
information
concerning mixing functions that resist entropy decrease.

Heat death of the universe v. cool death of the cryptographer.

Nem uma gol para o Brasil? Que vergonha!


New Message Reply About this list Date view Thread view Subject view Author view

 
All trademarks and copyrights are the property of their respective owners.

Other Directory Sites: SeekWonder | Directory Owners Forum

The following archive was created by hippie-mail 7.98617-22 on Fri Aug 21 1998 - 17:20:18 ADT