Re: Entropy loss

New Message Reply About this list Date view Thread view Subject view Author view

David Wagner (daw@CS.Berkeley.EDU)
Wed, 22 Jul 1998 16:43:09 -0700 (PDT)


In article <In article <Pine.LNX.3.96.980716010232.24035B-100000@blackbox> you write:
> This raises an issue I've been wondering about for a while: how much
> entropy is lost by recursive hashing?

Suppose we've got a random function F : {0,1}^n -> {0,1}^n.
Let f_j denote the fraction of n-bit values which can appear as the
result of iterating F j times. Then f_1 ~ 1 - 1/e, and
        f_{j+1} ~ 1 - e^{-f_j}.
The first few numbers in the sequence are .632, .469, .374, .312,
.268, .235, .210, etc. Asymptotically, f_j ~ 2/(j + log j) is a
pretty good approximation.

Moreover, I'd guess that the lost entropy is going to be on the
order of lg f_j bits, which is not too large as long as j isn't too
big. Even asymptotically, this is lg f_j ~ 1 - lg(j + log j), so
if you hash it 2^k times, you'll only lose about k-1 bits of entropy
(just a guess -- I have no proof).


New Message Reply About this list Date view Thread view Subject view Author view

 
All trademarks and copyrights are the property of their respective owners.

Other Directory Sites: SeekWonder | Directory Owners Forum

The following archive was created by hippie-mail 7.98617-22 on Fri Aug 21 1998 - 17:20:48 ADT