Validity of hashing, stuff about entropy

New Message Reply About this list Date view Thread view Subject view Author view

Anonymous (nobody@replay.com)
Sun, 12 Jul 1998 19:26:45 -0400


Before I get to explaining what that nonsensical "validity of hashing"
post meant, I'd like to sincerely apologize for polluting the list by
attempting to write a post in 45 seconds -- sorry, everybody!

Anyhow, what I was saying was that hashing to extract entropy is a good
process. I don't think Perry was claiming that it wasn't in his post on
"seat-of-the-pants-ism," but the way the post was written, someone could
easily (and mistakenly) infer that hashing to gather entropy was voodoo.

With pretty much any source of randomness, if you know what the biases
are, you can write code to simulate the source, so do so. Write a program
that takes 160 bits of input and outputs stuff that looks just like the
stuff the source of randomness outputs, different for each input.

If you have some method which can yield a guess at the hash of your
simulation's output which is correct with a probability of more than
1-((2^160-1)/(2^160))^((simulation time/hash time)+1) or so (don't trust
my math; it will betray you if you do) then just watching the hashed
outputs of the simulation with different inputs for a collision will yield
a collision faster than a birthday attack.

With any good hash function, nothing can find a collision faster than a
birthday attack can. Therefore, your best guess at the hashed output can't
have probability greater than
1-((2^160-1)/(2^160))^((simulation time/hash time)+1), and there are,
for cryptographic purposes, at least
log(1/(1-(((2^160-1)/(2^160))^((simulation time/hash time)+1)))/log(2)
bits of entropy in the result.

TONAP -- That's Obviously Not A Proof -- but I think it makes it
sufficiently clear that hashing isn't simply "voodoo."

The number of bits of entropy in what everybody seems to call a "system"
is the sum of, for each possible state, the product of its probability
and a base-two logarithm of that probability.

For cryptographic purposes, you can use the same thing, except taking into
account that the attacker, by observing and/or interfering with your
stuff, can figure out probabilities allowing a better prediction of the
state at hand (f'rinstance, if you're using the system clock at the time
of a keypress, the attacker knows that there is a very slight probability
it was outside a certain range).

It might be more convenient to say that a "system" has n bits of entropy
when that same observing, interfering attacker has to make an average of
2^(n-1) guesses at its state before getting it right. When there are only
2^n possible states, this means that the most likely state has probability
no greater than 2^-n.


New Message Reply About this list Date view Thread view Subject view Author view

 
All trademarks and copyrights are the property of their respective owners.

Other Directory Sites: SeekWonder | Directory Owners Forum

The following archive was created by hippie-mail 7.98617-22 on Fri Aug 21 1998 - 17:20:18 ADT