Re: Random Data from Geiger Counter

New Message Reply About this list Date view Thread view Subject view Author view

John Kelsey (kelsey@plnet.net)
Fri, 17 Jul 1998 05:42:54 -0500


-----BEGIN PGP SIGNED MESSAGE-----

[ To: CodherPlunks ## Date: 07/17/98 ##
  Subject: Re: Random Data from Geiger Counter ]

>Date: 16 Jul 1998 10:07:31 -0000
>From: Cicero <cicero@redneck.efga.org>
>Subject: Re: Random Data from Geiger Counter
>To: "John Kelsey" <kelsey@plnet.net>
>Cc: "William H. Geiger III" <whgiii@invweb.net>,
<CodherPlunks@toad.com>,
> "Bill Stewart" <bill.stewart@pobox.com>

The original proposal was:

a. Hash buffer full of data, yielding key K.
b. Encrypt buffer in CBC-mode, under key K.

I commented:
>>Right. I will raise one point with this scheme, though:
>>You actually lose a small amount of entropy here, since you
>>can't use the output from this to go back to the input. I
>>don't see this as being important with any real buffer size,
>>but it's at least a theoretical complaint.

Cicero said:

>If I were to retain the value of the hash, I could later
>decrypt the CBC-encrypted data, returning it to its original
>state. This invertibility proves conservation of entropy.

>Perhaps you meant something else than what I am presuming
you did?

Hmmm. This isn't as obvious as I thought it was, but I am
convinced it still is not invertible, based on a counting
argument.

Imagine a block of data, X, and a 128-bit block cipher key
generated by hashing that block of data. We form K =
hash(X), and then form X' = E_K(X). The question is, are
all X' values possible results from some X?

If we were just randomly generating K, then the answer would
clearly be 'yes.' Similarly, if we were accumulating
entropy in our hash function, and then using it to derive K,
and finally encrypting our seed buffer with K, the answer
would be 'yes.' This is true because, in that case, the key
choice would be independent of the value of X, and under any
randomly generated key, each X leads to one X'.

However, in Cicero's scheme, as I have understood it, this
isn't what happens. Instead, K is determined by X.

Let's look at an extreme case of this: X is one bit, K is
the hash of that one bit using SHA1, and the low-order bit
of the resulting hash is used to encrypt X by being XORed
into it. In this case, there are two possibilities: If
hash(0) = hash(1) in that low bit, then we don't lose any
entropy, since we're just XORing in a constant. If
hash(0)!=hash(1), though, then we always get the same value
for X', regardless of X. (Imagine hash(0) = 0, hash(1) = 1.
X' = 0 for all X. Similarly, hash(0)=1, hash(1)=0 leads to
X' = 1 for all X.)

Let's try to extend the result to a 64-bit X with a 128-bit
K and a real block cipher. Again, the question is whether
there are any X' values that cannot be reached by any X
value. Another way of phrasing the same question is, does
any pair of X values yield the same X' value? If so, then
we must lose some possible X' values, because X' has the
same number of bits as X. Now, if the cipher works as we
would expect it to (like a family of random permutations),
each time hash(X_0)<>hash(X_1), there is a 2^{-64}
probability that (X_0,X_1) will get the same X' value. Over
all 2^{127} possible pairs of X values, we expect about
2^{63} collisions. (We don't have to worry about collisions
in the key yet, since we only expect about one.) Any
collisions are enough to cause us to start losing entropy.

Let's extend this to an X of N bits, for any N, and a block
cipher that worked for all N bits. In that case, we would
expect each pair of X values whose keys (hashes) aren't the
same to have about a 2^{-N} probability of yielding the same
output value. Assuming K has at least twice as many bits as
N, we can use the above calculation. Otherwise, we have to
worry about collisions in K. When N is much larger than
|K|, we expect 2^{N-K} of each key value. For |K|=128, and
N=512, we would expect each key to occur about 2^{384}
times. There will be no collisions within the same key.
However, each X_0 has 2^{512}-2^{384}, or almost 2^{512},
values against which it can still collide, since every X_1
that gets a different key than this X_0 is a potential
candidate for colliding with it. We would expect a
collision after about 2^{256} X values that were eligible to
collide; in this case, we're virtually certain of many
colisions, if the encryption function works at all like we
expect it to.

This result appears to extend in a natural way to using
CBC-mode encryption instead of a block cipher. Thus, I
claim that Cicero's construction does, in fact lose entropy.
(However, I admit that I hadn't done this analysis when I
posted my initial comment--I had just noted that it was
using X for both key and plaintexts, which generally gives
you a random function rather than a random permutation.)
Am I missing something?

>Cicero

- --John Kelsey, kelsey@counterpane.com / kelsey@plnet.net
NEW PGP print = 5D91 6F57 2646 83F9 6D7F 9C87 886D 88AF

-----BEGIN PGP SIGNATURE-----
Version: 2.6.2

iQCVAwUBNa9FnCZv+/Ry/LrBAQGS2QP+OgZHJbQmVgRcdYbtqtPLkLpnJxupYIP2
k6RLghOsdjlHl+Pj94sSji7/VxpuXpubzIAAZbvFp/TxyzgFegxHptyMWY68LSOZ
yTsOmzSfQOfUSYc0U8/zkec7qNj2NvkMkHJZ70A+Ie6FUcnoMOQLJrCMESw6Kx+X
XkX3tYTcXhw=
=rtgO
-----END PGP SIGNATURE-----


New Message Reply About this list Date view Thread view Subject view Author view

 
All trademarks and copyrights are the property of their respective owners.

Other Directory Sites: SeekWonder | Directory Owners Forum

The following archive was created by hippie-mail 7.98617-22 on Fri Aug 21 1998 - 17:20:32 ADT