lkml.org 
[lkml]   [2012]   [Aug]   [15]   [last100]   RSS Feed
Views: [wrap][no wrap]   [headers]  [forward] 
 
Messages in this thread
/
SubjectRe: [PATCH RFC] random: Account for entropy loss due to overwrites
From
Date
On Mon, 2012-08-13 at 10:26 -0700, H. Peter Anvin wrote:
> From: "H. Peter Anvin" <hpa@linux.intel.com>
>
> When we write entropy into a non-empty pool, we currently don't
> account at all for the fact that we will probabilistically overwrite
> some of the entropy in that pool.

Technically, no, nothing is overwritten. The key fact is that the mixing
function is -reversible-. Thus, even if you mix in known data, you can't
learn anything about the state and thus can't destroy any of the
existing entropy.

But you are correct, mixing new actual entropy is not purely additive
(with saturation). For that to happen, we'd need an input mixing
function with perfect maximal cascading. Instead we effectively cascade
across somewhere between 6 and 64 bits. So the truth lies somewhere
between linear and your exponential estimate (which would be the case
for mixing a single bit into the pool with XOR), but much closer to
linear due to combinatoric expansion.

On the other hand, I don't think this sort of thing matters at all.
There is so much more fundamentally wrong with even trying to do entropy
accounting in the first place that these sorts of details don't even
matter. Instead we should stop fooling ourselves and just drop the
pretense of accounting entirely. Now that we've got a much richer set of
inputs, I think the time is ripe... but of course, I'm no longer the
maintainer.

--
Mathematics is the supreme nostalgia of our time.




\
 
 \ /
  Last update: 2012-08-15 22:02    [W:0.218 / U:0.192 seconds]
©2003-2020 Jasper Spaans|hosted at Digital Ocean and TransIP|Read the blog|Advertise on this site