Information Theory and Noisy Computation
by
William S. Evans, UC Bekeley PhD Thesis, TR-94-057 (1994).
Abstract: The information carried by a signal unavoidably decays when the signal is corrupted by
random noise. This occurs when a noisy channel transmits a message as well as when a noisy
component performs computation. We first study this signal decay in the context of communication
and obtain a tight bound on the decay of the information carried by a signal as it crosses a noisy
channel. We then use this information theoretic result to obtain depth lower bounds in the noisy
circuit model of computation defined by von Neumann. In this model, each component fails
(produces 1 instead of 0 or vice-versa) independently with a fixed probability, and yet the output
of the circuit should be correct with high probability. Von Neumann showed how to construct
circuits in this model that reliably compute a function and are no more than a constant factor deeper
than noiseless circuits for the function. Our result implies that such a multiplicative increase in
depth is necessary for reliable computation. The result also indicates that above a certain level of
component noise, reliable computation is impossible.
We use a similar technique to lower bound the size of reliable circuits in terms of the
noise and complexity of their components, and the sensitivity of the function they compute.
Our bound is asymptotically equivalent to previous bounds as a function of sensitivity, but unlike
previous bounds, its dependence on component noise implies that as this noise increases to 1/2,
the size of reliable circuits must increase unboundedly. In all cases, the bound is strictly stronger
than previous results.
Using different techniques, we obtain the exact threshold for component noise, above
which noisy formulas cannot reliably compute all functions. We obtained an upper bound on this
threshold in studying the depth of noisy circuits. The fact that this bound is only slightly larger
than the true threshold indicates the high precision of our information theoretic techniques.