Herzog–Schönheim conjecture: Difference between revisions

From formulasearchengine
Jump to navigation Jump to search
en>Helpful Pixie Bot
m ISBNs (Build KC)
 
en>Yobot
m WP:CHECKWIKI error fixes - Replaced endash with hyphen in sortkey per WP:MCSTJR using AWB (9100)
 
Line 1: Line 1:
Hello. Let me introduce the author. Her name is Refugia Shryock. North Dakota is our birth location. I utilized to be unemployed but now I am a librarian and the salary has been truly satisfying. To do aerobics is a thing that I'm completely addicted to.<br><br>My weblog: home std test, [http://www.smylestream.org/groups/solid-advice-in-relation-to-candida/ clicking here],
In [[information theory]], '''Fano's inequality''' (also known as the '''Fano converse''' and the '''Fano lemma''') relates the average information lost in a noisy channel to the probability of the categorization error.  It was derived by [[Robert Fano]] in the early 1950s while teaching a [[Ph.D.]] seminar in information theory at [[MIT]], and later recorded in his 1961 textbook.
 
It is used to find a lower bound on the error probability of any [[decoder]] as well as the lower bounds for [[minimax risk]]s in [[density estimation]].
 
==Fano's inequality==
Let the [[random variables]] ''X'' and ''Y'' represent input and output messages with a [[joint probability]] <math>P(x,y)</math>. Let ''e'' represent an occurrence of error; i.e., that <math>X\neq Y</math>. Fano's inequality is
:<math>H(X|Y)\leq H(P(e))+P(e)\log(|\mathcal{X}|-1),</math>
 
where <math>\mathcal{X}</math> denotes the support of ''X'',
:<math>H\left(X|Y\right)=-\sum_{i,j} P(x_i,y_j)\log P\left(x_i|y_j\right)</math>
 
is the [[conditional entropy]],
:<math>P(e)=P(X\neq Y)</math>
 
is the probability of the communication error, and
:<math>H(e)=-P(e)\log P(e)-(1-P(e))\log(1-P(e))</math>
 
is the corresponding [[Binary entropy function|binary entropy]].
 
==Alternative formulation==
Let ''X'' be a [[random variable]] with [[Probability density function|density]] equal to one of <math>r+1</math> possible densities <math>f_1,\ldots,f_{r+1}</math>. Furthermore, the [[Kullback-Leibler divergence]] between any pair of densities cannot be too large,
:<math> D_{KL}(f_i\|f_j)\leq \beta</math> for all <math>i\not = j.</math>
 
Let <math>\psi(X)\in\{1,\ldots, r+1\}</math> be an estimate of the index. Then
 
:<math>\sup_i P_i(\psi(X)\not = i) \geq 1-\frac{\beta+\log 2}{\log r}</math>
where <math>P_i</math> is the [[probability]] induced by <math>f_i</math>
 
==Generalization==
The following generalization is due to Ibragimov and Khasminskii (1979), Assouad and Birge (1983).
 
Let '''F''' be a class of densities with a subclass of ''r''&nbsp;+&nbsp;1 densities ''&fnof;''<sub>''&theta;''</sub> such that for any ''&theta;''&nbsp;≠&nbsp;''&theta;''<nowiki>&prime;</nowiki>
 
:<math>\|f_{\theta}-f_{\theta'}\|_{L_1}\geq \alpha,\,</math>
:<math>D_{KL}(f_\theta\|f_{\theta'})\leq \beta.\,</math>
 
Then in the worst case the [[expected value]] of error of estimation is bound from below,
 
:<math>\sup_{f\in \mathbf{F}} E \|f_n-f\|_{L_1}\geq \frac{\alpha}{2}\left(1-\frac{n\beta+\log 2}{\log r}\right)</math>
 
where ''&fnof;''<sub>''n''</sub> is any [[density estimation|density estimator]] based on a [[sample (statistics)|sample]] of size ''n''.
 
==References==
* P. Assouad, "Deux remarques sur l'estimation," ''Comptes Rendus de L'Academie des Sciences de Paris'', Vol. 296, pp.&nbsp;1021–1024, 1983.
* L. Birge, "Estimating a density under order restrictions: nonasymptotic minimax risk," Technical report, UER de Sciences Economiques, Universite Paris X, Nanterre, France, 1983.
* T. Cover, J. Thomas, ''Elements of Information Theory''. pp.&nbsp;43.
* L. Devroye, ''A Course in Density Estimation''. Progress in probability and statistics, Vol 14. Boston, Birkhauser, 1987. ISBN 0-8176-3365-0, ISBN 3-7643-3365-0.
* R. Fano, ''Transmission of information; a statistical theory of communications.'' Cambridge, Mass., M.I.T. Press, 1961. ISBN 0-262-06001-9
* R. Fano, ''[http://www.scholarpedia.org/article/Fano_inequality Fano inequality]'' [[Scholarpedia]], 2008.
* I. A. Ibragimov, R. Z. Has′minskii, ''Statistical estimation, asymptotic theory''. Applications of Mathematics, vol. 16, Springer-Verlag, New York, 1981. ISBN 0-387-90523-5
 
[[Category:Information theory]]
[[Category:Inequalities]]

Latest revision as of 16:21, 20 April 2013

In information theory, Fano's inequality (also known as the Fano converse and the Fano lemma) relates the average information lost in a noisy channel to the probability of the categorization error. It was derived by Robert Fano in the early 1950s while teaching a Ph.D. seminar in information theory at MIT, and later recorded in his 1961 textbook.

It is used to find a lower bound on the error probability of any decoder as well as the lower bounds for minimax risks in density estimation.

Fano's inequality

Let the random variables X and Y represent input and output messages with a joint probability P(x,y). Let e represent an occurrence of error; i.e., that XY. Fano's inequality is

H(X|Y)H(P(e))+P(e)log(|𝒳|1),

where 𝒳 denotes the support of X,

H(X|Y)=i,jP(xi,yj)logP(xi|yj)

is the conditional entropy,

P(e)=P(XY)

is the probability of the communication error, and

H(e)=P(e)logP(e)(1P(e))log(1P(e))

is the corresponding binary entropy.

Alternative formulation

Let X be a random variable with density equal to one of r+1 possible densities f1,,fr+1. Furthermore, the Kullback-Leibler divergence between any pair of densities cannot be too large,

DKL(fifj)β for all i=j.

Let ψ(X){1,,r+1} be an estimate of the index. Then

supiPi(ψ(X)=i)1β+log2logr

where Pi is the probability induced by fi

Generalization

The following generalization is due to Ibragimov and Khasminskii (1979), Assouad and Birge (1983).

Let F be a class of densities with a subclass of r + 1 densities ƒθ such that for any θ ≠ θ

fθfθL1α,
DKL(fθfθ)β.

Then in the worst case the expected value of error of estimation is bound from below,

supfFEfnfL1α2(1nβ+log2logr)

where ƒn is any density estimator based on a sample of size n.

References

  • P. Assouad, "Deux remarques sur l'estimation," Comptes Rendus de L'Academie des Sciences de Paris, Vol. 296, pp. 1021–1024, 1983.
  • L. Birge, "Estimating a density under order restrictions: nonasymptotic minimax risk," Technical report, UER de Sciences Economiques, Universite Paris X, Nanterre, France, 1983.
  • T. Cover, J. Thomas, Elements of Information Theory. pp. 43.
  • L. Devroye, A Course in Density Estimation. Progress in probability and statistics, Vol 14. Boston, Birkhauser, 1987. ISBN 0-8176-3365-0, ISBN 3-7643-3365-0.
  • R. Fano, Transmission of information; a statistical theory of communications. Cambridge, Mass., M.I.T. Press, 1961. ISBN 0-262-06001-9
  • R. Fano, Fano inequality Scholarpedia, 2008.
  • I. A. Ibragimov, R. Z. Has′minskii, Statistical estimation, asymptotic theory. Applications of Mathematics, vol. 16, Springer-Verlag, New York, 1981. ISBN 0-387-90523-5