Minimal logic

From formulasearchengine
Revision as of 03:03, 21 March 2013 by en>Addbot (Bot: Migrating 2 interwiki links, now provided by Wikidata on d:q3257974)
Jump to navigation Jump to search

In information theory, given an unknown stationary source π with alphabet A, and a sample w from π, the Krichevsky–Trofimov (KT) estimator produces an estimate πi(w) of the probabilities of each symbol i ∈ A. This estimator is optimal in the sense that it minimizes the worst-case regret asymptotically.

For a binary alphabet, and a string w with m zeroes and n ones, the KT estimator can be defined recursively[1] as:

P(0,0)=1,[6pt]P(m,n+1)=P(m,n)n+1/2m+n+1,[12pt]P(m+1,n)=P(m,n)m+1/2m+n+1.

See also

References

  1. Krichevsky, R.E. and Trofimov V.K. (1981), 'The Performance of Universal Encoding', IEEE Trans. Information Theory, Vol. IT-27, No. 2, pp. 199–207


Template:Probability-stub