Extended negative binomial distribution: Difference between revisions
en>Luckas-bot m r2.7.1) (Robot: Adding fr:Loi binomiale négative étendue |
en>BeyondNormality |
||
Line 1: | Line 1: | ||
In [[mathematics]], a '''Markov information source''', or simply, a '''Markov source''', is an [[information source (mathematics)|information source]] whose underlying dynamics are given by a stationary finite [[Markov chain]]. | |||
==Formal definition== | |||
An '''information source''' is a sequence of [[random variable]]s ranging over a finite alphabet Γ, having a [[stationary distribution]]. | |||
A Markov information source is then a (stationary) Markov chain ''M'', together with a function | |||
:<math>f:S\to \Gamma</math> | |||
that maps states ''S'' in the Markov chain to letters in the alphabet Γ. | |||
A '''unifilar Markov source''' is a Markov source for which the values <math>f(s_k)</math> are distinct whenever each of the states <math>s_k</math> are reachable, in one step, from a common prior state. Unifilar sources are notable in that many of their properties are far more easily analyzed, as compared to the general case. | |||
==Applications== | |||
Markov sources are commonly used in [[communication theory]], as a model of a [[transmitter]]. Markov sources also occur in [[natural language processing]], where they are used to represent hidden meaning in a text. Given the output of a Markov source, whose underlying Markov chain is unknown, the task of solving for the underlying chain is undertaken by the techniques of [[hidden Markov model]]s, such as the [[Viterbi algorithm]]. | |||
==See also== | |||
*[[Entropy rate]] | |||
==References== | |||
*Robert B. Ash, ''Information Theory'', (1965) Dover Publications. ISBN 0-486-66521-6 | |||
{{probability-stub}} | |||
[[Category:Probability theory]] | |||
[[Category:Stochastic processes]] | |||
[[Category:Statistical natural language processing]] |
Revision as of 04:41, 10 December 2013
In mathematics, a Markov information source, or simply, a Markov source, is an information source whose underlying dynamics are given by a stationary finite Markov chain.
Formal definition
An information source is a sequence of random variables ranging over a finite alphabet Γ, having a stationary distribution.
A Markov information source is then a (stationary) Markov chain M, together with a function
that maps states S in the Markov chain to letters in the alphabet Γ.
A unifilar Markov source is a Markov source for which the values are distinct whenever each of the states are reachable, in one step, from a common prior state. Unifilar sources are notable in that many of their properties are far more easily analyzed, as compared to the general case.
Applications
Markov sources are commonly used in communication theory, as a model of a transmitter. Markov sources also occur in natural language processing, where they are used to represent hidden meaning in a text. Given the output of a Markov source, whose underlying Markov chain is unknown, the task of solving for the underlying chain is undertaken by the techniques of hidden Markov models, such as the Viterbi algorithm.
See also
References
- Robert B. Ash, Information Theory, (1965) Dover Publications. ISBN 0-486-66521-6