Extended negative binomial distribution: Difference between revisions

From formulasearchengine
Jump to navigation Jump to search
en>BeyondNormality
en>CmdrObot
sp: amd→and
 
Line 1: Line 1:
In [[mathematics]], a '''Markov information source''', or simply, a '''Markov source''', is an [[information source (mathematics)|information source]] whose underlying dynamics are given by a stationary finite [[Markov chain]].
Blake is precisely how she is actually named however reader always misspell thatI am currently a computer system owner as well as I'll be advertised quickly. Massachusetts is actually where she as well as her other half live. To behave is what she does weeklyHis better half and also he sustain a website. You could want to inspect this out: [http://mini-camping-de-eikenhof.nl/wpknhf/?attachment_id=23 car transport]
 
==Formal definition==
An '''information source''' is a sequence of [[random variable]]s ranging over a finite alphabet Γ, having a [[stationary distribution]].   
 
A Markov information source is then a (stationary) Markov chain ''M'', together with a function
 
:<math>f:S\to \Gamma</math>
 
that maps states ''S'' in the Markov chain to letters in the alphabet &Gamma;.
 
A '''unifilar Markov source''' is a Markov source for which the values <math>f(s_k)</math> are distinct whenever each of the states <math>s_k</math> are reachable, in one step, from a common prior state.  Unifilar sources are notable in that many of their properties are far more easily analyzed, as compared to the general case.
 
==Applications==
Markov sources are commonly used in [[communication theory]], as a model of a [[transmitter]]. Markov sources also occur in [[natural language processing]], where they are used to represent hidden meaning in a textGiven the output of a Markov source, whose underlying Markov chain is unknown, the task of solving for the underlying chain is undertaken by the techniques of [[hidden Markov model]]s, such as the [[Viterbi algorithm]].
 
==See also==
*[[Entropy rate]]
 
==References==
*Robert B. Ash, ''Information Theory'', (1965) Dover Publications. ISBN 0-486-66521-6
 
{{probability-stub}}
 
[[Category:Probability theory]]
[[Category:Stochastic processes]]
[[Category:Statistical natural language processing]]

Latest revision as of 23:12, 5 May 2014

Blake is precisely how she is actually named however reader always misspell that. I am currently a computer system owner as well as I'll be advertised quickly. Massachusetts is actually where she as well as her other half live. To behave is what she does weekly. His better half and also he sustain a website. You could want to inspect this out: car transport