# Hyper-exponential distribution

In probability theory, a hyper-exponential distribution is a continuous distribution such that the probability density function of the random variable ${\displaystyle X}$ is given by

${\displaystyle f_{X}(x)=\sum _{i=1}^{n}f_{Y_{i}}(x)\;p_{i},}$

where ${\displaystyle Y_{i}}$ is an exponentially distributed random variable with rate parameter ${\displaystyle \lambda \,_{i}}$, and ${\displaystyle p_{i}}$ is the probability that X will take on the form of the exponential distribution with rate ${\displaystyle \lambda \,_{i}}$.[1] It is named the hyper-exponential distribution since its coefficient of variation is greater than that of the exponential distribution, whose coefficient of variation is 1, and the hypoexponential distribution, which has a coefficient of variation less than one. While the exponential distribution is the continuous analogue of the geometric distribution, the hyper-exponential distribution is not analogous to the hypergeometric distribution. The hyper-exponential distribution is an example of a mixture density.

An example of a hyper-exponential random variable can be seen in the context of telephony, where, if someone has a modem and a phone, their phone line usage could be modeled as a hyper-exponential distribution where there is probability p of them talking on the phone with rate ${\displaystyle \lambda \,_{1}}$ and probability q of them using their internet connection with rate ${\displaystyle \lambda \,_{2}.}$

## Properties of the hyper-exponential distribution

Since the expected value of a sum is the sum of the expected values, the expected value of a hyper-exponential random variable can be shown as

${\displaystyle E(X)=\int _{-\infty }^{\infty }xf(x)dx=p_{1}\int _{0}^{\infty }x\lambda \,_{1}e^{-\lambda \,_{1}x}dx+p_{2}\int _{0}^{\infty }x\lambda \,_{2}e^{-\lambda \,_{2}x}dx+\cdots +p_{n}\int _{0}^{\infty }x\lambda \,_{n}e^{-\lambda \,_{n}x}dx}$
${\displaystyle =\sum _{i=1}^{n}{\frac {p_{i}}{\lambda \,_{i}}}}$

and

${\displaystyle E(X^{2})=\int _{-\infty }^{\infty }x^{2}f(x)\,dx=p_{1}\int _{0}^{\infty }x^{2}\lambda \,_{1}e^{-\lambda \,_{1}x}\,dx+p_{2}\int _{0}^{\infty }x^{2}\lambda \,_{2}e^{-\lambda \,_{2}x}\,dx+\cdots +p_{n}\int _{0}^{\infty }x^{2}\lambda \,_{n}e^{-\lambda \,_{n}x}\,dx,}$
${\displaystyle =\sum _{i=1}^{n}{\frac {2}{\lambda \,_{i}^{2}}}p_{i},}$

from which we can derive the variance.

The moment-generating function is given by

${\displaystyle E(e^{tx})=\int _{-\infty }^{\infty }e^{tx}f(x)dx=p_{1}\int _{0}^{\infty }e^{tx}\lambda \,_{1}e^{-\lambda \,_{1}x}dx+p_{2}\int _{0}^{\infty }e^{tx}\lambda \,_{2}e^{-\lambda \,_{2}x}dx+\cdots +p_{n}\int _{0}^{\infty }e^{tx}\lambda \,_{n}e^{-\lambda \,_{n}x}dx}$
${\displaystyle =\sum _{i=1}^{n}{\frac {\lambda \,_{i}}{\lambda _{i}-t}}p_{i}.}$