|
|
Line 1: |
Line 1: |
| In the [[mathematics|mathematical]] discipline of [[graph theory]], the '''expander walk sampling theorem''' states that [[Sampling (statistics)|sampling]] [[vertex (graph theory)|vertices]] in an [[expander graph]] by doing a [[Random walk#Random walk on graphs|random walk]] is almost as good as sampling the vertices [[statistical independence|independently]] from a [[uniform distribution (discrete)|uniform distribution]].
| | I am Oscar and I totally dig that title. His spouse doesn't like it the way he does but what he truly likes doing is to do aerobics and he's been doing it for quite a while. South Dakota is where me and my husband reside. My working day occupation is a librarian.<br><br>Look at my web page ... [http://www.sex-porn-tube.ch/user/JJeannere sex-porn-tube.ch] |
| The earliest version of this theorem is due to {{harvtxt|Ajtai|Komlós|Szemerédi|1987}}, and the more general version is typically attributed to {{harvtxt|Gillman|1998}}.
| |
| | |
| ==Statement==
| |
| Let <math>G = (V, E)</math> be an expander graph with [[Expander graph#Spectral expansion|normalized second-largest eigenvalue]] <math>\lambda</math>. Let <math>n</math> denote the number of vertices in <math>G</math>. Let <math>f : V \rightarrow [0, 1]</math> be a function on the vertices of <math>G</math>. Let <math>\mu = E[f]</math> denote the true mean of <math>f</math>, i.e. <math>\mu = \frac{1}{n} \sum_{v \in V} f(v)</math>. Then, if we let <math>Y_0, Y_1, \ldots, Y_k</math> denote the vertices encountered in a <math>k</math>-step random walk on <math>G</math> starting at a random vertex <math>Y_0</math>, we have the following for all <math>\gamma > 0</math>:
| |
| | |
| :<math>\Pr\left[\frac{1}{k} \sum_{i=0}^k f(Y_i) - \mu > \gamma\right] \leq e^{-\Omega (\gamma^2 (1-\lambda) k)}.</math> | |
| | |
| Here the <math>\Omega</math> hides an absolute constant <math>\geq 1/10</math>. An identical bound holds in the other direction:
| |
| | |
| :<math>\Pr\left[\frac{1}{k} \sum_{i=0}^k f(Y_i) - \mu < -\gamma\right] \leq e^{-\Omega (\gamma^2 (1-\lambda) k)}.</math>
| |
| | |
| ==Uses==
| |
| This theorem is useful in randomness reduction in the study of [[derandomization]]. Sampling from an expander walk is an example of a randomness-efficient [[sample (statistics)|sampler]]. Note that the number of [[bit]]s used in sampling <math>k</math> independent samples from <math>f</math> is <math>k \log n</math>, whereas if we sample from an infinite family of constant-degree expanders this costs only <math>\log n + O(k)</math>. Such families exist and are efficiently constructible, e.g. the [[Ramanujan graph]]s of [[Alexander Lubotzky|Lubotzky]]-Phillips-Sarnak.
| |
| | |
| ==Notes==
| |
| {{reflist}}
| |
| | |
| ==References==
| |
| {{refbegin}}
| |
| * {{citation
| |
| | first1=M. | last1=Ajtai
| |
| | first2=J. | last2=Komlós
| |
| | first3=E. | last3=Szemerédi
| |
| | title=Deterministic simulation in LOGSPACE
| |
| | booktitle=Proceedings of the nineteenth annual ACM symposium on Theory of computing
| |
| | pages=132–140
| |
| | year=1987
| |
| | work=ACM
| |
| | doi=10.1145/28395.28410
| |
| }}
| |
| * {{citation
| |
| | first=D. | last=Gillman
| |
| | title=A Chernoff Bound for Random Walks on Expander Graphs
| |
| | journal=SIAM Journal on Computing
| |
| | volume=27
| |
| | pages=1203–1220
| |
| | year=1998
| |
| | publisher=Society for Industrial and Applied Mathematics
| |
| | doi=10.1137/S0097539794268765
| |
| | issue=4,
| |
| }}
| |
| {{refend}}
| |
| | |
| ==External links==
| |
| * Proofs of the expander walk sampling theorem. [http://citeseer.ist.psu.edu/gillman98chernoff.html] [http://projecteuclid.org/Dienst/UI/1.0/Summarize/euclid.aoap/1028903453]
| |
| | |
| [[Category:Sampling (statistics)]]
| |
I am Oscar and I totally dig that title. His spouse doesn't like it the way he does but what he truly likes doing is to do aerobics and he's been doing it for quite a while. South Dakota is where me and my husband reside. My working day occupation is a librarian.
Look at my web page ... sex-porn-tube.ch