Английская Википедия:Expectation propagation

Материал из Онлайн справочника
Перейти к навигацииПерейти к поиску

Шаблон:Short description

Expectation propagation (EP) is a technique in Bayesian machine learning.[1]

EP finds approximations to a probability distribution.[1] It uses an iterative approach that uses the factorization structure of the target distribution.[1] It differs from other Bayesian approximation approaches such as variational Bayesian methods.[1]

More specifically, suppose we wish to approximate an intractable probability distribution <math>p(\mathbf{x})</math> with a tractable distribution <math>q(\mathbf{x})</math>. Expectation propagation achieves this approximation by minimizing the Kullback-Leibler divergence <math>\mathrm{KL}(p||q)</math>.[1] Variational Bayesian methods minimize <math>\mathrm{KL}(q||p)</math> instead.[1]

If <math>q(\mathbf{x})</math> is a Gaussian <math>\mathcal{N}(\mathbf{x}|\mu, \Sigma)</math>, then <math>\mathrm{KL}(p||q)</math> is minimized with <math>\mu</math> and <math>\Sigma</math> being equal to the mean of <math>p(\mathbf{x})</math> and the covariance of <math>p(\mathbf{x})</math>, respectively; this is called moment matching.[1]

Applications

Expectation propagation via moment matching plays a vital role in approximation for indicator functions that appear when deriving the message passing equations for TrueSkill.

References

Шаблон:Reflist

External links


Шаблон:Compsci-stub