Английская Википедия:Expectation propagation

Материал из Онлайн справочника
Версия от 17:35, 5 марта 2024; EducationBot (обсуждение | вклад) (Новая страница: «{{Английская Википедия/Панель перехода}} {{Short description|Method to approximate a probability distribution}} '''Expectation propagation (EP)''' is a technique in Bayesian machine learning.<ref name="bishop">{{Cite book|title=Pattern Recognition and Machine Learning|last=Bishop|first=Christopher|publisher=Springer-Verlag New York Inc.|year=2007|isbn=978-0387310732|location=New York}}</ref> EP finds...»)
(разн.) ← Предыдущая версия | Текущая версия (разн.) | Следующая версия → (разн.)
Перейти к навигацииПерейти к поиску

Шаблон:Short description

Expectation propagation (EP) is a technique in Bayesian machine learning.[1]

EP finds approximations to a probability distribution.[1] It uses an iterative approach that uses the factorization structure of the target distribution.[1] It differs from other Bayesian approximation approaches such as variational Bayesian methods.[1]

More specifically, suppose we wish to approximate an intractable probability distribution <math>p(\mathbf{x})</math> with a tractable distribution <math>q(\mathbf{x})</math>. Expectation propagation achieves this approximation by minimizing the Kullback-Leibler divergence <math>\mathrm{KL}(p||q)</math>.[1] Variational Bayesian methods minimize <math>\mathrm{KL}(q||p)</math> instead.[1]

If <math>q(\mathbf{x})</math> is a Gaussian <math>\mathcal{N}(\mathbf{x}|\mu, \Sigma)</math>, then <math>\mathrm{KL}(p||q)</math> is minimized with <math>\mu</math> and <math>\Sigma</math> being equal to the mean of <math>p(\mathbf{x})</math> and the covariance of <math>p(\mathbf{x})</math>, respectively; this is called moment matching.[1]

Applications

Expectation propagation via moment matching plays a vital role in approximation for indicator functions that appear when deriving the message passing equations for TrueSkill.

References

Шаблон:Reflist

External links


Шаблон:Compsci-stub