Variational Bayes (VB) is a common strategy for approximate Bayesian inference, but simple methods are only available for specific classes of models including, in particular, representations having conditionally conjugate constructions within an exponential family. Models with logit components are an apparently notable exception to this class, due to the absence of conjugacy among the logistic likelihood and the Gaussian priors for the coefficients in the linear predictor. To facilitate approximate inference within this widely used class of models, Jaakkola and Jordan (Stat. Comput. 10 (2000) 25-37) proposed a simple variational approach which relies on a family of tangent quadratic lower bounds of the logistic log-likelihood, thus restoring conjugacy between these approximate bounds and the Gaussian priors. This strategy is still implemented successfully, but few attempts have been made to formally understand the reasons underlying its excellent performance. Following a review on VB for logistic models, we cover this gap by providing a formal connection between the above bound and a recent Pólya-gamma data augmentation for logistic regression. Such a result places the computational methods associated with the aforementioned bounds within the framework of variational inference for conditionally conjugate exponential family models, thereby allowing recent advances for this class to be inherited also by the methods relying on Jaakkola and Jordan (Stat. Comput. 10 (2000) 25-37).

Durante, D., Rigon, T. (2019). Conditionally conjugate mean-field variational Bayes for logistic models. STATISTICAL SCIENCE, 34(3), 472-485 [10.1214/19-STS712].

Conditionally conjugate mean-field variational Bayes for logistic models

Rigon T.
Secondo
2019

Abstract

Variational Bayes (VB) is a common strategy for approximate Bayesian inference, but simple methods are only available for specific classes of models including, in particular, representations having conditionally conjugate constructions within an exponential family. Models with logit components are an apparently notable exception to this class, due to the absence of conjugacy among the logistic likelihood and the Gaussian priors for the coefficients in the linear predictor. To facilitate approximate inference within this widely used class of models, Jaakkola and Jordan (Stat. Comput. 10 (2000) 25-37) proposed a simple variational approach which relies on a family of tangent quadratic lower bounds of the logistic log-likelihood, thus restoring conjugacy between these approximate bounds and the Gaussian priors. This strategy is still implemented successfully, but few attempts have been made to formally understand the reasons underlying its excellent performance. Following a review on VB for logistic models, we cover this gap by providing a formal connection between the above bound and a recent Pólya-gamma data augmentation for logistic regression. Such a result places the computational methods associated with the aforementioned bounds within the framework of variational inference for conditionally conjugate exponential family models, thereby allowing recent advances for this class to be inherited also by the methods relying on Jaakkola and Jordan (Stat. Comput. 10 (2000) 25-37).
Articolo in rivista - Articolo scientifico
EM; Logistic regression; Pólya-gamma data augmentation; Quadratic approximation; Variational Bayes;
EM; Logistic regression; Pólya-gamma data augmentation; Quadratic approximation; Variational Bayes
English
2019
34
3
472
485
none
Durante, D., Rigon, T. (2019). Conditionally conjugate mean-field variational Bayes for logistic models. STATISTICAL SCIENCE, 34(3), 472-485 [10.1214/19-STS712].
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/10281/289199
Citazioni
  • Scopus 14
  • ???jsp.display-item.citation.isi??? 15
Social impact