Monash University
Browse

Updating Variational Bayes: Fast Sequential Posterior Inference

Download (1.37 MB)
journal contribution
posted on 2022-11-10, 01:53 authored by Nathaniel Tomasetti, Catherine Forbes, Anastasios Panagiotelis
Variational Bayesian (VB) methods produce posterior inference in a time frame considerably smaller than traditional Markov Chain Monte Carlo approaches. Although the VB posterior is an approximation, it has been shown to produce good parameter estimates and predicted values when a rich classes of approximating distributions are considered. In this paper we propose the use of recursive algorithms to update a sequence of VB posterior approximations in an online, time series setting, with the computation of each posterior update requiring only the data observed since the previous update. We show how importance sampling can be incorporated into online variational inference allowing the user to trade accuracy for a substantial increase in computational speed. The proposed methods and their properties are detailed in two separate simulation studies. Two empirical illustrations of the methods are provided, including one where a Dirichlet Process Mixture model with a novel posterior dependence structure is repeatedly updated in the context of predicting the future behaviour of vehicles on a stretch of the US Highway 101.

History

Classification-JEL

C11, G18, G39

Creation date

2020-08-16

Working Paper Series Number

27/20

Length

36

File-Format

application/pdf

Handle

RePEc:msh:ebswps:2020-27

Usage metrics

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC