search

UMD     This Site





Approximate Bayesian inference is a statistical learning methodology with wide-ranging applications in sequential information collection problems, particularly those where a decision maker must use incomplete or censored information to maintain and update a set of beliefs about one or more unknown population parameters. Approximate Bayesian models are attractive for their computational tractability and often lead to compact belief representations that can interface with simple and interpretable policies for related decision problems.

Approximate Bayesian methods have repeatedly demonstrated significant practical benefits. They have been successfully used for market design, posted price auctions, online gaming, big-data analytics, approximate dynamic programming and ranking and selection.

However, virtually all of the existing work on sequential approximate Bayesian learning is computational/algorithmic in nature, and the consistency of approximate Bayesian estimators has been a largely open problem, mostly unamenable to the usual forms of consistency analysis.

A new paper in the journal Operations Research, Consistency Analysis of Sequential Learning Under Approximate Bayesian Inference, develops a theoretical framework that can be leveraged to produce new consistency proofs.

Associate Professor Ilya Ryzhov (BMG/ISR) and his former student Ye Chen (Math Ph.D. 2018) introduce a new consistency theory that interprets approximate Bayesian inference as a form of stochastic approximation (SA) with an additional “bias” term. They prove the convergence of a general SA algorithm of this form and leverage this analysis to derive the first consistency proofs for a suite of approximate Bayesian models from the recent literature.

This paper is among the first to provide broad theoretical support for approximate Bayesian procedures. The authors prove, for the first time, the statistical consistency of a wide variety of previously proposed approximate Bayesian estimators, providing insight into their good empirical performance. They also develop theoretical tools that may be used by researchers to develop similar proofs for other problems and applications.



Related Articles:
A novel statistical idea: 'Down-Up' sequences that 'capture' small tail probabilities
Nuno Martins, alum Shinkyu Park and Jeff Shamma lead tutorial session at IEEE CDC 2019
Hybrid compositional planning for UAV rescue missions
'Safety smart list' can decrease time in ICU, lower hospital costs
Fu, Ryzhov, Qu are issued patent for B2B optimal bidding
Alumnus Amir Ali Ahmadi receives PECASE Award
Maryland researchers awarded $1M DARPA Lagrange program cooperative agreement
Schonfeld, Ryzhov team up for NSF EAGER grant
Alumna Enlu Zhou earns tenure at Georgia Tech
Alumna Enlu Zhou wins NSF CAREER Award for optimization and sampling in stochastic simulation

January 7, 2020


«Previous Story  

 

 

Current Headlines

RoadTrack algorithm could help autonomous vehicles navigate dense traffic scenarios

Who should Google Scholar update more often?

Alum Evandro Valente testifies on two economic development bills

Warren Savage delivers cybersecurity keynote at DesignCon 2020

Fast estimation algorithm aids investigations of the cortical origins of underlying neural processes

Nau and colleagues develop APE, an integrated acting-and-planning system

IFIG framework helps robots follow instructions

A learning algorithm for training robots' deep neural networks to grasp novel objects

Alumnus Donald Martin honored by Network of Minorities in Mathematical Sciences

Dean Pines Named University of Maryland's 34th President

 
 
Back to top  
Home Clark School Home UMD Home