Summary
In this chapter, we introduced several Bayesian machine learning methods designed to address the challenges posed by wide and tall data. However, the field of Bayesian machine learning is rapidly evolving, and the material presented here should be viewed as an introductory overview.
Many important topics were not covered but are highly relevant, such as Bayesian neural networks (Radford M. Neal 2012) and neural posterior estimation (Papamakarios and Murray 2016; Lueckmann et al. 2017; D. S. Greenberg, Nonnenmacher, and Macke 2019).
Other key approaches, such as Variational Bayes, particularly in its stochastic implementations, which are rooted in machine learning and offer scalable solutions for tall data, are introduced in Chapter 14 (Wainwright, Jordan, et al. 2008).
Greenberg, David S, Marcel Nonnenmacher, and Jakob H Macke. 2019. “Automatic Posterior Transformation for Likelihood-Free Inference.” In International Conference on Machine Learning, 2404–14.
Lueckmann, Jan-Matthis, Pedro J Goncalves, Gabriele Bassetto, Kaan Öcal, Marcel Nonnenmacher, and Jakob H Macke. 2017. “Flexible Statistical Inference for Mechanistic Models of Neural Dynamics.” Advances in Neural Information Processing Systems 30.
Neal, Radford M. 2012. Bayesian Learning for Neural Networks. Vol. 118. Springer Science & Business Media.
Papamakarios, George, and Iain Murray. 2016. “Fast \(\epsilon\)-Free Inference of Simulation Models with Bayesian Conditional Density Estimation.” In Advances in Neural Information Processing Systems. Vol. 29.
Wainwright, Martin J, Michael I Jordan, et al. 2008. “Graphical Models, Exponential Families, and Variational Inference.” Foundations and Trends in Machine Learning 1 (1–2): 1–305.