Bayesian methods in Machine Learning
Bayesian methods have recently regained a significant amount of attention in the machine community due to the development of scalable approximate Bayesian inference techniques. There are several advantages of using Bayesian approach: Parameter and prediction uncertainty become easily available, facilitating rigid statistical analysis. Furthermore, prior knowledge can be incorporated.
About the project
In this project we develop new models, methods and algorithms for Bayesian machine learning, in particular related to deep neural networks and computational causal inference.
- Bayesian estimation of causal effects using directed graphical models. PhD project for Vera Kvisgaard.
- A Bayesian model-averaging toolkit for causal inference with observational data under nonlinear structural equations: An application to the effect of ADHD treatment on school performance by Norwegian children. PhD project for Johan de Aguas.
- Normalizing flows as variational inference approximations in latent binary Bayesian neural network models. Master project by Lars Skaaret-Lund
- A fully probabilistic methodology for providing diverse, personalised recommendations from clicking data, using a Variational Bayesian approach for fast computation. Master student: Haakon Muggerud
- Subsampling Strategies for Bayesian Variable Selection and Model Averaging in GLM and BGNLM. Master project by Jon Lachmann (Stockholm University)
- Identification of non-linear Models with a Bayesian Model selection Tool. Master project by Elke Bruns (University of Vienna)
- Aliaksandr Hubin is hired as a post doc through the "Akademia-avtale" with Equinor
- Part of the activity is financed through BigInsight
- Lachmann, Storvik, Frommlet and Hubin: A subsampling approach for Bayesian model selection. arXiv preprint arXiv …, 2022
- Hubin, Storvik and Frommlet: Flexible Bayesian Nonlinear Model Configuration, Journal of Artificial Intelligence Research, 2021
- Hubin, Frommlet and Storvik: Reversible Genetically Modified Mode Jumping MCMC, Proceedings of 22nd European Young Statisticians Meeting, 2021
- Pensar, Talvitie, Hyttinen, and Koivisto: A Bayesian approach for estimating causal effects from observational data, Proccedings of the 34th AAAI Conference on Artificial Intelligence, 2020
- Viinikka, Hyttinen, Pensar, and Koivisto: Towards scalable Bayesian learning of causal DAGs, Proceedings of the 34th Conference on Neural Information Systems (NeurIPS), 2020
- Hubin, Storvik and Frommlet: A Novel Algorithmic Approach to Bayesian Logic Regression (with Discussion) Bayesian Analysis, 2020
- Hubin and Storvik: Combining Model and Parameter Uncertainty in Bayesian Neural Networks, arXiv preprint arXiv:1903.07594, 2019
- Hubin and Storvik: Mode jumping MCMC for Bayesian variable selection in GLMM, Computational Statistics & Data Analysis, 2018
- Hubin and Storvik: Estimating the marginal likelihood with Integrated nested Laplace approximation (INLA), arXiv preprint arXiv:1611.01450, 2016