-
Storvik, Geir Olve & Diz-Lois Palomares, Alfonso
(2023).
Sequential Monte Carlo for infectious disease models - the Covid-19 case.
-
Hubin, Aliaksandr & Storvik, Geir Olve
(2022).
Variational Bayes for inference on model and parameter uncertainty in Bayesian neural networks.
Show summary
Bayesian neural networks (BNNs) have recently regained a significant amount of attention in the deep learning community due to the development of scalable approximate Bayesian inference techniques [1]. There are several advantages of using a Bayesian approach: Parameter and prediction uncertainties become easily available, facilitating rigorous statistical analysis. Furthermore, prior knowledge can be incorporated. However, so far there have been no scalable techniques capable of combining both model (structural) and parameter uncertainty. In the presented piece of research [2] we introduce the concept of model uncertainty in BNNs and hence make inference in the joint space of models and parameters. Moreover, we suggest an adaptation of a scalable variational inference approach with reparametrization of marginal inclusion probabilities to incorporate the model space constraints. Experimental results on a range of benchmark data sets show that we obtain comparable accuracy results with the competing models, but based on methods that are much more sparse than ordinary BNNs. This is particularly the case in model selection settings, but also within a Bayesian model averaging setting a considerable sparsification is achieved. As expected, model uncertainties give higher, but more reliable uncertainty measures.
References:
[1] Blundell, C., Cornebise, J., Kavukcuoglu, K., and Wierstra, D. (2015). Weight uncertainty in neural networks. International Conference on Machine Learning, 1613 - 1622.
[2] Hubin, A., Storvik, G. (2019). Combining model and parameter uncertainty in Bayesian neural networks, arXiv preprint arXiv:1903.07594.
-
Hubin, Aliaksandr & Storvik, Geir Olve
(2022).
Variational Inference for Bayesian Neural Networks under Model and Parameter Uncertainty.
-
Lachmann, Jon; Storvik, Geir Olve; Frommlet, Florian & Hubin, Aliaksandr
(2022).
A subsampling approach for Bayesian model selection.
-
Hubin, Aliaksandr; Frommlet, Florian & Storvik, Geir Olve
(2021).
Reversible Genetically Modified MCMCs.
Show summary
In this work, we introduce a reversible version of a genetically modified Markov chain Monte Carlo algorithm (GMJMCMC) for inference on posterior model probabilities in complex functional spaces, where the number of explanatory variables or functions of explanatory variables is prohibitively large for simple Markov Chain Monte Carlo methods. A genetically modified Markov chain Monte Carlo algorithm (GMJMCMC) was introduced in [5, 4, 2] for Bayesian model selection/averaging problems when the total number of function of covariates is prohibitively large.
More specifically, these applications include GWAS studies with Bayesian generalized linear models [2] as well as Bayesian logic regressions [5] and Bayesian generalized nonlinear models [4]. If its regularity conditions are met, GMJMCMC algorithm can asymptotically explore all models in the defined model spaces. At the same time, GMJMCMC is not a proper MCMC in a sense that its limiting distribution does not correspond to marginal posterior model probabilities and thus only renormalized estimates of these probabilities [3, 1] can be obtained. Unlike the standard GMJMCMC algorithm, the introduced algorithm is a proper MCMC and its limiting distribution corresponds to posterior marginal model probabilities in the explored model spaces under reasonable regularity conditions.
-
Hubin, Aliaksandr & Storvik, Geir Olve
(2021).
Variational Inference for Bayesian Neural Networks under Model and Parameter Uncertainty.
Show summary
Bayesian neural networks (BNNs) have recently regained a significant amount of attention in the deep learning community due to the development of scalable approximate Bayesian inference techniques. There are several advantages of using a Bayesian approach: Parameter and prediction uncertainties become easily available, facilitating rigorous statistical analysis. Furthermore, prior knowledge can be incorporated. However so far there have been no scalable techniques capable of combining both model (structural) and parameter uncertainty. In this paper we introduce the concept of model uncertainty in BNNs and hence make inference in the joint space of models and parameters.
Moreover, we suggest an adaptation of a scalable variational inference approach with reparametrization of marginal inclusion probabilities to incorporate the model space constraints. Experimental results on a range of benchmark data sets show that we obtain comparable accuracy results with the competing models, but based on methods that are much more sparse than ordinary BNNs. This is particularly the case in model selection settings, but also within a Bayesian model averaging setting a considerable sparsification is achieved. As expected, model uncertainties give higher, but more reliable uncertainty measures.
-
Hubin, Aliaksandr & Storvik, Geir Olve
(2021).
Variational Bayes for inference on model and parameter
uncertainty in Bayesian neural networks.
Show summary
Bayesian neural networks (BNNs) have recently regained a significant amount of attention in
the deep learning community due to the development of scalable approximate Bayesian inference
techniques [1]. There are several advantages of using a Bayesian approach: Parameter and prediction uncertainties become easily available, facilitating rigorous statistical analysis. Furthermore,
prior knowledge can be incorporated. However so far there have been no scalable techniques capable of combining both model (structural) and parameter uncertainty. In the presented piece of
research [2] we introduce the concept of model uncertainty in BNNs and hence make inference in
the joint space of models and parameters. Moreover, we suggest an adaptation of a scalable variational inference approach with reparametrization of marginal inclusion probabilities to incorporate
the model space constraints. Experimental results on a range of benchmark data sets show that
we obtain comparable accuracy results with the competing models, but based on methods that are
much more sparse than ordinary BNNs. This is particularly the case in model selection settings,
but also within a Bayesian model averaging setting a considerable sparsification is achieved. As
expected, model uncertainties give higher, but more reliable uncertainty measures.
-
Hubin, Aliaksandr; Storvik, Geir Olve & Frommlet, Florian
(2020).
A novel algorithmic approach to Bayesian Logic Regression.
Show summary
Logic regression was developed more than a decade ago as a tool to construct predictors from Boolean combinations of binary covariates. It has been mainly used to model epistatic effects in genetic association studies, which is very appealing due to the intuitive interpretation of logic expressions to describe the interaction between genetic variations. Nevertheless logic regression has (partly due to computational challenges) remained less well known than other approaches to epistatic association mapping. Here we will adapt an advanced evolutionary algorithm called GMJMCMC (Genetically modified Mode Jumping Markov Chain Monte Carlo) to perform Bayesian model selection in the space of logic regression models. After describing the algorithmic details of GMJMCMC we perform a comprehensive simulation study that illustrates its performance given logic regression terms of various complexity. Specifically GMJMCMC is shown to be able to identify three-way and even four-way interactions with relatively large power, a level of complexity which has not been achieved by previous implementations of logic regression. We apply GMJMCMC to reanalyze QTL (quantitative trait locus) mapping data for Recombinant Inbred Lines in Arabidopsis thaliana and from a backcross population in Drosophila where we identify several interesting epistatic effects. The method is implemented in an R package which is available on github.
-
Hubin, Aliaksandr; Storvik, Geir Olve & Frommlet, Florian
(2020).
Rejoinder for the discussion of the paper "A Novel Algorithmic Approach to Bayesian Logic Regression".
Bayesian Analysis.
ISSN 1936-0975.
15(1),
p. 312–333.
doi:
10.1214/18-ba1141.
Full text in Research Archive
Show summary
Logic regression was developed more than a decade ago as a tool to construct predictors from Boolean combinations of binary covariates. It has been mainly used to model epistatic effects in genetic association studies, which is very appealing due to the intuitive interpretation of logic expressions to describe the interaction between genetic variations. Nevertheless logic regression has (partly due to computational challenges) remained less well known than other approaches to epistatic association mapping. Here we will adapt an advanced evolutionary algorithm called GMJMCMC (Genetically modified Mode Jumping Markov Chain Monte Carlo) to perform Bayesian model selection in the space of logic regression models. After describing the algorithmic details of GMJMCMC we perform a comprehensive simulation study that illustrates its performance given logic regression terms of various complexity. Specifically GMJMCMC is shown to be able to identify three-way and even four-way interactions with relatively large power, a level of complexity which has not been achieved by previous implementations of logic regression. We apply GMJMCMC to reanalyze QTL (quantitative trait locus) mapping data for Recombinant Inbred Lines in Arabidopsis thaliana and from a backcross population in Drosophila where we identify several interesting epistatic effects. The method is implemented in an R package which is available on github.
-
Storvik, Geir Olve
(2020).
"Preliminaries for Deep Neural Networks: Recapture of Linear Algebra, Gradient Descents and Generalized Linear Models".
-
Storvik, Geir Olve
(2020).
Neural networks vs generalized linear models.
-
-
Hubin, Aliaksandr & Storvik, Geir Olve
(2019).
Combining Model and Parameter Uncertainty in Bayesian Neural Networks.
Show summary
Bayesian neural networks (BNNs) have recently regained a significant amount of attention in the deep learning community due to the development of scalable approximate Bayesian inference techniques. There are several advantages of using Bayesian approach: Parameter and prediction uncertainty become easily available, facilitating rigid statistical analysis. Furthermore, prior knowledge can be incorporated. However so far there have been no scalable techniques capable of combining both model (structural) and parameter uncertainty. In this paper we introduce the concept of model uncertainty in BNNs and hence make inference in the joint space of models and parameters. Moreover, we suggest an adaptation of a scalable variational inference approach with reparametrization of marginal inclusion probabilities to incorporate the model space constraints. Finally, we show that incorporating model uncertainty via Bayesian model averaging and Bayesian model selection allows to drastically sparsify the structure of BNNs without significant loss of predictive power.
-
Hubin, Aliaksandr; Storvik, Geir Olve; Grini, Paul Eivind & Butenko, Melinka Alonso
(2019).
Bayesian binomial regression model with a latent Gaussian field for analysis of epigenetic data.
-
Hubin, Aliaksandr & Storvik, Geir Olve
(2019).
Combining Model and Parameter Uncertainty in Bayesian Neural Networks.
-
Hubin, Aliaksandr & Storvik, Geir Olve
(2019).
Combining Model and Parameter Uncertainty in Bayesian Neural Networks.
Show summary
Bayesian neural networks (BNNs) have recently regained a significant amount of attention in the deep learning community due to the development of scalable approximate Bayesian inference techniques. There are several advantages of using Bayesian approach: Parameter and prediction uncertainty become easily available, facilitating rigid statistical analysis. Furthermore, prior knowledge can be incorporated. However so far there have been no scalable techniques capable of combining both model (structural) and parameter uncertainty. In this paper we introduce the concept of model uncertainty in BNNs and hence make inference in the joint space of models and parameters. Moreover, we suggest an adaptation of a scalable variational inference approach with reparametrization of marginal inclusion probabilities to incorporate the model space constraints. Finally, we show that incorporating model uncertainty via Bayesian model averaging and Bayesian model selection allows to drastically sparsify the structure of BNNs without significant loss of predictive power.
-
Hubin, Aliaksandr & Storvik, Geir Olve
(2019).
Combining Model and Parameter Uncertainty in Bayesian Neural Networks.
-
Storvik, Geir Olve & Hubin, Aliaksandr
(2019).
Combining model and parameter uncertainty in Bayesian neural networks.
-
Storvik, Geir Olve
(2019).
Flexible Bayesian Nonlinear Model Configuration.
-
Storvik, Geir Olve
(2019).
Bayesian approaches to neural networks.
-
Storvik, Geir Olve
(2019).
Bayesian approaches to neural networks.
-
Storvik, Geir Olve
(2019).
On the use of Bayesian methods in machine learning.
-
Storvik, Geir Olve
(2019).
Education in Statistics and Data Science at the University of Oslo .
-
Storvik, Geir Olve
(2019).
Bayesian approaches to neural networks.
-
Hubin, Aliaksandr; Storvik, Geir Olve & Frommlet, Florian
(2018).
Deep Bayesian regression models.
-
Hubin, Aliaksandr; Storvik, Geir Olve & Frommlet, Florian
(2018).
Deep Bayesian regression models.
-
Hubin, Aliaksandr; Storvik, Geir Olve & Frommlet, Florian
(2018).
Deep Bayesian regression models.
-
Hubin, Aliaksandr; Storvik, Geir Olve & Frommlet, Florian
(2018).
Deep Bayesian regression models.
Show summary
Regression models are used for inference and prediction in a wide range of applications providing a powerful scientific tool for researchers and analysts from different fields. In many research fields the amount of available data as well as the number of potential explanatory variables is rapidly increasing. Variable selection and model averaging have become extremely important tools for improving inference and prediction. However, often linear models are not sufficient and the complex relationship between input variables and a response is better described by introducing non-linearities and complex functional interactions. Deep learning models have been extremely successful in terms of prediction although they are often difficult to specify and potentially suffer from overfitting. The aim of this paper is to bring the ideas of deep learning into a statistical framework which yields more parsimonious models and allows to quantify model uncertainty. To this end we introduce the class of deep Bayesian regression models (DBRM) consisting of a generalized linear model combined with a comprehensive non-linear feature space, where non-linear features are generated just like in deep learning but combined with variable selection in order to include only important features. DBRM can easily be extended to include latent Gaussian variables to model complex correlation structures between observations, which seems to be not easily possible with existing deep learning approaches. Two different algorithms based on MCMC are introduced to fit DBRM and to perform Bayesian inference. The predictive performance of these algorithms is compared with a large number of state of the art algorithms. Furthermore we illustrate how DBRM can be used for model inference in various applications.
-
Hubin, Aliaksandr; Storvik, Geir Olve & Frommlet, Florian
(2018).
Deep Bayesian regression models.
Show summary
Regression models are used for inference and prediction in a wide range of applications providing a powerful scientific tool for researchers and analysts from different fields. In many research fields the amount of available data as well as the number of potential explanatory variables is rapidly increasing. Variable selection and model averaging have become extremely important tools for improving inference and prediction. However, often linear models are not sufficient and the complex relationship between input variables and a response is better described by introducing non-linearities and complex functional interactions. Deep learning models have been extremely successful in terms of prediction although they are often difficult to specify and potentially suffer from overfitting. The aim of this paper is to bring the ideas of deep learning into a statistical framework which yields more parsimonious models and allows to quantify model uncertainty. To this end we introduce the class of deep Bayesian regression models (DBRM) consisting of a generalized linear model combined with a comprehensive non-linear feature space, where non-linear features are generated just like in deep learning but combined with variable selection in order to include only important features. DBRM can easily be extended to include latent Gaussian variables to model complex correlation structures between observations, which seems to be not easily possible with existing deep learning approaches. Two different algorithms based on MCMC are introduced to fit DBRM and to perform Bayesian inference. The predictive performance of these algorithms is compared with a large number of state of the art algorithms. Furthermore we illustrate how DBRM can be used for model inference in various applications.
-
Hubin, Aliaksandr; Storvik, Geir Olve & Frommlet, Florian
(2018).
Deep Bayesian regression models.
Show summary
Regression models are used for inference and prediction in a wide range of applications providing a powerful scientific tool for researchers and analysts from different fields. In many research fields the amount of available data as well as the number of potential explanatory variables is rapidly increasing. Variable selection and model averaging have become extremely important tools for improving inference and prediction. However, often linear models are not sufficient and the complex relationship between input variables and a response is better described by introducing non-linearities and complex functional interactions. Deep learning models have been extremely successful in terms of prediction although they are often difficult to specify and potentially suffer from overfitting. The aim of this paper is to bring the ideas of deep learning into a statistical framework which yields more parsimonious models and allows to quantify model uncertainty. To this end we introduce the class of deep Bayesian regression models (DBRM) consisting of a generalized linear model combined with a comprehensive non-linear feature space, where non-linear features are generated just like in deep learning but combined with variable selection in order to include only important features. DBRM can easily be extended to include latent Gaussian variables to model complex correlation structures between observations, which seems to be not easily possible with existing deep learning approaches. Two different algorithms based on MCMC are introduced to fit DBRM and to perform Bayesian inference. The predictive performance of these algorithms is compared with a large number of state of the art algorithms. Furthermore we illustrate how DBRM can be used for model inference in various applications.
-
Storvik, Geir Olve
(2018).
Preliminaries for Deep Neural Networks: Recapture of Linear Algebra, Gradient Descents and Generalized Linear Models.
-
Storvik, Geir Olve
(2018).
Statistical analysis for Time series.
-
Storvik, Geir Olve; Vigeland, Magnus Dehli; Caliebe, Amke & Egeland, Thore
(2018).
Specification of mutation probabilities through Metropolis-Hastings steps.
-
Hubin, Aliaksandr; Storvik, Geir Olve & Frommlet, Florian
(2017).
Deep non-linear regression models in a Bayesian framework.
-
Hubin, Aliaksandr; Storvik, Geir Olve & Grini, Paul Eivind
(2017).
Variable selection in binomial regression with latent Gaussian field models for analysis of
epigenetic data.
-
Hubin, Aliaksandr; Storvik, Geir Olve & Frommlet, Florian
(2017).
A novel algorithmic approach to Bayesian Logic Regression.
-
Hubin, Aliaksandr; Storvik, Geir Olve & Frommlet, Florian
(2017).
A novel GMJMCMC algorithm for Bayesian Logic Regression models.
-
Hubin, Aliaksandr & Storvik, Geir Olve
(2017).
Efficient mode jumping MCMC for Bayesian
variable selection and model averaging in GLMM.
-
Hubin, Aliaksandr; Storvik, Geir Olve & Frommlet, Florian
(2017).
A novel algorithmic approach to Bayesian Logic Regression.
-
Eikeset, Anne Maria; Dunlop, Erin; Heino, Mikko Petteri; Storvik, Geir Olve; Stenseth, Nils Christian & Dieckmann, Ulf
(2017).
Reply to Enberg and Jørgensen: Ecology and evolution both matter for explaining stock dynamics.
Proceedings of the National Academy of Sciences of the United States of America.
ISSN 0027-8424.
114(22),
p. E4322–E4323.
doi:
10.1073/pnas.1703865114.
-
Hubin, Aliaksandr & Storvik, Geir Olve
(2016).
Efficient mode jumping MCMC for Bayesian variable selection in GLM with random effects models.
-
Hubin, Aliaksandr & Storvik, Geir Olve
(2016).
On Mode Jumping in MCMC for Bayesian Variable
Selection within GLMM.
-
Hubin, Aliaksandr & Storvik, Geir Olve
(2016).
VARIABLE SELECTION IN BINOMIAL REGRESSION WITH LATENT GAUSSIAN FIELD MODELS FOR ANALYSIS OF EPIGENETIC DATA.
-
Hubin, Aliaksandr & Storvik, Geir Olve
(2016).
Variable selection in logistic regression with a latent Gaussian field models with an application in epigenomics.
-
Vanem, Erik; Glad, Ingrid Kristine & Storvik, Geir Olve
(2016).
Dynamical Linear Models for Condition Monitoring with Multivariate Sensor Data.
-
Hubin, Aliaksandr & Storvik, Geir Olve
(2015).
On model selection in Hidden Markov Models with covariates.
-
Hubin, Aliaksandr & Storvik, Geir Olve
(2015).
Variable selection in binomial regression with a latent
Gaussian field models for analysis of epigenetic data.
-
Hubin, Aliaksandr & Storvik, Geir Olve
(2015).
Variable selection in binomial regression with a latent
Gaussian field models for analysis of epigenetic data.
-
Storvik, Geir Olve; Aanes, Sondre & Maisha, Peter Nyangweso
(2015).
Estimation of fish abundance and demography in the Barents sea.
-
Storvik, Geir Olve
(2015).
Estimation of fish abundance and demography in the Barents sea.
-
Storvik, Geir Olve & Marques, Reinaldo
(2014).
Estimation of static parameters using particle filters and a block independence approximation.
-
Storvik, Geir Olve
(2013).
Computational tools for analysis of models with latent structures.
-
Storvik, Geir Olve
(2012).
Bayesian approaches to neural networks.
-
Rognebakke, Hanne Therese Wist; Hirst, David; Aldrin, Magne & Storvik, Geir
(2011).
Modelling catch at age for multiple stock.
-
-
-
Villar, Jaime Otero; Antonsson, Thorulfur; Armstrong, John; Arnason, Fridthjofur; Arnekleiv, Jo Vegar & Baglinière, Jean-Luc
[Show all 26 contributors for this article]
(2010).
Environmental effects on ocean entry of Atlantic salmon (Salmo salar) smolt across its range of distribution.
-
Storvik, Geir Olve & Marques, Reinaldo
(2018).
On Monte Carlo Contributions for Real-time Probabilistic Inference.
Universitetet i Oslo.
-
Storvik, Geir Olve & Hubin, A.
(2018).
Bayesian model configuration, selection and averaging in complex regression contexts.
Universitetet i Oslo.
-
Rognebakke, Hanne; Hirst, David; Aanes, Sondre & Storvik, Geir Olve
(2016).
Catch-at-age - Version 4.0: Technical Report.
Norsk Regnesentral.
-
Storvik, Geir Olve; Løland, Anders; Lykkja, Ola Martin & Gjevestad, Jon Glenn Omholt
(2016).
SAVE – tracking vehicle movements for toll object detection using particle filter.
Norsk Regnesentral.
-
Maisha, Peter Nyangweso; Storvik, Geir Olve & Aanes, Sondre
(2015).
A State-Space Model for Abundance Estimation from Bottom Trawl Data with Applications to Norwegian Winter Survey.
Matematisk Institutt, UiO.
Full text in Research Archive
Show summary
We study a hierarchical dynamic state-space model for abundance estimation. A generic data fusion approach for combining computer simulated posterior samples of catch output data with observed research survey indices using sequential importance sampling is presented. Posterior samples of catch generated from a computer software are used as a primary source of input data through which fisheries dependent information is mediated. Direct total stock abundance estimates are obtained without the need to estimate any intermediate parameters such as catchability and mortality. Numerical results of a simulation study show that our method provides a useful alternative to existing methods. We apply the method to data from the Barents Sea Winter survey for Northeast Arctic cod (Gadus morhua). The results based on our method are comparable to results based on current methods.
-
Rognebakke, Hanne Therese Wist; Hirst, David & Storvik, Geir Olve
(2011).
Catch-at-age - Version 2.0: Technical Report.
Norsk Regnesentral.
-
Rognebakke, Hanne; Hirst, David; Storvik, Geir & Aldrin, Magne
(2011).
Catch-at-age for multiple stocks: Modelling Skrei and Coastal Cod simultaneously.
Norsk Regnesentral.