1 code implementation • 5 May 2023 • Lars Skaaret-Lund, Geir Storvik, Aliaksandr Hubin
In this paper, we will consider two extensions to the LBBNN method: Firstly, by using the local reparametrization trick (LRT) to sample the hidden units directly, we get a more computationally efficient algorithm.
1 code implementation • 1 May 2023 • Aliaksandr Hubin, Geir Storvik
Bayesian neural networks (BNNs) have recently regained a significant amount of attention in the deep learning community due to the development of scalable approximate Bayesian inference techniques.
1 code implementation • 19 Jan 2022 • Geir Storvik, Alfonso Diz-Louis Palomares, Solveig Engebretsen, Gunnar Øyvind Isaksson Rø, Kenth Engø-Monsen, Aja Bråthen Kristoffersen, Birgitte Freiesleben de Blasio, Arnoldo Frigessi
During the first months, the Covid-19 pandemic has required most countries to implement complex sequences of non-pharmaceutical interventions, with the aim of controlling the transmission of the virus in the population.
no code implementations • 11 Oct 2021 • Aliaksandr Hubin, Florian Frommlet, Geir Storvik
In this paper, we introduce a reversible version of a genetically modified mode jumping Markov chain Monte Carlo algorithm (GMJMCMC) for inference on posterior model probabilities in complex model spaces, where the number of explanatory variables is prohibitively large for classical Markov Chain Monte Carlo methods.
no code implementations • 1 May 2020 • Aliaksandr Hubin, Geir Storvik, Florian Frommlet
In this rejoinder we summarize the comments, questions and remarks on the paper "A novel algorithmic approach to Bayesian Logic Regression" from the discussants.
1 code implementation • 5 Mar 2020 • Aliaksandr Hubin, Geir Storvik, Florian Frommlet
In this paper, we introduce a flexible approach for the construction and selection of highly flexible nonlinear parametric regression models.
1 code implementation • 18 Mar 2019 • Aliaksandr Hubin, Geir Storvik
Bayesian neural networks (BNNs) have recently regained a significant amount of attention in the deep learning community due to the development of scalable approximate Bayesian inference techniques.
2 code implementations • 6 Jun 2018 • Aliaksandr Hubin, Geir Storvik, Florian Frommlet
DBRM can easily be extended to include latent Gaussian variables to model complex correlation structures between observations, which seems to be not easily possible with existing deep learning approaches.
Methodology 62-02, 62-09, 62F07, 62F15, 62J12, 62J05, 62J99, 62M05, 05A16, 60J22, 92D20, 90C27, 90C59 G.1.2; G.1.6; G.2.1; G.3; I.2.0; I.2.6; I.2.8; I.5.1; I.6; I.6.4
1 code implementation • 22 May 2017 • Aliaksandr Hubin, Geir Storvik, Florian Frommlet
Logic regression was developed more than a decade ago as a tool to construct predictors from Boolean combinations of binary covariates.
Computation 62-02, 62-09, 62F07, 62F15, 62J12, 62J05, 62J99, 62M05, 05A16, 60J22, 92D20, 90C27, 90C59
1 code implementation • 21 Apr 2016 • Aliaksandr Hubin, Geir Storvik
An increasing number of sources of data are becoming available, introducing a variety of candidate explanatory variables for these models.
Computation 62-02, 62-09, 62F07, 62F15, 62J12, 62J05, 62J99, 62M05, 05A16, 60J22, 92D20, 90C27, 90C59