Search Results for author: Mikkel N. Schmidt

Found 19 papers, 7 papers with code

Coherent energy and force uncertainty in deep learning force fields

no code implementations7 Dec 2023 Peter Bjørn Jørgensen, Jonas Busk, Ole Winther, Mikkel N. Schmidt

In machine learning energy potentials for atomic systems, forces are commonly obtained as the negative derivative of the energy function with respect to atomic positions.

Multi-view self-supervised learning for multivariate variable-channel time series

1 code implementation13 Jul 2023 Thea Brüsch, Mikkel N. Schmidt, Tommy S. Alstrøm

However, for multivariate time series data, the set of input channels often varies between applications, and most existing work does not allow for transfer between datasets with different sets of input channels.

Contrastive Learning EEG +2

Synthetic data shuffling accelerates the convergence of federated learning under data heterogeneity

1 code implementation23 Jun 2023 Bo Li, Yasin Esfandiari, Mikkel N. Schmidt, Tommy S. Alstrøm, Sebastian U. Stich

In this paper, we establish a precise and quantifiable correspondence between data heterogeneity and parameters in the convergence rate when a fraction of data is shuffled across clients.

Federated Learning

Graph Neural Network Interatomic Potential Ensembles with Calibrated Aleatoric and Epistemic Uncertainty on Energy and Forces

no code implementations10 May 2023 Jonas Busk, Mikkel N. Schmidt, Ole Winther, Tejs Vegge, Peter Bjørn Jørgensen

The proposed method considers both epistemic and aleatoric uncertainty and the total uncertainties are recalibrated post hoc using a nonlinear scaling function to achieve good calibration on previously unseen data, without loss of predictive accuracy.

On the effectiveness of partial variance reduction in federated learning with heterogeneous data

2 code implementations CVPR 2023 Bo Li, Mikkel N. Schmidt, Tommy S. Alstrøm, Sebastian U. Stich

In this paper, we first revisit the widely used FedAvg algorithm in a deep neural network to understand how data heterogeneity influences the gradient updates across the neural network layers.

Federated Learning

Raman Spectrum Matching with Contrastive Representation Learning

no code implementations25 Feb 2022 Bo Li, Mikkel N. Schmidt, Tommy S. Alstrøm

We propose a new machine learning technique for Raman spectrum matching, based on contrastive representation learning, that requires no preprocessing and works with as little as a single reference spectrum from each class.

BIG-bench Machine Learning Conformal Prediction +1

Calibrated Uncertainty for Molecular Property Prediction using Ensembles of Message Passing Neural Networks

no code implementations13 Jul 2021 Jonas Busk, Peter Bjørn Jørgensen, Arghya Bhowmik, Mikkel N. Schmidt, Ole Winther, Tejs Vegge

In this work we extend a message passing neural network designed specifically for predicting properties of molecules and materials with a calibrated probabilistic predictive distribution.

BIG-bench Machine Learning Decision Making +2

Materials property prediction using symmetry-labeled graphs as atomic-position independent descriptors

2 code implementations15 May 2019 Peter Bjørn Jørgensen, Estefanía Garijo del Río, Mikkel N. Schmidt, Karsten Wedel Jacobsen

The possibilities for prediction in a realistic computational screening setting is investigated on a dataset of 5976 ABSe$_3$ selenides with very limited overlap with the OQMD training set.

BIG-bench Machine Learning Formation Energy +3

Probabilistic PARAFAC2

1 code implementation21 Jun 2018 Philip J. H. Jørgensen, Søren F. V. Nielsen, Jesper L. Hinrich, Mikkel N. Schmidt, Kristoffer H. Madsen, Morten Mørup

The PARAFAC2 is a multimodal factor analysis model suitable for analyzing multi-way data when one of the modes has incomparable observation units, for example because of differences in signal sampling or batch sizes.

Neural Message Passing with Edge Updates for Predicting Properties of Molecules and Materials

5 code implementations8 Jun 2018 Peter Bjørn Jørgensen, Karsten Wedel Jacobsen, Mikkel N. Schmidt

Neural message passing on molecular graphs is one of the most promising methods for predicting formation energy and other properties of molecules and materials.

Drug Discovery Formation Energy

Scalable Group Level Probabilistic Sparse Factor Analysis

no code implementations14 Dec 2016 Jesper L. Hinrich, Søren F. V. Nielsen, Nicolai A. B. Riis, Casper T. Eriksen, Jacob Frøsig, Marco D. F. Kristensen, Mikkel N. Schmidt, Kristoffer H. Madsen, Morten Mørup

Many data-driven approaches exist to extract neural representations of functional magnetic resonance imaging (fMRI) data, but most of them lack a proper probabilistic formulation.

Experimental Design

Completely random measures for modelling block-structured sparse networks

no code implementations NeurIPS 2016 Tue Herlau, Mikkel N. Schmidt, Morten Mørup

Statistical methods for network data often parameterize the edge-probability by attributing latent traits such as block structure to the vertices and assume exchangeability in the sense of the Aldous-Hoover representation theorem.

Nonparametric Modeling of Dynamic Functional Connectivity in fMRI Data

1 code implementation4 Jan 2016 Søren F. V. Nielsen, Kristoffer H. Madsen, Rasmus Røge, Mikkel N. Schmidt, Morten Mørup

We further investigate what drives dynamic states using the model on the entire data collated across subjects and task/rest.

Clustering EEG

Bayesian Dropout

no code implementations12 Aug 2015 Tue Herlau, Morten Mørup, Mikkel N. Schmidt

Dropout has recently emerged as a powerful and simple method for training neural networks preventing co-adaptation by stochastically omitting neurons.

regression

Completely random measures for modelling block-structured networks

no code implementations10 Jul 2015 Tue Herlau, Mikkel N. Schmidt, Morten Mørup

Recently Caron and Fox (2014) proposed the use of a different notion of exchangeability due to Kallenberg (2009) and obtained a network model which admits power-law behaviour while retaining desirable statistical properties, however this model does not capture latent vertex traits such as block-structure.

Adaptive Reconfiguration Moves for Dirichlet Mixtures

no code implementations31 May 2014 Tue Herlau, Morten Mørup, Yee Whye Teh, Mikkel N. Schmidt

Bayesian mixture models are widely applied for unsupervised learning and exploratory data analysis.

Non-parametric Bayesian modeling of complex networks

no code implementations20 Dec 2013 Mikkel N. Schmidt, Morten Mørup

Modeling structure in complex networks using Bayesian non-parametrics makes it possible to specify flexible model structures and infer the adequate model complexity from the observed data.

The Infinite Degree Corrected Stochastic Block Model

no code implementations11 Nov 2013 Tue Herlau, Mikkel N. Schmidt, Morten Mørup

On synthetic data we demonstrate that including the degree correction yields better performance both on recovering the true group structure and predicting missing links when degree heterogeneity is present, whereas performance is on par for data with no degree heterogeneity within clusters.

Stochastic Block Model

Nonparametric Bayesian models of hierarchical structure in complex networks

no code implementations5 Nov 2013 Mikkel N. Schmidt, Tue Herlau, Morten Mørup

Analyzing and understanding the structure of complex relational data is important in many applications including analysis of the connectivity in the human brain.

Cannot find the paper you are looking for? You can Submit a new open access paper.