Search Results for author: Kim A. Nicoli

Found 9 papers, 2 papers with code

Flow-based Sampling for Entanglement Entropy and the Machine Learning of Defects

no code implementations18 Oct 2024 Andrea Bulgarelli, Elia Cellini, Karl Jansen, Stefan Kühn, Alessandro Nada, Shinichi Nakajima, Kim A. Nicoli, Marco Panero

We introduce a novel technique to numerically calculate R\'enyi entanglement entropies in lattice quantum field theory using generative models.

Physics-Informed Bayesian Optimization of Variational Quantum Circuits

1 code implementation NeurIPS 2023 Kim A. Nicoli, Christopher J. Anders, Lena Funcke, Tobias Hartung, Karl Jansen, Stefan Kühn, Klaus-Robert Müller, Paolo Stornati, Pan Kessel, Shinichi Nakajima

In this paper, we propose a novel and powerful method to harness Bayesian optimization for Variational Quantum Eigensolvers (VQEs) -- a hybrid quantum-classical protocol used to approximate the ground state of a quantum Hamiltonian.

Bayesian Optimization Inductive Bias

Detecting and Mitigating Mode-Collapse for Flow-based Sampling of Lattice Field Theories

no code implementations27 Feb 2023 Kim A. Nicoli, Christopher J. Anders, Tobias Hartung, Karl Jansen, Pan Kessel, Shinichi Nakajima

In this work, we first point out that the tunneling problem is also present for normalizing flows but is shifted from the sampling to the training phase of the algorithm.

Gradients should stay on Path: Better Estimators of the Reverse- and Forward KL Divergence for Normalizing Flows

no code implementations17 Jul 2022 Lorenz Vaitl, Kim A. Nicoli, Shinichi Nakajima, Pan Kessel

We propose an algorithm to estimate the path-gradient of both the reverse and forward Kullback-Leibler divergence for an arbitrary manifestly invertible normalizing flow.

Variational Inference

Path-Gradient Estimators for Continuous Normalizing Flows

1 code implementation17 Jun 2022 Lorenz Vaitl, Kim A. Nicoli, Shinichi Nakajima, Pan Kessel

Recent work has established a path-gradient estimator for simple variational Gaussian distributions and has argued that the path-gradient is particularly beneficial in the regime in which the variational distribution approaches the exact target distribution.

Estimation of Thermodynamic Observables in Lattice Field Theories with Deep Generative Models

no code implementations14 Jul 2020 Kim A. Nicoli, Christopher J. Anders, Lena Funcke, Tobias Hartung, Karl Jansen, Pan Kessel, Shinichi Nakajima, Paolo Stornati

In this work, we demonstrate that applying deep generative machine learning models for lattice field theory is a promising route for solving problems where Markov Chain Monte Carlo (MCMC) methods are problematic.

BIG-bench Machine Learning

Asymptotically unbiased estimation of physical observables with neural samplers

no code implementations29 Oct 2019 Kim A. Nicoli, Shinichi Nakajima, Nils Strodthoff, Wojciech Samek, Klaus-Robert Müller, Pan Kessel

We propose a general framework for the estimation of observables with generative neural samplers focusing on modern deep generative neural networks that provide an exact sampling probability.

Analysis of Atomistic Representations Using Weighted Skip-Connections

no code implementations23 Oct 2018 Kim A. Nicoli, Pan Kessel, Michael Gastegger, Kristof T. Schütt

In this work, we extend the SchNet architecture by using weighted skip connections to assemble the final representation.

BIG-bench Machine Learning Property Prediction

Cannot find the paper you are looking for? You can Submit a new open access paper.