Search Results for author: Satoshi Hayakawa

Found 9 papers, 6 papers with code

A Quadrature Approach for General-Purpose Batch Bayesian Optimization via Probabilistic Lifting

1 code implementation18 Apr 2024 Masaki Adachi, Satoshi Hayakawa, Martin Jørgensen, Saad Hamid, Harald Oberhauser, Michael A. Osborne

Parallelisation in Bayesian optimisation is a common strategy but faces several challenges: the need for flexibility in acquisition functions and kernel choices, flexibility dealing with discrete and continuous variables simultaneously, model misspecification, and lastly fast massive parallelisation.

Policy Gradient with Kernel Quadrature

no code implementations23 Oct 2023 Satoshi Hayakawa, Tetsuro Morimura

Reward evaluation of episodes becomes a bottleneck in a broad range of reinforcement learning tasks.

Causal Discovery

Quantum Ridgelet Transform: Winning Lottery Ticket of Neural Networks with Quantum Computation

no code implementations27 Jan 2023 Hayata Yamasaki, Sathyawageeswar Subramanian, Satoshi Hayakawa, Sho Sonoda

To address this problem, we develop a quantum ridgelet transform (QRT), which implements the ridgelet transform of a quantum state within a linear runtime $O(D)$ of quantum computation.

Quantum Machine Learning

SOBER: Highly Parallel Bayesian Optimization and Bayesian Quadrature over Discrete and Mixed Spaces

1 code implementation27 Jan 2023 Masaki Adachi, Satoshi Hayakawa, Saad Hamid, Martin Jørgensen, Harald Oberhauser, Micheal A. Osborne

Batch Bayesian optimisation and Bayesian quadrature have been shown to be sample-efficient methods of performing optimisation and quadrature where expensive-to-evaluate objective functions can be queried in parallel.

Drug Discovery

Sampling-based Nyström Approximation and Kernel Quadrature

1 code implementation23 Jan 2023 Satoshi Hayakawa, Harald Oberhauser, Terry Lyons

We analyze the Nystr\"om approximation of a positive definite kernel associated with a probability measure.

Learning Theory

Fast Bayesian Inference with Batch Bayesian Quadrature via Kernel Recombination

2 code implementations9 Jun 2022 Masaki Adachi, Satoshi Hayakawa, Martin Jørgensen, Harald Oberhauser, Michael A. Osborne

Empirically, we find that our approach significantly outperforms the sampling efficiency of both state-of-the-art BQ techniques and Nested Sampling in various real-world datasets, including lithium-ion battery analytics.

Bayesian Inference Numerical Integration

On the minimax optimality and superiority of deep neural network learning over sparse parameter spaces

no code implementations22 May 2019 Satoshi Hayakawa, Taiji Suzuki

Whereas existing theoretical studies of deep learning have been based mainly on mathematical theories of well-known function classes such as H\"{o}lder and Besov classes, we focus on function classes with discontinuity and sparsity, which are those naturally assumed in practice.

Cannot find the paper you are looking for? You can Submit a new open access paper.