Search Results for author: Kohei Hayashi

Found 29 papers, 8 papers with code

Weighted Likelihood Policy Search with Model Selection

no code implementations NeurIPS 2012 Tsuyoshi Ueno, Kohei Hayashi, Takashi Washio, Yoshinobu Kawahara

Reinforcement learning (RL) methods based on direct policy search (DPS) have been actively discussed to achieve an efficient approach to complicated Markov decision processes (MDPs).

Model Selection reinforcement-learning +1

Rebuilding Factorized Information Criterion: Asymptotically Accurate Marginal Likelihood

no code implementations22 Apr 2015 Kohei Hayashi, Shin-ichi Maeda, Ryohei Fujimaki

Our analysis provides a formal justification of FIC as a model selection criterion for LVMs and also a systematic procedure for pruning redundant latent variables that have been removed heuristically in previous studies.

Model Selection

Bayesian Masking: Sparse Bayesian Estimation with Weaker Shrinkage Bias

no code implementations3 Sep 2015 Yohei Kondo, Kohei Hayashi, Shin-ichi Maeda

A common strategy for sparse linear regression is to introduce regularization, which eliminates irrelevant features by letting the corresponding weights be zeros.

Bayesian Inference feature selection

Making Tree Ensembles Interpretable

no code implementations17 Jun 2016 Satoshi Hara, Kohei Hayashi

Tree ensembles, such as random forest and boosted trees, are renowned for their high prediction performance, whereas their interpretability is critically limited.

Making Tree Ensembles Interpretable: A Bayesian Model Selection Approach

1 code implementation29 Jun 2016 Satoshi Hara, Kohei Hayashi

In this study, we present a method to make a complex tree ensemble interpretable by simplifying the model.

Model Selection

Minimizing Quadratic Functions in Constant Time

1 code implementation NeurIPS 2016 Kohei Hayashi, Yuichi Yoshida

A sampling-based optimization method for quadratic functions is proposed.

On Tensor Train Rank Minimization: Statistical Efficiency and Scalable Algorithm

no code implementations1 Aug 2017 Masaaki Imaizumi, Takanori Maehara, Kohei Hayashi

Tensor train (TT) decomposition provides a space-efficient representation for higher-order tensors.

Tensor Decomposition with Smoothness

no code implementations ICML 2017 Masaaki Imaizumi, Kohei Hayashi

Real data tensors are usually high dimensional but their intrinsic information is preserved in low-dimensional space, which motivates to use tensor decompositions such as Tucker decomposition.

Tensor Decomposition

Fitting Low-Rank Tensors in Constant Time

1 code implementation NeurIPS 2017 Kohei Hayashi, Yuichi Yoshida

Then, we show that the residual error of the Tucker decomposition of $\tilde{X}$ is sufficiently close to that of $X$ with high probability.

Tensor Decomposition

Why does PairDiff work? - A Mathematical Analysis of Bilinear Relational Compositional Operators for Analogy Detection

no code implementations COLING 2018 Huda Hakami, Kohei Hayashi, Danushka Bollegala

We show that, if the word embed- dings are standardised and uncorrelated, such an operator will be independent of bilinear terms, and can be simplified to a linear form, where PairDiff is a special case.

Information Retrieval Knowledge Base Completion +2

On Random Subsampling of Gaussian Process Regression: A Graphon-Based Analysis

no code implementations28 Jan 2019 Kohei Hayashi, Masaaki Imaizumi, Yuichi Yoshida

In this paper, we study random subsampling of Gaussian process regression, one of the simplest approximation baselines, from a theoretical perspective.

regression

Weisfeiler-Lehman Embedding for Molecular Graph Neural Networks

no code implementations12 Jun 2020 Katsuhiko Ishiguro, Kenta Oono, Kohei Hayashi

A graph neural network (GNN) is a good choice for predicting the chemical properties of molecules.

Feature Engineering Link Prediction

A Scaling Law for Synthetic-to-Real Transfer: How Much Is Your Pre-training Effective?

1 code implementation25 Aug 2021 Hiroaki Mikami, Kenji Fukumizu, Shogo Murai, Shuji Suzuki, Yuta Kikuchi, Taiji Suzuki, Shin-ichi Maeda, Kohei Hayashi

Synthetic-to-real transfer learning is a framework in which a synthetically generated dataset is used to pre-train a model to improve its performance on real vision tasks.

Image Generation Transfer Learning

A Scaling Law for Syn-to-Real Transfer: How Much Is Your Pre-training Effective?

no code implementations29 Sep 2021 Hiroaki Mikami, Kenji Fukumizu, Shogo Murai, Shuji Suzuki, Yuta Kikuchi, Taiji Suzuki, Shin-ichi Maeda, Kohei Hayashi

Synthetic-to-real transfer learning is a framework in which a synthetically generated dataset is used to pre-train a model to improve its performance on real vision tasks.

Image Generation Transfer Learning

Fractional SDE-Net: Generation of Time Series Data with Long-term Memory

no code implementations16 Jan 2022 Kohei Hayashi, Kei Nakagawa

It generalizes the neural stochastic differential equation model by using fractional Brownian motion with a Hurst index larger than half, which exhibits the LRD property.

Time Series Time Series Analysis

TabRet: Pre-training Transformer-based Tabular Models for Unseen Columns

1 code implementation28 Mar 2023 Soma Onishi, Kenta Oono, Kohei Hayashi

We present \emph{TabRet}, a pre-trainable Transformer-based model for tabular data.

Neural Fourier Transform: A General Approach to Equivariant Representation Learning

no code implementations29 May 2023 Masanori Koyama, Kenji Fukumizu, Kohei Hayashi, Takeru Miyato

Symmetry learning has proven to be an effective approach for extracting the hidden structure of data, with the concept of equivariance relation playing the central role.

Representation Learning

Virtual Human Generative Model: Masked Modeling Approach for Learning Human Characteristics

no code implementations19 Jun 2023 Kenta Oono, Nontawat Charoenphakdee, Kotatsu Bito, Zhengyan Gao, Yoshiaki Ota, Shoichiro Yamaguchi, Yohei Sugawara, Shin-ichi Maeda, Kunihiko Miyoshi, Yuki Saito, Koki Tsuda, Hiroshi Maruyama, Kohei Hayashi

In this paper, we propose Virtual Human Generative Model (VHGM), a machine learning model for estimating attributes about healthcare, lifestyles, and personalities.

CFTM: Continuous time fractional topic model

no code implementations29 Jan 2024 Kei Nakagawa, Kohei Hayashi, Yugo Fujimoto

This approach incorporates fractional Brownian motion~(fBm) to effectively identify positive or negative correlations in topic and word distribution over time, revealing long-term dependency or roughness.

Dynamic Topic Modeling

Extended Flow Matching: a Method of Conditional Generation with Generalized Continuity Equation

no code implementations29 Feb 2024 Noboru Isobe, Masanori Koyama, Kohei Hayashi, Kenji Fukumizu

In this paper, we develop the theory of conditional generation based on Flow Matching, a current strong contender of diffusion methods.

Cannot find the paper you are looking for? You can Submit a new open access paper.