Search Results for author: Kenta Oono

Found 16 papers, 6 papers with code

Virtual Human Generative Model: Masked Modeling Approach for Learning Human Characteristics

no code implementations19 Jun 2023 Kenta Oono, Nontawat Charoenphakdee, Kotatsu Bito, Zhengyan Gao, Yoshiaki Ota, Shoichiro Yamaguchi, Yohei Sugawara, Shin-ichi Maeda, Kunihiko Miyoshi, Yuki Saito, Koki Tsuda, Hiroshi Maruyama, Kohei Hayashi

In this paper, we propose Virtual Human Generative Model (VHGM), a machine learning model for estimating attributes about healthcare, lifestyles, and personalities.

Controlling Posterior Collapse by an Inverse Lipschitz Constraint on the Decoder Network

no code implementations25 Apr 2023 Yuri Kinoshita, Kenta Oono, Kenji Fukumizu, Yuichi Yoshida, Shin-ichi Maeda

Variational autoencoders (VAEs) are one of the deep generative models that have experienced enormous success over the past decades.

TabRet: Pre-training Transformer-based Tabular Models for Unseen Columns

1 code implementation28 Mar 2023 Soma Onishi, Kenta Oono, Kohei Hayashi

We present \emph{TabRet}, a pre-trainable Transformer-based model for tabular data.

Fast Estimation Method for the Stability of Ensemble Feature Selectors

no code implementations3 Aug 2021 Rina Onda, Zhengyan Gao, Masaaki Kotera, Kenta Oono

It is preferred that feature selectors be \textit{stable} for better interpretabity and robust prediction.

Universal Approximation Property of Neural Ordinary Differential Equations

no code implementations4 Dec 2020 Takeshi Teshima, Koichi Tojo, Masahiro Ikeda, Isao Ishikawa, Kenta Oono

Neural ordinary differential equations (NODEs) is an invertible neural network architecture promising for its free-form Jacobian and the availability of a tractable Jacobian determinant estimator.

Coupling-based Invertible Neural Networks Are Universal Diffeomorphism Approximators

no code implementations NeurIPS 2020 Takeshi Teshima, Isao Ishikawa, Koichi Tojo, Kenta Oono, Masahiro Ikeda, Masashi Sugiyama

We answer this question by showing a convenient criterion: a CF-INN is universal if its layers contain affine coupling and invertible linear functions as special cases.

Image Generation Representation Learning

Optimization and Generalization Analysis of Transduction through Gradient Boosting and Application to Multi-scale Graph Neural Networks

1 code implementation NeurIPS 2020 Kenta Oono, Taiji Suzuki

By combining it with generalization gap bounds in terms of transductive Rademacher complexity, we show that a test error bound of a specific type of multi-scale GNNs that decreases corresponding to the number of node aggregations under some conditions.

Learning Theory Transductive Learning

Weisfeiler-Lehman Embedding for Molecular Graph Neural Networks

no code implementations12 Jun 2020 Katsuhiko Ishiguro, Kenta Oono, Kohei Hayashi

A graph neural network (GNN) is a good choice for predicting the chemical properties of molecules.

Feature Engineering Link Prediction

Graph Residual Flow for Molecular Graph Generation

no code implementations30 Sep 2019 Shion Honda, Hirotaka Akita, katsuhiko Ishiguro, Toshiki Nakanishi, Kenta Oono

Statistical generative models for molecular graphs attract attention from many researchers from the fields of bio- and chemo-informatics.

Graph Generation Molecular Graph Generation

Graph Neural Networks Exponentially Lose Expressive Power for Node Classification

1 code implementation ICLR 2020 Kenta Oono, Taiji Suzuki

We show that when the Erd\H{o}s -- R\'{e}nyi graph is sufficiently dense and large, a broad range of GCNs on it suffers from the "information loss" in the limit of infinite layers with high probability.

Classification General Classification +1

Approximation and non-parametric estimation of ResNet-type convolutional neural networks via block-sparse fully-connected neural networks

no code implementations ICLR 2019 Kenta Oono, Taiji Suzuki

We develop new approximation and statistical learning theories of convolutional neural networks (CNNs) via the ResNet-type structure where the channel size, filter size, and width are fixed.

Approximation and Non-parametric Estimation of ResNet-type Convolutional Neural Networks

no code implementations24 Mar 2019 Kenta Oono, Taiji Suzuki

The key idea is that we can replicate the learning ability of Fully-connected neural networks (FNNs) by tailored CNNs, as long as the FNNs have \textit{block-sparse} structures.

Vocal Bursts Type Prediction

Population-based de novo molecule generation, using grammatical evolution

1 code implementation6 Apr 2018 Naruki Yoshikawa, Kei Terayama, Teruki Honma, Kenta Oono, Koji Tsuda

Automatic design with machine learning and molecular simulations has shown a remarkable ability to generate new and promising drug candidates.

Chemical Physics Biomolecules

Semi-supervised learning of hierarchical representations of molecules using neural message passing

1 code implementation28 Nov 2017 Hai Nguyen, Shin-ichi Maeda, Kenta Oono

With the rapid increase of compound databases available in medicinal and material science, there is a growing need for learning representations of molecules in a semi-supervised manner.

Cannot find the paper you are looking for? You can Submit a new open access paper.