no code implementations • 19 Jun 2023 • Kenta Oono, Nontawat Charoenphakdee, Kotatsu Bito, Zhengyan Gao, Yoshiaki Ota, Shoichiro Yamaguchi, Yohei Sugawara, Shin-ichi Maeda, Kunihiko Miyoshi, Yuki Saito, Koki Tsuda, Hiroshi Maruyama, Kohei Hayashi
In this paper, we propose Virtual Human Generative Model (VHGM), a machine learning model for estimating attributes about healthcare, lifestyles, and personalities.
no code implementations • 25 Apr 2023 • Yuri Kinoshita, Kenta Oono, Kenji Fukumizu, Yuichi Yoshida, Shin-ichi Maeda
Variational autoencoders (VAEs) are one of the deep generative models that have experienced enormous success over the past decades.
1 code implementation • 28 Mar 2023 • Soma Onishi, Kenta Oono, Kohei Hayashi
We present \emph{TabRet}, a pre-trainable Transformer-based model for tabular data.
no code implementations • 15 Apr 2022 • Isao Ishikawa, Takeshi Teshima, Koichi Tojo, Kenta Oono, Masahiro Ikeda, Masashi Sugiyama
Invertible neural networks (INNs) are neural network architectures with invertibility by design.
no code implementations • 3 Aug 2021 • Rina Onda, Zhengyan Gao, Masaaki Kotera, Kenta Oono
It is preferred that feature selectors be \textit{stable} for better interpretabity and robust prediction.
no code implementations • 4 Dec 2020 • Takeshi Teshima, Koichi Tojo, Masahiro Ikeda, Isao Ishikawa, Kenta Oono
Neural ordinary differential equations (NODEs) is an invertible neural network architecture promising for its free-form Jacobian and the availability of a tractable Jacobian determinant estimator.
no code implementations • NeurIPS 2020 • Takeshi Teshima, Isao Ishikawa, Koichi Tojo, Kenta Oono, Masahiro Ikeda, Masashi Sugiyama
We answer this question by showing a convenient criterion: a CF-INN is universal if its layers contain affine coupling and invertible linear functions as special cases.
1 code implementation • NeurIPS 2020 • Kenta Oono, Taiji Suzuki
By combining it with generalization gap bounds in terms of transductive Rademacher complexity, we show that a test error bound of a specific type of multi-scale GNNs that decreases corresponding to the number of node aggregations under some conditions.
no code implementations • 12 Jun 2020 • Katsuhiko Ishiguro, Kenta Oono, Kohei Hayashi
A graph neural network (GNN) is a good choice for predicting the chemical properties of molecules.
no code implementations • 30 Sep 2019 • Shion Honda, Hirotaka Akita, katsuhiko Ishiguro, Toshiki Nakanishi, Kenta Oono
Statistical generative models for molecular graphs attract attention from many researchers from the fields of bio- and chemo-informatics.
1 code implementation • ICLR 2020 • Kenta Oono, Taiji Suzuki
We show that when the Erd\H{o}s -- R\'{e}nyi graph is sufficiently dense and large, a broad range of GCNs on it suffers from the "information loss" in the limit of infinite layers with high probability.
no code implementations • ICLR 2019 • Kenta Oono, Taiji Suzuki
We develop new approximation and statistical learning theories of convolutional neural networks (CNNs) via the ResNet-type structure where the channel size, filter size, and width are fixed.
no code implementations • 24 Mar 2019 • Kenta Oono, Taiji Suzuki
The key idea is that we can replicate the learning ability of Fully-connected neural networks (FNNs) by tailored CNNs, as long as the FNNs have \textit{block-sparse} structures.
1 code implementation • 6 Apr 2018 • Naruki Yoshikawa, Kei Terayama, Teruki Honma, Kenta Oono, Koji Tsuda
Automatic design with machine learning and molecular simulations has shown a remarkable ability to generate new and promising drug candidates.
Chemical Physics Biomolecules
1 code implementation • 28 Nov 2017 • Hai Nguyen, Shin-ichi Maeda, Kenta Oono
With the rapid increase of compound databases available in medicinal and material science, there is a growing need for learning representations of molecules in a semi-supervised manner.
1 code implementation • NIPS 2015 • Seiya Tokui, Kenta Oono, Shohei Hido, Justin Clayton
Software frameworks for neural networks play key roles in the development and application of deep learning methods.