no code implementations • 29 May 2024 • Yuta Tarumi, Keisuke Fukuda, Shin-ichi Maeda

State estimation for nonlinear state space models is a challenging task.

no code implementations • 2 Oct 2023 • Tsukasa Takagi, Shinya Ishizaki, Shin-ichi Maeda

Image denoising is a representative image restoration task in computer vision.

no code implementations • 19 Jun 2023 • Kenta Oono, Nontawat Charoenphakdee, Kotatsu Bito, Zhengyan Gao, Yoshiaki Ota, Shoichiro Yamaguchi, Yohei Sugawara, Shin-ichi Maeda, Kunihiko Miyoshi, Yuki Saito, Koki Tsuda, Hiroshi Maruyama, Kohei Hayashi

In this paper, we propose Virtual Human Generative Model (VHGM), a machine learning model for estimating attributes about healthcare, lifestyles, and personalities.

no code implementations • 25 Apr 2023 • Yuri Kinoshita, Kenta Oono, Kenji Fukumizu, Yuichi Yoshida, Shin-ichi Maeda

Variational autoencoders (VAEs) are one of the deep generative models that have experienced enormous success over the past decades.

no code implementations • 29 Sep 2021 • Hiroaki Mikami, Kenji Fukumizu, Shogo Murai, Shuji Suzuki, Yuta Kikuchi, Taiji Suzuki, Shin-ichi Maeda, Kohei Hayashi

Synthetic-to-real transfer learning is a framework in which a synthetically generated dataset is used to pre-train a model to improve its performance on real vision tasks.

no code implementations • ICCV 2021 • Aditya Ganeshan, Alexis Vallet, Yasunori Kudo, Shin-ichi Maeda, Tommi Kerola, Rares Ambrus, Dennis Park, Adrien Gaidon

Deep learning models for semantic segmentation rely on expensive, large-scale, manually annotated datasets.

Ranked #38 on Semantic Segmentation on NYU Depth v2

1 code implementation • 25 Aug 2021 • Hiroaki Mikami, Kenji Fukumizu, Shogo Murai, Shuji Suzuki, Yuta Kikuchi, Taiji Suzuki, Shin-ichi Maeda, Kohei Hayashi

Synthetic-to-real transfer learning is a framework in which a synthetically generated dataset is used to pre-train a model to improve its performance on real vision tasks.

no code implementations • 1 Jan 2021 • Masanori Koyama, Toshiki Nakanishi, Shin-ichi Maeda, Vitor Campagnolo Guizilini, Adrien Gaidon

Estimating the 3D shape of real-world objects is a key perceptual challenge.

no code implementations • 1 Jan 2021 • Shin-ichi Maeda, Hayato Watahiki, Yi Ouyang, Shintarou Okada, Masanori Koyama

In this study, we consider a situation in which the agent has access to the generative model which provides us with a next state sample for any given state-action pair, and propose a model to solve a CMDP problem by decomposing the CMDP into a pair of MDPs; \textit{reconnaissance} MDP (R-MDP) and \textit{planning} MDP (P-MDP).

no code implementations • 2 Jun 2020 • Shin-ichi Maeda, Toshiki Nakanishi, Masanori Koyama

However, the posterior distribution in Neural Process violates the way the posterior distribution changes with the contextual dataset.

1 code implementation • NeurIPS 2019 • Kohei Hayashi, Taiki Yamaguchi, Yohei Sugawara, Shin-ichi Maeda

Tensor decomposition methods are widely used for model compression and fast inference in convolutional neural networks (CNNs).

no code implementations • 19 Nov 2019 • Homanga Bharadhwaj, Shoichiro Yamaguchi, Shin-ichi Maeda

Efficiently transferring learned policies to an unknown environment with changes in dynamics configurations in the presence of motor noise is very important for operating robots in the real world, and our work is a novel attempt in that direction.

no code implementations • 20 Sep 2019 • Shin-ichi Maeda, Hayato Watahiki, Shintarou Okada, Masanori Koyama

Practical reinforcement learning problems are often formulated as constrained Markov decision process (CMDP) problems, in which the agent has to maximize the expected return while satisfying a set of prescribed safety constraints.

1 code implementation • 13 Aug 2019 • Kohei Hayashi, Taiki Yamaguchi, Yohei Sugawara, Shin-ichi Maeda

Tensor decomposition methods are widely used for model compression and fast inference in convolutional neural networks (CNNs).

no code implementations • NeurIPS 2019 • Amir Najafi, Shin-ichi Maeda, Masanori Koyama, Takeru Miyato

What is the role of unlabeled data in an inference problem, when the presumed underlying distribution is adversarially perturbed?

no code implementations • ICLR 2019 • Ken Nakanishi, Shin-ichi Maeda, Takeru Miyato, Masanori Koyama

We propose Adaptive Sample-space & Adaptive Probability (ASAP) coding, an efficient neural-network based method for lossy data compression.

1 code implementation • 4 Feb 2019 • Katsuhiko Ishiguro, Shin-ichi Maeda, Masanori Koyama

Graph Neural Network (GNN) is a popular architecture for the analysis of chemical molecules, and it has numerous applications in material and medicinal science.

1 code implementation • 28 Oct 2018 • Riku Arakawa, Sosuke Kobayashi, Yuya Unno, Yuta Tsuboi, Shin-ichi Maeda

A remedy for this is to train an agent with real-time feedback from a human observer who immediately gives rewards for some actions.

1 code implementation • 4 Jul 2018 • Hirotaka Akita, Kosuke Nakago, Tomoki Komatsu, Yohei Sugawara, Shin-ichi Maeda, Yukino Baba, Hisashi Kashima

A possible approach to answer this question is to visualize evidence substructures responsible for the predictions.

no code implementations • 16 May 2018 • Ken Nakanishi, Shin-ichi Maeda, Takeru Miyato, Daisuke Okanohara

This study presents a new lossy image compression method that utilizes the multi-scale features of natural images.

1 code implementation • ICML 2018 • Yasuhiro Fujita, Shin-ichi Maeda

We propose a policy gradient estimator that exploits the knowledge of actions being clipped to reduce the variance in estimation.

no code implementations • ICLR 2018 • Sotetsu Koyamada, Yuta Kikuchi, Atsunori Kanemura, Shin-ichi Maeda, Shin Ishii

Neural sequence generation is commonly approached by using maximum- likelihood (ML) estimation or reinforcement learning (RL).

1 code implementation • 28 Nov 2017 • Hai Nguyen, Shin-ichi Maeda, Kenta Oono

With the rapid increase of compound databases available in medicinal and material science, there is a growing need for learning representations of molecules in a semi-supervised manner.

1 code implementation • 30 Jun 2017 • Sotetsu Koyamada, Yuta Kikuchi, Atsunori Kanemura, Shin-ichi Maeda, Shin Ishii

We propose a new neural sequence model training method in which the objective function is defined by $\alpha$-divergence.

13 code implementations • 13 Apr 2017 • Takeru Miyato, Shin-ichi Maeda, Masanori Koyama, Shin Ishii

In our experiments, we applied VAT to supervised and semi-supervised learning tasks on multiple benchmark datasets.

no code implementations • 3 Sep 2015 • Yohei Kondo, Kohei Hayashi, Shin-ichi Maeda

A common strategy for sparse linear regression is to introduce regularization, which eliminates irrelevant features by letting the corresponding weights be zeros.

5 code implementations • 2 Jul 2015 • Takeru Miyato, Shin-ichi Maeda, Masanori Koyama, Ken Nakae, Shin Ishii

We propose local distributional smoothness (LDS), a new notion of smoothness for statistical model that can be used as a regularization term to promote the smoothness of the model distribution.

no code implementations • 22 Apr 2015 • Kohei Hayashi, Shin-ichi Maeda, Ryohei Fujimaki

Our analysis provides a formal justification of FIC as a model selection criterion for LVMs and also a systematic procedure for pruning redundant latent variables that have been removed heuristically in previous studies.

no code implementations • 22 Dec 2014 • Shin-ichi Maeda

Dropout is one of the key techniques to prevent the learning from overfitting.

Cannot find the paper you are looking for? You can
Submit a new open access paper.

Contact us on:
hello@paperswithcode.com
.
Papers With Code is a free resource with all data licensed under CC-BY-SA.