no code implementations • 29 Sep 2021 • Hiroaki Mikami, Kenji Fukumizu, Shogo Murai, Shuji Suzuki, Yuta Kikuchi, Taiji Suzuki, Shin-ichi Maeda, Kohei Hayashi
Synthetic-to-real transfer learning is a framework in which a synthetically generated dataset is used to pre-train a model to improve its performance on real vision tasks.
1 code implementation • 25 Aug 2021 • Hiroaki Mikami, Kenji Fukumizu, Shogo Murai, Shuji Suzuki, Yuta Kikuchi, Taiji Suzuki, Shin-ichi Maeda, Kohei Hayashi
Synthetic-to-real transfer learning is a framework in which a synthetically generated dataset is used to pre-train a model to improve its performance on real vision tasks.
no code implementations • 5 Dec 2020 • Kazuki Ikeda, Dmitri E. Kharzeev, Yuta Kikuchi
We interpret this maximum in terms of the growth of critical fluctuations near the critical point, and draw analogies between the massive Schwinger model, QCD near the critical point, and ferroelectrics near the Curie point.
High Energy Physics - Phenomenology High Energy Physics - Lattice Nuclear Theory Quantum Physics
1 code implementation • 28 Sep 2020 • He Huang, Shunta Saito, Yuta Kikuchi, Eiichi Matsumoto, Wei Tang, Philip S. Yu
Motivated by the fact that detecting these rare relations can be critical in real-world applications, this paper introduces a novel integrated framework of classification and ranking to resolve the class imbalance problem in scene graph parsing.
no code implementations • WS 2018 • Ryo Nagata, Tomoya Mizumoto, Yuta Kikuchi, Yoshifumi Kawasaki, Kotaro Funakoshi
Based on the discussion of possible causes of POS tagging errors in learner English, we show that deep neural models are particularly suitable for this.
no code implementations • ICLR 2018 • Sotetsu Koyamada, Yuta Kikuchi, Atsunori Kanemura, Shin-ichi Maeda, Shin Ishii
Neural sequence generation is commonly approached by using maximum- likelihood (ML) estimation or reinforcement learning (RL).
1 code implementation • 17 Oct 2017 • Jun Hatori, Yuta Kikuchi, Sosuke Kobayashi, Kuniyuki Takahashi, Yuta Tsuboi, Yuya Unno, Wilson Ko, Jethro Tan
In this paper, we propose the first comprehensive system that can handle unconstrained spoken language and is able to effectively resolve ambiguity in spoken instructions.
no code implementations • ACL 2017 • Shun Hasegawa, Yuta Kikuchi, Hiroya Takamura, Manabu Okumura
In English, high-quality sentence compression models by deleting words have been trained on automatically created large training datasets.
1 code implementation • 30 Jun 2017 • Sotetsu Koyamada, Yuta Kikuchi, Atsunori Kanemura, Shin-ichi Maeda, Shin Ishii
We propose a new neural sequence model training method in which the objective function is defined by $\alpha$-divergence.
1 code implementation • EMNLP 2016 • Yuta Kikuchi, Graham Neubig, Ryohei Sasano, Hiroya Takamura, Manabu Okumura
Neural encoder-decoder models have shown great success in many sequence generation tasks.