no code implementations • RANLP 2021 • Ying Zhang, Hidetaka Kamigaito, Tatsuya Aoki, Hiroya Takamura, Manabu Okumura
Encoder-decoder models have been commonly used for many tasks such as machine translation and response generation.
no code implementations • COLING 2020 • Riku Kawamura, Tatsuya Aoki, Hidetaka Kamigaito, Hiroya Takamura, Manabu Okumura
We propose neural models that can normalize text by considering the similarities of word strings and sounds.
no code implementations • 15 Apr 2020 • Kazuki Miyazawa, Tatsuya Aoki, Takato Horii, Takayuki Nagai
Recently, the bidirectional encoder representations from transformers (BERT) model has attracted much attention in the field of natural language processing, owing to its high performance in language understanding-related tasks.
no code implementations • WS 2019 • Kasumi Aoki, Akira Miyazawa, Tatsuya Ishigaki, Tatsuya Aoki, Hiroshi Noji, Keiichi Goshima, Ichiro Kobayashi, Hiroya Takamura, Yusuke Miyao
We propose a data-to-document generator that can easily control the contents of output texts based on a neural language model.
1 code implementation • WS 2018 • Tatsuya Aoki, Akira Miyazawa, Tatsuya Ishigaki, Keiichi Goshima, Kasumi Aoki, Ichiro Kobayashi, Hiroya Takamura, Yusuke Miyao
Comments on a stock market often include the reason or cause of changes in stock prices, such as {``}Nikkei turns lower as yen{'}s rise hits exporters.
no code implementations • EMNLP 2017 • Tatsuya Aoki, Ryohei Sasano, Hiroya Takamura, Manabu Okumura
Our experimental results show that the model leveraging the context embedding outperforms other methods and provide us with findings, for example, on how to construct context embeddings and which corpus to use.