Search Results for author: Jin Sakuma

Found 7 papers, 2 papers with code

Vocabulary Adaptation for Domain Adaptation in Neural Machine Translation

1 code implementation Findings of the Association for Computational Linguistics 2020 Shoetsu Sato, Jin Sakuma, Naoki Yoshinaga, Masashi Toyoda, Masaru Kitsuregawa

Prior to fine-tuning, our method replaces the embedding layers of the NMT model by projecting general word embeddings induced from monolingual data in a target domain onto a source-domain embedding space.

Domain Adaptation Machine Translation +3

Vocabulary Adaptation for Distant Domain Adaptation in Neural Machine Translation

no code implementations30 Apr 2020 Shoetsu Sato, Jin Sakuma, Naoki Yoshinaga, Masashi Toyoda, Masaru Kitsuregawa

Prior to fine-tuning, our method replaces the embedding layers of the NMT model by projecting general word embeddings induced from monolingual data in a target domain onto a source-domain embedding space.

Domain Adaptation Machine Translation +3

Multilingual Model Using Cross-Task Embedding Projection

no code implementations CONLL 2019 Jin Sakuma, Naoki Yoshinaga

We present a method for applying a neural network trained on one (resource-rich) language for a given task to other (resource-poor) languages.

Cross-Lingual Word Embeddings Sentiment Analysis +2

Wikipedia2Vec: An Efficient Toolkit for Learning and Visualizing the Embeddings of Words and Entities from Wikipedia

no code implementations EMNLP 2020 Ikuya Yamada, Akari Asai, Jin Sakuma, Hiroyuki Shindo, Hideaki Takeda, Yoshiyasu Takefuji, Yuji Matsumoto

The embeddings of entities in a large knowledge base (e. g., Wikipedia) are highly beneficial for solving various natural language tasks that involve real world knowledge.

World Knowledge

Cannot find the paper you are looking for? You can Submit a new open access paper.