no code implementations • 4 Sep 2024 • Ryotaro Shimizu, Yu Wang, Masanari Kimura, Yuki Hirakawa, Takashi Wada, Yuki Saito, Julian McAuley
In this work, we propose a fashion item recommendation model that incorporates hyperbolic geometry into user and item representations.
1 code implementation • 1 Nov 2023 • Takashi Wada, Timothy Baldwin, Jey Han Lau
We propose a new unsupervised lexical simplification method that uses only monolingual data and pre-trained language models.
1 code implementation • 2 Jun 2023 • Takashi Wada, Yuji Matsumoto, Timothy Baldwin, Jey Han Lau
We propose an unsupervised approach to paraphrasing multiword expressions (MWEs) in context.
1 code implementation • COLING 2022 • Takashi Wada, Timothy Baldwin, Yuji Matsumoto, Jey Han Lau
We propose a new unsupervised method for lexical substitution using pre-trained language models.
no code implementations • COLING 2020 • Yuya Sawada, Takashi Wada, Takayoshi Shibahara, Hiroki Teranishi, Shuhei Kondo, Hiroyuki Shindo, Taro Watanabe, Yuji Matsumoto
We propose a simple method for nominal coordination boundary identification.
1 code implementation • EMNLP (MRL) 2021 • Takashi Wada, Tomoharu Iwata, Yuji Matsumoto, Timothy Baldwin, Jey Han Lau
We propose a new approach for learning contextualised cross-lingual word embeddings based on a small parallel corpus (e. g. a few hundred sentence pairs).
Bilingual Lexicon Induction Cross-Lingual Word Embeddings +5
1 code implementation • ACL 2019 • Takashi Wada, Tomoharu Iwata, Yuji Matsumoto
Recently, a variety of unsupervised methods have been proposed that map pre-trained word embeddings of different languages into the same space without any parallel data.
no code implementations • 7 May 2019 • Takashi Wada, Hideitsu Hino
It is difficult to analytically maximize the acquisition function as the computational cost is prohibitive even when approximate calculations such as sampling approximation are performed; therefore, we propose an accurate and computationally efficient method for estimating gradient of the acquisition function, and develop an algorithm for Bayesian optimization with multi-objective and multi-point search.
no code implementations • 7 Sep 2018 • Takashi Wada, Tomoharu Iwata
The proposed model contains bidirectional LSTMs that perform as forward and backward language models, and these networks are shared among all the languages.