no code implementations • 30 Nov 2023 • Qi Cao, Takeshi Kojima, Yutaka Matsuo, Yusuke Iwasawa
While Large Language Models (LLMs) have achieved remarkable performance in many tasks, much about their inner workings remains unclear.
1 code implementation • 28 Jun 2022 • Takeshi Kojima, Yutaka Matsuo, Yusuke Iwasawa
Vision Transformer (ViT) is becoming more popular in image processing.
2 code implementations • 24 May 2022 • Takeshi Kojima, Shixiang Shane Gu, Machel Reid, Yutaka Matsuo, Yusuke Iwasawa
Pretrained large language models (LLMs) are widely used in many sub-fields of natural language processing (NLP) and generally known as excellent few-shot learners with task-specific exemplars.
Ranked #5 on
Math Word Problem Solving
on SVAMP
no code implementations • EACL 2021 • Takeshi Kojima, Yusuke Iwasawa, Yutaka Matsuo
In this paper, we propose a GAN model that aims to improve the approach to generating diverse texts conditioned by the latent space.