no code implementations • 6 Jun 2021 • Katsuma Inoue, Soh Ohara, Yasuo Kuniyoshi, Kohei Nakajima
A Lite BERT (ALBERT) is literally characterized as a lightweight version of BERT, in which the number of BERT parameters is reduced by repeatedly applying the same neural network called Transformer's encoder layer.
no code implementations • 15 Jan 2021 • Tomohiro Nakamura, Tomoya Miyashita, Soh Ohara
With the advent of FrameNet and PropBank, many semantic role labeling (SRL) systems have been proposed in English.