no code implementations • 16 May 2019 • Alon Rozental, Zohar Kelrich, Daniel Fleischer
This paper describes a language representation model which combines the Bidirectional Encoder Representations from Transformers (BERT) learning mechanism described in Devlin et al. (2018) with a generalization of the Universal Transformer model described in Dehghani et al. (2018).
no code implementations • WS 2018 • Alon Rozental, Daniel Fleischer, Zohar Kelrich
This paper describes the system developed at Amobee for the WASSA 2018 implicit emotions shared task (IEST).