no code implementations • EACL 2021 • Lutfi Kerem Senel, Hinrich Sch{\"u}tze
Recent progress in pretraining language models on large corpora has resulted in significant performance gains on many NLP tasks.
no code implementations • 6 Feb 2021 • Lutfi Kerem Senel, Hinrich Schütze
Recent progress in pretraining language models on large corpora has resulted in large performance gains on many NLP tasks.
2 code implementations • 19 Jul 2018 • Lutfi Kerem Senel, Ihsan Utlu, Furkan Şahinuç, Haldun M. Ozaktas, Aykut Koç
In other words, we align words that are already determined to be related, along predefined concepts.
2 code implementations • 1 Nov 2017 • Lutfi Kerem Senel, Ihsan Utlu, Veysel Yucesoy, Aykut Koc, Tolga Cukur
Dense word embeddings, which encode semantic meanings of words to low dimensional vector spaces have become very popular in natural language processing (NLP) research due to their state-of-the-art performances in many NLP tasks.