no code implementations • EMNLP (BlackboxNLP) 2021 • Mohsen Fayyaz, Ehsan Aghazadeh, Ali Modarressi, Hosein Mohebbi, Mohammad Taher Pilehvar
Most of the recent works on probing representations have focused on BERT, with the presumption that the findings might be similar to the other models.
1 code implementation • 5 Jun 2023 • Ali Modarressi, Mohsen Fayyaz, Ehsan Aghazadeh, Yadollah Yaghoobzadeh, Mohammad Taher Pilehvar
An emerging solution for explaining Transformer-based models is to use vector-based analysis on how the representations are formed.
no code implementations • 10 Nov 2022 • Mohsen Fayyaz, Ehsan Aghazadeh, Ali Modarressi, Mohammad Taher Pilehvar, Yadollah Yaghoobzadeh, Samira Ebrahimi Kahou
In this work, we employ these two metrics for the first time in NLP.
1 code implementation • ACL 2022 • Ehsan Aghazadeh, Mohsen Fayyaz, Yadollah Yaghoobzadeh
Large pre-trained language models (PLMs) are therefore assumed to encode metaphorical knowledge useful for NLP systems.
no code implementations • 13 Sep 2021 • Mohsen Fayyaz, Ehsan Aghazadeh, Ali Modarressi, Hosein Mohebbi, Mohammad Taher Pilehvar
Most of the recent works on probing representations have focused on BERT, with the presumption that the findings might be similar to the other models.