no code implementations • EACL (AdaptNLP) 2021 • Jezabel Garcia, Federica Freddi, Jamie McGowan, Tim Nieradzik, Feng-Ting Liao, Ye Tian, Da-Shan Shiu, Alberto Bernacchia
In meta-learning, the knowledge learned from previous tasks is transferred to new ones, but this transfer only works if tasks are related.
Cross-Lingual Natural Language Inference Cross-Lingual Transfer +1
no code implementations • 27 Feb 2023 • Jamie McGowan, Elizabeth Guest, Ziyang Yan, Cong Zheng, Neha Patel, Mason Cusack, Charlie Donaldson, Sofie de Cnudde, Gabriel Facini, Fabon Dzogang
We first explore the structure of this dataset with a focus on the application of Graph Representation Learning in order to exploit the natural data structure and provide statistical insights into particular features within the data.
no code implementations • 8 Mar 2021 • Jezabel R. Garcia, Federica Freddi, Feng-Ting Liao, Jamie McGowan, Tim Nieradzik, Da-Shan Shiu, Ye Tian, Alberto Bernacchia
We show that TreeMAML improves the state of the art results for cross-lingual Natural Language Inference.
Cross-Lingual Natural Language Inference Cross-Lingual Transfer +2
no code implementations • 1 Jan 2021 • Jezabel Garcia, Federica Freddi, Jamie McGowan, Tim Nieradzik, Da-Shan Shiu, Ye Tian, Alberto Bernacchia
In meta-learning, the knowledge learned from previous tasks is transferred to new ones, but this transfer only works if tasks are related, and sharing information between unrelated tasks might hurt performance.