1 code implementation • 2 Feb 2024 • Paul Youssef, Jörg Schlötterer, Christin Seifert
In this work, we consider a complementary aspect, namely the coherency of factual knowledge in PLMs, i. e., how often can PLMs predict the subject entity given its initial prediction of the object entity.
no code implementations • 25 Oct 2023 • Paul Youssef, Osman Alperen Koraş, Meijie Li, Jörg Schlötterer, Christin Seifert
Our contributions are: (1) We propose a categorization scheme for factual probing methods that is based on how their inputs, outputs and the probed PLMs are adapted; (2) We provide an overview of the datasets used for factual probing; (3) We synthesize insights about knowledge retention and prompt optimization in PLMs, analyze obstacles to adopting PLMs as knowledge bases and outline directions for future work.
1 code implementation • 24 Jul 2023 • Jan Trienes, Paul Youssef, Jörg Schlötterer, Christin Seifert
Automatically summarizing radiology reports into a concise impression can reduce the manual burden of clinicians and improve the consistency of reporting.
no code implementations • 31 Jul 2022 • Angan Mitra, Nguyen Kim Thang, Tuan-Anh Nguyen, Denis Trystram, Paul Youssef
The design of decentralized learning algorithms is important in the fast-growing world in which data are distributed over participants with limited local computation resources and communication.
no code implementations • 5 Jul 2022 • Evripidis Bampis, Bruno Escoffier, Paul Youssef
There is a first round where some students apply, and a first (stable) matching $M_1$ has to be computed.
no code implementations • 23 Nov 2019 • Mohsen Mesgar, Paul Youssef, Lin Li, Dominik Bierwirth, Yihao Li, Christian M. Meyer, Iryna Gurevych
Our conversational agent UKP-ATHENA assists NLP researchers in finding and exploring scientific literature, identifying relevant authors, planning or post-processing conference visits, and preparing paper submissions using a unified interface based on natural language inputs and responses.
1 code implementation • EMNLP 2018 • Steffen Eger, Paul Youssef, Iryna Gurevych
Activation functions play a crucial role in neural networks because they are the nonlinearities which have been attributed to the success story of deep learning.