Interestingly, we find that prompting combined with KD can reduce compute and data cost at the same time.
Given an incomplete narrative that specifies a main event and a context, we aim to retrieve news articles that discuss relevant events that would enable the continuation of the narrative.
In simple open-domain question answering (QA), dense retrieval has become one of the standard approaches for retrieving the relevant passages to infer an answer.
This paper describes the participation of UvA. ILPS group at the TREC CAsT 2020 track.
The 1st edition of the workshop on Mixed-Initiative ConveRsatiOnal Systems (MICROS@ECIR2021) aims at investigating and collecting novel ideas and contributions in the field of conversational systems.
Conversational passage retrieval relies on question rewriting to modify the original question so that it no longer depends on the conversation history.
Recent research on conversational search highlights the importance of mixed-initiative in conversations.
As recent learning to match methods have made important advances in bridging the vocabulary gap for these traditional IR areas, we investigate their potential in the context of product search.
Knowledge graph simple question answering (KGSQA), in its standard form, does not take into account that human-curated question answering training data only cover a small subset of the relations that exist in a Knowledge Graph (KG), or even worse, that new domains covering unseen and rather different to existing domains relations are added to the KG.
Context from the conversational history can be used to arrive at a better expression of the current turn query, defined as the task of query resolution.
KG fact contextualization is the task of augmenting a given KG fact with additional and useful KG facts.