|TREND||DATASET||BEST METHOD||PAPER TITLE||PAPER||CODE||COMPARE|
The usage of neural network models puts multiple objectives in conflict with each other: Ideally we would like to create a neural model that is effective, efficient, and interpretable at the same time.
Exploration and analysis of potential data sources is a significant challenge in the application of NLP techniques to novel information domains.
With the capability of modeling bidirectional contexts, denoising autoencoding based pretraining like BERT achieves better performance than pretraining approaches based on autoregressive language modeling.
We present a context-aware neural ranking model to exploit users' on-task search activities and enhance retrieval performance.
A cascaded ranking architecture turns ranking into a pipeline of multiple stages, and has been shown to be a powerful approach to balancing efficiency and effectiveness trade-offs in large-scale search systems.
We propose a multi-task learning framework to jointly learn document ranking and query suggestion for web search.