Deeper Text Understanding for IR with Contextual Neural Language Modeling

22 May 2019 Zhuyun Dai Jamie Callan

Neural networks provide new possibilities to automatically learn complex language patterns and query-document relations. Neural IR models have achieved promising results in learning query-document relevance patterns, but few explorations have been done on understanding the text content of a query or a document... (read more)

PDF Abstract

Datasets


  Add Datasets introduced or used in this paper
TASK DATASET MODEL METRIC NAME METRIC VALUE GLOBAL RANK RESULT BENCHMARK
Ad-Hoc Information Retrieval TREC Robust04 BERT-MaxP nDCG@20 0.469 # 5
Ad-Hoc Information Retrieval TREC Robust04 BERT-SumP nDCG@20 0.467 # 6
Ad-Hoc Information Retrieval TREC Robust04 BERT-FirstP nDCG@20 0.444 # 11

Methods used in the Paper


METHOD TYPE
Residual Connection
Skip Connections
Attention Dropout
Regularization
Linear Warmup With Linear Decay
Learning Rate Schedules
Weight Decay
Regularization
GELU
Activation Functions
Dense Connections
Feedforward Networks
Adam
Stochastic Optimization
WordPiece
Subword Segmentation
Softmax
Output Functions
Dropout
Regularization
Multi-Head Attention
Attention Modules
Layer Normalization
Normalization
Scaled Dot-Product Attention
Attention Mechanisms
BERT
Language Models