Rank | Method | EM | F1 | Paper title | Year | Paper | Code |
---|---|---|---|---|---|---|---|
1 | 87.433 | 93.160 | |||||
2 |
nlnet (ensemble)
|
85.954 | 91.677 | ||||
3 |
nlnet (ensemble)
|
85.356 | 91.202 | ||||
4 | 85.083 | 91.835 | |||||
5 |
QANet (ensemble)
|
84.454 | 90.490 | ||||
6 |
r-net (ensemble)
|
84.003 | 90.147 | ||||
7 |
MARS (ensemble)
|
83.982 | 89.796 | ||||
8 |
QANet (ensemble)
|
83.877 | 89.737 | ||||
9 |
nlnet (single model)
|
83.468 | 90.133 | ||||
10 |
MARS (single model)
|
83.185 | 89.547 | ||||
11 |
MARS (single model)
|
83.122 | 89.224 | ||||
12 |
Reinforced Mnemonic Reader + A2D (ensemble model)
|
82.849 | 88.764 | ||||
13 |
QANet (ensemble)
|
82.744 | 89.045 | ||||
14 |
r-net+ (ensemble)
|
82.650 | 88.493 | ||||
15 |
MARS (single model)
|
82.587 | 88.880 | ||||
16 |
Hybrid AoA Reader (ensemble)
|
82.482 | 89.281 | ||||
17 |
QANet (single)
|
82.471 | 89.306 | ||||
18 |
SLQA+ (ensemble)
|
82.440 | 88.607 | ||||
19 | 82.283 | 88.533 | |||||
20 |
QANet (single model)
|
82.209 | 88.608 | ||||
21 |
r-net (ensemble)
|
82.136 | 88.126 | ||||
22 |
AttentionReader+ (ensemble)
|
81.790 | 88.163 | ||||
23 |
MMIPN
|
81.580 | 88.948 | ||||
24 |
Reinforced Mnemonic Reader + A2D (single model)
|
81.538 | 88.130 | ||||
25 |
KACTEIL-MRC(GF-Net+) (ensemble)
|
81.496 | 87.557 | ||||
26 |
Reinforced Mnemonic Reader + A2D + DA (single model)
|
81.401 | 88.122 | ||||
27 |
r-net (single model)
|
81.391 | 88.170 | ||||
28 | 81.003 | 87.432 | |||||
29 |
QANet (single model)
|
80.929 | 87.773 | ||||
30 |
Reinforced Mnemonic Reader + A2D (single model)
|
80.919 | 87.492 | ||||
31 |
AVIQA+ (ensemble)
|
80.615 | 87.311 | ||||
32 |
Reinforced Mnemonic Reader + A2D (single model)
|
80.489 | 87.454 | ||||
33 |
SLQA+
|
80.436 | 87.021 | ||||
34 |
{EAZI} (ensemble)
|
80.436 | 86.912 | ||||
35 |
EAZI+ (ensemble)
|
80.426 | 86.912 | ||||
36 |
DNET (ensemble)
|
80.164 | 86.721 | ||||
37 |
Hybrid AoA Reader (single model)
|
80.027 | 87.288 | ||||
38 |
BiDAF + Self Attention + ELMo + A2D (single model)
|
79.996 | 86.711 | ||||
39 |
r-net+ (single model)
|
79.901 | 86.536 | ||||
40 |
MAMCN+ (single model)
|
79.692 | 86.727 | ||||
41 | 79.608 | 86.496 | |||||
42 | 79.545 | 86.654 | |||||
43 |
SLQA+ (single model)
|
79.199 | 86.590 | ||||
44 |
Interactive AoA Reader+ (ensemble)
|
79.083 | 86.450 | ||||
45 |
MIR-MRC(F-Net) (single model)
|
79.083 | 86.288 | ||||
46 |
MDReader
|
79.031 | 86.006 | ||||
47 | 78.978 | 86.016 | |||||
48 | 78.852 | 85.996 | |||||
49 |
KACTEIL-MRC(GF-Net+) (single model)
|
78.664 | 85.780 | ||||
50 | 78.580 | 85.833 | |||||
51 |
aviqa (ensemble)
|
78.496 | 85.469 | ||||
52 | 78.433 | 85.517 | |||||
53 |
KakaoNet (single model)
|
78.401 | 85.724 | ||||
54 |
SLQA(ensemble)
|
78.328 | 85.682 | ||||
55 | 78.234 | 85.344 | |||||
56 |
BiDAF++ with pair2vec (single model)
|
78.223 | 85.535 | ||||
57 |
MDReader0
|
78.171 | 85.543 | ||||
58 |
test
|
78.087 | 85.348 | ||||
59 |
Interactive AoA Reader (ensemble)
|
77.845 | 85.297 | ||||
60 |
DNET (single model)
|
77.646 | 84.905 | ||||
61 | 77.583 | 84.163 | |||||
62 |
BiDAF++ (single model)
|
77.573 | 84.858 | ||||
63 |
AttentionReader+ (single)
|
77.342 | 84.925 | ||||
64 |
Jenga (ensemble)
|
77.237 | 84.466 | ||||
65 |
{gqa} (single model)
|
77.090 | 83.931 | ||||
66 | 76.996 | 84.630 | |||||
67 |
MARS (single model)
|
76.859 | 84.739 | ||||
68 | 76.828 | 84.396 | |||||
69 |
VS^3-NET (single model)
|
76.775 | 84.491 | ||||
70 | 76.461 | 84.265 | |||||
71 |
FRC (single model)
|
76.240 | 84.599 | ||||
72 | 76.2 | 84.6 | |||||
73 |
Conductor-net (ensemble)
|
76.146 | 83.991 | ||||
74 | 76.125 | 83.538 | |||||
75 |
smarnet (ensemble)
|
75.989 | 83.475 | ||||
76 | 75.968 | 83.900 | |||||
77 |
AVIQA-v2 (single model)
|
75.926 | 83.305 | ||||
78 |
Interactive AoA Reader+ (single model)
|
75.821 | 83.843 | ||||
79 | 75.789 | 83.261 | |||||
80 | 75.370 | 82.658 | |||||
81 |
Mixed model (ensemble)
|
75.265 | 82.769 | ||||
82 |
two-attention-self-attention (ensemble)
|
75.223 | 82.716 | ||||
83 | 75.087 | 83.081 | |||||
84 | 75.034 | 82.552 | |||||
85 | 74.866 | 82.806 | |||||
86 |
eeAttNet (single model)
|
74.604 | 82.501 | ||||
87 |
SSR-BiDAF
|
74.541 | 82.477 | ||||
88 |
SLQA (single model)
|
74.489 | 82.815 | ||||
89 | 74.405 | 82.742 | |||||
90 |
Jenga (single model)
|
74.373 | 82.845 | ||||
91 | 74.268 | 82.371 | |||||
92 |
S^3-Net (ensemble)
|
74.121 | 82.342 | ||||
93 | 74.090 | 81.761 | |||||
94 |
SSAE (ensemble)
|
74.080 | 81.665 | ||||
95 | 73.765 | 81.257 | |||||
96 | 73.744 | 81.525 | |||||
97 | 73.723 | 81.530 | |||||
98 |
Interactive AoA Reader (single model)
|
73.639 | 81.931 | ||||
99 |
Jenga (single model)
|
73.303 | 81.754 | ||||
100 | 73.240 | 81.933 | |||||
101 | 73.010 | 81.517 | |||||
102 |
T-gating (ensemble)
|
72.758 | 81.001 | ||||
103 |
two-attention-self-attention (single model)
|
72.600 | 81.011 | ||||
104 |
Conductor-net (single)
|
72.590 | 81.415 | ||||
105 |
AVIQA (single model)
|
72.485 | 80.550 | ||||
106 | 72.139 | 81.048 | |||||
107 |
S^3-Net (single model)
|
71.908 | 81.023 | ||||
108 |
QFASE
|
71.898 | 79.989 | ||||
109 |
attention+self-attention (single model)
|
71.698 | 80.462 | ||||
110 | 71.625 | 80.383 | |||||
111 | 71.415 | 80.160 | |||||
112 | 71.4 | 80.2 | |||||
113 |
AttReader (single)
|
71.373 | 79.725 | ||||
114 | 71.3 | 79.9 | |||||
115 |
M-NET (single)
|
71.016 | 79.835 | ||||
116 | 70.995 | 80.146 | |||||
117 |
MAMCN (single model)
|
70.985 | 79.939 | ||||
118 | 70.849 | 78.741 | |||||
119 | 70.849 | 78.857 | |||||
120 | 70.733 | 79.353 | |||||
121 | 70.639 | 79.456 | |||||
122 | 70.607 | 79.821 | |||||
123 | 70.555 | 79.364 | |||||
124 | 70.387 | 78.784 | |||||
125 |
SimpleBaseline (single model)
|
69.600 | 78.236 | ||||
126 |
SSR-BiDAF
|
69.443 | 78.358 | ||||
127 | 68.478 | 77.971 | |||||
128 | 68.436 | 77.070 | |||||
129 |
PQMN (single model)
|
68.331 | 77.783 | ||||
130 | 68.163 | 77.527 | |||||
131 |
T-gating (single model)
|
68.132 | 77.569 | ||||
132 | 67.974 | 77.323 | |||||
133 | 67.901 | 77.022 | |||||
134 | 67.744 | 77.605 | |||||
135 |
AllenNLP BiDAF (single model)
|
67.618 | 77.151 | ||||
136 |
Iterative Co-attention Network
|
67.502 | 76.786 | ||||
137 |
newtest
|
66.527 | 75.787 | ||||
138 | 66.233 | 75.896 | |||||
139 | 64.744 | 73.743 | |||||
140 |
Unnamed submission by ravioncodalab
|
64.439 | 73.921 | ||||
141 | 64.083 | 73.056 | |||||
142 |
Attentive CNN context with LSTM
|
63.306 | 73.463 | ||||
143 | 62.897 | 72.016 | |||||
144 | 62.604 | 71.968 | |||||
145 | 62.499 | 70.956 | |||||
146 | 62.446 | 73.327 | |||||
147 | 60.474 | 70.695 | |||||
148 |
Unnamed submission by Will_Wu
|
59.058 | 69.436 | ||||
149 | 54.505 | 67.748 | |||||
150 |
Unnamed submission by jinhyuklee
|
52.544 | 62.780 | ||||
151 |
Unnamed submission by minjoon
|
52.533 | 62.757 |