HUJI-KU at MRP~2020: Two Transition-based Neural Parsers

12 Oct 2020  ·  Ofir Arviv, Ruixiang Cui, Daniel Hershcovich ·

This paper describes the HUJI-KU system submission to the shared task on Cross-Framework Meaning Representation Parsing (MRP) at the 2020 Conference for Computational Language Learning (CoNLL), employing TUPA and the HIT-SCIR parser, which were, respectively, the baseline system and winning system in the 2019 MRP shared task. Both are transition-based parsers using BERT contextualized embeddings. We generalized TUPA to support the newly-added MRP frameworks and languages, and experimented with multitask learning with the HIT-SCIR parser. We reached 4th place in both the cross-framework and cross-lingual tracks.

PDF Abstract

Datasets


Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Semantic Parsing AMR (chinese, MRP 2020) HUJI-KU F1 45 # 2
Semantic Parsing AMR (english, MRP 2020) HUJI-KU F1 52 # 2
Semantic Parsing DRG (english, MRP 2020) HUJI-KU F1 63 # 2
Semantic Parsing DRG (german, MRP 2020) HUJI-KU F1 62 # 2
Semantic Parsing EDS (english, MRP 2020) HUJI-KU F1 80 # 2
Semantic Parsing PTG (czech, MRP 2020) HUJI-KU F1 58 # 3
Semantic Parsing PTG (english, MRP 2020) HUJI-KU F1 54 # 2
Semantic Parsing UCCA (english, MRP 2020) HUJI-KU F1 73 # 2
Semantic Parsing UCCA (german, MRP 2020) HUJI-KU F1 75 # 2

Methods