Search Results for author: Ruixiang Cui

Found 12 papers, 5 papers with code

HUJI-KU at MRP~2020: Two Transition-based Neural Parsers

no code implementations12 Oct 2020 Ofir Arviv, Ruixiang Cui, Daniel Hershcovich

This paper describes the HUJI-KU system submission to the shared task on Cross-Framework Meaning Representation Parsing (MRP) at the 2020 Conference for Computational Language Learning (CoNLL), employing TUPA and the HIT-SCIR parser, which were, respectively, the baseline system and winning system in the 2019 MRP shared task.

Semantic Parsing Vocal Bursts Valence Prediction

HUJI-KU at MRP 2020: Two Transition-based Neural Parsers

no code implementations CONLL 2020 Ofir Arviv, Ruixiang Cui, Daniel Hershcovich

This paper describes the HUJI-KU system submission to the shared task on CrossFramework Meaning Representation Parsing (MRP) at the 2020 Conference for Computational Language Learning (CoNLL), employing TUPA and the HIT-SCIR parser, which were, respectively, the baseline system and winning system in the 2019 MRP shared task.

Vocal Bursts Valence Prediction

Meaning Representation of Numeric Fused-Heads in UCCA

no code implementations4 Jun 2021 Ruixiang Cui, Daniel Hershcovich

We exhibit that the implicit UCCA parser does not address numeric fused-heads (NFHs) consistently, which could result either from inconsistent annotation, insufficient training data or a modelling limitation.

Machine Translation Natural Language Inference +2

Great Service! Fine-grained Parsing of Implicit Arguments

1 code implementation ACL (IWPT) 2021 Ruixiang Cui, Daniel Hershcovich

Broad-coverage meaning representations in NLP mostly focus on explicitly expressed content.

Compositional Generalization in Multilingual Semantic Parsing over Wikidata

1 code implementation7 Aug 2021 Ruixiang Cui, Rahul Aralikatte, Heather Lent, Daniel Hershcovich

We introduce such a dataset, which we call Multilingual Compositional Wikidata Questions (MCWQ), and use it to analyze the compositional generalization of semantic parsers in Hebrew, Kannada, Chinese and English.

Semantic Parsing Zero-Shot Cross-Lingual Transfer

How Conservative are Language Models? Adapting to the Introduction of Gender-Neutral Pronouns

1 code implementation NAACL 2022 Stephanie Brandl, Ruixiang Cui, Anders Søgaard

Gender-neutral pronouns have recently been introduced in many languages to a) include non-binary people and b) as a generic singular.

Generalized Quantifiers as a Source of Error in Multilingual NLU Benchmarks

1 code implementation NAACL (DADC) 2022 Ruixiang Cui, Daniel Hershcovich, Anders Søgaard

Logical approaches to representing language have developed and evaluated computational models of quantifier words since the 19th century, but today's NLU models still struggle to capture their semantics.

AGIEval: A Human-Centric Benchmark for Evaluating Foundation Models

2 code implementations13 Apr 2023 Wanjun Zhong, Ruixiang Cui, Yiduo Guo, Yaobo Liang, Shuai Lu, Yanlin Wang, Amin Saied, Weizhu Chen, Nan Duan

Impressively, GPT-4 surpasses average human performance on SAT, LSAT, and math competitions, attaining a 95% accuracy rate on the SAT Math test and a 92. 5% accuracy on the English test of the Chinese national college entrance exam.

Decision Making Math

What does the Failure to Reason with "Respectively" in Zero/Few-Shot Settings Tell Us about Language Models?

no code implementations31 May 2023 Ruixiang Cui, Seolhwa Lee, Daniel Hershcovich, Anders Søgaard

Humans can effortlessly understand the coordinate structure of sentences such as "Niels Bohr and Kurt Cobain were born in Copenhagen and Seattle, respectively".

Common Sense Reasoning Few-Shot Learning +2

Cultural Adaptation of Recipes

no code implementations26 Oct 2023 Yong Cao, Yova Kementchedjhieva, Ruixiang Cui, Antonia Karamolegkou, Li Zhou, Megan Dare, Lucia Donatelli, Daniel Hershcovich

We introduce a new task involving the translation and cultural adaptation of recipes between Chinese and English-speaking cuisines.

Information Retrieval Machine Translation +1

Cannot find the paper you are looking for? You can Submit a new open access paper.