1 code implementation • 20 Sep 2023 • Alberto Muñoz-Ortiz, David Vilares, Carlos Gómez-Rodríguez
We present an approach for assessing how multilingual large language models (LLMs) learn syntax in terms of multi-formalism syntactic structures.
no code implementations • 17 Aug 2023 • Alberto Muñoz-Ortiz, Carlos Gómez-Rodríguez, David Vilares
We conduct a quantitative analysis contrasting human-written English news text with comparable large language model (LLM) output from 4 LLMs from the LLaMa family.
1 code implementation • 24 May 2023 • Alberto Muñoz-Ortiz, David Vilares
The usefulness of part-of-speech tags for parsing has been heavily questioned due to the success of word-contextualized parsers.
no code implementations • 27 Oct 2022 • Alberto Muñoz-Ortiz, Mark Anderson, David Vilares, Carlos Gómez-Rodríguez
PoS tags, once taken for granted as a useful resource for syntactic parsing, have become more situational with the popularization of deep learning.
no code implementations • insights (ACL) 2022 • Alberto Muñoz-Ortiz, Carlos Gómez-Rodríguez, David Vilares
We propose a morphology-based method for low-resource (LR) dependency parsing.
no code implementations • RANLP 2021 • Alberto Muñoz-Ortiz, Michalina Strzyz, David Vilares
Different linearizations have been proposed to cast dependency parsing as sequence labeling and solve the task as: (i) a head selection problem, (ii) finding a representation of the token arcs as bracket strings, or (iii) associating partial transition sequences of a transition-based parser to words.