Search Results for author: Paul Nulty

Found 6 papers, 3 papers with code

Can Domain Pre-training Help Interdisciplinary Researchers from Data Annotation Poverty? A Case Study of Legal Argument Mining with BERT-based Transformers

no code implementations NLP4DH (ICON) 2021 Gechuan Zhang, David Lillis, Paul Nulty

Our case study focuses on: the comparison of general pre-training and domain pre-training; the generalisability of different domain pre-trained transformers; and the potential of merging general pre-training with domain pre-training.

Argument Mining

Enhancing Legal Argument Mining with Domain Pre-training and Neural Networks

1 code implementation27 Feb 2022 Gechuan Zhang, Paul Nulty, David Lillis

In this paper, we provide a broad study of both classic and contextual embedding models and their performance on practical case law from the European Court of Human Rights (ECHR).

Argument Mining

Crisis Domain Adaptation Using Sequence-to-sequence Transformers

1 code implementation15 Oct 2021 Congcong Wang, Paul Nulty, David Lillis

In this paper, we investigate how this prior knowledge can be best leveraged for new crises by examining the extent to which crisis events of a similar type are more suitable for adaptation to new events (cross-domain adaptation).

Domain Adaptation Language Modelling

Transformer-based Multi-task Learning for Disaster Tweet Categorisation

1 code implementation15 Oct 2021 Congcong Wang, Paul Nulty, David Lillis

Social media has enabled people to circulate information in a timely fashion, thus motivating people to post messages seeking help during crisis situations.

Multi-Task Learning

The UCD-Net System at SemEval-2020 Task 1: Temporal Referencing with Semantic Network Distances

no code implementations SEMEVAL 2020 Paul Nulty, David Lillis

This paper describes the UCD system entered for SemEval 2020 Task 1: Unsupervised Lexical Semantic Change Detection.

Change Detection

Cannot find the paper you are looking for? You can Submit a new open access paper.