Search Results for author: Lalita Lowphansirikul

Found 4 papers, 3 papers with code

WangchanBERTa: Pretraining transformer-based Thai Language Models

2 code implementations24 Jan 2021 Lalita Lowphansirikul, Charin Polpanumas, Nawat Jantrakulchai, Sarana Nutanong

However, for a relatively low-resource language such as Thai, the choices of models are limited to training a BERT-based model based on a much smaller dataset or finetuning multi-lingual models, both of which yield suboptimal downstream performance.

Language Modelling Sentence +2

Cannot find the paper you are looking for? You can Submit a new open access paper.