no code implementations • RANLP 2021 • John Lee, Chak Yan Yeung
When the user makes at least half of the expected updates to the open learner model, simulation results show that it outperforms the graded approach in retrieving texts that fit user preference for new-word density.
no code implementations • EMNLP (LaTeCHCLfL, CLFL, LaTeCH) 2021 • Wenxiu Xie, John Lee, Fangqiong Zhan, Xiao Han, Chi-Yin Chow
In Chinese, the derivation may be marked either with the standard adverbial marker DI, or the non-standard marker DE.
no code implementations • EMNLP 2021 • John Lee, Ho Hung Lim, Carol Webster
A nominalization uses a deverbal noun to describe an event associated with its underlying verb.
no code implementations • ACL (NLP4PosImpact) 2021 • John Lee, Baikun Liang, Haley Fong
We obtained the best performance in both restatement and question generation by fine-tuning BertSum, a state-of-the-art summarization model, with the in-domain manual dataset augmented with a large-scale, automatically mined open-domain dataset.
no code implementations • EACL (BEA) 2021 • Chak Yan Yeung, John Lee
To promote efficient learning of Chinese characters, pedagogical materials may present not only a single character, but a set of characters that are related in meaning and in written form.
1 code implementation • 24 Feb 2022 • Pierre Lambert, Cyril de Bodt, Michel Verleysen, John Lee
Multidimensional scaling is a statistical process that aims to embed high dimensional data into a lower-dimensional space; this process is often used for the purpose of data visualisation.
no code implementations • 7 Jan 2021 • Yufang Huang, Kelly M. Axsom, John Lee, Lakshminarayanan Subramanian, Yiye Zhang
Following the representation learning and clustering steps, we embed the objective function in DICE with a constraint which requires a statistically significant association between the outcome and cluster membership of learned representations.
no code implementations • COLING 2020 • John Lee, Benjamin Tsou, Tianyuan Cai
While bilingual corpora have been instrumental for machine translation, their utility for training translators has been less explored.
no code implementations • COLING 2020 • Dariush Saberi, John Lee, Jonathan James Webster
This paper describes a writing assistance system that helps students improve their academic writing.
no code implementations • LREC 2020 • John Lee, Meichun Liu, Tianyuan Cai
This paper presents the first investigation on using semantic frames to assess text difficulty.
no code implementations • LREC 2020 • Ildiko Pilan, John Lee, Chak Yan Yeung, Jonathan Webster
The dataset consists of student-written sentences in their original and revised versions with teacher feedback provided for the errors.
no code implementations • LREC 2020 • John Lee, Tianyuan Cai, Wenxiu Xie, Lam Xing
In a case study, we created a chatbot with a domain-specific subcorpus that addressed 25 issues in test anxiety, with 436 inputs solicited from native speakers of Cantonese and 150 chatbot replies harvested from mental health websites.
no code implementations • WS 2019 • John Lee, Chak Yan Yeung
In the typical LS pipeline, the Substitution Ranking step determines the best substitution out of a set of candidates.
no code implementations • 31 Aug 2019 • John Lee, Nicholas P. Bertrand, Christopher J. Rozell
The modeling of phenomenological structure is a crucial aspect in inverse imaging problems.
2 code implementations • NeurIPS 2019 • John Lee, Max Dabagia, Eva L. Dyer, Christopher J. Rozell
Our results demonstrate that when clustered structure exists in datasets, and is consistent across trials or time points, a hierarchical alignment strategy that leverages such structure can provide significant improvements in cross-domain alignment.
no code implementations • COLING 2018 • John Lee, Chak Yan Yeung
A lexical simplification (LS) system aims to substitute complex words with simple words in a text, while preserving its meaning and grammaticality.
no code implementations • COLING 2018 • Chak Yan Yeung, John Lee
This paper describes a personalized text retrieval algorithm that helps language learners select the most suitable reading material in terms of vocabulary complexity.
no code implementations • 12 Jun 2018 • Nicholas P. Bertrand, Adam S. Charles, John Lee, Pavel B. Dunn, Christopher J. Rozell
Tracking algorithms such as the Kalman filter aim to improve inference performance by leveraging the temporal dynamics in streaming observations.
no code implementations • WS 2017 • Shu Jiang, John Lee
Fill-in-the-blank items are a common form of exercise in computer-assisted language learning systems.
no code implementations • IJCNLP 2017 • John Lee, Meichun Liu, Chun Yin Lam, Tak On Lau, Bing Li, Keying Li
We present a web-based interface that automatically assesses reading difficulty of Chinese texts.
no code implementations • IJCNLP 2017 • Lis Pereira, Xiaodong Liu, John Lee
We explore the application of a Deep Structured Similarity Model (DSSM) to ranking in lexical simplification.
no code implementations • IJCNLP 2017 • Chak Yan Yeung, John Lee
We present the first study that evaluates both speaker and listener identification for direct speech in literary texts.
no code implementations • 25 Oct 2017 • Eric Laloy, Romain Hérault, John Lee, Diederik Jacques, Niklas Linde
Here, we use a deep neural network of the variational autoencoder type to construct a parametric low-dimensional base model parameterization of complex binary geological media.
no code implementations • WS 2017 • Shu Jiang, John Lee
This paper reports the first study on automatic generation of distractors for fill-in-the-blank items for learning Chinese vocabulary.
no code implementations • WS 2017 • John Lee, J. Buddhika K. Pathirage Don
This paper applies parsing technology to the task of syntactic simplification of English sentences, focusing on the identification of text spans that can be removed from a complex sentence.
no code implementations • WS 2017 • John Lee, Keying Li, Herman Leung
This opinion paper proposes the use of parallel treebank as learner corpus.
no code implementations • 1 Feb 2017 • Eric Eaton, Sven Koenig, Claudia Schulz, Francesco Maurelli, John Lee, Joshua Eckroth, Mark Crowley, Richard G. Freedman, Rogelio E. Cardona-Rivera, Tiago Machado, Tom Williams
The 7th Symposium on Educational Advances in Artificial Intelligence (EAAI'17, co-chaired by Sven Koenig and Eric Eaton) launched the EAAI New and Future AI Educator Program to support the training of early-career university faculty, secondary school faculty, and future educators (PhD candidates or postdocs who intend a career in academia).
no code implementations • WS 2016 • Herman Leung, Rafa{\"e}l Poiret, Tak-sum Wong, Xinying Chen, Kim Gerdes, John Lee
This article proposes a Universal Dependency Annotation Scheme for Mandarin Chinese, including POS tags and dependency analysis.
no code implementations • COLING 2016 • John Lee, Wenlong Zhao, Wenxiu Xie
We present a browser-based editor for simplifying English text.
no code implementations • COLING 2016 • John Lee, Chun Yin Lam, Shu Jiang
We present a mobile app that provides a reading environment for learners of Chinese as a foreign language.
no code implementations • LREC 2016 • Tak-sum Wong, John Lee
We present a dependency treebank of the Chinese Buddhist Canon, which contains 1, 514 texts with about 50 million Chinese characters.
no code implementations • LREC 2016 • John Lee, Chak Yan Yeung
We propose a scheme for annotating direct speech in literary texts, based on the Text Encoding Initiative (TEI) and the coreference annotation guidelines from the Message Understanding Conference (MUC).