no code implementations • RANLP 2021 • Yukun Feng, Chenlong Hu, Hidetaka Kamigaito, Hiroya Takamura, Manabu Okumura
Character-aware neural language models can capture the relationship between words by exploiting character-level information and are particularly effective for languages with rich morphology.
no code implementations • RANLP 2021 • Jingyi You, Chenlong Hu, Hidetaka Kamigaito, Hiroya Takamura, Manabu Okumura
Neural sequence-to-sequence (Seq2Seq) models and BERT have achieved substantial improvements in abstractive document summarization (ADS) without and with pre-training, respectively.
no code implementations • 21 Jan 2024 • Yanhong Peng, Ceng Zhang, Chenlong Hu, Zebing Mao
This paper presents an innovative approach to integrating Large Language Models (LLMs) with Arduino-controlled Electrohydrodynamic (EHD) pumps for precise color synthesis in automation systems.
no code implementations • 24 Oct 2022 • Wei Wang, Gang Wang, Chenlong Hu, K. C. Ho
For single ellipse fitting, we formulate a non-convex optimization problem to estimate the kernel bandwidth and center and divide it into two subproblems, each estimating one parameter.
no code implementations • EACL 2021 • Chenlong Hu, Yukun Feng, Hidetaka Kamigaito, Hiroya Takamura, Manabu Okumura
This work presents multi-modal deep SVDD (mSVDD) for one-class text classification.
1 code implementation • Asian Chapter of the Association for Computational Linguistics 2020 • Yukun Feng, Chenlong Hu, Hidetaka Kamigaito, Hiroya Takamura, Manabu Okumura
We propose a simple and effective method for incorporating word clusters into the Continuous Bag-of-Words (CBOW) model.