no code implementations • LT4HALA (LREC) 2022 • Yu Chang, Peng Zhu, Chaoping Wang, Chaofan Wang
In recent years, new deep learning methods and pre-training language models have been emerging in the field of natural language processing (NLP).
no code implementations • CCL 2020 • Chaofan Wang, Shenggen Ju, Jieping Sun, Run Chen
近些年来, 胶囊神经网络(Capsnets)由于拥有强大的文本特征学习能力已被应用到了文本分类任务当中。目前的研究工作大部分都将提取到的文本多元语法特征视为同等重要, 而忽略了单词所对应各个多元语法特征的重要程度应该是由具体上下文决定的这一问题, 这将直接影响到模型对整个文本的语义理解。针对上述问题, 本文提出了多尺度特征部分连接胶囊网络(MulPart-Capsnets)。该方法将多尺度特征注意力融入到Capsnets中, 多尺度特征注意力能够自动选择不同尺度的多元语法特征, 通过对其进行加权求和, 就能为每个单词精确捕捉到丰富的多元语法特征。同时, 为了减少子胶囊与父胶囊之间的冗余信息传递, 本文同时也对路由算法进行了改进。 本文提出的算法在文本分类任务上针对七个著名的数据集进行了有效性验证, 和现有的研究工作相比, 性能显著提高, 说明了本文的算法能够捕获文本中更丰富的多元语法特征, 具有更加强大的文本特征学习能力。
no code implementations • 30 Sep 2024 • Samuel Kernan Freire, Tianhao He, Chaofan Wang, Evangelos Niforatos, Alessandro Bozzon
In the shift towards human-centered manufacturing, our two-year longitudinal study investigates the real-world impact of deploying Cognitive Assistants (CAs) in factories.
no code implementations • 7 Feb 2024 • Samuel Kernan Freire, Chaofan Wang, Evangelos Niforatos
Conversational Assistants (CA) are increasingly supporting human workers in knowledge management.
no code implementations • 10 Jan 2024 • Samuel Kernan Freire, Chaofan Wang, Mina Foosherian, Stefan Wellsandt, Santiago Ruiz-Arenas, Evangelos Niforatos
This paper introduces a Large Language Model (LLM)-based system designed to retrieve information from the extensive knowledge contained in factory documentation and knowledge shared by expert operators.
1 code implementation • 1 Sep 2022 • Nan Wu, Chaofan Wang
Heterogeneous graph neural networks (HGNNs) have attracted increasing research interest in recent three years.
1 code implementation • 27 Apr 2022 • Nan Wu, Chaofan Wang
We formulate a general propagation rule following the nature of message passing in the tree to update a node's feature by aggregating its initial feature and its neighbor nodes' updated features.
Ranked #50 on Node Property Prediction on ogbn-arxiv
no code implementations • 29 Sep 2021 • Nan Wu, Chaofan Wang
Models adopting this scheme has the capability of going deep.