no code implementations • 17 Dec 2024 • Xiangxiang Gao, Weisheng Xie, Yiwei Xiang, Feng Ji
Striking an optimal balance between minimal drafting latency and high speculation accuracy to enhance the inference speed of Large Language Models remains a significant challenge in speculative decoding.
1 code implementation • 8 Nov 2024 • Kai Zhao, Xuhao Li, Qiyu Kang, Feng Ji, Qinxu Ding, Yanan Zhao, Wenfei Liang, Wee Peng Tay
We introduce the Distributed-order fRActional Graph Operating Network (DRAGON), a novel continuous Graph Neural Network (GNN) framework that incorporates distributed-order fractional calculus.
no code implementations • 1 Nov 2024 • Feng Ji
This short note is a supplement to [1], in which the total variation of graph distributional signals is introduced and studied.
no code implementations • 13 Oct 2024 • Jielong Yang, Rui Ding, Feng Ji, Hongbin Wang, Linbo Xie
However, these approaches lack theoretical analysis of the proximity between predictions and ground truth at test time.
no code implementations • 6 Sep 2024 • Yanan Zhao, Xingchao Jian, Feng Ji, Wee Peng Tay, Antonio Ortega
We introduce a novel uncertainty principle for generalized graph signals that extends classical time-frequency and graph uncertainty principles into a unified framework.
no code implementations • 30 Aug 2024 • Qinji Shu, Hang Sheng, Feng Ji, Hui Feng, Bo Hu
By incorporating both sparsity and topological similarity into the model, this study establishes an upper bound on the transfer error for downsampling in the training of large-scale sparse graphs and provides insight into the influence of topological structure on transfer performance.
no code implementations • 6 Aug 2024 • Xingchao Jian, Martin Gölz, Feng Ji, Wee Peng Tay, Abdelhak M. Zoubir
We consider a multiple hypothesis testing problem in a sensor network over the joint spatial-time domain.
2 code implementations • 26 Apr 2024 • Qiyu Kang, Kai Zhao, Qinxu Ding, Feng Ji, Xuhao Li, Wenfei Liang, Yang song, Wee Peng Tay
We introduce the FRactional-Order graph Neural Dynamical network (FROND), a new continuous graph neural network (GNN) framework.
no code implementations • 16 Feb 2024 • Wenwei Liu, Hui Feng, Feng Ji, Bo Hu
In this paper, we are interested in how to sample based on sign information in an online manner, by which the direction of the original graph signal can be estimated.
no code implementations • 30 Jan 2024 • Yang Liu, Yanshan Chen, Yuexi Yang, Xiangyu Pei, Feng Ji
In order to limit the short-circuit current of inverters, a logic-based bang-bang funnel control (LBFC) is designed to control the switches of inverter bridges when over-current is detected.
no code implementations • 30 Dec 2023 • Feng Ji
In this paper, we present a signal processing framework for directed graphs.
no code implementations • 13 Dec 2023 • Feng Ji, Xingchao Jian, Wee Peng Tay
Our signal processing framework provides a comprehensive approach to analyzing and processing signals on graph sequences, even if they are sparse.
no code implementations • 7 Nov 2023 • Feng Ji, Lu Gao, Chang Lin, Yang Liu
This paper proposes to analyze the motion stability of synchro-nous generator power systems using a Lagrangian model derived in the configuration space of generalized position and speed.
no code implementations • 25 Oct 2023 • See Hian Lee, Feng Ji, Kelin Xia, Wee Peng Tay
Traditionally, graph neural networks have been trained using a single observed graph.
no code implementations • 23 Oct 2023 • Xingchao Jian, Feng Ji, Wee Peng Tay
The note also contains errata of the previous version of the note.
no code implementations • 12 Sep 2023 • Purui Zhang, Xingchao Jian, Feng Ji, Wee Peng Tay, Bihan Wen
We recall the notion of a complexon as the limit of a simplicial complex sequence [1].
no code implementations • 11 Sep 2023 • Xingchao Jian, Feng Ji, Wee Peng Tay
This random graph process converges to the generalized graphon in stretched cut distance.
no code implementations • 18 Aug 2023 • Rui Ding, Jielong Yang, Feng Ji, Xionghu Zhong, Linbo Xie
To address this challenge, we propose FR-GNN, a general framework for GNNs to conduct feature reconstruction.
no code implementations • 16 Jul 2023 • Feng Ji, Wee Peng Tay, Antonio Ortega
In this expository article, we provide a self-contained overview of the notion of convolution embedded in different theories: from the classical Fourier theory to the theory of algebraic signal processing.
no code implementations • 13 May 2023 • Qianglong Chen, Feng Ji, Feng-Lin Li, Guohai Xu, Ming Yan, Ji Zhang, Yin Zhang
To support cost-effective language inference in multilingual settings, we propose AMTSS, an adaptive multi-teacher single-student distillation framework, which allows distilling knowledge from multiple teachers to a single student.
no code implementations • 11 May 2023 • Feng Ji, Xingchao Jian, Wee Peng Tay, Maosheng Yang
Topological signal processing (TSP) over simplicial complexes typically assumes observations associated with the simplicial complexes are real scalars.
1 code implementation • 29 Apr 2023 • Feng Ji, See Hian Lee, Hanyang Meng, Kai Zhao, Jielong Yang, Wee Peng Tay
We introduce the key notion of label non-uniformity, which is derived from the Wasserstein distance between the softmax distribution of the logits and the uniform distribution.
no code implementations • 7 Apr 2023 • Feng Ji, See Hian Lee, Kai Zhao, Wee Peng Tay, Jielong Yang
In graph neural networks (GNNs), both node features and labels are examples of graph signals, a key notion in graph signal processing (GSP).
no code implementations • 3 Mar 2023 • See Hian Lee, Feng Ji, Wee Peng Tay
However, a graph can have hyperbolic and Euclidean geometries at different regions of the graph.
no code implementations • 24 Feb 2023 • Feng Ji, Xingchao Jian, Wee Peng Tay
In this paper, we propose a framework for graph signal processing using category theory.
no code implementations • 22 Feb 2023 • Feng Ji, Xingchao Jian, Wee Peng Tay
We develop signal processing tools to study the new notion of distributional graph signals.
no code implementations • 10 Oct 2022 • Zhongyi Ni, Feng Ji, Hang Sheng, Hui Feng, Bo Hu
When sampling multiple signals, the correlation between the signals can be exploited to reduce the overall number of samples.
no code implementations • 28 Sep 2022 • Feng Ji, See Hian Lee, Wee Peng Tay
In graph signal processing, one of the most important subjects is the study of filters, i. e., linear transformations that capture relations between graph signals.
1 code implementation • 24 Jul 2022 • See Hian Lee, Feng Ji, Wee Peng Tay
In this paper, we present Simplicial Graph Attention Network (SGAT), a simplicial complex approach to represent such high-order interactions by placing features from non-target nodes on the simplices.
no code implementations • 9 Jun 2022 • Feng Ji, Yiqi Lu, Wee Peng Tay, Edwin Chong
Graph signal processing is a framework to handle graph structured data.
no code implementations • 2 Mar 2022 • Feng Ji, Wee Peng Tay
Graph signal processing (GSP) is a framework to analyze and process graph-structured data.
no code implementations • 26 Sep 2021 • Wenwei Liu, Hui Feng, Kaixuan Wang, Feng Ji, Bo Hu
Sampling and interpolation have been extensively studied, in order to reconstruct or estimate the entire graph signal from the signal values on a subset of vertexes, of which most achievements are about continuous signals.
no code implementations • 20 Aug 2021 • Feng Ji, Wee Peng Tay, Antonio Ortega
Each graph topology gives rise to a different shift operator.
no code implementations • ACL 2021 • Qianglong Chen, Feng Ji, Xiangji Zeng, Feng-Lin Li, Ji Zhang, Haiqing Chen, Yin Zhang
In order to better understand the reason behind model behaviors (i. e., making predictions), most recent works have exploited generative models to provide complementary explanations.
1 code implementation • Findings (ACL) 2021 • Fangkai Jiao, Yangyang Guo, Yilin Niu, Feng Ji, Feng-Lin Li, Liqiang Nie
Pre-trained Language Models (PLMs) have achieved great success on Machine Reading Comprehension (MRC) over the past few years.
1 code implementation • 5 May 2021 • Yangyang Guo, Liqiang Nie, Zhiyong Cheng, Feng Ji, Ji Zhang, Alberto del Bimbo
Experimental results demonstrate that our adapted margin cosine loss can greatly enhance the baseline models with an absolute performance gain of 15\% on average, strongly verifying the potential of tackling the language prior problem in VQA from the angle of the answer feature space learning.
no code implementations • 29 Mar 2021 • See Hian Lee, Feng Ji, Wee Peng Tay
A heterogeneous graph consists of different vertices and edges types.
no code implementations • 11 Dec 2020 • Feng Ji, Wee Peng Tay
In this paper, we develop a signal processing framework of a network without explicit knowledge of the network topology.
no code implementations • 25 Nov 2020 • Haojie Pan, Cen Chen, Chengyu Wang, Minghui Qiu, Liu Yang, Feng Ji, Jun Huang
More specifically, we propose a reinforced selector to extract useful PRF terms to enhance response candidates and a BERT-based response ranker to rank the PRF-enhanced responses.
no code implementations • COLING 2020 • Qianglong Chen, Feng Ji, Haiqing Chen, Yin Zhang
More concretely, we first introduce a novel graph-based iterative knowledge retrieval module, which iteratively retrieves concepts and entities related to the given question and its choices from multiple knowledge sources.
no code implementations • 20 Oct 2020 • Feng Ji, Hui Feng, Hang Sheng, Wee Peng Tay
A continuous-time graph signal can be viewed as a time series of graph signals.
no code implementations • 24 Sep 2020 • Feng-Lin Li, Hehong Chen, Guohai Xu, Tian Qiu, Feng Ji, Ji Zhang, Haiqing Chen
Pre-sales customer service is of importance to E-commerce platforms as it contributes to optimizing customers' buying process.
no code implementations • 27 May 2020 • Zehao Lin, Shaobo Cui, Guodun Li, Xiaoming Kang, Feng Ji, FengLin Li, Zhongzhou Zhao, Haiqing Chen, Yin Zhang
More specifically, we take advantage of a decision model to help the dialogue system decide whether to wait or answer.
no code implementations • 21 May 2020 • Shuke Peng, Feng Ji, Zehao Lin, Shaobo Cui, Haiqing Chen, Yin Zhang
How to build a high-quality multi-domain dialogue system is a challenging work due to its complicated and entangled dialogue state space among each domain, which seriously limits the quality of dialogue policy, and further affects the generated response.
no code implementations • 11 May 2020 • Feng Ji, Wee Peng Tay, Giacomo Kahn
Graph signal processing, like the graph Fourier transform, requires the full graph signal at every vertex of the graph.
no code implementations • 6 Apr 2020 • Feng Ji, Giacomo Kahn, Wee Peng Tay
In this paper, we develop a signal processing framework on simplicial complexes, such that we recover the traditional GSP theory when restricted to signals on graphs.
no code implementations • 22 Feb 2020 • Zehao Lin, Shaobo Cui, Guodun Li, Xiaoming Kang, Feng Ji, FengLin Li, Zhongzhou Zhao, Haiqing Chen, Yin Zhang
And the arbitrator decides whether to wait or to make a response to the user directly.
1 code implementation • COLING 2020 • Wenpeng Hu, Mengyu Wang, Bing Liu, Feng Ji, Haiqing Chen, Dongyan Zhao, Jinwen Ma, Rui Yan
The key idea of the proposed approach is to use a Forward Transformation to transform dense representations to sparse representations.
1 code implementation • 7 Nov 2019 • Zhenxin Fu, Feng Ji, Wenpeng Hu, Wei Zhou, Dongyan Zhao, Haiqing Chen, Rui Yan
Information-seeking conversation system aims at satisfying the information needs of users through conversations.
no code implementations • IJCNLP 2019 • Zehao Lin, Xinjing Huang, Feng Ji, Haiqing Chen, Ying Zhang
How to incorporate external knowledge into a neural dialogue model is critically important for dialogue systems to behave like real humans.
no code implementations • 25 Sep 2019 • Wenpeng Hu, Ran Le, Bing Liu, Feng Ji, Haiqing Chen, Dongyan Zhao, Jinwen Ma, Rui Yan
Positive-unlabeled (PU) learning learns a binary classifier using only positive and unlabeled examples without labeled negative examples.
no code implementations • 20 Aug 2019 • Shuke Peng, Xinjing Huang, Zehao Lin, Feng Ji, Haiqing Chen, Yin Zhang
Dialogue systems dealing with multi-domain tasks are highly required.
3 code implementations • ACL 2019 • Runqi Yang, Jianhai Zhang, Xing Gao, Feng Ji, Haiqing Chen
In this paper, we present a fast and strong neural approach for general purpose text matching applications.
Ranked #5 on Natural Language Inference on SciTail
1 code implementation • 27 Apr 2019 • Shiqian Chen, Chenliang Li, Feng Ji, Wei Zhou, Haiqing Chen
Then, we devise a mechanism to identify the relevant information from the noise-prone review snippets and incorporate this information to guide the answer generation.
no code implementations • 25 Feb 2019 • Feng Ji, Jielong Yang, Qiang Zhang, Wee Peng Tay
In view of the huge success of convolution neural networks (CNN) for image classification and object recognition, there have been attempts to generalize the method to general graph-structured data.
no code implementations • 30 Dec 2018 • Chen Qu, Feng Ji, Minghui Qiu, Liu Yang, Zhiyu Min, Haiqing Chen, Jun Huang, W. Bruce Croft
Specifically, the data selector "acts" on the source domain data to find a subset for optimization of the TL model, and the performance of the TL model can provide "rewards" in turn to update the selector.
no code implementations • 20 Oct 2018 • Xin Tang, Shanbo Cheng, Loc Do, Zhiyu Min, Feng Ji, Heng Yu, Ji Zhang, Haiqin Chen
Our approach is extended from a basic monolingual STS framework to a shared multilingual encoder pretrained with translation task to incorporate rich-resource language data.
1 code implementation • ACL 2018 • Chenliang Li, Wei Zhou, Feng Ji, Yu Duan, Haiqing Chen
In the era of big data, focused analysis for diverse topics with a short response time becomes an urgent demand.
no code implementations • ACL 2018 • Minghui Qiu, Liu Yang, Feng Ji, Weipeng Zhao, Wei Zhou, Jun Huang, Haiqing Chen, W. Bruce Croft, Wei. Lin
Building multi-turn information-seeking conversation systems is an important and challenging research topic.
no code implementations • 1 May 2018 • Zheng Zhang, Minlie Huang, Zhongzhou Zhao, Feng Ji, Haiqing Chen, Xiaoyan Zhu
Dialogue management (DM) decides the next action of a dialogue system according to the current dialogue state, and thus plays a central role in task-oriented dialogue systems.