no code implementations • ICML 2020 • Cheng Zheng, Bo Zong, Wei Cheng, Dongjin Song, Jingchao Ni, Wenchao Yu, Haifeng Chen, Wei Wang
Graph representation learning serves as the core of important prediction tasks, ranging from product recommendation to fraud detection.
1 code implementation • 19 Nov 2022 • Chen Ling, Tanmoy Chowdhury, Junji Jiang, Junxiang Wang, Xuchao Zhang, Haifeng Chen, Liang Zhao
As the most well-known computational method of analogical reasoning, Structure-Mapping Theory (SMT) abstracts both target and base subjects into relational graphs and forms the cognitive process of analogical reasoning by finding a corresponding subgraph (i. e., correspondence) in the target graph that is aligned with the base graph.
1 code implementation • 26 Oct 2022 • Tianchun Wang, Wei Cheng, Dongsheng Luo, Wenchao Yu, Jingchao Ni, Liang Tong, Haifeng Chen, Xiang Zhang
Personalized Federated Learning (PFL) which collaboratively trains a federated model while considering local clients under privacy constraints has attracted much attention.
no code implementations • 9 May 2022 • Wei Zhu, Dongjin Song, Yuncong Chen, Wei Cheng, Bo Zong, Takehiko Mizoguchi, Cristian Lumezanu, Haifeng Chen, Jiebo Luo
Specifically, we first design an Exemplar-based Deep Neural network (ExDNN) to learn local time series representations based on their compatibility with an exemplar module which consists of hidden parameters learned to capture varieties of normal patterns on each edge device.
no code implementations • 5 Feb 2022 • Xujiang Zhao, Xuchao Zhang, Wei Cheng, Wenchao Yu, Yuncong Chen, Haifeng Chen, Feng Chen
Sound Event Early Detection (SEED) is an essential task in recognizing the acoustic environments and soundscapes.
no code implementations • 24 Jan 2022 • Jurijs Nazarovs, Cristian Lumezanu, Qianying Ren, Yuncong Chen, Takehiko Mizoguchi, Dongjin Song, Haifeng Chen
In this paper, we propose an ordered time series classification framework that is robust against missing classes in the training data, i. e., during testing we can prescribe classes that are missing during training.
no code implementations • 23 Dec 2021 • Junxiang Wang, Xuchao Zhang, Bo Zong, Yanchi Liu, Wei Cheng, Jingchao Ni, Haifeng Chen, Liang Zhao
During the past several years, a surge of multi-lingual Pre-trained Language Models (PLMs) has been proposed to achieve state-of-the-art performance in many cross-lingual downstream tasks.
no code implementations • 1 Dec 2021 • Denghui Zhang, Zixuan Yuan, Yanchi Liu, Hao liu, Fuzhen Zhuang, Hui Xiong, Haifeng Chen
Also, the word co-occurrences guided semantic learning of pre-training models can be largely augmented by entity-level association knowledge.
1 code implementation • 1 Dec 2021 • Liyan Xu, Xuchao Zhang, Bo Zong, Yanchi Liu, Wei Cheng, Jingchao Ni, Haifeng Chen, Liang Zhao, Jinho D. Choi
We target the task of cross-lingual Machine Reading Comprehension (MRC) in the direct zero-shot setting, by incorporating syntactic features from Universal Dependencies (UD), and the key features we use are the syntactic relations within each sentence.
no code implementations • NeurIPS 2021 • Dongkuan Xu, Wei Cheng, Dongsheng Luo, Haifeng Chen, Xiang Zhang
The key point of this framework is to follow the Information Bottleneck principle to reduce the mutual information between contrastive parts while keeping task-relevant information intact at both the levels of the individual module and the entire framework so that the information loss during graph representation learning can be minimized.
no code implementations • 29 Sep 2021 • Dongsheng Luo, Wei Cheng, Yingheng Wang, Dongkuan Xu, Jingchao Ni, Wenchao Yu, Xuchao Zhang, Yanchi Liu, Haifeng Chen, Xiang Zhang
How to find the desired augmentations of time series data that are meaningful for given contrastive learning tasks and datasets remains an open question.
1 code implementation • ICLR 2022 • Jingchao Ni, Wei Cheng, Zhengzhang Chen, Takayoshi Asakura, Tomoya Soma, Sho Kato, Haifeng Chen
The dilemma necessitates the adaptation of a "coarsely" pretrained model to new tasks with a few unseen "finer-grained" training labels.
no code implementations • 29 Sep 2021 • Peizhao Li, Xuchao Zhang, Ziyu Yao, Wei Cheng, Haifeng Chen, Hongfu Liu
To achieve this, we propose a machine learning approach to adapt the editorial style derived from few exemplars to a query code snippet.
no code implementations • Findings (EMNLP) 2021 • Zeyu Li, Wei Cheng, Reema Kshetramade, John Houser, Haifeng Chen, Wei Wang
Compliments and concerns in reviews are valuable for understanding users' shopping interests and their opinions with respect to specific aspects of certain items.
1 code implementation • EMNLP 2021 • Liyan Xu, Xuchao Zhang, Xujiang Zhao, Haifeng Chen, Feng Chen, Jinho D. Choi
Recent multilingual pre-trained language models have achieved remarkable zero-shot performance, where the model is only finetuned on one source language and directly evaluated on target languages.
no code implementations • 29 Jul 2021 • Xinyang Feng, Dongjin Song, Yuncong Chen, Zhengzhang Chen, Jingchao Ni, Haifeng Chen
Next, a dual discriminator based adversarial training procedure, which jointly considers an image discriminator that can maintain the local consistency at frame-level and a video discriminator that can enforce the global coherence of temporal dynamics, is employed to enhance the future frame prediction.
no code implementations • NAACL 2021 • Xuchao Zhang, Bo Zong, Wei Cheng, Jingchao Ni, Yanchi Liu, Haifeng Chen
Measuring document similarity plays an important role in natural language processing tasks.
no code implementations • 17 May 2021 • Yuening Li, Zhengzhang Chen, Daochen Zha, Mengnan Du, Denghui Zhang, Haifeng Chen, Xia Hu
Motivated by the success of disentangled representation learning in computer vision, we study the possibility of learning semantic-rich time-series representations, which remains unexplored due to three main challenges: 1) sequential data structure introduces complex temporal correlations and makes the latent representations hard to interpret, 2) sequential models suffer from KL vanishing problem, and 3) interpretable semantic concepts for time-series often rely on multiple factors instead of individuals.
1 code implementation • CVPR 2021 • Liang Tong, Zhengzhang Chen, Jingchao Ni, Wei Cheng, Dongjin Song, Haifeng Chen, Yevgeniy Vorobeychik
Moreover, we observe that open-set face recognition systems are more vulnerable than closed-set systems under different types of attacks.
1 code implementation • 26 Mar 2021 • Dongsheng Luo, Wei Cheng, Jingchao Ni, Wenchao Yu, Xuchao Zhang, Bo Zong, Yanchi Liu, Zhengzhang Chen, Dongjin Song, Haifeng Chen, Xiang Zhang
We present a contrasting learning approach with data augmentation techniques to learn document representations in an unsupervised manner.
1 code implementation • 3 Mar 2021 • Yinjun Wu, Jingchao Ni, Wei Cheng, Bo Zong, Dongjin Song, Zhengzhang Chen, Yanchi Liu, Xuchao Zhang, Haifeng Chen, Susan Davidson
Forecasting on sparse multivariate time series (MTS) aims to model the predictors of future values of time series given their incomplete past, which is important for many emerging applications.
no code implementations • 1 Jan 2021 • Lichen Wang, Bo Zong, Yunyu Liu, Can Qin, Wei Cheng, Wenchao Yu, Xuchao Zhang, Haifeng Chen, Yun Fu
As texts always contain a large proportion of task-irrelevant words, accurate alignment between aspects and their sentimental descriptions is the most crucial and challenging step.
1 code implementation • 13 Nov 2020 • Dongsheng Luo, Wei Cheng, Wenchao Yu, Bo Zong, Jingchao Ni, Haifeng Chen, Xiang Zhang
Graph Neural Networks (GNNs) have shown to be powerful tools for graph analytics.
3 code implementations • NeurIPS 2020 • Dongsheng Luo, Wei Cheng, Dongkuan Xu, Wenchao Yu, Bo Zong, Haifeng Chen, Xiang Zhang
The unique explanation interpreting each instance independently is not sufficient to provide a global understanding of the learned GNN model, leading to a lack of generalizability and hindering it from being used in the inductive setting.
no code implementations • 26 Oct 2020 • Denghui Zhang, Yanchi Liu, Wei Cheng, Bo Zong, Jingchao Ni, Zhengzhang Chen, Haifeng Chen, Hui Xiong
Accurate air turbulence forecasting can help airlines avoid hazardous turbulence, guide the routes that keep passengers safe, maximize efficiency, and reduce costs.
no code implementations • 7 Sep 2020 • Denghui Zhang, Zixuan Yuan, Yanchi Liu, Fuzhen Zhuang, Haifeng Chen, Hui Xiong
Pre-trained language models such as BERT have achieved great success in a broad range of natural language processing tasks.
1 code implementation • 31 Aug 2020 • Zhiwei Wang, Zhengzhang Chen, Jingchao Ni, Hui Liu, Haifeng Chen, Jiliang Tang
To address these challenges, in this paper, we propose OC4Seq, a multi-scale one-class recurrent neural network for detecting anomalies in discrete event sequences.
no code implementations • 26 Aug 2020 • Xueyuan Han, Xiao Yu, Thomas Pasquier, Ding Li, Junghwan Rhee, James Mickens, Margo Seltzer, Haifeng Chen
We introduce SIGL, a new tool for detecting malicious behavior during software installation.
no code implementations • 19 Jun 2020 • Yuening Li, Zhengzhang Chen, Daochen Zha, Kaixiong Zhou, Haifeng Jin, Haifeng Chen, Xia Hu
Outlier detection is an important data mining task with numerous practical applications such as intrusion detection, credit card fraud detection, and video surveillance.
1 code implementation • 15 May 2020 • Lei Cai, Zhengzhang Chen, Chen Luo, Jiaping Gui, Jingchao Ni, Ding Li, Haifeng Chen
Detecting anomalies in dynamic graphs is a vital task, with numerous practical applications in areas such as security, finance, and social media.
no code implementations • ICLR 2020 • Lichen Wang, Bo Zong, Qianqian Ma, Wei Cheng, Jingchao Ni, Wenchao Yu, Yanchi Liu, Dongjin Song, Haifeng Chen, Yun Fu
Inductive and unsupervised graph learning is a critical technique for predictive or information retrieval tasks where label information is difficult to obtain.
2 code implementations • International Conference on Web Search and Data Mining 2020 • Zeyu Li, Wei Cheng, Yang Chen, Haifeng Chen, Wei Wang
For this problem, existing approaches, with shallow or deep architectures, have three major drawbacks.
no code implementations • 18 Dec 2019 • Xin Dong, Jingchao Ni, Wei Cheng, Zhengzhang Chen, Bo Zong, Dongjin Song, Yanchi Liu, Haifeng Chen, Gerard de Melo
In practice, however, these two sets of reviews are notably different: users' reviews reflect a variety of items that they have bought and are hence very heterogeneous in their topics, while an item's reviews pertain only to that single item and are thus topically homogeneous.
no code implementations • 17 Oct 2019 • Shen Wang, Zhengzhang Chen, Xiao Yu, Ding Li, Jingchao Ni, Lu-An Tang, Jiaping Gui, Zhichun Li, Haifeng Chen, Philip S. Yu
Information systems have widely been the target of malware attacks.
no code implementations • 4 Oct 2019 • Lu Wang, Wenchao Yu, Wei Wang, Wei Cheng, Wei zhang, Hongyuan Zha, Xiaofeng He, Haifeng Chen
Graph representation learning, aiming to learn low-dimensional representations which capture the geometric dependencies between nodes in the original graph, has gained increasing popularity in a variety of graph analysis tasks, including node classification and link prediction.
no code implementations • 9 May 2019 • Shen Wang, Zhengzhang Chen, Jingchao Ni, Xiao Yu, Zhichun Li, Haifeng Chen, Philip S. Yu
How to address the vulnerabilities and defense GNN against the adversarial attacks?
no code implementations • 10 Dec 2018 • Shen Wang, Zhengzhang Chen, Ding Li, Lu-An Tang, Jingchao Ni, Zhichun Li, Junghwan Rhee, Haifeng Chen, Philip S. Yu
The key idea is to leverage the representation learning of the heterogeneous program behavior graph to guide the reidentification process.
5 code implementations • 20 Nov 2018 • Chuxu Zhang, Dongjin Song, Yuncong Chen, Xinyang Feng, Cristian Lumezanu, Wei Cheng, Jingchao Ni, Bo Zong, Haifeng Chen, Nitesh V. Chawla
Subsequently, given the signature matrices, a convolutional encoder is employed to encode the inter-sensor (time series) correlations and an attention based Convolutional Long-Short Term Memory (ConvLSTM) network is developed to capture the temporal patterns.
Time Series Anomaly Detection
Unsupervised Anomaly Detection
1 code implementation • ACM SIGKDD International Conference on Knowledge Discovery & Data Mining 2018 • Wenchao Yu, Cheng Zheng, Wei Cheng, Charu C. Aggarwal, Dongjin Song, Bo Zong, Haifeng Chen, Wei Wang
The problem of network representation learning, also known as network embedding, arises in many machine learning tasks assuming that there exist a small number of variabilities in the vertex representations which can capture the "semantics" of the original network structure.
2 code implementations • ICLR 2018 • Bo Zong, Qi Song, Martin Renqiang Min, Wei Cheng, Cristian Lumezanu, Daeki Cho, Haifeng Chen
In this paper, we present a Deep Autoencoding Gaussian Mixture Model (DAGMM) for unsupervised anomaly detection.
14 code implementations • 7 Apr 2017 • Yao Qin, Dongjin Song, Haifeng Chen, Wei Cheng, Guofei Jiang, Garrison Cottrell
The Nonlinear autoregressive exogenous (NARX) model, which predicts the current value of a time series based upon its previous values as well as the current and past values of multiple driving (exogenous) series, has been studied for decades.
no code implementations • 26 Dec 2016 • Tian Guo, Zhao Xu, Xin Yao, Haifeng Chen, Karl Aberer, Koichi Funaya
Time series forecasting for streaming data plays an important role in many real applications, ranging from IoT systems, cyber-networks, to industrial systems and healthcare.
2 code implementations • ACM SIGKDD international conference on Knowledge discovery and data mining 2016 • Wei Cheng, Kai Zhang, Haifeng Chen, Guofei Jiang, Zhengzhang Chen, Wei Wang
Structures and evolutions of the invariance network, in particular the vanishing correlations, can shed important light on locating causal anomalies and performing diagnosis.