no code implementations • 20 May 2013 • Xiangnan Kong, Bokai Cao, Philip S. Yu, Ying Ding, David J. Wild
Moreover, by considering different linkage paths in the network, one can capture the subtlety of different types of dependencies among objects.
no code implementations • CVPR 2013 • Mingsheng Long, Guiguang Ding, Jian-Min Wang, Jiaguang Sun, Yuchen Guo, Philip S. Yu
In this paper, we propose a Transfer Sparse Coding (TSC) approach to construct robust sparse representations for classifying cross-distribution images accurately.
no code implementations • 26 Sep 2013 • Zhung-Xun Liao, Shou-Chung Li, Wen-Chih Peng, Philip S. Yu
By analyzing real App usage log data, we discover two kinds of features: The Explicit Feature (EF) from sensing readings of built-in sensors, and the Implicit Feature (IF) from App usage relations.
no code implementations • 28 Sep 2013 • Chuan Shi, Xiangnan Kong, Yue Huang, Philip S. Yu, Bin Wu
Similarity search is an important function in many applications, which usually focuses on measuring the similarity between objects with the same type.
no code implementations • 4 Oct 2013 • Weixiang Shao, Xiaoxiao Shi, Philip S. Yu
The idea is to collectively completes the kernel matrices of incomplete datasets by optimizing the alignment of the shared instances of the datasets.
no code implementations • 13 Oct 2013 • Jiawei Zhang, Xiangnan Kong, Philip S. Yu
We propose a link prediction method called SCAN-PS (Supervised Cross Aligned Networks link prediction with Personalized Sampling), to solve the link prediction problem for new users with information transferred from both the existing active users in the target network and other source networks through aligned accounts.
no code implementations • 16 Oct 2013 • Sihong Xie, Xiangnan Kong, Jing Gao, Wei Fan, Philip S. Yu
Nonetheless, data nowadays are usually multilabeled, such that more than one label have to be predicted at the same time.
no code implementations • CVPR 2014 • Mingsheng Long, Jian-Min Wang, Guiguang Ding, Jiaguang Sun, Philip S. Yu
Visual domain adaptation, which learns an accurate classifier for a new domain using labeled images from an old domain, has shown promising value in computer vision yet still been a challenging problem.
no code implementations • 6 Jul 2014 • Xiangnan Kong, Zhaoming Wu, Li-Jia Li, Ruofei Zhang, Philip S. Yu, Hang Wu, Wei Fan
Unlike prior works, our method can effectively and efficiently consider missing labels and label correlations simultaneously, and is very scalable, that has linear time complexities over the size of the data.
no code implementations • 31 Jul 2014 • Lifang He, Xiangnan Kong, Philip S. Yu, Ann B. Ragin, Zhifeng Hao, Xiaowei Yang
The dual-tensorial mapping function can map each tensor instance in the input space to another tensor in the feature space while preserving the tensorial structure.
1 code implementation • 3 Jun 2015 • Bokai Cao, Hucheng Zhou, Guoqiang Li, Philip S. Yu
In this paper, we propose a general predictor, named multi-view machines (MVMs), that can effectively include all the possible interactions between features from multiple views.
no code implementations • NeurIPS 2017 • Mingsheng Long, Zhangjie Cao, Jian-Min Wang, Philip S. Yu
Deep networks trained on large-scale data can learn transferable features to promote learning multiple tasks.
no code implementations • 5 Aug 2015 • Bokai Cao, Xiangnan Kong, Philip S. Yu
Brain disorder data poses many unique challenges for data mining research.
no code implementations • 19 Aug 2015 • Bokai Cao, Xiangnan Kong, Jingyuan Zhang, Philip S. Yu, Ann B. Ragin
In this paper, we study the problem of discriminative subgraph selection using multiple side views and propose a novel solution to find an optimal set of subgraph features for graph classification by exploring a plurality of side views.
no code implementations • 22 Feb 2016 • Yue Cao, Mingsheng Long, Jian-Min Wang, Philip S. Yu
This paper presents a Correlation Hashing Network (CHN) approach to cross-modal hashing, which jointly learns good data representation tailored to hash coding and formally controls the quantization error.
no code implementations • 3 Apr 2016 • Jiawei Zhang, Xiao Pan, Moyin Li, Philip S. Yu
In bicycle-sharing systems, people can borrow and return bikes at any stations in the service region very conveniently.
no code implementations • 14 Apr 2016 • Weixiang Shao, Jiawei Zhang, Lifang He, Philip S. Yu
In many real-world applications, information can be gathered from multiple sources, while each source can contain multiple views, which are more cohesive for learning.
no code implementations • 11 Aug 2016 • Chenwei Zhang, Sihong Xie, Yaliang Li, Jing Gao, Wei Fan, Philip S. Yu
We propose a novel multi-source hierarchical prediction consolidation method to effectively exploits the complicated hierarchical label structures to resolve the noisy and conflicting information that inherently originates from multiple imperfect sources.
no code implementations • 27 Sep 2016 • Weixiang Shao, Lifang He, Chun-Ta Lu, Xiaokai Wei, Philip S. Yu
Third, how to leverage the consistent and complementary information from different views to improve the feature selection in the situation when the data are too big or come in as streams?
no code implementations • 2 Nov 2016 • Weixiang Shao, Lifang He, Chun-Ta Lu, Philip S. Yu
We model the multi-view clustering problem as a joint weighted nonnegative matrix factorization problem and process the multi-view data chunk by chunk to reduce the memory requirement.
no code implementations • 4 Dec 2016 • Hu Xu, Sihong Xie, Lei Shu, Philip S. Yu
One important product feature is the complementary entity (products) that may potentially work together with the reviewed product.
no code implementations • 14 Dec 2016 • Hu Xu, Lei Shu, Jingyuan Zhang, Philip S. Yu
In this paper, we address the problem of extracting compatible and incompatible products from yes/no questions in PCQA.
5 code implementations • 17 Jan 2017 • Lei Zheng, Vahid Noroozi, Philip S. Yu
One of the networks focuses on learning user behaviors exploiting reviews written by the user, and the other one learns item properties from the reviews written for the item.
2 code implementations • ICCV 2017 • Zhangjie Cao, Mingsheng Long, Jian-Min Wang, Philip S. Yu
Learning to hash has been widely applied to approximate nearest neighbor search for large-scale multimedia retrieval, due to its computation efficiency and retrieval quality.
no code implementations • 10 Apr 2017 • Chun-Ta Lu, Lifang He, Hao Ding, Bokai Cao, Philip S. Yu
Real-world relations among entities can often be observed and determined by different perspectives/views.
no code implementations • 2 May 2017 • Xiaokai Wei, Bokai Cao, Philip S. Yu
In this paper, we study unsupervised feature selection for multi-view data, as class labels are usually expensive to obtain.
no code implementations • 29 May 2017 • Hu Xu, Lei Shu, Philip S. Yu
Extracting opinion targets is an important task in sentiment analysis on product reviews and complementary entities (products) are one important type of opinion targets that may work together with the reviewed product.
1 code implementation • 5 Jun 2017 • Wei Wu, Bin Li, Ling Chen, Chengqi Zhang, Philip S. Yu
Min-Hash is a popular technique for efficiently estimating the Jaccard similarity of binary sets.
Data Structures and Algorithms
1 code implementation • Proceedings of the VLDB Endowment 2017 • Aoqian Zhang, Shaoxu Song, Jian-Min Wang, Philip S. Yu
Instead of simply discarding anomalies, we propose to (iteratively) repair them in time series data, by creatively bonding the beauty of temporal nature in anomaly detection with the widely considered minimum change principle in data repairing.
no code implementations • 12 Jun 2017 • Vahid Noroozi, Lei Zheng, Sara Bahaadini, Sihong Xie, Philip S. Yu
The model consists of two complementary components.
no code implementations • CVPR 2017 • Lifang He, Chun-Ta Lu, Hao Ding, Shen Wang, Linlin Shen, Philip S. Yu, Ann B. Ragin
Owing to prominence as a diagnostic tool for probing the neural correlates of cognition, neuroimaging tensor data has been the focus of intense investigation.
no code implementations • ICML 2017 • Lifang He, Chun-Ta Lu, Guixiang Ma, Shen Wang, Linlin Shen, Philip S. Yu, Ann B. Ragin
In the context of supervised tensor learning, preserving the structural information and exploiting the discriminative nonlinear relationships of tensor data are crucial for improving the performance of learning tasks.
no code implementations • 12 Sep 2017 • Guixiang Ma, Chun-Ta Lu, Lifang He, Philip S. Yu, Ann B. Ragin
Specifically, we propose an auto-weighted framework of Multi-view Graph Embedding with Hub Detection (MVGE-HD) for brain network analysis.
no code implementations • 13 Sep 2017 • Bokai Cao, Mia Mao, Siim Viidu, Philip S. Yu
On electronic game platforms, different payment transactions have different levels of risk.
no code implementations • 22 Oct 2017 • Chenwei Zhang, Nan Du, Wei Fan, Yaliang Li, Chun-Ta Lu, Philip S. Yu
The healthcare status, complex medical information needs of patients are expressed diversely and implicitly in their medical text queries.
no code implementations • 7 Nov 2017 • Lichao Sun, Xiaokai Wei, Jiawei Zhang, Lifang He, Philip S. Yu, Witawas Srisa-an
The results indicate that once we remove contaminants from the datasets, we can significantly improve both malware detection rate and detection accuracy
Cryptography and Security
no code implementations • 26 Nov 2017 • Jiawei Zhang, Congying Xia, Chenwei Zhang, Limeng Cui, Yanjie Fu, Philip S. Yu
The closeness among users in the networks are defined as the meta proximity scores, which will be fed into DIME to learn the embedding vectors of users in the emerging network.
Social and Information Networks Databases
1 code implementation • 29 Nov 2017 • Chuan Shi, Binbin Hu, Wayne Xin Zhao, Philip S. Yu
In this paper, we propose a novel heterogeneous network embedding based approach for HIN based recommendation, called HERec.
Social and Information Networks
no code implementations • NeurIPS 2017 • Yunbo Wang, Mingsheng Long, Jian-Min Wang, Zhifeng Gao, Philip S. Yu
The core of this network is a new Spatiotemporal LSTM (ST-LSTM) unit that extracts and memorizes spatial and temporal representations simultaneously.
Ranked #6 on Video Prediction on Human3.6M
no code implementations • 6 Dec 2017 • Hu Xu, Sihong Xie, Lei Shu, Philip S. Yu
Functionality is of utmost importance to customers when they purchase products.
no code implementations • 6 Dec 2017 • Hu Xu, Sihong Xie, Lei Shu, Philip S. Yu
Product compatibility and their functionality are of utmost importance to customers when they purchase products, and to sellers and manufacturers when they sell products.
no code implementations • 25 Dec 2017 • Jindong Wang, Yiqiang Chen, Lisha Hu, Xiaohui Peng, Philip S. Yu
The proposed framework, referred to as Stratified Transfer Learning (STL), can dramatically improve the classification accuracy for cross-domain activity recognition.
no code implementations • ICLR 2018 • Chenwei Zhang, Yaliang Li, Nan Du, Wei Fan, Philip S. Yu
Online healthcare services can provide the general public with ubiquitous access to medical knowledge and reduce the information access cost for both individuals and societies.
no code implementations • ICLR 2018 • Hu Xu, Bing Liu, Lei Shu, Philip S. Yu
We observe that domains are not isolated and a small domain corpus can leverage the learned knowledge from many past domains to augment that corpus in order to generate high-quality embeddings.
no code implementations • 1 Jan 2018 • Mehrnaz Najafi, Lifang He, Philip S. Yu
Various types of errors behave differently and inconsistently in each view.
no code implementations • 19 Jan 2018 • Shaika Chowdhury, Chenwei Zhang, Philip S. Yu
Social media has grown to be a crucial information source for pharmacovigilance studies where an increasing number of people post adverse reactions to medical drugs that are previously unreported.
1 code implementation • 12 Feb 2018 • Lichao Sun, Weiran Huang, Philip S. Yu, Wei Chen
In this paper, we study the Multi-Round Influence Maximization (MRIM) problem, where influence propagates in multiple rounds independently from possibly different seed sets, and the goal is to select seeds for each round to maximize the expected number of nodes that are activated in at least one round.
Social and Information Networks
no code implementations • 12 Mar 2018 • He Huang, Philip S. Yu, Changhu Wang
There has been a drastic growth of research in Generative Adversarial Nets (GANs) in the past few years.
no code implementations • 23 Mar 2018 • Bokai Cao, Lei Zheng, Chenwei Zhang, Philip S. Yu, Andrea Piscitello, John Zulueta, Olu Ajilore, Kelly Ryan, Alex D. Leow
The increasing use of electronic forms of communication presents new opportunities in the study of mental health, including the ability to investigate the manifestations of psychiatric diseases unobtrusively and in the setting of patients' daily lives.
11 code implementations • ICML 2018 • Yunbo Wang, Zhifeng Gao, Mingsheng Long, Jian-Min Wang, Philip S. Yu
We present PredRNN++, an improved recurrent network for video predictive learning.
Ranked #1 on Video Prediction on KTH (Cond metric)
2 code implementations • ACL 2018 • Hu Xu, Bing Liu, Lei Shu, Philip S. Yu
Unlike other highly sophisticated supervised deep learning models, this paper proposes a novel and yet simple CNN model employing two types of pre-trained embeddings for aspect extraction: general-purpose embeddings and domain-specific embeddings.
2 code implementations • 22 May 2018 • Jiawei Zhang, Bowen Dong, Philip S. Yu
This paper aims at investigating the principles, methodologies and algorithms for detecting fake news articles, creators and subjects from online social networks and evaluating the corresponding performance.
1 code implementation • 25 May 2018 • Hu Xu, Bing Liu, Lei Shu, Philip S. Yu
Learning high-quality domain word embeddings is important for achieving good performance in many NLP tasks.
no code implementations • 28 May 2018 • Yang Yang, Haoyan Liu, Xia Hu, Jiawei Zhang, Xiao-Ming Zhang, Zhoujun Li, Philip S. Yu
The number of missing people (i. e., people who get lost) greatly increases in recent years.
2 code implementations • 3 Jun 2018 • Yang Yang, Lei Zheng, Jiawei Zhang, Qingcai Cui, Zhoujun Li, Philip S. Yu
By projecting the explicit and latent features into a unified feature space, TI-CNN is trained with both the text and image information simultaneously.
no code implementations • 19 Jun 2018 • Ye Liu, Lifang He, Bokai Cao, Philip S. Yu, Ann B. Ragin, Alex D. Leow
Network analysis of human brain connectivity is critically important for understanding brain function and disease states.
1 code implementation • 19 Jul 2018 • Jindong Wang, Wenjie Feng, Yiqiang Chen, Han Yu, Meiyu Huang, Philip S. Yu
Existing methods either attempt to align the cross-domain distributions, or perform manifold subspace learning.
Ranked #1 on Domain Adaptation on Office-Caltech-10
1 code implementation • 29 Aug 2018 • He Huang, Bokai Cao, Philip S. Yu, Chang-Dong Wang, Alex D. Leow
Mood disorders are common and associated with significant morbidity and mortality.
Human-Computer Interaction Computers and Society
1 code implementation • 30 Aug 2018 • Lei Zheng, Chun-Ta Lu, Fei Jiang, Jiawei Zhang, Philip S. Yu
Benefiting from the rich information of connectivity existing in the \textit{spectral domain}, SpectralCF is capable of discovering deep connections between users and items and therefore, alleviates the \textit{cold-start} problem for CF.
no code implementations • 2 Sep 2018 • Xi Zhang, Yixuan Li, Senzhang Wang, Binxing Fang, Philip S. Yu
In this work, we study how to explore multiple data sources to improve the performance of the stock prediction.
6 code implementations • EMNLP 2018 • Congying Xia, Chenwei Zhang, Xiaohui Yan, Yi Chang, Philip S. Yu
User intent detection plays a critical role in question-answering and dialog systems.
1 code implementation • 4 Sep 2018 • Zhangjie Cao, Ziping Sun, Mingsheng Long, Jian-Min Wang, Philip S. Yu
Deep hashing enables image retrieval by end-to-end learning of deep representations and hash codes from training data with pairwise similarity information.
no code implementations • 10 Sep 2018 • Ji Wang, Jian-Guo Zhang, Weidong Bao, Xiaomin Zhu, Bokai Cao, Philip S. Yu
To benefit from the cloud data center without the privacy risk, we design, evaluate, and implement a cloud-based framework ARDEN which partitions the DNN across mobile devices and cloud data centers.
no code implementations • 10 Sep 2018 • Ji Wang, Bokai Cao, Philip S. Yu, Lichao Sun, Weidong Bao, Xiaomin Zhu
In this paper, we provide an overview of the current challenges and representative achievements about pushing deep learning on mobile devices from three aspects: training with mobile data, efficient inference on mobile devices, and applications of mobile deep learning.
no code implementations • 11 Sep 2018 • Jian-Guo Zhang, Ji Wang, Lifang He, Zhao Li, Philip S. Yu
Then, it is possible to utilize unlabeled data that have a potential of failure to further improve the performance of the model.
no code implementations • 11 Sep 2018 • Lichao Sun, Lifang He, Zhipeng Huang, Bokai Cao, Congying Xia, Xiaokai Wei, Philip S. Yu
Meta-graph is currently the most powerful tool for similarity search on heterogeneous information networks, where a meta-graph is a composition of meta-paths that captures the complex structural information.
no code implementations • 27 Sep 2018 • Chenwei Zhang, Yaliang Li, Nan Du, Wei Fan, Philip S. Yu
Being able to automatically discover synonymous entities from a large free-text corpus has transformative effects on structured knowledge discovery.
no code implementations • 27 Sep 2018 • Congying Xia, Chenwei Zhang, Tao Yang, Yaliang Li, Nan Du, Xian Wu, Wei Fan, Fenglong Ma, Philip S. Yu
In this paper, we focus on a new Named Entity Recognition (NER) task, i. e., the Multi-grained NER task.
1 code implementation • 14 Oct 2018 • Chen Li, Xutan Peng, Shanghang Zhang, Hao Peng, Philip S. Yu, Min He, Linfeng Du, Lihong Wang
By treating relations and multi-hop paths as two different input sources, we use a feature extractor, which is shared by two downstream components (i. e. relation classifier and source discriminator), to capture shared/similar information between them.
no code implementations • 17 Oct 2018 • Jianguo Chen, Kenli Li, Kashif Bilal, Xu Zhou, Keqin Li, Philip S. Yu
In this paper, we focus on the time-consuming training process of large-scale CNNs and propose a Bi-layered Parallel Training (BPT-CNN) architecture in distributed computing environments.
no code implementations • 17 Oct 2018 • Jianguo Chen, Kenli Li, Huigui Rong, Kashif Bilal, Keqin Li, Philip S. Yu
In this paper, a Periodicity-based Parallel Time Series Prediction (PPTSP) algorithm for large-scale time-series data is proposed and implemented in the Apache Spark cloud computing environment.
no code implementations • 18 Oct 2018 • Lifang He, Chun-Ta Lu, Yong Chen, Jiawei Zhang, Linlin Shen, Philip S. Yu, Fei Wang
In many real-world applications, data are often unlabeled and comprised of different representations/views which often provide information complementary to each other.
no code implementations • 24 Oct 2018 • Ye Liu, Jiawei Zhang, Chenwei Zhang, Philip S. Yu
After a thorough investigation of an online movie knowledge library, a novel movie planning framework "Blockbuster Planning with Maximized Movie Configuration Acquaintance" (BigMovie) is introduced in this paper.
no code implementations • 2 Nov 2018 • Guixiang Ma, Nesreen K. Ahmed, Ted Willke, Dipanjan Sengupta, Michael W. Cole, Nicholas B. Turk-Browne, Philip S. Yu
We propose an end-to-end similarity learning framework called Higher-order Siamese GCN for multi-subject fMRI data analysis.
no code implementations • 9 Nov 2018 • Shuaijun Ge, Guixiang Ma, Sihong Xie, Philip S. Yu
In terms of security, DETER is versatile enough to be vaccinated against diverse and unexpected evasions, is agnostic about evasion strategy and can be released without privacy concern.
no code implementations • 11 Nov 2018 • Jian-Guo Zhang, Pengcheng Zou, Zhao Li, Yao Wan, Ye Liu, Xiuming Pan, Yu Gong, Philip S. Yu
Nowadays, an increasing number of customers are in favor of using E-commerce Apps to browse and purchase products.
no code implementations • 11 Nov 2018 • Vahid Noroozi, Sara Bahaadini, Lei Zheng, Sihong Xie, Weixiang Shao, Philip S. Yu
While neural networks for learning representation of multi-view data have been previously proposed as one of the state-of-the-art multi-view dimension reduction techniques, how to make the representation discriminative with only a small amount of labeled data is not well-studied.
Dimensionality Reduction Learning Representation Of Multi-View Data
no code implementations • 11 Nov 2018 • Hao Peng, Jian-Xin Li, Qiran Gong, Senzhang Wang, Yuanxing Ning, Philip S. Yu
Different from previous convolutional neural networks on graphs, we first design a motif-matching guided subgraph normalization method to capture neighborhood information.
no code implementations • 12 Nov 2018 • Yao Wan, Wenqiang Yan, Jianwei Gao, Zhou Zhao, Jian Wu, Philip S. Yu
Dialogue Act (DA) classification is a challenging problem in dialogue interpretation, which aims to attach semantic labels to utterances and characterize the speaker's intention.
Ranked #5 on Dialogue Act Classification on Switchboard corpus
1 code implementation • CVPR 2019 • He Huang, Changhu Wang, Philip S. Yu, Chang-Dong Wang
Most previous models try to learn a fixed one-directional mapping between visual and semantic space, while some recently proposed generative methods try to generate image features for unseen classes so that the zero-shot learning problem becomes a traditional fully-supervised classification problem.
no code implementations • 13 Nov 2018 • Ji Wang, Weidong Bao, Lichao Sun, Xiaomin Zhu, Bokai Cao, Philip S. Yu
To benefit from the on-device deep learning without the capacity and privacy concerns, we design a private model compression framework RONA.
2 code implementations • 17 Nov 2018 • Yao Wan, Zhou Zhao, Min Yang, Guandong Xu, Haochao Ying, Jian Wu, Philip S. Yu
To the best of our knowledge, most state-of-the-art approaches follow an encoder-decoder framework which encodes the code into a hidden space and then decode it into natural language space, suffering from two major drawbacks: a) Their encoders only consider the sequential content of code, ignoring the tree structure which is also critical for the task of code summarization, b) Their decoders are typically trained to predict the next word by maximizing the likelihood of next ground-truth word with previous ground-truth word given.
4 code implementations • CVPR 2019 • Yunbo Wang, Jianjin Zhang, Hongyu Zhu, Mingsheng Long, Jian-Min Wang, Philip S. Yu
Natural spatiotemporal processes can be highly non-stationary in many ways, e. g. the low-level non-stationarity such as spatial correlations or temporal dependencies of local pixel values; and the high-level variations such as the accumulation, deformation or dissipation of radar echoes in precipitation forecasting.
Ranked #5 on Video Prediction on Human3.6M
no code implementations • 20 Nov 2018 • Zhiyu Yao, Yunbo Wang, Mingsheng Long, Jian-Min Wang, Philip S. Yu, Jiaguang Sun
Rev2Net is shown to be effective on the classic action recognition task.
no code implementations • 10 Dec 2018 • Shen Wang, Zhengzhang Chen, Ding Li, Lu-An Tang, Jingchao Ni, Zhichun Li, Junghwan Rhee, Haifeng Chen, Philip S. Yu
The key idea is to leverage the representation learning of the heterogeneous program behavior graph to guide the reidentification process.
3 code implementations • ACL 2019 • Chenwei Zhang, Yaliang Li, Nan Du, Wei Fan, Philip S. Yu
Being able to recognize words as slots and detect the intent of an utterance has been a keen issue in natural language understanding.
Ranked #7 on Slot Filling on ATIS
1 code implementation • 26 Dec 2018 • Lichao Sun, Yingtong Dou, Carl Yang, Ji Wang, Yixin Liu, Philip S. Yu, Lifang He, Bo Li
Therefore, this review is intended to provide an overall landscape of more than 100 papers on adversarial attack and defense strategies for graph data, and establish a unified formulation encompassing most graph adversarial learning models.
1 code implementation • 31 Dec 2018 • Chenwei Zhang, Yaliang Li, Nan Du, Wei Fan, Philip S. Yu
Being able to automatically discover synonymous entities in an open-world setting benefits various tasks such as entity disambiguation or knowledge graph canonicalization.
5 code implementations • 3 Jan 2019 • Zonghan Wu, Shirui Pan, Fengwen Chen, Guodong Long, Chengqi Zhang, Philip S. Yu
In this survey, we provide a comprehensive overview of graph neural networks (GNNs) in data mining and machine learning fields.
2 code implementations • 15 Jan 2019 • Zhi-Hong Deng, Ling Huang, Chang-Dong Wang, Jian-Huang Lai, Philip S. Yu
To solve this problem, many methods have been studied, which can be generally categorized into two types, i. e., representation learning-based CF methods and matching function learning-based CF methods.
1 code implementation • 3 Feb 2019 • Hu Xu, Bing Liu, Lei Shu, Philip S. Yu
Inspired by conversational reading comprehension (CRC), this paper studies a novel task of leveraging reviews as a source to build an agent that can answer multi-turn questions from potential consumers of online businesses.
1 code implementation • 14 Feb 2019 • Binhang Yuan, Chen Wang, Chen Luo, Fei Jiang, Mingsheng Long, Philip S. Yu, Yu-An Liu
Quick detection of blade ice accretion is crucial for the maintenance of wind farms.
no code implementations • 26 Feb 2019 • Lu Bai, Lixin Cui, Yue Wang, Philip S. Yu, Edwin R. Hancock
To overcome these issues, we propose a new feature selection method using structural correlation between pairwise samples.
no code implementations • CVPR 2017 • Yunbo Wang, Mingsheng Long, Jian-Min Wang, Philip S. Yu
From the technical perspective, we introduce the spatiotemporal compact bilinear operator into video analysis tasks.
no code implementations • 7 Mar 2019 • Chaozhuo Li, Senzhang Wang, Philip S. Yu, Zhoujun Li
Specifically, we propose a MCNE model to learn compact embeddings from pre-learned node features.
no code implementations • 9 Mar 2019 • Jianping Cao, Senzhang Wang, Danyan Wen, Zhaohui Peng, Philip S. Yu, Fei-Yue Wang
HINT first models multi-sourced texts (e. g. news and tweets) as heterogeneous information networks by introducing the shared ``anchor texts'' to connect the comparative texts.
no code implementations • NAACL 2019 • Jian-Guo Zhang, Pengcheng Zou, Zhao Li, Yao Wan, Xiuming Pan, Yu Gong, Philip S. Yu
To address this discrepancy, previous studies mainly consider textual information of long product titles and lacks of human-like view during training and evaluation process.
1 code implementation • NAACL 2019 • Hu Xu, Bing Liu, Lei Shu, Philip S. Yu
Since ReviewRC has limited training examples for RRC (and also for aspect-based sentiment analysis), we then explore a novel post-training approach on the popular language model BERT to enhance the performance of fine-tuning of BERT for RRC.
no code implementations • 12 Apr 2019 • Jianguo Chen, Kenli Li, Qingying Deng, Keqin Li, Philip S. Yu
We implement the proposed DIVS system and address the problems of parallel training, model synchronization, and workload balancing.
no code implementations • 9 May 2019 • Shen Wang, Zhengzhang Chen, Jingchao Ni, Xiao Yu, Zhichun Li, Haifeng Chen, Philip S. Yu
How to address the vulnerabilities and defense GNN against the adversarial attacks?
no code implementations • 15 May 2019 • Bowen Dong, Jiawei Zhang, Chenwei Zhang, Yang Yang, Philip S. Yu
Online knowledge libraries refer to the online data warehouses that systematically organize and categorize the knowledge-based information about different kinds of concepts and entities.
no code implementations • 5 Jun 2019 • Lichao Sun, Yingbo Zhou, Ji Wang, Jia Li, Richard Sochar, Philip S. Yu, Caiming Xiong
Privacy-preserving deep learning is crucial for deploying deep neural network based solutions, especially when the model works on data that contains sensitive information.
no code implementations • 5 Jun 2019 • Lichao Sun, Albert Chen, Philip S. Yu, Wei Chen
We incorporate self activation into influence propagation and propose the self-activation independent cascade (SAIC) model: nodes may be self activated besides being selected as seeds, and influence propagates from both selected seeds and self activated nodes.
Social and Information Networks
1 code implementation • 9 Jun 2019 • Hao Peng, Jian-Xin Li, Qiran Gong, Yangqiu Song, Yuanxing Ning, Kunfeng Lai, Philip S. Yu
In this paper, we design an event meta-schema to characterize the semantic relatedness of social events and build an event-based heterogeneous information network (HIN) integrating information from external knowledge base, and propose a novel Pair-wise Popularity Graph Convolutional Network (PP-GCN) based fine-grained social event categorization model.
1 code implementation • 9 Jun 2019 • Hao Peng, Jian-Xin Li, Qiran Gong, Senzhang Wang, Lifang He, Bo Li, Lihong Wang, Philip S. Yu
In this paper, we propose a novel hierarchical taxonomy-aware and attentional graph capsule recurrent CNNs framework for large-scale multi-label text classification.
no code implementations • 11 Jun 2019 • Senzhang Wang, Jiannong Cao, Philip S. Yu
Next we classify existing literatures based on the types of ST data, the data mining tasks, and the deep learning models, followed by the applications of deep learning for STDM in different domains including transportation, climate science, human mobility, location based social network, crime analysis, and neuroscience.
no code implementations • 5 Jul 2019 • Yingtong Dou, Weijian Li, Zhirong Liu, Zhenhua Dong, Jiebo Luo, Philip S. Yu
To the best of our knowledge, this is the first work that investigates the download fraud problem in mobile App markets.
no code implementations • 13 Aug 2019 • Yue Wang, Yao Wan, Chenwei Zhang, Lixin Cui, Lu Bai, Philip S. Yu
During the iterations, our model updates the parallel policies and the corresponding scenario-based regrets for agents simultaneously.
1 code implementation • 13 Aug 2019 • Ye Liu, Chenwei Zhang, Xiaohui Yan, Yi Chang, Philip S. Yu
To improve the quality and retrieval performance of the generated questions, we make two major improvements: 1) To better encode the semantics of ill-formed questions, we enrich the representation of questions with character embedding and the recent proposed contextual word embedding such as BERT, besides the traditional context-free word embeddings; 2) To make it capable to generate desired questions, we train the model with deep reinforcement learning techniques that considers an appropriate wording of the generation as an immediate reward and the correlation between generated question and answer as time-delayed long-term rewards.
1 code implementation • 10 Sep 2019 • Yuanfu Lu, Xiao Wang, Chuan Shi, Philip S. Yu, Yanfang Ye
The micro-dynamics describe the formation process of network structures in a detailed manner, while the macro-dynamics refer to the evolution pattern of the network scale.
no code implementations • 14 Sep 2019 • Chuan Shi, Xiaotian Han, Li Song, Xiao Wang, Senzhang Wang, Junping Du, Philip S. Yu
However, the characteristics of users and the properties of items may stem from different aspects, e. g., the brand-aspect and category-aspect of items.
no code implementations • 25 Sep 2019 • Lichao Sun, Yingbo Zhou, Jia Li, Richard Socher, Philip S. Yu, Caiming Xiong
Ensuring the privacy of sensitive data used to train modern machine learning models is of paramount importance in many areas of practice.
1 code implementation • Joint Conference on Lexical and Computational Semantics 2020 • Jian-Guo Zhang, Kazuma Hashimoto, Chien-Sheng Wu, Yao Wan, Philip S. Yu, Richard Socher, Caiming Xiong
Dialog state tracking (DST) is a core component in task-oriented dialog systems.
Ranked #4 on Multi-domain Dialogue State Tracking on MULTIWOZ 2.0
dialog state tracking Multi-domain Dialogue State Tracking +1
no code implementations • 14 Oct 2019 • Shaika Chowdhury, Chenwei Zhang, Philip S. Yu, Yuan Luo
Distributed representations have been used to support downstream tasks in healthcare recently.
no code implementations • 15 Oct 2019 • Shaika Chowdhury, Chenwei Zhang, Philip S. Yu, Yuan Luo
Predicting patient mortality is an important and challenging problem in the healthcare domain, especially for intensive care unit (ICU) patients.
no code implementations • 17 Oct 2019 • Shen Wang, Zhengzhang Chen, Xiao Yu, Ding Li, Jingchao Ni, Lu-An Tang, Jiaping Gui, Zhichun Li, Haifeng Chen, Philip S. Yu
Information systems have widely been the target of malware attacks.
1 code implementation • 18 Oct 2019 • Zhiwei Liu, Lei Zheng, Jiawei Zhang, Jiayu Han, Philip S. Yu
JSCN will simultaneously operate multi-layer spectral convolutions on different graphs, and jointly learn a domain-invariant user representation with a domain adaptive user mapping module.
1 code implementation • 4 Nov 2019 • Hu Xu, Bing Liu, Lei Shu, Philip S. Yu
Aspect-based sentiment classification (ASC) is an important task in fine-grained sentiment analysis.~Deep supervised ASC approaches typically model this task as a pair-wise classification task that takes an aspect and a sentence containing the aspect and outputs the polarity of the aspect in that sentence.
no code implementations • 8 Nov 2019 • Jiahao Liu, Guixiang Ma, Fei Jiang, Chun-Ta Lu, Philip S. Yu, Ann B. Ragin
Specifically, we use graph convolutions to learn the structural and functional joint embedding, where the graph structure is defined with structural connectivity and node features are from the functional connectivity.
1 code implementation • 23 Nov 2019 • Jianguo Chen, Philip S. Yu
However, clustering algorithms based on density peak have limited clustering effect on data with varying density distribution (VDD), equilibrium distribution (ED), and multiple domain-density maximums (MDDM), leading to the problems of sparse cluster loss and cluster fragmentation.
no code implementations • 26 Nov 2019 • Yue Wang, Chenwei Zhang, Shen Wang, Philip S. Yu, Lu Bai, Lixin Cui, Guandong Xu
We formalize networks with evolving structures as temporal networks and propose a generative link prediction model, Generative Link Sequence Modeling (GLSM), to predict future links for temporal networks.
no code implementations • 6 Dec 2019 • Shaika Chowdhury, Chenwei Zhang, Philip S. Yu, Yuan Luo
Distributed representations of medical concepts have been used to support downstream clinical tasks recently.
1 code implementation • 8 Dec 2019 • Zhiyu Yao, Yunbo Wang, Jianmin Wang, Philip S. Yu, Mingsheng Long
This paper introduces video domain generalization where most video classification networks degenerate due to the lack of exposure to the target domains of divergent distributions.
no code implementations • 25 Dec 2019 • Guixiang Ma, Nesreen K. Ahmed, Theodore L. Willke, Philip S. Yu
In many domains where data are represented as graphs, learning a similarity metric among graphs is considered a key problem, which can further facilitate various learning tasks, such as classification, clustering, and similarity search.
no code implementations • 31 Dec 2019 • Vahid Noroozi, Sara Bahaadini, Samira Sheikhi, Nooshin Mojab, Philip S. Yu
There has been a growing concern about the fairness of decision-making systems based on machine learning.
1 code implementation • 14 Jan 2020 • Zhiwei Liu, Mengting Wan, Stephen Guo, Kannan Achan, Philip S. Yu
By defining a basket entity to represent the basket intent, we can model this problem as a basket-item link prediction task in the User-Basket-Item~(UBI) graph.
1 code implementation • 18 Jan 2020 • Huanrui Luo, Ning Yang, Philip S. Yu
Particularly, as the aspect preference/quality of users/items is learned automatically, HDE is able to capture the impact of aspects that are not mentioned in reviews of a user or an item.
no code implementations • 18 Jan 2020 • Yuhui Zhao, Ning Yang, Tao Lin, Philip S. Yu
First, the existing works often assume an underlying information diffusion model, which is impractical in real world due to the complexity of information diffusion.
1 code implementation • 2 Feb 2020 • Shaoxiong Ji, Shirui Pan, Erik Cambria, Pekka Marttinen, Philip S. Yu
In this survey, we provide a comprehensive review of knowledge graph covering overall research topics about 1) knowledge graph representation learning, 2) knowledge acquisition and completion, 3) temporal knowledge graph, and 4) knowledge-aware applications, and summarize recent breakthroughs and perspective directions to facilitate future research.
1 code implementation • 11 Feb 2020 • Shao-Heng Ko, Hsu-Chao Lai, Hong-Han Shuai, De-Nian Yang, Wang-Chien Lee, Philip S. Yu
Shopping in VR malls has been regarded as a paradigm shift for E-commerce, but most of the conventional VR shopping platforms are designed for a single user.
Data Structures and Algorithms
no code implementations • 1 Mar 2020 • Lichao Sun, Yingbo Zhou, Philip S. Yu, Caiming Xiong
Ensuring the privacy of sensitive data used to train modern machine learning models is of paramount importance in many areas of practice.
1 code implementation • EMNLP 2020 • Xuming Hu, Chenwei Zhang, Yusong Xu, Lijie Wen, Philip S. Yu
Open relation extraction is the task of extracting open-domain relation facts from natural language sentences.
no code implementations • 22 Apr 2020 • Shoujin Wang, Liang Hu, Yan Wang, Xiangnan He, Quan Z. Sheng, Mehmet Orgun, Longbing Cao, Nan Wang, Francesco Ricci, Philip S. Yu
Recent years have witnessed the fast development of the emerging topic of Graph Learning based Recommender Systems (GLRS).
1 code implementation • Findings of the Association for Computational Linguistics 2020 • Hu Xu, Bing Liu, Lei Shu, Philip S. Yu
This paper focuses on learning domain-oriented language models driven by end tasks, which aims to combine the worlds of both general-purpose language models (such as ELMo and BERT) and domain-specific language understanding.
Aspect-Based Sentiment Analysis Aspect-Based Sentiment Analysis (ABSA) +1
1 code implementation • 1 May 2020 • Zhiwei Liu, Yingtong Dou, Philip S. Yu, Yutong Deng, Hao Peng
In this paper, we introduce these inconsistencies and design a new GNN framework, $\mathsf{GraphConsis}$, to tackle the inconsistency problem: (1) for the context inconsistency, we propose to combine the context embeddings with node features, (2) for the feature inconsistency, we design a consistency score to filter the inconsistent neighbors and generate corresponding sampling probability, and (3) for the relation inconsistency, we learn a relation attention weights associated with the sampled nodes.
no code implementations • SIGDIAL (ACL) 2020 • Ye Liu, Tao Yang, Zeyu You, Wei Fan, Philip S. Yu
Human tackle reading comprehension not only based on the given context itself but often rely on the commonsense beyond.
1 code implementation • 17 May 2020 • Fanzhen Liu, Shan Xue, Jia Wu, Chuan Zhou, Wenbin Hu, Cecile Paris, Surya Nepal, Jian Yang, Philip S. Yu
As communities represent similar opinions, similar functions, similar purposes, etc., community detection is an important and extremely useful tool in both scientific inquiry and data analytics.
no code implementations • 23 May 2020 • Ting-Ting Liang, Congying Xia, Yuyu Yin, Philip S. Yu
This paper proposes a novel neural network, joint training capsule network (JTCN), for the cold start recommendation task.
no code implementations • COLING 2020 • Hu Xu, Seungwhan Moon, Honglei Liu, Pararth Shah, Bing Liu, Philip S. Yu
We study a conversational recommendation model which dynamically manages users' past (offline) preferences and current (online) requests through a structured and cumulative user memory knowledge graph, to allow for natural interactions and accurate recommendations.
1 code implementation • 10 Jun 2020 • Yingtong Dou, Guixiang Ma, Philip S. Yu, Sihong Xie
We experiment on three large review datasets using various state-of-the-art spamming and detection strategies and show that the optimization algorithm can reliably find an equilibrial detector that can robustly and effectively prevent spammers with any mixed spamming strategies from attaining their practical goal.
1 code implementation • 10 Jun 2020 • Chen Li, Xutan Peng, Hao Peng, Jian-Xin Li, Lihong Wang, Philip S. Yu, Lifang He
Recently, graph-based algorithms have drawn much attention because of their impressive success in semi-supervised setups.
2 code implementations • 23 Jun 2020 • Shen Wang, Jibing Gong, Jinlong Wang, Wenzheng Feng, Hao Peng, Jie Tang, Philip S. Yu
To address this issue, we leverage both content information and context information to learn the representation of entities via graph convolution network.
no code implementations • 4 Jul 2020 • Jianguo Chen, Kenli Li, Zhaolei Zhang, Keqin Li, Philip S. Yu
The COVID-19 pandemic caused by the SARS-CoV-2 virus has spread rapidly worldwide, leading to a global outbreak.
no code implementations • 6 Jul 2020 • Di Jin, Zhizhi Yu, Dongxiao He, Carl Yang, Philip S. Yu, Jiawei Han
Graph neural networks for HIN embeddings typically adopt a hierarchical attention (including node-level and meta-path-level attentions) to capture the information from meta-path-based neighbors.
2 code implementations • IEEE Transactions on Knowledge and Data Engineering 2020 • Zheng Wang, Xiaojun Ye, Chaokun Wang, Jian Cui, Philip S. Yu
Network embedding, aiming to project a network into a low-dimensional space, is increasingly becoming a focus of network research.
no code implementations • 10 Jul 2020 • Longbing Cao, Qiang Yang, Philip S. Yu
Financial technology (FinTech) has been playing an increasingly critical role in driving modern economies, society, technology, and many other areas.
2 code implementations • 2 Aug 2020 • Qian Li, Hao Peng, Jian-Xin Li, Congying Xia, Renyu Yang, Lichao Sun, Philip S. Yu, Lifang He
The last decade has seen a surge of research in this area due to the unprecedented success of deep learning.
no code implementations • 5 Aug 2020 • Tianqing Zhu, Dayong Ye, Wei Wang, Wanlei Zhou, Philip S. Yu
Artificial Intelligence (AI) has attracted a great deal of attention in recent years.
no code implementations • 6 Aug 2020 • Ye Liu, Shaika Chowdhury, Chenwei Zhang, Cornelia Caragea, Philip S. Yu
Unlike most other QA tasks that focus on linguistic understanding, HeadQA requires deeper reasoning involving not only knowledge extraction, but also complex reasoning with healthcare knowledge.
1 code implementation • 12 Aug 2020 • Hao Peng, Jian-Xin Li, Zheng Wang, Renyu Yang, Mingzhe Liu, Mingming Zhang, Philip S. Yu, Lifang He
As a departure from prior work, Luce organizes the house data in a heterogeneous information network (HIN) where graph nodes are house entities and attributes that are important for house price valuation.
no code implementations • 16 Aug 2020 • Dayong Ye, Tianqing Zhu, Sheng Shen, Wanlei Zhou, Philip S. Yu
To the best of our knowledge, this paper is the first to apply differential privacy to the field of multi-agent planning as a means of preserving the privacy of agents for logistic-like problems.
6 code implementations • 19 Aug 2020 • Yingtong Dou, Zhiwei Liu, Li Sun, Yutong Deng, Hao Peng, Philip S. Yu
Finally, the selected neighbors across different relations are aggregated together.
Ranked #5 on Fraud Detection on Amazon-Fraud
2 code implementations • 24 Aug 2020 • Youwei Liang, Dong Huang, Chang-Dong Wang, Philip S. Yu
To overcome this limitation, we propose a new multi-view graph learning framework, which for the first time simultaneously and explicitly models multi-view consistency and multi-view inconsistency in a unified objective function, through which the consistent and inconsistent parts of each single-view graph as well as the unified graph that fuses the consistent parts can be iteratively learned.
no code implementations • 30 Aug 2020 • Qingyun Sun, Hao Peng, Jian-Xin Li, Senzhang Wang, Xiangyu Dong, Liangxuan Zhao, Philip S. Yu, Lifang He
Although these attributes may change, an author's co-authors and research topics do not change frequently with time, which means that papers within a period have similar text and relation information in the academic network.
no code implementations • 14 Sep 2020 • Tao Zhang, Tianqing Zhu, Mengde Han, Jing Li, Wanlei Zhou, Philip S. Yu
Extensive experiments show that our method is able to achieve fair semi-supervised learning, and reach a better trade-off between accuracy and fairness than fair supervised learning.
no code implementations • 25 Sep 2020 • Tao Zhang, Tianqing Zhu, Jing Li, Mengde Han, Wanlei Zhou, Philip S. Yu
A set of experiments on real-world and synthetic datasets show that our method is able to use unlabeled data to achieve a better trade-off between accuracy and discrimination.
1 code implementation • 26 Sep 2020 • Ye Liu, Yao Wan, Lifang He, Hao Peng, Philip S. Yu
To promote the ability of commonsense reasoning for text generation, we propose a novel knowledge graph augmented pre-trained language generation model KG-BART, which encompasses the complex relations of concepts through the knowledge graph and produces more logical and natural sentences as output.
1 code implementation • 28 Sep 2020 • He Huang, Shunta Saito, Yuta Kikuchi, Eiichi Matsumoto, Wei Tang, Philip S. Yu
Motivated by the fact that detecting these rare relations can be critical in real-world applications, this paper introduces a novel integrated framework of classification and ranking to resolve the class imbalance problem in scene graph parsing.
no code implementations • COLING 2020 • Lichao Sun, Congying Xia, Wenpeng Yin, TingTing Liang, Philip S. Yu, Lifang He
Our studies show that mixup is a domain-independent data augmentation technique to pre-trained language models, resulting in significant performance improvement for transformer-based models.
1 code implementation • Findings of the Association for Computational Linguistics 2020 • Hoang Nguyen, Chenwei Zhang, Congying Xia, Philip S. Yu
Although recent works demonstrate that multi-level matching plays an important role in transferring learned knowledge from seen training classes to novel testing classes, they rely on a static similarity measure and overly fine-grained matching components.
1 code implementation • Findings (EMNLP) 2021 • Xuming Hu, Chenwei Zhang, Fukun Ma, Chenyao Liu, Lijie Wen, Philip S. Yu
To alleviate human efforts from obtaining large-scale annotations, Semi-Supervised Relation Extraction methods aim to leverage unlabeled data in addition to learning from limited samples.
no code implementations • 13 Oct 2020 • Yue Wang, Zhuo Xu, Lu Bai, Yao Wan, Lixin Cui, Qian Zhao, Edwin R. Hancock, Philip S. Yu
To verify the effectiveness of our proposed method, we conduct extensive experiments on four real-world datasets as well as compare our method with state-of-the-art methods.
1 code implementation • 22 Oct 2020 • Zhiwei Liu, Xiaohan Li, Ziwei Fan, Stephen Guo, Kannan Achan, Philip S. Yu
The problem of basket recommendation~(BR) is to recommend a ranking list of items to the current basket.
1 code implementation • EMNLP 2020 • Jian-Guo Zhang, Kazuma Hashimoto, Wenhao Liu, Chien-Sheng Wu, Yao Wan, Philip S. Yu, Richard Socher, Caiming Xiong
Intent detection is one of the core components of goal-oriented dialog systems, and detecting out-of-scope (OOS) intents is also a practically important skill.
2 code implementations • COLING 2020 • Hu Xu, Lei Shu, Philip S. Yu, Bing Liu
Most features in the representation of an aspect are dedicated to the fine-grained semantics of the domain (or product category) and the aspect itself, instead of carrying summarized opinions from its context.
Aspect-Based Sentiment Analysis Aspect-Based Sentiment Analysis (ABSA) +2
1 code implementation • COLING 2020 • Zhongfen Deng, Hao Peng, Congying Xia, JianXin Li, Lifang He, Philip S. Yu
Review rating prediction of text reviews is a rapidly growing technology with a wide range of applications in natural language processing.
1 code implementation • 4 Nov 2020 • Zhiwei Liu, Lin Meng, Fei Jiang, Jiawei Zhang, Philip S. Yu
Stacking multiple cross-hop propagation layers and locality layers constitutes the DGCF model, which models high-order CF signals adaptively to the locality of nodes and layers.
no code implementations • 30 Nov 2020 • Xiao Wang, Deyu Bo, Chuan Shi, Shaohua Fan, Yanfang Ye, Philip S. Yu
Heterogeneous graphs (HGs) also known as heterogeneous information networks have become ubiquitous in real-world scenarios; therefore, HG embedding, which aims to learn representations in a lower-dimension space while preserving the heterogeneous structures and semantics for downstream tasks (e. g., node/graph classification, node clustering, link prediction), has drawn considerable attentions in recent years.
no code implementations • 7 Dec 2020 • Lingjuan Lyu, Han Yu, Xingjun Ma, Chen Chen, Lichao Sun, Jun Zhao, Qiang Yang, Philip S. Yu
Besides training powerful global models, it is of paramount importance to design FL systems that have privacy guarantees and are resistant to different types of adversaries.
no code implementations • 3 Jan 2021 • Di Jin, Zhizhi Yu, Pengfei Jiao, Shirui Pan, Dongxiao He, Jia Wu, Philip S. Yu, Weixiong Zhang
We conclude with discussions of the challenges of the field and suggestions of possible directions for future research.
1 code implementation • 8 Jan 2021 • Xiaohan Li, Mengqi Zhang, Shu Wu, Zheng Liu, Liang Wang, Philip S. Yu
Here we propose Dynamic Graph Collaborative Filtering (DGCF), a novel framework leveraging dynamic graphs to capture collaborative and sequential relations of both items and users at the same time.
no code implementations • 17 Jan 2021 • Zheng Liu, Xiaohan Li, Hao Peng, Lifang He, Philip S. Yu
EHRs contain multiple entities and relations and can be viewed as a heterogeneous graph.
no code implementations • 19 Jan 2021 • Jianguo Chen, Kenli Li, Keqin Li, Philip S. Yu, Zeng Zeng
The BSDP system contains four modules: bicycle drop-off location clustering, bicycle-station graph modeling, bicycle-station location prediction, and bicycle-station layout recommendation.
no code implementations • 19 Jan 2021 • Jianguo Chen, Kenli Li, Keqin Li, Philip S. Yu, Zeng Zeng
We model the DL-PBS system from the perspective of CPS and use deep learning to predict the layout of bicycle parking spots and the dynamic demand of bicycle dispatching.
2 code implementations • 21 Jan 2021 • Yuwei Cao, Hao Peng, Jia Wu, Yingtong Dou, JianXin Li, Philip S. Yu
The complexity and streaming nature of social messages make it appealing to address social event detection in an incremental learning setting, where acquiring, preserving, and extending knowledge are major concerns.
no code implementations • 22 Jan 2021 • Ye Liu, Yao Wan, Jian-Guo Zhang, Wenting Zhao, Philip S. Yu
In this paper, we claim that the syntactic and semantic structures among natural language are critical for non-autoregressive machine translation and can further improve the performance.
3 code implementations • 27 Feb 2021 • Yixin Liu, Ming Jin, Shirui Pan, Chuan Zhou, Yu Zheng, Feng Xia, Philip S. Yu
Deep learning on graphs has attracted significant interests recently.
1 code implementation • 2 Mar 2021 • Jindong Wang, Cuiling Lan, Chang Liu, Yidong Ouyang, Tao Qin, Wang Lu, Yiqiang Chen, Wenjun Zeng, Philip S. Yu
Domain generalization deals with a challenging setting where one or several different but related domain(s) are given, and the goal is to learn a model that can generalize to an unseen test domain.
1 code implementation • 4 Mar 2021 • Fanjin Zhang, Jie Tang, Xueyi Liu, Zhenyu Hou, Yuxiao Dong, Jing Zhang, Xiao Liu, Ruobing Xie, Kai Zhuang, Xu Zhang, Leyu Lin, Philip S. Yu
"Top Stories" is a novel friend-enhanced recommendation engine in WeChat, in which users can read articles based on preferences of both their own and their friends.
Graph Representation Learning Social and Information Networks
1 code implementation • 10 Mar 2021 • Zi-Yuan Hu, Jin Huang, Zhi-Hong Deng, Chang-Dong Wang, Ling Huang, Jian-Huang Lai, Philip S. Yu
Representation learning tries to learn a common low dimensional space for the representations of users and items.
2 code implementations • 14 Mar 2021 • Hongsheng Hu, Zoran Salcic, Lichao Sun, Gillian Dobbie, Philip S. Yu, Xuyun Zhang
In recent years, MIAs have been shown to be effective on various ML models, e. g., classification models and generative models.
3 code implementations • 17 Mar 2021 • Yunbo Wang, Haixu Wu, Jianjin Zhang, Zhifeng Gao, Jianmin Wang, Philip S. Yu, Mingsheng Long
This paper models these structures by presenting PredRNN, a new recurrent network, in which a pair of memory cells are explicitly decoupled, operate in nearly independent transition manners, and finally form unified representations of the complex environment.
Ranked #1 on Video Prediction on KTH (Cond metric)
no code implementations • 27 Mar 2021 • Mehrnaz Najafi, Philip S. Yu
In this paper, we propose a novel Robust Graph Convolutional Neural Networks for possible erroneous single-view or multi-view data where data may come from multiple sources.
no code implementations • 30 Mar 2021 • Nooshin Mojab, Vahid Noroozi, Abdullah Aleem, Manoj P. Nallabothula, Joseph Baker, Dimitri T. Azar, Mark Rosenblatt, RV Paul Chan, Darvin Yi, Philip S. Yu, Joelle A. Hallak
In this paper, we present a new multi-modal longitudinal ophthalmic imaging dataset, the Illinois Ophthalmic Database Atlas (I-ODA), with the goal of advancing state-of-the-art computer vision applications in ophthalmology, and improving upon the translatable capacity of AI based applications across different clinical settings.
1 code implementation • 2 Apr 2021 • Hao Peng, JianXin Li, Yangqiu Song, Renyu Yang, Rajiv Ranjan, Philip S. Yu, Lifang He
Third, we propose a streaming social event detection and evolution discovery framework for HINs based on meta-path similarity search, historical information about meta-paths, and heterogeneous DBSCAN clustering method.
no code implementations • 6 Apr 2021 • Li Sun, Zhongbao Zhang, Jiawei Zhang, Feiyang Wang, Hao Peng, Sen Su, Philip S. Yu
To model the uncertainty, we devise a hyperbolic graph variational autoencoder built upon the proposed TGNN to generate stochastic node representations of hyperbolic normal distributions.
1 code implementation • NAACL 2021 • Zhongfen Deng, Hao Peng, Dongxiao He, JianXin Li, Philip S. Yu
The second one encourages the structure encoder to learn better representations with desired characteristics for all labels which can better handle label imbalance in hierarchical text classification.
1 code implementation • 14 Apr 2021 • Chaoyang He, Keshav Balasubramanian, Emir Ceyani, Carl Yang, Han Xie, Lichao Sun, Lifang He, Liangwei Yang, Philip S. Yu, Yu Rong, Peilin Zhao, Junzhou Huang, Murali Annavaram, Salman Avestimehr
FedGraphNN is built on a unified formulation of graph FL and contains a wide range of datasets from different domains, popular GNN models, and FL algorithms, with secure and efficient system support.
no code implementations • 16 Apr 2021 • Yu Wang, Lifu Huang, Philip S. Yu, Lichao Sun
Membership inference attacks (MIAs) infer whether a specific data record is used for target model training.
1 code implementation • 16 Apr 2021 • JianXin Li, Hao Peng, Yuwei Cao, Yingtong Dou, Hekai Zhang, Philip S. Yu, Lifang He
Furthermore, they cannot fully capture the content-based correlations between nodes, as they either do not use the self-attention mechanism or only use it to consider the immediate neighbors of each node, ignoring the higher-order neighbors.
1 code implementation • 16 Apr 2021 • Hao Peng, Ruitong Zhang, Yingtong Dou, Renyu Yang, Jingyi Zhang, Philip S. Yu
To avoid the embedding over-assimilation among different types of nodes, we employ a label-aware neural similarity measure to ascertain the most similar neighbors based on node attributes.
Ranked #3 on Node Classification on Amazon-Fraud
2 code implementations • 25 Apr 2021 • Yingtong Dou, Kai Shu, Congying Xia, Philip S. Yu, Lichao Sun
The majority of existing fake news detection algorithms focus on mining news content and/or the surrounding exogenous context for discovering deceptive signals; while the endogenous preference of a user when he/she decides to spread a piece of fake news or not is ignored.
Ranked #1 on Graph Classification on UPFD-GOS
1 code implementation • 2 May 2021 • Zhiwei Liu, Ziwei Fan, Yu Wang, Philip S. Yu
We firstly pre-train a transformer with sequences in a reverse direction to predict prior items.
1 code implementation • 7 May 2021 • Gongxu Luo, JianXin Li, Jianlin Su, Hao Peng, Carl Yang, Lichao Sun, Philip S. Yu, Lifang He
Based on them, we design MinGE to directly calculate the ideal node embedding dimension for any graph.
no code implementations • 7 May 2021 • Mehrnaz Najafi, Lifang He, Philip S. Yu
Due to inevitable sensor failures, data in each view may contain error.
1 code implementation • 13 May 2021 • Shoujin Wang, Liang Hu, Yan Wang, Xiangnan He, Quan Z. Sheng, Mehmet A. Orgun, Longbing Cao, Francesco Ricci, Philip S. Yu
Recent years have witnessed the fast development of the emerging topic of Graph Learning based Recommender Systems (GLRS).
1 code implementation • 22 May 2021 • JianXin Li, Xingcheng Fu, Hao Peng, Senzhang Wang, Shijie Zhu, Qingyun Sun, Philip S. Yu, Lifang He
With the prevalence of graph data in real-world applications, many methods have been proposed in recent years to learn high-quality graph embedding vectors various types of graphs.
no code implementations • 26 May 2021 • Xing Su, Shan Xue, Fanzhen Liu, Jia Wu, Jian Yang, Chuan Zhou, Wenbin Hu, Cecile Paris, Surya Nepal, Di Jin, Quan Z. Sheng, Philip S. Yu
A community reveals the features and connections of its members that are different from those in other communities in a network.
1 code implementation • 8 Jun 2021 • Siddharth Bhatia, Mohit Wadhwa, Kenji Kawaguchi, Neil Shah, Philip S. Yu, Bryan Hooi
This higher-order sketch has the useful property of preserving the dense subgraph structure (dense subgraphs in the input turn into dense submatrices in the data structure).
1 code implementation • 8 Jun 2021 • JianGuo Zhang, Kazuma Hashimoto, Yao Wan, Zhiwei Liu, Ye Liu, Caiming Xiong, Philip S. Yu
Pre-trained Transformer-based models were reported to be robust in intent classification.