no code implementations • 27 Aug 2023 • Zhining Liu, Zhichen Zeng, Ruizhong Qiu, Hyunsik Yoo, David Zhou, Zhe Xu, Yada Zhu, Kommy Weldemariam, Jingrui He, Hanghang Tong
Class imbalance is prevalent in real-world node classification tasks and often biases graph learning models toward majority classes.
no code implementations • 29 May 2023 • Dingsu Wang, Yuchen Yan, Ruizhong Qiu, Yada Zhu, Kaiyu Guan, Andrew J Margenot, Hanghang Tong
First, we define the problem of imputation over NTS which contains missing values in both node time series features and graph structures.
no code implementations • 17 May 2023 • Haohui Wang, Baoyu Jing, Kaize Ding, Yada Zhu, Liqing Zhang, Dawei Zhou
However, there is limited literature that provides a theoretical tool to characterize the behaviors of long-tail categories on graphs and understand the generalization performance in real scenarios.
no code implementations • 30 Mar 2023 • Lecheng Zheng, Dawei Zhou, Hanghang Tong, Jiejun Xu, Yada Zhu, Jingrui He
In addition, we propose a generic context sampling strategy for graph generative models, which is proven to be capable of fairly capturing the contextual information of each group with a high probability.
1 code implementation • 11 Feb 2023 • Lecheng Zheng, Yada Zhu, Jingrui He
We also derive insights regarding the relative performance of the proposed regularizers in various scenarios.
no code implementations • 25 Jan 2023 • Baoyu Jing, Yuchen Yan, Kaize Ding, Chanyoung Park, Yada Zhu, Huan Liu, Hanghang Tong
Self-Supervised Learning (SSL) is a promising paradigm to address this challenge.
no code implementations • 27 Sep 2022 • Baoyu Jing, Si Zhang, Yada Zhu, Bin Peng, Kaiyu Guan, Andrew Margenot, Hanghang Tong
In this paper, we show both theoretically and empirically that the uncertainty could be effectively reduced by retrieving relevant time series as references.
no code implementations • 15 Aug 2022 • Shengyu Feng, Baoyu Jing, Yada Zhu, Hanghang Tong
In this work, by introducing an adversarial graph view for data augmentation, we propose a simple but effective method, Adversarial Graph Contrastive Learning (ARIEL), to extract informative contrastive samples within reasonable constraints.
no code implementations • 31 May 2022 • Baoyu Jing, Yuchen Yan, Yada Zhu, Hanghang Tong
We theoretically prove that COIN is able to effectively increase the mutual information of node embeddings and COIN is upper-bounded by the prior distributions of nodes.
no code implementations • 14 Feb 2022 • Shengyu Feng, Baoyu Jing, Yada Zhu, Hanghang Tong
Contrastive learning is an effective unsupervised method in graph representation learning.
no code implementations • 2 Dec 2021 • Zixuan Yuan, Yada Zhu, Wei zhang, Ziming Huang, Guangnan Ye, Hui Xiong
Earnings call (EC), as a periodic teleconference of a publicly-traded company, has been extensively studied as an essential market indicator because of its high analytical value in corporate fundamentals.
no code implementations • ACL 2021 • Wei zhang, Ziming Huang, Yada Zhu, Guangnan Ye, Xiaodong Cui, Fan Zhang
In the recent advances of natural language processing, the scale of the state-of-the-art models and datasets is usually extensive, which challenges the application of sample-based explanation methods in many aspects, such as explanation interpretability, efficiency, and faithfulness.
no code implementations • 9 Jun 2021 • Wei zhang, Ziming Huang, Yada Zhu, Guangnan Ye, Xiaodong Cui, Fan Zhang
In the recent advances of natural language processing, the scale of the state-of-the-art models and datasets is usually extensive, which challenges the application of sample-based explanation methods in many aspects, such as explanation interpretability, efficiency, and faithfulness.
1 code implementation • 19 May 2021 • Lecheng Zheng, JinJun Xiong, Yada Zhu, Jingrui He
We first provide a theoretical analysis showing that the vanilla contrastive learning loss easily leads to the sub-optimal solution in the presence of false negative pairs, whereas the proposed weighted loss could automatically adjust the weight based on the similarity of the learned representations to mitigate this issue.
1 code implementation • 15 Feb 2021 • Baoyu Jing, Hanghang Tong, Yada Zhu
We propose a novel model called Network of Tensor Time Series, which is comprised of two modules, including Tensor Graph Convolutional Network (TGCN) and Tensor Recurrent Neural Network (TRNN).
1 code implementation • 9 Feb 2020 • Yunan Ye, Hengzhi Pei, Boxin Wang, Pin-Yu Chen, Yada Zhu, Jun Xiao, Bo Li
Our framework aims to address two unique challenges in financial PM: (1) data heterogeneity -- the collected information for each asset is usually diverse, noisy and imbalanced (e. g., news articles); and (2) environment uncertainty -- the financial market is versatile and non-stationary.
no code implementations • 17 Oct 2019 • Di Chen, Yada Zhu, Xiaodong Cui, Carla P. Gomes
Real-world applications often involve domain-specific and task-based performance objectives that are not captured by the standard machine learning losses, but are critical for decision making.
no code implementations • 19 Sep 2019 • Giovanni Mariani, Yada Zhu, Jianbo Li, Florian Scheidegger, Roxana Istrate, Costas Bekas, A. Cristiano I. Malossi
Sound financial theories demonstrate that in an efficient marketplace all information available today, including expectations on future events, are represented in today prices whereas future price trend is driven by the uncertainty.
Computational Finance Statistical Finance