1 code implementation • 8 Oct 2024 • Zhuopeng Xu, Yujie Li, Cheng Liu, Ning Gui
Identifying causal relations from purely observational data typically requires additional assumptions on relations and/or noise.
1 code implementation • 30 Sep 2024 • Weiwei Ye, Songgaojun Deng, Qiaosha Zou, Ning Gui
Time series forecasting typically needs to address non-stationary data with evolving trend and seasonal patterns.
1 code implementation • 20 Sep 2023 • Haoyu Wang, Guozheng Ma, Cong Yu, Ning Gui, Linrui Zhang, Zhiqi Huang, Suwei Ma, Yongzhe Chang, Sen Zhang, Li Shen, Xueqian Wang, Peilin Zhao, DaCheng Tao
Notably, we are surprised to discover that robustness tends to decrease as fine-tuning (SFT and RLHF) is conducted.
1 code implementation • 15 Apr 2023 • Bei Lin, You Li, Ning Gui, Zhuopeng Xu, Zhiwu Yu
However, partially due to the irregular non-Euclidean data in graphs, the pretext tasks are generally designed under homophily assumptions and cornered in the low-frequency signals, which results in significant loss of other signals, especially high-frequency signals widespread in graphs with heterophily.
1 code implementation • 6 Dec 2022 • Jiajun Zhong, Weiwei Ye, Ning Gui
Instead of treating all samples equally, we introduce the concept: ``friend networks" to represent different relations among samples.
no code implementations • 19 Jul 2022 • Zhifeng Qiu, Wanxin Zeng, Dahua Liao, Ning Gui
Guided by the integrated information from the multi-self-supervised learning model, a batch-attention mechanism is designed to generate feature weights according to batch-based feature selection patterns to alleviate the impacts introduced by a handful of noisy data.
1 code implementation • 19 Jun 2022 • Jiawen Wei, Fangyuan Wang, Wanxin Zeng, Wenwei Lin, Ning Gui
Reducing sensor requirements while keeping optimal control performance is crucial to many industrial control applications to achieve robust, low-cost, and computation-efficient controllers.
1 code implementation • 3 Mar 2022 • You Li, Bei Lin, Binli Luo, Ning Gui
Unsupervised graph representation learning aims to distill various graph information into a downstream task-agnostic dense vector embedding.
no code implementations • 11 Dec 2020 • You Li, Binli Luo, Ning Gui
Low-dimension graph embeddings have proved extremely useful in various downstream tasks in large graphs, e. g., link-related content recommendation and node classification tasks, etc.
1 code implementation • 28 Feb 2019 • Ning Gui, Danni Ge, Ziyin Hu
AFS consists of two detachable modules: an at-tention module for feature weight generation and a learning module for the problem modeling.
no code implementations • 12 Jan 2014 • Hong Sun, Vincenzo De Florio, Ning Gui, Chris Blondia
Challenges in increasing the human participation in ambient assisted living are discussed in this paper and solutions to meet those challenges are also proposed.