Search Results for author: Jun Xia

Found 51 papers, 27 papers with code

A Graph is Worth $K$ Words: Euclideanizing Graph using Pure Transformer

no code implementations4 Feb 2024 Zhangyang Gao, Daize Dong, Cheng Tan, Jun Xia, Bozhen Hu, Stan Z. Li

Despite recent GNN and Graphformer efforts encoding graphs as Euclidean vectors, recovering original graph from the vectors remains a challenge.

Graph Classification Graph Generation +1

Deep Manifold Graph Auto-Encoder for Attributed Graph Embedding

no code implementations12 Jan 2024 Bozhen Hu, Zelin Zang, Jun Xia, Lirong Wu, Cheng Tan, Stan Z. Li

Representing graph data in a low-dimensional space for subsequent tasks is the purpose of attributed graph embedding.

Graph Embedding

End-to-end Learnable Clustering for Intent Learning in Recommendation

1 code implementation11 Jan 2024 Yue Liu, Shihao Zhu, Jun Xia, Yingwei Ma, Jian Ma, Wenliang Zhong, Xinwang Liu, Guannan Zhang, Kejun Zhang

Concretely, we encode users' behavior sequences and initialize the cluster centers (latent intents) as learnable neurons.

Clustering Contrastive Learning +2

Graph-level Protein Representation Learning by Structure Knowledge Refinement

no code implementations5 Jan 2024 Ge Wang, Zelin Zang, Jiangbin Zheng, Jun Xia, Stan Z. Li

The mainstream method is utilizing contrastive learning to facilitate graph feature extraction, known as Graph Contrastive Learning (GCL).

Contrastive Learning Property Prediction +1

Masked Modeling for Self-supervised Representation Learning on Vision and Beyond

1 code implementation31 Dec 2023 Siyuan Li, Luyuan Zhang, Zedong Wang, Di wu, Lirong Wu, Zicheng Liu, Jun Xia, Cheng Tan, Yang Liu, Baigui Sun, Stan Z. Li

As the deep learning revolution marches on, self-supervised learning has garnered increasing attention in recent years thanks to its remarkable representation learning ability and the low dependence on labeled data.

Representation Learning Self-Supervised Learning

AdapterFL: Adaptive Heterogeneous Federated Learning for Resource-constrained Mobile Computing Systems

no code implementations23 Nov 2023 Ruixuan Liu, Ming Hu, Zeke Xia, Jun Xia, Pengyu Zhang, Yihao Huang, Yang Liu, Mingsong Chen

On the one hand, to achieve model training in all the diverse clients, mobile computing systems can only use small low-performance models for collaborative learning.

Federated Learning

Have Your Cake and Eat It Too: Toward Efficient and Accurate Split Federated Learning

no code implementations22 Nov 2023 Dengke Yan, Ming Hu, Zeke Xia, Yanxin Yang, Jun Xia, Xiaofei Xie, Mingsong Chen

However, due to data heterogeneity and stragglers, SFL suffers from the challenges of low inference accuracy and low efficiency.

Federated Learning

Enabling On-Device Large Language Model Personalization with Self-Supervised Data Selection and Synthesis

no code implementations21 Nov 2023 Ruiyang Qin, Jun Xia, Zhenge Jia, Meng Jiang, Ahmed Abbasi, Peipei Zhou, Jingtong Hu, Yiyu Shi

While it is possible to obtain annotation locally by directly asking users to provide preferred responses, such annotations have to be sparse to not affect user experience.

Language Modelling Large Language Model

WaveAttack: Asymmetric Frequency Obfuscation-based Backdoor Attacks Against Deep Neural Networks

no code implementations17 Oct 2023 Jun Xia, Zhihao Yue, Yingbo Zhou, Zhiwei Ling, Xian Wei, Mingsong Chen

Due to the popularity of Artificial Intelligence (AI) technology, numerous backdoor attacks are designed by adversaries to mislead deep neural network predictions by manipulating training samples and training processes.

Backdoor Attack SSIM

Revisiting the Temporal Modeling in Spatio-Temporal Predictive Learning under A Unified View

no code implementations9 Oct 2023 Cheng Tan, Jue Wang, Zhangyang Gao, Siyuan Li, Lirong Wu, Jun Xia, Stan Z. Li

In this paper, we re-examine the two dominant temporal modeling approaches within the realm of spatio-temporal predictive learning, offering a unified perspective.

Self-Supervised Learning

CONVERT:Contrastive Graph Clustering with Reliable Augmentation

2 code implementations17 Aug 2023 Xihong Yang, Cheng Tan, Yue Liu, Ke Liang, Siwei Wang, Sihang Zhou, Jun Xia, Stan Z. Li, Xinwang Liu, En Zhu

To address these problems, we propose a novel CONtrastiVe Graph ClustEring network with Reliable AugmenTation (CONVERT).

Clustering Contrastive Learning +4

Reinforcement Graph Clustering with Unknown Cluster Number

2 code implementations13 Aug 2023 Yue Liu, Ke Liang, Jun Xia, Xihong Yang, Sihang Zhou, Meng Liu, Xinwang Liu, Stan Z. Li

To enable the deep graph clustering algorithms to work without the guidance of the predefined cluster number, we propose a new deep graph clustering method termed Reinforcement Graph Clustering (RGC).

Clustering Graph Clustering +1

Why Deep Models Often cannot Beat Non-deep Counterparts on Molecular Property Prediction?

no code implementations30 Jun 2023 Jun Xia, Lecheng Zhang, Xiao Zhu, Stan Z. Li

Molecular property prediction (MPP) is a crucial task in the drug discovery pipeline, which has recently gained considerable attention thanks to advances in deep neural networks.

Drug Discovery Molecular Property Prediction +1

Dink-Net: Neural Clustering on Large Graphs

2 code implementations28 May 2023 Yue Liu, Ke Liang, Jun Xia, Sihang Zhou, Xihong Yang, Xinwang Liu, Stan Z. Li

Subsequently, the clustering distribution is optimized by minimizing the proposed cluster dilation loss and cluster shrink loss in an adversarial manner.

Clustering Graph Clustering +1

Cross-Gate MLP with Protein Complex Invariant Embedding is A One-Shot Antibody Designer

1 code implementation21 Apr 2023 Cheng Tan, Zhangyang Gao, Lirong Wu, Jun Xia, Jiangbin Zheng, Xihong Yang, Yue Liu, Bozhen Hu, Stan Z. Li

In this paper, we propose a \textit{simple yet effective} model that can co-design 1D sequences and 3D structures of CDRs in a one-shot manner.

Specificity

CVT-SLR: Contrastive Visual-Textual Transformation for Sign Language Recognition with Variational Alignment

1 code implementation CVPR 2023 Jiangbin Zheng, Yile Wang, Cheng Tan, Siyuan Li, Ge Wang, Jun Xia, Yidong Chen, Stan Z. Li

In this work, we propose a novel contrastive visual-textual transformation for SLR, CVT-SLR, to fully explore the pretrained knowledge of both the visual and language modalities.

Sign Language Recognition

HierarchyFL: Heterogeneous Federated Learning via Hierarchical Self-Distillation

no code implementations5 Dec 2022 Jun Xia, Yi Zhang, Zhihao Yue, Ming Hu, Xian Wei, Mingsong Chen

Federated learning (FL) has been recognized as a privacy-preserving distributed machine learning paradigm that enables knowledge sharing among various heterogeneous artificial intelligence (AIoT) devices through centralized global model aggregation.

Federated Learning Privacy Preserving

Protein Language Models and Structure Prediction: Connection and Progression

1 code implementation30 Nov 2022 Bozhen Hu, Jun Xia, Jiangbin Zheng, Cheng Tan, Yufei Huang, Yongjie Xu, Stan Z. Li

The prediction of protein structures from sequences is an important task for function prediction, drug design, and related biological processes understanding.

Protein Folding Protein Language Model +1

GitFL: Adaptive Asynchronous Federated Learning using Version Control

no code implementations22 Nov 2022 Ming Hu, Zeke Xia, Zhihao Yue, Jun Xia, Yihao Huang, Yang Liu, Mingsong Chen

Unlike traditional FL, the cloud server of GitFL maintains a master model (i. e., the global model) together with a set of branch models indicating the trained local models committed by selected devices, where the master model is updated based on both all the pushed branch models and their version information, and only the branch models after the pull operation are dispatched to devices.

Federated Learning Reinforcement Learning (RL)

Using Context-to-Vector with Graph Retrofitting to Improve Word Embeddings

1 code implementation ACL 2022 Jiangbin Zheng, Yile Wang, Ge Wang, Jun Xia, Yufei Huang, Guojiang Zhao, Yue Zhang, Stan Z. Li

Although contextualized embeddings generated from large-scale pre-trained models perform well in many tasks, traditional static embeddings (e. g., Skip-gram, Word2Vec) still play an important role in low-resource and lightweight settings due to their low computational cost, ease of deployment, and stability.

Word Embeddings

A Systematic Survey of Chemical Pre-trained Models

2 code implementations29 Oct 2022 Jun Xia, Yanqiao Zhu, Yuanqi Du, Stan Z. Li

Deep learning has achieved remarkable success in learning representations for molecules, which is crucial for various biochemical applications, ranging from property prediction to drug design.

molecular representation Property Prediction

Teaching Yourself: Graph Self-Distillation on Neighborhood for Node Classification

no code implementations5 Oct 2022 Lirong Wu, Jun Xia, Haitao Lin, Zhangyang Gao, Zicheng Liu, Guojiang Zhao, Stan Z. Li

Despite their great academic success, Multi-Layer Perceptrons (MLPs) remain the primary workhorse for practical industrial applications.

Classification Node Classification

FedEntropy: Efficient Device Grouping for Federated Learning Using Maximum Entropy Judgment

1 code implementation24 May 2022 Zhiwei Ling, Zhihao Yue, Jun Xia, Ming Hu, Ting Wang, Mingsong Chen

Along with the popularity of Artificial Intelligence (AI) and Internet-of-Things (IoT), Federated Learning (FL) has attracted steadily increasing attentions as a promising distributed machine learning paradigm, which enables the training of a central model on for numerous decentralized devices without exposing their privacy.

Federated Learning

Model-Contrastive Learning for Backdoor Defense

1 code implementation9 May 2022 Zhihao Yue, Jun Xia, Zhiwei Ling, Ming Hu, Ting Wang, Xian Wei, Mingsong Chen

Due to the popularity of Artificial Intelligence (AI) techniques, we are witnessing an increasing number of backdoor injection attacks that are designed to maliciously threaten Deep Neural Networks (DNNs) causing misclassification.

Backdoor Attack backdoor defense +1

Generative De Novo Protein Design with Global Context

1 code implementation21 Apr 2022 Cheng Tan, Zhangyang Gao, Jun Xia, Bozhen Hu, Stan Z. Li

Thus, we propose the Global-Context Aware generative de novo protein design method (GCA), consisting of local and global modules.

Protein Design Protein Structure Prediction

Eliminating Backdoor Triggers for Deep Neural Networks Using Attention Relation Graph Distillation

1 code implementation21 Apr 2022 Jun Xia, Ting Wang, Jiepin Ding, Xian Wei, Mingsong Chen

Due to the prosperity of Artificial Intelligence (AI) techniques, more and more backdoors are designed by adversaries to attack Deep Neural Networks (DNNs). Although the state-of-the-art method Neural Attention Distillation (NAD) can effectively erase backdoor triggers from DNNs, it still suffers from non-negligible Attack Success Rate (ASR) together with lowered classification ACCuracy (ACC), since NAD focuses on backdoor defense using attention features (i. e., attention maps) of the same order.

backdoor defense Knowledge Distillation +1

BIOS: An Algorithmically Generated Biomedical Knowledge Graph

no code implementations18 Mar 2022 Sheng Yu, Zheng Yuan, Jun Xia, Shengxuan Luo, Huaiyuan Ying, Sihang Zeng, Jingyi Ren, Hongyi Yuan, Zhengyun Zhao, Yucong Lin, Keming Lu, Jing Wang, Yutao Xie, Heung-Yeung Shum

For decades, these knowledge graphs have been developed via expert curation; however, this method can no longer keep up with today's AI development, and a transition to algorithmically generated BioMedKGs is necessary.

BIG-bench Machine Learning Knowledge Graphs +3

HDL: Hybrid Deep Learning for the Synthesis of Myocardial Velocity Maps in Digital Twins for Cardiac Analysis

1 code implementation9 Mar 2022 Xiaodan Xing, Javier Del Ser, Yinzhe Wu, Yang Li, Jun Xia, Lei Xu, David Firmin, Peter Gatehouse, Guang Yang

A core part of digital healthcare twins is model-based data synthesis, which permits the generation of realistic medical signals without requiring to cope with the modelling complexity of anatomical and biochemical phenomena producing them in reality.

Decision Making Generative Adversarial Network +1

A Survey of Pretraining on Graphs: Taxonomy, Methods, and Applications

3 code implementations16 Feb 2022 Jun Xia, Yanqiao Zhu, Yuanqi Du, Stan Z. Li

Pretrained Language Models (PLMs) such as BERT have revolutionized the landscape of Natural Language Processing (NLP).

Drug Discovery Graph Representation Learning

Explainable COVID-19 Infections Identification and Delineation Using Calibrated Pseudo Labels

1 code implementation11 Feb 2022 Ming Li, Yingying Fang, Zeyu Tang, Chibudom Onuorah, Jun Xia, Javier Del Ser, Simon Walsh, Guang Yang

We demonstrate the effectiveness of our model with the combination of limited labelled data and sufficient unlabelled data or weakly-labelled data.

Computed Tomography (CT) Decision Making +1

SimGRACE: A Simple Framework for Graph Contrastive Learning without Data Augmentation

1 code implementation7 Feb 2022 Jun Xia, Lirong Wu, Jintao Chen, Bozhen Hu, Stan Z. Li

Furthermore, we devise adversarial training scheme, dubbed \textbf{AT-SimGRACE}, to enhance the robustness of graph contrastive learning and theoretically explain the reasons.

Contrastive Learning Data Augmentation +1

Swin Transformer for Fast MRI

2 code implementations10 Jan 2022 Jiahao Huang, Yingying Fang, Yinzhe Wu, Huanjun Wu, Zhifan Gao, Yang Li, Javier Del Ser, Jun Xia, Guang Yang

The IM and OM were 2D convolutional layers and the FEM was composed of a cascaded of residual Swin transformer blocks (RSTBs) and 2D convolutional layers.

MRI Reconstruction

Robust Weakly Supervised Learning for COVID-19 Recognition Using Multi-Center CT Images

no code implementations9 Dec 2021 Qinghao Ye, Yuan Gao, Weiping Ding, Zhangming Niu, Chengjia Wang, Yinghui Jiang, Minhao Wang, Evandro Fei Fang, Wade Menpes-Smith, Jun Xia, Guang Yang

The multi-domain shift problem for the multi-center and multi-scanner studies is therefore nontrivial that is also crucial for a dependable recognition and critical for reproducible and objective diagnosis and prognosis.

Computed Tomography (CT) Weakly-supervised Learning

Efficient Federated Learning for AIoT Applications Using Knowledge Distillation

no code implementations29 Nov 2021 Tian Liu, Zhiwei Ling, Jun Xia, Xin Fu, Shui Yu, Mingsong Chen

Inspired by Knowledge Distillation (KD) that can increase the model accuracy, our approach adds the soft targets used by KD to the FL model training, which occupies negligible network resources.

Federated Learning Knowledge Distillation

ProGCL: Rethinking Hard Negative Mining in Graph Contrastive Learning

1 code implementation5 Oct 2021 Jun Xia, Lirong Wu, Ge Wang, Jintao Chen, Stan Z. Li

Contrastive Learning (CL) has emerged as a dominant technique for unsupervised representation learning which embeds augmented versions of the anchor close to each other (positive samples) and pushes the embeddings of other samples (negatives) apart.

Contrastive Learning Representation Learning

Co-learning: Learning from Noisy Labels with Self-supervision

1 code implementation5 Aug 2021 Cheng Tan, Jun Xia, Lirong Wu, Stan Z. Li

Noisy labels, resulting from mistakes in manual labeling or webly data collecting for supervised learning, can cause neural networks to overfit the misleading information and degrade the generalization performance.

Learning with noisy labels Self-Supervised Learning

Explainable AI For COVID-19 CT Classifiers: An Initial Comparison Study

no code implementations25 Apr 2021 Qinghao Ye, Jun Xia, Guang Yang

XAI is an AI model that is programmed to explain its goals, logic, and decision making so that the end users can understand.

Decision Making Explainable Artificial Intelligence (XAI) +1

Unbox the Black-box for the Medical Explainable AI via Multi-modal and Multi-centre Data Fusion: A Mini-Review, Two Showcases and Beyond

no code implementations3 Feb 2021 Guang Yang, Qinghao Ye, Jun Xia

Explainable Artificial Intelligence (XAI) is an emerging research topic of machine learning aimed at unboxing how AI systems' black-box choices are made.

BIG-bench Machine Learning Decision Making +2

Towards Robust Graph Neural Networks against Label Noise

no code implementations1 Jan 2021 Jun Xia, Haitao Lin, Yongjie Xu, Lirong Wu, Zhangyang Gao, Siyuan Li, Stan Z. Li

A pseudo label is computed from the neighboring labels for each node in the training set using LP; meta learning is utilized to learn a proper aggregation of the original and pseudo label as the final label.

Attribute Learning with noisy labels +3

Invertible Manifold Learning for Dimension Reduction

1 code implementation7 Oct 2020 Siyuan Li, Haitao Lin, Zelin Zang, Lirong Wu, Jun Xia, Stan Z. Li

Dimension reduction (DR) aims to learn low-dimensional representations of high-dimensional data with the preservation of essential information.

Dimensionality Reduction

Deep Clustering and Representation Learning that Preserves Geometric Structures

no code implementations28 Sep 2020 Lirong Wu, Zicheng Liu, Zelin Zang, Jun Xia, Siyuan Li, Stan Z. Li

To overcome the problem that clusteringoriented losses may deteriorate the geometric structure of embeddings in the latent space, an isometric loss is proposed for preserving intra-manifold structure locally and a ranking loss for inter-manifold structure globally.

Clustering Deep Clustering +1

Generalized Clustering and Multi-Manifold Learning with Geometric Structure Preservation

1 code implementation21 Sep 2020 Lirong Wu, Zicheng Liu, Zelin Zang, Jun Xia, Siyuan Li, Stan Z. Li

Though manifold-based clustering has become a popular research topic, we observe that one important factor has been omitted by these works, namely that the defined clustering loss may corrupt the local and global structure of the latent space.

Clustering Deep Clustering +1

Weakly Supervised Deep Learning for COVID-19 Infection Detection and Classification from CT Images

no code implementations14 Apr 2020 Shaoping Hu, Yuan Gao, Zhangming Niu, Yinghui Jiang, Lao Li, Xianglu Xiao, Minhao Wang, Evandro Fei Fang, Wade Menpes-Smith, Jun Xia, Hui Ye, Guang Yang

An outbreak of a novel coronavirus disease (i. e., COVID-19) has been recorded in Wuhan, China since late December 2019, which subsequently became pandemic around the world.

General Classification Respiratory Failure

Artificial Intelligence Distinguishes COVID-19 from Community Acquired Pneumonia on Chest CT

1 code implementation Radiology 2020 Lin Li, Lixin Qin, Zeguo Xu, Youbing Yin, Xin Wang, Bin Kong, Junjie Bai, Yi Lu, Zhenghan Fang, Qi Song, Kunlin Cao, Daliang Liu, Guisheng Wang, Qizhong Xu, Xisheng Fang, Shiqin Zhang, Juan Xia, Jun Xia

Materials and Methods In this retrospective and multi-center study, a deep learning model, COVID-19 detection neural network (COVNet), was developed to extract visual features from volumetric chest CT exams for the detection of COVID-19.

COVID-19 Image Segmentation Specificity +1

Cannot find the paper you are looking for? You can Submit a new open access paper.