Search Results for author: Qingsong Wei

Found 7 papers, 1 papers with code

AiRacleX: Automated Detection of Price Oracle Manipulations via LLM-Driven Knowledge Mining and Prompt Generation

no code implementations10 Feb 2025 Bo Gao, YuAn Wang, Qingsong Wei, Yong liu, Rick Siow Mong Goh, David Lo

Decentralized finance (DeFi) applications depend on accurate price oracles to ensure secure transactions, yet these oracles are highly vulnerable to manipulation, enabling attackers to exploit smart contract vulnerabilities for unfair asset valuation and financial gain.

Maximizing Uncertainty for Federated learning via Bayesian Optimisation-based Model Poisoning

no code implementations14 Jan 2025 Marios Aristodemou, Xiaolan Liu, YuAn Wang, Konstantinos G. Kyriakopoulos, Sangarapillai Lambotharan, Qingsong Wei

As we transition from Narrow Artificial Intelligence towards Artificial Super Intelligence, users are increasingly concerned about their privacy and the trustworthiness of machine learning (ML) technology.

Bayesian Optimisation Federated Learning +2

Look Back for More: Harnessing Historical Sequential Updates for Personalized Federated Adapter Tuning

no code implementations3 Jan 2025 Danni Peng, YuAn Wang, Huazhu Fu, Jinpeng Jiang, Yong liu, Rick Siow Mong Goh, Qingsong Wei

In pFedSeq, the server maintains and trains a sequential learner, which processes a sequence of past adapter updates from clients and generates calibrations for personalized adapters.

Personalized Federated Learning

Blockchain Data Analysis in the Era of Large-Language Models

no code implementations9 Dec 2024 Kentaroh Toyoda, Xiao Wang, Mingzhe Li, Bo Gao, YuAn Wang, Qingsong Wei

Blockchain data analysis is essential for deriving insights, tracking transactions, identifying patterns, and ensuring the integrity and security of decentralized networks.

Fraud Detection

An Aggregation-Free Federated Learning for Tackling Data Heterogeneity

no code implementations CVPR 2024 YuAn Wang, Huazhu Fu, Renuga Kanagavelu, Qingsong Wei, Yong liu, Rick Siow Mong Goh

FedAF inherently avoids the issue of client drift, enhances the quality of condensed data amid notable data heterogeneity, and improves the global model performance.

Federated Learning

FedUKD: Federated UNet Model with Knowledge Distillation for Land Use Classification from Satellite and Street Views

2 code implementations5 Dec 2022 Renuga Kanagavelu, Kinshuk Dua, Pratik Garai, Susan Elias, Neha Thomas, Simon Elias, Qingsong Wei, Goh Siow Mong Rick, Liu Yong

The need for a Federated approach in this application domain would be to avoid transfer of data from distributed locations and save network bandwidth to reduce communication cost.

Knowledge Distillation Model Compression +1

NV-Tree: Reducing Consistency Cost for NVM-based Single Level Systems

no code implementations16 Apr 2015 Jun Yang, Qingsong Wei, Cheng Chen, Chundong Wang, and Khai Leong Yong, Data Storage Institute, A-STAR; Bingsheng He, Nanyang Technological University

Although the memory fence and CPU cacheline flush instructions can order memory writes to achieve data consistency, they introduce a significant overhead (more than 10X slower in performance).

Cannot find the paper you are looking for? You can Submit a new open access paper.