Search Results for author: Mi Zhang

Found 54 papers, 19 papers with code

Convolution over Hierarchical Syntactic and Lexical Graphs for Aspect Level Sentiment Analysis

no code implementations EMNLP 2020 Mi Zhang, Tieyun Qian

Moreover, we build a concept hierarchy on both the syntactic and lexical graphs for differentiating various types of dependency relations or lexical word pairs.

Relation Sentence +2

SVD-LLM: Truncation-aware Singular Value Decomposition for Large Language Model Compression

1 code implementation12 Mar 2024 Xin Wang, Yu Zheng, Zhongwei Wan, Mi Zhang

However, state-of-the-art SVD-based LLM compression methods have two key limitations: truncating smaller singular values may lead to higher compression loss, and the lack of update on the remaining model parameters after SVD truncation.

Language Modelling Large Language Model +1

Electrocardiogram Instruction Tuning for Report Generation

no code implementations7 Mar 2024 Zhongwei Wan, Che Liu, Xin Wang, Chaofan Tao, Hui Shen, Zhenwu Peng, Jie Fu, Rossella Arcucci, Huaxiu Yao, Mi Zhang

Electrocardiogram (ECG) serves as the primary non-invasive diagnostic tool for cardiac conditions monitoring, are crucial in assisting clinicians.

Search Intenion Network for Personalized Query Auto-Completion in E-Commerce

no code implementations5 Mar 2024 Wei Bao, Mi Zhang, Tao Zhang, Chengfu Huo

Query Auto-Completion(QAC), as an important part of the modern search engine, plays a key role in complementing user queries and helping them refine their search intentions. Today's QAC systems in real-world scenarios face two major challenges:1)intention equivocality(IE): during the user's typing process, the prefix often contains a combination of characters and subwords, which makes the current intention ambiguous and difficult to model. 2)intention transfer (IT):previous works make personalized recommendations based on users' historical sequences, but ignore the search intention transfer. However, the current intention extracted from prefix may be contrary to the historical preferences.

IoT in the Era of Generative AI: Vision and Challenges

no code implementations3 Jan 2024 Xin Wang, Zhongwei Wan, Arvin Hekmati, Mingyu Zong, Samiul Alam, Mi Zhang, Bhaskar Krishnamachari

Equipped with sensing, networking, and computing capabilities, Internet of Things (IoT) such as smartphones, wearables, smart speakers, and household robots have been seamlessly weaved into our daily lives.

Federated Learning Prompt Engineering

No-Skim: Towards Efficiency Robustness Evaluation on Skimming-based Language Models

no code implementations15 Dec 2023 Shengyao Zhang, Mi Zhang, Xudong Pan, Min Yang

To reduce the computation cost and the energy consumption in large language models (LLM), skimming-based acceleration dynamically drops unimportant tokens of the input sequence progressively along layers of the LLM while preserving the tokens of semantic importance.

BELT: Old-School Backdoor Attacks can Evade the State-of-the-Art Defense with Backdoor Exclusivity Lifting

no code implementations8 Dec 2023 Huming Qiu, Junjie Sun, Mi Zhang, Xudong Pan, Min Yang

Deep neural networks (DNNs) are susceptible to backdoor attacks, where malicious functionality is embedded to allow attackers to trigger incorrect classifications.

Efficient Large Language Models: A Survey

3 code implementations6 Dec 2023 Zhongwei Wan, Xin Wang, Che Liu, Samiul Alam, Yu Zheng, Jiachen Liu, Zhongnan Qu, Shen Yan, Yi Zhu, Quanlu Zhang, Mosharaf Chowdhury, Mi Zhang

Large Language Models (LLMs) have demonstrated remarkable capabilities in important tasks such as natural language understanding, language generation, and complex reasoning and have the potential to make a substantial impact on our society.

Natural Language Understanding Text Generation

JADE: A Linguistics-based Safety Evaluation Platform for Large Language Models

1 code implementation1 Nov 2023 Mi Zhang, Xudong Pan, Min Yang

In this paper, we present JADE, a targeted linguistic fuzzing platform which strengthens the linguistic complexity of seed questions to simultaneously and consistently break a wide range of widely-used LLMs categorized in three groups: eight open-sourced Chinese, six commercial Chinese and four commercial English LLMs.

Natural Questions

FedAIoT: A Federated Learning Benchmark for Artificial Intelligence of Things

1 code implementation29 Sep 2023 Samiul Alam, Tuo Zhang, Tiantian Feng, Hui Shen, Zhichao Cao, Dong Zhao, JeongGil Ko, Kiran Somasundaram, Shrikanth S. Narayanan, Salman Avestimehr, Mi Zhang

However, most existing FL works are not conducted on datasets collected from authentic IoT devices that capture unique modalities and inherent challenges of IoT data.

Benchmarking Federated Learning

MIRA: Cracking Black-box Watermarking on Deep Neural Networks via Model Inversion-based Removal Attacks

no code implementations7 Sep 2023 Yifan Lu, Wenxuan Li, Mi Zhang, Xudong Pan, Min Yang

In this paper, we propose a novel Model Inversion-based Removal Attack (\textsc{Mira}), which is watermark-agnostic and effective against most of mainstream black-box DNN watermarking schemes.

ETP: Learning Transferable ECG Representations via ECG-Text Pre-training

no code implementations6 Sep 2023 Che Liu, Zhongwei Wan, Sibo Cheng, Mi Zhang, Rossella Arcucci

In the domain of cardiovascular healthcare, the Electrocardiogram (ECG) serves as a critical, non-invasive diagnostic tool.

Language Modelling Representation Learning +2

FedMultimodal: A Benchmark For Multimodal Federated Learning

no code implementations15 Jun 2023 Tiantian Feng, Digbalay Bose, Tuo Zhang, Rajat Hebbar, Anil Ramakrishna, Rahul Gupta, Mi Zhang, Salman Avestimehr, Shrikanth Narayanan

In order to facilitate the research in multimodal FL, we introduce FedMultimodal, the first FL benchmark for multimodal learning covering five representative multimodal applications from ten commonly used datasets with a total of eight unique modalities.

Emotion Recognition Federated Learning +1

GPT-FL: Generative Pre-trained Model-Assisted Federated Learning

1 code implementation3 Jun 2023 Tuo Zhang, Tiantian Feng, Samiul Alam, Dimitrios Dimitriadis, Mi Zhang, Shrikanth S. Narayanan, Salman Avestimehr

Through comprehensive ablation analysis, we discover that the downstream model generated by synthetic data plays a crucial role in controlling the direction of gradient diversity during FL training, which enhances convergence speed and contributes to the notable accuracy boost observed with GPT-FL.

Federated Learning

Med-UniC: Unifying Cross-Lingual Medical Vision-Language Pre-Training by Diminishing Bias

1 code implementation NeurIPS 2023 Zhongwei Wan, Che Liu, Mi Zhang, Jie Fu, Benyou Wang, Sibo Cheng, Lei Ma, César Quilodrán-Casas, Rossella Arcucci

Med-UniC reaches superior performance across 5 medical image tasks and 10 datasets encompassing over 30 diseases, offering a versatile framework for unifying multi-modal medical data within diverse linguistic communities.

Disentanglement

NELoRa-Bench: A Benchmark for Neural-enhanced LoRa Demodulation

1 code implementation20 Apr 2023 Jialuo Du, Yidong Ren, Mi Zhang, Yunhao Liu, Zhichao Cao

The dataset shows that NELoRa can achieve 1. 84-2. 35 dB SNR gain over the standard LoRa decoder.

TimelyFL: Heterogeneity-aware Asynchronous Federated Learning with Adaptive Partial Training

no code implementations14 Apr 2023 Tuo Zhang, Lei Gao, Sunwoo Lee, Mi Zhang, Salman Avestimehr

However, we show empirically that this method can lead to a substantial drop in training accuracy as well as a slower convergence rate.

Federated Learning

Rethinking White-Box Watermarks on Deep Learning Models under Neural Structural Obfuscation

no code implementations17 Mar 2023 Yifan Yan, Xudong Pan, Mi Zhang, Min Yang

Copyright protection for deep neural networks (DNNs) is an urgent need for AI corporations.

Exorcising ''Wraith'': Protecting LiDAR-based Object Detector in Automated Driving System from Appearing Attacks

no code implementations17 Mar 2023 Qifan Xiao, Xudong Pan, Yifan Lu, Mi Zhang, Jiarun Dai, Min Yang

In this paper, we propose a novel plug-and-play defensive module which works by side of a trained LiDAR-based object detector to eliminate forged obstacles where a major proportion of local parts have low objectness, i. e., to what degree it belongs to a real object.

CAP: Robust Point Cloud Classification via Semantic and Structural Modeling

no code implementations CVPR 2023 Daizong Ding, Erling Jiang, Yuanmin Huang, Mi Zhang, Wenxuan Li, Min Yang

Recently, deep neural networks have shown great success on 3D point cloud classification tasks, which simultaneously raises the concern of adversarial attacks that cause severe damage to real-world applications.

3D Point Cloud Classification Classification +2

TopDiG: Class-Agnostic Topological Directional Graph Extraction From Remote Sensing Images

no code implementations CVPR 2023 Bingnan Yang, Mi Zhang, Zhan Zhang, Zhili Zhang, Xiangyun Hu

In this work, we propose an innovative class-agnostic model, namely TopDiG, to directly extract topological directional graphs from remote sensing images and solve these issues.

Adaptive Risk-Aware Bidding with Budget Constraint in Display Advertising

1 code implementation6 Dec 2022 Zhimeng Jiang, Kaixiong Zhou, Mi Zhang, Rui Chen, Xia Hu, Soo-Hyun Choi

In this work, we explicitly factor in the uncertainty of estimated ad impression values and model the risk preference of a DSP under a specific state and market environment via a sequential decision process.

reinforcement-learning Reinforcement Learning (RL)

FedRolex: Model-Heterogeneous Federated Learning with Rolling Sub-Model Extraction

3 code implementations3 Dec 2022 Samiul Alam, Luyang Liu, Ming Yan, Mi Zhang

Most cross-device federated learning (FL) studies focus on the model-homogeneous setting where the global server model and local client models are identical.

Federated Learning Model extraction

Federated Learning Hyper-Parameter Tuning from a System Perspective

1 code implementation24 Nov 2022 Huanle Zhang, Lei Fu, Mi Zhang, Pengfei Hu, Xiuzhen Cheng, Prasant Mohapatra, Xin Liu

In this paper, we propose FedTune, an automatic FL hyper-parameter tuning algorithm tailored to applications' diverse system requirements in FL training.

Federated Learning

Client Selection in Federated Learning: Principles, Challenges, and Opportunities

no code implementations3 Nov 2022 Lei Fu, Huanle Zhang, Ge Gao, Mi Zhang, Xin Liu

As a privacy-preserving paradigm for training Machine Learning (ML) models, Federated Learning (FL) has received tremendous attention from both industry and academia.

Fairness Federated Learning +1

A Certifiable Security Patch for Object Tracking in Self-Driving Systems via Historical Deviation Modeling

no code implementations18 Jul 2022 Xudong Pan, Qifan Xiao, Mi Zhang, Min Yang

To address this design flaw, we propose a simple yet effective security patch for KF-based MOT, the core of which is an adaptive strategy to balance the focus of KF on observations and predictions according to the anomaly index of the observation-prediction deviation, and has certified effectiveness against a generalized hijacking attack model.

Decision Making Object +3

Matryoshka: Stealing Functionality of Private ML Data by Hiding Models in Model

no code implementations29 Jun 2022 Xudong Pan, Yifan Yan, Shengyao Zhang, Mi Zhang, Min Yang

In this paper, we present a novel insider attack called Matryoshka, which employs an irrelevant scheduled-to-publish DNN model as a carrier model for covert transmission of multiple secret models which memorize the functionality of private ML data stored in local data centers.

Cracking White-box DNN Watermarks via Invariant Neuron Transforms

no code implementations30 Apr 2022 Yifan Yan, Xudong Pan, Yining Wang, Mi Zhang, Min Yang

On $9$ state-of-the-art white-box watermarking schemes and a broad set of industry-level DNN architectures, our attack for the first time reduces the embedded identity message in the protected models to be almost random.

Deep AutoAugment

1 code implementation11 Mar 2022 Yu Zheng, Zhi Zhang, Shen Yan, Mi Zhang

In this work, instead of fixing a set of hand-picked default augmentations alongside the searched data augmentations, we propose a fully automated approach for data augmentation search named Deep AutoAugment (DeepAA).

AutoML Data Augmentation +1

Automatically Generating Counterfactuals for Relation Classification

no code implementations22 Feb 2022 Mi Zhang, Tieyun Qian, Ting Zhang

In this paper, we formulate the problem of automatically generating CAD for RC tasks from an entity-centric viewpoint, and develop a novel approach to derive contextual counterfactuals for entities.

Classification Relation +1

Multiview Transformers for Video Recognition

1 code implementation CVPR 2022 Shen Yan, Xuehan Xiong, Anurag Arnab, Zhichao Lu, Mi Zhang, Chen Sun, Cordelia Schmid

Video understanding requires reasoning at multiple spatiotemporal resolutions -- from short fine-grained motions to events taking place over longer durations.

Ranked #5 on Action Recognition on EPIC-KITCHENS-100 (using extra training data)

Action Classification Action Recognition +1

Federated Learning for Internet of Things: Applications, Challenges, and Opportunities

no code implementations15 Nov 2021 Tuo Zhang, Lei Gao, Chaoyang He, Mi Zhang, Bhaskar Krishnamachari, Salman Avestimehr

In this paper, we will discuss the opportunities and challenges of FL in IoT platforms, as well as how it can enable diverse IoT applications.

Federated Learning

FedTune: Automatic Tuning of Federated Learning Hyper-Parameters from System Perspective

1 code implementation6 Oct 2021 Huanle Zhang, Mi Zhang, Xin Liu, Prasant Mohapatra, Michael DeLucia

Federated learning (FL) hyper-parameters significantly affect the training overheads in terms of computation time, transmission time, computation load, and transmission load.

Federated Learning

Automatic Tuning of Federated Learning Hyper-Parameters from System Perspective

no code implementations29 Sep 2021 Huanle Zhang, Mi Zhang, Xin Liu, Prasant Mohapatra, Michael DeLucia

Federated Learning (FL) is a distributed model training paradigm that preserves clients' data privacy.

Federated Learning

Multi-Scale Feature and Metric Learning for Relation Extraction

no code implementations28 Jul 2021 Mi Zhang, Tieyun Qian

Specifically, we first develop a multi-scale convolutional neural network to aggregate the non-successive mainstays in the lexical sequence.

Metric Learning Relation +1

Exploring the Security Boundary of Data Reconstruction via Neuron Exclusivity Analysis

no code implementations26 Oct 2020 Xudong Pan, Mi Zhang, Yifan Yan, Jiaming Zhu, Min Yang

Among existing privacy attacks on the gradient of neural networks, \emph{data reconstruction attack}, which reverse engineers the training batch from the gradient, poses a severe threat on the private training data.

Face Recognition Reconstruction Attack

Does Unsupervised Architecture Representation Learning Help Neural Architecture Search?

1 code implementation NeurIPS 2020 Shen Yan, Yu Zheng, Wei Ao, Xiao Zeng, Mi Zhang

Existing Neural Architecture Search (NAS) methods either encode neural architectures using discrete encodings that do not scale well, or adopt supervised learning-based methods to jointly learn architecture representations and optimize architecture search on such representations which incurs search bias.

Neural Architecture Search

Dance Revolution: Long-Term Dance Generation with Music via Curriculum Learning

no code implementations ICLR 2021 Ruozi Huang, Huang Hu, Wei Wu, Kei Sawada, Mi Zhang, Daxin Jiang

In this paper, we formalize the music-conditioned dance generation as a sequence-to-sequence learning problem and devise a novel seq2seq architecture to efficiently process long sequences of music features and capture the fine-grained correspondence between music and dance.

Motion Synthesis Pose Estimation

Transfer learning in large-scale ocean bottom seismic wavefield reconstruction

1 code implementation15 Apr 2020 Mi Zhang, Ali Siahkoohi, Felix J. Herrmann

Because different frequency slices share information, we propose the use the method of transfer training to make our approach computationally more efficient by warm starting the training with CNN weights obtained from a neighboring frequency slices.

Transfer Learning

MutualNet: Adaptive ConvNet via Mutual Learning from Network Width and Resolution

2 code implementations ECCV 2020 Taojiannan Yang, Sijie Zhu, Chen Chen, Shen Yan, Mi Zhang, Andrew Willis

We propose the width-resolution mutual learning method (MutualNet) to train a network that is executable at dynamic resource constraints to achieve adaptive accuracy-efficiency trade-offs at runtime.

Instance Segmentation object-detection +3

HM-NAS: Efficient Neural Architecture Search via Hierarchical Masking

no code implementations31 Aug 2019 Shen Yan, Biyi Fang, Faen Zhang, Yu Zheng, Xiao Zeng, Hui Xu, Mi Zhang

Without the constraint imposed by the hand-designed heuristics, our searched networks contain more flexible and meaningful architectures that existing weight sharing based NAS approaches are not able to discover.

Neural Architecture Search

How Sequence-to-Sequence Models Perceive Language Styles?

no code implementations16 Aug 2019 Ruozi Huang, Mi Zhang, Xudong Pan, Beina Sheng

Style is ubiquitous in our daily language uses, while what is language style to learning machines?

Informativeness Style Transfer +1

NestDNN: Resource-Aware Multi-Tenant On-Device Deep Learning for Continuous Mobile Vision

no code implementations23 Oct 2018 Biyi Fang, Xiao Zeng, Mi Zhang

These systems usually run multiple applications concurrently and their available resources at runtime are dynamic due to events such as starting new applications, closing existing applications, and application priority changes.

Theoretical Analysis of Image-to-Image Translation with Adversarial Learning

no code implementations ICML 2018 Xudong Pan, Mi Zhang, Daizong Ding

Recently, a unified model for image-to-image translation tasks within adversarial learning framework has aroused widespread research interests in computer vision practitioners.

Image-to-Image Translation Translation

Line-Based Multi-Label Energy Optimization for Fisheye Image Rectification and Calibration

no code implementations CVPR 2015 Mi Zhang, Jian Yao, Menghan Xia, Kai Li, Yi Zhang, Yaping Liu

Fisheye image rectification and estimation of intrinsic parameters for real scenes have been addressed in the literature by using line information on the distorted images.

Cannot find the paper you are looking for? You can Submit a new open access paper.