Search Results for author: Shihe Wang

Found 7 papers, 5 papers with code

DroidCall: A Dataset for LLM-powered Android Intent Invocation

1 code implementation30 Nov 2024 Weikai Xie, Li Zhang, Shihe Wang, Rongjie Yi, Mengwei Xu

Given a task instruction in natural language, small language models such as Qwen2. 5-3B and Gemma2-2B fine-tuned with DroidCall can approach or even surpass the capabilities of GPT-4o for accurate Android intent invocation.

Natural Language Understanding

LlamaTouch: A Faithful and Scalable Testbed for Mobile UI Task Automation

1 code implementation12 Apr 2024 Li Zhang, Shihe Wang, Xianqing Jia, Zhihan Zheng, Yunhe Yan, Longxi Gao, Yuanchun Li, Mengwei Xu

LlamaTouch comprises three key techniques: (1) On-device task execution that enables mobile agents to interact with realistic mobile environments for task execution.

A Survey of Resource-efficient LLM and Multimodal Foundation Models

1 code implementation16 Jan 2024 Mengwei Xu, Wangsong Yin, Dongqi Cai, Rongjie Yi, Daliang Xu, QiPeng Wang, Bingyang Wu, Yihao Zhao, Chen Yang, Shihe Wang, Qiyang Zhang, Zhenyan Lu, Li Zhang, Shangguang Wang, Yuanchun Li, Yunxin Liu, Xin Jin, Xuanzhe Liu

Large foundation models, including large language models (LLMs), vision transformers (ViTs), diffusion, and LLM-based multimodal models, are revolutionizing the entire machine learning lifecycle, from training to deployment.

Survey

Mobile Foundation Model as Firmware

1 code implementation28 Aug 2023 Jinliang Yuan, Chen Yang, Dongqi Cai, Shihe Wang, Xin Yuan, Zeling Zhang, Xiang Li, Dingge Zhang, Hanzi Mei, Xianqing Jia, Shangguang Wang, Mengwei Xu

Concurrently, each app contributes a concise, offline fine-tuned "adapter" tailored to distinct downstream tasks.

model

Boosting the Discriminant Power of Naive Bayes

no code implementations20 Sep 2022 Shihe Wang, Jianfeng Ren, Xiaoyu Lian, Ruibin Bai, Xudong Jiang

In this paper, we propose a feature augmentation method employing a stack auto-encoder to reduce the noise in the data and boost the discriminant power of naive Bayes.

A Max-relevance-min-divergence Criterion for Data Discretization with Applications on Naive Bayes

no code implementations20 Sep 2022 Shihe Wang, Jianfeng Ren, Ruibin Bai, Yuan YAO, Xudong Jiang

Thus, we propose a Max-Dependency-Min-Divergence (MDmD) criterion that maximizes both the discriminant information and generalization ability of the discretized data.

Attribute Classification

Cannot find the paper you are looking for? You can Submit a new open access paper.