Search Results for author: Yang Yong

Found 6 papers, 4 papers with code

Reducing Events to Augment Log-based Anomaly Detection Models: An Empirical Study

no code implementations7 Sep 2024 Lingzhe Zhang, Tong Jia, Kangjin Wang, Mengxi Jia, Yang Yong, Ying Li

Experimental outcomes highlight LogCleaner's capability to reduce over 70% of log events in anomaly detection, accelerating the model's inference speed by approximately 300%, and universally improving the performance of models for anomaly detection.

LLMC: Benchmarking Large Language Model Quantization with a Versatile Compression Toolkit

1 code implementation9 May 2024 Ruihao Gong, Yang Yong, Shiqiao Gu, Yushi Huang, Chentao Lv, Yunchen Zhang, Xianglong Liu, DaCheng Tao

Recent advancements in large language models (LLMs) are propelling us toward artificial general intelligence with their remarkable emergent abilities and reasoning capabilities.

Benchmarking Computational Efficiency +3

Fast and Controllable Post-training Sparsity: Learning Optimal Sparsity Allocation with Global Constraint in Minutes

1 code implementation9 May 2024 Ruihao Gong, Yang Yong, Zining Wang, Jinyang Guo, Xiuying Wei, Yuqing Ma, Xianglong Liu

Previous methods for finding sparsity rates mainly focus on the training-aware scenario, which usually fails to converge stably under the PTS setting with limited data and much less training cost.

Compressing Models with Few Samples: Mimicking then Replacing

1 code implementation CVPR 2022 Huanyu Wang, Junjie Liu, Xin Ma, Yang Yong, Zhenhua Chai, Jianxin Wu

Hence, previous methods optimize the compressed model layer-by-layer and try to make every layer have the same outputs as the corresponding layer in the teacher model, which is cumbersome.

Design and implementation of smart cooking based on amazon echo

no code implementations4 Dec 2018 Lin Xiaoguang, Yang Yong, Zhang Ju

Smart cooking based on Amazon Echo uses the internet of things and cloud computing to assist in cooking food.

Cloud Computing

Cannot find the paper you are looking for? You can Submit a new open access paper.