2 code implementations • 19 Feb 2024 • Hanling Yi, Feng Lin, Hongbin Li, Peiyang Ning, Xiaotian Yu, Rong Xiao
This research aims to accelerate the inference speed of large language models (LLMs) with billions of parameters.
1 code implementation • 23 Jan 2024 • Feng Lin, Hanling Yi, Hongbin Li, Yifan Yang, Xiaotian Yu, Guangming Lu, Rong Xiao
Large language models (LLMs) commonly employ autoregressive generation during inference, leading to high memory bandwidth demand and consequently extended latency.
1 code implementation • 21 Mar 2022 • Xiaotian Yu, Yifan Yang, Aibo Wang, Ling Xing, Hanling Yi, Guangming Lu, Xiaoyu Wang
Face clustering is an essential task in computer vision due to the explosion of related applications such as augmented reality or photo album management.
no code implementations • 30 Jul 2021 • Xiaotian Yu, Hanling Yi, Yi Yu, Ling Xing, Shiliang Zhang, Xiaoyu Wang
There has been a recent surge of research interest in attacking the problem of social relation inference based on images.
no code implementations • 17 Sep 2020 • Marcus Kalander, Min Zhou, Chengzhi Zhang, Hanling Yi, Lujia Pan
We conduct extensive experiments on real-world traffic datasets collected from telecommunication networks.
no code implementations • 12 Sep 2019 • Tianyu Zhao, Hanling Yi, Minghua Chen, Chenye Wu, Yunjian Xu
We consider the scenario where $N$ utilities strategically bid for electricity in the day-ahead market and balance the mismatch between the committed supply and actual demand in the real-time market, with uncertainty in demand and local renewable generation in consideration.
no code implementations • 17 Nov 2017 • Palma London, Shai Vardi, Adam Wierman, Hanling Yi
This paper presents an acceleration framework for packing linear programming problems where the amount of data available is limited, i. e., where the number of constraints m is small compared to the variable dimension n. The framework can be used as a black box to speed up linear programming solvers dramatically, by two orders of magnitude in our experiments.