Search Results for author: Pengyun Yue

Found 2 papers, 0 papers with code

Accelerated Gradient Algorithms with Adaptive Subspace Search for Instance-Faster Optimization

no code implementations6 Dec 2023 Yuanshi Liu, Hanzhen Zhao, Yang Xu, Pengyun Yue, Cong Fang

In this paper, we open up a new way to design and analyze gradient-based algorithms with direct applications in machine learning, including linear regression and beyond.

CORE: Common Random Reconstruction for Distributed Optimization with Provable Low Communication Complexity

no code implementations23 Sep 2023 Pengyun Yue, Hanzhen Zhao, Cong Fang, Di He, LiWei Wang, Zhouchen Lin, Song-Chun Zhu

With distributed machine learning being a prominent technique for large-scale machine learning tasks, communication complexity has become a major bottleneck for speeding up training and scaling up machine numbers.

Distributed Optimization

Cannot find the paper you are looking for? You can Submit a new open access paper.