Proximal SCOPE for Distributed Sparse Learning

NeurIPS 2018 Shenyi ZhaoGong-Duo ZhangMing-Wei LiWu-Jun Li

Distributed sparse learning with a cluster of multiple machines has attracted much attention in machine learning, especially for large-scale applications with high-dimensional data. One popular way to implement sparse learning is to use L1 regularization... (read more)

PDF Abstract


No code implementations yet. Submit your code now

Results from the Paper

  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.