no code implementations • 3 Mar 2023 • Kazusato Oko, Shunta Akiyama, Taiji Suzuki
While efficient distribution learning is no doubt behind the groundbreaking success of diffusion modeling, its theoretical guarantees are quite limited.
no code implementations • 1 Sep 2022 • Kazusato Oko, Shunta Akiyama, Tomoya Murata, Taiji Suzuki
While variance reduction methods have shown great success in solving large scale optimization problems, many of them suffer from accumulated errors and, therefore, should periodically require the full gradient computation.
no code implementations • 30 May 2022 • Shunta Akiyama, Taiji Suzuki
While deep learning has outperformed other methods for various tasks, theoretical frameworks that explain its reason have not been fully established.
no code implementations • 11 Jun 2021 • Shunta Akiyama, Taiji Suzuki
Deep learning empirically achieves high performance in many applications, but its training dynamics has not been fully understood theoretically.
no code implementations • ICLR 2021 • Taiji Suzuki, Shunta Akiyama
Establishing a theoretical analysis that explains why deep learning can outperform shallow learning such as kernel methods is one of the biggest issues in the deep learning literature.