Search Results for author: Yijun Dong

Found 4 papers, 0 papers with code

Randomized Dimension Reduction with Statistical Guarantees

no code implementations3 Oct 2023 Yijun Dong

From the computational efficiency perspective, we design and analyze fast randomized low-rank decomposition algorithms for large matrices based on "matrix sketching", which can be regarded as a dimension reduction strategy in the data space.

Computational Efficiency Data Augmentation +4

Adaptively Weighted Data Augmentation Consistency Regularization for Robust Optimization under Concept Shift

no code implementations4 Oct 2022 Yijun Dong, Yuege Xie, Rachel Ward

At the saddle point of the underlying objective, the weights assign label-dense samples to the supervised loss and label-sparse samples to the unsupervised consistency regularization.

Data Augmentation Image Segmentation +3

Sample Efficiency of Data Augmentation Consistency Regularization

no code implementations24 Feb 2022 Shuo Yang, Yijun Dong, Rachel Ward, Inderjit S. Dhillon, Sujay Sanghavi, Qi Lei

Data augmentation is popular in the training of large neural networks; currently, however, there is no clear theoretical comparison between different algorithmic choices on how to use augmented data.

Data Augmentation Generalization Bounds

Theoretical Analysis of Consistency Regularization with Limited Augmented Data

no code implementations29 Sep 2021 Shuo Yang, Yijun Dong, Rachel Ward, Inderjit S Dhillon, Sujay Sanghavi, Qi Lei

Data augmentation is popular in the training of large neural networks; currently, however, there is no clear theoretical comparison between different algorithmic choices on how to use augmented data.

Data Augmentation Generalization Bounds +1

Cannot find the paper you are looking for? You can Submit a new open access paper.