Search Results for author: Jingling Li

Found 7 papers, 2 papers with code

How does a Neural Network's Architecture Impact its Robustness to Noisy Labels?

no code implementations NeurIPS 2021 Jingling Li, Mozhi Zhang, Keyulu Xu, John Dickerson, Jimmy Ba

Our framework measures a network's robustness via the predictive power in its representations --- the test performance of a linear model trained on the learned representations using a small set of clean labels.

Noisy Labels Can Induce Good Representations

no code implementations23 Dec 2020 Jingling Li, Mozhi Zhang, Keyulu Xu, John P. Dickerson, Jimmy Ba

We observe that if an architecture "suits" the task, training with noisy labels can induce useful hidden representations, even when the model generalizes poorly; i. e., the last few layers of the model are more negatively affected by noisy labels.

Learning with noisy labels

How Neural Networks Extrapolate: From Feedforward to Graph Neural Networks

3 code implementations ICLR 2021 Keyulu Xu, Mozhi Zhang, Jingling Li, Simon S. Du, Ken-ichi Kawarabayashi, Stefanie Jegelka

Second, in connection to analyzing the successes and limitations of GNNs, these results suggest a hypothesis for which we provide theoretical and empirical evidence: the success of GNNs in extrapolating algorithmic tasks to new data (e. g., larger graphs or edge weights) relies on encoding task-specific non-linearities in the architecture or features.

Understanding Generalization in Deep Learning via Tensor Methods

no code implementations14 Jan 2020 Jingling Li, Yanchao Sun, Jiahao Su, Taiji Suzuki, Furong Huang

Recently proposed complexity measures have provided insights to understanding the generalizability in neural networks from perspectives of PAC-Bayes, robustness, overparametrization, compression and so on.

Tensorial Neural Networks: Generalization of Neural Networks and Application to Model Compression

no code implementations25 May 2018 Jiahao Su, Jingling Li, Bobby Bhattacharjee, Furong Huang

We propose tensorial neural networks (TNNs), a generalization of existing neural networks by extending tensor operations on low order operands to those on high order ones.

Model Compression Tensor Decomposition

Cannot find the paper you are looking for? You can Submit a new open access paper.