Search Results for author: Lifu Wang

Found 4 papers, 0 papers with code

Linear RNNs Provably Learn Linear Dynamic Systems

no code implementations19 Nov 2022 Lifu Wang, Tianyu Wang, Shengwei Yi, Bo Shen, Bo Hu, Xing Cao

We study the learning ability of linear recurrent neural networks with Gradient Descent.

On the Provable Generalization of Recurrent Neural Networks

no code implementations NeurIPS 2021 Lifu Wang, Bo Shen, Bo Hu, Xing Cao

In this paper, using detailed analysis about the neural tangent kernel matrix, we prove a generalization error bound to learn such functions without normalized conditions and show that some notable concept classes are learnable with the numbers of iterations and samples scaling almost-polynomially in the input length $L$.

Is the Skip Connection Provable to Reform the Neural Network Loss Landscape?

no code implementations10 Jun 2020 Lifu Wang, Bo Shen, Ning Zhao, Zhiyuan Zhang

In this paper, we follow this line to study the topology (sub-level sets) of the loss landscape of deep ReLU neural networks with a skip connection and theoretically prove that the skip connection network inherits the good properties of the two-layer network and skip connections can help to control the connectedness of the sub-level sets, such that any local minima worse than the global minima of some two-layer ReLU network will be very ``shallow".

Cannot find the paper you are looking for? You can Submit a new open access paper.