1 code implementation • 18 Apr 2024 • Jingmin Sun, Yuxuan Liu, Zecheng Zhang, Hayden Schaeffer
More importantly, we provide three extrapolation studies to demonstrate that PROSE-PDE can generalize physical features through the robust training of multiple operators and that the proposed model can extrapolate to predict PDE solutions whose models or data were unseen during the training.
no code implementations • 29 Oct 2023 • Zecheng Zhang, Christian Moya, Lu Lu, Guang Lin, Hayden Schaeffer
Neural operators have been applied in various scientific fields, such as solving parametric partial differential equations, dynamical systems with control, and inverse problems.
1 code implementation • 28 Sep 2023 • Yuxuan Liu, Zecheng Zhang, Hayden Schaeffer
Approximating nonlinear differential equations using a neural network provides a robust and efficient tool for various scientific computing tasks, including real-time predictions, inverse problems, optimal controls, and surrogate modeling.
1 code implementation • 11 Dec 2022 • Yuxuan Liu, Scott G. McCalla, Hayden Schaeffer
Particle dynamics and multi-agent systems provide accurate dynamical models for studying and forecasting the behavior of complex interacting systems.
no code implementations • 14 Apr 2022 • Zhijun Chen, Hayden Schaeffer, Rachel Ward
The spectra of random feature matrices provide essential information on the conditioning of the linear system used in random feature regression problems and are thus connected to the consistency and generalization of random feature models.
1 code implementation • 12 Apr 2022 • Nicholas Richardson, Hayden Schaeffer, Giang Tran
Signal decomposition and multiscale signal analysis provide many useful tools for time-frequency analysis.
1 code implementation • 6 Feb 2022 • Esha Saha, Hayden Schaeffer, Giang Tran
We prove that the HARFE method is guaranteed to converge with a given error bound depending on the noise and the parameters of the sparse ridge regression model.
1 code implementation • 7 Dec 2021 • Yuege Xie, Bobby Shi, Hayden Schaeffer, Rachel Ward
Inspired by the success of the iterative magnitude pruning technique in finding lottery tickets of neural networks, we propose a new method -- Sparser Random Feature Models via IMP (ShRIMP) -- to efficiently fit high-dimensional data with inherent low-dimensional structure in the form of sparse variable dependencies.
no code implementations • 21 Oct 2021 • Zhijun Chen, Hayden Schaeffer
In particular, we show that if the complexity ratio $\frac{N}{m}$ where $N$ is the number of neurons and $m$ is the number of data samples scales like $\log^{-1}(N)$ or $\log(m)$, then the random feature matrix is well-conditioned.
2 code implementations • 4 Mar 2021 • Abolfazl Hashemi, Hayden Schaeffer, Robert Shi, Ufuk Topcu, Giang Tran, Rachel Ward
In particular, we provide generalization bounds for functions in a certain class (that is dense in a reproducing kernel Hilbert space) depending on the number of samples and the distribution of features.
no code implementations • 17 Dec 2020 • Kayla Bollinger, Hayden Schaeffer
The neural network takes the form of a "three-layer" network with the first layer constrained to lie on the Grassmann manifold and the first activation function set to identity, while the remaining network is a standard two-layer ReLU neural network.
no code implementations • 8 Aug 2019 • Yifan Sun, Linan Zhang, Hayden Schaeffer
We propose a neural network based approach for extracting models from dynamic data using ordinary and partial differential equations.
Ranked #12 on Image Classification on Fashion-MNIST
no code implementations • 5 Aug 2019 • Hayden Schaeffer, Scott G. McCalla
We provide larger step-size restrictions for which gradient descent based algorithms (almost surely) avoid strict saddle points.
no code implementations • 25 Nov 2018 • Lam Si Tung Ho, Hayden Schaeffer, Giang Tran, Rachel Ward
In this work, we study the problem of learning nonlinear functions from corrupted and dependent data.
no code implementations • 24 Nov 2018 • Linan Zhang, Hayden Schaeffer
In this work, we show that the post-activation ResNet is related to an optimal control problem with differential inclusions, and provide continuous-time stability results for the differential inclusion associated with ResNet.
1 code implementation • 16 May 2018 • Linan Zhang, Hayden Schaeffer
In this work, we provide some theoretical results on the behavior and convergence of the algorithm proposed in [6].
Optimization and Control Information Theory Information Theory
no code implementations • PHYSICAL REVIEW E 96 2017 • Hayden Schaeffer, Scott G. McCalla
Model selection and parameter estimation are important for the effective integration of experimental data, scientific theory, and precise simulations.