no code implementations • ICLR 2019 • Qianxiao Li, Bo Lin, Weiqing Ren

The committor function is a central object of study in understanding transitions between metastable states in complex systems.

1 code implementation • 5 May 2024 • Sohei Arisaka, Qianxiao Li

Scientific computing is an essential tool for scientific discovery and engineering design, and its computational cost is always a main concern in practice.

no code implementations • 4 May 2024 • Fusheng Liu, Qianxiao Li

A State Space Model (SSM) is a foundation model in time series analysis, which has recently been shown as an alternative to transformers in sequence modeling.

1 code implementation • 31 Mar 2024 • Zhuotong Chen, Zihu Wang, Yifan Yang, Qianxiao Li, Zheng Zhang

This approach reduces the computational cost to that of using just the P controller, instead of the full PID control.

1 code implementation • 22 Feb 2024 • Aiqing Zhu, Qianxiao Li

Benefiting from the robust density approximation, our method exhibits superior accuracy compared to baseline methods in learning the fully unknown drift and diffusion functions and computing the invariant distribution from trajectory data.

1 code implementation • 17 Jan 2024 • Jiaxi Zhao, Qianxiao Li

We study the problem of distribution shift generally arising in machine-learning augmented hybrid simulation, where parts of simulation algorithms are replaced by data-driven surrogates.

1 code implementation • 24 Nov 2023 • Shida Wang, Qianxiao Li

In this paper, we investigate the long-term memory learning capabilities of state-space models (SSMs) from the perspective of parameterization.

no code implementations • 16 Nov 2023 • Zhuotong Chen, Qianxiao Li, Zheng Zhang

Moreover, we design a surrogate retention system based on existing literature on evolutionary population dynamics to approximate the dynamics of distribution shifts on active user counts, from which the objective of achieving asymptotically fair participation is formulated as an optimal control problem, and the control variables are considered as the model parameters.

no code implementations • 12 Sep 2023 • Jingpu Cheng, Qianxiao Li, Ting Lin, Zuowei Shen

We investigate the expressive power of deep residual neural networks idealized as continuous dynamical systems through control theory.

1 code implementation • 8 Aug 2023 • Xiaoli Chen, Beatrice W. Soh, Zi-En Ooi, Eleonore Vissol-Gaudin, Haijun Yu, Kostya S. Novoselov, Kedar Hippalgaonkar, Qianxiao Li

Specifically, we learn three interpretable thermodynamic coordinates and build a dynamical landscape of polymer stretching, including the identification of stable and transition states and the control of the stretching rate.

1 code implementation • 30 May 2023 • Shida Wang, Zhong Li, Qianxiao Li

We prove an inverse approximation theorem for the approximation of nonlinear sequence-to-sequence relationships using recurrent neural networks (RNNs).

no code implementations • 29 May 2023 • Haotian Jiang, Qianxiao Li

The Transformer architecture is widely applied in sequence modeling applications, yet the theoretical understanding of its working principles remains limited.

no code implementations • 29 May 2023 • Haotian Jiang, Qianxiao Li

We present a theoretical analysis of the approximation properties of convolutional architectures when applied to the modeling of temporal sequences.

no code implementations • 27 Feb 2023 • Haotian Jiang, Qianxiao Li, Zhong Li, Shida Wang

We survey current developments in the approximation theory of sequence modelling in machine learning.

no code implementations • 25 Nov 2022 • Ting Lin, Zuowei Shen, Qianxiao Li

We study the approximation of shift-invariant or equivariant functions by deep fully convolutional networks from the dynamical systems perspective.

no code implementations • 22 Nov 2022 • Danimir T. Doncevic, Alexander Mitsos, Yue Guo, Qianxiao Li, Felix Dietrich, Manuel Dahmen, Ioannis G. Kevrekidis

Meta-learning of numerical algorithms for a given task consists of the data-driven identification and adaptation of an algorithmic structure and the associated hyperparameters.

1 code implementation • 26 Aug 2022 • Alexander E. Siemenn, Zekun Ren, Qianxiao Li, Tonio Buonassisi

Needle-in-a-Haystack problems exist across a wide range of applications including rare disease prediction, ecological resource management, fraud detection, and material property optimization.

no code implementations • 18 Aug 2022 • Qianxiao Li, Ting Lin, Zuowei Shen

We study the approximation of functions which are invariant with respect to certain permutations of the input indices using flow maps of dynamical systems.

1 code implementation • 26 Jun 2022 • Zhuotong Chen, Qianxiao Li, Zheng Zhang

While numerous attack and defense techniques have been developed, this work investigates the robustness issue from a new angle: can we design a self-healing neural network that can automatically detect and fix the vulnerability issue by itself?

no code implementations • 17 Jun 2022 • Sohei Arisaka, Qianxiao Li

Iterative methods are ubiquitous in large-scale scientific computing applications, and a number of approaches based on meta-learning have been recently proposed to accelerate them.

no code implementations • 14 Jun 2022 • Siyu Isaac Parker Tian, Zekun Ren, Selvaraj Venkataraj, Yuanhang Cheng, Daniil Bash, Felipe Oviedo, J. Senthilnath, Vijila Chellappan, Yee-Fun Lim, Armin G. Aberle, Benjamin P MacLeod, Fraser G. L. Parlane, Curtis P. Berlinguette, Qianxiao Li, Tonio Buonassisi, Zhe Liu

Transfer learning increasingly becomes an important tool in handling data scarcity often encountered in machine learning.

no code implementations • 22 Feb 2022 • Fusheng Liu, Haizhao Yang, Soufiane Hayou, Qianxiao Li

Optimization and generalization are two essential aspects of statistical machine learning.

no code implementations • 22 Oct 2021 • Bo Lin, Qianxiao Li, Weiqing Ren

The potential component of the decomposition gives the generalized potential.

no code implementations • 29 Sep 2021 • Fusheng Liu, Haizhao Yang, Qianxiao Li

Through our approach, we show that, with a proper initialization, gradient flow converges following a short path with an explicit length estimate.

no code implementations • ICLR 2022 • Zhong Li, Haotian Jiang, Qianxiao Li

Our results provide the theoretical understanding of approximation properties of the recurrent encoder-decoder architecture, which characterises, in the considered setting, the types of temporal relationships that can be efficiently learned.

no code implementations • ICLR 2022 • Yingtian Zou, Fusheng Liu, Qianxiao Li

In this paper, we study the effect of the adaptation learning rate in meta-learning with mixed linear regression.

no code implementations • 29 Sep 2021 • Sohei Arisaka, Qianxiao Li

In science and engineering applications, it is often required to solve similar computational problems repeatedly.

no code implementations • 20 Jul 2021 • Haotian Jiang, Zhong Li, Qianxiao Li

We study the approximation properties of convolutional architectures applied to time series modelling, which can be formulated mathematically as a functional approximation problem.

1 code implementation • CVPR 2021 • Nanyang Ye, Jingxuan Tang, Huayu Deng, Xiao-Yun Zhou, Qianxiao Li, Zhenguo Li, Guang-Zhong Yang, Zhanxing Zhu

To the best of our knowledge, this is one of the first to adopt differentiable environment splitting method to enable stable predictions across environments without environment index information, which achieves the state-of-the-art performance on datasets with strong spurious correlation, such as Colored MNIST.

2 code implementations • 4 May 2021 • Yue Guo, Felix Dietrich, Tom Bertalan, Danimir T. Doncevic, Manuel Dahmen, Ioannis G. Kevrekidis, Qianxiao Li

As a case study, we develop a machine learning approach that automatically learns effective solvers for initial value problems in the form of ordinary differential equations (ODEs), based on the Runge-Kutta (RK) integrator architecture.

no code implementations • 19 Mar 2021 • Tian Huang, Siong Thye Goh, Sabrish Gopalakrishnan, Tao Luo, Qianxiao Li, Hoong Chuin Lau

In this way, we are able capture the common structure of the instances and their interactions with the solver, and produce good choices of penalty parameters with fewer number of calls to the QUBO solver.

1 code implementation • ICLR 2021 • Zhuotong Chen, Qianxiao Li, Zheng Zhang

We connect the robustness of neural networks with optimal control using the geometrical information of underlying data to design the control objective.

no code implementations • 15 Dec 2020 • Nanyang Ye, Qianxiao Li, Xiao-Yun Zhou, Zhanxing Zhu

However, conducting adversarial training brings much computational overhead compared with standard training.

no code implementations • 13 Dec 2020 • Bo Lin, Qianxiao Li, Weiqing Ren

The quasipotential is a natural generalization of the concept of energy functions to non-equilibrium systems.

no code implementations • 22 Oct 2020 • Shen Ren, Qianxiao Li, Liye Zhang, Zheng Qin, Bo Yang

The future of mobility-as-a-Service (Maas)should embrace an integrated system of ride-hailing, street-hailing and ride-sharing with optimised intelligent vehicle routing in response to a real-time, stochastic demand pattern.

no code implementations • ICLR 2021 • Zhong Li, Jiequn Han, Weinan E, Qianxiao Li

We study the approximation properties and optimization dynamics of recurrent neural networks (RNNs) when applied to learn input-output relationships in temporal data.

1 code implementation • 6 Sep 2020 • Haijun Yu, Xinyuan Tian, Weinan E, Qianxiao Li

We further apply this method to study Rayleigh-Benard convection and learn Lorenz-like low dimensional autonomous reduced order models that capture both qualitative and quantitative properties of the underlying dynamics.

1 code implementation • 15 May 2020 • Zekun Ren, Siyu Isaac Parker Tian, Juhwan Noh, Felipe Oviedo, Guangzong Xing, Jiali Li, Qiaohao Liang, Ruiming Zhu, Armin G. Aberle, Shijing Sun, Xiaonan Wang, Yi Liu, Qianxiao Li, Senthilnath Jayavelu, Kedar Hippalgaonkar, Yousung Jung, Tonio Buonassisi

Realizing general inverse design could greatly accelerate the discovery of new materials with user-defined properties.

no code implementations • 18 Apr 2020 • Yongqiang Cai, Qianxiao Li, Zuowei Shen

We present the viewpoint that optimization problems encountered in machine learning can often be interpreted as minimizing a convex functional over a function space, but with a non-convex constraint set introduced by model parameterization.

no code implementations • 12 Feb 2020 • Chi Zhang, Yong Sheng Soh, Ling Feng, Tianyi Zhou, Qianxiao Li

While current machine learning models have impressive performance over a wide range of applications, their large size and complexity render them unsuitable for tasks such as remote monitoring on edge devices with limited storage and computational power.

1 code implementation • 31 Jan 2020 • Zekun Ren, Felipe Oviedo, Maung Thway, Siyu I. P. Tian, Yue Wang, Hansong Xue, Jose Dario Perea, Mariya Layurova, Thomas Heumueller, Erik Birgersson, Armin G. Aberle, Christoph J. Brabec, Rolf Stangl, Qianxiao Li, Shijing Sun, Fen Lin, Ian Marius Peters & Tonio Buonassisi

Process optimization of photovoltaic devices is a time-intensive, trial-and-error endeavor, which lacks full transparency of the underlying physics and relies on user-imposed constraints that may or may not lead to a global optimum.

no code implementations • 22 Dec 2019 • Qianxiao Li, Ting Lin, Zuowei Shen

We build on the dynamical systems approach to deep learning, where deep residual networks are idealized as continuous-time dynamical systems, from the approximation perspective.

no code implementations • 14 Jun 2019 • Chi Zhang, Qianxiao Li

Moreover, we show that the more local updating can reduce the overall communication, even for an infinity number of steps where each node is free to update its local model to near-optimality before exchanging information.

no code implementations • 14 Jun 2019 • Qianxiao Li, Bo Lin, Weiqing Ren

The committor function is a central object of study in understanding transitions between metastable states in complex systems.

no code implementations • ICLR 2019 • Yongqiang Cai, Qianxiao Li, Zuowei Shen

Despite its empirical success, the theoretical underpinnings of the stability, convergence and acceleration properties of batch normalization (BN) remain elusive.

no code implementations • 5 Nov 2018 • Qianxiao Li, Cheng Tai, Weinan E

We develop the mathematical foundations of the stochastic modified equations (SME) framework for analyzing the dynamics of stochastic gradient algorithms, where the latter is approximated by a class of stochastic differential equations with small noise parameters.

no code implementations • ICLR 2019 • Yongqiang Cai, Qianxiao Li, Zuowei Shen

Despite its empirical success and recent theoretical progress, there generally lacks a quantitative analysis of the effect of batch normalization (BN) on the convergence and stability of gradient descent.

no code implementations • 3 Jul 2018 • Weinan E, Jiequn Han, Qianxiao Li

This paper introduces the mathematical formulation of the population risk minimization problem in deep learning as a mean-field optimal control problem.

1 code implementation • ICML 2018 • Qianxiao Li, Shuji Hao

Deep learning is formulated as a discrete-time optimal control problem.

2 code implementations • 26 Oct 2017 • Qianxiao Li, Long Chen, Cheng Tai, Weinan E

The continuous dynamical system approach to deep learning is explored in order to devise alternative frameworks for training algorithms.

no code implementations • ICML 2017 • Qianxiao Li, Cheng Tai, Weinan E

We develop the method of stochastic modified equations (SME), in which stochastic gradient algorithms are approximated in the weak sense by continuous-time stochastic differential equations.

Cannot find the paper you are looking for? You can
Submit a new open access paper.

Contact us on:
hello@paperswithcode.com
.
Papers With Code is a free resource with all data licensed under CC-BY-SA.