no code implementations • CSRNLP (LREC) 2022 • Lu Lu, Jinghang Gu, Chu-Ren Huang
Inclusion, as one of the foundations in the diversity, equity, and inclusion initiative, concerns the degree of being treated as an ingroup member in a workplace.
no code implementations • 8 Mar 2023 • Zhongyi Jiang, Min Zhu, Dongzhuo Li, Qiuzi Li, Yanhua O. Yuan, Lu Lu
Here, we develop a Fourier-enhanced multiple-input neural operator (Fourier-MIONet) to learn the solution operator of the problem of multiphase flow in porous media.
1 code implementation • 13 Dec 2022 • Min Zhu, Handi Zhang, Anran Jiao, George Em Karniadakis, Lu Lu
Deep neural operators can learn nonlinear mappings between infinite-dimensional function spaces via deep neural networks.
2 code implementations • 21 Jul 2022 • Chenxi Wu, Min Zhu, Qinyang Tan, Yadhu Kartha, Lu Lu
Hence, we have considered a total of 10 different sampling methods, including six non-adaptive uniform sampling, uniform sampling with resampling, two proposed adaptive sampling, and an existing adaptive sampling.
1 code implementation • 9 May 2022 • Xin-Yang Liu, Hao Sun, Min Zhu, Lu Lu, Jian-Xun Wang
A more promising way is to leverage our prior physics knowledge in scientific deep learning models, known as physics-informed deep learning (PiDL).
no code implementations • 19 Apr 2022 • Christopher Hazard, Akshay Bhagat, Balarama Raju Buddharaju, Zhongtao Liu, Yunming Shao, Lu Lu, Sammy Omari, Henggang Cui
Trajectory prediction is an important task in autonomous driving.
2 code implementations • 14 Apr 2022 • Lu Lu, Raphael Pestourie, Steven G. Johnson, Giuseppe Romano
Deep neural operators can learn operators mapping between infinite-dimensional function spaces via deep neural networks and have become an emerging paradigm of scientific machine learning.
no code implementations • 19 Mar 2022 • Lu Lu, Yi Yu, Rodrigo C. de Lamare, Xiaomin Yang
We propose a novel M-estimate conjugate gradient (CG) algorithm, termed Tukey's biweight M-estimate CG (TbMCG), for system identification in impulsive noise environments.
2 code implementations • 12 Feb 2022 • Pengzhan Jin, Shuai Meng, Lu Lu
Based on our theory and a low-rank approximation, we propose a novel neural operator, MIONet, to learn multiple-input operators.
2 code implementations • 3 Feb 2022 • Mitchell Daneker, Zhen Zhang, George Em Karniadakis, Lu Lu
The dynamics of systems biological processes are usually modeled by a system of ordinary differential equations (ODEs) with many unknown parameters that need to be inferred from noisy and sparse measurements.
2 code implementations • 1 Nov 2021 • Jeremy Yu, Lu Lu, Xuhui Meng, George Em Karniadakis
We tested gPINNs extensively and demonstrated the effectiveness of gPINNs in both forward and inverse PDE problems.
no code implementations • 19 Oct 2021 • Lu Lu, Kai-Li Yin, Rodrigo C. de Lamare, Zongsheng Zheng, Yi Yu, Xiaomin Yang, Badong Chen
Most of the literature focuses on the development of the linear active noise control (ANC) techniques.
no code implementations • 1 Oct 2021 • Lu Lu, Kai-Li Yin, Rodrigo C. de Lamare, Zongsheng Zheng, Yi Yu, Xiaomin Yang, Badong Chen
Active noise control (ANC) is an effective way for reducing the noise level in electroacoustic or electromechanical systems.
no code implementations • 14 Aug 2021 • Gang Guo, Yi Yu, Rodrigo C. de Lamare, Zongsheng Zheng, Lu Lu, Qiangming Cai
In addition, an adaptive approach for the choice of the thresholding parameter in the proximal step is also proposed based on the minimization of the mean square deviation.
no code implementations • 6 Apr 2021 • Anran Jiao, Haiyang He, Rishikesh Ranade, Jay Pathak, Lu Lu
Discovering governing equations of a physical system, represented by partial differential equations (PDEs), from data is a central challenge in a variety of areas of science and engineering.
3 code implementations • 9 Feb 2021 • Lu Lu, Raphael Pestourie, Wenjie Yao, Zhicheng Wang, Francesc Verdugo, Steven G. Johnson
We achieve the same objective as conventional PDE-constrained optimization methods based on adjoint methods and numerical PDE solvers, but find that the design obtained from hPINN is often simpler and smoother for problems whose solution is not unique.
no code implementations • 24 Dec 2020 • Lu Lu, ZhenZhen Lou
The classical problem of characterizing the graphs with bounded eigenvalues may date back to the work of Smith in 1970.
Combinatorics 05C50
no code implementations • 23 Dec 2020 • Chensen Lin, Zhen Li, Lu Lu, Shengze Cai, Martin Maxey, George Em Karniadakis
Simulating and predicting multiscale problems that couple multiple physics and dynamics across many orders of spatiotemporal scales is a great challenge that has not been investigated systematically by deep neural networks (DNNs).
Computational Physics
no code implementations • 26 Aug 2020 • Ping Liu, Yuewei Lin, Zibo Meng, Lu Lu, Weihong Deng, Joey Tianyi Zhou, Yi Yang
In this paper, we propose a simple yet effective approach, named Point Adversarial Self Mining (PASM), to improve the recognition accuracy in facial expression recognition.
1 code implementation • 2 Dec 2019 • Yuyao Chen, Lu Lu, George Em. Karniadakis, Luca Dal Negro
In this paper we employ the emerging paradigm of physics-informed neural networks (PINNs) for the solution of representative inverse scattering problems in photonic metamaterials and nano-optics technologies.
Computational Physics Optics
1 code implementation • 27 Oct 2019 • Andreagiovanni Reina, Viktor Ioannou, Junjin Chen, Lu Lu, Charles Kent, James A. R. Marshall
Will the Third World War be fought by robots?
2 code implementations • 8 Oct 2019 • Lu Lu, Pengzhan Jin, George Em. Karniadakis
This universal approximation theorem is suggestive of the potential application of neural networks in learning nonlinear operators from data.
5 code implementations • 10 Jul 2019 • Lu Lu, Xuhui Meng, Zhiping Mao, George E. Karniadakis
We also present a Python library for PINNs, DeepXDE, which is designed to serve both as an education tool to be used in the classroom as well as a research tool for solving problems in computational science and engineering.
1 code implementation • 9 Jul 2019 • Qilei Li, Zhen Li, Lu Lu, Gwanggil Jeon, Kai Liu, Xiaomin Yang
The rapid development of deep learning (DL) has driven single image super-resolution (SR) into a new era.
Ranked #14 on
Image Super-Resolution
on Manga109 - 4x upscaling
1 code implementation • 27 May 2019 • Pengzhan Jin, Lu Lu, Yifa Tang, George Em. Karniadakis
To derive a meaningful bound, we study the generalization error of neural networks for classification problems in terms of data distribution and neural network smoothness.
no code implementations • 15 Mar 2019 • Lu Lu, Yeonjong Shin, Yanhui Su, George Em. Karniadakis
Numerical examples are provided to demonstrate the effectiveness of the new initialization procedure.
no code implementations • 16 Jan 2019 • Christine M. Anderson-Cook, Kary L. Myers, Lu Lu, Michael L. Fugate, Kevin R. Quinlan, Norma Pawley
It also describes a post-competition analysis that enables robust and efficient assessment of the strengths and weaknesses of solutions from different competitors, as well as greater understanding of the regions of the input space that are well-solved.
no code implementations • 21 Sep 2018 • Dongkun Zhang, Lu Lu, Ling Guo, George Em. Karniadakis
Here, we propose a new method with the objective of endowing the DNN with uncertainty quantification for both sources of uncertainty, i. e., the parametric uncertainty and the approximation uncertainty.
1 code implementation • ICLR 2019 • Lu Lu, Yanhui Su, George Em. Karniadakis
However, here we show that even for such activation, deep and narrow neural networks (NNs) will converge to erroneous mean or median states of the target function depending on the loss with high probability.