Search Results for author: Ziyue Liu

Found 9 papers, 4 papers with code

Separable Operator Networks

1 code implementation15 Jul 2024 Xinling Yu, Sean Hooten, Ziyue Liu, Yequan Zhao, Marco Fiorentino, Thomas Van Vaerenbergh, Zheng Zhang

We provide a universal approximation theorem for SepONet proving the existence of a separable approximation to any nonlinear continuous operator.

Benchmarking Operator learning

Leveraging Latent Diffusion Models for Training-Free In-Distribution Data Augmentation for Surface Defect Detection

1 code implementation4 Jul 2024 Federico Girella, Ziyue Liu, Franco Fummi, Francesco Setti, Marco Cristani, Luigi Capogrosso

Usually, defect detection classifiers are trained on ground-truth data formed by normal samples (negative data) and samples with defects (positive data), where the latter are consistently fewer than normal samples.

Data Augmentation Defect Detection +1

CoMERA: Computing- and Memory-Efficient Training via Rank-Adaptive Tensor Optimization

1 code implementation23 May 2024 Zi Yang, Ziyue Liu, Samridhi Choudhary, Xinfeng Xie, Cao Gao, Siegfried Kunzmann, Zheng Zhang

Our method also shows $\sim 2\times$ speedup than standard pre-training on a BERT-like code-generation LLM while achieving $4. 23\times$ compression ratio in pre-training.

Code Generation Recommendation Systems

Real-Time FJ/MAC PDE Solvers via Tensorized, Back-Propagation-Free Optical PINN Training

no code implementations31 Dec 2023 Yequan Zhao, Xian Xiao, Xinling Yu, Ziyue Liu, Zhixiong Chen, Geza Kurczveil, Raymond G. Beausoleil, Zheng Zhang

Despite the ultra-high speed of optical neural networks, training a PINN on an optical chip is hard due to (1) the large size of photonic devices, and (2) the lack of scalable optical memory devices to store the intermediate results of back-propagation (BP).

Tensor-Compressed Back-Propagation-Free Training for (Physics-Informed) Neural Networks

no code implementations18 Aug 2023 Yequan Zhao, Xinling Yu, Zhixiong Chen, Ziyue Liu, Sijia Liu, Zheng Zhang

Backward propagation (BP) is widely used to compute the gradients in neural network training.

DeepOHeat: Operator Learning-based Ultra-fast Thermal Simulation in 3D-IC Design

1 code implementation25 Feb 2023 Ziyue Liu, Yixing Li, Jing Hu, Xinling Yu, Shinyu Shiau, Xin Ai, Zhiyu Zeng, Zheng Zhang

In this paper, for the first time, we propose DeepOHeat, a physics-aware operator learning framework to predict the temperature field of a family of heat equations with multiple parametric or non-parametric design configurations.

Operator learning

PIFON-EPT: MR-Based Electrical Property Tomography Using Physics-Informed Fourier Networks

no code implementations23 Feb 2023 Xinling Yu, José E. C. Serrallés, Ilias I. Giannakopoulos, Ziyue Liu, Luca Daniel, Riccardo Lattanzi, Zheng Zhang

PIFON-EPT is the first method that can simultaneously reconstruct EP and transmit fields from incomplete noisy MR measurements, providing new opportunities for EPT research.

Denoising

MR-Based Electrical Property Reconstruction Using Physics-Informed Neural Networks

no code implementations23 Oct 2022 Xinling Yu, José E. C. Serrallés, Ilias I. Giannakopoulos, Ziyue Liu, Luca Daniel, Riccardo Lattanzi, Zheng Zhang

Electrical properties (EP), namely permittivity and electric conductivity, dictate the interactions between electromagnetic waves and biological tissue.

TT-PINN: A Tensor-Compressed Neural PDE Solver for Edge Computing

no code implementations4 Jul 2022 Ziyue Liu, Xinling Yu, Zheng Zhang

Physics-informed neural networks (PINNs) have been increasingly employed due to their capability of modeling complex physics systems.

Edge-computing

Cannot find the paper you are looking for? You can Submit a new open access paper.