Search Results for author: Xitong Zhang

Found 12 papers, 3 papers with code

MagNet: A Neural Network for Directed Graphs

1 code implementation NeurIPS 2021 Xitong Zhang, Yixuan He, Nathan Brugnone, Michael Perlmutter, Matthew Hirn

In this paper, we propose MagNet, a spectral GNN for directed graphs based on a complex Hermitian matrix known as the magnetic Laplacian.

Link Prediction Node Classification

Connect the Dots: In Situ 4D Seismic Monitoring of CO2 Storage with Spatio-temporal CNNs

no code implementations25 May 2021 Shihang Feng, Xitong Zhang, Brendt Wohlberg, Neill Symons, Youzuo Lin

Via both numerical and expert evaluation, we conclude that our models can produce high-quality 2D/3D seismic imaging data at a reasonable cost, offering the possibility of real-time monitoring or even near-future forecasting of the CO$_2$ storage reservoir.

Optical Flow Estimation Seismic Imaging

Making Invisible Visible: Data-Driven Seismic Inversion with Spatio-temporally Constrained Data Augmentation

no code implementations22 Jun 2021 Yuxin Yang, Xitong Zhang, Qiang Guan, Youzuo Lin

To validate the effectiveness of our data augmentation techniques, we apply them to solve a subsurface seismic full-waveform inversion using simulated CO$_2$ leakage data.

Data Augmentation Seismic Imaging +1

Unsupervised Learning of Full-Waveform Inversion: Connecting CNN and Partial Differential Equation in a Loop

no code implementations ICLR 2022 Peng Jin, Xitong Zhang, Yinpeng Chen, Sharon Xiaolei Huang, Zicheng Liu, Youzuo Lin

In particular, we use finite difference to approximate the forward modeling of PDE as a differentiable operator (from velocity map to seismic data) and model its inversion by CNN (from seismic data to velocity map).

Geophysics

OpenFWI: Large-Scale Multi-Structural Benchmark Datasets for Seismic Full Waveform Inversion

2 code implementations4 Nov 2021 Chengyuan Deng, Shihang Feng, Hanchen Wang, Xitong Zhang, Peng Jin, Yinan Feng, Qili Zeng, Yinpeng Chen, Youzuo Lin

The recent success of data-driven FWI methods results in a rapidly increasing demand for open datasets to serve the geophysics community.

Benchmarking Geophysics +1

Extremely Weak Supervision Inversion of Multi-physical Properties

no code implementations3 Feb 2022 Shihang Feng, Peng Jin, Xitong Zhang, Yinpeng Chen, David Alumbaugh, Michael Commer, Youzuo Lin

We explore a multi-physics inversion problem from two distinct measurements~(seismic and EM data) to three geophysical properties~(velocity, conductivity, and CO$_2$ saturation).

Geophysics

PyTorch Geometric Signed Directed: A Software Package on Graph Neural Networks for Signed and Directed Graphs

1 code implementation22 Feb 2022 Yixuan He, Xitong Zhang, JunJie Huang, Benedek Rozemberczki, Mihai Cucuringu, Gesine Reinert

While many networks are signed or directed, or both, there is a lack of unified software packages on graph neural networks (GNNs) specially designed for signed and directed networks.

Time Series Time Series Analysis

Implicit regularization in Heavy-ball momentum accelerated stochastic gradient descent

no code implementations2 Feb 2023 Avrajit Ghosh, He Lyu, Xitong Zhang, Rongrong Wang

It is well known that the finite step-size ($h$) in Gradient Descent (GD) implicitly regularizes solutions to flatter minima.

Unlocking Tuning-free Generalization: Minimizing the PAC-Bayes Bound with Trainable Priors

no code implementations30 May 2023 Xitong Zhang, Avrajit Ghosh, Guangliang Liu, Rongrong Wang

It is widely recognized that the generalization ability of neural networks can be greatly enhanced through carefully designing the training procedure.

Can Directed Graph Neural Networks be Adversarially Robust?

no code implementations3 Jun 2023 Zhichao Hou, Xitong Zhang, Wei Wang, Charu C. Aggarwal, Xiaorui Liu

This work presents the first investigation into the robustness of GNNs in the context of directed graphs, aiming to harness the profound trust implications offered by directed graphs to bolster the robustness and resilience of GNNs.

PAC-tuning:Fine-tuning Pretrained Language Models with PAC-driven Perturbed Gradient Descent

no code implementations26 Oct 2023 Guangliang Liu, Zhiyu Xue, Xitong Zhang, Kristen Marie Johnson, Rongrong Wang

Fine-tuning pretrained language models (PLMs) for downstream tasks is a large-scale optimization problem, in which the choice of the training algorithm critically determines how well the trained model can generalize to unseen test data, especially in the context of few-shot learning.

Data Augmentation Few-Shot Learning

Cannot find the paper you are looking for? You can Submit a new open access paper.