Search Results for author: Ramzi Idoughi

Found 5 papers, 3 papers with code

Stereo Event-based Particle Tracking Velocimetry for 3D Fluid Flow Reconstruction

1 code implementation ECCV 2020 Yuanhao Wang, Ramzi Idoughi, Wolfgang Heidrich

Existing Particle Imaging Velocimetry techniques require the use of high-speed cameras to reconstruct time-resolved fluid flows.

Stereo Matching

Neural Adaptive SCEne Tracing

no code implementations28 Feb 2022 Rui Li, Darius Rückert, Yuanhao Wang, Ramzi Idoughi, Wolfgang Heidrich

Neural rendering with implicit neural networks has recently emerged as an attractive proposition for scene reconstruction, achieving excellent quality albeit at high computational cost.

Neural Rendering

NeAT: Neural Adaptive Tomography

1 code implementation4 Feb 2022 Darius Rückert, Yuanhao Wang, Rui Li, Ramzi Idoughi, Wolfgang Heidrich

Through a combination of neural features with an adaptive explicit representation, we achieve reconstruction times far superior to existing neural inverse rendering methods.

3D Reconstruction Inverse Rendering +2

IntraTomo: Self-Supervised Learning-Based Tomography via Sinogram Synthesis and Prediction

1 code implementation ICCV 2021 Guangming Zang, Ramzi Idoughi, Rui Li, Peter Wonka, Wolfgang Heidrich

After getting estimated through the sinogram prediction module, the density field is consistently refined in the second module using local and non-local geometrical priors.

Computed Tomography (CT) Low-Dose X-Ray Ct Reconstruction +3

Super-Resolution and Sparse View CT Reconstruction

no code implementations ECCV 2018 Guangming Zang, Mohamed Aly, Ramzi Idoughi, Peter Wonka, Wolfgang Heidrich

As a second, smaller contribution, we also show that when using such a proximal reconstruction framework, it is beneficial to employ the Simultaneous Algebraic Reconstruction Technique (SART) instead of the commonly used Conjugate Gradient (CG) method in the solution of the data term proximal operator.

Computed Tomography (CT) Super-Resolution

Cannot find the paper you are looking for? You can Submit a new open access paper.