Search Results for author: PRANEETH CHAKRAVARTHULA

Found 9 papers, 3 papers with code

Spatially Varying Nanophotonic Neural Networks

no code implementations7 Aug 2023 Kaixuan Wei, Xiao Li, Johannes Froech, PRANEETH CHAKRAVARTHULA, James Whitehead, Ethan Tseng, Arka Majumdar, Felix Heide

The explosive growth of computation and energy cost of artificial intelligence has spurred strong interests in new computing modalities as potential alternatives to conventional electronic processors.

2k

Thin On-Sensor Nanophotonic Array Cameras

no code implementations5 Aug 2023 PRANEETH CHAKRAVARTHULA, Jipeng Sun, Xiao Li, Chenyang Lei, Gene Chou, Mario Bijelic, Johannes Froesch, Arka Majumdar, Felix Heide

The optical array is embedded on a metasurface that, at 700~nm height, is flat and sits on the sensor cover glass at 2. 5~mm focal distance from the sensor.

Stochastic Light Field Holography

no code implementations12 Jul 2023 Florian Schiffers, PRANEETH CHAKRAVARTHULA, Nathan Matsuda, Grace Kuo, Ethan Tseng, Douglas Lanman, Felix Heide, Oliver Cossairt

The Visual Turing Test is the ultimate goal to evaluate the realism of holographic displays.

Seeing With Sound: Long-range Acoustic Beamforming for Multimodal Scene Understanding

no code implementations CVPR 2023 PRANEETH CHAKRAVARTHULA, Jim Aldon D’Souza, Ethan Tseng, Joe Bartusek, Felix Heide

We validate the benefit of adding sound detections to existing RGB cameras in challenging automotive scenarios, where camera-only approaches fail or do not deliver the ultra-fast rates of pressure sensors.

Autonomous Vehicles object-detection +2

OmniHorizon: In-the-Wild Outdoors Depth and Normal Estimation from Synthetic Omnidirectional Dataset

no code implementations9 Dec 2022 Jay Bhanushali, PRANEETH CHAKRAVARTHULA, Manivannan Muniyandi

Finally, we demonstrate in-the-wild depth and normal estimation on real-world images with UBotNet trained purely on our OmniHorizon dataset, showing the promise of proposed dataset and network for scene understanding.

Autonomous Driving Scene Understanding

ChromaCorrect: Prescription Correction in Virtual Reality Headsets through Perceptual Guidance

no code implementations8 Dec 2022 Ahmet Güzel, Jeanne Beyazian, PRANEETH CHAKRAVARTHULA, Kaan Akşit

In this work, we remedy the usage of prescription eyeglasses in Virtual Reality (VR) headsets by shifting the optical complexity completely into software and propose a prescription-aware rendering approach for providing sharper and immersive VR imagery.

FoV-NeRF: Foveated Neural Radiance Fields for Virtual Reality

1 code implementation30 Mar 2021 Nianchen Deng, Zhenyi He, Jiannan Ye, Budmonde Duinkharjav, PRANEETH CHAKRAVARTHULA, Xubo Yang, Qi Sun

To tackle these problems toward six-degrees-of-freedom, egocentric, and stereo NeRF in VR, we present the first gaze-contingent 3D neural representation and view synthesis method.

Distributed Computing Neural Rendering

DeepCGH: 3D computer-generated holography using deep learning

1 code implementation31 Aug 2020 M. Hossein Eybposh, NICHOLAS W. CAIRA, MATHEW ATISA, PRANEETH CHAKRAVARTHULA, NICOLAS C. PÉGARD

The goal of computer-generated holography (CGH) is to synthesize custom illu- mination patterns by modulating a coherent light beam.

Cannot find the paper you are looking for? You can Submit a new open access paper.