Search Results for author: Akshay Gadi Patil

Found 5 papers, 1 papers with code

LayoutGMN: Neural Graph Matching for Structural Layout Similarity

1 code implementation CVPR 2021 Akshay Gadi Patil, Manyi Li, Matthew Fisher, Manolis Savva, Hao Zhang

In particular, retrieval results by our network better match human judgement of structural layout similarity compared to both IoUs and other baselines including a state-of-the-art method based on graph neural networks and image convolution.

Graph Matching Metric Learning

DR-KFS: A Differentiable Visual Similarity Metric for 3D Shape Reconstruction

no code implementations ECCV 2020 Jiongchao Jin, Akshay Gadi Patil, Zhang Xiong, Hao Zhang

We introduce a differential visual similarity metric to train deep neural networks for 3D reconstruction, aimed at improving reconstruction quality.

3D Reconstruction 3D Shape Reconstruction

READ: Recursive Autoencoders for Document Layout Generation

no code implementations1 Sep 2019 Akshay Gadi Patil, Omri Ben-Eliezer, Or Perel, Hadar Averbuch-Elor

Creating large varieties of plausible document layouts can be a tedious task, requiring numerous constraints to be satisfied, including local ones relating different semantic elements and global constraints on the general appearance and spacing.

GRAINS: Generative Recursive Autoencoders for INdoor Scenes

no code implementations24 Jul 2018 Manyi Li, Akshay Gadi Patil, Kai Xu, Siddhartha Chaudhuri, Owais Khan, Ariel Shamir, Changhe Tu, Baoquan Chen, Daniel Cohen-Or, Hao Zhang

We present a generative neural network which enables us to generate plausible 3D indoor scenes in large quantities and varieties, easily and highly efficiently.


Automatic Content-aware Non-Photorealistic Rendering of Images

no code implementations7 Apr 2016 Akshay Gadi Patil, Shanmuganathan Raman

Non-photorealistic rendering techniques work on image features and often manipulate a set of characteristics such as edges and texture to achieve a desired depiction of the scene.

Cannot find the paper you are looking for? You can Submit a new open access paper.