Search Results for author: Akshay Kulkarni

Found 6 papers, 4 papers with code

Concurrent Subsidiary Supervision for Unsupervised Source-Free Domain Adaptation

1 code implementation27 Jul 2022 Jogendra Nath Kundu, Suvaansh Bhambri, Akshay Kulkarni, Hiran Sarkar, Varun Jampani, R. Venkatesh Babu

The prime challenge in unsupervised domain adaptation (DA) is to mitigate the domain shift between the source and target domains.

Unsupervised Domain Adaptation

Balancing Discriminability and Transferability for Source-Free Domain Adaptation

no code implementations16 Jun 2022 Jogendra Nath Kundu, Akshay Kulkarni, Suvaansh Bhambri, Deepesh Mehta, Shreyas Kulkarni, Varun Jampani, R. Venkatesh Babu

Conventional domain adaptation (DA) techniques aim to improve domain transferability by learning domain-invariant representations; while concurrently preserving the task-discriminability knowledge gathered from the labeled source data.

Domain Adaptation Semantic Segmentation

Amplitude Spectrum Transformation for Open Compound Domain Adaptive Semantic Segmentation

no code implementations9 Feb 2022 Jogendra Nath Kundu, Akshay Kulkarni, Suvaansh Bhambri, Varun Jampani, R. Venkatesh Babu

However, we find that latent features derived from the Fourier-based amplitude spectrum of deep CNN features hold a more tractable mapping with domain discrimination.

Disentanglement Domain Adaptation +1

Design and Development of Autonomous Delivery Robot

1 code implementation16 Mar 2021 Aniket Gujarathi, Akshay Kulkarni, Unmesh Patil, Yogesh Phalak, Rajeshree Deotalu, Aman Jain, Navid Panchi, Ashwin Dhabale, Shital Chiddarwar

Autonomous robots are developed to be robust enough to work beside humans and to carry out jobs efficiently.

Data Efficient Stagewise Knowledge Distillation

1 code implementation15 Nov 2019 Akshay Kulkarni, Navid Panchi, Sharath Chandra Raparthy, Shital Chiddarwar

We show, across the tested tasks, significant performance gains even with a fraction of the data used in distillation, without compromising on the metric.

Knowledge Distillation Model Compression +2

Cannot find the paper you are looking for? You can Submit a new open access paper.