no code implementations • 9 Feb 2024 • Sachin Chhabra, Hemanth Venkateswara, Baoxin Li
In the absence of labeled target data, unsupervised domain adaptation approaches seek to align the marginal distributions of the source and target domains in order to train a classifier for the target.
no code implementations • 3 Dec 2022 • Sandipan Choudhuri, Suli Adeniye, Arunabha Sen, Hemanth Venkateswara
The standard closed-set domain adaptation approaches seek to mitigate distribution discrepancies between two domains under the constraint of both sharing identical label sets.
1 code implementation • 27 Oct 2022 • Sachin Chhabra, Prabal Bijoy Dutta, Hemanth Venkateswara, Baoxin Li
Vision transformers require a huge amount of labeled data to outperform convolutional neural networks.
no code implementations • 17 Jul 2022 • Sandipan Choudhuri, Hemanth Venkateswara, Arunabha Sen
In contrast to a standard closed-set domain adaptation task, partial domain adaptation setup caters to a realistic scenario by relaxing the identical label set assumption.
no code implementations • 26 Jan 2022 • Aksheshkumar Ajaykumar Shah, Hemanth Venkateswara
Recently, Generative Adversarial Networks (GANs) have been applied to the problem of Cold-Start Recommendation, but the training performance of these models is hampered by the extreme sparsity in warm user purchase behavior.
no code implementations • 6 Jan 2021 • Sandipan Choudhuri, Riti Paul, Arunabha Sen, Baoxin Li, Hemanth Venkateswara
Driven by the motivation that image styles are private to each domain, in this work, we develop a method that identifies outlier classes exclusively from image content information and train a label classifier exclusively on class-content from source images.
1 code implementation • ECCV 2020 • Maunil R Vyas, Hemanth Venkateswara, Sethuraman Panchanathan
The SR-loss guides the LsrGAN to generate visual features that mirror the semantic relationships between seen and unseen classes.
no code implementations • 1 Jul 2019 • Piyush Papreja, Hemanth Venkateswara, Sethuraman Panchanathan
Playlists have become a significant part of our listening experience because of the digital cloud-based services such as Spotify, Pandora, Apple Music.
no code implementations • 23 Jun 2017 • Hemanth Venkateswara, Prasanth Lade, Binbin Lin, Jieping Ye, Sethuraman Panchanathan
Estimating the MI for a subset of features is often intractable.
no code implementations • 23 Jun 2017 • Hemanth Venkateswara, Vineeth N. Balasubramanian, Prasanth Lade, Sethuraman Panchanathan
The emergence of depth imaging technologies like the Microsoft Kinect has renewed interest in computational methods for gesture classification based on videos.
no code implementations • 23 Jun 2017 • Hemanth Venkateswara, Shayok Chakraborty, Troy McDaniel, Sethuraman Panchanathan
To determine the parameters in the NET model (and in other unsupervised domain adaptation models), we introduce a validation procedure by sampling source data points that are similar in distribution to the target data.
no code implementations • 22 Jun 2017 • Hemanth Venkateswara, Prasanth Lade, Jieping Ye, Sethuraman Panchanathan
Popular domain adaptation (DA) techniques learn a classifier for the target domain by sampling relevant data points from the source and combining it with the target data.
7 code implementations • CVPR 2017 • Hemanth Venkateswara, Jose Eusebio, Shayok Chakraborty, Sethuraman Panchanathan
Domain adaptation or transfer learning algorithms address this challenge by leveraging labeled data in a different, but related source domain, to develop a model for the target domain.
no code implementations • 22 Jun 2017 • Hemanth Venkateswara, Shayok Chakraborty, Sethuraman Panchanathan
The problem of domain adaptation (DA) deals with adapting classifier models trained on one data distribution to different data distributions.
1 code implementation • 2 May 2017 • Ragav Venkatesan, Hemanth Venkateswara, Sethuraman Panchanathan, Baoxin Li
Using an implementation based on deep neural networks, we demonstrate that phantom sampling dramatically avoids catastrophic forgetting.