1 code implementation • 22 May 2023 • Kowshik Thopalli, Rakshith Subramanyam, Pavan Turaga, Jayaraman J. Thiagarajan
We argue that augmentations utilized by existing methods are insufficient to handle large distribution shifts, and hence propose a new approach SiSTA, which first fine-tunes a generative model from the source domain using a single-shot target, and then employs novel sampling strategies for curating synthetic target data.
1 code implementation • 29 Oct 2022 • Rakshith Subramanyam, Kowshik Thopalli, Spring Berman, Pavan Turaga, Jayaraman J. Thiagarajan
The problem of adapting models from a source domain using data from any target domain of interest has gained prominence, thanks to the brittle generalization in deep neural networks.
1 code implementation • 9 Jul 2022 • Kowshik Thopalli, Pavan Turaga, Jayaraman J. Thiagarajan
With a minimal overhead of storing the subspace basis set for the source data, CATTAn enables unsupervised alignment between source and target data during adaptation.
no code implementations • 5 Jan 2022 • Kowshik Thopalli, Jayaraman J Thiagarajan, Rushil Anirudh, Pavan K Turaga
In contrast to existing adversarial training-based DA methods, our approach isolates feature learning and distribution alignment steps, and utilizes a primary-auxiliary optimization strategy to effectively balance the objectives of domain invariance and model fidelity.
1 code implementation • 17 Dec 2021 • Kowshik Thopalli, Sameeksha Katoch, Pavan Turaga, Jayaraman J. Thiagarajan
In this paper, we focus on the challenging problem of multi-source zero shot DG (MDG), where labeled training data from multiple source domains is available but with no access to data from the target domain.
Ranked #19 on Domain Generalization on TerraIncognita
no code implementations • 21 Mar 2021 • Zachary Seymour, Kowshik Thopalli, Niluthpol Mithun, Han-Pang Chiu, Supun Samarasekera, Rakesh Kumar
Visual navigation for autonomous agents is a core task in the fields of computer vision and robotics.
no code implementations • 10 Feb 2020 • Bindya Venkatesh, Jayaraman J. Thiagarajan, Kowshik Thopalli, Prasanna Sattigeri
The hypothesis that sub-network initializations (lottery) exist within the initializations of over-parameterized networks, which when trained in isolation produce highly generalizable models, has led to crucial insights into network initialization and has enabled efficient inferencing.
no code implementations • 24 Nov 2019 • Sameeksha Katoch, Kowshik Thopalli, Jayaraman J. Thiagarajan, Pavan Turaga, Andreas Spanias
Exploiting known semantic relationships between fine-grained tasks is critical to the success of recent model agnostic approaches.
no code implementations • 11 Jun 2019 • Kowshik Thopalli, Jayaraman J. Thiagarajan, Rushil Anirudh, Pavan Turaga
This paper represents a hybrid approach, where we assume simplified data geometry in the form of subspaces, and consider alignment as an auxiliary task to the primary task of maximizing performance on the source.
no code implementations • 11 Nov 2018 • Kowshik Thopalli, Rushil Anirudh, Jayaraman J. Thiagarajan, Pavan Turaga
We present a novel unsupervised domain adaptation (DA) method for cross-domain visual recognition.
1 code implementation • ECCV 2018 • Anirudh Som, Kowshik Thopalli, Karthikeyan Natesan Ramamurthy, Vinay Venkataraman, Ankita Shukla, Pavan Turaga
However, persistence diagrams are multi-sets of points and hence it is not straightforward to fuse them with features used for contemporary machine learning tools like deep-nets.