Search Results for author: Giacomo Zara

Found 5 papers, 5 papers with code

Simplifying Open-Set Video Domain Adaptation with Contrastive Learning

1 code implementation9 Jan 2023 Giacomo Zara, Victor Guilherme Turrisi da Costa, Subhankar Roy, Paolo Rota, Elisa Ricci

In this work we address a more realistic scenario, called open-set video domain adaptation (OUVDA), where the target dataset contains "unknown" semantic categories that are not shared with the source.

Action Recognition Contrastive Learning +1

AutoLabel: CLIP-based framework for Open-set Video Domain Adaptation

1 code implementation CVPR 2023 Giacomo Zara, Subhankar Roy, Paolo Rota, Elisa Ricci

Open-set Unsupervised Video Domain Adaptation (OUVDA) deals with the task of adapting an action recognition model from a labelled source domain to an unlabelled target domain that contains "target-private" categories, which are present in the target but absent in the source.

Action Recognition Domain Adaptation +1

Rotation Synchronization via Deep Matrix Factorization

1 code implementation9 May 2023 Gk Tejus, Giacomo Zara, Paolo Rota, Andrea Fusiello, Elisa Ricci, Federica Arrigoni

In this paper we address the rotation synchronization problem, where the objective is to recover absolute rotations starting from pairwise ones, where the unknowns and the measures are represented as nodes and edges of a graph, respectively.

Matrix Completion Simultaneous Localization and Mapping

The Unreasonable Effectiveness of Large Language-Vision Models for Source-free Video Domain Adaptation

1 code implementation ICCV 2023 Giacomo Zara, Alessandro Conti, Subhankar Roy, Stéphane Lathuilière, Paolo Rota, Elisa Ricci

Source-Free Video Unsupervised Domain Adaptation (SFVUDA) task consists in adapting an action recognition model, trained on a labelled source dataset, to an unlabelled target dataset, without accessing the actual source data.

Action Recognition Unsupervised Domain Adaptation

Cannot find the paper you are looking for? You can Submit a new open access paper.