Search Results for author: Idan Achituve

Found 13 papers, 10 papers with code

Bayesian Uncertainty for Gradient Aggregation in Multi-Task Learning

1 code implementation6 Feb 2024 Idan Achituve, Idit Diamant, Arnon Netzer, Gal Chechik, Ethan Fetaya

Running a dedicated model for each task is computationally expensive and therefore there is a great interest in multi-task learning (MTL).

Bayesian Inference Multi-Task Learning

De-Confusing Pseudo-Labels in Source-Free Domain Adaptation

no code implementations3 Jan 2024 Idit Diamant, Amir Rosenfeld, Idan Achituve, Jacob Goldberger, Arnon Netzer

In this paper, we introduce a novel noise-learning approach tailored to address noise distribution in domain adaptation settings and learn to de-confuse the pseudo-labels.

Source-Free Domain Adaptation

Data Augmentations in Deep Weight Spaces

no code implementations15 Nov 2023 Aviv Shamsian, David W. Zhang, Aviv Navon, Yan Zhang, Miltiadis Kofinas, Idan Achituve, Riccardo Valperga, Gertjan J. Burghouts, Efstratios Gavves, Cees G. M. Snoek, Ethan Fetaya, Gal Chechik, Haggai Maron

Learning in weight spaces, where neural networks process the weights of other deep neural networks, has emerged as a promising research direction with applications in various fields, from analyzing and editing neural fields and implicit neural representations, to network pruning and quantization.

Data Augmentation Network Pruning +1

GD-VDM: Generated Depth for better Diffusion-based Video Generation

1 code implementation19 Jun 2023 Ariel Lapid, Idan Achituve, Lior Bracha, Ethan Fetaya

GD-VDM is based on a two-phase generation process involving generating depth videos followed by a novel diffusion Vid2Vid model that generates a coherent real-world video.

Image Generation Video Generation

Guided Deep Kernel Learning

1 code implementation19 Feb 2023 Idan Achituve, Gal Chechik, Ethan Fetaya

Combining Gaussian processes with the expressive power of deep neural networks is commonly done nowadays through deep kernel learning (DKL).

Gaussian Processes

Equivariant Architectures for Learning in Deep Weight Spaces

1 code implementation30 Jan 2023 Aviv Navon, Aviv Shamsian, Idan Achituve, Ethan Fetaya, Gal Chechik, Haggai Maron

Designing machine learning architectures for processing neural networks in their raw weight matrix form is a newly introduced research direction.

Communication Efficient Distributed Learning over Wireless Channels

no code implementations4 Sep 2022 Idan Achituve, Wenbo Wang, Ethan Fetaya, Amir Leshem

Vertical distributed learning exploits the local features collected by multiple learning workers to form a better global model.

Functional Ensemble Distillation

1 code implementation5 Jun 2022 Coby Penso, Idan Achituve, Ethan Fetaya

Bayesian models have many desirable properties, most notable is their ability to generalize from limited data and to properly estimate the uncertainty in their predictions.

Bayesian Inference

Multi-Task Learning as a Bargaining Game

2 code implementations2 Feb 2022 Aviv Navon, Aviv Shamsian, Idan Achituve, Haggai Maron, Kenji Kawaguchi, Gal Chechik, Ethan Fetaya

In this paper, we propose viewing the gradients combination step as a bargaining game, where tasks negotiate to reach an agreement on a joint direction of parameter update.

Multi-Task Learning

Auxiliary Learning by Implicit Differentiation

1 code implementation ICLR 2021 Aviv Navon, Idan Achituve, Haggai Maron, Gal Chechik, Ethan Fetaya

Two main challenges arise in this multi-task learning setting: (i) designing useful auxiliary tasks; and (ii) combining auxiliary tasks into a single coherent loss.

Auxiliary Learning Image Segmentation +3

Self-Supervised Learning for Domain Adaptation on Point-Clouds

3 code implementations29 Mar 2020 Idan Achituve, Haggai Maron, Gal Chechik

Self-supervised learning (SSL) is a technique for learning useful representations from unlabeled data.

Domain Adaptation Self-Supervised Learning

Cannot find the paper you are looking for? You can Submit a new open access paper.