no code implementations • ECCV 2020 • Van Nhan Nguyen, Sigurd Løkse, Kristoffer Wickstrøm, Michael Kampffmeyer, Davide Roverso, Robert Jenssen
In this paper, we equip Prototypical Networks (PNs) with a novel dissimilarity measure to enable discriminative feature normalization for few-shot learning.
1 code implementation • 27 Apr 2024 • Shujian Yu, Xi Yu, Sigurd Løkse, Robert Jenssen, Jose C. Principe
The information bottleneck (IB) approach is popular to improve the generalization, robustness and explainability of deep neural networks.
1 code implementation • CVPR 2023 • Daniel J. Trosten, Sigurd Løkse, Robert Jenssen, Michael C. Kampffmeyer
To address this, we present DeepMVC, a unified framework for deep MVC that includes many recent methods as instances.
1 code implementation • CVPR 2023 • Daniel J. Trosten, Rwiddhi Chakraborty, Sigurd Løkse, Kristoffer Knutsen Wickstrøm, Robert Jenssen, Michael C. Kampffmeyer
Distance-based classification is frequently used in transductive few-shot learning (FSL).
no code implementations • 21 Jan 2023 • Shujian Yu, Hongming Li, Sigurd Løkse, Robert Jenssen, José C. Príncipe
The Cauchy-Schwarz (CS) divergence was developed by Pr\'{i}ncipe et al. in 2000.
1 code implementation • 18 May 2022 • Kristoffer Wickstrøm, J. Emmanuel Johnson, Sigurd Løkse, Gustau Camps-Valls, Karl Øyvind Mikalsen, Michael Kampffmeyer, Robert Jenssen
Our proposed kernelized Taylor diagram is capable of visualizing similarities between populations with minimal assumptions of the data distributions.
1 code implementation • 19 Dec 2021 • Kristoffer K. Wickstrøm, Daniel J. Trosten, Sigurd Løkse, Ahcène Boubekki, Karl Øyvind Mikalsen, Michael C. Kampffmeyer, Robert Jenssen
Our approach can also model the uncertainty in its explanations, which is essential to produce trustworthy explanations.
1 code implementation • CVPR 2021 • Daniel J. Trosten, Sigurd Løkse, Robert Jenssen, Michael Kampffmeyer
Aligning distributions of view representations is a core component of today's state of the art models for deep multi-view clustering.
2 code implementations • 20 Jan 2020 • Daniel J. Trosten, Sigurd Løkse, Robert Jenssen, Michael Kampffmeyer
In this work we study OFM in deep clustering, and find that the popular autoencoder-based approach to deep clustering can lead to both reduced clustering performance, and a significant amount of OFM between the reconstruction and clustering objectives.
no code implementations • 25 Sep 2019 • Kristoffer Wickstrøm, Sigurd Løkse, Michael Kampffmeyer, Shujian Yu, Jose Principe, Robert Jenssen
In this paper, we propose an IP analysis using the new matrix--based R\'enyi's entropy coupled with tensor kernels over convolutional layers, leveraging the power of kernel methods to represent properties of the probability distribution independently of the dimensionality of the data.
no code implementations • 25 Sep 2019 • Kristoffer Wickstrøm, Sigurd Løkse, Michael Kampffmeyer, Shujian Yu, Jose Principe, Robert Jenssen
In this paper, we propose an IP analysis using the new matrix--based R\'enyi's entropy coupled with tensor kernels over convolutional layers, leveraging the power of kernel methods to represent properties of the probability distribution independently of the dimensionality of the data.
no code implementations • 13 Feb 2019 • Michael Kampffmeyer, Sigurd Løkse, Filippo M. Bianchi, Lorenzo Livi, Arnt-Børre Salberg, Robert Jenssen
A promising direction in deep learning research consists in learning representations and simultaneously discovering cluster structure in unlabeled data by optimizing a discriminative loss function.
no code implementations • 19 Jul 2018 • Michael Kampffmeyer, Sigurd Løkse, Filippo M. Bianchi, Robert Jenssen, Lorenzo Livi
Autoencoders learn data representations (codes) in such a way that the input is reproduced at the output of the network.
3 code implementations • 21 Mar 2018 • Filippo Maria Bianchi, Simone Scardapane, Sigurd Løkse, Robert Jenssen
The architectures are compared to other MTS classifiers, including deep learning models and time series kernels.
2 code implementations • 17 Nov 2017 • Filippo Maria Bianchi, Simone Scardapane, Sigurd Løkse, Robert Jenssen
We propose a deep architecture for the classification of multivariate time series.
no code implementations • 23 Feb 2017 • Sigurd Løkse, Filippo Maria Bianchi, Arnt-Børre Salberg, Robert Jenssen
In this paper, we propose PCKID, a novel, robust, kernel function for spectral clustering, specifically designed to handle incomplete data.
no code implementations • 8 Feb 2017 • Michael Kampffmeyer, Sigurd Løkse, Filippo Maria Bianchi, Robert Jenssen, Lorenzo Livi
In this paper we introduce the deep kernelized autoencoder, a neural network model that allows an explicit approximation of (i) the mapping from an input space to an arbitrary, user-specified kernel space and (ii) the back-projection from such a kernel space to input space.
no code implementations • 16 Aug 2016 • Sigurd Løkse, Filippo Maria Bianchi, Robert Jenssen
In this paper we introduce a new framework to train an Echo State Network to predict real valued time-series.