Search Results for author: Michael Georgiopoulos

Found 11 papers, 3 papers with code

A Multi-criteria Approach for Fast and Outlier-aware Representative Selection from Manifolds

no code implementations12 Mar 2020 Mahlagha Sedghi, George Atia, Michael Georgiopoulos

The problem of representative selection amounts to sampling few informative exemplars from large datasets.

Learning Hash Function through Codewords

no code implementations22 Feb 2019 Yinjie Huang, Michael Georgiopoulos, Georgios C. Anagnostopoulos

In this paper, we propose a novel hash learning approach that has the following main distinguishing features, when compared to past frameworks.

Content-Based Image Retrieval Retrieval

Multi-Task Learning Using Neighborhood Kernels

no code implementations11 Jul 2017 Niloofar Yousefi, Cong Li, Mansooreh Mollaghasemi, Georgios Anagnostopoulos, Michael Georgiopoulos

As shown by our empirical results, our algorithm consistently outperforms the traditional kernel learning algorithms such as uniform combination solution, convex combinations of base kernels as well as some kernel alignment-based models, which have been proven to give promising results in the past.

Multi-Task Learning

Multi-Task Learning with Group-Specific Feature Space Sharing

1 code implementation13 Aug 2015 Niloofar Yousefi, Michael Georgiopoulos, Georgios C. Anagnostopoulos

When faced with learning a set of inter-related tasks from a limited amount of usable data, learning each task independently may lead to poor generalization performance.

Binary Classification Multi-Task Learning

Hash Function Learning via Codewords

1 code implementation13 Aug 2015 Yinjie Huang, Michael Georgiopoulos, Georgios C. Anagnostopoulos

In this paper we introduce a novel hash learning framework that has two main distinguishing features, when compared to past approaches.

Content-Based Image Retrieval Retrieval

Conic Multi-Task Classification

1 code implementation20 Aug 2014 Cong Li, Michael Georgiopoulos, Georgios C. Anagnostopoulos

Traditionally, Multi-task Learning (MTL) models optimize the average of task-related objective functions, which is an intuitive approach and which we will be referring to as Average MTL.

Classification General Classification +1

Pareto-Path Multi-Task Multiple Kernel Learning

no code implementations11 Apr 2014 Cong Li, Michael Georgiopoulos, Georgios C. Anagnostopoulos

A traditional and intuitively appealing Multi-Task Multiple Kernel Learning (MT-MKL) method is to optimize the sum (thus, the average) of objective functions with (partially) shared kernel function, which allows information sharing amongst tasks.

Multi-Task Learning

A Unifying Framework for Typical Multi-Task Multiple Kernel Learning Problems

no code implementations21 Jan 2014 Cong Li, Michael Georgiopoulos, Georgios C. Anagnostopoulos

Over the past few years, Multi-Kernel Learning (MKL) has received significant attention among data-driven feature selection techniques in the context of kernel-based learning.

feature selection Multi-Task Learning

Multi-Task Classification Hypothesis Space with Improved Generalization Bounds

no code implementations9 Dec 2013 Cong Li, Michael Georgiopoulos, Georgios C. Anagnostopoulos

This paper presents a RKHS, in general, of vector-valued functions intended to be used as hypothesis space for multi-task classification.

Classification General Classification +2

Cannot find the paper you are looking for? You can Submit a new open access paper.