no code implementations • 6 Nov 2024 • Stephen Pasteris, Chris Hicks, Vasilios Mavroudis
At the start of each trial we are given a probability distribution over the context set and are required (on that trial) to be fair with respect to that distribution, in that if the context (for that trial) was drawn from the distribution then our choice of action would be unbiased towards any protected group.
no code implementations • 31 May 2024 • Stephen Pasteris, Chris Hicks, Vasilios Mavroudis, Mark Herbster
A switching regret is defined relative to any segmentation of the trial sequence, and is equal to the sum of the static regrets of each segment.
no code implementations • 24 Feb 2024 • Stephen Pasteris, Chris Hicks, Vasilios Mavroudis
Running backpropagation end to end on large neural networks is fraught with difficulties like vanishing gradients and degradation.
1 code implementation • 22 Feb 2024 • Stephen Pasteris, Alberto Rumi, Maximilian Thiessen, Shota Saito, Atsushi Miyauchi, Fabio Vitale, Mark Herbster
We study the classic problem of prediction with expert advice under bandit feedback.
no code implementations • 14 Dec 2023 • Stephen Pasteris, Chris Hicks, Vasilios Mavroudis
In this paper we consider the adversarial contextual bandit problem in metric spaces.
no code implementations • 10 Nov 2023 • Stephen Pasteris, Alberto Rumi, Fabio Vitale, Nicolò Cesa-Bianchi
Many online decision-making problems correspond to maximizing a sequence of submodular functions.
no code implementations • NeurIPS 2023 • Stephen Pasteris, Chris Hicks, Vasilios Mavroudis
In this paper we adapt the nearest neighbour rule to the contextual bandit problem.
no code implementations • 11 Feb 2023 • Stephen Pasteris, Fabio Vitale, Mark Herbster, Claudio Gentile, Andre' Panisson
We investigate the problem of online collaborative filtering under no-repetition constraints, whereby users need to be served content in an online fashion and a given user cannot be recommended the same content item more than once.
no code implementations • 13 Apr 2022 • Hanlin Lu, Changchang Liu, Shiqiang Wang, Ting He, Vijay Narayanan, Kevin S. Chan, Stephen Pasteris
Coresets are small, weighted summaries of larger datasets, aiming at providing provable error bounds for machine learning (ML) tasks while significantly reducing the communication and computation costs.
no code implementations • NeurIPS 2021 • Mark Herbster, Stephen Pasteris, Fabio Vitale, Massimiliano Pontil
Users are in a social network and the learner is aided by a-priori knowledge of the strengths of the social links between all pairs of users.
no code implementations • NeurIPS 2021 • Lin Yang, Yu-Zhen Janice Chen, Stephen Pasteris, Mohammad Hajiesmaili, John C. S. Lui, Don Towsley
This paper studies a cooperative multi-armed bandit problem with $M$ agents cooperating together to solve the same instance of a $K$-armed stochastic bandit problem with the goal of maximizing the cumulative reward of agents.
no code implementations • 8 Feb 2021 • Hanlin Lu, Ting He, Shiqiang Wang, Changchang Liu, Mehrdad Mahdavi, Vijaykrishnan Narayanan, Kevin S. Chan, Stephen Pasteris
We consider the problem of computing the k-means centers for a large high-dimensional dataset in the context of edge-based machine learning, where data sources offload machine learning computation to nearby edge servers.
no code implementations • NeurIPS 2020 • Mark Herbster, Stephen Pasteris, Lisa Tse
We provide an algorithm that predicts on each trial in time linear in the number of hypotheses when the hypothesis class is finite.
no code implementations • 6 Jul 2020 • Stephen Pasteris, Ting He, Fabio Vitale, Shiqiang Wang, Mark Herbster
In this paper, we provide a rigorous theoretical investigation of an online learning version of the Facility Location problem which is motivated by emerging problems in real-world applications.
no code implementations • NeurIPS 2020 • Mark Herbster, Stephen Pasteris, Lisa Tse
In this setting, we provide an example where the side information is not directly specified in advance.
no code implementations • 28 Oct 2018 • Stephen Pasteris, Fabio Vitale, Kevin Chan, Shiqiang Wang, Mark Herbster
We introduce a new online learning framework where, at each trial, the learner is required to select a subset of actions from a given known action set.
no code implementations • 19 Jun 2017 • Stephen Pasteris, Fabio Vitale, Claudio Gentile, Mark Herbster
We measure performance not based on the recovery of the hidden similarity function, but instead on how well we classify each item.
no code implementations • NeurIPS 2016 • Mark Herbster, Stephen Pasteris, Massimiliano Pontil
We study the problem of completing a binary matrix in an online learning setting.
no code implementations • NeurIPS 2015 • Mark Herbster, Stephen Pasteris, Shaona Ghosh
We design an online algorithm to classify the vertices of a graph.
no code implementations • 31 Jul 2013 • Stephen Pasteris
The junction tree algorithm first constructs a tree called a junction tree who's vertices are sets of random variables.