no code implementations • 11 Sep 2024 • Hongyan Chang, Ali Shahin Shamsabadi, Kleomenis Katevas, Hamed Haddadi, Reza Shokri
Prior Membership Inference Attacks (MIAs) on pre-trained Large Language Models (LLMs), adapted from classification model attacks, fail due to ignoring the generative process of LLMs across token sequences.
1 code implementation • 19 Mar 2024 • Stefanos Laskaridis, Kleomenis Katevas, Lorenzo Minto, Hamed Haddadi
Transformers have revolutionized the machine learning landscape, gradually making their way into everyday tasks and equipping our computers with "sparks of intelligence".
no code implementations • 26 Feb 2023 • Ioannis Arapakis, Panagiotis Papadopoulos, Kleomenis Katevas, Diego Perino
Distributed (or Federated) learning enables users to train machine learning models on their very own devices, while they share only the gradients of their models usually in a differentially private way (utility loss).
no code implementations • 28 Oct 2021 • Dusan Stamenkovic, Alexandros Karatzoglou, Ioannis Arapakis, Xin Xin, Kleomenis Katevas
The proposed SMORL agent augments standard recommendation models with additional RL layers that enforce it to simultaneously satisfy three principal objectives: accuracy, diversity, and novelty of recommendations.
1 code implementation • 29 Apr 2021 • Fan Mo, Hamed Haddadi, Kleomenis Katevas, Eduard Marin, Diego Perino, Nicolas Kourtellis
We propose and implement a Privacy-preserving Federated Learning ($PPFL$) framework for mobile systems to limit privacy leakages in federated learning.
no code implementations • 18 Nov 2020 • Nicolas Kourtellis, Kleomenis Katevas, Diego Perino
Indeed, FL enables local training on user devices, avoiding user data to be transferred to centralized servers, and can be enhanced with differential privacy mechanisms.
2 code implementations • 12 Apr 2020 • Fan Mo, Ali Shahin Shamsabadi, Kleomenis Katevas, Soteris Demetriou, Ilias Leontiadis, Andrea Cavallaro, Hamed Haddadi
We present DarkneTZ, a framework that uses an edge device's Trusted Execution Environment (TEE) in conjunction with model partitioning to limit the attack surface against Deep Neural Networks (DNNs).
2 code implementations • 14 Mar 2020 • Kleomenis Katevas, Eugene Bagdasaryan, Jason Waterman, Mohamad Mounir Safadieh, Eleanor Birrell, Hamed Haddadi, Deborah Estrin
In this paper we present PoliFL, a decentralized, edge-based framework that supports heterogeneous privacy policies for federated learning.
no code implementations • 13 Jul 2019 • Fan Mo, Ali Shahin Shamsabadi, Kleomenis Katevas, Andrea Cavallaro, Hamed Haddadi
Pre-trained Deep Neural Network (DNN) models are increasingly used in smartphones and other user devices to enable prediction services, leading to potential disclosures of (sensitive) information from training data captured inside these models.
no code implementations • 30 Aug 2018 • Kleomenis Katevas, Katrin Hänsel, Richard Clegg, Ilias Leontiadis, Hamed Haddadi, Laurissa Tokarchuk
Remembering our day-to-day social interactions is challenging even if you aren't a blue memory challenged fish.
1 code implementation • 9 Feb 2018 • Seyed Ali Osia, Ali Taheri, Ali Shahin Shamsabadi, Kleomenis Katevas, Hamed Haddadi, Hamid R. Rabiee
We present and evaluate Deep Private-Feature Extractor (DPFE), a deep model which is trained and evaluated based on information theoretic constraints.
no code implementations • 19 Dec 2017 • Kleomenis Katevas, Ilias Leontiadis, Martin Pielot, Joan Serrà
Besides using classical gradient-boosted trees, we demonstrate how to make continual predictions using a recurrent neural network (RNN).
Human-Computer Interaction
1 code implementation • 4 Oct 2017 • Seyed Ali Osia, Ali Shahin Shamsabadi, Ali Taheri, Kleomenis Katevas, Hamid R. Rabiee, Nicholas D. Lane, Hamed Haddadi
Our evaluations show that by using certain kind of fine-tuning and embedding techniques and at a small processing costs, we can greatly reduce the level of information available to unintended tasks applied to the data feature on the cloud, and hence achieving the desired tradeoff between privacy and performance.
no code implementations • 17 May 2017 • Kleomenis Katevas, Ilias Leontiadis, Martin Pielot, Joan Serrà
We present a practical approach for processing mobile sensor time series data for continual deep learning predictions.
1 code implementation • 8 Mar 2017 • Seyed Ali Osia, Ali Shahin Shamsabadi, Sina Sajadmanesh, Ali Taheri, Kleomenis Katevas, Hamid R. Rabiee, Nicholas D. Lane, Hamed Haddadi
To this end, instead of performing the whole operation on the cloud, we let an IoT device to run the initial layers of the neural network, and then send the output to the cloud to feed the remaining layers and produce the final result.