Search Results for author: Kennedy Edemacu

Found 3 papers, 0 papers with code

Privacy Preserving Prompt Engineering: A Survey

no code implementations9 Apr 2024 Kennedy Edemacu, Xintao Wu

As a result, the sizes of these models have notably expanded in recent years, persuading researchers to adopt the term large language models (LLMs) to characterize the larger-sized PLMs.

In-Context Learning Privacy Preserving +1

DP-TabICL: In-Context Learning with Differentially Private Tabular Data

no code implementations8 Mar 2024 Alycia N. Carey, Karuna Bhaila, Kennedy Edemacu, Xintao Wu

In-context learning (ICL) enables large language models (LLMs) to adapt to new tasks by conditioning on demonstrations of question-answer pairs and it has been shown to have comparable performance to costly model retraining and fine-tuning.

In-Context Learning

Reliability Check via Weight Similarity in Privacy-Preserving Multi-Party Machine Learning

no code implementations14 Jan 2021 Kennedy Edemacu, Beakcheol Jang, Jong Wook Kim

Multi-party machine learning is a paradigm in which multiple participants collaboratively train a machine learning model to achieve a common learning objective without sharing their privately owned data.

BIG-bench Machine Learning Privacy Preserving

Cannot find the paper you are looking for? You can Submit a new open access paper.