no code implementations • ACL (MetaNLP) 2021 • Cyprien de Lichy, Hadrien Glaude, William Campbell
Meta-learning has recently been proposed to learn models and algorithms that can generalize from a handful of examples.
no code implementations • 30 Mar 2024 • Subigya Nepal, Arvind Pillai, William Campbell, Talie Massachi, Eunsol Soul Choi, Orson Xu, Joanna Kuc, Jeremy Huckins, Jason Holden, Colin Depp, Nicholas Jacobson, Mary Czerwinski, Eric Granholm, Andrew T. Campbell
MindScape aims to study the benefits of integrating time series behavioral patterns (e. g., conversational engagement, sleep, location) with Large Language Models (LLMs) to create a new form of contextual AI journaling, promoting self-reflection and well-being.
no code implementations • 24 Jan 2022 • Ninareh Mehrabi, Cyprien de Lichy, John McKay, Cynthia He, William Campbell
With this goal in mind, we conduct studies to show that FL is able to satisfy different fairness metrics under different data regimes consisting of different types of clients.
no code implementations • WS 2019 • Zimeng Qiu, Eunah Cho, Xiaochun Ma, William Campbell
Semi-supervised learning is an efficient method to augment training data automatically from unlabeled data.
no code implementations • WS 2019 • Varun Kumar, Hadrien Glaude, Cyprien de Lichy, William Campbell
In particular, we show that (a) upsampling in latent space is a competitive baseline for feature space augmentation (b) adding the difference between two examples to a new example is a simple yet effective data augmentation method.
no code implementations • 17 May 2016 • Youngjune Gwon, William Campbell, Kevin Brady, Douglas Sturim, Miriam Cha, H. T. Kung
Unsupervised feature learning methods have proven effective for classification tasks based on a single modality.