2 code implementations • 26 Nov 2019 • Maxim Ziatdinov, Dohyung Kim, Sabine Neumayer, Rama K. Vasudevan, Liam Collins, Stephen Jesse, Mahshid Ahmadi, Sergei V. Kalinin
We investigate the ability to reconstruct and derive spatial structure from sparsely sampled 3D piezoresponse force microcopy data, captured using the band-excitation (BE) technique, via Gaussian Process (GP) methods.
Computational Physics Materials Science
1 code implementation • 10 Feb 2020 • Maxim Ziatdinov, Dohyung Kim, Sabine Neumayer, Liam Collins, Mahshid Ahmadi, Rama K. Vasudevan, Stephen Jesse, Myung Hyun Ann, Jong H. Kim, Sergei V. Kalinin
Imaging mechanisms in contact Kelvin Probe Force Microscopy (cKPFM) are explored via information theory-based methods.
Applied Physics Materials Science
no code implementations • NeurIPS 2020 • Liam Collins, Aryan Mokhtari, Sanjay Shakkottai
Meta-learning methods have shown an impressive ability to train models that rapidly learn new tasks.
no code implementations • 27 Oct 2020 • Liam Collins, Aryan Mokhtari, Sanjay Shakkottai
Model-Agnostic Meta-Learning (MAML) has become increasingly popular for training models that can quickly adapt to new tasks via one or few stochastic gradient descent steps.
3 code implementations • 14 Feb 2021 • Liam Collins, Hamed Hassani, Aryan Mokhtari, Sanjay Shakkottai
Based on this intuition, we propose a novel federated learning framework and algorithm for learning a shared data representation across clients and unique local heads for each client.
no code implementations • 7 Feb 2022 • Liam Collins, Aryan Mokhtari, Sewoong Oh, Sanjay Shakkottai
Recent empirical evidence has driven conventional wisdom to believe that gradient-based meta-learning (GBML) methods perform well at few-shot learning because they learn an expressive data representation that is shared across tasks.
no code implementations • 27 May 2022 • Liam Collins, Hamed Hassani, Aryan Mokhtari, Sanjay Shakkottai
We show that the reason behind generalizability of the FedAvg's output is its power in learning the common data representation among the clients' tasks, by leveraging the diversity among client data distributions via local updates.
no code implementations • 15 Feb 2023 • Advait Parulekar, Liam Collins, Karthikeyan Shanmugam, Aryan Mokhtari, Sanjay Shakkottai
The goal of contrasting learning is to learn a representation that preserves underlying clusters by keeping samples with similar content, e. g. the ``dogness'' of a dog, close to each other in the space generated by the representation.
no code implementations • 13 Jul 2023 • Liam Collins, Hamed Hassani, Mahdi Soltanolkotabi, Aryan Mokhtari, Sanjay Shakkottai
An increasingly popular machine learning paradigm is to pretrain a neural network (NN) on many tasks offline, then adapt it to downstream tasks, often by re-training only the last linear layer of the network.
no code implementations • 6 Oct 2023 • Liam Collins, Shanshan Wu, Sewoong Oh, Khe Chai Sim
In many applications of federated learning (FL), clients desire models that are personalized using their local data, yet are also robust in the sense that they retain general global knowledge.
no code implementations • 18 Feb 2024 • Liam Collins, Advait Parulekar, Aryan Mokhtari, Sujay Sanghavi, Sanjay Shakkottai
We show that an attention unit learns a window that it uses to implement a nearest-neighbors predictor adapted to the landscape of the pretraining tasks.