no code implementations • 21 Mar 2024 • Sayanton V. Dibbo, Adam Breuer, Juston Moore, Michael Teti
Recent model inversion attack algorithms permit adversaries to reconstruct a neural network's private training data just by repeatedly querying the network and inspecting its outputs.
no code implementations • 23 Aug 2023 • Sayanton V. Dibbo, Juston S. Moore, Garrett T. Kenyon, Michael A. Teti
Audio classification aims at recognizing audio signals, including speech commands or sound events.
no code implementations • 23 Jan 2022 • Shagufta Mehnaz, Sayanton V. Dibbo, Ehsanul Kabir, Ninghui Li, Elisa Bertino
Increasing use of machine learning (ML) technologies in privacy-sensitive domains such as medical diagnoses, lifestyle predictions, and business decisions highlights the need to better understand if these ML technologies are introducing leakage of sensitive and proprietary training data.
no code implementations • 28 Sep 2021 • Alexa Muratyan, William Cheung, Sayanton V. Dibbo, Sudip Vhaduri
While most of these external authentication techniques suffer from multiple limitations, including recall burden, human errors, or biases, researchers have started using various physiological and behavioral data, such as gait and heart rate, collected by the wearables to authenticate a wearable user implicitly with a limited accuracy due to sensing and computing constraints of wearables.