We propose a novel approach to multimodal sensor fusion for Ambient Assisted Living (AAL) which takes advantage of learning using privileged information (LUPI).
We use data collected from 10 people with Parkinson's, and 10 controls, each of whom lived for five days in a smart home with various sensors.
The experiment results offer potential for promising healthcare applications using Wi-Fi passive sensing in the home to monitor daily activities, to gather health data and detect emergency situations.
1 code implementation • 8 Oct 2021 • Mohammud J. Bocus, Wenda Li, Shelly Vishwakarma, Roget Kou, Chong Tang, Karl Woodbridge, Ian Craddock, Ryan McConville, Raul Santos-Rodriguez, Kevin Chetty, Robert Piechocki
This dataset can be exploited to advance WiFi and vision-based HAR, for example, using pattern recognition, skeletal representation, deep learning algorithms or other novel approaches to accurately recognize human activities.
There is a pressing need to automatically understand the state and progression of chronic neurological diseases such as dementia.
We study a number of local and global manifold learning methods on both the raw data and autoencoded embedding, concluding that UMAP in our framework is best able to find the most clusterable manifold in the embedding, suggesting local manifold learning on an autoencoded embedding is effective for discovering higher quality discovering clusters.
Ranked #1 on Image Clustering on HAR
In this paper we study the prediction of heart rate from acceleration using a wrist worn wearable.
There is a widely-accepted need to revise current forms of health-care provision, with particular interest in sensing systems in the home.
We present a new framework for vision-based estimation of calorific expenditure from RGB-D data - the first that is validated on physical gas exchange measurements and applied to daily living scenarios.