no code implementations • 26 Sep 2023 • Gowtham Premananth, Yashish M. Siriwardena, Philip Resnik, Carol Espy-Wilson
This study focuses on how different modalities of human communication can be used to distinguish between healthy controls and subjects with schizophrenia who exhibit strong positive symptoms.
no code implementations • 17 Sep 2023 • Ahmed Adel Attia, Yashish M. Siriwardena, Carol Espy-Wilson
The performance of deep learning models depends significantly on their capacity to encode input features efficiently and decode them into meaningful outputs.
1 code implementation • 31 May 2023 • Yashish M. Siriwardena, Carol Espy-Wilson, Suzanne Boyce, Mark K. Tiede, Liran Oren
Nasalance is an objective measure derived from the oral and nasal acoustic signals that correlate with nasality.
no code implementations • 29 Oct 2022 • Yashish M. Siriwardena, Carol Espy-Wilson, Shihab Shamma
Most organisms including humans function by coordinating and integrating sensory signals with motor actions to survive and accomplish desired tasks.
no code implementations • 29 Oct 2022 • Yashish M. Siriwardena, Carol Espy-Wilson
The proposed SI system with the HPRC dataset gains an improvement of close to 28% when the source features are used as additional targets.
no code implementations • 27 May 2022 • Yashish M. Siriwardena, Ganesh Sivaraman, Carol Espy-Wilson
Multi-task learning (MTL) frameworks have proven to be effective in diverse speech related tasks like automatic speech recognition (ASR) and speech emotion recognition.
Automatic Speech Recognition Automatic Speech Recognition (ASR) +3
no code implementations • 25 May 2022 • Yashish M. Siriwardena, Ahmed Adel Attia, Ganesh Sivaraman, Carol Espy-Wilson
In this work, we compare and contrast different ways of doing data augmentation and show how this technique improves the performance of articulatory speech inversion not only on noisy speech, but also on clean speech data.
no code implementations • 12 Oct 2021 • Yashish M. Siriwardena, Guilhem Marion, Shihab Shamma
Experiments to understand the sensorimotor neural interactions in the human cortical speech system support the existence of a bidirectional flow of interactions between the auditory and motor regions.
no code implementations • 9 Oct 2021 • Yashish M. Siriwardena, Chris Kitchen, Deanna L. Kelly, Carol Espy-Wilson
This study investigates the speech articulatory coordination in schizophrenia subjects exhibiting strong positive symptoms (e. g. hallucinations and delusions), using two distinct channel-delay correlation methods.