Search Results for author: Kangwook Jang

Found 3 papers, 3 papers with code

STaR: Distilling Speech Temporal Relation for Lightweight Speech Self-Supervised Learning Models

1 code implementation14 Dec 2023 Kangwook Jang, Sungnyun Kim, Hoirin Kim

Albeit great performance of Transformer-based speech selfsupervised learning (SSL) models, their large parameter size and computational cost make them unfavorable to utilize.

Relation Self-Supervised Learning

Recycle-and-Distill: Universal Compression Strategy for Transformer-based Speech SSL Models with Attention Map Reusing and Masking Distillation

1 code implementation19 May 2023 Kangwook Jang, Sungnyun Kim, Se-Young Yun, Hoirin Kim

Transformer-based speech self-supervised learning (SSL) models, such as HuBERT, show surprising performance in various speech processing tasks.

Self-Supervised Learning

Cannot find the paper you are looking for? You can Submit a new open access paper.