no code implementations • 13 Mar 2024 • Amit Meghanani, Thomas Hain
HuBERT-based CAE model achieves the best results for word discrimination in all languages, despite Hu-BERT being pre-trained on English only.
no code implementations • 10 Mar 2024 • Amit Meghanani, Thomas Hain
These task-specific representations are used for robust performance on various downstream tasks by fine-tuning on the labelled data.