Search Results for author: Rendi Chevi

Found 3 papers, 1 papers with code

Daisy-TTS: Simulating Wider Spectrum of Emotions via Prosody Embedding Decomposition

no code implementations22 Feb 2024 Rendi Chevi, Alham Fikri Aji

This wide spectrum of emotions is well-studied in the structural model of emotions, which represents variety of emotions as derivative products of primary emotions with varying degrees of intensity.

Nix-TTS: Lightweight and End-to-End Text-to-Speech via Module-wise Distillation

1 code implementation29 Mar 2022 Rendi Chevi, Radityo Eko Prasojo, Alham Fikri Aji, Andros Tjandra, Sakriani Sakti

We present Nix-TTS, a lightweight TTS achieved via knowledge distillation to a high-quality yet large-sized, non-autoregressive, and end-to-end (vocoder-free) TTS teacher model.

Knowledge Distillation Neural Architecture Search

Which Student is Best? A Comprehensive Knowledge Distillation Exam for Task-Specific BERT Models

no code implementations3 Jan 2022 Made Nindyatama Nityasya, Haryo Akbarianto Wibowo, Rendi Chevi, Radityo Eko Prasojo, Alham Fikri Aji

We perform knowledge distillation (KD) benchmark from task-specific BERT-base teacher models to various student models: BiLSTM, CNN, BERT-Tiny, BERT-Mini, and BERT-Small.

Data Augmentation Knowledge Distillation +3

Cannot find the paper you are looking for? You can Submit a new open access paper.