Search Results for author: Moacir Antonelli Ponti

Found 11 papers, 6 papers with code

A single speaker is almost all you need for automatic speech recognition

1 code implementation29 Mar 2022 Edresson Casanova, Christopher Shulby, Alexander Korolev, Arnaldo Candido Junior, Anderson da Silva Soares, Sandra Aluísio, Moacir Antonelli Ponti

We explore the use of speech synthesis and voice conversion applied to augment datasets for automatic speech recognition (ASR) systems, in scenarios with only one speaker available for the target language.

Automatic Speech Recognition Data Augmentation +2

SC-GlowTTS: an Efficient Zero-Shot Multi-Speaker Text-To-Speech Model

2 code implementations2 Apr 2021 Edresson Casanova, Christopher Shulby, Eren Gölge, Nicolas Michael Müller, Frederico Santos de Oliveira, Arnaldo Candido Junior, Anderson da Silva Soares, Sandra Maria Aluisio, Moacir Antonelli Ponti

In this paper, we propose SC-GlowTTS: an efficient zero-shot multi-speaker text-to-speech model that improves similarity for speakers unseen during training.

Generalization of feature embeddings transferred from different video anomaly detection domains

no code implementations28 Jan 2019 Fernando Pereira dos Santos, Leonardo Sampaio Ferraz Ribeiro, Moacir Antonelli Ponti

By proposing novel cross-domain generalization measures, we study how source features can generalize for different target video domains, as well as analyze unsupervised transfer learning.

Anomaly Detection Domain Generalization +1

Como funciona o Deep Learning

no code implementations20 Jun 2018 Moacir Antonelli Ponti, Gabriel B. Paranhos da Costa

Deep Learning methods are currently the state-of-the-art in many problems which can be tackled via machine learning, in particular classification problems.

General Classification

Computing the Shattering Coefficient of Supervised Learning Algorithms

no code implementations7 May 2018 Rodrigo Fernandes de Mello, Moacir Antonelli Ponti, Carlos Henrique Grossi Ferreira

The Statistical Learning Theory (SLT) provides the theoretical guarantees for supervised machine learning based on the Empirical Risk Minimization Principle (ERMP).

Learning Theory

Providing theoretical learning guarantees to Deep Learning Networks

no code implementations28 Nov 2017 Rodrigo Fernandes de Mello, Martha Dais Ferreira, Moacir Antonelli Ponti

Deep Learning (DL) is one of the most common subjects when Machine Learning and Data Science approaches are considered.

Learning Theory

Cannot find the paper you are looking for? You can Submit a new open access paper.