Search Results for author: Timo Lohrenz

Found 3 papers, 2 papers with code

Relaxed Attention for Transformer Models

1 code implementation20 Sep 2022 Timo Lohrenz, Björn Möller, Zhengyang Li, Tim Fingscheidt

The powerful modeling capabilities of all-attention-based transformer architectures often cause overfitting and - for natural language processing tasks - lead to an implicitly learned internal language model in the autoregressive transformer decoder complicating the integration of external language models.

Ranked #3 on Lipreading on LRS3-TED (using extra training data)

Image Classification Language Modelling +3

Multi-Encoder Learning and Stream Fusion for Transformer-Based End-to-End Automatic Speech Recognition

no code implementations31 Mar 2021 Timo Lohrenz, Zhengyang Li, Tim Fingscheidt

Stream fusion, also known as system combination, is a common technique in automatic speech recognition for traditional hybrid hidden Markov model approaches, yet mostly unexplored for modern deep neural network end-to-end model architectures.

Automatic Speech Recognition Automatic Speech Recognition (ASR) +2

Cannot find the paper you are looking for? You can Submit a new open access paper.