Search Results for author: Erhardt Barth

Found 9 papers, 5 papers with code

TEVR: Improving Speech Recognition by Token Entropy Variance Reduction

2 code implementations25 Jun 2022 Hajo Nils Krabbenhöft, Erhardt Barth

This paper presents TEVR, a speech recognition model designed to minimize the variation in token entropy w. r. t.

 Ranked #1 on Speech Recognition on Common Voice German (using extra training data)

Automatic Speech Recognition (ASR) Language Modelling +2

Bio-inspired Min-Nets Improve the Performance and Robustness of Deep Networks

1 code implementation NeurIPS Workshop SVRHM 2021 Philipp Grüning, Erhardt Barth

Min-Nets are inspired by end-stopped cortical cells with units that output the minimum of two learned filters.

Explainable COVID-19 Detection Using Chest CT Scans and Deep Learning

no code implementations9 Nov 2020 Hammam Alshazly, Christoph Linse, Erhardt Barth, Thomas Martinetz

This paper explores how well deep learning models trained on chest CT images can diagnose COVID-19 infected people in a fast and automated process.

Specificity Transfer Learning

Feature Products Yield Efficient Networks

no code implementations18 Aug 2020 Philipp Grüning, Thomas Martinetz, Erhardt Barth

Such FP-blocks are inspired by models of end-stopped neurons, which are common in cortical areas V1 and especially in V2.

Deep Convolutional Neural Networks as Generic Feature Extractors

no code implementations6 Oct 2017 Lars Hertel, Erhardt Barth, Thomas Käster, Thomas Martinetz

Deep convolutional neural networks, trained on large datasets, achieve convincing results and are currently the state-of-the-art approach for this task.

General Classification Image Classification

A Hybrid Convolutional Variational Autoencoder for Text Generation

3 code implementations EMNLP 2017 Stanislau Semeniuta, Aliaksei Severyn, Erhardt Barth

In this paper we explore the effect of architectural choices on learning a Variational Autoencoder (VAE) for text generation.

Language Modelling Text Generation

Recursive Autoconvolution for Unsupervised Learning of Convolutional Neural Networks

2 code implementations2 Jun 2016 Boris Knyazev, Erhardt Barth, Thomas Martinetz

In visual recognition tasks, such as image classification, unsupervised learning exploits cheap unlabeled data and can help to solve these tasks more efficiently.

Classification General Classification +1

Recurrent Dropout without Memory Loss

2 code implementations COLING 2016 Stanislau Semeniuta, Aliaksei Severyn, Erhardt Barth

This paper presents a novel approach to recurrent neural network (RNN) regularization.

Cannot find the paper you are looking for? You can Submit a new open access paper.