Search Results for author: Tobias Christian Nauen

Found 2 papers, 2 papers with code

TaylorShift: Shifting the Complexity of Self-Attention from Squared to Linear (and Back) using Taylor-Softmax

1 code implementation5 Mar 2024 Tobias Christian Nauen, Sebastian Palacio, Andreas Dengel

The quadratic complexity of the attention mechanism represents one of the biggest hurdles for processing long sequences using Transformers.

Classification

Which Transformer to Favor: A Comparative Analysis of Efficiency in Vision Transformers

1 code implementation18 Aug 2023 Tobias Christian Nauen, Sebastian Palacio, Andreas Dengel

This benchmark provides a standardized baseline across the landscape of efficiency-oriented transformers and our framework of analysis, based on Pareto optimality, reveals surprising insights.

Image Classification Model Selection

Cannot find the paper you are looking for? You can Submit a new open access paper.