Search Results for author: Gamze İslamoğlu

Found 1 papers, 0 papers with code

ITA: An Energy-Efficient Attention and Softmax Accelerator for Quantized Transformers

no code implementations7 Jul 2023 Gamze İslamoğlu, Moritz Scherer, Gianna Paulin, Tim Fischer, Victor J. B. Jung, Angelo Garofalo, Luca Benini

Transformer networks have emerged as the state-of-the-art approach for natural language processing tasks and are gaining popularity in other domains such as computer vision and audio processing.

Quantization

Cannot find the paper you are looking for? You can Submit a new open access paper.