Publications / Sebastian Palacio

Sebastian Palacio.

ABB AG

Co-authored Publications: 2

Which Transformer to Favor: A Comparative Analysis of Efficiency in Vision Transformers figure
WACV 2025 · 2025

Which Transformer to Favor: A Comparative Analysis of Efficiency in Vision Transformers

Tobias Christian Nauen, Sebastian Palacio, Federico Raue, Andreas Dengel

A comprehensive benchmark and analysis of more than 45 transformer models for image classification to evaluate their efficiency, considering various performance metrics. We find the optimal architectures to use and uncover that model-scaling is more efficient than image scaling.

TaylorShift: Shifting the Complexity of Self-Attention from Squared to Linear (and Back) using Taylor-Softmax figure
ICPR 2024 (oral) · 2024

TaylorShift: Shifting the Complexity of Self-Attention from Squared to Linear (and Back) using Taylor-Softmax

Tobias Christian Nauen, Sebastian Palacio, Andreas Dengel

This paper introduces TaylorShift, a novel reformulation of the attention mechanism using Taylor softmax that enables computing full token-to-token interactions in linear time. We analytically and empirically determine the crossover points where employing TaylorShift becomes more efficient than traditional attention. TaylorShift outperforms the traditional transformer architecture in 4 out of 5 tasks.