Activations in Low Precision with High Accuracy - Inria - Institut national de recherche en sciences et technologies du numérique
Pré-Publication, Document De Travail Année : 2024

Activations in Low Precision with High Accuracy

Résumé

As machine learning hardware uses ever smaller number formats, this article surveys simple and effective techniques for the implementation of activation functions in low precision (fewer than 16 bits) with high accuracy. The implementation combines a fixed-point centric approach, efficient function-specific range reduction techniques, and state-of-the-art polynomial approximation. The resulting trade-offs are studied on both FPGA and ASIC. Functions considered in this article include tanh, sigmoid, ReLU variants such as GELU, ELU, SiLU, and exp for stable softmax, but the methodology can apply to more functions. These techniques are implemented in an opensource hardware generator that produces readable synthesizable VHDL.

Fichier principal
Vignette du fichier
2025-ALPHA.pdf (384.5 Ko) Télécharger le fichier
Origine Fichiers produits par l'(les) auteur(s)
Licence

Dates et versions

hal-04776745 , version 1 (11-11-2024)

Licence

Identifiants

  • HAL Id : hal-04776745 , version 1

Citer

Tom Hubrecht, Orégane Desrentes, Florent de Dinechin. Activations in Low Precision with High Accuracy. 2024. ⟨hal-04776745⟩
17 Consultations
39 Téléchargements

Partager

More