On-Device Deep Learning: Survey on Techniques Improving Energy Efficiency of DNNs - Inria - Institut national de recherche en sciences et technologies du numérique
Article Dans Une Revue IEEE Transactions on Neural Networks and Learning Systems Année : 2024

On-Device Deep Learning: Survey on Techniques Improving Energy Efficiency of DNNs

Résumé

Providing high quality predictions is no longer the sole goal for neural networks. As we live in an increasingly interconnected world, these models need to match the constraints of resource-limited devices powering Internet of Things and embedded systems. Moreover, in the era of climate change, reducing the carbon footprint of neural networks is a critical step for green artificial intelligence, which is no longer an aspiration but a major need. Enhancing the energy efficiency of neural networks, in both training and inference phases, became a predominant research topic in the field. Training optimization has grown in interest recently but remains challenging, as it involves changes on the learning procedure that can impact the prediction quality significantly. This paper presents a study on the most popular techniques aiming to reduce the energy consumption of neural networks’ training. We first propose a classification of the methods before discussing and comparing the different categories. Additionally, we outline some energy measurement techniques. We discuss the limitations identified during our study as well as some interesting directions, such as neuromorphic and reservoir computing.
Fichier principal
Vignette du fichier
survey.pdf (245.47 Ko) Télécharger le fichier
Origine Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-04662357 , version 1 (25-07-2024)

Identifiants

Citer

Anais Boumendil, Walid Bechkit, Karima Benatchba. On-Device Deep Learning: Survey on Techniques Improving Energy Efficiency of DNNs. IEEE Transactions on Neural Networks and Learning Systems, 2024, ⟨10.1109/TNNLS.2024.3430028⟩. ⟨hal-04662357⟩
47 Consultations
57 Téléchargements

Altmetric

Partager

More