Energy-Efficient Task Offloading using Reinforcement Learning for Dependent Tasks in Cloud-Edge-Device Systems
Résumé
The advancement of 5G networks has enabled latency-sensitive, computation-intensive applications that demand high Quality of Service but result in significant energy consumption. Mobile devices, with limited computational and energy resources, struggle to meet these demands. While cloud computing offers powerful processing, its high latency makes it unsuitable for delay-sensitive applications. Mobile-Edge Computing (MEC) provides a solution by offloading tasks to nearby servers, but energy efficiency remains a critical challenge, especially in dynamic environments with mobile device mobility and task dependencies. This paper proposes an energy-efficient task offloading mechanism for MEC environments that leverages collaboration between cloud, edge, and mobile devices. We model the problem as a Markov Decision Process and apply Double Deep Reinforcement Learning to optimize task offloading while minimizing energy consumption. Simulation results demonstrate the effectiveness of our approach in achieving convergence and reducing energy consumption by 36% compared to other methods, while adapting to dynamic network conditions.
Origine | Fichiers produits par l'(les) auteur(s) |
---|