Go Back Research Article January, 2023

Energy-Aware Scheduling in Cloud Data Centers Using Reinforcement Learning: Reducing Carbon Footprint and Operational Cost

Abstract

The exponential growth of cloud computing has elevated energy consumption in data centers, contributing significantly to operational costs and environmental degradation. Addressing this, our research introduces a reinforcement learning (RL)-based energy-aware scheduling model aimed at reducing both carbon footprint and operational expenses. We develop a dynamic scheduler that learns optimal task allocation policies using deep reinforcement learning in heterogeneous cloud environments. The model adapts to fluctuating workloads and resource availabilities, making it suitable for real-world scenarios. Through simulation using real workload traces and energy consumption profiles, our RL-based approach demonstrates superior energy efficiency and service level agreement (SLA) compliance compared to traditional heuristics and rule-based algorithms.

Keywords

cloud computing energy-aware scheduling reinforcement learning carbon emissions green computing operational cost reduction data center efficiency sustainable cloud architecture
Document Preview
Download PDF
Details
Volume 4
Issue 1
Pages 1-7
ISSN 3232-4536