Energy-Aware Scheduling in Cloud Data Centers Using Reinforcement Learning: Reducing Carbon Footprint and Operational Cost
Abstract
The exponential growth of cloud computing has elevated energy consumption in data centers, contributing significantly to operational costs and environmental degradation. Addressing this, our research introduces a reinforcement learning (RL)-based energy-aware scheduling model aimed at reducing both carbon footprint and operational expenses. We develop a dynamic scheduler that learns optimal task allocation policies using deep reinforcement learning in heterogeneous cloud environments. The model adapts to fluctuating workloads and resource availabilities, making it suitable for real-world scenarios. Through simulation using real workload traces and energy consumption profiles, our RL-based approach demonstrates superior energy efficiency and service level agreement (SLA) compliance compared to traditional heuristics and rule-based algorithms.