Cost-aware job scheduling for cloud instances using deep reinforcement learning |
| |
Authors: | Cheng Feng Huang Yifeng Tanpure Bhavana Sawalani Pawan Cheng Long Liu Cong |
| |
Affiliation: | 1.School of Mathematics, Southwest Jiaotong University, Chengdu, China ;2.School of Control and Computer Engineering, North China Electric Power University in Beijing, Beijing, China ;3.School of Computing, Dublin City University, Dublin, Ireland ;4.The Insight SFI Research Centre for Data Analytics, Dublin City University, Dublin, Ireland ;5.School of Computer Science and Technology, Shandong University of Technology, Zibo, China ; |
| |
Abstract: | As the services provided by cloud vendors are providing better performance, achieving auto-scaling, load-balancing, and optimized performance along with low infrastructure maintenance, more and more companies migrate their services to the cloud. Since the cloud workload is dynamic and complex, scheduling the jobs submitted by users in an effective way is proving to be a challenging task. Although a lot of advanced job scheduling approaches have been proposed in the past years, almost all of them are designed to handle batch jobs rather than real-time workloads, such as that user requests are submitted at any time with any amount of numbers. In this work, we have proposed a Deep Reinforcement Learning (DRL) based job scheduler that dispatches the jobs in real time to tackle this problem. Specifically, we focus on scheduling user requests in such a way as to provide the quality of service (QoS) to the end-user along with a significant reduction of the cost spent on the execution of jobs on the virtual instances. We have implemented our method by Deep Q-learning Network (DQN) model, and our experimental results demonstrate that our approach can significantly outperform the commonly used real-time scheduling algorithms. |
| |
Keywords: | |
本文献已被 SpringerLink 等数据库收录! |
|