The effect of time delays on the stability of load balancing algorithms for parallel computations
Institute of Electrical and Electronic Engineers (IEEE)
IEEE Transaction on Control Systems Technology
A deterministic dynamic nonlinear time-delay system is developed to model load balancing in a cluster of computer nodes used for parallel computations. The model is shown to be self consistent in that the queue lengths cannot go negative and the total number of tasks in all the queues and the network are conserved (i.e., load balancing can neither create nor lose tasks). Further, it is shown that using the proposed load balancing algorithms, the system is stable in the sense of Lyapunov. Experimental results are presented and compared with the predicted results from the analytical model. In particular, simulations of the models are compared with an experimental implementation of the load balancing algorithm on a distributed computing network.
Chiasson, John; Tang, Zhong; Ghanem, Jean; Abdallah, Chaouki T.; Birdwell, J. Douglas; Hayat, Majeed M.; and Jérez, Henry, "The effect of time delays on the stability of load balancing algorithms for parallel computations" (2005). Electrical and Computer Engineering Faculty Research and Publications. 538.
ADA Accessible Version
Accepted version. IEEE Transaction on Control Systems Technology, Vol. 13, No. 6 (2005): 932-942. DOI. © 2005 Institute of Electrical and Electronic Engineers (IEEE). Used with permission.