Problem 91

This is the archived version of this course from the Spring 2013 semester. You might find something more recent by visitning my teaching page.

Load balancing in a network

Due: April 29
Points: 3

Imagine the following scenario: You are running the servers for a large website that handles many connections and requests every minute. Your setup is a single "master" server that receives all the incoming requests, and \(s\) processing servers that handle the requests.

Since the requests come in quickly, your master server must very rapidly decide which of the \(s\) processing servers will be assigned to each incoming request. But every request might take a different amount of time, and there is no way to know how long some request is going to take in advance.

The only way your master server has of knowing which processing server to assign each incoming request to is to ask a single processing server what its current "load" is. Obviously, in an ideal situation, the server with the lightest load would be assigned the next incoming request. However, this is impractical because it would require the master sever to constantly be polling every processing server for its current load.

Come up with a way of deciding how to handle each incoming request, by making a small number of "load queries" to the processing servers. Specifically, your algorithm should only need to as a constant \(O(1)\) number of processing servers what their current load is, for each incoming request.

Briefly describe your algorithm and why it should work well. Hint: you should use some ideas from hashing, although no hash functions will actually be used here!