This is the archived version of this course from the Spring 2013 semester. You might find something more recent by visitning my teaching page.
In class and in the notes we analyzed the cost of randomized game tree evaluation with branching factor 2, and determined that it was \(\Theta(n^{\log_4 3})\) expected time, which is approximately \(\Theta(n^{0.79})\)
What if the branching factor is an arbitrary value \(k\) instead of 2 or 3? Come up with a formula for the expected cost in terms of \(n\) and \(k\). Simplify as much as possible to get a big-\(\Theta\) bound.
As \(k\) gets larger and larger, does the randomized algorithm behave better or worse compared to the deterministic one?