This is the archived website of IC 312 from the Fall 2015 semester. Feel free to browse around; you may also find more recent offerings at my teaching page.

Unit 2: Recursion

Credit: Gavin Taylor for the original version of these notes.

Beginning of Class 5

Beginning of Class 6

1 Recursion Review

Recursion!

Yes, it's back! You all learned about recursion in IC210, and you might have thought that it's pretty useless because you could do all the same things with iteration. And because recursion is so confusing.

Well, Data Structures (and most especially, trees) is where recursion becomes really, really useful. When we have more sophisticated data structures and algorithms, we need recursion to write simple code that works and we can understand.

The simple fact is this: with many, many data structures, recursion is WAY easier to cope with than iteration is. Embrace it, because it's about to make your life easier. Today, then, is an IC210-level reminder of what recursion is, and how it works.

Recursion is a function calling itself. So, if we understand how function calls work, we'll understand recursion a lot better. Imagine the following set of functions (in pseudocode):

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
f1() {
  print "inside f1"
}
 
f2() {
  print "starting f2"
  f1()
  print "leaving f2"
}
 
main() {
  print "starting main"
  f2()
  print "leaving main"
}

If we were to run this program, the output would be:

starting main
starting f2
inside f1
leaving f2
leaving main

To understand why, let's think about the call stack. The call stack works like a stack of plates: plates can only be placed on the top, and can only be removed from the top. For this reason, they're often referred to as LIFO (last-in-first-out). Instead of plates, though, the call stack has the functions that are running; when a function is called, it is placed on the top of the stack. A function ONLY runs when it is at the top of the stack; the rest of the time, it sits and waits for the one above it to return.

So, we start running the program. Obviously, it prints "starting main." At this point, our call stack looks like this:

\[ \array{ \boxed{\texttt{main()}} } \]

Our next line is to call the function f2. So, f2 gets put on the top of the stack:

\[ \array{ \boxed{\texttt{f2()}} \\ \boxed{\texttt{main()}} } \]

We've done enough programming now to know that now f2 is running, not main. The reason, though, is because now f2 is on the top of the call stack, so main just sits and waits until f2 returns.

So, f2 runs, printing "starting f2." It then calls f1, leading to this stack:

\[ \array{ \boxed{\texttt{f1()}} \\ \boxed{\texttt{f2()}} \\ \boxed{\texttt{main()}} } \]

So, f1 runs, and f2 (and main) sits and waits. f1 prints "inside f1," then returns, "popping" off the stack, taking us back to this:

\[ \array{ \boxed{\texttt{f2()}} \\ \boxed{\texttt{main()}} } \]

f2, back on the top of the stack, gladly continues where it left off, printing "leaving f2," before returning and popping off the stack itself:

\[ \array{ \boxed{\texttt{main()}} } \]

Back on the top of the stack, main now finishes, printing "leaving main," and returning, ending the program.

OK! So, recursion. Let's consider the standard first-recursion-example of finding a factorial using recursion:

1
2
3
4
5
int fac(int n) {
  if (n == 0)
    return 1;
  return n * fac(n-1);
}

If fac(3) is run, it runs through happily, until it gets to "fac(n-1)", and which point, fac(2) is placed on top of the stack. fac(2) begins to run, and fac(3) sits and waits for it to return. fac(1) is then placed on top, and fac(2) and fac(3) sit and wait. Finally, briefly, fac(0) is placed on the stack, just long enough to return 1.

The story so far looks like this:

\[ \array{ \phantom{\boxed{\texttt{f}}} \\ \phantom{\boxed{\texttt{f}}} \\ \phantom{\boxed{\texttt{f}}} \\ \boxed{\texttt{fac(3)}} } \longrightarrow \array{ \phantom{\boxed{\texttt{f}}} \\ \phantom{\boxed{\texttt{f}}} \\ \boxed{\texttt{fac(2)}} \\ \boxed{\texttt{fac(3)}} } \longrightarrow \array{ \phantom{\boxed{\texttt{f}}} \\ \boxed{\texttt{fac(1)}} \\ \boxed{\texttt{fac(2)}} \\ \boxed{\texttt{fac(3)}} } \longrightarrow \array{ \boxed{\texttt{fac(0)}} \\ \boxed{\texttt{fac(1)}} \\ \boxed{\texttt{fac(2)}} \\ \boxed{\texttt{fac(3)}} } \longrightarrow \array{ \texttt{return 1} \\ \boxed{\texttt{fac(1)}} \\ \boxed{\texttt{fac(2)}} \\ \boxed{\texttt{fac(3)}} } \]

At this point, fac(1) picks up where it left off, multiplying its n (1) by what fac(0) returned (also 1), and returning that solution. fac(2) then multiplies that returned value of 1 by its n (2), and returning the solution. fac(3) then multiplies that 2 by its n (3), returning a final answer of 6.

The second half of the story looks like this:

\[ \array{ \texttt{return 1} \\ \boxed{\texttt{fac(1)}} \\ \boxed{\texttt{fac(2)}} \\ \boxed{\texttt{fac(3)}} } \longrightarrow \array{ \phantom{\boxed{\texttt{f}}} \\ \texttt{return 1} \\ \boxed{\texttt{fac(2)}} \\ \boxed{\texttt{fac(3)}} } \longrightarrow \array{ \phantom{\boxed{\texttt{f}}} \\ \phantom{\boxed{\texttt{f}}} \\ \texttt{return 2} \\ \boxed{\texttt{fac(3)}} } \longrightarrow \array{ \phantom{\boxed{\texttt{f}}} \\ \phantom{\boxed{\texttt{f}}} \\ \phantom{\boxed{\texttt{f}}} \\ \texttt{return 6} \\ } \]

Tips for writing recursive methods

We're going to need a base case (the thing that makes everything stop running; in the above factorial example, it's if n==0), and at least one recursive case. The base case is usually easier... what's an input that makes this problem really easy? Write that first, then make the recursive case build towards it.

Linked List traversal

Here's linked list traversal with iteration:

1
2
3
4
5
traverse(Node head) {
  Node tempPtr = head;
  while (tempPtr != null)
    tempPtr = tempPtr.getNext()
}

and here it is with recursion:

1
2
3
4
5
traverse(Node head) {
  if (head==null) //Empty lists make this easy, so its a good base case
    return;
  traverse(head.getNext())
}

Make sure you understand how this works. Soon, there will be data structures where the iterative version is pages long, and the recursive version is five lines. You want to know and understand this.

In class, we wrote two other methods, one of which calculated the length of a linked list:

1
2
3
4
5
public int length(Node n) {
  if (n == null)
    return 0;
  return 1+length(n.getNext());
}

The other added an element to a sorted linked list. For example, if we have a linked list holding the integers 1, 2, and 4, after running addInOrder(3), we would have a linked list with the integers 1, 2, 3, and 4.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
public void addInOrder(int element) {
  first = addInOrder(element, first);
  if (last == null)
    last = first;
}
 
//This is a helper method, so our user doesn't have to know or care about
//nodes.
private Node addInOrder(int element, Node n) {
  if (n==null)
    return new Node(element);
  if (n.getData() > element) {
    Node t = new Node(element);
    t.setNext(n);
    return t;
  }
  n.setNext( addInOrder( element, n.getNext()));
  return n;
}

Make some examples to try the above on, and understand why it works! Your homework will go better.

Beginning of Class 7

2 Big-O with Recursion

Big-O and Recursion

We have some practice identifying the Big-O of iterative algorithms. However, now that we're (again) experts on recursion, it's important that we be able to identify the Big-O of recursive algorithms. We're not going to spend a huge amount of time on this; that's Algorithms' job (note for IT people: consider taking Algorithms. It's an Important Class). However, it is important that you can identify some of the most common patterns.

Let's start with an example: linked-list traversal.

1
2
3
4
5
traverse(Node n) {
  if (n==null) return;
  //Do some stuff to this node
  traverse(n.getNext());
}

Obviously, we have two cases; a base case, and a recursive case. We're going to write a function \(T\)that describes the number of steps taken in each case, as a function of \(n\). The base case is easy:

\[T(0) = 2\]

This is because when \(n=0\), there is a constant amount of work to do (namely, do the comparison in the if-statement, and perform the return statement). Ultimately, this constant 2 won't really matter in the big-O, but when we're dealing with recursion it's safer not to do big-O until the end.

Now what about the other case - if \(n\neq 0\)?

\[T(n) = 2 + T(n-1)\]

This is because when \(n\ge 1\), we do some constant amount of work (if statement comparison, function call to traverse, etc), then run it all again on a list of size \(n-1\).

The two of these functions together are known as a recurrence relation. So the recurrence relation of this algorithm is:

\[\begin{align*} T(0) &= 2 \\ T(n\ge 1) &= 2 + T(n-1) \end{align*} \]

The same exact thing can also be written as

\[ T(n) = \begin{cases} 2,& n=0 \\ 2 + T(n-1), & n\ge 1 \end{cases} \]

Solving Recurrence Relations

Our five-step process for solving a recurrence relation is:

  1. Write down the recurrence relation.
  2. Expand the recurrence relation, for several lines.
  3. Figure out what you would write for the i-th line.
  4. Figure out what value of i would result in the base case.
  5. In your equation from (3), replace i with the solution from (4).

OK, so we've written it. Let's do step 2.

\(T(n) = 2 + T(n-1)\)

\(T(n) = 2 + \underbrace{2 + T(n-2)}_{T(n-1)}\)

\(T(n) = 2 + 2 + \underbrace{2 + T(n-3)}_{T(n-2)}\)

And so on. But this is probably enough to recognize the pattern. Let's do step 3. Hopefully you agree that on the i-th line, we'd have:

\(T(n) = 2i + T(n-i)\)

So when does this stop? Well, that's where the other part of our recurrence relation comes in: our base case equation tells us that when \(n=0\), it's just \(2\). So, we figure out what value of i makes the T(n-i) equal to our base case of T(0). In this case, that happens when \(i=n\), because when \(i=n, n-i=0\).

Finally, we do step five, and take our equation \(T(n) = 2i + T(n-i)\), and replace i with our solution of step 4 (\(i=n\)), to get \(T(n) = 2n + T(n-n)\). Because \(T(n-n) = T(0)=2\), the runtime of that equation is \(2n+2\), which is \(O(n)\).

Another Example!

A couple days ago we saw the iterative version of binary search. Here's the recursive version:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
boolean binarySearch(int[] arr, int needle) {
  return binarySearch(arr, 0, arr.length-1, needle);
}
 
boolean binarySearch(int[] arr, int left, int right, int needle) {
  if (left > right) return false;
  else if (left == right) return arr[left] == needle;
 
  int middle = (left + right) / 2;
  if (needle <= arr[middle])
    return binarySearch(arr, left, middle, needle);
  else // (needle > arr[middle])
    return binarySearch(arr, middle+1, right);
}

This program has two base cases (when \(n=0\) or \(n=1\)) and two recursive cases too! But the worst-case cost in both base cases, and in both recursive cases, is more or less the same, so we use the following recurrence relation:

\[T(n) = \begin{cases} 5,& n \le 1 \\ 8 + T\left(\tfrac{n}{2}\right),& n \ge 2 \end{cases}\]

(*If you disagree or come up with different values for 5 and 8, remember that those constants do not actually matter at all for the big-O, as long as they're constant! If you were solving this for a HW or exam and put down 3 and 9 instead of 5 and 8, no problem!)

So let's figure out the Big-O. Step 2 is to get the pattern for the \(i\)th step:

\[\begin{align*} T(n) = 8 + T(n/2) \\ &= 8 + \underbrace{8 + T(n/4)}_{T(n/2)} \\ &= 8 + 8 + \underbrace{8 + T(n/8)}_{T(n/4)} \\ &= 8i + T(n/(2^i)) \end{align*}\]

The next step is to determine how big \(i\) has to be to hit the base case:

\[\begin{align*} n/(2^i) & \le 1 \\ 2^i &\ge n \\ i &\ge \log_2 n \end{align*}\]

Finally, we substitute our discovered value of \(i\) into the formula to get the non-recursive total cost:

\[\begin{align*} T(n) &= 8(\log_2 n) + T(n/2^{\log_2 n}) \\ &= 8\lg n + T(1) \\ & = 8\lg n + 5 \end{align*}\]

which of course is \(O(\log n)\).

(* From now on, I'm going to write \(\lg n\) instead of \(\log_2 n\), because I'm lazy. You should feel free to do the same.)

Hopefully, before the last line, it was clear to you at a minimum that this was sublinear, that is, faster than linear time.