## Section9.8Discussion

Yolanda took a sip of coffee “I'm glad I paid attention when we were studying vector spaces, bases, and dimension. All this stuff about solutions for recurrence equations made complete sense. And I can really understand why the professor was making a big deal out of factoring. We saw it our first semester when we were learning about partial fractions in calculus. And we saw it again with the differential equations stuff. Isn't it really neat to see how it all fits together?” All this enthusiasm was too much for Alice who was not having a good day. Bob was more sympathetic, saying “Except for the detail about zero as a root of an advancement operator polynomial, I was ok with this chapter.” Xing said “Here we learned a precise approach that depended only on factoring. I've been reading on the web and I see that there have been some recent breakthroughs on factoring.” Bob jumped back in “But even if you can factor like crazy, if you have a large degree polynomial in the advancement operator equation, then you will have lots of initial conditions. This might be a second major hurdle.” Dave mumbled “Just do the factoring. The rest is easy.” Carlos again was quiet but he knew that Dave was right. Solving big systems of linear equations is relatively easy. The challenge is in the factoring stage.

Despite thinking the material of this chapter was interesting, Bob also wondered if they really needed all of this machinery. “Defining a recursive function is easy in almost all programming languages, so why not just use a computer to calculate the values you need?” 1  Xing started to remark that the techniques of this chapter could provide a good way to understand the growth rate of recursive functions in terms of the big Oh notation of Chapter 4, but Alice interrupted to propose a programming experiment as something that would raise her spirits. (The chance to prove Bob wrong was probably more motivational than the chance to do some coding, but she didn't want to be too mean.)

The group decided to take a look at the recurrence in Example 9.25, which they immediately wrote as a recursive function defined on the nonnegative integers by

\begin{equation*} r(n) =\begin{cases} 2^n + r(n-1) + 2r(n-2)\amp \text{if }n\geq 2;\\ 1\amp\text{if }n=1;\\ 2\amp\text{if }n=0.\end{cases} \end{equation*}

Alice grabbed her computer and implemented this in SageMath and computed a few test values.

def r(n):
if n == 0:
return 2
elif n == 1:
return 1
elif n >=2:
return 2^n + r(n-1)+2*r(n-2)
print(r(1))
print(r(4))
print(r(10))


She then defined a second function s that was the explicit (nonrecursive) solution from Example 9.25 and checked that values matched.

s(n)=(5/9)*2^n + (2/3)*n*2^n+(13/9)*(-1)^n
print(s(1))
print(s(4))
print(s(10))


“Is this going somewhere?”, Bob asked impatiently. For these values, both r and s seemed to be giving them answers equally quickly. Dave said he'd heard something about a timeit command in SageMath that would allow them to compare run times and comandeered Alice's keyboard to type:

for n in range(31):
if n % 5 == 0:
print("For n = {}:".format(n))
timeit("r(n)",number=5)
timeit("s(n)",number=5)


This finally got Bob's attention, since it s seems to be taking a relatively constant time to run even as $$n$$ increases, while r seems to be taking about 10 times as long to run for each increase of 5 in the value of $$n\text{.}$$ As a final test, they execute the SageMath code below, which calculates s(100) almost instantly. On the other hand, it seems like getting a refill on their coffee would be a good way to pass the time waiting on r(40).

print(s(100))
print(r(40))

The history of how recursion made its way into ALGOL (and therefore most modern programming languages) involved some intrigue. Maarten van Emden recounts this in a blog post entitled “How recursion got into programming: a tale of intrigue, betrayal, and advanced programming-language semantics 2 ”.
vanemden.wordpress.com/2014/06/18/how-recursion-got-into-programming-a-comedy-of-errors-3/