Now we're going to take a look at the use of generating functions to address the

important tasks that we brought up in the last lecture.

programs many of which can be casts as recursive programs or algorithms

immediately lead to mathematical models of their behavior called recurrence

relations and so we need to be able to solve recurrence relations in order to be

able to analyze algorithms. And so now, I'll take a look at how you

use generating functions to solve recurrence relations.

and it's pretty much a algorithmic process that is we want to we want hate

to use a meat grinder, but it gives the idea.

We want to put in a recurrence relation and we want to turn the crank and we

want to get out a sequence. We want to get out of a simple expression

for what it represents. and it's pretty much a general procedure

that we can always follow. I'll give many examples.

So first thing is we're going to make the recurrence valid for all values of N and

the, it's easy to do that and I'll show you in many, in the examples.

Then, multiply both sides of the recurrence by z to the n and sum on n.

so it's an equation that's valid for all n so that we can do that.

And usually, we make it just for nonnegative n.

then that'll give us some sums but some of those sums will involve the, an, an

unknown generating function and maybe some well-known generating functions.

the end result will be an equation that's the OGF corresponding the recurrence has

to satisfy. so then, what we need to do is to solve

that equation to get an explicit formula for the OGF.

a lot of times we can go ahead and do that and we'll get plenty of examples.

the initial conditions play a role. and then we'll expand the OGF to find the

coefficients. So, the recurrence corresponds to a

sequence. the OGF is a way to represent the

sequence. We'll use the refer, recurrence to find

out what the OGF is and then we'll expand to get the coefficients which is the

goal. That's, that's what we're trying to find.

so let's look at the example. Now, in the case of linear recurrences

with constant coefficients the procedure really is an algorithm.

We always can get a solution and actually people have implemented this in symbolic

mass systems. So so here's an example from the previous

lecture. so that's a recurrence that is defined

for n bigger than 2 with the initial conditions a 0 to 0 and a1 = 1.

so the first thing is make it valid for all n and so it's all n greater equal to

0. And so, all we do is use a kronecker

delta notation. so for one this thing and you assume that

for negative indices of 0. so a0 = 0 so that would be 0, that'd be

0. So for a1, we'd have to add a1, so, when

n is 1 we want to add 1. That's what that kronecker delta is at,

the right there. A0 is expressed then in terms of a's with

negative indices which is 0, so its okay a0 = 0.

So that's how recurrence is valid for all n greater equal 0.

So now, we multiply by z to the n and sum on n.

Sum n greater than equal to 0, A to the n, z to the n.

That's our generating function, A(z), that we're going to use to represent this

sequence a sub n. For the first term on the right-hand side

it's sum an - 1, z to the n. Change n to n1 + 1 throws out A(z).

So that's 5zA(z). And for the second term we change n to n

plus 2 in the, in the sum and throw out Az^2 that's minus 6z^2, a of z.

And the kronecker delta term, that's sum all n but that thing is only

one when n = 1. So, that's z to the n, in that case, is just z.

So that's an equation that generating function has to satisfy.

and that's a, [COUGH] easy equation to solve with some

algebra. so that is, it's just A of z is just z

over 1 minus 1 - 5z + 6z^2. that's again equation that, that generic

function has to satisfy. so, that's generating function, now we

want to extract coefficients because our goal is to find an expression for an.

so how we are going to extract coefficients?

Well, in the case of ratio of two polynomials, it's not difficult it's

technique knows as partial fractions where we factor the polynomial and the

denominator. and we know that our solution must be at

this form, because if you cross multiply then in the denominator, you get the

right polynomial because you've factored it.

And then, the numerator you've got two equations and two unknowns you have to

have c0 + c1 = 0 and you have to have the 2c0 + 3c1 = -1.

So that is, those unknowns have to satisfy those two simultaneous equations

and that's just what we got before and the solution is z0 = 1 and z1 = -1.

and so, that now expresses the generating function as a difference between two

geometric sums and those we know how to expand, it's 3 to the N minus two to the

N. So that's step by step given a recurrence

we can get the solution that is a simple expression for the coefficients and

that's going to work in every case. Now, there's complications that arise.

Let's look at a more complicated example. sometimes and, and this works for

actually any linear recurrence, because you get a polynomial and you can always

factor a polynomial. So, so let's look at this one, 5 an minus

1 minus 8 an-2 plus 4n-3. Same procedure to make these initial

conditions satified. I start at 0 and work up and find that I

have to add a delta n1 and a minus delta n2 in order to make it valid for all

that. Then, when you multiply by z to the n and

sum on end you get this equation on the generating function.

5zA(z) - 8z^2A(z) + 4z^3A(z)3 + z - z^2.2.

And again, now just using algebra, now we have an explicit expression for the

generating function. It's the ratio of two polynomials.

And ratio of two polynomials, partial fractions works, you have multiple terms.

in this case, there's the [COUGH] the root 1/2 has multiplicity 2

and that means just that, that polynomial

is I got three roots and then actually in this case one of the roots cancels with

the numerator so we have A(z) = z / (1-2z)^2 but that's one that we know,

that's a generating function that we already derived by differentiating the

geometrican scaling and that says that a sub n = n2^n-1.

Again, just algebra to find an explicit representation for the generating

function and then expand. now, it turns out that if you have roots

of multiplicity 3, you'll get terms of the form n^22, something to the n and so

forth. and there's other things that can happen,

but it's just properties of polynomials. so, for example, you could get complex

roots. So here's an example where the roots are

complex. Again, this is the same set up we have a third order linear equation

constant coefficients. we've got three initial conditions.

apply, do the deltas to make it valid for all n multiply by z^n n and sum on n and

then do algebra and then that gives a again a ratio of two polynomials.

The degree of the polynomial is equal to the degree of the recurrence.

in this case what happens is there is +z z^2 is one of the roots, one of the

factors of the [COUGH] denominator.

And again the numerator cancels so what do we do with 1 + z^2?

well, we can factor it with complex and find out that A(z) is a half, 1 / 1 - i

of z plus 1 over 1 plus i of z and we can go ahead and expand that and get this

representation. It's i to the n plus minus i to the n or

you can factor out an i to the n. and even though, i appears in this

solution if, when you do the math the

i never appears as a member of the sequence.

because when i is odd this thing cancels and when n is odd, this thing cancels to

0. and when n is even, then you go between 1

and -1 between because of the i^22 or i^4.4.

so that's a rather strange oscillating sequence, but it comes out immediately

from our process of our algorithmic process of finding sequence corresponding

to a given recurrence. It's a nice example, because it shows the

origin of the oscillations that we often see when we're studying algorithms or

combinatorial structure. Those oscillations are modeled in

mathematics really, in this case, by just the square root of -1.

square root of -1^2 is -1, and to the fourth power is +1, and that's

really reflected in this sequence. and it's going to be maybe not so easy to

uncover oscillations without using complex and as we'll see complex analysis

plays a fundamental role in analytic combinatorics when we get into advanced

methods. [COUGH] okay.

So here's a summary. and then, this is just a math with

unknowns just so we can state a theorem. And it's kind of a complicated looking

theorem, but next time we'll show that we don't really need all this detail.

but it's worthwhile to fully state this theorem, because the method that we've

talked about really leads right to a proof of this theorem that's not that

abstract. it's just a matter of turning the crank.

So what we know is that if you've got a [INAUDIBLE] order linear recurrence with

constant coefficients so that you just have [INAUDIBLE] terms on the right-hand

side. It's going to be a linear combination of

t terms. and it depends on the roots of the

polynomial that's induced by what happens when you get, when you multiply by

[INAUDIBLE] and do the algebra. You're always going to get this

polynomial, 1 - x1 minus like that in the

denominator. and it depends on the multiplicity of

those roots so if you've got r roots, where the multiplicity is m sub i.

So if you add up all the multiplicities you get t, then your solution is

depending on the multiplicity it's going to be, for every root you got a the, that

root to a power plus n times that root to a power all the way up to the, the

multiplicity and that has a total of t terms.

and again I'm not expect, not expecting people to follow really every detail of

this, but, you, you can get the idea that we can, actually write down a full

solution to linear recurrence and generating functions give us this proof.

And the constants involved can always be determined from the initial conditions

and that's using partial fractions and solving simultaneous equations.

And again, this is all automatic and people have implemented this process in

symbolic algebra systems. so actually, nowadays you can type in

recurrences and get the solution. and don't forget, your solution might

introduce, might involve periodic behavior that's introduced by the complex

roots, so it might not be the case to prove to be able to prove that even a

recurrence like this converges to a limit.

but it's all very straightforward and well understood mathematically.

Now, what about the analysis of algorithms?

For quicksort, we had this rather complex recurrence that was our starting point

for the analysis of quicksort. Can we use generating functions to solve

the quick sort recurrence? and, and the answer of course is that we

can. to make life easiest, we'll first

multiply both sides by n, although you can get it done without doing that.

so now we have a recurrence where we're not dividing by n and then multiply by Z

to the N, and sum. over now we sum for N bigger than or

equal to 1 and that's just because the CK-1 to

save us a few terms. so that's multiplying by Z to the N and sum and now

we got to look at each one of the terms and see what we have.

what do, what do we have on the left there? well, if we define the generating

function for the sequence of interest to be C of Z equals sum C sub n, Z to the N.

That is if you differentiate that multiplied by Z, that's what you get.

So that one is C prime of Z and there's a factor of Z all the way through that's

divided out. This one is two

[COUGH] 2z over 1-EQ. and again that's right out of the table.

and then a factor of z divides out. And this one is a convolution, it's 1 / 1

- z * z of z. so I, I skipped just a very few steps

here involving indexes that go to 0 and involving dividing by z.

But you can convince yourself quite easily that that ordinary differential

equation is the result of the [COUGH] simply evaluating the sums to get the

equation that the generating function has to satisfy.

So, that's ordinary differential equation and it's completely well, well-defined

and so now we need to know about solving differential equations, to get this

solved, and I don't want to give a course on solving differential equations.

I just point this out as an example for people who do know that

it's possible to solve it just by considering the equation without the

extra term. and figuring out that, what you need to

do is, solve that prob, that problem. in that case, that problem, the solution

is 1 / 1 - z^2.2. so if we didn't have this constant term,

the solution would be 1 / 1 - z^2. [INAUDIBLE] differentiate that, it's the

same thing as if you multiply by one over one - z and multiply by 2.

and then, what you do is multiply, by that, or divide by that factor, which

wides up by multiplying. and it's really, actually analogous, and

it is totally analogous to what we did when solving for sorted linear

reccurences, where we try to find something to multiply it by to make it

that a telescope, this is kind of similar. and this comes out to be a

simple equation in terms of the function, 1 / 1.

z^2, c of z. so if you differentiate that you get this

thing so and that's then equivalent to our equation.

so two that's 2 over 1-z and now, we can just integrate that simple thing if that

shows that c of z times order 1-c squared is equal of n of all of that which is two

log over 1-z and that's a solution. and so that's solving the differential

equation to get an explicit representation for the generating

function. not too bad.

It's a standard differential equation. And now, we want to extract coefficients

to find the number of compares taken by quicksort, but that's easily done. we've

seen that one that's the example that we did for an

exercise. It's 2 N plus 1, H N plus 1 minus 1.

So, OGFs can solve recurrences even as

complicated as the quicksort recurrence. there's many other examples of solutions

of recurrences using OGFs in the book.