0:00

So, now we have, we have described the algorithm merge sort, and

Â again we illustrated it on this list.

Â When we took the list, we kept dividing it until we got to the leaves.

Â For each one of them, there was nothing to do, they were already sorted.

Â We go up and do the merge.

Â As I be when I, when I began describing the algorithm, or

Â before we start describing the algorithm,

Â I said we are going to go into an o of n log n algorithm.

Â So what is the efficiency?

Â Well why does the efficiency of this algorithm,

Â why is the time complexity of this algorithm, o of n log n.

Â So, this is where we, we need now to analyse this.

Â Now, this algorithm,

Â notice that this algorithm is different from algorithms we have seen so far.

Â This algorithm is a recursive algorithm.

Â I say, I want to sort this list.

Â I don't know how to do that, I'm going to divide it into two sub-lists, and

Â I'm going to sort them.

Â So, notice that, to sort the big list, divide it into two halves, and

Â then sort to halves.

Â But I'm using the same term again.

Â I'm saying to sort these, to sort this big list, sort the to halves.

Â But this is not really sorting, because I just invoke the same word again or

Â the same function again and to sort this, sort the two halves and

Â sort each one of them, sort the two halves.

Â So this is what we call a recursion here, right?

Â I'm, I'm calling the function multiple times, but every time I'm

Â calling the function, I'm calling it on a smaller instance of the problem here.

Â So, I go sort merge sort on this list then it will do merge sort on this merge sort

Â on this merge sort on this is going to to do merge sort on this to in each one of

Â these merge sort on this is going to do merge sort on each one of them.

Â So, this function is merge sort is going to be called once on

Â this its going to be called once on this once on this its going to be called here,

Â here, here, here.

Â It's going to be called again on each one of these.

Â So it's going to be called many times.

Â At the end when it is called here, when I say merge sort of this list,

Â it will look at the list say okay, it has one item, done, it is sorted.

Â And then after that, the merge comes into the picture, and

Â it starts doing the merging.

Â So no more sorting now, it's just merging them.

Â Right?

Â So what is the running time?

Â And here, but since this is a recursive algorithm, we do, the technique for

Â analysing the running time of such an algorithm, is a bit different now, right?

Â So I am doing [COUGH] let's look at it is that,

Â when we are doing merge sort, [SOUND] of a list.

Â Of n items, let's say for example, ni,

Â it has n items from position zero to n minus 1.

Â What merge sort is going to be doing, it is going to be doing three things.

Â It's going to be doing merge sort on L.

Â And the first half.

Â 3:06

And then it's going to be doing the merge of these two.

Â Once I'm calling merge sort on this, I get it sorted,

Â I get merge sort on this, I get it sorted, I will do merge.

Â So if I look at this, once this is sorted, once this is sorted, I'm doing merge.

Â Okay?

Â So I'm doing merge sort on a list of size n,

Â to get this, I'm doing a merge sort on a si, on a list of size n over 2.

Â Merge sort in a list of size n over 2, and I'm going to merge the two.

Â 3:43

Remember these were a, b and I copied them into c which is of size n.

Â Right?

Â This, the merge is going to look at these two items first.

Â Compare them and

Â copy the smaller, then the second, and comp, sa, copy it and so on.

Â This list is going to have n items, okay?

Â When we began it was empty, when the merge ended, it was n items in it.

Â So the win, any items copied here, so

Â we know it has at least, it does at least o of n amount of work.

Â And in fact, it does o of n amount of work, because for

Â each item, I'm doing just one comparison here, all right?

Â So, for each element here, I'm doing a constant amount of work.

Â So, merge of these two lists into this, is going to take o of n, okay?

Â So, I know that this merge.

Â Is going to take o of n amount of work.

Â The question is, how long does the entire function take?

Â So suppose this,

Â I'm trying to gather the running time of my merge sort which is t of n.

Â This is a function that is a number of steps that merge sort will take on

Â a list of size n.

Â So what is t of n?

Â T of n equals what?

Â Right?

Â I know t of n equals o of n, for the merge, plus something,

Â because in, we're doing the merge, but we are doing also two other things.

Â Right?

Â So, it's o of n for merging the two halves, but

Â how long does it take to do merge sort, and to do merge sort?

Â For a recursive algorithm, probably, actually probably your

Â first impression was, okay let's repeat the same exercise and say, okay.

Â Now, how long does it take this, and this, and then how long does it take this.

Â And keep repeating this exercise, and adding these kind of terms.

Â But for a recursive algorithm there's a much simpler technique, and

Â a much elegant technique for analysing this.

Â Which is, I know that the running time of my sort is t of n.

Â I don't know what t of n is yet, but I know it is t of n, I'm call it t of n.

Â 5:47

Now, if this is t of n, it takes t of n to, to do my sort of a list of size n.

Â This is the same function exactly the same function not modified at all, but

Â applied to a list that is half the original size.

Â So, this is the same function like this, but

Â applied to a different instance of the problem.

Â This was applied to the big list, this is applied to half.

Â So if this takes t of n.

Â If this takes t of n, this is going to take t of n over 2.

Â This is t of n over 2.

Â And this is the same story, t of n over 2.

Â So, what we get from here, is that we have 2,

Â t of n over 2 [SOUND] this is the running time of merge sort.

Â Now, [COUGH] when I, when I write something like this.

Â This is probably the first time you have seen something like this.

Â And it doesn't look like anything we have seen, so far.

Â So, far we have been talking about running times that are o of n squared of 2 to

Â the n of n factorial.

Â But what is this all of a sudden?

Â How does this function grow?

Â Is this good?

Â Is this bad?

Â And I said in the beginning that it is going to be o of n log n.

Â You don't see n log n here, right?

Â So, what we have here, this function, t of n equal 2,

Â t of n over 2 plus o of n [SOUND] this is called a recurrence.

Â [SOUND] And recurrences arise when we analyse the running time of recursive,

Â [COUGH], recursive algorithms.

Â Because when we run a recursive algorithm,

Â we want to describe the running time of the algorithm, t of n.

Â We are usually calling the function, the same function, itself.

Â So, we will see the same function t, but on different in,

Â on different parameters, okay?

Â So this is a recurrence.

Â And when we deal with recursive algorithms,

Â we are going to see these recurrences arising, to describe the running time.

Â And, while this is does not give us the sort of functions that we

Â are used to seeing, and give us an idea of how bad or good the algorithm is.

Â Well, there is a technique for taking this recurrence, and

Â writing it in terms of an explic, explicit formula.

Â So, I can take this and I show that this, in fact, evaluates the o of n log n.

Â But, when we have a recurrence, when we have a recurrence like this.

Â And before we, we, we are able to, to get the, the explicit formula for

Â t over n, we also have to put a rule for the base case.

Â Because look at this, t of n is 2, t of n over 2 equa, plus o of n.

Â 8:30

Then I can say it is 2 of, this is t of n over 2.

Â But what is t of n over 2?

Â It is t 2 times t of n over 4 plus o of n over 2.

Â And I can keep repeating this exercise and getting all these copies.

Â But where do I stop, all right?

Â Where do I stop?

Â So, where does, where I stop is exactly where the algorithm stops.

Â Where did, how did we define merge sort?

Â We defined merge sort to stop when it gets to a list of size one.

Â So when it gets a list of size one,

Â all it does, it just returns, because it doesn't do any, any more work.

Â So, this is why we write that.

Â We have a base case here, and I have to define it.

Â Which is in our case here, it's for a list of size one.

Â When the list of size one, this algorithm, does the constant amount of work,

Â which is o of 1.

Â 9:23

All right.

Â So, when we write our recurrence, [COUGH] we have to write the recurrence formula,

Â and we have to write the formula for the base keys.

Â The base keys should not have, should not be a recurrence,

Â it should be again a specific explicit formula, for how long it

Â takes the algorithm when we reach the simplest instance of the problem.

Â So for merge sort this is really what we get,

Â this is the running time of merge sort in terms of the recurrence.

Â The question again how does this grow and why is this all o of n log n,

Â and this is what we are going to talk about next.

Â