This too, goes to zero, which means that linear growth is faster than logarithmic

growth, even logarithmic growth raised to a power.

Now, put these together, this means that exponential growth beats polynomial

growth beats logarithmic or even polylogarithmic growth.

What else is there? Well there's an entire world or hierarchy

of functional growth as x goes to infinity.

Beginning to the constant functions that don't grow at all, moving up to

logarithmic growth, to linear growth, quadratic growth, cubic growth, all the

way to exponential growth. One question remains.

Is there anything beyond that? What can grow faster than exponential

growth? Well there are indeed worlds beyond that.

One simple example being factorial growth.

For now, what we want to do is consider the monomials, x to the n, for n bigger

than 0, and look at their growth. Both, as x goes to infinity, and as x

goes to 0. Well, it's not exactly growing there.

It's shrinking. But this is extremely interesting,

because if you look at the hierarchy of functions, as x is going to zero, we see

that linear is bigger than quadratic, is bigger than cubic, is bigger than

quartic, et cetera. That is, x to the n plus 1 is less than x

to the n, as x is going to 0. That is in contradistinction to what

happens as x goes to infinity, where we see that linear growth, is much slower

than quadratic growth, is much slower than cubic, et cetera.

That is, x to the n plus 1 is bigger than x to the n, as x is getting larger and

larger. All right.

What I'm going to do with that, well, let's take a look at an example.

The limit of 1 minus 2x plus 3x squared over 4 minus 5x plus 6x squared.

Let's first consider the limit as x goes to zero, what does that become?

Well this one's really simple. As x is going to 0, all of the higher

order terms in x are small, negligible, which means that we simply evaluate this

limit and get one fourth. The ratio of the lowest order terms in x.

On the other hand, if we look at this same limit as x goes to infinity, we

can't do the same thing. what we have to do is switch things

around a little bit. Factor out the highest order power of x.

In this case, x squared. Cancel that.

And what we see left over is 3 minus something small over 6 minus something

small giving a value of one half and the limit.

Now, if we compare these 2, we see that as x is getting small, it's the lowest

order terms that dominate. Whereas, if x is going to infinity, it's

the highest order terms that dominate. This duality is extremely important and

can cause confusion, so always pay attention to which way the limit is

going. What we really need is a good language

for controlling and describing this sort of growth.

This is going to be called big O and it has two forms.

One as x goes to infinity, the other as x is going to 0.

This is an extremely important and subtle definition.

You're going to want to pay attention to this.

We say that a function f is in big O of x to the n if, in absolute value, f of x is

less than some constant times x to the n. Now, we're not saying exactly what that

constant is, it's just some constant. And we're just getting an upper bound, we

don't actually care about the value of c, we just want to control the growth.

Now, there are two versions of this. One, we can say that this inequality has

to hold as x goes to zero, in which case, big O of x to the n consists of all

functions that approach zero, at least as fast as x to the n.

On the other hand, if we look at growth as x goes to infinity, then big O of x to

the n consists of those functions that approach infinity no faster than x to the

n. Both cases these are an upper bound.

This language replaces the higher order of terms that we have used in the past.

For example, if we look at the expansion of arctan of x.

What is that? The first term in the Taylor series is x.

All of the other terms are of higher order in x, specifically, we could say,

big O of x squared. Or, more specifically, we could say big O

of x cubed. That is a stronger statement.

It allows us to be more precise. Or, if we want to expand further, we

could say that arctan of x is x minus x cubed over 3 plus big O of x to the 5th.

Likewise, as x goes to infinity, we could look at a function like x times square

root of 5 plus 3x plus x squared. This is a complicated looking function.

But really it's just x squared plus some other stuff in the limit as x goes to

infinity. That other stuff is itself going to

infinity, but only at a linear rate. It is in big O of x.

If we wanted to be more specific, we could use the binomial series, to say

that this function is x squared plus 3 halves x plus something in big O of 1.

That is, something bounded, something in big O of x to the zero.

More generally, instead of talking about big O of x to the n, we could talk about

big O of some other function, g of x, if this same inequality holds for some

constant C. Let's look at a few examples.

In the limit, as x goes to zero, the function 5x plus 3x squared is in big O

of x, but it's not in big O of x squared. Sine of x is in big O of x, but not in

big O of x squared. Log of 1 plus x minus x, as you could see

from Taylor expanding, is in big O of x squared, but not in big O of x cubed.

1 minus of cosine of x is in big O of x to the fourth, but not in big O of x to

the fifth. Square root of x is not in big O of x to

the n for any n bigger than 1. It does not go to zero very quickly at

all. On the other hand, e to the minus 1 over

x squared is very quickly going to 0, it's in big O of x to the n for all

positive n. As x is going to infinity, we have a

number of interesting examples as well. Arctan of x is in big O of 1.

It's bounded. That means it's also in big O of x to the

n for any positive n. x times square root of 1 plus x squared

grows quadratically most, but it is not in big O of x to the three halves.

The log of the hyperbolic sine of x, well, is really in big O of x.

It's got linear growth at most. It's not in big O of log of x.

It grows too fast for that. Hyperbolic cosine of x is in big O of e

to the x. It has exponential growth, but not

polynomial growth. Log of x to the 5th can be simplified,

and we see that it's really in big O of log x, as well as big O of x to the n,

for any positive n. Lastly, x to the x is not in big O of

constant to the x for any constant. And it's not in big O of x factorial.

This is even super factorial growth. Now there is a huge subject of asymptotic

analysis associated with big O. It has certain rules.

The ones that you need to know for this course are the following.

If you take something and big O of x to the m and multiply it by something in big

oh of x to the n, you're going to get something in big O of x to the m plus n.

Just like multiplying monomials together. This is true whether you're going to

zero, or infinity. On the other hand, if you add two

functions together. Something that has x to the m growth, and

something that has x to the n growth, then there's a difference.

As x goes to 0, your resulting function is in the minimum of m and n.