“微积分二：数列与级数”将介绍数列、无穷级数、收敛判别法和泰勒级数。本课程不仅仅满足于得到答案，而且要做到知其然，并知其所以然。 注意：此课程的注册将在2018年3月30日结束。如果您在该日期之前注册，您将可以在2018年9月之前访问该课程。

Loading...

From the course by The Ohio State University

微积分二: 数列与级数 (中文版)

46 ratings

“微积分二：数列与级数”将介绍数列、无穷级数、收敛判别法和泰勒级数。本课程不仅仅满足于得到答案，而且要做到知其然，并知其所以然。 注意：此课程的注册将在2018年3月30日结束。如果您在该日期之前注册，您将可以在2018年9月之前访问该课程。

From the lesson

泰勒级数

在最后一个模块中，我们介绍泰勒级数。与从幂级数开始并找到其代表的函数的更好描述不同，我们将从函数开始，并尝试为其寻找幂级数。无法保证一定会成功！但令人难以置信的是，许多我们最喜欢的函数都具有幂级数表达式。有时，梦想会成真。和许多梦想相似，多数不说为妙。我希望对泰勒级数的这一简介能激起你学习更多微积分的欲望。

- Jim Fowler, PhDProfessor

Mathematics

Approximation.

[SOUND] Way back in Calculus 1,

we were finding linear approximations for

a function near a point a.

Well, how did that go?

Well, it looked like this.

f(x) was approximately f(a),

writing down a linear approximation around the point a,

+ the derivative of a times how much the input has to

change to go from a to x, so times (x-a).

[LAUGH] Why does this makes sense?

Well, the derivative is recording infinitesimally how the output changes

as the input changes.

So if I take the ratio of output change to input change and

multiply by the input change, this is approximately the output change.

And I'm adding it to the output at the point a, and

that will give me about the value of the function at x.

But let me put a little different spin on this.

Instead of thinking about what the derivative means and some ratio between

output values and input values, let's just think a little more naively.

Let's just think that I'm trying to write down a function, which has the same value

as my function f(a) and which has the same derivative as my function f(a).

We'll talk about this a little more precisely.

Let me give a name to this linear approximation, let's call that g(x).

So g(x) will be f(a) + f'(a) times (x-a).

Now what do I know about this function g?

Well, what's the value of g(a)?

Right, what happens if I plug in a for x?

Well then I've got (a-a), this is 0 in that case, so this whole term is 0.

So g(a) is just this, f(a).

And what's g'(a)?

Well, it's just the derivative of g, right?

Well, it's the derivative of this constant,

which is 0, + the derivative of this.

Well, this is a constant, so it's f'(a) times the derivative of this,

but what's the derivative of this with respect to x?

It is just 1.

So the derivative of g regardless of what x is, it's just f'(a).

So look, the function g here

is a function whose value at a is equal to the value of f(a) and

whose derivative at a is equal to f'(a).

I can do better.

So I wanted to find a new function, I'm also going to call it g, but

it's going to be defined this way.

So g(x) will be defined as follows,

g(x) = f(a) + f'(a) times (x-a).

So this is just a linear approximation to f(a),

but I'm going to add an extra term,

+ f''(a)/2, times (x-a) squared.

Why is that a good idea?

When I write it out like this,

this new function g has three really great properties.

First of all, g(a) = f(a).

If I plug in an a for x, that kills this term and this term.

All that I'm left with is just f(a).

Also, g'(a) = f'(a).

Now, that would have been true even if I didn't have this extra term.

But by including this extra term,

it also turns out that g''(a) = f''(a).

I mean, actually, the second derivative of g at any point is equal to f''(a).

But in particular, this is true.

So, I've written down a function that not only has the same value as f does at a,

not only has the same derivative as f does at a, but

also has the same second derivative as f does at a.

And I can just keep on going.

So here we go.

I'm going to define yet another function that I will again call g(x),

and it's going to start out the exact same way.

It's going to start out with f(a)

+ f'(a) times (x-a).

So this is just the linear approximation to f that we're used to.

Then last time, I added just one more term,

going to add that term again, f''(a)/2 times (x-a) squared.

But now I'll add another term, I'm going to add this term,

f'''(a)/6 times (x-a) cubed.

Now the idea here is that this function g is going to turn out to

agree with the function f(a).

It's also going to have the same derivative as the function f(a).

It's also going to have the same second derivative as the function f(a).

And it's going to have the same third derivative as the function f(a).

How to see this, all I have to do is differentiate.

Before I differentiate, let me just point out that g(a) is equal to what?

Well, if I plug in a for x, that kills this term, this term,

and this term, these are all 0.

So the only thing that's left is f(a).

Now let's try to differentiate, let's differentiate this once.

What's the derivative of g?

Well, to differentiate g is going to differentiate this.

So the derivative of this constant is 0.

The derivative of this constant times (x- a), well, that's the constant, f'(a),

times the derivative of (x- a), which is 1, +, what's the derivative of this term?

Well, that's this constant,

f''(a)/2 times the derivative of (x-a) squared,

which is 2 times (x-a) times the derivative of the inside, which is just 1.

All right, and then I gotta add the derivative of this term, well,

that's this constant f'''(a)/6 times the derivative of this,

which is 3 (x-a) squared times the derivative of the inside, which is just 1.

And now, what happens if I plug in a?

What is g'(a)?

Well this term and this term both have an (x-a), so they die.

The only thing that survives is this.

So g'(a) is the same as f'(a).

Now what about the second derivative?

I just gotta differentiate this again.

So what's g''(x)?

Well, I don't need to differentiate the 0, that's just 0.

This is a constant, so that's also 0.

What's the derivative of this?

Well, it's this, which is just f''(a) times the derivative of (x-a), which is 1.

And then what's the derivative of this?

Well, it's this constant, which is f'''(a)/2,

Times the derivative of (x-a) squared,

which is 2 (x-a) times the derivative of the inside, which is just 1.

Now [LAUGH] why is this when I plug in a for x, right?

What's g"(a)?

Well, when I plug in a for x, that kills this term, and

the only thing that survives is this.

So g''(a) is the same as f''(a).

Now what about the third derivative?

Well, then I just gotta differentiate this.

So what's g'''(x)?

Well, it's 0 + 0 + the derivative of this constant is 0, + the derivative

of this term, well, that's this constant times the derivative of (x-a).

Well, this constant is just f'''(a) times the derivative of (x-a), which is just 1.

So g'''(a) =

f'''(a).

So this cubic polynomial not only matches the function's value, but

it also matches the first, second, and third derivative of the function f(a).

Let me make this more concrete.

Let me apply this to a specific function.

Here's a specific example.

Let's apply this to the function f(x) = sine x.

I'm going to differentiate sine a bunch of times.

So the derivative of sine is cosine.

The second derivative of sine, the derivative of cosine,

is -sine, it's a bit of a joke.

What's the third derivative of sine?

Well, it's the derivative of -sine, that's -cosine x.

Now let's evaluate those at 0.

Well the function's value at 0, right, what's sine of 0?

It's 0.

What's the derivative at 0?

Well, that's cosine of 0, that's 1.

What's the second derivative at 0?

Well, that's- sine of 0, that's 0.

What's the third derivative at 0?

Well, that's -cosine of 0, that's -1.

Now I've got all the pieces ready.

I'm ready to write down a polynomial that has the same value,

the same derivative, the same second derivative, and

the same third derivative as sine does at the point 0.

Well in this case, g(x) is what?

Well it's the function's value at 0, which is 0,

+ the derivative at 0, which is 1, times (x-a),

which is just x, + the second derivative at 0,

which is 0, divided by 2 times (x-a) squared

+ the third derivative of the function at 0,

which is -1, divided by 6 times (x-a) cube,

and the a is 0 in this case, so it's just x cubed.

I can simplify that a bit.

Well, then I just get 0, I don't have to write that down.

x, this is just 0,- x cubed/6.

So there is my cubic polynomial.

So this is pretty great.

I've got this polynomial, and

it's supposed to be a pretty good approximation to sine.

Well let's took a look at a graph of sine to get an idea of

just how good of an approximation this polynomial really is.

Well here's a graph of the function y = sine of x.

It's got that familiar sinusoidal shape.

Let me superimpose on this the graph of the cubic polynomial

that we've been studying.

Here's that cubic polynomial.

This thick green curve is the graph of y = x- x cubed/6.

And remember how I got this polynomial.

This polynomial is rigged, so

that the value of this polynomial at 0 is the same as the value of sine at 0.

The derivative of this polynomial at 0 is the same as the derivative of sine at 0.

Same goes for the second derivative and the third derivative, right,

if I differentiate this thing three times,

I get the same thing as this differentiated three times at 0.

But it's interesting to note that even though I've rigged this

polynomial only to match up with sine at 0, at least in value for

second and third derivative, this graph is sort of resembling sine.

I mean, this green curve, especially around here,

is a pretty good approximation to the graph of the sine function.

I mean, it's obviously not great out here, but at least in here,

it's looking pretty good.

We can also look at this numerically.

For example, sine (1/2) is approximately,

to four decimal places, 0.4794.

[LAUGH] But what is this approximating function at 1/2?

What's g(1/2)?

Well, that's 1/2- (1/2) cubed/6.

Well that's 1/2 -, what is this?

That's 1/2 to the 3rd, that’s 1/8, over a 6, that’s 1/2- 1/48.

Well, instead of writing 1/2, I could write

24/48- 1/48, well, that's 23/48.

Well, what's 23/48?

That's approximately 0.4792.

That is awfully close to [LAUGH] sine of 1/2.

And whether we're thinking graphically or numerically,

we're seeing something significant here.

Way back in Calculus I, we thought about approximating a function like this,

this is linear approximations.

But now we're getting the idea that it'd be even better to approximate f,

not just by a degree 1 polynomial, but by a higher degree polynomial.

And even better than polynomials are power series.

Well, exactly, a power series is a lot like a polynomial that just

keeps on going.

So if a degree 10 polynomial is doing a better job than just a linear

approximation and if a degree 1,000 polynomial would do an even better job,

maybe the power series, if we just kept on going forever, would do such a good job

that the function would actually be equal to some power series.

That is a huge thing to hope for, but sometimes, dreams come true.

[SOUND]

Coursera provides universal access to the world’s best education,
partnering with top universities and organizations to offer courses online.