And if we plot this function as a function of z, what you find is that you get this

curve shown on the lower left of the slide.

And thus, we also see that when z is equal to large, that is,

when theta transpose x is large, that corresponds to a value of z that gives us

a fairly small value, a very, very small contribution to the consumption.

And this kinda explains why, when logistic regression sees a positive example,

with y=1, it tries to set theta transport x to be very large

because that corresponds to this term, in the cross function, being small.

Now, to fill the support vec machine, here's what we're going to do.

We're gonna take this cross function, this minus log 1 over 1 plus e to negative z,

and modify it a little bit.

Let me take this point 1 over here, and

let me draw the cross functions you're going to use.

The new pass functions can be flat from here on out, and

then we draw something that grows as a straight line,

similar to logistic regression.

But this is going to be a straight line at this portion.

So the curve that I just drew in magenta, and the curve I just drew purple and

magenta, so if it's pretty close approximation to

the cross function used by logistic regression.

Except it is now made up of two line segments, there's this flat portion on

the right, and then there's this straight line portion on the left.

And don't worry too much about the slope of the straight line portion.

It doesn't matter that much.

But that's the new cost function we're going to use for when y is equal to one,

and you can imagine it should do something pretty similar to logistic regression.

But turns out, that this will give the support vector machine

computational advantages and give us, later on, an easier optimization problem