parameters, right?

Where data matrix is

this thing here, and parameters

is this thing here, and this

times is a matrix vector multiplication.

And if you just do this then

this variable prediction - sorry

for my bad handwriting - then

just implement this one

line of code assuming you have

an appropriate library to do matrix vector multiplication.

If you just do this,

then prediction becomes this

4 by 1 dimensional vector, on

the right, that just gives you all the predicted prices.

And your alternative to doing

this as a matrix vector multiplication

would be to write eomething like

, you know, for I equals 1 to 4, right?

And you have say a thousand houses

it would be for I equals 1 to a thousand or whatever.

And then you have to write a

prediction, you know, if I equals.

and then do a bunch

more work over there and it

turns out that When you

have a large number of houses,

if you're trying to predict the prices

of not just four but maybe

of a thousand houses then

it turns out that when

you implement this in the

computer, implementing it like this, in any of the various languages.

This is not only true for

Octave, but for Supra Server

Java or Python, other high-level, other languages as well.

It turns out, that, by writing

code in this style on the

left, it allows you to

not only simplify the

code, because, now, you're just

writing one line of code

rather than the form of a bunch of things inside.

But, for subtle reasons, that we

will see later, it turns

out to be much more computationally

efficient to make predictions

on all of the prices of

all of your houses doing it

the way on the left than the

way on the right than if you were to write your own formula.

I'll say more about this

later when we talk about

vectorization, but, so, by

posing a prediction this way, you

get not only a simpler piece

of code, but a more efficient one.

So, that's it for

matrix vector multiplication and we'll

make good use of these sorts

of operations as we develop

the living regression in other models further.

But, in the next video we're

going to take this and generalize this

to the case of matrix matrix multiplication.