In the previous videos, we looked at properties of Inner Products to compute Lengths, Angles and Distances. We focused on inner products of finite dimensional vector spaces. In this video, we will look at two examples of inner products of other types of vectors, inner products of functions and inner products of random variables. The inner products we discussed so far were defined for vectors with a finite number of entries, and we can think of these vectors as discrete functions with a finite number of function values. The concept of an inner product can be generalized to continuous valued functions as well. And then the sum over individual components of vectors turns into an integral, and the inner product between two functions is defined as follows. You can write that the inner product between two functions u and v is the integral of an interval from a to b. A u of x times v of x, d x, and as with our normal inner product, we can define Norms and Orthogonality by looking at this inner product. If that integral evaluates to zero, the functions u and v are orthogonal. Let's have a look at an example. If we choose u of x equals Sine x and v of x is Cosine of x, and we define f of x to v, u of x times v of x, which is a Sine x times Cosine x then we're going to end up with this function. This function is Sine x times Cosine x. We see that this function is odd, which means that f of minus x equals minus f of x. If we choose the integral limit to be minus Pi and plus Pi then the integral of this product Sine of x times Cosine of x evaluates to zero. And that means that Sine and Cosine are Orthogonal, and actually holds that, if you look at a set of functions say one Cosine x, Cosine of two x, Cosine three x and so on, that all of these functions are Orthogonal to each other if we integrate from minus Pi to plus Pi. Another example for defining an inner product between unusual types are random variables or random vectors. If we have two random variables, which are uncorrelated, then we know the following relationship. We know that, the variance of x plus y is the variance of x plus the variance of y where x and y are random variables. Remember that variances are measured in squared units. This looks very much like the Pythagorean Theorem for right triangles, that one states that c squared equals a squared plus b squared. If we look at triangles of this form where this is a, b, and this is c. Let's see whether we can find a geometric interpretation of the variance relation of uncorrelated random variables. Random variables can be considered vectors in a vector space and we can define inner products to obtain geometric properties of these random variables. If we define the inner product between two random variables between x and y to be the covariance between x and y, we see that the covariance is Symmetric, Positive Definite, and Linear. So linearity would mean that the covariance of lambda times x plus y and z where x, y and z are random variables, and lambda is a real number, is lambda times the covariance between x and z plus the covariance between y and z. And if the length of a random variable is the square root of the covariance between of x with itself, which is the square root of the variance of x, then this is the standard deviation of the random variable x. So this is the length of a random variable. Therefore, the zero vector is a vector that has no uncertainty, that means standard deviation is zero. If we now look at the angle between two random variables, we get the following relationship. We get the Cosine of theta, which is the angle between two random variables is by definition the inner product between the two random variables, divided by the length of the first random variable, times the length of the second random variable. And if we now write this out using the definition of our inner product, we get the covariance between x and y, divided by the square root of the variance of x, times the variance of y. And this evaluates to zero if the covariance between x and y is zero, and that is the case when x and y are uncorrelated. Coming back now to our geometric interpretation, we would now replace a with the standard deviation of x, b is the standard deviation of y, and c is the square root of the variance of x plus the variance of y. And this is how we get our geometric interpretation of random variables. In this video, we looked at inner products of rather unusual objects, functions and random variables. However, even with functions and random variables, the inner product allows us to think about lengths and angles between these objects. In the case of random variables, we saw that the variance of the sum of two uncorrelated random variables can be geometrically interpreted using the Pythagorean Theorem.