# Course Diary.

• ## Today, Jul 2, we covered the chi-squared random variable in more detail, and I worked through the example of how to use it for ``goodness of fit'' of a model, and then used it in population sampling (though didn't go through in detail how to derive the `chi-square' test statistic). Here is an old note I wrote on these examples. I also mentioned that if a chi-squared random variable has 2 degrees of freedom, then it has an exponential distribution. First, I showed this via the formula relating chi-square to the gamma distribution; and then later in the lecture I did it using integration over 2D random variables. I defined the joint pdf for a 2D random variable, I defined the marginal pdfs, and I defined independent random variables. I went through a few examples, and presented a certain puzzle on 2D random variables with positive pdf on a triangle. Then, I used some integration to show directly that a chi-squared random variable with 2 degrees of freedom has an exponential distribution (involves a 2D integral and a change to polar coordinates). I explained how to find the distribution of a function U of a 2D random variable X,Y -- U itself is 1D, while the variables it depends upon form a 2D r.v. Lastly, I talked about expectation and variance of function of a 2D r.v. Be sure to do the relevant homework in the book!

• On July 9 I started the lecture with conditional probability density functions, conditional expectation, and conditonal variance. I then talked about, and proved, the tower property of conditional expectation. Next, I talked about the independence of two or more random variables, and showed that if X1, ..., Xk are independent random variables, and u1, ..., uk are any functions, then the expectation E(u1(X1) ... uk(Xk)) is the product E(u1(X1)) ... E(uk(X_k)) -- that is, the expectation of a product is the product of the expectations, WHENEVER THE RANDOM VARIABLES ARE INDEPDENT. I used this to show that the moment generating function for a sum of independent random variables X1 + ... + Xk is the product of moment generating functions of the individual variables. I mentioned in passing that this will be useful when we go to try to prove the ``Central Limit Theorem'', one of the most important theorems in probability theory (and mathematics, for that matter). After that I showed that if X1, ..., Xk independent, then V(X1 + ... + Xk) = V(X1) + ... + V(Xk); that is, the variance of the sum of INDEPENDENT r.v. is the sum of variances. Finally, I used this to prove the ``Weak Law of Large Numbers''.