Probability – L07.1 Lecture Overview – Conditioning of Random Variable; Independence of r.v.’s

In this last lecture of this unit, we continue with some of our earlier themes, and then introduce one new notion, the notion of independence of random variables.

We will start by elaborating a bit more on the subject of conditional probability mass functions.

We have already discussed the case where we condition a random variable on an event.

Here we will talk about conditioning a random variable on another random variable, and we will develop yet another version of the total probability and total expectation theorems.

There are no new concepts here, just new notation.

I should say, however, that notation is important, because it guides you on how to think about problems in the most economical way.

The one new concept that we will introduce is the notion of independence of random variables.

It is actually not an entirely new concept.

It is defined more or less the same way as independence of events, and has a similar intuitive interpretation.

Two random variables are independent if information about the value of one of them does not change your model or beliefs about the other.

On the mathematical side, we will see that independence leads to some additional nice properties of means and variances.

We will conclude this lecture and this unit on discrete random variables by considering a rather difficult problem, the hat problem.

We will see that by being systematic and using some of the tricks that we have learned, we can calculate the mean and variance of a rather complicated random variable.

Scroll to Top