Probability – L05.9 Elementary Properties of Expectation

We now note some elementary properties of expectations.

These will be some properties that are extremely natural and intuitive, but even so, they are worth recording.

The first property is the following.

If you have a random variable which is non-negative, then its expected value is also non-negative.

What does it mean that the random variable is non-negative? What it means is that for all possible outcomes of the experiment, no matter what the outcome is, the associated numerical value of the random variable is a non-negative number.

What’s the implication of this? When we calculate an expectation we’re adding over all the possible numerical values of the random variable.

All the possible numerical values of the random variable under this assumption are non-negative.

Probabilities are also non-negative.

So we have a sum of non-negative entries and therefore, the expected value is also going to be non-negative.

The next property is a generalization of this.

Consider now a random variable that has the property that no matter what the outcome of the experiment is, the value of this random variable lies in the range between two constants, a and b.

In this case, we argue as follows.

The expected value, by definition, is a sum over all possible values of the random variable of certain terms.

Now, the possible numerical values of the random variable are all of them at least as large as a, so this gives us an inequality of this type.

Then, we pull a factor of a outside of the summation.

And finally, we recall that the sum of a PMF over all possible values of little x is equal to 1.

Why is that the case? Well, these are the probabilities for the different numerical values of the random variable.

The sum of the probabilities of all the possible numerical values has to be equal to 1, because that exhausts all the possibilities.

So we obtain a times 1, which is a.

So, what we have proved is that the expected value is at least large as a.

You can use a symmetrical argument where the inequalities will go the opposite way and where a’s will be replaced by b’s, to prove the second inequality, as well.

The last fact we want to take note of is the following.

If we have a constant and we take its expected value, we obtain the same constant.

What does that mean? We have only been talking about expected values of random variables.

What does it mean to take the expected value of a constant? Well, as we discussed earlier, we can think of a constant as being a random variable of a very special type.

A random variable whose PMF takes this form.

This random variable can take only a single value and the probability of that single value is equal to 1.

This means that in the formula for the expected value there’s going to be only one term in this summation, and that term is going to be c times the probability that our random variable takes the value c.

Now, that probability is equal to 1, and we’re left with c.

So this equality makes sense, of course, as long as you understand that a constant can also be viewed as a random variable of a very degenerate type.

Now, intuitively, of course, it’s certainly clear what this is saying.

That if a certain quantity is always equal to c, then on the average, it will also be equal to c.

Scroll to Top