Probability – L02.1 Lecture Overview

Suppose I look at the registry of residents of my town and pick a person at random. What is the probability that this person is under 18 years of age?

The answer is about 25%. Suppose now that I tell you that this person is married. Will you give the same answer? Of course not.

The probability of being less than 18 years old is now much smaller. What happened here?

We started with some initial probabilities that reflect what we know or believe about the world. But we then acquired some additional knowledge, some new evidence– for example, about this person’s family situation.

This new knowledge should cause our beliefs to change, and the original probabilities must be replaced with new probabilities that take into account the new information. These revised probabilities are what we call conditional probabilities.

And this is the subject of this lecture. We will start with a formal definition of conditional probabilities together with the motivation behind this particular definition.

We will then proceed to develop three tools that rely on conditional probabilities, including the Bayes rule, which provides a systematic way for incorporating new evidence into a probability model.

The three tools that we introduce in this lecture involve very simple and elementary mathematical formulas, yet they encapsulate some very powerful ideas.

It is not an exaggeration to say that much of this class will revolve around the repeated application of variations of these three tools to increasingly complicated situations.

In particular, the Bayes rule is the foundation for the field of inference. It is a guide on how to process data and make inferences about unobserved quantities or phenomena.

As such, it is a tool that is used all the time, all over science and engineering.

Scroll to Top