2.1 Equally Likely Results

Last updated: March 8th, 20202020-03-08Project preview

Equally Likely Results

We will think now of more theoretical examples, to learn how to calculate some theoretical probabilities.

We have said in a previous lesson that the probability of getting heads when we toss a fair coin is $\dfrac{1}{2}$. And this was because getting heads is one of the two possible outcomes (heads or tails).

But the key here is that the coin is fair, meaning that the two outcomes have the same chances of coming up.

This kind of experiments are going to be called experiments with equally likely results, which only means that every outcome (or result) in the sample space will have the same probability of coming up.

More examples of this kind of experiments are:

  • Throwing a fair dice: in this case, the outcomes are $6$, but they all have the same probability of being rolled.
  • Picking a card from a deck: for example if the deck has $52$ cards, there will be $52$ outcomes, all equally likely.

Let's dig deeper into the dice example:

The sample space will be $\Omega = \{ 1, 2, 3, 4, 5, 6 \}$

We know from a previous lesson that the probabilities of all of them will add up to $1$, since there are six outcomes and all of them are equally likely, to get each probability individually we can divide $1$ into $6$.

This means: $P(1)=1/6$, $P(2)=1/6$, $P(3)=1/6$, $P(4)=1/6$, $P(5)=1/6$, $P(6)=1/6$.

We can clearly see that $P(1)+P(2)+P(3)+P(4)+P(5)+P(6)=\dfrac{6}{6}=1$.


Now let's think of the general case:

If $\Omega$ has equally likely results and $|\Omega|=n$, then $P(\omega)=\dfrac{1}{n}$ $\forall\omega\in\Omega$.

This just means that the probability of each outcome individually is $1$ over the total amount of outcomes in our sample space.

And this is because if there are $n$ equally likely options, and all add up to $1$, each one must be $\dfrac{1}{n}$.

Notebooks AI
Notebooks AI Profile20060