Probability
- Tags
- math
A probability is a value between 0 and 1 which measures the likelihood of an event occurring. A probability of \( 0 \) means the event will never occur, and \( 1 \) means the event will definitely occur.
We define the sample-space as the list of all the outcomes for any one experiment. We define an event as a group (or set) of outcomes from an experiment. For example if the experiment is rolling a dice an event can be the dice coming up with a value greater than 4.
Measuring the Probability of an Event
If the outcomes are equally likely then the probability can be written as \[ p(\text{Event}) = \frac{\text{Number of favourable outcomes}}{\text{Number of possible outcomes}} \]
Alternatively we can use a relative-frequency approach. Repeat an experiment a number of times and then measure the likelihood of an event as the number of times the experiment had that outcome over the total number of experiments.
Rules
For any two events \( A \) and \( B \):
\begin{align} P(a \cup B) = P(A) + P(B) - P(A \cap B) \label{eq:prob} \end{align}
For two mutually exclusive events \( A \) and \( B \):
- \( P(A \cap B) = 0 \), meaning both events cannot happen simultaneously.
- \( P(A \cup B) = P(A) + P(B) \), following from the previous equation and eq:prob.
For two independent events \( A \) and \( B \), meaning two events where the outcome of one has no effect on the outcome of the other, we define:
- \( P(A \cap B) = P(A) \times P(b) \)
- \( P(A \mid B) = P(A \mid \neg B) = P(A) \)
We define the probability of \( A \) given \( B \) as the probability of the event \( A \) occurring, given we know the event \( B \) has occurred. \[ P(B \mid A) = \frac{P(B \cap A)}{P(A)} \] This can be re-arranged into: \[ P(A \cap B) = P(A) \times P(B \mid A) \]