The concept of conditional probability is a very important to understand Bayesian networks. An excellent introduction to these concepts is available in this video - https://youtu.be/5s7XdGacztw
As we know, probability is calculated as the number of desired outcomes divided by the total possible outcomes. Hence if we roll a dice, the probability that it would be 4 is 1/6 ~ (P = 0.166 = 16.66%)
Siimilary, the probability of an event not occurring is called as the complement ~ (1-P). Hence the probability of not rolling a 4 would = 1-0.166 = 0.833 ~ 83.33%
While the above is true for a single variable, we also need to understand how to calculate the probability of two or more variables - e.g. probability of lightening and thunder happening together when it rains.
When two or more variables are involved, then we have to consider 3 types of probability:
1) Joint probability calculates the likelihood of two events occurring together and at the same point in time. For example, the joint probability of event A and event B is written formally as: P(A and B) or P(A ^ B) or P(A, B)
2) Conditional probability measures the probability of one event given the occurrence of another event. It is typically denoted as P(A given B) or P(A | B). For complex problems involving many variables, it is difficult to calculate joint probability of all possible permutations and combinations. Hence conditional probability becomes a useful and easy technique to solve such problems. Please check the video link above.
3) Marginal probability is the probability of an event irrespective of the outcome of another variable.
Another excellent article explaining conditional probability with real life examples is here - https://www.investopedia.com/terms/c/conditional_probability.asp
No comments:
Post a Comment