Bayes Theorem
Bayes theorem is a theorem in probability and statistics, named after the Reverend Thomas Bayes, that helps in determining the probability of an event that is based on some event that has already occurred. Bayes theorem has many applications such as bayesian interference, in the healthcare sector  to determine the chances of developing health problems with an increase in age and many others. Here, we will aim at understanding the use of the Bayes theorem in determining the probability of events, its statement, formula, and derivation with the help of examples.
1.  What is Bayes Theorem? 
2.  Proof of Bayes Theorem 
3.  Bayes Theorem Formula 
4.  Difference between Conditional Probability and Bayes Theorem 
5.  Terms Related to Bayes Theorem 
8.  FAQs on Bayes Theorem 
What is Bayes Theorem?
Bayes theorem, in simple words, determines the conditional probability of an event A given that event B has already occurred. Bayes theorem is also known as the Bayes Rule or Bayes Law. It is a method to determine the probability of an event based on the occurrences of prior events. It is used to calculate conditional probability. Bayes theorem calculates the probability based on the hypothesis. Now, let us state the theorem and its proof. Bayes theorem states that the conditional probability of an event A, given the occurrence of another event B, is equal to the product of the likelihood of B, given A and the probability of A. It is given as:
\(P(AB) = \dfrac{P(BA)P(A)}{P(B)}\)
Here, P(A) = how likely A happens(Prior knowledge) The probability of a hypothesis is true before any evidence is present.
P(B) = how likely B happens(Marginalization) The probability of observing the evidence.
P(A/B) = how likely A happens given that B has happened(Posterior)The probability of a hypothesis is true given the evidence.
P(B/A) = how likely B happens given that A has happened(Likelihood) The probability of seeing the evidence if the hypothesis is true.
Bayes Theorem  Statement
The statement of Bayes Theorem is as follows: Let \(E_{1}, E_{2}, E_{3}, ..., E_{n}\) be a set of events associated with a sample space S, where all events \(E_{1}, E_{2}, E_{3}, ..., E_{n}\) have nonzero probability of occurrence and they form a partition of S. Let A be any event which occurs with \(E_{1} or E_{2} or E_{3} ...or E_{n}\), then according to Bayes Theorem,
\(P(E_{i}  A) = \dfrac{P(E_{i})P(AE_{i})}{\sum_{k=1}^{n}P(E_{k})P(AE_{k})} , i=1,2,3,...,n\)
 Here E\(_i\) ∩ E\(_j\) = φ, where i ≠ j. (i.e) They are mutually exhaustive events
 The union of all the events of the partition, should give the sample space.
 0 ≤ P(E\(_{i}\)) ≤ 1
Proof of Bayes Theorem
To prove the Bayes Theorem, we will use the total probability and conditional probability formulas. The total probability of an event A is calculated when not enough data is known about event A, then we use other events related to event A to determine its probability. Conditional probability is the probability of event A given that other related events have already occurred.
(E\(_{i}\)), be is a partition of the sample space S. Let A be an event that occurred. Let us express A in terms of (E\(_{i}\)).
A = A ∩ S
= A ∩ (\(E_{1}, E_{2}, E_{3}, ..., E_{n}\))
A = (A ∩\(E_{1}\)) ∪ (A ∩\(E_{1}\)) ∪ (A ∩\(E_{1}\))....∪ ( A ∩\(E_{1}\))
P(A) = P[(A ∩\(E_{1}\)) ∪ (A ∩\(E_{1}\)) ∪ (A ∩\(E_{1}\))....∪ ( A ∩\(E_{1}\))]
We know that when A and B are disjoint sets, then P(A∪B) = P(A) + P(B)
Thus here, P(A) = P(A ∩\(E_{1}\)) +P(A ∩\(E_{1}\))+ P(A ∩\(E_{1}\)).....P(A ∩\(E_{n}\))
According to the multiplication theorem of a dependent event, we have
P(A) = P(E). P(A/\(E_{1}\)) + P(E). P(A/\(E_{2}\)) + P(E). P(A/\(E_{3}\))......+ P(A/\(E_{n}\))
Thus total probability of P(A) = \(\sum_{i=1}^{n}P(E_{i})P(AE_{i}) , i=1,2,3,...,n\)  (1)
Recalling the conditional probability, we get
\(P(E_{i}A) = \dfrac{P(E_{i}\cap A)}{P(A)} , i=1,2,3,...,n\) (2)
Using the formula for conditional probability of \(P(AE_{i})\), we have
\(P(E_{i}\cap A) = P(AE_{i}) P(E_{i})\)  (3)
Substituting equations (1) and (3) in equation (2), we get
\(P(E_{i}A) = \dfrac{P(AE_{i}) P(E_{i})}{\sum_{k=1}^{n}P(E_{k})P(AE_{k})}, i=1,2,3,...,n\)
Hence, Bayes Theorem is proved.
Bayes Theorem Formula
Bayes theorem formula exists for events and random variables. Bayes Theorem formulas are derived from the definition of conditional probability. It can be derived for events A and B, as well as continuous random variables X and Y. Let us first see the formula for events.
Bayes Theorem Formula for Events
The formula for events derived from the definition of conditional probability is:
\(P(AB) = \dfrac{P(BA)P(A)}{P(B)}, P(B) \neq 0\)
Derivation:
According to the definition of conditional probability, \(P(AB) = \dfrac{P(A \cap B)}{P(B)}, P(B) \neq 0\) and we know that \(P(A \cap B) = P(B \cap A) = P(BA)P(A)\), which implies,
\(P(AB) = \dfrac{P(BA)P(A)}{P(B)}\)
Hence, the Bayes theorem formula for events is derived.
Bayes Theorem for Continuous Random Variables
The formula for continuous random variables X and Y derived from the definition of the conditional probability of continuous variables is:
\(f_{XY=y}(x) = \dfrac{f_{YX=x}(y)f_{X}(x)}{f_{Y}(y)}\)
Derivation:
According to the definition of conditional density or conditional probability of continuous random variables, we know that \(f_{XY=y}(x)=\dfrac{f_{X,Y}(x,y)}{f_{Y}(y)}\) and \(f_{YX=x}(y)=\dfrac{f_{X,Y}(x,y)}{f_{X}(x)}\), which implies,
\(f_{XY=y}(x) = \dfrac{f_{YX=x}(y)f_{X}(x)}{f_{Y}(y)}\)
Hence, the Bayes Theorem formula for random continuous variables is derived.
Difference Between Conditional Probability and Bayes Theorem
Conditional Probability  Bayes Theorem 

Conditional Probability is the probability of an event A that is based on the occurrence of another event B.  Bayes Theorem is derived using the definition of conditional probability. The Bayes theorem formula includes two conditional probabilities. 
Formula: \(P(AB) = \dfrac{P(A \cap B)}{P(B)}\)  Formula: \(P(AB) = \dfrac{P(BA)P(A)}{P(B)}\) 
Terms Related to Bayes Theorem
As we have studied about Bayes theorem in detail, let us understand the meanings of a few terms related to the concept which have been used in the Bayes theorem formula and derivation:
 Conditional Probability  Conditional Probability is the probability of an event A based on the occurrence of another event B. It is denoted by P(AB) and represents the probability of A given that event B has already happened.
 Joint Probability  Joint probability measures the probability of two more events occurring together and at the same time. For two events A and B, it is denoted by \(P(A \cap B)\).
 Random Variables  Random variable is a realvalued variable whose possible values are determined by a random experiment. The probability of such variables is also called the experimental probability.
 Posterior Probability  Posterior probability is the probability of an event that is calculated after all the information related to the event has been accounted for. It is also known as conditional probability.
 Prior Probability  Prior probability is the probability of an event that is calculated before considering the new information obtained. It is the probability of an outcome that is determined based on current knowledge before the experiment is performed.
Important Notes on Bayes Theorem
 Bayes theorem is used to determine conditional probability.
 When two events A and B are independent, P(AB) = P(A) and P(BA) = P(B)
 Conditional probability can be calculated using the Bayes theorem for continuous random variables.
☛ Also Check:
Bayes Theorem Examples

Example 1: Amy has two bags. Bag I has 7 red and 2 blue balls and bag II has 5 red and 9 blue balls. Amy draws a ball at random and it turns out to be red. Determine the probability that the ball was from the bag I using the Bayes theorem.
Solution: Let X and Y be the events that the ball is from the bag I and bag II, respectively. Assume A to be the event of drawing a red ball. We know that the probability of choosing a bag for drawing a ball is 1/2, that is,
P(X) = P(Y) = 1/2Since there are 7 red balls out of a total of 11 balls in the bag I, therefore, P(drawing a red ball from the bag I) = P(AX) = 7/11
Similarly, P(drawing a red ball from bag II) = P(AY) = 5/14
We need to determine the value of P(the ball drawn is from the bag I given that it is a red ball), that is, P(XA). To determine this we will use Bayes Theorem. Using Bayes theorem, we have the following:
\(P(XA) = \dfrac{P(AX)P(X)}{P(AX)P(X)+P(AY)P(Y)}\)
= [((7/11)(1/2))/(7/11)(1/2)+(5/14)(1/2)]
= 0.64
Answer: Hence, the probability that the ball is drawn is from bag I is 0.64

Example 2: Assume that the chances of a person having a skin disease are 40%. Assuming that skin creams and drinking enough water reduces the risk of skin disease by 30% and prescription of a certain drug reduces its chance by 20%. At a time, a patient can choose any one of the two options with equal probabilities. It is given that after picking one of the options, the patient selected at random has the skin disease. Find the probability that the patient picked the option of skin screams and drinking enough water using the Bayes theorem.
Solution: Assume E1: The patient uses skin creams and drinks enough water; E2: The patient uses the drug; A: The selected patient has the skin disease
P(E1) = P(E2) = 1/2
Using the probabilities known to us, we have
P(AE1) = 0.4 × (10.3) = 0.28
P(AE2) = 0.4 × (10.2) = 0.32
Using Bayes Theorem, the probability that the selected patient uses skin creams and drinks enough water is given by,
\(P(E1A) = \dfrac{P(AE1)P(E1)}{P(AE1)P(E1)+P(AE2)P(E2)}\)
= (0.28 × 0.5)/(0.28 × 0.5 + 0.32 × 0.5)
= 0.14/(0.14 + 0.16)
= 0.47
Answer: The probability that the patient picked the first option is 0.47

Example 3: A man is known to speak the truth 3/4 times. He draws a card and reports it is king. Find the probability that it is actually a king.
Solution:
Let E be the event that the man reports that king is drawn from the pack of cards
A be the event that the king is drawn
B be the event that the king is not drawn.
Then we have P(A) = probability that king is drawn = 1/4
P(B) = probability that king is drawn = 3/4
P(E/A) = Probability that the man says the truth that king is drawn when actually king is drawn = P(truth) = 3/4
P(E/B)= Probability that the man lies that king is drawn when actually king is drawn = P(lie) = 1/4
Then according to Bayes theorem, the probability that it is actually a king = P(A/E)
=\(\dfrac{P(A)P(EA)}{P(A)P(EA)+P(B)P(EB)}\)
= [1/4 × 3/4] ÷[(1/4 × 3/4) + (1/4 × 3/4)]
= 3/16 ÷12/16
= 3/16 × 16/12
=1/2 = 0.5
Answer: Thus the probability that the drawn card is actually a king = 0.5
FAQs on Bayes Theorem
What Is Bayes Theorem in Statistics?
Bayes theorem is a statistical formula to determine the conditional probability of an event. It describes the probability of an event based on prior knowledge of events that have already happened. Bayes Theorem is named after the Reverend Thomas Bayes and its formula for random events is \(P(AB) = \dfrac{P(BA)P(A)}{P(B)}\)
Here, P(A) = how likely A happens
P(B) = how likely B happens
P(A/B) = how likely does A happen given that B has happened
P(B/A) = how likely does B happen given that A has happened
What Does the Bayes Theorem State?
Let \(E_{1}, E_{2}, E_{3}, ..., E_{n}\) be a set of events associated with a sample space S, where all events \(E_{1}, E_{2}, E_{3}, ..., E_{n}\) have nonzero probability of occurrence and they form a partition of S. Let A be any event associated with S, then according to Bayes Theorem,
\(P(E_{i}  A) = \dfrac{P(E_{i})P(AE_{i})}{\sum_{k=1}^{n}P(E_{k})P(AE_{k})} , i=1,2,3,...,n\)
How to Use Bayes Theorem?
To determine the probability of an event A given that the related event B has already occurred, that is, P(AB) using the Bayes Theorem, we calculate the probability of the event B, that is, P(B); the probability of event B given that event A has occurred, that is, P(BA); and the probability of the event A individually, that is, P(A). Then, we substitute these values into the formula \(P(AB) = \dfrac{P(BA)P(A)}{P(B)}\) to determine the probability using the Bayes Theorem.
Is Bayes Theorem for Independent Events?
If two events A and B are independent, then P(AB) = P(A) and P(BA) = P(B), therefore Bayes theorem cannot be used here to determine the conditional probability as we need to determine the total probability and there is no dependency of events.
Is Conditional Probability the Same as Bayes Theorem?
Conditional probability is the probability of the occurrence of an event based on the occurrence of other events whereas the Bayes theorem is derived from the definition of conditional probability. Bayes theorem includes the two conditional probabilities.
What Is the Bayes Theorem in Machine Learning?
Bayes theorem provides a method to determine the probability of a hypothesis based on its prior probability, the probabilities of observing various data given the hypothesis, and the observed data itself. It helps immensely in getting a more accurate result. Hence, whenever there is a conditional probability problem, the Bayes Theorem in Machine Learning is used.
visual curriculum