JEE Exam » JEE Study Material » Physics » Bayes Theorem Introduction

Bayes Theorem Introduction

The Bayes theorem, often known as the Bayes rule, is a mathematical formula used in statistics and probability theory to compute the conditional probability of events.

In probability theory and statistics, Bayes Theorem (or Bayes’ Law or Bayes’ Rule) asserts that the likelihood of an event increases when previous information or knowledge of conditions is provided. Suppose cancer is linked to age, for example. In that case, Bayes’ theorem can be used to assess the chance that a person has cancer more precisely instead of determining the probability of cancer without knowing the person’s age.

In Elementary Statistics, the notion of conditional probability is introduced. The conditional probability of an event is calculated with the knowledge that another event has already happened. Given that event A has already occurred, the conditional probability of event B happening is denoted by P(B|A). 

Bayes theorem meaning

The Bayes theorem, or the Bayes rule, is a useful mathematical formula used in statistics and probability theory to compute the conditional probability of events. The Bayes theorem expresses the likelihood of an event based on prior knowledge of the circumstances.

Thomas Bayes introduced this by proposing an equation that allows using fresh evidence to update previous beliefs. If the conditional probability is P(B|A), we may use the Bayes method to get the reverse probabilities P(A|B). 

This theorem states that:

When a random experiment or previous data offers new or additional information, we can change probabilities. To arrive at a proper conclusion in the face of ambiguity, business and management leaders must be able to update current (given) probabilities in light of new information.

P(A|B) =P(B|A) ×P(A) / P(A)

The following is a broad statement that can be used to illustrate the above assertion:

P(Ai|B) = P(B|Ai) × P(Ai)i=1n (P(B|Ai) × P(Ai))

P(Ai) is the probability of the ith occurrence, Ai.

Derivation

According to the definition of conditional probability, P(A|B)=P(A∩B)P(B), P(B)≠0 and we know that P(A∩B)=P(B∩A)=P(B|A)P(A), which implies,

P(A|B) = P(B|A)P(A)P(B)

Hence, the Bayes theorem formula for events is derived.

Proof of Bayes’ theorem

The conditional probability and total probability formulas will be used to prove the Bayes Theorem. 

When there is insufficient evidence to compute the entire probability of an event A, other events connected to event A are used to estimate its probability. The likelihood of event A, provided that the other similar activities have already happened, is known as conditional probability.

(Ei)  is a partition of the sample space S. Let A be an event that occurred. Let us express A in terms of (Ei).

A = A ∩ S

= A ∩ (E1, E2, E3,…,En)

A = (A ∩E1) ∪ (A ∩E2) ∪ (A ∩E3)….∪ ( A ∩En)

P(A) = P[(A ∩E1) ∪ (A ∩E2) ∪ (A ∩E3)….∪ ( A ∩En)]

We see that A and B are disjoint sets, then P(A∪B) = P(A) + P(B)

So, P(A) = P(A ∩E1) +P(A ∩E2)+ P(A ∩E3)…..P(A ∩En)

As per the multiplication theorem of a dependent event,

P(A) = P(E). P(A/E1) + P(E). P(A/E2) + P(E). P(A/E3)……+ P(A/En)

So, total probability of P(A) = i=1n P(Ei)P(A|Ei), i=1,2,3,…,n — (1)

Now, recall the conditional probability, 

P(Ei|A)=P(Ei∩A)/P(A), i=1,2,3,…,n —(2)

Putting the formula for conditional probability of P(A|Ei) we get

P(Ei∩A) = P(A|Ei)P(Ei) — (3)

Replacing equations (1) and (3) in equation (2) we get

P(Ei|A)=P(A|Ei)P(Ei)/k=1n P(Ek)P(A|Ek), i=1,2,3,…,n

Hence, Bayes Theorem is proved.

Terms related to Bayes’ theorem 

Let us comprehend the definitions of a few phrases linked to the notion that has been used in the Bayes theorem formula and derivation:

  • Conditional Probability – It is the probability of an event A occurring based on another event B. P(A|B) denotes the likelihood of A occurring given that event B has already occurred.
  • Joint Probability – The probability of two or more occurrences co-occurring is measured by joint probability, denoted by P(A∩B) for two events, A and B.
  • Random Variables – A random variable is a genuine variable whose values are decided by chance, and the experimental probability refers to the probability of certain variables.
  • Posterior Probability – It is the probability of an event estimated after all relevant data has been taken into account, whose other name is conditional probability.
  • Prior Probability – It is the probability of an event calculated before new information is taken into account. Before the experiment, the probability of a particular outcome is calculated based on current information.

Bayes’ theorem notes

  • Conditional probability is calculated using Bayes’ theorem.
  • When two independent occurrences A and B happen, P(A|B) = P(A) and P(B|A) = P(B) 
  • For continuous random variables, the Bayes theorem can calculate conditional probability.

Conclusion 

Bayes’ Theorem has a wide range of applications that aren’t confined to the financial world. Bayes’ theorem, for example, can be used to estimate the accuracy of medical test findings by taking into account how probable any specific person is to have a condition as well as the test’s overall accuracy.

faq

Frequently Asked Questions

Get answers to the most common queries related to the JEE Examination Preparation.

What is the use of Bayes’ theorem?

Ans. The Bayes rule can be applied to probabilistic questions based on a single piece of evidence....Read full

Is the Bayes theorem based on the assumption of independence?

Ans. The Bayes theorem is founded on fundamental statistical principles...Read full

Who invented the Bayes theorem?

Ans. Thomas Bayes, an English mathematician, introduced the Bayes theorem.

How is the Bayes theorem used in machine learning?

Ans. The Bayes Theorem is also commonly employed in machine learning, where it is a quick and easy approach t...Read full