The Bayes’ theorem is a probability and statistics mathematical theory named after Reverend Thomas Bayes. It helps estimate the likelihood of an occurrence based on the probability of a previous event, for instance, a natural catastrophe. The Bayes’ theorem has several uses, including Bayesian inference in the healthcare business – predicting the chance of developing health difficulties with advanced age. We shall try to understand the applicability of Bayes’ theorem in estimating the probability of occurrences and its formulation, formula and derivation using examples.
Conditional probability may be calculated using Bayes’ formula. It is named after the 18th-century British mathematician Thomas Bayes. It is more likely that a certain result will transpire if a primary outcome has transpired under comparable conditions. To update current predictions or hypotheses (update probability), Bayes’ theorem gives a mechanism to incorporate new information or evidence. If you’re considering lending money to someone, Bayes’s theorem may be used to assess the risk. The core of Bayesian statistics is Bayes’ theorem, commonly known as Bayes’ Rule or Bayes’ Law.
IMPORTANT THINGS TO KEEP IN MIND
- The Bayes theorem lets you incorporate new knowledge into your anticipated odds of an occurrence.
- Thomas Bayes, an 18th-century mathematician, inspired the name of Bayes’ theorem.
- It is often used in the financial sector to keep up-to-date risk assessments.
Recognising and Applying Bayes’s Law
The Bayes theorem has several applications outside of finance. A medical test result may be judged accurate using Bayes’ theorem, which considers the likelihood of a certain individual developing an illness and the overall accuracy of a given test. Using Bayes’ theorem, it is possible to derive posterior probabilities by adding prior probability distributions
Before fresh evidence is obtained, the chance of an event happening is known as the prior probability. Predicted probabilities are calculated based on existing information before an experiment is carried out, in other words. When new information is taken into account, the posterior probability is the updated likelihood of an occurrence. The posterior probability is computed by applying Bayes’ theorem to the prior probability. If event B has happened, the likelihood of event A happening is known as the posterior probability.
By using new information that may be connected to a certain occurrence, the Bayes theorem calculates the likelihood of that event occurring. Assuming the new information is correct, this method may also assess how the likelihood of an event happening may change based on the new knowledge.
Consider the case of a single card drawn from a full 52-card deck.
There is a 1/13 chance that the card is a king, which works out to 7.69 percent. Keep in mind that the deck contains four kings. Let’s say the picked card turns out to be a face card. Given that the picked card is a face card, the likelihood of a king is four divided by 12, or about 33.3 percent.
Assertion and Validation of Bayesian Inference
The Bayes Theorem may be proved using the total and conditional probability formulae. To calculate the overall probability of event A, we utilise other occurrences connected to event A to estimate its likelihood. The likelihood of an event A occurring if other related events have already happened is known as conditional probability.
P(AB) = P(A) + P when A and B are distinct sets (B)
Hence, we may write P(A) as P(A E1)+P(A E1)+P(A E1)….P(A En). This is what we get when we use the dependent event multiplication theorem:
P(A) is equal to P(B) (E). P = P(A/E1) + P. (E). P(A/E2) + P(A/E2) (E). P(A/E3)……+ P(A/En) is a sequence
As a result, P(A) has a total probability of
∑ni=1P(Ei)P(A|Ei),i=1,2,3,n
In this case, i=1nA|Ei)P(A|Ei),i=1,2,3.. (1)
The conditional probability we used before yields the result:
P(Ei|A)=P(Ei∩A)
P(A),i=1,2,3,..n
P(Ei|A)=P(Ei∩A)
Assume that P(A),i=1,2,3,…,n (2)
P(Ei∩A)=P(A|Ei)
P(Ei) (3)
P(Ei|A)=P(A|Ei)
P(Ei)∑k=1
P(Ek)P(A|Ek),i=1,2,3,…,n
Thus, Bayes’ Theorem is established.
The Bayesian Theorem
There is a formula for the Bayes theorem that may be used to both events and random variables. The formulations for the Bayes Theorem originate from the notion of conditional probability. For events A and B, as well as continuous random variables X and Y, it may be deduced from it.
Bayes Theorem-Related Terms
- Here are a few concepts connected to Bayes’s theorem that has been used in the formula and derivation of the Bayes formula and derivation:
- Based on another event B, Conditional Chance is the probability of an event A occurring. If event B has already occurred, then P(A|B) is the likelihood that A will occur.
- When two separate occurrences occur simultaneously, they are referred to as a “joint probability.”
- Real-valued random variables whose probable values are decided through chance experiments are random variables. The experimental probability is another name for this probability.
- ‘Posterior Likelihood’ is the probability of an event occurring after all relevant data has been considered. Conditional probability is another name for this concept.
- An event’s prior probability is computed before taking any new information into account. Before an experiment is conducted, the likelihood of a certain result is calculated using the most recent available data.
Notes about Bayes’ Theorem that are worth remembering
- The Bayes theorem is used to calculate the probability of a given event.
- P(A|B) = P(A), and P(B|A) = P(A) when the two occurrences are independent (B)
- The Bayes theorem for continuous random variables may compute the conditional probability.
Conclusion
Bayes’ Theorem is a basic mathematical method used for determining conditional probabilities. It strongly influences subjectivist or Bayesian approaches to epistemology, statistics, and inductive reasoning. The rules of probability regulate subjectivists, who hold that rational belief, focus extensively on conditional probabilities in their theories of evidence and practical learning models.
Bayes’ Theorem is crucial to these endeavours because it facilitates the computation of conditional probabilities and illuminates the fundamental characteristics of the subjectivist perspective. Indeed, the Theorem’s basic discovery — that a hypothesis is verified by any collection of evidence that its truth makes plausible — is the cornerstone of every subjectivist research. Named after Reverend Thomas Bayes, the theorem helps predict the likelihood of an event based on the probability of a previous occurrence.