Skip to main content
Chapter 9

Introduction to Bayesian Statistics: The Process of Experience Becoming Knowledge

#Bayes' Theorem#Prior Probability#Likelihood#Posterior Probability

Bayesian Statistics: Updating Probabilities

While traditional statistics (Frequentist) focuses on “how many times heads would appear if we flipped a coin 1,000 times,” Bayesian statistics focuses on “How accurate is my belief?” We move closer to the truth by constantly refining our existing knowledge whenever new information becomes available.

1. Formula and Meaning of Bayes’ Theorem

P(AB)=P(BA)P(A)P(B)P(A|B) = \frac{P(B|A)P(A)}{P(B)}

  • P(A)P(A) (Prior): My belief before seeing new data.
  • P(BA)P(B|A) (Likelihood): If my hypothesis is correct, the probability and likelihood that this data would appear.
  • P(AB)P(A|B) (Posterior): My updated belief after confirming the data.

2. The Knowledge Update Process

The process by which a Bayesian makes judgments in the real world is as follows:

1
Establish Subjective Prior

Assign initial probabilities based on past experience or intuition.

2
Observe New Data

Collect actual data (evidence) occurring in the field.

3
Execute Bayesian Update

Convert the prior probability into a posterior probability using the formula.

4
Cyclical Application

Today's posterior probability becomes tomorrow's new prior probability.

3. The Crucial Difference: Frequentist vs. Bayesian

Comparison of Two Statistical Perspectives

CategoryFrequentistBayesian
Definition of ProbabilityFrequency occurring in infinite repetitionDegree of personal belief (Uncertainty)
ParameterA fixed but unknown valueA variable with a probability distribution
Role of DataThe sole basis for judgmentA means to update existing beliefs
ProsObjective and standardized proceduresCan begin analysis with small data; highly flexible

4. Practical Case: Updating a Virtual Cancer Diagnostic Test

An example of updating results for a rare disease diagnostic kit with a 1% prior probability from a Bayesian perspective.

Change in Posterior Probability Based on Diagnostic Results

Confidence (probability) increases dramatically with two consecutive positive results compared to just one.


💡 Professor’s Tip

Spam filters and decision-making algorithms for self-driving cars are all based on Bayes’ Theorem. It’s essentially an order to “constantly update the probability that the object captured by the sensor is a person.” It’s also the branch of statistics that most closely resembles the way our brains learn.

🔗 Next Step