Course Progress
Part of 10 Chapters
Introduction to Bayesian Statistics: The Process of Experience Becoming Knowledge
Bayesian Statistics: Updating Probabilities
While traditional statistics (Frequentist) focuses on “how many times heads would appear if we flipped a coin 1,000 times,” Bayesian statistics focuses on “How accurate is my belief?” We move closer to the truth by constantly refining our existing knowledge whenever new information becomes available.
1. Formula and Meaning of Bayes’ Theorem
- (Prior): My belief before seeing new data.
- (Likelihood): If my hypothesis is correct, the probability and likelihood that this data would appear.
- (Posterior): My updated belief after confirming the data.
2. The Knowledge Update Process
The process by which a Bayesian makes judgments in the real world is as follows:
Assign initial probabilities based on past experience or intuition.
Collect actual data (evidence) occurring in the field.
Convert the prior probability into a posterior probability using the formula.
Today's posterior probability becomes tomorrow's new prior probability.
3. The Crucial Difference: Frequentist vs. Bayesian
Comparison of Two Statistical Perspectives
| Category | Frequentist | Bayesian |
|---|---|---|
| Definition of Probability | Frequency occurring in infinite repetition | Degree of personal belief (Uncertainty) |
| Parameter | A fixed but unknown value | A variable with a probability distribution |
| Role of Data | The sole basis for judgment | A means to update existing beliefs |
| Pros | Objective and standardized procedures | Can begin analysis with small data; highly flexible |
4. Practical Case: Updating a Virtual Cancer Diagnostic Test
An example of updating results for a rare disease diagnostic kit with a 1% prior probability from a Bayesian perspective.
Change in Posterior Probability Based on Diagnostic Results
Confidence (probability) increases dramatically with two consecutive positive results compared to just one.
💡 Professor’s Tip
Spam filters and decision-making algorithms for self-driving cars are all based on Bayes’ Theorem. It’s essentially an order to “constantly update the probability that the object captured by the sensor is a person.” It’s also the branch of statistics that most closely resembles the way our brains learn.