A version of this article appears in Twisted Logic: Puzzles, Paradoxes, and Big Questions. By Leighton Vaughan Williams. Chapman & Hall/CRC Press. 2024.
UNDERSTANDING EVIDENCE THROUGH A STORY
Imagine a situation where your friend, known for her upstanding character, is accused of vandalising a shop window. The only evidence against her is a police officer’s identification. You know her well and find it hard to believe she would commit such an act.
SETTING THE STAGE FOR BAYESIAN ANALYSIS
In Bayesian terms, the ‘condition’ is your friend being accused, while the ‘hypothesis’ is that she’s guilty. To apply Bayes’ theorem, we consider three probabilities:
Prior Probability: Based on your knowledge of your friend, you might initially think there’s a low chance of her guilt, say 5%. This is your ‘prior’—your belief before considering the new evidence.
Likelihood of Evidence If Guilty: Consider how reliable the officer’s identification is. If your friend were guilty, what’s the chance that the officer would identify her correctly? Let’s estimate this at 80%.
Likelihood of Evidence If Innocent: What’s the chance of the officer mistakenly identifying your friend if she’s innocent? Factors like similar appearances or biases could play a role. Let’s estimate this at 15%.
THE ITERATIVE NATURE OF BAYESIAN UPDATING
Bayes’ theorem allows for continual updates. If new evidence arises, you can recalculate, using your updated belief as the new ‘prior’. This process offers a dynamic way to assess situations as they evolve.
WHEN EVIDENCE DOESN’T ADD UP
In cases where evidence is equally likely whether the hypothesis is true or false, it doesn’t change our belief. It’s crucial to evaluate the quality of evidence, not just its existence.
CHALLENGES IN ASSIGNING PROBABILITIES
While assigning precise probabilities to real-life situations can be challenging, the exercise is invaluable. It forces us to think critically and systematically about our beliefs and how new information affects them.
The Unfolding Story
Now let’s consider the story in a little more detail. You’ve received a phone call from your local police station. An officer tells you that your friend, someone you’ve known for years, is currently assisting the police in their investigation into a case of vandalism. The crime in question involves a shop window that was smashed on a quiet street, close to where she resides. Furthermore, the incident took place at noon that day, which happens to be her day off work.
You had heard about the incident, but had no reason to believe your friend was involved. After all, she’s not a person known for reckless or unlawful behaviour.
However, this is where the narrative takes a twist. Your friend comes to the phone and tells you that she’s been charged with the crime. The accusation primarily stems from the assertion of a police officer who has positively identified her as the offender. There’s no other evidence, such as CCTV footage or eyewitness testimonies, to substantiate the officer’s claim.
She vehemently maintains her innocence, insisting it’s a case of mistaken identity.
The Challenge
Now, as a follower of Bayes as well as being a close friend, you find yourself in a position where you need to evaluate the probability that she has committed the crime before deciding how to advise her. This challenge leads us to the central theme of this section—the application of Bayes’ theorem to real-life situations.
Before we proceed, let’s clarify our terms. The ‘condition’ in this context is that your friend has been accused of causing the criminal damage. The ‘hypothesis’ we aim to assess is the probability that she is indeed guilty.
Bayes’ Theorem and Its Application
So, how does Bayes’ theorem help us answer this question? Well, Bayes’ theorem is a formula that describes how to update the probabilities of hypotheses being true when given new evidence. It follows the logic of probability theory, adjusting initial beliefs based on the weight of evidence.
To apply Bayes’ theorem, we need to estimate three crucial probabilities:
Prior probability (‘a’)
The prior probability refers to the initial assessment of the hypothesis being true, independent of the new evidence. In this scenario, it equates to the likelihood you assign to your friend being guilty before you hear the evidence.
Considering you’ve known her for years and her involvement in such an act is uncharacteristic, you might deem this probability low. After a thoughtful consideration of your friend’s past actions and character, allowing for the fact that she was off work on that day and in the neighbourhood, let’s say you assign a 5% chance (0.05) to her being guilty.
Assigning this prior probability requires an honest evaluation of your initial beliefs, unaffected by the newly received information.
Conditional probability of evidence given hypothesis is true (‘b’)
Next, you need to estimate the likelihood that the new evidence (officer’s identification) would have arisen if your friend were indeed guilty.
This estimate might be guided by factors such as the officer’s reliability, credibility, and proximity to the crime scene. For the sake of argument, let’s estimate this probability to be 80% (0.8).
Conditional probability of evidence given hypothesis is false (‘c’)
The third estimate involves figuring out the probability that the new evidence would surface if your friend is innocent. This entails gauging the chance that the officer identifies your friend as the offender when she isn’t guilty.
The probability could be influenced by several factors—perhaps the officer saw someone of similar age and appearance, jumped to conclusions, or has other motivations. For the purposes of our discussion, let’s estimate this probability to be 15% (0.15).
Probabilities Adding Up
An interesting point to note is that the sum of probabilities ‘b’ and ‘c’ doesn’t necessarily have to equal 1. Just for example, the police officer might have a reason to identify your friend either way (whether she’s guilty or innocent), in which case the sum of ‘b’ and ‘c’ could exceed 1. Alternatively, the officer may be reluctant to positively identify a suspect in any circumstance unless he is absolutely certain; in which case b plus c may well sum to rather less than 1. In this particular narrative, b plus c add up to 0.95.
Calculation and Interpretation
With these estimates in hand, we can now apply Bayes’ theorem, which calculates the posterior probability (the updated probability of the hypothesis being true after considering new evidence) using the formula: ab/[ab + c (1 − a)].
In our case, substituting the values results in a posterior probability of around 21.9%. What does this mean? Despite the officer’s confident identification (a seemingly strong piece of evidence), there’s only a 21.9% probability that your friend is guilty given the current information.
This result may seem counterintuitive. However, this discrepancy arises from our understanding of prior probability and the weight we assign to the new evidence. We must remember that the officer’s identification is only one piece of the puzzle, and its strength as evidence is balanced against the prior probability and the potential for a false identification.
Updating the Probability
The beauty of Bayes’ theorem lies in its iterative nature. Let’s suppose that another piece of evidence emerges—say, a second witness identifies your friend as the culprit. You can reapply Bayes’ theorem, using the posterior probability from the previous calculation as the new prior probability. This iterative process allows you to incorporate additional pieces of evidence, each of which updates the probability you assign to your friend’s guilt or innocence.
Cases Where Evidence Adds No Value
Consider a situation where ‘b’ equals 1 and ‘c’ also equals 1. This would imply that the officer would identify your friend as guilty whether she was or not. In such cases, the identification fails to update the prior probability, and the posterior probability remains the same as the initial prior probability.
The Imperfections of Assigning Probabilities
Now, it’s worth recognising the potential difficulty in assigning precise probabilities to real-life situations. After all, our scenario involves complex human behaviour and a unique event.
However, our inability to determine precise probabilities shouldn’t lead us to dismiss the process. In fact, this process of estimation is what we’re doing implicitly when we evaluate situations in our everyday lives.
While the results might not be perfect, Bayes’ theorem provides a systematic approach to updating our beliefs in the face of new evidence.
CONCLUSION: BAYESIAN REASONING IN REAL LIFE
Bayes’ theorem provides a structured approach to incorporating new evidence into our beliefs. It’s a tool that enhances our decision-making, offering a mathematical framework to navigate uncertainties, from everyday dilemmas to complex legal and medical decisions.
As we grapple with uncertainty, the application of Bayes’ theorem allows us to transition from ignorance to knowledge, systematically and rationally. Thus, whether we’re faced with a shattered shop window or any other challenging situation, we have a powerful tool to help us navigate our path towards truth.

