Sharks, Dogs and Biases
As a data professional, I am expected to interpret data and provide stakeholders with insights in the form of a story.
Many refer to this skill as "critical thinking." While that is valuable, one must also question how they arrived at their conclusions in the first place.
According to Wikipedia, a bias is a disproportionate weight in favor of or against an idea or thing, usually in a way that is inaccurate, closed-minded, prejudicial, or unfair.
I gained a deeper understanding of biases during an Organizational Behavior course I took in university, and I briefly mentioned it in Blog Post #8, which discusses data-driven decision-making.
Beyond numbers in a database, biases can lead to poor decisions in the real world.
For example, it is statistically more likely to be bitten by a dog than by a shark; however, availability bias caused authorities in Tobago to place a bounty on the shark that bit a tourist.
Being aware of different types of biases can encourage you to ask more questions before accepting the final results of your analysis.
With that in mind, let's look at the different types of biases and ways in which they can be avoided:
Availability Bias
Confirmation Bias
This is the tendency to seek out or give more weight to information that confirms our existing beliefs, while disregarding evidence that contradicts them. If someone already fears sharks, they might focus on every instance of a shark attack while ignoring statistics that show how rare such incidents are.
In data analysis, this means looking for contradictory evidence and being open to changing your interpretation of the data based on the full spectrum of information available.
Anchoring Bias
This refers to the tendency to rely too heavily on the first piece of information encountered when making decisions. This initial information "anchors" subsequent judgments, even if it is irrelevant or misleading. In the case of the shark attack, the immediate focus on the danger posed by the shark could have anchored the decision-making process, causing officials to fixate on this one-off event rather than considering a broader context.
It’s useful to regularly ask yourself: Is the first piece of information I encountered still influencing my decision, and should it?
Hindsight Bias
Another bias worth considering is hindsight bias, which is the tendency to see past events as having been predictable, even when they were not. After a shark attack, it’s easy to think, "I knew something like this would happen," especially if there were prior warnings or isolated reports of shark sightings. In reality, predicting such rare events is nearly impossible.
To combat this bias, it’s essential to remember that the unpredictability of rare events should not be minimized just because they seem obvious in retrospect.
Overconfidence Bias
This occurs when someone has excessive belief in their own ability to make accurate judgments or predictions. In the shark example, authorities may have been overconfident in their ability to assess the situation without fully consulting data or experts on marine behavior.
To avoid overconfidence bias, it's important to consult multiple sources, review the data critically, and be aware of the limits of your knowledge.
Comments
Post a Comment