Good and Bad Reasoning in Decision Making: The Rise of Misinformation and Bias

Upklyak from Freepik
The decisions we make define our lives, shaping not only our individual paths but also the course of history itself. At the heart of this uniquely human experience is the power of reason, a cognitive capacity that distinguishes us from other species.
While a tiger acts on pure instinct, our minds are capable of a different kind of foresight. This allows us to do things that appear harmful at first, like the stab from a vaccine needle, but with great reason become extremely valuable to us. Yet it would be foolish to think we're really all that different from our animal counterparts. We are just as capable of being fooled nonetheless into making small errors that have massively harmful outcomes.
That is why it's so important to know how to reason well and spot bad reasoning in others and more importantly ourselves. By doing so, we can make more properly informed decisions or, at the very least, avoid poorly informed ones. Many arguments from salesmen, politicians and advertisers are riddled with logical fallacies that grab our attention and change our minds. It's in our own best interest to be aware of other’s short-comings in reasoning and avoid being manipulated into doing someone else's bidding (even if by accident).
There are hundreds of different common logical fallacies. However, we'll focus on a few key ones that are prevalent in today's misinformation landscape.
Related: The Psychological Strategies of Influencers
The Problem of the “Non-Expert Expert”
One of the most significant challenges today is the rise of the non-expert expert. This phenomenon occurs when an individual, highly credible and respected in their specific field, leverages that authority to speak on topics outside their area of expertise. For example, a successful tech entrepreneur might offer advice on nutrition, or a popular actor might weigh in on public health policy.
This situation is particularly dangerous because it exploits the white coat effect, also known as the appeal to authority fallacy. We are conditioned to trust people in positions of power or with perceived expertise. When a non-expert expert speaks, their existing authority acts as a powerful halo, making their opinions on unrelated subjects seem more credible than they are. This cognitive bias can lead to the widespread acceptance of inaccurate or misleading information simply because of who said it, not because of what was said.
Logical Fallacies and Their Role in Misinformation
Misinformation often thrives by exploiting common logical fallacies. These aren't just academic concepts; they are the building blocks of persuasive, yet flawed, arguments that can easily sway public opinion. Which is incredibly dangerous.
The Appeal to Nature Fallacy
This fallacy argues that something is good, right, or healthy simply because it is "natural" or "unprocessed." For example, an article might claim a certain diet is superior because it's based on "foods our ancestors ate," or that a specific herbal remedy is better than a pharmaceutical drug because it comes "from the earth." But our ancestors ate only from the earth and naturally, but typically lived to only 30 years of age and died from horrible diseases we now can prevent.
This line of reasoning is flawed because it ignores that "natural" doesn't automatically equate to "safe" or "effective." Many poisonous plants are natural, and many life-saving medicines are synthetic and vice versa. By appealing to a romanticised idea of nature, this fallacy can lead people to make decisions that are not based on scientific evidence.
Related: Labubu, Stanley, and Matcha Hype: Lessons on Consumer Behavior
The Part-to-Whole Fallacy
Also known as the fallacy of composition, the part-to-whole fallacy incorrectly assumes that what is true of a part must also be true of the whole. A common example is arguing that because a single component of a complex system has a certain property, the entire system must share that property.
In the context of misinformation, this might manifest as a headline claiming a new product is completely safe because one of its ingredients is a well-known, harmless substance. It ignores the fact that the combination of different components, or their interaction within a larger system, can create entirely new effects. This fallacy oversimplifies complex realities and can lead to a false sense of security or understanding.
Navigating the Information Landscape
To combat the rising tide of misinformation, we must become more critical consumers of information. We need to look beyond the reputation of the person speaking and evaluate the evidence they present. By recognising the logical fallacies and biases we can better protect ourselves from manipulative and misleading narratives.
Communication is a two way street, we need to be aware of errors in reasoning coming from others but also coming from ourselves. A strong communicator can form stronger arguments that hold up to scrutiny. Conversely, being able to identify weaknesses in others' arguments allows you to respond more effectively and engage in more productive discussions. This not only enhances your ability to express yourself but also helps you better understand the perspectives of others. Ultimately, this practice isn't about shaming others, but about collaboratively working toward a clearer, more accurate understanding of the world so we all can make better decisions.
Step into the room where Asia’s future leaders connect. Register for ELC 2025 today.
Personal
Tags: Communication
References:
Wong M. Red flags for unreliable sources: a guide. Lab Muffin Beauty Science. October 23, 2024. Accessed August 12, 2025. https://labmuffin.com/red-flags-for-unreliable-sources-a-guide/
Ethan is currently the Director of Production at Leaderonomics. He hails from New Zealand and is loves to travel. He is currently in Malaysia, having fun making cool content and inspiring millions through his work at Leaderonomics