Rigor Brings Truth
May Contain Lies

Rigor Brings Truth

Alex Edmans, a finance professor at the London Business School, defines the psychological tendencies that undercut rational analysis to help you avoid drawing mistaken conclusions.

Alex Edmans, a finance professor at the London Business School, identifies the human psychological tendencies or biases that can lead you to draw erroneous conclusions. He is also the author of Grow the Pie: How Great Companies Deliver Both Purpose and Profit, a 2020 Financial Times Book of the Year.

Misinformation

Misinformation increases with social polarization and people’s dependence on social media as a source of information. Edmans says you can do two main things to thwart misinformation: First, check your facts. And, second, be aware of and compensate for your cognitive biases and the biases of the people who are communicating to you.

In today’s post-truth world, it’s more important than ever to separate myth from reality.Alex Edmans

Distinguishing fact from fiction requires understanding the source of errors, the problems they cause, and what to do about them. You can find opinions and reports that support almost any position, but even sources you consider reliable might be inaccurate. Accuracy requires getting the facts right plus using them correctly and in context.

Confirmation bias is a tendency to believe ideas that align with your desired reality. It results in decision-making that stems from emotion, not evidence. It ignores or misinterprets facts. Confirmation bias is a frequent culprit in unjust criminal convictions.

Such thinking occurs because a challenge to something you believe activates the amygdala in your brain, creating dissonance between its automatic fright, flight, or freeze response and the rational part of your brain. Resolving this discomfort – especially by giving in to confirmation bias and accepting inaccurate input – releases the rewarding, feel-good hormone dopamine.

Edmans warns that confirmation bias causes you to dig in your heels to support your existing beliefs, even when you are gaining more and better information to the contrary. This process – called “belief polarization” – suggests, counterintuitively, that giving people more information does not necessarily help persuade them to change their beliefs. One research study even found that people with more knowledge showed more bias than those with less knowledge.

Consider researching data that opposes your views. Actively engaging with information that runs contrary to your current position opens you to new insights, even if you ultimately do not change your views.

Binary Thinking

Humans tend to think in binary terms — all or nothing. People grow accustomed to binary contrasts: friend or enemy, this candidate or that one. This tendency is ancestral. Generalizations enabled early humans to make quick, survival-enhancing judgment calls.

Evidence is not proof, because it may not be universal.Alex Edmans

Binary thinking causes problems in these situations. Regarding something as only good or bad leads to rigid, black-and-white thinking. When a concept or plan is neither all good nor all bad – when it is positive in some situations and negative in others – you may draw false conclusions about its inherent qualities. When people encounter something with positive and negative attributes and ignore its duality, they’re also avoiding useful analysis in order to stick with their all-or-nothing point of view.

Embrace Critical Thinking

Giving in to a narrative fallacy – such as assuming a causal relationship between two unrelated things – leads to false conclusions. To avoid this, Edmans says, consider whether you are measuring the right variablesand whether a causal relationship makes sense. 

Scrutinize information like high-quality scientific journals that subject potential articles to a peer-review process. When an article survives such an expert inspection, you can draw the conclusion that its research is trustworthy. While even experts can fail to discern falsehoods or truth, the process of replicating studies and reviewing past research helps scientists uncover errors.

Be wary of hyperbole. Consider the source’s credentials. Ask whether the source has incentives to mislead. Determine whether other experts agree with the source’s findings or opinions.

Groupthink – the tendency to go along with the crowd rather than to dissent – also leads to erroneous conclusions. However, Edmans notes that diverse members of a group can unite to seek common ground and put their differences into perspective. 

Before you lead a discussion, distribute background materials so no one gets locked into your point of view. Avoid setting up a default position. Remove the sense of rank among decision-makers. Use private voting to bring out minority positions. Value criticism and dissent.

Edmans explains that companies should teach their employees to think critically because that reduces the organization’s vulnerability to misinformation and misinterpretation. This includes learning to consider the counterfactual and grasp basic statistics. Thinking critically requires people to be curious and to assess arguments with intellectual rigor.

People tend to align their thinking with their own group or tribes. Societies should help people separate their group identity from the process of assessing evidence by avoiding deriding other groups. Invite politically or philosophically diverse spokespeople to join the discussion. Instead of emphasizing problems, Edmans says, emphasize solutions and the diverse values they support.

Contrasting opinions needn’t be incorrect; they’re like looking at a landscape from a different perspective.Alex Edmans

If you find yourself in a heated disagreement, take deep breaths to calm yourself. Identify the overarching common goal and address your concerns about achieving it. Listen to others. Acknowledge to yourself that your opinions will color your judgment and that you may not be objective.

Remember that evidence has limits. Consider your unique context and trust your own judgment.

Worthwhile Assessment Tools

Professor Alex Edmans details how false assumptions and cognitive biases can lead you astray. He uncovers the logical fallacies that occur when people draw erroneous links among facts, data, evidence, and proof. Edmans offers worthwhile tools for assessing the validity of information and distinguishing fact from fiction. In this, he provides a service for anyone who is attempting to negotiate today’s news environment, in which so many sources or platforms seek only to inflame your emotions and convert you to their views. Edmans helpfully discusses the difficulties of filtering evidence and its sources as you reach decisions and conclusions you can trust.

Share this Story
Show all Reviews