The ability to think clearly is essential to making better decisions – whether you’re launching a new product, defining a legal strategy, or managing a pandemic. But cognitive biases affect the way we frame issues, interpret data, and come up with ideas.
The human brain is naturally wired to find simple answers, not seek objective truth. It looks for patterns to automate repetitive tasks and decisions. It settles on data points that fit with current perceptions, which may be flawed. While this is fine for routine matters, it can lead to serious errors in high-stakes situations.
It’s not enough to be aware of your blind spots and how they shape beliefs, drive emotions, and influence actions. You also need to safeguard against default thinking on complex issues.
No one is entirely objective in processing information. Cognitive biases are a part of human nature. One of the most common and damaging type is confirmation bias. This leads us to seek out and recall facts that confirm what we already believe or think, and disregard and overlook competing facts that support a different view.
Confirmation bias contributes to division and polarization, where right-wrong, good-evil, dichotomous thinking prevails. Bias, by itself, doesn’t necessarily make you wrong or more intolerant of others you disagree with. But when you don’t see the whole picture (i.e. objective facts and evidence), and see only what you want to see (i.e. facts and evidence confirming your beliefs), you end up with suboptimal solutions.
The first step to thinking clearly and making better decisions is to get out of your echo chambers.
In echo chambers, you are conditioned to distrust contrary views, arguments and information from the “other side.” Competing ideas, beliefs and data points are blocked or ignored. This keeps us in an epistemic bubble, where one is unexposed to contrary views, arguments and information from the “other side.”
Virtual echo chambers
Echo chambers distort reality and manipulate your impression of others. Social media, social networks, and online discussion sites like Facebook, Twitter, and Reddit amplify and reinforce biases. Through AI algorithms, search engines (e.g. Google and Yahoo) and content delivering websites (e.g. YouTube) give us more of what we want and less of what we don’t want. They can also suppress certain types of information.
In today’s attention economy, the most extreme viewpoints and sensationalized headlines get the most clicks. Individuals with neutral or moderate positions tend to avoid the debate, get sucked into the extremes, or are deemed to be clueless.
While modern technology like the Internet allows us to connect regardless of physical distance, it also has its drawbacks. Compared to face-to-face interactions, online communication makes it easier to express outrage over things you oppose and show less tolerance of competing beliefs.
Trolling, cyberbullying and other hostile online behavior are not uncommon, particularly in anonymous postings and through anonymous social media accounts. A faux pas that would be shrugged off or easily resolved in person becomes a major disagreement on the Internet.
Determining intent and understanding context in Twitter feeds, social media content, and even email is difficult. Words have different meanings and may affect each person differently. When we agree with the author’s viewpoint, we tend to assume their values, beliefs and perceptions are in line with our own. When we don’t, we might stereotype and label them negatively.
Communicating on a public medium often makes people more defensive and less likely to concede on valid points or point out areas of agreement. Use private online chats and direct messaging if you really want to encourage healthy discourse.
There is excess information, misinformation, filtered information, and conflicting information. For example, on issues of political controversy, check out both liberal leaning media (e.g. New York Times, MSNBC, The American Independent) and conservative leaning media (e.g. Wall Street Journal, Fox News, Brietbart).
Then search for unmoderated and less exciting information from more independent sources like C-SPAN. You could also tune into YouTube talk shows and podcasts covering in-depth and nuanced conversations on topics of interest.
Being exposed to opposing arguments will burst your epistemic bubble. But it won’t necessarily change your views or shift your positions — especially if you’ve learned, within your echo chamber, to distrust outside sources. Access to more information creates more opportunities to confirm biases. You might feel vindicated (when you receive confirming information) or disgusted (when you receive disconfirming information).
For concessions and agreements to be reached, each side has to earn the other’s trust, which is hard to do. Stay curious. Stay humble. Take time and make space for a well thought-out response, instead of give in to an immediate reaction.
Sometimes you need a digital detox. Read a book instead. Unplug from your devices such as smartphones, televisions, computers, and tablets. Limit your use of social media and get the apps off your phone. If you feel compelled to stay informed, a one-time check at the end of the day is usually more than enough.
Get a good night’s sleep, move, eat well, and practice mindfulness to help you avoid snap judgments. Self care helps you activate the prefrontal cortex of your brain, which is responsible for self-control, critical thinking and decision-making. Go for a long walk or sit in a quiet spot to reflect on information when you’re not emotionally charged and can think more calmly and objectively.
Real-life echo chambers
If you want to detect your blind spots, invite a free exchange of ideas and active discourse with friends, family members and colleagues who hold different beliefs and opinions.
Each must act in good faith, i.e. listen with curiosity, respect divergent viewpoints, refrain from proving the other side wrong or changing the other person’s mind, and find points of agreement. Some individuals are capable of having civil discussions; others are not. Choose wisely.
Discuss the strongest version (steel man) instead of the weakest version (straw man) of the person’s opposing viewpoint. Question what exactly they mean when they use vague terms or loose labels. Engaging in dialectic discourse sharpens your critical thinking skills. It allows you to poke holes at your own biases and recognize either side could be right or wrong. No one has a monopoly on truth, especially on complex and evolving issues.
Ward off first impressions. Avoid making assumptions about the person’s reasons for a particular position. It could be entirely apolitical and based on a common set of ideals. Perhaps your hierarchy of priorities and values simply do not match exactly with theirs. Maybe you just disagree on the approach in reaching a shared objective.
When we consider others’ perspectives and how they came to hold the beliefs and opinions they do (whether through logic or emotion), we can develop more compassion, engage in productive discourse, stop fighting, and learn from each other.
Silo thinking leads us to ignore objective facts, see only what confirms our biases, and overlook important data that do not match preconceptions. Merely getting out of your echo chamber is not enough for you to adapt to new information or change preexisting beliefs. But it will help you understand multiple perspectives, strengthen your thinking, and make fully informed decisions.
* * *
# # #
Dyan Williams is a productivity coach who helps working parents, lawyers, small business owners and other busy people turn their ideas into action, reduce overwhelm, and focus on what truly matters. She is also a solo lawyer who practices U.S. immigration law and legal ethics at Dyan Williams Law PLLC. She is the author of The Incrementalist: A Simple Productivity System to Create Big Results in Small Steps, an e-book at http://leanpub.com/incrementalist.