How the Internet Amplifies Cognitive Errors

The Internet has brought us many great things. But as author Sarah Tarkoff reveals, it also magnifies some common cognitive errors that are all too human.

keyboard
  • camera-icon
  • Photo Credit: Wikimedia Commons

You’ve heard it before: technology is changing everything – even the way we think. Indeed, it’s even rewiring our brains, in a thousand ways we haven’t come to grips with yet.

But what if the real problem is that our brains aren’t changing enough to adapt to this new digital world? Human brains are wildly stubborn things, and they still process information more or less the way they evolved to, tens of thousands of years ago.

Now, the same brains designed for hunter-gatherers are communicating not just in small tribal groups, but with billions of people around the world – allowing the internet to magnify all of our flaws. We’re making the same mistakes we always have… but now they’re multiplied on a global scale.

Below are four common cognitive errors that can be amplified by the internet. 

RELATED: Modern Cyborgs Aren't Waiting Around for Evolution 

Confirmation bias

cognitive errors internet

Humans instinctively seek out information that reinforces our existing beliefs, and avoid information that challenges those beliefs. There’s been a ton of fascinating research into confirmation bias, but the gist is this: we are all absolutely terrified of being wrong. Our brains rebel against it, to the point that we will outright reject information that contradicts our own points of view. The more central the belief is to our worldview, the more difficult it is to shake. (If you want to read more, this article is stellar.)

cognitive errors Internet
  • camera-icon
  • Photo credit: Omar Prestwich / Unsplash 

Which is why it’s so terrifying that, rather than making this problem better by exposing us to a plethora of new ideas, the internet can actually make it worse. For example, you’ve probably heard your Facebook feed is curated by an algorithm, to show you more of what you already like and agree with. Though Facebook has taken some steps recently to promote more high quality news sources, the news marketplace itself has now been permanently changed by social media.

RELATED: It's Only a Matter of Time: Time Travel in Fiction and Real Life 

Newspaper and magazine revenue used to come from subscribers who purchased the whole paper, meaning it was beneficial to include articles that would appeal to a wide audience.  Now, rather than flipping through our local paper in the morning, we’re scrolling through social media… which means news ad revenue is now driven by individual articles being shared widely. And part of what makes something “shareable” is driven by confirmation bias: we like to amplify points of view that match our own. Which means rather than play to a wide audience, news organizations are now incentivized to play straight to our own confirmation bias – giving us exactly what we want.      

And what are the results? Since the dawn of the internet age, the political spectrum has diverged drastically, as people move farther to one extreme or the other–partly because we’re no longer reading the same news as those from the opposite party. In fact, this is the one thing Obama and Hannity have agreed on lately: right now, Republicans and Democrats are living on different planets.

Groupthink/bandwagon effect

This one’s pretty straightforward: we’re more likely to think something is true if everyone around us seems to believe it. If you want to see what this looks like in real life, the Asch conformity experiments done in the 50s are absolutely fascinating. The gist: if all the people around you state a belief that is obviously false, you might pretend to believe that falsehood too. Or—and this is where it gets terrifying—actually convince yourself that everyone else must be right. Human beings are conditioned to trust the group, to go along to get along. So when everyone on your social media feed is posting the same kinds of opinions, you’re primed to buy into them, too – whether they’re right or not. Even when some of the people on your social media feed aren’t people at all

If you watch the Asch experiment above, there’s one very simple way to rise above groupthink: once one member of the group is willing to voice a dissenting opinion, it makes it easier for everyone else to speak their own truth. But social media doesn’t always make that easy. 

Here’s an example: Let’s say that I have opinion A. But everyone on my social media feed is posting opinion B, and liking and rewarding each other for having opinion B. How does it benefit me to post something everyone will disagree with (especially right now when I have a book to promote)? No one will like my post about opinion A… in fact, they might even write mean replies. And because of confirmation bias (see above), most of my friends will continue believing opinion B whether I post anything or not. 

So Facebook is already showing me more articles from sites I agree with. And because of the bandwagon effect, the more of the same opinion I see, the more likely it is I’ll think it’s the right one. And anyone left with dissenting opinions is discouraged from sharing them in a public forum.

So where do those people with dissenting opinions go?

RELATED: 10 Books Like Black Mirror 

Ingroup bias

postimage
  • camera-icon
  • Photo credit: John Schnobrich / Unsplash 

The internet is full of strangers ready to reinforce whatever it is you believe, even if everyone on your social media feed disagrees with you.

In some instances, this can be great! Are you the only LGBT+ kid in a small, conservative town? The internet is full of people experiencing exactly what you’re going through, and you can easily find a community of like-minded peers for support.

Unfortunately, there are plenty of people on the internet sharing ideas that are downright false, or even harmful. Want to find someone who believes the earth is flat? Who denies the existence of global warming, or the efficacy of vaccines? Or, as is horrifically common, forums supporting racism and sexism? The internet has the power to normalize all kinds of interests and beliefs, even the most heinous.  Alt right trolls, for example, often cloak their opinions in “humor” – using online meme culture to hide just how serious they are about the racism and sexism behind their jokes. But the jokes spread that racism and sexism all the same.

RELATED: 9 Freaky Predictions from Dystopian Novels That Have Come True 

These online forums can be a breeding ground for one of the most dangerous cognitive errors: ingroup bias, AKA, favoring members of one’s ingroup. And the more we connect with these groups online, the more our ingroup bias—our prejudice and favoritism—can grow. This is actually how ISIS recruits new members online—making them feel like part of a community. An ingroup, ready to target anyone who challenges their membership. Many U.S. citizens have tried (and sometimes succeeded) in joining ISIS in Syria, where they connect with their online community in the real world.

Okay, so we’ve gotten to ISIS recruiting. That’s pretty bad. Realistically it’s as bad as we’re going to get in this article. But I’ve got one more scenario left for you —what if the internet leaves you totally alone? 

Fundamental attribution error

Before the age of the internet, if you did something that caused everyone you knew to hate you, your life wasn’t over. You could move to a new city, make new friends, start over. But now, unless you change your name, your past comes with you in ones and zeroes. 

For some, maybe it should—for instance, the #MeToo movement has been using the public forum to out serial predators and harassers. As a woman working in Hollywood, this movement has been an unexpected godsend that has already changed my own workplace for the better. Where courts and HR departments failed women, the court of public opinion has stepped in to protect us.

cognitive errors internet
  • camera-icon
  • Photo credit: Jay Wennington / Unsplash 

But what power that court has. And any power that can be used for good can also be used for bad. While our online jury is subject to all the cognitive errors described above (and a million others I didn’t have room for), it’s also driven by one simple but inescapable bias: fundamental attribution error (FAE)

It sounds complicated, but it’s actually super simple: when I make a mistake, I understand all the factors that contributed to my mistake. I was upset, I was stressed, I was running late, etc – “I wouldn’t be so rude normally.”  When others make mistakes, we aren’t so generous. “God, he’s so rude.” 

In our day to day lives… that’s just life. We aren’t as sympathetic to others as we should be. But when we get online, and our fundamental attribution errors meet up with everyone else’s, their power is magnified. Because it’s so difficult for us to assess the root causes of other people’s mistakes, it’s easy to find fault, to assume those faults are a result of a defect in character. It’s easy to be cruel even – without being forced to look someone in the face, it’s easy to say anything, to deny the other person any humanity at all. 

There’s been a ton of interesting stuff written on public shaming in the internet age, but long story short: the sting of online humiliation and harassment is real, permanent, and pervasive. It can affect people’s jobs, their relationships, their mental health. And while we feel they deserve it as we’re angrily typing from behind our keyboards… if we could step inside their shoes would we feel the same?

As the internet magnifies our confirmation and ingroup biases, as algorithms and group think whittle down the opinions we even get to hear, it’s worth keeping an eye on what we believe, why we believe it, and how those beliefs are causing us to treat each other.

Because like it or not, the internet is the new public square, and our actions there have consequences that extend into the real world. And when our cognitive errors are magnified, those consequences can be bigger than we expect, and harder than ever to rectify.

SARAH TARKOFF currently writes for the CW series Arrow. Other TV writing credits include ABC’s Mistresses, Lifetime’s Witches of East End, and the animated series Vixen and The Ray. She graduated from USC with a degree in screenwriting (hence all the screenwriting), and currently lives in Los Angeles. Sinless is her debut novel.

Sinless Sarah Tarkoff

Featured photo: Wikimedia Commons

Keep scrolling for more out-of-this-world stories!