Skip to content
Join Life Time
Blocks that spell out FACT and FAKE

During the early days of the coronavirus pandemic, the World Health Organization identified another troubling outbreak: a massive “infodemic,” defined as an overabundance of information that makes it difficult for people to find trustworthy, reliable guidance when they need it most.

While some of the information in our social-media ecosystem is verifiably true, plenty of it is not. Unfortunately, research shows that lies tend to spread farther and faster than accurate information, particularly online. Some psychologists suggest that even after misinformation is corrected, we can go on believing it anyway, because our brains have trouble letting it go — a concept known as the “continued influence effect.”

Although misinformation may primarily spread on digital platforms, it has real-world consequences. Researchers estimate that between January and March 2020, nearly 6,000 people around the globe were hospitalized because of coronavirus misinformation: Whether they had believed that the virus was no worse than the flu or that it could be cured by gargling vinegar or drinking bleach, exposure to rumors and conspiracy theories had a direct effect on their health.

We’re also contending with disinformation, which is different from misinformation in its intent. If you share an inaccurate article on Facebook without knowing that it’s false, you’re spreading misinformation. Disinformation, on the other hand, means knowingly spreading falsehoods, often with the intent of manipulating a public narrative or influencing politics.

To make matters more complicated, this monster has a third head. Some researchers call it “gray-area misinformation,” and bad actors are using it to outsmart recent efforts by social-media platforms to crack down on falsehoods. Gray-area misinformation isn’t outright untrue — which means it’s harder to moderate — but it gets shared in a way that drives a misleading narrative.

Take the example of the U.S. women’s soccer team and their final pre-Olympic match on July 5, 2021. Before the game, World War II veteran Pete DuPré performed the national anthem on his harmonica, and during his performance, some of the players turned to face the American flag at one end of the field. Posts to a Facebook page called Hold the Line mischaracterized the players’ actions, stating that they “turned their backs” on DuPré in a “DISGRACE TO AMERICA!” Similar posts proliferated on Facebook and Twitter, referring to the team as “shameful” and “disrespectful,” with some posters calling for their disqualification from the 2021 Olympics in Tokyo. The story itself is true — some players did turn to face the American flag, in accordance with national-anthem etiquette recommended for civilians; the implication that they did so as a gesture of disrespect toward DuPré is false.

In the end, the age of social media means that anyone can have a platform, which means anyone has the ability to amplify content. That’s a tremendous responsibility, and learning to be more cognizant of what we’re reading and sharing online is one way to be part of the solution to our misinformation problem.

The next time you come across a post that doesn’t seem quite right, take a moment to ask yourself some of these questions, which can help you learn to spot misinformation in the wild.

1. Who said it?

If possible, you should always try to trace information back to its source. Then, ask yourself some questions: Who are they? Who do they work for? What’s their goal in sharing this? What about their expertise? Thinking critically about these things can help you identify whether a source is trustworthy, or if they have an ulterior motive.

Often, simply figuring out who’s behind the post can help you know whether to take it seriously. The Center for Countering Digital Hate has identified that a group of just 12 anti-vaccine activists are responsible for almost two-thirds of anti-vaccine content on social media. The “Disinformation Dozen” have massive social-media followings, which they’ve used to cast doubt on the safety of coronavirus vaccines.

If the post itself doesn’t offer many clues about the source, see if you can trace the claims to a news article or another website. Then, consider whether that site is a reliable source of information: Who’s behind the project? Does the article link to primary or secondary sources? Is it riddled with spelling errors or biased language? Can you verify the information elsewhere?

Depending on what you’re reading about, you may have to look at several different sources to get a sense of the whole picture. Whatever you do, don’t just read a headline and assume you get the gist. Also, be mindful of satirical news sites, many of which mimic the tone and appearance of actual news — research has shown that many Americans have trouble differentiating one from the other.

2. What’s the evidence?

If the post makes a particular claim — for example, that eating carrots will make your hair fall out — try to evaluate whether you see enough evidence to support that claim. Does the post link to a study or a news article with more information? Don’t believe it just because your neighbor says that her brother used to love carrots until he went bald. That’s called anecdotal evidence, and it’s usually not reliable, especially if it’s not supported by other forms of scientific data.

Considering the evidence is especially important in this age of social media, when it’s easy for anyone with Photoshop to create a meme with a misleading statistic or two. Don’t believe every meme you see — they’re usually intended for laughs or virality, not for sharing important, reliable info.

3. Does it confirm a belief you already hold?

Humans are wired to seek out content that reinforces our beliefs, a principle known as confirmation bias. “You notice the things you agree with,” media psychologist Pamela Rutledge, PhD, told Experience Life in 2020. “You share them because you’re reassuring yourself that your way is the right way.”

In psychology, confirmation bias is sometimes called “selective collection of evidence,” because we tend to hold on to information that supports our beliefs and reject the evidence we don’t like. One way to counter this is to make a conscious effort to diversify your media diet, including information from outlets outside of your usual bubble.

4. Does it appeal to your emotions?

Misinformation often plays on our feelings to circumvent our critical thinking. The next time you see a headline that is especially emotionally resonant — whether it makes you feel angry, or scared, or excited, or sad — consider that it’s likely an attempt to grab your attention or to get you to share without thinking.

One example of this is #SaveTheChildren, which went viral in August 2020. At first glance, many of these posts may have seemed emotionally resonant or even morally righteous — the movement purported to be about raising awareness around child sex trafficking. But lurking behind the hashtag? QAnon, a baseless conspiracy theory leveraging false statistics and emotional appeals about child sex trafficking in order to draw unwitting social-media users into their broader movement.

5. Is it hateful?

Plenty of people tweet angry, but if you’re seeing posts that seem especially cruel or violent — particularly if they target marginalized groups — consider that a red flag. Extremists often use social media to spread disinformation, conspiracy theories, and hate speech, and they’re often rewarded with increased exposure and new followers.

Some evidence suggests that social-media algorithms also amplify harmful or hateful content — because polarizing posts are more likely to go viral. Some of these algorithms drive users toward more extremist content by recommending more violent or hateful posts, a concept dubbed “algorithmic radicalization.”

6. Is it too soon?

Social media tends to move faster than traditional media. Immediately in the aftermath of a tragic event or a novel discovery — like the coronavirus — falsehoods often pour in to fill the void left by experts and organizations who are hesitant to make immediate clear, declarative statements.

If you’re reading a post about a developing topic with lots of unanswered questions, take a beat. Check out WNYC Studios’ Breaking News Consumer’s Handbook for more guidance on how to interpret breaking news.

7. Is it already viral?

Fake news is more likely to go viral than a true story, no matter the subject — though, unsurprisingly, false statements about politics frequently perform best. “Whereas the truth rarely diffused to more than 1,000 people, the top 1 percent of false-news cascades routinely diffused to between 1,000 and 100,000 people,” the authors wrote. They concluded that a false story reaches 1,500 people about six times faster, on average, than it would take for a true story to reach the same number of people.

Kaelyn
Kaelyn Riley

Kaelyn Riley is an Experience Life senior editor.

Thoughts to share?

This Post Has One Comment

  1. Congratulations on this simple, well-done piece. This topic should be taught in 6th grade, and a refresher offered every (school) year thereafter!

    I’d share the article on Facebook but, sadly, it would be inflammatory to some of my friends who are, at this point, completely entrenched in their (mis)beliefs.

    Well done for trying to educate on the value of critical thinking.

Leave a Reply

Your email address will not be published. Required fields are marked *

ADVERTISEMENT

More Like This

a person sits on a yoga mat holding a phone

5 Healthy-Living Experts Share Their Strategies for Using Tech Mindfully

By Jill Patton, FMCHC

We reconnected with five of our past cover-story personalities for insights on building healthier habits with our devices.

Back To Top