A few weeks ago, back when we were all still doing virtual pub quizzes and going out for our daily hour of exercise (#peaklockdown), I was queuing outside my local Sainsbury’s and overheard an “argument” between the couple two metres behind me:
Queue male: You know, this is how it’s going to be for the next two to three years, queueing up for food until we’re all begging for Bill Gates’ vaccine.
Queue female: Yup. Just so they can put chips in us. All over this fake plandemic.
Queue male: They ain’t chipping me, don’t care what the government says.
Queue female: Pfft. You take the vaccine, you’re getting chipped…
Even though conspiracy theories have been a mainstay of our society for as long as… well, society, the Covid-19 pandemic seems to have supercharged their prevalence. Everywhere I go, I hear or meet someone who believes in at least a few Covid conspiracies (or “coviracies” – I’m trying to make it a thing).
From the idea that Covid-19 was made in a lab or triggered by 5G masts, to theories claiming fatality rates are being manipulated or that the virus is all just a hoax, suffice it to say, the tinfoil hats have gone mainstream.
However, unlike simply believing that we all live on a flat disk surrounded by a wall of ice, believing an ongoing pandemic is fake has very real consequences.
A recent study published in the journal Psychological Medicine, found people who got their news from social media were less likely to adhere to all government guidelines, take antibody tests, or be vaccinated.
And there have been more than a few stories of people suffering tragic repercussions because they didn’t take coronavirus seriously.
So, as is usually the question these days – is social media ruining our lives?
The main culprits
Well, much like stages at Glastonbury, different platforms cater to different audiences when it comes to the misinformation-industrial complex.
For example, the study above found that out of the individuals who straight up don’t believe coronavirus even exists, 56% of them get their news from Facebook.
YouTube, on the other hand, cornered the 5G falsehoods niche, with 60% of those who erroneously believe in the link between 5G and the virus getting most of their info from the video sharing platform.
Twitter has played a different role in this dance of deception, with conspiratorial hashtags such as #covid19hoax and #plandemic helping spread the theories to a global audience.
And that’s before trying to untangle the murky mess that is dark social media apps like WhatsApp. These platforms allow groups and individuals to spread misinformation with absolutely no oversight whatsoever.
I know you’ve read some crazy stuff on WhatsApp.
How have social platforms reacted?
Social media platforms are traditionally quite resistant to censoring what it considers “free speech”. However, they agree that believing in an alternate reality during a global pandemic can be harmful. So they’ve all taken some actions to try and stem the spread of misinformation.
Facebook stuck warning labels to “90 million pieces of misinformation” between March and April. It also took down “hundreds of thousands” of Covid-19 posts it believed could lead to harm and redirected over two billion people to WHO resources by May.
Twitter has also started labelling misinformation around Covid-19, even promising to retroactively apply those labels to tweets predating the decision. YouTube has taken a twofold approach, “making authoritative information more prominent and aggressively removing policy-violating content.”
WhatsApp has approached its challenge differently. One of the main ways misinformation spreads on the platform is through forwarded messages. So it’s simply limited the number of times a single message can be forwarded in order to introduce a bit of “friction” into the process.
And in one last move to prove that they’re serious about the issue, the big social platforms announced they’d be collaborating in the fight against the scourge of misinformation. While the details are still a bit vague, they released this tweet together:
— Facebook Newsroom (@fbnewsroom) March 17, 2020
So, things have happened. The question is…
Has any of it worked?
While Twitter and Facebook insist they’ve taken “aggressive steps to remove harmful misinformation” from their platforms, a recent study shows the scope of the problem.
Volunteers from the UK, Ireland and Romania searched and reported misinformation around the falsehood that aspirin dissolved in hot water could cure Covid-19. Of all the posts they reported over the course of a month, Twitter only responded to 3% of them and Facebook 10-12%.
The problem is, the platforms use automated algorithms to spread content. Labelling and removing, on the other hand, is a manual task. And while moderators are deciding, posts can be spread millions of times. Plus, labelling can actually be psychologically counter-productive as they may instead pique interest in said misinformation.
And that’s before we even talk about disinformation – which is misinformation promulgated knowingly by bad actors. Social platforms have become disinformation battlegrounds for nation-states, and it’s becoming harder and harder to know what’s real every day.
As far as Coviracies are concerned, the damage has been done. The lies are out there and have taken on a life of their own.
And there’s little the platforms can do to stop that now.
What can we do?
Conspiracy theories aren’t going anywhere anytime soon. We’re simply too connected and there are too many incentives, from ignorance to espionage, to share fake news.
But even though the problem as a whole can feel futile, as individuals, all we can do is try not to be part of the problem by not sharing things that may be fake.
And if you have family or friends who’ve fallen down the proverbial rabbit hole, it can feel overwhelming and frustrating.
But, according to phycologists, simply shaking them over and over won’t work.
And just hope for the best.