Tuesday, July 20, 2021

Covid-19, vaccine hesitancy and the misinformation conundrum

Covid-19, vaccine hesitancy and the misinformation conundrum

Opinion by 
Columnist
July 19, 2021|Updated today at 6:22 p.m. EDT

Cardboard cutouts of Facebook founder and CEO Mark Zuckerberg outside the U.S. Capitol in 2018. (Saul Loeb/AFP via Getty Images)

Image without a caption

It’s hard to teach an algorithm to identify misinformation when humans themselves can’t agree on what misinformation is — and when political leaders can’t decide whether we should have more or less of whatever it entails.


Lately, vaccine hesitance has been calcifying into outright vaccine refusal. That’s partly because so many Americans have been fed a steady diet of misinformation and conspiracy theories about vaccine risks. Roughly 90 percent of Americans who don’t plan to get vaccinated say they fear possible side effects from the shot more than they fear covid-19 itself, a recent YouGov poll found. Roughly half of those who reject the vaccine believe the U.S. government is using the vaccine to microchip the population. (Hey, that would at least explain the global chip shortage.)


Where are Americans getting these kooky ideas? Politicians and pundits have been quick to blame social media platforms.


Story continues below advertisement

That’s understandable. Misinformation has flourished on Facebook and other sites for many years. Unlike truths, lies are unconstrained by reality, which means they can be crafted to be maximally interesting, sexy, terrifying. In other words, they’re optimized to generate traffic, which happens to be good for tech companies’ bottom lines. “Fake news” — whether fashioned by enterprising Macedonian teenagers, malicious state actors, U.S. political groups, snake-oil salesmen or your standard-issue tinfoil-hatters — drove tons of engagement on these sites in the lead-up to the 2016 election and has continued to do so.


Whether out of principle or financial self-interest, tech executives initially said they weren’t in the business of taking down content simply because it was false. (This included, infamously, Holocaust-denial claims.) Intense blowback followed, along with pressure for tech companies to recognize how their tools were being exploited to undermine democracy, stoke violence and generally poison people’s brains; the firms have since ramped up fact-checking and content moderation.


During the pandemic, Facebook has removed “over 18 million instances of COVID-19 misinformation” and made less visible “more than 167 million pieces of COVID-19 content debunked by our network of fact-checking partners,” the company wrote in a blog post over the weekend. This was in response to President Biden’s comments Friday that social media platforms were “killing people” by allowing vaccine misinformation to flourish.


Story continues below advertisement

On the one hand, yes, social media companies absolutely still can and must do more to scrub misinformation from their platforms. Case in point: A recent report on the “Disinformation Dozen” estimated that 12 accounts are responsible for 65 percent of anti-vaccine content on Facebook and Twitter. Their claims include that vaccines have killed more people than covid and are a conspiracy to “wipe out” Black people. All 12 remain active on at least Facebook or Twitter.


But on the other hand: Actually doing more to stamp out this misinformation is challenging. Not because these firms lack the workers or technology to identify problematic content; the real obstacle is political.


Politicians of both parties hate Big Tech’s approach to content moderation and think it should change — but propose diametrically opposite directions.


Story continues below advertisement

Democrats are mad that the companies suppress too little speech, allowing conspiracy theories to proliferate. Republicans are mad that these companies are suppressing too much speech, since often it’s right-wing content that gets (rightly) flagged as fake. Absent some political consensus on which way these companies are at fault, or regulation that tells Facebook and other platforms what content is acceptable (a move that would likely face First Amendment challenges), the firms will always be nervous about censoring too aggressively.


The fact that a lot of this same disinformation is being disseminated on prime-time cable also makes it politically harder for tech companies to justify taking it down.


It’s not merely a few no-name Facebook accounts promoting anti-vaccine nonsense; it’s also the most influential media personalities on TV. Fox News host Tucker Carlson, for instance, recently gave an entire monologue linking the government’s coronavirus vaccination effort to historical forced sterilization campaigns. His show then posted the clip on Facebook, which flagged it with a generic note about how coronavirus vaccines have been tested for safety.


Story continues below advertisement

Should Carlson’s insinuations have been removed entirely? That’s risky. As conspiracy-theorizing becomes more mainstream, and gobbles up an entire political party and the media ecosystem that sustains it, policing those conspiracy theories and the conservative leaders who promote them appears more politically motivated. Not coincidentally, the White House has reserved its harshest criticism about anti-vaccine content for social media companies rather than conservative news organizations parroting similar messages. Already despised by both parties, Big Tech is a safer target.


Now, one could argue that these tech firms should step up and impose the moderation policies they think are right, political (and perhaps financial) fallout be damned. Perhaps these companies could more forcefully rebut Republicans’ claims of politically motivated censorship and “shadow-banning” by pointing out that right-wing content still dominates the most popular posts every day on Facebook.


But if even White House officials appear tentative about picking fights with the right-wing industrial complex, it’s not surprising that tech firms would follow suit.


Read more:


No comments:

Post a Comment

Note: Only a member of this blog may post a comment.