Libs of TikTok is an anonymous Twitter account that posts TikTok videos of bad or extreme left-wing takes so conservatives can get mad.
It’s a new twist on a classic form of internet content called “nutpicking,”1 where instead of engaging with the strongest arguments and best evidence for views you’re skeptical of, you hold up examples of dumbasses and make fun of them (see also Julian Sanchez on the “weak man argument”). And it’s not hard to find this content. America has more than three million teachers working in over 100,000 K-12 schools, so even if “saying something really dumb” is objectively rare behavior, it is nonetheless easy to find thousands of examples of it happening.
The account has become quite popular and has been featured frequently on Fox News and cited by Republicans to bolster the case for new laws clamping down on discussion of LGBT issues in schools.
This led Taylor Lorenz to do some classic journalistic sleuthing, looking into who is behind the account. If that person had turned out to be the CEO of a national fast-food franchise or an Air Force Brigadier General or some other noteworthy person, this would’ve been a super interesting story. But it turns out to be an Orthodox Jewish real estate agent in Brooklyn who has crank right-wing political opinions, which is actually not that interesting. So some people are mad at Lorenz and other people are mad at the people who are mad at Lorenz, and I don’t think we’re any closer to reaching any kind of social consensus about how schools should handle these issues — something that, as Elizabeth Bruenig writes, is actually a profound and difficult problem for liberal concepts of state neutrality between different conceptions of the good life.
So we see once again that the context collapse dynamic of social media is an excellent tool for tearing American society apart and not so great for having constructive political conversations. You can have someone who may or may not be a teacher venting her anger at her own parents in a not very constructive way, and then a Brooklyn realtor making a wild accusation that she is grooming children for child abuse.
Twitter avatar for @libsoftiktok
And this is all quite bad, but I don’t think it’s the biggest issue with TikTok.
The issue with TikTok is its control over content-flow
TikTok is a service owned by ByteDance, a company located in the People’s Republic of China — an increasingly dystopian dictatorship that maintains a network of concentration camps across Xinjiang and recently had the entire population of Shanghai locked in their homes for weeks.
Back in August of 2020, Donald Trump rather suddenly announced a plan to ban TikTok from the American market unless ByteDance sold it to an American owner. The Biden administration dropped Trump’s order on this (which, as I understand it, was viewed as unlikely to hold up in court due to Trumpian sloppiness) but continued a Commerce Department investigation into the issues. Now they seem to be getting close to a deal where TikTok’s user data will be stored on Oracle servers in the United States, which in the eyes of many people will resolve the issue.
Those people are, I think, wrong.
From the beginning, there has been a very serious concern about TikTok and there has also been this sideshow about user data. I’m a bit of a skeptic of privacy as an issue, but the data storage probably has always had a relatively straightforward solution along the lines of what Oracle is apparently proposing now. And as Tim Lee argued a few weeks ago, the Chinese government has plenty of other ways to get its hands on our data.
The much more serious issue is about TikTok itself.
Social media originally started with the idea of following friends. The major networks (led by Facebook) have evolved away from the idea of “show users what their friends have posted” to “use machine learning to determine which posts are likely to secure engagement from specific users, and show those users those posts.” There is still a core social graph at the bottom of how Facebook, LinkedIn, Twitter, and Pinterest work. But TikTok is video-first rather than text-first, and it abstracts away from the concept of friends and followers. People just upload videos to TikTok, and then TikTok shows the videos to people who it thinks will like them.
A different (and quite good) Taylor Lorenz article shed some light on how this works in practice as a dance between objective engagement metrics and top-down efforts by TikTok to manipulate its own distribution choices:
Unlike other mainstream social platforms, the primary way content is distributed on TikTok is through an algorithmically curated “For You” page; having followers doesn’t guarantee people will see your content. This shift has led average users to tailor their videos primarily toward the algorithm, rather than a following, which means abiding by content moderation rules is more crucial than ever.
When the pandemic broke out, people on TikTok and other apps began referring to it as the “Backstreet Boys reunion tour” or calling it the “panini” or “panda express” as platforms down-ranked videos mentioning the pandemic by name in an effort to combat misinformation. When young people began to discuss struggling with mental health, they talked about “becoming unalive” in order to have frank conversations about suicide without algorithmic punishment. Sex workers, who have long been censored by moderation systems, refer to themselves on TikTok as “accountants” and use the corn emoji as a substitute for the word “porn.”
As discussions of major events are filtered through algorithmic content delivery systems, more users are bending their language. Recently, in discussing the invasion of Ukraine, people on YouTube and TikTok have used the sunflower emoji to signify the country. When encouraging fans to follow them elsewhere, users will say “blink in lio” for “link in bio.”
Sex workers playing cat-and-mouse games with the authorities is a tale as old as time. The question of how to discuss self-harm in the context of depression in a way that makes things better rather than worse sounds like a difficult one.
But it’s Ukraine that should really make you think. Here’s an app that is very popular with young people. On it, they get shown videos based on an engagement algorithm, but one that is also subject to deliberate intervention by the people who control the platform. They might, for example, subject discussions of the war in Ukraine to some kind of special treatment — after all, it’s a Chinese company, and the Chinese government sometimes throws prominent tech founders in prison if it gets mad at their decisions.
We wouldn’t have let the USSR buy a TV network
No analogies are perfect, but the closest analogy I can think of is to imagine if the Brezhnev-era Soviet Union had decided to plow some of its oil export profits into buying up broadcast television stations across the U.S.
The FCC wouldn’t have let them. And if the FCC for some reason did let them, the Commerce Department would have blocked it. And if a judge said the Commerce Department was wrong and control over the information ecosystem didn’t meet the relevant national security standard, Congress would have passed a new law. We know that partisan conservative media outlets like Fox News and Sinclair Broadcasting have a meaningful impact on American politics. But in a free society, you can’t have a rule that says something like “people with a partisan agenda can’t buy television networks.” What you can do is have rules that are designed to prevent concentrated ownership, over and above what an antitrust consumer welfare standard would require.
But you absolutely can have a rule that says a hostile foreign government can’t buy up your broadcast television stations. It would be totally unthinkable!
And yet here we are with China. Here’s what Ben Thompson wrote back in 2020:
The service censored #BlackLivesMatter and #GeorgeFloyd, blocked a teenager discussing China’s genocide in Xinjiang, and blocked a video of Tank Man. The Guardian published TikTok guidelines that censored Tiananmen Square, Tibetan independence, and the Falun Gong, and I myself demonstrated that TikTok appeared to be censoring the Hong Kong protests and Houston Rockets basketball team.
The point, though, is not just censorship, but its inverse: propaganda. TikTok’s algorithm, unmoored from the constraints of your social network or professional content creators, is free to promote whatever videos it likes, without anyone knowing the difference. TikTok could promote a particular candidate or a particular issue in a particular geography, without anyone — except perhaps the candidate, now indebted to a Chinese company — knowing. You may be skeptical this might happen, but again, China has already demonstrated a willingness to censor speech on a platform banned in China; how much of a leap is it to think that a Party committed to ideological dominance will forever leave a route directly into the hearts and minds of millions of Americans untouched?
So how is TikTok covering the war in Ukraine? With tons of misinformation:
Among the false claims shown to the researchers was the myth that the US has bioweapon laboratories in Ukraine, and the accusation that Putin was “photoshopped” on to footage of a press conference he gave in early March.
Videos also claimed that fake footage was real, and that real footage was fake: videos purportedly of the “Ghost of Kyiv” shooting down Russian jets were taken from a video game, while real videos from the war were decried as fake by pro-Russian accounts.
“Some of the myths in the videos TikTok’s algorithm fed to analysts have previously been identified as Kremlin propaganda,” the researchers said, by the organisation’s Russia-Ukraine Disinformation Tracking Center.
There is always misinformation in war, including in the mainstream media (see the credulous coverage of Snake Island), so the mere fact that this is happening is hardly proof of nefarious intent. But the war is not a matter of indifference to the Chinese government. Would it be crazy to think they are putting their thumbs on the scale in favor of pro-Russian content? Wouldn’t it be crazy for them not to be?
A YouGov poll taken in late March showed that people under 30 are much more likely to say they sympathize with the Russian side in the conflict and much more likely to deny that Russia is targeting civilians for violence. Is TikTok encouraging that? If I were the Chinese government, that’s what I’d be doing with it.
TikTok is growing very fast
TikTok has over twice as many users as Twitter. It’s probably less influential than Twitter at this point because the Twitter user base is more oriented toward people who are themselves influential.
But this is also why I analogize it to broadcast television. Local television news is on one level a very low-influence medium because sophisticated and important people don’t get their information from local TV news. At the same time, precisely because this is where unsophisticated people get their news, it can actually be very potent in terms of changing minds. When I was an intern in Chuck Schumer’s communications office, the mantra was to remember that while congressional staffers cared a lot about Roll Call and CNN’s Inside Politics, what really mattered for Schumer’s political standing was local television in Buffalo and Binghamton. People who watch shows on CNN about politics care a lot about them and know which side they’re on. People who don’t care at all about politics, don’t read about politics, and don’t watch shows about politics could be easily persuaded by a random politics story that popped up between sports scores and the weather.
TikTok, per this chart from The Economist, is growing at incredible speed, and I think is something akin to local TV news for young people.
In other words, this is almost certainly not where someone who is very interested in following and understanding political debates would go for information. But by that same token, it means that to the extent that users do end up getting political information on there, it is likely to be fairly effective in swaying their views.
To bring this back to the Libs of TikTok account, tons of people have done pieces on the ways in which new media technologies have unsettled existing hierarchies and institutions and in some ways poisoned public debate. Jon Haidt’s recent piece in The Atlantic was a great summation of the pessimistic view of all this — more pessimistic than I would be (he left out the considerable upsides), but trenchant and important. Yet consider applying all these points to a network that isn’t just amorally seeking ad revenue, but actually under the thumb of a foreign power that has an interest in making American society as dysfunctional as possible.
It seems really bad. Trump had all kinds of absurd ideas about taking a cut of the sale and kept introducing all kinds of nonsense into the debate. But the core idea of saying to ByteDance “we’re going to shut this app down unless you sell it within some reasonable time period” made a lot of sense. I don’t think you want to see a partisan crusade against TikTok, but precisely because this was originally a Trump effort, it seems like the White House should be able to reach out to Republicans in Congress and come up with a way to do this that is bipartisan and non-polarized. And the sooner the better.
No comments:
Post a Comment
Note: Only a member of this blog may post a comment.