by Adam Lee on February 6, 2023

[Previous: Is social media the cigarette of the 21st century?]

Is TikTok a Chinese plot to undermine America?

The popular video-sharing site is already banned on U.S. government devices, and some politicians are proposing to outlaw it completely in the United States as a national security threat.

At the heart of the debate is which videos TikTok recommends and why. You can follow accounts on TikTok and only watch their videos, like other social media sites. However, most people rely on its personalized algorithm. It learns what you like based on which videos you watch and which ones you skip, and shows you more of what you’re interested in.

I have a TikTok account myself, and I can confirm it’s uncanny how it works. At first it was chaos, but after a day or two, the algorithm quickly homed in on my interests. I mostly watch science and nature videos, travel, gardening and cooking, and a few accounts that feature deconversion stories and religious critiques. I share educational videos with my son, and I’ve learned things from it myself.

But TikTok is owned by a Chinese company, ByteDance, and like all Chinese companies, it’s subject to the decrees of the Chinese government. This might be less worrisome if its inner workings were more transparent, but TikTok is a black box. There’s no way to know why it’s showing you a video. What’s more, journalists have confirmed that it has a secret internal feature called “heating“: a button employees can push to make any video go viral.

American politicians have raised the concern, and it’s not an irrational fear, that the Chinese government could exploit this to interfere in elections, sow chaos, or spread propaganda. (We have reason to believe they’ve already tried, albeit at smaller scale and less effectively.)

It could suppress stories that reflect badly on China, like the government’s treatment of the Uyghur minority or its handling of COVID. It could force TikTok to push videos of its choosing to Western audiences, amplifying the reach of politicians who serve Chinese interests and stifling those who don’t. Because the algorithm is so personalized, this meddling would be almost undetectable.

Is social media rewiring our brains?

Even without the fear of Chinese propaganda, there are reasons for concern about TikTok itself, as well as social media more generally. Even if we don’t stop using it completely, we should be asking more hard questions about what excessive use does to our brains.

First is the “audience capture” problem. Social media has a tendency to reshape us into more extreme versions of ourselves. Influencers are motivated to perform more foolhardy and dangerous stunts, and commentators are pushed into espousing more radical and uncompromising philosophies, all in a bid to stand out in a crowded field and win the favor of the almighty algorithm.

As one example, Maajid Nawaz—once a moderate Muslim who criticized radical Islam and a friend of Sam Harris—has become a full-blown anti-vaxxer and conspiracy theorist ranting about the New World Order. That was the kind of content his audience rewarded him for, and as he chased their approval, it took over his personality.

It goes the other way, too. Studies suggest that greater time spent on social media correlates with more mental health problems, like depression and self-harm. Instagram in particular has been linked to eating disorders and suicide, according to leaked Facebook internal research.

There are multiple causes for this. Social media can become a channel for cyberbullying, stalking and harassment that feels impossible to escape. Celebrities and influencers fill our screens with images of glamor and affluence. Even though this is an illusion, based on carefully selected and even manipulated images, it can fill us with envy and inadequacy. (As the saying goes, don’t compare your cutting floor to someone else’s highlight reel.)

Douyin, the version of TikTok that’s available in China, has major differences from the U.S. version. For users under 14, it only shows science and educational videos and limits total usage to 40 minutes per day. The version of the app released internationally has no such restrictions.

Is this Chinese nanny-statism, or do they know something we don’t?

American social media is no better

However you feel about TikTok, banning it would be treating the symptom and not the disease. Everything we fear the Chinese government might do, American social media companies are already doing.

One big example is Facebook’s Cambridge Analytica scandal. A data-mining app that ran on Facebook harvested personal data on tens of millions of users, building detailed psychological profiles that were supposed to be for academic research. Instead, the data was sold to political campaigns, including Donald Trump’s 2016 campaign and the Brexit vote, for use in targeting voters.

YouTube has also been faulted as a contributor to polarization and radicalization. Like TikTok, its algorithm recommends videos to its users that are similar to ones they’ve already watched. This often results in people being shown more extreme, inflammatory and conspiratorial versions of ideas they already have some sympathy for. There are many stories about the YouTube algorithm leading disaffected men down a rabbit hole of far-right content. (Less seriously, but more humorously, there’s also “YouTube face“.)

And as for Twitter… well, nothing more needs to be said about Twitter. It was the single biggest factor in Trump’s rise to power. Since Elon Musk took over, it’s abandoned even the most minimal pretenses of safety or fairness, banning journalists for doing journalism while rolling out the welcome mat for anti-vaccine propaganda and white supremacists.

Are any of these problems less concerning because they stem from American social media owned by American billionaires?

Social media’s original sin

All these social media goliaths have incentives that don’t coincide with what’s good for society in general. Their motive is to maximize “engagement”—which means, to keep users glued to the screen as long as possible, by whatever means—because that means they can show us more ads and make more money.

It’s this incentive that’s at the root of everything toxic about social media. It’s the reason why they create algorithms to push content on their users, rather than letting us choose for ourselves what we want to see. It’s the reason they’ve poured so much effort into data mining, vacuuming up more and more information about us so we can be more effectively targeted. All the harms of social media spring from that original sin.

Non-algorithmic social media could be the solution to many of these problems. If there were no algorithms constantly pushing us to consume more, constantly trying to dig up more information about us, social media would be a very different place.

It would be slower-paced, more peaceful. It wouldn’t inundate us with clickbait. It wouldn’t have a built-in accelerant for disinformation, conspiracy theories and hate. There would be less repetitive content, less trend-chasing, and less incentive for people to become the worst versions of themselves. Genuinely thoughtful and interesting ideas could still go viral, but they would be amplified only by word of mouth.

Of course, the problem is that this might also mean people would use social media less. That would arguably be a good thing for society, but no for-profit tech company would create a site like that.

This hints, as I’ve previously suggested, that not-for-profit social media could be the solution. This could take the form of non-profits supported by users (as sites like Wikipedia and the Internet Archive have done successfully), or as decentralized networks with a shared protocol (like Mastodon, or Usenet before it). Or it could be run by governments as a public good, the same as libraries and post offices. Either way, this would pose its own risks and its own challenges—but if we tried it, we might find it to be a change worth making.