Algospeak: How Social Media Is Transforming the Future of Language, by Adam Aleksic, Knopf, 256 pages, $29
The kids these days have a lot of silly euphemisms. Porn becomes corn. Sex becomes seggs. Nipples are nip nops and a picture of an eggplant can stand in for a penis. Killing someone becomes unaliving them, and people kermit sewerslide instead of committing suicide. Everything slightly risqué or unpleasant becomes baby talk. But not because teens are overgrown infants—it’s a bottom-up response to top-down censorship.
As social media has become a bigger part of modern life, platforms have adopted elaborate policies to appease advertisers and politicians who might not be happy with the content that people organically share. Besides simply deleting content and banning creators, sites can subtly nudge users, algorithmically promoting certain sorts of content while demoting others. The policies are often frustratingly opaque, but many users have figured out well what will or won’t anger the invisible censor.
That doesn’t stop them from talking about taboo topics. For the younger generation, social media is often the first and only place to learn about various elements of the world they live in, and sometimes those elements include sex, drugs, violence, and politics. So teenagers have come up with an elaborate system of cheeky substitutes for words that would otherwise get their content shadow banned. Emojis and wordplay form a language.
Adam Aleksic explores this “algospeak” in a book of the same name. Aleksic runs Etymology Nerd, a popular TikTok channel that explores linguistics and the broader social issues around language. He learned through trial and error what the algorithm would reward and punish.
For example, when he made a video analyzing the controversy around the slogan From the river to the sea, Palestine will be free, he found his view count shrank by 90 percent. (So much for politicians’ claims that China was using TikTok to push anti-Israel incitement.) In another video, Aleksic discussed how the word fascism and a slur for gay men came from the same Latin root, fasces, meaning a bundle of sticks. Even though he was careful to self-censor by spelling the slur as f*ggot, the app still caught on and threatened to suspend him for homophobic language. Thus the need for creative workarounds.
As Aleksic notes, “minced oaths” and rhyming slang are not new. God damn it to Hell became gosh darn it to heck in the vocabulary of religious Christians, and ass became bottle and glass in the Cockney dialect. But the new euphemistic treadmill is far more artificially driven. Someone might use a minced oath because they believe it is impolite or impious to curse in public; kids today say unalive because a faceless institution will punish them for saying kill.
On one hand, algospeak could be interpreted as a straightforward example of censorship failing. Neither outright repression nor subtle incentives can get people to stop talking about the things they care about. Teenagers in particular are pretty savvy at getting around rules that they don’t respect. Speech always finds a way.
On the other hand, isn’t it a little disturbing that American kids are being trained to talk like subjects of a dictatorship? That teens have recreated from first principles the kinds of veiled metaphors and hushed tones familiar to people hiding from the secret police?
Worse still, this self-censorship is largely voluntary. For the most part, Americans are not in danger of police dragging them out of their houses for violating social media guidelines. Stimulation, attention, and the prospect of perhaps making some money from that attention are enough to get users caring about the demands of the algorithm. Then again, social media is where the hip conversations happen now. People who want to be a part of those conversations have to be on social media, and that means working around the arbitrary and ever-changing demands of its custodians.
At the end of the day, the algorithm is not primarily a political tool. It is a formula for figuring out which users are most likely to engage with which content, allowing social media companies to sell hypertargeted ads. Algospeak documents the symbiotic relationship between advertising and the proliferation of tiny online subcultures. Would you describe your fashion sense as “goblincore”? How about “cybergoth”? The algorithm can find you both like-minded people and products to buy.
Algospeak sometimes veers into woke clichés that feel about five years out of date. The concept of “cool” began as “a concealed act of resistance against the straight white norms of the English language,” Aleksic writes in a chapter titled “It’s Giving Appropriation.” The chapter gives a fairly value-neutral assessment of how slang develops in minority subcultures and enters mainstream culture, but Aleksic declares this “harmful” to minorities. Paradoxically, he goes on to complain that mainstream society is not accepting enough of slang and minority dialects.
Despite his political leanings, Aleksic documents well how both the censorship and the subcultures trying to escape it cut across political lines. Incels and “body-positivity” advocates both play the same game. So do LGBT activists and homophobes. Users replace white with YT and a slur for black people with ninja. Aleksic notes the paradox in trying to censor sensitive words, which may be used as either self-descriptors or slurs, and the ways trolls exploit the inflexible nature of content moderation. “Rather than making hateful content that’ll get immediately removed,” he writes, bigots will “often opt to mass-report videos of creators they dislike.”
Some users contend with competing censorship regimes. Much has been said about the ways that TikTok, founded by a Chinese company, has to answer to the Chinese authorities—a problem that ironically became a justification for banning TikTok in America. Aleksic addresses cases of TikTok censoring topics that are sensitive in China, such as the Hong Kong protests and the repression of Uyghurs. He also mentions censorship in the other direction. Instagram, for example, once shadow banned the journalist Azmat Khan, whose New York Times coverage of U.S. military cover-ups won her a Pulitzer Prize. (Meta, the parent company, claims it was an accident.)
These problems, though many of them have political causes, don’t have a clear political solution. Often, political pressure just causes censorship to swing in the other direction. Republicans who once railed against government intervention in social media moderation now pressure companies to help them root out the “cancer” of political radicalism; Democrats now waving the flag of free speech may soon return to demanding that dangerous “misinformation” be stomped out. While public pressure may cause a few figures to be unbanned, risk-averse companies generally find it easier to censor everything than to allow everything.
But the overall message of Algospeak should be reassuring from a libertarian perspective. Just let things be, the book seems to say. If people want to talk about something, they will find a way. Social media only accelerates the game of Whac-A-Mole.