Sex Educators Are Self-Censoring Online to Avoid Content Bans. Does it Work?

It's called 'algospeak,' and it looks like 's3x,' 'le$bian,' and 'c00chie.' Some creators swear by it, but experts say it may increase stigma. The post Sex Educators Are Self-Censoring Online to Avoid Content Bans. Does it Work? appeared first on Rewire News Group.

Sex Educators Are Self-Censoring Online to Avoid Content Bans. Does it Work?
Why the Hen Does Not Have Teeth Story Book

WHY THE HEN DOES NOT HAVE TEETH STORY BOOK

It’s an amazing story, composed out of imagination and rich with lessons. You’ll learn how to be morally upright, avoid immoral things, and understand how words can make or destroy peace and harmony.

Click the image to get your copy!

Why the Hen Does Not Have Teeth Story Book

WHY THE HEN DOES NOT HAVE TEETH STORY BOOK

It’s an amazing story, composed out of imagination and rich with lessons. You’ll learn how to be morally upright, avoid immoral things, and understand how words can make or destroy peace and harmony.

Click the image to get your copy!

Why the Hen Does Not Have Teeth Story Book

WHY THE HEN DOES NOT HAVE TEETH STORY BOOK

It’s an amazing story, composed out of imagination and rich with lessons. You’ll learn how to be morally upright, avoid immoral things, and understand how words can make or destroy peace and harmony.

Click the image to get your copy!

“Did you know sx isn’t supposed to hurt?” read the words on an Instagram reel posted by the sexual health and wellness company The Pelvic People in early 2025.

At first glance, the missing “e” looks like a typo. But it’s not. A scroll through the Pelvic People’s Instagram page and others like it on the social media platform reveals videos littered with words like “s*x,” “lub3,” and “c00chie.” 

Commonly referred to as “algospeak,” this popular form of online lingo exists to sidestep algorithms employed by tech companies like Meta, TikTok, and X that penalize or even remove posts for discussing sensitive topics like sex, mental health, or substance use. 

Posts perform better with “a little bit of censorship,” said Emily Tran, social media manager at the Pelvic Health and Rehabilitation Center, who uses the center’s Instagram, @pelvichealth, to share evidence-based information about sexual health. 

“If I were to write something about female anatomy and use the correct terms, [the algorithm] might flag it and say that it’s inappropriate,” Tran added. 

The algorithm would then reduce the post’s visibility, leading to a decline in engagement—something Tran said happened to the account before she started using algospeak. 

Creators who discuss sensitive topics often see this kind of self-censorship as essential to posting freely without losing engagement. But, experts said in interviews with Rewire News Group, censored language like algospeak can feed the stigma around these topics and even pose threats to sexual health. And recent research has cast doubt on whether using algospeak actually improves their social media visibility. 

Censorship begets censorship

Researchers at the University of St. Cyril and Methodius in Slovakia gathered the most recent 50 Instagram posts from nine sex education accounts that post in English, Slovak, or Czech. They analyzed the likes, comments, and shares on a total of 450 posts to understand overall engagement.

The resulting study, which was published in the journal Media Literacy and Academic Research in June 2025, found no statistically significant decrease in likes and shares between posts that did not use algospeak compared with those that did.

Researchers also found a slightly lower number of comments on posts not using algospeak, which they attributed to other potential factors, such as uncensored videos having a more academic tone. 

Another limiting factor, the study’s lead author, Michal Kabát said in an interview with Rewire News Group, was that “the accounts don’t use the algospeak consistently.” Still, because a given account often posts content both with and without algospeak, it’s unlikely that his study’s findings are simply explained by more popular accounts using less algospeak. 

Ultimately, algospeak is probably an unnecessary precaution for sex education accounts, Kabát said. 

“[People] feel this obligation to fit in some unwritten rules somehow,” Kabát said. 

Promoting censorship of sexual language reinforces taboos about sex and frames sexual organs as inappropriate to acknowledge, Kabát said. 

“This is not helping to establish open, clear, and taboo-less communication,” Kabát added. 

If a topic is portrayed online as forbidden to talk about openly, Kabát said, “[viewers] won’t talk about it either, they won’t use the words, or they’ll also start auto-censoring themselves.” 

It’s a phenomenon he said he’s already observed in comment sections, where users, despite having no need to worry about engagement, parrot back the same algospeak used in a post.

Together, we make reproductive justice visible.

Rewire News Group is a reader-supported, independent nonprofit newsroom. Membership keeps this reporting accessible to all.

Why language matters

Sex education is lacking in many school systems in the United States and around the world. Often, the curriculum centers on teaching people to avoid sexually transmitted infections and unplanned pregnancies without much discussion of anatomy and sexual health. 

Social media can fill in these gaps for many young people by making reliable sexual health information accessible. However, censoring language around these topics online can obstruct clear communication and lead to feelings of shame around uncensored sexual language.

This practice may be especially harmful to those assigned female at birth. People with female anatomy more commonly report experiencing feelings of shame about their genitalia than those with male anatomy,, and studies show that parents often use euphemisms like “down there” and “private parts” when talking about the female genitalia to their children. As a result, even young children are more likely to know the actual names for male genitalia than female genitalia. 

This linguistic discomfort with the female anatomy is visible even on TV. For example, a 2023 review of the censorship of the word “vagina” found that network broadcasting officials required the show Grey’s Anatomy to replace the word “vagina” with “vajayjay.” Meanwhile, the word “penis” was said 17 times in a single episode. 

That kind of censorship can have real-world impacts on sexual health: “Cryptic language can unintentionally reinforce the idea that sexuality and genital anatomy is shameful,” said Taylor Roebotham, a gynecologist at London Health Sciences Centre in Ontario, Canada. Many of her patients already struggle to explain their symptoms, she added, often because patients lack the language to do so. 

“I’ve had patients say that they had a problem with their vagina, using the only word that they know for that area of the body,” Roebotham said. “But on exam, they actually had a more musculoskeletal problem with their pubic bone.” 

Communication issues like this can delay care because clinicians may refer patients to the wrong specialist or investigate the incorrect area of the body.

Sometimes, stigma can stop patients from seeking care at all. 

A 2024 study using interviews with patients with vulvar lichen sclerosus, a chronic skin condition that causes pain, itching, and discoloration to the genitals, found that many participants experienced diagnostic delays because they were uncomfortable talking about their genitals with providers or didn’t even notice there was an issue in the first place. 

“Some of the women in my study said that they didn’t even think they were supposed to … acknowledge the existence of their vulva,” said Sophie Rees, a social scientist at the University of Bristol who co-authored the 2024 study. 

How to avoid algospeak

It might seem like Kabát’s study points to ending algospeak altogether as the obvious solution.

But for many sex education creators, getting rid of algospeak completely may not feel like a real possibility. Social media companies are often vague about their content restriction policies, and it’s never quite clear what kind of posts will get them flagged, downvoted, or even banned. For those who earn a living creating content, the threat of financial loss can make uncensored language too great of a risk without clearer content guidelines. 

Meta, the company that owns Instagram and Facebook, has also been criticized for enforcing its policies inconsistently and failing to provide explanations when posts are restricted. The company has promised to communicate the reasons for restrictions to content, Kabát said, but users themselves claim this doesn’t always happen. Research shows that “shadowbanning,” where a platform reduces the visibility of a user’s posts without notifying them, leaving them aware only of the sudden drop in engagement that results, is prevalent on Instagram. 

Tran, whose job it is to get evidence-based information about sexual health to the public, tries to use algospeak that looks as similar as possible to the language it’s replacing, she said, to avoid making the censorship distracting. “We just want to push out information that we think should be accessible to all,” Tran said, “while creating a community that wants to engage and have conversations.”

Even so, experts warn that in the long run, the risks of using algospeak may outweigh the benefits. 

“If creators collectively returned to medically accurate terminology, would sexual health content disappear entirely? Probably not,” Roebotham said. “Even if algorithms prefer censored language, they still need content, and we should be flooding them with thoughtful information.”

 

 

The post Sex Educators Are Self-Censoring Online to Avoid Content Bans. Does it Work? appeared first on Rewire News Group.

What's Your Reaction?

like

dislike

love

funny

angry

sad

wow