Home of Lisa's Top Ten, the daily email that brings you the world.
DONATE
SUBSCRIBE
The first task of the day

Sign Up for Lisa's Top Ten

Untitled(Required)

TikTok is Pushing Incel and Suicide Videos to 13-Year-Olds

EMS-FORSTER-PRODUCTIONS/GETTY IMAGES
EMS-FORSTER-PRODUCTIONS/GETTY IMAGES

Within minutes of TikTok’s youngest users signing up for a new account, the platform’s powerful algorithm is bombarding teens with extremist content, including videos promoting suicide and the virulently misogynistic incel subculture, according to new research published today by corporate accountability group Ekō and shared with VICE News.

Despite TikTok’s promises to crackdown on this kind of content, it’s still easily discoverable by new users who want to seek it out. And TikTok’s recommendation algorithm is so advanced that it will begin pushing increasingly extreme content into the feed of new users after they’ve used the app for just 10 minutes, the new research states. ​​

The researchers set up nine different new accounts on TikTok, which has replaced Instagram and Facebook as the de facto social media platform for teenagers in America, with 150 million active users. They stated their age as 13, the youngest age users can join the platform. They then mimicked users who were curious about topics like incel content.

The researchers found that after viewing just 10 videos related to these topics, the “For You” pages of the new accounts were filled with similar, and often more extreme, content.

One test account was shown a post that included a clip of Jake Gyllenhaal, whose films have been popular amongst incels. The video shows the actor with a rifle in his mouth saying, “Shoot me. Shoot me in the fucking face,” alongside text that reads: “Get shot or see her with someone else?”

Related Story: TikTok Pushes Harmful Content to Users as Often as Every 39 Seconds

Read More

Total
10
Shares
Related Posts