TikTok Feeds Teens a Diet of Darkness

0
133


Calls to ban TikTok in the U.S. are growing louder. Government leaders are trying to keep the popular China-owned social video platform away from schools, public workers, even entire states, on the grounds that users’ data could wind up in the wrong hands.

Calls to ban TikTok in the U.S. are growing louder. Government leaders are trying to keep the popular China-owned social video platform away from schools, public workers, even entire states, on the grounds that users’ data could wind up in the wrong hands.

Data privacy, though, might be less worrisome than the power of TikTok’s algorithm. Especially if you’re a parent.

Hi! You’re reading a premium article

Data privacy, though, might be less worrisome than the power of TikTok’s algorithm. Especially if you’re a parent.

A recent study found that when researchers created accounts belonging to fictitious 13-year-olds, they were quickly inundated with videos about eating disorders, body image, self-harm and suicide.

If that sounds familiar, a Wall Street Journal investigation in 2021 found that TikTok steers viewers to dangerous content. TikTok has since strengthened parental controls and promised a more even-keeled algorithm, but the new study suggests the app experience for young teens has changed little.

What teens see on social media can negatively affect them psychologically. Plenty of research backs this up. The simplest evidence may be found in my earlier column about teens who developed physical tics after watching repeated TikTok videos of people exhibiting Tourette Syndrome-like behavior.

A TikTok spokeswoman said the company has a team of more than 40,000 people moderating content. In the last three months of 2022, TikTok said it removed about 85 million posts deemed in violation of its community guidelines, of which 2.8% were suicide, self-harm and eating-disorder content. It also considers the removal of content flagged by users. “We are open to feedback and scrutiny, and we seek to engage constructively with partners,” the spokeswoman added.

Two-thirds of U.S. teens use TikTok, and 16% of all U.S. teens say they’re on it near constantly, according to Pew Research Center. Kids’ frequent social-media use—along with the potential for algorithms to lure teens down dangerous rabbit holes—is a factor in the American Psychological Association’s new recommendations for adolescent social-media use.

The group this week said parents should monitor their younger kids’ social-media scrolling and keep watch for troublesome use. The APA also urges parents and tech companies to be extra vigilant about content that encourages kids to do themselves harm.

‘Every 39 seconds’

The Center for Countering Digital Hate, a nonprofit that works to stop the spread of online hate and disinformation, tested what teens see on TikTok. Last August, researchers set up eight TikTok accounts to look like they belonged to 13-year-olds in the U.S., the U.K., Canada and Australia. For 30 minutes, researchers behind the accounts paused briefly on any videos the platform’s For You page showed them about body image and mental health, and tapped the heart to like them.

TikTok almost immediately recommended videos about suicide and eating disorders, the researchers said. Videos about body image and mental health popped up on the accounts’ For You pages every 39 seconds, they added.

After the researchers published their findings, many of the videos they flagged disappeared from TikTok. Many of the accounts that posted the material remain. Those accounts include other videos that promote restrictive diets and discuss self-harm and suicide.

TikTok does take down content that clearly violates its guidelines by, for instance, referring directly to suicide. Videos where people describe their own suicidal feelings, however, might not be considered a violation—and wouldn’t fall under moderator scrutiny. They could even be helpful to some people. Yet child psychologists say these too can have a harmful effect.

TikTok executives have said the platform can be a place for sharing feelings about tough experiences, and cite experts who support the idea that actively coping with difficult emotions can be helpful for viewers and posters alike. They said TikTok aims to remove videos that promote or glorify self-harm while allowing educational or recovery content.

The company said it continually adjusts its algorithm to avoid repeatedly recommending a narrow range of content to viewers.

‘Sad and lonely’

The Center for Countering Digital Hate shared its full research with me, including links to 595 videos that TikTok recommended to the fake teen accounts. It also provided reels containing all of the videos, some of which are no longer on the site. I also looked at other content on the accounts with flagged videos.

After a few hours, I had to stop. If the rapid string of sad videos made me feel bad, how would a 14-year-old feel after watching this kind of content day after day?

One account is dedicated to “sad and lonely” music. Another features a teenage girl crying in every video, with statements about suicide. One is full of videos filmed in a hospital room. Each of the hospital videos contains text expressing suicidal thoughts, including, “For my final trick I shall turn into a disappointment.”

Users have developed creative ways to skirt TikTok’s content filters. For instance, since TikTok won’t allow content referencing suicide, people use a sound-alike such as “sewerslide,” or just write “attempt” and leave the rest to the viewer’s imagination. Creators of videos about disordered eating have also evaded TikTok’s filters.

Policing all the content on a service used by more than one billion monthly users is no easy task. Yet there is a difference between stamping out harmful content and promoting it.

“If tech companies can’t eliminate this from their platforms, don’t create algorithms that will point kids to that information,” said Arthur C. Evans Jr., chief executive of the American Psychological Association.

What parents can do

Watch what your kids are watching. Ariana Hoet, a pediatric psychologist at Nationwide Children’s Hospital, recommends asking your teens to show you their For You page. If you spot harmful content, it is an indication they’re likely engaging with that type of content. That can give you an opening to start a conversation about it.

Set up Family Pairing. Parents can set up their own TikTok account and use the app’s Family Pairing to restrict age-inappropriate content and limit the time their teens spend on the app.

Filter the feed. People can filter out videos containing words or hashtags they don’t want to see. If content is still slipping through, teens can tap “not interested.”

Refresh the feed. Some teens have told me their feeds became so problematic they closed their accounts and started over. Teens can now refresh their feed without creating a new account. Once again, they must be careful what content they like or linger on, because new rabbit holes are forming all the time.

Write to Julie Jargon at [email protected]



Source link