Instagram Is No Place for Kids

Internal research at Facebook found that its photo-sharing app is having a terrible effect on teens’ mental health. So why is the company creating a new version for children?

The Facebook Inc. Instagram logo is displayed on an Apple Inc. iPhone in an arranged photograph taken in the Brooklyn Borough of New York, U.S.
By The Editors
September 22, 2021 | 10:38 AM

Bloomberg Opinion — Social media is a minefield of adolescent anxieties, as any parent can attest. Numerous studies have suggested a connection between excessive use of online platforms (and the devices used to access them) and worrying trends in teenage mental health, including higher rates of depressive symptoms, reduced happiness and an increase in suicidal thoughts.

Even in this grim context, Instagram, the wildly popular photo-sharing app owned by Facebook Inc., stands out. Its star-studded milieu — glossy, hedonistic, relentlessly sexualized — seems finely tuned to destabilize the teenage mind. Studies have linked the service to eating disorders, reduced self-esteem and more.

So perhaps it isn’t surprising that an internal research effort at the company, revealed last week, found that teens associate the service with a host of mental-health problems. “Thirty-two percent of teen girls said that when they felt bad about their bodies, Instagram made them feel worse,” said one slide. “Teens blame Instagram for increases in the rate of anxiety and depression,” said another. “This reaction was unprompted and consistent across all groups.”

If Facebook was concerned about these findings before they became public, it didn’t do much. In July, Instagram rolled out several policy changes it said were intended to protect teens, such as limiting how advertisers can target them and setting their accounts to private by default. “Instagram has been on a journey to really think thoughtfully about the experience that young people have,” a company rep said at the time.

PUBLICIDAD

Unfortunately, all that thoughtful thinking yielded an incoherent result. In the very same post in which Facebook announced the changes, it also conceded that it was moving ahead with a new version of Instagram intended for children under 13. Dubbed Instagram Youth, the concept was so obviously distasteful that it earned the opprobrium of health experts and consumer advocates, lawmakers of both parties, and nearly every state attorney general in the country.

A letter from health experts could hardly have been blunter. “The platform’s relentless focus on appearance, self-presentation, and branding presents challenges to adolescents’ privacy and wellbeing,” it said. “Younger children are even less developmentally equipped to deal with these challenges, as they are learning to navigate social interactions, friendships, and their inner sense of strengths and challenges during this crucial window of development.”

Facebook justifies this plan on the (rather shameless) theory that, since it has largely failed to keep children off of adult Instagram, the kids’ version will “reduce the incentive for people under the age of 13 to lie about their age.”

PUBLICIDAD

One might ascribe all this to Facebook’s standard-issue tactlessness. Yet the company’s treatment of young people has been especially irresponsible. For years, it refused to make changes that would prevent children from running up credit-card bills on its platform. In 2016, it started paying young people — including minors — $20 a month to use an app that gave the company total access to their web and phone activity. Its Messenger Kids app is targeted at users as young as 6, even though experts have warned that it’s highly likely to “undermine children’s healthy development.” That these schemes keep going horribly awry doesn’t seem to be much of a deterrent.

One wonders what would be. As a start, lawmakers should pressure Facebook to scrap Instagram Youth entirely and make a more earnest effort to protect teenagers across its services. Congress should consider extending existing online protections for children to all users up to age 15, for example, and create a legal expectation that platforms do more to prevent minors from lying about their ages. Down the road, more stringent regulations — perhaps modeled on the U.K.’s age-appropriate design code — may be needed if platform companies refuse to take this problem more seriously.

Social media is hard enough on consenting adults. It’s no place for kids.

Editorials are written by the Bloomberg Opinion editorial board.