Instagram is touting safety features for teens. Mental health advocates aren't buying it
Share this article:
Instagram will start offering "take a break" reminders, starting today, if you've been scrolling on the social media app too long.
The prompt is one of a cluster of features that Instagram, owned by Facebook parent company Meta, is rolling out to keep teens safer and healthier online, it said on Tuesday.
The proposed features include unspecified restrictions on what content is algorithmically recommended to teens and soon-to-come tools for parents to view and manage the time their teens spend on the app.
"We're releasing this feature because we want people to be able to take a break and have their time on Instagram be intentional and meaningful – irrespective of whether that means seeing less ads or not," Meta spokesperson Liza Crenshaw said.
The announcement comes after former Facebook employee Frances Haugen leaked internal company research suggesting that Instagram harms the mental health of young women and girls – and just a day before Instagram chief executive Adam Mosseri is set to testify before Congress about the company's impact on young people.
Policymakers have criticised Facebook for failing to share its findings about teens, while Facebook shot back that the research was taken out of context.
The break prompt will ping people after 10, 20 or 30 minutes of scrolling and suggest they switch activities. The nudges will come with "expert-backed tips" for how to step away, according to a blog post from Mosseri.
The feature isn't on by default, but teens will get notifications encouraging them to set the reminders, the company says.
Other additions are in the works, Instagram said. In January, all accounts will see a hub where users can mass-delete past posts, comments and likes to better manage their online presence, Instagram said.
A feature the company says is coming in early 2022 will limit unwanted contact from strangers by preventing them from tagging teens in comments or posts.
The company also said it would roll out parental controls in March that let guardians see how much time teens spent on the app and set limits. Teens would also get the option to notify their parents when they reported someone for inappropriate behaviour on the app.
In addition, Instagram said it was experimenting with alerts for teens who spent too much time "dwelling" on posts about a single topic.
The company has been criticised for serving to teens potentially harmful posts including diet and weight loss information and content promoting eating disorders.
Meta's Crenshaw did not say how the nudges would work or how much time would be considered dwelling.
And the company said it might also expand its "sensitive content control" for teens to places beyond the explore tab – a personalised landing page with posts from accounts you don't follow.
Historically, there's been little transparency around what content counts as sensitive and how much exposure to potentially harmful content the company considers too much. Crenshaw declined to provide further details.
Experts said they weren't sure how the tool launched Tuesday squares with Instagram's business model, which relies at least in part on keeping teens engaged on the social platform.
"If you can't be certain that your algorithms aren't promoting and recommending harmful content to teenagers, then you shouldn't use your algorithms on teenagers," said Josh Golin, the executive director at Fairplay, a non-profit organisation that aims to end marketing targeted at children.
Whistle-blower Haugen's revelations about Instagram's alleged harm to teen girls drew much attention and criticism, but researchers have been uncovering similar findings for years, according to Linda Charmaraman, the director of the Youth, Media & Wellbeing Research Lab at the Wellesley Centers for Women.
The visual nature of apps such as Instagram created a focus on physical appearance and a tendency among teens to compare themselves to others, Charmaraman said.
Couple that with a business model that depends on getting brands in front of ever-younger customers – and that often relies on "influencer marketers" to do so – and you've got a recipe for mental health struggles, experts say.
The new reminder about breaks does address concerns about the "doom scroll", or the habit of passively looking at content with little active participation, says Vicki Harrison, the programe director for the Center for Youth Mental Health and Wellbeing at Stanford University.
But as long as Instagram, Facebook and Meta refused to let outside researchers and auditors review their data and algorithms, it was impossible to know the extent of the problem and how it wass best addressed, Charmaraman said.
For instance, the company could have data showing that teens typically scroll in five-minute chunks many times throughout the day, Charmaraman said. That type of engagement wouldn't set off the break reminder, so teens could remain a reliable audience for advertisers.
Crenshaw said the average length of a short session on the app was 10 minutes. She noted that Meta wanted to be more transparent with its employees and outside researchers and that it was looking for partners willing to collaborate on independent studies, although the company was unsure how to share data with third-party researchers without violating people's privacy.
Wednesday will be the first time Mosseri goes before lawmakers on behalf of the company.
Critics of Instagram, including one of the senators in charge of the hearing at which Mosseri will testify, were swift to question the company's motives in rolling out safety features for teens the day before his appearance.
"We'd like to be hopeful that this shows Instagram's commitment to making its products safer for kids and teens. But with their track record, it seems like their 'big announcement' in the dead of night is more likely to be a smokescreen to try to draw attention away from things they don't want highlighted at Wednesday's hearing," a spokesman for Senator Marsha Blackburn said.
Some advocates say they hope the company is planning more than extra health and safety features in response to what many think is a threat to the mental health of children and teens.
"Facebook is not getting the message. It can't engage in piecemeal efforts. It can't promise to do better. It needs to publicly say what changes it's going to make and turn those changes over to policymakers and independent experts around the world," said Jeff Chester, the executive director of the Center for Digital Democracy, which advocates for privacy, civil and human rights.
He and others called for Meta to release its research and algorithms for an independent audit, as well as scrap its plans for an Instagram app for users younger than 13.
Crenshaw said Instagram for Kids remained "on pause".
As it stood, the new features were a move in line with Facebook's long-time strategy, said Jim Steyer, the chief executive family advocacy group Common Sense Media: Wait until things blow up, then promise to make things better.
"It's a public relations stunt," Steyer said. "And it isn't going to work."