Instagram may soon require your ID to verify your age

The latest move by Instagram comes at a time when concerns over the safety of minors have become rife across internet-based social platforms. Picture: Souvik Banerjee/Unsplash

The latest move by Instagram comes at a time when concerns over the safety of minors have become rife across internet-based social platforms. Picture: Souvik Banerjee/Unsplash

Published Jun 24, 2022

Share

Amid a push by social networks to foster a safer digital environment for minors, Instagram has announced plans for an age verification feature, which could require your ID.

The Meta-owned photo-sharing platform has said that it is testing new options for people to verify their age to “provide age-appropriate experiences” for its users.

In addition to providing an identity document, users are expected to seek a peer review, allowing other platform users to verify their age.

Another technology the company could introduce is age verification based on a video selfie.

“We’re testing new options for people on Instagram to verify their age, starting with people based in the US. If someone attempts to edit their date of birth on Instagram from under the age of 18 to 18 or over, we’ll require them to verify their age using one of three options: upload their ID, record a video selfie or ask mutual friends to verify their age,” the company said in a statement.

Instagram has also said it is partnering with digital ID platform Yoti to make age verification possible.

The company also said that it had been asking users to provide their age since 2019, which has become a sign-up requirement. This knowledge allowed it to “provide appropriate experiences to different age groups, specifically teens”.

“We require people to be at least 13 years old to sign up for Instagram. In some countries, our minimum age is higher. When we know if someone is a teen (13-17), we provide them with age-appropriate experiences like defaulting them into private accounts, preventing unwanted contact from adults they don’t know, and limiting the options advertisers have to reach them with ads,” the company said.

Earlier this month, Instagram announced its Sensitive Content Control feature to allow for “Standard”, “More”, or “Less” sensitive and explicit content when navigating through the app.

In March, the company made a push toward more safety for children on the platform through supervision tools for parents and guardians.

The latest move by the social network comes at a time when concerns over the safety of minors have become rife across internet-based social platforms.

Recently, IOL Technology reported that microblogging platform Twitter could become a hotbed for easily-accessible pornography content with tweaks to a more free algorithm - after the social network had already become a cesspool of adult content after being used to distribute and share pornography over the years.

Meanwhile, in the latest outcry for greater safety for minors online, Mindgeek’s CEO Feras Antoon and COO David Tassillo resigned amid scathing allegations of non-consensual videos posted to the company’s most popular subsidiary - Pornhub.