(San Francisco) The Meta group announced Tuesday the creation of “Teenage Accounts”, supposed to better protect minor users from the dangers linked to Instagram, an application accused by many associations and authorities of harming the mental health of the youngest.
“This is an important update, designed to give parents peace of mind,” Antigone Davis, the Californian group’s vice president responsible for safety issues, told AFP.
In practice, users aged 13 to 15 will now have private accounts by default, with safeguards on who can contact them and what content they can see.
Teens who want a public profile and fewer restrictions—because they want to become influencers, for example—will need to get their parents’ permission, whether they’re already registered or new to the platform.
“This is a fundamental change […] to make sure we’re really doing things right,” the manager emphasizes. Adults will be able to supervise their children’s activities on the social network and take action accordingly, including blocking the application.
The parent company of Facebook, Instagram, WhatsApp and Messenger is also tightening its age rules.
“We know that teens can lie about their age, particularly to try to get around these protections,” Davis notes. Now, if a teen tries to change their date of birth, “we’re going to ask them to prove their age.”
Age
Pressure has been mounting for a year against the world’s number two in digital advertising and its competitors.
Last October, some forty American states filed a complaint against Meta’s platforms, accusing them of harming the “mental and physical health of young people” due to the risks of addiction, cyberbullying and eating disorders.
From Washington to Canberra, elected officials are working on bills to better protect children online. Australia is expected to soon set the minimum age for using social media at between 14 and 16.
Meta currently refuses to check the age of all its users, in the name of respect for privacy.
“If we detect that someone has definitely lied about their age, we intervene,” says Davis, “but we don’t want to force 3 billion people to provide an ID.”
According to the manager, it would be simpler and more effective if age control took place at the level of the mobile operating system of smartphones, i.e. Android (Google) or iOS (Apple).
“They have significant information about the age of users,” she argues, and could therefore “share it with all the apps used by teenagers.”
Victims
It is not certain, however, that this reinforcement of existing protections will be enough to reassure worried governments and organisations.
“Instagram is addictive. It leads kids into vicious circles, showing them not what they want to see, but what they can’t look away from,” Bergman said.
In 2021, this lawyer founded an organization to defend “victims of social media” in court. In particular, it represents 200 parents whose child committed suicide “after being encouraged to do so by videos recommended by Instagram or TikTok.”
Matthew Bergman also cites the many cases where young girls have developed serious eating disorders. Meta already prevents the promotion of extreme diets on its platforms, among other measures taken in recent years.
“These are small steps in the right direction, but there is so much more to do,” the lawyer said.
According to him, it would be enough for groups to make their platforms less addictive – “and therefore a little less profitable” – without losing their qualities for users, for communicating or exploring interests.
In June, the U.S. surgeon general called for requiring social media platforms to display information about the dangers to minors, similar to the warning messages on cigarette packages.
Hearing before Congress at the end of January, Meta boss Mark Zuckerberg presented a rare apology to the parents of the victims, saying he was “sorry for everything you have experienced.”