Meta announced today the implementation of new restrictions on access to certain content for adolescents on Facebook and Instagram, in order to promote "safe and appropriate" use for their age.
The Menlo Park (California) group will now by default place all adolescent accounts in a stricter configuration of the two social networks, which "makes it more difficult to come into contact with potentially sensitive content."
The measure applies to accounts identified as belonging to a teenager between the ages of 13 (the minimum required for registration) and 15 (18 in certain countries).
This configuration also restricts access, for other users, to the list of "friends", followed accounts, as well as the possibility to comment on posted messages.
Searching for specific words such as "self-harm", "suicide", "eating disorders" or "bulimia" will not return any results for adolescents.
Instead, the user will be shown a prevention message suggesting they contact a professional, a friend, or consult a list of tips likely to help them, according to a message posted on the Met website.
However, this will not prevent a teenager from discussing his possible problems or difficulties on Instagram or Facebook with some of his contacts.
Meta has been regularly criticized in recent years for its accounts of young users on Facebook and Instagram.
After announcing the launch of an Instagram dedicated to young people up to 13 years old, Meta finally decided to give it up, in the fall of 2021.
In late October, 41 US states filed a US civil lawsuit against Meta, accusing Facebook and Instagram of harming the "mental and physical health of young people".
Meta said at the time that it had already introduced a number of tools to improve the protection of young users on its platforms.
Bonus video: