Instagram is redesigning its teen interface to give parents more control and assurance, as well as more “built-in protections” for their children.
Tuesday marks the launch of the new “teen accounts” in the US, UK, Canada, and Australia.
For all under-18s, they will activate several privacy settings by default, such as preventing non-followers from seeing their content and requiring them to explicitly accept each new follower.
However, minors between the ages of 13 and 15 can only change the settings by adding a guardian or parent to their account.
There is pressure on social media businesses all around the world to make their platforms safer because there are worries that not enough is being done to protect children from inappropriate content.
The move from Instagram was hailed as a “step in the right direction” by the UK children’s charity, the NSPCC.
However, it also mentioned that account settings can “highlight the necessity for parents and kids to protect themselves.”
They “must be backed up by proactive measures that prevent harmful content and sexual abuse from proliferating on Instagram in the first place,” according to Rani Govender, the NSPCC’s online child safety policy manager.
A “new experience for teens, guided by parents” is how Meta characterizes the modifications.
It states that they will “better support parents, and give them peace of mind that their teens are safe with the right protections in place.”
Ian Russell told the BBC that it was crucial to wait and observe how the new policy was applied. Molly, his daughter, watched content promoting self-harm and suicide on Instagram before taking her own life when she was 14 years old.
“We won’t know if it works or not until the measures are put in place,” he stated.
“Meta is very good at generating publicity and making these big announcements, but they also need to be good at sharing the effectiveness of their measures and being transparent.”
How is it going to operate?
Teen accounts, which have several options activated by default, primarily alter how Instagram functions for users between the ages of 13 and 15.
These include muted messages for the duration of the day and stringent filters on sensitive information to avoid suggestions of potentially hazardous content.
Additionally, accounts will be made to private rather than public, requiring adolescents to actively welcome new followers and preventing non-followers from viewing their content.
While they cannot examine the content of messages, parents who opt to monitor their child’s account can see who their child messages and the subjects they have indicated they are interested in.
But in April, the media watchdog Ofcom voiced concerns about parents’ readiness to step in and protect their kids online.
“One of the things we do find… is that even when we build these controls, parents don’t use them,” top Meta executive Sir Nick Clegg stated in a lecture last week.
Identification of age
Although Instagram already has capabilities to confirm a user’s age if they are suspected of lying about it, the system will mainly rely on users providing accurate information about their ages.
In order to re-account teenagers who have used adult accounts, artificial intelligence (AI) capabilities will be used in the US starting in January.
Passed earlier this year, the UK’s Online Safety Act mandates that online sites take precautions to keep youngsters secure, or risk paying hefty fines.
In May, Ofcom issued a warning to social media platforms that if they didn’t follow its new regulations, they risked being labeled, shamed, or banned for under-18s.
Instagram’s modifications, according to social media industry analyst Matt Navarra, were noteworthy but contingent on implementation.
“As we’ve seen with teens throughout history, in these sorts of scenarios, they will find a way around the blocks, if they can,” he stated to the BBC.
Meta-related queries
Instagram claims to have more than 50 options for keeping teens safe, but it is not the first platform to offer these kinds of resources for parents.
In 2022, among other things, it gave parents access to a family center and monitoring tools that allowed them to see whose accounts their child was following and who was following them.
Additionally, Snapchat launched a family center that lets parents (above 25) monitor who their children are messaging and set restrictions on what kinds of content they can access.
In September, YouTube said that it would stop recommending to teenagers certain health and fitness videos, such as those that “idealize” particular body shapes.
The topic of why young people are still exposed to harmful content on Instagram, despite the platform’s numerous precautions, is raised by the company’s recent actions.
Every youngster surveyed for an Ofcom study earlier this year reported having seen violent content online, with Instagram, WhatsApp, and Snapchat being the most popular platforms where it could be discovered.
Platforms will need to demonstrate their commitment to eliminating unlawful content, such as child sexual abuse material (CSAM) and content that encourages self-harm or suicide, to comply with the Online Safety Act.
However, it won’t be until 2025 that the regulations completely go into force.
How do you keep kids safe online and what is the Online Safety Act?
Australian Prime Minister Anthony Albanese has unveiled proposals to outlaw children’s use of social media and impose a new age restriction on children’s platform usage.
With Instagram’s most recent features, parents will have even more power over their child’s use of the platform. They will be directly in charge of monitoring their child’s interactions and activities as well as choosing whether to give them more freedom.
Additionally, they must have a personal Instagram account.
However, parents have no control over the content that is shared by the platform’s billions of users worldwide or the algorithms that recommend stuff to their kids.
It’s a “significant step in protecting children’s access to the world of social media and fake news,” according to social media specialist Paolo Pescatore.
“The smartphone has opened up to a world of disinformation, inappropriate content fuelling a change in behavior among children,” he stated.
“More needs to be done to improve children’s digital wellbeing and it starts by giving control back to parents.”