Meta’s more restrictive Instagram accounts for teens are a good start in building a safer experience for young users, but experts say the suite of new rules also raise further questions about what more social media companies will do to protect youth.
With Tuesday’s announcement, Meta is following through with demands many parents have made for more than a decade, according to Philip Mai, a senior researcher and co-director of the Social Media Lab at Toronto Metropolitan University.
The teen account features — including accounts set to private by default, limits to “sensitive content” on their feeds and notifications turned off overnight — appear to answer long-running parental concerns like who can communicate with their kids, what topics teens get exposed to and overuse of the app.
“This is a long time coming and I’m glad they’re rolling out something like this. But as always, with a new feature like this, the devil is in the details,” Mai said.
Anyone under 18 signing up for Instagram will get a teen account by default; those under 16 will also require parental permission to change protective settings. (Teens with existing accounts in the U.S., Canada, the U.K. and Australia will fall under the new restrictions in the next 60 days.)
How Instagram verifies age and identity in the first place is something Mai is interested in.
Currently, new account users can ask others to vouch for them, upload a piece of identification or record a video selfie for analysis by a third-party, AI facial recognition service. According to Meta, uploaded ID is deleted within 30 days and video selfies deleted after the analysis.
Mai pointed out each method can be easily circumvented by teens. The methods also spark further questions, he said, including about the consequences of identity data stored for any amount of time online or mistakes made with facial recognition.
“Over the next 12 months, I expect to hear a lot of growing pains and a few scandals here and there, where things didn’t go as planned or things that pop up that [Meta] didn’t anticipate,” he said.
Given ongoing lawsuits accusing social media companies of harming young people, as well as new online safety legislation being proposed in the U.S., the U.K, and Europe as well as in Canada, Mai predicts other social media platforms will soon join Instagram in rolling out new measures aimed at protecting young users.
Onus still on parents for monitoring
Instagram’s teen accounts will automatically be set to private, meaning young people can only exchange messages with and have their content seen by users they approve or have previously connected with — a move aimed at blocking attempts by strangers to communicate with them.
“We see a lot of situations where not only are adults connecting with kids on the platform, but individuals are using the platform to promote sexualized images of children,” noted Stephen Sauer, director of cyber security at the Canadian Centre for Child Protection in Winnipeg.
He thinks the new restrictions will help stem incoming messages from unknown individuals. But with so many of the changes linked to parental control and monitoring, he believes Meta is still putting the lion’s share of responsibility on parents versus taking stronger action itself against bad actors — such as a suspicious account sending a thousand friend requests daily.
“They will move to make these Band-Aid solutions and not look at a full-scale redevelopment of the tool to make sure that it meets … the protections that need to be in place for kids,” Sauer said.
Social media safety educator Paul Davies supports any increase in protections for teen users, since countless kids have circumvented what was previously in place and are on Instagram already. That’s why he’s a big advocate for ongoing education and parental guidance to teach kids to really understand the platforms they’re using.
One important lesson: the value of approving “real human followers” and not “people you think you know [or] you kind of might know.”
The Current24:12Should social media come with a warning label?
If your kid currently has hundreds of followers, Davies pointed out, even when Instagram eventually switches existing teen accounts to private ones, “it’s like having an open account,” because any of those other hundreds of followers can connect with them.
He encourages parents to “establish a healthy, transparent relationship with your child” on the topic of social media, where kids feel comfortable seeking support and asking questions without feeling judged or embarrassed.
New restrictions met with frustration
Some Toronto students bristled at Tuesday’s news of the new restrictions coming to their Instagram accounts.
Teens today are well-informed about the dangers of the internet, according to 16-year-old Hoang Banghia. “They are already aware enough to know what they should and shouldn’t do online,” he said.
Fellow student Kyidon Nornang, 17, is concerned about the restrictions on content, since social media is where so many young people turn to get informed. “Withholding information is never OK … It’s concerning that they think it’s OK to pick and choose what teens are allowed to see.”
Matt Hatfield, executive director of non-profit advocacy group OpenMedia, says he sympathizes with youth frustrated with Instagram’s new measures. While some might be capable of handling a restriction-free account, he noted that many others aren’t, so efforts at reducing risk are a step in the right direction.
“No set of changes is going to fix every problem. And some of these problems, it’s more about mitigating risk than preventing risk of harm,” Hatfield acknowledged.
“These are positive changes in terms of making it less likely that young people will be contacted and groomed by adults.”
Kari Hollend limits her kids’ use of social media: parental restrictions are turned on and phones must go off at certain times. The Toronto parent admits, however, “there’s only so much monitoring you can do” and welcomes more automatic safety measures.
“Having some rules and regulations in place to protect minors is a really good thing,” she said.