Instagram Debuts New Safety Settings for Teenagers

San Francisco/ Web Desk:

Instagram introduces new safety features to keep young users safe from unwanted adults. Through a press release, Instagram owner company Facebook has announced these new safety features.

It’s making new accounts private by default for kids under 16, blocking some adults from interacting with teens on its platform, and restricting how advertisers can target teenagers.

The changes come as the Facebook-owned photo-sharing app is under pressure from lawmakers, regulators, parents and child-safety advocates worried about the impact of social media on kids’ safety, privacy, and mental health.

Karina Newton, Instagram’s head of public policy, told NPR that the changes announced on Tuesday are aimed at creating “age-appropriate experiences” and helping younger users navigate the social network.

Starting this week, when kids under 16 join Instagram, their accounts will be made private automatically, meaning their posts will only be visible to people they allow to follow them. Teens that already have public Instagram accounts will see notifications about the benefits of private accounts and how to switch.

Instagram is also taking steps to prevent what it calls “unwanted contact from adults.” It says adults who, while not breaking Instagram’s rules, have shown “potentially suspicious behavior”, such as if they’ve been blocked or reported by young people, will have limited ability to interact with and follow teens.

“We want to ensure that teens have an extra barrier of protection around them out of an abundance of caution,” Newton said.

Facebook is also changing the rules for advertisers on Instagram as well as its namesake app and its Messenger app. Starting in a few weeks, they will be able to target people under 18 with ads based only on age, gender, and location, but not other information the company tracks, such as users’ interests or habits on its own apps or other apps that share data with the company.

Facebook and Instagram also say they are working on better methods of verifying users’ ages, so they can determine when policies for teens should apply, and do a better job of keeping kids under 13 off the apps.

This post was originally published on VOSA.