A broad-based coalition of child safety advocates has sent a letter to Facebook chief Mark Zuckerberg, urging him to abandon plans to launch a version of Instagram for children under 13, saying it’s not the right fix for the problem of kids lying about their age to get on the platform and it would “put young users at great risk.”
“Launching a version of Instagram for children under 13 is not the right remedy” for the problem of kids dodging age controls, wrote the coalition, which consists of 35 organizations and 64 individual experts, coordinated by the nonprofit Campaign for a Commercial-Free Childhood.
In the letter (pdf), the coalition said Facebook’s plans to develop a kids’ version of Instagram “will subject young children to a number of serious risks and will offer few benefits for families.”
“Instagram, in particular, exploits young people’s fear of missing out and desire for peer approval to encourage children and teens to constantly check their devices and share photos with their followers,” the letter says. “The platform’s relentless focus on appearance, self-presentation, and branding presents challenges to adolescents’ privacy and wellbeing.”
“Younger children are even less developmentally equipped to deal with these challenges, as they are learning to navigate social interactions, friendships, and their inner sense of strengths and challenges during this crucial window of development,” the letter says, noting studies that link excessive screen media and social media use by children and adolescents to issues like obesity, decreased quality of sleep, increased risk of depression, and increases in suicide-related outcomes.
The coalition noted reports that Facebook, which owns Instagram, was looking into a version of the popular photo-sharing app for kids under 13, and cited a report of a leaked memo that claimed as much.
Stephanie Otway, a Facebook spokeswoman, confirmed to media outlets that the company was in the early stages of developing a version of Instagram for kids under 13, adding that it would consult with experts as it moves forward.
“We agree that any experience we develop must prioritize their safety and privacy, and we will consult with experts in child development, child safety and mental health, and privacy advocates to inform it,” Otway said. “The reality is that kids are online. They want to connect with their family and friends, have fun and learn, and we want to help them do that in a way that is safe and age-appropriate. We also want to find practical solutions to the ongoing industry problem of kids lying about their age to access apps.”
Otway added that Instagram-for-kids would be free of ads and have parental controls.
But the child advocacy coalition is unconvinced, writing, “Facebook’s long track record of exploiting young people and putting them at risk makes the company particularly unsuitable as the custodian of a photo sharing and social messaging site for children.”
The coalition made a number of claims in support of this view, including a design flaw in Facebook’s Messenger Kids, a version of its Messenger app aimed at children aged 6 to 12, which let young children circumvent parental controls and enter group chats with strangers.
The coalition argued that not only would a for-kids version of Instagram not solve the problem of children lying about their age to get on the platform, it would groom even younger children for social media use.
“Children between the ages of 10 and 12 who have existing Instagram accounts are unlikely to migrate to a ‘babyish’ version of the platform after they have experienced the real thing,” they wrote. “The true audience for a kids’ version of Instagram will be much younger children who do not currently have accounts on the platform.”
“While collecting valuable family data and cultivating a new generation of Instagram users may be good for Facebook’s bottom line, it will likely increase the use of Instagram by young children who are particularly vulnerable to the platform’s manipulative and exploitative features,” they added.
Instagram, in a recent blog post, acknowledged enforcement of its 13-year minimum age requirement as a “long-standing, industry-wide challenge” due to the complexity of age verification, while noting the company was rolling out new safety features, such as preventing adults from sending messages to people under 18 who don’t follow them.
Other steps Instagram has taken to protect young people on their platform include encouraging teens to make their accounts private, making it harder for adults to find and follow teens, and displaying safety notices to young users, prompting them to more cautious about interactions in direct messages.
View original Post