Facebook is experiencing a demographic problem. Interest in the company’s primary product was falling down a cliff even before studies revealed that its items were damaging youngsters’ mental health. Teen usage of the app has dropped 13% since 2019, and it’s anticipated to decrease another 45 percent in the following two years.
In an internal memo released last week, a researcher stated, “Aging up is a real issue.” According to a recent document given over to Congress by whistleblower Frances Haugen, Facebook was investigating new products aimed at youngsters as young as six years old.
“Our company is making a major investment in youth and has spun up a cross-company virtual team to make safer, more private, experiences for youth that improve their and their household’s well-being,” the internal post from April 9 said.
“For many of our products, we historically haven’t designed for under 13 (with the exception of Messenger Kids) and the experiences built for those over 13 didn’t recognize distinctive maturity levels across the age spectrum.”
The Children’s Online Privacy Protection Act (COPPA) restricts what firms can do to target, acquire, and exchange data on children, which is why Facebook has mainly avoided targeting children under the age of 13. Companies, for example, are prohibited from disclosing personal information to third parties without parental approval.
A quick aside: Facebook just announced that its corporate entity would be renamed Meta. The Facebook moniker will continue to be used for the company’s main app and platform.
The rebranding comes as the corporation comes under increased scrutiny from governments and regulators for its role in ethnic cleansing, hate speech, insurgencies, mental health crises, and other issues. We’ll refer to the documents in this post by their original name because they were created while Facebook was still known as that.
Facebook does have one product for children under the age of 13—Messenger for Kids—and its terms of service state that the corporation does not sell its users’ information to third parties. However, as Common Sense Media points out, the conditions do not prohibit targeted advertising being shown to children. Facebook claims that its app conforms with the Children’s Online Privacy Protection Act (COPPA).
A glitch in the Messenger for Kids app allowed users to create group chats with unauthorised people two years ago. The flaw took nearly a year for Facebook to find, and it was remedied the next day. However, parents were not alerted for another month.
Senators Ed Markey (D-Mass.) and Richard Blumenthal (D-Conn.) pushed Facebook on whether it was in violation of the Children’s Online Privacy Protection Act (COPPA). Kevin Martin, Facebook’s vice president of public policy, responded that the business values children’s privacy and believes the app conforms with COPPA.
Yet the senators weren’t entirely convinced by Martin’s letter. “Facebook’s response gives little reassurance to parents that Messenger Kids is a safe place for children today,” they wrote back. “We are particularly disappointed that Facebook did not commit to undertaking a comprehensive review of Messenger Kids to identify additional bugs or privacy issues.”
Facebook, on the other hand, appears unfazed, and continues to push forward with its effort to attract younger users to its platforms.
In addition to potential COPPA violations, social media apps can harm children in other ways. According to Facebook’s own data, 7% of kids on Instagram have been bullied, with 40% of the bullying occurring through private messaging. To put it another way, limiting children’s social media relationships to their “friends” could still put them in danger.
Facebook had seven job listings at the time of the internal post, including three for Instagram Youth, a “paused” product that Facebook claims was intended towards kids aged 10 to 12. Another job posting was for an unnamed role that would include Messenger for Kids as well as a promised “Youth Platform.”
The new team was tasked with developing “experience strategies for a spectrum of age groups” spanning from six to sixteen years old, according to an internal memo.
Facebook didn’t appear to be trying to limit its attention on children to just one app. The company stated it was working on “redefin[ing] existing products to take into account cognitive and social development needs that different stages of maturity have,” in addition to developing new techniques to attract kids. To put it another way, Facebook wanted to include children in every app in its portfolio.