US regulators have said Facebook misled parents and failed to protect the privacy of children using its Messenger Kids app, including misrepresenting the access it provided to app developers to private user data.
As a result, the Federal Trade Commission on Wednesday proposed sweeping changes to a 2020 privacy order with Facebook — now called Meta — which would prohibit it from profiting from data it collects on users under 18. This would include data collected through its virtual reality products.
The FTC said the company has failed to fully comply with the 2020 order.
Meta would also be subject to other limitations, including on its use of face-recognition technology, and be required to provide additional privacy protections for users.
“Facebook has repeatedly violated its privacy promises,” said Samuel Levine, director of the FTC’s Bureau of Consumer Protection.
“The company’s recklessness has put young users at risk, and Facebook needs to answer for its failures.”
Facebook launched Messenger Kids in 2017, pitching it as a way for children to chat with family members and friends approved by their parents.
The app does not give children separate Facebook or Messenger accounts, but works as an extension of a parent’s account, and parents get controls such as the ability to decide who their children can chat with.
At the time, Facebook said Messenger Kids would not show ads or collect data for marketing, though it would collect some data it said was necessary to run the service.
But child development experts raised immediate concerns.
In early 2018, a group of 100 experts, advocates and parenting organisations contested Facebook’s claims that the app was filling a need for a children’s messaging service.
The group included non-profits, psychiatrists, paediatricians, educators and children’s music singer Raffi Cavoukian.
“Messenger Kids is not responding to a need — it is creating one,” the letter said. “It appeals primarily to children who otherwise would not have their own social media accounts.”
Another passage criticised Facebook for “targeting younger children with a new product”.
Facebook said at the time that the app “helps parents and children to chat in a safer way”, and emphasised that parents are “always in control” of their children’s activity.