The chief executives of Meta, TikTok, X and other social media companies are giving evidence before the Senate Judiciary Committee about child safety on their platforms.
The hearing comes as US politicians are growing increasingly concerned about the effects of social media on young people’s lives.
While Meta chief executive Mark Zuckerberg is a veteran of congressional hearings since his first one over the Cambridge Analytica privacy debacle in 2018, it will only be the second time for TikTok chief executive Shou Zi Chew and the first for Linda Yaccarino, the chief executive of the former Twitter.
The hearing began with recorded testimony from youngsters and parents who said they or their children were exploited on social media.
“They’re responsible for many of the dangers our children face online,” US Senate Majority Whip Dick Durbin, who chairs the committee, said in opening remarks.
“Their design choices, their failures to adequately invest in trust and safety, their constant pursuit of engagement and profit over basic safety have all put our kids and grandkids at risk.”
In a heated question and answer session with Mr Zuckerberg, Missouri Senator Josh Hawley asked the Meta CEO if he has personally compensated any of the victims and their families for what they have been through.
“I don’t think so,” Mr Zuckerberg replied.
“There’s families of victims here,” Mr Hawley said. “Would you like to apologise to them?”
As parents rose and held up their children’s pictures, Mr Zuckerberg turned to face them and apologised for what they have been through.
Mr Hawley continued to press Mr Zuckerberg, asking if he had take personal responsibility for the harms his company has caused.
Mr Zuckerberg stayed on message and repeated that Meta’s job is to “build industry-leading tools” and empower parents.
“To make money,” Mr Hawley cut in.
South Carolina Senator Lindsay Graham, the top Republican on the Judiciary panel, echoed Mr Durbin’s sentiments and said he is prepared to work with Democrats to solve the issue.
“After years of working on this issue with you and others, I’ve come to conclude the following: social media companies as they’re currently designed and operate are dangerous products,” Mr Graham said.
He told the executives their platforms have enriched lives but that it is time to deal with “the dark side.”
Beginning with Discord’s Jason Citron, the executives touted existing safety tools on their platforms and the work they have done with nonprofits and law enforcement to protect minors.
Snapchat had broken ranks ahead of the hearing and began backing a federal bill that would create a legal liability for apps and social platforms who recommend harmful content to minors.
Snap CEO Evan Spiegel reiterated the company’s support on Wednesday and asked the industry to back the bill.
Mr Chew said TikTok is vigilant about enforcing its policy barring children under 13 from using the app. Ms Yaccarino said X, formerly Twitter, does not cater to children.
“We do not have a line of business dedicated to children,” Ms Yaccarino said. She said the company will also support Stop CSAM Act, a federal bill that make it easier for victims of child exploitation to sue tech companies.
Yet child health advocates say social media companies have failed repeatedly to protect minors.
“When you’re faced with really important safety and privacy decisions, the revenue in the bottom line should not be the first factor that these companies are considering,” said Zamaan Qureshi, co-chair of Design It For Us, a youth-led coalition advocating for safer social media.
“These companies have had opportunities to do this before they failed to do that. So independent regulation needs to step in.”
Meta will likely be a central focus of the hearing, with the Menlo Park, California, tech giant being sued by dozens of states that say it deliberately designs features on Instagram and Facebook that addict children to its platforms and has failed to protect them from online predators.
New internal emails between Meta executives released by Senator Richard Blumenthal’s office show Sir Nick Clegg, president of global affairs, and others asking Mr Zuckerberg to hire more people to strengthen “wellbeing across the company” as concerns grew about effects on youth mental health.
“From a policy perspective, this work has become increasingly urgent over recent months. Politicians in the US, UK, EU and Australia are publicly and privately expressing concerns about the impact of our products on young people’s mental health,” Sir Nick wrote in an August 2021 email.
The emails released by Mr Blumenthal’s office do not appear to include a response, if there was any, from Mr Zuckerberg. In September 2021, The Wall Street Journal released the Facebook Files, its report based on internal documents from whistleblower Frances Haugen, who later testified before the Senate.
Meta has beefed up its child safety features in recent weeks, announcing earlier this month that it will start hiding inappropriate content from teenagers’ accounts on Instagram and Facebook, including posts about suicide, self-harm and eating disorders.
It also restricted minors’ ability to receive messages from anyone they do not follow or are not connected to on Instagram and on Messenger and added new “nudges” to try to discourage teens from browsing Instagram videos or messages late at night.
The nudges encourage children to close the app, though it does not force them to do so.
But child safety advocates say its actions from the companies has fallen short.
“Looking back at each time there has been a Facebook or Instagram scandal in the last few years, they run the same playbook. Meta cherry picks their statistics and talks about features that don’t address the harms in question,” said Arturo Bejar, a former engineering director at the social media giant known for his expertise in curbing online harassment who recently testified before Congress about child safety on Meta’s platforms.
“Instagram promises features that end up hidden in settings that few people use. Why is ‘quiet mode’ not the default for all kids?” Mr Bejar added.
“Meta says that some of the new work will help with unwanted advances. It is still not possible for a teen to tell Instagram when they’re experiencing an unwanted advance. Without that information how can they make it safer?”
Google’s YouTube is notably missing from the list of companies called to the Senate on Wednesday even though more children use YouTube than any other platform, according to the Pew Research Centre.
Pew found that 93% of US teenagers use YouTube, with TikTok a distant second at 63%.