Sexual predators. Addictive features. Suicide and eating disorders. Unrealistic beauty standards. Harassment. These are just some of the problems young people face on social media, and children’s advocates and lawmakers say companies aren’t doing enough to protect them.
On Wednesday, the CEOs of Meta, TikTok, youths.
The hearing began with recorded testimony from children and parents who said they or their children were exploited on social media. During the hours-long event, parents who lost their children to suicide silently held photographs of their dead children.
“They are responsible for many of the dangers our children face online,” US Senate Majority Leader Dick Durbin, who chairs the committee, said in his opening remarks. “Their design choices, their failure to adequately invest in trust and security, their constant pursuit of compromise and profit over basic security has put our children and grandchildren at risk.”
In a heated question-and-answer session with Mark Zuckerberg, Missouri Republican Sen. Josh Hawley asked Meta’s CEO if he had personally compensated any of the victims and their families for what they had gone through.
“I don’t think so,” Zuckerberg responded.
“There are families of victims here,” Hawley said. “Would you like to apologize to them?”
Parents who attended the hearing stood up and showed pictures of their children. Zuckerberg also stood up and walked away from his and the senators’ microphone to address them directly.
“I’m sorry for everything you’ve been through. No one should have to go through the things their families have endured,” she said, adding that Meta continues to invest and work on “industry-wide efforts” to protect children.
But time and time again, child and parent advocates have stressed that neither company is doing enough.
“Meta’s overall approach is ‘trust us, we will do the right thing’, but how can we trust Meta? The way they talk about these issues seems as if they are trying to enlighten the world, said Arturo Béjar, former engineering director of the social media giant known for his expertise in curbing online harassment, who recently testified before Congress about child safety on Meta platforms. “Every parent I’ve met with a child under 13 is afraid that their child is old enough to be on social media.”
Hawley continued to press Zuckerberg, asking if he would take personal responsibility for the damage his company has caused. Zuckerberg stayed steadfast on message, repeating that Meta’s job is to “create industry-leading tools” and empower parents.
“To make money,” Hawley interrupted.
South Carolina Sen. Lindsay Graham, the top Republican on the judiciary panel, echoed Durbin’s sentiments and said he is prepared to work with Democrats to resolve the issue.
“After years of working on this issue with you and others, I have come to the following conclusion: social media companies, as they are currently designed and operated, are dangerous products,” Graham said.
He told executives that his platforms have enriched lives, but that it is time to deal with “the dark side.”
Starting with Discord’s Jason Citron, executives touted the existing safety tools on their platforms and the work they’ve done with nonprofits and authorities to protect minors.
Snapchat broke ranks ahead of the hearing and began supporting a federal bill that would create legal liability for apps and social platforms that recommend harmful content to minors. Snap CEO Evan Spiegel on Wednesday reiterated the company’s support and called on the industry to back the bill.
TikTok CEO Shou Zi Chew said TikTok is vigilant about enforcing its policy prohibiting children under 13 from using the app. CEO Linda Yaccarino said X, formerly Twitter, does not cater to children.
“We don’t have a line of business dedicated to children,” Yaccarino said. He said the company will also support the Stop CSAM Act, a federal bill that makes it easier for victims of child exploitation to sue technology companies.
However, child health advocates say social media companies have repeatedly failed to protect minors.
“When you’re faced with really important security and privacy decisions, bottom line revenue shouldn’t be the first factor these companies consider,” said Zamaan Qureshi, co-president of Design It For Us, a youth-led initiative. coalition advocating for safer social media. “These companies have had opportunities to do this before that they didn’t.” Therefore, independent regulation needs to intervene.”
Republican and Democratic senators came together in a rare show of agreement during the hearing, although it is not yet clear whether this will be enough to pass laws such as the Child Online Safety Act, proposed in 2022 by Senator Richard Blumenthal of Connecticut and the senator. Marsha Blackburn of Tennessee.
Meta is being sued by dozens of states that say it deliberately designs features on Instagram and Facebook that get children addicted to their platforms and has failed to protect them from online predators.
New internal emails between Meta executives released by Blumenthal’s office show Nick Clegg, president of global affairs, and others asking Zuckerberg to hire more people to strengthen “wellbeing across the company” as concerns grew. on the effects on the mental health of young people.
“From a policy perspective, this work has become increasingly urgent in recent months. Politicians in the US, UK, EU and Australia are publicly and privately expressing concerns about the impact of our products on young people’s mental health,” Clegg wrote in an August 2021 email.
The emails released by Blumenthal’s office do not appear to include a response, if any, from Zuckerberg. In September 2021, The Wall Street Journal published Facebook Files, its report based on internal documents from whistleblower Frances Haugen, who later testified before the Senate.
Meta has beefed up its child safety features in recent weeks and announced earlier this month that it will begin hiding inappropriate content from teens’ Instagram and Facebook accounts, including posts about suicide, self-harm and eating disorders. It also restricted minors’ ability to receive messages from anyone they don’t follow or connect with on Instagram and Messenger and added new “nudges” to try to discourage teens from browsing Instagram videos or messages to late hours at night. Nudges encourage children to close the app, although they do not force them to do so.
Google’s YouTube is notably absent from the list of companies called to the Senate on Wednesday, even though more children use YouTube than any other platform, according to the Pew Research Center. Pew found that 93% of American teens use YouTube, with TikTok a distant second at 63%.
Associated Press writer Mary Clare Jalonick contributed to this article.