- Executives from the five leading social media companies Wednesday testified about child-safety protection measures on their respective platforms before the US Senate Judiciary Committee.1
- With the Chief Executive Officers of Meta, X, TikTok, Snap, and Discord in front of his committee, Sen. Dick Durbin (D-Ill.), the committee chairman, in his opening remarks said the companies are 'responsible for many of the dangers our children face online' and he criticized them for their 'failures to adequately invest in trust and safety.'2
- Recorded testimony showed children and parents speaking about exploitation on social media, and parents who lost their children to suicide silently held pictures of their deceased children in the hearing.3
- Meta CEO Mark Zuckerberg turned around to address the parents at the hearing after Sen. Josh Hawley (R-Mo.) asked him if he wanted to apologize to the victims of social media dangers. Zuckerberg said to the parents, 'No one should go through the things that your families have suffered,' while pledging a commitment to improve child safety measures.3
- Meanwhile, TikTok CEO Shou Zi Chew was questioned about his ByteDance-owned company’s affiliation with the Chinese Communist Party. Chew repeatedly denied any affiliation with the CCP, insisting that he only serves his country, Singapore, and that only ByteDance’s companies based in China are required to share data with the government.4
- This hearing comes as Meta is facing multiple lawsuits related to child safety on Facebook and Instagram, and more than 40 attorneys general are suing the platform for contributing to mental health issues — including teenage eating disorders.5
- Narrative A, as provided by The Gateway Pundit. Congratulations are due to the bipartisan group of senators who are taking on this issue and confronting these CEOs who are profiting off the exploitation of children. Wednesday’s hearing was a step in the right direction. Still, much more needs to be done to protect children from the dangers of social media exploitation because too often these platforms have placed profit above safety.
- Narrative B, as provided by Guardian. These CEOs are truly sorry for any harm their platforms have caused. Many families have dealt with unthinkable suffering, but as studies have been released to help better mitigate the harms, the platforms are doing what they can to implement controls and tools to make their sites as safe as possible. More will be done moving forward to provide security for kids and improve industry standards.