Fiery Senate hearing with Big Tech CEOs defending defective products

Yesterday was a big day for online safety for kids.

Five Big Tech executives appeared in front of Congress to talk about kids and the harms and dangers currently on their social media platforms.

In the Senate Judiciary hearing, the official topics were why sexual exploitation of children online continues and how Big Tech is going to remove illegal content from their platforms but the four hour hearing ended up being more about protecting kids on social media in general.

These are things that EVERY parent wants to know! We all want common sense safeguards so our kids to safely browse the internet without unwanted sexually explicit content and drug dealers peddling their wares openly.

As Senator Graham said to all five CEOs (Meta, Discord, X, TikTok, Snap) "You have blood on your hands."

Testifying were five Big Tech CEOs

A few CEOs accepted Congress' invitation to testify and others had to be subpoenaed (we heard from Big Tech execs Mark Zuckerberg, CEO of Meta; Linda Yaccarino, CEO of X (formerly Twitter); Shou Zi Chew, CEO of TikTok; Evan Spiegel, CEO of Snap and Jason Citron, CEO of Discord).

In the audience were grieving parents

In the audience were parents who are bound together by the unimaginable - the loss of a child from social media harms like cyberbullying, dangerous challenges, drug dealers, eating disorders, sexual exploitation, and sextortion. To learn more about their kids and their stories, visit Parents for Safe Online Spaces.

As Senator Graham said at the beginning of the hearing - "You families holding photos. Don't quit! You're making a difference." Several other Senators recognized the parents in the room.

At one point in the hearing, Mark Zuckerberg was asked if he would like to apologize to the parents in the audience. He stood, turned from his mic to address the room and said “I’m sorry for everything you have all been through. No one should go through the things that your families have suffered and this is why we invest so much and we are going to continue doing industry wide efforts to make sure no one has to go through the things your families have had to suffer."

A true apology from all platforms would be to take action NOW, work with parents and teens to fix your defective products (or don’t allow teens on the platforms) and take responsibility for the harm you’ve caused to youth mental health.

Parent advocates holding up their child’s photo and demanding Congress take action to protect kids online.

Protecting kids online is one issue that gets bipartisan support, although so far, that support hasn’t led to legislation passed.

While the hearing mostly focused on the protection of children from online sexual exploitation, the questions varied widely as senators took advantage of having five powerful executives there under oath.

Senators acknowledged there is currently no recourse for social media users who are harmed:

  • Section 230 protects social media companies and gives the platforms immunity from civil liability lawsuits

  • There are no consumer protections through a regulatory body that could fine a company or take away a license

When something goes wrong, law enforcement is left trying to unravel encrypted messages and families are left with no recourse. And we’re not even beginning to address Artificial Intelligence (AI) and dangers headed our kids’ way.

Introductions to the five CEOs

Zuckerberg (Meta) tried to push the safety problem on parents, mentioning the 30 tools Meta has created for safety, and saying he wants to “empower parents” (tech-speak for blame them if things go wrong). The underlying message was - if parents would just use these tools, their kids would be safe. That’s simply not true. Many of the tools are brand new, created to stave off Congressional regulation in 2024; others are obscure or not useful. He also disputed that a child’s mental health worsens with excessive social media.

Spiegel (Snap) boasted that Snapchat has no popularity metrics (“likes”) unlike the other platforms, but forgot to mention the addictive nature of “Streaks” that send kids back to the app daily to message each friend or risk losing their status.

Yaccarino (X, formerly Twitter) tried to say X really didn’t have many kids on the platform and “no line of business dedicated to children”. She stated that less than 1% of X users are 13-17 years old. But when asked what 1% of her user base was, she said 1% of the total US user base of 90 million would be about 900,000 teens. And that’s just the kids who are telling the truth about their age! She also stated X was only 14 months old and had an increase in its Trust & Safety team. (Didn’t Elon Musk fire most of them last year, so of course there would be an increase?)

Chew (TikTok) disclosed that TikTok has 1 billion users worldwide and 17 million American users monthly. He talked about the 40,000 Trust & Safety professionals it has worldwide and its commitment to spend $2 billion on T&S in 2024.

Citron (Discord) disclosed that they were small compared to the giants in the room, at 150 million active users. (Some people were wondering - What even is Discord?) It’s a communications platform that lets you direct message, chat in chatrooms, send photos and video call. Learn more here.

What I know for sure: we can’t push this problem onto parents. Parents didn’t create the problem and cannot be expected to know how to supervise each app with their own tools and family features and loopholes. That’s right - for many of these safety features, kids can simply toggle off of safety and back into danger.

Good news - a few in the industry support current legislation

Despite all the doublespeak about supporting the basic ideas around various bills “but we still need to discuss the details”, two companies stepped forward to say they actually support bills in existence in 2024.

In a surprise move last week that goes against the entire industry’s stance, Snapchat announced it supports the Kids Online Safety Act (KOSA), which creates a “duty of care” for social media companies to put their young users’ well-being before other goals, to prevent and mitigate harms to minors.

Then Microsoft agreed to support KOSA. And during the hearing, X (Twitter) said it supported KOSA and Stop CSAM Act bills.

Highlights from the hearing

”Why are you letting kids choose to see CSAM?”

Senator Cruz started by acknowledging “for kids phones are portals to predators, bullying and self-harm.”

He was outraged to see this popup and questioned Mark Zuckerberg: "In what universe is it okay to flag images of child sexual abuse material (CSAM) and then ask users if they still want to see them?"

He asked Zuckerberg to quantify how many times this message has been displayed and how many times young users clicked on “see results anyway” but didn’t get an answer.

“Who did you fire after learning about these statistics?”

Senator Hawley confronted Meta's Mark Zuckerberg with horrifying statistics brought to light by Whistleblower Arturo Bejar’s testimony last November.

  • 37% of teenage girls 13-15 years old encountered nudity in the last SEVEN DAYS

  • 24% received unwanted sexual advances

He asked “Who did you fire after learning about these statistics?”
He continued: “Have you compensated the victims' families? Started a victims' compensation fund? Provided counseling to those harmed? Apologized to the families who are in the room and have lost a child or had a child severely injured?”

Senator Klobuchar had a lot of statistics and data to put things in perspective.

“Early in January Boeing grounded a fleet of hundreds of aircraft after ONE problem that didn’t even kill anyone”. So why aren’t we doing something when kids are dying from exposure to social media harms?

She also mentioned the 2023 study across six major social media platforms showing the 2022 annual advertising revenue from youth users ages 0–17 years is nearly $11 billion, including $2 billion for kids ages 0-12.

Approximately 30–40% of the advertising revenue generated from three social media platforms (Snapchat, TikTok, YouTube) is attributable to young people. (Read the study here.)

Regarding the fentanyl epidemic, she stated 1/3 of the fentanyl cases had direct ties to social media.

I agree with Zuckerberg on this

Zuckerberg wondered if social media platforms were the right place to focus and brought up the idea that the app stores (Apple and Google) could be an easier place to set up parental consent, safety features and controls.

I agree - anytime you can centralize features in one spot instead of every app, you’re going to make it easier to implement. Parents will actually DO one thing but if they have to do eight things with 57 steps each that change regularly, forget about it.

Senator Hirono asked Zuckerberg why the safety settings for teens weren’t mandatory and kids could reset them to more public settings. He responded that teens sometimes want to be creators and it’s not his job to decide for others.

That is the problem! He’s not putting kids’ mental health first! Of course it’s his job to make sure his product provides a safe experience for kids, if he’s going to invite kids to use it.

Senator Padilla asked for specifics from all five CEOs on these two questions:

  • How many teens are on your platform?

  • How many parents are using your family center/parental controls?

Only Spiegel was able to answer, with an appalling ratio of 20 million teen users and 200,000 parents using Snapchat’s Family Center. A ONE PERCENT ADOPTION RATE.

Any business person can see that something isn’t working, to have only 1% using the safety features. More time and money need to be spent creating one easy tool that will be worth it for parents to use - one that lets parents know when kids come into contact with harmful content.

Senator Butler showed a photo of a woman using a plastic surgery filter, telling Zuckerberg “You give teens the tools to affirm self hate!”

A photo of a woman using an Instagram filter to simulate plastic surgery

Why this matters

Online safety for kids is critical! The U.S. is behind when it comes to safeguards to protect children and teens on social media and we need to make progress. Kids are getting harmed every day. Some even lose their lives to harms on these defective products. Big Tech CEOs are still in denial that they’re part of the problem when it comes to mental health.

We need Congress to listen to safety advocates and act to protect kids today!

  • Support the Kids Online Safety Act (KOSA), which creates a “duty of care” for social media companies to put their young users’ well-being before other goals, to prevent and mitigate harms to minors.

  • Require companies to collect and publicly disclose numbers and metrics (not percentages that can look good) like the number of children on their platforms, the number of teens who experienced harms, etc (Arturo Bejar has the tech background and lots of ideas on metrics and transparency needed.)

  • Remove encryption and disappearing messages/photos on teen accounts.

  • Ask parents and teens what tools would be helpful

  • Age verify to keep kids under 13 off the platforms

You can watch all 4 hours of the testimony here at the U.S. Senate Committee of the Judiciary.

The five bills related to this hearing on CSAM are:

  • The STOP CSAM Act, which supports victims and increases accountability and transparency for online platforms;

  • The EARN IT Act, which removes tech’s blanket immunity from civil and criminal liability under child sexual abuse material laws and establishes a National Commission on Online Child Sexual Exploitation Prevention;

  • The SHIELD Act, which ensures that federal prosecutors have appropriate and effective tools to address the nonconsensual distribution of sexual imagery;

  • The Project Safe Childhood Act, which modernizes the investigation and prosecution of online child exploitation crimes; and,

  • The REPORT Act, which combats the rise in online child sexual exploitation by establishing new measures to help strengthen reporting of those crimes to the CyberTipline.

(All photos are from the Senate Judiciary livestream.)

Previous
Previous

Snapchat IS social media

Next
Next

New York City says social media is a health hazard