Senate Hearing on Protecting Kids Online: TikTok, Snapchat and YouTube
Today the Senate Hearing “Protecting Kids Online” turned the spotlight from Facebook and Instagram onto TikTok, Snapchat and YouTube and the content on their platforms. Senate members displayed a sense of urgency and impatience with the platforms.
Senate Subcommittee members came prepared, with several members setting up teen accounts on these platforms to experience what the algorithms would deliver and receiving shocking content on suicide, self-harm, body image and sexual content. They saw first-hand how dangerous these platforms are.
The three Big Tech leaders started strong, saying that they welcomed stronger regulations, transparency and accountability, but gave non-committal answers when asked about specifics and tried to position themselves far from the policies of Facebook and Instagram. It was Snapchat and TikTok’s first time in a hearing, and at times it was comical to hear them say how much “less bad” they were than Facebook.
The tech leaders pointed to new safety measures designed to keep teens safer, many of which were only enacted over the summer, as it became apparent that regulation was a real possibility.
All three had a hard time committing to current proposed legislation, repeatedly saying they liked the goal but needed to have more discussions about the details. And all three overstated the tools parents actually have at their disposal today to keep their kids safe.
Senator Blumenthal’s closing remark was strong: “I leave it to the parents of America and the world whether or not they find your answers sufficient.”
Next steps: our recommendations
1. Make social media supervisable by regular parents without an IT degree
Social media platforms need to give parents an easy way to “see” what’s going on online and parent in the online space. Right now it is impossible to get involved as a parent. Parents can’t see who their kids are interacting with, what they are viewing, what is getting delivered to them via the algorithm, where they are exploring, where they are hurting or confused.
Give parents tools to make it easy to supervise their teens online.
2. Create regulations to protect teens’ privacy
Look to the UK’s Age Appropriate Design Code for strong privacy protections for all platforms that teens and children are likely to use. These companies are already following these rules in the UK. Let’s make them apply in the US.
3. Focus on Youth - Take away the Section 230 liability shield for these platforms
Section 230 grants platforms legal immunity for the content posted by their users. The problem is that social media companies elevate certain posts through their algorithm. The algorithms and business decisions from these platforms have caused massive tragedy to US families.
Let’s focus on reforming Section 230 for youth safety and protection. Teens and children have taken their lives through suicide, have accidentally died through online challenges and drug availability, been harmed by predators, learned to self-harm, learned to develop eating disorders, and more.
Social media platforms need to be held accountable for promoting dangerous and deadly content to children and for their algorithms that spread this content.
4. Require platforms to only ask for bare minimum permissions
Several senators asked about TikTok’s current permission to collect biometric data, including face prints, voice prints, as well as geolocation, the objects that appear in your videos and even smart speaker audio. Mr. Beckerman assured them TikTok isn’t currently using biometric data and would ask users’ permission before actually collecting that data.
No!
TikTok shouldn’t be allowed to even ask for this information.
We need to reassess what data platforms really NEED from any of us, especially our kids.
5. Teens shouldn’t be able to turn off safety features inside the apps
We heard that some of the safety settings are now defaulting to “on”. That’s a step in the right direction.
But kids can easily turn safety features off. Why is this allowed?
6. Get kids under 13 off all social media platforms
No, we don’t mean create more platforms for under 13 year olds, like the Facebook proposal for an Instagram for Kids.
We mean age verify the kids on current platforms. Kids under 13 should be allowed to be children, without the pressure to be online socially.
Some legislation discussed today in the Senate Hearing:
Kids Internet Design and Safety Act KIDS Act prohibits amplifying harmful content, bans “auto-play” features, bans “nudges” and push alerts, removes the “like” buttons, which quantify popularity or are a sign of rejection.
Platform Accountability and Transparency Act PACT Act holds social media companies accountable for content that violates their own policies or is illegal.
Children and Media Research Advancement Act CAMRA Act requires the National Institutes of Health to fund research regarding the effects of media on infants, children, and adolescents.
Protecting the Information of our Vulnerable Children and Youth Act Kids PRIVCY Act bans targeted ads to teens and children, requires opt-in consent for minors, requires considering the best interests of kids, protects biometric data and other private information and more.
2021 COPPA update prohibits internet companies from collecting personal information from anyone 13- to 15-years old without the user’s consent; creates an online “Eraser Button” by requiring companies to permit users to eliminate personal information from a child or teen; and implements a “Digital Marketing Bill of Rights for Minors” that limits the collection of personal information from teens.
Eliminating Abusive and Rampant Neglect of Interactive Technologies Act of 2020 EARN IT Act revises the framework governing the prevention of online sexual exploitation of children.
Protect Americans from Dangerous Algorithms Act PADAA Act holds social media companies accountable for using algorithms that promote harmful and dangerous content that leads to offline violence.