How can we get social media platforms to take responsibility?
Grieving parents all over the world are asking how they can get social media companies to take some responsibility for the deaths of their children.
These are parents who have had their children die by suicide or die accidentally due to harmful content on popular social media platforms like Instagram, YouTube and TikTok.
It’s difficult to get social media companies to take even partial responsibility for the death of a child. In the U.S. we have Section 230 of the Communications Decency Act of 1996, which provides immunity to online platforms from civil liability based on third-party content and for the removal of content in certain circumstances. In other words, it’s a shield for social media platforms to say they’re not responsible for the abuse and harms on their sites.
This is a problem when kids go deeper and deeper into online topics like depression, anxiety, self-harm and suicide and the algorithm sees the searches and delivers darker and darker material. They’re stuck in a loop of horrendous ideas and some of these kids end up taking their lives, in part because of what they are surrounded by on social media platforms.
To avoid the Section 230 protections, parents are starting to use product liability arguments. They argue that social media sites are liable for harms because they designed, manufactured or marketed a “dangerously defective product.” A product that they know hurts minors, based on independent research and internal documents from people like Frances Haugen, the Facebook whistleblower.
In the UK
It’s not just parents in the U.S. who want social media companies to be accountable for what they allow and promote online to kids.
In the UK, the inquest into the death of 14 year old Molly Russell ended in September 2022. The coroner concluded that social media companies were partially to blame for Molly’s death. In this case it was content on Pinterest and Instagram that contributed to Molly’s depression and death. The coroner will issue further reports on how the UK will protect its children and teens in the future from harmful content on social media.
In the News
This week on KOAT, we talk about a 13 year old New Mexico boy who accidentally took his life in 2020 while doing something called the “blackout challenge” that was posted on TikTok. In September 2022 his family filed a lawsuit against TikTok for marketing a dangerously defective product, saying “Despite knowing their product encourages dangerous behaviors, TikTok didn’t have warnings or safeguards that could have prevented this boy’s death.”
What should social media companies do to protect kids?
Make sure all users in the U.S. are 13 years old, as required - Age verification
Give parents with teens setting up new accounts the ability to “see” what kids are watching - A Mirror account
Set up strong protections for minors in the settings by default, not able to be turned off - Safeguards in the settings
Remove harmful content or block it behind “Over 18” walls - Block dangerous content
End algorithmic recommendations of harmful material - Stop recommendations
Create a reporting system that gives updates and access to a real person who can help take down content - Create an effective way to report harms
Issue independently reviewed quarterly reports on harms to kids to adults, safety features to address the gaps, with action items - Be transparent about the risks to kids on their platforms and how they are addressing them
What can parents do?
Support legislation that protects kids online - both their privacy and their safety. In 2022, the Kids Online Safety Act is a national bill that has bipartisan support and would give kids more protection on social media. Connect with your Senators here.
Talk to your kids about challenges, online and offline. Just because a friend did the choking challenge and survived doesn’t mean it’s safe or smart to do. Whether they see it as a video post on TikTok or hear friends laughing about it after, let your child know challenges are killing tweens and teens accidentally and you can’t know in advance who will survive or die.
Follow our Five Circles of Cyber Safety
Install a monitoring app like Bark
The National Suicide Prevention Lifeline is 800-273-8255 or chat for support. You can also text HOME to 741741 to connect with a crisis text line counsellor.