Facebook is imploding. What you need to know about teen safety.
It has been quite a week for Facebook.
First, it had to postpone Instagram for Kids, its solution for designing a "safe" platform for youth, because public outrage was extreme. (We are celebrating this postponement--9 year olds don't need to be on a social media platform--they need to be playing outside!) If you haven’t had a chance to sign the Fairplay petition against Instagram for Kids, here it is. Please add your voice to over 200,000 parents!
Then there was a Senate hearing about protecting kids online, titled “Protecting Kids Online: Facebook, Instagram, and Mental Health Harms”.
Then the Wall Street Journal released the Facebook Files, asserting that Facebook's and Instagram's platforms are riddled with flaws that cause harm, often in ways only the company fully understands.
We learned of an anonymous whistleblower inside Facebook that released tens of thousands of internal documents that proved what we've long suspected--Facebook has amplified hate speech, political unrest and misinformation around the world and Instagram (owned by Facebook) has harmed teens' mental health. (See this Fairplay summary focused on youth.)
Just a few facts from the internal studies:
1 in 5 teens say that Instagram made them feel worse about themselves.
Harm on Instagram generally falls into 3 categories: social comparison, social pressure, and negative interactions with others.
6% of US teens said that their desire to kill themselves started on Instagram.
9% of US teens said that their desire to hurt themselves started on Instagram.
Then the whistleblower came forward during 60 Minutes (it's a must-watch segment) and we learned her name is Frances Haugen and she worked in the Civil Integrity department of Facebook.
Strangely, Facebook, Instagram and WhatsApp (all owned by Facebook) were offline for 6 hours the day after the 60 Minutes whistleblower piece.
And Tuesday, another Senate hearing with Ms. Haugen, titled “Protecting Kids Online: Testimony from a Facebook Whistleblower”.
Her message:
Facebook and Instagram are choosing the information we see, out of millions of possible options, based on an algorithm that prioritizes user engagement (how long we're inside the app) and company profit over truth, mental health, safety or user experience.
The algorithm has learned that we stay on Facebook longer when we are outraged or angry, so that's what it delivers.
Facebook wants us to believe this problem is unsolvable, that we have to choose between public oversight or personal privacy, but we don't. We can make Facebook safer and kinder while respecting free speech. But it's not going to happen from inside Facebook.
It's time to regulate social media companies and insist on transparency in how they set up and deliver content using algorithms.
A standout statement from Ms. Haugen:
“When we realized tobacco companies were hiding the harms it caused, the government took action. When we figured out cars were safer with seatbelts, the government took action,” Ms Haugen said in her written testimony. “I implore you to do the same here.”
THIS is our big tobacco moment. This is the moment where I see us headed toward consensus that social media platforms should have safety standards for youth (and adults) and that is exciting.
Stay tuned!