Just do it: It’s not that hard to make social media safer for kids.
At a recent event with teachers and doctors from across the country, a pediatric psychiatrist told me that kids have started showing up for kindergarten without the ability to throw a ball or hold a pencil. Their hands lack those abilities, in part, because they spend so much time in front of screens. It seems kids are losing the ability to participate in childhood.
Last year, I disclosed to the federal government more than 20,000 pages of internal documents from my former employer, Facebook (now Meta). Probably the most shocking disclosure was the extent to which Facebook knew its products, Instagram in particular, were harming our children and chose to do nothing about it.
The products children spend so much time with from the youngest of ages are not safe — by design. And it is at the product-design level, rather than tacked-on screen-time features, that products for our children can be made meaningfully safer.
Instagram’s own studies show that the platform worsens body images for 1 in 3 teen girls. More than 13% of teen girls say the app contributes to their suicidal or self-harm thoughts.
In response to these horrific revelations, Facebook sent Instagram CEO Adam Mosseri to defend the platform. “We know that more people die than would otherwise because of car accidents, but by and large, cars create way more value in the world than they destroy … And I think social media is similar,” he said on a podcast with Recode,
I would remind Mosseri that cars have seat belts. They have airbags. There are speed limits, and lower speeds are required in school zones. We must have infant car seats before we can take our babies home from the hospital. Those measures are in place because when we realized just how many lives could be saved with such simple yet effective changes, we acted.
Did the auto industry fight against them? Absolutely. And we are seeing the same fight today from Big Tech — with millions of dollars spent on lobbying and misleading advertising.
It is ironic that this resistance to change and innovation is coming from progenitors of innovation. An industry that has moved fast and broken things should be obligated to fix what’s broken.
There are known technological fixes that would improve safety on the platform, particularly for the most vulnerable, such as children. But company executives are unwilling to implement these solutions because they shave slivers from their billion-dollar profits.
Instead of thinking creatively and designing with safety in mind from the start, Big Tech has relied on censoring our speech and entrapping our children with predatory tactics to ensure they are scrolling for as long as possible. Worse, it has placed the blame on parents when their kids are addicted and depressed as a result.
I am a technologist, but also a pragmatist. I have worked at Facebook, Google, Pinterest and other tech companies, and the truth is that we will never regulate at the pace at which they innovate. Instead of trying to regulate the latest algorithmic innovation or clamp down on free speech, we should require safety standards in the design of the products.
The California Age-Appropriate Design Code Act moving through the Legislature is a step toward creating “seat belts” for our kids on social media. The legislation turns off features such as location tracking and prohibits the sale of kids’ personal data.
Similar legislation is now law in the United Kingdom, so we know Big Tech already is familiar with this.
These measures aren’t about banning social media for our kids. They are about creating social media that promotes the best in humanity and allows our kids to be safe, to connect with one another and learn together. That kind of social media is possible, but we have to design it that way from the start.
California is the birthplace of many of these technologies, and it is appropriate that California take the lead in designing systems that honor our freedom of speech, respect our children’s privacy and enshrine their rights to be safe online.
Frances Haugen is a former Facebook product manager and an advocate for accountability and transparency in social media. She wrote this piece for CalMatters.
Join the Conversation
We invite you to use our commenting platform to engage in insightful conversations about issues in our community. We reserve the right at all times to remove any information or materials that are unlawful, threatening, abusive, libelous, defamatory, obscene, vulgar, pornographic, profane, indecent or otherwise objectionable to us, and to disclose any information necessary to satisfy the law, regulation, or government request. We might permanently block any user who abuses these conditions.