“Attention youth: We interrupt your social media experience to bring you this government-compelled, state-sanctioned message to help you ‘understand the impact of social media on the developing brain and the mental and physical health of youth users.’ This message will be repeated.’”
That sounds like a far-fetched way to educate minors about alleged and vehemently contested problems with using social media, but something similar might soon become reality in Colorado unless a judge issues an injunction in NetChoice v. Weiser. Compelling private entities to host disputed messages they disagree with––NetChoice calls Colorado’s statements “state-compelled opinions”––raises profound First Amendment problems, but that’s what the statute in Weiser seemingly does.
The US Supreme Court has “held time and again that freedom of speech ‘includes both the right to speak freely and the right to refrain from speaking at all.’” It recently added that the government generally “may not compel a person to speak its own preferred messages.” Given this principle against government-compelled expression, the better––and constitutional––method for addressing Colorado’s frets about minors’ social media usage is for the state to run its own educational and advertising campaigns––ones that don’t co-opt platforms to do the hard work for it.
Colorado Governor Jared Polis signed House Bill 24-1136 into law in June 2024. Part of the bill––codified as Colorado Revised Statute Section 6-1-1601––forces social media platforms to adopt one of two compelled-speech “functions.” NetChoice argues that both “unconstitutionally compel speech” and are intended to “discourage minors from using [platforms’] services.”
Under one option, platforms must adopt a function—one comporting with government-imposed standards and using government-gathered evidence—that provides platform-using minors with information to help them “understand the impact of social media on the developing brain and the mental and physical health of youth users.” Colorado will establish these notifications’ frequency later.
NetChoice emphasizes that
“the impact of social media” on young users is the subject of ongoing and vigorous debate, in addition to active litigation. Academics, policymakers, thought leaders, and parents all disagree on various issues regarding social media—from what websites qualify as “social media” to the precise balance of benefits and purported drawbacks they offer users.
The second option requires minors to receive a “pop-up or full screen notification” when they spend “one cumulative hour on [a] social media platform during a twenty-four-hour period” or use a “platform between the hours of 10 p.m. and 6 a.m.” This notice must be “repeat[ed] at least every thirty minutes.” Exactly what these too-much, wrong-time usage warnings must say is unclear; NetChoice argues one possibility is that platforms “must provide the same information required by the first option.”
The law’s sponsors contend its purpose is purely educational. Representative Rose Pugliese claims the statute “provides young people and their parents with the knowledge and support they need to make informed decisions about safe social media usage.” Another sponsor, Senator Judy Amabile, adds “it’s time we help teens make informed choices by providing them with the evidence-based information as well as the support and guidance they deserve to use social media safely.”
Education and information are wonderful things when factually accurate. States may freely conduct educational campaigns—ones without platforms’ conscripted aid—to provide parents and minors with facts to make informed decisions about social media usage.
Federal District Judge Mark Walker recently highlighted this option when issuing a preliminary injunction blocking parts of a Florida statute that bars anyone under age 14 from holding an account on platforms including Facebook, Instagram, YouTube, and Snapchat. Walker wrote that “the appropriate response from the State is a public education campaign, either to inform parents about the risks of social media or to equip them with the knowledge they need to employ the [parental control] tools they have available,” including those at the device, operating-system, and platform levels. He added that “if parents . . . find the available tools confusing or difficult to use, the appropriate role for the state is to provide educational tools and resources for parents to learn how to do so.”
What states cannot do, however, is force platforms to provide government-endorsed messages on contested issues that conflict with the platforms’ viewpoints. In 2018, the Supreme Court ruled that California violated the First Amendment speech rights of pro-life, state-licensed crisis pregnancy centers by forcing them to notify low-income women that California provides free and low-cost abortion services. In striking down the law, the Court reasoned that “California could inform low-income women about its services ‘without burdening a speaker with unwanted speech.’ . . . Most obviously, it could inform the women itself with a public-information campaign.” The Court added that “California cannot co-opt the licensed facilities to deliver its message for it.” Neither can Colorado now do the same with social media platforms.