It is deeply unfortunate that the vast wave of information, education, interaction, commerce, and creativity borne by the internet comes with costs like greater access to child sexual abuse material (CSAM). It is customary in the policy world to decry CSAM in overwrought terms, but it is important to be hardheaded about the implications of policies to counter the stuff. So let the record reflect my abhorrence of child abuse of all kinds as I write to reject a genre of means to counter CSAM: making information technology platforms liable.
The idea is not new. In 2005, I argued against making internet service providers (ISPs) liable for the distribution of computer viruses and worms. The logic was simple: ISPs are well positioned to stop internet pathogens; if they were liable for failing to do so, that would clean up the internet.
“Holding ISPs liable sounds like a way to reduce Internet wrongs,” I wrote. “But is it right?”
Our traditional notions of justice hold that people are responsible for their own acts. They are not responsible for others’ acts, given free will and competence all around.
A lawsuit recently filed against Apple attacks these basic moral premises, seeking to make the company liable for the wrongs of people using its products. Because Apple developed ways to combat the distribution of CSAM, says the suit, it now must be liable to child abuse victims for declining to deploy such technology.
Apple announced its NeuralHash technology in 2021 to detect and report CSAM on customers’ devices. It did not implement it in light of privacy and security concerns—well-placed, in my opinion—and perhaps because the system wasn’t as good as advertised. My critique at the time was that Apple might make itself into a government actor for constitutional purposes if it reviewed our content and communications on behalf of law enforcement as a matter of course.
To win, the plaintiffs will have to blow conceptual walls off the federal anti–child porn statutes. The language criminalizing sending and receiving CSAM, they must argue, also applies to being the means of sending and receiving CSAM. The plaintiffs will have to redefine knowing participation in such acts as “having the means to know and not using them,” which is quite a bit different (and not the same as “constructive knowledge“).
There are already too many examples in which liability makes people responsible for others’ acts: dram shop laws, the doctrine of negligent entrustment, parts of copyright enforcement, some landlord-tenant laws, and the foreseeable misuse doctrine in product liability law. Extending liability to internet platforms would be orders of magnitude worse.
Just think what happens if any actor having the means to know what happens online is required to monitor for wrongdoing. Your cell phone provider can transcribe calls and use AI to search for discussions of criminality. Your email service provider can do the same. If your web searches are consistent with criminal planning, your search engine liability rules may require it to report you. Our privacy does not fare well under these conditions.
Professor Eugene Volokh wrote a decade ago in “Tort Law vs. Privacy” about how liability and advancing technology intersect to threaten privacy. But his thesis doesn’t concern me much. If advancing technology or changed business practices require people and businesses to provide more protection for those to whom they owe a duty, this is natural migration of mores and practices, even if there is some loss of privacy in the process.
Here, we are talking about exploding the legal concept of duty so that someone providing an online service, some neutral communications tool, is responsible for all the behaviors of those who use it.
Australia is in the process of adopting regulations that explode duty. The EU Cyber Resilience Act is imposing distended obligations on open-source developers that seem tailored to suppressing open-source activity on the Continent. (One wonders whether Europe, especially, is playing 4D chess with a plan to suddenly flourish through tech-sector repression.)
Much of my work is to oppose “light-switch totalitarianism.” We should not build technologies, such as a national ID, that can be easily repurposed by authoritarians. Power on the internet must remain at the edges. In the name of future freedom, we must resist the forces that centralize power, including liability laid on platforms, because they promote mass surveillance and control.
We often speak of CSAM in overwrought terms, and it is something all decent people reject and detest. But this trend away from natural justice toward liability without limits has seeds of different, broader, and more thoroughgoing losses to society. This is a terrible, awful, disgusting lawsuit against Apple.
Learn more: America’s Two Singaporean Futures Under President Trump | Privacy and Property Smackdown | Election 2024: Supporting Crypto Is Good Politics and Good Policy | How Much Do Consumers Hate Online Ads? Not Much!