Skip to main content
Post

WEIRD Reactions to Privacy Regulation

AEIdeas

March 12, 2025

In my last blog, I discussed the effects of WEIRD (Western, educated, industrialized, rich, democratic) psychology on attitudes toward Artificial Intelligence (AI). WEIRD societies have been shown to demonstrate very different approaches to trust than non-WEIRD societies. WEIRD societies have substituted trust in friends, families, and communities with trust in institutions that have enabled widespread trading with strangers.

Ipsos surveys and subsequent research show that WEIRD societies are more fearful of AI and less optimistic about its prospects than their non-WEIRD counterparts, despite having some of the most sophisticated industrialization and institutions, highly educated populations, and substantially higher wealth. Importantly, the differences did not arise from a lack of AI awareness in non-WEIRD societies, as the non-WEIRD samples came from populations most likely to have encountered and be using AI.

Subsequent research confirmed that the most significant factor in explaining the lower levels of trust, confidence, and understanding in WEIRD countries was the sophistication of their democratic institutions. This is consistent with the hypothesis that, in WEIRD countries, reliance on institutions to substitute interpersonal observation and trust extends beyond classical commercial instruments, such as warranties, guarantees, and corporate governance rules, to include legislative provisions and regulatory institutions. On the one hand, these institutions provide assurances that strangers must abide by a set of government-enforced rules when transacting with each other. On the other hand, it begs the question of where, in democratic environments, the motivation to create those rules originates.

In democratic societies, calls for legislation arguably come from the governed when it becomes clear that commercial and interpersonal social arrangements are insufficient to provide the necessary confidence to engage and exchange. Yet if WEIRD psychology has engendered a shift in trust from social and commercial to legislative instruments, then it would seem that—in a feedback effect—regulations would eventually be needed before the adoption of new technologies. The lower levels of trust in AI in WEIRD countries could stem from the fact that, at the time the survey was conducted, no regulations governing AI had been enacted in any of the developed economies. Though none had been enacted in non-Weird countries either—their citizens would not have been looking for such assurances before engaging with AI.

Are the calls for AI regulation—which are most prominent in the countries where AI has been developed and most extensively used—a feature of democracy itself, rather than a rational response to evidence of harm? It is telling that in 2023, the year following the exponential takeoff of large language models such as ChatGPT, the number of incidents recorded in Stanford’s HAI AI Incident Database increased by only a third over the preceding year. This is hardly compelling evidence of significant harms requiring legal constraint. Is regulation being demanded in more democratic countries primarily to assuage citizens’ fears of an unknown technology? Such calls for regulation are not prominent in non-WEIRD countries where technological optimism abounds.

Coincidentally, the Ipsos data provided one question that enabled a deeper examination of trust and legislation in both WEIRD and non-WEIRD countries. Respondents were asked to express their confidence that AI firms would protect their personal data. While democracy was negatively correlated with trust in AI firms to protect their data, the coefficient’s effect in a regression equation was small—especially compared with other questions in the survey. This could have been because, in Europe at least, privacy regulation enforced by the 2018 General Data Protection Regulation (GDPR) may have been influencing responses. Subsequent analysis, when controlling for European Union membership, found that this was positively, not negatively, correlated with confidence in privacy. EU membership increased a country’s trust in protection of personal data by around 6 percent, whereas an increase of 1 percent in the country’s the Economist Intelligence Unit’s Democracy Index score decreased it by 3 percent. Notably, EU members in the sample included non-WEIRD countries Hungary, Poland, and Romania in addition to the WEIRD members.

As there is much debate about how the effectiveness of the GDPR, analysis of the Ipsos data appears to confirm that the presence of regulation alone alters perceptions of trust in both WEIRD and non-WEIRD jurisdictions. It may be tempting for WEIRD legislators to regulate new technologies to overcome public fears of uncertainty, even before  the technology’s effects are known. However, this would be putting the regulatory cart before the technological horse. Regulation should address real harms, not merely assuage fears.

Learn more: WEIRD Attitudes Toward Artificial Intelligence—And Its Regulation? | Connecting the Dots on the Chips | Practical Steps Towards Data and Software Resilience | Resilience: The New Challenge for Digital Systems Policy