Skip to main content
Post

Examining Manipulated Media and Platform Accountability

AEIdeas

January 9, 2024

On October 24, 2023, AEI hosted a panel to discuss a case facing Meta’s Oversight Board, which concerns an altered video posted by a Facebook user of President Joe Biden. The video raises questions about social media platforms’ responsibilities and their ability to influence public perceptions of political figures. The panel included AEI’s Clay Calvert and Shane Tews, Meta Oversight Board’s John Samples and Pamela San Martín, and Anchor Change’s Katie Harbath.

Below is an edited and abridged transcript of key highlights from that panel. You can watch the full event on AEI.org and read the full transcript here.

Shane Tews: John, can you explain the case the Oversight Board is currently facing?

John Samples: A video was posted to Facebook during in-person voting for the 2022 midterms in which Joe Biden places an “I Voted” sticker on his adult granddaughter, and kisses her on the cheek. The Facebook post features an altered version of that video, with the footage looping, so that it repeats the moment when the President’s hand brushes against his granddaughter’s chest. The post’s accompanying caption states that “Biden is a sick pedophile” and questions the mental wellness of people who voted for him.

A user reported the content to Meta with the idea of having it taken down, but the company did not remove the post. The reporting user appealed, and the human reviewer inside Meta upheld the decision not to remove the content. Meta agreed that the reason it shouldn’t be taken down is that it didn’t violate the rules of Meta. The manipulated media community standard only applies to videos generated by artificial intelligence—and this one wasn’t—or to those in which a subject is showing words that they did not say—which this did not. Meta also decided that it didn’t violate its larger manipulated media, misinformation, hate speech, or harassment rules.

I think the reason the Oversight Board took this appeal against Meta is because it touches on a much broader issue of how manipulated media might affect elections here in the United States, but also throughout the world.

In the United States, we’re familiar with a court in dealing with an issue like this in the context of the First Amendment. But, at Meta and at the Oversight Board, the First Amendment does not cover all of the three billion people that are on the Meta platform every day. So there has been a desire to make the kind of First Amendment the external norm in international human rights norms. There’s also the International Covenant on Civil and Political Rights, which has been ratified by many countries including the United States in the 1990s. This international human rights law has a First Amendment analogy—Article 19—that has language that is protective of free speech. But Article 19 also allows for limits on speech to protect other rights.

Shane Tews: Katie, you were at Meta at the beginning of many of these discussions about platform accountability. Walk us through some of the background on what actually created the Oversight Board.

Katie Harbath: A lot of this of rooted in the period immediately after the November 2016 election. Coming out of that election, there were questions about Macedonian teenagers spreading fake news to try to make money, questions around how President Trump had won his election and how he had used Facebook in doing that. I go back to that period because it was December of 2016 when Facebook started thinking about how to deal with content that doesn’t violate community standards, but is problematic in other ways. That’s when Facebook launched its first fact-checking program, where it started to put labels on content, and started to think about partnering with the International Fact-Checking Consortium.

Fast forward to May of 2019 and a video is posted on Facebook of then-Speaker Nancy Pelosi that is slowed down to make her sound like she is slurring her words or might be drunk. This caused a huge outrage from Speaker Pelosi’s office and others who thought that this should be taken down. They chose to leave it up and allow it to be checked by fact-checkers. I think they labeled it partially false, which caused it to be demoted, so it wasn’t seen by as many people on the platform, and it would have a label at the bottom of it.

As we were going through situations like this, we were sitting inside the company being like, “We should not be the only ones deciding this.” We would go to many experts and they’d have all sorts of different opinions. One of the things about the Oversight Board is creating a body of people to be able to be a check on these really tough policy decisions.

Shane Tews: There has been a lot of engagement on this by the courts recently. We’ve also seen some engagement by government, questioning whether or not taking down or keeping up content is the right thing to do.

Clay Calvert: First of all, it’s important to remember that if Meta takes down content, it is not a First Amendment issue, because the First Amendment only protects us from government censorship, not censorship by private entities or private individuals. Instead, we have a combination of international principles—some guided by First Amendment principles—on this case.

More broadly, the Oversight Board’s case relates to other cases we are seeing pop up. Supreme Court is soon hearing a case called Murthy versus Missouri, which deals with the concept of jawboning or just how far the government can go in trying to verbally arm twist platforms such as Meta, Google, X, to take down content that may be misinformation. And so that’s going to be a very important case. Generally, conservatives tend to feel that their views are taken down unfairly and liberals see this as a battle against misinformation.

Shane Tews: Pamela, I realize that a lot of the cases that the Oversight Board has taken are not US-based. It was also brought up that we have 65 elections going on in 54 countries in 2024. That gives you many areas of potential mischief to be watching. Give us a sense of how this is going, and things that we need to understand from a human rights perspective.

Pamela San Martín: When the Oversight Board sees any case, there are a few things that it has to acknowledge, one being that social media and traditional media and traditional ways of communicating are not the same. The capacity, the spread, the reach, and the virality that can be achieved through social media is something that has to be considered by the board. The board also has to consider the fact that the same policies will apply worldwide. There have to be nuances for the problems and the difficulties that the platform can have in applying the same policies worldwide.

Billions of posts are spread through Meta platforms daily. So, enforcing these policies on billions of posts has challenges for the platform, which will make the platform have to make tough decisions on which side to err. If erring on the side of protecting its paramount value of voice, or if in some cases it should err on the side of safety, or err on the side of privacy, or err on the side of dignity.

This conversation is so relevant internationally because the board has taken many cases that address election-related cases in specific countries. We have the Trump case which was US-based, the Cambodia case, and the Brazil case, which are the ones that most clearly address election-related issues or that can have very important impacts on elections.

This is the first time we’ve taken a case on manipulated media. We are going to have to study how this works in an electoral framework, in a global setting. That is, how it can affect different countries, and how it can affect the political and electoral debate. When we talk about manipulated media in elections, it’s because when you have manipulated media, manipulated information, and disinformation with the intent to deceive or mislead the people, you’re affecting the public debate that people use to decide on who will represent them or who will govern them in their own countries.

See also: Of Meta and Minors, Filters and Filings: An Uncertain Path Forward | A Blaring Wake-Up Call: Social Media Companies Should Prepare Better for Harm-to-Minors Lawsuits | AI-Generated Content, Fake News, and Credible Signals | The Supreme Court Must Protect Businesses from Bullying Governmental Efforts to Censor Citizens’ Dissenting Viewpoints