Skip Navigation
Analysis

Supreme Court Case Could Be Disastrous for Detecting Election Misinformation

The Court will hear a case that’s part of a legal and political effort to silence those working to counter election falsehoods.

Sen. Mark Warner (D-VA), chair of the Senate Select Committee on Intelligence, made an extraordinary statement at a Tuesday hearing on election administration. For about an eight-month period, starting in July 2023 “until about two weeks ago,” there had been zero communication between social media companies and the federal government about foreign election interference. At the beginning of a critical election year, that unprecedented breakdown in communication should, as Warner has noted “scare the hell out of all of us.”

The United States needs to be on alert and ready to protect the 2024 election against foreign interference and domestic election falsehoods — both of which can confuse voters or contribute to threats and harassment of election officials. That means federal agencies and others with expert knowledge should be sharing relevant information with social media companies for them to use as they apply and update their policies on how to handle falsehoods. But fallout from a little-noticed lawsuit now before the Supreme Court has stood in the way, and the wrong decision from the Court this term could make things even worse.  

Federal agencies opened regular communication with social media companies in the wake of Russia’s election interference in 2016. Among other things, they passed along messages from state officials that identified inaccurate information about elections circulating on their platforms. This included false information about how, when, and where to vote. The government and companies also shared information they collected about foreign interference operations playing out on the social media platforms. Continued contact along these lines seems like common sense in the current information climate. Not only are U.S. intelligence agencies warning that Russia and other foreign adversaries plan to try and intervene this year, domestic actors seeking to undermine confidence in American elections appear increasingly robust and coordinated. And their efforts will be potentially more widespread and hard-to-detect due to generative-AI tools. As Warner put it, Russia’s efforts in 2016 might look like “child’s play” in comparison to 2024.  

But that important connection between social media companies and federal agencies with specialized knowledge about elections, foreign interference, and disinformation has been strained, in significant part due to Murthy v. Missouri. The Supreme Court will hear oral arguments in the case on March 18.

The case started in 2022 when the attorneys general of Missouri and Louisiana, joined by several private plaintiffs, brought a First Amendment case against a host of Biden administration agencies and officials. The plaintiffs alleged that the government censored Americans’ speech by informing social media companies about falsehoods on the platforms, and coerced companies into correcting those errors — including lies about voting processes, elections, and the workers who run those elections.

The case is part of a larger legal and political effort to silence those working to detect or counter  election rumors and falsehoods. If allowed to spread, disinformation about elections can undermine democracy in myriad ways, from causing confusion among voters about when, where, or how to vote, to inciting harassment and violence against voters, election workers, and election officials. The wrong ruling in Murthy could deal a crippling blow to the ability of the government to identify election disinformation campaigns for social media companies, or even provide them with accurate election information. The case has already done real damage that could make the United States less prepared to respond to falsehoods in an election year when the information environment for voters may be worse than ever.

The plaintiffs argue that by flagging false election information on platforms or by answering questions about whether content was accurate or not, the federal government “coerced” or “significantly encouraged” social media companies to put warning labels on posts, deprioritize them in users’ feeds, or make posts with accurate information more prominent. Plaintiffs claim that these content moderation steps social media platforms took were “state action” attributable to the government, and a violation of the First Amendment.

None of the communications related to voting, election processes, or election security cited by the lower courts were coercive communications that should have converted the platforms into state actors under the law. They were almost entirely informative and did not include attempts to dictate decisions about how to handle users’ posts. Nevertheless, as the suit has proceeded through the courts, its twists and turns have created confusion and cast a shadow over the relationships between the government and the social media companies.

The initial July 2023 order stopped any communications between the government and social media companies on platforms’ content. It included a difficult-to-apply exception for communications related to foreign disinformation campaigns. Then that September, an appeals court modified the injunction to say that some federal agencies were not covered by the order’s restrictions, only to add some of those agencies back in the following month.

All of these orders were put on hold when the Court agreed to take up the case, but the impact of the legal whiplash — and that of the broader campaign to silence those who would name or push back on election falsehoods — lives on. The federal government remains extremely cautious about interaction with social media companies, at least until the case is resolved.

It’s not just the federal government that has been affected. The case also seems to have helped shut down communications between social media companies and election officials on the state level as well. According to an amicus brief filed on behalf of several secretaries of state in Murthy, both Meta and X appear to have no plans to “facilitate direct communications between state officials and the platforms,” as the companies did in 2020.

That’s a problem for voters. Election officials are generally the most authoritative source a social media company could go to when seeking to correct the record about an election process, or to clarify information about voting. The decision to stop communicating, the secretaries of state said, “will increase the risk that dangerous, and even illegal, falsehoods about elections and voting will spread unchecked.”

Those seeking to undermine the fight against election disinformation haven’t limited themselves to the courts. In Congress, they are using the House Select Subcommittee on the Weaponization of the Federal Government.

Rep. Jim Jordan (R-OH) has used his powers as chair of that subcommittee to target not only federal agencies, but also election officials, researchers, and non-governmental organizations working to identify and combat disinformation. And the subcommittee’s work has spurred additional litigation. The New York Times reported last spring that an advocacy group led by former Trump advisor Stephen Miller filed a class-action lawsuit against several researchers – including at Stanford, the University of Washington, and the Atlantic Council – echoing many of the committee’s most spurious claims. Several researchers have noted the chilling effect these onslaughts have had on their work, making it less likely we will understand foreign and domestic efforts to spread lies about our elections on social media this year.

A good ruling in Murthy will not undo all the damage discussed in this piece. But it should make clear to the Biden administration that the government, social media companies, and others are free to continue to take basic steps to make American elections and democracy more secure, as they have in the past. The court should sanction communication between them for the purpose of sharing accurate information about elections, identifying falsehoods about election processes, and responding to suspected foreign interference operations.

Norden and Ramachandran served as counsel for a bipartisan group of election officials on their friend-of-the-court brief in support of neither party in Murthy v. Missouri.