High Court Murthy Case Accepts Social Media Companies’ Autonomy

The US Supreme Court’s ruling in Murthy v. Missouri shouldn’t have surprised anyone who understands social media. The plaintiffs couldn’t find any evidence because companies act on their own interests, not solely because of government “jawboning.” It just so happens that the companies’ interests here are aligned with the government’s.

In Murthy, the court ruled that plaintiffs lacked standing to enjoin executive branch officials from allegedly bullying social media platform companies into “censoring” their speech about topics such as Covid-19 and elections.

The Supreme Court emphasized the absence of evidence that companies such as Meta Platforms Inc. censored plaintiffs’ content due to pressure from the Biden administration. Likewise, the court saw no evidence that companies would behave any differently with respect to plaintiffs’ posts if the government were enjoined from asking them to control misinformation.

Meta doesn’t want to be Truth Social. That’s not because Mark Zuckerberg is a progressive activist—it’s because dangerous misinformation is at least partly against his economic interests.

The economics of big social media companies require them to be palatable to mass consumers and mainstream advertisers. Major companies have repeatedly experienced painful financial consequences from letting offensive content run unchecked. X Corp. lost over half of its advertising revenue in the year after Elon Musk took over and began announcing changes to its content moderation policy.

In 2017 and 2019, companies such as AT&T Inc., Walt Disney Co., Nestle SA, Coca-Cola Co., Walmart Inc., and Starbucks Corp. pulled their advertisements from YouTube because of the brand safety risk posed by objectionable material. Unsurprisingly, Disney didn’t want its commercials algorithmically inserted into videos of child abuse.

The content at issue in Murthy is likewise poisonous to mainstream users and advertisers. Eli Lilly & Co. isn’t going to want its pharmaceutical ads next to a post urging followers to take a horse dewormer to cure Covid-19. Zuckerberg doesn’t want users fleeing his platforms because they consider it a forum where people organized the Capitol riot on Jan. 6, 2021.

As the Supreme Court repeatedly observed, Facebook acted on such content before the Biden administration started calling—and in some cases during the Trump administration, when the White House pressure potentially could have been on the other side.

There’s some evidence that government pressure can impact content moderation decisions. In 2021, a whistleblower claimed that Facebook soft-pedaled the enforcement of its policies against right-wing misinformation during the Trump administration because they were afraid of retaliation.

But the very same report also alleged facts illustrating how weak the impact of that pressure can be. One company employee allegedly said in response to 2017 worries about misinformation that it “will be a flash in the pan.”

That’s an accurate description of what the government has actually done. Politicians from both parties have been threatening various sanctions, from antitrust action to the repeal of Section 230, against social media companies for almost a decade. Nothing has happened on the federal level so far.

This could be because companies have succumbed to political pressure. However, such pressure comes from both sides: Democrats want more supposed misinformation removed, and Republicans want less content moderation. Anything that mollifies one party will enrage the other. Companies know that knuckling under to one side (even if they happen to occupy the White House at the moment) wouldn’t take the pressure off.

Chinmayi Arun wrote a fantastic essay two years ago in the Harvard Law Review Forum explaining how the conflicting interests in social media content moderation in big companies get mixed up together. In a nutshell, different teams within a company balance multiple revenue interests in often contradictory ways. Government pressure is an input to their decisions but doesn’t control them.

The Supreme Court in Murthy rightly recognized that First Amendment injunctions are wholly unsuitable as a way to tinker with that dynamic.

The case is Murthy v. Missouri, U.S., No. 23-411, decided 6/26/24.

This article does not necessarily reflect the opinion of Bloomberg Industry Group, Inc., the publisher of Bloomberg Law and Bloomberg Tax, or its owners.

Author Information

Paul A. Gowder is professor of law and associate dean of research and intellectual life at Northwestern Pritzker School of Law.

Write for Us: Author Guidelines

This post was originally published on this site