Stonewalling, threats and lawsuits: that’s what social media researchers face

Ideas54:00Platforms, Power and Democracy: Understanding the Influence of Social Media

This week, executives from TikTok, Snap, Discord, X (Twitter), and Meta (Facebook) appeared before a congressional hearing in Washington, DC. Mark Zuckerberg of Meta apologized to the parents attending who believe their children died as a result of using social media.

The hearings were preceded by an announcement in May 2023 by U.S. Surgeon General Vivek Murthy, who issued an advisory warning citing his “growing concerns about the effects of social media on youth mental health.” His warning confirmed for many what they had already intuited: that social media has been bad for society.

And now there is a growing body of research suggesting that platforms like Facebook, Instagram and TikTok are also encouraging social and political polarization, breeding radicalization, and eroding democratic institutions. 

But despite the startling headlines, none of these critiques is settled science.

For every paper that suggests social media has made us more polarized, there’s another that indicates it hasn’t had any effect at all. Given what’s at stake, finding answers to these questions is paramount. Yet conducting research on social media companies has become exceedingly difficult. 

In November 2023, dozens of social media researchers gathered in Montreal for the inaugural Attention conference, to discuss how they can continue to study Big Tech in the face of stifling corporate and political resistance. 

One of those researchers was Joan Donovan. Prior to her current post at Boston University, Donovan led a project at Harvard examining the effect of Facebook on American politics and civil discourse. But after a $500 million donation was made to the university from a foundation run by Mark Zuckerberg and his wife, Priscilla Chan, Donovan’s research project was abruptly terminated. She now alleges that Harvard tried to “destroy” her career, because she was “making trouble for the donors.”

Renée DiResta, research manager at the Stanford Internet Observatory, has also faced backlash for her work on social media. After collaborating with the Biden administration to track misinformation around COVID-19, conservative organizations accused DiResta of being a member of the “censorship industrial complex.” She is currently under investigation from House Republicans for her scholarship, and has become the target of a lawsuit from a prominent conservative legal group helmed by former Trump staffer, Stephen Miller. 

DiResta and Donovan were joined by Michael Wagner, a professor in the University of Wisconsin-Madison’s School of Journalism and Mass Communication, and another scholar under scrutiny from House Republicans for his work on disinformation. 

Here’s an excerpt from their conversation, moderated by IDEAS host Nahlah Ayed.

I think most people understand that there is a problem with social platforms. What’s the most urgent problem with the platforms that requires attention? 

Joan Donovan: Over the last decade of my research into Internet technologies, we’ve gone from seeing social movements being the first to adapt to social media, to now states and governments and, frankly, military operations being carried out on those same platforms. The word platform is very intentional here because they didn’t want you to think about social media as a product. They didn’t want you to think about it like a gun or like tobacco or like pharmaceuticals. See what I’m getting at?

A person gestures while speaking in front of a sign.
At the Senate hearing on Jan. 31, 2024, Republican Sen. Josh Hawley asked Meta CEO Mark Zuckerberg if he has personally compensated any of the victims and their families for what they have been through. (Jose Luis Magana/The Associated Press)

All of these other big industries that make tons of money are regulated in different ways to protect who? The consumer. To protect national security. So we’ve entered this moment where not just social media, but all of these digital technologies, have gotten away with it. And they’ve gotten away with it primarily because of some U.S. laws which suggest that platform companies are merely information conduits and they’re not businesses like other businesses.

And in some ways, we buy into the myth, we buy into the hype because we ourselves benefit from being able to stalk our exes. And so I’ll stop there. But it’s a problem of regulation, it’s a problem of lobbying, it’s a problem of the companies getting away with it, primarily because we as a society misunderstand the product itself as magic rather than understanding it like we understand other harmful products. 

Renée DiResta:  My succinct answer would be unaccountable private power. That’s the challenge. I think one of the key gaps right now, and the reason that it remains such an unaccountable private power, is that there’s very little visibility into a lot of the harmful dynamics on the platforms.

The area that I’ve been most captivated by lately has been around child exploitation content. We’ve looked at how you detect and disrupt networks devoted to that — but also you have the questions of how do platforms do that detection work themselves? Is there oversight that is ensuring that they do it? Who is looking on the outside to provide an oversight function when there has been very little actual government oversight, particularly within the United States? So I think that question of how do we gain visibility and then how do we think about accountability.

Former Facebook employee and whistleblower Frances Haugen testifies during a Senate Committee
Former Facebook employee and whistleblower Frances Haugen testified at a Senate Committee hearing about protecting kids online, Oct. 5, 2021. She left Facebook in May 2021 and provided internal company documents about Facebook to journalists and others, alleging that Facebook consistently chose profit over safety. (Jabin Botsford-Pool/Getty Images )

Michael Wagner: These are both really good answers. And so I’m going to do something a little stranger. I’ll talk about this in two ways, one of which relates to how we understand how humans behave on social media. And we tend to think of it as people doing things on social media. But that is not how most people live their lives. They live their lives in an overall information ecology where we talk to human beings face to face about things that matter to us. We watch news on television, we read news on paper. But one thing we tend to say is ‘is social media causing this problem? Is social media to blame for this or that?’ It’s how our use of social media interacts with all of the other things that we get information from. And we spend too little time thinking about that. So the behaviour answer, I think, is we kind of misunderstand social media’s power when we think about what it does.

The other problem is that one thing that researchers are supposed to be able to do in this space is to act as referees. What can we verifiably tell you is true or not about the way different platforms operate and how they influence what people believe to be true, what they want, how they behave, how they organize, all those kinds of things? And in the current environment, the things that both Joan and Renee were talking about have become deeply politicized. And people like us, people who are doing this kind of work, are now finding themselves the subject of attention rather than their research.It makes it really hard for us to really understand what’s happening with respect to platforms, power and democracy, when the people who were supposed to be doing the verifiably accurate and deep research about it are being attacked at every turn. 

 If the idea is — what can you verifiably tell us about what these platforms do to our lives — what is the effect of the politicization? What does that prevent you from being able to do? 

Renée DiResta: We were part of the Twitter Moderation Research Consortium, which was a researcher relationship that Twitter put together. Most of the work that I did in the context of the relationship with Twitter looked at state actor takedowns. We looked at influence operations from Egypt and Saudi Arabia targeting Iran, and Iran targeting them back. We looked at China and Taiwan, China and Hong Kong. And this work only happens when there is an open channel of communication. And when you create the perception that an open channel of communication is some sort of collusion or some sort of nefarious cabal to prevent people from speaking freely, you create an opportunity by which that [platform/researcher] relationship becomes chilled.

When state actors are undertaking these operations, it’s not because they are just doing something for fun, right? They’re aiming to either maintain or obtain political power. And so when we do work that disrupts that, a lot of the response is actually to try to discredit the researcher. I can’t tell you how many times Sputnik and RT have written about us over the years. There have been multiple occasions where people who have family in China or in India or in Turkey receive visits from governments. So there are very real costs associated with this. And it is in the interests of power to prevent those kinds of relationships from being transparent and effective. And that chilling effect has extended into the United States now as well. 

Joan Donovan: I’ve had a very particular experience with power and influence. When you’re, in my case, a non-tenured academic what it does is it protects you from outsiders encroaching on your work and trying to influence you in different ways. The kind of power that I’ve experienced is more so a soft power. It’s not a good thing for a senator to call your boss about research that you’re doing. No matter who your boss is and no matter who the senator is. 

Nahlah: We’re speaking hypothetically. 

Joan Donovan: Hypothetically, say someone from a company joins the board of your school and maligns your reputation at that advisory meeting. Hypothetically. The point is that there are all of these soft power moments where people in power who you’re investigating — their benefits decrease the more truth you uncover. Our job is fundamentally about telling the truth — and finding the truth in places where people in power don’t want you to be looking. 

Vivek Murthy talks to reporters during the daily news conference in the Brady Press Briefing Room at the White House on July 15, 2021 in Washington, DC.
U.S. Surgeon General Vivek Murthy spoke to reporters at the White House, July 15, 2021, calling on social media companies to do more to combat false information about the coronavirus vaccine and other health care topics. (Chip Somodevilla/Getty Images)

So as members of the public who keep hearing that social media platforms cause society harm — they might be polarizing, they might be affecting our political landscapes. How much confidence can we have that the right questions are being asked about how these platforms affect our society?

Michael Wagner: I think social media is lighter fluid in some ways. But it’s also the case that humans have done awful things and have been deeply divided. Like there was a four year long civil war in the United States that wasn’t started by Facebook. There are partisan entrenchments that lead to violence or other anti-democratic behaviours. And the platforms make some of these things easier. They make some of them easier to spot. They make some of them easier to coordinate. [But] let’s not just study Facebook — let’s look at how people actually use these things and how the affordances of one might lead to what happens on another. That’s one path forward to help us try to understand how to think about solving some of these larger problems. 

Joan Donovan: There is something going on with the children, and the kind of content that young people are being exposed to. It’s not like when we were kids and you happened to find, a Hustler ripped apart in the woods. Pornography is a part of their daily diet online, that kind of hypersexualization, social comparison. This is happening over and over and over again.

The design of Instagram itself is made to make teenagers feel inadequate so that they keep staying on the platform. And there’s 42 states in the U.S. now where attorney generals are suing social media because they knew their product design was harmful to teenagers and they chose to do nothing. They chose not to implement any kind of things that would tamp down the impact on younger folks.

I’m not making the argument from the 90s that we need to put stickers on CDs. I’m making the argument that this is the product. This is the product and the product could be different. I mean, it’s the biggest tell — if Facebook can get rid of the news in Canada, why can’t they get rid of hate? Why can’t they get rid of the racism? Why can’t they get rid of the ghost guns? There’s all kinds of things that happen on that website that once you start to dig into it, you realize there’s no such thing as the dark web. It’s just Facebook.

Why can’t they get rid of all that? 

Joan Donovan: They could. They choose not to because that’s what gets you to go back. 
 


*Q&A was edited for clarity and length. This episode was produced by Mitchell Stuart.

Listen to the full conversation wherever you download your favourite podcasts

This post was originally published on this site be sure to check out more of their content