Popular social media platforms were once a place to connect with old and new friends, post life updates, and share photos of your kids. But they have become fraught with misinformation, an algorithm-delivered onslaught of content aimed at fueling discord, all in service to business models that prioritize drawing eyeballs, said experts on digital life at a discussion last Thursday on how to improve discourse and spaces online.
But that’s not how they started.
“I think it’s important to remember why people were originally going to a lot of these services, these online environments. They were looking for connection and were looking for community,” said danah boyd, founder of research nonprofit Data & Society. “It was self-identified geeks, freaks, and queers — which I identify as all three — that were very, very happily finding their people in these online environments, and they were investing in those relationships.”
boyd, who is also a distinguished visiting professor at Georgetown University, was part of a keynote panel at an afternoon-long series of discussions, titled “Beyond Discourse Dumpster Fires.” boyd’s panel was moderated by Berkman Klein Center Faculty Director Jonathan Zittrain, the George Bemis Professor of International Law at Harvard Law School.
“I’ve been thinking a little bit about what I call the three laws of digital governance,” said Zittrain, who is also a professor of public policy at the Kennedy School and a professor of computer science at the School of Engineering and Applied Sciences. “One, we don’t know what we want. Two, we don’t trust anybody to give it to us. Three, we want it now. And the optional fourth is can AI scale it.” The crowd chuckled. He asked the panel how they felt these rules might be helping or hurting public discourse.
Deb Roy, a professor of media arts and sciences at MIT where he directs the MIT Center for Constructive Communication, said that when thinking about all of these technologies, it’s important to think about their intended goals.
Roy served as Twitter’s chief media scientist from 2013-2017. During that time, he was surprised by the lack of clarity within the company on its larger vision. The decision by social networks to follow a broadcast media model for monetization ended up leading to many unanticipated challenges.
“The point of this must be to build an audience and to have a modern-day version of broadcast reach. It’s not clear that’s the primary kind of social network we were seeking when we talked about connection and community,” Roy said of the vague goals they pursued.
His team at MIT has been experimenting with a new social network that puts dialogue — both in person and virtual — at the center of its platform. It is currently being tested voluntarily by students and is slowly being rolled out to faculty.
It goes back to connection, boyd says.
boyd, who focuses some of her research on youth and the current mental health crisis, said that when young people turn to social networks or interventions for help, what they’re seeking is human interaction.
Before the pandemic, crisis workers would focus on connecting struggling youth with the “noncustodial adults” in thesir lives, she said. These are the teachers, pastors, coaches, and mentors that don’t have direct power over their lives in the way parents do, and can offer a critical level of support. But coming out of the pandemic, they found that the number of youths who had even one of these types of relationships had plummeted.
Could connecting them with an AI chatbot help? Probably not, she suggested.
“When somebody’s in crisis, what they’re grateful for is that a human has spent time listening to them,” boyd said. “Somebody, even a stranger, is willing to dedicate time. And that will never be replaced.”
But there may be other ways AI can lend a hand. Gordon Pennycook, associate professor and Himan Brown Faculty Fellow at Cornell University, recently published a study in Science that showed the value in using AI tools to address conspiracy theories.
Pennycook’s team used DebunkBot, an AI tool, to have short conversations with people about conspiracy theories they embraced. The bot presented participants with counterarguments using evidence-based research. The team found that nearly a quarter of participants no longer held those beliefs by the end of the conversation, which typically lasted only eight minutes.
“Even for this group of people who everyone thinks is down the rabbit hole — you can never change their mind — you see effects. Evidence matters,” he said of the findings. “It’s benefiting people … After the conversation, people trusted the AI more. Most people have really enjoyed the conversation, and they learn a lot from us.”
The event also featured other discussions around anonymous discourse on the online communication platform Nymspace and the role of technology in participatory democracy.
Toward the end of the keynote conversation, the panelists discussed the blurring of private and public spaces and the unique challenges this creates online.
Roy mentioned the book “A Pattern Language” by Christopher Alexander, which looks at patterns that correlate with human flourishing. One pattern is the intimacy gradient, which looks at architecture and considers the experience of moving from more private spaces to more public ones, such as the differences that might exist between a bedroom and an entryway.
“What we’ve done with a lot of the design of social media [is] imagine your child in their bedroom, and they open the door and just step into Times Square. There is no gradient,” Roy said. “How do we bring that kind of thinking and those principles into the spaces that we all together can create?”