Excerpted from “Superbloom: How Technologies of Connection Tear Us Apart” by Nicholas Carr, M.A. ’84.
It was a Sunday night, Oct. 19, 1952, and Frank Walsh, a Long Island electrician who moonlighted as a security guard, was worn out. He headed upstairs to bed while his wife, mother-in-law and five kids stayed down in the living room watching TV. They were engrossed in the latest episode of the new hit comedy “The Abbott and Costello Show.” Walsh tossed and turned but couldn’t fall asleep. The television was too loud, the laughter jarring. His irritation mounted, then turned to rage. He got up and grabbed the .38 Special he used in his guard job. Halfway down the stairs, the offending set came into view. He paused, took aim, and fired a bullet through the screen.
Walsh’s wife, furious, called the police. Officers arrived and confiscated the revolver, but they made no arrest. There’s no law, they explained, against shooting one’s own television. Two days later, The New York Times ran a brief, tongue-in-cheek notice about the incident, under the headline “Obviously Self-Defense.” The day after that, a Times columnist, Jack Gould, praised Walsh’s “public-spirited act.” He called on the authorities to give the man his gun back. “His work has barely started.” The paper’s coverage turned Walsh into a celebrity. Within a week, he appeared as a contestant on the popular prime-time game show “Strike It Rich.” He won a TV.
To shoot a television set, Frank Walsh discovered, is not to strike a blow against media and its dominion. It’s to merge into the televisual. It’s to act as someone on TV would act. As the producers of “Strike It Rich,” not to mention the editors of The New York Times, immediately recognized, Walsh’s shooting of his television was a made-for-media event — outrageous, funny, violent, relatable. Flattened into a figure of amusement and funneled into the media flow, Walsh succeeded only in turning himself into content. His act lived on, though. Firing a gun at a television would become a cultural trope, replayed endlessly in books, movies, songs, cartoons, and, of course, television shows. Elvis Presley made a habit of shooting his TVs and burying the carcasses in a “television graveyard” behind Graceland. He would then go out and buy more sets. He kept upwards of a dozen televisions in various locations around his mansion, plugged in and broadcasting. In surrounding himself with screens, the King was a trailblazer. We all live in Graceland now.
***
Thanks to its lack of attachments, its promiscuous flexibility, mass media has always been resilient. It absorbs the criticisms directed at it (even when they take the form of projectiles), turns them into programming, airs them, then distracts us from them with the next spectacle. Social media goes a step further. By encouraging an overheated style of rhetoric that breeds political polarization and governmental paralysis, it reduces the chances that it will be subjected to meaningful regulations or other legal controls. It’s protected by the conditions of distraction and dysfunction that it fosters. Politicians go on social media to express their disdain for social media, then eye the like count.
That’s not to say reform is impossible. The European Union, which has been much less sanguine than the United States about jettisoning the secrecy-of-correspondence doctrine, regularly passes laws and regulations aimed at restraining social media platforms. The rules provide citizens with more control over the information they share and the information they receive. Europeans are able to opt out of data-collection regimes, targeted advertising programs, and even, as of the summer of 2023, personalized news feeds. But the controls, however salutary, haven’t really changed the way social media operates. The reason is simple: they haven’t changed the behavior of most users. As surveys show, consumers have grown accustomed to trading personal information for tailored products and services. Few of them at this point are going to opt out of receiving content geared to their desires. Personalization has become central to people’s experience of media and to the enjoyment they derive from it. For avid TikTokers, taking the For You out of the For You page would be tantamount to switching off a pleasure center in the brain. Strong engagement isn’t only good for the platforms; users like it, too.
Antitrust actions against companies such as Google and Meta, which may be justified in economic terms, are also unlikely to change social media’s workings. Technological progress has an inertial force that rolls on independently of the maneuverings of the companies making money off it. While breaking up the tech giants or curbing their ability to enter into oligarchic alliances might well intensify competition and innovation in the internet industry, it’s unlikely to push media off the technological path it’s already on — a path that has been and will continue to be appealing to consumers and lucrative for companies. The point of antitrust prosecutions, argues Tim Wu, the Columbia law professor, is not to punish the big platforms but to force them “to make way for the next generation of technologists and their dreams.” That sounds stirring — until we remember that it’s the dreams of technologists that got us into our current fix. The next wave of innovations — larger language models, more convincing chatbots, more efficient content generation and censorship systems, more precise eye trackers and body sensors, more immersive virtual worlds, faster everything — will only drive us further into the emptiness of hyperreality.
The boldest and most creative of social media’s would-be reformers, a small group of legal scholars and other academics, joined by a handful of rebel programmers, have a more radical plan. They call it frictional design. They believe the existing technological system needs to be dismantled and rebuilt in a more humanistic form. Pursuing an approach reminiscent of the machine-breaking strategy of the 19th-century British Luddites, if without the violence, they seek, in effect, to sabotage existing social media platforms by reintroducing friction into their operations — throwing virtual sand into the virtual works.
“The relentless push to eliminate friction in the digital networked environment for the sake of efficiency,” explain two of the movement’s leading thinkers, Villanova’s Brett Frischmann and Harvard’s Susan Benesch, in a 2023 article in the Yale Journal of Law & Technology, has imposed large, hidden costs on society. “A general course correction is needed.” Invoking the “time, place, and manner” restrictions that have long been imposed on public speech — the prohibition on using a megaphone on a neighborhood street in the middle of the night, say, or the requirement that protesters get a permit before marching through a city — Frischmann and Benesch argue that legal restrictions can in a similar way be imposed on media software to encourage civil behavior and protect the general public interest. Unlike antitrust actions, privacy regulations, and opt-in requirements, which fail to address “the rampant techno-social engineering of humans by digital networked technologies,” government-mandated design constraints would, they write, transform the “digital architectures [and] interfaces that shape human interactions and behavior.” The constraints would change social relations by, to once again draw on sociologist Charles Horton Cooley’s terms, altering the mechanisms that determine how information flows and associations form.
Many kinds of “desirable inefficiencies” have been proposed. Limits could be set on the number of times a message can be forwarded or the number of people it can be forwarded to. The limits might become more stringent the more a message is shared. A delay of a few minutes could be introduced before a post appears on a platform, giving the person doing the posting time to reconsider its content and tone and slowing down the pace of exchanges. A similar delay or a few added clicks could be imposed before a person is allowed to like or reply to someone else’s post. A small fee might be required to broadcast a post or message to, say, more than 1,000 recipients. The fee might be increased for 10,000 recipients and again for 100,000. A broadcasting license might be required for any account with more than a quarter million followers or subscribers. Pop-up alerts could remind users of the number of people who might see a post or a message. Infinite scrolls, autoplay functions, and personalized feeds and advertisements could be banned outright.
There’s much to be said for the frictional design approach. It introduces values other than efficiency into media technology, and it would promote the construction of networks that, like the analog systems of old, encourage more deliberation and discretion on the part of viewers and listeners. If “code is law,” as Harvard Law School professor Lawrence Lessig argued years ago, then shouldn’t the public’s values and interests be taken into account in the formulation of software that shapes how society works? We have speed bumps on roads to slow people down and safeguard the public; why not on the net? The approach also has precedents in recent experiments undertaken by the platforms themselves. In 2020, some Twitter users began seeing a pop-up asking “Want to read the article first?” when they were about to retweet an article they hadn’t read. The pop-ups stirred some irritation — “Who made you god?” one user tweeted — but they did seem to have an effect, increasing the likelihood that people would at least glance at an article before sharing it. Two years later, Twitter tested a similar pop-up to deter “abusive language” in tweets. It, too, seemed to have an effect, with users canceling or revising about a third of the flagged messages. Apple and Instagram have introduced algorithmic interventions aimed at curbing the exchange of nude photos among minors. Teenaged users of Apple’s Messages and Instagram’s direct-messaging service are warned before sending or receiving messages that include nude images, and the images themselves are sometimes automatically blurred.
But while frictional design may help curb certain well-defined types of undesirable online behavior, it is likely to prove as futile as Frank Walsh’s gunplay when it comes to changing how social media operates. Unlike traditional time, place, and motion laws, which don’t affect the day-to-day lives of most people, changes to the basic workings of social media would affect pretty much everyone all at once. Although the frictional design proposals focus on regulating how technological systems work rather than on what people say, they would still raise free-speech and free-press concerns. Many people, even among the growing number who would like to see stiffer controls placed on platform companies, would rebel against what they’d see as patriarchal overreach or nanny-state meddling. Others would object to the government imposing a single set of values on the general public’s means of communication and entertainment. Many would ask whether politicians and bureaucrats can be trusted to meddle with software without mucking everything up. Would every shift in the political winds bring sudden and confusing alterations to the way apps work?
The biggest obstacle to adding friction to communication, though, is likely to be the habits of social media users themselves. The history of technological progress shows that once people adapt to greater efficiency in any practice or process, reductions in efficiency, whatever the rationale, feel intolerable. The public is rarely willing to suffer delays and nuisances once it has been relieved of them. In a culture programmed for ease, speed, and diversion, friction is the hardest of all sells.
The distinguished technology historian Thomas Hughes, having spent decades studying electric utilities, manufacturing plants, and transportation and communication networks, argued that complex technological systems are difficult if not impossible to change once they become established. In a system’s early, formative days, the public has an opportunity to influence how it’s designed, run, and regulated. But as it becomes entwined in society’s workings and people’s lives — as the technology gains “momentum,” in Hughes’s formulation — it resists alteration. Changing the system in any far-reaching way causes too many disruptions for too many people. Society shapes itself to the system rather than the other way around.
In the 1990s, when the internet was just beginning its transition from an academic to a commercial network, we could have passed laws and imposed regulations that would have shaped the course of its development and, years later, influenced how social media works.
We could have updated the secrecy-of-correspondence doctrine for a new era of online communication. We could have applied the public-interest standard to internet companies. We could have made the companies legally responsible for the information they transmit. We could have drawn technological and regulatory distinctions between private and public communication. But none of that happened. It was hardly even talked about. The public’s enthusiasm for the web and its apparent democratizing power, an enthusiasm that swept through Congress, the White House, and the Supreme Court, was too strong. Our faith in the benefits of ever more efficient communication overrode any concerns about risks or unintended consequences. Now, it’s too late to rethink the system. It has burrowed its way too deeply into society and the social mind.
But maybe it’s not too late to change ourselves.
Copyright (c) 2025 by Nicholas Carr. Used with permission of the publisher, W. W. Norton & Company, Inc. All rights reserved.