A Rutgers-Camden researcher warns against creating ‘bubbles of agreement’ online

As the presidential races heat up, chances are you might find that so is the rhetoric online. Perhaps this scenario sounds familiar: You click – or even hover over – a campaign ad. Suddenly, on any webpage you visit, you are inundated with similar, reappearing ads.

Later, you log onto Facebook, where several of your so-called “friends” are once again railing against your candidate of choice. Enough is enough, you say, but unfriending these antagonists seems too extreme – so you promptly mute them.

Brown demonstrates how individuals apply algorithms while computers apply algorithms of their own.

“What you see is that we are algorithmically making our own ‘bubbles of agreement,’ based on rules that we are making, as well as those that computers and software programs are making for us,” says Jim Brown, an assistant professor of English and director of the Digital Studies Center at Rutgers University-Camden. “When I make that decision – this person is too far away from my politics so I am going to mute them – I am applying my own algorithm while the computer is applying them as well. It’s fascinating to consider how this plays out over the course of an election.”

Brown demonstrates the kind of rhetorical spaces networked software establishes and the access it permits, prevents and molds, in his illuminating new book, Ethical Programs: Hospitality and the Rhetorics of Software (University of Michigan Press). A major contribution to the emerging discourse of software studies, the book explores the rhetorical potential and problems of a hospitality ethos suited to a new era of hosts and guests.

“The argument that I make is, by software allowing us to do certain things and not others, that is both a rhetorical and an ethical problem – deciding who or what can or can’t participate, and how these experiences shape us,” explains Brown. “The software is making arguments about how we should and shouldn’t use it.”

Brown explains that by thinking about software as another participant or audience, individuals are in constant interaction, and thus experience a constant push and pull, with these arguments and shifting ground rules.

“They are almost like conversations, and in those conversations, we struggle to find out what rules are in play,” says Brown.

Throughout Ethical Programs, Brown introduces a series of stories and explores what they illustrate about software rhetoric and ethics. In one chapter, he considers both the positive and negative consequences of hacking. As the Rutgers-Camden researcher recounts, several years ago, a Twitter user found a minor flaw in the platform’s software: a user could post JavaScript code into a tweet and execute it, wreaking havoc on anyone’s machine who read the tweet. The user notified Twitter and was told that the problem had been fixed.

Several months later, the flaw reemerged. Only this time, instead of notifying the company, the user exploited the platform and showed the company the dire consequences of not taking the issue seriously.

Brown's book is a major contribution to the emerging discourse of software studies.

“So was this hacker a villain or a hero?” asks Brown. “We tend to blame the hackers, but how much blame should go to the company that doesn’t institute the proper security measures in order to safeguard against these issues?”

In another chapter, Brown explores the ramifications of increasingly complex software programs having the ability to generate news stories on their own. He recalls a research project conducted several years ago wherein scientists created a program that algorithmically writes a recap of a baseball game.

Rather than focus on a ranging set of facts, the program would analyze high-level statistics and locate significant moments in a game. For instance, the software analyzed a statistic called win probability added, a certain turning point in a game when the win probability spiked for a particular team.

Although highly analytical, says Brown, such programs fail to detect storylines that only a human could see, such as the unquantifiable actions or contributions of a particular player, or storylines that go beyond one particular game.

“What if the first-round draft pick is making his major league debut?” he asks. “He goes hitless on the day, but it doesn’t show up in the story.”

Ultimately, he notes, such programs come at a cost.

“It means that, once you release these machines into the wild, they are making decisions,” says Brown. “We have to come to grips with that.”

Brown also argues that, just as he is analyzing software through the lens of an English researcher, experts across a wide range of disciplines should be conducting close analysis.

“If we just leave it to the programmers or computer scientists, then they are only going to answer certain kinds of questions,” he says.

To take it even a step further, he adds, since software permeates everyday life, anyone should be able to look at a page of computer code and know basically what is happening.

“We talk about the three Rs in school – reading, writing, and arithmetic – but one of the primary things that we should be teaching is computational thinking,” says Brown.

Moreover, cautions Brown, some human issues predate computers and have only become exacerbated with the emergence and proliferation of technology.

“Political disagreements are a prime example,” he says. “We have to learn to disagree with one another and be open to differing points of view. If it’s ‘mute, mute, mute,’ and these are the only people I want to talk to, we need to ask ‘Why?’ and ‘Is this the best way to live our lives?’ We are increasingly forced to recognize how we interact and deal with one another, and what we do when someone shows up in our lives.”