We should all be grateful to Siva Vaidhyanathan. He has endured great pain and suffering to explore a dangerous new landscape, and he now offers to be our guide to this strange ecosystem full of threatening creatures. The landscape I’m referring to, of course, is the media ecosystem that Facebook has wrought. And the pain and suffering Vaidhyanathan endured consisted not merely of the academic research required to write a scholarly book, but of “hundreds of hours reading Mark Zuckerberg’s interviews, watching his television appearances, and listening to his public addresses.” It is difficult to imagine a more mind-numbing research agenda. So, as I said, we should be grateful that he has suffered for us and that instead of reading Zuckerberg, we can just read Vaidhyanathan’s new book, Antisocial Media: How Facebook Disconnects Us and Undermines Democracy.

Perhaps the best way to get oriented within this strange landscape is to define some key terms. So in lieu of a more traditional summary, I will organize some of Vaidhyanathan’s key points in a kind of guidebook that defines aspects of our contemporary media ecology. As a former colleague of Neil Postman, Vaidhyanathan follows Postman in imagining media as an ecosystem (though he differs from Postman in other respects), so I’d like to think he’d approve of this approach to his book.

A Guidebook to Facebook’s Media Ecosystem

Asymmetric Information

While Vaidhyanathan doesn’t use this economic term to describe users’ relationship with Facebook, the term fits the facts he lays out. In essence, Facebook knows a lot more about its users (or should I say victims?) than users realize. And thanks to “Open Graph partnerships and the use of tracking cookies that it implants in users’ Web browsers, [Facebook] is able to gather immense amounts of personal data from people who hardly ever log in to their Facebook accounts. Basically, there’s no way to opt out fully from Facebook’s ability to track you” (54). Since Facebook even maintains a “shadow profile” of individuals who aren’t registered users, it tracks pretty much everyone. Facebook is the most widespread and intrusive surveillance apparatus ever designed. And like the prisoners in Bentham’s Panopticon, users don’t know the extent to which they are being watched.

The Cryptopticon

Facebook’s remarkable ability to extract data from its users leads to what Vaidhyanathan terms “the Cryptopticon: An inscrutable information ecosystem of massive corporate and state surveillance.” Yet this isn’t an Orwellian (or even Foucoultian) Panopticon because users freely feed their data into the Cryptopticon in exchange for pleasure.

Facebook, Google, and Amazon want us to relax and be ourselves. They have an interest in exploiting niche markets that our consumer choices have generated. These companies are devoted to tracking our eccentricities because they understand that the ways we set ourselves apart from others are the things about which we are most passionate. Our passions, predilections, fancies, and fetishes drive and shape our discretionary spending; they are what makes us easy targets for precise marketing. (63)

Be your authentic you. Pursue your pleasures. The Cryptopticon is watching and will sell you what you need before you know you need it.

Context Collapse

Facebook flattens all social relationships into one type, the misnamed “Friend.” By doing so, it collapses together the various groups to which we belong: “school, church, the public sphere, a place of employment, or a family. Each of these contexts shifts and overlaps with others. Borders change. Contexts blend. So configuring a ‘self’ in the twenty-first century is a lot more work than it used to be” (60).

Vaidhyanathan is right to criticize the way that Facebook “scrambles our social contexts,” but he labels the problem as a loss of privacy (which he defines as “the combination of autonomy and dignity”). Such privacy is one of the core goods of liberalism, which holds that each individual should have the freedom to achieve autonomous self-definition. The more fundamental problem, I think, is the erosion of membership or intimacy. Facebook tends to render all social relationships into tepid, flat “Friends” because that’s the only kind of connection Facebook makes available.

Filter Bubble

Sometimes also labeled an “echo chamber,” the filter bubble describes how Facebook’s algorithms exacerbate our human tendencies to listen to people with whom we already agree. Vaidhyanathan points out that “homophily—a sociological term used to describe our urge to cavort with those similar to ourselves”—was present long before Facebook, but Facebook, and the broader media landscape shaped by Facebook, amplifies and reinforces this tendency (85). Its algorithms feed us more of the content we’ve interacted with previously (think of the way recommendations on Netflix or Amazon work). In the past, Facebook experimented with prioritizing certain news sites in users’ news feeds. While this had its own set of problems, Facebook was at least acknowledging it might have an educative or editorial responsibility. But Facebook’s algorithms have doubled down on simply maximizing engagement. Long live clickbait.

The Operating System of our Lives

Ultimately, Facebook is vying (along with Alphabet, Microsoft, Amazon, and Apple) to dominate “the data streams that would monitor, monetize, and govern our automobiles, homes, appliances, and bodies” (93). Each of these companies, and countless smaller ones, wants to monopolize the invaluable data generated not only by personal computers and smart phones, but by the ever-growing Internet of Things. The result? “Attention would be optional. Power would be more concentrated. And manipulation constant” (99). Sounds like fun, right?

Corporate Social Responsibility

Not to worry; Facebook is part of a broader movement within the business world that believes corporations have a responsibility to improve society. As Zuckerberg claims, Facebook “‘was built to accomplish a social mission—to make the world more open and connected’” (101). While this seems like an improvement upon the “shareholder primacy” school of thought, in which companies exist solely to maximize shareholder value, it has insidious effects. Particularly since, as Vaidhyanathan points out, Zuckerberg manages to combine “self-righteousness with naïveté.” I am reminded of Thoreau’s diatribe against the growing class of philanthropists: “If I knew for a certainty that a man was coming to my house with the conscious design of doing me good, I should run for my life.” (If you wonder why anyone would criticize philanthropists, read Jeremy Beer.)

Techno-Narcissism

One manifestation of this naïveté is the techno-narcissism that dominates Silicon Valley (or, as the Bar Jester aptly names it, Silly Con Valley). “Techno-Narcissism,” Vaidhyanathan writes, “is both ethnocentric and imperialistic. It assumes that if only people had our tools, toys, and techniques their lives would improve almost instantly” (133). This assumption fuels wild optimism and utopian thinking. It leads Western observers to see a “Twitter Revolution” behind the “Arab Spring” when in fact, as Vaidhyanathan demonstrates, social media and the internet do not magically bring about democracy and liberty.

Hypermedia

Vaidhyanathan borrows this term from Philip Howard, and it describes a narrowing of political and cultural messaging. As Facebook and other platforms “harves[t] consumer data [and] profil[e] voters in narrow tranches based on issue interest,” we increasingly see messages that “pande[r] to [our] pet concerns” (155). The result is a tragedy of the commons 2.0: “The culture of politics, therefore, has become customized so that we are each asked whether a candidate or platform is good for us and our immediate gratification rather than good for our community, nation, or world.” We feel like we’re getting personal, direct contact with elected representatives or important institutions, but we’re being manipulated rather than empowered.

Disinformation

Rather than fake news, Vaidhyanathan sees the core problem of the Facebook ecosystem as one of the amplification of disinformation. Pollution spreads and drowns out meaningful conversation. The kinds of stories that are most “engaging,” the ones most likely to be shared, are those that generate a strong emotional response. Sites like the Huffington Post and Breitbart have honed the art of crafting content that is “irresistible to Facebook News Feeds. Those who buy into the claims of the disinformation share it gleefully. Those who are appalled by the existence of the disinformation share it out of disgust. The effect is the same. Chaos reigns” (174). Trump, of course, is a master at provoking this cycle of “mediated cacophony,” but he’s simply using the system as it was designed to work (12).

Moving from Suspicion to Trust

More than anything, Vaidhyanathan’s analysis confirms that Facebook’s media ecosystem amplifies disinformation and makes it increasingly difficult to think well in public. To use a distinction that Alan Jacobs borrows from Mike Caulfield, Facebook is a tool to publish with, not to think with. Facebook has restructured our entire media ecosystem to make it easier than ever to publish and spread ideas, but it is increasingly difficult to listen to others and then think and deliberate with them. As Vaidhyanathan writes, “Facebook has amplified some of our best and worst collective habits. But one thing has surely suffered: Our ability to think through problems together” (203).

And this is in spite of—or maybe because of—Mark Zuckerberg’s idealistic vision of “connecting the world.” As Vaidhyanathan demonstrates, Facebook has connected us in ways that damage us and our places:

By removing friction from so much of our lives, by lifting our gaze from the local to the virtual, these systems have made many lives easier (ignoring, for a moment, those poisoned by runoff from heavy metal mining in Africa or those injured in electronics factories in China). But the constant, daily effect of these technologies is narcotic and neurotic. (194)

Perhaps the greatest loss has been an erosion of trust. We no longer know which institutions and voices are reliable and trustworthy. Vaidhyanathan cites a disturbing study which found that “Facebook users judge the trustworthiness of information that comes across their News Feed based on who posted it rather than the source of the post itself” (14). In the absence of trustworthy institutions, we rely on our friends. Depending on who we follow, this can be okay, but in the aggregate, crowd-sourced knowledge contributes to disinformation more than it does to responsible wisdom.

Is there any hope that we might restore the trust needed to think and deliberate honestly together? Vaidhyanathan argues that we shouldn’t try to restore this trust through individual choices like deleting our Facebook accounts. He claims that “Facebook makes it hard to think. . . . As individuals we can deploy strategies and tactics to cope. We can delete an app or turn off a mobile phone. There are no such strategies for the harm Facebook does to our ability to think collectively” (190). Thus he concludes that “a mass boycott of Facebook would be trivial” and instead challenges us to “strengthen other institutions such as libraries, schools, universities, and civil society organizations that might offer richer engagement with knowledge and community” (16). This is an intriguing line of thought, but given the state of civil society and its lack of strong, trustworthy institutions, it’s not surprising that Vaidhyanathan spends more time pondering how national politics might solve the mess that Facebook has generated.

Still, it seems odd to put one’s hope in pulling on the levers of political power that, as he so ably demonstrates, Facebook has rigged. Vaidhyanathan’s commitment to national, political solutions appears to stem from his nostalgia for some halcyon time when Enlightenment values reigned and the liberal democratic order promoted vibrant political discourse. He writes about his own youthful belief that, in the wake of the Cold War, the liberal order would prevail across the globe (127). And he mourns the way that the state, “as the aggregator and mediator of disinterested information and the forger of public interest” has “sloughed away” (119). But was the state ever such an institution? Is there such a thing as “disinterested information”? I have my doubts. Before Facebook, TV dominated election cycles, and further back there was the penny press; Facebook is bad in unique ways, but American media have long been attention-grabbing, superficial, and profit-oriented.

After chronicling the ways that Facebook influenced the 2016 presidential race, Vaidhyanathan asks, “What does it say about the fate of American democracy that national elections would be decided based on motivation rather than deliberation?” (143). Nothing good, but again, when exactly was this era of golden deliberation to which Vaidhyanathan hopes to return? While he acknowledges that state-orchestrated projects have gone badly wrong, he concludes that “somewhere between the tiny vision of innovation [e.g. private companies like Facebook entrepreneurially fixing problems] and the arrogance of grand [state-led] progress lies a vision of collective destiny and confidence that with the right investments, a strong consensus, and patience we can generate a more just and stable world” (198). That still sounds rather utopian and abstract to me. I can’t imagine a global system—whether backed by political institutions or private capital—that could earn and honor our trust. Trust can’t be sustained on a global scale.

Maybe without the lure of going viral or the possibility of becoming a global thought leader, we’d be more likely to invest our energies closer to home, working to strengthen the civic institutions that remain.

Vaidhyanathan’s best idea is, unfortunately, unlikely to come to fruition. He urges antitrust regulators to simply dismantle Facebook: “The United States should break up Facebook” (207). A diversified ecosystem of smaller media companies would almost certainly be a real improvement over Facebook’s dominance. And maybe without the lure of going viral or the possibility of becoming a global thought leader, we’d be more likely to invest our energies closer to home, working to strengthen the civic institutions that remain.

In any case, the long, difficult work of restoring trust begins even closer to home, in our marriages, families, and friendships. While reading Vaidhyanathan’s description of the ways in which Facebook has eroded trust, I was reminded of Wendell Berry’s short novel Remembering. After Andy Catlett’s soul-searching on his early morning walk through the streets of San Francisco, he goes to the airport to board his flight home. Even in the pre-9/11 days, the airport is a place of abstraction and distrust: “He passes through the Gate of Universal Suspicion and is reduced to one two-hundred-millionth of his nation.” Each individual must pass through security: “none may be trusted, not one. Where one may be dangerous, and none is known, all must be mistrusted.” These precautions are necessary when we interact on large scales: We can’t know or trust or love 200 million people (much less the 325 million who now live in the United States or the 2 billion who use Facebook).

As Andy looks out his airplane window, he contrasts the patchwork landscape far below with the home to which he is returning, a place Andy “knows as his tongue knows the inside of his mouth.” For him and his wife, Flora, to make a life from the abused and marginal farm that is their home “required trust.” And “as Flora seems to have known and never doubted, as he sees, one cannot know enough to trust. To trust is simply to give oneself.” Such giving becomes nearly impossible in an ecosystem polluted with disinformation and warped by universal suspicion.

Facebook is simply not designed to connect us in ways that foster trust. We need to cultivate genuine friendships that grow from shared interests, shared history, and shared service. We need a literal common ground. When we are abstracted from our places and communities, we become commodified bits of data: “Andy began to foresee a time when everything in the country would be marketable and everything marketable would be sold, when not one freestanding tree or household or man or woman would remain.” Andy’s dystopian vision remains prescient. Big data has indeed found ways to market every household and man and woman. Yet his prescription also remains practicable: “Something needed to be done, and he did not know what. He turned to his own place then—the Harford Place, as diminished by its history as any other—and began to ask what might be the best use of it. How might a family live there without reducing it?” While we wait for Facebook’s damaging monopoly to be broken up, we can all set to work asking Andy’s questions about our own places. These questions remind us that to restore trust, we will need to be less connected and more rooted.

Local Culture
Local Culture
Local Culture
Local Culture

1 COMMENT

Comments are closed.