Several years ago I followed an exchange on Twitter between two academics. Both were lamenting the (in their view) low quality work done by young writers as well as the way those writers all seemed addicted to Twitter. After following the exchange for a few minutes, I asked them both “So aside from ‘doing your PhD in the 80s and 90s before the academic job market fell apart and then finding a tenure track gig at a liberal arts school that will allow you to write whatever you want and provide you with a job that allows you that time,’ what is your advice to young people wishing to make a living as a writer?”

Neither of these august academics replied.

I was reminded of that exchange while reading a recent piece at Front Porch Republic. According to Matt Stewart, Twitter is no place for a localist. Citing Wendell Berry’s “Why I Will Not Own a Computer,” as well as the obvious issues with the platform’s formation of thinkers, he argued that no self-respecting localist should be active on Twitter–or Facebook.

To be sure, if you are a PhD candidate chasing work in the academy there is a strong case for abstaining from social media. Most hiring committees will, at best, not care about a large Twitter following, many will be turned off by it, and Twitter provides a ready platform to say something publicly to a large audience that will render you unemployable at most universities or colleges. If I were an academic (or a pastor), I wouldn’t be on Twitter.

But, of course, what I have just outlined is a contextualized prudential case for not using the platform which is quite different from the more far-reaching critique offered by Stewart. Stewart’s argument hinges on two things, it would seem: First, Twitter use undermines one’s ability to be useful on a local level in one’s own small place. Second, Twitter use inevitably deforms one intellectually by exchanging the proper criterion by which intellectual work is judged–rigor, precision, clarity–with criterion that inherently devalue and degrade intellectual work–convenience, speed, ease, and so on.

Stewart’s argument is, in short, an application of McLuhan’s idea that the medium is the message, that we cannot separate the brute fact of a tool’s existence from that tool’s powers to form and shape everything it touches. Yet it seems to me that while we should recognize the formative power of tools, we should not simply collapse the two together, which is clearly what Stewart is doing.

I have made friends on Twitter who have taught me a number of useful things that I have been able to use to bless my local place.  

So, for example, the argument is that Twitter undermines one’s ability to be useful locally. But that is easily dispensed with: I have made friends on Twitter who have taught me a number of useful things that I have been able to use to bless my local place. Some things are profound–Twitter friends have passed on articles or books that helped shape my mind on a deep level–while others are far more banal–I have found excellent recipes and help with home repair on Twitter and Facebook. Indeed, when I have a home repair problem my two places I go for help first are YouTube and Facebook. While that could potentially draw me away from my local community, it does not necessarily do so. After all, if checking YouTube for help with such a thing makes me a bad localist, then so too would consulting Home Plumbing for Dummies.

The point about the formational effect of Twitter is stronger, yet even here it does not work nearly so well as Stewart seems to think. In the first place, he does not seem to entertain at all the idea that learning to express things concisely can be a worthy goal and that Twitter may be helpful toward that end. Second, one can of course simply quit or take a break from the platform. If you follow Catholic Twitter at all then you are familiar with the quiet interludes that come with Lent and sometimes Advent when many devout Catholics step away from the platform for a time.

Twitter is, ultimately, a tool. It is something that can bring writers together, allow for constructive conversation (I have seen and participated in such discussions), and help hone certain necessary skills a writer might need. It can, of course, do many other things that are far less helpful. But, then, that is precisely my point: There are many prudential reasons that a person might not use Facebook or Twitter. I’ve stepped away from both for periods of time so that I could more easily focus on other work. Being wise about the tools one uses is good and reasonable. But Stewart’s argument is not that we should be prudent, but that we are somehow bad localists who are deforming ourselves intellectually if we join a social media network.

Because social media networks are free to use and subsidized by ads, to use a social network is essentially to consent to making a product of oneself.  

There is, of course, a far stronger argument for not using the platforms and Stewart hints at it: Because social media networks are free to use and subsidized by ads, to use a social network is essentially to consent to making a product of oneself. And, as the Cambridge Analytica scandal showed, social networks have the ability to make us into extremely specific sorts of products. This is a much stronger point and, frankly, if I were not preparing to publish a book and did not work as an editor with two non-profits, it probably would have been sufficient to drive me off of Facebook, at least.

Yet even here, we should be careful: The problem still is not that simply using some particular tool automatically removes one from one’s local context and guarantees that one will gradually morph into a Matt-Walsh-style machine that belches out hot takes. The problem is with the business model being used to subsidize a service. It is possible, of course, that there is no viable business model for building a large scale social network other than commodifying the users in increasingly creepy ways. But, of course, that too was not Stewart’s argument.

There is one final point to make: We live in a time where the much discussed social fabric is unraveling. In such a day, we have great need for careful, creative writers who can call upon a vast store of knowledge to advise us in the practical questions of citizenship, household life, piety, and a host of other matters. But to be that kind of writer requires a great deal of time, which is to say it (usually) requires making one’s living as a writer or editor. And if you wish to do that in today’s writing economy, then you pretty much have to have a Twitter account.

Is that norm stupid? Probably! I edit a site that has articles shared on a not infrequent basis by accounts with 100,000+ followers and the traffic effect of those shares is non-existent. Derek Thompson has observed something similar over at The Atlantic. In a more reasonable world, we’d recognize that one’s Twitter follower count is an almost entirely meaningless vanity metric that doesn’t actually reflect one’s relative prominence or lack thereof as a writer. But we don’t live in such a world. And as long as publishers judge one’s “platform” largely on the basis of Twitter follower counts, writers will have to play that game to some degree. This is all frustrating and probably foolish on the part of publishers, if my experience at Mere Orthodoxy and Thompson’s at The Atlantic is anything to go on. But it’s where the industry is right now. And if we want to have writers useful for our moment, which is to say writers who have the time and freedom required to do good reading and reflecting on the world, we need writers who work full time as writers. And to get there, you pretty much have to play the social media game.

Local Culture
Local Culture
Local Culture
Local Culture

1 COMMENT

  1. There are only two business models: You either pay for the product or you are the product.

Comments are closed.