“Almost everybody who is a writer these days,” he observes, “gets, at some point, a lecture on the necessity of being ‘on’ Twitter and Facebook. It’s a tool of selling and career building. It is, for writers of all ages and stages, not so much required reading as required writing. The whole thing seems stupid at first: you ignore whoever is giving you this lecture, until one day you decide, O.K., let’s try it out, and then discover that it’s kind of fun. And, as long as it’s done in moderation, it is kind of interesting. But could Twitter possibly be productive, beyond the basic act of publicizing what you have written and/or proving that you still exist?”
As a new (and somewhat skeptical) Twitter user, I find all this particularly resonant. Have we really hit the cultural tipping point where we need to prove our existence, in bursts of 140-character inspiration, two, three, 10, 12, 15 times a day?
Beller’s account charts my own Twitter experience with uncanny accuracy: resistance, capitulation, the discovery that, perhaps, there might be something to the enterprise and yet a kind of lingering confusion about what it means. Is it possible to do something interesting on Twitter? Or is it just an elaborate way of saying, “Look at me”?
“Most great writers,” Beller notes, “could, if they wanted to, be very good at Twitter, because it is a medium of words and also of form. Its built-in limitation corresponds to the sense of rhythm and proportion that writers apply to each line.” That reminds me of the poet David Trinidad, who once told me that he liked writing in traditional formats (the sonnet, especially) because of the stylistic boundaries: The challenge was to be creative while coloring within the lines.
I haven’t been on Twitter long, but I find myself gravitating to the feeds of writers such as Margaret Atwood and Joyce Carol Oates, whom Beller calls “a prolific and often ingenious tweeter,” citing one recent example: “If an action is not recorded on a smart phone, does it, did it, exist?”
What Oates is doing here is neither self-promoting nor time-wasting nor navel-gazing but, rather, taking a more ontological approach. Hers is a highly self-conscious feed, often reflecting on the mechanics of the medium; “Development in human consciousness,” she tweeted a few weeks ago, “people speak into the (social media) void as a way of speaking to themselves, like keeping a journal?”
That’s a vivid observation, but what I admire most about it is that it requires us to be engaged. Oates is asking, in other words, for us to think about what we’re doing on Twitter, what we’re reading and what we’re writing, which is, of course, the whole point of literary (or any) culture, to encourage consciousness.
For Beller, Oates provokes a series of significant, and troubling, questions: “Does a thought need to be shared to exist? What happens to the stray thought that drifts into view, is pondered, and then drifts away? Perhaps you jot it down in a note before it vanishes, so that you can mull it over in the future. It’s like a seed that, when you return to it, may have grown into something visible. Or perhaps you put it in a tweet, making the note public. But does the fact that it is public diminish the chances that it will grow into something sturdy and lasting? Does articulating a thought in public freeze it in place somehow, making it not part of a thought process but rather a tiny little finished sculpture?”
I wonder much the same thing. Is Twitter the problem or the solution? How do we use it in compelling ways? Beller closes his piece by describing two essays he first “assembled” as a set of Tweets, a process he regards as akin to “being a juggler or a three-card-monte dealer: I drew a little crowd.” Is this “talking it out before you write it,” he considers, “or part of a process?” And what does it mean for our writing if we are now expected to take our notes in public, to play out in full view the very private life of the mind?
“We live in a transparent age,” Beller concludes, “and yet there is much of value that happens in the opaque quarters of our own ambivalent minds, seen by no one else, and seen by us only after a long period of concentration and looking.”
ALSO:Copyright © 2014, Los Angeles Times