By Caitlin Roper, Special to the Los Angeles Times
June 27, 2010
Creativity and Generosity in a Connected Age
The Penguin Press: 256 pp., $25.95
What the Internet Is Doing to Our Brains
W.W. Norton: 276 pp., $26.95
When I was looking for a job after college, I bought the Sunday New York Times and scanned the classifieds, circling promising descriptions with a red pen. I mailed in my resume and waited for a call on my landline. That was 11 years ago — the last time I looked for a job without using the Internet.
I now have close relations with several devices unimaginable in 1999. I spend a large part of every day at a computer terminal, and I conduct my personal and professional lives with the indispensable aid of new technologies. Am I dumber than I was before I had an iPhone and a 24-hour Internet connection? I don't feel dumber. I feel faster, but more distracted than I used to be. I don't know anyone who doesn't struggle, as I do, with the issue of how much to let technology aid, or encroach, on daily life.
This question, of the effect of technology in contemporary culture, is at the center of two new books that explore the impact of the Internet. Clay Shirky's "Cognitive Surplus: Creativity and Generosity in a Connected Age" and Nicholas Carr's "The Shallows: What the Internet Is Doing to Our Brains" come marketed as oppositional. Shirky says the Internet is making us smarter; Carr maintains it's making us more stupid. Though this reductive schema may work to sell books, it's a poor representation of both authors' arguments.
Shirky's perspective is macro, societal. He makes the case that the time we used to spend passively entertained by television has been largely replaced by time on the Internet, where we engage with each other, producing something. "This book is about the novel resource that has appeared as the world's cumulative free time is addressed in aggregate," he writes.
Of course, when it comes to what we are producing, Shirky's examples vary wildly, including lolcats (photos of cats with ungrammatical captions) and Ushahidi, an Internet service created to help citizens of Kenya track ethnic violence via cellphone, now with applications all over the world. Yet he isn't claiming that the Internet inevitably makes us more productive, civically engaged people. Instead, he argues, it taps into our innate desire to connect and share by enabling us "to speak publicly and to pool our capabilities," a process "so different from what we're used to that we have to rethink the basic concept of media: it's not just something we consume, it's something we use."
Shirky's best example is Patientslikeme.com, a site where those with chronic conditions share information and support. Patientslikeme allows people with rare diagnoses to form communities and discuss treatments, challenges, side effects — to commiserate. Before the Internet, there was no way for these patients to find one another and no way for doctors to find groups large enough to conduct significant clinical trials. Now individuals benefit from the support their online communities generate, researchers benefit from the largest group of patients with rare diseases ever assembled and hopefully innovations in treatments will result.
If Shirky comes down in support of unbridled innovation — let's use the Internet in every way we can, he says, because that's how important developments, such as Wikipedia, Patientslikeme and Ushahidi come about — Carr is more worried about what the Web is doing to each of us individually, to our poor, beleaguered brains. He fears that, even as we may be developing finer motor skills through constant Internet navigation, we're losing the ability to focus for the significant periods of time necessary for deep thinking. Such a stance is reasoned-alarmist: "[T]he news is even more disturbing than I had suspected," he writes. "Dozens of studies by psychologists, neurobiologists, educators, and Web designers point to the same conclusion: when we go online, we enter an environment that promotes cursory reading, hurried and distracted thinking, and superficial learning." Even more, he continues, "the Net delivers precisely the kind of sensory and cognitive stimuli — repetitive, intensive, interactive, addictive — that have been shown to result in strong and rapid alterations in brain circuits and functions."
Carr has synthesized a wealth of cognitive research to illustrate how the Internet is changing the way we process information. "The Net is, by design, an interruption system, a machine geared for dividing attention," he points out. He is particularly disturbed by the Internet's effect on our relationship with reading: "[I]n the choices we have made, consciously or not, about how we use our computers," he argues, "we have rejected the intellectual tradition of solitary, single-minded concentration, the ethic that the book bestowed on us." He's got a point; as someone who has loved books since before I could make sense of them, I'll admit that it's harder for me to carve out space to read than ever before. Without question, I spend more time on the Internet than I do reading bound volumes. Yet once I begin a book, I'm still able to focus on its contents.
Be wary of the Internet's effects, Carr warns us. He makes a convincing case that we are altering our brains with every ping and click-though. Shirky, on the other hand, celebrates the possibilities the Internet affords, for civic engagement, for collaboration, for emotional support, for innovation. Who is right? I'd suggest that both of them are. If these books represent an equation that must be solved, we have to ask ourselves what scale of intelligence is more critical for us to evaluate? Is it the Internet's effect on humans individually or on humans as a whole?
We are all part of a bridge generation. We're also part of a massive experiment, and we don't yet know the outcome. But there's enough promise to all this for me to feel comfortable letting it play out, with maximum openness and flexibility, as Shirky suggests. At the same time, I have no doubt Carr is onto something too. We are altering how our minds work, and we are only at the beginning of a profound and unpredictable evolution.
Roper is managing editor of the Paris Review.
Copyright © 2013, Los Angeles Times