Whether our laptops, tablets and smartphones have made us smarter or dumber is a matter of endless debate and of scant but growing research. A new study grabs hold of an important corner of that question, finding that we have adapted the way we remember things to a world in which virtually everything is available on the Web.
The upshot: We actually remember facts better when we know we won’t have ready access to them by means of computer, smartphone or tablet. When we are confident that bits of information will be available to us after seeing or hearing them, we are more likely to commit to memory how to access them again -- what keyword will lead us to it, or whether we stowed it away in a folder on our desktop -- than to remember much about their actual content.
The study, published Thursday in Science magazine, describes four experiments designed to explore two things in particular: whether a random demand for a piece of information makes us think to do as our mothers always admonished us, and “look it up” on the web; and once we’ve found that piece of information, whether and how we store it away for later retrieval.
In three of those experiments, subjects--several dozen undergrads from either Harvard or Columbia University-- were set before a computer and asked to read, type and then try to remember a set of intriguing factoids (such as, “an ostrich’s eye is bigger than its brain”). In some cases, the subjects were told they’d have later access to these odd nuggets of information, and if so, how. In other cases, subjects were led to believe that that the factoids they were asked to remember would be erased after they typed them onto the computer screen, making them unavailable for later reference.
One of the experiments clearly showed that those who believed the factoids would be unavailable to them later were more likely to commit them to memory than those who believed they could call them up at any time. Another experiment showed that, when subjects were told a factoid would be stored away for later reference in a folder, they were far more likely to remember the folder’s name when asked, say, to recall the factoid about the ostrich, than to remember the substance of the fact itself.
The three psychologists who authored the study (hailing from Columbia University, University of Wisconsin and Harvard University) see its results as evidence not only that we have adapted our memory processes to the omnipresence of data readily available on the worldwide web, but how we’ve adapted those processes: we’ve come to use our laptops, tablets and smartphones as a “form of external or transactive memory, where information is stored collectively outside of ourselves.”
They add, “We are becoming symbiotic with our computer tools, growing into interconnected systems that remember less by knowing information than by knowing where information can be found.”
So is this computer-induced pattern of behavior making our memories poorer? Is this why we’re less likely to pepper our speech with passages from Shakespeare or Voltaire--or why many no longer know their best friend’s phone number?
Joshua Foer, winner of the annual U.S. Memory Championship and author of the book “Moonwalking with Einstein,” notes that ancient scholars such as Cicero committed huge tracts of text to memory--and devised methods to do so, such as the “memory palace"--because books were a rarity usually savored only once. Researchers have established that present-day memory champions are neither smarter nor do they have better memories than the average person, Foer reports. But they can use the techniques devised by the ancients to recall with perfect accuracy, say, the order of more than 1,500 digits with an hour’s preparation.
Foer, too, says we’ve “outsourced” our memories to our external devices, with the result that we no longer trust our own memories. “We’ve forgotten to remember,” he adds, but his own story of training to become a memory champion suggests that those skills can be relearned.
Columbia psychology professor Betsy Sparrow, lead author of the study, isn’t too worried either. Sparrow asserts that if her cellphone were to disappear forever today, she’d probably learn many of her friends’ and family’s phone numbers in short order. In some sense, she adds, humans have long exercised this type of information management: in the workplace and within families, certain individuals have always been the repositories of specific categories of information, allowing others to know they need not commit all those facts to memory because they can retrieve them just by asking the family or workplace oracle. And, of course, there were books and libraries.
“We’re still accessing other people as external memories: we’re just doing it on the Internet,” says Sparrow.
Now, all the world’s knowledge is stored somewhere on the Web. It’s a far vaster landscape of facts and information than we had at our fingertips before. With so much to retrieve, her study suggests, we may have to adapt our memory strategies away from facts themselves and focus on remembering how to find them: what folder it’s filed it in, or what keywords led you to it in the first place.
Sparrow suggests that rather than making us stupider, the notion that everything we can learn can be retrieved for later reference might make us smarter. Having a vast external memory at our fingertips, she says, might free us to spend more time and brainpower discerning patterns and interlocking themes in the facts we are presented and less on memorizing those facts. “Once you take away the effort it takes to remember the specific details, then we might be getting more out of the contextual stuff we read and look up,” says Sparrow.
Tell that to your mother the next time you lose your cellphone and have to explain that you can no longer remember her phone number.
How’s your memory, and can you make it better? See my recent blog post.