Virtual pioneer Jaron Lanier warns: Machines make bad masters

"People have to be able to make money off their brains and their hearts," Jaron Lanier was telling me. "Or else we're all going to starve, and it's the machines that'll get good."

It sounded a little bit like Dickens, and a little bit like a line from the "Terminator" movies. But it was all reality, coming from a true computing pioneer and one of modern technology's most insightful critics.

Lanier, 49, has been pondering the effect that the World Wide Web -- its ideology as well as its design -- has had on creativity, society and commerce for years.

The inquiry resulted in a book, or "manifesto" according to his own label, published this year titled "You Are Not a Gadget." The title should provide a clue that he hasn't found much to like.

Lanier is an imposing man with dark brown hair wound into long dreadlocks, and a way of throwing off incisive observations about technology and the world in a soft, almost apologetic tone of voice. His resume places him squarely in what he might himself term nerd royalty: He was an early pioneer in virtual reality, a term he "either coined or popularized," according to the bio on his home page.

A college dropout, he has been a visiting faculty member at Columbia University, the University of Pennsylvania and UC Berkeley. He composes music and has performed with Philip Glass and Ornette Coleman. He has founded or served as an executive of companies later acquired by Google, Sun Microsystems and Adobe; at the moment, his work is financially supported by Microsoft.

That background may be the reason his critique has the feel of something that could come only from a disenchanted ex-believer in the destiny of personal computing to improve our lives.

Almost everywhere he looks, he finds the dead hand of a Web culture increasingly dominated by advertising and aimed at imposing conformity.

The craze for social networking, for instance. Lanier's critique of sites such as Facebook has attracted wide notice in the online world, much of it uncomplimentary.

He told me when we met for an interview recently that he appreciates the way Facebook allows older users -- "people who already have lives" -- to connect with trends and people with similar interests.

But he fears that its tendency to inflict peer rule on teenagers will interfere with their natural inclination to find themselves by trying new and different selves on for size.

"America's Facebook generation shows a submission to standardization that I haven't seen before," he says. "The American adventure has always been about people forgetting their former selves -- Samuel Clemens became Mark Twain, Jack Kerouac went on the road. If they had a Facebook page, they wouldn't have been able to forget their former selves."

One of Lanier's main themes is our misplaced faith in the intelligence of machines and networks. There are the supposedly foolproof computer algorithms that lead Wall Street into one disastrous love affair after another with complex investment instruments we can't really control. We evaluate schoolteachers on the basis of their success in training pupils to pass standardized, machine-gradable tests.

Even the gold standard of artificial intelligence, the defeat of chess world champion Garry Kasparov by IBM's "Deep Blue" in 1997, is not the AI watershed it was once touted to be. The victory actually was the product of Deep Blue's gargantuan computing power, as even AI researchers acknowledge today. "Instead of a computer that thought and played chess like a human," Kasparov wrote recently, "they got one that played like a machine . . . winning with brute number-crunching force."

Our enchantment with the device instead of with the human imagination that invests it with life is the focus of some of Lanier's most penetrating insights.

The Web, he writes, has fostered "a new kind of social contract" in which "authors, journalists, musicians, and artists are encouraged to treat the fruits of their intellects and imaginations as fragments to be given without pay" to Web aggregatorslike Google and YouTube.

Those companies collect advertising dollars by pushing these fragments out to the consumer, but none of the money goes back to the creators. In fact, the more fragmentary the content, the more it's appreciated -- think of the re-edited mash-ups of movie clips, often accompanied by amusing voice-overs, that attract millions of YouTube viewers.

Who pays the creators of the original clips for this usage? Not YouTube, and not you, the viewer. In return for the appropriation of their creativity, the creators are supposed to be pleased to receive "self-promotion," Lanier observes, not money.

"If it were only journalists and only musicians who were affected," Lanier told me, "we could say that's a loss, but we could correct for it by making new institutions, like nonprofits, to support the work. But it eventually affects everybody, because if there isn't more opportunity to make money off our brains, people will get poorer and poorer."

The solution, he argues, is to give everybody equal entry to the content world -- that is, let anyone post creative work online and find a way for them to collect a payment, even a micropayment, any time it's accessed by a user. "Unless everybody believes they at least have a shot at making money from content, they'll never buy into it," he says. "The first time somebody posts a comment to a news story and earns 50 bucks, they're going to be sold and they're not going to be a pirate anymore."

As it happens, digital content is trending the other way at the moment. Books, music and video are increasingly purveyed by merchants jealously guarding their own "walled gardens." Think about the Kindle and the iPhone. You can't purchase a book for the former or software for the latter that isn't approved by Amazon or Apple, respectively, and you can't transfer your purchase to an unapproved device.

Apple’s new iPad is another troubling iteration of this trend, Lanier suggests. It's a device that looks like it will be terrific for displaying purchased content but terrible for creating it.

"That represents letting go of one of the most beautiful parts of the original dream of personal computation," he says, nostalgically -- the dream that the computer would enhance creativity and make it instantly accessible to millions.

Is there hope for that dream yet? Lanier sees only one glimmer, which is that walled gardens eventually run out of room for growth. The current model, he says, "isn't sustainable in the long term, and it's not good for civilization."

Michael Hiltzik's column appears Sundays and Wednesdays. Reach him at michael.hiltzik@latimes.com, and follow @latimeshiltzik on Twitter.

Copyright © 2019, Los Angeles Times
EDITION: California | U.S. & World
73°