The setting: a gloomy metropolis in the not-too-distant future. Endless rows of identically clad proles sit before a giant screen, watching Big Brother rant about the virtues of group think. "We are one people with one will, one resolve, one cause!" he rails.
Then a blond female jock, hotly pursued by helmeted guards, races in, winds up and hurls a sledgehammer at the monstrous image. The Dear Leader's face explodes, sending a blast of illumination across the newly liberated masses.
"On Jan. 24, Apple Computer will introduce Macintosh," a stentorian voice intones. "And you'll see why 1984 won't be like '1984.'"
The minute-long spot, which ran during the 1984 Super Bowl, is of course one of the most influential in advertising's history. Created by L.A.'s Chiat/Day and directed by Ridley Scott, it transmitted an irresistible ideology for the dawning digital age. Technology would break the chains around human creativity. Information wanted to be free. Individual self-expression would trump mindless conformity.
For the next three decades, U.S. political and popular culture broadly embraced that message and rushed to embellish the mythos behind it. Software designers became the new American folk heroes. Silicon Valley entrepreneurs were hailed as prophets. Their techno-utopian allies in academia and journalism supplied the hallelujah chorus.
Then last May, documents leaked to the Guardian newspaper by Edward Snowden, a 29-year-old former CIA employee and National Security Agency contractor, revealed that the U.S government was engaged in a massive data-gathering project, directed not only at potential terrorists and foreign heads of state but at its own citizens. This enterprise now faces judicial and congressional challenges.
Vacuuming up phone records, emails and much else besides, our national security establishment was using some of the same technology that delivers 24/7 celebrity trash and adorable pet videos for less beguiling purposes. And the government's spy partners, in many cases, were the same companies that had been selling us the giddy promise of a digitally leveled playing field and the sum of all human knowledge with a simple mouse click.
It was a sobering reminder that, as Orwell might've put it, some people's technology is more equal than others.
In the last six months, Snowden — like Julian Assange, Mark Zuckerberg and Steve Jobs before him — has become the personification of our fears and desires about the awesome power and potential downside of digital technology. Like Ralph Nader, who rattled America's love affair with the automobile when he published "Unsafe at Any Speed" in 1965, he has been lionized and demonized for bringing to light inconvenient truths about a prized technology.
His disclosures put the Obama White House and the national security establishment on the defensive and shook governments from Berlin to Brasília. They also added weight to critics' warnings about the dangers of an omniscient digital higher intelligence, which writers like Evgeny Morozov have likened to a postmodern deity, assembled by Silicon Valley's engineering priesthood and worshiped by Wall Street.
This counter-narrative had been building for years. As the initial phase of dot.com infatuation passed and memories of the Sept. 11 attacks faded, popular views of digital culture began to cloud with ambivalence. The shift was reflected in red-flag treatises like Jaron Lanier's "You Are Not a Gadget" and white-hot manifestoes such as Morozov's "To Save Everything, Click Here: The Folly of Technological Solutionism."
These assessments were put forward not by neo-Luddites but by cautious advocates of technology's many blessings. (Their critiques differed from the serial whingeing of novelist Jonathan Franzen, who cranks out despairing essays calling Amazon founder Jeff Bezos "one of the four horsemen" of the apocalypse and berating Salman Rushdie for tweeting. (Rushdie's tweeted reply: "Enjoy your ivory tower.")
The noir view of an all-knowing, all-seeing technology was somewhat anticipated by popular culture. Movies as disparate as Tony Scott's spy thriller "Enemy of the State" (1998), about a cabal of renegade NSA agents; Steven Spielberg's "Minority Report" (2002), based on Philip K. Dick's sci-fi story about a techno-totalitarian society; and David Fincher's "The Social Network" (2010), which depicted Facebook's origins as a spiteful and sophomoric hacker stunt targeting Harvard women, showed how technology can be a seductive come-on that conceals the abuses of raw power.
This year's film that best expresses technology's tangled associations may be "Her," in which the protagonist played by Joaquin Phoenix falls in love with a computer operating system voiced by Scarlett Johansson.
"What makes 'Her' so potent is that it does to us what Samantha does to Theodore," critic Anthony Lane wrote in the New Yorker. "We are informed, cosseted, and entertained, and yet we are never more than a breath away from being creeped out."
Television also has tracked Americans' evolving feelings about the trade-off (some would call it a false binary) between security and privacy. Fox's espionage thriller "24," with its ticking time-bomb plot structure, implicitly justified the government's right to thwart bad guys by any means necessary. Showtime's "Homeland," which just concluded its third season on a characteristically chiaroscuro note, offered a more nuanced and troubled vision of how Uncle Sam acts on our behalf.
Snowden's revelations reverberated with other alarming global developments. The all-seeing, all-hearing Chinese government extended its crackdown on dissent, blocking Western websites as ruthlessly as it locks up the regimes domestic critics. Facebook, Google and their rivals found new ways of changing the rules of the social media game, making it harder for users to know which parts of their lives will stay private and which will be monetized to the highest bidder.
So-called native ads and branded content, derived from supposedly anonymous consumer profiles, further muddy the line between news and propaganda. Many Americans approved of computerized aerial drones when they were being used to wipe out Middle Eastern adversaries. Now they're not so sure they want them delivering holiday gifts to the family doorstep via Amazon.
Even online video games are being colonized for prying and profit. According to a 2008 NSA document, leaked by Snowden, the U.S. military trolls games like "World of Warcraft" and "Second Life," searching for potential evildoers hiding in plain sight among the pixies and skeleton avatars.
Author Rebecca Solnit recently summarized the emerging noir perspective in the London Review of Books: "We are moving into a world of unaccountable and secretive corporations that manage all our communications and work hand in hand with governments to make us visible to them. Our privacy is being strip-mined and hoarded."
Americans aren't likely to start building bonfires out of their iPods and smartphones in the sort of reflexively puritanical purge that Nathaniel Hawthorne satirized in his story "Earth's Holocaust." What some are crying out for, more urgently, is to rethink the relationship between the powers and responsibilities of society versus the rights of individuals.
Apple's "1984" ad suggested that the individual and the state were fundamentally at odds, in tension, but that digital technology could mediate between them. Over time, however, that aspiration has been whittled to a dangerously limited and naive ideal of digital culture as a mirror of the autonomous "I": my music downloads, my videos, my blog posts, my vacation photos, my selfies, My Self.
Now we're faced with the realization that cyber world never has been a private sanctuary, a free-enterprise idyll of the self and one's select "friends." It's a hive swarming with billions, among them eavesdroppers and voyeurs, spooks and con men. And its contents are never solely mine or yours but ours and theirs.Copyright © 2014, Los Angeles Times