Advertisement

From the Archives: ‘Blade Runner’ went from Harrison Ford’s ‘miserable’ production to Ridley Scott’s unicorn scene, ending as a cult classic

“Blade Runner 2049” features Ryan Gosling, Jared Leto and Harrison Ford.

Share

Upon its initial release in 1982, Ridley Scott’s “Blade Runner” was a critical and commercial disappointment. Over time the film amassed a devoted cult following, and in 1992, upon the release of Scott’s director’s cut, Times film critic Kenneth Turan wrote a deep dive into the making of the film and its rediscovery. Twenty-five years later a sequel, “Blade Runner 2049,” will open in theaters nationwide. This article was originally published on Sept. 13, 1992.

Elegant cars gliding through a decaying infrastructure, the dispossessed huddling in the shadow of bright skyscrapers, the sensation of a dystopian, multiethnic civilization that has managed to simultaneously advance and regress — these are scenes of modern urban decline, and if they make you think of a movie, and chances are they will, it can have only one name: “Blade Runner.”

Few, if any, motion pictures have the gift of predicting the future as well as crystallizing an indelible image of it, but that is the key to “Blade Runner’s” accomplishments. One of the most enduringly popular science-fiction films, it revived the career of a celebrated writer, helped launch a literary movement and set a standard for the artistic use of special effects many people feel has never been equaled. And, until now, it has never been seen in anything like the form intended by the people who created it.

Advertisement

Starting this weekend, a full decade later than anyone anticipated, Ridley Scott’s original director’s cut of this moody, brilliant film is having its premier engagement, opening in 60 cities nationwide, with another 90 to follow in three weeks. While classic revivals have become commonplace, the usual re-released versions offer either a technical improvement (Orson Welles’ “Othello”) or else a sprinkle of new footage (“Lawrence of Arabia”). This “Blade Runner” is a very different version, a cut that until two years ago no one even knew existed, and because of the film’s reputation and power it is intended by Warner Bros. to make some serious money.

Yet if this seems like a simplistic tale of good finally triumphing over evil, be aware that absolutely nothing about “Blade Runner” is as simple as it first seems. For this was a film that was awful to make, even by normal Hollywood standards of trauma, agonizing to restructure and rediscovered by a total fluke. The people who worked on it called it “Blood Runner,” a sardonic tribute to the amount of personal grief and broken relationships it caused, and they recall it with horror and awe.

Production designer Lawrence G. Paull remembers it as “a dream and a nightmare all at once.” Art director David L. Snyder, whose personal life was one of those that broke under the strain, remembers that, psychologically, “Ridley beat the hell out of me; beatings were in order all the time.” But he now looks on that time as the most intoxicating of his career, calling it “keeping up with the genius, like working with Orson Welles.” While star Harrison Ford considers “Blade Runner” his worst movie experience, co-star Sean Young calls it “still my favorite film.” Director Scott finds himself “progressively amazed” as interest in the film “gets bigger and bigger and bigger.” Yet it took veteran producer Michael Deeley, whose previous picture was the Oscar-winning “The Deer Hunter,” 10 years to find the enthusiasm to produce another theatrical film.

More than anything else, “Blade Runner’s” saga is, as the best Hollywood stories invariably are, a microcosm for the industry, starkly underlining how irredeemably deep the classic split between aesthetics and commerce is and also how painfully inevitable. As with an etching by Escher, the final decision on who the villains are here, or even if there are any villains at all, depends on your point of view.

The man who benefited the most, albeit posthumously, from “Blade Runner” was the man who started it all. When he died at the age of 53 in March, 1982, less than four months before the film’s premiere, he was, according to his agent, Russell Galen, looking forward to that event “like a kid on Christmas Eve.”

Advertisement

Philip K. Dick was one of the architects of modern science fiction. A passionate, emotionally unstable visionary, author of dozens of books and hundreds of short stories, he was, according to critic John Clute, “the first writer of genre science fiction to become an important literary figure.” As Richard Bernstein noted in a recent front-page New York Times Book Review piece (an august location the writer never expected to inhabit while he was alive), Dick articulated “our deepest fears and most persistent fantasies about technology and its potential to destroy us.”

These themes come out quite vividly in his 1968 novel, “Do Androids Dream of Electric Sheep?” The main character, Rick Deckard, a futuristic bounty hunter with an unhappy marriage, is offered the job of hunting down half a dozen Nexus 6 androids, or “andys”: synthetic human beings with four-year life spans who’ve escaped from Mars and are trying to pass as authentic humans on a bleak planet Earth.

Deckard agrees to the task because he wants the money to buy the environmentally plundered future’s ultimate status symbol, a live animal — specifically, a sheep. He meets Tyrell, the maker of the Nexus 6 robots, as well as his nominal niece, Rachal, who, Deckard discovers, is an android. He has a brief affair with Rachal, terminates the six andys and, after much philosophical speculation about how androids and humans differ and ruminations about a futuristic religion called Mercerism, Deckard returns to a somehow strengthened relationship with his wife.

When Hampton Fancher, an actor turned screenwriter in search of a project, called on Dick in his Santa Ana apartment in 1975, he wasn’t concerned about the writer’s place in literature. He didn’t care that “Androids” had been optioned three times before or that Dick thought Victoria Principal would make a perfect Rachal.

“Phil was crazy, wonderful; he’d stop and look at his hands for five minutes straight, like he was getting messages from Mars,” remembers Fancher, a striking man with a casually bohemian air. “But he didn’t like me. He kept insulting me, acting like I was Hollywood, some emissary from people with cigars.” No deal resulted, but five years later, when Brian Kelly, an actor friend of Fancher, was looking for a property to produce, Fancher, “just to put him off, knowing he’d be going up a blind alley,” sent him off to Phil Dick. But the two got on; Kelly got an option on “Androids,” and Fancher eventually became screenwriter.

Fancher’s drafts (he ultimately did eight) eliminated both Mercerism and the wife, upgraded Rachal to girlfriend and placed the “Androids” story in the dark, fatalistic world of film noir. “I wrote it for Robert Mitchum,” he says, “a wiped-out guy with scars and hangovers who got up and did his job. But there was no love in his life. He was missing part of himself, and he found it through contact with this woman. He found his heart by falling in love with the Tin Man.”

Advertisement

These drafts concluded with Deckard taking Rachal out of the city, letting her see nature for the first time, and then, because she has only a few days to live, shooting her in the snow.

While Fancher was writing, Kelly brought the project to the attention of the more experienced Michael Deeley, who in addition to “The Deer Hunter” had overseen dozens of films, including Sam Peckinpah’s “The Convoy,” and had run British Lion and Thorn EMI. Deeley, a polished Briton, liked the novel, seeing it as “a thriller and a romance, like the Nazi commandant falling in love with the Jewish girl who’s supposed to be his victim.”

Deeley immediately thought of Ridley Scott, a filmmaker he’d known for a number of years. But Scott, a successful director of commercials whose only released film was the little-seen “The Duellists,” was in post-production with something called “Alien” and was not ready to commit to another science-fiction project. So the script, whose name kept changing from “Android” to “Animal” to “Mechanismo” to “Dangerous Days,” made the well-traveled Hollywood rounds.

Director Robert Mulligan, best known for the sweetly sentimental “To Kill A Mockingbird,” briefly became involved. “The romantic element was a lot softer then,” says Deeley by way of explaining what now seems like a curious choice. Mulligan never got beyond preliminary discussions, but by then, Scott, who had become an A-list director with the success of “Alien,” decided he was interested after all. In April, 1980, Filmways Pictures announced a $13-million budget for an as-yet-untitled tale of “technological terror.”

Scott liked Fancher’s dark take on the script. In fact, both men found their collaboration energizing. “For a writer it was awesome, really inspiring, a creative fun house,” remembers Fancher. “And Scott had a way of speaking in shorthand. ‘What’s out the window?’ he said one day. I told him I didn’t know. ‘Well, think about it,’ he said,” a brief dialogue that led eventually to the elaborately imagined future world that would become the film’s trademark.

Advertisement

It was Fancher who uncovered the name “Blade Runner,” taken from the title of an obscure work by William Burroughs. It was during his tenure that Dustin Hoffman was seriously considered for the role of Deckard. But Hoffman pulled out, and Fancher, after all those drafts, was replaced. “Ridley and I had had disagreements, but I thought I’d won the arguments,” he says with bemused irony. “I was so naive, I didn’t know that writers did what they’re told.”

David Peoples, the second writer on the project, had a background in the documentary field, including co-writing the moving Oscar-nominated study of J. Robert Oppenheimer, “The Day After Trinity.” But though they’d never been produced, he’d written seven or eight dark, futuristic spec scripts (and went on to write Clint Eastwood’s current “Unforgiven”) that had come to the attention of director Tony Scott, Ridley’s brother.

Though excited by the opportunity, Peoples remembers being “totally bummed out” when he read Fancher’s last draft, telling Ridley Scott, “This is brilliant; there is nothing I can do to make it better.” But Scott, not for the last time, persevered. “He’s very demanding,” says Peoples. “He has something in mind and he goes after it.”

Scott had Peoples, in the writer’s words, “move away from Deckard in a lot of jeopardy to a plot involving clues, like ‘Chinatown.’ ” Peoples also worked on the humanity of Deckard’s adversaries, and, in fact, helped by his daughter, who told him about the biological term replicate, he came up with the androids’ new name: replicants. The change was necessary because Scott thought the sturdy science-fiction term android was a cliché and half-seriously decreed that anybody who used it on the set would get his head broken with a baseball bat.

Just as Peoples was starting to work, he was informed that “a bit of a hiccup” had developed. After having invested more than $2 million in the project, Filmways abruptly pulled out. This set off a frantic scramble to secure financing and distribution for the project, then slated to cost in the neighborhood of $20 million.

“For two weeks, Larry Paull and I did presentations to every studio in town,” remembers art director Snyder. “Ridley and Michael Deeley kept making the point that they weren’t trying to do ‘Star Wars,’ they were trying something else, and the distributors kept saying, ‘You should be so lucky as to do “Star Wars.” ’ “

Advertisement

Finally a complex, three-cornered deal was announced early in 1981. Though the Ladd Co. would release the film (through Warner Bros.), their financial stake would be fixed. According to Deeley, the Ladd Co. put in $8.5 million while foreign rights were sold to Hong Kong film mogul Run Run Shaw for another $8.5 million. To cover the rest, Deeley sent the script over to the three partners at Tandem Productions — Norman Lear, Jerry Perenchio and Bud Yorkin — to see if they were interested in the video and other ancillary rights.

“Jerry was a player; he’d been an agent and a boxing promoter, and he had excellent timing in buying and selling,” remembers Deeley. Yorkin had directed such features as “Divorce, American Style” and “Come Blow Your Horn.” Lear passed on the project, but Perenchio and Yorkin were interested. The pair decided, in Yorkin’s words, “Let’s take a flyer.”

Though they differ on the amount of money initially involved (Yorkin says it was $1.5 million, Deeley $4 million) both men agree on two points. First, without those dollars, however many there were, “Blade Runner” would never have been made. And Perenchio and Yorkin, in industry parlance, took the place of a completion bond company: If “Blade Runner” went over budget, they agreed to pay whatever it took to finish the picture. And that agreement gave them, not the Ladd Co. or Warner Bros. or even Ridley Scott, effective final cut of the movie.

Next, the casting fell into place. Harrison Ford, star of “Star Wars” and the as-yet-unreleased “Raiders of the Lost Ark,” was signed as Deckard. Sean Young, a 20-year-old actress with what cinematographer Jordan Cronenweth described as “wonderful, light, creamy, highly reflective skin,” had exactly the look Scott wanted; she was signed as Rachal. International star Rutger Hauer became Roy Batty, the leader of the replicant band Deckard was to track down.

Shooting was scheduled to begin on March 9, 1981, and last for 15 weeks, and in the beginning, all was sweetness and light. That first day, Yorkin sent Deeley a note: “I know that we are embarking upon a project that you have worked a long time on and that is going to be everything you have dreamed of.”

That Ridley Scott did not work in a way anyone on the crew had ever experienced became obvious the very first day of shooting. The elaborate set for the Tyrell Corp. office, complete with nearly 6,000 square feet of polished black marble and six enormous columns, was to be used first. “It was a very pristine set. Everyone was standing around in their socks,” production designer Paull remembers, “and Ridley walked in, took a look at the middle columns and said, ‘Let’s turn them upside down,’ ” a decision that meant a major delay.

Advertisement

“Ridley literally changed everything. I can’t think of one set we went into and shot the way we found it,” Snyder says. “It was brutal.” Adds Paull: “Working with him was the first time in my career as a designer that the paint was still wet as the cameras were rolling.”

Trained at London’s prestigious Royal College of Art, with extensive experience as a set designer, Scott directed thousands of commercials (including Chanel’s haunting “Share the Fantasy” spots). Even then, he had a reputation for possessing what production executive Katherine Haber describes as “an eye that was totally and utterly brilliant.”

“Most directors are hyphenates,” explains Snyder. “They can be actor-directors or editor-directors. Because Ridley was an art director-director, he spent the majority of his time with the art department.” In fact, when Snyder was first introduced as the film’s art director, Scott, in a hint of things to come, shot a look at the man and said simply, “Too bad for you, chap.”

A man who’d rather sketch than write, Scott was tireless in pursuit of ideas for the film’s look. The alternative comic book, Heavy Metal, was a major inspiration, and Paull and Scott screened everything from “Metropolis” to “Eraserhead,” “Citizen Kane” and “The Blues Brothers,” which inspired “Blade Runner’s” flaring gas fires.

As he had on “Alien,” where he’d worked with artist H.R. Giger, the director decided to bring in a conceptual illustrator to, as he put it in advertising terminology, “spitball with.” The man chosen was Syd Mead, an industrial designer with a futuristic bent who had created visuals for such companies as U.S. Steel and the Ford Motor Co. Originally hired merely to design the film’s cars, Mead put backgrounds in his sketches that intrigued Scott, and soon Mead, the director, Paull and Snyder were involved in conceptualizing the future.

Though Dick’s novel was set in 1992, the script had updated things to 2020 (finally changed to 2019 so it didn’t sound so much like an eye chart). Scott, who’d been attracted to the film because of a chance to design a city-oriented future, knew he wanted to avoid “the diagonal zipper and silver-hair syndrome” a la “Logan’s Run.” Based on his experiences with urban excess in New York and the Orient, “Blade Runner” was going to be the present only much more so, “Hong Kong on a bad day,” Scott says, a massive, teeming, on-the-verge-of-collapse city that the director at one point was going to call “San Angeles.”

Advertisement

“This was not a science-fiction film so much as a period piece,” Paull explains. “But it would be 40 years from now, not 40 years ago.”

The key design concept came to be called retrofitting, the idea being that once cities start to seriously break down, no one would bother to start new construction from scratch. Rather, such essentials as electrical and ventilation systems would simply be added onto the exteriors of older buildings, giving them a clunky, somehow menacing look. Progress and decay would exist hand in hand, and the city’s major buildings, like the massive, Mayan-inspired pyramid that houses the Tyrell Corp., would tower miles above the squalor below.

Though Mead and the director were involved in conceptualizing the future, the task of actually building it fell to Paull and Snyder. “Our job was not just design or dreaming,” recalls Snyder, “it was to stand something up for principal photography.” Which meant, among other tasks, renting some of the neon signs from the recently completed “One From the Heart” and salvaging spare parts from an Air Force base in Tucson. “Syd would do these illustrations, but you had to do it, you had to finesse it if things didn’t work,” Paull adds. “That was the tough, tough part.”

The New York street, on the back lot of what was then the Burbank Studios, was built for Warner Bros. in 1929. Once populated by Humphrey Bogart and James Cagney, it became the arena for Scott’s painstaking artistry. Someone who sees a film as “a 700-layer cake,” Scott is, in producer Deeley’s apt phrase, “a pointillist, creating things out of masses of tiny dots, like Seurat.”

Almost immediately, Scott began the time-consuming, very gradual piling of precise detail upon precise detail. The never-to-be-seen magazines on the barely glimpsed newsstands had futuristic headlines such as “Guard Dogs You Never Feed.” Hundreds of highball glasses were examined before a single one was selected as a minor prop. Screenwriter David Peoples remembers sinking into a chair in Deckard’s apartment and realizing with a jolt that “this was not like a movie set, this was like somebody’s apartment, like somebody lived there. It was stunning that way.”

Because much of the film was shot at night (partly for the look, partly to save the expense of hiding the Burbank hills), a 24-hour art department had to be maintained. “Ridley and I would walk the street every morning at dawn,” Paull recalls with something like a shudder. “He’d say, ‘Larry, why don’t we take this and do that with it.’ Then he’d leave with a twinkle in his eye, and I’d pull together 15 or 20 guys so that by the time he walked on the set that night, it would be done.”

Advertisement

With so much attention paid to the visuals, it was inevitable that the actors would get shorter shrift. Edward James Olmos, who played a policeman, welcomed the opportunity to be left on his own to create a street language for his character, and Rutger Hauer was happy to be allowed to improve some of his dialogue, resulting in his wonderful closing line about memories being “lost in time, like tears in the rain.” But M. Emmet Walsh, who also played a policeman, complained to Snyder that “by the time you guys get finished lighting, we’re lucky if we have time for three takes.” Ford was, by several accounts, frustrated to be dealing with a director who was, as one observer put it, “happier to be sitting on a crane looking through the camera than talking to him.”

Invariably, though, dealing with Scott was hardest on “Blade Runner’s” crew. “His view of what was finished work,” Deeley explains, “was different than everyone else’s. If there was something not right in the top right-hand corner, the crew would say, ‘No one’s looking up there.’ But Ridley was looking up there.”

And, says Snyder, “When you didn’t get it with Ridley, you were gone.” The original physical-effects people were fired just before principal photography commenced; the original set decorator was dismissed because Scott didn’t like the look of some crucial department-store windows. “To get the detail I wanted to get,” Scott said in a post-shooting interview, “you do become a relatively unpopular fellow.”

What brought to a head all this pressure, compounded by the threat of a directors’ strike, was an impolitic interview Scott gave to the Manchester Guardian. “He said how much more he enjoyed working with English crews; they all called him ‘Guv’nor,’ and did what he wanted,” remembers Katherine Haber. “A copy of the story was left in Ridley’s trailer, and by the next morning 150 copies had been Xeroxed and distributed.” Almost immediately, the crew declared a T-shirt war.

Though everyone involved remembers the shirts slightly differently, the likeliest scenario seems to be that the challenge “Yes Guv’nor My Ass” appeared on the front, with the sentiment “Will Rogers Never Met Ridley Scott” emblazoned on the back. To retaliate, and to open lines of communication, the British members of the production team--Scott, Deeley and Haber--came back with shirts that insisted, “Xenophobia Sucks.”

Over at Entertainment Effects Group, the special-effects house then run by Douglas Trumbull (who’d become a legend through his work on “2001: A Space Odyssey”) and Richard Yuricich, Scott’s perfectionism was also taking its toll. “He drove the effects people crazy. At the end they were ready to lynch him,” reports Don Shey, editor of Cinefex magazine, who devoted an entire issue to the effects created by Trumbull, Yuricich and David Dryer. “Not only did he beat them to death, it didn’t bother him to take a shot that cost a quarter of a million dollars and say, cavalierly, ‘It didn’t work as well as I thought it would. I’m cutting it.’ ”

Advertisement

Though “Blade Runner’s” special effects are dazzling, they were hardly, Trumbull says today, a result of extraordinary expenditure of money. “It was the lowest budget we had ever seen, less than a third of that allocated for ‘Star Wars’ or ‘Close Encounters,’ ” he says. “We had no money to invent new gizmos, so we took a very conservative approach,” doing things like reusing the mold for the spaceship from “Close Encounters” as the landing dock on the roof of “Blade Runner’s” police station.

The effects were memorable for two reasons. One was the unusually close coordination between the effects and the live-action photography, what Trumbull calls “one of the most seamless linkups ever,” ensuring a unified look for the entire production. The other was simply Scott’s eye. “It’s almost trite to say these days, but Ridley Scott had a vision, and ‘Blade Runner’ is probably the first and only science-fiction art film,” Shey says.

Scott refused to be rushed. Haber recalls that “he’d fiddle and diddle until it was perfect.” But Scott defends his actions to this day. “In a way, directors ought to get tunnel vision when they’re doing a film, or they shouldn’t be doing the job.”

Hardly that philosophical about the situation were Bud Yorkin and Jerry Perenchio, the men who had contracted to pay whatever it took to finish the film. Deeley describes what he considered “panic from the front office, from Bud Yorkin, who somehow felt as a filmmaker himself that there should be a way to restrain the costs. But he was a meat-and-potatoes, ‘a-picture-is-a-picture’ guy, not on the same creative wavelength as Ridley.”

Yorkin, a silver-haired man with an air of melancholy, sees it differently. “Jerry and I didn’t go into this naively. We knew it would be a very difficult shoot, and we left ourselves a pad of $1.5 million to $2 million,” he says. But as the amount of money the pair had to put into the picture rose to something like $5 million, the frustration level escalated.

“We’re not a studio, but unfortunately we were placed in the position of the heavy that a studio would take,” Yorkin says, still irritated. “We were two guys taking it out of our own pockets or going to the bank and borrowing it ourselves. Going on the set and watching someone take five hours longer to set up a shot, seeing a lot of money go out of your pocket, that kind of thing one doesn’t need unless you have a very good heart.”

Advertisement

The time is early 1982, the cities Denver and Dallas, and the feeling is one of happiness and anticipation as movie fans open their newspapers and see advertisements announcing the sneak preview of a science-fiction epic starring Harrison Ford. The mood inside the theaters is cheerful and expectant, for both of Ford’s previous films have conditioned audiences to expect a lighthearted, action-oriented romp. Instead, the lights go down and what appears is something entirely different.

According to one source, the preview cards filled out after both screenings told the same story: “This was a film that made demands on an audience that wasn’t expecting a movie that made demands on them, an audience somewhat befuddled by the film and very disappointed by the ending.” It wasn’t so much that people actively disliked “Blade Runner,” they were simply unprepared for it. Another crisis had arrived.

Though one participant emphasizes that overall “the cards were good, but not through the roof,” Yorkin saw it differently. “After so much talk, so much anticipation about the film, the cards were very disappointing. We all were in a state of shock.”

It was at this point that changes in the film’s structure were decreed. And though Yorkin says the changes made to the film were a group decision involving the Ladd Co., Warner Bros., Perenchio and the filmmakers, writer Fancher is not alone when he says angrily, “Perenchio and Yorkin came in and shoved people around. They brought in the money that was missing at the end, but they took more than their pound of flesh.”

First, an extensive voice-over was added to help people relate to Harrison Ford’s character and make following the plot easier. According to Haber, after a draft by novelist-screenwriter Darryl Ponicsan was discarded, a TV veteran named Roland Kibbee got the job. As finally written, the voice-over met with universal scorn from the filmmakers, mostly for what Scott characterized as its “Irving the Explainer” quality.

“You’re looking at a red door, and it’s telling you it’s a red door,” says film editor Terry Rawlings. “It was ludicrous.” It sounded so tinny and ersatz that, in a curious bit of film folklore, many members of the team believe to this day that Harrison Ford, consciously or not, did an uninspired reading of it in the hopes it wouldn’t be used. And when co-writers Fancher and Peoples, now friends, saw it together, they were so afraid the other had written it that they refrained from any negative comments until months later.

Advertisement

The film’s ending was equally troublesome. Scott had wanted the film to end on the nicely enigmatic line, “It’s a shame she won’t last forever; but then again, no one does,” as an elevator door closed in front of a fleeing Deckard and Rachal. Scott had also decided he wanted to leave the viewer with a hint that Deckard himself was a replicant. So he had Deckard notice a small origami unicorn on the floor, a unicorn that would hark back to a unicorn dream that he had earlier in the film, making him realize that his very thoughts were programmed.

None of the Ladd Co. executives or Yorkin were impressed. “You try and explain to some executive what thoughts are,” growls Rawlings. “They don’t have any.”

“Is he or isn’t he a replicant? You can’t cheat an audience that way. It’s another confusing moment,” Yorkin says. And so the unicorn dream was never used, and a new, more positive ending line--revealing that Rachal was a replicant without a termination date--was written. To indicate the joy the happy couple had in store for them, scenes of glorious nature were to be shot and added on, but attempts to get proper footage in Utah were foiled by bad weather. Instead, contact was made with Stanley Kubrick and, remembers Rawlings, they ended up with outtakes from “The Shining”: “Helicopter shots of mountain roads, the pieces that are in all the ‘Blade Runner’ prints you see everywhere.”

Though he was far from happy with the changes, especially the loss of his beloved unicorn scene, Scott, surprisingly, did not kick up a major fuss. “It was the first time I’d experienced the heavy-duty preview process,” Scott recalls, “and I was so daunted by the negative or puzzled reaction, I didn’t fight it. I thought, ‘My God, maybe I’ve gone too far. Maybe I ought to clarify it.’ I got sucked into the process of thinking, ‘Let’s explain it all.’ ”

With the voice-over and new ending, “Blade Runner” tested better, and the Ladd Co. planned to open it on June 25, the company’s “lucky day” when both “Star Wars” and “Alien” had debuted. True, another science-fiction film, a little picture from Universal, was opening a month earlier, but, says Deeley with a wan smile, “we all thought that ‘E.T.’ would be out of business in a few weeks, that people would be sick of that sentimental rubbish and be looking for something a little harder-edged. It didn’t quite work out that way.”

A new blade runner, played by Ryan Gosling, discovers a secret that could plunge what’s left of society into chaos. The discovery leads him on a quest to find a former blade runner, played by Harrison Ford, who has been missing for 30 years.

Advertisement

Although “Blade Runner” opened strongly, it was not embraced by the critics, who took special offense at that voice-over (“Should Be Seen Not Heard” read one headline). And while “E.T.” went on to become the highest-grossing film of all time, earning more than $300 million, “Blade Runner” returned only $14.8 million in rentals. Viewers were reluctant to embrace the film’s dark genius, and it gradually disappeared. “It was painful to see it happen,” Rutger Hauer says. “A film that unique pulled out of theaters.” Even “Blade Runner’s” legendary visuals could not stand up against the elfin, feel-good lure of its strongest competitor, losing the visual-effects Oscar to “E.T.”

When a film dies in Hollywood, no one expects it to come back to life, and the people around “Blade Runner” were resigned to its demise. Michael Deeley felt “so depressed I don’t think I ever saw it, never sat through it until the end.” Having successfully fought a bitter battle with Warner Bros. and the Ladd Co. to release Dick’s original novel instead of a quickie novelization as the official tie-in book, agent Russell Galen felt “that was the end of that.”

But there were still more twists to this plot. “Blade Runner’s” availability on video kept it alive in the eyes of the always loyal science-fiction crowd, and gradually, over time, the film’s visual qualities and the uncanniness with which it had seemed to see the future began to outweigh its narrative flaws. Scott says he saw the interest rise, “And I thought, ‘My God, we must have misfired somewhere; a lot of people like this movie.’ ” And not just in this country. In Japan, where the film had always been successful, “I was treated like a king,” art director Snyder reports. “The fans would be too in awe to even look at you.” The film’s look began to show up in art direction and design: Terry Gilliam’s “Brazil” and the stage design for the Rolling Stones’ Steel Wheels tour were influenced by “Blade Runner.” And when laser discs appeared on the market, “Blade Runner” was one of the films that everyone just had to get. It became Voyager’s top-selling disc immediately upon its release in 1989, never losing the No. 1 spot.

“Blade Runner’s” influence has been literary as well. Many people who saw the film ended up reading the novel, making it Dick’s top seller and, according to Galen, sparking the Phillip K. Dick renaissance of the 1980s. Dick’s “We Can Remember It for You Wholesale” became the basis for Arnold Schwarzenegger’s “Total Recall,” and “Blade Runner,” along with “Road Warrior” and “Escape to New York,” is considered a key progenitor of the latest wrinkle in written science fiction, the darkly futuristic cyberpunk movement.

Yet, if anyone thought about Scott’s original cut of “Blade Runner” while all this was going on (and some people did), the accepted wisdom was that it no longer existed. That’s what Michael Arick thought when he took over as Warner Bros. director of asset management in 1989, in charge of recovering and restoring material on the studio’s films. Then, in October of that year, there occurred the first of a series of fluky events that would rearrange fate.

“I was in the vault at Todd-AO’s screening room, looking for footage from ‘Gypsy,’ when I stumbled on a 70-millimeter print of ‘Blade Runner,’ ” Arick remembers. “What had probably happened was that no one had remembered to have it picked up after a screening. In order to save it from collectors, I hid it on the lot.”

Advertisement

Several months later, the management at the Cineplex Odeon Fairfax theater was in the midst of a classic-film festival featuring 70-millimeter prints. Having heard through the print grapevine that a 70-millimeter version of “Blade Runner” had been spotted, the Fairfax asked for it from Warner Bros. Arick, a supporter of revival theaters, agreed. But neither the Fairfax nor Arick (who had never screened the print in its entirety) knew what they had on their hands.

All this changed dramatically one morning in May. “Anyone who gets up for a 10 a.m. Sunday screening of ‘Blade Runner’ really knows the film,” Arick says, “and everyone knew immediately what they were watching. The audience was very rapt from the beginning; the atmosphere was incredible.” The print, almost devoid of the voice-over and lacking the tacked-on ending, was closer to Scott’s original version than anyone ever thought they’d see again.

Though there was an immediate stir in the film-buff community, Warner Bros. wasn’t sure what to do with this new-old version. Scott came over to see it and told Arick that it was in fact not his final cut: The unicorn scene that he had come to love was still missing, and the music over the climactic fight scene was not film composer Vangelis’ work but temporary music lifted from Jerry Goldsmith’s score for “Planet of the Apes.” The two talked about the possibility of adding the unicorn footage, technically known as a “trim,” which was languishing in a film-storage facility in London.

What happened instead was that Arick and Warner Bros. parted company (although he continued to advise Scott), and the studio contacted Gary Meyer, executive vice president of the Landmark theater chain, which had earlier expressed interest in “Blade Runner,” and asked if he still wanted to show it. Meyer was enthusiastic; 15 theaters nationwide were booked, including the Nuart in West Los Angeles, and without knowing it wasn’t quite true, Warner Bros. created a campaign advertising “The Original Director’s Version of the Movie That Was Light Years Ahead of Its Time.”

Scott was not pleased. “As I understand it, he said, ‘This is not my version,’ which left Warner Bros. in a real dilemma,” reports Meyer. “My intuition is that the studio, which might want to hire him in the future, didn’t want to alienate him over some two-week repertory booking.” So a compromise was reached. The newly discovered version of “Blade Runner” would play in the Nuart and at the Castro in San Francisco, but nowhere else.

With little publicity, “Blade Runner” opened at the Nuart last September, and immediately attendance went through the roof. The first week set a house record, and the second week bettered the first. When Hampton Fancher, whose screenplay had started it all, tried to get in, he even showed his passport at the box office to prove who he was. But there was absolutely no room at the inn.

Advertisement

The same pattern of success repeated at the Castro, where its $94,000-plus box-office take in one week made it the top-grossing theater in the country. Encouraged by this, and by lucrative showings of the old voice-over version of “Blade Runner” in Houston and Washington, Warner Bros. agreed to pay for the technicians and editing rooms so that Scott could put the film back just the way he wanted it. Which is why, on a weak telephone connection from London a few months ago, there was quiet satisfaction in the director’s voice when he said, “I finally got me unicorn scene. Ten years later, but I got it.”

The tale of “Blade Runner” turns out to be a curious one. No one went bankrupt, no one’s life was ruined beyond repair, no one never worked in this town again. But the experience illuminated the oldest of Hollywood battles, the one about how much tribute must be paid to art in a multimillion-dollar business where money is always the bottom line. Movie executives have always tried to change films, often destroying any artistic merit on the screen — and in the end, ironically, the mutilated films don’t make any more money than the original versions would have. So, the dispute remains contentious — even though everyone agrees that “Blade Runner” was so ahead of its time that it wouldn’t have been a major hit even if not a frame had been altered.

Still, this was for almost all involved the project of projects, the one that no one has forgotten and that everyone sighs the deepest of sighs over. “Everything on ‘Blade Runner’ was a little bigger, a little better,” says Rutger Hauer wistfully. “You can only be a genius so many times in your life.”

Advertisement