Advertisement

MUSIC, COMPUTERS MERGE

Share
Times Staff Writer

Marshall Otwell huddled over a computer keyboard in his tiny Laguna Beach office and peered intently at a lengthy list of commands displayed on his video display terminal.

His two office mates were heading out the door for lunch, and they asked Otwell to join them. “I’ve got a few bugs with this program I want to work out. I’ll meet you there in a little while,” Otwell said, scratching his head and continuing to stare thoughtfully at the screen.

A few days later in Santa Ana, John Moriarty anxiously flipped through the programmer’s manual to his Apple IIE computer trying to discover why he couldn’t get a certain piece of data to print out. “I don’t understand why it’s not working,” he said, obviously embarrassed at the temperamental machine that refused to do his bidding. “Usually it prints fine.”

Advertisement

Welcome to the Brave New World of making music--1980s style.

Otwell and Moriarty are both accomplished pianists who are also working with relatively new technology merging computers and music. They are just two in the rapidly growing ranks of professional and amateur musicians who spend nearly as much time studying computer programming and debugging as they did in college learning music theory and piano technique.

This new technology is giving the individual musician unprecedented power and versatility to write, record and perform music.

“I shudder to think what the great composers of the past would have done with equipment like this,” said Bob Frye, a product specialist for Buena Park-based Yamaha International, one of the world’s largest synthesizer manufacturers.

The crux of all the excitement is an invention allowing electronic synthesizers to communicate with and control other instruments as well as share musical information with computers. Known as MIDI--an acronym for Musical Instrument Digital Interface--the technology was developed jointly by three synthesizer manufacturers and introduced in 1983.

MIDI has been accepted so rapidly--and almost universally--within the music industry that it’s almost impossible now to listen to a record, go to a concert or watch a movie or television program without hearing sounds created or processed by computers.

Although synthesizers have been in use since the 1960s, MIDI has added a new dimension to their use by coupling them with the speed and programming abilities of computers. The way word processing revolutionized typing, MIDI programs are giving musicians new ways to write, edit, shape, record and perform music.

Advertisement

“Throughout the history of music, we’ve seen new developments that gave musicians more power,” Frye said. “With MIDI, the improvement is geometrical rather than linear.”

For John Moriarty, the marriage of computers and synthesizers is allowing him to become a one-man music production company. At night, he still plays an acoustic piano in local hotels and nightclubs. By day, Moriarty can single-handedly record, electronically edit and score his compositions for full orchestra, both for himself and as a service to other musicians. And he does it in a fraction of the time it would have taken only a few years ago.

Marshall Otwell, who has performed in Southern California jazz clubs for years, is putting his musical knowledge to work at Third Street Software in Laguna Beach, one of about 20 or 30 firms worldwide creating MIDI software. Along with partners Len Sasso and Tim Ryan, Otwell is writing programs he hopes will help musicians and computers become comrades rather than adversaries.

“Music will be limited only by imagination rather than technical capacities,” said Len Sasso, “and people who are technically limited will be able to make better music.”

For the average musician, the union of computers and synthesizers is opening up new musical frontiers. And with increasingly sophisticated keyboards that have built-in microprocessors to store pre-programmed sounds, a performer no longer has to be an electronics wizard to use synthesizers.

One manufacturer introduced a new system this year that allows one musician to control up to 16 pre-programmed synthesizers from a single keyboard to create complex sounds and effects that previously would have required a small army of musicians.

Advertisement

Another boon that MIDI technology holds in store for the average musician is significant reductions in recording studio costs. By being able to record and edit keyboard parts on a home computer, musicians can enter the studio with a computer floppy disk containing finished basic tracks to their songs, leaving only vocals to work on while the recording studio’s meter is running. And because prices are dropping as technology is improving, the equipment is within reach of an increasing number of people.

“For under $600 you can buy a cheap computer, software and a MIDI synthesizer and have a complete setup,” said Lachland Westfall, managing editor of an informational bulletin published by the International MIDI Assn., a users group with about 2,400 members worldwide. At the other end of the scale are the extremely sophisticated instruments used by Stevie Wonder and other top recording artists costing $200,000 or more.

In addition to writing and editing songs electronically, musicians are able to control an increasing number of synthesizer functions via computer.

Third Street Software’s Tim Ryan demonstrated one program that provides a graphic representation on a computer video screen of all the switches and dials for a particular synthesizer. From a computer keyboard, Ryan moved the cursor to different controls displayed on the video monitor and punched various commands, altering the synthesizer’s sound, volume, pitch and virtually anything else that would normally be done with a mechanical dial on the instrument.

“Since the (synthesizer) manufacturers don’t have to devote all that space inside for controls, they can make the synthesizers much more sophisticated.” While computerized controls are still less convenient than keyboard-mounted dials, Ryan said he believes the extra versatility more than compensates for the inconvenience.

Among the other recent developments in MIDI-related equipment are “digital samplers” that permit synthesizers to record and subsequently emulate and electronically manipulate any sound--musical or non-musical.

Advertisement

“The advantage of this is that it allows you to do the impossible,” said musician Jimmy Cox, who uses synthesizers extensively in creating music for CBS-TV’s “Airwolf” series. “Take the chime of Big Ben--it’s now tunable, bendable at will. Anything that makes a sound becomes a chromatic instrument.”

MIDI equipment also has tremendous potential for applications in music education.

“For creative hobbyists or people who want to make music but haven’t spent 15 years studying music, this can help them learn music quickly,” said Nancy Kerwin, communications director at Roland Corp., another major synthesizer manufacturer.

Westfall described a device known as a “pitch-to-MIDI converter” that permits non-electronic, or “pitched,” instruments such as guitars, saxophones and trumpets to control a synthesizer keyboard. “A guy who is a great guitar player can now write parts for keyboard, without having to take time to master keyboard technique,” Westfall said.

Other non-musical uses of MIDI are just beginning to be explored.

“We are taking MIDI out of the music business and applying it to the sound business,” said Tom Kobayashi, president of Glen Glenn Sound, one of the most prominent Hollywood sound production facilities. “Instead of the old method of doing sound effects editing on film, we’ll be able to do it electronically via computer. For sound editors there will be just as much work, but they’ll have to change their tools and have the ability to work with computers. But this is a new field even for us. We haven’t even scratched the surface yet.”

Some touted advantages of such equipment, however, stir up the age-old machine vs. man debate. “A machine is always there; it doesn’t get drunk; it isn’t late and it doesn’t complain about pay,” said Yamaha technical representative Kevin Bierl.

“This is probably the central issue of our time,” said Bernie Fleischer, president of the Los Angeles local of the American Federation of Musicians. “It cuts out a tremendous number of people who would normally be employed in the industry--not just musicians, but also copyists, orchestrators, setup people and on down the line.”

Advertisement

Added veteran composer Jerry Goldsmith, who often uses computers and synthesizers in creating his television and film scores: “It is never said, but the attitude is that it’s cheaper to do music electronically. Well, I did one all-electronic score (for the 1984 film ‘Runaways’), and it certainly wasn’t cheaper.

“I have the fear that people will get hold of electronic instruments, program a few chords and think that instantly they are composers. That approach is not improving the quality of music. They’re making a mistake if they think you need less of a musical foundation to work with computers. It shouldn’t be used as a crutch.”

The musicians’ union isn’t, however, trying to eliminate synthesizers and computers from the world of music, since the union’s membership also includes many musicians who play them.

“We are realistic enough to know that we can’t legislate progress out of existence,” Fleischer said. “As an adjunct to music in general, this is a powerful new tool and no musician or union is going to stand in the way of that legitimate purpose. But when it is used only for an economic advantage, that just results in junk food music.”

Producer-keyboardist George Duke uses synthesizers and computers regularly in studio work but objects to relying on pre-programmed material extensively in a concert setting.

“I’m a champion of live performance,” Duke said. “I think that’s what separates the men from the boys. I put in a lot of time developing the ability to play accurately and rhythmically correctly. I don’t want to play sloppy and let the computer clean it up. Computers can enhance a show, but if they become the show, it will become a serious problem.”

Advertisement

Representatives of equipment makers were generally cautious when discussing the use of synthesizers to replace human musicians. Still, most agree that the issue of cost savings cannot be dismissed.

“We never say that our equipment should be used instead of live musicians. It is made to be used as a supplement, not a replacement,” said Brad Naples, president of New England Digital, manufacturer of the Synclavier, one of the most sophisticated--and expensive--synthesizers in the world. “But the record and broadcast industries have got to make money. They’re not someone’s playground. As a business, they are always looking for ways to make a profit.”

Like most musicians, pianist and programmer Marshall Otwell is simultaneously enthusiastic about the creative potential offered by combining computers and music and wary about the price to be paid for the new technology.

“The thing that bothers me is when I think that 25 or 30 years ago the electric bass was introduced--now no one is learning how to play the acoustic bass,” Otwell said. “It’s big and bulky and it doesn’t have frets, but it just happens to have a big beautiful sound.

“The real issue whenever you add technology is that there is something gained and something lost. The musician has to be conscious of what is lost and control the up side and the down side. There’s always a trade off.”

Advertisement