Can artificial intelligence become a choreographer? Wayne McGregor brings AI to L.A.
From his early years choreographing in the 1990s, Wayne McGregor has been fascinated by the intersection of dance with science and technology.
The British choreographer based his 2008 work “Entity” for his ensemble Company Wayne McGregor, on collaborative research with psychologists, neuroscientists and software engineers. McGregor tapped into a full sequence of his own genetic code for 2017’s “Autobiography” and he has choreographed with drones — spherical orbs programmed by an algorithm — for an installation featuring his company and members of the Royal Ballet where he’s the resident choreographer.
For one of his latest works, McGregor collaborated with Google Arts & Culture to develop an artificial intelligence-powered tool that creates original dance movement. The work, “Living Archive: An AI Performance Experiment,” makes its world premiere Friday at the Music Center’s Dorothy Chandler Pavilion.
It’s part of “Adès & McGregor: A Dance Collaboration” featuring composer-conductor Thomas Adès, the L.A. Phil, England’s Royal Ballet and Company Wayne McGregor.
McGregor choreographed the three works on the program, including 2010’s “Outlier” a collaborative performance with the Royal Ballet and Company Wayne McGregor, and the world premiere of “The Dante Project (Inferno),” enacted by the Royal Ballet and inspired by the poet’s 14th century epic work, the “Divine Comedy.”
But the structure of the AI-assisted work is more nebulous.
Set to Adès’ “In Seven Days,” a 2008 piece for piano and orchestra based on the biblical creation myth, the deliberately untitled work is “a philosophical meditation on how a dance is made,” McGregor said by phone. It explores questions like: what does it mean to choreograph? And can interesting choreography be made with help from artificial intelligence?
“That’s why I called it a performance experiment,” McGregor said. “We’re going to see how that plays out on stage.”
Creating an artificial intelligence system that could not only understand McGregor’s movement vocabulary but also create new choreography based on his style, was a two-year process.
McGregor teamed up with Google engineers and creative technologists to train the algorithm, called “Living Archive,” using thousands of hours of video from the choreographer’s previous works over 25 years. It was a way of “activating the archives” and “hijacking its past,” McGregor said.
The technology also learned the distinct way each of McGregor’s 10 company dancers moved. Cameras captured dancers’ solos, detecting the forms of their individual poses, and then it provided suggestions for the next choreographic sequences, displaying them — in the form of constellation-like stick-figure avatars — on a screen in real time.
McGregor compared the tool to predictive text, a technology that suggests words while typing on a phone. This choreographic catalyst is more sophisticated though, he said. ”It takes the essence of what that dancer is doing — the shape, the position, the dynamic, the articulation of that body. Then it uses that information to develop the next potential phrase.”
Presented with options for possible sequences of movement, the dancer could then either use the phrase, interpret it in their own way or use it to inspire improvisation. “It’s a real recursive process between the dancer and the AI system,” the choreographer said.
McGregor gravitated toward less obvious choreographic choices from the tool — “the unusualness, the things that you don’t recognize,” he said. “We’re looking for surprise, we’re looking for the body misbehaving, we’re looking for errors, we’re looking for anomalies.”
The AI collaboration was like adding another dancer to his company, McGregor said. “It’s exploiting opportunities in the data you can never see yourself.”
The work is part of the long tradition of modern and contemporary choreographers turning to technology to create.
Throughout his 70-year career, Merce Cunningham embraced technology, using the software DanceForms as a choreographic tool in the 1990s. In 2005, Trisha Brown developed a 30-minute work using an artist-designed, artificial intelligence program that responded instantly to dancers’ movements with animated graphics. And this year, Bill T. Jones partnered with Google’s Creative Lab to experiment with the company’s PoseNet program, a machine-learning model that can recognize the positioning of human figures in real time.
For some, the thought of artificial intelligence creating a dance work conjures a dystopian future where machines have replaced human artists. But McGregor didn’t seem too worried.
“It’s not a question about whether or not the AI is creative,” he said. “Because firstly, creative people have made it and [the tool] is creating really interesting solutions to physical problems.”
In a time where choreographers are planning how to carry out their life’s work long after they’re gone, McGregor envisioned a future where a machine could still carry on the legacy of his work 100 years from now.
“But is there a moment where the dances that the AI system makes are more interesting than the dances the humans make?” McGregor wondered. “I don’t know yet. But there is a very interesting potential.”
Adès & McGregor: A Dance Collaboration
When: 7:30 p.m. Friday-Saturday
Where: Dorothy Chandler Pavilion, 135 N. Grand Avenue, L.A.
Inside the business of entertainment
The Wide Shot brings you news, analysis and insights on everything from streaming wars to production — and what it all means for the future.
You may occasionally receive promotional content from the Los Angeles Times.