This website uses cookies primarily for visitor analytics. Certain pages will ask you to fill in contact details to receive additional information. On these pages you have the option of having the site log your details for future visits. Indicating you want the site to remember your details will place a cookie on your device. To view our full cookie policy, please click here. You can also view it at any time by going to our Contact Us page.

You’ll never dance alone with this Artificial Intelligence project

29 April 2016

This project allows people to get jiggy with a computer-controlled dancer, which “watches” the person and improvises its own moves based on prior experiences.

Dancers perform in Georgia Tech's geodesic dome. (Credit: Georgia Tech)

When the human responds, the computerised figure or “virtual character” reacts again, creating an impromptu dance couple based on artificial intelligence (AI).

The LuminAI project is housed inside a 15ft tall geodesic dome, designed and constructed by Georgia Tech digital media master’s student Jessica Anderson, and lined with custom-made projection panels for dome projection mapping. The surfaces allow people to watch their own shadowy avatar as it struts with a virtual character named VAI, which learns how to dance by paying attention to which moves the current user (and everyone before them) is doing and when. The more moves it sees, the better and deeper the computer’s dance vocabulary. It then uses this vocabulary as a basis for future improvisation.

“Co-creative artificial intelligence, or using AI as a creative collaborator, is rare,” said Brian Magerko, the Georgia Tech digital media associate professor who leads the project. “As computers become more ubiquitous, we must understand how they can co-exist with humans. Part of that is creating things together.”

The system uses Kinect devices to capture the person’s movement, which is then projected as a digitally enhanced silhouette on the dome’s screens. The computer analyses the dance moves being performed and leans on its memory to choose its next move.

“This episodic memory is filled with experiences of how people have danced with it in the past,” said Mikhail Jacob, a computer science Ph.D. student and lead developer of the LuminAI technology. “For example, the computer learns to predict that when one person pumps their arms into the air, their partner is likely to do something similar. So on seeing that movement, the avatar might pump its arms sideways at the same pace or use that as the basis for its response.”

The team says this improvisation is one of the most important parts of the project. The avatar recognises patterns, but doesn’t always react the same way every time. That means that the person must improvise too, which leads to greater creativity all around. All the while, the computer is capturing these new experiences and storing the information to use as a basis for future dance sessions.

“Humans aren’t fully in the driver’s seat anymore. The process gives autonomy back to the computer,” said Jacob. “LuminAI forces a person to create something new ¬– potentially something better – with their partner because they’re forced to take their (virtual) partner’s actions into consideration.”

The technology has broader implications than art. As Magerko explains it, these days AI mostly relies on instructions fed to it by humans, and programming a computer with every possible instruction is impossible.

“That’s because humans are so unpredictable,” says Magerko. “Let’s say a computer and a person are going to write a story together about a family conversation at a restaurant. The story could go in a typical fashion or veer wildly into novel territory. The computer won’t do well unless it has been programmed with all of the pieces of knowledge that the story could possibly contain. However, if it can learn that knowledge from people and prior experiences, its improvisation can become somewhat consistent and accurate and the AI learning new story content (or dance moves) becomes part of the user experience.”

LuminAi was unveiled for the first time this past weekend in Atlanta at the Hambidge Art Auction in partnership with the Goat Farm Arts Centre. It was featured within a dance and technology performance, in a work called “Post,” as a finalist for the Field Experiment ATL grant. T. Lang Dance performed set choreography with avatars and virtual characters within the dome. Post is the fourth and final instalment of Lang’s Post Up series, which focuses on the stark realities and situational complexities after an emotional reunion between long lost souls.

Video courtesy of Georgia Tech University.


Print this page | E-mail this page