Skip to content

ad5041/PuppeteeringAI

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Puppeteering AI

Daniel Bisig - Coventry University, UK - ad5041@coventry.ac.uk

Abstract

Puppeteering AI is a system that creates an artificial dancer whose movements are generated through a combination of interactive control and machine learning. Puppeteering AI explores interactive applications of an autoencoder that has been trained on motion capture recordings of a human dancer. These explorations have led to the discovery, that autoencoders, when applied iteratively on their own decoded output, can produce interesting synthetic motions, in particular when some of the values of this output are removed from the autoencoder’s influence and instead controlled interactively. This approach is reminiscent of puppeteering, in which a puppeteer only controls a subset of a puppet’s joints while the other joints are taken care of by another process. In traditional puppeteering, this other process is physics, whereas in the system presented here its is autoencoding. While physics ensures that the puppet performs physically plausible motions, autoencoding ensures that the artificial dancer exhibits motions that are stylistically plausible to the extent that they resemble movements that have been recorded from a dancer.

This project has been realised in collaboration with Ephraim Wegner, teacher and researcher at Offenburg University, Germany. A detailed description of the project has been published.

Resources

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published