Skip to content

ad5041/PuppeteeringAI

master
Switch branches/tags

Name already in use

A tag already exists with the provided branch name. Many Git commands accept both tag and branch names, so creating this branch may cause unexpected behavior. Are you sure you want to create this branch?
Code

Latest commit

Files

Permalink
Failed to load latest commit information.
Type
Name
Latest commit message
Commit time
Dec 7, 2021
Dec 7, 2021
Dec 7, 2021
src
Dec 7, 2021
Dec 7, 2021

Puppeteering AI

Daniel Bisig - Coventry University, UK - ad5041@coventry.ac.uk

Abstract

Puppeteering AI is a system that creates an artificial dancer whose movements are generated through a combination of interactive control and machine learning. Puppeteering AI explores interactive applications of an autoencoder that has been trained on motion capture recordings of a human dancer. These explorations have led to the discovery, that autoencoders, when applied iteratively on their own decoded output, can produce interesting synthetic motions, in particular when some of the values of this output are removed from the autoencoder’s influence and instead controlled interactively. This approach is reminiscent of puppeteering, in which a puppeteer only controls a subset of a puppet’s joints while the other joints are taken care of by another process. In traditional puppeteering, this other process is physics, whereas in the system presented here its is autoencoding. While physics ensures that the puppet performs physically plausible motions, autoencoding ensures that the artificial dancer exhibits motions that are stylistically plausible to the extent that they resemble movements that have been recorded from a dancer.

This project has been realised in collaboration with Ephraim Wegner, teacher and researcher at Offenburg University, Germany. A detailed description of the project has been published.

Resources

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published