Creating “moonshot: solos for your home”

moonshot: solos for your home is an Augmented Reality (AR) app that places a more than human entity (cyborg body) in the home of the user. The concept models for the piece were created through embodied practice in Tilt Brush from transparent prisms to create the 3D model for moonshot’s posthuman representation.

Concept Image

Choreographing and performing a solo in a motion capture session and connecting that movement to a fluctuating polygon 3D Model (Maya) before bringing it into Unity to create functionality on a device.

Motion capture data collection excerpt

Transposing the concept model between Tilt Brush and Maya had some flaws. It is possible, but the poly count of the Tilt Brush model was too high, meaning too much processing power would be necessary to render them and make them visible and run smoothly in the application. I am still interested in trying to do this and believe it can be done if the Tilt Brush model created was smaller and less elements. Once the Tilt Brush model is converted to an fbx file and imported into Maya, I would “paint” the model to assign the area of the skin I wanted to be activated by each portion of the model: https://youtu.be/_m5M0SD1BBk. However, for this version, with advice from Vita Berezina-Blackburn, I was able to come up with an alternate method of creating a 3D model that used similar visuals using MASHes in Maya.

Motion capture data, 3D humanoid model, and Maya MASH

In creating the AR App using motion capture data, I have did the following:

Further work in Unity included creating an interface and instructions for use, using LeanTouch to Scale and Zoom the entity with a touch screen, and connecting the soundtrack I created from distortions of the brushes in Tilt Brush.

Make-shift composition of audio in Audacity while playing animation in Unity

This application places the 3D model and connected motion capture solo in the user’s space. When encouraging the user to interface with the entity, I am exploring audience as performer and archive as performance. The audience becomes the performer by engaging with an AR entity in the space through watching from a distance, moving around 360º, or sitting immersed in the body of the soloist. The choreography and movement of the dancer were archived through motion captured choreography. By placing this danced archive in the hands of the user, I am questioning the essence of performance and exploring alternative performance venues.

Moving forward, the app will be modified to house multiple solos. Dancer/Choreographers Kylee Smith, Yukina Sato, Menghang Wu, and Jacqueline Courchene are currently in the process of performing their choreography for motion capture recording. I will collaborate with each of these choreographers to continue this investigation surrounding artifacts as performance and an application as a performance platform including new bodies and perspectives.