Presented as my Master's thesis, afloat is an immersive, collaborative moving meditation, guided by improvisational aerial dance and transformed by audience participation. Live movement data generates dynamic scene- and soundscapes, allowing our motion and stillness, our individual ebbs and flows, to be amplified and experienced together.
In seeking outlets for self-expression, I often find myself wanting for more when creating through a single medium. Dance is the "language" that feels most organic to me, and yet, I yearn to say more, to translate my movement expression into other mediums so that I can understand myself and be understood more fully. It's from this desire that afloat was born.
afloat was staged to reflect my curiosity about both interactive live performance and interactive/immersive public experiences, combining the challenges of choreographing detailed multimedia systems, along with those of providing accessible opportunities for community-building and connection. It begins with a 15-minute meditative performance by 4 aerial dancers, transitioning into an audience participation experience, inviting viewers to wear the sensors, play with the motion-capture lanterns, and feel liberated to move in their own bodies and witness how those movements contribute to creating beauty and wonder.
How It Works:
The movement data is mainly captured by and read from SOMI-1 wearable accelerometer sensors worn by the dancers; these values initiate paint bursts and new sounds. This sensor data is transmitted via Bluetooth as MIDI, triggering paint bursts in a real-time fluid simulation programmed in TouchDesigner (projected via Resolume) and complementary generative sounds in Ableton Live, all tuned to harmonize with an underlying soundscape. Additional motion data is captured through the motion capture cameras in the theater — the custom hanging lanterns function as interactive motion capture objects, their relative amplitudes modulating overarching qualities of both the projection and sound environment.