For the final project of our Experimental Practices course at ATLAS, my classmate Elyza Bleau and I collaborated to write, compose, choreograph, and perform an original song and dance piece titled Light That Fire. With backgrounds in theater, music, and dance, we were curious about how performance artists could incorporate interactive technology in ways that enhance the story being told, rather than detract from it.
In Light That Fire, Elyza and I perform a light-hearted tale about the social disharmony between two villages in a faraway galaxy, while our backdrop, a dynamic particle system projection, visualizes and emphasizes the story's emotional tension and release.
How It Works:
The performers wear motion capture sensors that allow the addressable spotlights to follow them programmatically. The motion capture data also informs the particle system projection, keeping the system centered between the two performers at all times. Developed in TouchDesigner and syphoned through Resolume, the particle system has several modes that correspond to scene changes. Preset keyboard macros correspond to a sequence of stage cues that our technology crew uses behind the scenes, providing a quick method for adjusting particle repel/attract forces and day/nighttime color gradients when appropriate.

Other Work:

Back to Top