Tuesday, February 21, 2012

Things are getting heavy

The performance for this project is happening March 8th in the Mills College Littlefield Concert Hall, 8pm.  Its part of a 4 day annual festival held by the music graduate students of the school, and usually involves a ton of experimental, bold new approaches to music and performance.  


Since the last posting, we've spent most of our time exploring the narrative, conceptual, and movements of the pice.  After weeks of elaborating, experimenting, and simplifying, I have submitted the program notes for the piece:


Title:


Paralysis Of The Will


Description:


This performance abstracts the deterioration of the human spirit, or a yielding into a self appointed guardian of the status quo, brought by the stifling outdated institutions we all so helplessly live in.  


Performers and instruments:


Patrice Scanlon and Edan Mason both using the Kinect visual interface and computer




OSCeleton doesn't like chairs!


In our last rehearsal, we've been unable to get consistent tracking of two users at the same time, without OSCeleton either loosing a user, discovering a new user and trying to calibrate them (infrared ghost in the concert hall?), or switching our user numbers to other numbers, or all of the above happening at the same time.  There's no consistency!  And the only way it IS consistent is when we are both standing upright and standing relatively still.  The moment I sit down on a chair, or lay on the floor, OSCeleton misbehaves.  


The main reason this is important is because we'd like to utilize the Kinects ability to track multiple users at the same time.  This means one kinect, one computer, clean and simple.  But if the Kinect can not distinguish between two users because one is sitting on a chair, then I might suddenly control Patrice's sounds mid performance.  No good.  


If we have two separate systems, then it doesn't matter which user assignment OSCeleton gives us because we are the only one moving within the frame of the Kinect anyway.  This is an unfortunate discovery, but we are reshaping the way the performance will be blocked in order to accommodate this issue.  If there was more time, or if we discovered this sooner, we would have liked to find out if the issue lies within OSCeleton and its associated software, or the Kinect itself.  


On mapping cabinet samples to movement


As I've been exploring how I'd like to map these samples to my body, I've been using MSP objects as a form of sketching.  The perfect configuration would be like this: A gesture activated by falling into the chair triggers an audio sample associated with one of my joints.  As I move the joint, we can now hear the sample as if its being scrubbed.  No pitch shifting, no looping, or snaps & pops as samples are shifting, just smooth scrubbing.  There would be a relative joint position associated with tiny portions of the sample, and as my join moves, the playback of the sample moves through the sound file.  


I've been unable to think of ways to do this using basic msp sampling objects.  The closest I can get makes a tiny snapping sound as my body moves through different times of the sample.  So at this point, I'm leaning towards granular synthesis.  


Granular synthesis would allow me to play through a specific range, or window, of the sample, and scrub through that sample as I move my body, AND avoid the snapping sound as the scrubbing takes place.  So now I'm going to dive into Max MSPs granular synthesis potential.  But if there's a way to accomplish my ideal setup, any suggestions?? Let me know please!!!


-edantron



No comments:

Post a Comment