Tuesday, February 21, 2012

Things are getting heavy

The performance for this project is happening March 8th in the Mills College Littlefield Concert Hall, 8pm.  Its part of a 4 day annual festival held by the music graduate students of the school, and usually involves a ton of experimental, bold new approaches to music and performance.  

Since the last posting, we've spent most of our time exploring the narrative, conceptual, and movements of the pice.  After weeks of elaborating, experimenting, and simplifying, I have submitted the program notes for the piece:


Paralysis Of The Will


This performance abstracts the deterioration of the human spirit, or a yielding into a self appointed guardian of the status quo, brought by the stifling outdated institutions we all so helplessly live in.  

Performers and instruments:

Patrice Scanlon and Edan Mason both using the Kinect visual interface and computer

OSCeleton doesn't like chairs!

In our last rehearsal, we've been unable to get consistent tracking of two users at the same time, without OSCeleton either loosing a user, discovering a new user and trying to calibrate them (infrared ghost in the concert hall?), or switching our user numbers to other numbers, or all of the above happening at the same time.  There's no consistency!  And the only way it IS consistent is when we are both standing upright and standing relatively still.  The moment I sit down on a chair, or lay on the floor, OSCeleton misbehaves.  

The main reason this is important is because we'd like to utilize the Kinects ability to track multiple users at the same time.  This means one kinect, one computer, clean and simple.  But if the Kinect can not distinguish between two users because one is sitting on a chair, then I might suddenly control Patrice's sounds mid performance.  No good.  

If we have two separate systems, then it doesn't matter which user assignment OSCeleton gives us because we are the only one moving within the frame of the Kinect anyway.  This is an unfortunate discovery, but we are reshaping the way the performance will be blocked in order to accommodate this issue.  If there was more time, or if we discovered this sooner, we would have liked to find out if the issue lies within OSCeleton and its associated software, or the Kinect itself.  

On mapping cabinet samples to movement

As I've been exploring how I'd like to map these samples to my body, I've been using MSP objects as a form of sketching.  The perfect configuration would be like this: A gesture activated by falling into the chair triggers an audio sample associated with one of my joints.  As I move the joint, we can now hear the sample as if its being scrubbed.  No pitch shifting, no looping, or snaps & pops as samples are shifting, just smooth scrubbing.  There would be a relative joint position associated with tiny portions of the sample, and as my join moves, the playback of the sample moves through the sound file.  

I've been unable to think of ways to do this using basic msp sampling objects.  The closest I can get makes a tiny snapping sound as my body moves through different times of the sample.  So at this point, I'm leaning towards granular synthesis.  

Granular synthesis would allow me to play through a specific range, or window, of the sample, and scrub through that sample as I move my body, AND avoid the snapping sound as the scrubbing takes place.  So now I'm going to dive into Max MSPs granular synthesis potential.  But if there's a way to accomplish my ideal setup, any suggestions?? Let me know please!!!


Monday, January 30, 2012

Like a rock.

The aesthetic and compositional concepts of this performance are taking shape now.  Here is a rough treatment:

The aesthetic is the deterioration of the human spirit, or a yielding into a self appointed guardian of the status quo, brought by the stifling outdated institutions, mass media outlets, and the monetary system.  The performance involves two characters.  One represents modern western man; the other represents the stifling institutions that aim to domesticate the individual.  The performance progresses through various human gestures made by the man, which in turn are pacified by the institutions, yielding back to a sedative state.  Each gesture the man makes represents opportunity to break through the barriers instilled by the institutions, but are then reshaped back.  As the reshaping happens, a rusty sound emerges on the joints of the man’s body.  An example of one of these gestures would be a fist raised above the head, symbolizing protest.  Another might be palms brought together placed against the chest, symbolizing spirituality.  Once these gestures form, they are shortly lived and the body returns to a neutral state by the influence of the institution character.  With each passing gesture, a new joint sounds with rusty squeaks, symbolizing the deterioration and petrification of the man.  Eventually all the joints are rusty, and the man falls into a chair, and sits in front of a television, or screen, or some representation of hypnotic sedation to the liking of the institution character. 


At our last rehearsal, Patrice and I began to visualize how this would be choreographed, and what props, if any will exist.  For now, I've begun to categorize the gestures by institutions: Healthcare, Speech, Consumerism, Religion, Politics, etc.  Each category will have a gesture that represents it.  I'm thinking about how each gesture will transition from one to another, and what are the best ways to express these categories via the body and movement?  I'm encouraging any readers of this site to comment.  But I think Patrice and I are the only who ever on here!  

This project is my graduate thesis performance, which will be held March 8th, 2012 as part of Mills College's annual Signal Flow festival.  More on that later.  


Tuesday, January 24, 2012


        The semester started up again, and I now have my weekly schedule down and steady.  The winter break was awesome, but I didn't spend nearly as much time on this project as I expected.  But that's alright, I still feel like we've got plenty of time to work.  And working on this is now prime in my life at the moment.  This and cooking some new dishes :)

        There hasn't been any updates from Mr. Kuperberg's Lab website, and I've received no response from the comments and forum sections on how to get it going.  So for now, I'm going to get good with OSCeleton and plan on using it for the show.  

        There has been some compositional updates that I'm excited to try out.  I've recorded my mom's kitchen cabinets with a contact microphone.  These cabinets are the loudest, squeakiest metallic cabinets I've ever heard!  So at the moment, the idea is to map refined samples of these cabinets to individual OSCeleton joints, and create a rusty android type of creature.  Each cabinet I recorded has its own pattern and timbre, which makes me want to associate each one with a different joint.  The challenge will lie on how to utilize the incoming floating points in such a way that triggers these samples, and have it seem like a natural response to my movement.  Here are a few ideas:

        I can relate the rate of number change to the speed of playback.  This is cool because the samples would stop when the corresponding joint stops.  But it might sound like a DJ scratching on a record, which is sweet, but not what I'm going for.

        I can trigger samples when a joint begins to move, and if I keep moving past the length of the sample, it can loop over from someplace in the middle of the sample until I stop moving.  This might take more time to create in Max, but could allow more freedom of movement.  On the other hand, the looping aspect might come off as obvious and bring down the production quality.  

        Another way, and this might be the most ambitious, yet highest quality, is to have tons of samples, all triggered when a specific relational distance from two or more joints happen.  This would allow me to have a sample triggered when ever I stomp on the ground, turn my head, move an arm, and even sit down.  

        There are probably many other ways I can map audio samples to the OSCeleton joints.  These are just the first ones that come up in my head.  I will talk about other ways with Patrice and James.  There's also a factor of processing the samples, either live or beforehand in order to further produce the realistic effect of rusty metallic android.  

        Meeting Patrice at my place for supper and a breakdown.  Then our first concert hall rehearsal is on the 29th.  YES!!!

This project is my graduate thesis performance, which will be held March 8th, 2012 as part of Mills College's annual Signal Flow festival.  More on that later.  


Monday, January 2, 2012

Got OSCeleton running

OSCeleton is running great, as well as most of the OpenNI and NITE sample scripts.  I still can't get Mappinect running properly.  Only a person or two have posted a successful load of Mappinect, and I'm working on figuring out what I need to do in order to get this to work.  

I have a lot of faith that Mappinect will be our savior for this project, and will allow us to map controls in such a way that is more like a dance and less like obvious gestures.  I will post any progress, and have asked a few on forums what they did in order to get this working.  

This project is my graduate thesis performance, which will be held March 8th, 2012 as part of Mills College's annual Signal Flow festival.  More on that later.  


New Developments

Sheesh, its been longer than it should since I've posted news.

The latest development is that http://zigfu.com/ created an installation package for OpenNI.  All that effort installing and getting everything up and running is really easy now.

 Also, a blog site called The Lab http://benjamin.kuperberg.fr/lab/ run by Ben Kuperberg is developing software that would make Kinect mapping a and routing a lot easier than it has been so far.

And lastly, Kinect Star Wars: http://lucasarts.com/games/kinectstarwars/ I don't know how I feel about this.  I think I'm embarrassed.

I finally found a Kinect being sold buy a guy in Venice Beach and have been running example programs bundled with the suite of OpenNI tools.  So far so good, but now I will be testing out the alpha version of  MappInect, which apparently will include a GUI builder for easy data routing soon!

This project is my graduate thesis performance, which will be held March 8th, 2012 as part of Mills College's annual Signal Flow festival.  More on that later.  


Monday, December 19, 2011

Patrice Intro!


My name is Patrice Scanlon, and I started ballet lessons when I was 6 and clarinet lessons at the age of 8. I love being on stage even though the moments or even months before being on stage are filled with self-doubt as well as narcissistic glory and guilt.

These two worlds of music and dance were segregated for too long in my life. There was dance and then there was music. However, it wasn’t until I went to undergrad at Stetson University that I realized that these two worlds could be combined.

To be more specific, it was Eric Singer, http://ericsinger.com/old/, who visited my undergrad as a guest lecturer for a Master Class. He talked about the VideoIn object that he created for Max. He did a demo were his hands were controlling MIDI data. It finally clicked for me! This is what I WANT TO DO!!!

Of course! Let’s reverse the relationship of pre-recorded music generating the movement and let the movement generate the music! At the ripe age of 20, I thought I had come up with the most genius idea ever. 3 years later, I went to grad school at Mills College for Electronic Music and learned that this idea has been floating around forever.

During grad school, Eric Singer upgraded his free VideoIn object and sold it for a modest price as the Cyclops object. This new object could track colors. I found the latency to be much improved by not having to convert the color video from a mini-DV camcorder to grayscale as the VideoIn object was only capable of accomplishing.

Eric Singer, ignited the fire for the idea but it was Les Stuck, http://stuckfootage.com/, who helped me actualize the concept into a working piece of technology! I am so grateful for his energy and expertise in the world of MaxMSP/Jitter as well as music and dance. I could not have accomplished this successfully without him!

Still, it was never as precise as I wanted it to be. Then I found the answer to a more precise device for motion tracking. It started by attending a collaboration meeting for the dance and music students, mainly grad students but a few undergrads as well, at Mills College. There, I learned that Edan was a motion tracking aficionado as well. After that meeting, Edan and I started a collaboration, which led to the purchase of a Kinect Controller. Which then led to installing bad scripts via Terminal that totally crippled my computer. I had to back up important files and re-image the machine. I tried again, using Thom Judson’s instructions found here:


I have had success with this install. I believe I did have some work-a-rounds through out the process. All in all, it is working great for me!

I can’t believe how precise this Kinect controller is! The way I have been working for the past 13 years has been in a X/Y, columns and rows approach. Now there is a Z-Plane! Now there are 13 points on the skeleton that can be tracked. I have to completely re-think the way I approach this data and the process of motion tracking.

At the same time, I have another person to help me through the process and that is Edan. We have done months of research, an improvised recording session (which I will post before the New Year!), and have conducted many tests to watch data and attach it to sounds to see what happens. I believe we have found out enough about what the data is doing to start creating the software to actualize the performance of the piece.

Happy Holidays!


Tuesday, December 13, 2011

Patrice's test run

Kinect > OSC > Max > Logic

This project is my graduate thesis performance, which will be held March 8th, 2012 as part of Mills College's annual Signal Flow festival.