Monday, January 30, 2012

Like a rock.

The aesthetic and compositional concepts of this performance are taking shape now.  Here is a rough treatment:



The aesthetic is the deterioration of the human spirit, or a yielding into a self appointed guardian of the status quo, brought by the stifling outdated institutions, mass media outlets, and the monetary system.  The performance involves two characters.  One represents modern western man; the other represents the stifling institutions that aim to domesticate the individual.  The performance progresses through various human gestures made by the man, which in turn are pacified by the institutions, yielding back to a sedative state.  Each gesture the man makes represents opportunity to break through the barriers instilled by the institutions, but are then reshaped back.  As the reshaping happens, a rusty sound emerges on the joints of the man’s body.  An example of one of these gestures would be a fist raised above the head, symbolizing protest.  Another might be palms brought together placed against the chest, symbolizing spirituality.  Once these gestures form, they are shortly lived and the body returns to a neutral state by the influence of the institution character.  With each passing gesture, a new joint sounds with rusty squeaks, symbolizing the deterioration and petrification of the man.  Eventually all the joints are rusty, and the man falls into a chair, and sits in front of a television, or screen, or some representation of hypnotic sedation to the liking of the institution character. 

::::


At our last rehearsal, Patrice and I began to visualize how this would be choreographed, and what props, if any will exist.  For now, I've begun to categorize the gestures by institutions: Healthcare, Speech, Consumerism, Religion, Politics, etc.  Each category will have a gesture that represents it.  I'm thinking about how each gesture will transition from one to another, and what are the best ways to express these categories via the body and movement?  I'm encouraging any readers of this site to comment.  But I think Patrice and I are the only who ever on here!  




This project is my graduate thesis performance, which will be held March 8th, 2012 as part of Mills College's annual Signal Flow festival.  More on that later.  



-edantron 

Tuesday, January 24, 2012

Updates

        The semester started up again, and I now have my weekly schedule down and steady.  The winter break was awesome, but I didn't spend nearly as much time on this project as I expected.  But that's alright, I still feel like we've got plenty of time to work.  And working on this is now prime in my life at the moment.  This and cooking some new dishes :)


        There hasn't been any updates from Mr. Kuperberg's Lab website, and I've received no response from the comments and forum sections on how to get it going.  So for now, I'm going to get good with OSCeleton and plan on using it for the show.  


        There has been some compositional updates that I'm excited to try out.  I've recorded my mom's kitchen cabinets with a contact microphone.  These cabinets are the loudest, squeakiest metallic cabinets I've ever heard!  So at the moment, the idea is to map refined samples of these cabinets to individual OSCeleton joints, and create a rusty android type of creature.  Each cabinet I recorded has its own pattern and timbre, which makes me want to associate each one with a different joint.  The challenge will lie on how to utilize the incoming floating points in such a way that triggers these samples, and have it seem like a natural response to my movement.  Here are a few ideas:


        I can relate the rate of number change to the speed of playback.  This is cool because the samples would stop when the corresponding joint stops.  But it might sound like a DJ scratching on a record, which is sweet, but not what I'm going for.


        I can trigger samples when a joint begins to move, and if I keep moving past the length of the sample, it can loop over from someplace in the middle of the sample until I stop moving.  This might take more time to create in Max, but could allow more freedom of movement.  On the other hand, the looping aspect might come off as obvious and bring down the production quality.  


        Another way, and this might be the most ambitious, yet highest quality, is to have tons of samples, all triggered when a specific relational distance from two or more joints happen.  This would allow me to have a sample triggered when ever I stomp on the ground, turn my head, move an arm, and even sit down.  


        There are probably many other ways I can map audio samples to the OSCeleton joints.  These are just the first ones that come up in my head.  I will talk about other ways with Patrice and James.  There's also a factor of processing the samples, either live or beforehand in order to further produce the realistic effect of rusty metallic android.  


        Meeting Patrice at my place for supper and a breakdown.  Then our first concert hall rehearsal is on the 29th.  YES!!!






This project is my graduate thesis performance, which will be held March 8th, 2012 as part of Mills College's annual Signal Flow festival.  More on that later.  


-edantron

Monday, January 2, 2012

Got OSCeleton running

OSCeleton is running great, as well as most of the OpenNI and NITE sample scripts.  I still can't get Mappinect running properly.  Only a person or two have posted a successful load of Mappinect, and I'm working on figuring out what I need to do in order to get this to work.  


I have a lot of faith that Mappinect will be our savior for this project, and will allow us to map controls in such a way that is more like a dance and less like obvious gestures.  I will post any progress, and have asked a few on forums what they did in order to get this working.  



This project is my graduate thesis performance, which will be held March 8th, 2012 as part of Mills College's annual Signal Flow festival.  More on that later.  


-edantron

New Developments

Sheesh, its been longer than it should since I've posted news.


The latest development is that http://zigfu.com/ created an installation package for OpenNI.  All that effort installing and getting everything up and running is really easy now.


 Also, a blog site called The Lab http://benjamin.kuperberg.fr/lab/ run by Ben Kuperberg is developing software that would make Kinect mapping a and routing a lot easier than it has been so far.





And lastly, Kinect Star Wars: http://lucasarts.com/games/kinectstarwars/ I don't know how I feel about this.  I think I'm embarrassed.


I finally found a Kinect being sold buy a guy in Venice Beach and have been running example programs bundled with the suite of OpenNI tools.  So far so good, but now I will be testing out the alpha version of  MappInect, which apparently will include a GUI builder for easy data routing soon!

This project is my graduate thesis performance, which will be held March 8th, 2012 as part of Mills College's annual Signal Flow festival.  More on that later.  


-edantron