Monday, December 19, 2011

Patrice Intro!

Hello!


My name is Patrice Scanlon, and I started ballet lessons when I was 6 and clarinet lessons at the age of 8. I love being on stage even though the moments or even months before being on stage are filled with self-doubt as well as narcissistic glory and guilt.


These two worlds of music and dance were segregated for too long in my life. There was dance and then there was music. However, it wasn’t until I went to undergrad at Stetson University that I realized that these two worlds could be combined.


To be more specific, it was Eric Singer, http://ericsinger.com/old/, who visited my undergrad as a guest lecturer for a Master Class. He talked about the VideoIn object that he created for Max. He did a demo were his hands were controlling MIDI data. It finally clicked for me! This is what I WANT TO DO!!!


Of course! Let’s reverse the relationship of pre-recorded music generating the movement and let the movement generate the music! At the ripe age of 20, I thought I had come up with the most genius idea ever. 3 years later, I went to grad school at Mills College for Electronic Music and learned that this idea has been floating around forever.


During grad school, Eric Singer upgraded his free VideoIn object and sold it for a modest price as the Cyclops object. This new object could track colors. I found the latency to be much improved by not having to convert the color video from a mini-DV camcorder to grayscale as the VideoIn object was only capable of accomplishing.


Eric Singer, ignited the fire for the idea but it was Les Stuck, http://stuckfootage.com/, who helped me actualize the concept into a working piece of technology! I am so grateful for his energy and expertise in the world of MaxMSP/Jitter as well as music and dance. I could not have accomplished this successfully without him!


Still, it was never as precise as I wanted it to be. Then I found the answer to a more precise device for motion tracking. It started by attending a collaboration meeting for the dance and music students, mainly grad students but a few undergrads as well, at Mills College. There, I learned that Edan was a motion tracking aficionado as well. After that meeting, Edan and I started a collaboration, which led to the purchase of a Kinect Controller. Which then led to installing bad scripts via Terminal that totally crippled my computer. I had to back up important files and re-image the machine. I tried again, using Thom Judson’s instructions found here:

http://tohmjudson.com/?p=30

I have had success with this install. I believe I did have some work-a-rounds through out the process. All in all, it is working great for me!


I can’t believe how precise this Kinect controller is! The way I have been working for the past 13 years has been in a X/Y, columns and rows approach. Now there is a Z-Plane! Now there are 13 points on the skeleton that can be tracked. I have to completely re-think the way I approach this data and the process of motion tracking.


At the same time, I have another person to help me through the process and that is Edan. We have done months of research, an improvised recording session (which I will post before the New Year!), and have conducted many tests to watch data and attach it to sounds to see what happens. I believe we have found out enough about what the data is doing to start creating the software to actualize the performance of the piece.


Happy Holidays!

Patrice

Tuesday, December 13, 2011

Patrice's test run

Kinect > OSC > Max > Logic



This project is my graduate thesis performance, which will be held March 8th, 2012 as part of Mills College's annual Signal Flow festival.


-edantron

........and GO!


This blog will be the posting location of all research, documentation, and development for my performance study utilizing the Kinect.  The project is a collaboration between me and Patrice Scanlon

 


Although I really enjoy explaining things with great detail, especially to those who are hungry for it, I'm not going to spend the limited amount of time available writing on things which I already know.  In other words, this blog is not a place for instruction (though I will be posting useful links) as much as it is a place for me and colleagues to asses our progress in this project.  



So far!



At this moment, Patrice got the Kinect sending data to OSC, which means we're able to route OSC messages to whatever programs can take it.  This has been done using OpenNI and OSCeleton.  The setup process for getting OSC messages out of the Kinect is a bit lengthy.  Outdated instructions are found at Tohm Judson's site, which was extremely helpful, but as a user with little to no developing skills, I had to fill in the outdated pieces by searching forums, trial & error, and improvising. 



edantron's Notes on installing OSCeleton for the Kinect:




When following the installation instructions from Tohm Judson's site 


Everything is fine until you need to install OpenNI, because the version that is available is later than what was used for the instructions.  The difference is a couple extra steps, which involve installing 'oxygen' and 'graphviz' using the previously installed MacPorts program.  You do this by typing this into terminal:


sudo port install doxygen graphviz


if this is not done, you won't be able to install OpenNI using the readme instructions downloaded with the OpenNI zip file (which is what you need to follow, NOT Tohm's installation, which points to an instal.sh file that doesn't exist anymore.


At the point where the instructions say to COPY FILES, the directory mentioned also doesn't exist anymore, instead follow the instructions on adding the license key to 3 xml files.  Instructions on Kan Yang Li's Blog.   


Phew!  At this point, I'm trying to scrounge some money to own a Kinect.  HA!  I dont' even have one yet.  Craigslist will be my glory on that task.  



In conclusion to this post:

 


There will be lots of details throughout the project as Patrice and I mold an interface between our bodies and the sounds and controls we wish to have.  The interface development and artistic development will be engaged at the same time, and both will have a strong presence in this blog.  


This project is my graduate thesis performance, which will be held March 8th, 2012 as part of Mills College's annual Signal Flow festival.  More on that later.  


-edantron