Monday, December 19, 2011

Patrice Intro!


My name is Patrice Scanlon, and I started ballet lessons when I was 6 and clarinet lessons at the age of 8. I love being on stage even though the moments or even months before being on stage are filled with self-doubt as well as narcissistic glory and guilt.

These two worlds of music and dance were segregated for too long in my life. There was dance and then there was music. However, it wasn’t until I went to undergrad at Stetson University that I realized that these two worlds could be combined.

To be more specific, it was Eric Singer,, who visited my undergrad as a guest lecturer for a Master Class. He talked about the VideoIn object that he created for Max. He did a demo were his hands were controlling MIDI data. It finally clicked for me! This is what I WANT TO DO!!!

Of course! Let’s reverse the relationship of pre-recorded music generating the movement and let the movement generate the music! At the ripe age of 20, I thought I had come up with the most genius idea ever. 3 years later, I went to grad school at Mills College for Electronic Music and learned that this idea has been floating around forever.

During grad school, Eric Singer upgraded his free VideoIn object and sold it for a modest price as the Cyclops object. This new object could track colors. I found the latency to be much improved by not having to convert the color video from a mini-DV camcorder to grayscale as the VideoIn object was only capable of accomplishing.

Eric Singer, ignited the fire for the idea but it was Les Stuck,, who helped me actualize the concept into a working piece of technology! I am so grateful for his energy and expertise in the world of MaxMSP/Jitter as well as music and dance. I could not have accomplished this successfully without him!

Still, it was never as precise as I wanted it to be. Then I found the answer to a more precise device for motion tracking. It started by attending a collaboration meeting for the dance and music students, mainly grad students but a few undergrads as well, at Mills College. There, I learned that Edan was a motion tracking aficionado as well. After that meeting, Edan and I started a collaboration, which led to the purchase of a Kinect Controller. Which then led to installing bad scripts via Terminal that totally crippled my computer. I had to back up important files and re-image the machine. I tried again, using Thom Judson’s instructions found here:

I have had success with this install. I believe I did have some work-a-rounds through out the process. All in all, it is working great for me!

I can’t believe how precise this Kinect controller is! The way I have been working for the past 13 years has been in a X/Y, columns and rows approach. Now there is a Z-Plane! Now there are 13 points on the skeleton that can be tracked. I have to completely re-think the way I approach this data and the process of motion tracking.

At the same time, I have another person to help me through the process and that is Edan. We have done months of research, an improvised recording session (which I will post before the New Year!), and have conducted many tests to watch data and attach it to sounds to see what happens. I believe we have found out enough about what the data is doing to start creating the software to actualize the performance of the piece.

Happy Holidays!


No comments:

Post a Comment