The semester started up again, and I now have my weekly schedule down and steady. The winter break was awesome, but I didn't spend nearly as much time on this project as I expected. But that's alright, I still feel like we've got plenty of time to work. And working on this is now prime in my life at the moment. This and cooking some new dishes :)
There hasn't been any updates from Mr. Kuperberg's Lab website, and I've received no response from the comments and forum sections on how to get it going. So for now, I'm going to get good with OSCeleton and plan on using it for the show.
There has been some compositional updates that I'm excited to try out. I've recorded my mom's kitchen cabinets with a contact microphone. These cabinets are the loudest, squeakiest metallic cabinets I've ever heard! So at the moment, the idea is to map refined samples of these cabinets to individual OSCeleton joints, and create a rusty android type of creature. Each cabinet I recorded has its own pattern and timbre, which makes me want to associate each one with a different joint. The challenge will lie on how to utilize the incoming floating points in such a way that triggers these samples, and have it seem like a natural response to my movement. Here are a few ideas:
I can relate the rate of number change to the speed of playback. This is cool because the samples would stop when the corresponding joint stops. But it might sound like a DJ scratching on a record, which is sweet, but not what I'm going for.
I can trigger samples when a joint begins to move, and if I keep moving past the length of the sample, it can loop over from someplace in the middle of the sample until I stop moving. This might take more time to create in Max, but could allow more freedom of movement. On the other hand, the looping aspect might come off as obvious and bring down the production quality.
Another way, and this might be the most ambitious, yet highest quality, is to have tons of samples, all triggered when a specific relational distance from two or more joints happen. This would allow me to have a sample triggered when ever I stomp on the ground, turn my head, move an arm, and even sit down.
There are probably many other ways I can map audio samples to the OSCeleton joints. These are just the first ones that come up in my head. I will talk about other ways with Patrice and James. There's also a factor of processing the samples, either live or beforehand in order to further produce the realistic effect of rusty metallic android.
Meeting Patrice at my place for supper and a breakdown. Then our first concert hall rehearsal is on the 29th. YES!!!
This project is my graduate thesis performance, which will be held March 8th, 2012 as part of Mills College's annual Signal Flow festival. More on that later.