Kinect Control Of Sample Triggers And Effects

Hello all, thanks in advance. I got the Kinect portion all figured out, been using it to create fun video effects… but i don’t know much about audio. I’d like to activate a physical room by triggering samples and tweaking effects. I already have the ability to send out OSC, and i am starting playback of a song (it was easy since it is included with renoise in the form of “/renoise/transport/start”)

i looked at the scripting api and it seemed like a smart idea to ask for a bit of help. What would be the proper way for me to script the triggering of samples? and then, is it possible to dynamically tweak the audio with effects in realtime? sorry for my newbness.

I’m reading now, it seems like it might be the correct path. thanks for the help.

Alpay Kasal

You mean that you’d like to create a virtual 3d space midi controler ?

Yeah you can do quite a bunch of cool things with OSC and Kinect…

For triggering samples, i guess you can mess around with the instrument chainer tool, it does a rerouting to the Renoise OSC server to trigger instruments.
You then have to set specific positions to trigger specific notes (that will allow you to trigger specific samples)

Interesting video examples, with good and bad things.

Personnaly I would not have coded a 3D piano-like keyboard. The real position of keys on a piano are not vertical but horizontal what requires from the controler to re-create in OpenGL a virtual 3d space where to put a 3D keyboard…

Simply, a vertical instrument, like a classical harp, for example, that has vertical strings that you can “hit”, would be better and easier to code. And there is something that could have been simpler, with the “webcam view” (position of hands). Coders have understood the notion of “threshold” on the Z-plane. The last video example shows that the color of the hands is different if it’s closer from the Kinect or not. What means that you can “isolate” the hands, auto-mask them, and place them directly on the front webcam view, on the top of a background picture representing the instrument. Auto-Masking hands makes them usable like a 2D “pointing device” and they can define a X axis position, on a scale (corresponding to a note). And, if the Z position <= a defined threshold, the note is ON, otherwise the note is OFF.

I’ll try to install at home the Microsoft Beta SDK and see if I can make make something like that, with my own Kinect device… something like a Virtual Laser Harp, a la Jean Michel Jarre.

Some inspiration:

vV, Haha! the second video you posted is one of mine, i was the one holding the camera… that’s the office I’m sitting in right now. That’s so funny. We made that video within the same week the Kinect was released.

Kurtz, not really like a harp or kepboard hanging in the air, but more like triggers througout the room… I am doing it now using the unity game engine and it’s built in collision detection but I wanted a better way to deal with sound and effects.

Conner, Kurtz, vV, i’ll take a look at the links you guys provided, thanks. very much appreciated. also, would you say that perhaps renoise is not a good tool for my need? should I look elsewhere?

It depends on what you exactly want. Triggering samples in instruments is no problem through OSC. Contolling effects probably requires additions to the GlobalOSCAction.lua (you can add own scripts that do specific things in Renoise which respond to OSC messages, user defined OSC messages if you want to)

i’ll look into that too. thanks for the help. i also wondered if reason would give me a direct use of it’s effects through osc. for now, i’ll prototype with vvvv to make sure i’m getting no latency, then add renoise to the mix.

fwiw, this is another vid i worked on… it show a hint of what i’m after, and things not seen are how i turn the lights on and off by reaching for them with the kinect. ideally I want to associate a sound to objects throughout the room.