How to use your Kinect like a MIDI controler with Renoise [windows onl

This trick is based on the usage of several free softs, Midi Yoke, Kinectar, Ethno Tracker, and specific Kinect drivers (anyway they’re all free and easy to install)


After a few hesitations I’ve found a way to make it work with renoise

  1. download Kinectar for Windows that contains 2 softwares, the first one in the package is called ethno tracker that tracks body movments and converts them to data but the result is sent via the OSC protocol, the second one is kinectar a kind of OSC 2 MIDI converter with lots of mid level parameters and extra possibilities

  2. you’ve got to uninstall previously installed OpenNi Primesense controllers, and then reinstall all the drivers BUT only through the recommended ZigJSOpenNI_v1.0.1 installer package

  3. if it’s not allready done you’ve got to install MIDI Yoke, another free program , Midi Yoke , it’s a Virtual Midi I/O Driver, a must have (that you should never uninstall), once it’s done restart your computer

  4. then, close your internet connexion (it can be a quick solution in some situations when the OSC protocol tries to use a wrong IP by default from the start and when your primesense driver is slooow or crashing)

  5. launch kinectar - at first, it won’t work - that’s normal - something must “feed” it, it waits another program to launch

  6. you launch ethno tracker, define a large screen view window 1600x900 for example and your kinect red light will turn on, and you’ll be able to perform the t-pose, ethno tracker looks very fast at detecting gestures, good thing

  7. ethno tracker will first send its data to kinectar via the osc protocol port 8001 (kinectar autodetects ethno tracker) and after a few gestures (approx 10 secs are needed) kinectar tracks ethno tracker joints

  8. and… you’ll probably ask yourself " how the f…k this thing acts lika virtual midi controller ??? … ?? "

  9. you’ve got to go in the Kinectar Instrument Editor, then in Control Changes section, in mode A, check the A box on the left side, select “coords(body)” in the listview, define an axis (example, x), define a realistic min/max value, check the MIDI box, and select for channel 1 : your Midi Yoke 1 output device. When you move your arms, it moves the data in the CC slider of keynectar and sends the data to the MIDI Yoke virtual device.

  10. So (it’s the last step) when you launch Renoise it will enumerate your MIDI devices at start and will find the MIDI Yoke 1 device in the Preferences… tab, define it as your default MIDI In device.

  11. Then click on the the Renoise MIDI Map button, select what ya want to map and try to make tests.

  12. It works, I’ve mapped 3 sliders with no efforts.

Could it be possible to write a script that “directly” deal with OSC messages from ethno tracker without the need to use kinectar ?

Does this also work with an ordinary webcam?

Unfortunately no, ethno tracker really needs some Kinect (or similar) device to send its data via OSC.

For a webcam based motion tracking, I think it’s something that can be done with Pure Data, instead, check here.

I gave a talk last year or so about using the Kinect for assorted audio and visual applications. I put the code up on GitHub: GitHub - Neurogami/KinectForArtists: Code used for Kinect for Artists presentation

I’ve been expanding this into an EBook on the topic, Kinect Hacking for Artists: http://kinect.justthebestparts.com/

It feeds of another, more complete, book, OSC for Artists: http://osc.justthebestparts.com/

For the Kinect book I have more code for controlling Renoise using the Kinect via Processing. It will have the option of sending either MIDI or OSC.

It’s ostensibly cross-platform, though depending on the OS I needed different tools for virtual MIDI devices.

I didn’t know “processing”, thanx alot, I’m going to check all this !

Processing (http://processing.org) is a language created to help artists and musicians create fun and interesting code projects without having to know all the gnarly details of drawing lines and circles or making sounds. It’s kinda sorta a simplified version of Java. PBS has a nice intro video to general field of creative coding: Watch Music, Theater, and Art History Shows & Documentaries | PBS

It’s fairly easy to pick up, but the very cool thing is that, secretly under the hood, it really is Java. This means that your code can use libraries meant for Java. People have created wrappers around assorted Kinect libraries so that using a Kinect from Processing is pretty simple (more or less). There are also OSC and MIDI libraries for Processing.

The “new” synthmaker (actually renamed to FlowStone) could perform webcam analysis and convert motion detected data to MIDI info. Check here.