Osc Output

Hi,

Is there a way, either with Duplex or by modifying “GlobalOscActions.lua” in the “Scripts” folder, to output OSC (Open Sound Control) messages from Renoise?

I would like to trigger video events in Pure Data (or VDMX, Modul8 or any other OSC application that can do video) in sync with events played from my Renoise patterns. I was hoping this could be done in Renoise 2.6, but I do not see an obvious method for now.

Hi,

I don’t know if OSC messages can be sent from Renoise (I am a beginner).
It would be also useful in my case : I want to generate simple video game events (with PyGame library) according to the sounds played with Renoise.

Instead, I am trying to read Renoise midi output from my game (PyPortMidi library provided with PyGame) through Jack. Normally, you should be able to read Renoise midi output from PureData as well.

The answer is yes, renoise can send and receive OSC quite well, on par with everything else.
There is no easy way or ultimate method to explain how to do it,
except to review and familiarize yourself with anything and everything in the tec subforum and on http://code.google.com/p/xrnx/

Some docs to read:

Cheers.

Ok, I will check it out… doesn’t look too hard even for a lua newbie.

First of all, is there a specific forum for questions about scripting?

Ok, I read through the documentation and examples but I still have some questions.
What I want is to output the contents of the note and effect columns by OSC when they are read by the playback_pos.

Ideally, I would observe the playback_pos, read the line data with renoise.song().patterns[].tracks[].lines[] and output it with renoise.Osc.Bundle().

For this purpose, I checked renoise.song().transport.playback_pos but it is not observable.
Is this done on purpose to avoid overloading Renoise and creating glitches?

So the other solution is to use an on_idle() method, during which I would check if the playback_pos has changed and output the OSC bundles if so.
Is this the proper way to do this? Isn’t there a change I might miss note or effect events?

Yes there certainly is:

Realtime changes introduce latency as the Lua scripting engine is not fast enough to constantly synchronize realtime events, so yes there is a big chance you miss note or effect events.
The sequence positions have an observable, you can do something with that, but the song_pos.line element has none, indeed, using co-routines to check upon line states is the only way to be able to do something but not in the sense of “if line == positionx then do y” but only in the sense of “if line >= positionx then do y”

Ok, I started my plugin. It can be found here: Osc Output Initial Version

Sorry to bump an old thread.

this is really an important point. i’ve been trying to use OSC output tool to triger precise events in Processing and PureData, and it gave me headaches because it was loosing certain triggers.

It’s an important thing to know that there’s chance to miss notes or effect events.

But anyway:

  • has anything changed since 2010 about this? in 3.0? - is triggering video (for example) better to do via MIDI then? (or MIDI->PureData->OSC)?

L.

I am the original developer and I gave up development because there was to much latency. MIDI OUTPUT IS WAY BETTER THAN OSC OUTPUT in Renoise.

yes, it seems like the culprit is the LUA engine.

true, i’ve managed to create a simple bridge using PureData to send OSC (to Processing, or whatever). on the screenshot, all controller data is forwarded as OSC… i guess i could also use MIDI input for processing, but that means sticking with one machine. with OSC i can connect to another (yes, i know there is MIDI over LAN too).

5079 Screenshot - 06022014 - 08:30:49 AM.png

Yes, there needs to be an actual OSC output for instruments, or a scriptable realtime thread in lua before this is possible.
But for all intends and purposes, MIDI is still a good solution (remember, you can enter many different midi commands in the pattern, not just notes).

Another route to go down would be to preprocess the timing. So, not realtime but you end up with a bag of data that you can use for control
An example : https://github.com/furusystems/rnssync

how about an OSC meta device - quite the same like MIDI control one?
the same way as you choose a MIDI output, you could choose a host/IP, and port,
and write mountpoints to which each controller would go.

just an idea.

But a good one!