Automating Non-Automatable Renoise Features

Attached is an example tool where it sets up an instrument automate device and links it to the song groove sliders.
You could also pick your own device, as long as the display name meets a demand and the devices is in the master-track and the first four parameters are changable sliders.
The tool is not meant to be released as a public use-tool, but for other tool-developers to use the idea for creating new tools that allow automation of non-automatable processes.
Like:
-Instrument Envelopes
-Sample loop points
-Pattern break to arbitrary positions in the sequencer

I don’t guarantee it works with everything or all options are also included when rendering the song, but one could at least try as it opens up a new set of perspectives.

Must be noted that such things always will be completely unreliable - timing wise. So this is not suitable for realtime automation, but can only be used when it does not matter if the automation gets applied a bit later or earlier.

Lua scripts are running in the Renoise UI. Automation happens in realtime (the player’s/soundcard’s thread). How long it takes from the the realtime thread to the UI thread and back (which is what’s happening here) depends on a lot of things, and even on the very same setup this time will vary. In other words: songs with such automation always will sound different every time you play them back.

This is realy cool. I love how you use a dummy device to make the pattern playback observable.

I agree with what taktik says about timing. But I think this tool adds some questions about the renoise lua api.

Why doesn’t the renoise lua api support observing the pattern playback at all?

It is possible to do exactly that already with any one of these three hacks:

  1. Using a dummy device that is automated (as with this tool).
  2. Monitoring playback_position through the idle_notifier and parsing the pattern track lines as we go along.
  3. Route midi from renoise back to renoise and add midi bindings to the lua actions in a tool.

As taktik pointed out: “Lua scripts are running in the Renoise UI. Automation happens in realtime (the player’s/soundcard’s thread).” so all these hacks operate within the UI thread they do not allow us to do realtime automation. So what do the hacks allow for; they allow for event based actions. You could think of them as a messaging service that can be subject to latency (such as midi).

All of these methods are hacks and add extra cpu cycles and possibly extra latency. If there were more observables that would fire during playback it would both ease cpu processing and human production (coding) of playback related tools. I would prefer any native solution rather than using hacks, so pretty plz consider.

The only proper solution would be real time lua (Like LuaJIT). Now you get a certain amount of chances to execute something, depending if your GPU and CPU can handle it. If the frequency is set to 60 hz, then this will be one of 60 cycles precise in that second, if that will be 30hz, then it will only be one 30th of a second precise that something can be executed. Some people even lowered the GUI updates to 15 or 10 fps to save cpu time for the audio thread. These settings differ on each computer where Renoise is ran on so the outcome of having such a control is also different (not reliable).
It only has advantage if the delays are hardly noticable or don’t matter and you are the only one involved in the project and do the rendering yourself.

LuaJIT looks nice. Looking fwd to see it in action with renoise.

Do we think real-time lua functionality will make it into the renoise api at any stage?

Well, don’t know probably not into the existing API as it runs in the GUI thread.
That is what i know for sure now. The rest is still a mystery for the team as well.

Here is another earlier example of making such a device: http://www.youtube.com/watch?v=brBhjArLsEo

This seems to work fine without JIT, but it’s not manipulating the sound in any way.

Cool.

Is it possible to script automation of different DSPs active/inactive state? And have it mapped/automated with MIDI even though the DSP isn’t selected? :panic:

scripts run in the GUI thread so it is no surprise that it works well and apparently in time. with sound manipulation the situation is much worse; I can ensure you about this because I tried

What method do you use to update the background blend value of the tracks?

EDIT1: Awesome idea and execution!

EDIT2: And how do you get the RMS value of the track output in LUA? I can think of adding a signal follower and reading from it via LUA, but seems very hack-a-delic.