I’m working on a simple tool right now, the basic idea is that it can be assigned to a track, and whenever that track plays a note the value and effect column value are sent to an external server where I do other stuff. My impression is that there’s a way to set a callback for a track’s notes which achieves this, but I’m having a little trouble figuring out how to actually do this.
Right now I’m trying something like this:
local function monitor(index)
local song = renoise.song()
for pos, column in song.pattern_iterator:lines_in_track(index) do
-- Not sure what to do here, the idea is to add a notifier to every note in
-- the selected track so that when a note plays the on_note callback is
-- called with that note as an arg
do_something_with_column(column) -- ???
end
end
I’m also not 100% on what the best way to get an integer channel number in the range 0…8 and a 32/64 bit float from a note, and then (this is more me not being great @ lua) convert them to their byte representation.
Would anyone be willing to take a look at my script? It would be deeply appreciated
I don’t have an answer to your question about what might be wrong. I am curious, though, how this is supposed to work in practice.
Is the ES9 track supposed to also trigger a Renoise instrument, or does it exist only to get data to a remote Rust process?
In cases where I want to have Renoise interact with an external process, and that process does not already accept MIDI directly, I use proxy (my own) that will listen for MIDI and in turn convert it to some other format (e.g.OSC) and send it to the server process.
In Renoise I set the track to send MIDI to the proxy. The advantage here for me is that then I can write code using a more robust language and libraries rather than trying to get Renoise to do the data conversion and network transmission. (I’ve done this using Ruby and JRuby, where I find it easier to such things than in Renoise Lua.)
Also, the Renoise notifier might be imprecise, so the timing will be off when the data are sent
The thought was: MIDI is low resolution, my output device (ES9) is capable of high resolution digital-to-analog conversion. So I’ll just send data derived from a note event to the Rust server, which will in turn fill the device’s output buffer and those values will be output to my synthesizers as voltage.
In general, when interfacing with modular synths which are capable of sending and receiving continuous control messages via voltage, 7-bit midi values are far from ideal IMO.
Regarding your last point, is there something preferable to notifiers then? What’s most accurate?
Did I understand right that you want to make a cv stream that will reflect the note data in a track? Where is the difference in when you analyze the note data and write audio, vs. using midi directly? I understand correctly that you’re using a sound card/midi to CV device trying to drive modular synths with this?
My idea to work with this would be to try to generate an audio stream in renoise tracks directly, with key- and velocity trackers and gainers that would modulate a DC offset device that is running through an autoseek empty sample. Then you have a constant audio stream, can modulate it with key/vel trackers in a track with note data, and you can just send the track output via routing directly to the CV ports of the ES9 device. You can even use formula devices in between, to create effects like portamento/glide/glissando, vibrato, detuning, pitch bend or nonstandard scales. For a simple CV modulator, you could just use an autoseek empty sample track, and modulate a DC offset device in it. Don’t forget to route the control audio tracks to the physical dc preserving ports, and to mute them from your mix, because DC clicks suck somewhat to the ears and speakers…
I’d be happy to help you with the formulas if you need help and if they can be shared. The formula device will act differently than the renoise tool - it is executed directly in system together with the key and velocity trackers. The tool would have to rely on non-realtime notifier callbacks, and you’d probably just mess up your timing with it.
P.S. so renoise maybe sucks a little in this regard, the DC offset device will mute the track and output nothing if there is no input for a while. So with an empty autoseek sample, you have the modulation while the song is running, but what if it is stopped and you want to tweak it? I’ve no other idea than to trigger a looped empty e sample now and then into the modulation tracks, so the DC devices keep spitting data. This could be a good feature request, that you may be able to put devices like the DC device into constant operation mode, where it will not suspend audio. Let’s suggest the feature and others that help with audio CV data in renoise in case you want to use it that way…
The note event is low-resolution to begin with. No matter what you end up doing, that’s what you are starting with. I wasn’t suggesting you send MIDI to the ES9. I was suggesting that you use an intermediary process (i.e. a proxy server), external to Renoise, where you have (ideally) better options for what software to use. That intermediary process then sends data to the ES9.
This avoids the issue with Renoise timing for notifies.
The effect column is 32bit though right? I’m getting back into Renoise after a long hiatus so there’s some things I’m unfamiliar with, but I figured I could map the 0000-FFFF of the effect column to -1 - 1.
Is there not a low/no latency way to trigger a function when a note plays using pattern iterators? I guess I assumed there would be.
Currently there is no low latency way to trigger functions in scripts.
You might want look into using VCVRack or the Cardinal VST. You could then use plugin automation, notes and velocity to have high resolution output to the ES-9.
Here is a thread on the Rack forum discussing this, but you can probably find more.