Leverage the EEL2/WDL library

Would it make sense to try and leverage the WDL library (specifically the EEL2/JSFX part) to incorporate the “missing” audio processing side of the Renoise API?

Maybe this doesn’t interface well with the Lua side of things, but I was thinking of the EEL2/JSFX environment sitting as a layer on top of the Renoise audio engine, with hooks into the Renoise API.

Or does it just make sense to keep it all in Lua(jit)?

Interested to hear thoughts on this.

Certainly jsfx can be written as a script, which makes it an open source premise and will increase the preservation of plug-ins.
VSTs and others are not suitable for long-term preservation as they are fragile for various reasons.

It would be useful if the jsfx used in REAPER could be made available as is.

2 Likes

You coudl try using GitHub - JoepVanlier/ysfx: Hosting library for JSFX which is a VST3 plugin which loads JSFX scripts.


A similar older project using LuaJIT for Audio DSPs:

2 Likes

Unfortunately, ysfx is VST3 and does not appear to work with Renoise V3.4.4 .
EDIT: This is a mistake. I can use it. I corrected it two posts down.

Works fine here. What’s the problem?

Sorry, I reloaded it and it is now usable.
This is useful!

You seem to be using an older version of ysfx.

Try using this fork here:

It was recently updated with a bunch of fixes.

@taktik I am using Arch Linux ysfx-git package AUR (en) - ysfx-git, so it should be the latest HEAD version.
I was able to use it by reloading the plugin cache.
Thanks kindly.

Ah, but that’s the forked version, which is newer.
I have requested the maintainer of the package to use that one.

EDIT:
I had a kind maintainer create a forked version of the package.
https://aur.archlinux.org/packages/ysfx-saike-mod-git

The effects I was x-running earlier also seem to be stable and usable.
Thanks for letting me know.

Thanks for the input here - very interesting stuff. I had seen that ysfx stuff before.

I was more leaning towards its usage with developing Renoise tools - for example integration with the audio engine/player. Specifically (in my case) I’d like to be able to tap into Renoise’s mixer and routing to create something like multitrack recording tool, but I suspect something like this needs to be closer to the metal in the api, not in a vst/au. This kind of tool would touch many areas of the application.

I guess what I was trying to hint at is; would leveraging WDL help plug a gap in the Renoise codebase to help bring about the audio processing part of the API?

So, I think I’m starting to answer my own questions around this…unless there is something?

It would not help. IF we’d do this, we’d use our own API and codebase for this.

You can already alter the routing in Renoise tools:

https://renoise.github.io/xrnx/API/renoise/renoise.Track.html?highlight=output_routing#output_routing

https://renoise.github.io/xrnx/API/renoise/renoise.SampleDeviceChain.html?highlight=routing#available_output_routings

You can’t change the DSP stream itself, only the routing. I guess that’s what you’re after. The use case for such tools is pretty limited. Writing new DSP FX is IMHO far more likely and usable, which is exactly what the existing tools posted above allow.

1 Like

I guess my idea would be that you can route arbitrary tracks to said YSFX plugin, and be able to access the inputs separately in the code. But currently i think the routing possibilities in Renoise aren’t granular enough for what I’m intending.

In the future - if the DSP FX comes into existence - it could be applied to the Master channel, and you’d be able to iterate through all the tracks routing into it and handle the recording buffers inside of that…

I am not sure, do you want to do multi-channel recording?
If so, it sounds like you could separate the Outputs of each track and route them as multichannel to REAPER, etc.

Mmn thanks @tkna!

OK so my actual use case is that I’m tracking a bunch of hardware, and I want to record it all into separate tracks in one pass.

I was doing this in Renoise using an instance of Bidule plugin on each track, but the admin of setting the filenames and managing the instances etc. was quite cumbersome. I also have Digitakt/Overbridge that doesn’t use the Line Input device, but it uses a multichannel plugin to separate the outputs to individual tracks.

Now I’ve started using Reaper instead to do this part of the process, because frankly, it handles it all well and it’s super flexible. So what I end up doing is record the pass and chopping up the results to import into Renoise. Oftentimes, I will just stay in Reaper to work on the track. But… it’s a different process. I miss the production flow in Renoise, and ideally I would be able to handle this all in one place.

I need to explore the realtime recording in Renoise a bit more thoroughly, and bouncing individual tracks to files. But to end, in the past, I would occasionally use some hardware with Renoise and just record passages here and there (remember Sample Recorder only records one at a time!). But now the hardware has become more of its own entity, managing that in Renoise as part of a hybrid setup has somehow more onerous than I remember it being.

Hmmm, I still don’t quite understand, but it is possible to send Renoise MIDI to REAPER, and then use a plug-in in REAPER for multi-channel output, and multi-channel recording on a track in REAPER, right?
Personally, I feel that it would be reasonable to give up on processing everything in one place, and if something can be substituted by REAPER or third-party plug-ins or Tools, let them handle it as much as possible, and work modularly by connecting each of them.

This is not quite clear to me, but if, or example, there is a need to record the multi-output of a VCV Rack running on Renoise with REAPER, etc., does that mean that developing some kind of intermediate plug-in would solve that?
How much easier would that be than recording directly with REAPER?

What other possibilities are there?

Exactly.

Personally, I feel that it would be reasonable to give up on processing everything in one place, and if something can be substituted by REAPER or third-party plug-ins or Tools, let them handle it as much as possible, and work modularly by connecting each of them.

Yes, on the whole I agree with this sentiment, but in practice (for me personally) I’ve not found a setup that works for me in a way I can stick with.

Looks like you’re using Linux, and a routing kernel like JACK(?) which has the benefit of sample accurate midi communication, am I right? I explored the potential of this on macOS, or, OSX as it was called then, and still found it more trouble that it’s worth. Maybe that’s different now. But IAC midi is not the silver bullet I find, as it’s not treated as a first-class citizen by the OS. It suffers from jitter - it’s OK - not great. (It was much better when Rewire existed, but that’s been canned!)

I feel like I’m going off topic here, but I will have a think on your suggestions. I think just piping audio from Renoise to Reaper is something to explore further.

Thanks!

Yes, I am using JACK on Linux to route audio and MIDI.

Are you unable to connect to REAPER via ReWire?
https://wiki.cockos.com/wiki/index.php/ReWire

The kind author has made it available for all of the following in Renoise.

  • Track Device
  • Sampler DSP
  • Plugins

It is possible to coexist with the previous ysfx.

That’s great! He makes some fantastic plugins - really opened my eyes to what JSFX are capable of. Dusk Verb, with its animated UI :star_struck: