How Do I Use External Effects When Rendering?

Greetings,

My setup is this:

  • Renoise 2.7.2 for Windows
  • Focusrite Liquid Saffire56
  • Blackstar guitar amp

I want to render a part that spans multiple tracks and uses external effects. In this case, Renoise sends a signal out the Saffire’s AnlgOut3 to the guitar amp’s input. Then the amp’s effects are sent back to the Saffire’s AnlgIn3.

Here is a screen capture of Renoise playing and passing audio out and back in the Saffire56. The sound sent out Renoise Master track (“DAW 1 & 2” in the pic) sounds as expected.

click for larger image

When I render, however, the audio has clearly not passed thru the circuit in this screen capture. It sounds like it has only rendered the “M-Theory Out” track.

I did some Google and forum searches, but there didn’t seem to be a lot of information about this–or perhaps I did not query correctly.

Any ideas?

Thanks in advance,
Aaron

same poster as above, new preferred account.

I’ve rearranged some things, but the problem is the same. Here is a video description of the problem:

click for video (via Picasa)

Please help me!

Thank you,
Aaron

Maybe, I think that you should follow “from the Left to the Right” signal routing rule in Renoise.
(Renoise always processes signals from the left to the right internally.)
So I think, the routing form [M-Theory SND1] to [M-Theory in 1 & 2] is not good in this case (because it’s from the Right to the Left).

So maybe, you should move the contents of [M-Theory in 1 & 2] to the new send tracks which are between [M-Theory SND 1] and [SND 2].
In other words, it is necessary to re-arrange the order of send tracks to: SND1 >> in1 >> in2 >> SND2.

But, I am not so confident about external effector, so sorry if I’m wrong. :rolleyes:

[s]Silly question but you are using Realtime Mode rendering right? The Offlline mode shuts off the sound card and obviously wont render anything going into it.

Can you see audio going into your effects at all? [/s]The Realtime mode is only quite a recent addition and its main purpose was to allow rendering of external devices with the main song, rather than having to record them all first and render them as audio stems.

Clearly, if you want to use external effects, or render while you play a live instrument (rather than MIDI controlled) then you are going to need the outputs from the soundcard, as well as the inputs, active.

It appears on a very simple test (just putting a note and trying to render in Realtime) that this does not happen! IE I don’t hear anything from the computer when rendering in Realtime. :(

realtime render will only work for line input devices and will ignore routings out of renoise via send tracks or soundcard out channels. correct me if i’m wrong.

if you want to render a whole song (or individual stems) that have various routings in and out of renoise there’s isn’t a whole lot of options on the renoise side (AFAIK).

i have a motu ultralite soundcard that allows me to return a “main out” mix (or another specified channel) directly into the sample recorder. so when i need to bounce a song with external fx routings, i simply record this stream as i tweak in realtime. i guess this would work on individual stems too, but i usually just record the whole song in one take.

not sure if your soundcard has such options though…

Ahh, kazakore and oootini are right. My reply above was wrong, sorry for my misunderstanding.
Just tested such routings here, and I confirmed I cannot render include them. I just learned it too. ;)

In this case oootini’s solution is appropriate indeed.

i presume it’ll be rectified eventually, but the audio routing/recording (or lack thereof) in renoise is a major annoyance for me. i even tried to move to ableton! (didn’t last).

Thanks for the replies. This confirms what i’ve found, it seems that realtime rendering is completely disconnected from the audio interface as when composing.

I agree about the limitations of sample recording within renoise. Dont get me wrong… i love renoise interface… but sample recording could use the same sort of love the pattern editor and sequence pane has seen.

@kazacore: version 1.9, i believe, when realtime was an option

@oootini: it sounds like you’ve seen then how recording off any in/out of your audio interface each to their own audio track would be very useful. I’m going to get an iLok and evaluate pro tools. Im afraid i may prefer it in some regard to renoise, but i think they’ll play well together. Renoise is renoise. It cant be replaced, and certainly it would be a compliment to pro tools.

Come to think of it, this would be a rather simple fix. A check box in preferences: Realtime render uses audio interface routing.

Seems codewise to be easily implemented.

not for me to say unfortunately! but yeah, something like that and recorders for each track would be great.

I still don’t see why Line In Devices can’t can Arm/Record options…

I see where it gets complicated… you basically need to know whiich output pair is master. For most purposes, it would be safe to assume the song’s master out setting will work. My mixer panel has the liquid saffire’s daw 1&2 as the out device. At render, this gets redirected to a .wav file. There is the potential to change the sound if the audio interface’s mixes are set incorrectly.

The asio capabilities in renoise could be expanded. There are a few things i’d like to see:

  • support for up to 192KHz sampling rate
  • bit-level selection (24, 32, float, etc…) … if its appropriate to asio input, seems it would be
  • a robust routing interface, accessed right next to the mixer tab in renoise. Pattern synchronized toggle buttons for diskwriting the audio of each input and output

If this happened, i dont know that i could ever need anything else in a daw.

::bump::

is anyone else wanting to use the audio interface while rendering?

taktik, sorry if i assumed this is easier than it really is

I think this is quite hard to achieve because somehow rendered audio that has dynamic delays (delays that arise depending on how fast your CPU’s can cope the processing of whatever has to be processed) has to be recorded back through the in-device. Perhaps an estimate could be made of the delay for the device in if you could explicitly let Renoise know that the output is coming from a specific track (not yet even mentioning what would happen if you would reroute a track’s output back into itself), but otherwise i don’t see any other way to get such audio rendered along properly.

(let’s assume we all know how not to set up a infinite feedback loop)

i compose and render at 96 KHz, the maximum that Renoise will allow… i don’t think that varying latency is a big issue. i normally have 20 or 30 dsps across 10 tracks and a handful of virtual instruments and my cpu doesn’t choke during normal renoise playback.

i suppose i assumed the real-time render works like renoise live, except that is it is essentially disk-writing the song as it would play in renoise. by render time, you have already compensated for the latency in the various respective places that one can compensate for it.

also, yes, there is a slight but noticeable (if your listening for it) delay to go out the hardware then back in. this can be minimized by reducing the asio buffer size and also adjusting track delay compensation (the track going out to your effect could be a few ms sooner).