Rendering realtime vs offline

When mixing this track, I had a long sustained/feedbacked VSTi sound that sounded really good to me in Renoise. When I rendered it offline with Arguru’s sinc, it turned into a mosquito:

Go to 2:30

http://soundcloud.com/dangayle/guilty-112-bpm#t=2:30

Re-rendering it in realtime, that part sounds much thicker:

http://soundcloud.com/dangayle/guilty-112-bpm-v2#t=2:30

Can someone explain to me what happened and why it does that?

What VSTi is it?

The interpolation method — ie. Cubic vs Sinc — only affects sample-based instruments. It has no direct influence on the output of VST synths, since their output is rendered directly by the synth itself, and not from any sample data that exists within Renoise.

Therefore, if Arguru’s Sinc is affecting your sound negatively, then I have to assume that you’ve sampled the synth’s output into Renoise using “render selection to sample”, or perhaps using the plugin instrument grabber?

If you did not sample the synth’s output, then it could be another sample which is being negatively affected by the sinc interpolation, which is then somehow interacting/interfering with the synth sound in this bizarre manner. It’s difficult to say for sure without having your song’s .XRNS to examine more closely.

The interpolation mode you hear during real-time playback is Cubic. Therefore, if you want to ensure that your render sounds exactly the same as the real-time version, you should always render using Cubic interpolation.

Sinc interpolation is technically “perfect” from a mathematical point of view, but it also relies on a mathematically perfect band-limited input signal in order to function correctly. Using sinc by itself does not automatically guarantee that the end result will sound perfect.

Minor distortions or other imperfections in the sampled material — which may not even be audible under normal conditions — can sometimes become problematic with sinc interpolation, and manifest themselves much more dramatically after the interpolation takes place. This is especially true on material that has a lot of high frequency content.

We have a small note about it in our user manual:
http://tutorials.renoise.com/wiki/Render_Song_to_Audio_File#Sinc_Interpolation

Generally speaking, I would personally recommend using Cubic interpolation to avoid any surprises. If what you hear during real-time playback sounds good to you, then just go with that :]

1 Like

Not true in my case though, read this.

I was only talking about the sample interpolation method, and how to ensure you get the same sound that you hear during real-time playback.

The problem you’re experiencing with Spire is indeed quite strange, but it has nothing to do with the interpolation mode used when rendering. Better to discuss it further in your own thread.

I’m using the free Kairatune (audio unit) on that track, so it wasn’t sampled. I do have other synths running also, so I’ll poke around to see what I can find.

Thanks for the other advice about running in cubic. What is the difference between rendering realtime and offline+cubic?

Disregarding the cubic if that doesn’t have any influence there is something else that dBlue did not explained about realtime rendering:

During realtime:vst output buffers are completely processed whereas with offline, Renoise tries to process the whole song in an as fast as possible motion.
For sample-based instruments, this works fine because Renoise knows all the limits of all its samples and its own fx tails. For plugins that have dynamic buffering applied, this can end up in a mess being that Renoise is missing parts of the output when rendering.

So a generic rule of fist is simply to use realtime rendering when using external plugins and Midi anywhere to prevent getting buffering artifacts.