this would be absolutely amazing and stellar.
is this even feasible?
renoise is such a powerful sampling tool that it is a shame to lose it’s sampling functions when one uses it in rewire mode as a rewire slave.
That’s the limitation of reWire alas. Audio streams go out and Midi goes in.
aww. I use a setup of Ableton Live (Master) & Renoise (Slave) at gigs and would completely flip if I could also sample from a line-input device with Renoise while it’s in Slave-mode. especially with “Record to Current Track” working the way it does now.
In generic is the Rewire Master the only host that controls the audio card this also regards the recording devices. All slaves submit to the master.
However, a Renoise which functions like this:
Started in ReWire Slave mode, Renoise has the capability of sending audio to the ReWire Master (already present in Renoise)
Started in ReWire Slave mode, Renoise has the capability of sending and receiving MIDI to/from the ReWire Master (I guess this is the ReWire Midi-In or something? I’ve never really used it, I just use IAC. Is MIDI over ReWire snappier than IAC Bus/Virtual Midi Cable-type Midi?)
Started in ReWire Slave mode, Renoise has the capability of recording audio from an audio-input, independent of the ReWire Master. ← This is where Renoise flies the pirate flag and brings something truly amazing into the ReWire mix.
It occurs, however, to ask this: Can a ReWire Master send some form of audio to ReWire Slave? With this, I could just run a template which has 4 inputs enabled from a ReWire Master, plug them into Renoise … and not be able to sample since sampling is disabled in ReWire Slave mode. and I guess there’s no such thing as Master->Slave audio. I am currently not in the state where I can learn how to actually pass wavefiles between Live&Renoise with a kind of a method where if I write something in Live, it gets loaded into Renoise. It would be massively amazing tho. . . . . …
No. Because in ReWire mode the slave doesn’t directly process audio. This does your ReWire Master. Audio processing is always done in the ReWire master. It doesn’t matter if (external)audio input or output). The ReWire slave will send it’s audio in form of digital data streams to the master. The ReWire master then will process these digtal data streams to convert them back into an audio signal. These audio signals can be further processed in the ReWire master, before they go to the ReWire master output. Only the ReWire master is able to use audio in/out from/to your audio interface. So you have to record your external sources with the ReWire master, export the recorded audio to a file and then import it into the slave. This is the only way.
A ReWire slave can receive MIDI from and also can send MIDI to the ReWire master. But the ReWire slave only can send audio to the ReWire master, but can’t receiver audio from the ReWire master or any other external sound sources.
That’s not a Renoise Thing. Only the developer and license holder of the ReWire algorithm (Reason Studios, formerly Propellerheads) could change that if possible at all.
Why does it work with MIDI in both directions, but not with audio?
Very simple. A MIDI signal is a pure data stream that just needs a few bits and doesn’t consume any CPU and has also no latency between ReWire master and ReWire slave.
But audio processing is much heavier. Within a DAW all audio processings are digital data streams. Audio from eternal sources are still analog signals. Your Audio interface will convert the analog audio signal by A/D converters to a digital data stream, so that your DAW can work with it, because a DAW works with digital data streams. That’s why it’s called a DAW (Digital Audio Workstation).
Same with audio output. Your Audio interface will process the digital data that comes from a DAW into an analog audio signal to make it audible in your speakers. That’s why an audio interface and its drivers work with buffers. The digital data stream will be cached into the buffers, where the cached data will be converted by D/A converters to an analog audio signal.
This process is called “Digital Sound Processing”, also known as DSP.
Digital sound processing can consume much CPU in some circumstances like using a bunch of CPU heavy VSTs. Because the sound processing of a VST has to be computed by the CPU then further processed by the sound buffers of your audio card/interface.
This is also the reason, why a DAW displays much more CPU consumption than your operating system will display. A DAW doesn’t only display the CPU consumtion. It’s a combination of processings between the CPU and the audio buffers of your audio interface. Now, if you have a sound card with weak buffers, your DAW will show a much higher CPU consumtion. Using a weak audio interface with cheap or too slow or small buffers quickly can reach its limits and you’ll get these famous artifacts, gaps and crackles when your DAW is playing. This is called “buffer overrun”.
How buffers work:
The digital data stream sent by the DAW to the audio interface will be cached into the buffers of the interface to convert the digital data to an analog audio stream. This needs computing time. But if the buffers are set too small or are too slow, they will overrun. Because the too much data that receives will be dropped by the buffers and get lost. This is what you then will hear as gaps, crackles or artifacts in the processed sound.
It’s like you take an empty glass and fill it with water. The glass fills until it will overfow and the too much water gets lost. Same with the too much data that will reach the audio buffers.
Now you have to think, if both, ReWire master and also ReWireslave could both process audio input and output, this would need twice as much CPU and audio buffers.