[done b4 & b5] Mute groups

We’re doing this for two reasons: you can play the same drum kit individually (not cutting each others out) on the multiple tracks at once, if you really want to. And to avoid overhead: we’d else need to lookup all other track’s playback states every time a sample with a mute group setting gets triggered. In big songs with many tracks this will very likely cause far too much overhead for nothing.

taktik, I can see the problem (well at last to some degree), but this is absolutely needed for a proper mute group feature! The Mute group’s purpose is to allow the recording of natural sounding drums. Right now we can record them, but we cannot mix them properly, nor can we export them properly to mix in another DAW. So the recorded material is useless until we dig in and heavily edit it. If, however, we have to edit it anyway, we could have also done that right from the start. So the current mute group feature doesn’t add anything here, I’m afraid.
Also, for Redux, you will eventually have to make mute goups work across multiple outputs anyway, so I think this should work in Renoise natively as well.

Could you please let us know why spreading/playing an instrument in Renoise to multiple tracks is so essential for you? I think this is the real problem here.

I think for sake of mixing(?), adding separate effects, leveling, eq’ing for different drumkit elements.

1) For Mixing
2) For better overview (you know what C4 in track Snare is…)
3) For faster editing via Pattern Matrix

For somebody who is used to oldskool tracker workflow, it can look unimportant but for the others it is standart (every drum plugin has individual outputs)…
When i´ve figured this limit out i´ve stopped making any drumkits :unsure:

And it is problem in one another scenario:
R3 still doesnt have proper freezing function and rendering works only for one pattern, you can´t render all pattern of Snare track for example.
There is solution (even if it is not updated for 3.0 yet) and it is Freeze tool. But it doesnt work for collumns, only for tracks. So with no-multi output and just one track, you still can’t render all patterns of just one drum sound.
“Render column in whole track” would solve it though… Then you can have drums in one track and render them separetly for further mixing.

@taktik: Sure, it’s about mixing of course. As I said before, an analogue drum kit for instance is recorded with several mics. I don’t think you will find any commercial song from the last 40 years were this is not the case. A snare drum needs very different treatment to fit into a mix then a kick for instance. That said, while the current implementation of mute groups might work for the specific example of a hihat, sine you could argue that usually both open and closed hihats are recorded with one Mic, so putting those two in one track, while separating other drums might be fine for some people, I think that in general such a forced mixing decision is an unnecessary limitation.

@ taktik …
Currently there are two suggestions running in this thread so it can be alitlle bit confusing

  1. people would like individual outs when effetcs are tied to instruments , thus we can spread notes across multiple tracks and different outputs …as you said earlier this can be more difficult to implement due to the voice routing
  2. Spreading notes across tracks just as data while keeping the instrument’s main output , while effetcs are tied to instrumetns with just 1 output , this would not require a rewrite of the voice handling , but it would give us the advantage of using multiple tracks for note data , notes are routed to the master instrument track …but since there is no possibility. to assign an output track for the instruments , spreading notes acroos tracks results in not cut offs
    in vsti’s we can define the output ,so it doesn’t matter on which track the notes are insterted asss they will all be routed to the defined output , this comes in verryhandy when creating drumtrack

Not what you want, but you could;

…render a vsti like drumatic through the ‘render plugin to instrument’ feature (right mouse click instrument list or choose render to samples in the plugin tab of the instrument editor).

Use Fladd’s own: Split into separate Tracks | Renoise tool, in which you can split up a track into a group for example.

Happy mixing.

I dont talk about VSTi, I am talking about XRNi drums. So Fladd´s tool or render plugin to instrument arent solution, but thanks for suggestion.
I would love to be able to spread drum notes across several track mainly for better editing (and if i solo one track and render it / freeze it, muted notes will be ignored so no other sound will be rendered (then i would welcome that mentioned expanded rendering, because i use short patterns mainly) -----> It would be workaround for further mixing that Fladd mentioned —> you render that sequence, and place it on individual track).

Diferrent sound outputs aren´t so important for me when we can do basic mixing inside the instrument. So I would choose the easier way Gentleclockdriver suggests for now.
But multiple outputs should be solved, because Redux will need it.

I honestly think that 2. is not a proper solution on the long run. Every other sampler out there has multiple outputs and that is for a good reason. Not only for drums (where it is absolutely essential!), but also for other instruments. Again, the micing analogy is a good way to exemplify this: Some recording engineers might chose to record an acoustic guitar with two mics, one for the body and one for the frettboard. Those two will be mixed entirely differently into the mix (depending on all the other instruments in that mix).
So I think the mute group feature should just work across tracks as everyone would expect. With the output of that tracks being separated audio streams. (The same goes for the instrument FX, but that is another discussion).

The FX in the instrument itself are NOT supposed to be used for mixing! That is what the mixer is there for! Mixing involves making changes all the time. A drum internal mixing for instance changes dynamically with the rest of the mix, so going from the mixer deep into the single instruments all the time while mixing is far from optimal and just not how it is supposed to be. The FX in the sampler should be considered part of sound design. This is an entirely different purpose of FX use than how you use FX during mixing! I think keeping those conceptually different things actually separate is crucial!

Totally agree with you.
This instrument internal mixing is just a workaround (like my idea about comfortable rendering and post-mixing). Nothing that should be prefered over multi outputs. But I think that it will take some time and wont get into R3.0, so my suggestions (that are easier to implement IMHO) can be handy and usefull for our purposes for now.

I don’t see why it wouldn’t make it into 3.0. I thought Redux will be released simultaneously to 3.0 (and based on 3.0’s sampler), and they certainly won’t release that without multiouts.

But even despite multiouts, the current Mute group behaviour is inconsistent with how the rest of Renoise works! I CAN put notes onto individual tracks and they will be routed to individual tracks, even when they are part of the same instrument. This has always been the case and I think a new feature like mute groups (but also instrument FX), certainly should not break this standard behaviour!

Redux will be certainly released after 3.0, as its beta release will start after 3.0 will be released

But it will be just 3.0’s sampler, or not?

Well maybe nr .2 is not the proper solution in the long run , but maybe It can satisfy our needs for a while …e.g. building drumkits and use separate lanes etc …until a proper ind.routing is implemented .

I just wanted taktik to know .that there are 2 requests , one being the voice output routing , and the other one just multiple track lanes …1 output ( which I think is easy to fix )

I suspect if you would allow to output the internal instrument its effect chains to dedicated tracks, the problem would be solved, but in that case the notes for that effect chain should always be send to that specific track as well during the recording process (regardless where the pattern cursor is!) to cover up for the rendering process.
It is more or less using a multi-out structure in the XRNI that you then also can benefit from in Redux as well (you then have a multi-out channel system ready)

Taktik, thanks for the hard work getting mute groups into Renoise. The mute group feature is there, but there needs to be support for routing keyranges to different tracks to allow mixing different parts of the kit separately to complete it.

This is how most people use realistic multisample, multi-out virtual drumkit in a DAW:

kick drum, snare, toms, hi-hats, ride, and crash outputs are all routed to their own tracks. Often separate tracks for room and overhead microphones are also routed, along with other instruments that might be part of the kit. During mixing, levels and panning of each instrument in the kit can be adjusted (and automated), and different EQ, compression and effect settings can be applied to emphasize certain frequencies in each part of the kit and give it its own space in the mix. Not being able to do this with XRNI in Renoise is a severe limitation, and consistent multi-out support for XRNI (as per VSTi) would really make the XRNI format shine.

Well, let’s be a bit more clear about that here. There are two ways things would work correctly and properly:
A. The instrument does not know the concept of tracks and notes, regardless of from which instrument they come from, are routed to the tracks they are played on.
B. The instrument does know about the concept of tracks and hence needs to take care of the routing.

Renoise has always worked as in A. which was totally fine! Mixing was easily achievable by just separating the notes from each single drum into separate tracks. The instrument did not have to be concerned with output routing.

However, for whatever reason the devs decided to break out of this system when they implemented the instrument internal FX. And now again with the Mute groups apparently. This leaves us somewhere in between A. and B. unfortunately!

As I said, I think either A. and B. would work fine. A mixture of both does, however, not! It is inconsistent and will lead to more conceptual problems (see mute groups now as a perfect example!).

So, I am afraid a decision has to be made for either A. OR B.!

Disagree here, but of course understand your point. Doing the routing within the instrument FX allows you to bundle and properly reuse the instrument, which is not possible when doing the FX processing and mixing within the instrument and a set of tracks. I personally also see it as an advantage of doing the complete mixing !and! sound design of an instrument within the instrument itself.

In Redux you will be able to route instrument FX chains to dedicated outputs of the plugin, which then can be picked up and routed within the host. In Renoise you can also do this right now, but the outputs are dedicated channels of the soundcard only.

We actually had planed to allow routing instrument FX chains to tracks in Renoise, but had problems realizing PDC and a few other things for this. Can not promise that we’ll be able to solve those problems for Renoise 3, but I’ll definitely will have a look at this and give it another try.

Thanks for your thoughts on this taktik!

As I said before, if we talk about FX that affect the sound characteristics of the instrument (e.g. strong compression or gating on a snare to achieve a certain sound), then having that in the instrument is absolutely desirable and I am certainly in favour of having the possibility to shape instruments like this! But as I said, mixing FX are a different thing (e.g. overall compression/EQ/sent reverb on the snare to make it more prominent in a certain mix: this is something that cannot be set in stone in the instrument, as it changes dynamically, depending on the song the instrument is used in!).

I respect your personal preference here, but would like to remind you that this is by far not standard. The only place for mixing (as defined above) should be the mixer (that is why you put it there in the first place, right?). Of course, if you allow the instrument channels to show up in the mixer, than that’s a different thing, but at the moment this is not possible, and I also (personally) would favour a strict seperartion of the two FX usages (as explained above).

This is certainly good news. I hope this will make it into Renoise as well, because otherwise we all end up using Redux inside Renoise since it is more powerful, which I think was not the main idea here.

In general, it seems that you are arguing FOR changing Renoise’s instrument concept from, what I earlier referred to as, A. to B. So, coming back to actually discussing mute groups: How in this system could we route samples to different outputs (tracks) while at the same time having them in the same mute group?