Brainstorming: Xrni Future

RNI future? My head is still in the darkages… I’m still a 16bit sample-wav monkey. Stuff like ‘velocity’ is for people who don’t have a heap of live instruments they can record. Hence for us live instrument people the arranger is the big thing, or some sort of live sample recording feature, as previously discussed.

I’m dying for some more weird things, possibly already noted elsewhere:

  • sample loop points that can be automated/controlled via pattern effects.
  • and as mentioned before, all aspects of the intrument editor to be controlled via automation/ pattern effects.

It doesn’t take much to make my day! The rest of my demands lie with the GUI…

I think MickRip’s idea about macros is good.

And maybe some sort of button that I can press that makes the RNI file sound as good as Mike Oldfield.

I’m sorry for this looooooong post. But I had too much spare time and thought that I could try to explain some of the previous ideas and a few new ideas more into details.
I’ll try to explain the concept of multi-instruments together with something I call ‘Instrument-Patterns’ in a common and a more technically way.
So dont worry if this is confusing at first read ;) Its suppose to be easy on top, and more complex and flexible if you dig into it.

Picture 1
Picture 2
Edit: Picture 3

Multi-instruments:

Now… As most trackers know (or should know). Multilayering of instruments that play the same notes might be just what you need to make a sound more phat, deep, special, ‘pro’, good, whatever :)
Most synths use two or more wave generators to form the sound. Many new ones use up to 6.
Some combine this with samples.
Ok, point taken. It would be damn useful if you could trigger multible instruments at once in renoise. Some people would say that you loose control this way. You CAN do multilayering by just copy/paste a track and change the instrument in one of the tracks. Others will say that copy/pasting is not innovative or inspiring. I would say there is a time for both. It would be a damn hassle if you had to track each wavegenerator (osc) for each synth you inserted. Kills your inspiration. However. After/during the composing, you might wanna go into more details.
So why not have both things?

In the first picture you can see that you are in the advanced tab in Instrument Editor.
On top left you can enable Multi-instrument. Now if you take a look in the instrument list, you will se that there is a + sign when Multi-Instrument is enabled. Open it and you will get a - - - empty - - slot.
Here you can insert another instrument (vsti or sample), and the - - - emtpy - - - slot will go one step down so you can add another one etc.
This is the most basic use of multi-instruments. Hitting a note will trigger all the instruments inside the multi-instrument equally. Should be very useful in its simple way of use.

Later I’ll tell how ‘instrument-patterns’ and ‘scaling’ can change the multilayering concept into an even more dynamic and flexible synth/sampler.

Instrument level fx:

On the right side of the picture you can see a Instrument-pattern. (more about the Instrument-Pattern later)
Even if the Instrument-patterns are not activated, you can still use the two tracks all the way to the right ( I Mst and I S00). They are ‘Instrument master tracks’. If you put any fx on the ‘I Mst’ it will affect the entire instrument.
If you look at the renoise internal DSP chain you will realize that from the moment you put any fx into the Instrument tracks (I mst) then your instrument will behave like a vsti (only one stream out).
You have to use this instrument in one track only. This is also true if your Multi-Instrument contains one or more vsti.
Now… as discussed in another thread. This might be too limiting for some. It IS possible to duplicate everything for each polyphony. But it CAN be very cpu intensive. But not always.
So maybe you should be able to set a ‘True Polyphony’. Where the default poly is 1.
The ‘Voices’ setting I have included in the picture was just ment as normal polyphony. Just as you set in any vsti. A true polyphony will have multible streams out of the instrument.

However this is not an issue if you only use samples in your Multi-Instrument (and non instrument-level fx’s)
This is also one of the reasons I would wish for a ‘render instrument’ feature. (explained later)
Renoise will then render out each note and velocity layer into another instrument with only samples.

Instrument-Patterns:

Now we are getting into more advanced stuff.
The idea is that you have a pattern with its own notes, fx and instruments for each note and velocity layer in your instrument. If you for instance press a c-4 on your midi keyboard with a velocity of 30, then the pattern that has that range will be played.
(see first picture) Over the pattern you can see its range. note = C-4 and velocity = 30-40.
You have another pattern for note: C-4 velocity: 20, and another one for D#4 40 etc etc.
You can decide the resolution and the bounderies for your velocity layers in the keyzone window. Theoretically you can have 10(octaves)*12(notes)*127(velocity layers) = 15240 patterns.
But its nuts to use all the velocity layers. And you dont need 10 octaves either. A more typically scenario could be to use 4 octaves and 4 layers. = 192 patterns.

As you can guess there is no limit to what you can do with a pattern :)
Automate, commands, use multisamples/instruments. Play a seqeunce/arpeggio etc. You simply cant get more flexebility then this. You can now for instance treat each sample in your instrument as a instrument it self (adding envelopes/filter etc).

I also included a sustain/loop function in the pattern. Have a look at the line numbers in the pattern. There you can adjust this. The green part is the release part of the pattern.
So in this example you can see there is a release sample (C-5 82 on track 02). Release samples are very often used in GIGA, Kontakt and other big formats.
You can also synch the Instrument-Pattern BPM to the song BPM.

If you enable Instrument-Patterns you will se that in the instrument list you will get another [+] sign.
(Picture 2)
Open it and you will see all the Notes used in the instrument. You can then open each Note to see the velocity layers and finally the list of instruments/samples used for this single pattern only.

However. Very often you wanna use the same instrument in several patterns (for instance you usually dont wanna load a string or bass vsti for each single Instrument-pattern when you are gonna use the same instrument over several Instrument-patterns). So I suggest that the first 80 (hex) instrument slots are dedicated to what I call ‘common-instruments’. You load the common-instrument just like you load the Multi-Instrument. It will be on top of the list under the [+]MultiInstrument. In other words, if you wanna load a instrument/sample that you are gonna use in ONE pattern only, then you have to browse to that pattern in the instrument list. Press the [+] and you will get a - - -empty - - - slot (that will have an instrument number 80 or higher). This way you wont get a conflict with the Common-instrument numbers (slot 00 to 80).
((You cant just have Common-Instruments as its limited to FF(hex) slots. Even if you had FFF slots they will not be sorted that good. So it will be damn hard to find the right sample/instrument. So thats why each single Instrument-Pattern has its own unique 80-FF slots.
This also gives more room for instruments without increasing instrumnet-number-digits in a pattern. A GIGA file can easily contain several thousands samples. If that many samples was to be loaded as single common-instruments in Intstrument-Patterns, then you would need more digits))

Speaking of the Instrument list… might be a good idea to be able to expand the list like on the second picture. Should be plenty of room if you use somehing higher then 1024* resolution. Also a small button on the bottom of the instrument frame to expand/shrink the list. The expansion could also work in smaller resolutions when you are in Pattern Editor. When you go into other middle-views it could auto-shrink the list.
Or ultimately you could undock the list to a separate window. Why not make the entire tracker dockable?
My dual monitor setup would really like that. :)
There will also be room for other lists like clip/pattern/RAM-samples/FX tabed into the bigger list.
Even customizable tabs to sort instruments into groups.

Back to the edititng part of Instrument-Patterns:
You can see that you have different modes of editing the patterns. Edit All means of course that you edit all patterns at once. If you type anything in one pattern it will be the same in all others.
The exception is if you have ‘Auto Note’ enabled. This means that if you for instance type in the note F-4 in pattern C-4 , then in pattern D-4 the note will be autoadjusted to G-4 etc. So you dont have to go into each Instrument-Pattern and change manually to get the right pitch.
Edit single, will edit only the pattern you see.
Edit selection means that you edit a selected range of patterns. You select this range on the scale-keyboard or select a group of patterns in the keyzone window.
Edit note means that you edit all velocity layers in the current pattern note.

ADSR:

I also think you should be able to choose which type of envelope setting to use on instrument Amp and Filter in the Instrument Editor.
Drawing envelope points are very cool but also very static. Its not that efficient if you gonna automate or scale the envelopes.
We would need an option to choose either normal envelopes drawn by envelope points, or ADSR and cutoff, reso and env amount.
You can hardly find any synth or sampler that do not have ADSR/cut/reso/amount… and for a good reason.
There are several ways to implant this. Just have a look at different envelopes in vsti’s like rhino2 (which has both types), fm7 and pro53 to see what I mean.

Scaling:

(see first picture)
Scaling is one of the key features of any synth to make instruments sound real and dynamic. It makes it much easier to vary the sound and ‘live feeling’ when you play on the keyboard. It can be essential to program good instruments.

There are lots of things you can scale here.
As you see on the x-axis you can scale over the keys or the velocity.
If you choose velocity you will scale between the velocities from whitin the note range you have selected on the keyboard.

On the y-axis you can have all kind of things.
Some universal, and some only for the Instrument-Patterns.
First you can have a per instrument midi in <–> velocity-out scaling.
This will scale the velocity on your midi input keyboard.

You can also scale the velocity of each instrument in a Multi-Instrument (common-instruments).
If you have a multisample instrument made of samples, then you can scale the volume of the instrument (not the same as velocity).
If the instrument is not a common-instrument or if it is just made of samples then you can freely scale the panning too.

If you have ADSR enabled for an instrument (amp and/or filter), then you can scale each parameter here.
And of course you can scale Cutoff, Reso and Env Amount. (You know… Hitting the keys light will have different cutoff then hitting the keys hard etc)

Now if you enable ‘Instrument-Patterns’ you can scale alle the fx paramters you have used over several zones.
In Unisone mode (polyphony=1) you can scale even single inserted fx.
In the first picture you can see that the Compressor ratio is scaled for the velocity on the note c-4.
In this case that means that if you hit the key c-4 softly, it will compress with ha higher ratio then if you hit the key c-4 heavy. Can be very handy if you program drum banks etc.
Now there are issues with this. If you have a technical view of the DSP routing you will see that you will need a unique compressor on each velocity layer. So you can end up using quite a lot ram/cpu on this. But this is totally up to the creator of the instrument.
However, if the instrument is unisone (polyphony 1) then only one compressor needs to be used.
Anyway… you can do lots of lots of things with fx scaling.
If you for instance put a reverb on the ‘I s01’ (instrument master send channel). And then put a send device in all the Instrument-Patterns, then you can scale the amount of reverb you want on different keys or velcities.
As you are scaling a send device you will only process one reverb.

If you got the ram and cpu, and the plugin support it, then you can also put a unique reverb on each Instrument-Pattern. This will make you able to scale all the reverb fx parameters. Like the lenght of the reverb (hitting a key hard can make the reverb longer then hitting it soft. Or C-5 will have more early reflection then C-4 etc etc).
The possibilities are endless. And its all up to the programmer of the instrument to choose how heavy on the cpu it will be (In the year 2010 with 8 multi cpu cores this might not be a problem :P ).
And also. You choose how many Instrument-Patterns to use. (the resolution of velocity layers and note range)
But a huge warning should appear every time you are about to insert a fx/vsti into multible patterns at once.

Instrument Presets:
I think you need instrument presets(programs) if you are gonna import other formats like AKAI/GIGA etc.
It would be nice if the .rni had its own internal bank. So if you make large and complicated instruments you should be able to make several programs. I guess this belongs somewhere in the ‘Instr.Settings’ tab. Maybe a ‘Multi-Instrument Properties’ like you have ‘Midi Properties’ , ‘Sample Properties’ and ‘Vsti Properties’?

Keyzones:

Ok. I did not make a picture of the keyzones. You all should know how it works. Have a look in other vsti samplers.
There is one difference though. You should be able to swtich between two windows. One where you are inserting samples directly into zones. And another window where you insert Instrument-Patterns.
I agree with It-Alien in the first post in this thread. A list of all samples/(Instrument-Patterns) used in a selected key would be very nice. With boxes to type values.

Oh…One more thing here could be to have resolution presets. Simply a set of presets to set up the zones.

Instrument rendering

Finally as you have made that kick ass instrument, you just realize that it uses 70% of your cpu :(
And you might not be as free as if it was a pure samplebased instrument (has to do with internal DSP routing and polyphony). You can smell the limits in the same way that vsti’s has limits.
Then you could render your heavy instrument into a simple samplebased instrument.
You know what I mean if you have seen the Arguru/discoDSP vsti HighLife or if you have rendered vsti’s into samples/sf2 using Chainer.
You can set up instruments of vsti’s and add fx (vst) as you wish, and then render it out to samples…
You choose the note lenght, the release lenght and the note-range and how many velocitylayers it will render out. Hit render and you are free.

One nice trick would then be to render out the same instrument several times.
First you turn off all fx like reverb/chorus/delays etc and render out the dry instrument. Then you render out 100% wet reverb,and then 100%chorus etc. Then you could set up the new instruments into a new Multi-Instrument. Then you can easily adjust/scale the amount of reverb/chorus you want. And its all sample based, thus not using much cpu. The reverb you can easily change the lengt (shorten the tail) by just adding envelopes to the samples and fade out etc etc.

Some other examples of use:
If a vsti plugin is not too heavy on your cpu you could insert several plugins to get better control of each note. Almost like samples. Lets say you insert the Vsti pro53 on slot 00 (common-instrument). You find a cool setting. For instance a bass sound. You set voices in pro53 to 1. Turn of fx you dont need. Now its pretty easy on the cpu. Then duplicate this instrument over to slot 01 to 0F. You now have 16 instances of the same pro53 plugin. You can then insert them into different Instrument-Patterns. For instance you set instrument 00 on F-2 and all patterns(notes) below.
Then you can set 01 on G-2, 02 on G#2, 03 on A-2 … … 0F on G#3 and all above.
You can then pan each note almost as you like. Use different commands on them that will only affect that single note. Almost like samples.

Ok I’ll stop now… now Taktik can come and say that nothing of this is possible to make, and its just too complicated and confusing :P
I’m just trying to generate ideas :)

cheers
-pysj

Very nice!
Some really nice ideas.

However have you tried using energy xt as a instrument container?

It lacks things you have descibed here. However I think it could be better if the instruments was modular?

With energy xt you can load several vsti´s and apply patterns and or chord automations to the vsti’s you can also apply envelopes/lfos for any parameter.

I havn’t really tried the Energy xt sampler yet. But I do belive it lacks the ability to control things with velocity.

Some other things I think could be improved with the Renoise instrument editor is that once you open it I think it could expand over the lower area for track DSP/Instr. Settings/Automation etc tabs…
This would give room for bigger envelopes etc and more options. Instead there could be a back/exit button, which brings back the pattern view and the lower area.

Another thing that could save space in your pic 1 example would be to have the edit all/edit single etc as a dropdown meny.

Wow, pysj, you have really put a lot of thought into this. I like many ideas here, like the instrument pattern, which would give you endless possibilities. Then again I also think that it could be too complex plus it adds new limitations. Hmm… it’s not easy creating a feature that should be quick and easy to use yet complex and allow everything you wanna do :)

About the ADSR: I have not looked at the VSTs you wrote about, but how about 4 sliders that simply changes the envelope. Then you’ll also have the possibility to fine-tune your envelope after setting ADSR.

about ADSR:

the equivalent of FT2’s Lxx command (set volume envelope offset) would already be a great think with not that great effort in programming.

If you were not a FT2 user, or not an advanced one ( :rolleyes:), I will explain what Lxx did:

In normal condition, when an instrument is played, its volume envelope starts playing from beginning.

You probably know that the units of the envelopes are ticks: that vertical lines you see into Renoise envelopes (see picture) are actually ticks.

Lxx lets the envelope start from the xxth tick instead of the first one.

Also, you can use Lxx without retrigging the note (i.e.: with a note, but without specifying the instrument number) to set the envelope position to a new position.

Of course, since ReNoise is not limited to 256 ticks in length of envelopes, the corresponding ReNoise command should be something like Lxxx.

Great effects can be achieved by using this command: so was with FT2, with RNS it would be a lot better.

Another thing that could be nice is a command that lets you chose which envelope preset the instrument is going to play with.

Pysj: I totally agree with your ideas… Imagine Renoise in about a year, how much features it will have… can’t wait…

Let’s hybernate!! :w00t:

I might not have been too specific about the ‘easy parts’ of this system.
When it comes to setting up multilayered samples there is of course the easy/normal way of doing this. You can do all this in the keyzone part.
Just for the record I made a picture of a typical keyzone editor :rolleyes: . This one also has a table for those who like that kind off editing.

Picture of Keyzones

You simply drag/drop samples from the instrument list or from the browser into zones. You drag them into the graph or into the table. Then you adjust the zones either graphically or by typing/adjusting the boxes in the table.
You dont need to enable either multi-instrument or Instrumet-patterns for using anything of this. Except if you use Instrument-Patterns then you can adjust the pattern-zones here.
Import of other formats like AKAI/SF2 etc will also use this simple method. Its just like setting up a normal instrument as you do in Renoise today. Only with several velocity layerings.

Some functions in the keyzone:
I think the picture pretty much speaks for it self. But there are a few hidden things i can think of.
If you select a zone in the graphical window you should have special options to move/resize the zones.
For instance alt+hold lmb you can change the velocity range by moving the mouse up/down and the key-range by moving the mouse left/right.

The table can be a bit overkill for some. But remember how unpractical the graphics can be if you have 20-30 or more layers on a single key. Instruments today are rapidly increasing in size. I’ve seen several with more then 50 layers.

The table will list zones that are mapped. You can choose 3 modes:
All = will list all zones.
Key = will list all zones in the selected key(s) (like on the picture where D-4 is selected).
Sel. = will list all selected zones in the graph.

You should be able to sort the list by pressing the header-name in table (or a rightclick option). By default it will list the zone with highest velocity on top.

You can select one or several zones in the table and move/replace/swap up/down in the table.
The small arrows will change values for all selected zone(s) in the table.

Clicking a key on the keyboard will select it. And you will then see all zones on that key in the table (if ‘Key’ is selected in the table).
If you double click a key on the keyboard it will select all zones on that key.

‘Generate Velocity Layers’ will generate equally big layers(zones) on top of each other. This is a good start if you for instance drag 20 samples into a single key. This way you dont start with 20 overlapped zones. Now its easy to adjust them further.

‘Glue zones’ means that the zones stick together. If you adjust the range of a zone, then the neighbour zones will also adjust.
Then its easy to increase a range and at the same time the ‘colliding’ zones will decrease so they dont overlap.

‘Overlapping Zones’ means that you can choose if its allowd to overlap zones. On most instruments you dont want this to happend.

The preset thing is mostly ment for Instrument-Patterns. It can be easier to set it up roughly with a preset and then start tracking the Instrument-Patterns.

Ok… thats all I could think of for now.
Hope some of this can be inspiring ;)

cheers
Pysj

@splajn
I have indeed tried energy xt. Its a very nice piece of software.
However…if you enable Instrument-Patterns then ‘my’ system is also modular.
The instruments are the generators. Patterns are the note/automation seqeuncers. When you insert a note you are connecting them. The track/send/master channels contains the fx. The send devices connects/routes them.
Its just a bit more tricky to see it visually in your head I guess :)

@Johan
You are absolutely right. Its very hard to have it all. Both easy and advanced/flexible.
I tried my best to make it as easy as possible on top. The easy sample-handeling with just adding samples to different keys is still there. You have to activate the more advanced parts in steps as you wish. Like multi-instruments and Instrument-patterns.
Scaling is also totally optional.
However I dont see any big limits here. Maybe you do? Please let me know. In that case there must be some better ways of doing this. I hope to keep the discussion up :)

ADSR:
I’m all in for the Lxx command. Still its not close to a ADSR/cut/res/env setting in many cases. I see no other way than having them both :) You would need to draw endless many waves to emulate it perfectly with env points and Lxx only.

-Pysj

:walkman:

I was referring to the one-output-stream limitation that instrument effects would cause. But of course, that’s already a limitation today with VSTi, so that’s not really a big deal… Other than that I can’t see any major limitations.

Pysj, thanks: with your JPG you perfectly visualized what I’ve meant here :yeah:

So true…

I can see a few exceptions here though.
If you only use samples and no track-fx anywhere inside the instrument, then you have a generator to each sample = no limitations.

Using track-fx or a vsti in a common-instrument slot (slot 00-80 that will be shared in all Instruemnt Patterns) then you cant benefit from independent generators anymore.
As I understood from how Taktik was commenting how to implant fx on a instrument-level (in another thread), then he wants to have a separate fx for each generator. This will give you a multible outputstream. But will also use a lot more cpu. At least this should be optional IMO.
I use 90% of all instrument in the same track. That would waste a lot of my cpu if this was not optional.

That was why I also mention a ‘True Polyphony’ option.
If this is set to ‘one’ then you will have only one stram like a vsti. If you increase the ‘true polyphony’ to ‘two’ it will actually kinda double the entire instrument. This way giving you two outputs etc.

cheers
-Pysj

You welcome :)

You got my bank account number… :P

-Pysj

no, there is not, but this is not the right topic to ask such questions: this is a thread about the future of ReNoise Instrument format.

Oh, sorry. I guess I was half asleep when I made the post.

i found a post over at wattm saying about dp4.5 -
“The coolest thing ive found so far is instead of drawing normal automation with the pencil tool you can have it draw a random quanitized automation with random squarewave”

wouldnt this be a nice feature?..

just a quick thought…

mlon

pitch shift and bend functions

record loop length and slider movement in sampler…close up sample…al la machine drum

time stretch

wa la

oh and quantize!

more process options

like processing a single hit but not the sample in its entirety

I had an interesting thought this morning while waking up. It’s kinda similar to some of the ideas Psyj and MickRip have been talking about here earlier. It’s also conceptually similar to some other stuff that’s been posted about using Renoise as a VSTi, but that is not really what I am gonna be talking about here. What I have in mind is probably a little more elegant/simple than the nightmarish world of building a VST from Renoise. :D

Anyway…

I was thinking that rather than (or even in addition to) trying to add some kind of complex pattern sequencer/editor into the instrument format, which could potentially be months and months of work, what about the ability to simply load a .RNS song into the instrument itself? The instrument would basically function as an instance of the Renoise player engine, but obviously with some limitations.

I don’t think it would be necessary to include any major editing functions from within the instrument itself - we would simply prepare the .RNS files at some earlier point in time, within the normal Renoise environment. We would only have access to player-related functions when later using a song as an instrument.

I imagined that we would be able to use the 09xx command to trigger different patterns of the song, rather than affecting a sample offset for that instrument.

I also imagined a BPM-sync function, with the option to: (1) sync to the main environment BPM, (2) sync to another BPM you can specify manually, or (3) have no BPM change (the instrument would play with the original BPM of the .rns loaded within it).

There could also be some realtime transposing options perhaps. For example, playing C-4 would trigger the instrument’s song normally. Playing D-4 would transpose the notes on the fly, allowing us to “tune” the instrument song to our main song environment. A base-note setting would also be available, working in the same manner as it currently does with samples. Any beat-synced samples within the instrument’s song should probably behave as they normally would - with their pitch unaffected by the note.

Additionally, things such as the instrument’s volume/panning envelopes should have their normal effect on the instrument, allowing us to fade or pan the song within the instrument. Maybe cutoff/resonance envelopes should also work, too.

Now… I don’t really know how totally insane this idea is, hehe. I guess it depends on how the Renoise player engine is structured, and whether it can be instantiated in this manner. In my mind I picture Taktik and crew as pretty crazy coders who would be thinking about things like this, planning for the future and structuring code in a very modular way. So I guess I’m imagining/hoping that the Renoise engine is modular enough to allow it.

But you can immediately begin to see how powerful something like this would be. We could create a simple .RNS song containing common drumkits (808, 909, etc) with common drum patterns we might use often, then trigger those different patterns with the 09xx command within other new songs. Or depending on how powerful your machine is, it could be as crazy as loading up a selection of entire songs as instruments, and playing/mixing them on the fly… could this also solve some of the live performance questions other people have?

Anyway. Something to think about. :)