Renoise Instruments Re-thought.

I’ve been reading what other people have to say about Renoise instruments.

I feel as though the existing instrument/sample philosophy needs to be forgotten and redesigned…

The existing sample editor works well and should remain largely unharmed for the time being. We should have an instrument editor which can draw from a pool of samples. Rather than the existing one to one relationship. (this part is obvious)

There’s a gazillion ways to make an instrument editor - from beatslicing to wavetables to synthesis - we must resign ourselves to the fact that it is inevitably going to be impossible to please everyone by employing a single way of doing things.

I suggest we create multiple instrument editors which draw on the sample and VST pool. When you create a new instrument, you are prompted with an “instrument type” or “instrument pluggin”. Each pluggin could have a completely separate interface. A lot of ideas that have been suggested in the forums could be implemented transparently this way.

This paves the way for a Renoise instrument API.

“multiple instrument editors”

What I was thinking is that we should aim for a customizable controller page as the first ‘Instrument Tab’.
Just like any synth out there, you don’t need to know what’s inside to use it. Just to tweak a few controllers up front should be enough.

I was thinking a good solution would be that we can right-click any element in the instrument structure (a device, a single slider, a envelope device etc…) and add a ‘shortcut’ of the element to the first ‘controller-tab’.

we could also choose to let it be shown as a slider or mini-slider (or a knob for that sake).
We could have different kind of presets there to quickly start from scratch before you start to create your new instrument (just like the current instrument has envelopes and lfo devices now by default).

Then it’s up to you to keep it minimal or advanced with lots of things to tweak.

The instrument could also have a RNI-device (like a midiCC device). You set this up on another tab inside the instrument, you can route different things from inside the instrument to outside controllers represented by this RNI-device. You could also route the same sliders directly to relative pattern-commands.

There are a lot more details to these things.
I’ll post more of my and others ideas into the new Feature Design forum when I get the time.
You are also welcome to contribute there if you want to really dig into this.

+1 … + Instrument Patterns + Metadevice SDK

… tbh though, Instrument Patterns are higher on my list than an Instrument API, as they provide instant gratification ;)

Instrument Patterns are the go. But an Instrument SDK/API (can anyone explain to me what the difference is?) could possibly solve the issue. An interface standard, access to all of Renoise’s tracker commands, DSP parameters etc. You could write an instrument pattern Renoise pluggin.

It would be very exciting times! :D

Im completely agree that existing instrument/sample philosophy needs to be forgotten and redesigned.
And same for the effects!!!.

Every slot in inst. must have an :
New unique window where you can do the new instrumnet . For examample:
-Intrument type: MIDI,Sampler, Renoise Plug-In*, VST Plug-in, DXi Plug-in, Buzz Plug-in,Rewire
-Instrument option: Single Instrument, Combo*.
-Intrument Interface

*Ability to put more then one instrument por Slot to do a combo.(like combinator in reason)
*New renoise api to develop instruments or machines (like buzz or psycle) :w00t:

New unique window where you can do the new effect.For examample:
-Effect type: Renoise Plug-In, VST Plug-in, DXi Plug-in, Buzz Plug-in
-Instrument option: Single Effect, Combo Effect.
-Intrument Interface

Please, let Renoise a simple and efficient software… Maybe all that stuff sounds cool but i don t want renoise to become as complete as reason or cubase.
I like renoise for his minimalism!

I didn’t know Renoise was a guy =( I feel used

Its like that moment in the film ‘the crying game’.
:o

I think the ability to trigger multiple instruments at once… idealy by dropping a number of them into some kind of meta instrument…would be great on its own. Then we get layering, different envelopes on different samples or on groups of samples with a small extention of current RNIs rather than a ground up rethink. Conceptualy at least im no programmer…

If we could do it with vsts too, then great.

Instrument patterns… im not sure im sold on that… sell me! :)

You can track your instruments!
That should sell you? no? :)

Just like you can add envelopes and stuff now, using patterns you can do pretty much anything to your instrument. You can stack instrument/samples there (just use several columns/tracks in the instrument-pattern). Do multi fancy trackery tricky wizardry. Use pattern commands, add fx etc.
In other words, anything you can do to make a song in renoise now, you can also use to make an instrument.
You can read more about instrument-patterns and other cool suggestions in the pinned RNI Future thread.
The idea was to combine the instrument-pattern with traditional instrument/sample mapping. You just drag/drop the different elements to a key/velocity-mapping. So you are not forced to use the instrument pattern to assemble your instrument.
You can use plain samples (like now) or you can use entire instrument (choose to copy or to only link them to a new instrument), or you can use entire patterns. (and later do the same with clips).
It’s a very simple system really, where its up to you to choose at what level you want do assemble a instrument. I can see a lot of creative use of such a system.

Well i can track my instruments now… thats what i use renoise for!

I also want to see more opportunity to modulate more parameters from more places. I also think the idea of having something like the envelopes in instruments but with a pattern interface is a good idea. When i got the demo i also thought it seemed like there should also be tracker style pattern modulation options in the instruments since there were envelopes in the patterns.

These are two different points though. Instrument patterns wouldnt allow me to layer, modulate more widely, or set multi envelopes up in and of themselves. Those things need to exist anyway at which point yes, a tracker modulator in the instruments sounds useful.

On its own it sounds like the other thing instrument patterns would be used for would be triggering phrases… Very useful for those who use a lot of loops im sure as it makes working with phrases made from one shot or synth type samples in the sequencer similar to sampled phrases or beats.
I dont really see the use in this for myself though, Id vote against.

Ill stop now rather than go into more detail because id like to know if ive correctly understood what you mean.
Sorry if ive missed the point of your posts on this or the rni thread pysj. Please do explain further if so.
:)

I’ve been thinking about the whole “Tracked instrument” idea and looking at various case scenarios.

Lets say I have a 4-note instrument.
| C-3 01 | G-3 01 | C-4 01 | G-6 02 |

I want the volume of G-6 to decrease as the root pitch goes up. How would I do that?

The best solution I can think of is to implement multiple instrument types. So “instrument 02” has an option allowing you to link the volume to the pitch.

What if I wanted some notes to transpose, and others to remain static? Easy, I define a “fixed frequency” instrument type, so C-4 sounds identical to C-1.

I believe it would be a better long term route to develop an API, so developers can develop their own instruments. And then implementing the next generation instrument schema using that API.

  • The API should allow access to custom tracker commands.
  • Dblue could write a Renoise version of Glitch. (as an instrument, not an effect)
  • Someone who finally make the beatslicer of their dreams without bothering the Renoise devs.
  • A SF2 wrapper could be developed (which allows the traditionally VST-unfriendly vibrato/pitch bend/portamento commands)
  • A native synth with transparent access to vibrato/pitch bend/portamento via the tracker. (instruments track like a sample, sound like a synth)

You should take a look at NI Massive dude… I think it has alot of the features you want in an instrument… although it’s not a sampler but a synth… but perhaps you could use it as a model for inspiration in this crusade. It’s got completely routeable parameters with LFOs, step sequencers, enveloping, velocity parameters, etc. The main thing that makes Massive kick ass is its routing though. It pretty much makes the need for scripting obsolete.

… Point here being, with proper parameter/metadata routing, anything is possible without the use of code.

There are many ways you cold solve this.

Have a look at this picture.
Under scaling you will see a collapsed “[+]Instruments”. In there you will see all instruments added to your main instrument (that you put into the instrument-pattern). There you will see different parameters for each instrument (remember I talked about giving each instrument its own ‘outputdevice’ called a RNI-device ?). Then you would just find instrument 02, and the instrument-volume parameter in the RNI-device for that instrument, and draw a scale line for it. Either a key(pitch)-scale or as velocity scale (you can see the volcity/note option to the right of the keyboard in the picture)

After this old picture was made, we now have already got an velocity scale as a separate device. Going further in that direction would make that picture a little different, but the functions are the same. We would then just need to add a key/pitch-device as well.
If splitting everything into devices it would look more like this picture. Have a look at the lower part of the list to the right, there you see Instrument devices. We would then add velocity and key/pitch device to that list.

But there are other ways to solve you problem as well… also remember that you can add as many patterns you want, overlapping each other.
That means you could add instrument 02 into its own pattern.
Each pattern must have settings to set them static or not. I was thinking there could be 3 or more static properties for each instrument-pattern:
-Lock note-pitch
-Lock velocity
-Lock FX

This could also be a per track option. So you could make track00 static and track01 dynamic (it will change pitch/velocity or even dynamically change any pattern editor fx-value (you just type in the maximum value and it will be dynamically scale it linear). Of course we might take this further and let you scale every single value you put into the pattern editor (you would then mark a value in the instrument-pattern and then be able to scale that single value by drawing a pitch/velocity line. But that sounds like a hell of a job to code. And there are just as good workarounds for this. But it is very possible to do.

The third way is to not use a instrument pattern at all. (I’ll tell this one in details)
Remember that the instrument-patterns are optional. Very important to remember.
So to solve this a third way:
Add a instrument-pattern. Type in your 00-instruments there, drag/drop that pattern into the key mapper (and adjust which keys and velocity that will trigger this pattern. Highlight the pattern in the instrument list, you will then see a ‘pattern-properties’ at bottom just the same way you will see instrument-properties if you highlight a instrument, and sample-properties if you highlight a sample.
In these properties you can tick a box that will make the pattern static.

Then you drag/drop the 02-instrument directly to the key-mapper (02 is not using a instrument-pattern) where you overlap it with the instrument pattern. This will trigger both the pattern and the 02-instrument at the same time and place.

By default instrument 02 will then of course not be static. But you can also make this one static in the ‘Instrument-Properties’ (like you suggested). As said before, each element inside a instrument has a device chain. That means you can add any type of device to any part of the instrument. So in this case you just highlight instrument 02 and then drag/drop a key/pitch device on it. Here you draw a line over a virtual keyboard (like on my first picture in this post but instead you now see it as a device). Then output the pitch-device to instrument volume (or velocity slider… there is a difference!).
You draw the line so that on higher keys the velocity will be lower.

The fourth solution (if your are nuts and really really wanna do things in detail) is to add a pattern to each velocity value of your main instrument (all this is explained in detail in the RNI-thread).
So… velocity 01 will trigger it’s own instrument-pattern, velocity 02 will trigger another, velocity 03 will trigger another one etc… The resolution of this you set in the key-mapper.
Remember that you will have options to edit all patterns at once. So typing in the fx command 0923 in one pattern can insert it in all patterns (you can choose if it should only insert in patterns for that key, or every pattern in the instrument etc.). Again… all this is written in the RNI thread.

All this might look messy in my bad writing and unfinished sketches . But there is a lot more to this. And I believe it is really really simple because it is based on the same principles that are already in renoise. How advanced you track your instruments is up to you. Nothing of this complexity will sacrifice the simplicity what we have in renoise now.
So if you know renoise now, you will instantly know how to use this instrument-system when you see it and start using it.

I would not mind a API. That has been discussed many many years ago as well.
But that is another discussion I think (or perhaps the discussion you initially wanted, but I turned into the RNI that have to be included by default in Renoise :))

No matter what, I’m pretty sure renoise will get it’s own improved instrument.

A internal synth would be really great.
For the thing about adding pattern commands, I was thinking about something similar as well (when taktik has made another 50 clones of himself -_-). But not for a complete API, but for the internal instrument structure. So if you for instance add a vsti, then you could add a automation device inside the instrument, then route any slider from the automation device onto pattern commands. You will still have the limitations of the vsti itself, but it would still be very useful to manipulate them with common commands that you also use on internal samples now.
This also imply for instance relative commands like for instance the 01xx command will not only pitch internal samples, but also pitch a vsti as well.
This pattern-command-fx-routing can also make for instance 01xx pitch all the 01 instruments (in your example) up, while the 02 instrument can be static, or even be pitched down! Or even connect it to an entire different fx/parameter.
So one command can alter a lot of different things inside the instrument.

Another aspect of this is to use renoise live.
Lets say you for instance mark all patterns in the pattern-seqeuncer, and drag/drop them to the key-mapper, then press a ‘drumkit generator’. This could act as shortcuts to jump to position in sequencer.
So pressing c-4 can make renoise jump to the 5th pattern in the sequencer, pressing d-4 will jump to 6th pattern etc.
Synch options to jump:

  • Immediately
  • Next beat
  • Pattern end

Another aspect is to arrange many patterns at once. Drag/drop patterns from the list of Patterns (not from the sequencer) will actually copy the entire pattern (with devices an everything) into a new instrument.
Every element of this new instrument is unique (fx/vst(i) etc are not shared with anything outside the instrument).
This way you can arrange/play entire patterns, several at once.

Later on you can do the same with clips…than this feature starts to make more sense.
With clips there will only be shortcuts to notes, and is another total different long story that I will come back to later. But it actually fits this scenario very well.

I love the idea about instrument patterns. Renoise becoming a phrase sampler and sequencer rolled into one :guitar:
So you’d have a limited subset of features from the pattern editor? A lot of thought need to go into that one.
For instance, I would expect to be able to play an instrument with an arpeggiator (standard up/down/random walk mode, and the ability to combine the notes being played). This is not possible in realtime, not even with the Live-mode that pysj suggest (which is another great idea btw)

Finally, don’t forget the existing instrument annoyances/shortcomings (yes, I keep track of such things):

The ability to control an instrument thru a specific MIDI channel

This is really not useful the way things work ATM. For live performance, one wants to be able to change an instrument while playing (and not having to fiddle with mouse + keyboard, or even looking at the screen). Another practical use is when using an external sequencer to feed MIDI to Renoise.

The ability to specify “auto note off”

See this thread for the description

Shared samplepool
Microtuner generates huge files :frowning:

Auto-select sample split
Forgot this one. Being to select a split by hitting the corresponding note on the keyboard. Can’t remember who suggested it, instrument-building would become a lot faster.

PS: I had an idea a while back, about how advanced routing could be displayed, which easily could be applied to instruments. Complex routing in renoise.gif. This type of diagram has a type of flow, which is pretty easy to interpret. And fit’s well with the Renoise aesthetic

That looks really good.

My first thought was that instead of having multi instruments as a + sign next to the instrument, I thought Renoise should use the sample bank, that is already there. But then I realised that it would not be such a good idea. Because it should be saved for multi sample instruments so that design definatly looks feasible.

Yay for the stuff danoise just mentioned, including:

  • Instrument macro. I envision the macro/pattern editor as a pop-up edit window, containing a miniature single-track version of the Renoise tracker. Arpeggios, timed note-offs, pitch slides, other effect column stuff, and all the other usual capabilities could be implemented within the macro editor.

  • Auto split generator (for faster instrument building). However, even the ability to declare a split range would be fast, and even more powerful. For each sample in the instrument, there could be a spot next to the “basenote” menu for selecting a note range. Like this:
    Basenote [F#4] | Split bottom [F#4] | Split top [C#6]

  • Shared sample pool. I must admit I’m only into the idea assuming the possibility of tuning one sample to different pitches at different places in the instrument (for microtunal scales).

The idea of Renoise plugins that users could develop really intrigues me, but I can’t say I quite understand what it’s potential is. Can anybody mention some practical applications, or point to a thread if one’s been started?

On a side note, yay for Renoise keeping “his minimalism,” as mat-weasel said… though I sincerely doubt that the developers intend to let the program get as bloated as other software.