I think one thing that’ll help streamline the feature request surge that’s sure to come - now that a wider pool of people are able to make adjustments to Renoise - is a thread where people can get a definitive YES or NO on whether their idea can be dealt with by tool scripting.
Some of my ideas in the past have quite rightly been answered with “hold on a bit longer, because the scripting engine could deal with this” but I have a few other ideas which I think maybe couldn’t be fixed with scripting… Anyway, if I know at the outset (even if my idea is half-baked but I can explain the gist of it), I can ask either the tool development community to have a look at it or the Renoise devs, if it’s something more fundamental. Does that make sense?
I think in time, people will start to get an idea of the sorts of problems that can be solved with scripting, and consequentially become more judicious in their lobbying for new features or adjustments
With that in mind…I’d like to kick things off with a request of mine for a while ago about being able to select single columns of patterndata (not just Note+Ins+Vol or FX - I actually often want to select only the instrument values, for instance). Is that too fundamental to Renoise’s GUI engine, or could it be fixed with scripting? If it could be done with scripting, no matter how complicated, I’ll know not to bother Taktik and the devs with it and I’ll either learn to script for Renoise or I’ll beg tools devs to help me with it
No one will be able to say you what exactly is possible with scripting the way we have it now and what not. There are for sure limits right now (no “realtime” scripting like DSP or meta devices), but there’s a LOT you can access and change. What you are able to do with this, is only something you can answer There are people who write a vowel filter with Renoise native FX only, now try to imagine where the limit is in the scripting when you’re a bit creative.
But hold your horses. Take a look into all the scripting docs to see what’s made public, what not. Take a look at the existing tools to see how they “do things” to get a better overview.
Even if you got an idea of what and how you want, but then stumble upon something that is not yet published by the API: give us some feedback about this, and we’ll try to add this to the API. All the stuff we have now is not written in stone. The API can be extended just like any part of Renoise can…
And if you’re not a developer or not interested in writing tools, then try to convince someone who is.
Awesome, that makes it clear So people should first talk to the community, then read the docs, then ask a scripter or learn to do it themselves, and that makes total sense.
Yes thanks, Ledger, I’ve been loving that! It fulfils a similar requirement and I’ll use it a lot, but single-column selection (without having to use adv.edit->content mask) is a huge workflow issue for me and one that I think would speed up others’ workflow if it became available. Anyway, I’ll not bang on about it here as I’ve done so elsewhere on the forum I shall study the docs…
Is it possible to make custom instrument/preset loader which loads multiple vst’s/dsp’s, adds tracks and sets audio routing?
Let’s take for example Aria Player which I use a lot.
Almost every time I load it I also:
-Add several aliases for different midi channels
(-Add new tracks)
-Route outputs to different tracks
-Add midi control device to all added tracks and set them to work correctly with each midi channel
-Add new send track
-Add send device to all added tracks (and set them keep source)
-Add some reverb to send track (and set it to 100% wet)
Would be really really really handy if I could just select a preset containing all this.
Possible with scripting and if not which part is the problem?
I’d murder somebody’s grandma for this, sauli. A single, catch-all preset for each of my hardware synths, a bundle of MIDI-controlled Kaoss-style effects to throw on the master channel of any song in a live situation - even one I hadn’t prepared/anticipated playing, just one-click to instantiate the whole shebang. Damn.
The things you list are, I think, mostly possible with scripting (excluding the output routing and the VST loading - I don’t know about those), but I’d be surprised if the whole lot weren’t. Can a wise person advise either way for each of sauli’s points? If it’s feasible, I’ll step up my RenoiseLua baby steps and try to solve this problem…unless a faster and wiser person sees the value in it and wants to do it themselves
sauli - let’s see what people say, then maybe break this out into a standalone Tool Request thread.
Not trying to be smart as I can see the benefit of your request, but wouldn’t it make sense to simply create a ‘preset song.xrns’ set-up like you want the routing/sends/effects most of the time and work from that?
Appreciated, but the point of this is that halfway through writing a song, when you make a compositional decision that you need just dropped into the project, you can get it. And if you’ve saved 10 variations on that setup, you can avoid that situation we all sometimes get into where our default template holds us prisoner to a certain instrument layout, or whatever, until we change it (at which point the template’s ubiquity and usefulness expires!).
Beyond that, suppose I grab a bundle of songs from my desktop machine and throw them onto a laptop for a live show. My portable soundcard has different MIDI interfaces and I’m using different controllers etc. so to save having to go through and tweak all of those songs with a maximum of clicking, I could just fire my ‘live’ setup presets (and maybe this tool could include a series of options for triggering find/replace on instruments, routings, so you can say "this is my ‘home’ config and this is my ‘live’ config) in each song and I’m ready to go. Currently, I have to hunt through DSP in various different tracks changing line-in device inputs, then go through instrument options changing MIDI outputs, mess around with controller mappings, blah.
And then my suggestion for a quick’n’dirty, “eep, this live set is going to be hella boring unless I throw my preset bundle of cheesy, artless, charlatan Kaoss-Pad style XY-controlled effects on the master channel so I can slap the touchpad all evening as if I know what the hell I’m doing” config, while the least dignified, is probably the one that most people would find a use for. Again, the point being that they didn’t know when they started - when they loaded up their default template song - what they’d want to do by the end…when it was too late to do it easily and efficiently.
Ye, I have two preset xrns’s I sometimes use, but like Syphus pointed out it’s not like you always knew where the process leads you to.
Actually now that I think of this problem some sort of intelligent xrns merge script could work as well. Then I could have for example sampletank_basic.xrns, aria_full.xrns and kontakt_2_channels.xrns combined with the song I’m working on at the moment. This way .xrns could work as presets file.
Yes, although you must understand the GUI API is bit limited right now. No canvas like object for custom graphics for example. But I am sure someone can hack something together anyway.
That is frankly what the arpeggiator is more or less a good offset for. Though playing concepts real-time is currently not yet possible. Would be nice though, then you can perhaps indeed do some own piano-roll timeline design.
i guess im right when i say that the api (or even the way renoise handles plugins [and their GUIs])
isnt able to get the last touched slider in a plugin gui.
its only possible to write automation to already declared (automated) parameters?
would like to make a autolink -script. i think its tedious to grab that automation device and search the parameters everytime you want to automate a single knob/slider.
ive looked through the docs and i think it isnt there yet.