Actually, yes. I started out with a number of steps that detailed the process of linking a parameter. I’ve got them on my work-PC so unfortunately I can’t upload them right now. But it involves drawing a line between two devices by dragging the mouse, and - in the case of a Hydra device being the source - asking which output to use.
Anyway, with this modular routing business, I though it would be a good start to attempt to visualize what we already got. There’s nothing “new” happening in those pictures
Yes, i totally agree. i think this implementation is very good. and it is indeed a good start for this idea. however, you have now raised great curiosity, and more information on your idea is being furiously demanded.
as for drawing lines between devices, i’m assuming you mean in Mixer View, as you would not be able to do this across tracks through the DSP tab. seeing how it is possible to have both the Mixer View and the DSP tab visible at the same time, this seems to me like a proper way of doing this.
hope to see your examples soon.
dang. buzztracker lives again.
fun program, never was very stable…
Interesting idea to give buzz a “machine view” but it’d have to be very very well designed or this would be a trainwreck. buzz had the benefit of being developed from the ground up with its workflow in mind…
The idea has merit and it’s interesting but there are LOTS of implications on the workflow that currently exists in renoise… I for one actually enjoy the “simplicity” of renoise’s current single “tracker view”… I would probably just stick with 2.7 forever if 2.8 gave each track its own “machine”, each with their own pattern view… forget it. I for one hated hate that about buzz, especially when it came to collaborating or picking up a song after letting it sit for a few months.
Danoise - maybe I’m just exhausted and not reading you right … Ill probably feel like an idiot in the morning for asking. but I don’t completely understand why I would want to route the effects of my kick track to the gainer in my hiHats track and the gainer in my bass track ??
Renoise already has routable effects, no?
so what’s missing and how could an innovative method be implemented to route signals from one generator to anoth… … man this is sounding an awful lot like buzz tracker. why not just use buzztracker ?
It just happens to be Hunz’ demosong that I used for the purpose of illustrating the idea. But the principle is valid enough - it’s a basic sidechaining “compression” based on the Kick track. I use a similar technique in my own songs.
If you mean modular routing with audio signals, then the answer would be no. Effects in Renoise are processed in sequence, starting with the first track. Each device feeds into the next one, etc. And seen from this signal flow perspective, even send tracks are processed in this way (think of them as coming after the sequencer tracks but before the master track).
What we do have is modular routing with parameters. We can basically control any parameter from our meta devices (Key Tracker, Hydra, LFO). This is also why the Signal Follower is a bit special - it kind of sits between the two types of routing, allowing an audio signal to control a parameter.
This means that we currently have these devices as possible modulation sources:
Key Tracker
LFO
Signal Follower
Velocity Tracker
XY Pad
They can then be assigned to control any parameter, with the exception of the Signal Follower, which can only control parameters “after itself” (it will not recognize any device that comes before it as a valid target).
I’m catching on I agree it’s a very solid and desirable principle. but how to implement this with the current structure is sticky since effects are currently applied to tracks. . .
So say hypothetically renoise implements what is being suggested - what would happen when i would render a song with each track on a separate wav?
You might not have noticed it, but the sketch was actually intended to illustrate what we’ve already got. I carefully avoided new features: Everything in that image is doable right now, with the SFW and the Hydra and the gainers connected to each other.
I think that Renoise’s routing features ATM are great, but it’s tricky sometimes to figure out sometimes where a given signal is going. To me, the challenge is how to (better) visualize such a thing.
Edit: still no images, but basically try to imagine this workflow :
We got ourselves a song with some drums playing and a bass.
We want the sidechaining effect
(1) You go into the mixer
(2) You turn on routing in the panel on the right side
(3) Every device that supports routing (the meta devices) will show a connection that you can drag your “cable” from
(4) When you drag the cable around, each device will automatically expand to show all of it’s parameters (with name and/or tooltip to guide you)
(5) Release the mouse, and the connection is established
The only exception is the hydra, which would ask you which output to use
I’m with you there - you clarified that several times… but weren’t you posting the mock up as a method to implement what the thread was intended to suggest?? – allowing the audio signal of one effect (buzz machine) to be routed into the input of another - - I’m completely with ya on implementing some method to visualize modular parameter routing. hands down. it’s a great idea.
Not so sold on renoise adapting buzz’s routing system I’ve already stated - each generator in buzz had it’s own tracker. now if an instance of renoise could be treated like a buzz generator and we had audio routing capabilities from one xrns to others while also allowing users to throw routable effects in the chain - THAT would be pretty wild. . . but would prolly be about eight versions away
I’m not sure it was covered in the thread, but a way to see where each send device goes would be nice too.
Or maybe some colors on the send device and a color on the send track ?
something to avoid remembering where each track are routed in case of complex routing, this and what was talked about in the thread would be really useful features.
I don’t completely know what’s going on either. I just tried to “freeze” the inception if you will in a pictoral format instead of words.
That idea is loosely based on maps and motorcycle dashboards, while a big part of the idea is perspective.
I chose motorcycle dashboards because all the fat is cut, plus riding speeds offer a different perspective.
I’ve never taken a formal art lesson so my current collection of to-read texts is in need of some organization in order for me to better express idea to visuals.
I may take a closer look at mechanical and engineering drawing texts.
The idea is quickly expressed in four tracks, five if you count “parameter collection”, but I’ll get to that later.
One track to visualize only “meta flow”.
One track to visualize all dsp’s used which I have not put time in detail because I also wanted to visualize where sources are placed. By sources I mean VST’s or samples.
One track to visualize only “signal flow”
One track to visualize if signal is going to one send, mutli-send, or master.
That makes four tracks, the fifth track, “parameter collection” is supposed to visualize parameters that are of constant or favored use per section of song,
perhaps this could also be visualized like cloud tags, bigger = used a lot.
I wanted to visualize this “parameter collection” like a dashboard where as the others, like maps.
Just a quick notable mention which had some visual influence and that is American Football plays because its full with perspective and visual “routes”.
Football player and coach Tom Landry changed the game in a big way by utilizing his industrial engineering background to design next level plays.
The following video is Vince Lombardi Teaching the Power Sweep, a classic football play, any how, the visual is similar to this routing business.
This is awesome. If canvas were added I would make this. Although I would make the meta cables connect to exactly the parameter then if it’s shown in mixer view (preferably). It could be a layover for current track. Oh and I’d make the faders just like normal but indeed show I/O at the same time (another wish expressed by many indeed).
Cool work!!
Because this is all gonna take a lot of time before something like this will be implemented (no disrespect intended), I’ve made a little beta tool so one can at least route meta (CV) signals using keyboard. link!
Here’s a different version, I don’t know if it works or not visually, I’ve not really evaluated it. I’m off to do some pattern zooming mock ups, and if I have time, I’ll test out the new tool.