send Track can be placed next to a group or even a single track, so organization will be much more clear. E.g. I want to process only a wet delay signal for a single track
Please allow send devices in instruments to target outside send tracks. So we can divide the output of the instrument. / Allow instruments to play on multiple tracks. So each sample within an instrument will play the the deale of the target output channel.
Recording of graphical automation should be able to be placed between pattern lines in a high resolution.
Automation and pattern command shouldn’t be quantized! This makes no sense, at least for me…
Some features like column info (e.g. on the bottom line of the track little vol/pan/dly), more mainstream demosongs etc. for tracker beginners.
Improvements of pattern readability: Extra color for empty pattern lines, Instead of “OFF” for note-offs kind of filled rectangles
Edit beyond pattern boundaries: So we can start a selection from the end of pattern 1 and finish selection in pattern 2, if “wrap around pattern edges” is off.
Edit beyond automation boundaries.
Optional/default auto focus on mouseover, so “set keyboard focus here” is not needed anymore. At least for strg-c, strg-v, strg-x.
Double click on a point always deletes the point, no matter of mode.
Line mode: line drawing can be aborted by pressing esc.
On draw mode, if I select a point with the mouse and move it horizontally, no new points will be drawn, the point will be moved. New points only if I click in an empty area.
Bug: If I move/drag a point in line mode, the line mode still wants to start a line drawing after finished positioning of the point
A/B buttons also for vst/au generators, within the gui window.
Instead A/B please A/B/C, maybe a labeling function in context menu would be nice (shown while mouse-over, for remembering what the preset was for, Eg. In combination with other devices)
mono compatibility for the width slider and also the surround slider of the stereo expander device.
automation of sample start, sample loop start, sample loop end
graphical eq changes could be visible in the main spectrum pane, that would make the eq much more precise and also interacting with the visible live output
“hold” value / default note-off length for each instrument: A default note off value range for samples, vsti and midi. If another note off is set, this value will be overwritten for the note (just like in OctaMED).
GUI on OSX is still somehow slow in some situations. E.g. take the EatMe-Mode song: In mixer view I clearly can see how the framerate drops to maybe 20-30fps. Also switching the tabs somehow lags. It’s not a show stopper, but could be better, in my opinion.
audio starts to stutter very often on high CPU usage, at least on OSX. Compared to logic or cubase, in Renoise stops a fluent working -a lot- earlier even in situations where u ask what is now so CPU consuming. Also increasing the audio buffer size doesn’t seem to help a lot or at all. That’s why I guess something other then the CPU load of audio processing causes these heavy show stoppers - maybe the GUI code? Or the scripting code? Or the way the plugins are activated again after a silence?
Helpful here would a freezing option per track.
helpful here would be a better CPU usage display, e.g in the mixer for all plugins and generators and maybe a sum per track?
Real side chaining, since it’s better to be able to use the sophisticated compressor algorithms of the plugins instead of Renoise’s limited signal follower or something. This would require VST3 support on Windows Renoise. Mac Renoise already supports it via Audio Unit.
@gova: A native support would be better. Not sure, why not more pattern fx commands will be added. This is indeed an easy one. Perfect for humanizing piano pieces more. So you can overall change the song tempo, without need to change each slowdowns in the patterns. Anyway neat trick with your tool />
Yeah, thanks from me too… but native support for this would be completely TRIVIAL to implement, it’s just so fucking frustrating that the devs “forgot” about it, and then they just ignored every request for it to be added.
This feature took us a few minutes to add in Schism Tracker when i was contributing code to that project.
I think the issue is that in renoise, every time the BPM/TPL/TPL changes, during rendering the song’s end is recalculated along with the resolution of everything, (notice the intense slowdowns when it’s automated and rendering), and supposedly renoise has some issues with that.
Rex2 support would be nice, or at least better import options so I can bounce out a rex2 loop as individual samples and then import them and merge them back into a single sliced sample (with the slice markers where the files join). that would save a lot of fiddling about with slice markers in the editor.
EDIT: oh and being able to import the MIDI data from rex2 would also be tight.
If there is a wide demand for such feature, there is more solid ground to implement it.
But since you are more or less stating you even had to personally implement such feature in Schism tracker to have it, gives this feature a personal character and this means that outside you and perhaps a few others, the majority is not thrilled in having that option
So it may perhaps not be forgotten, but the low demand is not worth adding it and when we speak of effect commands, these have to be carefully used for features that require time-specific control adjustments.
I personally am more interested in having a dynamic command set that is user specifiable (ZUXX). But for that we more or less also need to be able to write Lua tools in real-time mode.
This would at least allow for adding such personal requests it would also open up a whole lot of more opportunities (and perhaps a can of worms).
There is a technical reason that this command was skipped for 3.0. Basically there are a lot of pre-computation that rely on the tempo and hence, code that needs to change in a lot of places.
I’m sorry that I can’t be more specific than this - I am not a core developer - but taktik actually did (attempt to) explain to me why such a feature was not so simple as it might seem. All I can say is that it was seriously considered, among a few other suggestions that arrived at a late stage.
Cool idea - but quite a large subject. Has anyone started a topic about this?
That’s not at all the case. Schism Tracker was in its early alpha stage at the time, and many features such as relative tempo changes were missing from it. As an open-source “remake” of Impulse Tracker, our aim was to have it do pretty much everything that Impulse Tracker could do. That included of course all pattern commands.