I’ve been working on midi mapping my Maschine Mikro, and I realized that this is probably one of the only things in the navigation that can’t be midi mapped. If this was midi-mapped I think you could do a TON of work, just from your midi controller! I’ve got almost everything midi mapped in terms of navigation except this one thing. You can go left to right by midi mapping the under the tracks and columns tab, so why can’t you can’t you move it up and down? Honestly, it just doesn’t make sense to me. The simple fact that you can move it with the Qwerty keyboard, it seems you should be able mid map it. If I could map it, I could keep my keyboard away until I needed to do effects commands. For me, while I’m working I like to work with physical controls. The way I envision this, is using a an endless encoder moving left to right, and another one moving up and down (preferably notched). I think it could add a lot of extra functionality even though it’s such a simple mapping. You could do a ton of work just from a controller.
Of course, I might be the only one who thinks this. And maybe I’m crazy! Maybe there’s a way to do it and I don’t even know it! />/>
Most MIDI mappings in Renoise are implemented in plain Lua, in a document called GlobalMidiActions.lua
You can modify this document, add your mappings too.
As an example, I added these mappings:
-- Navigation:Pattern Lines
add_action("Navigation:Pattern:Select Previous Line [Trigger]",
function(message)
if message:is_trigger() then
local line_index = song().selected_line_index
local patt_index = song().selected_pattern_index
local patt = song().patterns[patt_index]
if (line_index == 1) then
line_index = patt.number_of_lines
else
line_index = line_index - 1
end
song().selected_line_index = line_index
end
end)
add_action("Navigation:Pattern:Select Next Line [Trigger]",
function(message)
if message:is_trigger() then
local line_index = song().selected_line_index
local patt_index = song().selected_pattern_index
local patt = song().patterns[patt_index]
if (line_index == patt.number_of_lines) then
line_index = 1
else
line_index = line_index + 1
end
song().selected_line_index = line_index
end
end)
add_action("Navigation:Pattern:Current Line [Set]",
function(message)
local patt_index = song().selected_pattern_index
local patt = song().patterns[patt_index]
if message:is_abs_value() then
song().selected_line_index = clamp_value(
message.int_value, 1, patt.number_of_lines)
elseif message:is_rel_value() then
song().selected_line_index = clamp_value(
song().selected_line_index + message.int_value,
1, patt.number_of_lines)
end
end)
To add these mappings to Renoise, simply copy-paste this code and add it to your copy of GlobalMidiActions.lua
For detailed instructions, see the google code link above?
Thanks Danoise, that’s awesome! What are the limitations of this? Does this mean with a little scripting everything can potentially be midi-mapped in Renoise? I’ve got some idea’s that I want to try out. For example, I used to have a launchpad and there’s a Tool that allows you to use it as a Step Sequencer. I’d like set up my Maschine controller to act as a step sequencer in Renoise. I was thinking that I would create a series of templates in Maschine each acting as a different point of reference for midi-maps. So I would have one dedicated to live playing, one dedicated to general control and composing and one as a step sequencer? The template editor in Maschine is pretty rad. You can setup a lot of different things.
Here’s a question, using this scripting stuff, could I midi map the start and end loop points in the Renoise Sampler? With an unnotched endless encoder, if this was possible, this would make editing loop points in samples super easy! It would be really neat to be able to completely midi map the sampler so that I could make it more hands on.
Pretty much, yeah. Everything which is exposed by the API anyway. The limitation is that…
To make something like a step sequencer you need it to light up buttons on the controller, which is a lot more involved than just listening to, mapping to incoming messages. It is the reason we have something like Duplex - which already known how to set a certain color on the a wide range of controllers.
Of course, you could engineer a script that would allow you to talk to the Machine. It just a question of how much work that would be In itself.
This would not be hard to make - it could be done both with midi actions, or by packaging as tool.
But you won’t magically get automation recording for it, though.