Help: "Note Column Tracker" for MIDI Input and keyboard

Hi

I have been advancing with another small utility: “Note Column Tracker”.Basically, write each note in each corresponding note column, for all octaves, through a virtual piano for the mouse. Just use a track. Example:

7276 note_column_tracker.gif

Of course, in this situation, only one key is pressed at a time.

  1. My doubt 1 is:Would it be possible to achieve the same with MIDI input?Instead of using the virtual piano with the mouse, use a MIDI keyboard, keeping in mind even the note-OFFs.
  2. My doubt 2 is:Would it be possible to achieve the same with USB alphanumeric keyboard?Instead of using the virtual piano with the mouse, use the alphanumeric keyboard, keeping in mind even the note-OFFs.

Note: In both cases, you can press several keys at the same time, and that can be a problem…

Any idea to achieve this?

Thanks!

EDIT :I attach a animated GIF inside a ZIP, because uploading the file modifies the animated GIF by a static:

7277 note_column_tracker.zip Download to see the effect…

  1. I’m pretty sure that in any case (entering notes via midi or via keyboard) the only option is to ‘quickly’ move the note after it has been entered. Such a scheme can be done with the help of line notifiers. The line notifier would notify where something was entered, and you’ll then delete that content and place it in the appropriate note column.

  2. It seems to me this would require a remapping of the numeric keys on an OS level? I believe that the numeric keys are already hardwired to selecting instruments in Renoise (?).

PS. Using line notifiers is one of the things us scripters have to be most resposible with :slight_smile: E g making sure it’s implemented in a way that minimizes performance bloating. A line notifier will trigger on any patterndata change in a pattern, so things to consider when using it: 1) filter stuff at an early stage, disregarding “duplicate” or irrelevant events. 2) maybe only hook up the line notifier when it’s really needed, e g when the gui is activated and when you’re in a specific track (?).

Thanks Joule!

A small clarification in the 2)I mean the complete USB keyboard, with its letters, not the numbers only.Maybe I did not explain myself well.

What I was thinking is that the USB keyboard configuration, in Renoise each key is assigned to a note. So, when a key on the USB keyboard is pressed, the “M” or “U” key for example, is always “B-”, regardless of the octave.Then the note would be written in column number 12. Something like:

  • If you press “M” or “U” key, then position the writing in column 12 before writing, and then introduce note B-. and so on with all the letters.This I see a bit complicated, because the rest of values should accompany the note too, and at first I do not know how it would turn out.With the virtual piano for the mouse I have done something similar, but of course, it is only for one note at a time.But I do not know if the API available allows to control the keys of the USB keyboard.I have used this function to redirect:
function note_column_tracker( song )
  song = renoise.song()
  --"vb.views['WMP_CTRL_NOT'].value" is a number (range 0 to 119)
  for i = 0, 11 do if vb.views['WMP_CTRL_NOT'].value == i then return i + 1 end end
  for i = 12, 23 do if vb.views['WMP_CTRL_NOT'].value == i then return i - 11 end end
  for i = 24, 35 do if vb.views['WMP_CTRL_NOT'].value == i then return i - 23 end end
  for i = 36, 47 do if vb.views['WMP_CTRL_NOT'].value == i then return i - 35 end end
  for i = 48, 59 do if vb.views['WMP_CTRL_NOT'].value == i then return i - 47 end end
  for i = 60, 71 do if vb.views['WMP_CTRL_NOT'].value == i then return i - 59 end end
  for i = 72, 83 do if vb.views['WMP_CTRL_NOT'].value == i then return i - 71 end end
  for i = 84, 95 do if vb.views['WMP_CTRL_NOT'].value == i then return i - 83 end end
  for i = 96,107 do if vb.views['WMP_CTRL_NOT'].value == i then return i - 95 end end
  for i = 108,119 do if vb.views['WMP_CTRL_NOT'].value == i then return i - 107 end end
end

-----

song.patterns[].tracks[].lines[].note_columns[note_column_tracker()] --"note_column_tracker()" is a number (range 1 to 12)

I still do not control the “line notifiers” to carry out this.Somehow, this will also be related to illuminating each key on the virtual keyboard when a line has a concrete note (Illuminate a button when a note sounds in the pattern).Some time ago you told me that it is possible to do it, but with a little delay.It would be great if I had an example with a single button associated with a particular note.Could you make an example tool that does this?So I could multiply it for the 120 notes for my virtual piano for the mouse.

I am thinking of everything he has told me.I get the feeling that the API should provide other capabilities to be able to do all this, other functions that give us greater control to control first what will happen in the pattern, and also later. In this topic,Instead of accumulating the notes in the first columns, as Renoise is programmed, so that we can assign each note to each note column directly, using buttons, using a USB keyboard or using a MIDI keyboard.So, instead of doing two things, I would do one.

The steps I have in mind, to see if it is correct or not, to see what you think:

  1. Create a checkbox to enable or disable the function that chases the notes. Enable: sort the notes in note columns inside the same track. Disable Renoise: works by default.
  2. Enable checkbox.
  3. In live recording regardless of the peripheral used (mouse, USB keyboard or MIDI keyboard), write any note in track 01.
  4. A notifier is then activated to detect that a new note is written. Correct ?The doubts I have concentrated here. How to turn it on and off, so you have the least possible delay and at the same time have the maximum performance.In theory, would this notifier work at any time?
  5. Then you have to create a function that copies that note with its associated values, in the corresponding note column, and then delete the original note.
  6. And back to the same. It would also be difficult to translate Note-OFFs.

I understand that this function would be valid for all peripherals when entering notes.That is, change what Renoise has already written.If there was any way to change the writing position before writing the note for live recording, I think the OFF-note would be written directly where it corresponds.It seems simpler at first.But I do not know if the API allows writing conditioning: if you press “M” key (B- note) write the note in note column 12.This whole thing is very interesting.


By the way, it has given me the feeling that you were doing experiments with a possible pianoroll in some of your gifs.Are you developing something serious or are only tests?I read a comment from FFX recently on another forum, commenting that some internal Renoise stuff should be fixed rather than trying to optimize the API available to make tools.It is true that some things related to the GUI become very slow if the functions are a little heavy. For example, move 30 tracks of position in the index at a time or things like that.But I also realized that depending on the function you use, it can work better.So maybe they are both.

A small clarification in the 2) I mean the complete USB keyboard, with its letters, not the numbers only. Maybe I did not explain myself well.

What I was thinking is that the USB keyboard configuration, in Renoise each key is assigned to a note. So, when a key on the USB keyboard is pressed, the “M” or “U” key for example, is always “B-”, regardless of the octave. Then the note would be written in column number 12. Something like:

  • If you press “M” or “U” key, then position the writing in column 12 before writing, and then introduce note B-. and so on with all the letters. This I see a bit complicated, because the rest of values should accompany the note too, and at first I do not know how it would turn out. With the virtual piano for the mouse I have done something similar, but of course, it is only for one note at a time. But I do not know if the API available allows to control the keys of the USB keyboard. I have used this function to redirect:

The approach I suggested seems most feasible. Don’t try tracking what key the user presses, but instead let line notifiers handle it.

Keeping track of note-offs (midi recording) will require an additional voice scheme, cleverly caching some stuff to tie the note-off to the correct note (that has already been moved).

I still do not control the “line notifiers” to carry out this. Somehow, this will also be related to illuminating each key on the virtual keyboard when a line has a concrete note (Illuminate a button when a note sounds in the pattern). Some time ago you told me that it is possible to do it, but with a little delay. It would be great if I had an example with a single button associated with a particular note. Could you make an example tool that does this? So I could multiply it for the 120 notes for my virtual piano for the mouse.

It doesn’t work that way (probably). The implementation has to be made with several considerations.

I am thinking of everything he has told me. I get the feeling that the API should provide other capabilities to be able to do all this, other functions that give us greater control to control first what will happen in the pattern, and also later. In this topic, Instead of accumulating the notes in the first columns, as Renoise is programmed, so that we can assign each note to each note column directly, using buttons, using a USB keyboard or using a MIDI keyboard. So, instead of doing two things, I would do one.

The steps I have in mind, to see if it is correct or not, to see what you think:

  • Create a checkbox to enable or disable the function that chases the notes. Enable: sort the notes in note columns inside the same track. Disable Renoise: works by default.
  • Enable checkbox.
  • In live recording regardless of the peripheral used (mouse, USB keyboard or MIDI keyboard), write any note in track 01.
  • A notifier is then activated to detect that a new note is written. Correct ? The doubts I have concentrated here. How to turn it on and off, so you have the least possible delay and at the same time have the maximum performance. In theory, would this notifier work at any time?

Yes. (I believe that line notifiers are ‘sequential’ and predictable. The only time you need to worry about things not happening in the correct order is when using idle notifiers or timers AFAIK.)

This approach would work for pattern data being entered in any way. And yes, you should def detach any line notifier when you don’t need them to trigger any action.

From your list, I think you got the approach. Tracking NOTE-OFFS can be done with some additional ‘voice handling’… (a note-off is written… check in a cache_table which was the last note written there and where it was moved. Move the note-off accordingly. Heavily relies on the cache_table being updated properly)

  • Then you have to create a function that copies that note with its associated values, in the corresponding note column, and then delete the original note.
  • And back to the same. It would also be difficult to translate Note-OFFs.
    I understand that this function would be valid for all peripherals when entering notes. That is, change what Renoise has already written. If there was any way to change the writing position before writing the note for live recording, I think the OFF-note would be written directly where it corresponds. It seems simpler at first. But I do not know if the API allows writing conditioning: if you press “M” key (B- note) write the note in note column 12. This whole thing is very interesting.

By the way, it has given me the feeling that you were doing experiments with a possible pianoroll in some of your gifs. Are you developing something serious or are only tests? I read a comment from FFX recently on another forum, commenting that some internal Renoise stuff should be fixed rather than trying to optimize the API available to make tools. It is true that some things related to the GUI become very slow if the functions are a little heavy. For example, move 30 tracks of position in the index at a time or things like that. But I also realized that depending on the function you use, it can work better. So maybe they are both.

I only made all proofs-of-concept needed to verify that a decent piano roll can be scripted. It was mostly for fun, but I don’t see any big motivation for doing it. ViewBuilder is also updating fast enough, as long as you keep from using too many/big bitmaps :slight_smile:

Personally, I think LUA and the API are fast enough. It’s only natural to expect some overhead from a consistent and flexible API. I’ve heard it being suggested that pattern data access could be faster, and that Renoise might have some extra internal overhead due to XML, but I don’t know if that’s true. Sample writing could probably be highly improved/optimized as well. What bothers me much more is some of the limitations in ViewBuilder, feature wise :wink: Otherwise, I think we can theoretically do surprisingly much with the API - and fast enough.

Hi Joule.nally I have a little time to talk about these things…

The approach I suggested seems most feasible. Don’t try tracking what key the user presses, but instead let line notifiers handle it.

I am currently considering the two ideas:

  1. Order after writing or.In this case I have already performed a function that tries to sort the notes in different columns, but is executed with a button, not through a notifier that reads directly in the pattern editor.
  2. Sort according to the pressed key.In this second case, I have already prepared my virtual piano for the mouse, to order in column according to the keys (the notes). But it gives me a small problem with OSC Server and the speed of the (volume) when I record live (with the patterns running).On the other hand, I have configured the USB keyboard keys (Z, S, X, D, C …) so that they can sort the notes written in the 12 note columns.

In the second case, the only problem I encounter is whether there is an identical equivalence for these three things: a button (executed from the mouse), a key (executed from the USB keyboard) and a key (executed from the MIDI keyboard Pad MIDI)).I am going to open another topic to address this issue in depth.

It doesn’t work that way (probably). The implementation has to be made with several considerations.

Is it very complicated to make an example with a single button and then serve to multiply it for 120 notes?Could you help me here?

I only made all proofs-of-concept needed to verify that a decent piano roll can be scripted. It was mostly for fun, but I don’t see any big motivation for doing it. ViewBuilder is also updating fast enough, as long as you keep from using too many/big bitmaps :slight_smile:

Well, I would really like to see something that works.It would be very interesting, not only to handle it but to see how the code works.A good motivation is to have the philosophy of finishing what you start. Sometimes it is complicated, or it takes a lot of time, but a more or less complete pianoroll would be a unique tool very downloaded by many people (provided that it works well) and a relief for many who protest by their absence. Actually, if it is possible to build a pianoroll, we would have to have one working for the community. And no doubt, it would be a source of inspiration for other tools.

Personally, I think LUA and the API are fast enough. It’s only natural to expect some overhead from a consistent and flexible API. I’ve heard it being suggested that pattern data access could be faster, and that Renoise might have some extra internal overhead due to XML, but I don’t know if that’s true. Sample writing could probably be highly improved/optimized as well. What bothers me much more is some of the limitations in ViewBuilder, feature wise :wink: Otherwise, I think we can theoretically do surprisingly much with the API - and fast enough.

I miss some things less.For example, a simpler and equivalent control of keys for USB keyboard and MIDI keyboard.Specific code to be able to control or read specific values of the pattern editor, to be able to create specific functions according to this information.A replacement for the OSC Server to be able to directly control the sound and things like that…

Maybe I still ignore some things, but there are certain things that could be done more easily if the available API was more prepared.For example, to make a pianoroll and that the notes sound when composing, it would be necessary to use OSC Server again. And I think that is not the ideal solution.However, there is no pianoroll tool yet.