There is way, a rather hack-ish way, but still a way (:
You can setup a midi loopback like i do in ScaleFinder, then you basically can send all sort of midi messages to the feedback device via scripting and recieve them as standard midi input within renoise. It will introduce a bit of latency, but one can work with it imho.
Good luck with your project, looks sweet
Looks great!! I’m happy to see your project becoming reality
There is a better/simpler way than to use a MIDI feedback device. It’s possible to trigger notes natively in Renoise, using the special “realtime” messages (of which two exist, one that trigger/release notes, and another that trigger MIDI messages). (see GlobalOscActions.lua for more information about this).
So it’s perfectly possible to trigger notes in Renoise without having to depend on external software, but it does not solve the problem of precision, or lack thereof.
The reason is that everything that isn’t triggered by an event (such as the user pressing a button, or the device transmitting some message) will happen in gui rate (idle state), which is something like 20 times per second, depending on how the CPU load, etc. So it’s possible to trigger a sound instantly if you press a button, but if you want to schedule a note to play in 0.5 seconds, it might play 0.5, or even 0.6 seconds later.
The only solutions I see - apart from the Renoise API being expanded with support for realtime messages - are :
Some sort of clever hack using the formula device (the formula device has realtime capabilities, and can be made to trigger an event e.g. for each line).
Add some sort of clock-sync to the hardware itself, so it can trigger those events by itself.
Hi danoise good tips but with osc from what I understand you have also said that the renoise osc server needs to be enabled and an osc client needs to be running.
I have looked at the formula device and I cant imagine how you would hack it to do that ? not saying it not possible , just dont know where you would start.
I understand if there are limits or if its impossible to accomplish. What are the chances of having the API expanded to allow LUA to play or trigger notes? It would allow for arppegiator type devices and gui pad triggering devices like a virtual sample trigger pad. And still keep the pattern editor free of any recording changes.
Anybody else interested? should we start a new post to request this? how many people would be on board?