Ossia Score, tool for scripting stuff via OSC

Just learned of this tool, Ossia Score:

Enables precise and flexible scripting of interactive scenarios.

Control and score any OSC-compliant software or hardware :

Max/MSP, PureData, openFrameworks, Processing…

https://ossia.io/

Wow, I also haven’t heard about this before. Looks mightily impressive!!

Especially interested in the fact that they implemented (some kind of) osc-query - that, to me, is the most urgently needed piece in the whole OSC puzzle.

Looks like this could take away the pain of more complicated setups.

looks cool, i need to learn about osc. … any tips?

looks cool, i need to learn about osc. … any tips?

What a coincidence. I wrote a short book on it.

http://osc.justthebestparts.com/

Looks interesting… forgive my lack of knowledge (and perhaps imagination) but what are the possible use case scenarios with Renoise?

Would this need a lua tool of some sort? Hardware controllers? I know little about OSC other than it’s a replacement for midi… I will be reading your book now, thanks!

Looks interesting… forgive my lack of knowledge (and perhaps imagination) but what are the possible use case scenarios with Renoise?

Would this need a lua tool of some sort? Hardware controllers? I know little about OSC other than it’s a replacement for midi… I will be reading your book now, thanks!

Renoise has a built-on OSC server (the thing that listens for OSC messages), and a set of predefined message handlers (for things like start/stop transport)

http://tutorials.renoise.com/wiki/Open_Sound_Control

You can add custom OSC message handlers by adding code to your local copy of GlobalOscActions.lua.

You will need to know Lua, and the Renoise song API, for this. But there’s a lot of good example code to learn from.

With your own OSC handlers you can send messages to Renoise to have it mute/unmute tracks;, jump to a pattern; set a loop of patterns; play notes (or trigger samples). What you can code with the Renoise API.

Interesting theme. From what I understand, if the OSC server is not there, no tool could control the reproduction of the notes, and therefore, it is not possible to listen to them. Yes, it is possible to edit the data in the pattern editor with specific functions, but internal recording does not work, which is apparently commanded by:

---- Realtime Messages
 
/renoise/trigger/midi(message(u/int32/64))
/renoise/trigger/note_on(instr(int32/64), track(int32/64), note(int32/64), velocity(int32/64))
/renoise/trigger/note_off(instr(int32/64), track(int32/64), note(int32/64))

This is the only thing I see really useful to use OSC in a tool. I find it hard to see a concrete scenario to want to control Renoise remotely, since you need to see the Renoise screen at all times to know what is happening at all times. You may need to learn more about the potential of CSOs from a remote point of view.

Regarding the scripts to build tools, with which we have a specific code to define the OSC client, the protocol and port and the control of note_on and note_off I think that is enough.I am using this code to define it:

-----------------------------------------------------------------------------------------------
--osc server
class "OscClient"
     
function OscClient:__init( osc_host, osc_port, protocol )
  self._connection = nil
  local client, socket_error = renoise.Socket.create_client( osc_host, osc_port, protocol )
  if ( socket_error ) then 
    rna:show_warning( "Warning: Failed to start the internal OSC client" )
    self._connection = nil
  else
    self._connection = client
  end
end

-- Trigger instrument-note
  --- note_on (bool), true when note-on and false when note-off
  --- instr (int), the Renoise instrument index 1-254
  --- track (int), the Renoise track index 
  --- note (int), the desired pitch, 0-119
  --- velocity (int), the desired velocity, 0-127
function OscClient:trigger_instrument( note_on, instr, track, note, velocity )
  if not self._connection then
    return false
  end
  local osc_vars = { }
    osc_vars[1] = { tag = "i", value = instr }
    osc_vars[2] = { tag = "i", value = track }
    osc_vars[3] = { tag = "i", value = note }

  local header = nil
  if ( note_on ) then
    header = "/renoise/trigger/note_on"
    osc_vars[4] = { tag = "i", value = velocity }    
  else
    header = "/renoise/trigger/note_off"
  end
  self._connection:send( renoise.Osc.Message( header, osc_vars ) )
  return true
end
osc_client = OscClient( "127.0.0.1", 8000, 2 )

I do not know if it was Danoise, or Joule or another guy who wrote this code. But it works perfect to control the notes (on / off). and this allows us to make very interesting tools. The last one I have built is a pretty good step sequencer through functions that are chained timers that can be selected or skipped.All this would not be possible without OSC. For the programmer, it is essential to be able to listen to the notes in their tools.

I wonder if there is a cleaner or simpler code to define the oscClient to control the notes (on / off).Apart from all this, the MIDI in can be defined through the properties of the objects (midi_mapping = “Tools:Name_Tool:operation_xxx…”.Then you use 4 things:

  1. Code LUA
  2. API Renoise
  3. OSC Server (define a class to the oscClient and the control of note_on, note_off (or more))
  4. MIDI in (through the properties of objects, midi_mapping = “…”)

One thing I would like to point out is the live recording of parameters (note_on, note_off, delay and velocity) through OSC Server.It is a subject that can go unnoticed. You only see it in action when you have it defined in your tool.It will not record if the instrument editor is unpinned in the floating window. It will be necessary to select the pattern editor, and this is a problem when building tools of this type. You can not use a tool to record live via OSC Server if you have the instrument editor in the floating window.

I point out all this because maybe when using OSC in a tracing way, through another program, these problems are also presented.

Another question is whether, when using OSC Server, it is correct to use the properties of the objects (midi_mapping = “…”) or another form must be used. Apparently, OSC has this in real time:

---- Realtime Messages 
/renoise/trigger/midi(message(u/int32/64))

So, are there 2 ways to program the MIDI in for the scripting tools?I deduce that with midi_mapping you must define your own function, while with …/trigger/midi(number) is it already defined in the file GlobalOscActions.lua?

…/trigger/midi(number)accept a number, what number?I have never used it, so I do not know exactly how to use it yet.

What a coincidence. I wrote a short book on it.

http://osc.justthebestparts.com/

awesome! will buy today

I do not know if it was Danoise, or Joule or another guy who wrote this code. But it works perfect to control the notes (on / off).

Looks like it has been lifted from the Duplex code? That functionality has since been moved into xLib.

Btw: once you go deeper with OSC note triggering, you might also have to come up with a voice handler.
I don’t know about your tool, but I’ve implemented steps to avoid stuck notes in the following scenarios:

  • When user shifts an octave while playing a note (released note needs to match the old octave offset)
  • When user changes instrument or track while playing a note (released note needs to match the old instr/track)

As for why there are two flavours of OSC note triggers - I can tell you the following:

Triggering via OSC over MIDI sends a “raw” midi message. Works just like regular MIDI input (using the currently selected track, instrument etc.).
You can also use this to map MIDI commands to actions in Renoise (CTRL+M) - both note and CC messages.

Triggering via OSC sends an “internal note”, which is routed to a specific track or instrument.

In short: it’s good to know both biggrin.png

Also: don’t waste you time writing a tool which programmatically triggers notes in realtime via Lua.
The Renoise API can receive live input from a controller in near-realtime. So it’s suitable for input from a keyboard, turning knobs, etc.

But, if you want to write a sequencer that plays notes “by itself”, don’t waste time doing this?
Reliable timing is only possible by writing the notes into a pattern and then playing them back.

Looks like it has been lifted from the Duplex code? That functionality has since been moved into xLib.

Btw: once you go deeper with OSC note triggering, you might also have to come up with a voice handler.
I don’t know about your tool, but I’ve implemented steps to avoid stuck notes in the following scenarios:

  • When user shifts an octave while playing a note (released note needs to match the old octave offset)
  • When user changes instrument or track while playing a note (released note needs to match the old instr/track)

As for why there are two flavours of OSC note triggers - I can tell you the following:

Triggering via OSC over MIDI sends a “raw” midi message. Works just like regular MIDI input (using the currently selected track, instrument etc.).
You can also use this to map MIDI commands to actions in Renoise (CTRL+M) - both note and CC messages.

Triggering via OSC sends an “internal note”, which is routed to a specific track or instrument.

In short: it’s good to know both biggrin.png

Also: don’t waste you time writing a tool which programmatically triggers notes in realtime via Lua.
The Renoise API can receive live input from a controller in near-realtime. So it’s suitable for input from a keyboard, turning knobs, etc.

But, if you want to write a sequencer that plays notes “by itself”, don’t waste time doing this?
Reliable timing is only possible by writing the notes into a pattern and then playing them back.

Thanks! I have created a tool that can be defined basically in a multiple note trigger, even at the same time, in different instruments of different tracks, all through OSC. Based on this, I have also built a step sequencer, which are basically functions chained with timers, based also on OSC. Therefore, it is possible to trigger notes of different instruments in different tracks to create rhythms, in a previous listening, without editing anything yet in the pattern editor. This allows you to test sounds quickly, how to fit them and locate sound textures, although the times were not extraordinarily precise.From the beginning, I was looking for something that would allow me to experiment with the sounds of various instruments without editing anything from the pattern editor: pre-listening to the sound with results.

To use all this, it seems to be necessary a minimally powerful hardware to ensure that all timers last exactly the same if they have set the same time in milliseconds. It’s the first time I’ve built a real step sequencer and the result is quite amazing, because it allows you to modify the behavior while it is playing…

Regarding the play and stop of notes through OSC, I have also had to deal with several scenarios to avoid that the notes keep sounding, that behaves in a consistent way when it is handled by the user. I have been able to build several shooting modes, compatible with a mouse click (press and release), the alphanumeric keyboard (2 presses to launch and stop) or the MIDI input, which can function as a mouse or as the alphanumeric keyboard. I intend to share this tool, and later discuss some details.

So far, there are only 2 serious things that bother me:

  1. OSC does not seem to record the notes in the pattern editor when the instrument editor is in the floating window. If you touch the window of the tool with the mouse, for example, it will be impossible to record the parameters. It only works if the pattern editor is selected at all times. I do not understand why this happens. The edit mode on / off is already used to avoid recording in the pattern editor, and OSC does not seem to be useful for recording notes in the phrase editor either. Maybe it should work the same (record the notes in the pattern editor) even when the instrument editor is in the floating window.
  2. For some reason, the windows of the VSTi do not work fine either. This issue is already discussed here.

If these two things were solved it would be much more comfortable to work with tools of this style.Perhaps these details are useful to know in the case of using OSC from an external program.

I have to do some more tests, but the step sequencer I’ve built seems to work quite fine and precisely on a PC with an i7 CPU. However, if I use a 15" laptop with an i3, it gives the feeling that the timers with the same set time, do not take the same. In theory, I believe that the LUA code used is correct, well designed. The OSC signal can generate some delay according to the local network? What is this route? The scenario would be a PC with Renoise installed, the tool configured to use OSC with protocol UDP and port 8000. I mean, does this signal leave the PC , goes through some network cable and then it comes back or something or the whole signal moves inside the PC? Forgive my ignorance, but I still do not understand how OSC works internally.

Thanks Neuro for the information! I dabble in a lot of audio oriented “stuff” so I just added Ossia Score to that list. I think I’ll be buying your book as well. :slight_smile:

Cheers.

awesome! will buy today

Thanks! Hit me up if you have any questions.

Thanks Neuro for the information! I dabble in a lot of audio oriented “stuff” so I just added Ossia Score to that list. I think I’ll be buying your book as well. :slight_smile:

Cheers.

Sweet. Let me know if you have questions.

hey people, ossia score dev here. So glad to see people finding about it :stuck_out_tongue:

I just wanted to drop by to say that if you have any question we are a bunch mostly always available on the chat : https://gitter.im/OSSIA/score ;

I would be more than happy to think about nice ways to integrate renoise and score / libossia / oscquery - if renoise already supports OSC it can be a simple matter of making a JSON preset that will know the “standard” renoise addresses to send messages to.