the plugin can say which parameters are used and which aren’t
the plugin can dynamically add new parameters
the plugin can expose multiple ports configuration, including repeatable sidechain input which lets the host connect any number of inputs to the plugin, and name those inputs. It is useful for an analyzer.
The plugin can inform the host that it does not need to process the next block, like when all the voices are off. It will help to save a lot of CPU on big project when you have hundreds of plugin instances idle.
The interface is very easy to understand and use, anyone can get started from the example synthesizer and host.
I hope that you’ll find it interesting and give it a chance. Thanks.
Looks like a complete alternative to the vst 2.x binary interface - well what will let it really stand out from stuff like LV2 (besides seemingly not using real cluttered and stitched together interfaces basing on markup languages, and not having yet unknown possibilities for unsupported expansion by anyone who’d like to do so…)?
I think this be interesting, as LV2 seems like complexity overkill, and clap more like a slim old school abi. Is this spec some sort of “final draft” for the thing, or will there still be changes/additions? Have you been in open discussion with coders and users regarding details of implementation and the possibilities given by this interface? What about open-source-friendlieness, I think the original VST license is incompat with some licenses per se, will this be more friendly with e.g. strict gpl stuff?
Well, let’s say stuff looks good so far from first loose readings. One thing for example that’d interest me would be options for “dynamic” control ports in runtime, so one could for example create a really freely built up modular synthesis network in a gui, and have selected parameters for automation or a choosen number of input/output ports exposed, but only those. During session without need to reload the plugin to have all those ports registered, named and with the right properties. Or have some mini script language processor to build own fx in runtime. Stuff like that.
OopslFly, the whole CLAP interface, documentation, examples and tools are distributed and licensed under the MIT. So you’re free to use it in any project (commercial, bsd, gpl, …).
It is not yet finished, yet it is very close to final design, if no one ask me for changes with valid arguments then it may be the final version already. And yes I tried to discuss with professional dsp and daw coders.
Oh ok, I could’ve read more closely before asking shit.
As there’s an extension system for providing add. functionality, are there already Proposals for certain features available? Current documentation is a bit sparse, some functionality is listed as extension, but no real docs on them there yet.
I expecially like the motivation of giving note pitches as frequencies, having per note automations, and a quasi voice/note “alias by number” style, will be very friendly for implementing tracker style stuff like triggering the same note in the same instrument on multiple channels, individually controlled portamentos for different notes and such. I guess for emulating plain midi control there’ll be an extension available? People will hesitate porting their already existant plugins if it can’t be done with a simple wrapper. Also I found no dedicated work presented on bridging plugins or seperating plugs to their own process space for safety, or inter-plugin-communication ways. 'nother thing is event/parameter-out-ports, for event processors (note filters, arpeggiators…) or automation tools (i.e. the meta devices in renoise). Stuff that will have to be there nowadays.
As there’s an extension system for providing add. functionality, are there already Proposals for certain features available? Current documentation is a bit sparse, some functionality is listed as extension, but no real docs on them there yet.
Every feature which did not seem mandatory to me have been moved as extension. So I think it is easier to get started as the requirement is very small and you can step by step discover the extensions which fits your needs and implement them. Which feature seemed unclear/undocumented to you?
I expecially like the motivation of giving note pitches as frequencies, having per note automations, and a quasi voice/note “alias by number” style, will be very friendly for implementing tracker style stuff like triggering the same note in the same instrument on multiple channels, individually controlled portamentos for different notes and such. I guess for emulating plain midi control there’ll be an extension available? People will hesitate porting their already existant plugins if it can’t be done with a simple wrapper. Also I found no dedicated work presented on bridging plugins or seperating plugs to their own process space for safety, or inter-plugin-communication ways. 'nother thing is event/parameter-out-ports, for event processors (note filters, arpeggiators…) or automation tools (i.e. the meta devices in renoise). Stuff that will have to be there nowadays.
So for the midi, there is already a midi event which gives you a raw midi buffer, which you can parse using the helper provided by clap or your own midi parser. Also most midi event could be translated into clap data structure.
For now I’m waiting for some feedback on the spec before writing too much code. Once I get enough of positive feedback, I may start to write a generic bridge and vst adaptor
The plugin can already use host->events(); to send events to the host, like not off, param set, … maybe the event interface could be improved to better support multiplexing in some areas.
couldn’t really find the part in the docs / sample code where notes are controlled or processed. so may i ask a question? is the design also to support ramping the note frequency? even more maybe defining a LFO (type, phase, amplitude) for it each processing call? controlling notes from the daw has been my main wish for plugins since ever.
and why’s there keys as note identifiers? is that good? why no opaque handle? i can’t really think of any benefit to see midi keys anywhere in the basic concept of a plug-in interface
anyway, if i ever manage to do so, i want to support this
couldn’t really find the part in the docs / sample code where notes are controlled or processed. so may i ask a question? is the design also to support ramping the note frequency? even more maybe defining a LFO (type, phase, amplitude) for it each processing call? controlling notes from the daw has been my main wish for plugins since ever.
and why’s there keys as note identifiers? is that good? why no opaque handle? i can’t really think of any benefit to see midi keys anywhere in the basic concept of a plug-in interface
anyway, if i ever manage to do so, i want to support this
Hi Mark2,
The design supports parameters automation (global and per note), so if your plugin has a pitch knob, then you could automate it.
We only support parameter set and parameter ramp. The ramp itself lets the host do nice shapes without flooding too much events, I think it is good enough for low frequency modulations. But for sure it is not meant to do FM synthesis
We use a key as a note identifier because we can share the same interface for synthesizers which does not supports tuning and those who does. Having an handle would make the interface more complex, subject to bugs. Using the key as an index is very efficient as the synth can have an array of 128 pointers to voice, and do direct access to them. It is a bit like unix file descriptors And it is safe because we do not share pointers.
We use a key as a note identifier because we can share the same interface for synthesizers which does not supports tuning and those who does. Having an handle would make the interface more complex, subject to bugs. Using the key as an index is very efficient as the synth can
I try not to have a bad influence. But here using the (7 bit) MIDI concept is the thing I’d suspect to be error prone and hard to get (not really resembling the real world) and unneccessarily fast. So to say “data unhiding” and speed optimization at the wrong point. What will a drum computer with 200 drums do? Offer bank switching? I think it’s hard to overcome MIDI here, but I personally would definately forget about the MIDI key as soon as the host knows what it wants from the plug-in.
like this pseudocode-like structure of some imaginary drum computer
It wouldn’t stop the host from using it like a keyboard. I just thought, MIDI is the wrong abstraction of e.g. these two kinds of plug-ins. What the host wants is trigger and control *something, which *could be notes.
I’m a bit septic about 200 drums. I think that it is easier to work with many drum machines instance than 200 drums in one instance. Also if they’re high quality samples they’ll take a lot of memory even if you only use a few. Also you often need to sidechain the kick, eq the drums, put effects, so 200 is really a lot of work!!!
If it is a real use case and people can really show it as useful and mandatory we can still extend the interface by adding new events for triggering drum machine inputs
Also triggering a drum can be different from a key on a keyboard because the controller might want to describe its surface and where it was hit so a new event type sounds good at some point in the future.
I’m a bit septic about 200 drums. I think that it is easier to work with many drum machines instance than 200 drums in one instance. Also if they’re high quality samples they’ll take a lot of memory even if you only use a few. Also you often need to sidechain the kick, eq the drums, put effects, so 200 is really a lot of work!!!
If it is a real use case and people can really show it as useful and mandatory we can still extend the interface by adding new events for triggering drum machine inputs
Also triggering a drum can be different from a key on a keyboard because the controller might want to describe its surface and where it was hit so a new event type sounds good at some point in the future.
Ah, ok. I see, you’re doing a more sophisticated structure than I initially thought. Good point about the surface and the point, impossible to represent this as a list. Yep.
All I can say, what’s really holding me back sometimes from making music is traditional concepts that nobody would invent like this from scratch today. Everything that ends at 15 or 127, then maybe makes any kind of add-ons to overcome its initial limitations. Sometimes it’s good to have such structures (matrices more or less) in the GUI, it can help a lot. But just look at what a mess the renoise patterns are. (sorry )