Waveform (song?) display, multiple windows (views)

Hi API gurus,

I looked through the Renoise API docs and still didn’t find what I was looking for - a way to display a waveform

in my own window on the Renoise screen. In fact, I’d like to either display two waveforms in one window or

have two windows with a different waveform in each. I’d also like to use my own descriptors for the x and y

axis, not just whatever Renoise uses by default, which is a result of our project using its own terminology.

Any clues where I should start reading, and has anyone seen this done already?

Thanks very much!

James Goss

There’s no draw function in renoise.ViewBuilder(). You can load images, see therenoise.ViewBuilder.API.lua file in renoise 3.0.1\resources\scirpts\documentation

But you’ll have to give more details about what you’re doing.

If you’re proficient enough, you could have any gui in a external program, like a web browser for example using sockets, a few people have written a bridge for it in some way.

Do you mean you want a scope? Plenty of scope vsts…

from the introduction.lua in the same folder:

What's *NOT* possible with Renoise tools:
- Change Renoise's existing behaviour. Like, you can't make all C-4s in the
 pattern editor yellow instead of white. You can write your own pattern
 editor, but not change the existing one.
- Realtime access. Except for OSC and MIDI IO, you can't write scripts that
 run in the audio player. In other words, you can not script new realtime
 DSPs - yet. But you can, for example, write a tool that creates samples or
 manipulates existing samples. This limitation might change in the future.
 For now you can make a VST, AudioUnit, or LADSPA/DSSI plug-in.

Thanks, GrandDaddy, that does help.

What I wanted to do is something like a scope, but if I use a scope VST I would be limited by its design since I doubt I would find

support for changing axis labels in a VST.

I’ll have to read the pointer to the docs you give and see what I find. Maybe what I want to do is a real-time DSP, which the intro

text mentions is not supported. I read that intro before I posted, but it was not clear to me what all it meant.

I think what I meant, in API terms, is to use a view to contain a waveform of my own choosing, such as just loading it from a

file.

If you just need to load a static image in a window in renoise when something non-realtime-audio related happens, that is possible.

I wanted something changing in time, just like an audio waveform. Static wouldn’t do me much good.

Maybe this just is not possible using the API.

it is, but in awkward ways,

what you should do instead, since you want a scope (you should have said that clearly from the start :p) is find an opensource vst or use vsts like protoplug or reascript