I think it would be very cool if Renoise had a naitive visualizer that could be coded in LUA or something else.
Without being too lengthy maybe utilizing the ability to use an openGL script, puredata GEM, winamp video plug-in, or something similar to the scripts used for touchdesigner or wallpaper engine.
Not a shameless self promo but i use the stock Renoise spectogram for creating visuals often. This can be seen on my instagram. @.vifyrv.
Does anybody else have any thoughts on this? I am curious if anybody would be interested or excited in seeing something like this become available in the future?
Weird question: is there anything wrong with having to pipe Renoise through something like Processing for audioreactive visuals? I’m only asking because it seems like a very small step away from utilizing essentially GLSL without paying another dime or really doing too much else.
The FL Studio demo has a built-in GLSL compiler as well and will even export without paying a buck if you don’t mind just dragging your Renoise track over to the DAW and learning a relatively easy scripting language (not too far from Lua after all, really, just a different syntax). People have also built scripts that will convert sketches from Shadertoy over, which is kind of hilarious and decent for getting started with doing it yourself.
(Also, I should mention that the ZGameEditor that FL uses will allow full saving of your GLSL scripts anyway, so you really don’t have to pay a penny to keep your code safe from impending doom)
lol I doubt this will become real in renoise native, but you could try making a tool for it…take a programmable OpenGL visual server, or whatever…then make an interface into a renoise tool. Then you can try to stream the audio into the server, and read automation, note, pattern command data. You could also make a VST effect, and hook/stream it into the visual server, then piping the audio and parameter automation from renoise tracks (kind of like sidechain send) into the visual server for analysis and control.
I don’t know if there’s already (open source) projects doing this thing…like allowing straightforward scripting for opengl effects. If there is, you might be able to save some work. If not, you need to take a game engine or whatever and make your own scriptable server from scratch.
If such a project gets real good and stable, people would use it for their work, for live sets, online videos or art installations. Also consider how powerful/flexible you would want it to be. music player viz effects are often rather generic, but when you add the ability to code from building block effects and custom image/3d model data, you can create a tool wherein there is a great power to script audio visualizations that accompany your renoise songs.
IDK find some demo coders and start going? It’s a straightforward idea, just it’s very subtle on how powerful you would allow users to script their visuals, either you make it easy but it will lack power, or you make it powerful but then it will be too complicated for newbs. Try serving both at once can get twice as powerful, but also needs a dev with at least twice the amound minds while not caring about any money, trying to design that beast of a software.
Noticed that it is possible to change themes during playback…
Going to assume some API tool, or new Pattern Command, could
be leveraged to change themes; to create an interesting visual effect.