Music visualizer for YouTube

I see a lot of musics posted on YouTube with some kind of photo/effects/titles (with sometime good or bad results).

But I am searching for a simple one, with not too much effects. Does anyone could recommend somes softwares or web app, free or not?

(except iMovie wich is not bad, but do not make music visualisation)

Prettyscope - can be as simple or complex. It’s a VST. Plug it in and it’ll bounce to your music right away. It’s a Lissajous pattern-style. You can trigger it with Renoise LFOs, etc. to modulate parameters and can edit the .XML file to set up parameters that don’t have values. Pretty neat!

I find it works better in Reaper. Prettyscope kinda-sometimes on my end will cause Renoise to crash. Don’t know why, yet. It doesn’t crash until I close the window, so as long as you don’t close the VST window, it doesn’t. I use a screen recording app to capture the end result.

1 Like

I use Unity3D… A little FFT coding action, sometimes some OSC/midi routing and the world is open to whatever you can imagine =)

1 Like

I didn’t think about a VST! That look interesting, I’ll check this Prettyscope. In fact I thought more about simple visualization with a background picture + audio spectrum.

:open_mouth: Unity3D! Sure that could be quite nice, but look complex for 3D newbies as me on this kind of stuff!

Depends on what you’re doing. You could use iMovie and place a stationary image behind the video you record from Prettyscope, or another movie clip, and adjust their opacity to get a blend of both. Would be pretty neat!

I did this track with Prettyscope. Created the track in Renoise and recorded the video with Quicktime/Soundflower, so Prettyscope was placed at the end of the master channel chain. Extra coloring/filters done in Apple Motion, but they’re just for light effects, not changing the original material of the actual video too much. Pauses came from automating Prettyscope via LFOs:

1 Like

Yeah, the coding/3d side is a little overkill for a lot of people. Mind you, I had started the visual side using applications such as Quartz Composer on Mac and VVVV on PC. Quartz was deprecated for Vuo and no longer free, so I moved on to Unity3D.

If you need to work with the audio spectrum, Unity is much easier in a non realtime environment. However if you can get away with controlling your objects and stuff via midi or osc output from DAW, you can do some very interesting things.

Here is an example of one of my Unity projects (Currently working a complete rewrite of this for Universal Pipeline)


Sonic Candle is a pretty simple spectrum generator which is quite neat.


I tried my hand at some super simple music visualizations in golang a while ago, but what really bothers me is that I’m writing a million images to the HD (or worse, SSD), just to make a video out of them with ffmpeg, then throw them away. That’s the main reason I can’t really get “into” it, and I just couldn’t find a simple package to just create a video with golang.

Here are some classic Amiga modules, with one color per instrument:



1 Like

Ah, simple, effective! Could be fine! I’ll try this for beginning. :slight_smile:

Not bad at all, the rendering fit with tracker musics! Oh, the Fremens theme! I soooo like Dune’s stuff even today!

Quite beautifull!

Smart result, simple and really fit with the track.


So I am experimenting with hooking up my slideshow to my audioplayer… but I have a long ways to go for this to be anywhere near good, I’m totally fumbling around.

Next I want to have a second copy of the image converted to HSL, so I can “smear” that around instead of just RGB, or use it to smear the RGB etc. The audio player does have historical FFT data in a texture, using that for should be easy, as should be using videos or webcam input instead of photos. The hardest thing will be to make it look good yet varied!

Ideally I would want to analyze an audio file ahead of time, to set the visualization parameters based on that, and then use the to in turn bounce some those parameters around (which ones, and in what way, would get determined by the “fingerprint” of the track). But that sounds more like work that playing around :confused: