Fragment : Collaborative Spectral Synthesizer

Hello everyone,

i want to present you an experiment, Fragment, a collaborative and free spectral musical instrument.

Here is a playlist with some videos of Fragment, on most videos Fragment act as a virtual instrument for Renoise:

https://www.youtube.com/playlist?list=PLYhyS2OKJmqe_PEimydWZN1KbvCzkjgeI

This is first and foremost a “live-coding” additive/spectral synthesizer which associate visuals/audio through direct manipulation of the spectrum, the visuals are generated by a GLSL script (your GPU is producing the visuals) which is shared between users and per sessions, the visuals represent a kind of possibilities space from which you choose what to hear by “slicing it”, spectrum slices are fed to a pure additive synthesis engine in real-time, i believe that some weird sounds can be made easily with this synthesizer, complex sounds can also be made but it may require higher knowledges.

Fragment can also produce visuals synchronized to the produced sounds.

Visuals/sounds can be manipulated by tweaking the script variables directly or through MIDI enabled controls widgets (this is a Chrome and Opera only feature because FireFox does not implement it right now), it is also possible to manipulate the spectrum with your camera or images, that way it is possible to draw a score on a piece of paper or anything else and play it back with your camera or by adding images.

This is mostly web-based but there is a port of the synthesis engine to Linux/Windows/Raspberry which provide crackles free performances in case you have a “slow” CPU, i will also soon release a standalone all-in-one executable for Linux and other platforms which will allow beefy performances directly.

This synth. require a beefy CPU, GPU and some knowledges of GLSL in order to use it, you can follow the example comments for some hints and the help dialog for some helps, a documentation is also available.

If you have any questions, i will be glad to answer them here. :slight_smile:

You can try it at: https://www.fsynth.com

Documentation can be found here: www.fsynth.com/documentation

A quick track made with Fragment and Renoise while i was testing things today :

https://soundcloud.com/fsynth/jam-session

Some more videos were uploaded : see original post

The standalone version (much better performances) and the complete software documentation with examples and tutorials is also in progress.

Also, while i was jamming with it today, found out that having multiple instances of the synthesizer to play different sounds is right now quasi impossible, it is a major set back so this will be enhanced in the next version by adding the possibility to assign different output channels to the spectrum slices and have a way in the fragment shader to determine which “sound” is to be played for MIDI notes messages, that alone would solve the problem without performances hit altogether, something which will be done after that is render buffers which will allow an “easy” way to add fx such as reverb. to the sounds produced with Fragment, it is possible to do it right now but either quite limited or very cumbersome.

Fragment generates very interesting sounds! Amazing to watch and listen, but I guess it’s not for me. Maybe when I have some more free time… :blush:

Btw. the first time I have seen a live programmable interface was a presentation video by Sam Aaron playing with samples on a Sonic Pi just a few days before you posted this one here. When I saw your post here I thought the user interface has a similar approach.

Fragment was updated many times this month and is now feature complete, here is the list of new features:

  • The documentation is available
  • MIDI devices dialog with hot plugging of MIDI devices support
  • support of multiple audio channels output per slices , this work best under Linux/Jack (i don’t know how you can have more than 2 outputs “virtually” under Windows…)
  • multitimbral (MIDI channels can be accessed in the fragment shader)
  • live-coding ready (don’t stop on errors)
  • monaural mode (so that full RGB output is available for synchronized visuals of all kinds while the alpha channel is used for the additive synthesizer)
  • slices can be instructed to move at a certain speed left or right independently
  • fullscreen editor
  • new website
  • new UI guide, many fixes and UI change

Some more videos were uploaded (Renoise is used as a host on all of these videos):

Fragment generates very interesting sounds! Amazing to watch and listen, but I guess it’s not for me. Maybe when I have some more free time… :blush:

Btw. the first time I have seen a live programmable interface was a presentation video by Sam Aaron playing with samples on a Sonic Pi just a few days before you posted this one here. When I saw your post here I thought the user interface has a similar approach.

There is a high learning curve yeh, it should be easier to get into now that there is a documentation and some examples.

The documentation is not finished yet, especially the examples section, i would like to write a section on the usage of Fragment with Renoise also, anyway the actual documentation should be sufficient to get started quickly and learn the principles, the new video (first one) is actually entirely based on the examples of the documentation!

1 Like