In light of Lareux’s recent concept images, I’d like to bring to your attention to these 9 month old images:
Full Size
Full Size (with annotations)
There’s track grouping, maximise (not minimise, though that would be good too), labelled notes, volume and pitch-bend visualisation, interactive pitchbend, volume “airbrush”, vertical zoom for finetuning timing…
Transcender, implementing this piano roll will have Cubase/Sonar/Logic devs rolling over in their newly established (by us) graves.
Piano Roll has always been about what note, what time. The next generalisation of this is what is approached by these images. What we should be moving toward is a rich, detailed, interactive graph of musical activity over time.
Here we just see instrument (colour), volume (opacity) and pitch (with bending) on the graph… you can imagine how many layers of musical information you can map onto this thing. Like fill with waveform, 0xx arpeggio visualisation, colour by pan (or two views of the same track with one set to Opacity by Left Pan, the other set to Opacity by Right Pan, and the whole 3D soundscape is actually visible right there on the screen…)
The important thing here is why a piano roll is proposed in the first place: more direct visual feedback, more direct control over the sound, and the most important result: a more direct route from your brain to the final product.
Pattern data is currently considered the “spine” of the program.
It is important to know that pattern data is first and foremost musical data.
It’s pretty damn clear from the size and longevity of this thread that there is a need for more directly controllable musical data.
Piano Roll will not take over Pattern View. Pattern View will not be the only source of data for Piano Roll. Instead both views should be based on an underlying set of musical data with high precision timing and other parameters. It could be a 2D array like it is now, it could be an N-tree sorted by event time. Whatever. We already set upon this path with the introduction of Automation curves. They are not part of the pattern data but they are part of the musical data. It’s time to get all the musical data in one central location and to recognise Automation and Pattern View as two different angles on the same theoretical data structure.
“Compose without Wires burns Directly from Brain to DVD that is already in Store”
That’s the end-state we’re all aiming for isn’t it