Couldn’t really find the right topic for this, but it would be super super awesome if the visuals of, let’s say, the envelope or any other modulation used can be shown over the visuals of the waveform?
If the visuals fit in comparison with the resolution and the timing of the waveform, then you could use your eyes as well when certain modulation needs to start in terms of the waveform.
Basically this as an example:
Thanks mate. That clarifies it quite a bit :’).
The waveform needs to be updated in real time , which is imho a waste of resources
A workaround could be to add a user defined update rate for the realtime scope ( ithink atm it’s around 20ms ) maybe update it to 1000 ms
Btw , wat’s wrong with relying on your ears instead of your eyes ?
Nothing wrong with relying on your ears. It’s just a workflow thing, I guess. Next to that, imo the waveform doesn’t needs to be updated in terms of visual feedback. But it would be nice to see from the source how the modulation would work over time.
Filtering might be ok to ignore for such a task, but I bet that pitch modulation would make implementing this feature turn into quite some nightmare…
regarding visual update, it would probably work best like it already is for the envelope contour visualisation - on every note hit, or when parameters are updated…
good point regarding the pitch bend…