yes and if your sample frequency gets high enough your sample rate will become irrelevant. there is any number of ways in which an idea can fail. you have successfully illustrated this point. congratulations! you’ve argued me into apathy. i give up. you’re right. sorry for being so stupid as to have an original idea. i’ll know better next time,
I’m arguing problems that can arise, not trying to grind your nose in the dirt.
Stop being such a drama queen. I might be totally wrong – maybe this will be the feature of the century. I Do Not Know. But you have to be able to take criticism. Otherwise don’t post ideas… Man. You’re actually making me feel bad about replying in the first place, seeing that you take everything I say for an insult…
Look at this:
Now… to have also an automation curve right there in the track could be very nice.
Altough I know it’s not so simple as it may seem. (May not work in all circumstances…) But you could implement it to be there only for those simple cases when it would work.
This is great idea, but I think waveforms should be completely separated from patterns and fully linear, although visible at the same time as tracker editor (on the left or right side of it, like pattern sequencer). I don’t think this would benefit much from tracker commands anyway, so it could simply have envelopes for FX automation, overlayed on the waveform display. And it should be played as real audio tracks, not “midi” triggered, either from ram or streamed from hard drive.
HCYS pictures are exactly what ive been looking for a long time. With this tech you would be able to sync events visually and actually see music, and find things that would otherwise be elusive to the ear or would take countless relistenings to realize.
Cool thing also would be that you have a LFOing/vibrating bass sound and could get other tracks to vibrate EXACTLY in sync effortlessly with on track automation curves. It could make for some schmoooving dance music
Spectrogram/sonogram view as alternative to peak/waveformview is also very powerful way to see the music. (like when mixing and fitting that “spectral puzzle” where all voices fit in its own part of the spectrum). It would make Renoise visually very much the “SciFi” of a musicprogram.
My points are:
Track “freezing” has been a suggested feature for long and was rated as only “medium” difficulty in the 1.27 features poll if i remember correctly. “Trackwaveform” with peak data would probably be the only way to effectively work with them in pattern view.
Either the rendering could be done in the background and be shown “when ready” or with a “render track”-button. I think the latter is better because it keeps computer more responsive when entering notes. Renoise can already render a track and does this quite speedily at top quality. Heck, even PATTERN rendering is maybe 1 to 15 second on average song, not terribly much.
The rendering would maybe not necessarily be so CPU intensive. For example: 44khz sampling frequency is not neccessary, even 8 Khz would be very much overkill at 125bpm. Necessary is also not 32 bits, nor interpolation. Maybe some rendering optimizations/rationalisations can be made based upon this.
nice! but’s it’s more an “underlay” (no overlay! ) … but imho this needs much gfx resources (if it should show not only the static wave of a note - also the influence of the pattern commands to the wave) and is something like this needed to make music? nice visual fx but it makes the whole pattern drawing less readable … if implemented in future version, it should be an option. a wave display on the left or right of a track is maybe more useful (already suggested in the past)
Just to further justify this whole debate;
To me the oscillators are useful, but having a waveform overlay would make them completely obsolete. It’d be uncomfortably cool to be able to normalize a section with a vertical volume envelope, or to immediatly see the impact a volume change would have on an instrument’s output.
I wouldn’t use this during production, but i WOULD use it to no end during final cleanup before the master render, and that’s a piece of the renoise toolkit that’s been missing for a long time.
As a suggestion; this doesn’t necessarily have to be realtime. It could be a render function that instead of rendering tracks to files simply plays through the song and analyzes the waveforms. It could be a part of the right click context menu: “Render to waveform overlay”. It could work for selection, track or track in song, and probably would be a lot faster than rendering to wav, considering the fidelity of the waveform wouldn’t necessarily have to be so great to work as a visual cue.
True. They’d just take up screen space. One thing I’d like are for most of the instantanious indicators, like scopes, VU meters, spectrum analyzer, etc for each track and master out to be moved to a dedicated mixer or indicator view. How about having an integrated Inspector XL type thing in Renoise…
Yeah. For me it would probably used in production too, because it’s sometimes easier to locate volumes graphically than it is to hear through it trying to notice where everything is. Also, if there was a toggle between pattern data and various forms of automation curves over the waveform, that’d be superb.
Exactly. As part of a track freeze function, there’d be more than enough resources to draw out and render the waveform graphics. Not having coded anything but lighthearted AS, I can always presume such things.
Also, it’d be nice to have another right context menu option that would put the freezed waveform on a new, pure wave track (i.e. can’t be unfreezed to pattern data), which would allow for cutting, moving, crossfades, etc. Render to sample slot is already there, so it’s well under way!
bump
this is awesome!
Don’t bump a thread that has been inactive for half a decade when most of the ideas in it are also mentioned, discussed and elaborated upon in another thread that is permanently stickied to the top of category, it’s just so… -__-