This is what the process might look like…
- User makes selection (instrument, track/s + group/s selection + vertical length selection in pattern editor)
- User presented with option to render fx or to remain as live fx
- Render selection offline (CPU)
- Replace/update instrument with rendered audio
- Insert sample trigger in pattern editor according to vertical length selection
- Enable autoseek