“Today at its CES event, Intel made the claim that by the end of 2013, the market will enjoy the presence of $599, touch-enabled Ultrabook laptops. The company also stated that the devices will be required to support touch input.”
Windows 8 support touch screens and since Intel 4th gen ultrabooks will actually require touch, it might be a good idea to think about touch friendly controls. Maybe Renoise could work as now, but activate special controls when it register a touch event?
Also, I don’t see what touch can do, and the normal PC mouse can’t… so renoise is actually quite ready unless you like to recognize 3-, 4-, or 5-finger “taps” as specific actions, and I can’t figure what one would want to do with that
Multitouch does give some extra options though, like I could see zooming in, selecting, etc in sample editor working way more smoothly with pinching/punching/swiping etc.
I could see it implemented in the mixer view, for sliding up and down multiple faders.
Possibly re-do the XY device with an ‘expand’ view that can accept touch.
Could be quite fun in the Sample editor, possibly for making selections and using gestures to trim, cut, paste, apply effects, change amplitude. Though not sure how accurate.
Trackers are keyboard orientated anyway. I think most Renoise users barely touch the mouse at all. That’s just the way trackers are designed. So there’s not really many benefit to having touch support in there.
Keep in mind a touch version of renoise would be a huge redesign. All the controls would need to be bigger. You also can’t just scale controls up, you have to completely rethink how you fit the stuff on the screen and how you logically place things next to each other when they’re being used as a touch surface. Not saying this is impossible or shouldn’t be done, just that it would require an almost ground-up UI redesign.
This looks pretty interesting. I 'm right now trying out renoise on Surface Pro, and works like a charm, althought I’m mostly using touch for muting channels, etc. I’m quite curious to try this little touch interface coming also for the surface pro. I don’t know what you could do with the scripting for renoise, but I’m imagining using my Surface pro connected to a secondary screen, and maybe a touch interface with mixing controls etc as a plugin for renoise… tht would be really awesome…
But…isn’t that just a plain marketing trick to describe it as such? Or can someone confirm that software can actually receive pressure from the keys?
I recall MS did a prototype keyboard that could pull this off a few years back, but somehow I’m skeptical that it actually made it to a real product.
Not that pressure sensitive keys wouldn’t be a perfect match for a tracker - it would be the perfect way to input notes!!
they did a demonstration of it in their keynote for the first surface. they didn’t say what kind of data it would output, but if someone could make a lua script to take that data and convert it to velocity, that would be neat! (if possible)
Indeed the video is interesting, but I think we are looking at driver-level code, specifically created for the presentation event.
At least, a quick search over at MSDN didn’t reveal anything like a pressure amount associated with windows keyboard events.
I’m not really speaking of redesigning renoise for touch, but rather make an app/tool using the scripting capabilities in renoise to control some renoise functions. Running renoise on dual screens, with renoise on the non touch screen, and controls/app/tool on the touchscreen/surface pro, or any other combination.
Browse all Tools | Renoise ---- like making renoise tools with touch capabilities. Like a touch-mixer, touch-keyboard(piano), touch-drum-machine or whatever. I’d be interested in brainstorming more ideas and help designing something like that if anyone is familiar with the scripting part.
Maybe on the touch screens you’d need a little palette with all the notes (E-4, for example) floating around, bumping into each other and maybe turning into different notes based on some cellular automata rules… and you catch the little notes with your finger and drag the little notes onto the tracker grid
For good solid touch response, you need a proper engine that handles everything realtime properly.
You can catch some stuff with the tools API, but the response of all these knobs are pretty crappy if you need fast and instance application of specific effects.