Although the concept as a whole is pretty vague in my mind, I was wondering if such an application existed that would allow you to sequence / edit / mangle video inside a tracker environment?
I speedily (and thankfully) ditched the other (linear style) music apps after demoing Renoise mainly because of the interface. I don’t have much experience with video software, but the ones I have tried (Vegas, Premiere) have that clunky cluttered feel that puts me off wanting to explore them on a practical level.
I’d just like to sequence video clips with the the rigid timing and precision that Renoise gives me. A few effects would be a nice to have, but not essential.
I imagine there are more important things on the devs to-do list, but a little video functionality inside Renzo would get a massive +1 from me.
Anyone know anything they could recommend? I’m on a PC
He pitched the idea to the main programmer to create a vj-tool with keyboard control (after seeing me play with trackers for years ). In the end it is nothing like a tracker, but pretty cool for realtime video manipulations.
Crikey…I wouldn’t know where to start with that…are you able to break that down a little further? I’ve not used either app (bit intimidated if being perfectly honest!). If not I can scamper off down the net to work it out
Not really explored Reaper - like I said the whole linear setup really tests my patience now that I’ve discovered Rezzle - do you think I can trigger (video clips and images whilst we’re talking about it), on the beat, in Reaper (in terms of beats and bars rather than minutes and seconds?).
Again, its all very vague in my head at the moment, but the way I like to see it in my mind is akin to a bunch of video clips in the sample window of Renoise (much like audio samples) and we can create instruments, use 09xx to offset clip start times, 0Bxx to reverse, use the sample editor as a clip editor to chop etc…sure you get the drift there
Thanks for the link, I had a look and it looks much slicker than the other editors I’ve looked at. If you reckon that programmer is up for hearing a few ideas then I’d be well up for pitching in
Live use sounds like a lot of fun, however for me at this stage it would be more so that I could ‘track’ video rather than ‘perform’, although I can see just how the two would eventually overlap in some way.
Lastly, and forgive my naivety, but is the code of Rennie open source? I say that as if I know about programming - quite the opposite - but if I could take the ‘framework’ of Reno and explain to a dev (audio one or otherwise) what it is I’m looking for, would it be a worthwhile, practical and I guess from a dev’s point of view, legal venture?
Nice one guys! Wouldn’t it be great if video tracking originated from this very board…one does like to dream…
With video sequencing you usually don’t do either - it all revolves around frames! Even if you wrap it up in something else so to speak, it all gets rendered down to the frame rate anyway, so IMHO it’s best to get used to that way of working. Unless rendering the video isn’t your main goal, of course, because for realtime all bets are off.
Let me give a simple example, let’s say you want to “flash an image on every snare hit”. Well, unless you tune the song speed to perfectly match the frame rate, this is IMPOSSIBLE. What you get instead is an approximation: sometimes the frame just before the snare hit gets picked, sometimes the frame after it.
But if you work at the thing in 25fps in premiere or whatever, YOU decide, depending on what the creative/emotional content is, to pick the frame before the snare hit EVERY time, or on the on-beat, etc…
Premiere can be pretty intimidating, but once you figured out the few stuff you need to cut stuff up, and put effects and transitions on them, you realize that you don’t need to understand or use 80% of the app and it’s not really that bad… I guess it’s similar for other programs.
This is no different from scoring for film, though - you just have make the decisions about what looks good and what doesn’t. Exact sync between music and video doesn’t work well anyways.
then you could use pattern commands and/or a dsp device chain to send parameters to PD(PureData) or VVVV.
PD is just like max/msp only it’s open source. …so you don’t have to pirate max/msp!
and it works on linux osx and windows.
PureData-extended comes with Gem (the pure data video libraries)
I’m not sure but I would assume someone from the PD community has made a PD patch that would suit this adventure. it’s pretty much choose your adventure with PD and VVVV, once you learn one decent enough the others work similar.
VVVV is very similar to PD and very clean, it is designated much more toward very high quality visual arts. it’s not open source but the license is suited for most everyone’s needs that isn’t doing things involving large sums of money. It’s very impressive really.
If anyone makes anything with VVVV I would love to see you post the output and source patches here.
Thanks Johann, that was a very useful post. Rendering would be the final goal, yes.
In my mind I’m still seeing beat synced triggering as being possible, as video clips can presumably be chopped down to the frame I want the clip to begin from when triggering. So, if I had a 4/4 kick drum pattern in track 1 of Ren, in track 2, once I had trimmed my video clip as desired, it too could be triggered at the same time as each of the kick drums, thus producing the result I’m after? I could be missing something (or more likely, a great deal!) here! I realise a lot of this is talking in the abstract here too
thanks again
Ideally, everything would be contained within a visual tracker (ReView?) but I’ll definitely have a look at those ideas, many thanks
Very close to what I had in mind, output-wise! Nice
This was one of my first ideas earlier this year. I routed the audio into Pure Data and would turn it into numbers in order to manipulate parameters of OpenGL. That idea turned into triggering random frames of video by simply applying a threshold to the master audio track. blah blah ideas ideas
I have been very happy with Jitter for a while now but never underestimate PD (so many amazing tutorials come with it)… but with either of these you are going to need a fairly fast computer and/or a great idea of what’s going on with buffers.
For your idea…
Try taking the midi output of Renoise and Pop that into either PD or Jitter… you could turn those commands into anything… i mean they’ll be numbers but you could use them to manipulate any parameter of video you want, and it shouldn’t be to hard on the CPU. This is definitely on my to do list
hope this helps.
I think it’s only a matter of time, I had a dream last year I wrote about here in the offtopic subforum.
Don’t remember all of the details but, I woke into a scene of me editing a pattern in renoise using divx files, I was controlling with an audio tone signal i was hearing and manipulating inside my head. as I manipulated the signal inside my head I witnessed changes in the pattern.
upon realizing what was happening I lost awareness. and was walking along a sidewalk path, the sky was orange and the clouds where moving very fast above.
the part when i was using renoise like I was, was very vivid and clear.
…renoise really needs open sound control!