No-Gear Live Set

That sounds interesting genfu, my only question is how you would go about syncing 2 songs that were not within an acceptable bpm range of each other?

To elaborate, say you have 2 songs with samples that are intrinsically linked to separate bpms, say 137.26 and 175. (bpms that can’t be multiplied or divided amongst each other) We would need these 2 songs in the rewire master to somehow be playing together within a precise degree, so the rewire master would whack out the slave instances timings, since the rewire master is the master BPM control, from what I know about this, which is not really enough. Since we have no real method of pitch shifting or pitch bending, this is the only thing I can see that really stands in the way of this working for the Dev Team to implement it.

I think this part is somewhat simple , through declaring itself something like this:
max/msp(master)
renoise(0),
renoise(1) and so forth.
So whenever another instance of renoise is loaded is sees other instances being declared and follows accordingly asking to add itself to the master. Once in, load songs, full asio, midi and in a bit OSC! ;)

I really don’t think this would get in the way too much if at all of those of us who use multiple renoise instances separately & in different ways.

That is the purpose of the thread I started about rewiring renoise to renoise.

This is the basic initial idea of what I would put in the patch, if it were possible to rewire 2 instances to max:

DECK A
-play
-pause
-resume
-stop
-link bpm to B
-manual bpm
-play at interval (e.g. at the beginning of the next bar, 4 beats etc.)

DECK B
(the reverse of above)

then you have a crossfader with selectable angles, perhaps a patchbay for midi routing, a soundfile recorder for recording the set etc. also an option to route the output to 4 channels for use with a real dj mixer if you have the hardware.

as far as changing bpms of songs goes, how well it works would be a case of trial and error, but for the most part I would expect that to synchronise songs you need to really design them at the same bpm, or a similar enough bpm that you can change the one bpm to match the other. if someone is doing a set in a particular style, most tracks would be in a similar bpm range (e.g. 135-145 or something)… in that case it should work okay to sync tracks in this way. it might be something you have to consider when writing your music, or you reformat a track for live use so that it will sync up okay (you’d probably want to do this anyway to make a version with plenty of pre-rendered material so save on cpu).

note that the idea of the patch is that you can sync tracks in a similar tempo range but you can also just start-stop songs from the beginning or a cue point, either instantaneously or at the beginning of the next beat/bar etc. so if you have two songs at a different speed you can do what a DJ might do and not blend them together in sync, but start the next track promptly at the right point. the idea was also to facilitate that.

finally, the patch is advantageous because it pipes the audio from both renoises out through max using one audio driver only.

there’s nothing much in this idea which would be particularly difficult to build if i could slave TWO renoises to max.

yes… that’s exactly what i want to happen.

sorry, what thread are you referring to here?

That’s an interesting idea, I didn’t know the bpm could be disabled for certain slaves.

Here is the thread:

It’s pretty much the same thing we are talking about here and goes into a little more depth, I guess. Also some other ideas or rather interpretations too.

ah okay… thats a pretty old one now isn’t it. worth resurrecting or not? the thread seems to cover more confusing (to me) applications of the concept. but for the kind of DJ/Live applications i’m talking about it would seem to make perfect sense to implement what you propose of Renoise (1) Renoise (2) etc. I’m pretty sure (but i do need to check up on this) that it would then be possible to sync each renoise independently or simultaneously from max/msp, which would make constructing a DJ mixer application a breeze.

Since you can already use a different audio driver to get audio out of both, and control the transport via midi with midi-yoke, i’m going to attempt something basic with what is already possible. but really rewire would be the ideal way to do this in terms of audio drivers and synchronisation…

now there’s a thought,…

what if, for the time being, you guys go back to the basic idea of 2 turntables and a mixer, ie: you throw out all the requirements for syncing, bpm matching etc, all the fancy stuff,… cos fundamentally all you need is a DJ mixer inbetween 2 renoises.

dj’s should be able to sync things manually shouldnt they?

put the dj back into DJing dammit!

disclaimer: i dont dj so i most definately dont know what i’m talking about.

i’ve made a rather crude video of a basic version of the patch i’ve been talking about as a proof of concept. i’ve started a new thread over here:

Oh man I just figured something out in 2.5.

"Controlling renoise… with renoise!: :)

I don’t have my glasses with me so I won’t go into too much detail, just how to do this.

You must have some sort of midi router to begin with, OSX (IAC) and linux (jack & alsa) have this, in windows use something like midi-yoke (it works perfect)

-I’ll try to make a recording of this once I find a decent screen capture for OSX.

You need at least 2 instances of renoise open for this renoise on renoise fun.

you can either do this synced or non-synced.

Synced entails making one renoise a master & the other a slave,

In the midi prefs of renoise(Master) in the middle at MIDI CLOCK Master select the routing device, such as IAC Bus 1, Out to MIDI Yoke:1, Midi Through Port 0

On instrument 00 go to the Instrument Settings, to the MIDI Properties then Device select IAC Bus 1, Out to MIDI Yoke:1, Midi Through Port 0

now that is set.

now go to the Track DSPs

insert a Meta MIDI Control Device, as the instrument, make sure 00 MIDI (cha 1) is showing.

Go to one of the “Untitled” sliders and give it a CC number like 22. (make sure the CC is next to it)

Now insert a LFO meta device in the same track and link the destination to that slider you just set up.

The slider should be moving at this point. and this slider is what we will use to begin telling Renoise (Slave) what CC controller we are using.

Now on to Renoise (Slave)

Of renoise (slave), goto midi prefs, at the top MIDI Master Keyboard / Midi Mapping:

In Device A, select IAC Bus 1, Out to MIDI Yoke:1, Midi Through Port 0 remember that is whatever you chose it to be in the instrument 00 of renoise (master)

Then at the bottom at MIDI Clock Slave, set it to whatever you have your MIDI Clock Master set as in renoise (master)

After all of that we are ready to start mapping, with the MIDI MAP.

**make sure you save the files now.

Now you can do all kinds of very weird & interesting things from the Renoise (Master)

Open the MIDI MAPPER CTRL + M or apple + M

and select the BPM!!! (Beats / Min.)

You can close MIDI Mapping at this point and see the BPM shifting with the LFO from Renoise(Master)

Now everything you saw high-lighted when you had the MIDI Mapping open, you can control from Renoise (Master)

The true power of Renoise! :walkman:

oh yeah, I need to add, on renoise(slave) the button next to the Beats / Min. should be turned off to get the proper BPM shifting.

Nice! Works. Synchs up perfectly when I initially hit play on the master . But it seems the button beside the bpm on the slave definitely needs to be activated when i slide the bpm of the master up and down in order to catch up. Something keeps slightly offsetting the slave when this bpm button is on it but i can’t figure out what… hmm. :( Maybe i messed up. I’ll try setting it up again. Almost perfect though. Thank you!

For screen capture on OSX I’d recommend screen flow

Cool, just got ScreenFlow-1.5.1 going to check it out, here shortly. Thanks! :)

Yeah, I too noticed that after I wrote the above.

What I ended up doing in renoise (master) I put a LFO reset in the pattern editor effects command column.
Using the renoise (master) pattern length as the master sync.
then in renoise(slave) unchecking that box.

Oh man, and using this on LPB is wild too!

Just noticed in beta9 we can use controller transports, so that makes things that much more interesting I think. :)

looking forward to your video :panic:

I got a bunch of things squared away this morning and the past days since, and I hope to have it uploaded somewhere in about 2 days. :)

Nice idea! Since were talking about live sets here (not really linked to BYTE’s post but) When I pulll off a live set… i usually use standalone synths like Massive or Korg leg. (any really) and build a track in renoise or buzz (renoise looks cooler up on a projector )… kinda like, i am the band… I end the track, I dont build it up like when im dj ing and mix the tracks together… I keep producing and DJ seperate… think of it like this… When KJ Sawka pulls off his shiz he stops inbetween…takes a rest… takes in the joys and the talent of being a producer and the cheers of the people who appreciate his tallent…his art. … Yeah i dunno … we’re more musicians then “dj’s” eh… Save the dj’n for the cats that spin our music when/if we hit the big one =D!! lol. But I guess you could take the softare synths as “gear” but i was taking you to mean gear as hardware… You thinking of doing a live set BYTE-Smasher? ill come to ON. to see ya bro… your like 1 prov. away =D haha! eZ.

…::niNja pWn::…

I forget how to get this embedded.

**link removed
**going to fix it.

I’m having trouble figuring out just how to mess with BPM and LPB in a manner speaking. Doing so can achieve great noise effects. chainsaws of drums and overly noisey type stuff. A few days ago when I began testing this idea/method I had a great synced pad sound happening, but I forgot to write down what I was doing.

So this method is about using 3 renoise instances.
1 master, 2 slaved.

The master doesn’t have a song, it’s simply a device chain; controlling the 2 slaved renoise instances loaded with small songs.

Next, I want to work out how to load other songs into the 2 slaved instances and get them back in sync. It looks like I may need to wait a little, as I believe the actual method of doing this is to use the “Send song position pointer” in the master instance, and I think there may possibly be a bug in that.

okay I’ll fix it. :)
link removed until I get timing straight.

Interesting you say that; I’ve been working on a PD patch for Linux for just that purpose.

I’m actually trying to get the whole using renoise live thing sorted out, myself.

tried to do a liveset with 2 instances of renoise running, but when i started the song in the second instance my laptop horribly started to go mad ^^ i had nearly a minute of start-stop-start-stop-start-stop of the first 10 seconds of the track… too bad with just 512mb ram and 1,7ghz i think.
powered up my lepatop with now 2gb ram i could give it a new try, but i haven’t done yet. don’t want to mess around again.
i think the set i played there was my worst ever, chosen wrong and unfinished tracks, repeated one and so on… damn you beer! ;)

Lesson #1: Experiment obsessively with live setup in advance to assure stability and useability

Make beat. Press play. Scroll up and down.

At least when I’m bored. Can’t say that’s much of a live set. <_<

  1. Use rnsmerge to join all of you songs together
  2. Press play
  3. Look busy

That is all.