# Get the BPM value per line throughout the song

Given the complexity of Renoise, it is quite possible that there is no reasonable way to obtain the following:
The intention is to introduce very quickly in a table the BPM values ​​of each line along all the patterns of the whole song according to the sequence. Thus, said table will be used to consult it for a cumulative time calculation function.

This is an experiment that I have done:

``````local function sleep(ms)
local sec=tonumber(os.clock()+ms)
while (os.clock()<sec) do
end
end
local function rec_bpm_lpb()
local song=renoise.song()
local trn=song.transport
local tbl_bpm={}
local tbl_lpb={}
local ssq=song.sequencer
for seq=1,#ssq.pattern_sequence do
song.selected_sequence_index=seq
tbl_bpm[seq]={}
local nol=song:pattern(ssq:pattern(seq)).number_of_lines
song.selected_line_index=1
for l=1,nol do
tbl_bpm[seq][l]=trn.bpm
sleep(0.050)
if l<nol then
song.selected_line_index=l+1
end
end
end
rprint(tbl_bpm)

end
rec_bpm_lpb()
``````

This function is excessively slow because it is not possible to extract the BPM value per line quickly (that’s why the function uses a sleep).

This implies 2 things:

1. Attend to the BPM values ​​of the automation editor (something extremely complicated, because they are not the value of the points, but the value of the ramps in each line)
2. And the ZTxx effects of all the lines (always take as priority the most right value of the line, in the track that is furthest to the right).

The objective is to create a tool that obtains down all the durations of all the patterns, individual and accumulated. But I’m afraid this is impossible to do.

As far as I have come, I have achieved a reasonably fast tool but only considers the changes for all the effects ZTxx (BPM value) throughout the song, (actually it also contains the changes of all the effects ZLxx (value LPB) ).

But at the moment of involving the BPM values ​​of the automation editor, there does not seem to be any reasonable way. The most unfavorable scenario is 1000 patterns of 512 lines. The tool should calculate everything in a reasonably short time, between 10 or 20 seconds I think it would be quite reasonable.

I think all this is impossible to do. If someone wants to think about it, leave your comments here.

The only way that could be calculated is for the API to offer the exact BPM value of each line, which is a data that depends on its ZTxx effect and its BPM ramps from the automation editor.

It would imply having a record of:

1. renoise.song (): pattern (p_idx): track (t_idx): line (l_idx) .bpm_value
2. renoise.song (): pattern (p_idx): track (t_idx): line (l_idx) .lpb_value
3. renoise.song (): pattern (p_idx): track (t_idx): line (l_idx) .tpl_value

Renoise does not have a vertical timeline in the sequence (left panel), where a cumulative time clock is detailed by pattern (let alone per line). And unfortunately there is no way to create a tool for this purpose, as far as I know.

Still other parameters exist that influence the duration of the song, and therefore in all the times accumulated by each line. This still complicates things more.

Has anyone ever tried something like this (create a tool that calculates the accumulated times of the song with all these factors)?

The quickest way without resorting to complex math and extensive song scanning should be to render the song with all tracks muted except one (for speed). You can even do it at the lowest bitrate/samplerate in case it’ll be even faster. Put some simple generated sample at the start of each pattern to signify the pattern change.

The maths for analyzing the rendered sample would be quite simple. All assuming the user would press some “calculate times” button. I wouldn’t do this kind of operation without the user knowing about it.

In that scenario, do you think you would get the accumulated time for each line throughout the entire song with total precision?

Even so, the process would be very slow, is not it? If the song lasts 5 or 10 minutes, if the process involves many seconds, the tool would no longer be worth it. The interesting thing would be to obtain this data quickly, because it would probably be useful precisely to change the song duration in correction processes (time adjustment).

Yes, the idea is to press a “calculate times” button and wait for a reasonably fast process, but in a totally composed song.

The problem is in the automation editor, which is to obtain the values to calculate the accumulated time is very difficult, if not impossible.

If the tool only depends on the ZTxx and ZLxx effects, the process of scanning the lines is very fast (3-4 seconds in a very long song (about 30 minutes) with a sixth generation i7).

It would give the same precision as song rendering do.

If you want to take automation into consideration I would use the approach I suggested for the reason you mention.

Rendering a silent track with 6 minute song duration takes about a second on my computer.

EDIT: If time per pattern is enough, rendering patterns separately and calculating the duration from sample size should be very fast.

If with the rendering it is possible to obtain the time that each pattern lasts, and therefore, the accumulated time at the end of each pattern, it would be enough. I would have to investigate.

That and wait for @taktik to fix the clock (top right), because in certain cases it stops updating when you navigate between the lines. It seems that in certain cases the clock does not update.

Maybe an alternative approach, likewise ugly but a bit less obtrusive than rendering a sample:

1. temporarily set LPB to something very high
2. mute all tracks, disable any block looping, make sure the number of sequences are greater than 1
3. play thru song
4. observe selected_sequence_index, and have it store timestamps. How high can you set LPB while still getting reliable results? Maybe it could be made very fast.

Calculate. Don’t forget to restore all settings and the edit songpos.

Still, this would only give you time per pattern.