how to calibrate latency compensation for audio recording?

Anyone have tips on how to set the latency compensation setting for audio recording? I’m assuming it’s not as simple as setting it to the audio latency, since that doesn’t account for the input latency. Would be nice if there was an automatic way to calibrate this.

What is it you are trying to record? Latency is a really big problem, and afaik you are limited to things like a monitor mix, or if you’ve got the budget you can go for a focusrite rednet system or something like that.

When I record I try and set the asio as low as possible… Like 64 samples. Depending on the project, this is not always possible. Even 64 samples you can really hear the latency… You have to play slightly ahead of the beat…

:slight_smile:

Not the greatest answer I know. Perhaps somebody will come along with something a little better.

Cheers

I think you can do this in Tracktion, you plug your output into your input then run a test, then it calibrates itself, not sure about Renoise though.

What most software report as being the “latency” isn’t really the true latency of the audio path (and definitely not roundtrip latency) but is your audio output buffer size.

Audio output and input buffers will often be of the same size but they don’t strictly have to be. When playing to a pre-recorded track on the computer the audio with go through the output buffer plus other delays (DAC conversion, signal propagation time, sound propagation from source (speaker) to ear) then the audio recording will have similar delays.

Out of these all the largest are probably going to be your buffer size and the time taken for the sound to travel from the speaker (unless you are using headphones.) Remember sound waves travel at approximately one foot per millisecond (thanks atte.) So twice the buffer size plus a milisecond per foot you are away from your soundsource is probably a pretty decent starting point. DAC/ADC conversions and other propagation delays are likely to be close to negligible but if you are using any external DSPs, an active crossover, digital mixer etc etc you might want to add a little extra on top of this value.

Thanks for the comments. One solution that has been taken in a few android apps is to record the metronome playback through the microphone. Assuming it’s not too noisy, it should be pretty straightforward to infer/estimate the latency from the recorded signal. Would be nice if this could be done automatically, but in the meantime, I’ll try doing it manually.

One limitation of this is that I’ve noticed that for some apps the latency is dependent on the number of tracks that are used. So the value you get from a clean experiment like this might not be applicable once your song starts getting more complex. Don’t know if this is an issue with OSX/Windows, but it might be difficult to get around if it is.

You obviously meant to say one foot per ms :blink:

Hahaha yeah obviously. Thanks for the correction. I’ve not had a real full nights sleep in almost two weeks now due to one nasty cough and I think it’s starting to effect my brain function!! ;)