You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The audio buffers as supplied by PortAudio should have timestamps. If the computer's clock is being synchronized with NTP, then these timestamps should be very accurate. More accurate than a typical quartz watch.
So it should be possible to use these timestamps to calibrate tg. The number of samples received vs the timestamps between those samples should give the actual sampling rate.
This can be done continuously, so that as the sampling rate changes with temperature the calibration changes too.
For this to work, it requires:
PortAudio's timestamps are based on the computer's system clock and not PortAudio synthetically constructing them based on the samples received since the previous timestamp. On Linux, ALSA does provide real timestamps, generated in the kernel from the audio driver's interrupt code. PortAudio would need to use these.
The system clock is more accurate over the τ of interest than the audio hardware's clock. With NTP, even with an internet based NTP server, this should be the case.
That there is not a source of error (over the appropriate τ) prior to the timestamp of the audio. An example would be buffering in the microcontroller used in a USB audio device, to buffer data between each isochronous USB frame, which allows for unmeasured error on the computer's side of the USB interface. USB audio devices shouldn't do this.
The text was updated successfully, but these errors were encountered:
xyzzy42
changed the title
Feature Idea: Continuous auto-calibration using audio timestamps
[Feature] Continuous auto-calibration using audio timestamps
Jan 26, 2021
The audio buffers as supplied by PortAudio should have timestamps. If the computer's clock is being synchronized with NTP, then these timestamps should be very accurate. More accurate than a typical quartz watch.
So it should be possible to use these timestamps to calibrate tg. The number of samples received vs the timestamps between those samples should give the actual sampling rate.
This can be done continuously, so that as the sampling rate changes with temperature the calibration changes too.
For this to work, it requires:
The text was updated successfully, but these errors were encountered: