Error in Calculating Lag Due to in-band Noises

The pulse is assumed to have f0=100 MHz and df=5 MHz and a peak amplitude of unity. Random noise with the same bandwidth and an amplitude of 1/SNR is added.

Pulses with SNR of 100 (top) and Infinity (bottom). (Postscript version).

Pairs of these noisy pulses are constructed with a lag of exactly 10.25 ns, a a sampling interval of 1 ns. The lag between them is calculated using cross-correlation (Menke's ah_timediff3 program, which determines the lag to better than a sample).

The statistics of the lag of 1000 releaizations of pairs is examined, for a SNR ratios of 100, 20 and 5. The SNR=100 has a mean lag of 10.249 ns and a standard deviation of 0.02 ns. The SNR=10 has a mean lag of 10.248 ns and a standard deviation of 0.091 ns. The SNR=5 has a mean lag of 10.274 and a standard deviation of 0.32 ns. However, these standard deviations ignore "cycle errors", which are insignifant for SNR=100 and account for about 30% of the realizations for SNR=10. (Postscript version).

Given that we want accuracy of about 1.5 cm, and the speed of the radio waves is about 15 cm/ns, we need 0.1 ns timing. That implies a SNR ratio in of about 10 (or better).