A test of water vapor radiometer-based troposphere calibration using very long baseline interferometry observations on a 21-km baseline


  • R. P. Linfield,

  • S. J. Keihm,

  • L. P. Teitelbaum,

  • S. J. Walter,

  • M. J. Mahoney,

  • R. N. Treuhaft,

  • L. J. Skjerve


Simultaneous very long baseline interferometry (VLBI) and water vapor radiometer (WVR) measurements on a 21-km baseline showed that calibration by WVRs removed a significant fraction of the effect of tropospheric delay fluctuations for these experiments. From comparison of the residual delay variations within scans and between scans, the total tropospheric contribution to the delay residuals for each of the three 5–20 hour sessions was estimated as 1%, 17%, and 10%, with the first value being very uncertain. The observed improvement in rms residual delay from WVR calibration during these three sessions was 4%, 16%, and 2%, respectively. The improvement is consistent with the estimated 2–3 mm path delay precision of current WVRs. The VLBI measurements of natural radio sources were conducted in April and May 1993 at Goldstone, California. Dual-frequency (2.3 and 8.4 GHz) observations were employed to remove the effects of charged particles from the data. Measurements with copointed WVRs, located within 50 m of the axis of each antenna, were performed to test the ability of the WVRs to calibrate line-of-sight path delays. Factors which made WVR performance assessment difficult included the facts that (1) the level of tropospheric fluctuations was smaller than is typical for Goldstone during these experiments, and (2) VLBI delay variations on longer timescales (i.e., over multiple scans) contained uncalibrated instrumental effects (probably a result of slow temperature variations in the VLBI hardware) that were larger than the tropospheric effects.