Having an accurate method to estimate and remove ionospheric effects is a major issue for low-frequency radio astronomy arrays, as the ionosphere is one of their largest error terms. One way to estimate the ionosphere is to measure total electron content (TEC) using dual frequency global positioning system (GPS) signals. This technique uses the dispersive nature of the ionosphere, as both group and phase velocities are (to first order) dependent on the inverse square of the frequency and on TEC. Using these properties, TEC can be measured to a high degree of accuracy by computing the delay difference between signals at GPS's two frequencies (L1 = 1575.42 and L2 = 1227.6 MHz). Unfortunately, effects other than ionospheric dispersion also introduce differential delay differences. These additional differences, called biases, can be separated into those introduced by the satellite and those by the receiver. Receiver biases show the most significant variations, sometimes over intervals of hours. Changing temperature conditions at the receiver antenna, along the cable, or in the internal receiver hardware are thought to be responsible for some of these variations. We report here on an investigation of the temperature dependence of the GPS receiver bias. Our results show that for our particular receiver, antenna, and cable set-up, a temperature-dependent bias is clearly evident, and that this temperature dependence varies from receiver to receiver. When the receiver bias temperature dependence is removed, a noise level of 1–3 TEC units still remains in the bias estimation.