Since the first satellites entered Earth orbit in the late 1950's and early 1960's, the influences of solar and geomagnetic variability on the satellite drag environment have been studied, and parameterized in empirical density models with increasing sophistication. However, only within the past 5 years has the realization emerged that “troposphere weather” contributes significantly to the “space weather” of the thermosphere, especially during solar minimum conditions. Much of the attendant variability is attributable to upward-propagating solar tides excited by latent heating due to deep tropical convection, and solar radiation absorption primarily by water vapor and ozone in the stratosphere and mesosphere, respectively. We know that this tidal spectrum significantly modifies the orbital (>200 km) and reentry (60–150 km) drag environments, and that these tidal components induce longitude variability not yet emulated in empirical density models. Yet, current requirements for improvements in orbital prediction make clear that further refinements to density models are needed. In this paper, the operational consequences of longitude-dependent tides are quantitatively assessed through a series of orbital and reentry predictions. We find that in-track prediction differences incurred by tidal effects are typically of order 200 ± 100 m for satellites in 400-km circular orbits and 15 ± 10 km for satellites in 200-km circular orbits for a 24-hour prediction. For an initial 200-km circular orbit, surface impact differences of order 15° ± 15° latitude are incurred. For operational problems with similar accuracy needs, a density model that includes a climatological representation of longitude-dependent tides should significantly reduce errors due to this source.