A global-scale analysis of detections made at all 36 currently operating International Monitoring System (IMS) infrasound arrays confirms that the primary factor controlling signal detectability is the seasonal variability of the stratospheric zonal wind. At most arrays, ∼80% of the detections in the 0.2- to 2-Hz bandpass are associated with propagation downwind of the dominant stratospheric wind direction. Previous IMS infrasound network performance models neglect the time- and site-dependent effects of both stratospheric meteorological variability and ambient noise models. In this study both effects are incorporated; we compare empirical and improved specifications of the stratospheric wind and include station-dependent wind noise models. Using a deterministic approach, the influence of individual model parameters on the network performance is systematically assessed. At frequencies of interest for detecting atmospheric explosions (0.2–2 Hz), the simulations predict that explosions equivalent to ∼500 t of TNT would be detected by at least two stations at any time of the year. The detection capability is best around January and July when stratospheric winds are strongest, compared to the equinox periods when zonal winds reduce and reverse. The model predicts that temporal fluctuations in the ground-to-stratosphere meteorological variables generate detection threshold variations on daily and seasonal timescales of ∼50 and ∼500 t, respectively. While the strong zonal winds lead to an improvement in detection capability, their highly directional nature leads to an increase in the location uncertainty owing to the decreased azimuthal separation of the detecting stations.