Using observations from a space-borne radiometer and a ground-based precipitation profiling radar, the impact of cloud microphysics schemes in the WRF model on the simulation of microwave brightness temperature (Tb), radar reflectivity, and Doppler velocity (Vdop) is studied for a winter storm in California. The unique assumptions of particles size distributions, number concentrations, shapes, and fall speeds in different microphysics schemes are implemented into a satellite simulator and customized calculations for the radar are performed to ensure consistent representation of precipitation properties between the microphysics schemes and the radiative transfer models.
 Simulations with four different schemes in the WRF model, including the Goddard scheme (GSFC), the WRF single-moment 6-class scheme (WSM6), the Thompson scheme (THOM), and the Morrison double-moment scheme (MORR), are compared directly with measurements from the sensors. Results show large variations in the simulated radiative properties. General biases of ~20 K or larger are found in (polarization-corrected) Tb, which is linked to an overestimate of the precipitating ice aloft. The simulated reflectivity with THOM appears to agree well with the observations, while high biases of ~5−10 dBZ are found in GSFC, WSM6 and MORR. Peak reflectivity in MORR exceeds other schemes. These biases are attributable to the snow intercept parameters or the snow number concentrations. Simulated Vdop values based on GSFC agree with the observations well, while other schemes appear to have a ~1 m s-1 high bias in the ice layer. In the rain layer, the model representations of Doppler velocity vary at different sites.