The ability of six scanning cloud radar scan strategies to reconstruct cumulus cloud fields for radiation study is assessed. Utilizing snapshots of clean and polluted cloud fields from large eddy simulations, an analysis is undertaken of error in both the liquid water path and monochromatic downwelling surface irradiance at 870 nm of the reconstructed cloud fields. Error introduced by radar sensitivity, choice of radar scan strategy, retrieval of liquid water content (LWC), and reconstruction scheme is explored. Given an infinitely sensitive radar and perfect LWC retrieval, domain average surface irradiance biases are typically less than 3 W m−2 µm−1, corresponding to 5–10% of the cloud radiative effect (CRE). However, when using a realistic radar sensitivity of −37.5 dBZ at 1 km, optically thin areas and edges of clouds are difficult to detect due to their low radar reflectivity; in clean conditions, overestimates are of order 10 W m−2 µm−1 (~20% of the CRE), but in polluted conditions, where the droplets are smaller, this increases to 10–26 W m−2 µm−1 (~40–100% of the CRE). Drizzle drops are also problematic; if treated as cloud droplets, reconstructions are poor, leading to large underestimates of 20–46 W m−2 µm−1 in domain average surface irradiance (~40–80% of the CRE). Nevertheless, a synergistic retrieval approach combining the detailed cloud structure obtained from scanning radar with the droplet-size information and location of cloud base gained from other instruments would potentially make accurate solar radiative transfer calculations in broken cloud possible for the first time.