We investigate the light-trapping effects of dielectric nanoparticles embedded within the active semiconductor layer of a thin-film solar cell. The baseline model consists of a 1·0 µm slab of crystalline silicon on an aluminum back contact topped with a 75 nm Si3N4 anti-reflective coating. Using finite-difference time-domain simulations, we calculate the absorption gain due to a periodic array of SiO2 nanospheres with characteristic depth, diameter, and pitch. Under optimal conditions, absorption gain due to embedded spheres can reach as high as 23·4% relative to the baseline geometry. Using Au-core/SiO2-shell nanoparticles, it is even possible to reach 30%. We then infer a series of design principles from our data that include trade-offs between broadband scattering efficiency, poor absorption at long wavelengths, and semiconductor displacement. We also find that the optimal spacing between particles is approximately 400 nm. Above this distance, each scatterer acts in near isolation from any neighboring particles, and absorption gain is approximately linear with area coverage. Such gains are also expected for disordered as well as ordered arrays. These results demonstrate the potential of embedded dielectric nanoparticles as a tool for enhancing carrier generation in thin silicon solar cells. Copyright © 2011 John Wiley & Sons, Ltd.