Land surface temperature (LST), a key parameter for many environmental studies, can be most readily estimated by using thermal infrared (TIR) sensors onboard satellites. Accurate LST are contingent upon simultaneously accurate estimates of land surface emissivity (ε), which depend on sensor viewing angle and the anisotropy of optical and structural properties of surfaces. In the case of inorganic bare soils (IBS), there are still few data that quantify emissivity angular effects. The present work deals with the angular variation of TIR emissivity for twelve IBS types, representative of nine of the twelve soil textures found on Earth according to United States Department of Agriculture classification. Emissivity was measured with a maximum error of ±0.01, in several spectral ranges within the atmospheric window 7.7–14.3 μm, at different zenithal (θ) and azimuthal (φ) angles. Results showed that ε of all IBS studied is almost azimuthally isotropic, and also zenithally up to θ = 40°, from which ε values decrease with the increase of θ. This decrease is most pronounced in sandy IBS which is rich in quartz reaching a maximum difference from nadir of +0.101 at θ = 70°. On the other hand, clayey IBS did not show a significant decrease of ε up to θ= 60°. A parameterization of the relative-to-nadir emissivity in terms ofθ and sand and clay percentage was established. Finally, the impact of ignoring εangular effects on the retrievals of LST, using split-window-type algorithms, and of outgoing longwave radiation, was analyzed. Results showed systematic errors ranging between ±0.4 K to ±1.3 K for atmospheres with water vapor values lower than 4 cm in the case of LST, and errors between 2%–8%, in the estimation of different terms of the surface energy balance.