Surface solar radiation revealed an estimated 7 W/m2 or 4% decline at sites worldwide from 1961 to 1990. Here I find that the strongest declines occurred in the United States sites with 19 W/m2 or 10%. The clear sky optical thickness effect accounts for −8 W/m2 and the cloud optical thickness effect for −18 W/m2 in three decades. If the observed increases in cloud cover frequencies are added to the clear sky and cloud optical thickness effect, the higher all sky reduction in solar radiation in the United States can be explained. It is shown that solar radiation declined below cloud-free sky because of the reduction of the cloud-free fraction of the sky itself and because of the reduction of clear sky optical thickness. Solar radiation exhibits no significant changes below cloud-covered sky because reduced cloud optical thickness is compensated by increased frequencies of hours with overcast skies.