Studies in regions of extensive irrigation practices have revealed a significant influence of evaporative cooling on regional temperatures as a result of surface energy redistribution during evaporation. In the U.S. High Plains, maximum temperatures during the last quarter of the 20th century have been decreasing. We investigated the trends in evapotranspiration (ET or latent heat) fluxes originating from increasing irrigation practices in the High Plains region from 1981 to 2008. We estimated actual ET (ETc) over the entire High Plains from the spatial crop coefficients (Kc) and spatial reference (potential) ET (ETref). We proposed and validated a global linear relation between Kc and advanced very high resolution radiometer-based normalized difference vegetation index. Our results show an increase in ETc trends over the region in the last three decades. The study shows that the increase in ETc flux was not in principal from increased atmospheric evaporative demand. Rather, the increase in ETc was due to significant increase in irrigated surfaces. The increase in ETc fluxes is likely a manifestation of increased redistribution of surface energy into latent heat and less partitioning into the sensible heat. We investigated the evolution of full canopy cover vegetation (normalized difference vegetation index >0.70) in relation to the maximum temperature anomalies during the study period. Results revealed a significant negative correlation between the two variables. These results appear to demonstrate that there is a regional evaporative cooling signal due to extensive irrigation practices, which impacts regional temperatures during the summer seasons.