Many field experiments have observed significant temporal variations of thermal infrared (TIR) emission directionality, making it necessary to explain this phenomenon quantitatively to exploit potential applications of the directional remotely sensed TIR observation. The main objective of this paper is to determine when and how the significant directional effect appears. Two models, TRGM and Cupid, are linked to simulate the temporal variations of directional brightness temperature TB(θ) of crop canopies, including winter wheat and summer corn. Two indicators are defined: (1) ΔTB,AVG representing the mean difference between nadir TB(0) and off-nadir TB(55) and (2) ΔTB,STD representing the standard deviation of TB(55) for different view azimuth angles. Simulation results show that the highest ΔTB,AVG of up to 4°C appears mostly at midday (1200–1300 LT), while the lowest ΔTB,AVG appears mostly in the early morning (0700–0800 LT) or late afternoon (1700–1800 LT). The ΔTB,STD is about one third of ΔTB,AVG and should not be neglected given its considerable value at around 1400 LT. This trend has been proven through field measurements at both wheat and corn sites. The major factors affecting the trend are also identified using sensitivity analysis. Among the major factors, soil water content, LAI, and solar radiation are the three most significant factors, whereas the wind speed and air temperature have a larger effect on ΔTB,AVG than air humidity. It is interesting that ΔTB,AVG reaches a maximum value when the LAI is around 0.8. Further analysis shows that ΔTB,AVG is related to soil surface net radiation, which will be useful in net radiation estimation.