The radio loss between two small dipoles located in a forest is examined in the frequency range of 2–200 MHz by characterizing the forest in terms of a dissipative slab backed by an imperfectly conducting ground. A careful derivation of the total radio loss L shows that it consists of four distinct constituents: (1) a basic loss L0 associated with the forest-air interface, (2) a separation loss L8 due to the vegetation above the antenna, (3) a wave-interference loss Li due to field reflections at the ground plane, and (4) an antenna input resistance loss Lr produced by the ground proximity. The functional variation of these losses is consistent with previously obtained results that viewed the field in terms of a lateral wave, which now affords a simple physical interpretation for each of the separate factors enumerated above. By choosing parameters describing typical forests, we show that the variation of the four constituents leads to a total loss whose behavior agrees with available experimental data and with previous theoretical considerations pertinent to high antennas. In contrast to these, the present investigation predicts, for low antennas, that vertical polarization is preferable to the horizontal one and that communication conditions may be further improved if the operating frequency is increased rather than decreased. The antenna height gain effect is also examined in detail, and it is shown to be strongly affected by the ground proximity in a manner that had not been recognized hitherto.