The problems of polarization rotation losses, in meteor burst communication systems, are examined using a theoretical model developed for the purpose. The paper takes into account both the polarization changes due to ionospheric Faraday rotation and the rotation of the wave polarization that takes place as a result of scattering from underdense meteor trails. Linearly polarized systems, employing copolar transmitting and receiving antennas, and hybrid systems, employing a linearly polarized transmitting antenna and a circularly polarized receiving antenna, are studied. It is shown that, particularly in linearly polarized systems, polarization rotation may introduce unexpected diurnal performance variations in systems operating at frequencies ∼40 MHz. For the two 40-MHz linearly polarized links investigated in detail the model predicts that, for noon summer solstice conditions during periods with a high solar sunspot number, these losses cause a reduction in data throughput to between 15 and 70% of the value expected had no polarization rotation occurred. By using a cross polarization approach, qualitative experimental confirmation of the predictions is also given.