Host resistance to parasites can come in two main forms: hosts may either reduce the probability of parasite infection (anti-infection resistance) or reduce parasite growth after infection has occurred (anti-growth resistance). Both resistance mechanisms are often imperfect, meaning that they do not fully prevent or clear infections. Theoretical work has suggested that imperfect anti-growth resistance can select for higher parasite virulence by favouring faster-growing and more virulent parasites that overcome this resistance. In contrast, imperfect anti-infection resistance is thought not to select for increased parasite virulence, because it is assumed that it reduces the number of hosts that become infected, but not the fitness of parasites in successfully infected hosts. Here, we develop a theoretical model to show that anti-infection resistance can in fact select for higher virulence when such resistance reduces the effective parasite dose that enters a host. Our model is based on a monarch butterfly–parasite system in which larval food plants confer resistance to the monarch host. We carried out an experiment and showed that this environmental resistance is most likely a form of anti-infection resistance, through which toxic food plants reduce the effective dose of parasites that initiates an infection. We used these results to build a mathematical model to investigate the evolutionary consequences of food plant-induced resistance. Our model shows that when the effective infectious dose is reduced, parasites can compensate by evolving a higher per-parasite growth rate, and consequently a higher intrinsic virulence. Our results are relevant to many insect host–parasite systems, in which larval food plants often confer imperfect anti-infection resistance. Our results also suggest that – for parasites where the infectious dose affects the within-host dynamics – vaccines that reduce the effective infectious dose can select for increased parasite virulence.