There is no doubt that late medieval or, indeed, pre-industrial European society was prone to annual vagaries and oscillations of nature. It was the age of the ‘organic economy’, whose levels of production, and success or failure, depended much on the fertility of the land and the health of animals (both humans and domesticates). Harvest failures inevitably meant inflated crop prices and famine. Food shortages could stem from either grain scarcity (food availability decline, or ‘FAD’) or disruptions in the grain supply brought about by purely institutional or endogenous factors (food entitlement decline, or ‘FED’).2 Though local food shortages seem to have been commonplace throughout the middle ages, none is comparable in its scale and implications to the infamous Great European Famine of 1314/15–22, arguably the single worst agrarian and food crisis in northern and central Europe in the last two millennia. The torrential flooding of 1314–16 led to three back-to-back harvest failures (in England and Wales, composite grain yields of 1315, 1316, and 1317 stood at about 40, 60, and 10 per cent their average level, respectively).3 Although much remains to be studied about the demographic aspects of this catastrophe, there is a scholarly consensus that about 10–15 per cent of the European population died of hunger and hunger-related disease in these fatal years.4
Humans, however, were not the only victims of the inclement weather and the subsequent famine. In 1319, when it seemed that the crisis was more or less over, a new ecological disaster of unheard proportions hit the British Isles: bovine mortality on a panzootic scale. The pestilence was not enzootic to the British Isles; its pathogen seems to have originated in the Far East, perhaps in or around Mongolia in the late thirteenth century. The disease spread westwards into Europe; central Europe was ravaged by 1315; western Germanic lands, France, and the Low Countries were attacked in 1317–18; Denmark fell victim in 1318; and a year later the pathogen finally arrived in the British Isles, around Easter 1319.5 Essex appears to have been affected first, which implies that the pathogen was spread into England from Holland.6 In England, the mortality spread quickly in multiple directions; by the late summer of 1319, it reached the Scottish border; in the summer of 1320 it penetrated into Wales; Ireland was spared until 1321.7
Just as the Black Death, the identity of Great Bovine Pestilence remains enigmatic. So far, three ‘diagnoses’ have been suggested: anthrax, foot-and-mouth disease (FMD), and rinderpest.8 In most cases, however, these diagnoses are highly speculative, because their makers tended to disregard both the wider context and factual details of the murrain. Most recently, however, Newfield has convincingly argued in favour of rinderpest and the present study tends to adhere to this verdict.9
Scholars have not neglected this Great Bovine Pestilence. Until most recently, however, they have tended to consider it only in conjunction with the Great European Famine.10 Thanks to recent contributions by Newfield, however, the bovine pestilence can now be regarded as a crisis of its own.11 Newfield studied the event mostly in narrative sources, while Slavin has recently drawn upon some manorial data to provide an introductory study of the fourteenth-century bovine pandemic and its economic consequences.12 The present article adds to the growing corpus of publications on medieval livestock pandemics, by looking in depth at the scale of the bovine pestilence in England and its biological and economic implications. The pestilence is set here in the larger context of the early fourteenth-century crisis. It is regarded as a missing link between the two other great ecological crises of the fourteenth century, the Great Famine and the Black Death. This vision follows the lines of Jordan's argument regarding the likely connection between the famine and the human plague in that century.13
On account of the remarkably vast corpus of extant English evidence for the plague, both statistical and narrative, England presents us with an ideal case study of this disaster. The study is based on a wide range of contemporary statistical sources, including over 3,000 manorial accounts spanning the years 1310–50. These are annual financial and agricultural reports, rendered by demesne officials (reeves, bailiffs, serjeants, and so on) on an annual basis and running usually from Michaelmas to Michaelmas (29 September). The accounts recorded, in great detail, annual patterns of livestock incomings (births, purchases, dues from tenants and transfers) and outgoings (deaths, sales, butcheries, transfers), and differentiated between different sexes and age groups. These extraordinary documents are truly without peer in late medieval Europe. It is thanks to them that a quantitative study of the cattle pestilence and its impact are possible.
- Top of page
- Footnote references
Manorial accounts can reveal a great deal about both the mortality rates and mortality patterns within bovine stocks. In order to attain a reliable and robust picture, however, two conditions must be met. First, a large sample of manors should be analysed. Second, one should rely only on accounts covering the years during which the pestilence persisted in England. In the vast majority of cases, death rates can be calculated from accounts running from Michaelmas 1319 to Michaelmas 1320. In some cases, however, the figures are indicated in the rolls running between Michaelmas 1318 and Michaelmas 1319, meaning that the panzootics hit before late September of 1319. Since far from every demesne has a surviving account from the years in which the pestilence spread through England, we may, in many instances, obtain the general idea of the extent of the mortality by juxtaposing the accounts from the year preceding the disaster with that coming after the pestilence. While this methodology may be valid for assessing the impact of the mortality, it cannot, however, be applied in calculating mortality rates. For instance, an account from 1318/19 may state that there were, say, 50 oxen at the end of the account year, while an account from 1320/1 may indicate that there were only 20 oxen at the beginning of the account year. This does not necessarily mean, however, that 30 oxen died in the course of the 1319/20 year; it could also mean that a certain number of oxen were transferred elsewhere. Therefore, for the purpose of calculating mortality rates, only the pestilence year accounts are taken into consideration. Also, in order not to over-inflate the mortality rates, the ‘base’ population, against which pestilence deaths have been measured, is reckoned as the total number of the cattle remaining from the previous years plus those received from purchases, graduations, transfers, and other means.14 A systematic search for, and tabulation of, these accounts allowed for the construction of a national sample, based on 165 demesnes. It is unlikely that much more will emerge in the course of time. Although the sample covers several regions, its overall geographic distribution is pronouncedly uneven, with a clear predominance of the southern and eastern counties. Thus, the two most represented counties are Hampshire and Kent (with 25 and 26 demesnes respectively); these are followed by Norfolk and Surrey (15 and 12 demesnes respectively). Other counties are endowed with fewer represented demesnes. Two factors, both institutional in their nature, stand behind this uneven distribution. First, manorialism was never fully developed in northern and western parts of the country. Freeholders or farmers in these regions did not, as a rule, keep their own accounts. Second, the best geographic coverage is closely associated with a particular lordship. Thus, the Hampshire and Kentish demesnes were owned by the Bishop of Winchester and Canterbury Cathedral Priory, whose surviving accounts have remarkable chronological coverage, with very few gaps. Similarly, Norwich Cathedral manors have almost equally good coverage.
As table 1 indicates, between 1319 and 1320, England and Wales lost about 62 per cent of its bovine population.15 This is an astonishingly high figure, with seriously negative implications. This pronounced mortality meant a drastic fall in manure resources and the available ploughing force, and also a significant decline in dairy produce and, consequently, dairy resources. It should be noted that our accounts clearly differentiate between diseased animals that perished by themselves, and diseased animals that were killed by humans. The sudden arrival of the pestilence and the mass mortality of the cattle prompted the reeves to get rid of the potentially contagious and morbid animals. This gave rise to mass ‘panic sales’ of cattle in 1319–20, which are clearly distinguished from ordinary sales of healthy animals in the accounts. For this reason, it is hardly surprising that cattle prices were abnormally low in 1320, as we discuss later. For instance, 34 out of 36 diseased oxen were sold en masse for £10 4s. 6d. at Wisbech Barton (Cambridgeshire).16 In some select cases, manorial reeves decided to slaughter diseased cattle, perhaps consciously to avoid further casualties. For instance, at Westerham (Kent), 27 bovids were slaughtered, ‘because they were likely to die’ (eo quod uellet mori).17 Similar butcheries were undertaken at Eltham (Kent) and Hendon (Middlesex).18 The language of the accounts reveals that the reeves were undoubtedly aware of the risk of murrain.19 However, such slaughters were rather exceptional, and in the majority of cases, the reeves and tenants stood helpless in front of the murrain. Indeed, in the vast majority of fatalities (some 90 per cent), animals died without human intervention. The ‘panic sales’ accounted for a further 9 per cent, while ‘panic butcheries’ did not account for more than 1 per cent of all deaths. The mortality figures with these panic sales and butcheries included should not appear too controversial, as there is no evidence that indicates that diseased animals ever recovered. Moreover, if the rinderpest virus were the cause we could expect a near 100 per cent mortality rate in the infected.20 It is highly implausible that the contagious animals would have survived the pathogen, had they not been culled.
Table 1. Bovine mortality rates in England and Wales, 1319–20
|Percentile group||Animals before mortality||Animals, dead in pestilence||Mortality rates||Sampled demesnes|
Sixty-two per cent is merely an average figure. The scale of the mortality varied from demesne to demesne. Some manors, including West Wycombe (Buckinghamshire) and Gravesend (Kent), lost all of their bovids, while others, such as Loose (Kent) and Steeple (Dorset), were spared altogether.21 What accounts for these differences? A run of regressions, based on several variables, may provide some clues.
Table 2 hints that there was little or no correlation between most explanatory variables and the levels of mortality. Geographically speaking, the pathogen seems to have been as active in East Anglia as in the north. The type of lordship (whether religious or lay, monastic or episcopal) seems not to have had an impact on mortality rates. The presence of markets and fairs, indicating the volume of trade in and frequent movement of cattle, seems to have had an insignificant effect, as well. Thus, the manors of Culham and Wargrave, both in Berkshire, lost most of their bovine stocks, though the former did not have a market and the latter did. On the other hand, the manor of Langley, a major market in Hertfordshire, was spared by the pestilence. The manor of Sedgeford (Norfolk), located at a considerable distance from a market, did not, likewise, suffer any fatalities.
Table 2. Correlation between mortality rates and other independent variables
|Variable||Bovine mortality rates|
|Bovine units per 100 sown acres||−0.011|
|Oxen units per 100 sown acres||−0.042|
|Livestock units per 00 sown acres||−0.054|
|Wheat yields, 1315–17||−0.223|
|Oat yields, 1315–17||−0.423***|
|Barley yields, 1315–17||−0.588***|
Similarly, the size and structure of bovine stocks had a meagre, if any, impact on the mortality rates. The demesnes of Millow (Bedfordshire), Dalton and Pittington (both in Durham), Loose (Kent), and Upton Knoyle (Wiltshire), which reared oxen, but no other cattle, were spared altogether. This may create a false impression that neutered stocks, consisting of oxen (castrated bulls), were saved, whereas sexually active, or mixed stocks, consisting of both oxen and non-draught cattle, were more prone to the pathogen and that the former may have been spread through sexual contact. In reality, the situation was more complicated. Wisbech Barton (Cambridgeshire), Houghall (Durham), East Meon Church (Hampshire), Ponteland (Northumberland), and Oakham (Rutland), which only possessed neutered bovids, lost all or nearly all their oxen. On the other hand, some manors consisting of mixed or cattle-biased stocks, such as Langley (Hertfordshire), Ham, Lower Halstow, and Lydd (all in Kent), suffered few or no fatalities.
Also, the pathogen discriminated between neither the overall size of bovid stocks nor their physical density (as measured in the number of animals per 100 sown acres of arable land). Thus, Birdbrook (Essex) and Knoyle (Wiltshire) both reared very large bovine stocks, consisting of over 110 animals each. The former lost only 35.4 per cent of its bovids, while the latter lost nearly all of its animals. On the other hand, Anstey (Hampshire) and Kennington (Berkshire) stocked less than 10 oxen each.22 The former lost all but one ox, while the latter suffered no losses at all. Similarly, the physical density of bovids had a meagre impact on the mortality rates. Wisbech Barton (Cambridgeshire) had 9.2 bovids per 100 sown acres and Billingbear (Berkshire) stocked 150.8 bovids per 100 sown acres, yet both demesnes suffered over 90 per cent mortality rates. Many more examples could be supplied to support this point. As table 2 demonstrates, there was no correlation between the two variables.23
It was the degree of crop failure during the famine years of 1315–17 that seems to have been the most significant determinant of mortality, as table 2 suggests (with the correlation coefficient of –0.453 for oat yields and –0.588 for barley ones). Since crop yields varied from place to place in ‘normal’, that is, non-famine years, depending on land fertility and managerial arrangements, the harvest failures are measured here in relative, rather than absolute terms (namely, the ratio between crop yields of 1315–17 and their indexed average for 1270–1400). It appears that there was a fair degree of inverse correlation between the two variables. In other words, in many cases high mortality rates were found on the demesnes, where crop yields were notoriously low in 1315–17. Thus, at Woodhay, Burghclere, Ecchinswell (all in Hampshire), Wargrave (Berkshire), and Adderbury (Oxfordshire) composite crop yields were around 50 per cent of their non-famine levels. All these demesnes lost all or almost all of their bovine stocks. Brightwell (Berkshire), Wield, Alresford (both in Hampshire), Chartham, and Meopham (both in Kent), whose yields were not nearly as bad during the famine years, escaped heavy mortalities.24 The negative correlation between famine year harvests and bovine mortality rates seems to have been especially pronounced within the oat and barley sectors. This is hardly surprising, since these were regarded as both ‘drinking’ and ‘fodder’ crops, namely grains consumed by both humans and animals. There is a good deal of evidence that the agrarian crisis forced manorial managers to cut down oat (and in some cases barley) allowances to oxen. While one should not overestimate the overall contribution of the fodder grains (oats in particular) to the daily intake of oxen and cattle (on average, the share of these grains did not exceed 5–10 per cent of their total fodder composition), one should also bear in mind that fodder grains provided some vital nutrients for bovids, in particular oxen, whose exclusive functionality was to perform force-related tasks, namely ploughing and carting. Although the main consumers of oats were horses, there is no evidence of increased horse mortality during the crisis years. This is not surprising as the pathogen affected cattle alone.
Other fodder components included pasturage, hay, and straw. These constituted the predominant share of bovine dietary intake. It is a pity that very few accounts contain data on these types of forage and even fewer indicate annual fluctuations in their availability, in actual figures.25 It was not possible to trace a single account from the famine years that contains a forage section. There can be little doubt, however, that the depressed crop harvests reflect the depression within the pasturage sector of agriculture in the early fourteenth century, and in the famine years in particular. Fortunately, some select manorial accounts contain brief references to non-grain fodder resources, mostly in conjunction with the torrential rain, which destroyed much grass, both raw and dried, both growing and mowed. These accounts also indicate that, during the famine years, manorial authorities purchased very limited quantities of mowed meadow to feed oxen. Between 1310 and 1315, the manorial authorities of Bolton Priory (Yorkshire) spent, on average, £5 7s. on meadow. During the famine, the expenditure fell to 18s. 9d., which is to be ascribed to meadow shortage. For instance, in 1314 and 1315 the Bolton authorities purchased 18 acres of meadow in Berwick, while in 1316 they bought only two acres there. In 1317, no meadow was purchased at all. The inability to purchase sufficient amounts of meadow was dictated by high prices during the famine years. In 1316 mowed meadow was selling for as much as 6s. 8d. an acre, compared to 2s. 3d. between 1310 and 1315. In 1318 meadow prices were still high, with one acre selling for 4s. 5d.26 Also, various accounts from the Bishop of Winchester report a shortage of herbage. In 1315, the demesnes of Waltham St Lawrence, Culham, Waltham, and Billingbere sold very little or no herbage, while Wargrave, Downton, Mardon, and East Meon had trouble mowing their grass, on account of heavy rain. In 1316 and 1317 the situation was even worse.27 When peasants succeeded in mowing meadows, they did not necessarily manage to dry the hay. The almost-biblical flooding during the summer and autumn prevented meadow from drying and, as a result, doomed it to rot and become infected with germs and parasitic fungi. Several contemporary chronicles vividly, yet gloomily, narrate how oxen and cattle suffered from putrefaction of herbage, as much as humans suffered from the ruining of food.28 Similarly, some manorial accounts, such as those of the Bishop of Winchester, report a shortage of pasturage ‘because of great inundation’.29 There is also evidence that some hay was consumed by starving humans, rather than given to starving animals during the Great Famine.30 Heavy rain returned in the winter of 1319, as some contemporary chronicles and manorial accounts report. Some Winchester rolls indicate that the demesne animals could not pasture on account of the abundant downpour. The torrential rain of the 1310s, however, was a short-term event, a part of much longer climatic and environmental movements and contexts. Towards the end of the thirteenth century, we witness the first signs of a long-term climatic deterioration in the North Atlantic region, the initial phase of the Little Ice Age.31 This climatic change affected vegetation growth by shortening the growing, and consequently, the grazing season by one month, chiefly in upland regions. A shorter growing season naturally meant a decline in the production of biomass for animal grazing.32
Bad weather, climatic deterioration, and malnutrition may have all stood behind the weakening of England's bovine population. Exposed to colder temperatures and deprived of their most basic kinds of fodder, cattle had to spend more energy to maintain body temperature. This, in turn, is likely to have decreased the animals' resistance to pathogens within a very short period of time.33 Malnutrition also tends to delay physical growth in young cattle, chiefly the development of muscles. As a result, younger bullocks and heifers were more likely to have grown into sterile bulls, weak oxen, and aborting cows. Indeed, some accounts mention calf abortions in conjunction with the bovine crisis.34 It should be remembered that the growth period in cattle is significantly shorter than in humans and that a short period of intense dearth could have far-reaching consequences for the well-being of bovine stocks. Normally, it takes about 18 months for calves to develop into sexually mature animals. As such, a six-month period of deprivation and malnutrition is undoubtedly much worse for a bullock than a year-long deprivation in a young human. Interestingly, evidence for increased bovine mortality before 1319–20 on the manors that suffered significant crop failures was not found. Does this indicate that fodder deprivation alone (that is, without the intervention of the pathogen) was not enough to kill the animals? The connection between the poor harvests of 1315–17 and the cattle mortality of 1319–20 needs to be studied in greater detail. At present, the connection remains speculative.
While bovine mortality rates varied from manor to manor, depending, most likely, on the extent of the agrarian crisis experienced by the same manors, they also varied across different age- and sex-cohorts within the bovine stocks, as table 3 indicates. For the sake of accuracy, a distinction is made here between the number of animals that fell from murrain without human intervention, the deaths through murrain and butcheries, and the total losses of animals through all three modes, namely the pathogen, butchery, and ‘panic sales’.
Table 3. Death rates across different sex- and age-cohorts
|Animal cohort||Deaths through murrain||Deaths through murrain and butcheries||Total losses (through murrain + butchery + ‘panic sales’)||As % of all bovids|
|Observations (demesnes)||142|| || || |
|Observations (animals)||7,214|| || || |
It appears that the pathogen was relatively discriminative among different bovine cohorts. Mature cattle seem to have been the most prone and vulnerable, consisting predominantly of cows (an average cows-to-bulls ratio would be around 12:1).35 This is hardly surprising, given the physical peculiarities and life-cycle of a mature and fertile cow. Having reached her sexual maturity around the age of 14 months, a heifer, and later cow, spends nine months each year in pregnancy and lactation. During that period, the immune system of cows tends to be weaker than that of male cattle, and hence cows are more likely to be overtaken by pathogens than bulls, oxen, or immature cattle. As table 3 suggests, nearly 80 per cent of cows and 66 per cent of bulls perished in the pestilence. In other cohorts, death rates were somewhat lower. They amounted to some 60 per cent among young cattle. The ox stocks suffered the least, with mortality rates of around 54 per cent. Their distinct physiology likely accounts for this. We may speculate, then, that oxen, kept exclusively as working animals, tended to be stronger and less susceptible than non-working cattle. What is known, however, is that oxen were usually better fed than non-draught cattle in terms of crop allowance and fodder intake. Indeed, mortality rates of oxen were lower than that of cows, bulls, and immature cattle in ‘normal’ (non-murrain) years. On Norfolk manors, for instance, about 1 per cent of demesne oxen, nearly 2 per cent of cows and bulls, and over 4 per cent of calves died on an annual basis between the 1260s and 1430s, mostly from non-pestilential disease.36
How did the pathogen disseminate? Three possible channels can be detected. The first channel may have been trade in cattle. This is a most important, yet much understudied, aspect of the late medieval English economy. Though sometimes stated otherwise, late medieval stocks were never isolated, and there is a great deal of information about ongoing trade in cattle found in manorial accounts.37 One particular sector to be considered here was the trade in calves, prized for their gentle and tasty meat (veal). Thus, between 1280 and 1370 in Norfolk, as much as 70 per cent of the annual calf issue was sold at local markets, while the remainder (net of tithe and death) was kept as replacement animals.38 The role of trade became especially crucial in the pestilence year, when an increasing number of lords and their reeves attempted to get rid of potentially infected and contagious animals, through ‘panic sales’. One paradox about the trade in cattle around that time concerns the lack of correlation between the degree of commercialization and mortality rates. While one would expect to find a particularly high number of fatalities in more commercialized areas, this is not the case. Thus, in Norfolk, arguably one of the most commercialized regions, and situated in reasonable proximity to the Continent, mortality rates actually tended to be much lower than in other counties.
The second channel of transmission was the inter-manorial transfer of oxen and cattle. Just as trade in cattle, transferring cattle from one manor to another was a commonplace in late medieval England. But it was also, first and foremost, an important form of collaboration between officials of different demesnes, belonging to the same lords, without which it would have been impossible to control vast networks of manors. In Norfolk, about 25 per cent of oxen and 15 per cent of cows were transferred from their herds elsewhere, on an annual basis, throughout the fourteenth century.39 Moreover, bulls had to be either transferred or castrated after about two years of sexually active life, to prevent inbreeding. Although in most cases cattle movements were strictly regional (cattle were often driven from a manor to a nearby market or fair), the volume of trade and transfer was high enough to ensure enough contact between healthy animals/stocks and their diseased counterparts.
The third mode of dissemination was ongoing warfare between England and Scotland. Although the link between war and epidemic disease should not be taken for granted,40 it is obvious that the warfare played an important role in facilitating the spread of the pathogen. First, a large number of oxen and cattle were sent to Berwick-upon-Tweed to provision English garrisons in August 1319. Beef was an important component of an English soldier's diet during the Scottish War of Independence, as indicated in surviving provision and purveyance accounts.41 On the other hand, bands of Scottish raiders conducted large-scale raids into the northern counties, burning fields, abducting captives, and carrying away oxen and cattle, as various contemporary sources demonstrate.42 It should be borne in mind that Scotland, much devastated by the war, depended on pastoral husbandry more than most parts of England.
The generally low mortality rates of bovids on late medieval demesnes in non-pestilence years would have made the Great Cattle Pestilence all the more extraordinary in the eyes of contemporaries. This is not to say that there were no other local outbreaks of bovine murrain in late medieval England. For instance, in 1285–6, there was an outbreak of disease on the bishop of Exeter's manor of Exminster, in Devon, which killed about one-third of its bovids.43 Similarly, between 1324 and 1330, there was a series of minor outbreaks, documented in Dorset, Monmouthshire, and Worcestershire.44 Here the mortality rates fluctuated between 10 and 30 per cent. Similarly, there were large-scale weather-induced mortalities of animals between 1437 and 1440, some of the coldest years of the entire medieval period. However, it was not until the outbreak of a pan-European cattle plague in the 1740s that we hear about an outbreak of comparable proportions.45
- Top of page
- Footnote references
While lords and peasants lost their stocks within weeks, it took years, if not decades, to replenish them.46 Restocking was a long and painful process, with negative economic and environmental implications. Reconstructing this process is a much trickier task than calculating death rates, mainly as there are far fewer demesnes with consecutive runs of accounts surviving from the 1320s, 1330s, and 1340s, than those with accounts for 1318/19 and 1319/20 only. As a result, the overall sample is somewhat smaller and is confined to 86 demesnes, mostly belonging to seven major ecclesiastical landlords: the Bishop of Winchester, Canterbury Cathedral Priory, Westminster Abbey, Norwich Cathedral Priory, Glastonbury Abbey, the Bishop of Ely, and Bury St Edmunds Abbey. Furthermore, this sample is confined to the Home Counties, the south, and East Anglia, while other regions are poorly or not at all represented. Finally, with the exception of the royal demesne of Kennington (Berkshire), all the represented demesnes are ecclesiastical. Still, this is probably as close to an exhaustive sample of extant accounts for the 1320s–40s that one could possibly achieve.
Generally speaking, figure 1 indicates that lords did their best to restock their oxen first. Fourteenth-century England was a society dominated by grain consumption, whose basic dietary requirements could not be satisfied without sufficient draught power. Around 1320, the ox still was the single most important and numerous ploughing force. A prolonged shortage of oxen meant a prolonged reduction in aggregate grain production. By 1331, ox stocks reached about 85 per cent of their pre-pestilence levels. The 1318 figures, however, were never seen again; between 1331 and 1350, the overall numbers fluctuated between 75 and 85 per cent of the pre-murrain ones. One may wonder if this has to do with the expansion of horses in lieu of oxen. However, as discussed below, the expansion of the horse into traction was a temporary phenomenon; by the late 1320s horse numbers returned to their pre-1319 levels. Cattle, both mature and immature, marched at a different pace. In the 1320s, their growth seems to have been slow, at the expense of the expanding ox cohorts. From c. 1330, however, their annual growth was fast enough to reach about 90 per cent of their 1318 levels by 1337. Annual growth rates within young cattle cohorts were more significant and by 1334 young cattle cohorts achieved their pre-murrain level. The final years of the 1340s mark a remarkable rise in both ox and cattle populations on the demesne. This owes to animals passed to lords from their deceased tenants, as heriots (inheritance/entry fines), during the Black Death. The chronology of the bovid replenishment seems especially significant when examined against annual price fluctuations of these animals. With the exception of the crisis year (1319/20), there is a clear inverse relation in the movement of the two variables. When the bovine population was notoriously low in the earlier 1320s, their prices remained high. The gradual rise in their numbers is reflected in a piecemeal fall in their prices, which reached their nadir during the Black Death years, when surplus animals were available for both work and consumption. The unprecedentedly low prices in 1319/20 are to be explained both by the debilitated and sickly state of the animals, and by the fear and risk aversion of their potential buyers.
Figure 1. Bovine population and inverted prices in England, 1318–50 (indexed on 1318). Sources: Manorial accounts database (see tab. 1 sources); Farmer, ‘Prices and wages’, pp. 804–6.
Download figure to PowerPoint
Figure 1 provides general demographic trends, without paying attention to regional or structural variances. In reality, the figures varied from manor to manor, and from cohort to cohort. In some cases, the recovery occurred within several years; in other cases, the recovery was never complete. This unequal distribution is indicated in table 4, showing the recovery rates of both individual cohorts and the entire stocks. Given the fact that England lost about 10 per cent of its human population during the Great Famine, and that the complete replenishment of the bovine stocks may have seemed somewhat unrealistic for some lords, we may equate a ‘recovery’ with achieving about 90 per cent of the 1318 bovine population. Because of their small numbers, the bull cohorts were restocked faster than others. In the vast majority of cases, bull cohorts were replenished within five years. This meant that the demesne managers were in a position to find suitable animals to inseminate cows and heifers fairly soon after the murrain (in theory, a healthy bull is capable of servicing up to 50 cows, though such a low cow-to-bull ratio is not encountered in the accounts). This fact had favourable implications for the reproduction of herds after the panzootic. Other groups recovered at a slower pace. About 18 per cent of the sampled demesnes managed to replenish their ox herds within a five-year span, while only 9 per cent were able to restock all of their cattle entirely within the same period of time. Although 7 per cent of manors would never reach their pre-pestilence numbers of non-working cattle, as many as one-quarter of all demesnes did not attain pre-1319 bovine stocks.
Table 4. Length of bovine stocks recovery in England, 1318–50
|Cohort||0 years||1–5 years||6–10 years||11–15 years||16–20 years||21–5 years||26–32 years||Never||TOTAL|
Again, several variables stood behind these variances, which are correlated here against the speed of recovery, measured in the number of years that it took lords to replenish their stocks to 90 per cent of their pre-pestilence levels (table 5). As table 5 shows, with the possible exception of Norwich Cathedral Priory, there is no institutional impact on recovery levels. The figures varied greatly for the estates of both the Bishop of Winchester and Canterbury Cathedral Priory. Thus, it took Bishopstone (Hampshire) 31 years to restore its herds, while Harwell managed to do it in just three. Both demesnes belonged to the bishop of Winchester. Similarly, Hollinbourne and Chartham, both in Kent and both owned by Canterbury Cathedral Priory, replenished their bovids within 32 and two years, respectively. The correlation with Norwich Cathedral Priory suggests a slow recovery speed, but also a systematic replacement of oxen with horses, which began before the panzootic, in the closing years of the thirteenth century.47 The degree of the horse expansion seems to have had a somewhat pronounced impact on the recovery rates. In some cases, there was a negative relation between the expansion of the horse herds and the pace of the bovine population recovery. The expansion of horses was vaster; the replenishment of pre-murrain ox cohorts was quicker. The expansion of sheep, as the most immediate dairy alternative to cows, did not have any visible impact on recovery speed. Finally, the extent of arable acreage did seem to have a clearer effect, but only if measured against the ratio of post- to pre-pestilence acreage. The higher the ratio (or the less the reduction), the slower was the restocking. This is hardly surprising, since larger proportions of arable land demanded larger herds, especially oxen.
Table 5. Correlation between recovery speed and other independent variables
|Bishop of Winchester||0.01|
|Canterbury Cathedral Priory||−0.072|
|Norwich Cathedral Priory||0.241***|
|Bishop of Ely||−0.14|
|Size of oxen, c. 1318||0.181**|
|Size of adult cattle, c. 1318||−0.018|
|Size of young cattle, c. 1318||0.078|
|Size of cattle stocks, c. 1318||0.018|
|Size of bovid stocks, c. 1318||0.146*|
|Mortality rates, bovids||0.045|
|Expansion of horses, 1320–5||−0.204**|
|Expansion of sheep, 1320–5||−0.087|
|Sown acreage, 1318||−0.077|
|Sown acreage, 1320–5||0.013|
|1320–5 acreage as % of 1318 acreage||0.287***|
The size and structure of bovine herds on the eve of the pestilence had only a limited impact on the pace of restocking. For instance, Bishop's Stoke (Hampshire), which herded 98 bovids units in 1318, fully recovered within nine years.48 On the other hand, Cheam (Surrey) had 18 animals on the eve of the pestilence, yet never again reached this figure.49 Mortality rates did not have any impact on replenishment speed either.
How were the bovine herds replenished? Four main channels can be identified: inter-manorial transfer, trade, biological reproduction, and tenant dues (mostly heriots). Inter-manorial transfers of cattle were often related to the demesnes that were either spared altogether, or bore light losses during the pestilence. Thus, in 1320 the reeve of Sedgeford (Norfolk), a demesne entirely spared by the pestilence, sent eight cows and six calves to the adjacent demesnes of Gnatingdon and Thornham. In addition, one cow was given ‘to the poor’.50
It is not surprising that the volume of transactions in the cattle trade was unusually high in the immediate aftermath of the murrain.51 For instance, at Usk (Monmouthshire), 58 cattle heads, out of the total 167, were sold in 1320–1.52 On Kentish estates of Canterbury Cathedral, the ‘panic sales’ of diseased cattle were followed by ‘panic purchases’ of healthy animals, mostly oxen, in the years following the disaster, so that on these Kentish demesnes ox stocks stood at their pre-1319 level by 1328.53 Similarly, Wisbech Barton (Cambridgeshire) restocked its ox cohorts by 1331, through extensive purchases.54 Trade in bovids, however, was undertaken on a strictly local or regional level. In most cases, oxen and cattle were purchased at nearby markets and fairs. This is not to say, however, that trans-regional trade was unheard of. For instance, in 1319 a certain John de Brekehulle was paid 2s. for driving no less than 200 head of cattle from (an unspecified location in) Wales to Chartley (Staffordshire), presumably to compensate for the murrain.55 The distance from Chartley to the nearest point in Wales (say, Llanymynech) would have been around 60 miles. However, this seems to have been an exception to the rule. With the exclusion of the Danish-Dutch cattle trade, based around the North Sea basin, large-scale trans-national cattle commerce was largely unknown until the late fifteenth century.56 Similarly, large-scale trade in cattle between England and Wales did not develop until the fifteenth century.57
Biological reproduction had several drawbacks, as an alternative strategy. First, with some rare exceptions, cows and heifers bear only one calf. Second, cow pregnancy is relatively long, lasting around nine months. Third, cows and heifers normally have only three opportunities each year to get pregnant. Fourth, only some three-quarters of all cows tended to be fertile each year.58 Finally, most demesnes suffered from a dearth of bulls capable of inseminating cows for five years after the panzootic, as we have seen above. All these facts made the restocking process especially challenging. Thus, at Martham (Norfolk), the pestilence reduced the number of cows from 28 to eight, while the immature cattle fell from 11 to one (a yearling bullock). In the following year (1320–1), however, the remaining cows produced seven calves, which constituted about 73 per cent of the pre-murrain stock. By Michaelmas 1323, there were 12 young cattle in stock, which exceeded the 1318–19 figure by one animal. At the same time, there were still only 13 cows, less than half of their 1318–19 population.59 All the same, Martham was fortunate to have had one inseminating bull during these years. It should be noted, that despite the animal shortage, reeves were still able to ‘control’ younger bovine populations by selling, transferring, and butchering immature animals.
Heriot dues, usually paid in ‘the best beast’ (horse, ox, or cow), represented another channel of replenishment on manors with servile or customary peasants. Thus, in 1320–1, 33 bovids joined the stocks of the bishop of Winchester's demesnes through heriots.60 Overall, though, heriots contributed only marginally to the process of restocking.
- Top of page
- Footnote references
The cattle pestilence of 1319/20 was a major biological catastrophe, occurring on an unprecedented scale and carrying long-term and harsh implications within the socio-economic and environmental sectors. The loss of 55 per cent of oxen meant massive losses of draught animals, both as ‘tractors’ and haulers. The fact that almost 80 per cent of mature cattle perished in the pestilence would imply that the population was now deprived of vital protein sources, found both in dairy products and beef. Finally, the reduction of younger cattle and calves by about 60 per cent not only meant a drastic reduction in veal but a drastic reduction in potentially reproductive animals. Moreover, all bovids were prized for their manure, a most essential fertilizing agent, which restores nitrogen to the soil, improving its fertility. Once these factors are considered one can understand how the cattle plague represented a sharp fall in production levels and, consequently, an economic crisis with far-reaching consequences. Lords and their reeves, however, tried their best to avoid that crisis and to ensure a steady and sure recovery.61 The expansion of the non-bovine sector of animal husbandry, to pick up the slack, was the most obvious solution to the sudden vacuum of domesticates. The ox was replaced with the horse, while the cow's place could have been taken over by the ewe.
When calculating the ratio between more than one livestock group, it is customary to convert each group, or each cohort within each group, into its ‘livestock unit’ equivalent, based on the relative price of each animal. The ratio is as follows: horses = 1.00; oxen and mature cattle = 1.20; immature cattle = 0.80; sheep + pigs = 0.10; poultry = 0.02.62 In order to appreciate the impact of the pestilence on the production levels, however, it is also necessary to convert the relevant livestock sectors, namely horses, oxen, cattle, and sheep, into their ‘output units’, namely into their productivity capacities, in relation one to another. Thus, one ox could perform approximately 0.67 of what one plough-horse could.63 Therefore, our unit equivalents within the working sector would be 1.00 for horses and 0.67 for oxen. As far as the dairy sector is concerned, a mature ewe and a gimmer (a young ewe) would yield approximately 10 and 14 times less milk than a cow respectively, while the ratio of milk production between a cow and a heifer would be approximately 1.00:0.75.64 Hence, our ‘output units’ would be 1.00 for cows, 0.75 for heifers, 0.10 for ewes, and 0.07 for gimmers. Dams (mother goats) are found on only one demesne (Downton, Hampshire), and since their contribution to the aggregate lactage output was meagre, they were not included in the sample.65
Let us consider the working sector first. As figure 2 shows, on the eve of the pestilence, approximately 34 per cent of draught work (carting and ploughing) was performed by horses and the remaining 66 by oxen. After the murrain, the overall levels of the available working animals fell by 27 per cent, which reflects the loss of some 55 per cent of oxen in the pestilence and the expansion of the horse sector by some 40 per cent. The share of the horse within the working sector rose now from 34 to 68 per cent. By 1321, however, the aggregate levels of working animal power rose to 87 per cent of their 1318 level, with the horse occupying around 56 per cent of the working animal share. In other words, lords and their reeves reacted remarkably fast. Between 1321 and 1333, overall levels fluctuated between 85 and 90 per cent, while the share of the ox rose again from 44 to some 62 per cent. It would remain around this figure until the Black Death. While the ratio between the horse and ox remained stable, the aggregate levels of animal working power fluctuated a great deal. Between 1333 and 1335 it fell from 86 to 76 per cent of its 1318 level, which reflects a slight reduction of the horse and ox numbers on the demesne through sales, most likely as a response to the rising prices within the working livestock sector. Thus, the purchasing price of an ox rose from around 13s. to 15s. 6d., while that of a cart-horse rose from about 16s. to 24s.66 Once the prices collapsed once more in 1335/6, working animal levels returned to their 1333 levels. From that point until 1349, the year of the Black Death, they remained more or less stable. The sudden rise in the available animal force in 1349 reflects, undoubtedly, human mortality and the passing of animals from peasants to lords, both as heriots and wandering, unyoked animals.
Figure 2. Levels of available working (ploughing and carting) animal power and arable acreage in England, 1318–50 (indexed on 1318). Source: Manorial accounts database (see tab. 1 sources).
Download figure to PowerPoint
Just as within the working animal sector, the decline within the arable sector seems to be mostly insignificant, as figure 2 indicates. Because of the fact that the lords and their reeves acted efficiently and quickly in replacing oxen with horses and replenishing ox cohorts, the overall size and hence aggregate production levels of arable did not fall much in the aftermath of the pestilence. There was a slight decline after 1326, but even at their lowest point in 1333, the overall levels of arable acreage amounted to about 90 per cent of their 1318 levels. It was not until the Black Death that the decline in arable acreage was truly felt. Nor did seeding levels decline, with the minor exception of the first three post-pestilence years. Between 1322 and 1348 seeding levels generally matched 1318 levels (between 1322 and 1331 they exceeded 1318 levels). To a large degree, the fluctuations of seeding levels mirror those of the total arable which, in turn, mirror the expansion and decline of the exploitation of horses for draught on the demesne.
In other words, lords and their reeves were able to maintain pre-pestilence levels of arable acreage and grain production. After all, lords did not have to reduce the arable portion of their demesnes, since the temporary shortage of oxen was solved by an immediate expansion of horses and a swift replenishment of oxen. What is striking here is that arable acreage and crop production levels did not fall after the pestilence, despite the fact that draught power declined by about 27 per cent shortly after the pestilence, and remained around 85–90 per cent of 1318 levels until the Black Death. This could well imply that the decline in the available working animal power does not contradict the rise in per-acre ploughing output, with the expansion of the horse sector. Indeed, arable acreage levels corresponded, more or less, to the number of horses employed by demesne authorities. The sudden rise in the arable acreage in the aftermath of the pestilence went hand in hand with the augmentation of horse power. Horses, after all, worked faster than oxen and were capable of working longer hours. On the other hand, it should be borne in mind that another important source of ploughing, harrowing, and hauling came in the form of tenant labour services. It is possible that this maintenance of arable acreage reflects higher labour productivity rates of manorial tenants.
All these facts indicate that there were no signs of decline in overall arable production levels, and hence no signs of a food crisis within the grain sector, resulting from a temporary shortage of oxen. This is not to say, however, that the situation was uniform everywhere and that there were no demesnes with contracting arable. Thus, between 1318 and 1350, at Downham-in-the-Isle, Great Shelford, and Wisbech Barton, three Cambridgeshire manors of the Bishop of Ely, the total arable acreage declined by 35, 20, and 40 per cent, respectively. This, however, had nothing to do with the bovine crisis, since all these demesnes managed to restock their oxen remarkably fast. Rather, this contraction was brought about by deteriorating weather in eastern England (in particular, in coastal areas of Suffolk and Norfolk and the fenlands), characterized by flooding and coastal inundations in the 1330s and 1340s.67 But apart from these regions, there are no signs of arable decline elsewhere in the sampled demesnes.68 The developments within the demesne seem to have stood in sharp contrast with the situation on the tenancy. As some studies have shown, there were omnipresent signs of contracting arable within tenants' lands, between c. 1300 and 1350.69 This contraction resulted from both exogenous and endogenous factors. While in Cambridgeshire it was brought about by inclement weather and natural events, in Buckinghamshire it reflected peasant poverty.70 Thus, at Ivinghoe (Buckinghamshire), more than 400 tenancy acres lay uncultivated in the 1340s, while the levels of the demesne acreage were as high as in 1318.71
Arable husbandry was arguably the single most important sector within demesne agriculture, since the late medieval English diet was dominated by grain products.72 At the same time, however, success within this sector should not be equated with success within the entire economy. Lords and their reeves still had to take care of the dairy sector, to make up for colossal losses of cows and heifers. As we have seen, ewes yield 10 times less milk than cows, while dams are capable of producing even less than that. It is not surprising, then, that replacing dairy herds was a much longer and more challenging process than replenishing working animals. The negative impact of the cattle pestilence on productivity levels within the dairy sector are shown in figure 3.
Figure 3. Levels of available dairy power, milk output, inverted dairy prices, and legume acreage in England, 1318–50 (indexed on 1318). Notes: The figures for milk output are based on Winchester Bishopric demesnes only; the missing values for the vacancy years (1334 and 1345) have been adjusted, using linear interpolation. The figure for 1320 has been increased from 0.13 to 0.20, since the overall mortality rates on Winchester Bishopric demesnes were higher than on a national level. The other variables are based on the demesnes of Winchester Bishopric, Canterbury Cathedral Priory, Westminster Abbey, Ely Bishopric, Glastonbury Abbey, and Norwich Cathedral Priory. Dairy prices represent averaged butter and cheese prices indexed on 1318: 10 lbs of butter = 12d.; 10 lbs of cheese = 6.5d. Sources: Manorial accounts database (see tab. 1 sources); Munro, ‘Revisions’.
Download figure to PowerPoint
As figure 3 indicates, milk output levels were depressed until 1331. Unlike the draught power, whose levels returned to normal shortly after the bovine crisis thanks to a swift and extensive expansion of the horse stocks, the recovery within the dairy section was slow. First, despite considerable losses of cows and heifers, very few demesnes managed to take up the slack by augmenting their sheep flocks. Even those demesnes that did so, such as Wargrave (Berkshire), West Farleigh (Kent), Feering (Essex), Bourton-on-the-Hill (Gloucestershire), and Ashford (Middlesex), could not maintain pre-1319 levels of dairy output. Each dead cow had to be replaced with 10 living ewes—a highly problematic option for most lords. Generally, the ratio between the available bovine and ovine dairy force did not change as drastically as that within the draught sector. The share of ewes and gimmers rose from about 35 per cent to about 50 per cent among all sheep between 1319 and 1320. But this rise was less than insufficient to keep up with the pre-pestilence levels of the dairy force, which, in turn, fell by almost 50 per cent; furthermore, overall milk output fell by an overwhelming 80 per cent. The gap between the available dairy force and the total lactage output persisted until about 1330, when the cow stocks returned to their pre-pestilence levels and when the pre-1319 ratio between cows and ewes was more or less restored. From 1330 onwards, dairy output levels fluctuated from year to year, depending much on annual lactage yields on each manor. Thus, there were four outstanding years during the 1330s (1332, 1337, 1338, and 1339), and several bad yields in the 1340s (1340, 1345–50). It seems that in some cases the chronology of milk yields marched in parallel with that of crop yields, which were indeed exceedingly high in 1332, 1337, and 1338, but low in 1346, 1349, and 1350.73 This likely reflects the fact that weather affected all vegetation growth, be it crops or grass. It has long been established that there is a clear correlation between the available forage (pasturage, oats, hay, and straw, in our case) and physical health, and hence lactage levels, of cows.74 Cold temperature and excessive rainfall would have depressed both arable and pasturage, while a good deal of sun radiation and moderate weather would lead to generous growth in both sectors. The ideal combination, however, seems to have been a relatively wet spring followed by a relatively hot, dry summer. Butter and cheese prices, presented in table 3, reflect, to a certain degree, fluctuations within the dairy produce sector. Prices were relatively high between 1321 and 1323, but fell gradually into the 1330s. The prices were low, for the most part, in the course of the 1330s and did not rise again until 1345. In other words, there seems to be a clear correlation between weather, lactage yields, and dairy-product prices. This, however, invites a separate and a much more thorough study.
Although dairy products do not seem to have contributed much to the overall diet, they seem to have been late medieval peasants' single most important source of protein.75 Massive losses of protein sources deriving from dairy products could have been partially compensated by augmenting the legume acreage. Legume crops, such as peas, beans, vetches, pulses, and various legume-based mixtures do represent an alternative source of protein intake. However, as figure 3 indicates, there is no evidence for the augmentation of the legume share of the demesne arable up until the 1340s, with the exception of 1321, which is unlikely to have been related to the bovine crisis. For an early fourteenth-century demesne officer, legumes may mean crops for both rustic and animal consumption, as well as a fertilizing agent (legumes represent an important source of nitrogen), but not a source of vital nutrients, akin to dairy products. Hence, it is hardly surprising that lords and their reeves made no attempts to augment the legume acreage.
- Top of page
- Footnote references
So far the discussion has focused on one producing sector within the late medieval English economy: the demesne. The demesne's share, however, represented about 14 per cent of the total national income and some 18 per cent of the total rural income c. 1300.76 Does the demesne reflect the entire economy? What about the peasant sector, which accounted for a far larger share of the total agricultural output?77 Surely, the pathogen did not discriminate between the lords' and peasants' stocks, especially considering how frequent contact would have been between peasant and demesne animals. Unfortunately, information about the peasant sector is virtually non-existent. Peasants did not have a practice of account keeping. Everything we know about their husbandry comes from seigniorial sources, such as tithe accounts, occasional court rolls, rentals, surveys, and tax assessments. The little we know about the peasant sector, however, may allow us to speculate about the impact of the bovine pestilence on the peasant economy and peasant well-being. A national sample of peasant livestock, based on nearly 4,000 observations deriving from these sources, indicates that c. 1319 an average peasant household held two bovids, normally including one cow and/or one younger heifer.78 Oxen were stocked more rarely and in smaller proportions than horses (on average, there were about 0.39 oxen per household).79 Bulls were almost never found among peasants' animals and we may, perhaps, speculate that their cows were inseminated by either ‘common’ bulls, held by the entire village community in common, or by demesne bulls. The cattle numbers would, certainly, vary from place to place. Thus, in 1297 there were only 120 ox and 1,013 cattle units per 1,051 households in three Bedfordshire hundreds. Similar ratios are found in three Kentish hundreds, surveyed in 1301. In some places, peasant economies seem to have been more dairy-oriented: for instance, in Blackbourne Hundred (Suffolk), the mean number of cows per household was 3.1 in 1283. In 1309, an average peasant family in Wistow (Huntingdonshire) possessed 2.9 cows. Intriguingly, there is no clear correlation between pasturage endowment and peasant livestock numbers. The figures for Kent and Bedfordshire appear low, despite the fact that both counties were abundant in pasture resources. On the other hand, Suffolk, one of the counties least endowed with grassland, had comparatively large numbers of peasant bovids.80 It is likely that the impact of the pestilence may have been particularly devastating in those areas where peasant bovids were kept in comparatively high numbers. The tenancies with denser bovine populations were potentially more prone to the pathogens, for three main reasons. First, close contact between animals tends to facilitate the spread of diseases and increase mortality rates. This connection, however, is not to be taken at face value: as we have seen, some demesnes with dense bovine populations were spared by mortality. Second, smaller stocks may mean a more limited degree of commercialization, or even a lack of such. That cattle trade, which involves local movement of animals and contact with other bovids, may play a crucial role in spreading the disease has been illustrated above. Finally, smaller cattle numbers would also mean the closer management of each individual animal. As Stone has shown, there was a clear connection between the efficient supervision and health of demesne sheep in the same period.81 But the size and structure of the bovine stocks could have hardly been the only factor affecting the crisis within the peasant sector. One such factor to be considered here is human population density. Particularly dense populations were found in Norfolk (with 137 taxpayers per 10 square miles), but also in Bedfordshire (91 taxpayers), Huntingdonshire (98 taxpayers), Oxfordshire (114 taxpayers), and Rutland (105 taxpayers). The denser the townships were, in terms of humans-to-land ratio, the more peasant animals were likely to have been found in the same townships, in absolute numbers. Such a scenario may have increased the chances of disease transmission, through frequent contact between the animals of different households. The type of field-cultivation system should also be considered. With a degree of speculation, we may argue that the spread of the pathogen could have been particularly devastating in the areas of ‘champion’ countryside, characterized by open-field cultivation and communal management of livestock. This system prevailed especially in the east and west midlands, as well as East Anglia, and the southern counties of Hampshire, Wiltshire, Dorset, Somerset, and parts of Sussex. The lack of individual, enclosed plots of lands could be another potential factor, which facilitated the spread of the pathogen. Finally, the degree of peasants' involvement in the manorial system could also be a factor here. Manorialism was weak or virtually non-existent in the Celtic fringe and adjacent regions, as well as the northern counties (and, in particular, the north-west). These areas were characterized by individual and often scattered farms, often leased from the lords, and a weak presence or complete absence of the demesne. In these circumstances, peasants' cattle were undoubtedly more isolated and less prone to various transmittable diseases. In other words, it seems that, regardless of the size of each household's bovine stock, peasant animals were much better protected in the areas with weak or no manorialism, low human population densities, and individual and enclosed farms.
The situation within the peasant sector can be deduced from heriot payment sections in manorial accounts. Heriots were servile dues paid in ‘the best beast’, upon inheritance of or entry on a new landed property. A somewhat narrow sample based on 53 accounts from manors with a servile population yields a rather ambiguous picture. It shows that of a total of 78 beasts surrendered in 1317/18, 29 were horses, 26 oxen, and 23 cows. Three years later, the total number of heriots was 54, comprising 19 horses, 18 oxen, 14 cows, and 2 bullocks.82 In other words, the overall ratio of horse-to-bovid heriots hardly changed. Animals were dying, but peasants were continuing to surrender their animals, despite the obvious shortage. This could be interpreted in two ways. First, it could indicate that rigid and conservative lords were still imposing animal-based heriots, and were not ready to commute them into cash-based dues. In that case, these numbers could mean that the situation within the peasant sector was even more dire than that on the demesne: in addition to the factors complicating restocking on the demesne discussed above, the replenishment of village bovids may have been troubled by unwavering seigniorial demands. On the other hand, these heriot numbers could signify that the situation was not necessarily as gloomy for peasants as it was for lords: the fact that the villains were still capable of paying heriots in bovids some two years after the catastrophe may indicate that the recovery of their bovine stocks was swifter and more efficient compared to the demesne. However, this scenario seems somewhat less likely: to purchase a young heifer costing about 4s., an early fourteenth-century English peasant would have to be employed as a wage-earner for about 16 workdays.83 The main challenge, however, would not be the acquisition of the necessary funds, but finding healthy animals at local markets. That the draught-horse was more dominant than the ox within the peasant sector may imply that peasants were somewhat less prone to the crisis at least in terms of arable husbandry.84 The reliance on horses instead of oxen, however, could not entirely make up for a deficiency of cows and milk-based products. Furthermore, there is no evidence that the peasants attempted to take up the slack by expanding their sheep flocks. Even though it is highly plausible that ewe milk contributed to the dairy intake of a peasant, its contribution should not be overstated: the ratio between sheep and cattle units within the tenancy sector were usually 0.29:1.00 (on average, each peasant stock had about 1.77 cows and 5.29 sheep).85 Bearing these figures in mind and assuming that each ewe would produce 10 times less than one cow, we may estimate that about 80 per cent of peasants' dairy intake derived from cows' milk, which, in turn, may have accounted for about 10 per cent of the total food intake.86
If we side with the ‘pessimist scenario’ and speculate that the crisis was equally harsh on the demesne and on the tenancy (or at least in the areas whose settlement and institutional patterns would be especially suitable for the successful spread of the pathogen), we cannot avoid one further crucial point. The Great Bovine Pestilence of the early fourteenth century stood right at the centre of a series of ecological, economic, social, military, and political crises, which may together be seen as the ‘general crisis of the fourteenth century’. In England, it stood between the two great ecological shocks of the Great Famine (1314/15–22) and the Black Death (1348–51). It is plausible that the placement of the bovine pestilence between these two crises was hardly coincidental. The almost biblical flooding of 1314–16 depressed not only human food resources, but also animal fodder. As we have seen, there seems to be a pronounced negative correlation between crop yields in 1315–16 and bovine mortality rates in 1319–20. The bovine murrain claimed, inter alia, nearly 80 per cent of cows, the single most important producers of dairy. While arable production levels were hardly affected and there is some evidence for a possible rise in per-acre ploughing output (judging from the fact that the arable acreage hardly changed, in spite of the decline in the available draught power), the dairy sector was gravely depressed by the crisis. This dairy crisis, in fact, seems to have created a prolonged ‘protein famine’ among humans, lasting for about a dozen years. As we have seen, though dairy products appear to have been one of the most important sources of protein in the peasant diet, no efforts were made to acquire alternative sources of protein, for instance by augmenting the legume acreage, perhaps because contemporaries did not think in our modern dietary categories. Likewise, very few demesnes attempted to combat the dairy shortage by augmenting ewe flocks. This means that the majority of the English population, consisting of rustics, undoubtedly suffered from protein shortage for at least 12 years. Such a long-term deprivation of dairy-based protein sources, such as casein and whey protein, tends to have severe implications for human populations, and especially for child development.87 This protein shortage persisted long beyond the seven years of the Great Famine and is likely, as such, to have had more severe implications for human health. If the protein shortage indeed weakened the immune system of developing adolescents, is it possible that it also was that ‘invisible beast’ that made them easily susceptible to the plague some 30 years later? In his now-classic study, Jordan suggests that the malnourishment of the 1310s may have made humans more prone to the Black Death.88 While this intuitive argument sounds plausible, one way to further it is to regard the Great Cattle Plague as a missing link between the Great Famine and the Black Death. These three crises may have been connected. This hypothesis, however, requires more study before it can be regarded as a thesis.