17th- and 18th-century risk of disease through Migration
The fate of migrants moving to cities in 17th- and 18th-century England demonstrates how a single pathogen could dramatically alter the risks associated with migration and migratory patterns today.
Cities have always been a magnet to migrants. In 2010, a tipping point was reached for the first time when, according to the World Health Organization, the majority of the world’s population lived in cities. By 2050, seven out of 10 people will have been born in – or migrated to – a city. One hundred years ago, that figure was two out of 10.
Today, cities are generally the safest places to live. If you live in one, you’re likely to be richer than someone living in a rural environment. If you’re richer, you’re likely to live longer. If you live in a city, you have better access to hospitals and healthcare, and you’re more likely to be immunised.
But that was not always the case. In 17th- and 18th-century England, city life was lethal – disproportionately so for those migrating from the countryside.
Dr Romola Davenport is studying the effects of migration on the health of those living in London and Manchester from 1750 to 1850, with a particular focus on the lethality of smallpox – the single most deadly disease in 18th-century England. In the century before 1750, England’s population had failed to grow. Cities and towns sucked in tens of thousands of migratory men, women and children – then killed them. It’s estimated that half of the natural growth of the English population was consumed by London deaths during this period. Burials often outstripped baptisms.
In 2013, cities are no longer the death traps they once were, even accounting for the millions of migrants who live in poor, often slum-like conditions. But will cities always be better places to live? What could eliminate the ‘urban advantage’ and what might the future of our cities look like if antibiotics stop working?
By looking at the past – and trying to make sense of the sudden, vast improvement in survival rates after 1750 – Davenport and the University of Newcastle’s Professor Jeremy Boulton hope to understand more about city life and mortality.
“For modern migrants to urban areas there is no necessary trade-off of health for wealth,” said Davenport. “Historically, however, migrants often took substantial risks in moving from rural to urban areas because cities were characterised by substantially higher death rates than rural areas, and wealth appears to have conferred little survival advantage.”
The intensity of the infectious disease environment overwhelmed any advantages of the wealthy – such as better housing, food and heating. Although cities and towns offered unparalleled economic opportunities for migrants, wealth could not compensate for the higher health risks exacted by urban living.
“Urban populations are large and dense, which facilitates the transmission of infectious diseases from person to person or via animals or sewage. Towns functioned as trading posts not only for ideas and goods but also for pathogens. Therefore, growing an urban population relied upon substantial immigration from rural areas,” explained Davenport.
“After 1750, cities no longer functioned as ‘demographic sinks’ because there was a rapid improvement in urban mortality rates in Britain. By the mid-19th century, even the most notorious industrial cities such as Liverpool and Manchester were capable of a natural increase, with the number of births exceeding deaths.”
Davenport has been studying the processes of urban mortality improvement and changing migrant risks using extremely rich source material from the large London parish of St Martin-in-the-Fields. The research, funded by the Wellcome Trust and the Economic and Social Research Council, is now being augmented with abundant demographic archives from Manchester, funded by the Leverhulme Trust.
For both cities, Davenport and colleagues have access to detailed records of the individual burials underlying the Bills of Mortality, which were the main source of urban mortality statistics from the 17th to the 18th century. These give age at death, cause of death, street address and the fee paid for burial, which enables them to study the age and sex distribution of deaths by disease. In addition, baptismal data allow them to ‘reconstitute’ families as well as to measure the mortality rates of infants by social status.
“The records themselves give only a bald account of death,” said Davenport. “But sometimes we can link them to workhouse records and personal accounts, especially among the migrant poor, which really bring home the realities of life and death in early modern London.
“Smallpox was deadly. At its height, it accounted for 10% of all burials in London and an astonishing 20% in Manchester. Children were worst affected, but 20% of London’s smallpox victims were adults – likely to be migrants who had never been exposed to, and survived, the disease in childhood. However in Manchester – a town that grew from 20,000 to 250,000 in a century – 95% of smallpox burials were children in the mid-18th century, implying a high level of endemicity not only in Manchester but also in the rural areas that supplied migrants to the city.
“So studying urban populations can tell us not only about conditions in cities but also about the circulation of diseases in the rest of the population.”
The greater lethality of smallpox in Manchester is, for the moment, still a mystery to researchers; but evidence suggests the potential importance of transmission via clothing or other means – as opposed to the person-to-person transmission assumed in mathematical models of smallpox transmission in bioterrorism scenarios. Although smallpox was eradicated in the late 1970s, both the USA and Russia have stockpiles of the virus – which has led to fears of their use by terrorists should the virus ever fall into the wrong hands. Data on smallpox epidemics before the introduction of vaccination in the late 1790s are very valuable to bioterrorism researchers because they provide insights into how the virus might spread in an unvaccinated population (only a small proportion of the world’s population is vaccinated against smallpox).
From 1770 onwards, there was a rapid decline in adult smallpox victims in both London and Manchester, which Davenport believes could be attributable to a rapid upsurge in the use of smallpox inoculation (a precursor of vaccination) by would-be migrants or a change in the transmissibility and potency of the disease. By the mid-19th century, towns and cities appear to have been relatively healthy destinations for young adult migrants, although still deadly for children.
“Smallpox was probably the major cause of the peculiar lethality of even small urban settlements in the 17th and 18th centuries,” said Davenport, “and this highlights how a single pathogen, like plague or HIV, can dramatically alter the risks associated with migration and migratory patterns.”
“The close relationship between wealth and health that explains much of the current ‘urban advantage’ is not a constant but emerged in England in the 19th century,” added Davenport. “While wealth can now buy better access to medical treatment, as well as better food and housing, it remains an open question as to whether this relationship will persist indefinitely in the face of emerging threats such as microbial drug resistance.”
Header Image : An 1802 cartoon of the early controversy surrounding Edward Jenner’s vaccination theory, showing using hiscowpox-derived smallpox vaccine causing cattle to emerge from patients. WikiPedia
Contributing Source : University of Cambridge
© Copyright 2014 HeritageDaily – Heritage & Archaeology News