Tuesday, October 10, 2023

PNAS: Frailty and Survival in the 1918 Influenza Pandemic













#17,716


One of the striking features – and enduring mysteries – of the 1918 pandemic was that it appeared to largely spare those over the age of 65 in the United States.

The infamous `W shaped curve of the 1918 pandemic (above) indicates that the death rates among those in their teens, 20s, and 30s was much higher than was normally seen with influenza. Those over 65, however, saw a reduction in mortality during the pandemic.

This unusual Epi curve is most often explained by suggesting a similar `H1N1' virus may have circulated in the mid-to-late 1800s, leaving behind some degree of immunity for those who were alive then, and survived.  Those born later would have no such immunity. 

While there are several flu-like epidemics during this time period in the history books (including the epizootic of 1872), causes of those outbreaks remain unknown.

Another theory, posits that younger people with stronger immune systems fell victim to a `cytokine storm' (see Cytokine Storm Chasers), but once again this is supposition.

To be fair, we've seen other studies - from other countries - showing a different (or less pronounced) pattern (see Study: Impact Of 1918 Pandemic In Mexico).  In 2006, a Lancet journal article (below) cited as much as a 30-fold difference in mortality rates around the world: 

Estimation of potential global pandemic influenza mortality on the basis of vital registry data from the 1918—20 pandemic: a quantitative analysis 

Christopher JL Murray , Alan D Lopez , Brian Chin , Dennis Feehan , Kenneth H Hill

Excess mortality ranged from 0·2% in Denmark to 4·4% in India. Since there was some under-registration of

The upshot is, our understanding of what happened in 1918 remains murky, and subject to change. 

 mortality in India, total pandemic mortality could have been even higher.

Our understanding of what happened in 1918 is based largely on anecdotal accounts, and since the average life expectancy in the United States leading up to the pandemic was just over 50 years of age, deaths of those younger likely got greater attention. 

Add in that the world was at war when the pandemic began, and in many countries (including the United States) news of the outbreak was treated as a national security issue. As a result, rumors and misinformation - even from `official' sources - ran rife. 

Pity any historian a hundred years from now, sifting through the archives of the internet, trying to make sense of the events of 2020-2021. Imagine getting 10 people in a room today, and coming up with a consensus opinion.  

The upshot is, our understanding of what happened in 1918 remains murky, and subject to change. 

All of which brings us to a new study, published in PNAS, which introduces a new factor; frailty.  By examining the skeletal remains of people who died both before, and during, the 1918 pandemic they determined that those who showed signs of unhealed lesions (a sign of frailty or poor health) were more likely to die at a younger age during the pandemic. 

This paper doesn't invalidate the `W' Epi curve, or the pandemic's impact on younger people, but it offers a new theory as to why it may have happened. 

The full study is behind a paywall, but the link, and Abstract follow.  I'll return with a postscript after the break.


Frailty and survival in the 1918 influenza pandemic
Amanda Wissler wisslera@mcmaster.ca and Sharon N. DeWitte 
Edited by Christopher Kuzawa, Northwestern University, Evanston, Il; received March 23, 2023; accepted August 21, 2023

Significance

The COVID-19 pandemic showed how social, environmental, and biological circumstances can shape the likelihood of disease and death, even with respect to a disease for which no one has preexisting adaptive immunity. The 1918 influenza pandemic killed an estimated 50 million people worldwide. So many people fell ill that doctors at the time believed that healthy people were equally likely to die as those who were already sick or frail. We analyze bioarchaeological data on age at death and skeletal lesions from 369 individuals who died prior to and during the 1918 influenza pandemic in the United States. The results further show that even in the past, people with evidence of prior environmental, social, and nutritional stress were most likely to die.

Abstract

One of the most well-known yet least understood aspects of the 1918 influenza pandemic is the disproportionately high mortality among young adults. Contemporary accounts further describe the victims as healthy young adults, which is contrary to the understanding of selective mortality, which posits that individuals with the highest frailty within a group are at the greatest risk of death.

We use a bioarchaeological approach, combining individual-level information on health and stress gleaned from the skeletal remains of individuals who died in 1918 to determine whether healthy individuals were dying during the 1918 pandemic or whether underlying frailty contributed to an increased risk of mortality. Skeletal data on tibial periosteal new bone formation were obtained from 369 individuals from the Hamann–Todd documented osteological collection in Cleveland, Ohio. Skeletal data were analyzed alongside known age at death using Kaplan–Meier survival and Cox proportional hazards analysis.

The results suggest that frail or unhealthy individuals were more likely to die during the pandemic than those who were not frail. During the flu, the estimated hazards for individuals with periosteal lesions that were active at the time of death were over two times higher compared to the control group.

The results contradict prior assumptions about selective mortality during the 1918 influenza pandemic. Even among young adults, not everyone was equally likely to die—those with evidence of systemic stress suffered greater mortality. These findings provide time depth to our understanding of how variation in life experiences can impact morbidity and mortality even during a pandemic caused by a novel pathogen.

         (Continue . . . )


Life expectancy in the United States had only just breached the 50-year mark in the early 1900s, and at that time the three leading causes of death here were Pneumonia or Influenza, Tuberculosis, or Gastrointestinal infections.

This was decades before the development of the first effective antibiotics, and before large cities treated their water supply, meaning that many people lived with, and eventually died from, chronic infections like Tuberculosis and dysentery.

Many suffered from nutritional deficiencies, as the following passage from a 1999 MMWR report recounts:

The discovery of essential nutrients and their roles in disease prevention has been instrumental in almost eliminating nutritional deficiency diseases such as goiter, rickets, and pellagra in the United States. During 1922-1927, with the implementation of a statewide prevention program, the goiter rate in Michigan fell from 38.6% to 9.0 % (21). In 1921, rickets was considered the most common nutritional disease of children, affecting approximately 75% of infants in New York City (22). In the 1940s, the fortification of milk with vitamin D was a critical step in rickets control.

While this doesn't explain why those over 65 appear to have suffered less mortality from the 1918 pandemic than younger cohorts, it does paint a picture of a population with many vulnerabilities, even among young adults. 

While today's study may not be the last word on what happened during the 1918 pandemic, it does offer up a new, and intriguing theory.