Sunday, July 27, 2014

Updating The Cambridge Working Group

image

Credit CDC PHIL



# 8872

 


Two weeks ago in Reshuffling The NSABB & A New Biosecurity Working Group Emerges, we learned that the NIH had notified 11 of the 23 original members of the NSABB (or National Science Advisory Board for Biosecurity) - an under-utilized biosafety advisory committee, formed in 2005 - that their services are no longer required.

 

Gone were such well known (and frequently outspoken) experts as Paul Keim from Northern Arizona University, Arturo Casadevall from Albert Einstein College of Medicine, Michael Imperiale of the University of Michigan, and Michael Osterholm from CIDRAP.

 

Given the recent serious lapses in lab biosecurity at both the CDC and FDA, and continued work on `Gain of Function’ research (designed to enhance the virulence, transmissibility, or host range) of highly dangerous pathogens (see The Debate Over Gain Of Function Studies Continues), one might easily have assumed that the NSABB would have their plates full.

 

Although they made headlines in 2011-2012 over their cautionary stance regarding the publication of the Fouchier H5N1 ferret study (see The Furor Over H5N1 Research Continues), the NSABB been idle for the past two years, with no requests from the NIH to reconvene.

 

The day following the NIH’s decision to release 11 of their members, a new initiative appeared online called the Cambridge Working Group, with a consensus statement supported by 18 internationally known experts and researchers (including several former NSABB members).

 

Cambridge Working Group Consensus Statement on the Creation of Potential Pandemic Pathogens (PPPs)

Recent incidents involving smallpox, anthrax and bird flu in some of the top US laboratories remind us of the fallibility of even the most secure laboratories, reinforcing the urgent need for a thorough reassessment of biosafety. Such incidents have been accelerating and have been occurring on average over twice a week with regulated pathogens in academic and government labs across the country. An accidental infection with any pathogen is concerning. But accident risks with newly created “potential pandemic pathogens” raise grave new concerns. Laboratory creation of highly transmissible, novel strains of dangerous viruses, especially but not limited to influenza, poses substantially increased risks. An accidental infection in such a setting could trigger outbreaks that would be difficult or impossible to control. Historically, new strains of influenza, once they establish transmission in the human population, have infected a quarter or more of the world’s population within two years.


For any experiment, the expected net benefits should outweigh the risks. Experiments involving the creation of potential pandemic pathogens should be curtailed until there has been a quantitative, objective and credible assessment of the risks, potential benefits, and opportunities for risk mitigation, as well as comparison against safer experimental approaches. A modern version of the Asilomar process, which engaged scientists in proposing rules to manage research on recombinant DNA, could be a starting point to identify the best approaches to achieve the global public health goals of defeating pandemic disease and assuring the highest level of safety. Whenever possible, safer approaches should be pursued in preference to any approach that risks an accidental pandemic.

Original Signatories & Founding Members:

(Founding Members met in Cambridge on July 14 and crafted the statement)

  • Amir Attaran, University of Ottawa
  • Barry Bloom, Harvard School of Public Health
  • Arturo Casadevall, Albert Einstein College of Medicine
  • Richard Ebright, Rutgers University
  • Nicholas G. Evans, University of Pennsylvania
  • David Fisman, University of Toronto Dalla Lana School of Public Health
  • Alison Galvani, Yale School of Public Health
  • Peter Hale, Foundation for Vaccine Research
  • Edward Hammond, Third World Network
  • Michael Imperiale, University of Michigan
  • Thomas Inglesby, UPMC Center for Health Security
  • Marc Lipsitch, Harvard School of Public Health
  • Michael Osterholm, University of Minnesota/CIDRAP
  • David Relman, Stanford University
  • Richard Roberts (Nobel Laureate '93), New England Biolabs
  • Marcel Salathé, Pennsylvania State University
  • Lone Simonsen, George Washington University
  • Silja Vöneky, University of Freiburg Institute of Public Law, Deutscher Ethikrat

In the two weeks since that initiative appeared only, more than 50 more Charter Members have also signed, including such familiar names as John S. Brownstein from Harvard Medical School, Neil M. Ferguson of Imperial College, W. Ian Lipkin of Columbia University, Andrew Rambaut of the University of Edinburgh, UK, and Klaus Stöhr, Novartis Vaccines and Diagnostics.


Follow this link to review this expanding roster of supporters.

 

As you might imagine, urging caution over this sort of research won’t win popularity contests in some circles of academia, as these (often government funded) research projects can bring in large grants, along with substantial publicity and prestige to labs, universities, and researchers.

 

Suggestions that this sort of work be confined to biosafety level 4 facilities are often met with stiff resistance, as that would exclude most of the university based labs in this country.

 

The debate up until now over this type of research has been largely limited to academia, and has often been acrimonious.

 

The often promised  `thorough public discussion of the risks and benefits involved; never materialized as the issue exited the headlines, and GOF research resumed in early 2013 after Scientists Declare End To H5N1 Research Moratorium.

 

While undeniably bad timing for proponents of unfettered GOF research, three high profile lab incidents this summer involving anthrax, smallpox, and avian flu  have reawakened the public’s (and congressional) concern over lab safety and the wisdom of conducting certain types of research.

 

Despite the assurances from researchers on the relatively low risk of an accidental biological release, if we’ve learned nothing else this summer, it’s that accidents can happen even the best labs in the country.

 

And when dealing with Potential Pandemic Pathogens (PPPs), even a small mistake could have serious public health ramifications.

 

The stated goal here isn’t to ban GOF or DURC (Dual Use Research of Concern) studies, but rather to ensure they are only conducted in an appropriate high containment lab, are done in the safest and most responsible way possible, and only conducted when their benefits clearly outweigh the risks

 

While additional regulation is unlikely to be popular among many engaged in GOF research, these are all issues that need to be discussed, mapped out, and agreed upon.

 

And preferably before next high profile lab accident makes the headlines . . . or worse.

 

For more on the issue of Lab Safety,  GOF,  and DURC research, you may wish to revisit:

 

ECDC Comment On Gain Of Function Research
CDC: Press Conference Transcript, Audio & Timelines For Lab Incidents
Cell Host & Microbe: 1918-like Avian Viruses Circulating In Birds Have Pandemic Potential
Lipsitch & Galvani: GOF Research Concerns