In a U.S. government lab in Bethesda, Maryland, virologists plan to equip the strain of the monkeypox virus that spread globally this year, causing mostly rash and flulike symptoms, with genes from a second monkeypox strain that causes more serious illness. Then they’ll see whether any of the changes make the virus more lethal to mice. The researchers hope that unraveling how specific genes make monkeypox more deadly will lead to better drugs and vaccines.
Some scientists are alarmed by the planned experiments, which were first reported by Science. If a more potent version of the outbreak strain accidentally escaped the high-containment, high-security lab at the National Institute of Allergy and Infectious Diseases (NIAID), it could spark an “epidemic with substantially more lethality,” fears epidemiologist Thomas Inglesby, director of the Center for Health Security at the Johns Hopkins University Bloomberg School of Public Health. That’s why he and others argue the experiments should undergo a special review required for especially risky U.S.-funded studies that might create a pathogen that could launch a catastrophic pandemic.
But it’s not clear that the rules apply to the proposed study. In a 2018, a safety panel determined it was exempt from review. Monkeypox did not meet the definition of a “potential pandemic pathogen” (PPP), the panel decided, because it didn’t spread easily. Now, with monkeypox widespread, the National Institutes of Health (NIH) is planning to reexamine the work, but it still might not qualify as “enhancing” a PPP, the agency says. That’s because the study will swap natural mutations, not create new ones, so it is not expected to create a monkeypox strain more virulent than the two already known.
The monkeypox controversy marks just the latest flare-up in a decade-old debate over exactly when a study that alters a pathogen is too risky for the U.S. government to fund—and who should have the power to decide. That wrangling became especially ferocious over the past 2 years, as the COVID-19 pandemic spawned allegations, so far unproven, that SARS-CoV-2 escaped from a laboratory in China. Now, in the pandemic’s wake, the U.S. government appears poised to make sizable changes to how it manages so-called gain-of-function (GOF) studies that tweak pathogens in ways that could make them spread faster or more dangerous to people.
Last month, an expert panel convened by NIH and its parent agency, the Department of Health and Human Services (HHS), released a draft report that recommends the GOF rules be broadened to include pathogens and experiments that are exempt from the current scheme. If the recommendation is adopted—which could come next year—the monkeypox study could come under tighter scrutiny. And other researchers working with viruses such as Ebola, seasonal flu strains, measles, and even common cold viruses could face new oversight and restrictions.
Some scientists are watching nervously, worried that an expanded definition could worsen what they already see as a murky, problematic oversight system. The existing rules, they say, have caused confusion and delays that have deterred scientists from pursuing studies critical to understanding emerging pathogens and finding ways to fight them. If not implemented carefully, the proposed changes could “greatly impede research into evolving or emerging viruses,” worries virologist Linda Saif of Ohio State University, Wooster. She and others say expanding the regulations could add costly red tape, potentially driving research overseas or into the private sector, where U.S. regulations don’t apply or are looser.
Others say the proposed changes don’t go far enough. They’d like to see the U.S. government create an entirely new independent body to oversee risky research, and for the public to get far more information about proposed experiments that could have fearsome consequences. Some have even called for curbing the now common practice of collecting viruses from wild animals and studying them in the lab, saying it only increases the risks that the viruses—or modified versions—will jump to humans.
“We really should be asking important questions about whether that work should continue,” Inglesby says. And virologist James LeDuc, who retired last year as director of the University of Texas Medical Branch’s Galveston National Laboratory, says, “It’s one thing to recognize that these viruses exist in nature. It’s another to modify them so that you can study them if in fact they could become human pathogens.”
All sides agree on one thing: The proposed rules represent a potential pivot point in the debate over the funding of high-risk GOF studies by the U.S. government, which is one of the world’s largest supporters of virology research. “There are significant potential risks to both under- and overregulation in this field,” says virologist Jesse Bloom of the Fred Hutchinson Cancer Center, who like LeDuc is part of a group of scientists pushing for the changes. “The goal,” he adds, “needs to be to find the right balance.”
THE CONTROVERSY over studies that enhance or alter pathogens ignited a decade ago, but such work goes back more than a century. To make vaccines, for example, virologists have long passaged, or repeatedly transferred, a virus between dishes of animal cells or whole animals, so that it loses its ability to harm people but grows better—a gain of function. Since the late 1990s, genetic engineering techniques have made these studies much more efficient by allowing virologists to assemble new viral strains from genomic sequences and to add specific mutations.
In 2011, two such NIH-funded experiments with H5N1 avian influenza set off alarm bells worldwide. Virologists Yoshihiro Kawaoka at the University of Wisconsin, Madison, and the University of Tokyo and Ronald Fouchier at Erasmus University Medical Center were interested in identifying mutations that could enable the virus, which normally infects birds, to also spread easily among mammals, including humans. Small but frightening outbreaks had shown H5N1 could spread from birds to people, killing 60% of those infected. By introducing mutations and passaging, Kawaoka and Fouchier managed to tweak the virus so it could spread between laboratory ferrets, a stand-in for humans.
Controversy erupted after Fouchier discussed the work at a scientific meeting prior to publication. Soon, worries that the information could land in the wrong hands or that the tweaked virus could escape the lab prompted journal editors and government officials to call for a review by an HHS panel called the National Science Advisory Board for Biosecurity (NSABB). HHS established NSABB after the 2001 anthrax attacks in the United States to consider so-called dual use research that could be used for both good and ill. During the review, flu researchers worldwide voluntarily halted their GOF experiments. Ultimately, NSABB concluded the scientific benefits of the studies outweighed the risks; the H5N1 papers were published and the work resumed.
Then in mid-2014, several accidents at U.S. labs working with pathogens, along with worries about some new GOF papers, prompted the White House to impose a second “pause” on U.S.-funded GOF research. It halted certain studies with influenza and the coronaviruses that cause Middle East respiratory syndrome (MERS) and severe acute respiratory syndrome (SARS), SARS-CoV-2 cousins that have caused small though deadly outbreaks. NIH ultimately identified 29 potential GOF projects in its funding portfolio. After reviews, the agency allowed 18 to resume because it determined they didn’t meet the risky GOF definition or were urgent to protect public health. Some, for example, adapted MERS to infect mice, a step that can help researchers develop treatments. The remaining 11 studies had GOF components that were removed or put on hold.
DURING THE SECOND PAUSE, U.S. officials promised to come up with a more comprehensive approach to identifying and potentially blocking risky studies before they began. Advocates of tighter rules also pushed for less-risky approaches for studying altered viruses, such as using weakened virus strains, computer models, or “pseudoviruses” that can’t replicate.
Many virologists, however, argued that only studies with live virus can accurately show the effect of a mutation. “There’s only so much you can learn [from alternative techniques],” says University of Michigan, Ann Arbor, virologist Michael Imperiale, who supported the H5N1 GOF studies. “Sometimes using intact virus is the best approach.”
In 2017, the debate culminated with the release of the current HHS policy, dubbed Potential Pandemic Pathogen Care and Oversight (P3CO). It requires that an HHS panel review any NIH-funded study “reasonably anticipated” to generate an enhanced version of a pathogen that is highly virulent, highly transmissible, and might cause a pandemic. But it exempts natural, unmodified viruses and GOF work done to develop vaccines or as part of surveillance efforts, such as tweaking a circulating flu virus to assess the risks of a newly observed variant.
The HHS committee charged with implementing the policy, which operates behind closed doors, has since reviewed only three projects, and approved all. Two were continuations of Kawaoka’s and Fouchier’s H5N1 work. (Both grants are now expired.) The third involved work with H7N9 avian influenza, but the investigator later agreed to use a nonpathogenic flu strain.
Other concerning studies have been given a pass, critics say. As an example, they point to work led by coronavirus expert Ralph Baric of the University of North Carolina, Chapel Hill. In the 2000s, his team became interested in determining whether bat coronaviruses had the potential to infect humans. (COVID-19 has since shown the answer is emphatically yes.) But the researchers often could not grow the viruses in the laboratory or enable them to infect mice. So they created hybrid, or chimeric, viruses, grafting the gene encoding the surface protein, or “spike,” that the wild bat virus uses to enter a host cell into a SARS strain that infects mice.
NIH let this work continue during the 2014 pause. The researchers had no intention of making the mouse-adapted SARS virus more risky to people, Baric has said. But something unexpected happened when his lab added spike from a bat coronavirus called SHC014: The chimeric virus sickened mice carrying a human lung cell receptor, Baric’s team reported in 2015 in Nature Medicine. The hybrid virus could not be stopped by existing SARS antibodies or vaccines. In essence, critics of the work assert, it created a potential pandemic pathogen.
A review panel might “deem similar studies building chimeric viruses based on circulating [bat coronavirus] strains too risky to pursue,” Baric acknowledged. Yet he has also called these chimeric viruses “absolutely essential” to efforts to test antiviral drugs and vaccines against coronaviruses, and many virologists agree. They also argue that Baric’s work and related experiments provided an early warning that, if heeded, might have helped the world prepare for the COVID-19 pandemic.
THE PANDEMIC HAS SUPERCHARGED the GOF debate, in large part because of unproven but high-profile allegations—including from former President Donald Trump—that SARS-CoV-2 emerged from a laboratory in Wuhan, China. One prominent advocate of the lab-leak theory, Senator Rand Paul (R–KY), a senior member of the Senate’s health panel, has sparred with NIAID Director Anthony Fauci over experiments in virologist Shi Zhengli’s lab at the Wuhan Institute of Virology (WIV). With money from an NIH grant to a U.S. nonprofit organization, the EcoHealth Alliance, Shi had created chimeras by adding spike proteins from wild bat coronaviruses to a SARS-related bat strain called WIV1. The WIV researchers used methods developed by Baric, who has collaborated with Shi.
Last year, documents obtained by the Intercept showed that—like Baric’s work during the 2014 pause—NIH had exempted the EcoHealth grant from the P3CO policy. (The agency later explained that the bat coronaviruses were not known to infect humans.) But NIH also said that if Shi’s lab observed a 10-fold increase in a chimeric virus’ growth compared with WIV1, it wanted to be informed, because the experiments could then require a P3CO review.
The documents show WIV did observe increased growth in the lungs of infected mice and more weight loss and death in some animals. NIH has said EcoHealth failed to report these “unexpected” results promptly as required, but EcoHealth disputes this. Paul and some proponents of the lab-leak theory have gone further, alleging that NIH actively conspired with EcoHealth to hide the risks of the study.
Read more at Phys.org