Discussion: (0 comments)
There are no comments available.
View related content: Health Care
A vaccine that is intended to save countless lives. Parents suspicious of the shots, terrified of what the vaccine will do to their children. The government insisting on vaccinations for the general good.
Sounds like recent history, when a British doctor’s study linking autism to the three-in-one vaccine for measles, mumps and rubella panicked some parents into barring their children from being vaccinated. But long before actress Jenny McCarthy prominently stirred fears about MMR shots, the vaccination campaign to eradicate smallpox was met with similar trepidation. One significant difference: The study that caused the anti-MMR hysteria has been proved to be bogus, while the smallpox vaccine used a century ago carried genuine dangers.
When some states introduced mandatory smallpox vaccinations during the epidemic of 1898-1903, Americans resisted by the thousands. The ensuing battles produced medical conventions and case law that altered the balance between government authority and medical practice, in favor of federal control. The effects of the smallpox fight continue to this day: The Obama health-care law and the infrastructure required to administer it rely on some of these century-old precedents.
In “Pox: An American History,” Michael Willrich meticulously traces the story of how the smallpox vaccine was pressed into service during a major outbreak. Sometimes the shots were physically forced on people, outraging their sense of personal freedom and—when the vaccine sickened some and killed others—galvanizing suspicion of vaccination programs. The episode, Mr. Willrich says, prompted large swaths of Americans to insist that “the liberty protected by the Constitution also encompassed the right of a free people to take care of their own bodies and children according to their own medical beliefs and consciences.”
“People produced fake vaccination certificates and sometimes injured their skin to simulate the telltale scar caused by the vaccine.” — Scott Gottlie
Historical records show that smallpox was a human scourge for thousands of years. The virus produces high fever, severe back pain and scarring eruptions of flat red spots on the skin that turn into pustules and then into scabs—a two-week process during which the disease is highly contagious. Smallpox can vary in its severity, with some strains killing many sufferers and others relatively few. In the late 18th century, the British scientist Edward Jenner discovered that scratching the arms of healthy children with a bit of pus from cowpox immunized them against smallpox. The revelation was jeered by skeptics, but soon many governments were encouraging smallpox shots—and cowpox-based vaccines would eventually rout the disease from the modern world.
The smallpox outbreak in the U.S. that began in 1898 was not as virulent as some earlier ones, but memories of past horrors and mounting deaths across the country stirred officials to action. Over the epidemic’s five-year course, an estimated 4,000 to 5,600 Americans died from smallpox, and tens of thousands suffered from nonfatal but often disfiguring infections.
Prior to the epidemic, public heath was largely the province of state and local authorities. But many officials proved incapable, or unwilling, to intervene when faced with the epidemic itself. In some cases, money was at issue. In other cases, racism. In the South and elsewhere, Mr. Willrich says, local officials refused to invest in stopping a virus that they saw as a blight of “dark people” who who were forced to live in close quarters with poor sanitary conditions. Mostly, though, the failure was a matter of ineptitude: Many doctors and others who focused on the infectious smallpox pustules didn’t understand that the virus could be easily transmitted by a cough or a sneeze. And the vaccination programs were at best haphazard.
Walter Wyman, the U.S. surgeon general for two decades beginning in 1891, “railed against the short-sightedness of local and state officials who, he believed, had allowed smallpox to rage out of control,” Mr. Wallrich writes. Whatever the cause of the anemic response to the epidemic, it prompted federal intervention in public-health matters—opening a door that has never closed.
Invoking what Mr. Willrich calls “a precedent in the American legal tradition of police power, which allowed for broad governmental intrusions into everyday lives of American citizens” when the public welfare was at stake, the feds stepped in. The government deployed “virus squads” of vaccinators who fanned out across the country, aided by Texas Rangers along the Mexico border and by billyclub-swinging policemen in New York. Health officials opened “pesthouses,” where the ill were sequestered; sometimes whole towns were quarantined. The vaccinators visited factories and schools and railroad stations. In some cases, people who had been exposed to the smallpox virus were vaccinated at police gunpoint.
Their reluctance was understandable: The smallpox vaccine was sometimes shoddily produced. In Camden, N.J., in 1901, the deaths of nine schoolchildren were linked by newspapers to a commercially produced vaccine tainted with tetanus. (Around that time, 13 children in St. Louis died of tetanus after being vaccinated for diphtheria.) It wasn’t just civil libertarians who opposed the compulsory vaccinations; many practical people did the math and preferred to take a chance with the virus, not the vaccine. It was not just a fear of death that scared people: Reports of excruciating arm soreness caused by the vaccine made manual laborers, worried about having to miss work, avoid the shots. People produced fake vaccination certificates and sometimes injured their skin to simulate the telltale scar caused by the vaccine.
The feds realized that if they were going to mandate vaccination, the government would have to ensure that the shots were safe. In 1902, President Theodore Roosevelt signed the Biologics Control Act, the first federal law to regulate drug products. It was a precursor to today’s Food and Drug Administration.
This being America, the vaccine tempest also gave rise to litigation. Mr. Willrich, a history professor at Brandeis University, says that the most significant of what became a series of legal rulings was the 1905 Supreme Court decision in Jacobson v. Massachusetts. The complex opinion gave the high court’s blessing to compulsory-vaccination schemes. But the justices also established a set of standards for balancing governmental power and individual rights during public emergencies. The decision was invoked in 2004 by Justice Clarence Thomas in his dissent from the court’s ruling in Hamdi v. Rumsfeld, granting certain rights to U.S. citizens detained as “illegal enemy combatants.”
In the end, vaccination methods were dramatically improved, the epidemic was stopped—and over the next few decades smallpox was wiped out of human circulation by concerted vaccination campaigns and the swift isolation of the infected. Today, the virus is confined to frozen storage inside a lab in Atlanta and one in Koltsovo, Russia. But the federal role in the practice of medicine remains very much alive.
Scott Gottlieb, M.D., is a resident fellow at AEI.
There are no comments available.
1150 17th Street, N.W. Washington, D.C. 20036
© 2016 American Enterprise Institute for Public Policy Research