Vaccines in America
by Alex Carter
In a recent presentation hosted by The Pitt Pulse and American Medical Student Association, Dr. Amesh Adalja, UPMC infectious disease specialist, provided a comprehensive overview of American vaccine policy and perception.
On December 28, 2014 a Californian woman vacationing at Disneyland began to feel ill. Assuming that she was suffering from an everyday viral infection, she carried on with her life, flying to Washington and then back to Orange County, CA the following week.
It was not until more atypical symptoms appeared—namely a blotchy, red rash—that she realized she might have a more serious illness. On January 7, the California Department of Public Health announced that she had measles, a disease caused by the measles virus for which a vaccination is readily available. The woman, however, had never received the vaccination.
An extremely contagious virus, measles can be transmitted through saliva and mucus that enters the air through coughs or sneezes. The virus can survive on infected surfaces for up to two hours and may spread to anyone who comes into contact with such contaminated surfaces. Initial symptoms are similar to many viral infections—fever, cough and runny nose. However, about 4 days after symptoms begin, a rash breaks out on the face, which subsequently spreads over the whole body. Although symptoms do not appear until seven to 21 days after infection, the victim becomes contagious just before the first symptoms appear. Treatment for a measles infection is supportive, meaning that only the symptoms can be addressed. For healthy adults, recovery can be achieved in just over a week. However, for young children or immunocompromised adults, the disease can lead to pneumonia or encephalitis, two potentially deadly complications.
According to the Center for Disease Control and Prevention (CDC), measles was officially eliminated in the United States in 2000. In other nations, however, measles remains endemic. To prevent transmission from unvaccinated foreign travelers, most American children still receive vaccinations for the disease after 12 months of age. Given as a set of two injections, the vaccine is about 97 percent effective, and it can be obtained for free under the new preventative care provisions of the Affordable Care Act.
Despite having easy access to the vaccine, many Americans are electing to not immunize. As of March 2, 170 cases of measles had been confirmed in the U.S. in 2015. Of the first 34 victims in California, only 5 were fully vaccinated against the disease.
This current measles outbreak is not unique. Unprecedented outbreaks of many vaccine-preventable diseases including varicella (chicken pox), pertussis (whooping cough) and mumps have also occurred in the U.S. as a result of reduced vaccination rates. The recent American trend of forgoing vaccinations is startling and is showing no signs of slowing down, due in great part to the anti-vaccine movement.
To understand the emergence of the “vaccine debate,” it’s important to consider the history of vaccination policy in the United States. Every state has a law that mandates vaccinations for a number of standard diseases for children entering public school. At the same time, all states also permit exemptions for immunocompromised children, and 19 states (including California) allow parents to exempt their children due to “philosophical, moral, or religious beliefs.”
While these states vary in their requirements for claiming vaccine exemptions, there is usually a way to circumvent the law. The inevitable question, though, is what deters parents from vaccinating their children in the first place?
Religion is one possible answer. Since vaccines are typically composed of inactivated or weakened viruses, they are produced most effectively in the optimal viral environment: living tissues. Most are cultivated in animal tissues, but some have been historically developed in human cell strains that were originally derived from aborted fetuses. This could be problematic for religions that object to abortion. However, save for a few small Christian denominations, none of the world’s major religions officially condemn vaccines. The foundation of objection for most anti-vaccinators, therefore, is not religion. It is misinformation.
The anti-vaccine movement has been popularized in recent years by a number of celebrities, many of whom cite connections between vaccines and adverse side effects. Most of these critics’ extreme claims (such as the suggested relationship between vaccines and autism) have been disproven by extensive studies. The researchers who initially suggested the vaccine-autism connection in a 1998 paper have all retracted their support or had their medical licenses revoked.
It is true that vaccines can lead to complications. Some components of vaccines can be allergenic, and cases of post-inoculation anaphylaxis (a potentially life-threatening allergic reaction) have been documented. However, such reactions are exceedingly rare (less than one in one million) and patients with allergies to possible constituents like egg, gelatin, or latex, are often encouraged by their physicians to forgo vaccination.
The “risks” of vaccines are overwhelmingly dwarfed by their potential benefits. Simply put, vaccines save lives. The CDC estimates that between 1994 and 2014, vaccines prevented 732,000 childhood deaths and 332 million illnesses in the U.S. alone.
Anti-vaccinators commonly argue that vaccine-targeted diseases are relatively harmless. While it is true that under most circumstances a disease like chicken pox will simply cause an uncomfortable rash, not all people’s immune systems are robust enough to rapidly fight infection. Infants are especially susceptible to diseases, and infections at an early age (when major development is occurring) can cause permanent disability.
Vaccines are even more important for those who cannot receive them in the first place. Individuals who are young, elderly or immunocompromised rely on the buffer created by the inoculated people with whom they typically interact. In order to prevent the spread of contagious diseases through a population, a certain percentage of the population should be immunized. For especially contagious diseases like measles, the threshold is about 95 percent. If such a rate is achieved, “herd immunity” is likely established.
This type of community-level protection is the most effective defense against outbreaks. Thus, the decision to vaccinate oneself or a child is more than just personal.
Vaccines are equally important for the future. If high enough rates of vaccination are achieved, entire diseases could be eradicated. Smallpox, a disease responsible for over 300 million deaths worldwide in the 20th century, was officially eradicated from the world in 1977 following a massive international vaccination effort. Future generations no longer have to fear this pathogen, and the same result could be achieved for many of today’s comparable infectious diseases.
States of herd immunity and eradication can only be achieved with widespread participation in vaccination and yet, a recent poll conducted by the Pew Research Center determined that 41 percent of young adults believe that childhood vaccinations should not be made mandatory. A common perception among youth, especially, is that mandatory vaccinations are just another example of government overreach into one’s personal life. There is essentially no objection, however, to laws that prohibit drunk driving, another public health issue which puts others at risk. The issue, therefore, appears to be one of perception.
In light of the recent measles outbreak, new mandatory vaccination laws are likely in the works. As responsible citizens, voters of all ages should be able to make an educated decision about vaccine policy. Whether a personal connection is apparent or not, we all must recognize that today’s health policies will inevitably affect our future.