Health Wanted Show Notes: Public Health Valentines
Public Health is one of those areas where, if we do our job right, the general public is completely unaware.
It’s difficult to make someone appreciate the outbreak that was averted or the health complications they didn’t experience. These days, it feels like many have lost perspective about where we are today compared to 100 years ago.
The romanticization of a “simpler” time before vaccine schedules, or food regulations, or pharmaceutical ads seems to often forget details like how, in some cities in the 1900s, up to 30% of children died before their first birthday, or a glass of milk could give you an incurable disease, or your cause of death could simply be listed as “teeth.”
So today, I want to make a little valentine for some of the greatest accomplishments in public health that you likely don’t even think about.
There are aspects of public health that you might not have considered were not always standard—one example is food safety.
While these days discussion of food safety tends to revolve around recalls or arguments about the harms or merits of certain ingredients, they used to sound more like “should we be letting people put formaldehyde in milk?”
Industrialization had more people moving into cities, which caused the food industry to take unique approaches to maximizing their product and profit.
And since there were no regulations, there was no one to stop them from doing things like cutting milk with pond water and then re-coloring it with chalk, making it last longer by putting formaldehyde in it, using arsenic as green food dye or red lead to color cheddar cheese, or selling you a bag of coffee fortified with ground up bones.
Cinnamon was often just brick dust and pepper was dirt or charred rope.
Regulations only came in 1906 with the Pure Food and Drug Act, also called the Wiley Act.
Harvey Wiley was the chief chemist for the U.S. Department of Agriculture in 1883, and he decided he wanted to look into adulterated food and drink in order to make an argument that it was, in fact, bad.
And he did this by taking young, male volunteers and providing them three square meals a day—with the caveat that half the men must take capsules of food additives so they would become sick, and he could make the case they were unsafe and needed to be banned.
The men would come to be known as the poison squad.
The Pure Food and Drug Act was the start of food labeling. It forbade the adulteration of food to hide its inferiority and required labels on food and drugs not be false or misleading.
It was also the start of the formation of the FDA. And while the list of dyes we currently count as acceptable are actually all approved by Wiley’s experiments from 120 years ago (indicating we have room for updating), it’s certainly an improvement over eating bread made with cement.
Hand in hand with food safety is, specifically, milk safety.
It’s estimated that bovine tuberculosis, which is tuberculosis that comes from exposure to infected cows, was responsible for 15% of TB deaths in 1900. And given that TB was the second leading cause of death then, that’s a significant amount of deaths!
In comes pasteurization. The process was invented by French chemist Louis Pasteur, who originally realized that heating wine stopped the bacterial growth that caused it to spoil.
There are many ways to pasteurize milk, but two of the most common are to heat it to 161 F for 15 seconds or to 280 F for 1-2 seconds, and then cool it rapidly. Both options make the milk shelf stable for longer, because there’s no bacteria multiplying in it over time.
It’s not just TB that was making raw milk dangerous to health. Campylobacter, listeria, typhoid fever, and E. coli were also common pathogens found in raw milk. In 1938, it was estimated that 25% of all food or waterborne outbreaks were related to the consumption of milk. Today, milk is implicated in less than 1% of outbreaks.
Chicago was the first city to implement pasteurization in 1909 and within a decade the city’s infant mortality rate dropped by two-thirds.
Of course many other interventions likely contributed, but as industrialization of farming increased, farms began pooling milk from multiple sources into large tanks, which increased the risk of contamination, making pasteurization a no-brainer.
Another no-brainer today is hand hygiene.
Events over the last five years have drilled the concept of hand-washing and sanitizing even deeper in our brains than it was before, but it was not so long ago that the concept of clean hands didn’t even occur to people.
Ignaz Semmelweis was a Hungarian doctor in the mid-1800s who was concerned with puerperal, or childbed, fever—a condition in which women, shortly after giving birth, became sick with a fever and often died.
The hospital where Semmelweis worked had two maternity clinics, one staffed by doctors and medical students and one by midwives. Paradoxically, it was the clinic staffed by midwives that had significantly lower maternal mortality due to childbed fever.
The doctors and medical students would start their days conducting autopsies on women who had died the day before. Then they would stroll right over into the labor and delivery room and start helping evict those babies from their comfortable wombs…without washing their hands first.
The midwives did not perform autopsies, so their fingers were slightly cleaner.
This was about 13 years before Louis Pasteur would figure out that wine and milk were spoiling not from the spontaneous appearance of pathogens, but because pathogens that already existed in small quantities could multiply to disastrous effect.
This was a realization that led the way to germ theory—the concept that microscopic organisms are what make us sick, as opposed to the prevailing theory of the time that illness was caused by “bad air.”
Semmelweis is another great historical example of generally getting something right, but not totally understanding why.
He had a friend and colleague die from an infected wound after this friend cut himself while performing a childbed fever autopsy.
When Semmelweis did an autopsy on his friend, his findings were similar to what he saw in the women who died, leading him to theorize that certain “death particles” were coming from decomposing tissue, sticking to the hands of doctors, and being transferred to the women in labor.
Once he implemented a policy of mandatory hand-washing before doctors could enter the labor rooms, the cases of childbed fever dropped
But no good deed goes unpunished, and many refused to believe that doctors could possibly be the source of death. Semmelweis even wrote a book on the subject that was dismissed and lambasted for its “unprofessional” writing style.
Sadly the criticism was all too much for Semmelweis, who died in a mental asylum in 1865 and was not recognized for his immeasurable contribution to public health for another 20 years.
Another area of massive accomplishment in public health is improved workplace safety.
It might surprise you to learn that work safety is considered public health, but we spend so much time at work that improving the conditions leads to an improvement in overall well-being.
In 2022, it was estimated the total financial cost of workplace injuries was around $167 billion.
In the late 1800s a doctor by the name of Alice Hamilton set up a well-baby clinic in a settlement neighborhood for the poor. Her daily interactions with these families got her interested in “industrial medicine,” or illnesses caused by certain jobs, as she saw the high number of widows and people suffering from partial paralysis due to lead exposure.
She researched things like carbon monoxide poisoning in steelworkers, mercury poisoning in hatters, and “dead fingers,” a condition by which the continuous use of jackhammers damages the circulation of fingers.
Her findings were so impressive they led to sweeping reforms, both voluntary and regulatory, and she was appointed as the first woman on faculty at Harvard Medical School in
Research into occupational safety and the cost of workplace injury is what supports regulations and reforms.
OSHA, the Occupational Safety and Health Administration, was established in 1970 and since that time they’ve been able to reduce workplace deaths by 63% and workplace injuries by 40%.
Even more basic than implementing safety laws that prevent workers from getting their arms ripped off in machines or buried in collapsed mines, public health research can contribute to best practices and trainings to reduce workplace injuries.
For example, in the 1990s, implementation of updated best practices for lifting patients that included the use of mechanical patient lifting equipment saw a 66% reduction in workman’s compensation claims for low back injuries among health care workers.
Crab fishermen used to have an occupational fatality rate of 770 per 100,000 full time fisherman. That’s declined to about 114 per 100,000 after the coast guard implemented dockside safety checks to ensure boats weren’t at a greater risk of overturning due to being overloaded.
The thing about public health is that, when it works, it’s so boring that the lifesaving changes can even be seen as annoying or inconvenient.
Now you might be thinking “a lot of these innovations in public health happened a while ago. What has it done for me lately?” But the reality is, public health is responsible for making sure these basic standards of health continue to be met.
It’s monitoring your access to safe food and water. It’s inspecting farm safety and tracking disease outbreaks. It’s looking at how we can reduce the risk of infections. It’s writing protocols to quickly respond to emerging threats and keeping tabs on the risks you face at your job that could impact your well-being.
It’s constantly working behind the scenes to keep your daily life healthy without you needing to spend your time thinking about it.