When you hear the word fortification, what do you think of?
The strengthening of a city’s defenses? A steely mental demeanor? A commercial listing the benefits of a breakfast cereal?
Or maybe your mind goes to any number of videos you’ve seen on social media warning you against any food that has the word “fortified” anywhere on the label.
If pseudo-wellness influencers had their way, fortification would be seen as a dirty word, indicating the addition of dreaded “chemicals” (despite the fact everything on Earth, both good and bad, is made up of chemicals).
So, what’s the deal? What ARE those “essential vitamins and minerals” on the cereal box? And WHY are they being added to our foods in the first place?
There have been a few times on this show where I’ve said that public health rarely has a single solution to a problem, but when it comes to food fortification, it’s actually one of those occasions!
More than once, the existence of a debilitating disease has been diminished, if not eliminated, by the addition of simple micronutrients to food.
But first, let’s start with our glossary. You’re maybe used to hearing about macronutrients: the things we eat that make up the bulk of our calories and energy. These include things like carbohydrates, fats and proteins.
Micronutrients are things our body needs to function, but in small amounts. They’re those “essential vitamins and minerals” that food packaging is always talking about.
It’s estimated that 2 billion people (¼ of the world’s population) have a micronutrient deficiency.
And a lot of this deficiency can be addressed with fortification.
Food fortification is the process of adding micronutrients to foods to increase their nutritional value.
Food enrichment is the process of adding micronutrients back into foods that are lost during processing and production.
But don’t worry, you’re not going to public health language jail if you mix up the two, they’re sometimes used interchangeably.
So how do you even discover that someone is deficient in a nutrient they only need a small amount of to begin with?
Sailors of course! Well, that’s at least true in one case.
If you think about it, sailors are really the earliest version of randomized controlled trials. You can have two ships full of people who are of similar age and socio-economic status. You can then isolate them (at sea) and change one thing about their circumstances: their diet.
This is exactly how Dr. Kanehiro Takaki discovered that food could be the answer to a disease called “beriberi,” which plagued the Japanese navy in the 1800s.
Beriberi caused such symptoms as shortness of breath, difficulty walking, swelling or loss of sensation in the arms and legs, speech difficulty and confusion, and killed a portion of sailors each year.
Takaki was a British-trained doctor working for the Imperial Navy at the time and noticed that cases of beriberi were frequent on the long training missions that took 9 months to travel between Hawaii and Japan, particularly in the low-ranking cadets, who often ate nothing but the polished rice they were given for free.
Takaki proposed the navy send out a ship on the same route but provide all crew with a varied diet. At the end of this trip, there were significantly fewer cases of beriberi and no deaths.
It wasn’t until a Dutch physician showed that feeding chickens unpolished rice could prevent beriberi that people began to understand that, in addition to needing macronutrients to survive, food likely also contained necessary micronutrients.
At the time they called these “accessory factors.”
In this case, the “accessory factor” was thiamine, also known as vitamin B1.
Polished rice (aka white rice) loses the natural thiamine that’s found in the rice bran when it’s processed, but unpolished rice (aka brown rice) retains it.
That’s why you now see the phrase “enriched” white rice. It has micronutrients added back in after processing.
And rice isn’t the only one! Have you ever wondered what the deal with “iodized salt” is?
The answer is goiters. At the base of your neck there is an undetectable, butterfly shaped gland known as the thyroid.
Or at least it SHOULD be undetectable. When properly functioning, the thyroid uses iodine to produce two essential hormones that interact with nearly every part of your body. And because we can’t produce iodine ourselves, we rely on our diet and environment to provide it.
But if you’re not getting enough iodine from your environment, the thyroid can swell in an attempt to filter iodine from anywhere it can in the rest of the body.
The best-case scenario for iodine deficiency is a goiter, what we call the large lump that forms at the base of the neck.
Goiters can grow so large they press on the windpipe. Continue to be deprived of thyroid hormones and you become exhausted, lose your hair, your skin thins, your muscles ache and a thick brain fog sets in causing confusion and disorientation.
For children in utero the outcomes are even worse.
Fetuses rely on their mother’s thyroid to produce these hormones to grow. Without them you risk missing crucial stages of development which can result in miscarriage or children born with birth defects such as short stature, deafness, blindness, and physical abnormalities.
In places like Switzerland, these combinations of birth defects were so common they had a name: cretinism.
A rather rude name (as medical terms at the turn of the century tended to be), it’s now called “congenital iodine deficiency syndrome.”
And before the 1920s, goiters were extremely prevalent not only in Switzerland, but also midwestern states of the U.S.
Iodine is only naturally occurring in the ocean.
When prehistoric oceans receded, iodine was left in the soil, and it continues to persist as iodine vapor from the sea gets blown far inland.
But Midwestern states like Ohio and Michigan are really far inland. Too far for ocean vapors to reach, and the iodine that would be in their soil was washed out to the coast during the thawing and flooding of the ice ages.
Similarly, much of Switzerland was ALSO covered by a giant ice sheet which thawed over thousands of years, slowly but surely washing away all the rubble that would contain the prehistoric iodine.
David Marine (no relation to the iodine containing ocean) was the one who recognized that goiters could be reduced or cured by adding saltwater fish to a person’s diet (remember saltwater fish live in the ocean, full of iodine).
He went on to propose a test of Ohio school girls (who were of the age when goiters were most prevalent). He took 2,000 girls and gave them iodine supplements and compared them to 2,000 who did not take the supplements.
Over time, none of the girls who received iodine developed goiters, and ⅔ of those who already had swelling of the throat returned to normal.
So why put iodine in salt?
Salt is also an essential mineral, but we need relatively little of it, and most of us know when we’re using too much.
Iodine (just like all things) can be toxic if you consume way too much.
Goiter surgeons were actually opposed to iodine treatment, not because they wanted to stay in business, but because they were worried about potential toxicity of iodine.
But putting iodine in salt (which masks the taste of nearly anything) means people not only self-regulate the amount they consume, but they also don’t need to remember to take an additional supplement.
This is the basis of nearly all food fortification efforts: you add micronutrients to staple foods to supplement people’s diets without making them change their habits.
Sometimes those staple foods actually aid in the discovery of the micronutrient that people are missing. Take, for example,
In the 1920s, newly graduated medical researcher Lucy Wills traveled to India to investigate why so many pregnant women were suffering from a debilitating, and sometimes deadly, form of anemia.
The existence of “germs” had only recently been discovered, and, as a result, people’s first assumption for any ailment was that it’s caused by an infectious disease, which Wills thought too, until she noticed that this type of anemia was happening more frequently in poor textile workers.
She studied their diets and began to think the problem must be a deficiency in vitamins, but when she tried supplementing people’s diets with vitamin A or C, it failed to yield results.
Her biggest discovery came when she fed a monkey a diet similar to what the women were eating and saw the same signs of anemia in the animal.
But when she gave the monkey marmite, a marked improvement
Marmite is made from brewer’s yeast, the leftover goop that remains after the fermentation process of beer is finished.
The B vitamins that are removed from wheat, barley, and hops when it’s processed into alcohol end up concentrated into that brewer’s yeast, making it a goldmine of micronutrients.
Among those nutrients is the vitamin B9, aka folic acid or folate, which is what these women were missing.
Folic acid is probably one of the most incredible discoveries of the 20th century.
Both in its ability to prevent disease, and the cost-effectiveness of implementing it. Because certain types of anemia aren’t the only thing it can prevent.
We now know that taking around 400mcg of folic acid a day before and during early pregnancy is essential to reduce the risk of neural tube defects, or NTDs, in infants.
NTDs occur when the structure of the neural tube fails to develop correctly as an embryo grows. What should become the spinal cord and brain can fail to properly close off at crucial points.
NTDs include spina bifida, a failure of the spinal column to close that leaves the spinal cord and nerves exposed, or anencephaly, in which a baby is born missing parts of their brain or skull.
Both defects can be fatal, and infants that survive often go on to have lifelong disabilities.
But 70% of these birth defects can be prevented with enough folic acid in the diet.
The evidence for folic acid to prevent NTDs is so strong that in 1998 the U.S. mandated that any cereal grain products (so things like flour, bread, pasta and rice) that were already labeled as “enriched” also be fortified with folic acid.
And research has found that, since beginning this fortification program, each year there are about 1,300 fewer cases of spina bifida.
But, like all things in public health, equity is still an issue. Not every country has food fortification programs, and not everyone in THIS country is eating foods fortified with life-saving micronutrients.