American society is inundated with food. There are thousands of options of what to eat and a myriad of ways to eat it. So why does food need defending? In In Defense of Food, Michael Pollan distinguishes between real food and processed food and how the reliance on the latter leads to a society plagued by Western diseases.
Since the mid-20th century, Americans have looked to the government and scientists to tell them what to eat and what not to eat. As we’ll see, a decision that was once successfully made by families guided by tradition was turned over to those who benefit most from consumer confusion.
The industrialization of food changed how food was produced and the nutritional value of food. Farmers started using pesticides and chemical enhancements to fortify soil for faster production, tainting the plants grown in it. A reliance on three main staples—corn, soy, and wheat—infiltrated the industry as refined carbohydrates, processed sweeteners, and hydrogenated fats. Nutritional science and industry successfully shifted the focus from real food to nutrients and implied that eating was solely intended to bring about physical health.
We have willingly handed the chef’s hat to the multi-million-dollar food and science industries, losing the culture of food along the way. We’ve become obsessed with health labels and eating by the numbers, but none of this has resulted in better health. In fact, an increase in the rate of heart disease, diabetes, cancer, and obesity is the result of the industrialization of food and the advent of the Western diet. Getting us back on track to healthier bodies and minds requires a paradigm shift about food production and food culture.
The only rule about eating that you need to understand is eat mostly plant-based real food in moderation. ** This idea is simple enough and should be easy to follow. What has prevented us from following this simple rule is the rise of nutritionism. Nutritionism is an ideology about nutritional science, not an actual branch of that science. The focus of nutritionism is on isolating certain nutrients—proteins, carbohydrates, certain fats, and antioxidants—as the cause of either good or bad health. The concept is “eat more of the good nutrients to live longer.”
But this narrow view of health and eating ignores the other advantages of food. These advantages include the holistic benefits of whole foods and a more traditional food culture.
The focus on nutrients led to the lipid hypothesis, a theory developed in the 1960s that states that fat and cholesterol, mostly from meat and dairy, lead to increased rates of heart disease. Originally, the hypothesis was used to urge consumers to eat less animal protein and dairy, but the powerful and influential meat and dairy industries fought back.
They used the concept of nutritionism to shift the blame from the unhealthy whole foods to unhealthy nutrients. Their efforts changed government guideline language from “eat less animal proteins and dairy” to “eat proteins and dairy that help reduce saturated fat.” With this shift, nutritionism gained a massive hold over America’s food culture.
The science and food industries tightened their grip on the American diet after winning another major battle against real food. In the early 20th-century, the Food and Drug Administration stipulated that “imitation” food, or food made with artificial ingredients or chemically altered, must be labeled as such. With food manufacturers now touting the benefits of nutritionism and processing foods to match, they feared the negative response consumers might have to this label.
With the help of the American Heart Association, who recommended that foods be modified to reduce cholesterol and saturated fats, the food industry was successful in repealing the mandate. Now, any food-like product that equaled or surpassed real food in nutrient value could be sold without a warning label. The age of processed food kicked into high gear.
The issues with nutritionism and processed foods would be nil if we were actually getting healthier, but we’re not, thanks to the lipid hypothesis. The lipid hypothesis made saturated fats the main enemy of health. People believed that removing saturated fats from food and replacing them with “good” fats, like hydrogenated seed oil, was the best solution to improving health. The low-fat food revolution began, with hydrogenated oil becoming the fat of choice to be added to all foods to make them healthier. This decision had far-reaching consequences. Many believe the current obesity and diabetes epidemics in America correspond with the rise of low-fat foods in the 1970s.
The process of solidifying vegetable oil using hydrogen creates trans fat. Scientists and food marketers urged Americans to eat more trans fat in low-fat foods. Unfortunately, the evidence supporting this recommendation was flimsy from the start.
American eating habits during and after WWII formed the assumptions that led to the lipid hypothesis. During the war, meat and dairy were rationed and heart disease was low. After the war, eating resumed as normal, and heart disease increased. But the researchers failed to acknowledge that people ate less during the war and exercised by walking more because of rationed gasoline. Also, post-war eating became increasingly industrial, with hydrogenated oil in foods like margarine already soaring in popularity during that time.
It wasn’t until the 1990s that scientists realized the deadly effects of trans fat. But by that time, it was too late. Low-fat foods had taken overthe American diet, and Americans got fatter and sicker.
Most nutritional studies in America have one thing in common—none of them examine the types of food promoted by the Western diet. They only look at nutrients. But when you ignore the effects of processed foods, added fats, sugar, and scant amounts of fruits, vegetables, and whole grains on the body, you’re missing a big piece of the puzzle.
Scientists know that the Western diet is linked to killer diseases and that cultures who adopt the Western diet fall victim to these diseases shortly after. What was not known before was whether the negative effects could be reversed. However, studies now show that reverting back to traditional dietary patterns involving whole foods can mitigate the damage done. To create eating habits that support better health, we need to fix the relationship between humans and food as part of the overall food chain.
When one aspect of the food chain is disrupted, it affects the entire food chain. If the soil is unhealthy, the plants grown will be unhealthy, as will the animals who eat the plants and the humans who eat the animals. We once had a close familiarity with our food and used our senses and instincts to determine when food was good or bad. Our bodies also knew how to accept the food and what to do with it. But five shifts created distance between people and food and led to a host of health issues for the entire food chain.
The advent of refined carbohydrates is one of the most fundamental changes in the history of food and one of the most damaging to the human body. In the past, we ate whole wheat ground in stone mills. Stone grounding retained the germ in the wheat, which contains most of the nutritional value. But the invention of steel rollers during the Industrial Revolution changed how grains were processed. The rollers now removed the germ, creating a fine white flour devoid of nutrients and more resilient to spoilage. Corn, rice, and sugar also went through similar processes.
By removing the nutrient-rich germ, the internal synergistic interactions of the whole grain was lost. The refined flour now broke down into glucose quickly without fiber to slow its release. The problem with this simplified digestion process is that the body reacts by producing excess insulin. The influx of insulin makes you think you’re hungry and leads to overeating. In the worst case, your body can’t produce enough insulin to match the excess sugar, and Type 2 diabetes results.
After WWII, farmers began adding nitrogen, phosphorus, and potassium to soil to increase the rate of plant growth. At the same time, farms that once raised a diverse species of crops and livestock started growing mostly corn and soybeans. Both actions created plants lacking in nutrients.
When plant growth is sped up, the time allotted for plants to soak up nutrients is shortened. And with the three growth elements readily available, plant roots have no need to dig deep to seek vital minerals. This lack of nutrients makes plants more susceptible to pests and disease, which require the use of pesticides to fix. The pesticides seep into the plants and are digested as toxins in the body.
Likewise, reducing the ecological diversity of farms to one or two crops means less diversity in the nutrients going back into the soil. The human body requires a myriad of minerals and nutrients to function properly, and the likelihood of two species providing anywhere close to the appropriate amount is small, especially when those species are industrially grown.
The efforts to simplify how food is grown, make food more durable, and reduce the number of food species have stripped most of the nutritional value of whole foods on the market today. The result is a food inflation, in which it takes more food today to gain the same nutritional benefits of food in the past.
Corn, soy, and wheat are high-yield crops. They’re grown quickly and in abundance. Corn and soy are easily manipulated for use as sweeteners, fats, and proteins, and their use in most processed foods allows food to be cheap. Livestock are also fed these crops, reducing their value both in price and nutrition. Cheap and easy food is now the standard in the American diet, but it comes with the cost of poor nutrition.
The shift from green plants to the three staple crops creates a disadvantage for human health. Plants are high in micronutrients, such as antioxidants, vitamins, and omega-3 fatty acids. Grains and seeds are high in omega-6 fatty acids and macronutrients, or fats, carbohydrates, and protein. Macronutrients are high in calories, and an overabundance in the body is linked to many Western diseases.
Furthermore, omega-6s promote inflammation and reinforce cell walls, which makes metabolism difficult. Omega-3s, in contrast, support metabolism by making cell walls permeable, reduce inflammation, and seem to regulate heart rhythms. Both fatty acids compete for cells and enzymes in the body, so the ratio of the two is likely the significant determining factor for your health. A diet higher in green plants than seeds and grains will help push the ratio to your favor.
The shift in focus from whole foods and traditional meals to nutrients and fast processed food has significant effects on our health and food culture. The problem is that we see food as a mere mechanism for nutrient consumption. The faster and easier we can get those nutrients the better. We are so ingrained in the eat-on-the-go mentality that whole foods play a small part, if any, in our eating habits. In addition, Western diseases have become so common, they feel inevitable.
Our complacency with the state of food and culture makes it easy for science and industry to continue producing fortified food instead of addressing the underlying issues of diet and lifestyle. To stop the Western diet from continuing its rampage on society and health, we need to reclaim the lost quality and culture of our food.
To change your habits away from the Western diet, you must differentiate the theories from the problem and work to address the latter. Regardless of the theory, the problem remains the same—eating a Western diet leads to Western diseases. The “how” and “why” are less important than the “what,” meaning you have to treat the whole problem, not just the symptoms.
The following rules can help you cut ties with the Western diet.
What to eat:
What not to eat:
How to eat:
Where to get food:
When you understand where food comes from and prepare it yourself, you become part of a healthier food chain. When the whole food chain is respected and supported, your health and the health of the natural world will be, as well.
For decades, Americans have sought to determine which foods support optimal health. A multi-million dollar food marketing and science industry have dedicated their efforts to finding this answer. But the solution to what to eat can be boiled down to a basic principle:eat mostly plant-based real food in moderation.Seems easy, right? So why all the fuss?
The tried and tested days of tradition are gone. In those days, fresh and unprocessed foods dominated the culture and family guided eating decisions. Now, we have grocery stores full of processed foods and a government food pyramid about as trustworthy as the money-making schemes of the same name.
Food products no longer look like whole food. They are modified for mass production, easy consumption, and low cost. They come with health claims stamped on their packages to guide the would-be savvy shopper. The existence of a health claim on a food package should be the first red flag to consumers. Real food doesn’t need a health claim. Processed foods do.
We have given the power to decide what to put in our bodies to science, government, and media. The need for humans to be told what to eat, something our ancestors were able to figure out for themselves for centuries, shows just how far we’ve fallen down the confusing dietary rabbit hole.
The dietary guidelines for optimal foods and eating habits are changing more than once per generation. These changes are motivated by information doled out by those who gain the most from consumer confusion. If people stopped needing dietary advice, the following consequences would occur:
With all of this at stake, why would these industries work to stabilize nutritional information or make deciding what to eat easy?
They wouldn’t. Instead, these industries have created an atmosphere that supports three prominent myths about food:
Through these myths, we’ve become victims of the oppressive message that science knows best when it comes to food. And we’re told that everything we need to know about what to eat can be found on dietary labels.
One problem with this mindset is that food is not merely an engine that fuels biological health. Eating is a cultural practice as much as it is an avenue toward health. Food is also meant to be enjoyed. It helps create cultural, community, and individual identity. Sharing meals strengthens family and social bonds.
These aspects are not represented by the mass-marketing of processed foods and nutrition science. Instead, they have gone by the wayside of a cultural obsession with the body. Unfortunately, this obsession hasn’t led to healthier bodies.
Americans are increasingly falling into the trap of orthorexia—an unhealthy obsession with healthy eating. No other culture in the world worries about food as much as we do. But for all the worrying and efforts to standardize the link between food and health, we are more unhealthy than ever before ** (and, subsequently, than any other culture).
The American paradox is the opposite of the French paradox. The French paradox explains how the French are less concerned about nutrition yet eat a richer diet and are healthier than Americans. By contrast, all the rules and concern regarding food in America add up to an increase in the rates of four “killer” diseases: heart disease, diabetes, stroke, and cancer.
Before the industrialization of food in the mid-20th century, these diseases were not as common. The introduction of and reliance on processed and refined foods, pesticides, growth hormones, modified fats, and three staple ingredients—corn, soy, and wheat—created the “Western diet” and set the health of Americans into a tailspin.
In other cultures that have not adopted the Western diet, these diseases are low in frequency. In contrast, cultures that have adopted the Western diet quickly see an increase in the number of people afflicted with obesity and killer diseases.
The Western diet is a well-oiled machine. It focuses more on refining a heavily processed diet for maximum benefit than questioning why the food is not healthy to begin with. Non-industrialized fresh fruits, vegetables, and whole grains are not prominent in many mainstream dietary considerations because the effort to acquire these foods is more difficult. Short of cultivating a private farm, access to these foods has always been limited.
However, the increased popularity of farmer’s markets, organic farms, and ethical livestock rearing has made it possible to cut ties with industry-driven food sources. Subsequently, these better food options are also more advantageous for the environment and psychological health of consumers.
What you eat has actual consequences on physical and psychological health. Moving toward a new era of real food, or food-as-food, will help shift the cultural dietary paradigm away from the destructive Western diet.
In Part 1, you’ll learn about the ideology of nutritionism and the way science and government perpetuate it. In Part 2, you’ll learn what the Western diet is and how we came to rely on it. And in Part 3, you’ll learn how to move away from the Western diet and toward better health.
American eating habits took a sharp turn in the 20th century, when the focus on positive nutrition turned from whole food to nutrients (chemical and mineral compounds found in food). Although the theoretical aspects of this shift sprouted between the late-19th and mid-20th centuries, the effects didn’t become wholly obvious until the 1980s.
On grocery store shelves in the 1980s, food identified as food was being replaced by packages identified by their nutrient components, whether bad or good.Words like cholesterol, fiber, and saturated fats replaced eggs, grains, and meat. This new strategy suggested that the inclusion or exclusion of these substances equaled positive or negative health for consumers. Real food became antiquated. Nutrients took over as the shiny new, lab-tested giant of healthy eating—eat the good and avoid the bad to find physical Nirvana. The events that led to this shift and those that have followed created the Western diet.
The work of an English doctor and chemist would change the face of the American diet forever. William Prout was the first person to isolate and identify the three main compounds in food, called macronutrients: fat, protein, and carbohydrates. Following on Prout’s coattails was German scientist Justus von Liebig, one of the founders of organic chemistry. Liebig took these macronutrients and broke them down further into a handful of micronutrients he claimed were solely responsible for digestion and growth.
Following this logic, Liebig created bouillon and the first baby formula to concentrate these vital nutrients. However, a few observations challenged his assertion:
These observations led to further research, and from those efforts came vitamins. Vitamins became the new best thing in nutritional science. Deficiency conditions, like beriberi and scurvy, cleared up because of certain vitamins. Then vitamins were linked to growth and vitality. By the mid-20th century, vitamins took center stage as the answer to the question of what to eat, leaving whole foods further behind.
In the 1960s, the government sought answers to the growing health crisis in America, including the increase of killer diseases and malnutrition. The Senate Select Committee on Nutrition and Human Needs was formed to research the relationship between food and health. At the head of the committee was South Dakota Senator George McGovern. For several years, scientists studied food consumption and health in America and discovered some interesting patterns.
These findings created the “lipid hypothesis,” whichstatedthat fat and cholesterol, mostly from meat and dairy, led to increased rates of heart disease. This hypothesis motivated the committee to create the Dietary Goals for the U.S. guidelines in 1977. This report advised Americans to reduce their consumption of red meat and dairy. It did not exist for very long.
The meat and dairy industries strongly criticized the committee’s findings, including cattle ranchers in South Dakota—Senator McGovern’s constituents. As a result, a retraction was printed that changed the advice from “eat less red meat and dairy” to “choose meat and dairy that reduce saturated fats.” With this change, a domino effect occurred in the industrial, political, and social spheres.
The problem with the changes made to the report was the lack of acknowledgement of how certain foods differ from one another. The report suggested that all meat was equal nutrition-wise. But beef, poultry, and fish are about as similar as cars, bikes, and skateboards. Saying these three meat items are similar vehicles for a simple nutrient, saturated fat, is like saying the three transportation devices are the same because they get you to your destination.
This flawed logic was perpetuated by the National Academy of Sciences (NAS) in their report on diet and cancer. Rather than report their findings in a food-by-food manner, they used a nutrient-by-nutrient manner. For instance, the report spoke of the benefit of antioxidants in vegetables, but not the benefit of any specific vegetable.
With the Senate’s guidelines and the NAS report, science had officially begun our current obsession with nutrients over food. With the help of the food industry, marketers, and media sources, America entered the Age of Nutritionism.
“Nutritionism” focuses on the benefits of foods as a sum of their parts, rather than the whole. The term was coined by a sociologist as a way to explain this new way of regarding food and health. To understand what nutritionism is, you must understand a few key aspects encompassed in the term.
The suffix “-ism” denotes a system, philosophy, or ideology. Therefore, nutritionism is an ideological concept regarding the science of nutrition, not a field of that science. Ideologies are grounded in marginally examined hypotheses made about observed relationships among pieces of information.
Regarding nutritionism, the assumption is that nutrients are the gateway to understanding food, or the parts compose the whole. Because nutrients are invisible and intangible, a subsequent assumption is that the public is not equipped to understand their nuanced details. People must rely on science to tell them what the value of food is.
The reductionist assumption that food is eaten solely for health benefits is understandable when food is valued only as a vehicle for nutrient transportation. This assumption is further supported when science becomes the authority for disseminating the value of nutrients. The resulting mindset is that certain nutrients are valuable and certain nutrients are harmful.
To ensure the public’s interest in the good nutrient, a bad nutrient is offered up as the enemy. The past decades have seen wars waged between carbohydrates and proteins, proteins and fats, and fats and carbohydrates.
In addition to the conflicts between these “super power” nutrients, smaller civil wars are waged within each: simple carbs vs. complex carbs, animal protein vs. plant-based protein, and saturated fats vs. polyunsaturated fats. Currently, the virtuous omega-3 fatty acids are being promoted over the devious trans fats.
This type of nutritional duality creates an environment ripe for a revolving door of fad diets and shifting loyalties among nutrients. These shifts happen in the blink of an eye based on whatever way the science winds blow.
When we only consider foods as valuable for their nutrient make-up, we ignore the internal workings of whole foods. An example of this short-sightedness is the current state of milk.
Milk has become nothing more than a vessel for delivering protein, lactose, fat, and calcium. For decades, food science has attempted to simulate the beneficial effects of these nutrients in milk-like substances. From Liebig’s first attempt to develop a suitable baby formula, several other iterations have entered the market. However, despite the changing compositions of ingredients and added nutrients, there has never been a time when babies fed formula have thrived as well as those given human milk.
If baby formula has the same chemical compounds as milk, it should react similarly in the body. But it doesn’t. The question must be asked: What inherent interactions are happening within the whole milk product to create this advantage?
The boundaries between whole foods and processed foods diminish when food is only valuable as the sum of its parts. Nutritionism would have you believe that a box of cheese-food product fortified with essential nutrients is as healthy or healthier than actual whole milk cheese straight from the farm. This suggestion is extremely beneficial for the nutritionism movement, but it may be extremely dangerous to your health.
The relationship between nutritionism and processed food manufacturers is akin to that of criminals and accomplices. Stating that any food, even synthetic food, can be made healthy or healthier than real food through scientific modifications justifies the value of processed foods. This justification reinforces their appeal to the public. Yet, the modifications may not always lead to healthier people. The history of margarine is a good example.
In the 1800s, margarine was created from hydrogenated vegetable oil as a cheap alternative for butter. But after the lipid hypothesis, margarine manufacturers swept in with claims that their product was healthier than fat- and cholesterol-laden butter. The hydrogenated vegetable oil could be injected with all sorts of good nutrients to support optimal health.
Unfortunately, solidifying vegetable oil with hydrogen creates trans fats. At the time, no one knew that trans fats were even more dangerous than the saturated fats they were replacing. But once scientists saw the link between trans fats and killer diseases, margarine got a face-lift to reduce trans fats. A new war was waged against trans fats as though they’d never been promoted. This was a positive step for future consumers, but those who’d been eating margarine in good faith for decades may not have been so lucky.
How was it possible for this dangerous food source to rise to the status of shining star of nutritionism? Again, the U.S. government played a large role.
Americans were almost saved from the manipulative marketing of the food industry, but the power of the industry thwarted that effort. In 1906, the publication of Upton Sinclair’s novel The Jungle created a crisis for the food industry. The novel exposed the horrific conditions and unethical processes involved in meat and by-product manufacturing. The public urged the government to do something to protect them from these adulterated and unhygienic products.
In response, the Food, Drug, and Cosmetic Act of 1938 created mandates requiring all synthetic foods be labeled “imitation.” The idea was simple: if a food was packaged to represent traditional real food but was modified in some way, it must be outed as such. Enter the food industry, which fought with everything it had for decades to repeal the mandate. They complained the rule undermined innovation in food science to address the lipid hypothesis and relationship between food and killer diseases.
The efforts of the food industry were again bolstered by scientific organizations. The American Heart Association spearheaded a campaign against saturated fats in favor of vegetable oil. It recommended that foods be modified to remove the bad fats and cholesterol. To ensure the public would eat these healthier foods, they promoted the removal of the “imitation” label. The AHA believed the label placed a negative light on modified foods.
With this scientific backing, the Food and Drug Administration adjusted the mandate in 1973. The new laws stated that any “imitation” product equal or superior to the real item in nutrient value could be sold without labeling it as synthetic.With this step, the FDA became the newest member of nutritionism, inducting the American public by proxy.
Food manufacturers had a field day without the restrictive labeling requirements. Any food with “unhealthy” natural fat, such as yogurt and cream, could be modified with hydrogenated oil. Soy protein could now masquerade as animal protein in products like Bacon Bits. Egg substitution products appeared on the scene, joined by non-dairy creamers.
The assumption was that science could perfectly simulate the benefits of real foods without the negative natural components. But as the baby formula example illustrates, that claim has never been validated. Still, the passive public were none the wiser, and the food industry was off to the races. Enter the golden age of processed food.
Food processors, free of marketing restrictions and backed by the government, set out to make processed food the food of choice. They modified any food they could. Labels such as low-fat, fat-free, cholesterol-free, high-fiber, low-fiber, and low-carb became the norm. Foods that once consisted of two- or three ingredients, such as mayonnaise and whipped cream, now had tumbling lists of additives.
If a nutrient once thought healthy was found to be otherwise, the food-like product was reengineered to reduce the bad and include the good. A new set of health claims was printed on new boxes. The public followed this advice until new evidence contradicted those claims. And because health claims couldn’t be printed on fruits and vegetables, healthy foods were often left in the dark.
In fact, the fate of real food in supermarkets is still dependent on what science dictates. If carrots are said to be healthy, people will buy them. If not, they won’t. A new subgroup of food science has formed to exploit the benefits of certain whole foods based on a particular nutrient.Walnuts, once deemed fatty, are now hailed for their omega-3 compounds. Pomegranates and blueberries are touted for their antioxidant properties. In fact, antioxidants are so popular now that the Mars Corporation is funding a scientist at the University of California, Davis, to study the antioxidant compounds in chocolate. In the near future, chocolate may be considered a healthy food because of those findings.
The Era of Atkins
In 2003, the Atkins diet took over the world after cardiologist Robert Atkins claimed that carbohydrates made people fat and unhealthy. He pushed the consumption of proteins and fat to lose weight. This fad diet took over the world, and people everywhere stopped eating carbohydrates.
Bread and pasta manufacturers began to re-engineer their products to reduce carbs and increase protein. Low-carb breads and pastas flooded the market, some even stamped as “Atkin-friendly.” Unfortunately for vegetable and fruit farmers, there was no way to make a potato or banana low-carb. These foods became doomed under this fad.
Before 1973, a label of “imitation” would have been added to these breads and pastas. Without the label, these highly processed food-like products were consumed en masse.
With so much emphasis put on nutrients for health, you’d expect to see the healthiest population in the world. This expectation suggests that the science and policies surrounding food are sound and unquestionable, which has rarely been true. In fact, the most significant nutritionism movement of the last 30 years is based on conjecture and contributed to the current health epidemic—the lipid hypothesis.
The advent of the lipid hypothesis severely altered the way we thought about what to eat and how to eat it. These behaviors diminished the culture surrounding food and the overall health of the population.
The research used to support the bye-in of the lipid hypothesis was small to none. In fact, only a few studies ever found a positive correlation between saturated fats and cholesterol and coronary heart disease (CHD). A number of recent efforts have attempted to debunk the low-fat movement by questioning the validity of the research that led to the hypothesis.
During the period in question, the trend in eating habits was moving toward healthier food choices, but health was trending toward increased CHD. Consumers were actually eating fewer sources of animal fats, such as lard and tallow. Instead, margarine had grown in popularity, outselling butter for the first time in 1957. In general, consumption of animal proteins decreased after the war, from 84 pounds per person to 71 pounds between 1946 and 1976. In this same time period, the use of seed oils doubled.
One explanation for these confounding results may be an oversimplification of the relationship between CHD and meat and dairy during the war. Sugar and gas were also rationed, and eating habits were different. People ate less, in general, and fewer refined carbohydrates during the war. People ate more fish and got more exercise from walking due to a lack of gasoline. Finally, regarding the lower CHD rates in other cultures, it was entirely likely that those cultures had different customs surrounding food, ate fewer calories, and led different lifestyles that included more exercise.
Despite these variables, the finger staunchly pointed at fat as the culprit. A shift in perspective about food swept through the entire population. People turned away from decades of traditional eating habits and toward science-guided standards. What followed was a diminished state of health for the American public.
The war waged against saturated fats dismissed some key aspects of the role of fat in the body. The brain is mostly fat, which is needed to protect the delicate cells and neurons. Fats are also the foundation of the body’s cellular walls. Different amounts of various fats help the body absorb vitamins and minerals on a cellular level.
But these concerns were not important to the booming food industry, who had much to gain from the lipid hypothesis. Once the hypothesis was embraced, all movement in the food industry became geared toward removing fats from food. Whatever nutrients or ingredients remained, such as refined sugar or high-fructose corn syrup (HFCS), were of little consequence.
What’s interesting is that although Americans reduced their total calories from fat from 42% in 1977 to 34% in 1995, fats were still a major part of the American diet. Saturated fats were removed, and polyunsaturated fat and trans fat took their place. Meat remained a strong staple in the diet. Now, the chunk of beef on the plate was replaced by a chunk of chicken breast with a hefty side of carbohydrates. These actions were justified by the guidelines to “eat more low-fat food.”
With free license to eat as much low-fat food as they wanted, people didn’t change the amount of food they consumed. They ate massive amounts of foods deemed “good” because of the nutrient value. You could now eat a whole box of low-fat cookies and call it healthy eating. Unfortunately, this thinking was based on a lie.
As opposed to the lack of evidence linking saturated fat to heart disease, the evidence linking trans fat to heart disease is vast. Trans fats were the alternative fat used in most low-fat foods. It’s hard to know how much trans fat was consumed before the 1990s, when scientists learned that the product they’d been pushing was almost twice as likely as saturated fat to cause heart disease. (Shortform example: If saturated fat is the equivalent of the common cold, trans fat is the bubonic plague.)
Trans fat negatively affects the body in the following ways:
Yet, no one in the health or science community has acknowledged or expressed regret for the error. If they did, the American public would know how wrong they were, and people would never trust their guidelines again.
All of the negative aspects of nutritionism might be worth it if people were actually getting healthier. But all the focus on low-fat foods led to a fatter America. Many believe that the current obesity and diabetes epidemics began in the 1970s with the rise of low-fat foods.
The promotion of carbs over fats muddied the important differences between complex and refined carbs. Refined carbs are believed to require more insulin to break down, which leads to a false sensation of hunger and overeating. The process also doesn’t allow fats to be broken down properly.
Also, the biggest point of justification for the low-fat diet—the decreased heart disease mortality rate since the 1960s—is misleading. The New England Journal of Medicineconducted a 10-year study on coronary heart disease and found that rates were not actually reduced. The smaller number of people dying from this disease has more to do with advances in medicine than diet. In fact, the number of CHD-related hospital admissions has not decreased at all.
Several factors make the study of nutrition difficult: 1) the focus on nutrients makes studying food benefits difficult, and 2) the tools used to measure nutritional data are lacking in methodology. These factors make it hard to truly understand food and eating habits. However, this problem doesn’t stop scientists from claiming their findings as fact.
When you take the nutrient out of the context of food, you miss influential relationships within whole foods and the numerous possible benefits beyond the one identified. Other important variables pertaining to health are also ignored, such as diet patterns, lifestyles, and human physiology. **
No two humans are alike. There are specific inherent differences that either support or hinder nutritional health.
The human eater is not a robot that reacts to food in the same way every time. Your body is constantly adjusting to your environment, and the science on the physiological makeup of the digestive system is incomplete. (Shortform note: What this means is that human health statuses cannot be generalized anymore than singing voices can.)
Likewise, no two whole foods are alike. Whole foods also have a complex internal structure in which various compounds work together depending on type and size. Whole foods behave differently in the body than nutrients do, and since people consume whole foods, not isolated nutrients, nutritional science should focus on food.
The reluctance to see the whole from the parts is ingrained in the field of study. The general paradigm of all scientific study is to isolate a variable to add to or remove from a sample to observe the effect. But to reduce scientific assessment of food to one variable leads to false assumptions. An example of the problem with this approach is the antioxidant revolution.
Scientists know that there is a connection among fruits and vegetables and cancer. To determine what this connection was, they isolated the various components and discovered the benefits of antioxidants. One benefit is that antioxidants seem to occupy cell receptors so free radicals—which destroy cells and cause cancer—can’t.
The nutritionism strategy is to isolate this nutrient in supplement form to create the same benefits. But antioxidant supplements have proven ineffective in providing the same benefits. The following factors could be why:
There are several antioxidants within each fruit and vegetable. For instance, in one leaf of thyme, there are dozens of antioxidants. There really is no way to determine which one, if any, helps the body and whether that help is an isolated or compound effect. The beauty is that you don’t need to know. The great thing about eating food instead of nutrients is that you don’t have to worry about what each nutrient is doing. You can just eat the food and get the benefit.
Nutrient-focused science is a zero-sum game, meaning if you’re eating more of one thing, you’re eating less of something else. Increase trans fats to reduce saturated fats. The focus tends to be on eliminating one thing by adding another. But not eating enough of a good thing also has an impact. What you’re not eating is just as important as what you are.Another gander at the lipid hypothesis helps illuminate this point.
Science says that saturated fat in animal proteins is the problem, but what if the whole animal protein is the culprit? Maybe the simple act of eating animal protein reduces the amount of fruits and vegetables consumed. Maybe a larger portion of fruits and vegetables in the diet helps manage the effect of saturated fat. If this were the case, lean beef technically would be no better for you than regular beef.
Nutritionism is so focused on removing the bad, it fails to acknowledge good foods that don’t require scrutinization.
One of the biggest problems with nutritionism is that the evidence supporting it is based on faulty tests. Several mechanisms are used to study both foods and eating habits, and all of them come with limitations.
It’s hard for nutritional scientists to study diet by isolating one factor. If they remove meat from a trial diet, they reduce the total calories of the diet. The question becomes which action created the results: the removal of meat or the fewer calories consumed? If they try to fix this by adding calories with healthy fats, carbs, or proteins, which variable is responsible for the outcome? When one food element is removed, the entire sample is altered.
Many scientific research studies use the placebo effect to test variables among subjects. But adulterated foods don’t taste the same as foods in their natural state, making placebos an ineffective tool.
Also, food isn’t the only factor that affects health. Nutritional research is further compounded when lifestyle is considered. Two examples show the importance of lifestyle in health.
First, the Mediterranean diet, considered one of the healthiest in the world, consists of high amounts of olive oil and fish. The diet was observed with several subjects in Crete, Greece in the 1950s. This population undoubtedly lived different lifestyles than Americans.
The Mediterranean diet may, in fact, be sublime. But we don’t know if the benefits are from the types of food eaten, the Greek lifestyle, or a combination of these.
The second example is the role of supplements in healthy peoples’ daily routines. Supplement takers are likely to be healthier, with or without the supplements. They are typically more educated, wealthier, and more interested in their health in general. If the benefit of supplements is determined by the health of supplement takers, it’s a benefit based on faulty evidence.
There are three main study mechanisms used in nutrition research—case-control, long-term cohort, and large-scale intervention studies—and all of them contain flaws.
In case-control studies, researchers study the eating habits of people with specific illnesses to determine the cause. The problem is that people typically change their diets at the onset of illness. What they currently eat may not be what they were eating when the illness developed. Also, the ubiquitous nature of food marketing and media coverage of FDA guidelines, most people have a general knowledge of what foods are deemed good and bad. This information may cause people to falsely blame foods they know to be bad.
Long-term cohort studies track a group of subjects before illness occurs. The most famous one is the Nurses’ Health Study, which studies the effects of nutrition, environment, and lifestyle of thousands of nurses. The study was limited in scope from the beginning. The study sample of nurses were not diverse enough. Many led similar lifestyles, and almost all ate a Western diet. Therefore, dietary changes were marginal, or not radical enough to truly show differentiations.
The second issue with the study was the reliance on a food-frequency questionnaire to gauge eating habits. It should come as no surprise that people lie all the time about their food consumption. In fact, this behavior is so common, validation studies were created to account for it. Those studies showed that people tended to eat between one-fifth and one-third more food than reported.
This questionnaire was also used in the best-known large-scale intervention study, the Woman’s Health Initiative. In intervention studies, one group serves as the control group, and another group serves as the experimental group receiving the intervention. In this case, approximately 49,000 women were studied, and 40% were asked to reduce their calories from fat to 20% of their total caloric intake. After eight years, no significant changes to their health were found.
One problem was the focus on dietary fat, not food. There was no distinction made among the different fats. Therefore, fats from olive oil or avocados were counted the same as fats from low-fat foods and animal protein. Also, the women in the control group were part of society, so they likely were aware of nutritional trends toward low-fat foods and ate accordingly. In the end, both groups could have been eating basically the same low-fat diet.
More than affecting our bodies, nutritionism has a negative effect on our mental health. Paul Rozin, a psychologist from the University of Pennsylvania, has conducted numerous studies regarding the thoughts of American eaters surrounding food. His results showed the following:
Some scientists are promoting the addition of orthorexia nervosa to the Diagnostic and Statistical Manual of Mental Health as an official eating disorder. Therapists are seeing this unhealthy obsession with healthy eating coming up more and more with their patients.
Thanks to nutritionism and its supporters, we are fatter, sicker, and more mentally unstable regarding food than ever before. It’s time for a change that actually leads to better physical and mental health for all.
To create eating habits that support better health, we need to fix the relationship between humans and food as part of the overall food chain. To do this, we need to leave the Western diet behind. But what about all those already experiencing poor health? Has the Western diet destroyed their chances for a healthy life?
The Western diet has spread throughout most of the civilized world and beyond. The decline in health of those who follow it is no secret. But until 1982, what was unclear was whether the damage could be reversed. A study of 10 Aborigines in Australia provided the first grain of hope that it could.
In summer 1982, a research scientist, Kerin O’Dea, asked 10 formerly bush-dwelling Aborigines to return to their previous lives as hunters and gatherers to see if their health improved. Since moving to civilization years before, all had developed Type 2 diabetes and were at high risk for heart disease. They’d also developed “metabolic syndrome,” a disorder in which the body is unable to metabolize carbohydrates and fats appropriately through insulin production.
When the Aborigines returned to the bush, they lived by the coast and consumed seafood, birds, grubs, and kangaroos. In search of more plants, they moved inland and ate freshwater fish, turtles, crocodiles, birds, kangaroos, yams, figs, and honey. This diet was vastly different from the one at the western settlements, which included mostly flour, sugar, carbonated drinks, alcohol, and cheap meat.
After 7 weeks, O’Dea tested their blood and found that every aspect of their health had improved.
The results revealed that the negative health outcomes experienced by western populations could be reduced by changes in diet. The beauty of O’Dea’s study was her lack of focus on nutrients. She didn’t break down the nutrients in the foods the Aborigines ate. She focused, rather, on their overall dietary patterns.
The focus on dietary patterns makes it difficult to pin down exactly what needs to change in the Western diet, but it raises a lot of questions about whole foods and health. These questions are more important than ever:
But if this information was known in 1982, why are we still stuck in the Western diet culture?
Most nutritional studies focus on nutrients, not foods or dietary patterns. But these are two of the most important factors to consider if a shift in nutrition and health can happen. Almost every nutritional study, from the Nurses’ study to the Women’s Health Initiative, fails to address the main staples of the Western diet: processed foods, added fats, sugar, and scant amounts of fruits, vegetables, and whole grains. However, this was not always the case.
Today, cancer and heart disease are so common, there’s a feeling of inevitability surrounding them. People look to medicine to save them and can’t imagine a time when these illnesses weren’t prevalent. But before the second world war, the rates of killer diseases were low. During the early 1900s, scientists started to notice a higher frequency of killer disease diagnoses. Reports from health professionals, who’d witnessed the entry of Western civilization into remote indigenous cultures and the domino effect of disease that followed, piqued their interest. Discussions about the relationship between industrialization and health began in the research and science communities.
One person heavily involved in this discussion was a Canadian dentist named Weston A. Price. Price moved to Ohio in the 1930s and started to see a rapid decrease in formally healthy teeth. He proffered that the new modern diet was responsible for increased tooth decay.
In 1934, Price was one of many who debated whether hygiene or the modern diet was to blame for the increased dental problems. Hygiene won the day, likely because it was easier to change and more profitable for dentists. Following this, Price closed his practice and traveled to isolated communities unaffected by the modern diet. His findings included the following:
Price saw the connections among soil, grass, animals, and humans. He reasoned that the industrialization of food broke the natural order of things in two ways: depleting soils of nutrients, thereby the food grown in them, and destroying the surviving nutrients through processing. These actions intervened in the natural momentum of nutrients in the food chain, and the physiological needs of the human body were no longer satisfied.
Although some scientists were interested in Price’s ideas before the war, everything changed with WWII. During the war, industry was helping save lives and protect the country. No one wanted to hear it criticized. With the war’s end and America’s victory, the industrial machine was firmly in place. Industrial agriculture saw a chance to take over and did. Everyone forgot about Price and his theories until the 1960s, when organic agriculture entered the scene.
One of the main downfalls of nutritionism is the way it shifts our focus from food as part of a relationship to merely an item. The relationship between humans and food is the food chain and occurs in every aspect of nature. Organisms along this chain tend to adapt to or with one another to survive.
For example, cow’s milk used to make people physically sick. Then, five thousand years ago, milk farmers and the surrounding community started to evolve. A mutation of the gene that promotes the enzyme needed to digest milk spread through the population. The milk gave the herders the ability to have and raise more children, which increased the spread of the gene. Likewise, the larger population and improvements to health allowed for better care of the cows, which improved their lives and allowed more reproduction.
When one aspect of the food chain is disrupted, it disrupts the rest of the chain. If the soil is unhealthy, the grass growing from it will be unhealthy, and the cows that eat the grass will be unhealthy, and so on. Therefore, the health of humans is greatly dependent on the health of the food chain.
In the past, the familiar relationship between humans and their food guided decisions on what to eat. For example, you could tell by looking at a fruit whether it was ripe or spoiled. What the industrialization of food did was create food that looks like food but isn’t. The fake food doesn’t spoil. This makes it impossible to visually judge whether the food is good or bad.
When we convert this concept to nutritionism, the problem becomes more clear. You can’t see chemicals or nutrients. And because our relationship with processed foods is shallow, instinct can’t play a part. Further, the way the whole food is processed creates confusion in the body.
Corn, for example, breaks down in the body as sugar, but the fiber in it mediates the pace. The body is prepared to accept this process and absorb the sugars because of its long relationship with corn. High-fructose corn syrup, however, is a different story. The body has no idea what to do with it because it’s an artificial compound. In time, our bodies will learn how to adapt, but for the moment, that is not the case.
This example shows just how far removed we’ve become from our natural relationships with food in the Western diet. How did this relationship become so rocky? Five shifts in nutrition are responsible for the movement away from nature and toward the modern diet.
The advent of refined carbohydrates is one of the most fundamental changes in the history of food and one of the most damaging to the human body. Refined grains, rice, and sugar have many benefits, but none of them involve health. In fact, refined foods are almost empty of nutritional value without fortification and lead to massive influxes of glucose and fructose in the body. So why do we eat so much of them? Part of the answer is history.
Before the Industrial Revolution, grains were stone ground in small batches by mills run on water power. The flour was yellowish and pungent. The grinders could remove the bran from wheat but not the germ. When the germ was crushed, it released an oil, which created the color and smell. The oil also oxidized quickly, making the shelf life of the flour short.
When steel, iron, and porcelain rollers were invented, they were able to remove the germ and grind the rest into a fine white powder. This flour was more appealing and had a long shelf life without the oil. Because the rollers could be operated by steam engines, manufacturers could now grind flour year round and ship it distances without it spoiling.
The problem is that the germ in wheat holds most of the nutritional value, including B vitamins, protein, carotenes, and omega-3 fatty acids. Without the germ, refined flour is basically nutrient-free. The flour breaks down into glucose, which overwhelms the body’s insulin processes without the mitigating nutrients. During this time, corn and rice were undergoing their own processes to strip the nutrients and make them more sustainable.
What followed were deficiencies of certain nutrients that led to diseases, such as beriberi (vitamin B deficiency). In the 1930s, scientists figured out the connection between refined foods and disease. Mills started adding B vitamins back into the refined grain. In 1996, scientists discovered that Americans were low on folic acid, so that was also added back into refined grains. These steps helped with deficiency diseases, but they did nothing to stop the killer diseases spreading through the population.
Scientists noticed that people who ate whole grains were healthier, and the mission to identify which nutrients caused it started. But even when these nutrients were supplemented by other sources, people were still not seeing the same results as those who ate the actual whole grain. They reasoned that some internal synergistic relationship was causing the benefit.
This discovery should have been the end of the discussion. But processed foods are more marketable than whole foods. So the food industry surged on in its assertion that fortifying foods with nutrients was just as good as eating the whole food.
Sugar can be considered the pinnacle of refined carbohydrates. Refined sugar was introduced around the same time refined flours were taking off. By the end of the 19th century, one-sixth of all calories came from sugar. Fructose is the sweet element of sugar and is typically only seen in nature in ripe fruits. These fruits also include fiber that, like with corn, slows the release and intake of the fructose and includes other essential nutrients.
Refined sugar, like high-fructose corn syrup, is basically a mainline of fructose to the system. With sugar being added to the diet, people now had an influx of fructose to go along with the influx of glucose from refined flour. Today, 20% of total calories come from sugar and 40% from mostly refined grains, which means the American diet is approximately 50% sugar in one form or another. Another way to think about it is that 50% of the diet is pure energy and nothing else.
Metabolic syndrome and Type-2 diabetes are the results of this influx of sugar, as the body struggles to produce enough insulin to respond. Another health issue related to high sugar intake is obesity. Fiber not only mitigates the quick release of sugar, it also helps you feel satiated. When fiber is removed, you may still feel hungry even after eating a large meal of refined carbohydrates. Also, the quick release of sugar causes insulin production to spike then crash, which makes the body believe it needs to eat again. Both of these situations can lead to overeating.
The industrialization of food is founded in the concept of simplification. When we alter both soil and food to simplify natural processes, we create distance between humans and the natural food chain. We also create many disadvantages for the human body.
Remember Liebig’s discovery of the three basic elements—nitrogen, phosphorus, and potassium—needed in soil to grow food? That discovery, coupled with the ability to transform fossil fuels into nitrogen fertilizer after WWII, reshaped how food was grown throughout America.
It’s true that plants can thrive with only these chemicals, but it’s also true that the ability to thrive has little to do with nutritional value. Chemical fertilizers make plants grow faster. But when plants grow at a fast rate, there is less gestation time, meaning less time to absorb nutrients from the soil. Also, because the big three elements are readily available to the plants, their roots don’t dig as deep into the soil to find nutrients. Shallow roots means a lack of access to deep soil minerals.
The result is plants with little nutrient value, which makes them more susceptible to pests and disease. Since chemical fertilizers were introduced in 1950, the nutritional value of fresh fruits and vegetables, as tracked by the USDA, has deteriorated.
Simplifying food means everything from making foods less perishable to removing less-desirable species from the market. Despite the seemingly endless variety of food available today, the base ingredients are made up of only a few food species chosen for their high-yield value and those easily harvested and manipulated.
Where an Iowa farm once raised several types of livestock and a variety of fruits and vegetables, now they almost all exclusively raise corn and soybeans. This move is championed by government subsidies for corn and soybean farmers.
What’s so special about corn and soybeans? They both are proficient at transforming sunlight and chemical fertilizers into macronutrients (carbohydrates for corn and fat and protein for soybeans). They are cheap to grow and easy to manipulate into other processed foods. For instance, many people believe they don’t eat a lot of corn or soy, but 75% of all vegetable oils are soy-based, and 50% of all sweeteners are corn-based. In fact, 800 of the total daily calories come from a combination of corn (550) and soy (250). When you add 750 calories from wheat and 90 from rice, that is two-thirds of the total daily caloric intake from four species.
The problem is that humans are omnivores. Ss such, we require approximately 50 to 100 different nutrient properties to function properly. When food species varied, we were getting different nutrients from hundreds to thousands of sources, which created a complex nutrient landscape. With only four species making up most of the food, and many of those stripped of their nutrient value, how likely is it that you’re getting everything you need?
The food industry’s focus on quantity over quality has stripped most of the nutritional value from whole foods on the market today. The result of efforts to simplify how food is grown, make food more durable, and reduce the number of food species is a food inflation in the American diet.A food inflation is when more food must be eaten to gain the same level of nutrients that less food provided in the past.
According to the USDA studies, since the mid-20th century, vitamin C has decreased by 20%, iron by 15%, riboflavin by 38%, and calcium by 16% in produce. What this means is you would need to eat three apples today to get the same amount of iron as one apple in 1940.
One way industry promotes quantity over quality is through genetics. Many manufacturers breed certain types of food sources for high yields, such as seeds and certain types of livestock. Wheat is bred to increase its yield output, which has nearly tripled over the last century. Holstein cows have been bred since 1950 to produce more milk. But when you breed for a certain outcome, other elements go by the wayside. In the case of food, it’s nutrients. Today’s wheat contains 28% less iron than before, and milk from Holstein cows has less butter fat and fewer nutrients than milk from other cow species.
Another way quantity has pushed quality out of the way is the industry focus on foods high in calories and low in price. After a surge in food prices in the mid-70s caused housewives to revolt, the U.S. government created new agriculture policies to ensure the availability of cheap food. The subsidies for corn and soybean farmers are an example of these policies. Farmers were encouraged to grow large amounts of food in small amounts of time.
The policies worked. Food production increased and prices decreased. But this increase caused another increase: 300 added calories per day to the American diet since 1985. A fourth of those calories are from added sugars, another fourth from added fats, and 46% from mostly refined grains. That leaves only 8% of the daily caloric intake from fruits and vegetables. In fact, adults only eat 32% and children 20% of the recommended daily servings of fruits and vegetables.
This dietary focus on more over value has created a strange phenomenon in society—overfed but undernourished citizens. Old deficiency diseases are cropping back up, but now they’re being seen in obese people, especially children. Deficiency breaks down DNA, which can lead to cancer.
Our reliance on a handful of species means a reliance on durable seeds with long shelf lives, from which corn, soybeans, wheat, and rice all derive. This shift away from leafy vegetables brought a shift away from micronutrients toward macronutrients. The amount of macronutrients found in most processed foods leads to an overabundance in our systems, which leads to obesity and other killer diseases. However, what may be more damaging is the lack of certain vital micronutrients, namely omega-3 fatty acids. Omega-3 and omega-6 fatty acids are both naturally occurring in different food sources, but they differ greatly in structure and mechanism in the body.
Omega-3 fatty acids are found in plant leaves and may be the most essential nutrient of all. They are important for neurological development and processing, vision, cell wall permeability, glucose metabolism, and reductions in the inflammatory response. Omega-3s are swift and malleable, which also make them more perishable and not suitable for long-term storage.
The food industry does what it can to remove omega-3s from products to increase their shelf life. Breeders look for crops low in omega-3s, and nutritionism has helped eliminate omega-3s from the diet through its push of seed oils and hydrogenated fats.
Omega-6 fatty acids, on the other hand, are found in durable seeds. These fatty acids are dense and slow, which make them more suitable for storage. They promote the storage of fat in the body, cell wall density, clotting, and inflammation.
Both types of fatty acids take up residence in the cell membranes and require enzymes to help them perform their functions. Because of this, omega-3s and omega-6s compete for space at the cellular level and for enzymes. Therefore, the ratio of the two within the body may be more significant than the quantity of either alone.
In the Western diet, you’re getting more omega-6s from grains and animals eating grains instead of plants. The move from saturated fats to seed oils and margarine thanks to the lipid hypothesis was a tactical move toward omega-6s. These days, the typical ratio of omega-6s to omega-3s in the body is 10:1.
Omega-3s are negatively correlated with heart disease. As the amount of omega-3s increase, the risk of heart disease decreases. Studies show that good amounts of omega-3s in the body can reduce the risk of heart disease by a third.
A few theories exist about the beneficial effects of omega-3s.
The two omega fatty acids exist in a zero-sum relationship. So simply fortifying a diet with more omega-3s may be futile if the amount of omega-6s remains the same or high. The current ratio favoring inflammation-inducing omega-6s explains the billions of dollars Americans spend a year on non-steroidal anti-inflammatory drugs. Omega-6s also strengthen cellular walls, making absorption more difficult, thereby slowing metabolism. The only way to combat the negative effects of omega-6s is to eat a diet higher in green plants and lower in seed-based foods.
With all of the money and effort by the food industry to promote nutritionism, we should be in a better place than we are. If we were, this book wouldn’t be needed. The Western diet is taking over the world and pushing traditional food cultures aside. In America, a $32 billion per year marketing strategy by the food industry glamorizes the myriad of food products in the grocery stores. With such a significant presence in the culture, consumers are inundated with encouragement to buy processed foods over whole foods. This is how science has come to overrun tradition and how industry exploits that science for its own benefit.
You might think it’s time we simply accept that fast food is the new food culture and allow the body to adapt over time. But accepting this means sacrificing all those that are sick and dying from the current diet. Plus, evolutionary adaptation is a long process and could take generations before our genetic predispositions change.
The argument that medicine will eventually catch up to the detrimental effects of the Western diet is also popular, and to some extent, it has. Doctors are much better at adapting and finding ways to mitigate the effects of killer diseases and obesity through surgical procedures and medications. But financially, humans will not be able to keep up with the necessary care as their health worsens. A total of $250 billion is spent a year on diet-related medical care. Diabetes shortens life by 12 years and costs $13,000 to treat a year compared to the average of $2,500 for the medical expenses of someone without diabetes.
Eighty percent of Type 2 diabetes cases could be reversed with a better diet and proper fitness. But the diabetes industry detracts from that strategy by continuing to adapt to the growing epidemic. New technologies, drugs, and lifestyle magazines are all geared toward allowing someone to learn to live with diabetes, rather than helping them rid their lives of it. In general, Type 2 diabetes is becoming so common that soon it may be considered a mere lifestyle, rather than a treatable condition.
What a boon it will be for the food and science industry when that happens. They will have another market to cater to. And because these industries are viewed as the authority on food and health, they will be successful. If we stand a chance of surviving killer diseases, we must find a way to shift away from the Western diet for good.
To change your habits away from the Western Diet, you must differentiate the theories from the problem and work to address the latter. The different theories of nutritionism presented in this summary may create more confusion about what you should eat and what not to eat. Many of them contradict each other or overlap. For instance, is it a lack of omega-3s that leads to disease or an influx of refined carbohydrates? If carbohydrates are the problem, what does that mean for the lipid hypothesis?
It’s natural for scientists and the general public to want a one-stop solution that is easily proven and disseminated. Yet, allowing the one-nutrient mentality to serve as the endpoint of the issue is narrow, short-sighted, and ineffective in helping guide you to good choices. Think of who is actually benefiting from these theories. Is it the consumer, who is yo-yoed between this nutrient and that nutrient? Not really. The main beneficiaries of nutritionism, again, are the food and science industries.
The good news is that theories are just that—ideas that attempt to explain a problem or phenomenon. Regardless of the theory, the problem remains the same—eating a Western diet leads to Western diseases. The “how” and “why” are less important than the “what,” meaning you have to treat the whole problem, not just the symptoms.
It seems like the obvious answer is “don’t eat the Western diet,” but that advice is not as simple as it sounds. Our culture is deeply embedded in the Western diet and nutritionism ideology, and many of us don’t live in environments where returning to ancestral traditions is truly possible. Whole foods are not what they used to be. The beef you eat is not the beef your ancestors ate.
Although a whole food, this product has been adulterated by industry and has less nutritional value than before. Does this beef count as a whole food? Perhaps not. This brings up one of the main issues you’ll find in the following advice on how to shift your eating away from the Western diet. How a food is produced is just as important as what the food is and how you eat it.
The guidance that follows attempts to provide a new way of thinking about food that focuses on the food chain and the relationship between the health of the environment and human health. You have to expand your mind beyond nutrient-specific advantages and disadvantages when thinking about healthy eating. The suggestions in the next sections do not focus on specific foods, nutrients, or calories. They focus on ways to think about food and shopping that lead to a variety of healthy meals.
It takes work to change your eating habits and to eat better. The simplified food industry and foods have spoiled the expectations for eating. We want food to be fast, easy, and inexpensive. More effort, time, and resources are needed to leave the Western diet behind, but that is the trade-off for better health.
Just because something is edible does not mean it is real food. Much of what grocery stores carry are food-like substances. In fact, there are 17,000 new food-like substances created and marketed a year. In contrast, real food is ordinary food, or food-as-food, and you should only be eating ordinary food. How can you tell the difference? Three rules can help you identify what you should and shouldn’t be eating.
If your great-grandma wouldn’t call it real food, it’s not real food. Using your great-grandmother gives you a good chance of going back to a time before industrialized food took over. If you’re very young, go back to the hunter and gatherer days.
When you’re in the grocery store, imagine your great-grandma or ancestor is standing next to you. When you pick up an item, imagine they are picking up the item and inspecting it. The example of Go-Gurt provides an understanding of how this works.
Your great-grandma looks at the Go-Gurt and asks what it is. You tell her it’s yogurt in tube form. She looks at the list of ingredients and gets confused, and rightfully so. In her day, yogurt was cultured milk. What she sees are one or two ingredients that are reminiscent of yogurt and a bunch of other ingredients she has never heard of: HFCS, gelatin, corn starch, vitamins, artificial flavors, and many more. Turns out, Go-Gurt is a yogurt-like substance, but not real yogurt.
Some other items that would confuse great-grandma would be the following:
These foods are filled with additives created in labs and corn and soy byproducts. These foods confuse your eating brain, which is exactly what they are designed to do. When you eat, your body uses its senses to prepare itself for digestion. When foods contain fake flavors, textures, and sweeteners, they appear and taste like real food, but the body doesn’t know how to digest the foreign ingredients.
Furthermore, these foods are not merely adulterated to improve shelf life. They are geared to tempt the brain and body with salt, fat, and sugar, which the body needs certain amounts of to survive. In the natural environment, the avenues for these substances are narrow, and they are accompanied by water, fiber, and complex nutrient structures that help break them down. But industry adds salt, fat, and sugar to get you to eat more, hence buy more, without the added buffers. So you’re eating more and gaining nothing in return.
Some processed foods are so cleverly designed that even great-grandma might be fooled. Foods like bread look like bread despite being heavily processed. But because of the reversal of the imitation law, manufacturers are not required to label them as such, which leads us to the next rule.
Basically, foods with unfamiliar or unpronounceable ingredients, more than five ingredients, or high-fructose corn syrup as an ingredient should be avoided. When these aspects are present, it’s a good bet the food is not real food.
When you make bread at home, you use flour, water, yeast, and a pinch of salt. Mix them up, knead the dough, let it rise, and bake. What you get is a nice loaf of bread that looks like the loaves in the store until you check the ingredient list, even whole-grain bread. A look at the ingredient list for Sara Lee’s Whole-Grain White Bread exemplifies the difference. If you notice the contradiction of “whole-grain” and “white” used to describe the same bread, your work is done.
Sara Lee’s list of ingredients include eight main ingredients, including high-fructose corn syrup and cellulose along with the usual suspects of flour, yeast, etc., and sixteen “2% or less” ingredients, some with four or more sub-ingredients, including soy, vegetable oil, butter, corn starch, dough conditioner, and vinegar.
Many products are altered to compensate for the addition of a good or subtraction of a bad ingredient. Often, the compensations reduce the value of the good ingredients or are worse than the ingredients removed.
An example is low-fat or non-fat dairy. To maintain the creamy texture of whole dairy after the fat is removed, manufacturers add various products. For instance, to make skim milk palatable, they add powdered milk, which contains oxidized cholesterol. Oxidized cholesterol may be worse than real cholesterol. Also, many of the vitamins in milk are fat-soluble, meaning they need fat to be absorbed in the body. If you remove the fat from milk, what’s it doing for you?
Food manufacturers make these compromises so they can slap a non-fat or whole-grain label on with an accompanying health claim, which brings us to Rule #3.
The package alone is a big clue as to whether it’s real food or processed food, but more so, real food doesn’t have to tout claims about what nutrients it contains or what it does for the body. Vegetables and fruits are full of antioxidants and vitamins, and meat and dairy have protein and other important nutrients. These things are commonly known or common sense. So if a food has to tell you its benefits, chances are it's not real food.
Usually, only major food manufacturers have the resources and desire to get FDA approval for a health claim. Big food conglomerates have a lot of resources and political power. They are able to fund research studies to support health claims and pressure the FDA to approve them. But the science behind these claims, as in most studies based in nutritionism, are often narrowly focused or randomly tested. Remember the health claims for margarine?
When a company gets permission to place a health claim on the box, they are also given license to design the claim as they wish. They can make the big “Helps Reduce Saturated Fat” claim as big as they want while leaving disclaimers like, “The FDA concludes that there is little scientific evidence supporting this claim” in small print. The FDA allows this strategy to continue despite their own research suggesting that consumers don’t truly grasp the full extent of health labels.
The American Heart Association also grants manufacturers the right to use their “Heart Healthy” stamp of approval based on nutrient-focused science for a fee. This is how sugar cereals, like Coco Puffs, can claim to be good for your heart despite recent studies showing that sugar is related to heart disease. In the meantime, actual heart-healthy foods are package-free and label-free over in the produce section. They receive little fanfare because of a lack of political clout and science claims to prove their worth. This fact provides two sub-rules for Rule #3.
To ensure you’re staying close to a higher percentage of real food, stay away from the middle aisles. Most grocery stores are organized similarly, with produce, dairy, bakeries, meats, and fish surrounding aisles of processed food.
Of course, issues with how produce and livestock are raised bring up the previous question of what constitutes real food and adulterated food. Also, additives and high-fructose corn syrup have wiggled their way into many dairy and bread products. It’s not an exact science to shop along the periphery, which is why the only true way to ensure you are getting real food is by following Rule #3b.
The best way to avoid the Western diet is to stay out of its home environments: grocery stores, quickie marts, and fast food chains. Independent growers and food manufacturers don’t have the resources or ability to process foods to the degree of big industry. When you shop at farmer’s markets, join community-sponsored agriculture (CSA) groups, or grow your own garden, you will always get real food.
Farmer’s markets and CSAs are more popular than ever and are cropping up all over the country. True, you’ll have to go without some foods you’re used to if they’re not in season. But getting a majority of your food through these avenues goes a long way to supporting your health and the health of the food chain. And eating seasonal foods increases food diversity.
Furthermore, independent farmers tend to grow a variety of foods and raise a number of animals, so the need for pesticides is reduced when the ecology is working together. This makes independent farms organic in nature. Even though organic foods in the grocery stores are not raised with chemical fertilizers or pesticides, they are likely being shipped from far-off locales. Shipping food from places like California or China increases the environmental impact.
Shopping at a farmer’s market or CSA means you know where your food comes from. You are actively participating in a short food chain. You can interact with the food grower to ask questions about how the food is grown, which breaks the wall of ignorance separating consumers from big manufacturers. When the grower and consumer are face to face, food safety and accountability are lower concerns.
You may scoff at the extra cost of shopping this way. But the money you spend on food is like casting a vote for what is important for your health and that of the food chain. You can cast a vote for manufacturers who cherish quantity over quality and health. Or you can support those who cherish the quality and health of you and the environment. If health and environment are important, this perspective may help mitigate the extra costs.
It can be hard to know what you’re eating simply by looking at the food. Distinguishing between real food and food-like substances requires knowledge and intention.
In your refrigerator, do you have at least two real food items? How do you know they’re real? What are they made of?
Name two food-like substances you currently own. What ingredients are you surprised to find listed?
What would your great-grandma say about the majority of the food you own?
What sections of the grocery store do you most gravitate toward? Why?
The evolutionary relationship between humans and plants greatly explains why they are so good for us and why a diet focused on plant-based foods is a healthier option. Even Thomas Jefferson said that meat should be eaten more as a flavoring to support vegetables, rather than the main dish.
At the end of the day, a switch to real food of any type is going to be beneficial. In fact, other cultures who eat traditional diets still experience greater health than Western eaters simply because of the priority on whole foods. Still, not all whole foods are created equal, and a few rules can help you identify which ones provide the most benefit.
Much in the same way scientists can’t agree on what aspect of nutritionism is most helpful, they also can’t agree on what it is about plants (not seeds) that provide the benefit. What they do agree on, and maybe the only thing, is that plant-based foods are good for you.
One benefit of eating plants is the amount of antioxidants they provide the body. Vitamin C is one of the most important. Your ancestors were able to produce vitamin C internally. This process helped reduce the amount of free radicals—oxygen atoms containing an extra electron—created whenever cells metabolize or the body fights inflammation. Free radicals are believed to be associated with cancer and other ailments experienced with age. Vitamin C and other antioxidants stabilize the free radicals before they can cause harm.
Vitamin C and other antioxidants also entice the liver to produce enzymes to break the antioxidants down and other compounds in the body, like toxins. In this way, antioxidants are detoxifiers, helping to rid the body of carcinogens and other dangerous compounds. The greater variety of antioxidants, the greater variety of toxins targeted.
However, the large plant-based diet of your ancestors contained so much vitamin C, the body had no need to continue producing its own. This ability was weeded out through evolution and made humans dependent on plants to receive these vital nutrients. This is one of the main reasons they are so good for you.
There are various other benefits humans get from plants. Vegetarians tend to be healthier and live longer. Plants also have fewer energy properties, hence, fewer calories. And unlike meat, which you don’t need to survive, you do need plants to survive. Still, meat does provide some essential amino acids, so a diet including small portions of meat is not necessarily a bad thing.
The problem with meat comes from the way we consume it and the way it is processed. The large portions of meat as main courses is extreme, and industrialized meat is not good. Meat sits at the top of the food chain, so it has the distinction of soaking up all the nutrients—and toxins—running up that chain. Thus, meat is the pinnacle evidence for why the health of the food chain is so important, which relates to the second rule.
As discussed, the importance of healthy soils for produce and plant-based diets for animals cannot be overemphasized. Eggs, meat, and dairy are only as healthy as the animal they come from. When animals are fed only grain, they get sick, especially cows and sheep, which requires antibiotics to heal them. Those digested medications stay in the food. Likewise with chemical fertilizers and produce.
It’s worth the extra money to buy pastured animal products and fresh produce grown locally to ensure the quality of the food you will eat. An egg from a free-range chicken is not the same egg as that from an industrialized chicken. Humans are animals, so if it stands to reason that we live healthier lives on a diet of mostly plants, it stands to reason that the animals we eat do as well.
A separate freezer can be a great investment to help you eat the right kind of foods without spending a lot of money. The small investment in a freezer can save money down the road if you find a good source of pastured meat and buy it in bulk. Produce can also be purchased in bulk at farmer’s markets at the height of the season and frozen. The benefit of a separate freezer is that you open it less frequently than the one you use for daily use, and freezing vegetables retains more nutrients than canning.
Whether you eat plants or plants and animals, diversifying the species of each helps to cast your nutritional net wide for greater benefits. Eating a wider variety of foods also helps support farm biodiversity, which reduces the need for pesticides and soil enhancements that trickle up the food chain. Both of these results support your health. A couple of tips can help you live a better omnivore life.
You’ve heard a lot about the health of soil and the health of the food chain already, but eating foods grown in good soil is not as simple as eating organic foods. There is a tendency to equate the word “organic” with health, but this is not always the case.
Processed organic food is not much different than processed conventionally grown foods. Organic cookies and organic soda are still cookies and soda. Your body must deal with the imitation ingredients the same way regardless of soil.
Real food is a different story. Organic real food shows higher values of antioxidants, vitamins, and other nutrients than its conventional counterparts. However, organic food loses some of its nutritional quality when it travels great distances to reach your store. There are many local farms and ranches that may not fit into the government-issued standards for certified organic that still grow and raise quality food. And they likely provide different species than those found on commercial farms.
Foods existing in the wild are responsible for their own preservation without human interference. Because of this, wild greens are some of the most nutritious food there is. They contain more phytochemicals, which is what helps plants ward off pests and diseases and have many antioxidant and anti-inflammatory properties.
Wild game is also more nutritious because it has more omega-3s and less saturated fat from a lifestyle of eating plants and roaming. Grass-fed beef comes close to resembling the nutritional value of wild game.
Wild-caught fish are also better than farm-raised fish because of their diet of algae and other fish who eat algae. Farm fish tend to be fed more grain. However, many fish and animal species are becoming endangered in the wild, so you need to understand which wild foods are both good for you and the environment. Fish like salmon, sardines, and mackerel are good options and contain many essential nutrients.
Being a better omnivore means being more educated about your food and more health conscious in general. Although studies suggest that the effects of taking supplements are small to none, the lifestyle of supplement takers is still a good one to mimic. This population tends to be healthier, more educated, and health conscious than the average population.
One positive habit of supplement takers is the daily intake of a multi-vitamin. Multi-vitamins are helpful as you age. Your diet of mostly plants is sufficient in providing you with the micronutrients you need. But after child-bearing years, the body loses its ability to absorb antioxidants and other nutrients as efficiently as before. Boosting your body with a multi-vitamin and fish oil can help supplement the loss of naturally absorbed nutrients.
Other cultures have more deeply ingrained traditional diets than the Western diet and eat fewer processed foods. The advantage of traditional diets is that the food has been vetted through years of trial and error for its effectiveness in fulfilling the body’s dietary needs.A long relationship between the people and the food exists, and the food chain tends to be more harmonious.
For instance, areas where the climate is hot tend to eat spicy foods. Spices help decrease body temperature. They also have antimicrobial properties that fight issues caused by rapidly decaying foods in hot climates. The hotter the climate, the spicier the food tends to be.
Other traditional foods are chosen for purely cultural origins. Foods are often tied to identity and culture uniqueness, which separates them from other cultures. Kosher and halal foods are good examples of this type of cultural identity. These cuisines tend to eschew change and remain over generations.
Still, other forms of traditional diets are entwined with nature, both biologically and ecologically. In some cultures, people use their experience-based knowledge of food to decide what to eat and how to prepare it.
As opposed to industrialized processing, these traditional processing techniques have proven over time to keep people alive, nourished, and healthy. The long relationship with the whole foods led to an understanding of their properties and benefits. This knowledge created standardized dietary habits that have persisted. This idea points to an important sub-rule: stick with tradition.
Ingenuity is great in many industries, but food is not one of them. If healthy diets are the result of generational and evolutionary experience and growth, new and “improved” foods are outliers of tradition, likely for a reason.
The example of soy explains this better. In America, we eat a lot of soy, but it varies greatly from the way Asians eat it based on their long history with soy. Soy is consumed as oil, textured vegetable protein, soy protein isolate, and isoflavones. These forms are all implicated in various health concerns. For instance, isoflavones resemble estrogen in the body and latch onto estrogen receptors. This action is believed to affect cancer, menopause, and other system processes. However, research has not yet decided whether the effect is good or bad. If it’s in question, why gamble?
The Asians may or may not have the answer to this scientific question. But their use of soy over generations with positive health results seems more legitimate as evidence than a lack of evidence. There may also be other aspects of the Asian diet that support soy differently than in the Western diet. This idea points to another sub-rule: eat a traditional diet completely for its benefits.
In the same way that the benefit of whole foods has more to do with internal synergy than any specific nutrient, the benefit of dietary patterns is related to internal synergistic aspects, not a particular ingredient.
Countless studies have tried to identify the magical ingredient in the Mediterranean diet that makes it so healthy. Is it lemons, olive oil, fish, or nuts? Similar studies have attempted to understand the French paradox. Is it the red wine, olive oil, foie gras, cheese? But none of these studies ever finds one food that explains the health outcomes of the eaters, namely a lack of Western diseases and improved health overall.
Compounding the issue is that many traditional diets seem to include foods seemingly unhealthy. The French diet involves lots of saturated fats and wine. The Mediterranean diet involves more fats, like olive oil, than recommended (40% of calories, rather than 30%).
Nutritionism wants to get down to the nitty gritty to figure out which nutrient is responsible for good health. But this behavior is short-sighted and muddies the point of eating a traditional diet. Trying to isolate the magic nutrient is another reductionist attempt to determine how to replicate it through processing. But traditional diets don’t work because of one thing only. The pattern of eating is what creates the benefit.
Recent studies on dietary patterns suggest that certain complete patterns of eating do provide health benefits and can be transferred to different populations. To truly benefit from traditional diets, you can’t be hung up on nutritionism. It seems impossible to track down the magic ingredients that support these diets, and there likely isn’t one. When you approach a traditional diet, you must be willing to accept that synergy among the foods may be the most important factor.
One thing many traditional diets have in common, especially European and Mediterranean diets, is the consumption of wine with meals. American health officials are reticent to recommend alcohol in the diet due to high rates of alcoholism, but studies have shown benefits.
People who drink moderately and consistently seem to live longer and have fewer heart-related issues. Any alcohol seems to create the benefit, but the polyphenols in red wine seem particularly positive. The more you drink the greater the benefit, but excessive alcohol consumption leads to other health issues, so one or two glasses is recommended.
It’s unknown if the benefits of alcohol are isolated or part of a dietary pattern. Drinking daily is better than big benders, and drinking with food helps to slow the absorption of the alcohol. Drinking in conjunction with a plant-based diet also seems helpful in that plants provide B vitamins, which alcohol diminishes. Again, the specific modality of advantage is unclear, but as part of a traditional diet, alcohol may support cardiovascular health and perhaps other issues yet unknown.
It’s evident from this chapter that foods encompass both advantages and disadvantages. These factors may greatly influence your life and health.
How often do you eat fruits and vegetables, and which do you mostly eat?
What information or decisions guide your choices from above?
What is one aspect of the traditional diets you read about that intrigues you the most? Why?
What is one adjustment to your diet you could make based on this chapter?
If food is more than nutrients and diet is more than certain foods, it stands to reason that food culture is more than just diet. The sociology of food is a major component of eating and health. It involves habits, customs, manners, and the way we eat. Following traditional dietary patterns is just one step toward health. Following the customs of a culture is the other step.
When scientists ponder the French paradox, they only notice a diet of rich foods and wine and thinner, healthier eaters. What they don’t consider is the relationship those eaters have with the food and culture surrounding how they eat. If they did, they’d see that the French don’t snack and eat mostly meals in social environments. Their portions are small, and they eat at a slower pace. They eat fewer calories and enjoy the experience of eating.
Psychologists refer to something called “unit bias,” which means people tend to believe whatever portion they’re served is the right amount to eat. This is why portion size matters. Plus, when you eat smaller portions more slowly, you allow the food to be savored rather than devoured, leaving you wanting more. Americans could learn a lot from the habits of the French and other cultures regarding how to eat, and the following rules can serve as a guide.
To benefit your health and the health of the food chain, spend more money on food so you are buying better food both in nutritional value and quality. The French seem to accept the trade-off between quality of food/eating experience and the cost of eating. But we are geared toward quantity and cheap, accessible food.
The American mentality has been to deliver large amounts of food packaged inexpensively for decades. This mindset serves many people well, as many can’t afford to spend a lot of money on food. But many of us can and choose not to. In fact, Americans spend less than 10% of their income on food, the lowest percentage of all other industrialized populations.
More expensive food is often so because it has required more care and energy to produce, rather than being mass-produced in factories, stockyards, or single-crop farms. These foods are better for you because of a lack of toxins or improved nutritional quality. Also, you’re more likely to eat less food when it costs more.
The lure of high-calorie products is their cheap price and convenience. This food is easy to eat and requires little effort to prepare or clean-up thanks to the invention of the microwave. These factors cause people to eat more. If we had to make our own junk food, such as baking our own hostess cakes or frying our own potato chips, we would eat less. You’ll eat less food if it is more expensive and requires more effort to prepare.
The advice to “eat less” is not part of the American vocabulary and hasn’t been since McGovern’s revisions of the dietary guidelines in the 1970s. The culture is inundated with cheap calories, and no aspect of our eating customs points away from indulging, like in other cultures. For instance, second portions are taboo in France. In Okinawa, Japan, a region boasting some of the oldest and healthiest people, they eat following the principle of “hara hachi bu,” or eat to 80% fullness.
We, on the other hand, eat until an external cue tells us to stop, even when full—an empty chip bag, an empty plate, or the end of a movie or TV program. This behavior has led to a society with increasing waistbands and declining health, and the cost of food has matched these trends. Since the changes in agricultural policies in 1980, obesity rates have grown steadily with the additional 300 more calories per person per day. Most of these calories come from sweeteners and fats derived from corn and soy, which has decreased in price by 20% since that time. Conversely, the price of fresh produce has increased by 40%.
Ironically, as money spent on food decreased, the amount spent on health care increased. In 1960, Americans spent 17.5% of their income on food and 5.2% on health care. Today, the 10% spent on food is paired with 16% spent on health care. Many believe the increase in calories is the main culprit and that reducing calories is the key to cancer prevention.
You are more likely to savor the food more when you spend more on food and eat less. Eating healthier foods is a better food experience, and you'll be satisfied with the smaller portion more than if you eat until you’re stuffed.
If you eat only three meals a day—breakfast, lunch, and dinner—and eat them with other people, you can reduce your caloric intake and promote a positive food culture. Americans tend to eat more snacks than meals, especially meals with other people. In fact, researchers have ceased studies that strictly focus on the big three meals. They now include a fourth “meal occasion,” that of all-day continual snacking and drinking of processed beverages.
Why is this a big deal? Those of a certain age will remember the culture of family meals and the benefits gained.
At one point, American culture was defined by group and family meals. Not anymore. One study found that one-fifth of all eating for people aged 18 to 50 years old happened in the car. And although many families state they eat 3 or 4 meals together a week, those meals resemble little of the old customs.
Family members now tend to forage for their own meals before sporadically making their way to the communal table, if they do at all. As soon as they’re finished eating, they leave the table. The food industry has conducted studies to observe the habits of families during meals, and seeing this new trend, they now market what they call “home meal replacements” to families. Each member can eat what they desire, whether low-carb, low-fat, low-cholesterol, or high-calorie. Portion sizes are dictated by whatever company packaged these ready-to-serve meals.
Social graces and culture have very little to do with the way American families eat these days. Still, nothing has threatened the Norman Rockwell culture more than snacks. Snacking now occurs in places that were once food-free environments. For instance, workplaces have well-stocked kitchens, and a box of pastries or bagels can often be found in work meetings. Times between meals are filled with foods packaged specifically for snacking.
Our snacking culture is so big, cars were redesigned to support it. Cup holders are large enough to fit a bottle of soda or Big Gulp. Glove compartments are refrigerated. And snack portions continue to grow. These snacks are rarely fruit slices or vegetable sticks. They are flavored, processed refined carbs, hydrogenated oils, salt, and sweeteners. This culture is so ingrained, changing it must be intentional. The following considerations can help.
1. Eat only at a table.
2. Don’t buy food from gas stations or convenience stores.
3. Eat with others when possible.
When external cues drive your eating habits, you’re likely to eat more than you need. When you allow other senses to guide what and how you eat, you’re likely to eat less. Other senses can lead to the following questions:
Science suggests that it takes 20 minutes for the stomach to signal the brain when full, but Americans tend to eat meals at a faster rate. You can be finished gorging before your body has time to react. When you eat slowly, you allow the body to feel satisfied and act accordingly. This is likely what makes the French paradox so compelling.
Until our culture shifts to slower rates of eating and internal cues for guidance, the following manipulations of external cues can help: serve smaller portions, use smaller plates, repackage snacks into smaller sizes, use tall glasses, make healthy foods available and visible, stick unhealthy foods in the back, and serve food from cookware in the kitchen, rather than in serving dishes on the table.
Still, a complete shift to slower eating is the best option for many reasons, as the next rule suggests.
You gain more respect for the overall health of the food chain and promote food culture when healthy food is prepared and enjoyed intentionally. This belief is what the Slow Food Movement (Slow Food) was founded on. The movement was developed in Italy as a response to the hostile takeover of the Western diet in Rome when fast food chains were introduced in the 1980s. The movement seeks to reclaim and introduce the traditional culture of eating healthier foods consciously grown, prepared, and eaten communally.
Quality is the priority of Slow Food, for the belief is that a focus on quality will awaken an appreciation for sensory pleasures and bridge the expanding gap between eaters and food producers. The suggestion is that reestablishing an appreciation for food is the proverbial straw needed to break the back of fast-paced, individually driven societies, like ours.
Slow Food is a hard sell in America because it would require a shift in national culture along with food culture. We value success and hard work over everything else, including time off and community bonding. Fast food fits this frenzied, distracted way of life. We don’t want to slow down to think about what to eat, let alone what is in what we’re eating, which is precisely what food manufacturers are counting on.
Take the fast-food hamburger. This hamburger is engineered with flavorings to make it more succulent. It is made to be devoured, thereby reducing the amount of time you have to think about where it comes from. You likely wouldn’t get past the first bite if you considered the conditions of the stockyards, slaughterhouses, or thousands of workers toiling at precarious factory jobs for minimum wage.
In contrast, a grass-fed burger made at home is more enjoyable to eat. First, it’s healthier for you. Second, with the knowledge of where the beef comes from, you can imagine pastured cows grazing and skilled workers humanely producing the meat. The food feels better intellectually and tastes better, giving you a greater sense of appreciation for what you’re eating.
Appreciation for food is found when eating becomes intentional, rather than habitual. Even something as simple as growing a few herbs at home can help you feel more connected to your food, which leads to the next rule.
When you feel ownership of your eating, you strengthen your relationship with food and help break your dependence on industry. You take responsibility for your health back from the industry and scientists and become the chef again in your kitchen. Your perspective is driven by nature, not nutritionism, and you help create stronger bonds with your food, community, and the earth.
The best way to reclaim this power is to start a garden. The direct line from garden to table makes you more aware of the larger food chain, including the health of the soil, plants, and yourself. You know what is going into your body because you know what has gone into the soil, and you won’t need to spend a lot of money on organic foods from the store or farmer’s market anymore.
Tending a garden requires physical labor and is a good form of exercise and recreation. Gardening provides a greater purpose than random fitness routines. You must also be educated about plants, soils, and techniques to garden, which expands your mind and strengthens mental faculties. This sounds like a lot of work, but most of this labor is easy, and when you go to “shop” in your garden for dinner and sit down to eat, the payoff is huge.
Gardening also brings you closer to understanding the synergistic power of the food chain. When you care for plants and help them reach their potentials, they respond by signaling you when they’re ready to be eaten. They change their colors, smells, sizes, tastes, and the way they feel so you know when to harvest. Each time you harvest, you help them continue to grow.
For one reason or another, you may not get a vibrant, functioning garden the first time. When this happens, you will be forced to deepen your connection with the processes of life and gain more respect for farmers who consistently provide healthy produce to stores and markets. When your garden does work out, you’ll likely have so much produce, you’ll want to share it with others, which strengthens community and culture through food.
After harvesting your garden, you’ll be tasked with preparing the food, which creates a myriad of benefits for your health and the health of your culture. You’ll want to create variety in the dishes you make with your home-grown vegetables, so you’ll follow time-tested recipes or create your own. This trial and error gives you a familiarity with the food and brings you closer to your food heritage and those of other cultures.
This familiarity helps you gain an instinct about food that no amount of time spent in the grocery aisles reading labels can provide. And the last thing you’ll be thinking about is your health or the type of nutrients a food contains. You won’t wonder if the food is healthy because you will know simply by looking and feeling the food that it is alive and vital. You will learn what flavors pair well and cherish them for their taste, not the inherent health benefits. You will have confidence in what you’re eating and how it was made because you made it.
At the moment, this type of behavior is subversive in our culture. In being aware of your food and enjoying it for more than just a pathway to health, you are sidestepping the hold nutritionism has on society. You are part of the food chain, not a bystander watching its depreciation. When the food chain is respected and supported, the result is the health of all things in nature, including you.
Food culture is about more than just what you eat, and understanding what makes up a food culture might provide a new perspective of your own habits.
Do you mostly eat three conventional meals a day, or do you eat more snacks and convenience meals? Why?
When you eat at home, do you cook or eat packaged meal replacements? Why is this your chosen behavior?
What is one thing you would like to change about your current eating habits?
What sort of benefit would eating meals with others add to your life?
If you had a garden, what would you grow? Why?
Now that you’ve heard Micheal Pollan’s defense of food, do you think differently about the food you eat?
What is one surprising thing you learned about the food you typically eat?
Can you think of one or two ways the Western diet negatively affects your health?
How would you describe the Western diet to a friend?
What is one thing you will change about your eating habits based on what you’ve learned in this book?