Are The Foods We Eat, Always Safe?

The Erosion of Food Integrity and Its Impact on Health

 

According to Webster’s New World College Dictionary, food is defined as:

  1. Any substance taken in and assimilated by a plant or animal to keep it alive and enable it to grow and repair tissue; nourishment; nutriment.

  2. Anything that nourishes or stimulates; whatever helps something to stay active, grow, etc.

But does modern food still fulfill these purposes—supporting growth, tissue repair, and providing proper nourishment to both body and mind?

There was a time when the answer to that question was a resounding yes. Unfortunately, this is no longer true.

Take diabetes as a telling example. In 1880, approximately 2.8 individuals per 100,000 were diagnosed with diabetes. By 1949, this number had risen dramatically to 29.7 cases per 100,000. However, in that same year, the method for recording statistics changed, conveniently reducing that number to 16.4 cases per 100,000. This shift obscured the alarming rise in diabetic cases during that period. It’s also worth noting that back then, there was no distinction between Type I and Type II diabetes—it was simply referred to as “diabetes.”

Fast forward to today, and Type II diabetes alone affects 10% to 20% of the population—a massive leap from the mere 0.0028% seen in the 1880s. This surge appears to correlate directly with the reengineering of our food supply. Essential nutrients have been stripped from foods in the name of shelf life and profit, leading not only to diminished nutritional value but to widespread, chronic health issues.

Looking at this 100-year timeframe, it’s impossible to ignore the parallels between the corruption of our food supply and the explosion of diet-related diseases.

Artificial Food: Profit Over Public Health

Efforts to substitute artificial food for natural nutrition go back centuries—at least as far as Napoleon III, who spurred innovation in this space for military and economic reasons. In response to a contest he held for an affordable butter alternative, Hippolyte Mège-Mouriès, a French chemist, invented what would become margarine. It was patented in England in 1869. By modern standards, this early margarine was scarcely palatable, containing ingredients such as hog fat, gelatin, bleach, mashed potatoes, gypsum, and casein. Nevertheless, it made its way into the American market in 1874.

In 1899, David Wesson developed a vacuum and high-temperature method to deodorize cottonseed oil, a waste product of the cotton industry. A year later, he introduced Wesson Oil to consumers. Around the same time, William Norman patented the hydrogenation process, a technique used to transform unsaturated fatty acids into saturated fats to prevent spoilage.

By 1911, the artificial fat industry was gaining serious traction. That year saw the debut of Crisco, which was marketed as a clean, shelf-stable alternative to animal fat—and even accepted as “kosher” by religious communities.

Although margarine had already claimed around 40% of the market by the 1920s, it didn’t become widely accepted in the U.S. until World War II. Wartime easing of restrictions repealed earlier bans, allowing margarine to become a dietary staple, soon joined by Crisco and artificial lard. Refined oils also gained popularity during this period, primarily because they looked cleaner and lasted longer. Strangely enough, even insects wouldn’t touch these refined oils when spilled—a red flag we ignored.

A Legacy of Misleading Science

By the 1930s, food science was increasingly driven by market interests rather than health concerns. A campaign began to wean consumers off traditional animal fats and cold-pressed vegetable oils, which had sustained generations. These were replaced with refined, artificial oils, promoted under the guise of science. Saturated fats were villainized, often by salespeople posing as scientists, and at times by real scientists who were compromising their integrity for corporate gains.

Consider the American Eskimo, whose traditional diet consisted of up to 60% animal fat. For generations, they showed no signs of diabetes. However, once they embraced the typical American diet—supported by pipeline income and modern conveniences—their health deteriorated quickly. Within a single generation, their disease rates mirrored those of the general U.S. population.

As studies multiplied, the food industry spun misleading narratives about artificial fats. Polyunsaturated and monounsaturated oils were marketed as heart-healthy, despite being processed into trans fats—known toxins to the human body. There’s no legal requirement for companies to reveal these harmful processes, and so, they don’t. They’ll tell you that polyunsaturated and monounsaturated oils are good for you, but omit that they’ve been chemically altered, making them hazardous.

Cis fats, which are natural fats, are crucial for overall health. The difference is more than chemistry—it’s life and death.

Where We Are Now

Given this history, it’s no surprise that our health has deteriorated alongside our food supply. Unless there is a significant shift away from artificial, processed foods toward whole, living foods, the diabetes epidemic—and the general decline in public health—will only continue.

It’s not just about diet anymore. It’s about awareness, accountability, and reclaiming the right to food that truly nourishes and sustains life.

New Posts

5 Ways to Manage your Diet for Diabetes

5 Ways to Manage your Diet for Diabetes

Since being diagnosed with juvenile diabetes at the age of eleven, my diet has changed…

5 Diabetes Travel Tips

5 Diabetes Travel Tips

Planning ahead when you travel reduces stress. This is particularly crucial for those managing diabetes.…