Archive for the ‘Nutrition’ Category

by Kris Gunnars

Eggs are among the most nutritious foods on the planet.

Just imagine… a whole egg contains all the nutrients needed to turn a single cell into an entire baby chicken.

However, eggs have gotten a bad reputation because the yolks are high in cholesterol.

In fact, a single medium-sized egg contains 186 mg of cholesterol, which is 62% of the recommended daily intake.

People believed that if you ate cholesterol, that it would raise cholesterol in the blood and contribute to heart disease.

But it turns out that it isn’t that simple. The more you eat of cholesterol, the less your body produces instead.

Let me explain how that works…

How Your Body Regulates Cholesterol Levels

Cholesterol is often seen as a negative word.

When we hear it, we automatically start thinking of medication, heart attacks and early death.

But the truth is that cholesterol is a very important part of the body. It is a structural molecule that is an essential part of every single cell membrane.

It is also used to make steroid hormones like testosterone, estrogen and cortisol.

Without cholesterol, we wouldn’t even exist.

Given how incredibly important cholesterol is, the body has evolved elaborate ways to ensure that we always have enough of it available.

Because getting cholesterol from the diet isn’t always an option, the liver actually produces cholesterol.

But when we eat a lot of cholesterol rich foods, the liver starts producing less (1, 2).

So the total amount of cholesterol in the body changes only very little (if at all), it is just coming from the diet instead of from the liver (3, 4).
Bottom Line: The liver produces large amounts of cholesterol. When we eat a lot of eggs (high in cholesterol), the liver produces less instead.

What Happens When People Eat Several Whole Eggs Per Day?

For many decades, people have been advised to limit their consumption of eggs, or at least of egg yolks (the white is mostly protein and is low in cholesterol).

Common recommendations include a maximum of 2-6 yolks per week. However, there really isn’t much scientific support for these limitations (5).

Luckily, we do have a number of excellent studies that can put our minds at ease.

In these studies, people are split into two groups… one group eats several (1-3) whole eggs per day, the other group eats something else (like egg substitutes) instead. Then the researchers follow the people for a number of weeks/months.

These studies show that:

In almost all cases, HDL (the “good”) cholesterol goes up (6, 7, 8).
Total and LDL cholesterol levels usually don’t change, but sometimes they increase slightly (9, 10, 11, 12).
Eating Omega-3 enriched eggs can lower blood triglycerides, another important risk factor (13, 14).
Blood levels of carotenoid antioxidants like Lutein and Zeaxanthine increase significantly (15, 16, 17).

It appears that the response to whole egg consumption depends on the individual.

In 70% of people, it has no effect on Total or LDL cholesterol. However, in 30% of people (termed “hyper responders”), these numbers do go up slightly (18).

That being said, I don’t think this is a problem. The studies show that eggs change the LDL particles from small, dense LDL to Large LDL (19, 20).

People who have predominantly large LDL particles have a lower risk of heart disease. So even if eggs cause mild increases in Total and LDL cholesterol levels, this is not a cause for concern (21, 22, 23).

The science is clear that up to 3 whole eggs per day are perfectly safe for healthy people who are trying to stay healthy.

Bottom Line: Eggs consistently raise HDL (the “good”) cholesterol. For 70% of people, there is no increase in Total or LDL cholesterol. There may be a mild increase in a benign subtype of LDL in some people.

Eggs and Heart Disease

Many studies have looked at egg consumption and the risk of heart disease.

All of these studies are so-called observational studies. In studies like these, large groups of people are followed for many years.

Then the researchers use statistical methods to figure out whether certain habits (like diet, smoking or exercise) are linked to either a decreased or increased risk of some disease.

These studies, some of which include hundreds of thousands of people, consistently show that people who eat whole eggs are no more likely to develop heart disease. Some of the studies even show a reduced risk of stroke (24, 25, 26).

However… one thing that is worth noting, is that these studies show that diabetics who eat eggs are at an increased risk of heart disease (27).

Whether the eggs are causing the increased risk in diabetics is not known. These types of studies can only show a correlation and it is possible that the diabetics who eat eggs are, on average, less health conscious than those who don’t.

This may also depend on the rest of the diet. On a low-carb diet (by far the best diet for diabetics), eggs lead to improvements in heart disease risk factors (28, 29).

Bottom Line: Many observational studies show that people who eat eggs don’t have an increased risk of heart disease, but some of the studies do show an increased risk in diabetics.

For the rest of the article please go to:


The statements in this website or any of its links are for
informational purposes only.They have not been evaluated
by the US Food and Drug Administration and are not intended
to diagnose, treat, cure or prevent any known or suspected,
disease. Any recommendations made are with the intent to
support the normal psychological and biochemical processes
of healing and good health.



Read Full Post »

MANINIS® Gluten Free uses 7 ancient grains in the certified gluten-free blending of our mixes.  MANINIS Gluten Free Mixes contain at least four and as high as 6 in various combinations of the following ancient grains which are naturally gluten-free: Organic Millet, Teff, Organic Quinoa, Certified Gluten Free Oats, Flax, Organic Amaranth, Organic Sorghum.  Each grain in itself has amazing nutritional qualities.  Most important of all, because MANINIS Gluten Free is a family living with celiac disease, we have carefully chosen the growers of these ancient grains after many years of testing and retesting to be sure they could provide us with consistent gluten-free results.  The following information gives you an overview of the origin and nutritional content of each of these grains:


Read Full Post »

Gluten-free is hot these days. There are books and websites, restaurants with gluten free menus, and grocery stores with hundreds of new gluten-free food products on the shelf. Is this a fad, or a reflection of response to a real problem?

Yes, gluten is a real problem. But the problem is not just gluten. In fact, there are three major hidden reasons that wheat products, not just gluten (along with sugar in all its forms) is a major contributor to obesity, diabetes, heart disease, cancer, dementia, depression and so many other modern ills.

This is why there are now 30 percent more obese than undernourished in the world, and why chronic lifestyle and dietary driven disease kills more than twice as many people as infectious disease globally. These non-communicable, chronic diseases will cost our global economy $47 trillion over the next 20 years.

Sadly, this tsunami of chronic illness is increasingly caused by eating our beloved diet staple, bread, the staff of life, and all the wheat products hidden in everything from soups to vodka to lipstick to envelope adhesive.

The biggest problem is wheat, the major source of gluten in our diet. But wheat weaves its misery through many mechanisms, not just the gluten! The history of wheat parallels the history of chronic disease and obesity across the world. Supermarkets today contain walls of wheat and corn disguised in literally hundreds of thousands of different food-like products, or FrankenFoods. Each American now consumes about 55 pounds of wheat flour every year.

It is not just the amount but also the hidden components of wheat that drive weight gain and disease. This is not the wheat your great-grandmother used to bake her bread. It is FrankenWheat — a scientifically engineered food product developed in the last 50 years.

How Wheat — and Gluten — Trigger Weight Gain, Prediabetes, Diabetes and More

This new modern wheat may look like wheat, but it is different in three important ways that all drive obesity, diabetes, heart disease, cancer, dementia and more.

  1. It contains a Super Starch — amylopectin A that is super fattening.
  2. It contains a form of Super Gluten that is super-inflammatory.
  3. It contains forms of a Super Drug that is super-addictive and makes you crave and eat more.

The Super Starch

The Bible says, “Give us this day our daily bread.” Eating bread is nearly a religious commandment. But the Einkorn, heirloom, Biblical wheat of our ancestors is something modern humans never eat.

Instead, we eat dwarf wheat, the product of genetic manipulation and hybridization that created short, stubby, hardy, high-yielding wheat plants with much higher amounts of starch and gluten and many more chromosomes coding for all sorts of new odd proteins. The man who engineered this modern wheat won the Nobel Prize — it promised to feed millions of starving around the world. Well, it has, and it has made them fat and sick.

The first major difference of this dwarf wheat is that it contains very high levels of a super starch called amylopectin A. This is how we get big fluffy Wonder Bread and Cinnabons.

Here’s the downside. Two slices of whole wheat bread now raise your blood sugar more than two tablespoons of table sugar.

There is no difference between whole wheat and white flour here. The biggest scam perpetrated on the unsuspecting public is the inclusion of “whole grains” in many processed foods full of sugar and wheat, giving the food a virtuous glow. The best way to avoid foods that are bad for you is to stay away from foods with health claims on the labels. They are usually hiding something bad.

In people with diabetes, both white and whole grain bread raises blood sugar levels 70 to 120 mg/dl over starting levels. We know that foods with a high glycemic index make people store belly fat, trigger hidden fires of inflammation in the body and give you a fatty liver, leading the whole cascade of obesity, pre-diabetes and diabetes. This problem now affects every other American and is the major driver of nearly all chronic disease and most our health care costs. Diabetes now sucks up one in three Medicare dollars.

The Super Gluten

Not only does this dwarf, FrankenWheat, contain the super starch, but it also contains super gluten which is much more likely to create inflammation in the body. And in addition to a host of inflammatory and chronic diseases caused by gluten, it causes obesity and diabetes.

Gluten is that sticky protein in wheat that holds bread together and makes it rise. The old fourteen-chromosome-containing Einkorn wheat codes for the small number of gluten proteins, and those that it does produce are the least likely to trigger celiac disease and inflammation. The new dwarf wheat contains twenty-eight or twice as many chromosomes and produces a large variety of gluten proteins, including the ones most likely to cause celiac disease.

Five Ways Gluten Makes You Sick and Fat

Gluten can trigger inflammation, obesity and chronic disease in five major ways.

  1. Full-blown celiac disease is an autoimmune disease that triggers body-wide inflammation triggering insulin resistance, which causes weight gain and diabetes, as well as over 55 conditions including autoimmune diseases, irritable bowel, reflux, cancer, depression, osteoporosis and more.
  1. Low-level inflammation reactions to gluten trigger the same problems even if you don’t have full-blown celiac disease but just have elevated antibodies (7 percent of the population, or 21 million Americans).
  1. There is also striking new research showing that adverse immune reactions to gluten may result from problems in very different parts of the immune system than those implicated in celiac disease. Most doctors dismiss gluten sensitivity if you don’t have a diagnosis of celiac disease, but this new research proves them wrong. Celiac disease results when the body creates antibodies against the wheat (adaptive immunity), but another kind of gluten sensitivity results from a generalized activated immune system (innate immunity). This means that people can be gluten-sensitive without having celiac disease or gluten antibodies and still have inflammation and many other symptoms.
  1. A NON-gluten glycoprotein or lectin (combination of sugar and protein) in wheat called wheat germ agglutinin (WGA)[1] found in highest concentrations in whole wheat increases whole body inflammation as well. This is not an autoimmune reaction, but can be just as dangerous and cause heart attacks.[2]
  1. Eating too much gluten-free food (what I call gluten-free junk food) like gluten-free cookies, cakes and processed food. Processed food has a high glycemic load. Just because it is gluten-free, doesn’t mean it is healthy. Gluten-free cakes and cookies are still cakes and cookies! Vegetables, fruits, beans, nuts and seeds and lean animal protein are all gluten free — stick with those.

Let’s look at this a little more closely. Gluten, a protein found in wheat, barley, rye, spelt and oats, can cause celiac disease, which triggers severe inflammation throughout the body and has been linked to autoimmune diseases, mood disorders, autism, schizophrenia, dementia, digestive disorders, nutritional deficiencies, diabetes, cancer and more.

Celiac Disease: The First Problem

Celiac disease and gluten-related problems have been increasing, and now affect at least 21 million Americans and perhaps many millions more. And 99 percent of people who have problems with gluten or wheat are NOT currently diagnosed.

Ninety-eight percent of people with celiac have a genetic predisposition known as HLA DQ2 or DQ8, which occurs in 30 percent of the population. But even though our genes haven’t changed, we have seen a dramatic increase in celiac disease in the last 50 years because of some environmental trigger.

In a recent study that compared blood samples taken 50 years ago from 10,000 young Air Force recruits to samples taken recently from 10,000 people, researchers found something quite remarkable. There has been a real 400 percent increase in celiac disease over the last 50 years.[3] And that’s just the full-blown disease affecting about one in 100 people, or about three million Americans. We used to think that this only was diagnosed in children with bloated bellies, weight loss and nutritional deficiencies. But now we know it can be triggered (based on a genetic susceptibility) at any age and without ANY digestive symptoms. The inflammation triggered by celiac disease can drive insulin resistance, weight gain and diabetes, just like any inflammatory trigger — and I have seen this over and over in my patients.

Gluten and Gut Inflammation: The Second Problem

But there are two ways other than celiac disease in which wheat appears to be a problem.

The second way that gluten causes inflammation is through a low-grade autoimmune reaction to gluten. Your immune system creates low-level antibodies to gluten, but doesn’t create full-blown celiac disease. In fact, 7 percent of the population, 21 million, have these anti-gliadin antibodies. These antibodies were also found in 18 percent of people with autism and 20 percent of those with schizophrenia.

A major study in the Journal of the American Medical Association reported that hidden gluten sensitivity (elevated antibodies without full-blown celiac disease) was shown to increase risk of death by 35 to 75 percent, mostly by causing heart disease and cancer.[4] Just by this mechanism alone, over 20 million Americans are at risk for heart attack, obesity, cancer and death.

How does eating gluten cause inflammation, heart disease, obesity, diabetes and cancer?

Most of the increased risk occurs when gluten triggers inflammation that spreads like a fire throughout your whole body. It damages the gut lining. Then all the bugs and partially-digested food particles inside your intestine get across the gut barrier and are exposed your immune system, 60 percent of which lies right under the surface of the one cell thick layer of cells lining your gut or small intestine. If you spread out the lining of your gut, it would equal the surface area of a tennis court. Your immune system starts attacking these foreign proteins, leading to systemic inflammation that then causes heart disease, dementia, cancer, diabetes and more.

Dr. Alessio Fasano, a celiac expert from the University of Maryland School of Medicine, discovered a protein made in the intestine called “zonulin” that is increased by exposure to gluten.[5] Zonulin breaks up the tight junctions or cement between the intestinal cells that normally protect your immune system from bugs and foreign proteins in food leaking across the intestinal barrier. If you have a “leaky gut,” you will get inflammation throughout your whole body and a whole list of symptoms and diseases.

Why is there an increase in disease from gluten in the last 50 years?

It is because, as I described earlier, the dwarf wheat grown in this country has changed the quality and type of gluten proteins in wheat, creating much higher gluten content and many more of the gluten proteins that cause celiac disease and autoimmune antibodies.

Combine that with the damage our guts have suffered from our diet, environment, lifestyle and medication use, and you have the perfect storm for gluten intolerance. This super gluten crosses our leaky guts and gets exposed to our immune system. Our immune system reacts as if gluten was something foreign, and sets off the fires of inflammation in an attempt to eliminate it. However, this inflammation is not selective, so it begins to attack our cells — leading to diabesity and other inflammatory diseases.

Damage to the gastrointestinal tract from overuse of antibiotics, anti-inflammatory drugs like Advil or Aleve and acid-blocking drugs like Prilosec or Nexium, combined with our low-fiber, high-sugar diet, leads to the development of celiac disease and gluten intolerance or sensitivity and the resultant inflammation. That is why elimination of gluten and food allergens or sensitivities can be a powerful way to prevent and reverse diabesity and many other chronic diseases.

The Super Drug

Not only does wheat contain super starch and super gluten — making it super fattening and super inflammatory — but it also contains a super drug that makes you crazy, hungry and addicted.

When processed by your digestion, the proteins in wheat are converted into shorter proteins, “polypeptides,” called “exorphins.” They are like the endorphins you get from a runner’s high and bind to the opioid receptors in the brain, making you high, and addicted just like a heroin addict. These wheat polypeptides are absorbed into the bloodstream and get right across the blood brain barrier. They are called “gluteomorphins,” after “gluten” and “morphine.”

These super drugs can cause multiple problems, including schizophrenia and autism. But they also cause addictive eating behavior, including cravings and bingeing. No one binges on broccoli, but they binge on cookies or cake. Even more alarming is the fact that you can block these food cravings and addictive eating behaviors and reduce calorie intake by giving the same drug we use in the emergency room to block heroin or morphine in an overdose, called naloxone. Binge eaters ate nearly 30 percent less food when given this drug.

Bottom line: wheat is an addictive appetite stimulant.

How to Beat the Wheat, and Lose the Weight

First, you should get tested to see if you have a more serious wheat or gluten problem.

If you meet any of these criteria, then you should do a six-week 100 percent gluten-free diet trial to see how you feel. If you have three out of five criteria, you should be gluten-free for life.

  1. You have symptoms of celiac (any digestive, allergic, autoimmune or inflammatory disease, including diabesity).
  2. You get better on a gluten-free diet.
  3. You have elevated antibodies to gluten (anti-gliadin, AGA, or tissue transglutaminase antibodies, TTG).
  4. You have a positive small intestinal biopsy.
  5. You have the genes that predispose you to gluten (HLA DQ2/8).

Second, for the rest of you who don’t have gluten antibodies or some variety of celiac — the super starch and the super drug, both of which make you fat and sick, can still affect you. So go cold turkey for six weeks. And keep a journal of how you feel.

The problems with wheat are real, scientifically validated and ever-present. Getting off wheat may not only make you feel better and lose weight, it could save your life.

My personal hope is that together we can create a national conversation about a real, practical solution for the prevention, treatment, and reversal of our obesity, diabetes and chronic disease epidemic. Getting off wheat may just be an important step.

To learn more and to get a free sneak preview of The Blood Sugar Solution where I explain exactly how to avoid wheat and what to eat instead go to www.drhyman.com.

Please leave your thoughts by adding a comment below.

To your good health,

Mark Hyman, MD


[1] Saja K, Chatterjee U, Chatterjee BP, Sudhakaran PR. “Activation dependent expression of MMPs in peripheral blood mononuclear cells involves protein kinase.” A. Mol Cell Biochem. 2007 Feb;296(1-2):185-92

[2] Dalla Pellegrina C, Perbellini O, Scupoli MT, Tomelleri C, Zanetti C, Zoccatelli G, Fusi M, Peruffo A, Rizzi C, Chignola R. “Effects of wheat germ agglutinin on human gastrointestinal epithelium: insights from an experimental model of immune/epithelial cell interaction.” Toxicol Appl Pharmacol. 2009 Jun 1;237(2):146-53.

[3] Rubio-Tapia A, Kyle RA, Kaplan EL, Johnson DR, Page W, Erdtmann F, Brantner TL, Kim WR, Phelps TK, Lahr BD, Zinsmeister AR, Melton LJ 3rd, Murray JA. “Increased prevalence and mortality in undiagnosed celiac disease.” Gastroenterology. 2009 Jul;137(1):88-93

[4] Ludvigsson JF, Montgomery SM, Ekbom A, Brandt L, Granath F. “Small-intestinal histopathology and mortality risk in celiac disease.” JAMA. 2009 Sep 16;302(11):1171-8.

[5] Fasano A. “Physiological, pathological, and therapeutic implications of zonulin-mediated intestinal barrier modulation: living life on the edge of the wall.” Am J Pathol. 2008 Nov;173(5):1243-52.

Mark Hyman, M.D. is a practicing physician, founder of The UltraWellness Center, a four-timeNew York Times bestselling author, and an international leader in the field of Functional Medicine. You can follow him on Twitter, connect with him on LinkedIn, watch his videos onYouTube, become a fan on Facebook, and subscribe to his newsletter.

For more by Mark Hyman, M.D., click here.

For more on diet and nutrition, click here.

For more on personal health, click here.

For more on weight loss, click here.


Read Full Post »

Balanced body chemistry is of utmost importance for the maintenance of health and correction of disease. Acidosis, or over-acidity in the body tissues, is one of the basic causes of many diseases, especially the arthritic and rheumatic diseases.

All foods are “burned” in the body–more commonly called “digested”– leaving an ash as the result of the “burning”, or the digestion. This food ash can be neutral, acid or alkaline, depending largely on the mineral composition of the foods. Some foods leave an acid residue or ash, some alkaline. The acid ash (acidosis) results when there is a depletion of the alkali reserve or the diminution in the reserve supply of fixed bases in the blood and the tissues of the body.

It is, therefore, vitally important that there is a proper ratio between acid and alkaline foods in the diet. The natural ratio in a normal healthy body is approximately 4 to 1 — four parts alkaline to one part acid, or 80% to 20%. When such an ideal ratio is maintained, the body has a strong resistance against disease. In the healing of disease, when the patient usually has acidosis, the higher the ratio of alkaline elements in the diet, the faster will be the recovery. Alkalis neutralize the acids. Therefore in the treatment of most diseases it is important that the patient’s diet includes plenty of alkaline-ash foods to offset the effects of acid-forming foods and leave a safe margin of alkalinity.

A healthy body usually keeps large alkaline reserves which are used to meet the emergency demands if too many acid-producing foods are consumed. But these normal reserves can be depleted. When the alkaline-acid ratio drops to 3 to 1, health can be seriously menaced. Your body can function normally and sustain health only in the presence of adequate alkaline reserves and the proper acid-alkaline ratio in all the body tissues and the blood.

For optimum health and maximum resistance to disease, it is imperative that your diet is slightly over-alkaline. The ideal ratio, according to the world’s foremost authority on the relationship between the acid-alkaline ratio in the diet in health and disease, Dr. Ragnar Berg, is about 80% alkali-producing foods and 20% acid-producing foods.

Below are tables of common foods with an approximate potential acidity or alkalinity, as present in one ounce of food.

Alkali-Forming Foods


Figs 30.0 Potatoes 2.0
Soy Beans 12.0 Pineapple 2.0
Lima Beans 12.0 Cabbage 1.8
Apricots 9.5 Grapefruit 1.7
Spinach 8.0 Tomatoes 1.7
Turnip/Beettops 8.0 Peaches 1.5
Raisins 7.0 Apples 1.0
Almonds 3.6 Grapes 1.0
Carrots 3.5 Bananas 1.0
Dates 3.0 Watermelon 1.0
Celery 2.5 Millet 0.5
Cucumber 2.5 Brazil nuts 0.5
Cantaloupe 2.5 Coconuts 0.5
Lettuce 2.2 Buckwheat 0.5
Watercress 2.0
Neutral (near/neutral) Ash Foods
milk Vegetable oils
Butter White sugar
Acid-Forming Foods
Oysters 5.0 Rice 2.5
Veal 3.5 W.Wheat/Rye bread 2.5
Most Fish 3.5 Most nuts (X-almond/brazil nut) 2.0
Organ meats 3.0 Natural Cheese 1.5
Liver 3.0 Lentils 1.5
Chicken 3.0 Peanuts 1.0
Fowl 3.0 Eggs 3.0
Most Grains 3.0

Most grains are acid-forming, except millet and buckwheat, which are considered to be alkaline. Sprouted seeds and grains become more alkaline in the process of sprouting. All vegetable and fruit juices are highly alkaline. The most alkali-forming juices are: fig juice, green juices of all green vegetables and tops, carrots, beet, celery, pineapple and citrus juices.

MANINIS Gluten Free mixes are made with millet.


The information contained in this site is presented for general education purposes only. It is not intended to be a substitute for special medical advice, which the visitor can obtain only from a qualified health professional .

Neither the author or management of this site assume responsibility or liability for any consequences of the reader to obtain such specific medical advice from a qualified health professional, nor for any consequences of the reader attempting to treat his own health problems using any or all of the information contained in this site or others managed by Maninis.

We strongly advise that in any case of ill health, the reader seek the advice of a qualified health professional.

Read Full Post »

Think “food allergy” and you might conjure the worst-case scenario, like a child going into anaphylactic shock after exposure to peanuts. No doubt, a severe food allergy is scary. But it’s also relatively rare. A much more common scenario is an adult with a low-grade food allergy to say, gluten, who never pinpoints the cause of his misery. His symptoms are vague (bloating, constipation, weight gain) and his exposure is frequent (breakfast, lunch and dinner), so the connection is murky. And, over years, the hidden allergy takes a toll on the immune system. The result of an overworked immune system is everything from weight gain to irritable bowel syndrome (IBS) to arthritis.

That’s what happened to a patient of mine. John weighed 350 pounds and was facing diabetes. But his blood sugar problem was only the tip of the iceberg. He also had joint pain, asthma, crippling fatigue and a sleep disorder. To combat his lethargy, he craved diet soda and fast food for its high number of starchy carbs, a false source of fast energy. What he didn’t know was that he had celiac disease, a serious autoimmune disease fed by his daily indulgence in bagels and donuts. Celiac disease causes the immune system to turn on itself, attacking the healthy lining of the digestive tract. And the major trigger is gluten, a sticky protein found in many grains, including John’s daily dose of bagels and donuts. Unchecked autoimmune diseases mean the gut is in a constant state of inflammation, a breeding ground for chronic illness.

Food Sensitivities and Inflammation

John’s story is not unique. Inflammation is one of the biggest drivers of weight gain and disease in America. While celiac afflicts roughly 1 percent of Americans, as many as 30 percent may have non-celiac gluten intolerance.[1] The key difference is that in people with celiac disease, the body attacks the small intestine. But in people with non-celiac gluten intolerance, the immune system attacks the gluten. A recent article in The New England Journal of Medicine listed 55 “diseases” that can be traced back to eating gluten.[2] Either way, the gut festers out of sight. And when the lining of the gut is inflamed, the body is even more prone to food reactions, so the problem spirals out of control.

When the lining of the gut is inflamed, small fissures open between the tightly-woven cells making up the gut walls. Called leaky gut syndrome, these chinks in the gut’s armor allow bacteria and partially-digested food molecules to slip out into the bloodstream, where they are considered foreign invaders. Once it spies a potential enemy, the body doesn’t hold back. The immune system attacks full throttle. White blood cells rush to surround the offending particle and systemic inflammation ensues. I’m not talking about a sore throat or infected finger. I’m talking about a hidden, smoldering fire created by the immune system as it tries to fend off a daily onslaught of food allergies.

The problem is that most people, like John, eat foods they are allergic to several times a day. Meaning every time that food enters the body, the immune system whips itself into a frenzy. But because symptoms are delayed up to 72 hours after eating, a low-grade food allergy can be hard to spot. Without diagnosis or awareness, the damage is repeated over and over, meal after meal. Eventually, inflammation seeps throughout the body, establishing an environment ripe for weight gain and chronic disease.

Identifying and treating food allergies and food sensitivities is an important part of my practice. Six weeks after John went gluten-free on The Blood Sugar Solution, not only did he lose three notches on his belt, but his knees didn’t hurt, his asthma was gone, he wasn’t hungry and his energy was back. John’s response was not unusual. I have seen dramatic effects in weight loss, inflammatory conditions like autoimmune disease and even mood and behavioral disorders.

The problem is that most physicians, especially allergists, don’t see the value in uncovering hidden food allergies. That is unfortunate because there is a growing body of medical literature illuminating the intimate relationship between the gut, food and illness. Luckily, you don’t have to wait for your doctor to catch up with the times. Here are three ways to determine if food allergies are undermining your health.

Three Ways to Identify Food Allergies

  1. Get a blood test. Blood testing for IgG food allergens (Immuno Labsand other labs) can help you to identify hidden food allergies. While these tests do have limitations and need to be interpreted in the context of the rest of your health, they can be useful guides to what’s bothering YOU in particular. When considering blood tests for allergens, it’s always a good idea to work with a doctor or nutritionist trained in dealing with food allergies. 
  2. Go dairy- and gluten-free for six weeks. Dairy and gluten are the most common triggers of food allergies. For patients who have trouble losing weight, I often recommend a short elimination as part of the The Blood Sugar Solution. Both dairy (milk, cheese, butter and yogurt) and gluten (most often found in wheat, barley, rye, oats, spelt, triticale and kamut) are linked to insulin resistance and, therefore, weight gain. Temporarily cutting them out of the diet allows the inflamed gut to heal. This one move may be the single most important thing most you can do to lose weight. 


  3. Avoid the top food allergens. If you don’t feel a sense of relief from nixing dairy and gluten, you may need to take the elimination diet one step further by cutting out the top food allergens: gluten, dairy, corn, eggs, soy, nuts, nightshades (tomatoes, bell peppers, potatoes and eggplant), citrus and yeast (baker’s, brewer’s yeast and fermented products like vinegar). Try this for a full six weeks. That is enough time to feel better and notice a change. When you reintroduce a top food allergen, eat it at least two to three times a day for three days to see if you notice a reaction. If you do, note the food and eliminate it for 90 days.


If you are overweight or if you suffer from inflammatory diseases, such as heart disease, diabetes and cancer, the potential health benefits of discovering and uprooting hidden food allergies cannot be overstated. Remember, food is your greatest ally in helping to prevent and treat illness. For more information see The Blood Sugar Solution to get a free sneak peak.

Now I’d like to hear from you…

Do you have food allergies?

Are you gluten intolerant?

Have you eliminated your food sensitivities and lost weight?

Please leave your thoughts by adding a comment below.

To your good health,

Mark Hyman, MD


[1] Ludvigsson, JF, et al. 2009. “Small-intestinal histopathology and mortality risk in celiac disease,” Journal of the American Medical Association. 302 (11): 1171-8

[2] Farrell, RJ, and CP Kelly. 2002. “Celiac sprue,” New England Journal of Medicine. 346 (3): 180-88 Review

Mark Hyman, M.D. is a practicing physician, founder of The UltraWellness Center, a four-time New York Times bestselling author, and an international leader in the field of Functional Medicine. You can follow him on Twitter, connect with him on LinkedIn, watch his videos on YouTube, become a fan on Facebook, and subscribe to his newsletter.

For more by Mark Hyman, M.D., click here.

For more on diet and nutrition, click here.

For more on weight loss, click here.



Follow Mark Hyman, MD on Twitter: www.twitter.com/markhymanmd

Read Full Post »

The origins of agriculture: a biological perspective and a new hypothesis

by Greg Wadley and Angus Martin

published in Australian Biologist volume 6: pp 96-105, June 1993

(re-published in Journal of ACNEM 2000)


What might head a list of the defining characteristics of the human species? While our view of ourselves could hardly avoid highlighting our accomplishments in engineering, art, medicine, space travel and the like, in a more dispassionate assessment agriculturewould probably displace all other contenders for top billing. Most of the other achievements of humankind have followed from this one. Almost without exception, all people on earth today are sustained by agriculture. With a minute number of exceptions, no other species is a farmer. Essentially all of the arable land in the world is under cultivation. Yet agriculture began just a few thousand years ago, long after the appearance of anatomically modern humans.

Given the rate and the scope of this revolution in human biology, it is quite extraordinary that there is no generally accepted model accounting for the origin of agriculture. Indeed, an increasing array of arguments over recent years has suggested that agriculture, far from being a natural and upward step, in fact led commonly to a lower quality of life. Hunter-gatherers typically do less work for the same amount of food, are healthier, and are less prone to famine than primitive farmers (Lee & DeVore 1968, Cohen 1977, 1989). A biological assessment of what has been called the puzzle of agriculture might phrase it in simple ethological terms: why was this behaviour (agriculture) reinforced (and hence selected for) if it was not offering adaptive rewards surpassing those accruing to hunter-gathering or foraging economies?

This paradox is responsible for a profusion of models of the origin of agriculture. ‘Few topics in prehistory’, noted Hayden (1990) ‘have engendered as much discussion and resulted in so few satisfying answers as the attempt to explain why hunter/gatherers began to cultivate plants and raise animals. Climatic change, population pressure, sedentism, resource concentration from desertification, girls’ hormones, land ownership, geniuses, rituals, scheduling conflicts, random genetic kicks, natural selection, broad spectrum adaptation and multicausal retreats from explanation have all been proffered to explain domestication. All have major flaws … the data do not accord well with any one of these models.’

Recent discoveries of potentially psychoactive substances in certain agricultural products — cereals and milk — suggest an additional perspective on the adoption of agriculture and the behavioural changes (‘civilisation’) that followed it. In this paper we review the evidence for the drug-like properties of these foods, and then show how they can help to solve the biological puzzle just described.

The emergence of agriculture and civilisation in the Neolithic

The transition to agriculture

From about 10,000 years ago, groups of people in several areas around the world began to abandon the foraging lifestyle that had been successful, universal and largely unchanged for millennia (Lee & DeVore 1968). They began to gather, then cultivate and settle around, patches of cereal grasses and to domesticate animals for meat, labour, skins and other materials, and milk.

Farming, based predominantly on wheat and barley, first appeared in the Middle East, and spread quickly to western Asia, Egypt and Europe. The earliest civilisations all relied primarily on cereal agriculture. Cultivation of fruit trees began three thousand years later, again in the MiddleEast, and vegetables and other crops followed (Zohari 1986). Cultivation of rice began in Asia about 7000 years ago (Stark 1986).

To this day, for most people, two-thirds of protein and calorie intake is cereal-derived. (In the west, in the twentieth century, cereal consumption has decreased slightly in favour of meat, sugar, fats and so on.) The respective contributions of each cereal to current total world production are: wheat (28 per cent), corn/maize (27 per cent), rice (25 per cent), barley (10 per cent), others (10 per cent) (Pedersen et al. 1989).

The change in the diet due to agriculture

The modern human diet is very different from that of closely related primates and, almost certainly, early hominids (Gordon 1987). Though there is controversy over what humans ate before the development of agriculture, the diet certainly did not include cereals and milk in appreciable quantities. The storage pits and processing tools necessary for significant consumption of cereals did not appear until the Neolithic (Washburn & Lancaster 1968). Dairy products were not available in quantity before the domestication of animals.

The early hominid diet (from about four million years ago), evolving as it did from that of primate ancestors, consisted primarily of fruits, nuts and other vegetable matter, and some meat — items that could be foraged for and eaten with little or no processing. Comparisons of primate and fossil-hominid anatomy, and of the types and distribution of plants eaten raw by modern chimpanzees, baboons and humans (Peters & O’Brien 1981, Kay 1985), as well as microscope analysis of wear patterns on fossil teeth (Walker 1981, Peuch et al.1983) suggest that australopithecines were ‘mainly frugivorous omnivores with a dietary pattern similar to that of modern chimpanzees’ (Susman 1987:171).

The diet of pre-agricultural but anatomically modern humans (from 30,000 years ago) diversified somewhat, but still consisted of meat, fruits, nuts, legumes, edible roots and tubers, with consumption of cereal seeds only increasing towards the end of the Pleistocene (e.g. Constantini 1989 and subsequent chapters in Harris and Hillman 1989).

The rise of civilisation

Within a few thousand years of the adoption of cereal agriculture, the old hunter-gatherer style of social organisation began to decline. Large, hierarchically organised societies appeared, centred around villages and then cities. With the rise of civilisation and the state came socioeconomic classes, job specialisation, governments and armies.

The size of populations living as coordinated units rose dramatically above pre-agricultural norms. While hunter-gatherers lived in egalitarian, autonomous bands of about 20 closely related persons, with at most a tribal level of organisation above that, early agricultural villages had 50 to 200 inhabitants, and early cities 10,000 or more. People ‘had to learn to curb deep-rooted forces which worked for increasing conflict and violence in large groups’ (Pfeiffer 1977:438).

Agriculture and civilisation meant the end of foraging — a subsistence method with shortterm goals and rewards — and the beginning (for most) of regular arduous work, oriented to future payoffs and the demands of superiors. ‘With the coming of large communities, families no longer cultivated the land for themselves and their immediate needs alone, but for strangers and for the future. They worked all day instead of a few hours a day, as hunter-gatherers had done. There were schedules, quotas, overseers, and punishments for slacking off’ (Pfeiffer 1977:21).

Explaining the origins of agriculture and civilisation

The phenomena of human agriculture and civilisation are ethologically interesting, because (1) virtually no other species lives this way, and (2) humans did not live this way until relatively recently. Why was this way of life adopted, and why has it become dominant in the human species?

Problems explaining agriculture

Until recent decades, the transition to farming was seen as an inherently progressive one: people learnt that planting seeds caused crops to grow, and this new improved food source led to larger populations, sedentary farm and town life, more leisure time and so to specialisation, writing, technological advances and civilisation. It is now clear that agriculture was adopted despite certain disadvantages of that lifestyle (e.g. Flannery 1973, Henry 1989). There is a substantial literature (e.g. Reed 1977), not only on how agriculture began, but why. Palaeopathological and comparative studies show that health deteriorated in populations that adopted cereal agriculture, returning to pre-agricultural levels only in modem times. This is in part attributable to the spread of infection in crowded cities, but is largely due to a decline in dietary quality that accompanied intensive cereal farming (Cohen 1989). People in many parts of the world remained hunter-gatherers until quite recently; though they were quite aware of the existence and methods of agriculture, they declined to undertake it (Lee & DeVore 1968, Harris 1977). Cohen (1977:141) summarised the problem by asking: ‘If agriculture provides neither better diet, nor greater dietary reliability, nor greater ease, but conversely appears to provide a poorer diet, less reliably, with greater labor costs, why does anyone become a farmer?’

Many explanations have been offered, usually centred around a particular factor that forced the adoption of agriculture, such as environmental or population pressure (for reviews see Rindos 1984, Pryor 1986, Redding 1988, Blumler & Byrne 1991). Each of these models has been criticised extensively, and there is at this time no generally accepted explanation of the origin of agriculture.

Problems explaining civilisation

A similar problem is posed by the post-agricultural appearance, all over the world, of cities and states, and again there is a large literature devoted to explaining it (e.g. Claessen & Skalnik 1978). The major behavioural changes made in adopting the civilised lifestyle beg explanation. Bledsoe (1987:136) summarised the situation thus:

‘There has never been and there is not now agreement on the nature and significance of the rise of civilisation. The questions posed by the problem are simple, yet fundamental. How did civilisation come about? What animus impelled man to forego the independence, intimacies, and invariability of tribal existence for the much larger and more impersonal political complexity we call the state? What forces fused to initiate the mutation that slowly transformed nomadic societies into populous cities with ethnic mixtures, stratified societies, diversified economies and unique cultural forms? Was the advent of civilisation the inevitable result of social evolution and natural laws of progress or was man the designer of his own destiny? Have technological innovations been the motivating force or was it some intangible factor such as religion or intellectual advancement?’

To a very good approximation, every civilisation that came into being had cereal agriculture as its subsistence base, and wherever cereals were cultivated, civilisation appeared. Some hypotheses have linked the two. For example, Wittfogel’s (1957) ‘hydraulic theory’ postulated that irrigation was needed for agriculture, and the state was in turn needed to organise irrigation. But not all civilisations used irrigation, and other possible factors (e.g. river valley placement, warfare, trade, technology, religion, and ecological and population pressure) have not led to a universally accepted model.

Pharmacological properties of cereals and milk

Recent research into the pharmacology of food presents a new perspective on these problems.

Exorphins: opioid substances in food

Prompted by a possible link between diet and mental illness, several researchers in the late 1970s began investigating the occurrence of drug-like substances in some common foodstuffs.

Dohan (1966, 1984) and Dohan et al. (1973, 1983) found that symptoms of schizophrenia were relieved somewhat when patients were fed a diet free of cereals and milk. He also found that people with coeliac disease — those who are unable to eat wheat gluten because of higher than normal permeability of the gut — were statistically likely to suffer also from schizophrenia. Research in some Pacific communities showed that schizophrenia became prevalent in these populations only after they became ‘partially westernised and consumed wheat, barley beer, and rice’ (Dohan 1984).

Groups led by Zioudrou (1979) and Brantl (1979) found opioid activity in wheat, maize and barley (exorphins), and bovine and human milk (casomorphin), as well as stimulatory activity in these proteins, and in oats, rye and soy. Cereal exorphin is much stronger than bovine casomorphin, which in turn is stronger than human casomorphin. Mycroft et al. (1982, 1987) found an analogue of MIF-1, a naturally occurring dopaminergic peptide, in wheat and milk. It occurs in no other exogenous protein. (In subsequent sections we use the term exorphin to cover exorphins, casomorphin, and the MIF-1 analogue. Though opioid and dopaminergic substances work in different ways, they are both ‘rewarding’, and thus more or less equivalent for our purposes.)

Since then, researchers have measured the potency of exorphins, showing them to be comparable to morphine and enkephalin (Heubner et al. 1984), determined their amino acid sequences (Fukudome &Yoshikawa 1992), and shown that they are absorbed from the intestine (Svedburg et al.1985) and can produce effects such as analgesia and reduction of anxiety which are usually associated with poppy-derived opioids (Greksch et al.1981, Panksepp et al.1984). Mycroft et al. estimated that 150 mg of the MIF-1 analogue could be produced by normal daily intake of cereals and milk, noting that such quantities are orally active, and half this amount ‘has induced mood alterations in clinically depressed subjects’ (Mycroft et al. 1982:895). (For detailed reviews see Gardner 1985 and Paroli 1988.)

Most common drugs of addiction are either opioid (e.g heroin and morphine) or dopaminergic (e.g. cocaine and amphetamine), and work by activating reward centres in the brain. Hence we may ask, do these findings mean that cereals and milk are chemically rewarding? Are humans somehow ‘addicted’ to these foods?

Problems in interpreting these findings

Discussion of the possible behavioural effects of exorphins, in normal dietary amounts, has been cautious. Interpretations of their significance have been of two types:

where a pathologicaleffect is proposed (usually by cereal researchers, and related to Dohan’s findings, though see also Ramabadran & Bansinath 1988), and

where a naturalfunction is proposed (by milk researchers, who suggest that casomorphin may help in mother-infant bonding or otherwise regulate infant development).

We believe that there can be no natural function for ingestion of exorphins by adult humans. It may be that a desire to find a natural function has impeded interpretation (as well as causing attention to focus on milk, where a natural function is more plausible) . It is unlikely that humans are adapted to a large intake of cereal exorphin, because the modern dominance of cereals in the diet is simply too new. If exorphin is found in cow’s milk, then it may have a natural function for cows; similarly, exorphins in human milk may have a function for infants. But whether this is so or not, adult humans do not naturally drink milk of any kind, so any natural function could not apply to them.

Our sympathies therefore lie with the pathological interpretation of exorphins, whereby substances found in cereals and milk are seen as modern dietary abnormalities which may cause schizophrenia, coeliac disease or whatever. But these are serious diseases found in a minority. Can exorphins be having an effect on humankind at large?

Other evidence for ‘drug-like’ effects of these foods

Research into food allergyhas shown that normal quantities of some foods can have pharmacological, including behavioural, effects. Many people develop intolerances to particular foods. Various foods are implicated, and a variety of symptoms is produced. (The term ‘intolerance’ rather than allergy is often used, as in many cases the immune system may not be involved (Egger 1988:159). Some intolerance symptoms, such as anxiety, depression, epilepsy, hyperactivity, and schizophrenic episodes involve brain function (Egger 1988, Scadding & Brostoff 1988).

Radcliffe (1982, quoted in 1987:808) listed the foods at fault, in descending order of frequency, in a trial involving 50 people: wheat (more than 70 per cent of subjects reacted in some way to it), milk (60 per cent), egg (35 per cent), corn, cheese, potato, coffee, rice, yeast, chocolate, tea, citrus, oats, pork, plaice, cane, and beef (10 per cent). This is virtually a list of foods that have become common in the diet following the adoption of agriculture, in order of prevalence. The symptoms most commonly alleviated by treatment were mood change (>50 per cent) followed by headache, musculoskeletal and respiratory ailments.

One of the most striking phenomena in these studies is that patients often exhibit cravings, addiction and withdrawal symptoms with regard to these foods (Egger 1988:170, citing Randolph 1978; see also Radcliffe 1987:808-10, 814, Kroker 1987:856, 864, Sprague & Milam 1987:949, 953, Wraith 1987:489, 491). Brostoff and Gamlin (1989:103) estimated that 50 per cent of intolerance patients crave the foods that cause them problems, and experience withdrawal symptoms when excluding those foods from their diet. Withdrawal symptoms are similar to those associated with drug addictions (Radcliffe 1987:808). The possibility that exorphins are involved has been noted (Bell 1987:715), and Brostoff and Gamlin conclude (1989:230):

‘… the results so far suggest that they might influence our mood. There is certainly no question of anyone getting ‘high’ on a glass of milk or a slice of bread – the amounts involved are too small for that – but these foods might induce a sense of comfort and wellbeing, as food-intolerant patients often say they do. There are also other hormone-like peptides in partial digests of food, which might have other effects on the body.’

There is no possibility that craving these foods has anything to do with the popular notion of the body telling the brain what it needs for nutritional purposes. These foods were not significant in the human diet before agriculture, and large quantities of them cannot be necessary for nutrition. In fact, the standard way to treat food intolerance is to remove the offending items from the patient’s diet.

A suggested interpretation of exorphin research

But what are the effects of these foods on normal people? Though exorphins cannot have a naturally selected physiological function in humans, this does not mean that they have noeffect. Food intolerance research suggests that cereals and milk, in normal dietary quantities, are capable of affecting behaviour in many people. And if severe behavioural effects in schizophrenics and coeliacs can be caused by higher than normal absorption of peptides, then more subtle effects, which may not even be regarded as abnormal, could be produced in people generally.

The evidence presented so far suggests the following interpretation.

The ingestion of cereals and milk, in normal modern dietary amounts by normal humans, activates reward centres in the brain. Foods that were common in the diet before agriculture (fruits and so on) do not have this pharmacological property. The effects of exorphins are qualitatively the same as those produced by other opioid and / or dopaminergic drugs, that is, reward, motivation, reduction of anxiety, a sense of wellbeing, and perhaps even addiction. Though the effects of a typical meal are quantitatively less than those of doses of those drugs, most modern humans experience them several times a day, every day of their adult lives.

Hypothesis: exorphins and the origin of agriculture and civilisation

When this scenario of human dietary practices is viewed in the light of the problem of the origin of agriculture described earlier, it suggests an hypothesis that combines the results of these lines of enquiry.

Exorphin researchers, perhaps lacking a long-term historical perspective, have generally not investigated the possibility that these foods really are drug-like, and have instead searched without success for exorphin’s natural function. The adoption of cereal agriculture and the subsequent rise of civilisation have not been satisfactorily explained, because the behavioural changes underlying them have no obvious adaptive basis.

These unsolved and until-now unrelated problems may in fact solve each other. The answer, we suggest, is this: cereals and dairy foods are not natural human foods, but rather are preferred because they contain exorphins. This chemical reward was the incentive for the adoption of cereal agriculture in the Neolithic. Regular self-administration of these substances facilitated the behavioural changes that led to the subsequent appearance of civilisation.

This is the sequence of events that we envisage.

Climatic change at the end of the last glacial period led to an increase in the size and concentration of patches of wild cereals in certain areas (Wright 1977). The large quantities of cereals newly available provided an incentive to try to make a meal of them. People who succeeded in eating sizeable amounts of cereal seeds discovered the rewarding properties of the exorphins contained in them. Processing methods such as grinding and cooking were developed to make cereals more edible. The more palatable they could be made, the more they were consumed, and the more important the exorphin reward became for more people.

At first, patches of wild cereals were protected and harvested. Later, land was cleared and seeds were planted and tended, to increase quantity and reliability of supply. Exorphins attracted people to settle around cereal patches, abandoning their nomadic lifestyle, and allowed them to display tolerance instead of aggression as population densities rose in these new conditions.

Though it was, we suggest, the presence of exorphins that caused cereals (and not an alternative already prevalent in the diet) to be the major early cultigens, this does not mean that cereals are ‘just drugs’. They have been staples for thousands of years, and clearly have nutritional value. However, treating cereals as ‘just food’ leads to difficulties in explaining why anyone bothered to cultivate them. The fact that overall health declined when they were incorporated into the diet suggests that their rapid, almost total replacement of other foods was due more to chemical reward than to nutritional reasons.

It is noteworthy that the extent to which early groups became civilised correlates with the type of agriculture they practised. That is, major civilisations (in south-west Asia, Europe, India, and east and parts of South-East Asia; central and parts of north and south America; Egypt, Ethiopia and parts of tropical and west Africa) stemmed from groups which practised cereal, particularly wheat, agriculture (Bender 1975:12, Adams 1987:201, Thatcher 1987:212). (The rarer nomadic civilisations were based on dairy farming.)

Groups which practised vegeculture (of fruits, tubers etc.), or no agriculture (in tropical and south Africa, north and central Asia, Australia, New Guinea and the Pacific, and much of north and south America) did not become civilised to the same extent.

Thus major civilisations have in common that their populations were frequent ingesters of exorphins. We propose that large, hierarchical states were a natural consequence among such populations. Civilisation arose because reliable, on-demand availability of dietary opioids to individuals changed their behaviour, reducing aggression, and allowed them to become tolerant of sedentary life in crowded groups, to perform regular work, and to be more easily subjugated by rulers. Two socioeconomic classes emerged where before there had been only one (Johnson & Earle 1987:270), thus establishing a pattern which has been prevalent since that time.


The natural diet and genetic change

Some nutritionists deny the notion of a pre-agricultural natural human diet on the basis that humans are omnivorous, or have adapted to agricultural foods (e.g. Garn & Leonard 1989; for the contrary view see for example Eaton & Konner 1985). An omnivore, however, is simply an animal that eats both meat and plants: it can still be quite specialised in its preferences (chimpanzees are an appropriate example). A degree of omnivory in early humans might have preadapted them to some of the nutrients contained in cereals, but not to exorphins, which are unique to cereals.

The differential rates of lactase deficiency, coeliac disease and favism (the inability to metabolise fava beans) among modern racial groups are usually explained as the result of varying genetic adaptation to post-agricultural diets (Simopoulos 1990:27-9), and this could be thought of as implying some adaptation to exorphins as well. We argue that little or no such adaptation has occurred, for two reasons: first, allergy research indicates that these foods still cause abnormal reactions in many people, and that susceptibility is variable within as well as between populations, indicating that differential adaptation is not the only factor involved. Second, the function of the adaptations mentioned is to enable humans to digest those foods, and if they are adaptations, they arose because they conferred a survival advantage. But would susceptibility to the rewarding effects of exorphins lead to lower, or higher, reproductive success? One would expect in general that an animal with a supply of drugs would behave less adaptively and so lower its chances of survival. But our model shows how the widespread exorphin ingestion in humans has led to increased population. And once civilisation was the norm, non-susceptibility to exorphins would have meant not fitting in with society. Thus, though there may be adaptation to the nutritional content of cereals, there will be little or none to exorphins. In any case, while contemporary humans may enjoy the benefits of some adaptation to agricultural diets, those who actually made the change ten thousand years ago did not.

Other ‘non-nutritional’ origins of agriculture models

We are not the first to suggest a non-nutritional motive for early agriculture. Hayden (1990) argued that early cultigens and trade items had more prestige value than utility, and suggested that agriculture began because the powerful used its products for competitive feasting and accrual of wealth. Braidwood et al. (1953) and later Katz and Voigt (1986) suggested that the incentive for cereal cultivation was the production of alcoholic beer:

‘Under what conditions would the consumption of a wild plant resource be sufficiently important to lead to a change in behaviour (experiments with cultivation) in order to ensure an adequate supply of this resource? If wild cereals were in fact a minor part of the diet, any argument based on caloric need is weakened. It is our contention that the desire for alcohol would constitute a perceived psychological and social need that might easily prompt changes in subsistence behaviour’ (Katz & Voigt 1986:33).

This view is clearly compatible with ours. However there may be problems with an alcohol hypothesis: beer may have appeared after bread and other cereal products, and been consumed less widely or less frequently (Braidwood et al. 1953). Unlike alcohol, exorphins are present in all these products. This makes the case for chemical reward as the motive for agriculture much stronger. Opium poppies, too, were an early cultigen (Zohari 1986). Exorphin, alcohol, and opium are primarily rewarding (as opposed to the typically hallucinogenic drugs used by some hunter-gatherers) and it is the artificial reward which is necessary, we claim, for civilisation. Perhaps all three were instrumental in causing civilised behaviour to emerge.

Cereals have important qualities that differentiate them from most other drugs. They are a food source as well as a drug, and can be stored and transported easily. They are ingested in frequent small doses (not occasional large ones), and do not impede work performance in most people. A desire for the drug, even cravings or withdrawal, can be confused with hunger. These features make cereals the ideal facilitator of civilisation (and may also have contributed to the long delay in recognising their pharmacological properties).

Compatibility, limitations, more data needed

Our hypothesis is not a refutation of existing accounts of the origins of agriculture, but rather fits alongside them, explaining why cereal agriculture was adopted despite its apparent disadvantages and how it led to civilisation.

Gaps in our knowledge of exorphins limit the generality and strength of our claims. We do not know whether rice, millet and sorghum, nor grass species which were harvested by African and Australian hunter-gatherers, contain exorphins. We need to be sure that preagricultural staples do not contain exorphins in amounts similar to those in cereals. We do not know whether domestication has affected exorphin content or-potency. A test of our hypothesis by correlation of diet and degree of civilisation in different populations will require quantitative knowledge of the behavioural effects of all these foods.

We do not comment on the origin of noncereal agriculture, nor why some groups used a combination of foraging and farming, reverted from farming to foraging, or did not farm at all. Cereal agriculture and civilisation have, during the past ten thousand years, become virtually universal. The question, then, is not why they happened here and not there, but why they took longer to become established in some places than in others. At all times and places, chemical reward and the influence of civilisations already using cereals weighed in favour of adopting this lifestyle, the disadvantages of agriculture weighed against it, and factors such as climate, geography, soil quality, and availability of cultigens influenced the outcome. There is a recent trend to multi-causal models of the origins of agriculture (e.g. Redding 1988, Henry 1989), and exorphins can be thought of as simply another factor in the list. Analysis of the relative importance of all the factors involved, at all times and places, is beyond the scope of this paper.


‘An animal is a survival machine for the genes that built it. We too are animals, and we too are survival machines for our genes. That is the theory. In practice it makes a lot of sense when we look at wild animals…. It is very different when we look at ourselves. We appear to be a serious exception to the Darwinian law…. It obviously just isn’t true that most of us spend our time working energetically for the preservation of our genes’ (Dawkins 1989:138).

Many ethologists have acknowledged difficulties in explaining civilised human behaviour on evolutionary grounds, in some cases suggesting that modern humans do not always behave adaptively. Yet since agriculture began, the human population has risen by a factor of 1000: Irons (1990) notes that ‘population growth is not the expected effect of maladaptive behaviour’.

We have reviewed evidence from several areas of research which shows that cereals and dairy foods have drug-like properties, and shown how these properties may have been the incentive for the initial adoption of agriculture. We suggested further that constant exorphin intake facilitated the behavioural changes and subsequent population growth of civilisation, by increasing people’s tolerance of (a) living in crowded sedentary conditions, (b) devoting effort to the benefit of non-kin, and (c) playing a subservient role in a vast hierarchical social structure.

Cereals are still staples, and methods of artificial reward have diversified since that time, including today a wide range of pharmacological and non-pharmacological cultural artifacts whose function, ethologically speaking, is to provide reward without adaptive benefit. It seems reasonable then to suggest that civilisation not only arose out of self-administration of artificial reward, but is maintained in this way among contemporary humans. Hence a step towards resolution of the problem of explaining civilised human behaviour may be to incorporate into ethological models this widespread distortion of behaviour by artificial reward.


Adams, W .M., 1987, Cereals before cities except after Jacobs, in M. Melko & L.R. Scott eds, The boundaries of civilizations in space and time, University Press of America, Lanham.

Bell, I. R., 1987, Effects of food allergy on the central nervous system, in J. Brostoff and S. J. Challacombe, eds, Food allergy and intolerance, Bailliere Tindall, London.

Bender, B., 1975, Farming in prehistory: from hunter-gatherer to food producer, John Baker, London.

Bledsoe, W., 1987, Theories of the origins of civilization, in M. Melko and L. R. Scott, eds, The boundaries of civilizations in space and time, University Press of America, Lanham.

Blumler, M., & Byrne, R., 1991, The ecological genetics of domestication and the origins of agriculture, Current Anthropology 32: 2-35.

Braidwood, R. J., Sauer, J.D., Helbaek, H., Mangelsdorf, P.C., Cutler, H.C., Coon, C.S., Linton, R., Steward J. & Oppenheim, A.L., 1953, Symposium: did man once live by beer alone? American Anthropologist 55: 515-26.

Brantl, V., Teschemacher, H., Henschen, A. & Lottspeich, F., 1979, Novel opioid peptides derived from casein (beta-casomorphins), Hoppe-Seyler’s Zeitschrift fur Physiologische Chemie 360:1211-6.

Brostoff, J., & Gamlin, L., 1989, The complete guide to food allergy and intolerance, Bloomsbury, London.

Chang, T. T., 1989, Domestication and the spread of the cultivated rices, in D.R. Harris and G.C. Hillman, eds, Foraging and farming: the evolution of plant exploitation, Unwin Hyman, London.

Claessen, H. J. M. & Skalnik P., eds, 1978, The early state, Mouton, The Hague.

Cohen, M. N., 1977, Population pressure and the origins of agriculture: an archaeological example from the coast of Peru, in Reed, C.A., ed., The origins of agriculture, Mouton, The Hague.

Cohen, M. N., 1989, Health and the rise of civilization, Yale University Press, New Haven.

Constantini, L., 1989, Plant exploitation at Grotta dell’Uzzo, Sicily: new evidence for the transition from Mesolithic to Neolithic subsistence in southern Europe, in Harris, D. R. & Hillman, G. C., eds, Foraging and farming: the evolution of plant exploitation, Unwin Hyman, London.

Dawkins, R., 1989, Darwinism and human purpose, in Durant, J. R., ed., Human origins, Clarendon Press, Oxford.

Dohan, F., 1966, Cereals and schizophrenia: data and hypothesis, Acta Psychiatrica Scandinavica 42:125-52.

Dohan, F., 1983, More on coeliac disease as a model of schizophrenia, Biological Psychiatry 18:561-4.

Dohan, F. & Grasberger, J., 1973, Relapsed schizophrenics: earlier discharge from the hospital after cereal-free, milk-free diet, American Journal of Psychiatry 130:685-8.

Dohan, F., Harper, E., Clark, M., Ratigue, R., & Zigos, V., 1984, Is schizophrenia rare if grain is rare? Biological Psychiatry 19: 385-99.

Eaton, S. B. & Konner, M., 1985, Paleolithic nutrition – a consideration of its nature and current implications, New England Journal of Medicine 312: 283-90.

Egger, J., 1988, Food allergy and the central nervous system, in Reinhardt, D. & Schmidt E., eds, Food allergy, Raven, New York.

Flannery, K. V., 1973, The origins of agriculture, Annual Review of Anthropology 2:271-310.

Fukudome, S., & Yoshikawa, M., 1992, Opioid peptides derived from wheat gluten: their isolation and characterization, FEBS Letters 296:107-11.

Gardner, M. L. G., 1985, Production of pharmacologically active peptides from foods in the gut. in Hunter, J. & Alun-Jones, V., eds, Food and the gut, Bailliere Tindall, London.

Gam, S. M. & Leonard, W. R., 1989, What did our ancestors eat? Nutritional Reviews 47:337 45.

Gordon, K. D., 1987, Evolutionary perspectives on human diet, in Johnston, F., ed, Nutritional Anthropology, Alan R. Liss, New York.

Greksch, G., Schweiger C., Matthies, H., 1981, Evidence for analgesic activity of beta-casomorphin in rats, Neuroscience Letters 27:325~8.

Harlan, J. R., 1986, Plant domestication: diffuse origins and diffusion, in Barigozzi, G., ed., The origin and domestication of cultivated plants, Elsevier, Amsterdam.

Harris, D. R., 1977, Alternative pathways towards agriculture, in Reed, C. A., ed., The origins of agriculture, Mouton, The Hague.

Harris, D. R. & Hillman, G. C., eds, 1989, Foraging and farming: the evolution of plant exploitation, Unwin Hyman, London.

Hayden, B., 1990, Nimrods, piscators, pluckers, and planters: the emergence of food production, Journal of Anthropological Archaeology 9:31-69.

Henry, D. O., 1989, From foraging to agriculture: the Levant at the end of the ice age, University of Pennsylvania Press, Philadelphia.

Heubner, F., Liebeman, K., Rubino, R. & Wall, J., 1984, Demonstration of high opioid-like activity in isolated peptides from wheat gluten hydrolysates, Peptides 5:1139-47.

Irons, W., 1990, Let’s make our perspective broader rather than narrower, Ethology and Sociobiology 11: 361-74

Johnson, A. W. & Earle, T., 1987, The evolution of human societies: from foraging group to agrarian state, Stanford University Press, Stanford.

Katz, S. H. & Voigt, M. M., 1986, Bread and beer: the early use of cereals in the human diet, Expedition 28:23-34.

Kay, R. F., 1985, Dental evidence for the diet of Australopithecus, Annual Review of Anthropology 14:315 41.

Kroker, G. F., 1987, Chronic candiosis and allergy, in Brostoff, J. & Challacombe, S.J., eds, Food allergy and intolerance, Bailliere Tindall, London.

Lee, R. B. & DeVore, I., 1968, Problems in the study of hunters and gatherers, in Lee, R.B. & DeVore, I., eds, Man the hunter, Aldine, Chicago.

Mycroft, F. J., Wei, E. T., Bernardin, J. E. & Kasarda, D. D., 1982, MlF-like sequences in milk and wheat proteins, New England Journal of Medicine 301:895.

Mycroft, F. J., Bhargava, H. N. & Wei, E. T., 1987, Pharmacalogical activities of the MIF-1 analogues Pro-Leu-Gly, Tyr-Pro-Leu-Gly and pareptide, Peptides 8:1051-5.

Panksepp, J., Normansell, L., Siviy, S., Rossi, J. & Zolovick, A., 1984, Casomorphins reduce separation distress in chicks, Peptides 5:829-83.

Paroli, E., 1988, Opioid peptides from food (the exorphins), World review of nutrition and dietetics 55:58-97.

Pedersen, B., Knudsen, K. E. B. & Eggum, B. 0., 1989, Nutritive value of cereal products with emphasis on the effect of milling, World review of nutrition and dietetics 60:1-91.

Peters, C. R. & O’Brien, E. M., 1981, The early hominid plant-food niche: insights from an analysis of plant exploitation by Homo, Pan, and Papio in eastern and southern Africa, Current Anthropology 22:127-40.

Peuch, P., Albertini, H. & Serratrice, C., 1983, Tooth microwear and dietary patterns in early hominids from Laetoli, Hadar, and Olduvai, Journal of Human Evolution 12:721-9.

Pfeiffer, J. E., 1977, The emergence of society: a prehistory of the establishment, McGraw Hill, New York.

Pryor, F. L., 1986, The adoption of agriculture: some theoretical and empirical evidence, American Anthropologist 88:879-97.

Radcliffe, M. J., 1987, Diagnostic use of dietary regimes, in Brostoff, J. & Challacombe, S. J., eds, Food allergy and intolerance, Bailliere Tindall, London.

Ramabadran, K. & Bansinath, M., 1988, Opioid peptides from milk as a possible cause of Sudden Infant Death Syndrome, Medical Hypotheses 27:181-7.

Randolph, T. G., 1978, Specific adaptation, in Annals of Allergy 40:333-45

Redding, R., 1988, A general explanation of subsistence change from hunting and gathering to food production, Journal of Anthropological Archaeology 7:56-97.

Reed, C. A., ed., 1977, The origins of agriculture, Mouton, The Hague.

Rindos, D., 1984, The origins of agriculture: an evolutionary perspective, Academic Press, Orlando.

Scadding, G. K. & Brostoff, J., 1988, The dietic treatment of food allergy, in Reinhardt, D. & Schmidt, E., eds, Food allergy, Raven, New York.

Simopoulos, A. P., 1990, Genetics and nutrition: or what your genes can tell you about nutrition, World review of nutrition and dietetics 63:25-34.

Sprague, D. E. & Milam, M. J., 1987, Concept of an environmental unit, in Brostoff, J. & .Challacombe, S. J., eds, Food allergy and intolerance, Bailliere Tindall, London.

Stark, B. L., 1986, Origins of food production in the New World, in Meltzer, D. J., Fowler, D. D. & Sabloff, J. A., eds, American archaeology past and future, Smithsonian Institute Press, Washington.

Susman, R. L., 1987, Pygmy chimpanzees and common chimpanzees: models for the behavioural ecology of the earliest hominids, in Kinzey, W. G., ed., The evolution of human behaviour: primate models, State University of New York Press, Albany.

Svedburg, J., De Haas, J., Leimenstoll, G., Paul, F. & Teschemacher, H., 1985, Demonstration of betacasomorphin immunoreactive materials in in-vitro digests of bovine milk and in small intestine contents after bovine milk ingestion in adult humans, Peptides 6:825-30.

Thatcher, J. P., 1987, The economic base for civilization in the New World, in Melko, M. & Scott, L. R., eds, The boundaries of civilizations in space and time, University Press of America, Lanham.

Walker, A., 1981, Dietary hypotheses and human evolution, Philosophical Transactions of the Royal Society of London B292:57-64.

Washburn, L. & Lancaster, C. S., 1968, The evolution of hunting, in Lee, R. B. & DeVore, I., eds, Man the hunter, Aldine, Chicago.

Wittfogel, K., 1957, Oriental Despotism, Yale University Press, New Haven.

Wraith, D. G., 1987, Asthma, in Brostoff, J. & Challacombe, S. J., eds, Food allergy and intolerance, Bailliere Tindall, London.

Wright, H. E., 1977, Environmental changes and the origin of agriculture in the Near East, in Reed, C. A., ed, The origins of agriculture, Mouton, The Hague.

Zioudrou, C., Streaty, R. & Klee, W., 1979, Opioid peptides derived from food proteins: the exorphins Journal of Biological Chemistry 254:244S9.

Zohari, D., 1986, The origin and early spread of agriculture in the Old World, in Barigozzi, G., ed., The origin and domestication of cultivated plants, Elsevier, Amsterdam

Read Full Post »

Posted By Dr. Mercola | October 22, 2010

You’ve no doubt noticed that for about the last 60 years the majority of health care officials and the media have been telling you saturated fats are bad for your health and lead to a host of negative consequences, like elevated cholesterol, obesity, heart disease and Alzheimer’s disease.

Meanwhile during this same 60 years the American levels of heart disease, obesity, elevated serum cholesterol and Alzheimer’s have skyrocketed compared to our ancestors, and even compared to modern-day primitive societies using saturated fat as a dietary staple.

Did you know that multiple studies on Pacific Island populations who get 30-60% of their total caloric intake from fully saturated coconut oil have all shown nearly non-existent rates of cardiovascular disease?[1]

Clearly, a lot of confusion and contradictory evidence exists on the subject of saturated fats, even among health care professionals.

But I’m going to tell you something that public health officials and the media aren’t telling you.

The fact is, all saturated fats are not created equal. (more…)

Read Full Post »

Older Posts »

%d bloggers like this: