PrintFriendly and PDFPrint - PDF - Email Vitamin D Experts Defend Cod Liver Oil In November of 2008, Dr. John Cannell of the Vitamin D Council published a commentary in the journal Annals of Otology, Rhinology & Laryngology attacking cod liver oil because of its high vitamin A content, claiming that vitamin A intakes above the most minimal levels would increase mortality rates, increase vulnerability to infections, cause osteoporosis, and antagonize the beneficial effects of vitamin D. Sixteen scientists signed on to the paper as co-authors. In response, Wise Traditions published my article, “The Cod Liver Oil Debate,” in the Spring, 2009 issue, which defended cod liver oil as an important and balanced source of the fat-soluble vitamins and essential fatty acids. The following November, I expanded on this article in my lecture, “Cod Liver Oil: Our Number One Superfood,” at the Foundation’s annual conference. We’re not the only ones who responded! In January 2010, Michael F. Holick, MD PhD, a vitamin D researcher whose work I have cited in previous articles, Linda Linday, a medical doctor whose cod liver oil study formed the starting point for Cannell’s 2008 commentary, and several other colleagues, even including one researcher from the National Institutes of Health, made a direct response to Dr. Cannell and his colleagues in the pages of the same journal. What’s more, they even credited the Weston A. Price Foundation for raising concern about the balance between vitamins A and D! “Cod liver oil,” they wrote, “available without a prescription for hundreds of years, is a valuable source of vitamins A and D, as well as long-chain omega-3 fatty acids, all of which may be important in the prevention of respiratory tract illnesses in children. In many populations around the world, cod liver oil continues to be a valuable source of these important nutrients. The across-the-board dismissal of cod liver oil as a supplement advocated by [Cannell and colleagues] ignores this reality.” REDUCED RESPIRATORY INFECTIONS The authors pointed out that in Dr. Linday’s randomized, controlled trials, cod liver oil supplementation cut doctor’s visits for upper respiratory infections between one-third and one-half. Cannell’s paper called this “less than robust,” but most of us would consider such a reduction meaningful, especially if by taking cod liver oil we got sick less often! The authors, moreover, argued that retinol from animal foods is a more reliable source of vitamin A than carotenes from plant foods since there is such wide variation in people’s ability to convert carotenes to vitamin A—an argument that has appeared in the pages of Wise Traditions many times in the past. THE IDEAL RATIO But now to the exciting part. The authors devoted a section of their paper to the ideal ratio of vitamin A to D. “In the responses to [Cannell and colleagues] from the on-line supplement and nutrition newsletter communities,” they wrote, “the issue of the proper ratio of vitamin A to vitamin D emerged as a major concern.” They gave three references, including one to the Weston A. Price Foundation’s “Cod Liver Oil Update” from December, 2008. In fact, the importance of balance between vitamins A and D was raised in the pages of Wise Traditions even earlier than 2008. In the spring of 2006, I discussed the issue in my article “Vitamin A on Trial: Does Vitamin A Cause Osteoporosis?” when I argued that vitamin A only contributes to osteoporosis when vitamin D levels are deficient or when the ratio of vitamin A to D is massively out of balance. The following fall, I raised the issue again in my article “From Seafood to Sunshine: A New Understanding of Vitamin D Toxicity,” wherein I presented research showing that vitamin A protects against vitamin D toxicity and introduced the possibility that vitamins A, D, and K2 may be cooperative factors that should all be consumed in proper balance. I more fully developed this concept in my spring 2007 article on vitamin K2, “On the Trail of the Elusive X-Factor: A Sixty-Two-Year- Old Mystery Finally Solved.” As a result of this research, in December of 2007, I published a hypothesis on the molecular mechanism of vitamin D toxicity in the journal Medical Hypotheses entitled “Vitamin D toxicity redefined: vitamin K and the molecular mechanism,” which emphasized interactions between vitamins A, D, and K2. The following year, researchers from Tufts University published a paper in the Journal of Nutrition supporting this hypothesis, showing that vitamin A protects against vitamin D toxicity in part by helping to properly regulate the production of vitamin K-dependent proteins. One question I have never been able to answer in any of these articles is the one everyone wants an answer to: what, precisely, is the proper ratio of vitamins A and D? Dr. Linday and her colleagues offer a suggestion: poultry studies suggest optimal A-to-D ratios between four and eight. Similarly, in her own studies showing that cod liver oil protects against upper respiratory tract infections, Linday supplied her patients with A-to-D ratios between five and eight. They also point out that rat studies showing that vitamin A is toxic and antagonizes the effects of vitamin D used much higher ratios, ranging from 5,000 to 55,000! It is refreshing to see a powerful defense of cod liver oil in the scientific literature, and especially refreshing to see the work of the Weston A. Price Foundation cited therein. We owe a big thank you to Dr. Linda Linday (MD) of St. Luke’s-Roosevelt Hospital Center in NY, NY, Dr. John C. Umhau (MD, MPH) of NIH in Bethesda, MD, Richard D. Shindledecker of New York Downtown Hospital in NY, NY, Dr. Jay N. Dolitsky (MD) of New York Eye and Ear Infirmary in NY, NY and Michael F. Holick (PhD, MD) of Boston University Medical Center in Boston, MA for helping to sort out these important questions about the fat-soluble vitamins. OPTIMAL VITAMIN D LEVELS Are some people pushing their vitamin D levels too high? Has science proven that the minimal acceptable blood level of vitamin D, in the form of 25(OH)D, is above 50 ng/mL (125 nmol/L)? The answer is “No.” If you’ve been trying to maintain your levels this high because you thought this was the case, I’m sorry to break the news. There is, on the contrary, good evidence that 25(OH)D levels should be at least 30-35 ng/ mL (75-88 nmol/L). Much higher levels may be better, or they could start causing harm, especially in the absence of adequate vitamins A and K2. Once we leave the land of 30-35 ng/mL, however, we enter the land of speculation. The idea that science has proven we need to maintain 50 ng/mL as a minimum comes from Dr. John Cannell of the Vitamin D Council. In his article “Am I Vitamin D Deficient?” he writes the following: “Thanks to Bruce Hollis, Robert Heaney, Neil Binkley, and others, we now know the minimal acceptable level. It is 50 ng/ ml (125 nmol/L). In a recent study, Heaney, et al expanded on Bruce Hollis’s seminal work by analyzing five studies in which both the parent compound (cholecalciferol) and 25(OH)D levels were measured. They found that the body does not reliably begin storing cholecalciferol in fat and muscle tissue until 25(OH)D levels get above 50 ng/ml (125 nmol/L). The average person starts to store cholecalciferol at 40 ng/ml (100 nmol/L), but at 50 ng/ml (125 nmol/L) virtually everyone begins to store it for future use. That is, at levels below 50 ng/ml (125 nmol/L), the body uses up vitamin D as fast as you can make it, or take it, indicating chronic substrate starvation—not a good thing. 25(OH)D levels should be between 50–80 ng/ml (125–200 nmol/L), year-round.” DIFFERENT CONCLUSIONS There are a few problems with this argument. To begin with, Drs. Hollis, Heaney, Binkley, and the other authors of this study rightly made very different conclusions from their own data. In the report they wrote for the American Journal of Clinical Nutrition, they wrote the following: “One could plausibly postulate that the point at which hepatic 25(OH)D production becomes zero-order [this is the point at which the enzymes converting vitamin D to 25(OH) D are saturated with vitamin D] constitutes the definition of the low end of normal status. This value, as suggested in an equation shown in the article, is at a serum 25(OH)D concentration of 88 nmol/L (35.2 ng/mL). It is interesting that this estimate is very close to that produced by previous attempts to define the lower end of the normal range from the relations of serum 25(OH) D to calcium absorption and to serum parathyroid hormone concentration (ie, 75–85 nmol/L, or 30–34 ng/mL).” According to the authors of this study, then, the point at which the vitamin D enzymes are saturated and vitamin D “accumulates within the body, both in serum and probably in body fat” is not 40 or 50 ng/mL (100 or 125 nmol/L) but rather 35 ng/mL (88 nmol/L). The authors used a statistical approach that pooled together data from several studies. They presented most of their data in Figure 4, and the data from one other study in Figure 5 (see below). They did not determine the point at which vitamin D starts getting stored in body fat in particular individuals. On the contrary, they used a statistical approach to infer the point at which this occurs in their entire study population. Now, if you compare Figures 4 and 5, looking for the point at which the slope of the line dramatically changes, you will see that it changes at a higher level of 25(OH)D in Figure 5. Dr. Cannell seems to have used the data from Figure 5 to say when vitamin D gets stored in body fat in “virtually everyone” as opposed to “the average person,” but in fact the authors stated that they did not use the data from Figure 5 to determine this point because a different and apparently inferior method of measuring vitamin D levels was used in that data set. So, we are back to the authors’ original conclusions, that vitamin D saturates its activation enzymes and starts getting stored in body fat when 25(OH)D levels reach 35 ng/mL (88 nmol/L). The second problem is that this study does not “prove” or “show” or “demonstrate” what the optimal or minimal blood level of vitamin D is. The authors state that one could plausibly postulate that the minimum acceptable blood level is the point at which the enzymes are saturated and vitamin D is stored in body fat, but they never state that “we now know the minimal acceptable level.” The most definitive way to determine the ideal 25(OH)D level would be to conduct a randomized, controlled trial with different levels of vitamin D supplementation targeted at reaching specific blood levels of 25(OH)D and to test the effects of the different levels of supplementation on clinical outcomes, such as bone mineral density, fracture rate, insulin resistance, glucose tolerance, cancer or heart disease. We do not yet have this type of data. We do, however, have some strong support for raising 25(OH)D levels to at least 35 ng/mL (88 nmol/L). For example, as the authors of the study we have been looking at pointed out, similar attempts to use statistical approaches to define the 25(OH)D level that maximizes calcium absorption, maximally suppresses parathyroid hormone (which leaches calcium from bone), or maximizes bone mineral density have suggested similar results. A recent randomized, placebo-controlled trial showed that supplementing insulin-resistant women with 4,000 IU of vitamin D per day for six months reduced insulin resistance and had the most powerful effect in women whose 25(OH)D level was raised to over 32 ng/mL (80 nmol/L). POSSIBLE HARM What about higher levels? The evidence is conflicting, and some of it indicates possible harm. For example, a study in the American Journal of Medicine published in 2004 found that in Americans aged over fifty, the maximal bone mineral density (BMD) occurs around 32-40 ng/mL (80- 100 nmol/L). Among Mexican Americans, BMD continues to rise a little after this point, but for whites it plateaus and begins dropping off around 45 ng/mL (110 nmol/L) and for blacks it begins dropping off even before 40 ng/mL (100 nmol/L). If 50 ng/mL (125 nmol/L) is our minimal acceptable level, this study would seem to suggest that those of us who have “acceptable” levels of 25(OH)D would have lower bone mineral density than those of us who are moderately deficient. And that premise just doesn’t make sense. Another study published in the European Journal of Epidemiology in 2001 found that South Indians with 25(OH)D levels higher than 89 ng/mL (223 nmol/L) were three times more likely to have suffered from ischemic heart disease than those with lower levels—and of course with such a dramatic elevation of heart disease risk, the risk may have begun increasing at levels substantially lower than 89 ng/mL. Neither of these studies was designed to show that high levels of 25(OH)D cause decreases in bone mineral density or increases in heart disease risk, but it is possible. As I especially emphasized in my Wise Traditions and Medical Hypotheses articles on vitamin K2, bone resorption and blood vessel calcification are prominent symptoms of vitamin D toxicity in animal experiments. I also emphasized the role of vitamins A and K2 in protecting against vitamin D toxicity. So, even if these levels are in fact harmful, they may only be harmful or may be primarily harmful in the absence of adequate vitamins A and K2. The presence of the other fat-soluble vitamins could even turn these levels from harmful to beneficial. STILL NEEDED Nevertheless, what we need in order to show that levels higher than 50 ng/mL are helpful or harmful are vitamin D supplementation trials comparing the effect of different doses resulting in different blood levels on clinical health outcomes, and similar studies examining the interactions between vitamin D and the other fat-soluble vitamins. Lifeguards in the tropics can reach blood levels in the 50s and 60s naturally from sun exposure, suggesting these levels are “natural,” although lifeguards in Israel have twenty times the rate of kidney stones as the general population. Kidney stones may be the most sensitive indicator of vitamin D toxicity and are a symptom of vitamin A and K2 deficiency. Thus, I suspect these levels are healthful in the context of a diet rich in vitamins A and K2, and if my levels were to reach this high in the summer sun while I was eating such a diet, I certainly would not worry. But if you are trying desperately to maintain year-round 25(OH)D status between 50-80 ng/ mL using vitamin D supplements, you have entered the land of speculation. Enter at your own risk. SIDEBARS THE FAT SOLUBLE ACTIVATORS The key finding of Dr. Weston Price was very high levels of “fat-soluble activators” in traditional diets. No matter what the particulars of the diet—whether in the frozen north, the Alpine highlands or the tropical South Seas—traditional peoples consumed plentiful amounts of vitamins A, D and what Dr. Price referred to as Activator X—now determined to be vitamin K2—from seafood, organ meats and the fat of grass-fed animals. It is difficult to obtain adequate amounts of these activators in Western diets, partly because government agencies have demonized the foods that contain these vitamins, and also because the industrializaton of agriculture has taken most livestock off pasture. Properly processed cod liver oil is an excellent source of vitamins A and D, and this is why we recommend it for westerners, especially in preparation for conception, during pregnancy and lactation, and for growing children. Unfortunately, while the need for vitamin D has received considerable recognition in recent years, many researchers have spoken out against vitamin A and especially cod liver oil. Chief among the detractors are Dr. John Cannell of the Vitamin D Council, and Dr. Joseph Mercola of mercola.com. This article, which is necessarily technical in parts, serves as part of the ongoing debate on this subject. For background and more information, see www.westonaprice.org/cod-liver-oil/1622.html. This article combines two recent blog postings by Chris Masterjohn. Visit his blog at www.westonaprice.org/blogs/. summer2010fig4 FIGURE 4. Plot of the relation between serum concentrations of vitamin D3 and 25-hydroxyvitamin D after 18–20 weeks of treatment with various doses of vitamin D3. Triangles represent subjects from study B; circles subjects from study C; squares subjects from study F. The regression line is a least-squares fit of the data to a combination exponential and linear function. summer2010fig5 FIGURE 5. Plot of the relation between serum vitamin D3 and 25-hydroxyvitamin D in study D only. As in Figure 4, the regression line is a least-squares fit of the data to a combination exponential and linear function. From the American Journal of Clinical Nutrition, Vol. 87, No. 6, 1738-1742, June 2008. Used with permission. This article appeared in Wise Traditions in Food, Farming and the Healing Arts, the quarterly magazine of the Weston A. Price Foundation, Summer 2010. Christopher Masterjohn chrismasterjohn@aol.com' Filed Under: ABCs of Nutrition, Fat-Soluble Activators, Health Topics ? Action Alert – USDA Dietary Guidelines for 2010 Caustic Commentary, Summer 2010 ? 31 Responses to Update on Vitamins A and D margeauxmarais@gmail.com' Margeaux says: October 14, 2010 at 10:43 pm Solgar Cod Liver Oil? Hi I’d like to know if the nutrient ratios of Solgar’s Norwegian Cod Liver Oil Softgels are acceptable.. each capsule contains 1250 IU vitamin A, 135 IU vitamin D, 28mg EPA and 28mg DHA. Thank you Reply sweettansysmom@q.com' Shary says: February 5, 2011 at 10:43 pm Vitamin D After suffering from low energy, musculoskeletal pain, poor balance, and depression, among other things, for a number of years, I began to feel better after taking large amounts of vitamin D. And I do mean significantly better, like the difference between night and day. Moreover it only took a few weeks of supplementation at 8,000 IU per day of D3. I have dropped back to 5,000 IU a few times but find that I just feel better at 8,000 IU. I will get my blood level checked again soon, but I doubt that I’m anywhere near toxicity. I can only speak for myself, but vitamin D3 supplementation has literally given me back my life. Regarding cod liver oil, I would not take the stuff. I feel there’s a far greater chance of vitamin A toxicity than from vitamin D, and would feel safer eating the cod itself and forgetting about the oil from its liver. Reply orklein@optonline.net' Olga says: February 17, 2011 at 10:42 pm I don’t understand how you can recommend taking cod liver oil during pregnancy when it is known that excessive amounts of vitamin A cause serious birth defects. Also, we have been supplementing with vitamin D3 this most cold and snowy winter (1,000IU kids, 4,000IU me and my husband) and there were only 2 minor colds. My kindergartener has not missed any school! Reply whisperingsage1@gmail.com' Whisperingsage says: September 22, 2014 at 4:11 pm It doesn’t sound as i you have read the many other good articles on this site yet. Please do, this is the history of the vitamin A discoveries; http://www.westonaprice.org/health-topics/abcs-of-nutrition/vitamin-a-saga/ This is a good study on the importance of A and D together as it is found in nature; http://www.westonaprice.org/health-topics/abcs-of-nutrition/vitamin-a-on-trial-does-vitamin-a-cause-osteoporosis/ And this is the original study from the 1930’s with the many tribes Price studied, their pics and their histories as recorded by Price; gutenberg.net.au/ebooks02/0200251h.html There is no need to fall or the fear of vitamin A. As shown in the articles posted above, when found in ratios with one another (in particular- 1000 iu’s of D for every 100,000 iu’s of A) both cancel out the other’s toxicity, and indeed promote beneficial values of both substances. A can be a phenomenal tool when used this way- for example- I lost a number of teeth because of retaking all my childhood shots to get into nursing school. This has caused me many years of health problems, not the least of resulted from chronic diarrhea and tooth loss- yet I have been controlling pain and inflammation by high dose vitamin C and magnesium- when these recently were not enough, I added 200,000 iu of vitamin A and the pain subsided. I was already on 5000 iu of D for years. And took daily doses of 25,000 IU of A. I cannot take NSAIDs because of their propensity to cause me ulcers, or worsen one that was caused by ibuprofen, neither can I take narcotics- so upping the A was a wonderful relief. ( I have been an unemployed licensed nurses for 5 years due to the obama recession believe it or not, so have not had the money to pay a dentist (non toxic of course- look up the Smoking Teeth on Youtube)to get these worked on. Reply mstev6@hotmail.com' Martin says: October 19, 2014 at 7:05 pm “I lost a number of teeth because of retaking all my childhood shots to get into nursing school” I don’t understand how taking shots caused you to lose teeth? Are you sure it wasn’t something else? There was a comment from a poster on another article (forget which one) who was talking about increased tooth sensitivity from increased vitamin a intake. Reply earth_friendly_cleaning@hotmail.com' Laurel Blair, NTP says: May 30, 2011 at 10:42 pm Reply to Olga Hi Olga, There are a number of reasons why Chris Masterjohn and the WAPF recommend cod liver oil during pregnancy. For starters, Weston A Price observed healthy people around the world consuming far more vitamin A than the 10,000 IU that is considered the safe limit, and these people showed no evidence of birth defects. The studies that claim to show increased birth defects from higher doses of vitamin A do not distinguish between synthetic and natural vitamin A, and the people who Weston Price studied only consumed natural vitamin A from foods like liver and butter. The vast majority of the preformed vitamin A that people consume these days is synthetic, as the sources of natural vitamin A have been demonized (egg yolks, butter, liver, cod liver oil). People just don’t eat these foods much anymore, and so any study that claims to show birth defects from vitamin A is really showing defects from synthetic vitamin A. Synthetic vitamins are not the same and they affect the body in different ways. Cod liver oil is high in both vitamins A and D, which each protect against the toxicity of the other. While the doses of vitamin D you mentioned taking are probably ok in the short term if you started with a vitamin D deficiency, in the long run you’re better off increasing your vitamin A intake as well, to protect against vitamin D toxicity. Personally, I have seen nothing but benefits from taking cod liver oil. Reply alph1blepo@hotmail.com' Jon Doh says: October 16, 2011 at 10:41 pm Soy, K2, A, D, and calcium. As I have been reading through the information available online, it appears that there is no satifactory solution to the apparent dangers of soy consumption, if one is to attempt to improve bone density, and combat hardening of the arteries. K2 is necessary to properly use calcium supplements to improve bone mass and K2 is necessary to combat arterial calcification. From what I see, the only high quantity source of K2, if I’ve understood what has been written, is soy. http://www.betterbones.com/bon…min-k.aspx There is a new synthetic K2, if you can actually find someone who sells it, but your articles describe problems with synthetic vitamins, so who wants to go there? So, what do you suggest to improve the K2 intake to a sufficient level to combat hardening of the arteries without consuming any (too much) soy? And in what ratios do you take the vitamins K2, A, and D3? Reply garr.linda@yahoo.com' Linda says: December 4, 2011 at 10:41 pm Update on Vitamins A and D ?? Well, first of all the long article written was full of everything but dosages that were not written for people who aren’t scientists.What about IU’s or mg.?I also read the interview with Dr. Mercola, whom I have followed for many years.I am confused as to who to believe about the A and D issues!Cod Liver oil is another ????,to take yes or no?? Maybe I will just stick to Vit. D3 supplements and rely on our intake of Vit A with our fresh eggs, eaten daily, and forget about supplementation.What a befuddlement!! LG Dec.4 2011 Reply yoli851@sbcglobal.net' Yolanda says: March 28, 2012 at 10:40 pm WHAT ARE THE RECOMMENDED PROPER SUPPLEMENTATION RATIOS BETWEEN VITAMINS D3, A AND K2? SHOULD DOSAGES BE EQUAL TO ONE ANOTHER OF EACH ONE OF THESE VITAMINS? AND I’M ASKING IN LAYMAN’S TERMS OF IU, G, MG, MCG. I DO SUPPLEMENT, BUT IF TOXICITY COULD BE AN ISSUE IN SUPPLEMENTING, IN ALL THE RESEARCH THAT I’VE DONE BETWEEN THESE VITAMINS I HAVE NEVER BEEN OFFERED A STRAIGHT ANSWER. I AND SO MANY PEOPLE THAT I KNOW DISCUSS THIS ISSUE AND WOULD REALLY APPRECIATE AN ANSWER TO THIS QUESTION…….THANK YOU Reply whisperingsage1@gmail.com' Whisperingsage says: September 22, 2014 at 4:27 pm 1000 IU for babies- so that tells us the piddly 400 mg commonly used is waaaaaay too low for the frail elderly who have a lot of catching up to do. And for the rest of us- 5000Iu is the average needed for adults- keep in min d, when we go in the sun on a good UVB day (not possible for most of the year in my latitude) we make 20,000 IU’s in our skin, all other factors being present- proper cholesterol in our diets as it is used to MAKE D in OUR SKIN! Statins probably ruin that function. A 5000-20,000 IU of A. The lowest for normal adults the higher for higher needs like me- I take 25,000 and have been urged to take 50,000 IU of A. K2 is made from natto, not at all fraught with the same dangers as regular soy providing it is organic and not GMO, the DRI is 120 mcg (12 mg) for an adult male- It can be found in 500 mcg (50 mg) doses at online vitamin stores. This is much better than a few years ago- we could only buy it commercially in 5 mg doses. So they are getting better at production or perhaps being ALLOWED higher production by the all knowing FDA. It can also be found under the name MK4 and MK7. It can be found naturally in aged cheeses. I recommend using your own home grown sheep, cow or goat milk. Reply jacquiebattle@verizon.net' Jacquie says: November 20, 2014 at 1:45 pm It looks as though the decimal point may have been dropped in the parenthetical milligram equivalents. 1000 micrograms = 1 milligram. So, 500 mcg is one half milligram (0.5 mg) and 120 mcg is slightly less than one eighth of a milligram (0.12 mg). Reply fidesetspecula@earthlink.net' Ginger says: June 24, 2012 at 10:40 pm Confusion: few readers know what nanograms/milliliter means in daily life I have the same questions as the other commenters: what is the proper ratio of consumption of Vitamins A, D, & K2 as expressed in IU, or mg? I’m only 30 and I already know my bone density is not what it should be. I’m drinking a half gallon of raw milk every week and occasionally taking a cal/mag supplement which is a good start, but it seems very important to know good sources of K2, and the proper supplmentation regimen I should be on! Otherwise, I could easily be shooting myself in the foot with the wrong supplements. Reply whisperingsage1@gmail.com' whisperingsage says: December 9, 2015 at 4:03 pm How are you doing now, since it has been a few years since your post? The Weston A Price site here has an awesome article http://www.westonaprice.org/health-topics/abcs-of-nutrition/vitamin-a-on-trial-does-vitamin-a-cause-osteoporosis/ And VVitamin A can be taken safely up into 200,000 as long as there is at least 1000 IU of Vitamin D. Now, a good couple of books on D are The Power of Vitamin D by Sarfraz Zaidi, MD, and Jeff Bowles The Miraculous results of Extremely High dose Vitamin D, My experiment with Huge Doses. Dr. Zaidi has his practice in Sunny S. CAlifornia, and decided to check D3 status as a routine, and found only ONE, One! patient had optimal blood levels- She was a lifeguard, spending 5 days a week and at least 5 hours a day in the sun in a bathing suit. How’s that for 15 mins on arms and face? He found that on average he commonly recommended 10,000 IU’s a day, I myself have done the 100,000 IU a day experiment for 4 months and it straightened out my toenail fungus. Surprise, and nice. My usual dose is 10,000 IU. K2 is supposed to be 200 mcg per 5000 IU od D3. I take more so I buy natto powder and capsulate it or take 1/2 to 1 teaspoon. As a powder, it is nutty flavored, mixes in yogurt OK. Also when I can get my goats milked I feel better on that too. Don’t forget, bones also need boron (2-3 mg), zinc (15-50 mg, amino acid chelated if you can find it) copper (2 mg) vitamin C (2000-4000 mg) and we already get too much phosphorus. (300mg is the dose, Calcium 1200 mg and now they say Magnesium equal to calcium- I get at least 800 mg. ) Phosphorus and potassium are so abundant in foods because NPK is what the main field farmers routinely use on crops and nothing else. This is why foods are deficient in calcium/magnesium. Cheap dolomite would do a lot to help that- this is true of your home plants too. Reply Twenty-Two Reasons Not to Go Vegetarian Posted on April 8, 2009 by Sally Fallon Morell • 103 Comments PrintFriendly and PDFPrint - PDF - Email Currently making the rounds on the internet is an article resurrected from a 1999 issue of Vegetarian Times, “22 Reasons to Go Vegetarian.” “Consider making this healthy choice as one of your new year’s resolutions. . .” says the teaser. “Stacks of studies confirm that a diet full of fresh fruits and vegetables and grains is your best bet for living a longer, healthier and more enjoyable life. There are literally hundreds of great reasons to switch to a plant-based diet; here are 22 of the best.” Leaving aside for the moment the fact that a “plant-based diet” is not necessarily the same as a vegan diet, and that in the US a diet containing fresh fruits, vegetables and whole grains is a marker for prosperity and health consciousness (and therefore would naturally give better results than a diet lacking in these items), let’s look first at the American origins of the premise that a diet composed largely of fruits, vegetables and grains (presumably whole grains) is a passport to good health. The American Vegetarian Society was founded in 1850 by Sylvester Graham (1794- 1851), an early advocate of dietary reform in United States and the inventor of Graham bread, made from chemical-free unsifted flour. Highly influential, Graham promoted vegetarianism and a high-fiber diet as a cure for alcoholism and lust. Graham preached that an unhealthy diet (one containing the confounding variables of meat and white flour) stimulated excessive sexual desire, which irritated the body and caused disease. John Harvey Kellogg (1852-1943) followed in Graham’s footsteps. Inventor of corn flakes and a process for making peanut butter, Kellogg advocated a high-fiber vegetarian diet to combat the twin evils of constipation and “natural urges.” Kellogg preached against sexual activity even in marriage. Today we recognize the demonization and suppression of “natural urges” as a recipe for the pathological expression thereof; in fact we’d probably label Graham and Kellogg as nut cases suffering from serious insecurities. But the diet proposed to accomplish their goal of character building and social piety is still with us, enshrined, in fact, in the government-sanctioned food pyramid based on grains, vegetables and fruits with the addition of small amounts of lowfat animal foods. Lop off the top of the pyramid and you have the vegan diet, still promoted with religious fervor even though its original dogmatic basis has been forgotten. The language of moral rectitude still lurks in the vegetarian arguments of sexually liberated New Age youth. With these paradoxes in mind, let’s examine the 22 reasons given for adopting a vegan diet. 1. You’ll live a lot longer “Vegetarians live about seven years longer, and vegans (who eat no animal products) about 15 years longer than meat eaters, according to a study from Loma Linda University. These findings are backed up by the China Health Project (the largest population study on diet and health to date), which found that Chinese people who eat the least amount of fat and animal products have the lowest risks of cancer, heart attack and other chronic degenerative diseases.” Reference please? We haven’t found such statistics in a search of the medical database. In spite of claims to “stacks of studies,” there is actually very little scientific literature that carefully compares mortality and disease rates in vegetarians and nonvegetarians. In 1991, Dr. Russell Smith, a statistician, analyzed the existing studies on vegetariansim1 and discovered that while a number of studies show that vegetarian diets significantly decrease blood cholesterol levels, very few have evaluated the effects of vegetarian diets on overall mortality. His careful analysis (see sidebar below) revealed no benefit from vegetarianism in terms of overall mortality or longevity. In fact, Smith speculated on the possibility that the available data from the many existing prospective studies were left unpublished because they failed to reveal any benefits of the vegetarian diet. He notes, for example, mortality statistics are strangely absent from the Tromso Heart Study in Norway, which showed that vegetarians had slightly lower blood cholesterol levels than nonvegetarians.2 Since the publication of Russell Smith’s analysis, two significant reports on vegetarianism and mortality have appeared in the literature. One was a 2005 German paper that compared mortality in German vegetarians and health-conscious persons in a 21-year followup.7 By comparing vegetarians with health-conscious meat eaters, the German researchers eliminated the major problem in studies that claim to have found better mortality rates in vegetarians compared to the general population. Vegetarians tend not to smoke, drink alcohol or indulge in sugar and highly processed foods. To compare these individuals to meat-eaters on the typical western diet will naturally yield results that favor vegetarianism. But in the German study, both vegetarians and nonvegetarian health-conscious persons had reduced mortality compared with the general population, and it was other factors—low prevalence of smoking and moderate or high levels of physical activity—that were associated with reduced overall mortality, not the vegetarian diet. The other was a 2003 report that followed up on The Health Food Shoppers Study in the 1970s and the Oxford Vegetarians Study in the 1980s.8 The mortality of both the vegetarians and the nonvegetarians in these studies was low compared with national rates in the UK. Within the studies, mortality for major causes of death was not significantly different between vegetarians and nonvegetarians, although there was a non-significant reduction in mortality from ischemic heart disease among vegetarians. As for Colin Campbell’s China Study, often cited as proof that plant-based diets are healthier than those containing animal foods, the data on consumption and disease patterns collected by the Cornell University researchers in their massive dietary survey do not support such claims. What the researchers discovered was that meat eaters had lower triglycerides and less cirrhosis of the liver, but otherwise they found no strong correlation, either negative or positive, with meat eating and any disease.9 In his introduction to the research results, study director Campbell refers to “considerable contemporary evidence supporting the hypothesis that the lowest risk for cancer is generated by the consumption of a variety of fresh plant products.”10 Yet Cornell researchers found that the consumption of green vegetables, which ranged from almost 700 grams per day to zero, depending on the region, showed no correlation, either positive or negative, with any disease. Dietary fiber intake seemed to protect against esophageal cancer, but was positively correlated with higher levels of TB, neurological disorders and nasal cancer. Fiber intake did not confer any significant protection against heart disease or most cancers, including cancer of the bowel. In a 1999 article published in Spectrum, Campbell claimed the Cornell findings suggested “that a diet high in animal products produces disease, and a diet high in grains, vegetables and other plant matter produces health.”11 Such statements by the now-famous Campbell are misleading, to put it mildly, and have influenced many unsuspecting consumers to adopt a vegetarian lifestyle in the hopes of improving their health. 2. You’ll save your heart “Cardiovascular disease is still the number one killer in the United States, and the standard American diet (SAD) that’s laden with saturated fat and cholesterol from meat and dairy is largely to blame. Plus, produce contains no saturated fat or cholesterol. Incidentally, cholesterol levels for vegetarians are 14 percent lower than meat eaters” “Stacks of evidence” now exist to refute the notion that cholesterol levels and consumption of saturated fat have anything to do with heart disease, but this is a convenient theory for promoting vegetable oil consumption at the expense of animal fats. The International Atherosclerosis Project found that vegetarians had just as much atherosclerosis as meat eaters.12 Vegetarians also have higher levels of homocysteine, a risk marker for heart disease.13 The standard American diet is not, unfortunately, “laden with saturated fat and cholesterol.” It is, however, laden with trans fats and refined vegetable oils, both derived from plants, and it is these processed fats and oils that are associated with the increase in heart disease, not saturated animal fats. 3. You can put more money in your mutual fund “Replacing meat, chicken and fish with vegetables and fruits is estimated to cut food bills.” Some plant foods, such as nuts and breakfast cereals, are very expensive. And any analysis of your food budget must necessarily include medical and dental expenses, and also account for reduced income due to missed days at work, lack of energy and the behavioral difficulties that result from B12 deficiency. A lowcost vegetarian diet that renders you incapable of performing a well-paid, high-stress job—the kind that allows you to put money into a mutual fund—is a poor bargain in the long-term. 4. You’ll reduce your risk of cancer “Studies done at the German Cancer Research Center in Heidelberg suggest that this is because vegetarians’ immune systems are more effective in killing off tumour cells than meat eaters.’ Studies have also found a plant-based diet helps protect against prostate, colon and skin cancers.” The claim that vegetarians have lower rates of cancer compared to nonvegetarians has been squarely contradicted by a 1994 study comparing vegetarians with the general population.14 Researchers found that although vegetarian Seventh Day Adventists have the same or slightly lower cancer rates for some sites, for example 91 percent instead of 100 percent for breast cancer, the rates for numerous other cancers are much higher than the general US population standard, especially cancers of the reproductive tract. SDA females had more Hodgkins disease (131 percent), more brain cancer (118 percent), more malignant melanoma (171 percent), more uterine cancer (191 percent), more cervical cancer (180 percent) and more ovarian cancer (129 percent) on average. According to scientists at the Cancer Research UK Epidemiology Unit, University of Oxford, “Studies of cancer have not shown clear differences in cancer rates between vegetarians and non vegetarians.”15 5. You’ll add color to your plate “Meat, chicken and fish tend to come in boring shades of brown and beige, but fruits and vegetables come in all colors of the rainbow. Disease fighting phytochemicals are responsible for giving produce their rich, varied hues. So cooking by color is a good way to ensure you’re eating a variety of naturally occurring substances that boost immunity and prevent a range of illnesses” Salmon, eggs and butter have beautiful color. Nothing prevents meat-eaters from adding color to their plate by using a variety of vegetables and fruits. The nutrients from these plant foods will be more easily absorbed if you serve them with butter or cream. Animal foods provide an abundance of “naturally occurring substances that boost immunity and prevent a range of illnesses.” 6. You’ll fit into your old jeans “On average, vegetarians are slimmer than meat eaters, and when we diet, we keep the weight off up to seven years longer. That’s because diets that are higher in vegetable proteins are much lower in fat and calories than the SAD. Vegetarians are also less likely to fall victim to weight-related disorders like heart disease, stroke and diabetes” Studies do show that vegetarians on average have lower body mass than non-vegetarians, but vegetarianism does not confer protection from stroke and diabetes and provides only minimal protection against heart disease. Some people do gain weight—lots of weight—on a vegetarian diet and many vegetarians are far too thin. 7. You’ll give your body a spring cleaning “Giving up meat helps purge the body of toxins (pesticides, environmental pollutants, preservatives) that overload our systems and cause illness. When people begin formal detoxification programs, their first step is to replace meats and dairy products with fruits and vegetables and juices.” There are no studies showing that elimination of meat from the diet helps “purge the body of toxins.” The wording is interesting as it implies that vegetarianism will render a sinful body pure. Most plant foods today are loaded with pesticides and many components in animal products support the body’s detoxification system—such as iron in meat, amino acids in bone broths, vitamin A in liver and saturated fat in butter. No doubt about it, however, toxins are everywhere, in plant foods and animal foods. Health conscious consumers need to do their best to reduce the toxic load by choosing organic plant foods and pasture-raised animal foods. The Honolulu Heart Study found an interesting correlation of Parkinson’s disease with the consumption of fruit and fruit juices.16 Men who consumed one or more servings of fruit or fruit drinks per day were twice as likely to develop Parkinson’s as those who consumed less fruit. Commentators proposed either high levels of pesticides or natural nerve toxins called isoquinolones that occur in fruit as the cause. Salicylates are another component of fruit that can lead to problems. So even the consumption of “healthy” fruit is not necessarily safe. 8. You’ll make a strong political statement “It’s a wonderful thing to be able to finish a delicious meal, knowing that no beings have suffered to make it” Not a single bite of food reaches our mouths that has not involved the killing of animals. By some estimates, at least 300 animals per acre—including mice, rats, moles, groundhogs and birds—are killed for the production of vegetable and grain foods, often in gruesome ways. Only one animal per acre is killed for the production of grass-fed beef and no animal is killed for the production of grass-fed milk until the end of the life of the dairy cow. And what about the human beings, especially growing human beings, who are suffering from nutrient deficiencies and their concomitant health problems as a consequence of a vegetarian diet? Or does only animal suffering count? Of course, we should all work for the elimination of confinement animal facilities, which do cause a great deal of suffering in our animals, not to mention desecration of the environment. This will be more readily accomplished by the millions of meat eaters opting for grass-fed animal foods than by the smaller numbers of vegetarians boycotting meat. Vegetarians wishing to make a political statement should strive for consistency. Cows are slaughtered not only to put steak on the table, but to obtain components used in soaps, shampoos, cosmetics, plastics, pharmaceuticals, waxes (as in candles and crayons), modern building materials and hydraulic brake fluid for airplanes. The membrane that vibrates in your telephone contains beef gelatin. So to avoid hypocrisy, vegetarians need to also refrain from using anything made of plastic, talking on the telephone, flying in airplanes, letting their kids use crayons, and living or working in modern buildings. The ancestors of modern vegetarians would not have survived without using animal products like fur to keep warm, leather to make footwear, belts, straps and shelter, and bones for tools. In fact, the entire interactive network of life on earth, from the jellyfish to the judge, is based on the sacrifice of animals and the use of animal foods. There’s no escape from dependence on slaughtered animals, not even for really good vegan folks who feel wonderful about themselves as they finish their vegan meal. 9. Your meals will taste delicious “Vegetables are endlessly interesting to cook and a joy to eat. It’s an ever-changing parade of flavors and colors and textures and tastes.” To make processed vegetarian foods “taste delicious,” manufacturers load them up with MSG and artificial flavors that imitate the taste of meat. If you are cooking from scratch, it is difficult to satisfy all the taste buds with dishes lacking animal foods. The umami taste is designed to be satisfied with animal foods. In practice, very few people are satisfied with the flavors and tastes of a diet based exclusively on plant foods, even when these foods are loaded up with artificial flavors, which is why it is so difficult for most people to remain on a vegan diet. Vegetables are a lot more interesting and bring us a lot more joy when dressed with egg yolks and cream or cooked in butter or lard. But if you are a vegan, you’ll be using either liquid or partially hydrogenated vegetable oils, both extremely toxic. 10. You’ll help reduce waste and air pollution “Livestock farms create phenomenal amounts of waste, tons of manure, a substance that’s rated by the Environmental Protection Agency (EPA) as a top pollutant. And that’s not even counting the methane gas released by goats, pigs and poultry (which contributes to the greenhouse effect); the ammonia gases from urine; poison gases that emanate from manure lagoons; toxic chemicals from pesticides; and exhaust from farm equipment used to raise feed for animals.” The problem is not animals, which roamed the earth in huge numbers emitting methane, urine and manure long before humans came on the scene, but their concentration into confinement facilities. Only strong, committed, persistent and focused human effort will accomplish the goal of eliminating these abominations—the kind of strength, commitment, persistence and focus that only animal foods rich in cholesterol, zinc, good fats and vitamin B12 can sustain. In nature and on old-fashioned farms, the urine and manure from animals is not a pollutant but a critical input that nourishes plant life. As for methane, the theory that methane from animals contributes to global warming is just that—a theory, one that doesn’t even pass the test of common sense. Without urine and manure to nourish the soil, plant farmers need more pesticides, more chemicals. And there’s only one way to eliminate exhaust from farm equipment used to raise plant foods for vegan diets—pull those plows with horses and mules. 11. Your bones will last longer “The average bone loss for a vegetarian woman at age 65 is 18 percent; for non-vegetarian women, it’s double that. Researchers attribute this to the consumption of excess protein. Excess protein interferes with the absorption and retention of calcium and actually prompts the body to excrete calcium, laying the ground for the brittle bone disease osteoporosis. Animal proteins, including milk, make the blood acidic, and to balance that condition, the body pulls calcium from bones. So rather than rely on milk for calcium, vegetarians turn to dark green leafy vegetables, such as broccoli and legumes, which, calorie for calorie, are superior sources” References, please? The theory that excess protein causes bone loss was first presented in 196817 and followed up in 1972 with a study comparing bone density of vegetarians and meat eaters.18 Twenty-five British lacto-ovo vegetarians were matched for age and sex with an equal number of omnivores. Bone density, determined by reading X-rays of the third finger metacarpal, was found to be significantly higher in the vegetarians—these are lacto-ovo vegetarians, not vegans, so they will have good calcium intake. Dr. Herta Spencer, of the Veterans Administration Hospital in Hines, Illinois, explains that the animal and human studies that correlated calcium loss with high protein diets used isolated, fractionated amino acids from milk or eggs.19 Her studies show that when protein is given as meat, subjects do not show any increase in calcium excreted, or any significant change in serum calcium, even over a long period.20 Other investigators found that a high-protein intake increased calcium absorption when dietary calcium was adequate or high, but not when calcium intake was a low 500 mg per day.21 So meat alone will not help build strong bones. But meat plus dairy is an excellent combination. The chart below illustrates the difficulty of obtaining adequate calcium from green leafy vegetables or legumes and contradicts the claim made above that leafy green vegetables and legumes supply more calcium on a per-calorie basis. The opposite is the case. The RDA for calcium can be met for under 700 calories using cheese or milk, but requires 1200 calories for spinach and 5100 calories for lentils. And not even the most dedicated vegetarians could choke down 13 cups of spinach or 32 cups of lentils (that would be almost doubled once the lentils were cooked) per day (see sidebar, below). Leafy greens present additional problems because they contain calcium-binding oxalic acid. Calcium assimilation requires not only adequate protein but also fat-soluble vitamins A, D and K2, found only in animal fats. The lactoovo vegetarian consuming butter and full fat milk will take in the types of nutrients needed to maintain healthy bone mass, but not the vegan. 12. You’ll help reduce famine “It takes 15 pounds of feed to get one pound of meat. But if the grain were given directly to people, there’d be enough food to feed the entire planet. In addition, using land for animal agriculture is inefficient in terms of maximizing food production. According to the journal Soil and Water, one acre of land could produce 50,000 pounds of tomatoes, 40,000 pounds of potatoes, 30,000 pounds of carrots or just 250 pounds of beef.” No land anywhere in the world will produce 50,000 pounds of tomatoes, 40,000 pounds of potatoes or 30,000 pounds of carrots per acre year after year after year unless bolstered with fertilizer. Such land rotated with animal grazing will be fertilized naturally; without the manure and urine of animals, synthetics must be applied—synthetics that require large amounts of energy to produce and leave problematic pollutants, such as fluoride compounds, as a by-product. And much of the world’s land—mountainous, hillside, arid and marginal areas—is incapable of producing harvestable crops even with a large fertilizer input. But this land will support animal life very well. Eliminating the animals on this land in order to produce vegetable crops will indeed create famine for the people who live there. 13. You’ll avoid toxic chemicals “The EPA estimates that nearly 95 per cent of pesticide residue in our diet comes from meat, fish and dairy products. Fish, in particular, contain carcinogens (PCBs, DDT) and heavy metals (mercury, arsenic; lead, cadmium) that cannot be removed through cooking or freezing. Meat and dairy products are also laced with steroids and hormones.” Pesticides and heavy metals are found in animal foods only because they are applied to plant foods that feed the animals. Pasture-based livestock production and wild caught fish do not contribute to pesticide residue. Conventionally raised vegetables and grains are loaded with chemicals. Vitamin A obtained in adequate amounts from animal foods provides powerful protection against dioxins like PCBs and DDT.23 Vitamin B12 is also protective. Good gut flora prevents their absorption. Humans have always had to deal with environmental carcinogens—smoke is loaded with them—and heavy metals like mercury, which occur naturally in fish. We can deal with these challenges when we have adequate amounts of the nutrients supplied by animal foods. 14. You’ll protect yourself from foodborne illness “According to the Center for Science in the Public Interest, which has stringent food standards, 25 per cent of all chicken sold in the United States carries salmonella bacteria and, the CDC estimates, 70 percent to 90 percent of chickens contain the bacteria campylobacter (some strains of which are antibiotic-resistant), approximately 5 percent of cows carry the lethal strain of E. coli O157:H7 (which causes virulent diseases and death), and 30 percent of pigs slaughtered each year for food are infected with toxoplasmosis (caused by parasites).” The most common source of food-borne illness by a long shot is fruits and vegetables.24 Problems with animal foods stem from factory farming practices. Milk, meat and eggs raised naturally do not present problems of food-borne illness. 15. You may get rid of your back problems “Back pain appears to begin, not in the back, but in the arteries. The degeneration of discs, for instance, which leads to nerves being pinched, starts with the arteries leading to the back. Eating a plant-based diet keeps these arteries clear of cholesterol-causing blockages to help maintain a healthy back.” This item is pure speculation. One of the most common side effects of cholesterol-lowering is crippling back pain. The muscles that support our spine require animal foods to maintain their integrity. And the bones in our spine need a good source of calcium, namely dairy products or bone broth, to remain strong. 16. You’ll be more regular “Eating a lot of vegetables necessarily means consuming fiber, which pushes waste out of the body. Meat contains no fiber. Studies done at Harvard and Brigham Women’s Hospital found that people who ate a high-fiber diet had a 42 percent lower risk of diverticulitis. People who eat lower on the food chain also tend to have fewer incidences of constipation, hemorrhoids and spastic colon.” Konstantin Monastyrsky, author of Fiber Menace, begs to differ. He notes that because fiber indeed slows down the digestive process, it interferes with the digestion in the stomach and, later, clogs the intestines. The results of delayed indigestion (dyspepsia) include heartburn (GERD), gastritis (the inflammation of the stomach’s mucosal membrane), peptic ulcers, enteritis (inflammation of the intestinal mucosal membrane), and further down the tube, constipation, irritable bowel syndrome, ulcerative colitis, and Crohn’s disease. Hemorrhoids and diverticulitis are other likely results—scientific studies do not support the theory that fiber prevents these conditions.25 17. You’ll cool those hot flashes “Plants, grains and legumes contain phytoestrogens that are believed to balance fluctuating hormones, so vegetarian women tend to go through menopause with fewer complaints of sleep problems, hot flashes, fatigue, mood swings, weight gain, depression and a diminished sex drive.” Let’s see now, hormones in meat and milk are bad (see Item 13), but by tortured vegetarian logic, hormones in plant foods are good. Where is the research showing that vegetarian women go through menopause with fewer complaints? Numerous studies have shown that the phytoestrogens in soy foods have an inconsistent effect on hot flashes and other symptoms of menopause.26 The body needs cholesterol, vitamin A, vitamin D and other animal nutrients for hormone production. A vegetarian diet devoid of these nutrients is a recipe for menopausal problems, fatigue and diminished sex drive—the dietary proscriptions of the puritanical Graham and Kellogg work very well for their intended purpose, which is to wipe out libido in both men and women. Lack of cholesterol, vitamin D and vitamin B12 is a recipe for mood swings and depression. If you want to have a happy menopause, don’t be a vegetarian! 18. You’ll help to bring down the national debt “We spend large amounts annually to treat the heart disease, cancer, obesity, and food poisoning that are byproducts of a diet heavy on animal products.” We have commented on the link between vegetarianism and heart disease, cancer, obesity and food poisoning above. The main change in the American diet paralleling the huge increase in health problems is the substitution of vegetable oils for animal fats. A secondary change is the industrialization of agriculture. The solution to our health crisis is to return to pasture-based farming methods and the animal food-rich diets of our ancestors. 19. You’ll preserve our fish population “Because of our voracious appetite for fish, 39 per cent of the oceans’ fish species are over-harvested, and the Food & Agriculture Organization reports that 11 of 15 of the world’s major fishing grounds have become depleted.” Let’s pass laws against overfishing! And let’s provide the incentive to anti-overfishing activists by pointing out the important benefits of seafood in the diet. 20. You’ll help protect the purity of water “It takes 2,500 gallons of water to produce one pound of mutton, but just 25 gallons of water to produce a pound of wheat. Not only is this wasteful, but it contributes to rampant water pollution.” Reference please? If a sheep drinks one gallon of water per day— which is a lot—the animal would only need about 600 gallons of water to yield almost eighty pounds of meat. That’s less than eight gallons of water per pound, much less than the water required to produce a pound of wheat. 21. You’ll provide a great role model for your kids “If you set a good example and feed your children good food, chances are they’ll live a longer and healthier life. You’re also providing a market for vegetarian products and making it more likely that they’ll be available for the children.” You may not ever have any children if you follow a vegan diet, and in case you do, you will be condemning your kids to a life of poor health and misery. Here’s what Dutch researcher P C Dagnelie has to say about the risks of a vegetarian diet: “ A vegan diet. . . leads to strongly increased risk of deficiencies of vitamin B12, vitamin B2 and several minerals, such as calcium, iron and zinc. . . even a lacto-vegetarian diet produces an increased risk of deficiencies of vitamin B12 and possibly certain minerals such as iron.”27 These deficiencies can adversely affect not only physical growth but also neurological development. And following a vegan diet while pregnant is a recipe for disaster. You will, however, by embracing vegetarianism, provide a market for vegetarian products—the kind of highly processed, high-profit foods advertised in Vegetarian Times. 22. Going vegetarian is easy! “Vegetarian cooking has never been so simple. We live in a country that has been vegetarian by default. Our traditional dishes are loaded with the goodness of vegetarian food. Switching over is very simple indeed.” Going vegetarian is very difficult. The body needs animal foods and provides a powerful drive to eat them. Cravings and resentment are a natural byproduct of a vegetarian diet, not to mention separation from the the majority of humankind by unnatural eating habits and sense of moral rectitude. Sidebars Analysis of Vegetarian Studies by Russell Smith Russell Smith, PhD, was a statistician and critic of the lipid heart theory of heart disease. He is the author of the massive Diet, Blood Cholesterol and Coronary Heart Disease: A Critical Review of the Literature (1991, Vector Enterprises), as well as The Cholesterol Conspiracy (Warren H. Green, Inc., 1991). As part of his efforts to reveal the flimsiness of the theoretical basis for the lipid hypothesis, he also looked at studies on vegetarianism in the scientific literature. In a review of some 3,000 articles, Smith found only two that compared mortality data for vegetarians and nonvegetarians. One was a 1978 study of Seventh Day Adventists (SDAs) to which the above unreferenced claim probably refers. Two very poor analyses of the data were published in 1984, one by H. A. Kahn and one by D. A. Snowden.3 The publication by Kahn rather arbitrarily threw out most of the data and considered only subjects who indicated very infrequent or very frequent consumption of the various foods. The author then computed “odds ratios” which showed that mortality increased as meat or poultry consumption increased (but not for cheese, eggs, milk or fat attached to meat). When Smith analyzed total mortality rates from the study as a function of the frequencies of consuming cheese, meat, milk, eggs and fat attached to meat, he found that the total death rate decreased as the frequencies of consuming cheese, eggs, meat and milk increased. He called the Kahn publication “yet another example of negative results which are massaged and misinterpreted to support the politically correct assertions that vegetarians live longer lives.” The Snowden analysis looked at mortality data for coronary heart disease (CHD), rather than total mortality data, for the 21-year SDA study. Since he did not eliminate the intermediate frequencies of consumption data on meat, but did so with eggs, cheese and milk, this analysis represents further evidence that both Kahn and Snowden based their results on arbitrary, after-the-fact analysis and not on pre-planned analyses contingent on the design of their questionnaire. Snowden computed relative risk ratios and concluded that CHD mortality increased as meat consumption increased. However, the rates of increase were trivial at 0.04 percent and 0.01 percent respectively for males and females. Snowden, like Kahn, also found no relationship between frequency of consumption of eggs, cheese and milk and CHD mortality “risk.” Citing the SDA study, other writers have claimed that nonvegetarians have higher all-cause mortality rates than vegetarians4 and that, “There seems little doubt that SDA men at least experience less total heart disease than do others. . .”5 The overpowering motivation to show that a diet low in animal products protects against CHD (and other diseases) is no better exemplified than in the SDA study and its subsequent analysis. While Kahn and Snowden both used the term “substantial” to describe the effects of meat consumption on mortalities, it is obvious that “trivial” is the appropriate descriptor. It is also interesting to note that throughout their analyses, they brushed aside their totally negative findings on foods which have much greater quantities of fat, saturated fat and cholesterol. The second study was published by Burr and Sweetnam in 1982.6 It was shown that annual CHD death rate among vegetarians was only 0.01 percent lower than that of nonvegetarians, yet the authors indicated that the difference was “substantial.” The table below presents the annual death rates for vegetarians and nonvegetarians which Smith derived from the raw data in the seven-year Burr and Sweetnam study. As can be seen, the “marked” difference between vegetarian and nonvegetarian men in Ischemic Heart Disease (IHD) was only .11 percent. The difference in all-cause death rate was in the opposite direction, a fact that Burr and Sweetnam failed to mention. Moreover, the IHD and all-cause death rates among females were actually slightly greater for heart disease and substantially greater for all causes in vegetarians than in nonvegetarians. Annual Death Rates of Vegetarians and Nonvegetarians IHD All-Cause Male vegetarians .22% .93% Male nonvegetarians .33% .88% Female vegetarians .14% .86% Female nonvegetarians .10% .54% These results are absolutely not supportive of the proposition that vegetarianism protects against either heart disease or all-cause mortalities. They also indicate that vegetarianism is more dangerous for women than for men. How to Protect Yourself from Cancer with Food See our online brochure (printed version also available in our store) Vegetarianism: Variations on a Theme by Jim Earles VEGETARIANISM: In its simplest form, the abstinence from all flesh foods—those foods which inherently require the taking of an animal’s life—in favor of plant foods. Without further qualifying terms, the term “vegetarian” does not specify whether or not a person might choose to eat animal products like milk and eggs, which do not inherently require the taking of an animal’s life. LACTO-VEGETARIANISM: A vegetarian diet with the inclusion of milk and/or dairy products. OVO-VEGETARIANISM: A vegetarian diet with the inclusion of eggs (usually eggs from chickens or other fowl, but presumably an ovo-vegetarian might also eat fish roe). PESCO-VEGETARIANISM (a.k.a. pescetarianism): A vegetarian diet with the exception of consuming fish and/or seafood. This is often viewed by adherents as being a voluntary abstention from eating land animals. This diet is similar to (and often overlaps with) the popular version of the Mediterranean Diet. POLLO-VEGETARIANISM (a.k.a. pollotarianism): A vegetarian diet with the exception of consuming chicken (and possibly other types of fowl). This is often viewed by adherents as being a voluntary abstention from red meats and from eating more highly-developed mammals such as cows, pigs, sheep, etc. NOTE: Many vegetarians do not feel that people who include seafoods or land fowl in their diets qualify as vegetarians at all. Indeed, many practicing pescetarians and pollotarians feel that their diet is a similar but entirely distinct dietary philosophy from vegetarianism. Some people prefer to use terms such as “semi-vegetarianism” or “flexitarianism” to refer to the primary (but not exclusive) practice of vegetarianism. ALSO NOTE: The above variants on vegetarianism may be combined in any way to describe an individual’s food choices. (e.g. lacto-ovo-vegetarianism, pollo-ovo-vegetarianism, etc.) VEGANISM: The more extreme end of the scale of vegetarianism. A vegan (both “vee-gan” and “vay-gan” are accepted pronunciations) abstains from all animal foods, including any meats, fish, eggs or dairy. Some vegans, but not all of them, also abstain from honey and other bee products, as well as clothing and materials made from animal products (e.g. silk, leather, fur, etc.). Many vegans view their dietary choices as being just a part of veganism, which is more fully viewed as a way of life and a socio-political stance. FREEGANISM: A subset of veganism which utilizes the same basic food choices but often lives out the socio-political aspects of veganism in an even more direct and radical way. Freegans seek to minimize or eliminate participation in the corporate food system by practices such as foraging for wild plant foods, community gardening, bartering for food instead of using money and dumpster diving (taking food that is still edible but past its expiration date out of supermarket, restaurant and bakery dumpsters). Dumpster diving especially is seen as a radical form of environmental stewardship—saving otherwise good food from going to a landfill. Getting food for free in this way also gives rise to the name—“free” plus “vegan” equals “freegan.” MEAGANISM: A further subset of freeganism! A meagan would dispense with the strict adherence to a vegan diet when their dumpster diving provides them with usable meat or other animal foods. (“Meat” plus “vegan” equals “meagan.”) Some meagans argue that all foods produced by the dominant corporate model are ethically-tainted, meatless or otherwise. Following this line, there is no moral high ground to be had when eating salvaged food. Other meagans believe that it is disrespectful to the spirit of an animal to allow its flesh or other products to be wasted, so it is better to eat these items and honor the loss of their lives by keeping them in the food chain whenever possible. FRUITARIANISM: A subset of veganism wherein neither animals nor plants are allowed to be harmed or killed to feed human beings. This means that only the fruits of plants and trees are morally acceptable as human food, as these may be harvested without doing any harm to the plant. However, there is no strong consensus among fruitarians as to what exactly should constitute “fruit.” Botanically speaking, some common vegetables are actually classified as fruits (such as bell peppers, tomatoes and cucumbers), as are nuts and grains. Some fruitarians abide by the wider, botanical meaning of “fruit,” while others only eat the sweet, fleshy, more commonly-known fruits. Many fruitarians also include seeds in their diet, following the line of thought that anything that naturally falls from a plant (or would do so) is valid food. LIQUIDARIANISM / JUICEARIANISM: A rarely-espoused dietary philosophy wherein adherents only consume liquids and fruit and vegetable juices. More often than not, such a program would only be undertaken for a limited period of time only for the purposes of a cleansing fast. However, a relatively small number of people have attempted to maintain such a regime over an indefinite period of time. RAW FOODISM: While not necessarily falling under any of the above headings, many raw foodists base their food choices on some form of vegetarianism or veganism. A raw foodist consumes most or all of their foods in uncooked and unprocessed forms. (This may or may not include practices such as the soaking of nuts, seeds and grains.) While many raw foodists minimize or exclude animal products, some do consume raw meats, eggs and dairy products. MACROBIOTICS: Again not necessarily falling under any vegetarian category, but many macrobiotic adherents have strong overlap with vegetarianism and veganism. The macrobiotic diet emphasizes eating foods that are grown locally and (to the extent possible) when they are actually in season, placing an emphasis on eating grains, legumes, vegetables, fruits, nuts, seeds, fermented soy products and sometimes fish. Processed foods and animal products are typically excluded, as are vegetables of the nightshade family. VEGANGELICAL: Extreme veganism, where eating habits have become a highly intolerant, proselytizing religion! Products that Come from Cows Not only the steak on your plate, but a myriad of other products come from slaughtered cows, including components used in the manufacture of cosmetics, plastics, waxes (in crayons and candles), soaps, cleansers, shampoos, modern building materials and hydraulic brake fluid for airplanes. The membrane that vibrates to make a telephone work is made from beef gelatin. Epinephrine, a widely used drug for asthma and allergic reactions, is made from beef adrenal glands. No Such Thing as a Guilt-Free Lunch Letter published in the New Yorker, January 7, 2008 Bill Buford writes that nobody has a persuasive rejoinder to the vegan belief that sentient, warm-blooded creatures shouldn’t be sacrificed for our sustenance [An article on meat-eating called “Red, White, and Bleu,” December 3, 2007]. But if that’s your ethic, you should seriously consider fasting. Countless millions of wee furry beasties, mice, moles and voles, as well as ground-nesting birds, are killed outright or die off from habitat destruction annually, when vast acreages are tilled by huge, mindless machines to grow “ethical” grains and vegetables. More are killed during the growing season by rodenticide grain baits, including zinc phosphide. Small mammals and birds are killed by machinery again at harvest time, and even more are killed by pest-control practices in granaries and processing plants before vegetables get to market. There’s no such thing as a guilt-free lunch. Rich Latimer, Falmouth, Massachusetts Calcium in Dairy Products and Plant Foods Calories per 100 grams Calcium in mg per 100 grams Calcium/Calorie Ratio Amount needed for RDA (1200 mg) Cheddar Cheese 402 718 1.8 170 grams (about 6 ounces) = 680 K Whole Milk 66 117 1.7 1000 grams (about 4 cups) = 660 K Spinach 91 93 1.02 1300 grams (about 13 cups) = 1200 K Lentils 106 25 0.23 4800 grams (about 32 cups) = 5100 K The Nutrient Density Stakes: Landslide Victory of Animal Foods over Fruits and Vegetables Plant foods fail to match up to animal foods in almost every category. Note that liver contains more vitamin C than apples or carrots! Per 100g Phosphorus in mg Iron in mg Zinc in mg Copper in mg B2 in mg A in IU C in mg B6 in mg B12 in mcg Apple 000.6 0.1 0.05 00.04 0.02 00000 07.0 0.03 000 Carrots 031.0 0.6 0.3 00.08 0.05 00000 06.0 0.1 000 Red Meat 140 3.3 4.4 00.2 0.2 00040 00 0.07 001.84 Liver 476 8.8 4.0 12.0 4.2 53,400 27 0.73 111.3 References 1.Smith, Russell L. Diet, Blood Cholesterol and Coronary Heart Disease: A Critical Review of the Literature, Vol 2. Vector Enterprises, November 1991. 2.Fonnebo V. The Tromso Heart Study: diet, religion and risk factor for coronary heart disease. American Journal of Clinical Nutrition, 1988, 48:739. 3.Kahn HA and others, Association between reported diet and all-cause mortality. American Journal of Epidemiology, 1984, 119:775; Snowden DA and others. Meat consumption and fatal ischemic heart disease. Preventive Medicine, 1984, 13:490. 4.Dwyer JT. Health aspects of vegetarian diets. American Journal of Clinical Nutrition, 1988, 48:712. 5.Fraser GE. Determinants of ischemic heart disease in Seventh-Day Adventists: a review. American Journal of Clinical Nutrition, 1988, 48:833. 6.Burr ML and P M Sweetnam PM. Vegetarianism, dietary fiber and mortality. American Journal of Clinical Nutrition, 1982, 36:873. 7.Chang-Claude J and others. Life style determinants and mortality in German vegetarians and health-conscious persons: results of a 21-year follow-up. Cancer Epidemiol Biomarkers Prev. 2005 Apr;14(4):963-8. 8.Key TJ and others. Mortality in British vegetarians; review and preliminary results from EPIC-Oxford. Am J Clin Nutr. 2003 Sep;78(3 Suppl):533S-538S. 9.T Colin Campbell and others. The Cornell Project in China. 10.T Colin Campbell and others. The Cornell Project in China, p 56. 11.The China Project: The Most Comprehensive Study Ever Undertaken on Diet and Health. Spectrum, Mar-Apr 1999, p 27. 12.Laboratory Investigations 1968 18:498. 13.Keys TJ and others. Health effects of vegetarianism and vegan diets. Proc Nutri Sci 2006;65(1):35-41. 14.Mills PF and others. Cancer incidence among California Seventh-Day Adventists, 1976-1982. American Journal of Clinical Nutrition. 1994, Vol 59 (Supplement), Pages 1136S-1142S. 15.Key TJ and others. Health effects of vegetarian and vegan diets. Proc Nutr Soc. 2006 Feb;65(1):35-41. 16.http://www.webmd.com/parkinsons-disease/news/20030402/fruit-linked-to-parkinsons-disease. 17.Wachman A and D S Bernstein DS. Diet and osteoporosis. Lancet 1968 1:958. 18.Ellis FR and others. Incidence of osteoporosis in vegetarians and omnivores. American Journal of Clinical Nutrition, June 1972, 25:555-558. 19.Spencer H and Kramer L. Factors contributing to osteoporosis. Journal of Nutrition, 1986 116:316-319. 20.Spencer H and Kramer L. Further studies of the effect of a high protein diet as meat on calcium metabolism. American Journal of Clinical Nutrition, June 1983 37 (6):924-929. 21.Linkswiler HM and others. Calcium retention of young adult males as affected by level or protein and of calcium intake. Trans. N. Y. Acad. Sci, 1974 36:333. 22.Ensimger AH and others. The Concise Encyclopedia of Food and Nutrition. CRC Press, 1995. 23.Masterjohn, Chris. Dioxins in Animal Foods: A Case for Vegetarianism? Wise Traditions, Fall 2005. 24.MMWR Mar 2, 2000:49(SS01);1-51. 25.http://www.fibermenace.com. 26.Soy Alert! Update, Summer 2003, on westonaprice.org. 27.Dagnelie PC. Nutrition and health—potential health benefits and risks of vegetarianism and limited consumption of meat in the Netherlands. Ned Tijdschr Geneeskd. 2003 Jul 5;147(27):1308-13. What Can the Diet of Gorillas Tell Us About a Healthy Diet for Humans? Posted on February 17, 2004 by H. Leon Abrams, Jr. • 1 Comment PrintFriendly and PDFPrint - PDF - Email One of the arguments proffered by vegetarians is that our primate ancestors were vegetarians and, to be healthy, we should eat the same kind of diet. An article entitled “The Western Lowland Gorilla Diet Has Implications For the Health of Humans and Other Hominids,” which appeared in a recent issue of Human and Clinical Nutrition, makes this argument. With reference to the authors’ study of the vegetarian diet of gorillas, the research is sound, but to claim that humans would be better off with a vegetarian diet like that of the gorillas is spurious and equivocal. One misconception about the gorilla diet is that it contains no animal products. On the contrary, all of the great ape groups take in some animal protein, whether overtly or inadvertently, by consuming insects, insect eggs and the larvae that nest on the plants and fruits they eat. In her pioneering work on chimpanzees, Jane Goodall discovered to her amazement, and to the amazement of the rest of the world, that chimpanzees kill and eat monkeys and make a tool to extract termites from their hills (homes), and that they went to considerable effort to obtain these foods. It is also significant that meat is the only food they share with other chimpanzees. All monkeys, lemurs and apes are classified as vegetarians and/or fruitivors, but they consume a small amount of animal protein by unconsciously eating the small insects, their eggs and larvae on the plant foods they select to eat. The National Zoo in Washington, D.C. tried to breed the near extinct fruitivorian South American golden marmoset in captivity with no result, but when a little animal protein was added to their diet, they began to breed, which proves that they require a small amount of animal protein to be healthy and reproduce. With the exception of humans, the native habitat of all the primates is in the tropics. By contrast, for thousands of years, humans have inhabited all the land masses of the world, except for Anarctica. The first humans, the Australopithicines, circa 2 million years ago, were omnivorous. Recently, some researchers, in examining their fossil teeth, have claimed that the Australopithicines were vegetarians; but the evidence indicates they were omnivorous. It is clear that by the time “humans” evolved, from Homo erectus through to what is now considered “modern” humans, such as Cro-Magnon man, humans were primarily meat eaters. According to J. Brownoski, (The Ascent of Man), it was meat-eating that led to the rise of modern man. Homo erectus invented stone tools for hunting big game which led to the invention of more advanced stone tools by Cro-Magnon to modern humans. It was the quest for meat that led Homo sapiens to colonize the world. They followed the herds of animals. When overpopulation caused the animal food supply to dwindle, many moved on, from tropical Africa to North Africa, Asia, Europe, the Americas and Australia. They walked and adapted to the cold climates and were able to do so because meat is compact energy, and one kill of a mammoth or other big game could feed many people and lasted for a long period of time; whereas gathering plants and fruits to eat was seasonal. Until the early part of the 20th century there were peoples who lived almost entirely on animal food. For example, the Eskimos of North America and Lapps of Scandinavia lived almost entirely on animal protein and were very healthy. However, when we refer to meat, remember that meat entails fats which are necessary for sound health. The protein and minerals in the meat cannot be utilized without the nutrients in the fat. Both Steffanson and Brody, who spent time with the Eskimos and Indians of North America, reported that these people saved the fat from game animals and always ate their meat with fat. The Eskimos ate raw meat, which is very healthy, but there is a caveat for modern society: fresh meat often contains bacteria and parasites that can cause illness, and even death, therefore it is recommended by the government that all meat should be cooked well enough to kill all such pollutants. Humans only turned to plant foods as major food sources when, due to the ever-increasing human population, herds of animals became scarce. They learned to domesticate some animals and invented agriculture. Humans learned to use fire, to any extent, in the Paleolithic age. Cooking certainly was necessary, because grains cannot be eaten raw. It is also interesting to note that when humans began eating a diet high in grains, the incidence of tooth decay increased considerably. Tooth decay increased dramatically when refined grains (wheat and rice) became staple diets for a large percentage of the world’s population. For normal growth and sound health throughout life, the human species requires eight amino acids which their bodies cannot manufacture, vitamin B12 and some essential minerals. The only viable source of these amino acids and of vitamin B12 is animal protein such as red meat, fish, shell fish, eggs, milk, insects and worms. The lack of these amino acids results in serious illnesses. For example, kwashiorkor is a deficiency disease which impedes the normal development of vital brain cells and stunts growth. People may be getting all they need to eat to satisfy their hunger from grains and other plant foods. They may even become plump on a diet of grains, but their normal growth and development is stunted. For instance, some Maya Indian peasant groups of Guatemala primarily have only corn, beans and squash to eat. They like meat, but are too poor to purchase meats or raise animals. Feeding domesticated animals would sacrifice land needed to grow the grains on which they subsist. This condition is common over much of the world. Unlike humans, the digestive tract of gorillas is equipped to manufacture the essential amino acids and other vital nutrients. The human digestive system is not so equipped and we must rely on animal proteins. It is interesting to note that advocates of vegetarian diets who use the diet of apes as a rational to support their food choice–asserting that the ape diet is more “natural”–fail to advocate eating a diet of all-raw plant foods as the apes do. The basic plant foods that humans eat must be cooked. Vegan advocates also say that by combining grains with legumes, one can get the essential amino acids. Though this may be theoretically possible, in practice it is not viable and extremely difficult or impossible to accomplish, particularly if robust health is to be achieved and maintained generation after generation. Of course, due to modern technology, many of the essential nutrients can be supplied by synthetic or processed products, but these merely duplicate what is naturally in animal protein and are often extracted from them. To be on the safe side, it is wise to procure essential nutrients from their best source–animal protein. Anthropologists have wondered why certain foods came to be prohibited by some religions. The anthropologist, Dr. Marvin Harris, in his two extremely readable, informative and enjoyable books, Cannibals and Kings and Cows, Pigs, Wars and Witches, shows that the prohibition of pigs (pork) by the Jewish religion and cows by the Hindu religion came about due to the ever-increasing pressure of population growth. Pigs eat grain. It takes lots of land to grow grain for wheat which could feed more humans than it could feed pigs that require the grain to become meat on the human dinner-table. So wheat was in competition with pigs and the wheat won out when human referees decided wheat was more efficient in feeding the growing population. So pork wasn’t worth the grain and was prohibited by the religious leaders as a strategy to feed the population more efficiently. Likewise, in India where beef was widely eaten at an earlier time in history, the Hindu religion prohibited it because the cow was more valuable for its milk and dung than as edible beef. Milk from the cow provided animal protein and the dung provided fuel for the fires to cook food. Religious sanctions are a very powerful societal force of control. (In these books by Harris, only a few pages are devoted to this subject, but the books are highly recommended for gaining insight into human behavior.) In economically diverse societies where animal protein is scarce among the poorer classes and more abundant in the increasingly affluent sectors of society, it is interesting to note the differences in body height that seems to reflect the way people are forced to eat. The less affluent sectors subsist primarily on grains and a few vegetables and lack the height that is found among the more affluent ruling classes. This situation can develop as a result of overpopulation because too many humans inhabiting in a region can deplete the carrying capacity of the land upon which the food is produced. The ancient Maya of the Classical Period used the slash and burn strategy to create more arable land as their population outgrew the surrounding forest. In order to create fields in which to grow corn, squash, beans and chili peppers, forest land was cleared by the destructive method of cutting down trees and burning the debris. This is a very brutal strategy within a fragile ecosystem that rapidly exhausts the soil. The Mayan diet consisted chiefly of the vegetables they grew, a few fruits and game. But the game became scarce as the forest was cleared for farm land and only the tiny ruling class had access to animal protein. (They had the domestic turkey and dog, but these animals ate the same food as humans.) This ecologically unstable situation led to the collapse of the Classical Maya civilization when they abandoned their great cities. The point for this article is that the skeletons unearthed from the Mayan burial grounds reveal that the ruling class was taller than the masses. The nobility supplemented their basic diet of corn, beans and squash with what animal protein was available; whereas the masses had practically none. So what can the diet of gorillas tell us about what constitutes a healthy diet for humans? Little if anything. Humans are omnivores and need animal protein as well as plant foods to maintain sound health. The author of this article and Dr. Melvin E. Page recommend, as presented in their book, Your Body is Your Best Doctor, the following as a sound diet to help maintain optimal health: Eat a variety of fresh animal protein and fats, a wide variety of fresh vegetables, fruits and nuts and whole grain breads and cereals. For a complete bibliography on this subject, see “The Relevance of Paleolithic Diet in Determining Contemporary Nutritional Needs,” H. Leon Abrams, Jr. The Journal of Applied Nutrition. Vol. 31, Numbers 1 and 2. Editor’s Note: Many practitioners still recommend the use of raw meat for its health-building properties, pointing out the careful handling and protective factors in the diet can minimize the risks of parasite and microbial infection. Copper-Zinc Imbalance: Unrecognized Consequence of Plant-Based Diets and a Contributor to Chronic Fatigue Posted on February 14, 2008 by Laurie Warner, MA, CNC • 27 Comments PrintFriendly and PDFPrint - PDF - Email A commonly reported consequence of vegetarian or vegan diets, or even diets that rely too heavily on plant foods, is chronic fatigue. Many sufferers subsequently embrace the principles enumerated by Weston Price, adopting a diet containing more nutrient-dense animal foods and fat; however, the fatigue often persists, even after considerable time on the new diet. While Americans have been receiving a broad education on the nutritional value of plant foods, evidence has accumulated to indicate that diets that rely too heavily on plant food sources have special problems of their own. Those of us interested in traditional nutrition have become familiar with some of these, including fatty acid imbalances, B6 and B12 deficiencies, and untreated phytates in whole grains, legumes and nuts. As we continue to delve into these areas, the seriousness of these dietary imbalances continues to emerge. Disruption of the copper-zinc ratio is an overlooked contributor to intractable fatigue that follows excessive reliance on a plant-based diet. The result is toxic accumulation of copper in tissues and critical depletion of zinc through excretion. This condition usually goes unrecognized because copper levels in the blood can remain normal. Also, most doctors are unprepared to meet with extreme zinc deficiency and its baffling effects on many systems of the body. Hair mineral analysis, competently used, is the tool which can unravel the complexities of this growing problem. In particular, it is becoming clear that plant-based diets, and lighter diets generally, cause serious nutrient imbalances and long-term damage to digestion and cellular metabolism that are not easily corrected. This is of consequence for us in the traditional foods movement because we are asking people to return to higher density foods they may not have eaten for many years. Proper physiologic balance can be restored, but the period of transition in some cases may be longer and more difficult than we have anticipated. An Unrecognized Danger This article explores a major hurdle to dietary recovery, which has remained little-known, although an accessible book by Ann Louise Gittleman, MS, introduced the topic in 1999.1 The fact is that the micronutrient copper is widely available in unrefined foods,2 but the mineral zinc, needed in larger amounts to balance copper, can only reliably be obtained in optimum amounts from land-based animal foods, in particular eggs and red meats.3 These of course are among the foods that have been most stubbornly attacked by mainstream nutrition authorities. They are also among the foods lacto-vegetarians and others who have conscientiously adopted light diets have the most difficulty in reintroducing. It is tragic that Americans who have been inspired to adopt healthier diets have been so harmfully misled by the anti-animal foods dogma, often against their better instincts. I myself was led into this trap in the mid 1970s, and have only found my way out of it in the last few years. Although I found the Weston A. Price Foundation material when it first appeared, and benefited from many of its suggestions, I was unable to consistently expand my diet, or even tolerate any fat, until I learned to recognize and apply the lessons of the copper-zinc imbalance. In fact, this imbalance could very well have killed me. Controlling Copper A brief survey of copper/zinc imbalance will show why this condition can be so serious. Copper is an essential trace mineral, but it is needed only in minute amounts. It works in a paired relationship with zinc, sometimes in complement and sometimes opposing. Copper is present in most foods, and is also absorbed from the environment.4,5 When zinc is present in abundance, and when there is enough quality protein available to bind it,6 copper can be handled freely, and the excess can be readily excreted trough the bile.7,8 When the diet is lacking in zinc and protein, however—and in fats to promote bile production —use of high-copper foods, and environmental copper, primarily ingested through our water, promote buildup of copper in our tissues.9 The late Carl C. Pfeiffer PhD, MD, formerly of the Brain-Bio Center in Princeton, New Jersey, has provided us with the most comprehensive overview of nutritional problems associated with copper and zinc in his classic study Mental and Elemental Nutrients.10 As he succinctly puts it, “Deficiency of zinc accentuates copper excess.”11 Here we have a classic dilemma of the medical flight from traditional diets. In lighter diets generally, and in heavily plant-based diets in particular, zinc is sharply reduced relative to copper,12 protein is curtailed, and fat is provided scantily at best. The excess copper that builds up in tissues is in unbound, inorganic form,13 highly immobile and creates a low-level toxicity that interferes with many body systems. Particularly affected are the liver and digestion,14 which are already hampered by increasing deficiency of zinc. As bile function and digestive vigor decline, difficulty with meat and fat develops. Legions of light-diet and vegetarian adherents feel justified in their choices because heavier food becomes unpalatable to them.15 The Grain Connection We can quickly recognize a connection here that is particularly relevant to traditional foods nutrition. The copper-zinc ratio in grains is disturbed by refining.16 This ratio tends to be low in plant foods anyway,17 and is shifted further in favor of copper by the refining process. In whole grains, as we know, phytates interfere with zinc absorption, so the net benefit from unrefined grains is always problematic, and probably very low in most cases, while copper, which is less affected by phytates than zinc,18 gains again in the copper-zinc ratio. This loss of nutrients in grains, though serious, seems to have had less effect in past generations when much of the country still lived rurally and meat and eggs were liberally used.19 Current ideology, however, has shifted the burden of the diet to grains and other phytate-bearing foods and most people concerned with nutritional values of their food today have come to believe that these foods are reliable sources of both protein20 and zinc,21 resulting in poor protein nutrition, zinc deficiencies and build up of excess copper. Modern Conditions Even in 1975, Pfeiffer considered zinc status in most Americans to be borderline at best.22 After twenty-five years of vegetarianism and plant-based diets, it is doubtful our status today is even that optimistic. Too many other factors also work to increase copper and work against zinc. Zinc galvanized pipes have been replaced by copper pipes in many areas, which can be etched by slightly acidic water supplies.23 Birth control pills and other medications increase the retention of copper.24 Blanching of vegetables before commercial freezing removes zinc and many trace minerals,25 while copper is added to many multivitamins.26 There are numerous other factors contributing to this imbalance, but most devastatingly zinc is lost from our bodies every day when we are under stress.27 The more stress, the higher the losses, and yet zinc is needed in large amounts by our stress-resisting adrenal glands.28 When we are zinc-deficient our innate coping resources can start to unravel, and the grind of everyday stress can seem overwhelming. Effects on the Personality I know now that I started life with a big zinc-deficiency liability. Four years ago, my acupuncturist put me on a copper-zinc balancing program, but it was only about a year ago that I learned about pyroluria from the Resource Tool Kit in The Mood Cure by Julia Ross, MA.29 Those of us with this condition, affecting 11 percent of the population, produce excessive amounts of a metabolic toxin called pyrroles, which requires vitamin B6 and zinc for detoxification.30 Significantly, this condition is found disproportionately in those with alcoholism,31 schizophrenia32 and mood disorders.33 It can also produce baffling physical symptoms due to heightened deficiency of these two nutrients, as well as manganese,34 a nutrient that is crucially needed to activate arginase,35 the enzyme that converts ammonia to urea for excretion from the body. Pyroluria, like copper-zinc imbalance, was first researched at the Brain-Bio Center.36 Pyroluria patients display a range of symptoms connected with severe zinc deficiency that are familiar to me from my work with Chronic Fatigue Immune Deficiency Syndrome (CFIDS), including nausea, loss of appetite, abdominal pains and headache—all of which can be associated with food intolerance and digestive problems—as well as nervous exhaustion, emotional fragility, palpitations, depression and insomnia.37 Other complications include abnormal EEG findings38 and cognitive difficulties ranging from misperceptions and hallucinations39 to amnesia.40 Cognitive deficits such as memory, attention and concentration disturbance are widely recognized in CFIDS patients41 and can occasionally take on more serious manifestations. These observations lead me to suspect that pyroluria may also be disproportionately represented among CFIDS patients. Certainly chronic fatigue of a baffling type is a hallmark of the copper-zinc imbalance more generally. Nutritionist Ann Louise Gittleman discovered the importance of copper overload in her practice when results of hair mineral analysis (sometimes referred to as tissue mineral analysis) helped explain the fatigue of patients who had not responded to treatment for suspected causes of the problem.42 Among a varied population, the only common factors were fatigue and high copper analysis.43 But as she also stresses, copper overload and its accompanying zinc deficiency, are usually “more than just fatigue.”44 In addition to problems already mentioned, she recognizes hypoglycemia,45 anxiety, racing mind and panic attacks,46 skin problems,47 and premenstrual syndrome.48 Racing mind, which I have experienced as a kind of desperate, circular chattering of my own thoughts that can go on for days, is a special case here because it is so specific to the copper overload problem. The cognitive deficits of chronic fatigue patients are often characterized as “brain fog,”49 and investigators have found a general slowing down of brain functions.50 For patients to complain of rushing, frantic thought processes is an anomaly that can complicate the diagnosis of chronic fatigue, unless its role as a tipoff of possible high copper is recognized. Michael Rosenbaum, MD, has credited Gittleman with recognizing “tired bodies with overactive minds”51 as the signature of the copper-zinc imbalance. Candida and Infection Two other serious conditions mentioned by Gittleman deserve special consideration, because they are often involved in the more critical CFIDS form of chronic fatigue. The first of these is yeast overgrowth, termed systemic candidiasis or candida by alternative practitioners. Copper, Gittleman informs us, is “the body’s natural yeast killer.”53 When it is bound up in tissues, however, blood copper may be low,54 resulting in reduced white blood cell activity. High levels of bioavailable copper can also be a problem, however, in exacerbating the condition.55 As in so much of mineral metabolism, balance is necessary to permit optimum function. Other infections also play their part in CFIDS and can lead to the immune dysfunction that characterizes it. Gittleman tells us that individuals affected by chronic bacterial infections are found to have copper that’s low or unavailable, while conditions of chronic viral infection are more typically connected with low zinc and high copper levels.56 Such patients often struggle on for years with little improvement, but may benefit from a copper-balancing program.57 Keynote of Poor Health Struggling on has pretty much been a keynote of my life. In childhood I was weak and shy, always underweight. I was diagnosed with anemia and also treated with thyroid medication in early adolescence. It may have helped: always subject to frequent strep infections and earaches, I was a chronic absentee from school, but about that time I resolved to maintain regular attendance and was able to do so. But new problems appeared. I sunburned severely and was subject to stretch marks, signs of zinc-related skin fragility.58 I had my first yeast infection when only thirteen. I also experienced characteristic late-onset menarche.59 This pattern of uncertain health only worsened as I grew older. I suffered serious depression and attention problems I realize now were probably side effects of birth control pills.60 My use of these throughout my twenties was only the first of several major developments that would greatly aggravate my inborn copper-zinc imbalance. When I first became interested in natural foods, I turned to Adelle Davis and D. C. Jarvis of Folk Medicine fame. These authors represented the natural foods movement for me and I never believed vegetarianism was necessarily the right or best lifestyle. But when I moved into a household with a vegetarian requirement, I made a fatal mistake. I accepted the premise that is was not really necessary to include meat, fish or poultry in the diet to be healthy. When I then, after a year or so of vegetarian lifestyle, acquired HHV2 infection, I was really in serious trouble, and the ominous deterioration of my health during the next few years took me decades to climb out of. Like so many immersed in the vegetarian culture, I strove to deal with my new crises by moving to more and more rigorous regimes, rather returning to the more nutritious foods I had eliminated. My chronic infections from childhood had never let up. I had suffered constant vaginal yeast infections since my early twenties. Now to this were added, more and more frequently, headaches, painful joints and burning pain over my body in a general misery. Studying natural health seriously now, I found that these bouts, which I called “acid attacks,” could be mitigated by the popular cleansing and alkalinizing regimes that so many vegetarians admire. Of course, there was no one to inform me of the vital role of high quality protein in maintaining proper pH in the body.61 Having gained some relief from my symptoms, however, I was able to sort out the pattern of these attacks more specifically, and they were centered on a cycle of liver and intestinal inflammation. It was this that I now sought to understand and unravel, and I was to pursue this quest doggedly year after year, in spite of blank looks, indifference and patronizing responses I received from practitioners across the spectrum of the healing arts. I had to do detective work on my own in those early years. Fats and Acid Attacks It was through careful self-testing that I first learned that fats were the source of my “acid attacks.” It was a relief to find a cause, but also alarming. From my early studies, I knew very well the crucial role of fat-soluble vitamins. Was it only certain fats? I would experiment again and again over the years, trying to find ways to get a little fat into my system. All fats, even the highest quality, gave me these problems. For the present, I avoided all fats because of the price in pain and debility. It was truly ironic that everyone thought my diet was really healthy. One radio-talk nutrition expert asked me, “Why would you want to eat fats?” Even then, in the early eighties, I knew better. At that point, things had, if possible, gotten worse. I had done an internship in iridology with a raw foods expert. I learned a great deal from this man about cleansing and retracing (a condition where old health problems resurface during the cleansing process), and I respected his program because he valued fats. He used substantial amounts of avocado and seed sauces to give his vegan regime some density, and he was not the archetypal gaunt vegetarian. But I was trying to do his program without those foods, and living on raw sprouts gave me new intestinal problems. There was no way to get enough food, let alone nourishment, from such a program. That was only the first time I nearly starved. I had discovered I could not go back to even a more moderate vegetarian diet. I recognized that I had lost crucial digestive abilities. I now got acid attacks just trying to eat cooked food, even without fat. Dr. Paul Eck, a pioneer of mineral metabolism and hair mineral analysis was an early researcher and clearly recognized this destructive aspect of the vegetarian diet. He asserts that the vegetarian does not act freely in his choice of diet. He is forced into it by the progressive collapse of his metabolism.62 This collapse is certainly what I experienced. Thankfully, I found two foods during this time that saved me. For years I used a seed sauce of my own design made with sunflower seeds and tofu—that is, I had finally found a substantial food I could rely on for protein and some fat. The home culturing of the sauce seemed to make it more digestible, and probably also reduced some of the problematic components of soy (of which I was unaware at the time). I was also receiving a basic source of zinc (from the sunflower seeds), a nutrient that had concerned me because I knew of its role in healing and the immune system. I did not know how extreme my zinc deficiency must have been, though I watched my nails for telltale signs. In all those years, I never developed the white-spotted or deformed fingernails linked to extreme zinc deprivation.63 Unfortunately, this sauce was also high in copper. Perhaps my second saving food helped me with this, however. Not tolerating commercial supplements, I turned to spirulina as a food supplement. I knew spirulina provided a broad range of nutrients. It was only years later that I learned how beneficial it is for the liver64 and realized it had probably helped me to reduce some of my copper load. It certainly aided my digestion, and in time I was able to return to cooked foods, though my diet was extremely limited still by my fat intolerance. Fungal Problems What I had not yet faced was a threat just becoming known.65 In my raw food years I had relied excessively on fruit and fruit juices for “alkalinizing,” and just to get enough food. As I read the emerging literature on candidiasis, I was horrified to realize I had built up a massive systemic yeast problem. And yeast, we remember, is a hallmark of the copper-zinc imbalance. There would be no resolving one without dealing with the other. In 1988, I began treatment with my acupuncturist, Theresa Vernon, and benefited from Chinese tonic herbs. Chinese medicine is a godsend for cases like mine because it can work at once by strengthening and balancing the system based simply on presenting conditions. Its cumulative effects are slow however, and I was by now very ill, and could not withstand further shocks to my system. I was then going through a prolonged relationship breakdown that I would have to call the most excruciating stress of my not unstressful life. And in 1990, I suffered a severe adverse reaction to Nizoral, an antifungal drug then being use for candida. This caused serious new liver damage and intestinal damage as well. I had now developed an acute colitis-like condition that would stay with me many more years, and the liver pain was back with a vengeance. Once again my diet collapsed to a handful of foods. Breakdown In the fall of that year I went into total breakdown, a process that is most devastating because it just keeps on getting worse. Gittleman talks about adrenal burnout in zinc deficiency as a total exhaustion of the adrenal capacity to respond to stress.66 Deep burnout produces a bone-shuddering, unrelenting fatigue that is beyond anything I would have imagined. I only hope that by sharing this information I can spare others that experience. Burnout was only part of what was going on. There were also waves of a kind of feverish delirium that made it very hard to focus on my surroundings or communicate with others. Pfeiffer might have called it an “intensifying of disperceptions.”67 In Chinese medical terms it is referred to as “deficiency fire.”68 In energetic form, it can be described as a fast-burning brushfire in dry grass; when the system becomes too depleted, it can only consume itself. It is a complete exhaustion of yin, the reserves and nourishing fluids of the body. The Chinese regime treats this with a “purge fire” and yin-restorative herbal tonic program. By pouring on these herbs for weeks, we cooled down to where the outbreaks of fire were less intense and less frequent. But I remained in a free-floating kind of fugue state for years. It is part of the disorientation of the condition that I don’t know now exactly when I came out of it. I see a lesser version of this frequently in the ill persons I assist through our support network; there is such a high level of confusion, distractibility and anxiety in certain people today that they frequently cannot focus on the information that could help them. Such observations lead me to look into the area of zinc deficiency and adrenal burnout in their situations. With all this we were trying to fight the candida too. We frequently had to go beyond the available information to make progress. In straight forward cases of flora imbalance, the basic programs generally presented may suffice. But attention must always be given to the problem of die-off. When antifungal supplements begin to kill yeast in the system, toxins are released which can aggravate symptoms unless care is taken.69 These symptoms can be mitigated by moderation in the approach, but I found I was struggling constantly with erratic and unpredictable flare-ups. What we gradually realized was that in more severe cases, the body can be so saturated with the toxic by-products of candida that these can cause “die off” responses with anti-fungal products, but also by anything with a cleansing effect on the system, even salads and beverage teas in a case as severe as mine. I also felt that nourishing and strengthening agents, such as vitamins and tonic herbs, stirred up symptoms, perhaps simply because they aided my body in its own efforts. The toxicity of yeast and yeast byproducts is a serious concern and I have seen yeast-control efforts collapse again and again when this factor is not understood. The impulse is to throw everything available at the overgrowth, but we discovered that in many cases it can be eliminated only in minute increments, over an extended period of time. Chinese Medicine I believe Theresa’s treatment during those years saved my life. Using care in handling die-off, I was able to progress beyond the phase where nearly everything I did seemed to cause flare-ups. By the late 1990s, I had rebuilt my diet yet again and regained some strength, but I was living mostly on chicken soup and still rarely went out of the house. Theresa and I have both found Chinese herbs and food therapy always helpful for those with chronic fatigue. Neither of us personally know anyone who has recovered from the condition without the help of Chinese medicine. In Chinese medicine, proper food is a major treatment modality. According to Michael Tierra in Planetary Herbology, “Deficiency conditions are regarded as the root or radical cause of most diseases.”70 Foods are analyzed according to the five flavors71 of sour, bitter, sweet, pungent and salty, and applied as a kind of supplement for the primary energies of yin, yang, chi and blood.72 In Chinese System of Food Cures, Dr. Henry C. Lu recommends chicken for underweight, poor appetite, diarrhea, edema, frequent urination, vaginal bleeding and discharge, shortage of milk secretion after childbirth and weakness after childbirth73—all symptoms of yin deficiency. He describes its characteristics as warm and sweet and its action as a tonic for the spleen. In Chinese medicine, digestion is a function governed by the spleen meridian system. Dr. Lu gives recipes helpful for fatigue, neurasthenia and memory. His remarks indicate it would be a food of choice for any case of malnutrition, burnout or digestive debility. Theresa describes chicken soup as “healing for everything!”74 She has nursed many patients through this chicken soup phase. My personal chicken soup was made with carrot, cabbage and potato. I had arrived at this combination by trial and error when the die-off reactions made everything problematic. I was able to eat it day after day and still find it delicious, strengthening and satisfying. As I learned about Chinese food therapy, I could see why it was so helpful. The root vegetables carrot and potato provided me some mild tonic benefit at a time when most herbal tonics were too strong for me. The cabbage had a cooling effect, promoted urination75 and nourished my digestion. I used only boneless, skinless chicken breasts at this time, (only later was I able to tolerate soup made with bones), but the slight amount of fat they provided was a godsend. I was also able to make soups with lowfat whitefish some of the time. Ups And Downs During this time I would improve greatly with this protocol and be able to add more foods for a time, even butter, but then my old problems would return. I understand now that my steady regimen was aiding my zinc deficiency and allowing me to eliminate copper. My tolerance for other foods would go up and I would improve again when I added eggs, but when I added other foods I would soon be in trouble again, and then the eggs would be too rich again. I realize now that when I could, I would go straight back to copper-rich foods. Ironically, copper excess can lead to a craving for copper in some individuals. “Although it’s a bit difficult to understand,” Gittleman writes, “many people who have high copper in their tissues have difficulty utilizing that stored copper. As a result, they become somewhat deficient in copper in their blood. Because of that deficiency, they often crave high-copper foods to give them a temporary energy high.”76 My copper fixes of choice were nuts, cereals and avocado. Thus we can find ourselves simultaneously in excess and deficiency of copper. This paradox can complicate any program of copper-zinc balancing. When in 2002 Theresa began incorporating hair mineral analysis into her practice, she recognized my problem with high-copper foods and urged me to begin to detoxify. I had avoided zinc supplements along with so much else when everything gave me a problem. When I reduced high-copper foods, my liver pain reappeared; when I tested zinc supplements, my liver pain also reappeared. I began to realize this copper thing could have been a part of my problem for some time. But I didn’t have a handle on it yet, and my efforts were erratic. When I read Gittleman’s book, all my years of struggle finally fell into place. The key point: copper is normally eliminated in the bile.77 The Bile Connection Liver pain is debilitating and frightening. When tested, my blood panels had been normal. The usual hepatic herbs gave me fits. Without knowing what I was doing, I had always opted for protecting myself and avoiding flare-ups. Now I set out to restore my gall bladder function. The more I learned, the more I was sure copper must be part of my problem. I came to understand that by reducing copper foods, I was allowing copper elimination. By beginning zinc supplementation, I was mobilizing copper elimination. I reasoned that my gall bladder function had shut down from years of nearly fat-free eating. I knew that my old mentor, Adelle Davis, had much to say on the subject, information I hadn’t been able to use until now. Ms. David won my heartfelt gratitude when she described the life of a gall bladder sufferer: “Individuals who have suffered acutely while passing a gall stone or when the gall bladder has been inflamed often become so fearful of food that they frequently live on self-imposed, severely restricted diets free from all fats without realizing that they are making their condition continually worse.”77 Here I read the only description I ever found of my plight. My suffering had been caused by passing of copper, not gall stones, but I had repeatedly been given the same advice she criticizes—avoid fats to reduce digestive discomfort.78 Her program uses peanut oil79 to increase bile acids80 and recommends whole milk, cream and butter.81 I had long since recognized the epigastrium-gall bladder area just below the ribs on the right flank as the focus of my pain, and had for some time been using a Chinese formula “to clear damp heat”82 from the area. I got hold of some zinc supplements and some bile capsules containing 500 mg, and was ready to face peanut oil and butter. There was considerable discomfort from the copper elimination, and some digestive upset to mark the transition, but understanding now what I was doing I was able to modulate the process. Within a few days I was eating soft boiled eggs with butter and salad dressing with buttermilk and flax oil; by the end of the week, I was experimenting with chicken skin and bits of well-marbled roast beef. Talk about learning things the hard way! Digestive Recovery I have never again had to fall back on my frugal chicken soup diet, though I still make soups several times a week. Now I prefer turkey to chicken, because it’s richer, and I also make soups with beef, pork, lamb and seafood. Yes, seafood is very high in copper, but after a period of detox and after digestive recovery to restore a hearty appetite for red meat, copper just isn’t a major bugaboo any longer. In teaching traditional foods and working with chronic fatigue advocacy, I am now meeting people frequently who complain of fat intolerance or gall bladder pain, or queasiness after rich meals. People are hearing the new information about good fats and are eager to enjoy salmon and butter, olive oil and coconut milk. It is startling to them to find they can’t easily go back to more traditional habits. I see this pattern in people who haven’t yet developed the multiple problems of lowfat plant-based diets and copper-zinc imbalance. After all, that was my first problem, too. We need to take this incipient digestive upset as a warning sign and find our way back to the foods of our ancestors. I feel that digestive recovery is the beginning, whether a person is coming from the standard American diet or some version of a light or fat-restricted diet. As in my case, the particular nutritional dilemmas a person has gotten into can tell a lot about the struggles developing in his or her body. Gittleman, who had studied the work of Paul Eck, develops his point made above: “Many people switch to a lighter diet because red meats and other types of animal protein feel ‘heavy’ in their system. Ironically, this feeling can develop from copper excess, or zinc deficiency, or adrenal insufficiency. Individuals with copper-zinc imbalance have trouble digesting and absorbing fat and protein in particular, so they often opt for diets that avoid foods rich in these nutrients.”83 Farther down the spiral from lighter diets to adrenal burnout, copper buildup becomes almost unavoidable. Adrenal burnout can lead to copper buildup in and of itself. Protein synthesis, especially the copper-binding protein ceruloplasmin, slows down and liver detoxification falters.84 This can lead to, in Chinese terms, liver heat, or in more extreme form, liver fire, with symptoms of dizziness, headache and red eyes.85 Recall the headaches that marked my first problems, which were “cooled” by alkaline foods and cleansing herbs. In the full-on deficiency-fire state, waves of dizziness were constant, and my eyes were so sensitive I wore dark glasses in dimly lit rooms. The most easily available herb Tierra recommends for liver fire is yellow dock.86 Its energy is bitter and cool, it functions as an “alterative, chologue, astringent, aperient and blood tonic,” and he recommends it for skin disorders and as a purgative for bile congestion.87 With skin disorders, think zinc deficiency, and with skin disorders of liver fire, think of the widespread incidence of adult acne. Yellow dock’s action as a purgative, he tells us, is similar to rhubarb but milder in action.88 This combination of action is especially valuable in our present context, since constipation and lower intestinal problems can be a direct consequence of reduced bile flow89 and low-fat diets.90 Tierra recommends 3-9 grams daily, in capsules.91 This is probably too high when copper is being cleanses. In these cases, I recommend any new herb be introduced carefully. Very small amounts may provoke a reaction, but the portion can be gradually increased as cleansing occurs. Yellow dock is recommended as a digestive aid only. No single herb can clear such a complex condition as liver heat. It is because of this pervasive liver heat that many of our light dieters on the way down became avid salad and fruit eaters, or raw-food vegetarians, as I did. To go for all this cold food without understanding what it is doing can create new digestive problems. In Chinese medicine, the spleen meridian system, governing spleen, pancreas and stomach, is easily damaged by cold, a condition called deficiency of spleen yang. With this deficiency comes poor digestion, fluid retention and tendency towards mucus.92 A clear sign of this type of digestive damage is a tongue that is bloated and very pale, frequently heavily coated as well. This condition is frequently associated with candida overgrowth.93 The vegetarian with a cold spleen condition may be worried about “mucus-forming” foods and yet crave cooling dairy products that worsen the condition because of liver heat or hot spots elsewhere in the system. The Chinese medical approach allows addressing liver heat with cooling liver herbs and cold, damp spleen with herbs and foods that warm and protect the stomach. My chicken soup, although I did not fully understand it, fulfilled these functions very well. Tierra recommends mildly warming stomachic herbs like cardamom, caraway and dill, which by aiding the circulatory function of the spleen system, benefit the liver as well.94 Dairy products can also worsen problems with zinc deficiency, according to Gittleman. Calcium can slow metabolism, already sluggish from poor digestion and excess copper, and if foods high in phytates are eaten with dairy foods, “. . . this combination of foods dramatically decreases the body’s absorption of copper-antagonistic zinc.”95 When I needed to improve digestion and transition to foods with higher nutrient density, I relied on non-gluten grains, vegetables, poultry and fish. Just by cutting out the glutinous grains, I avoided substantial amounts of copper and zinc-binding phytates. Although my program was very low-fat, it greatly improved my digestion and my very cold spleen condition. The way I ate then was very similar to the program Gittleman recommends, and I feel her guidelines can be helpful for those wishing to transition to richer traditional foods. Healing Foods This program is fairly neutral from the Chinese warm-cold point of view, avoiding excess cold foods. Fish is neutral from the copper-zinc viewpoint as well, as it does not contain great amounts of either copper or zinc. While fish is valuable for its rich nutritional profile, especially essential fatty acids, and is especially digestible for those first adding more protein to their diet,96 it is important to begin using small amounts of land-based proteins as one becomes able to do so. It is in part the mildly warming nature of chicken that makes it so good for digestion. Although eggs contain only .7 mg zinc per egg, their ratio of 7 to 1 zinc to copper is nearly ideal,97 and properly raised eggs are rich in many accessory nutrients needed to aid detoxification. And red meats are among the most warming foods, with mutton and pork being especially recommended for the spleen.98 These land-based proteins are our richest and best-assimilated sources of zinc. Dark meat poultry and red meat contain the most fat in this group, and also significantly more zinc.99 Restoring Fat Metabolism We know that vitamins A and D in animal fats are essential for the absorption of minerals.100 Although Gittleman recommends use of enzymes and hydrochloric acid to aid digestion for those who have lost finction,101 she does not provide an affirmative program for restoring fat digestion, such as the use of bile salts, nor does she recommend cod liver oil. She states that reversing copper overload will boost both fat digestion and fat metabolism,102 but I found I had to improve my fat digestion to begin to eliminate copper. Thus, her program stops short: to fully restore our mineral metabolism, we must get past the stage of careful fat restriction she advocates103 and embrace the full range of healthy natural fats, especially fats that will provide the all-important fat-soluble activators. In 1997, a significant article appeared in the Health Journal of the Price-Pottenger Nutrition Foundation, discussing “systematic acidosis resulting from glandular deficiencies that impair fat metabolism.”104 The author, a dentist, discussed how this acidosis was the cause of calculus (scale) deposited on teeth, and could be reversed by supplementation of bile salts. The article provides careful and detailed information on bile supplementation which must be adjusted to individual need. Two tablets of 5-grain ox bile are to be taken with each meal, to be reduced to one if diarrhea occurs, and discontinued if diarrhea continues, indicating another source of fat disturbance is likely. It is interesting that, while I benefited greatly from bile supplementation, I never was subject to dental calculus; thus, a trial of bile salts is desirable in cases of liver or gall bladder congestion whether or not calculus is present. Some elimination of copper can begin as soon as a shift towards a more balanced diet is taken, and is likely to cause some discomfort. As with the candida process, changes should be made slowly, backed up by digestive support. If copper release is higher than can comfortably pass through the liver and gall bladder, copper levels in the blood can rise, with an increase in digestive discomfort, anxiety, headaches and other symptoms.105 To minimize these episodes of copper discharge, Gittleman recommends emphasizing nutrients which have an antagonizing action to copper, that is, they reduce its absorption or aid in binding it for excretion from the body.106 The most important of these, of course, is zinc itself, as obtained from the land-based proteins mentioned above. Manganese and iron act to displace copper from the liver; vitamin B6 and niacin promote reversal of copper overload; molybdenum and sulfur, which act in the intestines, facilitate its excretion; and vitamin C, very importantly, chelates copper in the blood to facilitate its removal.107 By emphasizing food sources of these nutrients, inorganic copper can be mobilized and circulated out of the system with minimum disruption. A diet providing ample animal protein, dark leafy greens, a variety of other vegetables and fruits, fish, small amounts of legumes and plentiful natural fats can meet these needs. If cold foods worsen your digestion, stick with soups, cooked vegetable dishes and stewed fruits, and take digestive enzyme supplements. Supplementation To perform this kind of metabolic work, supplementation is very helpful. For copper overload of long standing, or to obtain more immediate relief, it really becomes necessary. Readers of Wise Traditions are accustomed to using food-based supplements, and I always encourage these for the rich matrix of associated factors they provide, but to address serious conditions like copper toxicity, liver congestion, candidiasis and adrenal insufficiency, Theresa has found clinical supplementation to be essential.108 Gittleman recommends supplements in the following amounts, to be taken with a copper-free multiple vitamin: zinc, 10-25 mg; manganese, 5-15 mg; vitamin B6, 50-200 mg and vitamin C, 500-3,000 mg.109 To this would be added pantothenic acid, 600 mg to support the adrenals.110 Not mentioned by Gittleman, but of course very important, is a good quality cod liver oil. I have taken these supplements for years, and still do. I also take, and recommend, a natural trace mineral supplement (see the Resource section, below) as a source of antagonists too often depleted from our soils. The product I use contains a mixture of sea bed and volcanic montmorillonite. The minute amounts of copper such products contain, perhaps because imbedded in a mineral substrate, generally do not interfere with a copper-balancing program. Hair Analysis In order to develop a more comprehensive program, if it is to match your own metabolism, it is necessary to seek out hair mineral analysis, and obtain a metabolic profile based on the mineral ratios presented. Unfortunately, most laboratories offering hair analysis services provide nutritional programs based simply on apparent deficiency of minerals in the hair, and perhaps levels of toxic metals. I had tried such a program early in my search for health, and found it offered little beyond supplementation I was already using. Pioneer mineral researcher Paul Eck, mentioned above, found that supplementation must be applied to correct critical mineral ratios, such as the ratio of copper to zinc in the tissues of 1:8. He had found that giving a particular mineral just because it showed up low on an analysis rarely succeeded in raising that mineral, but when he adjusted mineral ratios first, mineral levels would then rise.111 Gittleman’s work is based solidly on Paul Eck’s research, and Theresa is also seeing excellent results by affiliation with a laboratory that uses his methods. This kind of metabolic rehabilitation is a long-term project, and requires using a group of supplements that are modularly interlocked to match each person’s pattern. In the case of copper overload, copper which is found in hair tissue may not initially give a high reading, but telltale patterns of mineral ratios can reveal the likelihood of hidden copper.112 The laboratory we have used is listed in the Resources section below. I have been using a metabolic mineral balancing program since I began to address my own copper issues several years ago. My hair is retested several times a year and the supplement program adjusted accordingly. The program includes a supplement specifically intended to increase digestive elimination of copper. I also use zinc and B6 in higher-than-normal levels to address my pyroluria. Based on Pfeiffer’s research, I supplement manganese and zinc in a 1:20 ratio to facilitate urinary excretion of copper.113 Recovery When I was ill, my underweight condition at times approached emaciation, and for years all I could do was prepare my soups, eat them and return to bed. My digestive recovery five years ago has changed all that. With a steady diet of bone broths, meat, turkey, butter, eggs, cod liver oil and raw cheese since that time, I am today stout, active and happy for the first time in my life. I have the musculature to take regular exercise and—most astonishingly—have lost the frail frame I had struggled with all my life. Today at sixty years of age, I have the sturdy bones and rosy peasant cheeks of my Irish and German ancestors. And I have optimism and enthusiasm to bring a friendly word about real food to others who have been starving from the lack of it. Resources To find a practitioner in your area who utilizes hair mineral analysis (also referred to as tissue mineral analysis) according to the methods of Dr. Paul Eck, you may call Analytical Research Labs, Inc. at (602) 995-1580. Their website is www.arltma.com. They are located in Phoenix, Arizona. ARL is the provider of Endomet brand supplements. The copper-eliminative supplement Theresa recommended for me is called GB3, which contains 112 mg ox bile, hydrochloric acid, enzymes and black radish. Black radish is recommended by Gittleman for liver congestion and as a source of sulfur.114 Supplements provided directly to the consumer are generally lower in bile salts. When my stomach was still very cold, I could not use hydrochloric acid, and I searched for months to find a product that contained only ox bile and enzymes. I could only find one, designed by Dr. David Beaulieu of Kansas City, Missouri. It contains 65 mg ox bile and enzymes in a two-stage tablet. Multiple tablets can be used to give a comfortable bowel elimination. Since bile salts are resorbed from the small intestine,115 effects of bile supplementation are cumulative and the dose would change over time. Dr. Beaulieu’s company, called Preventics, also makes the Mont-Min 74 trace mineral supplement I use. Preventics can be contacted at (800) 888-4866 or www.askdrdavid.com. I generally recommend zinc be taken in chelated form. Ethical Nutrients, however, has a liquid zinc sulfate product called Zinc Status. Since zinc deficiency affects the sense of taste, you can test yourself at intervals with this product until your zinc-restoring efforts bring you up to speed. As long as your zinc status is deficient, the product will remain tasteless; if it takes on a characteristic obnoxious sulfur taste, you know that you are making progress.116 The product is available in health food stores or from www.ethicalnutrients.com. Herbalist Andrew Gaeddart is the genius (in Theresa’s opinion) behind Health Concerns Chinese herbal products. These products adapt traditional Chinese formulas for problems of American patients. They are at the heart of Theresa’s success with chronic illness. Quiet Digestion, containing magnolia bark, citrus peel and other herbs, promotes long-term digestive recovery and provides immediate aid for digestive distress. GB6 is the product which helped me reduce liver and gall bladder pain, used over time. Phellostatin is an outstanding candida regimen, which supports all affected systems as it eliminates yeast.117 Yin Chiao Jin aids all those “flu-like symptoms” from yeast or copper detox. Nine Flavor Tea is a superlative yin tonic formula of the old school, used to overcome the extreme weakness and insomnia that go with burnout and inflammatory conditions. To find a practitioner using Health Concerns products, call (800) 233-9355 or visit www.healthconcerns.com. Sidebar Copper and Zinc in Foods Copper-zinc imbalance with its attendant digestive problems and danger of adrenal insufficiency provides a major challenge to lowfat and other “light” diet systems which reduce or eliminate animal foods. Nutritionist Ann Louise Gittleman in her book on this problem, Why am I Always So Tired? reminds us of the biological facts of the human diet: “Human beings evolved on animal protein and it’s virtually impossible to obtain adequate amounts of zinc any other way. Beef, for example, has a fourfold greater bioavailability of zinc than do high fiber cereals.”1 The ratio of copper to zinc in our tissues should be 1:8.2 Because stress, some medications (particularly oral contraceptives) and environmental copper can interfere with this balance, we need to maximize zinc in our diets to offset the copper found liberally in natural foods. Zinc cannot be stored,3 so we must rely on red meats, eggs and poultry as our optimum food sources.4 The zinc in these foods is not only more bioavailable than in plant sources, the ratio of zinc to copper is much higher, providing a buffer for other foods higher in their ratio of copper. The only plant food with an advantageous ratio of zinc over copper is pumpkin seeds.5 Once digestive vigor has been reduced and copper buildup has affected liver function, foods high in copper, or those that interfere with zinc, can be troublesome. Gittleman states that vegans, who often combine plant protein sources to increase protein intake, can be especially susceptible to copper toxicity.6 Soaking and sprouting of foods high in phytates should be a given, but while these methods make zinc more available, the ratio of zinc to copper is still low. Developing new sensitivities can be a high-copper tip-off if you are reacting to high-copper foods like soy, yeast, nuts, mushrooms and shellfish.7 Even low-copper foods such as dairy products can be problematic in excess; the calcium in these foods is a zinc antagonist, that is, it works against zinc in the body.8 Those looking to reduce stimulants have another reason now to do so; alcohol, coffee and sugar are all strong depletors of zinc, while chocolate and tea are problematically high in copper.9 Beyond foods already mentioned, many favorite health foods are strong copper contributors. Most grains and legumes, wheat germ, molasses, bran, dried fruit, sunflower seeds and organ meats carry copper ranging form .5 milligrams to several milligrams per serving.10 Needed in only trace amounts, copper has an RDA of only .6 milligrams for infants, 1 milligram for children under four and 2 milligrams for older children and adults.11 Given the many depletors and antagonists working against zinc, the RDA of 5 mg for infants, 8 mg for children under four and 15 mg for older children and adults12 is probably too low. At the root of our problems with copper and zinc is a generation of heedless nutritional guidelines which have produced widespread dietary imbalance and deficiency. We may have to avoid some nutritious high-copper foods while restoring digestion and reducing excess copper levels, but once we have succeeded in placing nutritious, high-density animal foods at the center of our food supply, the multiplying problems of copper-zinc imbalance can cease to be a cause for concern. REFERENCES FOR SIDEBAR 1.Gittleman, Ann Louise, MS, CNS, Why Am I Always So Tired? Harper, San Francisco, 1999, p 35. Meat, Organs, Bones and Skin Posted on July 2, 2013 by Christopher Masterjohn • 5 Comments PrintFriendly and PDFPrint - PDF - Email Nutrition for Mental Health SUMMARY • My anxiety disorders became seriously aggravated on a vegetarian diet but were resolved after including nutrient-dense animal foods in my diet. • Consistent with my personal experience, seven out of eight studies have shown that vegetarians are more likely than their non-vegetarian counterparts to experience mental disorders. • These studies cannot prove cause and effect, but vegetarian diets may induce a number of nutrient deficiencies that could contribute to the development of mental disorders. • Vitamin B12, folate, methionine and glycine support the proper regulation of a biochemical process called methylation, which in turn regulates the neurotransmitter dopamine. • This biochemical process contributes to the appropriate balance between mental stability and mental flexibility, which is needed for optimal mental health. • Meat, bones, skin and organ meats such as liver provide a balance of the nutrients needed to support the proper regulation of methylation, and thus to support robust and vibrant mental health. • Nutrient-dense plant foods are also beneficial. When I look back on my life and consider my struggles with anxiety, nothing stands in sharper relief than the healing power of nutrient-dense animal foods such as meat, bones, organs and skin. In my late teens, I became a vegetarian, thinking I would save the environment, the animals and even my own health. Six months later I became a vegan, excluding all animal products from my diet. Rather than improving my health, however, I developed problems with digestion and lethargy, a mouth full of tooth decay, and a profound aggravation of the anxiety disorders I had struggled with since my mid-teens. After a year and a half, I slowly began including animal foods such as eggs, milk and eventually fish in my diet. Nothing seemed to help. After about two years, I caved in to strong cravings for red meat at Christmas dinner. I feasted luxuriously on such meats thereafter, and within two weeks my regular panic attacks had ceased. Nevertheless, I still suffered from the phobias and obsessive-compulsive disorder I had had prior to becoming a vegetarian. Several months later, I discovered the work of Weston A. Price. Aiming to cure my tooth decay, I began incorporating nutrient-dense animal foods such as cod liver oil, liver and other organ meats, bone broths, and animal skins into my diet. Not only did my tooth decay come to a crashing halt, but within months my anxiety disorders disappeared. I thus realized that my health, both physical and mental, had undergone a revolution. VEGETARIANISM AND MENTAL DISORDERS To understand why nutrient-dense animal foods seem to have cured my anxiety disorders, it makes sense to ask a simple question: was I alone? Or do others who exclude animal products from their diet also struggle with mental disorders? Prior to 2012, seven studies had addressed this question. Four found that vegetarians were more likely than non-vegetarians to have eating disorders,1,2,3,4 two found they were more likely to be depressed,5,6 one found they had lower self-esteem and more anxiety,3 and one found they were more likely to have contemplated or attempted suicide.1 One study conducted among Seventh-Day Adventists, however, found that vegetarians within this religious group had fewer negative emotions than their non-vegetarian counterparts.7 Although Seventh-Day Adventism does not require vegetarianism, it strongly encourages this way of eating. It is possible this study stands apart from the others because vegetarians within this group experience greater esteem among their peers, are more confident in their own spirituality, or are more conscientious in other areas of their lives just as they adhere more strongly to the teachings of their religion. Regardless of the precise reason for this one anomaly, six out of these seven studies found that vegetarians are more likely to experience mental disorders. Nevertheless, all of these studies have several limitations: they relied on self-reporting of mental disorders rather than on professional diagnosis; they were conducted in limited populations, most of them in adolescents, one in young women, and one in Seventh-Day Adventists; none of them were matched for sociodemographic characteristics, which are known to differ between vegetarians and their nonvegetarian counterparts; and none of them determined whether the subjects developed mental disorders before or after they became vegetarians. A study published in 2012 addressed each of these limitations.8 The study included over four thousand respondents to the German National Health Interview and Examination Survey and its Mental Health Supplement, reflecting the general population of Germany rather than a specific subgroup. Clinically trained psychologists and physicians assessed the prevalence of mental disorders by administering a diagnostic interview rather than relying on self-reporting. The investigators took into account socio-demographic characteristics such as age, education, sex, marital status and community size, which was important because vegetarians were younger, more educated, more likely to be female, less likely to be married, and more likely to come from an urban environment. Finally, the investigators determined whether vegetarians with mental disorders began their vegetarian diet before or after the estimated onset of their mental disorder. Compared to omnivores matched for socio-demographic characteristics, vegetarians were more than twice as likely to be depressed, more than 2.5 times as likely to suffer from an anxiety disorder, and over four times as likely to suffer from an eating disorder. We could interpret these data in three ways: vegetarianism might contribute to the development of anxiety disorders, a pre-existing mental disorder might make someone more likely to become a vegetarian, or an unknown factor might predispose someone both to become a vegetarian and to develop a mental disorder. For example, perfectionism is not a mental disorder and could be beneficial in certain contexts, but the trait could contribute to an anxiety disorder if it gets out of hand, and a perfectionist may see vegetarianism as a way of making their diet “perfect.” These interpretations are not mutually exclusive, however: someone might be more likely to become a vegetarian because of a particular psychological trait, but vegetarianism could then induce nutrient deficiencies that interact with that psychological trait to produce a disorder. In the German study, half of vegetarians with eating disorders, two-thirds of those with depression, and over 90 percent of those with anxiety disorders developed their mental disorder before becoming a vegetarian, suggesting that vegetarianism was not the singular “cause” of their mental disorders, at least in the large majority of cases. Nevertheless, as shown in Figure 1, vegetarianism could have made many of the subjects more likely to be diagnosed with a mental disorder by aggravating pre-existing negative psychological traits. In my own case, vegetarianism did not “cause” my anxiety disorders, but it seriously aggravated them, and including abundant amounts of nutrient-dense animal foods in my diet cured them. We should keep in mind that all eight studies examining the relation between vegetarianism and mental disorders are observational in design and therefore incapable of determining cause and effect, which would require an experimental design. Nevertheless, it is reasonable to suggest the possibility that seven out of eight of them found vegetarians are more likely to suffer from mental disorders at least in part because nutrient-dense animal foods are required for optimal mental health. SUPPORTING METHYLATION There are a number of potential deficiencies and imbalances that could develop on a diet devoid of nutrient-dense animal foods: some people may become deficient in cholesterol if they do not make enough of their own; plant goitrogens, some of which require vitamin B12 and sulfur amino acids for their detoxification, could contribute to thyroid problems; deficiencies of vitamin B6, long-chain omega-6 and omega-3 fatty acids, zinc, and fat-soluble vitamins A, D and K2 could also develop. This article, however, will focus on the role of vitamin B12, sulfur amino acids, and glycine in supporting and regulating a process known as methylation, which is critical for mental health. We can see how important these nutrients and the process of methylation are to mental health by considering the neurological and cognitive consequences of severe vitamin B12 deficiency. This condition involves nervous system degeneration, loss of sensation beginning in the toes and progressing to the feet and hands, stiffness and involuntary muscle spasms, disturbed gait, and mental disturbances ranging from mild personality changes and memory loss to psychosis and occasional delirium. Although we do not yet completely understand the exact mechanisms by which vitamin B12 deficiency causes these problems, the primary role of vitamin B12 within our bodies is to support the process of methylation, so a breakdown in this process is almost certainly an important part of the picture. Methylation is a fancy biochemistry term that simply means the addition of a carbon atom with a small assortment of hydrogen atoms (a “methyl group”) to a wide variety of molecules. Methylation is required for the synthesis of many compounds such as creatine, and the regulation of many others, such as dopamine. As such, it is critical for a broad range of biological processes including tissue growth and repair, cellular communication, and controlling cancer. Among the many molecules whose production or regulation is dependent on methylation, both creatine and dopamine are critical to mental health. This article, however, will focus on dopamine. TONIC AND PHASIC DOPAMINE In order to begin exploring the relationship between methylation, dopamine and mental health, we must first understand the difference between tonic and phasic dopamine.10 As shown in Figure 2, tonic dopamine is the modest amount of dopamine that has a constant presence in our brain. It is like a stable body of water, and is important for mental stability. Phasic dopamine is like a wave that comes crashing in, making an appearance for only fractions of a second, and is important for mental flexibility. Methylation regulates tonic dopamine, while our brains have other ways of regulating phasic dopamine. Nevertheless, as shown in Figure 3, our brains judge the size of the phasic dopamine “wave” by how high it stands above the background of tonic dopamine. A higher level of tonic dopamine makes the “wave” of phasic dopamine look a lot smaller, and our brains react to it accordingly. Thus, as shown in Figure 4, methylation regulates the balance between mental stability and mental flexibility: too much methylation will favor too much flexibility, not enough methylation will favor too much stability, and the level of methylation that is just right will provide the appropriate balance between the two. Thus, our goal is not to increase methylation or decrease methylation, but to provide our brains with the raw materials they need to regulate the process properly. MENTAL STABILITY AND FLEXIBILITY Two analogies should prove useful to help us understand the need to balance mental stability with mental flexibility. In the first, we could imagine a potter who makes clay flexible by moistening it before attempting to make something out of it. Too little moisture will lead to brittle clay: it is too dry to shape into anything, and applying enough force to change its shape will simply make it break, exposing rough and sharp edges. Too much moisture will make it easy to manipulate, but no shape given to it will hold. The right amount of moisture will make the clay malleable enough to manipulate into something useful or beautiful, and yet stable enough to retain the shape given it. Similarly, not enough methylation could lead to “brittle” mental states. Such states are difficult to change, but when they do change, the transitions are sudden and without warning. This brittleness could lead to dangerous situations. For example, ordinarily when we get angry, the process is gradual enough that we may realize what is happening to us and stop ourselves from acting out in our anger, or someone else may notice that we are becoming angry and intervene to diffuse the situation. If our mental states are too brittle, however, we may act violently without warning, giving neither ourselves nor those around us any opportunity to recognize what is happening and intervene. Alternatively, too much methylation could make our minds like a bowl of liquid clay: easy to make a mess with, but difficult to shape into something beautiful or useful. In the second analogy, we could consider our consciousness like a net through which thousands of thoughts fly every day. These thoughts could be about basic biological drives and needs like food, sex, and sleep; they could be about the multitude of things we need to get done; or they could be thoughts that motivate us, whether to do good things or to do things that would get us into trouble. To achieve mental health, our net of consciousness needs enough flexibility that we are able to manipulate it as each thought approaches, choosing either to let it pass through or to hold on to it. This net also needs enough stability, however, to hold onto beneficial thoughts for as long as they are needed. Without flexibility, we hold onto everything that comes our way indiscriminately. Without stability, we cannot hold onto anything at all. With a proper balance, we become masters of our thoughts rather than their captives. Evidence from genetic studies supports the role of methylation in maintaining this balance. Some of us have a high or low rate of methylating dopamine for genetic reasons. In those who methylate dopamine at a low rate, unpleasant pictures cause a dramatic stimulation of activity in emotional centers of the brain.11 These people are also much more likely to invest energy into cognitive activity when they are exposed to these emotionally stimulating pictures, and the more energy they invest in cognitive activity the less likely they are to become noticeably disturbed.11 This suggests that low methylation contributes to excessive mental stability. The unpleasant image gets “stuck” in the person’s mind instead of passing through uneventfully, and the person must invest a lot more mental energy to break free of the image’s grasp. Since being held captive by our own thoughts is a central problem in mental disorders such as depression and anxiety, some researchers have suggested that those who methylate dopamine at a low rate are “worriers,” while those who methylate dopamine at a high rate are “warriors.” While the distinction has some merit, making low methylation the “bad” trait and high methylation the “good” trait is too simplistic. Between the “worrier” and the person who indiscriminately engages in every battle, there lies the cautious person who picks and chooses her battles. Moreover, genetic studies show that while those who methylate dopamine at a low rate have more difficulty with emotional processing, those who methylate dopamine at a high rate have more difficulty with cognitive processing.12 Psychoses, such as those seen in schizophrenics10 or those suffering from severe vitamin B12 deficiency,13 can manifest in some people with symptoms of excessive mental stability and in other people with symptoms of excessive mental flexibility. Examples include “rigidity of thoughts” on the one hand, and “flight of ideas” on the other. Perhaps the clearest indication that balance is best is that those with genetically high or low rates of methylating dopamine constitute roughly equal proportions of the population, and the majority of us have the genetics for an intermediate rate. If one trait were the “bad” one and the other the “good” one, natural selection would have weeded the “bad” one out long ago. The question before us, then, is this: regardless of genetics, what kind of nutritional approach can we use to provide our brains with the raw materials they need to maintain the right amount of methylation to support the appropriate balance of mental stability and flexibility needed for optimal mental health? MEAT, ORGANS, BONES, AND SKIN As shown in Figure 5 and discussed in more detail in the Fall 2012 issue of Wise Traditions,14 the most basic nutrient we need for the process of methylation is the amino acid methionine. The “meth-” in the word “methionine” refers to this process. Animal proteins are about twice as rich in methionine as plant proteins as a proportion of total protein. Plant foods, moreover, tend to contain much less protein than meats. People who exclude all animal products from their diets thus likely consume three to five times less methionine than those who eat a diet rich in animal products, leading to a dramatic decrease in the raw materials needed for methylation. As also shown in Figure 5, consuming less methionine should generate less homocysteine. Paradoxically, however, compared to omnivores, vegetarians are twice as likely and vegans are three times as likely to have elevated homocysteine.15 Figure 5 provides a resolution to this paradox: while vegetarians and vegans may generate less homocysteine, they also have lower intakes of vitamin B12, which is needed to recycle homocysteine back to methionine. Indeed, using the highest quality markers of vitamin B12 status, investigators have estimated that up to 73 percent of vegetarians and up to 90 percent of vegans are deficient in B12.15 This should be unsurprising since vitamin B12 is found almost exclusively in animal products, and even that which occurs in eggs, a key vegetarian source of animal protein, is poorly bioavailable.16 Thus, methylation takes a double-whammy: less methionine is available to begin with, and what is available often gets trapped as homocysteine rather than being recycled. Figure 5 provides another key part of the balance. When methionine concentrations rise, for example after eating a protein-rich meal, the amino acid glycine acts as a buffer to prevent excessive methylation. Although animal foods are not richer in glycine than plant foods as a proportion of total protein, a diet that includes animal products provides more glycine than one that does not simply because it is richer in total protein. Vegetarians excrete almost twice the level of a unique marker of glycine deficiency in their urine as omnivores.17 This suggests that excluding animal products from the diet could not only lead to a generally inadequate level of methylation because of lower intakes of methionine and vitamin B12, but the lower intake of glycine could also lead to transient periods of excessive methylation. This could theoretically result in seesawing between excessive mental stability and excessive mental flexibility. The purpose of this article, however, is not to denigrate vegetarian diets but to emphasize the importance of nutrient-dense animal foods. A standard omnivorous diet is hardly the ideal. Even omnivores excrete substantial amounts of the marker of glycine deficiency discussed above in their urine.17 This could be because the typical omnivore fails to make use of skin and bones in their diet. Protein from skin is three times richer in glycine than meat, while protein from bones is six times richer.14 Thus, most omnivores may stand to gain substantial improvements in mental health by including glycine-rich skin and bones (in the form of bone broth) in their diets. Moreover, Figure 5 shows that folate assists vitamin B12 in its support of the methylation process. Folate is found primarily in legumes, leafy greens and liver. Vegetarians tend to consume more leafy greens and legumes than omnivores, and most omnivores fail to take advantage of liver or other organ meats. Many omnivores may thus improve their mental health even further by including folate-rich plant foods and liver in their diets. HARNESSING GOOD NUTRITION FROM ALL SOURCES Vegetarians and vegans may adhere more strongly than omnivores to other health-promoting habits as well. This is especially important to consider if we are interested in preventing all diseases rather than just mental disorders. For example, Figure 5 shows that glycine helps convert homocysteine to glutathione, the master antioxidant and detoxifier of the cell, and a key regulator of protein function. We might predict from this that vegetarians and vegans should have lower glutathione status than omnivores because of lower intakes of methionine and glycine. Some studies, however, have shown that while vegans have lower glutathione status than omnivores, vegetarians have slightly higher glutathione status.18 Unlike the vegans, the vegetarians in such studies may have been consuming plenty of milk and eggs. Thus, the vegetarians and omnivores may have had similar intakes of methionine and glycine. Both the vegetarians and vegans may have been consuming more fruits and vegetables. These provide vitamin C, which spares glutathione from oxidation, polyphenols, which increase the production of glutathione, and, especially in their raw state, glutathione itself. Adequate glutathione status protects against degenerative diseases of all kinds. The best way to support glutathione status would likely be to consume a traditional diet that includes plenty of nutrient-dense foods of all kinds: meat, organs, bones, skin, folate-rich legumes and leafy greens, and fresh fruits and vegetables rich in vitamin C, polyphenols, and glutathione. VIBRANT MENTAL HEALTH Overall the evidence supports a key role for nutrient-dense animal foods in mental health. Seven out of eight relevant studies show vegetarians have a higher risk of mental disorders than omnivores. These studies cannot demonstrate cause and effect, but both dietary and biochemical data suggest that vegetarians are less able than omnivores to support methylation, and are thus likely less able to support the appropriate balance between mental stability and flexibility needed for optimal mental health. Standard meat-inclusive diets are hardly ideal, however. We should emphasize a wide variety of nutrient-dense foods, including not only meat, but also many animal foods banished from our modern menus, especially bones (usually as bone broth), skin, and organs. Such a diet is the surest way to obtain the robust and vibrant mental health of our ancestors. Dioxins in Animal Foods: A Case for Vegetarianism? Posted on October 17, 2005 by Christopher Masterjohn • 1 Comment PrintFriendly and PDFPrint - PDF - Email Table of Contents •Introduction ?Sidebar: Dioxins: Some Background •Exposure to Dioxins ?Sidebar: A Modern Threat? •Dioxin and Cancer: “Sufficient Evidence” Not Required ?Sidebar: Dioxins in Pastured Animal Products? •Non-Cancer Effects of Dioxins •Dioxins: It’s Not Just about the Meat ?Table 1: Dioxin Concentrations in Foodstuffs, United States 2003 ?Table 2: Dioxin Concentrations in Foodstuffs, Finland 2004 ?Table 3: Dioxin Concentrations in Foodstuffs, The Netherlands 1999 ?Table 4: Dioxin Concentrations in Foodstuffs, Greece 2002 •Factors Affecting Dioxin Toxicity •Dioxin Toxicity and Vitamin A •Dioxins, Vitamin A, and Cancer •Dioxins, Vitamin A, and Non-Cancer Toxicity ?Table 5. Effects Shared by Vitamin A Deficiency and Dioxin Toxicity •Dioxin Toxicity, Free Radicals, and Antioxidants •Dioxin Toxicity: Vegetarian Versus Traditional Diets ?Table 6. Vitamin A Content of Animal Foods vs. Plant Foods •Dioxin Shmioxin: It All Comes Back to Weston Price •Abbreviations Used in this Review •References •About the Author The research of Dr. Weston A. Price, documented in his classic volume Nutrition and Physical Degeneration, demonstrated the absolute necessity of certain fatty animal foods for good health. However, a challenging argument against eating animal foods–especially animal fat–arises from vegetarian circles. This argument focuses on a class of chemicals called dioxins, and suggests that in the modern world, overburdened by pollutants, these fat-soluble chemicals accumulate specifically in the fatty tissue of animal products, making a vegetarian–even vegan–diet a necessity for those living in the modern world. For example, one vegetarian website argues that “nearly 95 percent of our dioxin exposure comes in the concentrated form of red meat, fish, and dairy products, because when we eat animal products, the dioxin that animals have built up in their bodies is absorbed into our own,” and that eating dioxin-laced animal products will make us vulnerable to “a wide range of effects, including cancer, depressed immune response, nervous system disorders, miscarriages, and birth deformities.”1 The same argument appears in environmentalist circles as well. For example, the Pennsylvania-based environmental organization ActionPA’s “Dioxin Homepage” argues that “[t]he best way to avoid dioxin exposure is to reduce or eliminate your consumption of meat and dairy products by adopting a vegan diet.”2 Thus, this argument for vegetarianism essentially builds on a series of three points: •Dioxins are potent human carcinogens, endocrine disruptors, reproductive disruptors and immune disruptors; •Animal products are uniquely high in dioxins; •Avoiding the harmful effects of dioxins is primarily dependent upon minimizing dioxin intake, and therefore avoiding animal products. The assertion that dioxins accumulate specifically in animal products is simplistic and inaccurate, and in fact a diet rich in pastured animal products provides protective nutrients, especially vitamin A, that directly oppose the toxic actions of dioxins in animal experiments, while a diet rich in most plant fats provides compounds that enhance the actions of dioxin. The argument that we should avoid animal products because of their dioxin concentration is thus no less flawed than the argument that we should avoid animal products because they contain saturated fat and cholesterol. Dioxins: Some Background The prototypical dioxin compound is 2,3,7,8 tetrachlorodibenzo-p-dioxin, abbreviated as “TCDD.” The word “dioxin,” however, refers more broadly to dioxin-like compounds from three classes: polychlorinated dibenzo-p-dioxins (PCDDs, including TCDD), polychlorinated dibenzofurans (PCDFs), and polychlorinated biphenyls (PCBs). Not all PCDDs, PCDFs, and PCBs are considered dioxins. Only 17 out of 210 PCDDs and PCDFs are considered dioxin-like, and only 11 out of 209 PCBs are considered dioxin-like. The precise positioning pattern of chlorine atoms on the molecule determines whether or not it is dioxin-like, and it is important not to confuse the PCBs classified as dioxins with other PCBs that are believed to be toxic through non-dioxin-like mechanisms.3 The relative toxicity of dioxins is expressed in relation to the toxicity of TCDD, the most potent dioxin. A “toxicity equivalency factor” (TEF) relates the degree of toxicity of a specific PCDD, PCDF or PCB to the toxicity of the prototypical TCDD, and the TEF is then multiplied by the number of molecules of that particular dioxin compound in a food to yield a “toxicity equivalent quantity” (TEQ). The sum of TEQs from all dioxin compounds within a given foodstuff estimates the presumed degree of toxicity contained within that foodstuff.3 A higher amount of TEQs doesn’t necessarily mean that there is a greater absolute quantity of dioxins in the food, since the TEQ gives greater weight to the more potent dioxins. So, a food with a smaller total amount of dioxins but a more potent specific compound could have a higher TEQ value than a food with a higher quantity of dioxins but less potent specific types of dioxins. However, the TEQ is not necessarily an indicator of how toxic the food is, simply because some dioxins, such as PCBs, also have toxicity that is non-dioxin-like. Exposure to Dioxins Although 95 percent of human exposure to dioxins is believed to come from food,3 this fact deceptively overestimates the impact of foodbased dioxins, because industrially exposed populations have been exposed to 10-1000 times higher concentrations of dioxin than the general population.4 And even at these high exposures the evidence of dioxin-induced harm is inconclusive at best. Since the 1970s, after an historical peak in the 1950s and 1960s, sources of dioxins released into the environment have changed, and the levels have dramatically declined,4 due to government regulations and to the advancement of technology. The US and other countries have banned the use of pesticides and herbicides such as 2,4,5-trichlorophenoxyacetic acid and hexachlorophene, the production of which was once a primary source of dioxin contamination. Alternatives to the bleaching of paper with free chlorine have further reduced or eliminated dioxin production. The dioxin contribution of municipal and medical waste incineration has decreased by over 90 percent because of technological advances in waste disposal.5 Open barrel burning of trash is now the primary source of dioxin released through human agency, while modern incinerators make a comparatively negligible contribution. Certain metal refining processes also lead to dioxin generation. The other major contributors are natural, including volcanoes5 and forest fires.6 Human body burdens of TCDD, the most potent dioxin, in the US have decreased 10-fold, and total dioxin TEQs have decreased 4-fold to 5-fold between 1972 and 1999. Given the typical half-life of dioxins in the body, this means that dioxin exposure during this period has decreased by a full 95 percent!6 Similar observations have been found in other countries. For example, dioxin concentration in the breast milk of Japanese mothers declined by 87 percent between 1974 and 1998.7 Dioxin intake declined about 90 percent in the Netherlands between 1978 and 1999,8 and in Finland dioxin exposure declined 50 percent over the course of the 1990s alone.9 We will never know exactly what level of dioxins Price’s healthy primitives or other premodern societies were exposed to. However, since natural sources of dioxins like volcanoes and, more significantly, forest fires, are now primary sources of dioxins, and since pre-modern populations would be expected to have additional exposure through the direct inhalation of fumes from the incineration of heating and cooking materials (living, for example, in thatched houses without chimneys, as Price described the primitive Gaelics), as well as the use of incinerated materials as soil fertilizer (such as slash-and-burn techniques or the use of smoke-impregnated thatch as a fertilizer, both described by Dr. Price), it is not unreasonable to conclude that we are now approaching a level of dioxin exposure similar to that of pre-industrial populations. Even by conservative estimates, no one in the US is currently consuming a level of dioxins that would be expected to exert physiological harm. The World Health Organization (WHO) developed what is called a “tolerable daily intake” (TDI) for dioxins based on the intake levels that produce decreased sperm count, immune suppression and genital malformations in the offspring of exposed rats, and neurobehavioral effects and endometriosis in the offspring of exposed monkeys.4 However, since the WHO’s TDI is supposed to assume the greatest degree of sensitivity, in order to yield the safest and most conservative estimate, the harm done to male rats exposed during gestation is the primary basis for the TDI.6 Using this estimate, taken from the most sensitive individual rats, the WHO then added a “safety factor” of 10 to yield a TDI of 2 picograms (pg) TEQ per kilogram of body weight.4 (A picogram is a trillionth of a gram or a billionth of a milligram.) This means that a human whose intake of dioxins meets the WHO’s TDI is consuming only one-tenth of the concentration required to yield, after a lifetime of exposure, body burdens with concentrations that were required to produce the minimum physiological effect not in the most sensitive adult or child rat, but in the most sensitive rat during gestation, the critical period where a developing organism would be much more sensitive than at any other time. According to a 2005 study covering the years 1999 through 2002, only 1 percent of 2-year-olds in the United States exceeded the TDI in 1999 and 2000, and this excess of the TDI was very small. The risk to children is probably overestimated since the TDI is based on body weight alone and does not take into account the fact that children have higher fecal excretion rates of dioxin, nor does it take into account the fact that, since we are experiencing a decline in dioxin exposure, current exposures will overestimate the cumulative body burden that will be reached over time. In 2001 and 2002, no intakes at any age in the US were estimated to exceed the TDI.6 A Modern Threat? Dioxins are not merely a modern industrial phenomenon. Chlorinated organic compounds are produced naturally, by biological and abiotic means, have been found in coal samples dating back 300 million years, and are produced by cyanobacteria, which have existed for billions of years.a There are 4,519 known naturally ocurring organohalogens, 2,320 of which are organochlorines. Bleach, chlorine gas, and organochlorines are naturally produced in the human body. Brominated dioxins are produced biologically by sponges as a defense mechanism, while chlorinated dioxins are naturally produced by the decay of plant matter in peat bogs, the incineration of wood in forest fires, or in gases released from volcanos.a The smoke of fires–to which primitive peoples, unlike moderns, were exposed on a daily basis–contains between 10 and 40 nanograms of chlorinated dioxins per gram of smoke.b A single gram of smoke thus contains between 125 and 500 times the amount of dioxin that a 80 kg adult consumes from food per day. Wood naturally contains chloride compounds that are oxidized under high heat, producing chlorine that readily reacts with organic compounds to form organochlorines, including dioxins. Although chlorine released into the atmosphere by industry makes a very small contribution to the chlorine available for this reaction, of the 4 million tons of methyl chloride –the most abundant atmospheric form of chlorine–produced each year, only 10,000 tons originate from industry.a Therefore the healthy pre-modern groups that Price studied, who thrived on diets rich in animal products, probably consumed some level of dioxins in their food, possibly rivaling our own consumption. What has changed in the modern era is not the introduction of chemical pollutants, but the disappearance of protective factors abundant in traditional diets — which have protected us from pollutants throughout history–from the modern menu. 1.Gribble, Gordon W., “Amazing Organohalogens,” American Scientist Online, Vol 92 No.621 (2004) 342-349 2.US EPA, “A Summary of the Emissions Characterizations and Noncancer Respiratory Effects of Wood Smoke.” EPA-453/R-93-036, as cited in Citizens for Safe Water Around Badger, “Fact Sheet: Open Burning at Ravenna Arsenal,” http://www.cswab.org/ravenna.html Accessed October 2, 2005. Dioxin and Cancer: “Sufficient Evidence” Not Required Although the World Health Organization’s (WHO) International Agency for Research on Cancer (IARC) designated TCDD (but not the other dioxins) as carcinogenic to humans (Group 1) in 1997, TCDD does not actually pass the test for carcinogenicity. For decades, sufficient evidence of carcinogenicity to humans was a necessary criterion for classification of a substance as a Group 1 carcinogen. TCDD was the second chemical whose classification utilized the IARC’s 1990 change of criteria, by which a substance could be judged carcinogenic to humans even if “. . . evidence in humans is less than sufficient but there is sufficient evidence of carcinogenicity in experimental animals and strong evidence in exposed humans that the agent . . . acts through a relevant mechanism of carcinogenicity.” [emphasis added]10 Dioxins do not initiate the transformation of a normal cell to a cancerous cell in any species. TCDD has been shown, however, to be a very powerful promoter of cancers that are first initiated by another carcinogen, and is thus considered a “non-genotoxic carcinogen.” For example, one study using mouse fibroblasts in cell culture found TCDD to enhance the carcinogenic effect of N-methyl-N‘-nitro-N-nitrosoguanidine over 3-fold, and to enhance the carcinogenic effect of 3-methylcholanthrene almost 4-fold. Yet in the absence of these carcinogens, TCDD could not initiate cancer foci even at doses 1000 times higher than those used for its promoting effect and 1000 times higher than the dose of the genotoxic carcinogens used for initiation of cancer.11 The cancer-promoting effects of dioxin are not consistent across species or across tissues. In fact TCDD has been used to inhibit estrogen-dependent breast cancer in rodent models and cultured human cells, leading researchers to look into development of anti-cancer drugs from this dioxin.12 The fact remains, however, that TCDD can be a powerful cancer-promoter in certain tissues of certain species. The mechanism by which TCDD exerts its toxic effects is believed to be mediated by its binding to the aryl hydrocarbon receptor (AhR), a receptor that is also involved in mediating responses to polynuclear aromatic hydrocarbons, combustion products and numerous phytochemicals such as flavanoids and indole-3-carbinol. Once bound, the TCDD-AhR complex then moves into the nucleus, where it binds to the aryl hydrocarbon receptor nuclear translocator (Arnt) protein. Finally, this TCDD-AhR-Arnt complex then binds to DNA to induce the expression of the cytochrome P-450 1A1 gene.5 Less is known about precisely how this activation of the cytochrome P-450 system leads to toxicity and carcinogenesis, but the toxic effects of TCDD are usually correlated with its activation of this system and appear to be dependent upon it. The WHO’s argument for TCDD’s inclusion as a Group 1 carcinogen despite its failure to fulfill the criterion of sufficient evidence of carcinogenicity in humans relies on the following reasoning: •TCDD is a multi-site carcinogen in animals acting through the AhR; •The AhR is highly conserved across species and acts in a similar way in humans as it does in animals; •Tissue concentrations in humans where epidemiological studies have observed increased risk of cancer are similar to those of rats exposed to carcinogenic doses in the laboratory.13 Yet, although the AhR is highly conserved across species, the carcinogenicity and toxicity of TCDD is not. For example, the lethal dose of TCDD varies 5000-fold across species.5 Activation of the AhR cannot be sufficient to induce cancer, because the effect of indole-3-carbinol, a substance found in cruciferous vegetables, is also mediated by the AhR, yet is used to inhibit cancer.14 TCDD’s use of inhibiting estrogen-dependent breast cancer in rodent and human mammary cells is also mediated through the AhR.12 In a major review on dioxins published in 2003, Phillip Cole and his co-workers point out that TCDD acts as a carcinogen in certain tissues in certain species, not many tissues in one species. The variation across species as to which tissues are vulnerable to carcinogenicity is not an argument for multi-site carcinogenicity in humans, but an argument against generalizing from species to species. The types of cancer induced in animals by TCDD “bear little correspondence to those reported among humans exposed to TCDD,” and, while the tissue concentrations of TCDD are similar in animals who develop cancer and humans observed to have an increased risk of cancer, “even the few positive epidemiological studies of TCDD-exposed populations generally report at most a minimal increase of total cancer, while in rats the increase is much greater.”13 The fact that activation of the AhR is not consistently linked to cancer, that the response of animals varies widely to TCDD, and that the types of cancer and magnitude of increased risk observed in humans bear little resemblance to the types of cancer and magnitude of increased risk in rats is sufficient to refute the WHO’s inclusion of TCDD as a Group 1 carcinogen by its own criteria. A question remains: Do humans exposed to high concentrations of dioxins really exhibit an increased risk for cancer? The Cole review refutes this hypothesis, showing that the WHO used its evidence selectively, and that researchers failed to appropriately adjust for exposure to other carcinogens.13 The WHO’s argument rests on epidemiological evidence from industrial and occupational exposure, populations that have been exposed to 10-1000 times the concentrations of TCDD compared to the general population.4 While admitting the absence of a strong case for the elevation of any specific cancer, they have compiled four major cohort studies to find a 40 percent increased risk for all cancers combined for “highly exposed” workers, the definition of which differed between studies. Yet the WHO excluded from this compilation a study by Kogevinas that the very same monograph referred to as ” . . . the largest overall cohort study of [TCDD]-exposed workers,” and which included the data from the other four cohorts. The WHO argued that it had to be excluded because it included individuals with lower TCDD exposures; but, as Cole and his colleagues point out, the data were reported separately for those who were “highly exposed,” and those with lower exposure. Therefore the Kogevinas study could–and should–have been included. The Kogevinas study found a 20 percent increased risk for all cancers with occupational dioxin exposure, but those who were most highly exposed (20 or more years of work experience) had an 8 percent decreased risk of all cancers.13 The Seveso cohort study described the highest exposure to TCDD ever documented in a population. Seveso was the site of a 1976 accident at a chemical manufacturing plant in which a dense cloud of TCDD was released from a reactor in quantities measured in kilograms over 10 square miles, necessitating the evacuation of 600 homes.15 Yet follow-up of the Seveso population reveals that “all-cause and all-cancer mortality did not differ significantly from those expected in any of the contaminated zones.”13 Cole and his team also noted that in numerous studies, confounding factors were not taken into account: •The NIOSH cohort study used smoking information from industrial plants 1 and 2, where there was no lung cancer elevation, but did not record smoking data from plants 8 and 10, where lung cancer was elevated, which they attributed to dioxin exposure. •In the same study, two deaths from mesothelioma could have reflected exposure to asbestos, and workers were also exposed to the bladder carcinogen 4-amino-biphenyl, neither of which was taken into account, while cancers in those exposed to these known carcinogens were attributed to dioxin. •In the Zober study, 35 of 37 cancer cases were smokers, and 10 of 11 respiratory cancer cases were smokers, yet cancer risk was assumed attributable to TCDD. •The one study (Ott and Zober) “with even minimally adequate information on smoking” found no statistically significant relationship between respiratory cancer incidence or mortality and TCDD. •No attention was given to possible confounding of socioeconomic class, “even though the individuals most exposed to TCDD frequently are from the less privileged socio-economic groups that have high overall mortality, including mortality from all cancer.”13 All of the epidemiologic studies published before 1997 that were not included in the WHO’s IARC Monograph, found no association between TCDD exposure and increased risk for cancer or mortality, including those by Dalager and team concerning non-Hodgkin’s lymphoma, Watanabe and team concerning overall mortality, Bullman and team concerning testicular cancer, and Dalager and team concerning Hodgkin’s disease.13 After the WHO’s IARC classified TCDD as a Group 1 carcinogen in 1997, subsequent reviews and studies began to rely on the IARC’s interpretation of earlier study results, rather than the study results themselves. Of the follow-up studies since 1997, the data of Steenland and team show that longer follow-up decreased the magnitude of associations previously found in the same cohort, and caused loss of statistical significance; Pesatori and team found that non-cancer mortality in Seveso–where the highest exposure to dioxins ever documented occurred–did not differ from that of the general population, and Ketchum and team found 30 percent fewer deaths from cancer in US Air Force veterans who were highly exposed to dioxins.13 Thus, while TCDD is claimed to be a non-genotoxic multi-site carcinogen, the evidence suggests that the wide variation in responses to dioxins across species prevents generalization to humans, and that the failure of dioxin exposure to act as an independent risk factor for cancer, even in human populations exposed to concentrations 1000 times greater than the general population, would contradict claims of human carcinogenicity. Dioxins in Pastured Animal Products? A review published in 1995 suggested that pastured animal products would probably contain higher dioxin concentrations because of a higher rate of soil ingestion;3 however, newer research has revealed the fact that the primary sources of above-average dioxin concentration in beef samples are feeding troughs constructed with pentachlorophenol-treated wood and the inclusion of incinerator waste as a feed additive.6 Grass-fed beef is not exposed to these sources of dioxins. Non-Cancer Effects of Dioxins Dioxins are responsible for a wide range of different toxic effects in different species. Non-cancer effects observed in wildlife exposed to high concentrations of dioxins, experimentally induced in animals treated with dioxins, and observed in humans exposed to industrial concentrations of dioxins vary between species and between types of exposure. Like dioxins’ carcinogenic effects, the non-cancer effects of dioxins are believed to be primarily mediated by their ability to bind to the aryl hydrocarbon receptor (AhR).5 Seals fed dioxin-contaminated fish had depressed blood levels of vitamin A and thyroid hormone, and depressed natural killer cell and T-cell activity (indicating immune suppression). Herring gulls have been found with decreased liver stores of vitamin A, but increased egg yolk vitamin A levels, when exposed to dioxins, while great blue herons have lower levels of vitamin A in their egg yolks. Exposed cormorants experience decreased levels of free thyroid hormone, herring gulls experience decreased vitamin A, and common terns experience both decreased vitamin A and thyroid hormone levels. In white suckerfish, the AhR-mediated dioxin-like activity of PCBs was associated with birth defects. Skin diseases (resembling vitamin A-deficiency skin diseases) and increased thyroid weight have been observed in response to organochlorines, which include, but are not limited to dioxins.16 One interesting experiment, demonstrating the variation of dioxin toxicity between species, fed goiter-bearing salmon exposed to high concentrations of dioxin-like and non-dioxin-like PCBs in the wild to rats and other salmon. Although every single one of the wild salmon (previously transferred from the Pacific to the Great Lakes) in the organochlorine-polluted Great Lakes had an enlarged thyroid gland and a high PCB body burden, the degree of thyroid enlargement had no relation to PCB burden. When these PCB-laden fish were fed to immature Coho salmon, the latter did not develop any thyroid enlargement. Yet, when the PCB-laden fish were fed to lab rodents, the rodents developed goiter in direct proportion to the dioxin-like activity induced by the dietary PCBs. It has been hypothesized that the reason the Great Lakes salmon developed goiter is because of a goitrogenic factor of bacterial metabolism in the Great Lakes, which has also proved goitrogenic to humans, while to rodents, on the other hand, the PCBs carried by the fish are the goitrogenic factor.16 Reduced female fertility and reduced male sperm production, as well as genital deformations, have been induced by dioxin exposure in rodents. Dioxins can cause calcium uptake in neurons of the rat hippocampus, and have species- specific effects on gene expression in the nervous system of zebrafish and rats, though it is unknown how these effects may or may not result in any type of neurotoxicity.5 TCDD, the most potent dioxin, acts as an anti-estrogen in rodent mammary and uterine tissues, as well as human mammary cells, where it exerts anti-carcinogenic effects.17 TCDD can induce wasting syndrome and death in chickens and rodents, though its lethal dose varies 5000-fold across species. TCDD can induce cleft palate and other deformities, reproductive failure and liver damage in birds,18 endometriosis in rhesus monkeys, growth of surgically induced endometriotic cysts in rats and mice, and various effects on metabolism and hormones in various species.4 In humans, the only conditions to which dioxins have been conclusively linked are a type of skin acne known as chloracne, and a temporary increase in liver enzymes. Other non-cancer phenomena have been associated with exposure to industrial concentrations of dioxins in some human epidemiological studies, but the evidence is inconclusive or contradicting.4 Effects on various thyroid-related hormones and proteins were found in the Ranch Hand cohort and the National Institute for Occupational Safety and Health (NIOSH) cohort, but they were mostly weak and non-significant, and not consistent between studies. In one sub-section of the NIOSH cohort, diabetes was not associated with TCDD, but the highest-exposed group did have the highest rate of diabetes. The NIOSH cohort as a whole found a negative correlation between TCDD exposure and diabetes mortality, while women, but not men, in zones A and B in Seveso had a greater risk of diabetes mortality with greater TCDD-exposure.4 Exposure to dioxins in breast milk was associated with tooth enamel defects in one study, and alterations in thyroid hormone levels have been associated with prenatal dioxin exposure in children. In one study, exposure to dioxins in breast milk was found to have no effect on psychomotor outcome of infants between three and seven months or after 18 months, but was associated with depressed psychomotor skills between seven and 18 months. Few studies have identified statistically significant effects of industrial-level dioxin exposure on spontaneous abortions, birth weight,or birth defects. However, the most TCDD-contaminated area of Seveso, which has the highest exposure of a population to TCDD ever documented, found a modification in the sex ratio in favor of females to be associated with the total dioxin exposure of both parents. This is an interesting finding, but there were only 13 couples and 15 children in this group, and the association was only found in the highest-exposed group and between the years 1977 and 1984.4 The failure of dioxins to be consistently and conclusively correlated with cancer in humans, or non-cancer effects in humans with the exception of chloracne and temporary increases in liver enzymes, even at industrial levels that exceed what the general population encounters by up to a factor of 1000, should give pause to those who advocate exchanging the proven health-promoting diets of our ancestors for modern vegetarian and vegan diets that do not provide the same type of nutrition. Although dioxins can experimentally induce a variety of endocrine-disrupting, immune-depressing or anti-reproductive effects in animals, the effects are generally species-specific and in a minority of cases–such as the anti-estrogenic effects in mammary and uterine tissues–apparently beneficial. Even if we assume that the worst of these findings can be generalized to humans, the fact that dioxin exposure has declined 95 percent since the 1970s and continues to decline, and the fact that no one in the US is currently exposed to even one-tenth of the dosage that has produced an abnormality in the most sensitive gestational rat, should assure us of the safety of consuming animal products. Moreover, dioxins do not act in a vacuum, but their effects are subject to the influence of many other physiological and dietary factors, and it is a diet rich in traditionally valued animal products that offers the most protection against their effects. Dioxins: It’s Not Just about the Meat The first leg of the dioxin-based argument for a vegetarian or vegan diet–that dioxins are potent human carcinogens, endocrine disruptors, reproductive disruptors, and immune disruptors–has been shown above to be either false or irrelevant at the level of dioxins currently consumed in the US. The second leg of the argument–that animal products are uniquely high in dioxins–likewise fails to sustain analysis. While the most contaminated foods in some studies have been animal products, other studies cite animal products as among the least contaminated foods. Variation between samples is usually much greater than any variation between animal and vegetable products, making any supposed trend inconsistent at best. One 2003 study actually measured the intake of dioxins in humans, where fourteen subjects ate an omnivorous diet for two weeks, then ate a vegan diet for two weeks. Although exposure to dioxins on a TEQ basis was higher during the omnivorous phase than the vegan phase, the diets of some subjects were actually comparable in total PCBs in the vegan and omnivorous phases. The TEQ measurement weights the relative dioxin-like toxicity of each dioxin-like compound against the toxicity of TCDD, so that compounds with less dioxin-like activity (meaning less toxicity mediated by the aryl hydrocarbon receptor) will contribute less per weight to the total TEQ. However, since PCBs have non-dioxin-like toxicity, and since dioxin-like PCBs measured in the study could be indicators for the presence of non-dioxin-like PCBs, it’s possible that total PCB-related toxicity of the vegan diets could have been comparable to that of the omnivorous diets. Most significantly, even on the higher-TEQ omnivorous diet, average TEQ intake was 1.09 pg/kg bodyweight, which is only about half of the WHO’s tolerable daily intake (TDI), a hyperconservative estimate of toxicity risk.19 A 1995 review of the significance of animal products as sources of human exposure to dioxins claimed that the “major food sources [of dioxins] seem to be fat-containing animal products and some seafoods.”3 Since data from the US was not available at the time, the authors used data from Germany and the Netherlands. Table 2 of this review, showing the contribution of selected food products in pg TEQ per day, shows no consistent role of animal products in exposure to dioxins. For example, in the Netherlands, leafy vegetables (4.4) contributed a quantity of TEQs roughly equivalent to pork (4.2), and poultry and eggs (4.8). In Germany, vegetable oils (3.8) contributed only half as many TEQs as pork (7.6), and only 20 percent as many as beef and veal (19), while in the Netherlands, vegetable oils (14) contributed 3.3 times as many TEQs as pork (4.2) and 7 percent more than beef (13). Not only did the contribution of TEQs in the same type of food vary widely between the two countries, but the relative contribution of animal and vegetable products also varied widely.3 More recently, data from a wider range of countries has become available, and the FDA provides data from the years 2000-2003 on its website. These results continually reaffirm the wide variation between regions and samples and the lack of any consistent trend between animal and vegetable products. For example, in Finland, fish accounted for 94 percent of dioxin TEQ intake,9 while in Canada fish only accounted for 3 percent of TEQ intake.20 A Japanese study found fish intake to be an independent predictor of blood dioxin levels, while for all other animal products except pork the correlations were insignificant. Eggs, butter, cheese and pork were actually negatively associated with dioxin levels. Yet the variation of blood levels between regions was large, ranging from 13 TEQ per gram of lipid to 21 TEQ per gram of lipid. Despite the higher fish intake in Japan, the median blood levels of dioxin were still lower than those found in other industrialized countries, especially from earlier studies, probably reflecting both inter-regional differences and declining dioxin levels in the environment.21 Clearly, however, it would be more beneficial to look at the quantities in food rather than the contribution of foods to intake, since the former is independent of what the general population is eating. What we are interested in is whether a diet rich in traditional animal products is excessive in dioxins compared to a vegetarian diet, not which foods people eating a standard diet are getting their dioxins from. Unfortunately, most of the studies available have relatively small sample sizes which, when combined with the large variance between the samples, make the data relatively useless for establishing trends among types of foods. For example, in Greece, five samples of fish oil varied by a factor of six (meaning the sample with the highest concentration had six times more dioxins than the sample with the lowest concentration); five samples of butter and seven samples of farmed fish both varied by a factor of five; eight samples of lamb varied by a factor of three; three samples of poultry, three samples of beef liver and three samples of rice all varied by a factor of two; four samples of wild fish had concentrations ranging from zero to amounts nearly approaching the median for farmed fish.22 The FDA’s data for the United States only uses three samples for each item.23 Although the report does not give information indicating the variation between samples, we can infer that there is considerable variation between samples by comparing different specific food items within the same type of food. For example, expressed in pg TEQ per gram, ground beef contains 0.0425 while a fast food quarter-pound burger contains 0.0, and whole milk contains 0.0087 while half and half, which should concentrate the dioxins in the butterfat, contains 0.0.24 The FDA’s report offers us another way to ascertain the type of variation found among samples because the FDA reports the data in three ways: the first, where a non-detect is assumed to be equal to zero; the second, where a non-detect is assumed to be equal to half the limit of detection; and a third, where a non-detect is assumed to be equal to the limit of detection. If there weren’t any non-detects, we would expect all three figures to be the same. If there is one or more non-detect, we should expect a certain pattern: the second figure should be higher than the first, and the difference between the third and the first figures should be exactly double the difference between the second and the first figures. In fact, this pattern is nearly ubiquitous among the items that showed any detectable contamination with dioxins. This fact allows us to conclude that some of the items contained at least one sample with no detected dioxin, including milk, beef, lamb, turkey, beef liver, butter and salmon. The FDA’s data show that some of the highest concentration of dioxin TEQs are found in animal products, but this finding hardly provides justification for adoption of a vegan diet. Of the few whole foods actually measured, pork loin, eggs, and shrimp were animal products containing no detectable dioxin. Even the highest-ranking animal products, such as butter, salmon, lamb and beef had at least one non-detect among three samples, which means that only one or two samples out of three are responsible for the high ranking, while one or two out of the three samples did not contain any detectable dioxin at all. Table 1 shows selected items from the FDA’s report.24 Animal products are scattered among the highest and lowest concentrations of dioxin TEQs, with some plant foods containing significant quantities of dioxins. Since the FDA’s report contained little in the way of vegan protein foods, I estimated the dioxin concentration of tofu using the FDA’s figures for the dioxin concentration of vegetable oil (usually meaning soybean oil) and the USDA’s25 data for the fat content of tofu. Every single item listed had at least one sample in which no dioxin was detected. Since the US FDA’s data rely on only three samples, and since the variation between samples appears to be large, it is necessary to look at other data to establish or refute a trend–and this data destroys the argument that dioxins are found primarily in animal foods. Table 2 gives data from Finland.9 Although animal products have a tendency to be higher on the list, vegetables have higher concentrations than milk products, and cereal products have more than a 4-fold concentration of dioxins compared to milk products. Fish take the cake in Finland, with 163 times the dioxin concentration compared to their nearest competitor, fats. According to this data, avoiding meat in favor of cereals would have a negligible impact compared to avoiding fish in favor of meats. Unfortunately the Finnish data give us no information on vegan sources of protein. Table 3 provides data from the Netherlands.8 Since animal products except fish were reported per gram of fat while the other data were reported per kilogram of product, I adjusted the animal product data by choosing at random a specific item from the USDA’s database25 to represent each category, and multiplying the grams of fat per gram of product by the pg TEQ per gram fat to yield the adjusted figures. The data for fish and plant products were divided by 1000 to yield the adjusted figures. As in Finland, fish in the Netherlands are considerably higher in dioxins than any other food, having 10 times the dioxin concentration as beef. Still, vegetables have about a 50 percent greater concentration of dioxins than whole milk, and roughly the same concentration as pork. The data from Greece,22 shown in Table 4, are particularly damning to the notion that dioxins exist primarily in animal products. In this compilation, since plant products were reported on the basis of product weight while animal products were reported on the basis of grams of fat, it was necessary to choose a specific food product to represent each category and adjust the animal product figures in the same way as the previous table. It is therefore possible that some of the items, such as “farmed sockeye salmon,” were not actually sampled in Greece, but serve only as a model to adjust the figures for the purpose of comparison. Amazingly, the Greek study found, with the exception of fish oil, which isn’t consumed in significant quantities, that rice was the most concentrated source of dioxin TEQs! Like the FDA’s data for the US, the Greek study found animal products to be distributed between both the highest and lowest sources of dioxins, but this study actually found plant products in general to be higher in concentration than most of the animal products. For example, vegetables had almost six times the dioxin concentration of beef liver. In all of these analyses, in most cases the wide variation of dioxin concentrations between regions and between individual samples is wider than the variation between types of foods. Animal products tend to be distributed randomly among the highest and lowest concentrations, yielding no consistent trend of dioxin accumulation in animal products. In some cases, such as the Greek data, vegetable products dominate the higher-concentration readings, and animal products dominate the lower-concentration readings. In the United States, certain animal products like butter are found to have the highest concentrations, but the presence of at least one sample of butter out of three with no detectable dioxins and three out of three samples of half and half with no detectable dioxins makes it impossible to claim a consistent connection between butterfat and dioxin. Thus, the second leg of the dioxin-based argument for vegetarianism, that animal products are uniquely high in dioxins, crumbles to pieces when subjected to critical analysis. Table 1. Dioxin Concentrations in Foodstuffs, United States, 2003. Data are reported assuming a non-detect is equal to zero. FOOD ITEM PG TEQ DIOXIN/G PRODUCT Butter 0.2847 Salmon Steak/Fillet 0.1918 Lamb Chop 0.0714 Beef Roast, Chuck 0.0687 Beef Loin 0.0606 Olive Oil 0.0295 Tomato Sauce 0.0232 Beef liver 0.0207 Roasted Turkey Breast 0.0171 Vegetable Oil 0.0150 Whole Milk 0.0087 Margarine 0.0015 Chicken Leg, fried with skin 0.0014 Tofu, Firm 0.0006 White Beans 0.0001 Sunflower Seeds, roasted 0.0000 Peanuts, dry roasted 0.0000 Tuna, canned 0.0000 Shrimp, boiled 0.0000 Half and Half 0.0000 Eggs, scrambled or boiled 0.0000 Pork Loin 0.0000 Table 2. Dioxin Concentration in Foodstuffs, Finland, 2004. Data are reported assuming a non-detect is equal tozero. FOOD ITEM PG DIOXIN TEQ/G PRODUCT Fish 1.80000 Fats 0.01100 Meat and Eggs 0.00820 Cereal Products 0.00430 Solid Milk Products 0.00270 Vegetables 0.00120 Liquid Milk Products 0.000930 Fruits and Berries 0.000830 Beverages 0.000170 Potato Products 0.000031 Table 3. Dioxin Concentration in Foodstuffs, The Netherlands, 1999. Data are reported assuming a non-detect is equal to zero. FOOD ITEM PG DIOXIN TEQ/G PRODUCT Fatty Fish 3.1580 Crustaceans 1.3540 Butter 1.3305 Lean Fish 0.5930 Cheese 0.4253 80% Lean Ground Beef 0.3654 Chicken, Dark Meat 0.3230 Margarine 0.3000 Prepared Fish 0.2670 Vegetable Oil 0.1800 Pork Chop Loin 0.0614 Vegetables 0.0600 Whole Milk 0.0410 Nuts 0.0130 Table 4. Dioxin Concentration in Foodstuffs, Greece, 2002. Data are reported assuming a non-detect is equal to the limit of detection. FOOD ITEM PG DIOXIN TEQ/G PRODUCT Fish Oil 1.010 Rice 0.900 Butter 0.640 Fruit 0.470 Vegetable 0.430 Olive Oil 0.300 Lamb Shoulder 0.110 80% Lean Hamburger 0.098 Beef Liver 0.077 Pork Chop 0.051 Farmed Sockeye Salmon 0.050 Boiled Egg 0.039 Chicken Leg 0.017 Wild Sockeye Salmon 0.013 Factors Affecting Dioxin Toxicity Although the first tenet of the dioxin-based argument for vegetarianism, that dioxins are potent human carcinogens, endocrine disruptors, reproductive disruptors, and immune disruptors appears to be false or irrelevant to humans at the levels at which they are exposed, it is still sensible for us to err on the side of caution and, ceteris paribus (all things being equal), opt for a lower dioxin intake over a higher one. However, the argument for vegetarianism does not use the ceteris paribus stipulation; rather, it argues that dioxin intake be minimized regardless of other factors. The third and final leg of the dioxin-based argument for vegetarianism,– that avoiding the harmful effects of dioxins is primarily dependent upon minimizing dioxin intake and therefore avoiding animal products–implicitly assumes that dioxin toxicity is merely a function of dioxin intake. On the contrary, a variety of dietary and other factors influence dioxin uptake from the intestines, excretion of dioxin, the half-life of dioxin in the body and the toxicity of dioxin at the cellular level. As it turns out, vegetarian diets tend to be lower in protective nutrients and higher in toxicity-enhancing compounds, whereas a traditional diet is highest in protective nutrients and lowest in toxicity-enhancing compounds. Not all dioxin consumed in a food is actually absorbed. One human study found widely varying intestinal absorption rates, with a maximum of 63 percent. The study did find that a higher-fat meal produced a higher absorption rate; however, since protective compounds are also fat-soluble, it shouldn’t be concluded that a lower-fat diet is preferable. The older individuals in this study actually had a net excretion of dioxins, excreting more dioxins in the feces than was present in the food. Apparently, dioxins are stored in the tissues when tissue levels are lower than blood levels, and released from the tissues when blood levels drop below tissue levels. Since dioxin levels have decreased so dramatically over the past few decades, older individuals who experienced the high peaks in environmental dioxin levels in earlier decades cannot eat high enough concentrations in food to prevent automatic tissue release and fecal excretion.26 Various vegetable fibers have been shown to increase fecal excretion of dioxin in animals. Chlorophyll compounds, especially copper chlorophyllin, were shown in one study to be the most effective compounds, increasing excretion rates by 144 percent over normal when fed as 1 percent of the diet by weight.27 This might indicate a modest benefit of chlorophyll-rich vegetables (which would supply a much lower concentration of chlorophyll than used in the study), which could be obtained from both a vegetarian and a meat-based diet. Once absorbed from the diet or from other forms of exposure into the bloodstream, dioxins are stored in fatty tissues and slowly detoxified and excreted over long periods of time. The half-lives of dioxins are not consistent between individuals, however. Investigations into the halflives of dioxins in industrially exposed persons reveal that the variation between minimum and maximum half-lives in different individuals is often several times greater than the value of the median half-life. Kidney and thyroid disorders may inhibit detoxification of dioxins, though results for thyroid disorders are conflicting. Persons with a higher percentage body fat have a slower dioxin decay rate, while intermediate weight loss can increase the decay rate by a factor of 2.5. For some unknown reason, smoking increases the decay rate significantly, although when adjusted for age and percent body fat the association becomes lower and for TCDD it becomes non-significant. For certain dioxins, however, smoking decreases the half-life by up to 25 percent.28 The most important variations in diet that affect the potential for toxic effects of dioxins are antioxidants and factors that increase oxidative damage in the body, such as polyunsaturated fatty acids. Among the antioxidants, vitamin A has many other roles independent of its antioxidant activity and deserves special attention, since depletion of vitamin A and interference with vitamin A metabolism is central to the toxicity of dioxins. Dioxin Toxicity and Vitamin A Although relatively little is known about which factors tie the ability of dioxins to bind to the aryl hydrocarbon receptor (AhR) and the subsequent activation of the cytochrome P-450 system to their toxicity, it is clear that one of the missing links is vitamin A. Changes in vitamin A levels in wildlife are correlated with dioxin exposure, and TCDD is able to experimentally induce vitamin A depletion as well as resistance to vitamin A signaling, which is correlated with its toxic effects. Also, TCDD and vitamin A have opposing actions in certain tissues, and the addition of dietary vitamin A exerts a strong protective effect against a wide range of TCDD-induced effects. The most consistent effects observed in wildlife in response to dioxin exposure are changes in vitamin A and thyroid hormone levels. Changes in liver or plasma vitamin A concentrations have occurred in captive harbor seals eating polluted fish, Great Lakes herring gulls and tree swallows, great blue herons and lake sturgeon of the St. Lawrence River, common terns of Belgium and the Netherlands and white suckerfish of Montreal. Typically, decreases in liver or plasma vitamin A are observed, or signs of increased mobilization of vitamin A from the liver. In several of these cases, decreased levels of thyroid hormone have also occurred, and in cormorants of the Netherlands, a decrease in free thyroid hormone was observed without changes in vitamin A.16 When rats were fed daily doses of dioxins roughly equivalent to one million times more than humans typically consume, major impacts on vitamin A and thyroid hormone levels occurred. TCDD increased blood levels of vitamin A by 21 percent, while all other dioxins decreased blood levels. All of the dioxins, including TCDD, depleted liver stores of vitamin A by 60-80 percent. This was considered a “very sensitive response” to dioxins, since even the lowest dose, only 70,000 times the equivalent of that which humans consume, produced a statistically significant effect. A dose-dependent reduction of thyroid hormone (T4) was induced, yielding a 76 percent reduction at a dose equivalent to two million times more than humans typically consume, and still yielding a significant 50 percent decrease even in the group fed only 70,000 times more than humans typically consume.29 The above study found the effect of TCDD at reducing thyroid hormone levels to be much less potent than that of the other dioxins, while it actually raised blood levels of vitamin A rather than lowering them. This is probably because some of the other dioxins produce metabolites that bind to transthyretin, the protein that transports both vitamin A and thyroid hormone in the blood. TCDD, however, does not have this effect. The study found the WHO’s TEQ concept to have no predictive value with respect to these effects. This calls into question whether vegan diets that are lower in dioxin TEQs but comparable in absolute quantities of dioxin-like PCBs are truly lower in toxic elements.19 Dioxins, Vitamin A, and Cancer It appears that the capacity of dioxins to produce both cancer and non-cancer toxicity relates to their ability to deplete vitamin A reserves and oppose the actions of vitamin A in the body. In cultured human skin cells incubated with TCDD, TCDD induces the expression of transforming growth factor alpha (TGF-a) and decreases the expression of transforming growth factor beta-2 (TGF-ß2), while incubation with retinoic acid, the hormone form of vitamin A, increases the expression of TGF-ß2. (TGF-a increases cellular proliferation, while TGF-ß2 has the opposite effect.) Since excessive cellular proliferation is a mechanism of cancer promotion–causing cells to multiply before they are able to fix DNA damage–this may explain part of the carcinogenic potential of dioxins and the protective effect of vitamin A.30 On the other hand, in human breast cancer cells where dioxins inhibit cancer, vitamin A enhances the anti-estrogenic effect of dioxins. Both retinoic acid and TCDD inhibit breast cancer in rodents by opposing the effects of estrogen. In cultured human cells, TCDD and retinoic acid inhibit estrogen-induced cell proliferation and the synthesis of estrogen receptors, and the effectiveness of each is enhanced when used together.31 Both the carcinogenic and non-carcinogenic toxicity of dioxins are believed to stem from the ability of dioxins to bind to the AhR and induce the formation of the cytochrome P-450 system. A recent study showed that vitamin A fed to rodents reduced the TCDD-induced expression of cytochrome P-450 by 68 percent.32 Other studies also show vitamin A to be effective, along with other antioxidants, in inhibiting the free radical products that are induced by dioxins and also believed to play a role in carcinogenesis, as well as many other toxic effects, discussed below. Dioxins, Vitamin A, and Non-Cancer Toxicity Many of the observed toxic effects of dioxins resemble those of vitamin A deficiency. Table 5 shows selected effects of dioxins in various species that are also widely accepted to be effects of vitamin A deficiency. Diseases such as cancer that are effectively treated with or prevented by vitamin A but are not considered deficiencies of vitamin A in standard literature are not included. Many of the toxic effects induced by dioxins correlate with vitamin A depletion. TCDD can result in impaired growth and wasting disease, and in the guinea pig, rat, mouse and hamster, a dose-response relationship has been demonstrated between degree of vitamin A depletion and degree of depressed weight gain.38 Decreased vitamin A stores have been found along with hyperkeratotic skin diseases in elephant seals, decreased fertility and suppressed immune function in harbor seals, suppressed immune function in herring gulls, and increased birth defects in white suckerfish and lake sturgeon, all of which resemble the effects of vitamin A deficiency and were associated with exposure to dioxins or organochlorines in general.16 Thus, dioxins deplete vitamin A stores and are associated with many effects that seem to mimic vitamin A deficiency. But there is more to the story. Although liver reserves are depleted when vitamin A deficiency-like symptoms induced by dioxins arise, these symptoms usually occur when there are still significant tissue reserves remaining, whereas in simple vitamin A deficiency, symptoms usually do not occur until tissue reserves are almost entirely depleted. Dioxins appear not only to deplete vitamin A, but also to induce cellular resistance to retinoic acid, which is the hormone form of vitamin A.39 Many effects of dioxins can be reversed by vitamin A. Supplementation with vitamin A enabled 25 percent of rats fed a lethal dose of TCDD to survive, while supplementation with vitamin E enabled only 10 percent to survive.40 Injection of vitamin A into a fertile egg largely protected against the increase in mortality of chicks caused by injection of TCDD, while other antioxidants had no effect, and vitamin A also reduced the increase in birth defects by half.18 In rodents, vitamin A by itself reduces TCDD-induced reduction of body weight and thymus weight and reduces DNA damage following TCDD treatment by over 60 percent; in combination with vitamin E it reduces TCDD-induced increase in liver weight.41 TCDD-induced hyperkeratosis (psoriasis) and chloracne (a type of acne) are reversed by topical application of vitamin A.39 Thus, vitamin A appears to play a unique role in protecting against the toxicity of dioxins, and has some protective effects that other antioxidants do not have. A large part of vitamin A’s protective role is attributable to its antioxidant effect. Other antioxidants have also been shown to confer a large degree of protection against dioxin toxicity, a fact that also has implications regarding the types of fats we should consume. Table 5. Effects Shared by Vitamin A Deficiency and Dioxin Toxicity Information compiled from various sources.5, 16 , 33, 34, 35, 36, 37 EFFECT: Decreased circulating androgens Male and female infertility: Decreased sperm production in males; fetal loss in females Genital malformations Cleft palate and various birth defects Immune suppression Hyperkeratosis and other skin diseases Growth retardation Increased mortality Dioxin Toxicity, Free Radicals, and Antioxidants The aryl hydrocarbon receptor (AhR) is involved in the detoxification of many compounds. However, once this process is begun, hydrogen peroxide and other chemicals capable of free radical damage are formed. Ideally, free radical formation is intercepted by antioxidant systems, but when free radical production exceeds antioxidant capacity, oxidative damage occurs. Polyunsaturated fatty acids in the membrane of the cell are the preferred target of free radicals, which, when converted to lipid peroxides by free radicals, initiate a chain reaction of damage to the membrane.42 A wide array of antioxidant compounds play specific roles in forming a protective network against this damage, including vitamin A, which protects the integrity of membranes, vitamin E, which intercepts the lipid peroxide chain reaction, glutathione peroxidase, a selenium-dependent enzyme that converts hydrogen peroxide to water,41 and coenzyme Q1043 and vitamin C,44 both of which act as antioxidants themselves and help to regenerate vitamin E. While dietary antioxidants protect against oxidative damage, consuming polyunsaturated fatty acids raises lipid peroxide levels, which will be discussed further below. TCDD treatments have been shown to increase lipid peroxidation up to 7-fold in rats, 2-fold in mice, and by 25 percent in chickens exposed during embryonic development.45 The powerful protective effect against carcinogenicity and toxicity by supplemental antioxidants provides a compelling argument for a role in lipid peroxidation and oxidative stress in general in the mechanisms of dioxin-related toxicity. Vitamins A and E both offer roughly 61-66 percent protection to mice against free radical production and DNA damage induced by a single acute dose of TCDD roughly equivalent to 50 million times that which a human typically consumes in a day.41 A study of mouse fibroblasts in cell culture found TCDD to enhance the carcinogenic effect of two other carcinogens between 3.5 and 3.8-fold, but the addition of a mixture of vitamins E and C was found to considerably reduce the tumor-promoting effect of TCDD when tumors were initiated by N-methyl-N‘-nitro-N-nitrosoguanidine, and to entirely abolish the tumor-promoting effect of TCDD when tumors were initiated by 3-methylcholanthrene. Amazingly, when the hydroxyl scavenger mannitol was used as the antioxidant there were actually fewer tumors in TCDD-mannitol-treated groups than in the control for either initiator!11 A protective role for the selenium-dependent antioxidant enzyme glutathione peroxidase has also been demonstrated. TCDD has been shown to lower glutathione peroxidase levels up to 68 percent. Vitamin A, but not vitamin E, inhibits TCDD-induced reduction of glutathione peroxidase. Stohs and his team found vitamin A to be 2.5 times more effective than vitamin E at enabling the survival of rats exposed to a lethal dose of TCDD, and protection to be associated with glutathione peroxidase levels.40 Some of the protective effects of vitamin A that are not shared by vitamin E, then, might be attributable to vitamin A’s glutathione peroxidase-sparing activity. To date, no studies examining the affect of coenzyme Q10, an important protector against lipid peroxidation, on susceptibility to dioxin toxicity has been indexed for Medline. Dioxin Toxicity: Vegetarian vs. Traditional Diets Although the second leg of the dioxin-based argument for vegetarianism, that animal products are uniquely high in dioxins, has been shown to be false, even were it true, traditionally raised animal products provide important protective nutrients that vegetarian diets do not provide in comparable amounts. The third leg of the argument, then, that avoiding the harmful effects of dioxins is primarily dependent upon minimizing dioxin intake, and therefore avoiding animal products, is independently false because dioxin toxicity is mediated by many other factors, and a diet rich in traditional animal products is rich in protective factors, while a vegetarian diet and especially a vegan diet enhances the toxicity of dioxin. Children on vegetarian diets have been found to have lower blood levels of vitamins A and E.46 Although vitamin C levels tend to be higher in vegetarians, selenium and selenium-dependent glutathione peroxidase levels are lower in vegetarians.47 Vegetarians have also been found to have higher vitamin A and E levels48 than their non-vegetarian counterparts, but most meat-eaters do not consume traditional vitamin A-rich animal foods like organ meats and cod liver oil or traditionally pastured animal products rich in vitamin E. That meat-eaters tend to have lower levels of vitamin C is most likely a reflection of the fact that the average meat-eater does not consume sufficient quantities of fruits and vegetables. Vitamin C, carotenes, and vitamin E-rich plant foods, however, are not exclusive to a vegetarian diet. A meat-inclusive diet can be rich in fruits and vegetables if it is low in refined foods, while vegetarian and vegan diets by definition restrict animal products. It is noteworthy that pasture-fed meats contain four times as much vitamin E as their grain-fed counterparts,49 which is likely to be at least as true for grass-fed milk and butter, but it is also true that adequate vitamin E can be easily obtained on a vegetarian diet. Vitamin A, on the other hand, is a nutrient that occurs only in animal foods. Although carotenes from plant foods can be converted to vitamin A, the conversion rate is low, and is continually being revised downward. While the World Health Organization had considered six units of beta-carotene to be equal to one unit of vitamin A, the US Institute of Medicine revised this downward in 2002, considering 12 units of carotene in foods on a mixed diet to be equal to one unit of vitamin A. However, even this revision was criticized by a review in the Journal of Nutrition, which reported field studies suggesting that it took 21 units of beta-carotene to equal one unit of vitamin A.50 While the Institute of Medicine’s figure considered half of the carotene in oils to be converted to vitamin A, a much higher conversion rate than that for solid foods, a more recent study found that even when carotene is provided as a concentrated dose in the form of an oil, conversion factors range from a minimum of 2.4 to a maximum of 20.2.51 Additionally, several medical conditions interfere with the conversion of carotenes to vitamin A, children have lower conversion rates than adults, and infants cannot make this conversion at all, requiring an animal source of vitamin A.36 Table 6 compares the four animal foods richest in vitamin A and the four plant foods richest in carotenes. For the plant foods, the USDA’s listing25 for “vitamin A” is shown in parentheses. The amount of true vitamin A yielded is shown in bold, using the US Institute of Medicine’s conversion factor of 12. Even this figure is likely to overestimate the amount of vitamin A yielded by the plant foods. Considering the inefficiency of carotene conversion to vitamin A, it appears nearly impossible for a plant-based diet to supply the levels of vitamin A found in a traditional diet. Using the figures in Table 6, a quarter pound of liver supplies 29,210 IU of vitamin A, while a half-cup of sweet potatoes supplies only 1,792 IU. In order to meet the vitamin A content of a mere teaspoon of high-vitamin cod liver oil, one would have to consume 1.6 pounds or 3 cups of sweet potatoes. Using the conversion factor of 21 that some studies have suggested as more appropriate, it would take 3.1 pounds of carrots to equal the amount of vitamin A in one teaspoon of high-vitamin cod liver oil and a full 13.4 pounds of spinach to match a tablespoon of high-vitamin cod liver oil. Although these calculations demonstrate the dramatic inferiority of plant foods as a source of vitamin A, the truth is that even such massive doses of carotenes could not match the vitamin A activity of a diet emphasizing vitamin A-rich animal products, because excessive doses of carotenes depress the vitamin A activity of the portion of the carotene that is absorbed.58 While antioxidants protect against lipid peroxidation, consumption of polyunsaturated fatty acids (PUFA) raises lipid peroxides. PUFA levels can be low on a vegetarian diet if oils like olive oil or saturated coconut oil are staples, but cod liver oil, an animal product, is the only polyunsaturated oil that has been shown to provide essential fatty acids without raising lipid peroxide levels. Polyunsaturated plant oils rich in essential fatty acids such as soybean oil,52 corn oil53 and the omega-3-rich perilla oil54 all raise lipid peroxide levels. It is not only heated polyunsaturated oils that raise lipid peroxides. Even fresh, unoxidized perilla oil stored at –20C and fresh, unoxidized, purified DHA and EPA–the omega-3 PUFAs found in fish oil and cod liver oil,–stored at –80C, mixed into the diets of rats immediately before feeding, raised lipid peroxide levels in tissues considerably–even when rats were fed adequate vitamin E.54 Cod liver oil, on the other hand, has been shown to inhibit lipid peroxidation. One study found that cod liver oil depressed drug-induced lipid peroxidation in mice under the same conditions by which soybean oil increased lipid peroxidation.52 Another study found that feeding cod liver oil entirely abolished the increased level of lipid peroxidation found in diabetic rats.55 In both studies, the depression of lipid peroxidation was related to a sparing effect on glutathione peroxidase activity, which was also the case in rats saved from a lethal dose of dioxin by vitamin A supplementation, suggesting that the protective effect of cod liver oil is due to its high vitamin A content. The omega-3 and omega-6 PUFA in plant oils must be desaturated and elongated by the body to form the important fatty acids that have structural and hormone precursor value, such as DGLA and AA in the omega-6 family, and EPA and DHA in the omega-3 family. Since this conversion is relatively inefficient, especially in vegetarian diets that tend to be low in zinc, a larger amount of total PUFA must be consumed from plant oils to meet requirements for these fatty acids than from animal products that contain these fatty acids in their needed form. The increase in total PUFA consumption directly increases the risk of lipid peroxidation above that which is required on an animal product-inclusive diet. It is possible for vegetarian diets to be relatively low in PUFA and for meat-inclusive diets to be excessive in PUFA, but maximal protection against lipid peroxidation is only possible on a diet utilizing organ meats and cod liver oil. Organ meats and butter can provide the omega-6 fatty acids DGLA and AA without an excess of total PUFA,56 while cod liver oil can supply the omega-3 fatty acids EPA and DHA without an excess of total PUFA. Liver and cod liver oil also provide the vitamin A required to protect these fatty acids from oxidation and boost the level of the protective enzyme glutathione peroxidase. Coenzyme Q10’s effect on susceptibility to dioxin toxicity has not been studied, but since it is a known inhibitor of lipid peroxidation and is necessary for vitamin E function43 it is highly likely to offer considerable protection. Coenzyme Q10 is produced in the body, but synthesis begins declining at the age of 20, after which dietary sources become more important. Although there are no studies of coenzyme Q10 levels in vegetarians indexed for Medline at the time of writing, Dr. Al Sears, MD, director of the south Florida Center for Health and Wellness, reports in The Doctor’s Heart Cure that strict vegans tend to have “extremely low” levels of coenzyme Q10, based on several hundred patients whose blood levels he has measured. Coenzyme Q10 is a heat-sensitive nutrient primarily found in traditional foods like organ meats. According to Dr. Sears, the organs of wild and grass-fed animals have up to ten times the levels of coenzyme Q10 compared to the levels in the organs of grain-fed animals.57 Table 6. Vitamin A Content of Animal Foods vs. Plant Foods ANIMAL FOODS VITAMIN A (IU/G) PLANT FOODS VIT. A EQUIVALENTS (IU/G) High-Vitamin Cod Liver Oil 2,500 Sweet Potato 16.0 (192.2) Turkey Giblets 357.9 Carrots 14.3 (172.0) Beef Liver 260.9 Canned Pumpkin 13.0 (155.6) Chicken Liver 133.3 Spinach 10.1 (120.6) Dioxin Shmioxin: It All Comes Back to Weston Price The hysteria surrounding dioxins in some circles is difficult to understand considering the fact that exposure to dioxins has declined by 95 percent over the past three decades, a fact that is verified both by major declines in body burdens and in human breast milk concentrations. The simple fact is that dioxins do not exist in the environment at concentrations that warrant making dietary changes. Modifying antioxidant intake and fatty acid intake in the diet can produce major changes in lipid peroxidation, a major mechanism of dioxin toxicity, due not to the presence of dioxins, but to the direct impact of compounds that are present in our diets in much more relevant concentrations than dioxins. The dioxin-based argument for vegetarianism stands upon three legs, each of which crumble under analysis, the failure of each being sufficient for the argument to fall. Dioxins have not been shown to be potent human carcinogens, endocrine disruptors, reproductive inhibitors or immune toxicants; dioxins do not occur primarily in animal foods, and in some cases, as in Greece, occur primarily in plant foods; and intake is not the only or even primary determinant of toxicity. Furthermore, vegetarian diets cannot provide the degree of protection conferred by a traditional diet compatible with the Weston A. Price Foundation’s principles, and require the consumption of polyunsaturated plant oils to provide essential fatty acids, which enhance the type of toxicity exemplified by dioxins. Ultimately, diets must be looked at in their entirety. If the goal of minimizing dioxin intake was truly more important than all other dietary considerations, then it would make sense to eat a diet comprised mostly of potatoes, which studies consistently show to be the lowest carriers of dioxins in all countries, and to use margarine as one’s staple fat, which has been found to be lower in dioxin concentration than vegetable oil, olive oil and butter. Using the tortured logic of the dioxin-dreaders, smoking cigarettes would also be advisable, to increase the detoxification and excretion of stored dioxins. The studies that compare vegetarians to meat-eaters on modern diets compare two relatively poor diets, both devalued by poor soil fertility and the absence of traditional foods like organ meats and cod liver oil. The reason Weston Price’s research remains persistently relevant and continues to trump a multitude of conflicting research findings is because Price was able to document truly healthy populations rather than only those who suffered from disease. Not all pre-modern peoples had the same robust health as those observed by Dr. Price. Price chose to document groups based on their immunity to degenerative disease and tooth decay, not merely their isolation from modern society, and Price did not note the presence or absence of any singular element to be responsible for the superior health he observed. A combination of numerous dietary factors, soil maintenance practices, and prenatal and lactational diets were all required together to confer superb health. It is highly likely that the populations Price studied had at least some exposure to dioxins, produced from natural sources such as forest fires and volcanoes. Yet Price found that, without exception, certain animal products were considered necessary, sacred, protective and health-promoting. Of vegetarian diets, he noted: “As yet I have not found a single group of primitive racial stock which was building and maintaining excellent bodies by living entirely on plant foods. I have found in many parts of the world most devout representatives of modern ethical systems advocating restriction of foods to the vegetable products. In every instance where the groups involved had been long under this teaching, I found evidence of degeneration in the form of dental caries, and in the new generation in the form of abnormal dental arches to an extent very much higher than in the primitive groups who were not under this influence.”59 Our focus should not be on any given compound which, when isolated and given to animals at thousands of times the concentration found in food, produces toxic effects, but on what type of diet as a whole is able to promote long, healthful and happy lives. Price demonstrated that the healthiest of humans have always included animal products as a valuable and important part of such a diet, a truth whose relevance persists today. Be Kind to Your Grains…And Your Grains Will Be Kind To You Posted on January 1, 2000 by Sally Fallon and Mary G. Enig, PhD • 14 Comments PrintFriendly and PDFPrint - PDF - Email Read this in: Czech The science of nutrition seems to take a step backwards for every two steps it takes forward. When the study of vitamins was in its infancy, researchers realized that white flour lacked the nutrients that nature put into whole grains. One of these researchers was Dr. Weston Price who noted in his studies of isolated, so-called “primitive” peoples that when white flour and other devitalized foods were introduced into these communities, rampant tooth decay and disease of every sort soon followed. But defenders of the new refining process argued that phosphorus in whole grains was “too acid” and was the true cause of bone loss and tooth decay. Warnings against the use of white flour went largely ignored. Only in recent decades has Dr. Price been vindicated. Even orthodox nutritionists now recognize that white flour is an empty food, supplying calories for energy but none of the bodybuilding materials that abound in the germ and the bran of whole grains. We’ve take two important steps forward—but unfortunately another step backward in that now whole grain and bran products are being promoted as health foods without adequate appreciation of their dangers. These show up not only as digestive problems, Crohn’s disease and colitis, but also as the mental disorders associated with celiac disease. One school of thought claims that both refined and whole grains should be avoided, arguing that they were absent from the Paleolithic diet and citing the obvious association of grains with celiac disease and studies linking grain consumption with heart disease. But many healthy societies consume products made from grains. In fact, it can be argued that the cultivation of grains made civilization possible and opened the door for mankind to live long and comfortable lives. Problems occur when we are cruel to our grains—when we fractionate them into bran, germ and naked starch; when we mill them at high temperatures; when we extrude them to make crunchy breakfast cereals; and when we consume them without careful preparation. Grains require careful preparation because they contain a number of antinutrients that can cause serious health problems. Phytic acid, for example, is an organic acid in which phosphorus is bound. It is mostly found in the bran or outer hull of seeds. Untreated phytic acid can combine with calcium, magnesium, copper, iron and especially zinc in the intestinal tract and block their absorption. This is why a diet high in improperly prepared whole grains may lead to serious mineral deficiencies and bone loss. The modern misguided practice of consuming large amounts of unprocessed bran often improves colon transit time at first but may lead to irritable bowel syndrome and, in the long term, many other adverse effects. Other antinutrients in whole grains include enzyme inhibitors which can inhibit digestion and put stress on the pancreas; irritating tannins; complex sugars which the body cannot break down; and gluten and related hard-to-digest proteins which may cause allergies, digestive disorders and even mental illness. Most of these antinutrients are part of the seed’s system of preservation—they prevent sprouting until the conditions are right. Plants need moisture, warmth, time and slight acidity in order to sprout. Proper preparation of grains is a kind and gentle process that imitates the process that occurs in nature. It involves soaking for a period in warm, acidulated water in the preparation of porridge, or long, slow sour dough fermentation in the making of bread. Such processes neutralize phytic acid and enzyme inhibitors. Vitamin content increases, particularly B vitamins. Tannins, complex sugars, gluten and other difficult-to-digest substances are partially broken down into simpler components that are more readily available for absorption. Animals that nourish themselves on primarily on grain and other plant matter have as many as four stomachs. Their intestines are longer, as is the entire digestion transit time. Man, on the other hand, has but one stomach and a much shorter intestine compared to herbivorous animals. These features of his anatomy allow him to pass animal products before they putrefy in the gut but make him less well adapted to a diet high in grains—unless, of course, he prepares them properly. When grains are properly prepared through soaking, sprouting or sour leavening, the friendly bacteria of the microscopic world do some of our digesting for us in a container, just as these same lactobacilli do their work in the first and second stomachs of the herbivores. So the well-meaning advice of many nutritionists, to consume whole grains as our ancestors did and not refined flours and polished rice, can be misleading and harmful in its consequences; for while our ancestors ate whole grains, they did not consume them as presented in our modern cookbooks in the form of quick-rise breads, granolas, bran preparations and other hastily prepared casseroles and concoctions. Our ancestors, and virtually all pre-industrialized peoples, soaked or fermented their grains before making them into porridge, breads, cakes and casseroles. A quick review of grain recipes from around the world will prove our point: In India, rice and lentils are fermented for at least two days before they are prepared as idli and dosas; in Africa the natives soak coarsely ground corn overnight before adding it to soups and stews and they ferment corn or millet for several days to produce a sour porridge called ogi; a similar dish made from oats was traditional among the Welsh; in some Oriental and Latin American countries rice receives a long fermentation before it is prepared; Ethiopians make their distinctive injera bread by fermenting a grain called teff for several days; Mexican corn cakes, called pozol, are fermented for several days and for as long as two weeks in banana leaves; before the introduction of commercial brewers yeast, Europeans made slow-rise breads from fermented starters; in America the pioneers were famous for their sourdough breads, pancakes and biscuits; and throughout Europe grains were soaked overnight, and for as long as several days, in water or soured milk before they were cooked and served as porridge or gruel. (Many of our senior citizens may remember that in earlier times the instructions on the oatmeal box called for an overnight soaking.) Bread can be the staff of life, but modern technology has turned our bread—even our whole grain bread—into a poison. Grains are laced with pesticides during the growing season and in storage; they are milled at high temperatures so that their fatty acids turn rancid. Rancidity increases when milled flours are stored for long periods of time, particularly in open bins. The bran and germ are often removed and sold separately, when Mother Nature intended that they be eaten together with the carbohydrate portion; they’re baked as quick rise breads so that antinutrients remain; synthetic vitamins and an unabsorbable form of iron added to white flour can cause numerous imbalances; dough conditioners, stabilizers, preservatives and other additives add insult to injury. Cruelty to grains in the making of breakfast cereals is intense. Slurries of grain are forced through tiny holes at high temperatures and pressures in giant extruders, a process that destroys nutrients and turns the proteins in grains into veritable poisons. Westerners pay a lot for expensive breakfast cereals that snap, crackle and pop, including the rising toll of poor health. The final indignity to grains is that we treat them as loners, largely ignorant of other dietary factors needed for the nutrients they provide. Fat-soluble vitamins A and D found in animal fats like butter, lard and cream help us absorb calcium, phosphorus, iron, B vitamins and the many other vitamins that grains provide. Porridge eaten with cream will do us a thousand times more good than cold breakfast cereal consumed with skim milk; sourdough whole grain bread with butter or whole cheese is a combination that contributes to optimal health. Be kind to your grains. . . and your grains will deliver their promise as the staff of life. Buy only organic whole grains and soak them overnight to make porridge or casseroles; or grind them into flour with a home grinder and make your own sour dough bread and baked goods. For those who lack the time for breadmaking, kindly-made whole grain breads are now available. Look for organic, stone ground, sprouted or sour dough whole grain breads (we have many brands listed in our yearly Shopping Guide) and enjoy them with butter or cheese. Copyright: From: Nourishing Traditions: The Cookbook that Challenges Politically Correct Nutrition and the Diet Dictocrats by Sally Fallon with Mary G. Enig, PhD. © 1999. All Rights Reserved. To order Nourishing Traditions, go to www.newtrendspublishing.com. SIDEBAR THE DANGERS OF MODERN BREAKFAST CEREALS Modern cold breakfast cereals are made by a process called extrusion. The grains are mixed or mashed with water to make a slurry and then forced out a tine hole under very high temperatures and pressures. The shape of the die on the hole determines whether the final product will be a flake, a little O, a puffed grain or a shredded grain (for shredded wheat or triscuits). Extrusion represents extreme cruelty to our grains. The industry has convinced the FDA that high-temperature, high-pressure extruded grains are no different from non-extruded grains and has contrived to ensure that no studies have been published on the effects of extruded foods on either humans or animals. However, two unpublished animal studies indicate that extruded grains are toxic, particularly to the nervous system. One study was described by Paul Stitt in his book Fighting the Food Giants: Stitt worked for a cereal company and found this study locked in a file cabinet. Four sets of rats were given special diets. One group received plain whole wheat, water, vitamins and minerals. Another group received Puffed Wheat, water and the same nutrient solution. A third set was given water and white sugar, and a fourth given nothing but water and the chemical nutrients. The rats that received the whole wheat lived over a year on the diet. The rats that got nothing but water and vitamins lived for about eight weeks, and the animals on a white sugar and water diet lived for a month. But the company’s own laboratory study showed that rats given vitamins, water and all the Puffed Wheat they wanted died in two weeks. It wasn’t a matter of the rats dying of malnutrition; results like these suggested that there was something actually toxic about the Puffed Wheat itself. Wrote Stitt: “Proteins are very similar to certain toxins in molecular structure, and the puffing process of putting the grain under fifteen hundred pounds per square inch of pressure and then releasing it may produce chemical changes which turn a nutritious grain into a poisonous substance.” The other study, also not published but described over the phone to Sally Fallon Morell by the researcher, Loren Zanier, was performed in 1960 by researchers at the University of Michigan at Ann Arbor. Eighteen rats were divided into three groups. One group received cornflakes and water; a second group was given the cardboard box that the cornflakes came in and water; and the control group received rat chow and water. The rats in the control group remained in good health throughout the experiment and lived over a year. The rats receiving the box became lethargic and eventually died of malnutrition. But the rats receiving cornflakes and water died before the rats that were given the box – the last cornflake rat died on the day the first box rat died. Before death the cornflake rats developed schizophrenic behavior, threw fits, bit each other and finally went into convulsions. Autopsy revealed dysfunction of the pancreas, liver and kidneys and degeneration of the nerves in the spine – all signs of “insulin shock.” The startling conclusion of this study is that there is more nourishment in the box that cold breakfast cereals come in than in the cereals themselves. Millions of children begin their day with a bowl of extruded breakfast cereal. Do the toxic protein fragments in these cereals explain why so many of our children cannot concentrate at school? Although there are no published studies on the effects of breakfast cereals on the health of humans or animals, there is one published study which looked at the process of extrusion on the proteins in grains (Cereal Chemistry. American Association of Cereal Chemists. Mar/Apr 1998 V 75 (2) 217-221). The study looked at zeins—grain protein– which are located in spherical organelles called protein bodies, found in corn. The researchers found that during extrusion, the protein bodies are completely disrupted and the zeins dispersed. The results suggest that the zeins in cornflakes are not confined to rigid protein bodies but can interact with each other and other components of the system, forming new compounds that are foreign to the human body. The extrusion process breaks down the organelles and disperses the proteins, which then become toxic. When the proteins are disrupted in this way, they can adversely affect the nervous system, as indicated by the cornflake experiment. By the way, health food stores also carry extruded grain cereals. These cereals are made by the same process, and often in the same factories, as the cereals sold at the supermarket. Usually these cereals are made with organic grains. Organic grains contain more protein than non-organic grains. . . Which means that these health food store cereals probably contain MORE toxic protein fragments than supermarket cereals. Breakfast cereals are a bad deal, all the way around. They are very costly in terms of food dollars spent and their effects on our health. So much better to have eggs and bacon for breakfast, or soaked and cooked porridges with butter and cream. Wheaty Indiscretions: What Happens to Wheat, from Seed to Storage Posted on June 30, 2003 by Jen Allbritton • 2 Comments PrintFriendly and PDFPrint - PDF - Email Wheat–America’s grain of choice. Its hardy, glutenous consistency makes it practical for a variety of foodstuffs–cakes, breads, pastas, cookies, bagels, pretzels and cereals that have been puffed, shredded and shaped. This ancient grain can actually be very nutritious when it is grown and prepared in the appropriate manner. Unfortunately, the indiscretions inflicted by our modern farming techniques and milling practices have dramatically reduced the quality of the commercial wheat berry and the flour it makes. You might think, “Wheat is wheat–what can they do that makes commercial varieties so bad?” Listen up, because you are in for a surprise! It was the cultivation of grains–members of the grass family–that made civilization possible.1 Since wheat is one of the oldest known grains, its cultivation is as old as civilization itself. Some accounts suggest that mankind has used this wholesome food since 10,000 to 15,000 years BC.2 Upon opening Egyptian tombs archeologists discovered large earthenware jars full of wheat to “sustain” the Pharaohs in the afterlife. Hippocrates, the father of medicine, was said to recommend stone-ground flour for its beneficial effects on the digestive tract. Once humans figured out how to grind wheat, they discovered that when water is added it can be naturally fermented and turned into beer and expandable dough.2 Botonists have identified almost 30,000 varieties of wheat, which are assigned to one of several classifications according to their planting schedule and nutrient composition3–hard red winter, hard red spring, soft red winter, durum, hard white and soft white. Spring wheat is planted in the spring, and winter wheat is planted in the fall and shoots up the next spring to mature that summer. Soft, hard, and durum (even harder) wheats are classified according to the strength of their kernel. This strength is a function of the protein-to-starch ratio in the endosperm (the starchy middle layer of the seed). Hard wheats contain less starch, leaving a stronger protein matrix.3 With the advent of modern farming, the number of varieties of wheat in common use has been drastically reduced. Today, just a few varieties account for 90 percent of the wheat grown in the world.1 When grown in well-nourished, fertile soil, whole wheat is rich in vitamin E and B complex, many minerals, including calcium and iron, as well as omega-3 fatty acids. Proper growing and milling methods are necessary to preserve these nutrients and prevent rancidity. Unfortunately, due to the indiscretions inflicted by contemporary farming and processing on modern wheat, many people have become intolerant or even allergic to this nourishing grain. These indiscretions include depletion of the soil through the use of chemical fertilizers, pesticides and other chemicals, high-heat milling, refining and improper preparation, such as extrusion.1 Rather than focus on soil fertility and careful selection of seed to produce varieties tailored to a particular micro-climate, modern farming practices use high-tech methods to deal with pests and disease, leading to overdependence on chemicals and other substances. IT STARTS WITH THE SEED Even before they are planted in the ground, wheat seeds receive an application of fungicides and insecticides. Fungicides are used to control diseases of seeds and seedlings; insecticides are used to control insect pests, killing them as they feed on the seed or emerging seedling.7 Seed companies often use mixtures of different seed-treatment fungicides or insecticides to control a broader spectrum of seed pests.8 PESTICIDES AND FERTILIZERS Some of the main chemicals (insecticides, herbicides and fungicides) used on commercial wheat crops are disulfoton (Di-syston), methyl parathion, chlorpyrifos, dimethoate, diamba and glyphosate.9 Although all these chemicals are approved for use and considered safe, consumers are wise to reduce their exposure as much as possible. Besides contributing to the overall toxic load in our bodies, these chemicals increase our susceptibility to neurotoxic diseases as well as to conditions like cancer.10 Many of these pesticides function as xenoestrogens, foreign estrogen that can reap havoc with our hormone balance and may be a contributing factor to a number of health conditions. For example, researchers speculate these estrogen-mimicking chemicals are one of the contributing factors to boys and girls entering puberty at earlier and earlier ages. They have also been linked to abnormalities and hormone-related cancers including fibrocystic breast disease, breast cancer and endometriosis.13 HORMONES ON WHEAT? Sounds strange, but farmers apply hormone-like substances or “plant growth regulators” that affect wheat characteristics, such as time of germination and strength of stalk.11 These hormones are either “natural,” that is, extracted from other plants, or synthetic. Cycocel is a synthetic hormone that is commonly applied to wheat. Moreover, research is being conducted on how to manipulate the naturally occurring hormones in wheat and other grains to achieve “desirable” changes, such as regulated germination and an increased ability to survive in cold weather.12 No studies exist that isolate the health risks of eating hormone-manipulated wheat or varieties that have been exposed to hormone application. However, there is substantial evidence about the dangers of increasing our intake of hormone-like substances. CHEMICALS USED IN STORAGE Chemical offenses don’t stop after the growing process. The long storage of grains makes them vulnerable to a number of critters. Before commercial grain is even stored, the collection bins are sprayed with insecticide, inside and out. More chemicals are added while the bin is filled. These so-called “protectants” are then added to the upper surface of the grain as well as four inches deep into the grain to protect against damage from moths and other insects entering from the top of the bin. The list of various chemicals used includes chlorpyrifos-methyl, diatomaceous earth*, bacillus thuringiensis, cy-fluthrin, malathion and pyrethrins.14 Then there is the threshold test. If there is one live insect per quart of sample, fumigation is initiated. The goal of fumigation is to “maintain a toxic concentration of gas long enough to kill the target pest population.” The toxic chemicals penetrate the entire storage facility as well as the grains being treated. Two of the fumigants used include methyl bromide and phosphine-producing materials, such as magnesium phosphide or aluminum phosphide.14 GRAIN DRYING Heat damage is a serious problem that results from the artificial drying of damp grain at high temperatures. Overheating causes denaturing of the protein26 and can also partially cook the protein, ruining the flour’s baking properties and nutritional value. According to Ed Lysenko, who tests grain by baking it into bread for the Canadian Grain Commission’s grain research laboratory, wheat can be dried without damage by using re-circulating batch dryers, which keep the wheat moving during drying. He suggests an optimal drying temperature of 60 degrees Celsius (140 degrees Fahrenheit).27 Unfortunately, grain processors do not always take these precautions. MODERN PROCESSING The damage inflicted on wheat does not end with cultivation and storage, but continues into milling and processing. A grain kernel is comprised of three layers: the bran, the germ and the endosperm. The bran is the outside layer where most of the fiber exists. The germ is the inside layer where many nutrients and essential fatty acids are found. The endosperm is the starchy middle layer. The high nutrient density associated with grains exists only when these three are intact. The term whole grain refers to the grain before it has been milled into flour. It was not until the late nineteenth century that white bread, biscuits, and cakes made from white flour and sugars became mainstays in the diets of industrialized nations, and these products were only made possible with the invention of high-speed milling machines.28 Dr. Price observed the unmistakable consequences of these dietary changes during his travels and documented their corresponding health effects. These changes not only resulted in tooth decay, but problems with fertility, mental health and disease progression.30 Flour was originally produced by grinding grains between large stones. The final product, 100 percent stone-ground whole-wheat flour, contained everything that was in the grain, including the germ, fiber, starch and a wide variety of vitamins and minerals. Without refrigeration or chemical preservatives, fresh stone-ground flour spoils quickly. After wheat has been ground, natural wheat-germ oil becomes rancid at about the same rate that milk becomes sour, so refrigeration of whole grain breads and flours is necessary. Technology’s answer to these issues has been to apply faster, hotter and more aggressive processing.28 Since grinding stones are not fast enough for mass-production, the industry uses high-speed, steel roller mills that eject the germ and the bran. Much of this “waste product”–the most nutritious part of the grain–is sold as “byproducts” for animals. The resulting white flour contains only a fraction of the nutrients of the original grain. Even whole wheat flour is compromised during the modern milling process. High-speed mills reach 400 degrees Fahrenheit, and this heat destroys vital nutrients and creates rancidity in the bran and the germ. Vitamin E in the germ is destroyed–a real tragedy because whole wheat used to be our most readily available source of vitamin E. Literally dozens of dough conditioners and preservatives go into modern bread, as well as toxic ingredients like partially hydrogenated vegetable oils and soy flour. Soy flour–loaded with antinutrients–is added to virtually all brand-name breads today to improve rise and prevent sticking. The extrusion process, used to make cold breakfast cereals and puffed grains, adds insult to injury with high temperatures and high pressures that create additional toxic components and further destroy nutrients–even the synthetic vitamins that are added to replace the ones destroyed by refinement and milling. People have become accustomed to the mass-produced, gooey, devitalized, and nutritionally deficient breads and baked goods and have little recollection of how real bread should taste. Chemical preservatives allow bread to be shipped long distances and to remain on the shelf for many days without spoiling and without refrigeration. HEALTHY WHOLE WHEAT PRODUCTS Ideally, one should buy whole wheat berries and grind them fresh to make homemade breads and other baked goods. Buy whole wheat berries that are grown organically or biodynamically–biodynamic farming involves higher standards than organic.34 Since these forms of farming do not allow synthetic, carcinogenic chemicals and fertilizers, purchasing organic or biodynamic wheat assures that you are getting the cleanest, most nutritious food possible. It also automatically eliminates the possibility of irradiation31 and genetically engineered seed. The second best option is to buy organic 100 percent stone-ground whole-wheat flour at a natural food store. Slow-speed, steel hammer-mills are often used instead of stones, and flours made in this way can list “stone-ground” on the label. This method is equivalent to the stone-ground process and produces a product that is equally nutritious. Any process that renders the entire grain into usable flour without exposing it to high heat is acceptable. If you do not make your own bread, there are ready made alternatives available. Look for organic sourdough or sprouted breads freshly baked or in the freezer compartment of your market or health food store. If bread is made entirely with l00 percent stone-ground whole grains, it will state so on the label. When bread is stone ground and then baked, the internal temperature does not usually exceed 170 degrees, so most of the nutrients are preserved.28 As they contain no preservatives, both whole wheat flour and its products should be kept in the refrigerator or freezer. Stone-ground flour will keep for several months frozen.28 Sprouting, soaking and genuine sourdough leavening “pre-digests” grains, allowing the nutrients to be more easily assimilated and metabolized. This is an age-old approach practiced in most traditional cultures. Sprouting begins germination, which increases the enzymatic activity in foods and inactivates substances called enzyme inhibitors.1 These enzyme inhibitors prevent the activation of the enzymes present in the food and, therefore, may hinder optimal digestion and absorption. Soaking neutralizes phytic acid, a component of plant fiber found in the bran and hulls of grains, legumes, nuts, and seeds that reduces mineral absorption.32 All of these benefits may explain why sprouted foods are less likely to produce allergic reactions in those who are sensitive.1 Sprouting also causes a beneficial modification of various nutritional elements. According to research undertaken at the University of Minnesota, sprouting increases the total nutrient density of a food. For example, sprouted whole wheat was found to have 28 percent more thiamine (B1), 315 percent more riboflavin (B2), 66 percent more niacin (B3), 65 percent more pantothenic acid (B5), 111 percent more biotin, 278 percent more folic acid, and 300 percent more vitamin C than non-sprouted whole wheat. This phenomenon is not restricted to wheat. All grains undergo this type of quantitative and qualitative transformation. These studies also confirmed a significant increase in enzymes, which means the nutrients are easier to digest and absorb.33 You have several options for preparing your wheat. You can use a sour leavening method by mixing whey, buttermilk or yogurt with freshly ground wheat or quality pre-ground wheat from the store. Or, soak your berries whole for 8 to 22 hours, then drain and rinse. There are some recipes that use the whole berries while they are wet, such as cracker dough ground right in the food processor. Another option is to dry sprouted wheat berries in a low-temperature oven or dehydrator, and then grind them in your grain mill and then use the flour in a variety or recipes. Although our modern wheat suffers from a great number of indiscretions, there are steps we can take to find the quality choices that will nourish us today and for the long haul. Go out and make a difference for you and yours and turn your wheaty indiscretions into wheaty indulgences. SIDEBAR ARTICLES SPELT AND KAMUT® Spelt is a distant cousin to modern wheat and one of the oldest cultivated grains. Current research indicates few differences between hard red wheat and Canadian spelt. Researchers have also found evidence supporting the claim that spelt may be easier for humans to digest than wheat.4 Modern wheat has been altered over the years through breeding to simplify its growth and harvesting, increase its yield and raise its gluten content for the production of commercial baked goods–all of which has rendered modern wheat more difficult to digest. Spelt, on the other hand, has not been as popular in our food supply and has therefore retained many of its original traits.5 Kamut® is also an ancient relative of modern wheat, durum wheat to be exact. Actually, “kamut” is an ancient word for wheat. Similar to spelt, this grain has been untouched by modern plant-breeding techniques that have been imposed on wheat.6 IRRADIATION Wheat and wheat flour were some of the first foods the Food and Drug Administration (FDA) approved for irradiation.15 A 1963 ruling applied to imported grains. In 1968, the FDA approved irradiation for US wheat berries and flour to control insects.16 Irradiation is the practice of using either high-speed electron beams or high-energy radiation to break chemical bonds and ionize molecules that lie in their path.17 According to proponents of this technology, irradiation can provide more food security for the world by eradicating storage pests in grain, killing fruit flies in fruit, preventing mold growth, delaying ripening, preventing the sprouting of potatoes, onions and garlic, and extending the shelf life of meat, fish and shellfish – all without health consequences. However, research tells us something quite different. One particularly interesting study on the dangers of irradiation was published in The American Journal of Clinical Nutrition18 in 1975. Ten children were divided into two groups of five. Before the trial started, blood samples were taken and examined for each child. The diets given to each group were identical except the wheat for the experimental group had been irradiated two or three days earlier with a dose recommended for grain disinfestation. After four weeks, the examination of blood samples showed abnormal cell formation in four of the five children given irradiated wheat. No signs of abnormal cell development appeared in the control group. One particularly disturbing cell type found in the experimental group was polyploid lymph. Lymph is a vital component of the immune system, and these abnormal varieties occur routinely in patients undergoing radiation treatment. In fact, the level of these abnormal lymph cells is often used as a measure of radiation exposure for people accidentally exposed to radiation.19 After six weeks, blood samples were taken again and a sharp increase of polyploid lymph cells was seen when compared to the level at four weeks. Because of concern for the children’s health, the study was terminated. It was argued that the main culprit in the increase of cell abnormalities was the fact the wheat was “freshly irradiated.” Therefore, a subsequent study looked at the effects of feeding wheat that had been irradiated and then stored for 12 weeks. The polyploid cells took a little longer to show up–six weeks instead of four. After the irradiated wheat had been withdrawn, it took 24 weeks before the blood of the test children reverted to normal. To verify their results, the researchers continued with experimental animals and found the same results in both monkeys and rats–a progressive increase of polyploid lymph cells and a gradual disappearance of these cells after withdrawal of the irradiated wheat.20 ,21 ,22 ,23 Thus, the dangers of irradiated foods are evident, whether the food has been freshly irradiated or stored for a period of time. Other long-term health implications from eating irradiated foods include lowered immune resistance, decreased fertility, damage to kidneys, depressed growth rates, as well as a reduction in vitamins A, B complex, C, E and K.24 NUTRIENT LOSS FROM REFINING OF WHEAT29 Thiamine (B1) 77% Riboflavin (B2) 80% Niacin 81% Pyridoxine (B6) 72% Pantothenic acid 50% Vitamin E 86% Calcium 60% Phosphorous 71% Magnesium 84% Potassium 77% Sodium 78% Chromium 40% Manganese 86% Iron 76% Cobalt 89% Zinc 78% Copper 68% Selenium 16% Molybdenum 48% GENETICALLY ENGINEERED WHEAT Genetic Engineering (GE) is the process of altering or disrupting the genetic blueprints of living organisms–plants, trees, fish, animals and microorganisms. Genes are spliced to incorporate a new characteristic or function into an organism. For example, scientists can mix a gene from a cold-water fish into a strawberry plant’s DNA so it can withstand colder temperatures. So far, the most widely used GE foods are soy, cotton and corn. Monsanto hopes to commercialize Roundup Ready Wheat sometime between 2003 and 2005. This crop will join the company of a number of crops engineered to resist the Roundup herbicide. Proponents of GE claim that this “technology” will make agriculture sustainable, eliminate world hunger, cure disease and improve public health–but have they factored in the enormous risks? When surveyed, most consumers do not want to eat genetically modified foods, and even commercial farmers are wary. Wheat farmers are scared of the Starlink corn fiasco. Iowa farmers planted one percent of their 2000 corn crop as Starlink, a genetically engineered corn approved only for animal consumption. By harvest time, almost 50 percent of the Iowa crop tested positive for Starlink. Product recalls, consumer outcry and export difficulties have ensued. This mistake resulted in the recall of hundreds of millions of dollars of food products and seeds. In regards to exporting, our overseas consumers say they will not accept any wheat that has been genetically engineered. For this reason, Monsanto has put the development of GE wheat on temporary hold.25 USING WHEAT IN BAKING When deciding which wheat berries to use for baking, the main categories to consider are hard and soft. Hard wheat is higher in protein, particularly gluten, making it more elastic and the best choice for making breads. Gluten traps carbon dioxide during the leavening process, allowing the dough to rise. Durum wheats, used mostly for pasta, are even harder. Soft wheats are lower in protein and are more appropriate for cookies, crackers, soda breads and other baked goods. WAPF received the following letter from Lorraine Iverson of the EPA on August 10, 2004: I have thoroughly enjoyed your website over the past few years and have found it to be an authority on nutrition in America. Please keep up the good work. I am a chemist with the EPA, and I spend my days analyzing soil, water and fish samples for pesticides such as Chlordane, DDT, Chlorpyrifos to name a few. I was reading the article “Wheaty Indiscretions” with great interest today, when I noticed that diatomaceous earth was listed as a chemical used in storage in the paragraph titled Chemicals Used in Storage. This material is not toxic, nor is it a manmade chemical. It is the residue left over from plankton, and it kills bugs by mechanical means, not chemical. To list this material alongside chlorpyrifos and malathion gives the wrong impression. Our Daily Bread Posted on July 12, 2003 by Katherine Czapp • 5 Comments PrintFriendly and PDFPrint - PDF - Email My father Vasilii was diagnosed with celiac disease, or gluten intolerance, three years ago after nearly 30 years of suffering from chronic digestive and absorption problems that no one could explain or help relieve. Although complications from the disease had brought him to a very precarious state of health, he made steady improvement the moment all sources of gluten were banned from his diet and nutrient-dense foods–including plenty of gelatin-rich broth–were introduced instead. So dramatic was his recovery, in fact, that about a year and a half later, my father was feeling so good that he believed himself “cured,” and so when a local farm implement dealership sponsored a pancake breakfast which would be an opportunity to socialize with many neighbors, my father wanted to participate. My mother strongly advised against this move, but they ultimately compromised: Dad could have one pancake. Dad kept to the bargain, but less than 48 hours later, unpleasant symptoms of an allergic reaction began and did not fully disappear for almost 2 weeks. A bitter lesson, but one not likely to be soon forgotten. Bread–A Modern Curse? According to recent news articles, celiac disease–the inability to digest certain proteins in gluten-containing grains such as wheat, rye, barley and oats–afflicts at least one in 30 people. So common and so debilitating is this malady that many popular nutrition doctors and nutrition writers forbid the consumption of grains as a matter of course. In The Paleo Diet, for example, author Loren Cordain blames the consumption of grains for our modern deficiency diseases, and the narrowing of the jaw so prevalent in modern humans. According to Barry Sears, PhD, author of The Zone Diet, the switch to a grain-based diet in Egypt was a chief factor in the emergence of the diseases of modern civilization. Dr. Joe Mercola tells his patients to avoid grain, period. Yet Weston Price studied several societies that enjoyed remarkable good health even though they consumed grains as a principle foodstuff. The primitive Swiss of the Loetschental Valley baked a sourdough bread in communal ovens, made of locally grown rye ground fresh in a stone mill. Rye bread plus rich dairy products–milk, butter and cheese–were the chief articles of the diet. Likewise, the primitive Gaelic peoples subsisted on seafood and oats. Both these groups exhibited beautiful facial structure and were free of deficiency diseases. Price also found healthy groups in Africa and South America that consumed large quantities of grain, usually as a sour fermented porridge or beverage. Bread–The Staff of Life? Back to Dad: during his recovery, my husband Garrick, originally from Russia, took an interest in genuine Russian sourdough breads. Eventually he perfected his sourdough bread recipe and had baked and frozen many loaves of bread for my mother, who had stopped baking bread at home in deference to my father’s inability to eat it. I might add here that the gluten-free breads commercially available are mostly pretty wretched. All sorts of odd things are thrown together in a sad attempt to mimic the real thing: rice flour, bean flour, xantham gum, potato starch, and so on. The taste and texture of them bring to mind siege conditions. As we sat around my parents’ dining table discussing the mechanism of sourdough culture–yeast that is the leavener, and bacteria that develop the gluten and thereby the taste–we all had the same thought: perhaps the long, slow fermentation somehow digests the gluten? Perhaps it would be safe for my father to eat? Of course after his nasty pancake experience no one could expect him to become a guinea pig again, but in the interests of science, Dad decided he would try a piece of Garrick’s bread. To our great relief and cautious excitement, after a couple of days there was no reaction, so he ate the bread again. And again there was no reaction. He has continued to enjoy the bread, although he does not eat it every day, for the last year and a half. He has never had any adverse reaction. In fact, he is now able to tolerate oats and corn and spelt (which he had been unable to do initially) and eats them in moderation as he does Garrick’s bread. He is also able to enjoy French and Belgian unpasteurized bitters and ale (made from barley malt and hops). Of course each person with celiac disease has a unique response to the condition and recovery varies greatly. I don’t know that every person with celiac disease could tolerate this bread, and in fact, several people to whom I told this story were horrified that my father would eat it. But he is strong and energetic and agile: last fall at age 75 he re-shingled their garage alone, and he and my mother baled hay which he stacked in their barn. He fells dry trees in their woods and chops them to heat their house. I only hope I have as much energy at the same age and can take the same pleasure in life. According to a recent article in Science Magazine (September 27, 2002), gluten in grain is not fully broken down, even by all the digestive enzymes normally present in the digestive track. What does break down gluten, according to the article, is a bacterial enzyme. . . just what the bacteria in a sour dough culture are likely to produce! The Science article stated sadly that it would be years before medicine would have a pill available for celiac sufferers–but why not just apply a little logic to the problem and go back to preparing bread with a long fermentation. This ancient method not only seems to digest or completely break down the gluten (as my father’s experience proves), but also neutralizes enzyme inhibitors (that interfere with digestion) and phytic acid (that block mineral absorption). Bread prepared in this old-fashioned way is truly the staff of life–a highly nutritious storage food that provides many nutrients in a form that is delicious and easy to digest. F.B.I. (Friendly But Important) warning: In our house, the daily bread is made weekly. Garrick has perfected the method and now it is routine. Before giving all the particulars, let me provide Garrick’s F.B.I., friendly but important, warning. I’ll let him put it in his own words: “When making the starter, do not use on yourself perfumes and be sure your kitchen is free of chemical cleaners. Also it is a good idea to use a wooden bowl and spoon. Keep fur of cats and dogs out of your bread-making area and avoid major fightings between spouses. Remember: the dough has a better memory than you. All sins of omission or commission will be revealed later in the bread. This is true karma essence.” Tools You Will Need 1.Kitchen scale calibrated in grams 2.Covered clay pot, such as a Romertopf baker, about 7 1/2 inches high by 10 1/2 inches wide by 14 1/2 inches long 3.Oval basket or banneton, about 12 inches by 9 inches by 4 inches deep, in which dough will proof. These can be obtained from the Baker’s Catalog (800) 827-6836, www.BakersCatalogue.com 4.Baker’s mittens 5.Room thermometer 6.Probe-type thermometer (as for poultry) 7.Grain mill, hand-operated or electric 8.Large mixing bowl, preferably wooden Making Your Starter You will need 200 grams of organic hard red winter wheat berries and 200 grams organic rye berries. Day One: Grind all the grain together and take from this flour mixture 120 grams and place in a clean vessel. Pour over 120 grams spring water and mix well. Cover loosely with a piece of unbleached parchment paper and a damp towel and allow to stand for 48 hours at 60-65 degrees. Day Three: You now have 240 grams of not very active starter. Discard half of it and to the remaining portion add 60 grams of the wheat/rye flour and 60 grams spring water. Mix well, cover as before and allow to stand another 24 hours. Day Four: Repeat the same process as the day before. Day Five: By now, one hopes to have a visibly active starter: bubbling and with a good smell–something like a wine-y smell. (If it smells bad at this point, discard and start all over again.) Let the starter stand, covered, for another 2 or 3 days to ferment further. It should become light, airy and fragrant. At this point you have approximately 240 grams of starter ready to use for baking your first loaf. Note that for each subsequent loaf you will use a piece of the previous bread’s dough as your starter; this piece of dough is called the chef. Making Your Bread The following recipe is for one loaf of wheat bread of approximately 4 pounds, leavened over 4 days. The ideal room temperature for fermentation is about 65-70ş F. At temperatures above or below this ideal, the timing of procedures will vary. Experience will teach you to adjust according to the conditions of your own kitchen. The following times and hours of fermentation are given as examples and are approximate. We use all organic grains which we grind ourselves, spring water and Celtic sea salt. The technique of baking in the covered clay pot may seem at first cumbersome, but I highly recommend it for producing a loaf that most closely replicates one baked in a wood-fired brick oven in both beauty and flavor. The day before you begin, grind all your flour together: 800 grams of hard red winter wheat berries, 600 grams of hard red spring wheat berries, and 200 grams of rye berries for a total of 1600 grams of flour. Keep the flour in a covered bowl at room temperature until used. Day One, 10 am: Place your starter in a large mixing bowl and add 300 grams water. Stir until you have a milky, homogenous mixture. Add 300 grams of your flour mixture and stir very well. Shape into a ball, dust with flour and loosely cover the bowl with parchment paper and a kitchen towel. Let it ferment about 20-24 hours. Day Two, Morning: Add 400 grams of water to your dough and mix very thoroughly, until it has the consistency of soft butter. Now add 600 grams of your flour mix, stir very well and knead with a wooden spoon or spatula right in the bowl. Again shape into a ball, dust with flour and cover loosely with parchment paper and kitchen towel for another 20-24 hours. Day Three, 9 am: Dissolve completely 20 grams of sea salt in 260 grams of water. Crush to powder 2 teaspoons of coriander seeds in a mortar with pestle. Add this to the salty water and pour all together over your dough. Stir very well until you have a homogenous mass again. Now add the remaining flour to this mass, stirring, and then kneading first in the bowl, then on a floured surface for approximately 30-50 turns of dough. You are now making the final dough, but not yet the final shaped loaf. The dough should be slightly sticky and not dryish. Form into a ball and let it rest, covered by parchment paper and towel, on the table for about a half hour. Next, knead again briefly for 1-2 minutes, form into a ball, return to original bowl, dust with flour and cover with parchemnt paper and a towel. Let the dough ferment for 10-12 hours. Day Three, Evening: Prepare your banneton (oval basket) for proofing the loaf: line it with a thin kitchen towel and then parchment paper. The size of the banneton and your clay pot should accommodate your loaf without crowding. A little experimenting will show you what works. Now comes a step you mustn’t forget! Cut off a piece of your dough of approximately 280 grams. This is your chef which you want to save and use for starting your next bread. Dust it with flour, wrap in a piece of parchment paper secured with masking tape, place it in a covered container and refrigerate. You can use it tomorrow or next week or even next month as the starter of your next bread. Now you have the major part of your dough remaining in the bowl. If the temperature in your kitchen is 70 degrees or more, you can shape your loaf and proof it overnight and bake it the next morning: take your dough out of the bowl and place on a floured surface. Knead for several minutes, using a bit of flour, but do not allow the dough to become dry. Shape the dough into an oval close to the shape of the inside of the clay baker. Dust lightly with flour and place in the banneton. Cover with parchment paper and light towel and let proof overnight. However, if your kitchen is 62 degrees or less, as ours is in winter, you will proceed a little differently: cover the bowl with parchment paper and a towel and leave until the morning. Day Four, Morning: The cool-kitchen loaf needs a boost: shape the loaf as described above and place, uncovered, into the oven with an electric pan of steaming water that raises the temperature to about 95 degrees. In the case of both rising methods, the dough is proofed when you press it with your finger and the indentation does not rise, and the dough is soft and there are slight cracks in the surface–usually this takes about 3 hours with the oven-steam technique. Now prepare your clay pot for baking: follow manufacturer’s instructions regarding soaking the pot about a half hour before baking time. Wipe dry, and place covered pot in cold oven and set temperature to 450 degrees. When temperature is reached, use baker’s mittens to take out clay pot, open and wipe up any drops of water–areas where the bread would stick! Strew coarsely ground grain on pot’s bottom and then take banneton and carefully but decisively overturn the loaf into the hot pot. Slash the loaf’s top in several places, cover pot and place in oven. Immediately raise the temperature to 500 degrees F. When oven reaches 500 degrees hold it here for 5 minutes then lower temperature to 450 degrees and bake for one hour. Now prepare a shallow pan with about an inch of hot water. Open oven, place pan on bottom of oven and remove the top of the clay baker. Reset oven temperature to 425 degrees and bake 25 minutes more. Remove bread from oven and use your thermometer to check the inner temperature of the bread. It should be close to 205 degrees (plus or minus 5 degrees). Remove bread from clay baker. Thumping the bottom of the bread should elicit a hollow sound. Place on rack, dab hot water on top and sides to help soften the crust, and cover with several kitchen towels. Let your bread cool at least 3 or 4 hours before tasting–the bread will still be warm and your house will smell absolutely wonderful! The crust will be more pliant the next day. This bread is dense, but has a very nice crumb and can be sliced thinly for open-faced sandwiches. It keeps a long time (more than a week) if well-covered, and freezes beautifully. And nothing else comes close to the well-developed flavor of the grains in this bread. Nostalgia Virtually everybody who has had Garrick’s bread, and is from Eastern Europe, Lithuania or Germany, is immediately plunged into a nostalgia for the last time they had such bread, usually in a village, often before World War II. One woman remembered that her grandfather had been a village baker and made such bread before the War–she hadn’t thought of any of this for decades. For Garrick, too, who is 68, baking this bread has significance on many levels, and it has been, unpredictably, a healing avocation. Unpredictable and unlikely too: Garrick can boil potatoes and make tea, but that about covers his culinary experience. Until this bread!! Who knew he had this talent lying hidden for so long. But now he gains so much satisfaction from baking bread and giving it to others to enjoy. Needless to say, he’s so pleased to share the recipe. Sidebars Sourdough Blini The same starter that goes on to become bread can be used in a diluted form to make pancakes, or better still, blini, which we find the most delicious. Blini are crepes or thin pancakes found in traditional Russian cuisine and are raised by yeast, in this case, by wild yeast. Blini batter is a very useful resource in your pantry when unexpected guests come. In a few moments you can serve an elegant snack of tender blini filled with grated Gruyere cheese and scallions with a glass of white wine, or sweet blini filled with sour cream and fruit preserves with tea–or even filled with cream cheese, onions and caviar. Versatile and delicious and quite easy! Once you have an active starter, you will use soft winter wheat berries that you grind as you need. (Sometimes I add some buckwheat flour to the batter for a different taste.) The basic formula is to “feed” your culture at least 8 hours before you plan to use it. You can make the batter rather thick at this stage, adding only flour and water; exact quantities are not important. When you are ready to use the batter you will thin it with a beaten egg and enough milk or cream to produce the consistency of thin cream. Prepare the blin in a 10-inch skillet or sauté pan–I use a well-seasoned cast iron pan. Let the pan become very hot on the stove and grease with a nub of coconut butter or ghee or swipe pan with a piece of pork fatback (butter will burn). Quickly add about 3˝ tablespoons of batter and swirl in the pan to cover the entire surface. The blin will cook quickly and is ready to turn when the top surface becomes dry. Flip over and cook for about 45 seconds more. Serve immediately and allow diner to add fillings: caviar and minced scallion, mushroom ragout, curried chicken–you get the idea! To serve with cheese, add grated cheese directly to the blin in the pan right before you would normally flip it. Fold blin in half and allow to cook for 30 seconds more (just enough to barely melt the cheese) and serve–this is like a Russian quesadilla! If you do not plan to use the batter every day, feed the batter modestly and refrigerate. Remember to feed it a few tablespoons of flour and water when needed every 2-3 days while refrigerated and not in use. The batter will still ferment, although slowly, in the refrigerator. You can use it directly out of the fridge, but it behaves better if you allow it to return to room temperature overnight. About the only thing that will harm this batter is high temperature, so plan to keep your culture cool during the summer. With a little care, your culture should last a very long time. Our culture has been in the same bowl for 2 years now! My Weekly Bread Routine By Tom Cowan, MD Most people say they don’t have time to make bread. Yet with my full-time, busy medical practice, I still make sourdough bread every week. Here is the routine I have developed, along with a couple of tricks to ensure the bread turns out well every time. One is to let the bread rise in a “tupelo bread bowl,” available from Lehman’s (888) 342-2387. The bowl keeps the dough warm and ensures a good rise. Also, I add 1/8 teaspoon yeast plus 1 teaspoon rapadura to the dough on the morning of baking–this ensures a good rise every time. The routine goes like this: •Every day grind about 1 cup biodynamic rye berries. On the first day, place the flour into a bowl, add enough water to make a soupy consistency and cover with a cloth. On each subsequent day, add another 1 cup flour plus water to the mixture. This is your starter. Keep at room temperature, but if you won’t be using the starter for a while, store in the refrigerator. •The night before baking grind about 8 cups flour–I like organic spelt berries or biodynamic nonhybrid wheat the best, but any flour will work. In a large bowl place 1 cup starter, 1 3/4 cup cool water and 2 cups freshly ground flour. Cover and allow to sit overnight. •In the morning, soak 1/8 teaspoon baker’s yeast in 1/4 cup water. Stir in 2 teaspoons Celtic sea salt and 1 teaspoon Rapadura (dehydrated cane sugar juice). Add this mixture to the flour mixture, stir and begin adding the rest of the flour. •Once it forms into a ball, place the ball onto the counter, cover with a damp cloth and allow to rest for 10 minutes. Then knead the dough for 10 minutes, adding new flour to keep from sticking. •Oil the tupelo bread bowl, shape the dough into a ball and place in the bread bowl. Cover with a damp cloth and allow to sit between 2 and 4 hours, until about double in size. •Remove the dough, punch it once, knead again briefly, and put it back in the bread bowl for another 2-4 hours. •Remove from bowl, cut in half, allow to sit for about 10 minutes, then shape into oiled bread pans (I use butter to grease the pans). •Allow to sit for about 1 hour, then preheat the oven to 350 degrees, slash the tops of the bread with a knife, beat an egg and coat the top of the bread with the egg wash, sprinkle poppy seeds on top and bake for 55 minutes.The bread should sound hollow when you tap on the bottom. •Place on wire rack to cool for about 1-2 hours. That’s all! Healthy Yeasted Bread Genuine sourdough bread required dedication and time (although our own Dr. Cowan finds time in his busy schedule to make sourdough bread every week). A compromise, quicker bread–and one that may be more acceptable to western palates–is soaked yeasted bread. The following recipe has been developed by Sonja Kepford, head of our Des Moines, Iowa chapter. The total fermentation time–rising and proofing–is 7 hours, which is the magic number for deactivation of phytic acid. To prepare the bread, you will need to have on hand 1 cup kefir “starter” and 3 cups “sponge.” Starter is some soaked, fermented flour that you keep on hand. Sponge is freshly rejuvenated starter. The ingredients are the same for both. The only difference is that the “starter” has been sitting around for a while and is really sour whereas the “sponge” is very fresh and active and has not become as sour. How to make the kefir starter? Mix 1 cup flour with 1 cup kefir and enough water to make a mixture the consistency of pancake batter. Keep in a covered container where air can get to it, such as a bowl with a plate on top. Stir it once in a while if you remember. You can probably use it after one day, but wait three or four days the first time if you can. The night before you want to bake, mix 2 cups flour, 2 cups water to 1 cup starter to make about 3 cups sponge in a big bowl. Leave covered on kitchen counter overnight to get bubbly. Soaked Yeasted Bread Makes three 9-inch by 5-inch loaves or one 9-inch by 5-inch loaf, two 8-inch pizza crusts and 8 rolls or hamburger buns 3 cups sponge 1 cup water or 3-4 eggs plus enough water to make 1 cup 2 teaspoons baker’s yeast 1/2 cup honey 1/4 cup warm water 1/2 cup softened butter 1 1/3 tablespoons sea salt 6 cups plus 4-6 tablespoons whole wheat flour, preferably freshly ground Mix 2 teaspoons yeast into 1/4 cup warm water and let soften for 15 minutes. Remove 1/4 cup sponge to keep for future starter. (Feed the starter with 1/2 cup flour and 1/2 cup water and put in covered container in the cupboard. The starter will keep for one week, until the next bread making, without anything being added, but it should be stirred occasionally.) Place 6 cups flour in a big bowl. Add 1 1/3 tablespoon sea salt and stir in. To the bowl containing the sponge, add the honey, egg-water mixture and yeast-water. Beat and stir into the flour. Add a small amount of additional water or several tablespoons additional flour until the dough feels right–it should be somewhat flabby. Knead in the bowl for 10 minutes, using water on your hands to keep from sticking. Toward the end of the kneading, smear 1/2 cup soft butter in your kneading bowl and work this in. Cover with a damp cloth and let rise for 3 hours. Deflate, round and let rise another 3 hours. Grease your pans–loaf pans, pizza rounds and a pie pan for the rolls–and divide, round, relax and shape the dough. You may use unbleached white flour to help with the forming. (If you are making hamburger buns, make balls of the dough, flatten and place on a greased cookie sheet.) Proof in warm place 45 minutes to 1 hour, covered with a damp cloth. Don’t overproof. Preheat oven halfway through the proofing. Bake at 415 degrees for 15 minutes, then at 315 degrees for 15-20 more minutes. To make cinnamon rolls, make a rectangle of dough, flattened to about 1/2 inch thick. Smear with soft butter and sprinkle with Rapadura or maple sugar, raisins, walnuts and lemon zest. Roll up from the side and cut into individual rolls by tightening a string around the roll at 1-inch intervals. If you want a VERY light loaf, follow the above recipe, except use part unbleached white flour, add the butter (melted) to the liquid ingredients, and use white flour to keep the dough from sticking during kneading. Tortillas: Mix 2 cups sponge, 1 teaspoon salt, 1 tablespoon honey and 1/2 cup melted butter or lard. Add about 3 cups fresh whole wheat flour. The dough should be soft but not too sticky to work with. Let proof 7 hours. Deflate and form into about 15 balls. Roll to about 1/8 inch thickness, using unbleached white flour to prevent sticking. Cook tortillas about 1-2 minutes per side on a dry, medium-hot cast iron skillet. Note: For meanings of terms such as “proof,” “feels right,” “round,” “deflate,” etc., and for many other tips on baking with whole wheat, such as why to smear butter in rather than melt it, please refer to the excellent book, Laurel’s Bread Book by Laurel Robertson. Bread Machine Bread Thank you for your article on bread, Spring 2003. I have been working on a recipe for soaked bread for a breadmaker and think I have finally achieved it. The first couple of times you try the recipe, don’t leave it unattended. You may need to make some slight adjustments to get it right for your particular breadmaker, climate, type of flour, etc. This recipe has been tested on a Panasonic SC-2000. It requires that the yeast, then the dry ingredients are put in first. If your machine has different instructions, alter the order given in the recipe. This has been tested using spelt flour. Wheat flour may give slightly different results and require slight adjustments. If you can’t tolerate any dairy products, try replacing the butter with olive oil and the yoghurt with some lemon juice or cider vinegar (but keep the total liquid the same). This has not been tested. 150 ml (1/2 cup plus 2 tablespoons) yoghurt 200 ml (scant 3/4 cup) water 450 g (1 pound) wholemeal flour, less 3 tablespoons 1 3/4 teaspoons yeast granules 3 T arrowrood 1 t Celtic sea salt (fine) 1 T molasses 2 T butter Weigh out 450 g (1 pound) flour, then take 3 tablespoons back out again. Mix together with yoghurt and water and add to the flour. Mix to form a dough. Cover and leave in a warmish place for 18-24 hours. Put the yeast, arrowroot, salt, dough, molasses and butter in the breadmaker. Set it to a wholemeal setting and begin. When it’s partway through the kneading section, check that all the ingredients have mixed together and check the consistency. If it is slimy, add some more arrowroot, if it’s too dry, add a few more drops of water, drop by drop. Proceed as directed and enjoy the bread! Deb Gully, Kilbirnie, New Zealand Q&A: Question: I made a loaf of sourdough rye bread which turned out wonderful and I’m very pleased with it. However, the last stage of the bread-making calls for adding 360g of flour to the dough, and then letting the bread rise for about one hour before cooking it. Based on the principles of Nourishing Traditions, I found the short time surprising. Do you think the phytates and other stuff in the grains would have time to be deactivated in so short a rising time? Answer: First, you are right to question the short period of time that those final 360 grams of flour are processed before baking. My husband, Garrick Ginzburg-Voskov, and the baker/originator of this recipe, had corrected the timing in his recipe (which will be up at the website soon) to extend the resting period after the addition of the 360 grams to three hours, and the proofing time to two hours. But beyond this, the topic is an interesting one to think about for a moment. We have read a study that shows that all phytates in wheat and rye flour can be neutralized in 2 hours in conditions of 4.5 pH and 45 degrees C (about 113 degrees F). Now that’s rapid! These are lab conditions, though, and not baking conditions, but they illustrate two important factors at work. Endogenous phytase (that is, resident in the wheat and rye flours) is activated in acidic conditions (such as a sourdough culture) and works at maximum speed when the temperature is close to the point when the phytase would be inactivated (which is about 115 degrees F). So we see that the phytates in these two grains can be pretty easily neutralized as long as we include the right acidity and enough time. At room temperature or a little warmer (as in the dough proofing stage) the phytase still works, just a little more slowly. We also have to remember–and this is important–that the dough to which those final 360 grams of flour is added is teeming with active phytase from the preceding additions of flour to the sourdough. All the phytase acts cumulatively, and this increases its combined efficiency. (As an aside, one method to greatly boost phytase activity with oat flour, which is very high in phytates but low in phytase, is to culture it with wheat sourdough.) Another question might regard the gluten development in the flour added in the last stages of dough preparation. We’ve read a lot about this, yet nothing (that we’ve seen, at any rate) that is absolutely definitive. Evidence does suggest that as the proliferative activity of the culture increases in an exponential fashion (and its consumption of nutrients in the flour as well, which can be measured) then it might be transforming the gluten at a similar rate; that is, faster at the end than at the beginning of the recipe when the first flour and starter are introduced to each other. This leads us to one final “trick” in baking sourdough bread. Garrick was forced to do this when he was baking in warm weather, and the heat sped the culture along too fast, threatening to over-sour the dough. (This would result in a flattened, overly-sour, overly-dense loaf.) To forestall that outcome, he refrigerated the dough before the baking stage for about 16 hours in order to bake it in the cool of the next morning. (Bakers call this “retarding the dough.”) He was pleasantly surprised to note that the flavor of the resulting loaf was extremely good–obviously the culture kept working even in 40-50 degree temps. It seems natural to suppose that this extra time would also aid in further gluten development–no doubt the starches and sugars were transformed, too, as shown in the complex flavor. My long discourse is to say that almost always, extra time works to your advantage! The only trick is to know under what conditions to extend the timing so that you are happy with the results of your labor. Thank you for your letter! Kind regards, Katherine Czapp This article appeared in Wise Traditions in Food, Farming and the Healing Arts, the quarterly magazine of the Weston A. Price Foundation, Spring 2003. Living With Phytic Acid Posted on March 26, 2010 by Ramiel Nagel • 194 Comments PrintFriendly and PDFPrint - PDF - Email Phytic acid is one of a number of “anti-nutrients” in grains and legumes. For an introduction to this subject, please see this article. Proper preparation of whole grains will neutralize a large portion of these problematic compounds. Studies on phytic acid reveal that for some people, the phytic acid in whole grains blocks calcium, zinc, magnesium, iron and copper; others seem immune to these adverse consequences, probably because of favorable gut flora, which in some cases can break down phytic acid. In addition, when animal fats providing vitamins A and D accompany dietary whole grains, the effects of phytic acid are mitigated. The author of the following article found that eliminating phytic acid in his diet and the diet of his family helped reverse serious tooth decay; not everyone will need to take such drastic steps. However, proper preparation of whole grains is a good idea for everyone as it is a practice found almost universally among nonindustrialized peoples. Preparing Grains, Nuts, Seeds and Beans for Maximum Nutrition Phytic acid in grains, nuts, seeds and beans represents a serious problem in our diets. This problem exists because we have lost touch with our ancestral heritage of food preparation. Instead we listen to food gurus and ivory tower theorists who promote the consumption of raw and unprocessed “whole foods;” or, we eat a lot of high-phytate foods like commercial whole wheat bread and all-bran breakfast cereals. But raw is definitely not Nature’s way for grains, nuts, seeds and beans. . . and even some tubers, like yams; nor are quick cooking or rapid heat processes like extrusion. Phytic acid is the principal storage form of phosphorus in many plant tissues, especially the bran portion of grains and other seeds. It contains the mineral phosphorus tightly bound in a snowflake-like molecule. In humans and animals with one stomach, the phosphorus is not readily bioavailable. In addition to blocking phosphorus availability, the “arms” of the phytic acid molecule readily bind with other minerals, such as calcium, magnesium, iron and zinc, making them unavailable as well. In this form, the compound is referred to as phytate. Phytic acid not only grabs on to or chelates important minerals, but also inhibits enzymes that we need to digest our food, including pepsin,1 needed for the breakdown of proteins in the stomach, and amylase,2 needed for the breakdown of starch into sugar. Trypsin, needed for protein digestion in the small intestine, is also inhibited by phytates.3 Through observation I have witnessed the powerful anti-nutritional effects of a diet high in phytate-rich grains on my family members, with many health problems as a result, including tooth decay, nutrient deficiencies, lack of appetite and digestive problems. The presence of phytic acid in so many enjoyable foods we regularly consume makes it imperative that we know how to prepare these foods to neutralize phytic acid content as much as possible, and also to consume them in the context of a diet containing factors that mitigate the harmful effects of phytic acid. spring2010-phyticacid Six-sided phytic acid molecule with a phosphorus atom in each arm. PHYTATES IN FOOD Phytic acid is present in beans, seeds, nuts, grains—especially in the bran or outer hull; phytates are also found in tubers, and trace amounts occur in certain fruits and vegetables like berries and green beans. Up to 80 percent of the phosphorus—a vital mineral for bones and health—present in grains is locked into an unusable form as phytate.4 When a diet including more than small amounts of phytate is consumed, the body will bind calcium to phytic acid and form insoluble phytate complexes. The net result is you lose calcium, and don’t absorb phosphorus. Further, research suggests that we will absorb approximately 20 percent more zinc and 60 percent magnesium from our food when phytate is absent.5 The amount of phytate in grains, nuts, legumes and seeds is highly variable; the levels that researchers find when they analyze a specific food probably depends on growing conditions, harvesting techniques, processing methods, testing methods and even the age of the food being tested. Phytic acid will be much higher in foods grown using modern high-phosphate fertilizers than those grown in natural compost.6 Seeds and bran are the highest sources of phytates, containing as much as two to five times more phytate than even some varieties of soybeans, which we know are highly indigestible unless fermented for long periods. Remember the oat bran fad? The advice to eat bran, or high fiber foods containing different types of bran, is a recipe for severe bone loss and intestinal problems due to the high phytic acid content. Raw unfermented cocoa beans and normal cocoa powder are extremely high in phytates. Processed chocolates may also contain phytates. White chocolate or cocoa butter probably does not contain phytates. More evidence is needed as to phytate content of prepared chocolates and white chocolate. Coffee beans also contain phytic acid. The chart in Figure 1 shows the variability of phytate levels in various common foods as a percentage of dry weight. Phytate levels in terms of milligrams per hundred grams are shown in Figure 2. DETRIMENTAL EFFECTS High-phytate diets result in mineral deficiencies. In populations where cereal grains provide a major source of calories, rickets and osteoporosis are common.10 Interestingly, the body has some ability to adapt to the effects of phytates in the diet. Several studies show that subjects given high levels of whole wheat at first excrete more calcium than they take in, but after several weeks on this diet, they reach a balance and do not excrete excess calcium.11 However, no studies of this phenomenon have been carried out over a long period; nor have researchers looked at whether human beings can adjust to the phytate-reducing effects of other important minerals, such as iron, magnesium and zinc. The zinc- and iron-blocking effects of phytic acid can be just as serious as the calcium-blocking effects. For example, one study showed that a wheat roll containing 2 mg phytic acid inhibited zinc absorption by 18 percent; 25 mg phytic acid in the roll inhibited zinc absorption by 64 percent; and 250 mg inhibited zinc absorption by 82 percent.12 Nuts have a marked inhibitory action on the absorption of iron due to their phytic acid content.13 Over the long term, when the diet lacks minerals or contains high levels of phytates or both, the metabolism goes down, and the body goes into mineral-starvation mode. The body then sets itself up to use as little of these minerals as possible. Adults may get by for decades on a high-phytate diet, but growing children run into severe problems. In a phytate-rich diet, their bodies will suffer from the lack of calcium and phosphorus with poor bone growth, short stature, rickets, narrow jaws and tooth decay; and for the lack of zinc and iron with anemia and mental retardation. THE EXPERIMENTS OF EDWARD MELLANBY As early as 1949, the researcher Edward Mellanby demonstrated the demineralizing effects of phytic acid. By studying how grains with and without phytic acid affect dogs, Mellanby discovered that consumption of high-phytate cereal grain interferes with bone growth and interrupts vitamin D metabolism. High levels of phytic acid in the context of a diet low in calcium and vitamin D resulted in rickets and a severe lack of bone formation. His studies showed that excessive phytate consumption uses up vitamin D. Vitamin D can mitigate the harmful effects of phytates, but according to Mellanby, “When the diet is rich in phytate, perfect bone formation can only be procured if sufficient calcium is added to a diet containing vitamin D.”20 Mellanby’s studies showed that the rickets-producing effect of oatmeal is limited by calcium.21 Calcium salts such as calcium carbonate or calcium phosphate prevent oatmeal from exerting rickets-producing effect. According to this view, the degree of active interference with calcification produced by a given cereal will depend on how much phytic acid and how little calcium it contains, or how little calcium the diet contains. Phosphorus in the diet (at least from grains) needs some type of calcium to bind to. This explains the synergistic combination of sourdough bread with cheese. Historically, the cultivation of grains usually accompanies the raising of dairy animals; high levels of calcium in the diet mitigates the mineral-depleting effects of phytic acid. In Mellanby’s experiments with dogs, increasing vitamin D made stronger bones regardless of the diet, but this increase did not have a significant impact on the amount of calcium excreted. Those on diets high in phytate excreted lots of calcium; those on diets high in phosphorus from meat or released from phytic acid through proper preparation excreted small amounts of calcium. Based on Mellanby’s thorough experiments, one can conclude that the growth of healthy bones requires a diet high in vitamin D, absorbable calcium and absorbable phosphorus, and a diet low in unabsorbable calcium (supplements, pasteurized dairy) and unabsorbable phosphorus (phytates). Interestingly, his experiments showed that unbleached flour and white rice were less anti-calcifying than whole grains that contain more minerals but also were higher in phytic acid. Other experiments have shown that while whole grains contain more minerals, in the end equal or lower amounts of minerals are absorbed compared to polished rice and white flour. This outcome is primarily a result of the blocking mechanism of phytic acid, but may be secondarily the result of other anti-nutrients in grains. Thus, absorbable calcium from bone broths and raw dairy products, and vitamin D from certain animal fats, can reduce the adverse effects of phytic acid. Other studies show that adding ascorbic acid can significantly counteract inhibition of iron assimilation by phytic acid.22 Adding ascorbic acid significantly counteracted phytate inhibition from phytic acid in wheat.23 One study showed that anti-iron phytate levels in rice were disabled by vitamin C in collard greens.24 Research published in 2000 indicates that both vitamin A and beta-carotene form a complex with iron, keeping it soluble and preventing the inhibitory effect of phytates on iron absorption.25 Here we have another reason to consume phytate-rich foods in the context of a diet containing organ meat and animal fats rich in vitamin A, and fruits and vegetables rich in carotenes. PHYTASE Phytase is the enzyme that neutralizes phytic acid and liberates the phosphorus. This enzyme co-exists in plant foods that contain phytic acid. Ruminant animals such as cows, sheep and goats have no trouble with phytic acid because phytase is produced by rumen microorganisms; monogastric animals also produce phytase, although far less. Mice produce thirty times more phytase than humans,26 so they can be quite happy eating a raw whole grain. Data from experiments on phytic acid using mice and other rodents cannot be applied to humans. In general, humans do not produce enough phytase to safely consume large quantities of high-phytate foods on a regular basis. However, probiotic lactobacilli, and other species of the endogenous digestive microflora can produce phytase.27 Thus, humans who have good intestinal flora will have an easier time with foods containing phytic acid. Increased production of phytase by the gut microflora explains why some volunteers can adjust to a high-phytate diet. Sprouting activates phytase, thus reducing phytic acid.28 The use of sprouted grains will reduce the quantity of phytic acids in animal feed, with no significant reduction of nutritional value.29 Soaking grains and flour in an acid medium at very warm temperatures, as in the sourdough process, also activates phytase and reduces or even eliminates phytic acid. Before the advent of industrial agriculture, farmers typically soaked crushed grain in hot water before feeding it to poultry and hogs. Today, feed manufacturers add phytase to grain mixes to get better growth in animals. Commercial phytases are typically produced using recombinant DNA technology. For example, a bacterial phytase gene has recently been inserted into yeast for commercial production. Not all grains contain enough phytase to eliminate the phytate, even when properly prepared. For example, corn, millet, oats and brown rice do not contain sufficient phytase to eliminate all the phytic acid they contain. On the other hand, wheat and rye contain high levels of phytase—wheat contains fourteen times more phytase than rice and rye contains over twice as much phytase as wheat.30 Soaking or souring these grains, when freshly ground, in a warm environment will destroy all phytic acid. The high levels of phytase in rye explain why this grain is preferred as a starter for sourdough breads. Phytase is destroyed by steam heat at about 176 degrees Fahrenheit in ten minutes or less. In a wet solution, phytase is destroyed at 131-149 degrees Fahrenheit.31 Thus heat processing, as in extrusion, will completely destroy phytase—think of extruded all-bran cereal, very high in phytic acid and all of its phytase destroyed by processing. Extruded cereals made of bran and whole grains are a recipe for digestive problems and mineral deficiencies! Phytase is present in small amounts in oats, but heat treating to produce commercial oatmeal renders it inactive. Even grinding a grain too quickly or at too high a temperature will destroy phytase, as will freezing and long storage times. Fresh flour has a higher content of phytase than does flour that has been stored.32 Traditional cultures generally grind their grain fresh before preparation. Weston Price found that mice fed whole grain flours that were not freshly ground did not grow properly.33 Cooking is not enough to reduce phytic acid—acid soaking before cooking is needed to activate phytase and let it do its work. For example, the elimination of phytic acid in quinoa requires fermenting or germinating plus cooking (see Figure 3). In general, a combination of acidic soaking for considerable time and then cooking will reduce a significant portion of phytate in grains and legumes. THE PHYTATE THRESHOLD It appears that once the phytate level has been reduced, such that there is more available phosphorus than phytate in the grain, we have passed a critical point and the food becomes more beneficial than harmful. Retention of phosphorus decreases when phytate in the diet is 30-40 percent or more of the total phosphorus.35 For best health, phytates should be lowered as much as possible, ideally to 25 milligrams or less per 100 grams or to about .03 percent of the phytate-containing food eaten. At this level, micronutrient losses are minimized. (For phytate content of common foods as a percentage of dry weight, see Figures 4 and 5.) White rice and white bread are low-phytate foods because their bran and germ have been removed; of course, they are also devitalized and empty of vitamins and minerals. But the low phytate content of refined carbohydrate foods may explain why someone whose family eats white flour or white rice food products may seem to be relatively healthy and immune to tooth cavities while those eating whole wheat bread and brown rice could suffer from cavities, bone loss and other health problems. PHYTATES AND GERMINATION Beer home brewers know that in order to make beer, they need malted (sprouted) grains. Soaking and germinating grains is a good idea, but it does not eliminate phytic acid completely. Significant amounts of phytic acid will remain in most sprouted grain products. For example, malting reduces wheat, barley or green gram phytic acid by 57 percent. However, malting reduces anti-nutrients more than roasting.36 In another experiment, malting millet also resulted in a decrease of 23.9 percent phytic acid after 72 hours and 45.3 percent after 96 hours.37 In legumes, sprouting is the most effective way to reduce phytic acid, but this process does not get rid of all of it. Germinating peanuts led to a 25 percent reduction in phytates. After five days of sprouting, chick peas maintained about 60 percent of their phytate content and lentils retained about 50 percent of their original phytic acid content. Sprouting and boiling pigeon pea and bambara groudnut reduced phytic acid by 56 percent.38 Germinating black eyed beans resulted in 75 percent removal of phytate after five days sprouting. Germination is more effective at higher temperatures, probably because the heat encourages a fermentation-like condition. For pearled millet, sprouting at 92 degrees F for a minimum of 48 hours removed 92 percent of the phytate. At 82 degrees F, even after 60 hours, only 50 percent of phytic acid was removed. Higher temperatures above 86 degrees F seem less ideal for phytate removal, at least for millet.39 Sprouting releases vitamins and makes grains and beans and seeds more digestible. However it is a pre-fermentation step, not a complete process for neutralizing phytic acid. Consuming grains regularly that are only sprouted will lead to excess intake of phytic acid. Sprouted grains should also be soaked and cooked. ROASTING AND PHYTIC ACID Roasting wheat, barley or green gram reduces phytic acid by about 40 percent.40 If you subsequently soak roasted grains, you should do so with a culture that supplies additional phytase, as phytase will be destroyed by the roasting process. ACIDIC SOAKING AND PHYTIC ACID For grains and legumes that are low in phytase, soaking does not usually sufficiently eliminate phytic acid. Soaking of millet, soya bean, maize, sorghum, and mung bean at 92 degrees F for 24 hours decreased the contents of phytic acid by 4–51 percent.43 With these same grains and beans, soaking at room temperature for 24 hours reduced phytic acid levels by 16–21 percent.44 However, soaking of pounded maize for one hour at room temperature already led to a reduction of phytic acid by 51 percent.45 Sourdough fermentation of grains containing high levels of phytase—such as wheat and rye—is the process that works best for phytate reduction. Sourdough fermentation of whole wheat flour for just four hours at 92 degrees F led to a 60 percent reduction in phytic acid. Phytic acid content of the bran samples was reduced to 44.9 percent after eight hours at 92 degrees F.46 The addition of malted grains and bakers yeast increased this reduction to 92-98 percent. Another study showed almost complete elimination of phytic acid in whole wheat bread after eight hours of sourdough fermentation (See Figure 6).47 A study of phytates in recipes used typically by home bread bakers found that leavening with commercial yeast was much less effective at removing phytates. Yeasted whole wheat breads lost only 22-58 percent of their phytic acid content from the start of the bread making process to the complete loaf.48 PHYTIC ACID AND YOU The purpose of this article is not to make you afraid of foods containing phytic acid, only to urge caution in including grains, nuts and legumes into your diet. It is not necessary to completely eliminate phytic acid from the diet, only to keep it to acceptable levels. An excess of 800 mg phytic acid per day is probably not a good idea. The average phytate intake in the U.S. and the U.K. ranges between 631 and 746 mg per day; the average in Finland is 370 mg; in Italy it is 219 mg; and in Sweden a mere 180 mg per day.49 In the context of a diet rich in calcium, vitamin D, vitamin A, vitamin C, good fats and lacto-fermented foods, most people will do fine on an estimated 400-800 mg per day. For those suffering from tooth decay, bone loss or mineral deficiencies, total estimated phytate content of 150-400 mg would be advised. For children under age six, pregnant women or those with serious illnesses, it is best to consume a diet as low in phytic acid as possible. In practical terms, this means properly preparing phytate-rich foods to reduce at least a portion of the phytate content, and restricting their consumption to two or three servings per day. Daily consumption of one or two slices of genuine sourdough bread, a handful of nuts, and one serving of properly prepared oatmeal, pancakes, brown rice or beans should not pose any problems in the context of a nutrient-dense diet. Problems arise when whole grains and beans become the major dietary sources of calories— when every meal contains more than one whole grain product or when over-reliance is placed on nuts or legumes. Unfermented soy products, extruded whole grain cereals, rice cakes, baked granola, raw muesli and other high-phytate foods should be strictly avoided. RICE Brown rice is high in phytates. One reference puts phytate content at 1.6 percent of dry weight, another at 1250 mg per 100 grams dry weight (probably about 400 mg per 100 grams cooked rice). Soaking brown rice will not effectively eliminate phytates because brown rice lacks the enzyme phytase; it thus requires a starter. Nevertheless, even an eight-hour soak will eliminate some of the phytic acid, reducing the amount in a serving to something like 300 mg or less. The ideal preparation of rice would start with home-milling, to remove a portion of the bran, and then would involve souring at a very warm temperature (90 degrees F) at least sixteen hours, preferably twenty-four hours. Using a starter would be ideal (see sidebar recipe). For those with less time, purchase brown rice in air-tight packages. Soak rice for at least eight hours in hot water plus a little fresh whey, lemon juice or vinegar. If you soak in a tightly closed mason jar, the rice will stay warm as it generates heat. Drain, rinse and cook in broth and butter. NUTS In general, nuts contain levels of phytic acid equal to or higher than those of grains. Therefore those consuming peanut butter, nut butters or nut flours, will take in phytate levels similar to those in unsoaked grains. Unfortunately, we have very little information on phytate reduction in nuts. Soaking for seven hours likely eliminates some phytate. Based on the accumulation of evidence, soaking nuts for eighteen hours, dehydrating at very low temperatures—a warm oven—and then roasting or cooking the nuts would likely eliminate a large portion of phytates. Nut consumption becomes problematic in situations where people on the GAPS diet and similar regimes are consuming lots of almonds and other nuts as a replacement for bread, potatoes and rice. The eighteen-hour soaking is highly recommended in these circumstances. It is best to avoid nut butters unless they have been made with soaked nuts—these are now available commercially. Likewise, it is best not to use nut flours—and also coconut flour—for cooking unless they have been soured by the soaking process. It is instructive to look at Native American preparation techniques for the hickory nut, which they used for oils. To extract the oil they parched the nuts until they cracked to pieces and then pounded them until they were as fine as coffee grounds. They were then put into boiling water and boiled for an hour or longer, until they cooked down to a kind of soup from which the oil was strained out through a cloth. The rest was thrown away. The oil could be used at once or poured into a vessel where it would keep a long time.50 By contrast, the Indians of California consumed acorn meal after a long period of soaking and rinsing, then pounding and cooking. Nuts and seeds in Central America were prepared by salt water soaking and dehydration in the sun, after which they were ground and cooked. BEANS All beans contain phytic acid and traditional cultures usually subjected legumes to a long preparation process. For example, according to one source, “Lima beans in Nigeria involve several painstaking processes to be consumed as a staple.”51 In central America, beans are made into a sour porridge called chugo, which ferments for several days. The best way of reducing phytates in beans is sprouting for several days, followed by cooking. An eighteen-hour fermention of beans without a starter at 95 degrees F resulted in 50 percent phytate reduction.52 Lentils fermented for 96 hours at 108 degrees F resulted in 70-75 percent phytate destruction.53 Lentils soaked for 12 hours, germinated 3-4 days and then soured will likely completely eliminate phytates. Soaking beans at moderate temperatures, such as for 12 hours at 78 degrees F results in an 8-20 percent reduction in phytates.54 When legumes comprise a large portion of the diet, one needs to go to extra steps to make beans healthy to eat. Beans should usually have hull and bran removed. Adding a phytase-rich medium to beans would help eliminate the phytic acid in beans. Adding yeast, or effective micororganisms, or kombu seaweed may greatly enhance the predigestive process of the beans. One website suggests using a starter containing effective microorganisms and cultured molasses for soaking beans.55 At a minimum, beans should be soaked for twelve hours, drained and rinsed several times before cooking, for a total of thirty-six hours. Cooking with a handful of green weed leaves, such as dandelion or chickweed, can improve mineral assimilation. TUBERS Sweet potatoes and potatoes contain little phytic acid but yams and other starchy staples contain levels of phytate that we cannot ignore. The phytic acid content of arrowroot is unknown, but it may contain a significant amount.56 These foods should be fermented—as they usually are in traditional cultures—if they are a staple in the diet. For occasional eating, cooking well and consuming with plenty of butter and vitamin C-rich foods should suffice. BREAD Bread can only be called the staff of life if it has undergone careful preparation; otherwise bread can be the road to an early grave. For starters, the flour used in bread should be stone ground. Wheat and rye contain high levels of phytase, but this is destroyed by the heat of industrial grinding, and also lessens over time. Fresh grinding of wheat or rye berries before use will ensure that the original amount of phytase remains in the flour. Rye has the highest level of phytase in relation to phytates of any grain, so rye is the perfect grain to use as a sourdough starter. Phytates in wheat are greatly reduced during sourdough preparation, as wheat is also high in phytase. Yeast rising bread may not fully reduce phytic acid levels.57 Phytate breakdown is significantly higher in sourdough bread than in yeasted bread.58 Yet even with the highly fermentable rye, a traditional ancient recipe from the French calls for removal of 25 percent of the bran and coarse substances.59 As an example of this practice, one small bakery in Canada sifts the coarse bran out of the flour before making it into bread.62 OATS Oats contain very little phytase, especially after commercial heat treatment, and require a very long preparation period to completely reduce phytic acid levels. Soaking oats at 77 degrees F for 16 hours resulted in no reduction of phytic acid, nor did germination for up to three days at this temperature.63 However, malting (sprouting) oats for five days at 52 degrees F and then soaking for 17 hours at 120 degrees F removes 98 percent of phytates. Adding malted rye further enhances oat phytate reduction.64 Without initial germination, even a five-day soaking at a warm temperature in acidic liquid may result in an insignificant reduction in phytate due to the low phytase content of oats. On the plus side, the process of rolling oats removes a at least part of the bran, where a large portion of the phytic acid resides. How do we square what we know about oats with the fact that oats were a staple in the diet of the Scots and Gaelic islanders, a people known for their robust good health and freedom from tooth decay? For one thing, high amounts of vitamin D from cod’s liver and other sources, helps prevent calcium losses from the high oat diet. Absorbable calcium from raw dairy products, consumed in abundance on mainland Scotland, provides additional protection. In addition, it is likely that a good part of the phytase remained in the oats of yore, which partially germinated in stacks left for a period in the field, were not heat treated and were hand rolled immediately prior to preparation. And some Scottish and Gaelic recipes do call for a long fermentation of oats before and even after they are cooked. Unprocessed Irish or Scottish oats, which have not been heated to high temperatures, are availabile in some health food stores and on the internet. One study found that unheated oats had the same phytase activity as wheat.65 They should be soaked in acidulated water for as long as twenty-four hours on top of a hot plate to keep them at about 100 degrees F. This will reduce a part of the phytic acid as well as the levels of other anti-nutrients, and result in a more digestible product. Overnight fermenting of rolled oats using a rye starter—or even with the addition of a small amount of fresh rye flour—may result in a fairly decent reduction of phytate levels. It is unclear whether heat-treated oats are healthy to eat regularly. SEEDS Seeds—such as pumpkin seeds—are extremely high in phytic acid and require thorough processing to remove it. Some may be removed by soaking and roasting. It is best to avoid consuming or snacking on raw seeds. By the way, cacao is a seed. Cacao contains irritating tannins and is said to be extremely high in phytic acid, although studies verifying phytic acid levels in cacao could not be located. Some brands of raw cocoa and cocoa powder may be fermented, others may not be. Check with the manufacturer before indulging! CORN Corn is high in phytic acid and low in phytase. The Native Americans fermented cooked corn meal for two weeks, wrapped in corn husks, before preparing it as a flat bread or tortilla. In Africa, corn is fermented for long periods of time using a lactobacillis culture to produce foods like kishk, banku, or mawe. No such care is given to corn products in the western world! But you can prepare healthy corn products at home. As with oatmeal, the addition of a rye starter or rye flour to the soaking water may be particularly helpful in reducing phytate content—think of the colonial “Ryn‘n’Injun” bread made from rye and corn. In one research project, soaking ground corn with 10 percent whole rye flour resulted in a complete reduction of phytate in six hours.66 Again, more research—and more experimenting in the kitchen—is needed! RYE TO THE RESCUE For those who need to reduce phytic acid to minimum levels—those suffering from tooth decay, bone loss and nutrient deficiencies—the magic ingredient is rye. To bring the phytate content of your diet to the absolute minimum, add freshly ground rye flour or a sourdough rye culture to rolled or cut oats, cornmeal, rice and other low-phytase grains, then soak in an acidic medium—preferably water with whey, yogurt or sour milk added—on a hot plate to bring the temperature up to about 100 degrees F. This is a better solution than consuming white rice and white flour, which are relative low in phytate but have a greatly reduced mineral content (see Figure 7). The intention of the article is not to impose a decision about whether or not to consume grains, nuts, seeds and beans; rather it is to clarify how to consume them with awareness. This way you can maximize your health by making grain-based foods more digestible and absorbable. Now it is very clear which foods contain phytic acid and how much they contain, what the health effects of phytic acid are and how to mitigate phytic acid in your diet with complementary foods rich in vitamin C, vitamin D and calcium. Methods for preparation of grains, seeds, and beans have been clarified, so that you can estimate how much phytic acid you are consuming. One meal high in phytic acid won’t cause a healthy person any harm. But high phytic acid levels over weeks and months can be very problematic. Fortunately, not only are properly prepared foods better for you, they also taste great. Now you can enjoy some well fermented sourdough bread, together with a piece of raw milk cheese, lots of butter and a slice of meat of your choice and taste the essence of life. Note to readers: This article is a work in progress. Please send additional information or comments to phytates@curetoothdecay.com SIDEBARS FIGURE 1: FOOD SOURCES OF PHYTIC ACID7 As a percentage of dry weight FOOD MINIMUM MAXIMUM Sesame seed flour 5.36 5.36 Brazil nuts 1.97 6.34 Almonds 1.35 3.22 Tofu 1.46 2.90 Linseed 2.15 2.78 Oat meal 0.89 2.40 Beans, pinto 2.38 2.38 Soy protein concentrate 1.24 2.17 Soybeans 1.00 2.22 Corn 0.75 2.22 Peanuts 1.05 1.76 Wheat flour 0.25 1.37 Wheat 0.39 1.35 Soy beverage 1.24 1.24 Oats 0.42 1.16 Wheat germ 0.08 1.14 Whole wheat bread 0.43 1.05 Brown rice 0.84 0.99 Polished rice 0.14 0.60 Chickpeas 0.56 0.56 Lentils 0.44 0.50 FIGURE 2: PHYTIC ACID LEVELS8 In milligrams per 100 grams of dry weight Brazil nuts 1719 Cocoa powder 1684-1796 Brown rice 12509 Oat flakes 1174 Almond 1138 – 1400 Walnut 982 Peanut roasted 952 Peanut ungerminated 821 Lentils 779 Peanut germinated 610 Hazel nuts 648 – 1000 Wild rice flour 634 – 752.5 Yam meal 637 Refried beans 622 Corn tortillas 448 Coconut 357 Corn 367 Entire coconut meat 270 White flour 258 White flour tortillas 123 Polished rice 11.5 – 66 Strawberries 12 PHYTATES: A BENEFICIAL ROLE? As evidence of the detrimental effects of phytates accumulates, reports on alleged beneficial effects have also emerged. In fact, a whole book, Food Phytates, published in 2001 by CRC press, attempts to build a case for “phytates’ potential ability to lower blood glucose, reduce cholesterol and triacylglycerols, and reduce the risks of cancer and heart disease.”14 One argument for the beneficial effects of phytates is based on the premise that they act as anti-oxidants in the body. But recent studies indicate that an overabundance of anti-oxidants is not necessarily a good thing as these compounds will inhibit the vital process of oxidation, not only in our cells but also in the process of digestion. Another theory holds that phytates bind to extra iron or toxic minerals and remove them from the body, thus acting as chelators and promoting detoxification. As with all anti-nutrients, phytates may play a therapeutic role in certain cases. For example, researchers claim that phytic acid may help prevent colon cancer and other cancers.15 Phytic acid is one of few chelating therapies used for uranium removal.16 Phytic acid’s chelating effect may serve to prevent, inhibit, or even cure some cancers by depriving those cells of the minerals (especially iron) they need to reproduce.17 The deprivation of essential minerals like iron would, much like other broad treatments for cancer, also have negative effects on non-cancerous cells. For example, prolonged use of phytic acid to clear excess iron may deprive other cells in the body that require iron (such as red blood cells). One theory is that phytates can help patients with kidney stones by removing excess minerals from the body. However, a long-term study involving over forty-five thousand men found no correlation between kidney stone risk and dietary intake of phytic acid.18 Phytates also have the potential for use in soil remediation, to immobilize uranium, nickel and other inorganic contaminants.19 OTHER ANTI-NUTRIENTS Phytates represent just one of many anti-nutrients in grains, nuts, tubers, seeds and beans. These include oxalates, tannins, trypsin inhibitors, enzyme inhibitors, lectins (hemagglutinins), protease inhibitors, gluten, alpha-amylase inhibitors and alkylresorcinols . Anti-nutrients exist in these plant foods because they are part of the process of life. The natural world requires them in order to perform many important tasks, including protection against insects, maintaining freshness of seeds for germination, and protection against mold and fungus. In order to consume these foods on a regular basis we must remove the phytates and other anti-nutrients through processing in harmonious ways. Many people in the health field assure us that if something is from nature, then it doesn’t require processing. Phytates act as the seed’s system of preservatives, like the impossible-to-open plastic packaging of many consumer goods. To get to the item we need—namely, phosphorus—we need to unwrap the phytate-phosphorus package. FIGURE 3: QUINOA PHYTATE REDUCTION34 PROCESS PHYTATE REDUCTION Cooked for 25 minutes at 212 degrees F 15-20 percent Soaked for 12-14 hours at 68 degrees F, then cooked 60-77 percent Fermented with whey 16-18 hours at 86 degrees F, then cooked 82-88 percent Soaked 12-14 hours, germinated 30 hours, lacto-fermented 16-18 hours, then cooked at 212 degrees F for 25 minutes 97-98 percent FIGURE 4: PHYTATE41 As Percentage of Dry Weight Sesame seeds dehulled 5.36 100% Wheat bran cereal 3.29 Soy beans 1.00 – 2.22 Pinto beans 0.60 – 2.38 Navy beans 0.74 – 1.78 Parboiled brown rice 1.60 Oats 1.37 Peanuts 1.05 – 1.76 Barley 1.19 Coconut meal 1.17 Whole corn 1.05 Rye 1.01 Wheat flour 0.96 Brown rice 0.84 – 0.94 Chickpeas 0.28 – 1.26 Lentils 0.27 – 1.05 Milled (white) rice 0.2 FIGURE 5: BREAD PHYTATES42 As Percentage of Weight Cornbread 1.36 Whole wheat bread 0.43-1.05 Wheat bran muffin 0.77-1.27 Popped corn 0.6 Rye 0.41 Pumpernickel 0.16 White bread 0.03- .23 French bread 0.03 Sourdough rye 0.03 Soured buckwheat 0.03 FIGURE 6: REDUCTION OF PHYTIC ACID IN WHOLE WHEAT SOURDOUGH BREAD47 Percentage of Phytic Acid spring2010-phyticacid-fig6Time —- Yeast Fermentation ___ Sourdough Fermentation PREPARATION OF BROWN RICE 1. Soak brown rice in dechlorinated water for 24 hours at room temperature without changing the water. Reserve 10% of the soaking liquid (should keep for a long time in the fridge). Discard the rest of the soaking liquid; cook the rice in fresh water. 2. The next time you make brown rice, use the same procedure as above, but add the soaking liquid you reserved from the last batch to the rest of the soaking water. 3. Repeat the cycle. The process will gradually improve until 96% or more of the phytic acid is degraded at 24 hours. Source: Stephan Guyenet http://wholehealthsource.blogspot.com/2009/04/new-way-to-soak-brown-rice.html. PHYTATES IN BRAN A survey of indigenous dishes shows that the bran is consistently removed from a variety of grains. The only exception seems to be beer. Traditional beer production—involving soaking, germination, cooking and fermentation—removes phytic acid and releases the vitamins from the bran and germ of grains. The traditional method for preparing brown rice is to pound it in a mortar and pestle in order to remove the bran. The pounding process results in milled rice, which contains a reduced amount of the bran and germ. Experiments have verified the fact that milled rice, rather than whole brown rice, results in the highest mineral absorption from rice. The idea we should eat bran is based on the idea of “not enough.” We somehow believe that grains without the bran do not provide enough nutrients. But solving the problem of a lack of bioavailable minerals in the diet may be more a question of soil fertility than of consuming every single part of the grain. A study of the famous Deaf Smith County Texas, the “town without a toothache”—because of their mineral-rich soil producing fabulous butter fat—found that its wheat contained six times the amount of phosphorus as normal wheat.60 In this case, wheat minus the bran grown in rich soils will have significant amounts or even more phosphorus compared to wheat with the bran grown in poor soil. Low nutrient content in food seems to be better solved by focusing on soil fertility, rather than trying to force something not digestible into a digestible form. There are many studies in which researchers have tried to find out how to make the bran of different grains digestible and to provide additional nutrition. But small additions of phosphorus- and calcium-rich dairy products, such as milk and cheese, or phosphorous-rich meat will make up for the moderate reductions in mineral intakes from grains without the bran. In one study, the calcium, magnesium, phosphorous and potassium in diets made up with 92 percent flour (almost whole wheat) were less completely absorbed than the same minerals in diets made up with 69 percent flour (with a significant amount of bran and germ removed).61 This study involved yeasted bread. With sourdough bread, the phytate content of bran will be largely reduced if a phytase-rich starter is used and the flour is fermented at least twenty-four hours. FIGURE 7: NUTRIENTS IN GRAINS AND OTHER FOODS67 In milligrams per 100 grams. Calcium Phosphorus Iron Calories Whole grain wheat flour 34 346 3.9 339 Unenriched white flour 15 108 1.2 364 White rice 9 108 0.4 366 Milled rice 10-30 80-150 .2-2.8 349-373 Brown rice 10-50 170-430 .2-5.2 363-385 Blue corn mush (Navajo) 96 39 2.9 54 Acorn stew 62 14 1 95 Milk 169 117 0.1 97 Free range buffalo steak 4 246 3.8 146 Cheese, mozarella 505 354 0.4 300 SOME FERMENTED GRAIN FOODS FROM AFRICA KISHK, a fermented product prepared from parboiled wheat and milk, is consumed in Egypt and many Arabian countries. During the preparation of kishk, wheat grains are boiled until soft, dried, milled and sieved in order to remove the bran. Milk is separately soured in earthenware containers, concentrated and mixed with the moistened wheat flour thus prepared, resulting in the preparation of a paste called a hamma. The hamma is allowed to ferment for about 24 hours, following which it is kneaded. Soured salted milk is added prior to dilution with water. Fermentation is allowed to proceed for a further 24 hours. The mass is thoroughly mixed, formed into balls and dried. BANKU is a popular staple consumed in Ghana. It is prepared from maize or a mixture of maize and cassava. The preparation involves steeping the raw material in water for 24 hours followed by wet milling and fermentation for three days. The dough is then mixed with water at a ratio of 4 parts dough to 2 parts water; or 4 parts dough to 1 part cassava and 2 parts water. Continuous stirring and kneading of the fermented dough is required to attain an appropriate consistency during subsequent cooking. Microbiological studies of the fermentation process revealed that the predominant microorganisms involved are lactic acid bacteria and moulds. MAWE is a sour dough prepared from partially dehulled maize meal which has undergone natural fermentation for a one- to three-day period. Traditional mawe production involves cleaning maize by winnowing, washing in water and crushing in a plate disc mill. The crushed maize is screened by sieving whereby grits and hulls are separated by gravity and the fine endosperm fraction collected in a bowl. The grits are not washed but home dehulled, following which they are mixed with the fine fraction, moistened over a 2- to 4-hour period and milled to a dough. The kneaded dough is then covered with a polyethylene sheet and allowed to ferment naturally to a sour dough in a fermentation bowl, or wrapped in paper or polyethylene. In the commercial process which takes place entirely in a milling shop, the grits are washed by rubbing in water, following which the germ and remaining hulls are floated off and discarded along with the water. The sedimented endosperm grits are subsequently blended with the fine endosperm fraction. The dominant microorganisms in mawe preparation include lactic acid bacteria and yeasts. INJERA is the most popular baked product in Ethiopia. It is a fermented sorghum bread with a very sour taste. The sorghum grains are dehulled manually or mechanically and milled to flour which is subsequently used in the preparation of injera. On the basis of production procedures three types of injera are distinguishable: thin injera which results from mixing a portion of fermented sorghum paste with three parts of water and boiling to yield a product known as absit, which is, in turn, mixed with a portion of the original fermented flour; thick injera, which is reddish in color with a sweet taste, consisting of a paste that has undergone only minimal fermentation for 12-24 hours; and komtata-type injera, which is produced from over-fermented paste, and has a sour taste. The paste is baked or grilled to give a bread-like product. Yeasts are the major microorganisms involved in the fermentation of the sweet type of injera. Source: http://www.fao.org/docrep/x2184e/x2184e07.htm#pre IRISH AND SCOTTISH OATMEAL Commercial oats in the U.S. are heat treated to about 200o F for four or five hours, to prevent rancidity—oats are rich in polyunsaturated oils that can go rancid within three months, especially at warm temperatures, and oats are harvested only once a year. Heat treatment kills enzymes that accelerate oxidation and helps prevent a bitter taste, although it surely damages the fragile polyunsaturated oils as well. While Irish and Scottish oatmeal is said to be “unheated,” this is not exactly true; these oats are also heat treated —for the same reasons, to minimize rancidity—but usually at lower temperatures. McCann’s Irish steel cut oats are heated to 113-118o F but Hamlyn’s heats to 212o F. Truly raw rolled oats are available from www.rawguru.com. The Alford brand, available only in the U.K., is kiln dried for four hours according to their website www.oatmealofalford.com; they do not provide temperatures. Hulless oats that have not been heat treated are available from www.sproutpeople.com; these can be ground or rolled at home before soaking and preparation as oat meal. UPDATE ON PHYTIC ACID by Rami Nagel The article on phytic acid (Spring, 2010) was written in response to reports of dental decay, especially in children, even though the family was following the principles of traditional diets. Phytates become a problem when grains make up a large portion of the diet and calcium, vitamin C and fat-soluble vitamins, specifically fat-soluble vitamin D, are low. In the diet advocated by WAPF, occasional higher phytate meals will not cause any noticeable health effects for people in good health. Significantly more care is needed with whole grains when the diet is low in fat-soluble vitamins and in diets where two or more meals per day rely significantly on grains as a food source. Vitamin C reduces the iron and perhaps other mineral losses from phytic acid. Vitamin D can mitigate the harmful effects of phytates. Calcium (think raw milk, raw cheese, yogurt, and kefir) balances out the negative effects of phytates. The best indicator of whether dietary phytic acid is causing problems can be seen in the dental health of the family. If dental decay is a recurrent problem, then more care with grain preparation and higher levels of animal foods will be needed. Article Correction , Brown Rice Preparation The article stated: “Soak brown rice in dechlorinated water for 24 hours at room temperature, without changing the water. Reserve 10 percent of the soaking liquid (which should keep for a long time in the fridge). Cook the rice in the remaining soaking liquid and eat. This will break down about 50 percent of the phytic acid.” The soaking water is to be discarded and the rice should be cooked in fresh water. Readers have noted that after the fourth cycle using the brown rice starter the brown rice becomes significantly softer and more digestible. PHYTIC ACID IN potatoes , YAMS AND SWEET POTATOES White potatoes have 0.111-0.269 percent of dry weight of phytic acid, a level approximately equivalent to the amount in white rice. Cooking does not significantly remove phytates in potatoes, but consumption of potatoes with plenty of butter or other animal fat in the context of a nutrient dense diet should be enough to mitigate the effects of phytate. Yams contain an amount of phytate equal to or less than that in white potatoes, and sweet potatoes contain no phytate at all. One idea for corn would be to soak/sour it with wheat such as in the process of making corn bread. Corn generally is prepared without the whole kernel, removing the kernel will reduce the phytate content a little bit. I don’t have further details on corn preparation, an entire article could be written on corn and traditional preparation. PREPARATION OF OATS AND CORN When preparing these grains according to traditional methods, such as those provided in Nourishing Traditions, the best idea is to add one or more tablespoons of freshly ground rye flour. Rye flour contains high levels of phytase that will be activated during the soaking process. This method reflects new information obtained since the publication of Nourishing Traditions. Even without the rye flour, overnight soaking of oats and other low-phytase grains greatly improves digestibility but won’t eliminate too much phytic acid. Another grain that benefits from added rye flour during soaking is sorghum, which is lower in phytic acid than wheat but lacking in phytase. (Buckwheat contains high levels of phytase and would not need added rye flour.) You can keep whole rye grains and grind a small amount in a mini grinder for adding to these grains during the soaking process. PREPARATION OF BEANS If beans are a staple of your diet, extra care is needed in their preparation, including soaking for twenty-four hours (changing the soaking water at least once) and very long cooking. In general, soaking beans and then cooking removes about 50 percent of phytic acid. One report with peas and lentils shows that close to 80 percent of phytic acid can be removed by soaking and boiling. Boiling beans that haven’t been soaked may remove much less phytic acid. Germinating and soaking, or germinating and souring is the best way to deal with beans; dosas made from soaked and fermented lentils and rice is a good example from India. In Latin America, beans are often fermented after the cooking process to make a sour porridge, such as chugo. PREPARATION OF NUTS We still do not have adequate information on nut preparation to say with any certainty how much phytic acid is reduced by various preparation techniques. Soaking in salt water and then dehydrating to make “crispy nuts” makes the nuts more digestible and less likely to cause intestinal discomfort, but we don’t know whether this process significantly reduces phytic acid, although it is likely to reduce at least a portion of the phytic acid. Roasting probably removes a significant portion of phytic acid. Roasting removes 32-68 percent of phytic acid in chick peas and roasting grains removes about 40 percent of phytic acid. Germinated peanuts have 25 percent less phytic acid then ungerminated peanuts. Several indigenous groups cooked and or roasted their nuts or seeds. I notice that I like the taste and smell of roasted nuts. The real problem with nuts comes when they are consumed in large amounts, such as almond flour as a replacement for grains in the GAPS diet. For example, an almond flour muffin contains almost seven hundred milligrams of phytic acid, so consumption should be limited to one per day. Eating peanut butter every day would also be problematic. PREPARATION OF COCONUT FLOUR We do not have enough information about the preparation of coconut flour to say whether soaking reduces phytic acid, but as with other phytic-acid containing foods, the likelihood is that it is at least partially reduced. MORE UPDATES COCONUT AND PHYTIC ACID I’m writing in regard to the article written by Ramiel Nagel titled “Living with Phytic Acid” (Spring 2010). In the article there are references to the phytic acid content of coconut. Since the publication of this article people have been asking me whether they should soak coconut or coconut flour to reduce the phytic acid. Phytic acid occurs in nuts and seeds in two forms—phytic acid and phytic acid salts [Reddy, NR and Sathe, SK (Eds.) Food Phytates. CRC Press, 2001]. Both are generally referred to as “phytates.” Together, these two compounds make up the total percentage of phytates reported in various foods. However, they do not possess the same chelating power. So the chelating effect of the phytates in corn, wheat, or soy are not the same as those in coconut. You cannot predict the chelating effect based on total phytate content alone. The mineral-binding effect of the phytates in coconut is essentially nonexistent. It is as if coconut has no phytic acid at all. In a study published in 2002, researchers tested the mineral binding capacity of a variety of bakery products made with coconut f lour. Mineral availability was determined by simulating conditions that prevail in the small intestine and colon. The researchers concluded that “coconut flour has little or no effect on mineral availability.” (Trinidad, TP and others. The effect of coconut flour on mineral availability from coconut flour supplemented foods. Philippine Journal of Nutrition 2002;49:48-57). In other words, coconut flour did not bind to the minerals. Therefore, soaking or other phytic acid-neutralizing processes are completely unnecessary. Soaking has been suggested as a means to reduce the phytic acid content in grains and nuts. Some suggest coconut flour should also be soaked. To soak coconut flour doesn’t make any sense. The coconut meat from which the flour is made, is naturally soaked in water its entire life (12 months) as it is growing on the tree. To remove the meat from the coconut and soak it again is totally redundant. After the coconut meat has been dried and ground into flour, soaking it would ruin the flour and make it unusable. You should never soak coconut flour. In the tropics coconut has been consumed as a traditional food for thousands of years. Those people who use it as a food staple and regard it as “sacred food,” do not soak it or process it in any way to remove phytates. It is usually eaten raw. This is the traditional method of consumption. They apparently have not suffered any detrimental effects from it even though in some populations it served as their primary source of food. Bruce Fife, ND Colorado Springs, Colorado To Gluten or Not to Gluten? Posted on July 8, 2014 by Maria Atwood • 35 Comments PrintFriendly and PDFPrint - PDF - Email Rethinking the Gluten-Free Craze “After a while the young man sat up and looked at the heavens, at the twinkling white stars, and then away across the shadows of round hills in the dusk. …The dreaming hills with their precious rustling wheat meant more than even a spirit could tell. Where had the wheat come from that had seeded these fields? Whence the first and original seeds, and where were the sowers? Back in the ages! The stars, the night, the dark blue of heaven hid the secret in their impenetrableness. Beyond them surely was the answer, and perhaps peace.” Zane Grey, The Desert of Wheat (1919) From the book: Harvest Heritage by Richard D. Scheuerman & Alexander C. McGregor After some years of enjoying delicious grain recipes, I actually got to the point where I was about to toss the grain baby out with the bath water! Why? Well, from many of the friendly Weston A. Price Foundation discussion groups and blogs that I and another WAPF buddy of mine follow, it seems of late that the urgent message to go gluten-free was the last great impetus since man landed on the moon! I began to see a deluge of recipes featuring alternative flours for baking. Almond flour was the most frequently suggested replacement for wheat flours. This new standard is not only a regular part of many WAPF-friendly blogs, but can be found virtually all over the Internet and is sadly becoming the norm. Additionally, there are the many affirmations that at last, we have finally come to realize (drum-roll) that it was the gluten that caused all those health problems! Give up the glutenous poison and a near nirvana state of health would be ours! Who could argue with these claims? In addition to blogs and the Internet pointing us in the direction of gluten-free baking and cooking, all we need to do is visit any commercial grocery or health food store. The grand proof that we have at last found the latest health panacea of the moment lies in the fact that shelf after shelf groans under the weight of gluten-free foods to appease even those of us who have never been bothered by eating gluten! Further, a sizable library of books, and some written by good doctors newly crowned the super-stars of the gluten-free craze, bolsters the faith of the recently converted. Unfortunately much of this information has served to hurt the wheat industry as a whole, and has also scared the common sense out of anyone who would dare to put a slice of wheat bread in his mouth! Sensing that there was something wrong with this picture and noting a definite departure from what I’d been used to eating, I, too, came under the spell and found myself at one of those stores looking for a sack of (gluten-free) almond flour! Of course I wanted to be sure I purchased the best organic almond flour I could find! This brazen act was perpetrated in spite of the fact that almonds and other gluten-free flours have some distinct disadvantages when compared to organic grains. So why was I looking to feed my family a wholesale diet of cookies, cakes, breads and numerous other foods made from gluten-free flours while choosing to ignore the nutritious grain recipes in books like Nourishing Traditions? Something inside me asked whether I could also perhaps be succumbing to the message? Just maybe I had not done my homework. It genuinely bothered me to see the proliferation of the gluten-free credo in a good many of the WAPF-friendly blogs and websites. I wondered how we could prevent the situation we recently experienced with the popular Paleo diets which were misunderstood to be a re-interpretation of the WAPF and ancestral diets. My instant conclusion was, here we go again! Suddenly, I felt I absolutely must investigate this issue further, and a faint sadness settled on me to think that going gluten-free may now also be misinterpreted as being endorsed as part of the diet espoused by the Weston A. Price Foundation. For newcomers that fallacy could be a real disaster as the diverse diet encouraged by the WAPF is the only diet that I know of that does not condemn any food groups such as meats, fruits, complex carbohydrates, saturated fats, and other foods, including those that contain gluten. WAPF encourages us to eat from all the food groups while focusing on the healing of our gastrointestinal system or, as I like to call it, the body’s “central processor,” so as to be able to enjoy and receive nourishment from all of it! Yes, I put the bag of almond flour back on the shelf and came home to dig into the gluten-free craze a little deeper. After nearly six months of meetings via email, reading many books, and numerous private phone conversations with some fantastic authors, business owners, and growers of ancient and heritage grains, I respectfully submit to you my findings. After a long absence due to my perceived fear of gluten, I now sit down to a delicious, warm slice of real sourdough bread and homemade butter. Ah, to eat what Grandmother served so frequently, and of which we relished every last bite, always eager to be right there when the fresh loaves were taken from the oven. This in my opinion is the way it should be, and I hope by the time you’ve read the rest of this article, it will be your opinion also. THE TRUE GLUTEN-FREE CANDIDATE Possibly the only true candidate for a totally gluten-free diet is a person who has damage to the tiny, fingerlike protrusions lining the small intestine called villi. Villi allow nutrients from food to be absorbed into the bloodstream. When damaged, the body cannot absorb nutrients properly, leading to malnutrition—regardless of the quantity or quality of food eaten. This is celiac disease and those suffering from it must abstain from gluten in all forms. Unfortunately, celiac disease can be misdiagnosed as irritable bowel syndrome, Crohn’s disease, diverticulitis, intestinal infections, iron deficiency, anemia and even chronic fatigue syndrome. It is estimated that about one percent of the U.S. population has celiac disease. Although this article is not meant to discuss or address celiac disease as such, I am including information about the necessary testing that may determine whether a complete gluten-free diet is even advisable. The results of a blood test can help detect celiac disease. If a blood test comes back positive for the appropriate antibodies an upper endoscopy may be performed to assess possible damage to the small intestine, more specifically the duodenum. If there is flattening of the villi, those finger-like projections that absorb nutrients, the doctor will work with the patient to create a gluten-free diet. Genetic testing is also helpful for relatives of those with celiac disease, as the disease is hereditary and common among first-degree relatives. NEWER APPROACH IN DIAGNOSTICS A fairly new approach that seeks to provide more sensitive, complete and early screening is available from EnteroLab. Their test is based on earlier research which demonstrated that anti-gliadin antibodies appear in the contents of the intestines before they appear in the blood. EnteroLab utilizes stool samples to test for these antibodies in gluten-sensitive individuals with the hope of positively identifying the condition before more extensive damage to the body has occurred. People with non-celiac gluten sensitivity generally have an unpleasant response to eating gluten. Symptoms can be similar to those of IBS (irritable bowel syndrome): bloating, diarrhea, and flatulence following the consumption of gluten-containing foods. They may also experience headaches or fatigue following the consumption of gluten. Unlike a true celiac sufferer, these reactions may not occur every time gluten is eaten; there is also no correlation with autoimmunity, making this very different from celiac disease. Wheat allergy—the third form of gluten intolerance—is a histamine response to any of several different forms of protein, including gluten, found in wheat. GOING GLUTEN-FREE Some of those going gluten-free may have decided to give up just baked goods like breads or cookies, and other easily recognized wheat-based foods, failing to understand that they are still getting plenty of gluten from other sources. One of the reasons a diet completely free of gluten is so challenging is that gluten is present in many processed foods, not just those whose main ingredient is wheat, barley, or rye. Just a few of these include frozen vegetables, sauces, soy sauce, many foods made with “natural flavorings,” vitamin and mineral supplements, some medications, and even toothpaste. Of course, this is one more in a long list of reasons to stay away from most processed foods, and focus instead on those you prepare yourself! According to Dr. Leffler, director of clinical research at the Celiac Center at Beth Israel Deaconess Medical Center in Boston, a true gluten-free diet is time-consuming, expensive, and restrictive. “It’s a gigantic burden for those who have to follow it,” says Dr. Leffler. “Many people with celiac disease are understandably frustrated when they hear in the lay press how wonderful this diet is.” The potential disadvantages of many glutenfree flours are similar to those of any refined flour: too much starch, too little fiber, and a lack of important vitamins and minerals. Just because something is gluten-free doesn’t mean it’s not refined. Many gluten-free bread and baking mixes have added sugar, and many recipes and mixes require the addition of xanthan or guar-gum to provide the structure found in flours containing gluten. Almond meal, which is currently one of the most widely used gluten-free flours, while rich in protein and other nutrients, is expensive, as well as high in phytates and omega-6. Coconut flour, which I personally love and use often for reasons other than avoiding gluten, is the only other flour I bake with because it has several desirable characteristics which, according to Dr. Bruce Fife, author of Cooking with Coconut Flour, “makes it a promising substitute for those who absolutely must avoid wheat flour. It is a good source of a variety of nutrients, including protein. It contains about 10 to 12 percent protein, which is the same as whole wheat flour. It is an excellent source of dietary fiber, reducing its digestible carbohydrate content thus making it the only truly low-carb flour. Another benefit of coconut flour is its mild taste. You would think that it might taste like coconut, but it doesn’t. It is nearly tasteless. When used in baking you cannot detect any coconut flavor. This is good because it takes on the flavor of the product being made. The primary benefit of coconut flour is its complete absence of gluten.” Coconut flour is also low in phytates, and has a great fatty acid profile. It does take a lot of eggs to make satisfactory baked goods with it, so baking can get expensive, and there are a fair number of people with egg sensitivities which may make using coconut flour, an otherwise healthy gluten-free flour, impractical for some. DEALING WITH THE SYMPTOMS I could write volumes on the dangers of making gluten-free flours your sole source for your baking and cooking needs; however, I wish to proceed to the substance of this article, and introduce ancient and heritage grains and organic whole wheat flour, which far outweigh the nutrient value in all gluten-free flours. (Some also happen to be low-gluten.) How many of the issues we so readily attribute to this recently identified poison called gluten are really issues with overall digestion or, as happens with many of us, are simply the result of being swept up in the gluten-free craze? Most important, do we understand that it is not necessarily an issue with wheat overall, but a misunderstanding about the types of wheat and grains that were used in a healthy ancestral diet and which very few of us now use? If you are experiencing symptoms that you believe may be attributed to gluten, and when you’ve not had your condition diagnosed via the tests I mentioned earlier to determine whether you are in the category of a true case of celiac disease, it may be time to deal with your symptoms by addressing the most problematic issue—that of healing the gut and slowly reintroducing one of the earliest and most healthful foods enjoyed by our ancestors. You may be pleasantly surprised that you, too, can finally get off the gluten-free craze. Remember that when we stop consuming a food, we naturally stop producing the enzymes that help digest that food. That fact by itself may explain why each time you try to go back to consuming wheat breads or other wheat products, they make you sick. You may be one of the unfortunates who have now lost the enzymatic capacity to digest almost any form of gluten. The sad part for me is that many parents are allowing this to happen to their children and other family members while not realizing that this may truly create a life-long inability to enjoy wheat products. Katherine Czapp, in her article (http://preview.tinyurl.com/nn52agl) titled “Our Daily Bread,” notes that her father, Vasili, diagnosed with full-blown celiac disease, could eventually eat whole grain sourdough bread, slow-fermented in traditional Russian fashion, with no digestive issues. It is crucial to note that this occurred after nearly two years of very concerted effort to restore his gut health. While this success may not be possible for everyone, this should certainly offer a more sensible direction for us to pursue what may be more sensible than going through the challenging exercise of making gluten-free baked goods from substances like almond, potato, tapioca, and bean flours, which may or may not be necessary for a short period of time while you re-introduce your digestive tract to the high-vitamin, mineral, and fiber-endowed heritage grains such as einkorn, emmer, spelt and even some heritage organic whole wheat or one of our more modern organic whole wheat breeds that are grown without chemical treatments. ANCESTRAL GRAINS: POSSIBLE CURE TO THE GLUTEN-FREE CRAZE Landrace, heritage and ancestral grains are best defined as those that originated in one of their native countries such as Iran, Syria, Turkey, or Russia, and have gratefully made their way through history without a complete change in their make-up. The three that are now slowly being brought back into use are einkorn, emmer, and some forms of spelt. Organic whole wheat varieties are best defined as wheat which may or may not be landrace grains, as not all grains that have had an evolution from landrace grains to our modern wheat varieties are to be avoided. What is of critical importance for those of us following an ancestral diet is that our sources of whole wheat are grown organically without the use of pesticides or chemical fertilizers. A CLOSER LOOK AT ANCIENT GRAINS Now that we all “know” that even heritage grains (which include einkorn, emmer, spelt, and triticum landrace) and organic whole wheat varieties, rye, and barley (and maybe oats) contain gluten, let’s look a little closer at the actual structure of grains so we can better understand the differences that make these heritage grains so valuable to our health. Somewhere in our own history, we discovered the value of the concentrated source of nutrients in every ancestral grain or wheat. Properly prepared (by soaking, sprouting and dehydrating, or leavening with wild yeasts), the nutrients stored within the grains were freed to be used by our own bodies, and use them we did, supplementing the other foods that had nourished us for thousands of years: animal-sourced foods and other plants. Every kernel of grain has protein, fats, carbohydrate, and fiber, stored neatly in a package protected by a fibrous outer layer called the hull. The bran is the outer layer of the grain. Next is the endosperm. The heart of the grain is the germ. The bran (14.5 percent of the kernel’s mass) is made of protein, fiber, starch, fat, and many B vitamins (all these are lost, of course, when grain is milled, removing the bran). The relatively high fat level in bran means that the grain—once hulled—can quickly go rancid (a good reason to store whole grains in a cool place and use them quickly once ground). The endosperm is the largest part of the kernel, with 83 percent of its total mass; in wheat, this is the part of the grain that, once the bran and germ have been removed, is milled into white flour. The endosperm nourishes the germ it wraps around until the seed has taken root and started to grow into a new plant. While it is rich in starch, it also contains about 75 percent of the protein plus iron and B complex vitamins. Finally, the germ, the smallest part of the wheat kernel (2.5 percent of the whole), also contains numerous B complex vitamins and vitamin E. About 8 percent of the protein found in wheat is in the germ. Minerals found in grain can include calcium, iron, phosphorus (bound up as phytic acid), magnesium, potassium, manganese, copper, iodine, chlorine, sodium, and silicon. One bonus supplied by wheat is betaine, a substance that protects our cells against stress, and stimulates the body’s production of vitamin B12. EINKORN (FARRO PICCOLO) With its simple chromosomal structure, high lutein content (which supports eye health), and long history of cultivation (dating back nine thousand years or so), einkorn can be handled by many people who react badly to readily available commercial wheat, and is low (mellow) in gluten content. Emmer, also known as farro medio, is another grain known to history even longer than einkorn. Emmer also holds the distinction of having more protein than any other member of the wheat family, a whopping 28 percent. Spelt, the favorite grain of St. Hildegard of Bingen, a mystical healer of the 12th century, is only slightly younger (references to it date to around seven thousand years ago). It is the first of the hexaploid grains but can still be considered ancient. It too is often easier to digest than newer forms of wheat. (Though, in their defense, even bread wheat—soft or hard, red or white—dates back at least six thousand years.) Rye, which does not contain true gluten, can be problematic for celiac sufferers because of the similarity of the protein structures, but often presents no problems for those with sensitivities to gluten. BREAD WITH BUTTER One of the mistakes we have made in the modern world is to eat our grains without any fat. Yet good quality fat makes bread more digestible and supplies fat-soluble vitamins so essential for gut health. Arachadonic acid supplied in butter and other animal fats is needed to make tight cell-to-cell junctures in the skin, including the “skin” lining the intestinal tract. So always look for teeth marks in the butter (or ghee, lard or bacon fat) that you are spreading on your bread. If you can’t see teeth marks, you are not putting enough on! PARTING THOUGHTS My main objective in writing this article is to remind myself and others of the fact that by going too far afield from our ancestral diets we may unfortunately impart to others a fear of eating foods that have sustained mankind for thousands of years. It is unfortunately easier for some just to avoid ancestral foods that we incorrectly indict as the culprit rather than doing the detective work to resolve the real issues with our digestive imbalances. Finally, it is wise to remind ourselves that we are in real danger of losing the privilege to enjoy and benefit from these ancient grains grown by enough farmers to supply the demand. My heart truly went out to many small growers I spoke with who struggle to make a living to supply us with one of the most precious commodities known to mankind: ancient grains. Promise yourself some serious consideration of the subject and try making your next step a commitment to heal your gut. Then you may be ready to step back into the time when ancient grains were considered the staff of life. And indeed, they still are. SIDEBARS WHILE WE ARE AT IT, LET’S BUST SOME WHEAT MYTHS I know I can’t digest wheat because I stayed away from it for a full year and then got sick when I ate a slice of bread. “Use it or lose it” works for enzymes in our gut, too. Stay away from a food for a long time and your body will ramp down the production of enzymes needed to digest that food. So when you’re reintroducing any food you’ve avoided for a long time, start with small amounts and don’t eat them every day. Your gut will start producing the proper enzymes if you give it a chance. Disclaimer: There are many cases of people suffering from wheat allergies who do in time lose that allergy; however gluten intolerance in the case of celiac disease is not reversible, although I’ve recently talked to a learned holistic practitioner who disputes that claim. For the present, always follow your medical doctor’s recommendations. Wheat makes us fat and foggy-brained. Ancestral grains and wheat have been part of the human diet for well over ten thousand years, and have supplied valuable nutrients to those who cultivated it. However, the wheat varieties developed since the 1950s as part of the Green Revolution (semi-dwarf wheat and, later, mutagenic wheat), the most common forms of wheat available, are different from earlier forms, and are much more likely to cause a whole host of problems. Please don’t blame thousands of years of perfectly healthy grains just because of problematic offshoots that are not being grown organically and which for the most part are only about fifty years old. In addition, although wheat is not genetically modified (not yet!), it is treated with the herbicide Round-Up a few days before harvest. Only recently we are learning that Round-Up is associated with a host of problems, including digestive disorders, gluten intolerance and even autism. Consider using healthy grain and wheat sources without the modern wheat consequences by sourcing landrace, heritage or ancestral grains. In 2000 Monica Spiller founded the non-profit Whole Grain Connection to promote whole organic grain foods for everyone and particularly to supply farmers with locally appropriate organic wheat seed. She states that organically grown modern whole wheat varieties that have not been treated with pesticides and chemical fertilizers may also be a safe alternative for some. Certainly you would want to test this for yourself. Modern wheat is bad because it has too many chromosomes. Some of the oldest forms of wheat, starting with spelt (which dates back to 5000 BC), have the same number of chromosomes as modern wheat. The problem is with a specific fraction of the gluten and suspect proteins in modern semi-dwarf wheat, not the number of chromosomes it has. Cereal grains come in varying genetic complexity. If you believe in eating less complex grains, einkorn , barley, and rye are diploid, with two sets of chromosomes; emmer and durum including kamut are a little more complex; they’re tetraploids with four sets of chromosomes; spelt and bread wheat varieties are hexaploid with six sets of chromosomes. All of these are perfectly edible and need not be avoided when your source of ancestral grains and modern whole wheat is carefully chosen. Wheat was never part of our healthy ancestors’ diets. On the contrary, all grain has been prized in those cultures that grew it. However, up until industrialization wheat flour contained the bran and germ of the kernel; the modern roller mills remove the healthiest parts of the kernel and make modern flours nutritionally deficient. Since neither commercial yeast nor mills to grind the grain without its bran and germ did not exist until the modern era, all grains were eaten in an unrefined state and prepared in ways that not only preserved them but enhanced their nutrition. Long-fermented wild yeast breads (sourdough); sour-leavened flatbreads (like pappadum or pita in India, for example); fermented sourdough noodles; and fermented porridges (kishk and nuruk are both wheat-based) all provided solid nutrition for our ancestors. Grains are the problem; wheat is just the worst offender. Grains—including wheat—have been part of traditional diets for thousands of years. Also note that we are part of the first generation or two in which chemically grown wheat treated with pesticides may be virtually the only wheat some of us have ever eaten. Add to this the fact that most are eating foods damaged in one way or another by modern processing overall, such as extrusion to make breakfast cereals (not to mention eating completely new foods and additives). Further, our guts have often been more or less damaged by not just these foods but by courses of antibiotics and other gut-compromising pharmaceuticals. Is it any wonder that so many people have digestive issues that can be exacerbated by a form of wheat that itself is novel before it’s even ground into flour? HEALING THE GUT: OTHERS HAVE DONE IT AND YOU CAN TOO! The following are suggestions from Sally Fallon Morell: 1. Get off all improperly prepared grains initially and then slowly re-introduce heritage grains and organic whole wheat properly prepared. 2. Lots of bone broth is needed as the villi rest on a layer of collagen that must be supported. Plus bone broth has numerous other benefits: http://www.westonaprice.org/food-features/broth-is-beautiful 3. Learn to make and regularly consume fermented foods and beverages. 4. Take cod liver oil and high vitamin butter and other good fats. 5. Always avoid all improperly prepared grains, such as granolas, muesli, and extruded breakfast cereals. 6. Work with a WAPF practitioner to help guide you. 7. Dr. Thomas Cowan in a recent email suggested the use of a Standard Process supplement called Okra-Pepsin E3 which is gluten-free, along with our nourishing traditional diet recommendations to also assist healing the gut. From Nourishing Traditions, page 493: Weston Price’s studies convinced him that the best diet was one that combined nutrient-dense whole grains with animal products, particularly fish. The healthiest African tribe he studied was the Dinkas, a Sudanese tribe on the western bank of the Nile. They were not as tall as the cattle-herding Neurs groups but they were physically better proportioned and had greater strength. Their diet consisted mainly of fish and cereal grains. This is one of the most important lessons of Price’s research—that a mixed diet of whole foods, one that avoids the extremes of the carnivorous Masai and the largely vegetarian Bantu, ensures optimum physical development. From Nourishing Traditions, quoting Jacques Delangre, page 491: In books on baking and even in nutritional/medical writings, the two techniques (for making bread), natural leaven (sourdough) and baker’s yeast, are often mingled and confounded….Baking with natural leaven is in harmony with nature and maintains the integrity and nutrition of the cereal grains used…The process helps to increase and reinforce our body’s absorption of the cereal’s nutrients. Unlike yeasted bread that diminishes, even destroys, much of the grain’s nutritional value, naturally leavened bread does not go stale and, as it ages, maintains its original moisture much longer. A lot of that information was known pragmatically for centuries; and thus when yeast was first introduced in France at the court of Louis XIV in March 1668, it was strongly rejected because at the time the scientists already knew that the use of yeast would imperil the people’s health. Today, yeast is used almost universally, without any testing, and the recent scientific evidence and clinical findings are confirming the ancient taboos with biochemical and bioelectronics valid proofs that wholly support the age-old common sense. Northern Roots of the Ancient Grains Posted on January 21, 2014 by Natalie Adarova • 0 Comments PrintFriendly and PDFPrint - PDF - Email The legend of the North is deep and enchanting, wrote famous Russian painter and mystic Nicolas Roerich about northern Russia. “Northern winds are brisk and merry. Northern lakes are wistful. Northern rivers are silvery. Faded forests are sagacious. Green hills are worldly-wise. Grey stones laid in circles are full of magic. We are still looking for the Ancient Rus.” The word for the Russian North, which sounds like “sever” in Russian, has left its traces in the English language in the old Celtic name (of Slavic origin) Severina, meaning “from the north,” and with the adjective “severe,” as an impression of intensely harsh weather. Has this boreal land always been so inhospitable and seemingly disconnected from the world as we know it today? In fact, the word “boreal” is paradoxically rooted in the word “bor,” which means “oak grove” in Russian. The oak is a warm climate-loving tree. CLIMATE CHANGE The Earth has experienced several ice ages, with the last ending about ten thousand years ago. During the interglacial periods, Eurasia experienced substantial climate changes. During such warm cycles, median January temperatures of the Russian north reached 32 degrees Fahrenheit, which is comparable to the climate of the present day Northern Italy. Under such conditions, tundra vanishes, and deciduous forests dominated by oak, elm and linden trees would spread as far as the sixty-fifth parallel north. Magnolia groves would cover the southern regions of Russia. The lands further south would become an inhospitable desert. Modern predictions that half of the Earth will experience conditions of extreme drought have already happened in the past. In the light of recent global climate warming, which is natural for the interglacial period as the Earth’s axis shifts, the retreating permafrost in the Russian north reveals more and more archeological evidence of agriculture’s deeply ancient roots. The commonly accepted date for the first grain cultivation is ten thousand years ago; however, that time frame only holds true for the Near East region. At that point in history, the expanding glacier spread in Eurasia pushed the milder climate to the south and brought that region novel plants and a food called bread. SOWN FIELDS Grass family grains have been foraged and cultivated in Russia since time immemorial. The sacred ancient Slavic symbol is a “sown field,” a diamond-shaped figure filled with dots, which took many complex shapes and forms symbolizing the growing paleolithic philosophy of life and death, the sun and the moon, the movement of time and the change of four seasons connected with agriculture. The sown field symbol can be found on all Russian folk costumes, and household and ritual objects. Its first primitive forms were found at Kostenka paleolithic camp, along with other ritual objects connected with agriculture. The archeological excavation of this site uncovered a large habitation with eight hearths, a complex central heating system and grain storage pits with wild varieties of rye, barley, oats, wheat and flax. That means grains were already used in a very sophisticated manner some seventy thousand years ago as it is thought that Kostenka camp belongs to that period. In fact, grains have probably been foraged since the dawn of Eurasian man, thought to appear three hundred to four hundred thousand years ago on the Eastern European plain?which interestingly coincides with the warmest interglacial period in the history of Earth. Grass family grains naturally grow in abundance in the Russian meadows and steppes, and the proximity of these grain fields has always been an important condition for ancient humans’ choice a habitat. It is hard to draw a line between foraging and deliberate cultivation as most probably the grain cultivation developed in Russia along the lines of permaculture?as a self-sustained system supported by nature. How ancient man first learned about grains and the sophisticated art of their cultivation and preparation is a great mystery and the subject of much debate. Ancient Slavs never took credit for this invention, rather, they point out that they were taught to sow and forge metal by a deity named Kola-Ksais who, according to Herodotus, rode the skies in a flying wheeled cart. Kola-Ksais was kind enough to throw the plow down from his vehicle, along with other gadgets for the unassuming peoples of earth. Interestingly, the words kolos (grain head), koleso (wheel) and the mysterious but very thoughtful Kola-Ksais all derive from the same root word, and do have a deep connection. The connection becomes even deeper when we learn that Kola-Ksais is the name used by the Greeks, while ancient Slavs called their heavenly patron Svarog. Svarga is an old Slavic word for “heaven” and this name is also rooted in the word svastika, completing the circle from deity to symbol. “Sown field” became known as swastika (or svastika), a symbol now forbidden and so downtrodden by history that it has lost its original deeply sacred meaning, which can be interpreted as “a monotonous flow, movement of heaven” in the old Slavic language. “What a fresh, unshaken memory!” marveled Gorodtsov, a renowned Russian archeologist, as he compared the skillful swastika embroidery of the northern craftswomen in 1926 to the ornaments of his upper paleolithic findings. “Recently we used to think that swastika is a fruit of the ancient Indian culture and of the decorative border, the meander, found in ancient Greek culture; however that turned out to be incorrect, as there is now documented evidence that swastika, meander and ovum were favorite ornaments of Paleolithic period. . . they were found in Russia on the objects of the Mezin paleolithic camp, which is set many tens of thousands of years back in time.” Gorodtsov died in 1945 fighting the Nazi swastika-turned-ominous, and his Paleolithic swastika works were buried in archives until the collapse of the Soviet Union. BREAD IN RUSSIA Russian culture revolves around bread. Endless songs, proverbs and legends are devoted to it as a sacred food. A loaf of bread named Kolobok even acts as the main character in fairytales? a plot similar to the Gingerbread Man in English tradition. In modern Russia important guests are still greeted with an ancient ceremony called “bread and salt.” Three women in Slavic folk costumes would present a round loaf of bread and salt placed on an embroidered towel (in the past it would be embroidered with swastikas?“ sown fields”) as a symbol of offering to share the fruits of their labor, along with fertility and wealth; guests would then eat a piece of it to symbolize accepting the generosity of their hosts. Numerous archaic songs provide evidence that only young unmarried women were responsible for harvesting, storage and preparation of the grain?a division of labor since Paleolithic times. That could explain why “flour” and “torment” are the same words in Russian. Anyone who has tried to make a truly stone-milled flour knows that this is an incredibly difficult physical task, especially for a young female. With the development of slash-and-burn agriculture, the Russian straw cult came into existence as another cultural phenomenon, again stemming from the ancient agriculture. People worshipped hay as a totem, because they noticed from ancient times that burning hay on the fields yielded more abundant crops. Every year people would burn a straw-stuffed dummy during Maslenitsa?a festival week before Lent, symbolizing victory over the winter frost and the beginning of the new fertile agricultural season. THE GOODNESS OF REAL BREAD Modern bread sold at the stores can hardly be called “bread” at all. A quickly risen product of the instant gratification age, made from genetically altered grains in order to yield higher and faster crops, grown in poor soils, stripped of any nutrients and full of harmful additives, it is a far cry from the food that nurtured thousands of generations. Due to their immobility, plants have developed sophisticated safety measures in the form of various toxins in order avoid being eaten. Whole grain bread touted as very healthy can present serious dangers if not properly prepared, as humans do not produce phytase enzymes that aid in breaking down phytic acid. This organic substance present in all grains, legumes, nuts and seeds blocks the absorption of phosphorus, calcium, magnesium, zinc, iron, copper, silica and even some amino acids. Traditional ferments, however, readily release phytase, as well as other compounds that neutralize antinutrients such as lectins and enzyme inhibitors, which is why traditional bread has always been prepared as sourdough. Preparation of traditional Russian sourdough bread was a complicated art and science. Dough had to be fermented only in oak barrels using a triple leavening process. The dough was considered a living substance, almost a creature, hence during the leavening and baking it was prohibited to curse or act aggressively?an action thought to to negatively affect the rising process. Russian ovens built by the rules of golden ratio created a special heating environment, giving the Russian bread its inimitable taste and nutritional value. Sprouting is another technique that reduces phytic acid. Before the invention of the combine harvester, grains stood in the field and sprouted naturally, making it easier to remove them from the stems. Both of these techniques largely remove the toxic matter out of the grains and greatly boost the nutritional content of the bread. However, even with all these steps some people find grains difficult to digest. One of the problems might be poor gut health in general, as one needs a powerful digestion, a strong gut lining, and a healthy microflora to be able to digest grains efficiently. Before modern times gluten intolerance was unknown, which indicates that gluten itself is not a problem. Plant foods are digested in the gut by the bacteria and if that bacteria are in poor health, problems will arise. SHARED MICROFLORA “An apple a day” is the new health recommendation picked up by the Russians, who in ancient times normally reserved apples for cattle and horses in the bad harvest years; the older recommendation was “a glass of kefir a day.” Besides genetics, which is an architectural blueprint, the second most important thing we inherit is our parents’ shared microflora. Since ancient times Slavic people considered the abdomen as the epicenter of the mystery of life. The word “abdomen” and “life” are synonyms in the Russian language. They both start with a Cyrillic letter ? (zhivot, meaning “life,” “abdomen”), an ancient symbol of the tree of life, which represents the complex paleolithic philosophy of the upper, middle and lower worlds and also reminds us of a human digestive system. In Chinese culture the letter zhi portrays the notion of life force or chi. Ancient Slavs knew that gut flora can either be your friend of your foe. They knew that flora could be transferred and could quickly turn pathogenic if handled incorrectly. Kissing strangers was prohibited and has never been used as a greeting. If someone of a different faith happened to eat in the old Orthodox home, the plate, glass and utensils he or she used weren’t even washed?they were disposed. Lechery and adultery were outlawed and strictly punished. Enemas are still viewed with suspicion as a rude interference into the human nature?a deeply imprinted collective memory that the human soul resides in the gut. BUTTER WITH YOUR BREAD Another old rule for consuming grains was the generous addition of animal fat. “You can not spoil kasha with too much butter” is an old Russian saying, hinting at the importance of this ingredient in grain consumption. Russian sourdough was always consumed with a thick layer of butter, a widespread tradition in other parts of Europe as well. Animal fats lubricate the gut protecting it from fiber damage while maximizing the absorption of fat-soluble nutrients. GRAIN AND CLAY The most interesting digestive aid historically used in Russia was clay, considered a sacred food, despite its non-food status. Geophagy?the eating of dirt?still puzzles many people and is considered an eating disorder. In fact, clay might be the earliest human medicine. In ancient times, grains were stored in grain pits usually dug out in clay-rich soil, and the top of the pit was sealed with clay. Such pits could store grains for almost a century and grains would still be edible after all that time. Whether it was due to accidental consumption of soiled grains or to sheer instinct (also widespread among animals), ancient varieties of bread were often prepared with clay. Other cultures also used and still use clay in baking. In ancient Rome a recipe for bread called picentin called for clay. Traditional Swedish acorn bread preparation also uses clay. You can still buy edible clay “cookies” in bazaars in Asia Minor and Africa; in fact, Africa is notorious for its clay consumption. Clay has a tremendous ability to bind toxins, and if there is any toxic matter left after sprouting or leavening, clay will help to usher it out of the body. Another important aspect of consuming clay is the fact that it is usually very rich is silica. This mineral is now gaining more and more recognition. “No life can exist without silica” proclaimed Vladimir Vernadsky, founder of geochemistry and pioneer of Russian cosmism. Silica is an essential element for proper growth, development and graceful aging. Among its myriad important function, silica plays a crucial role in formation of collagen. Collagen is a substance that forms us and holds us together, and our bodies start to disintegrate due to the rapid loss of silica as we age. Now when science and religion agree that man was made out of clay, it is especially important to look back and listen to the wisdom of our ancient ancestors. This article appeared in Wise Traditions in Food, Farming and the Healing Arts, the quarterly journal of the Weston A. Price Foundation, Winter 2013. Heart of the Matter : Sulfur Deficits in Plant-Based Diets Posted on February 2, 2012 by Kaayla Daniel • 4 Comments PrintFriendly and PDFPrint - PDF - Email The World Health Organization (WHO) reports that over sixteen million deaths occur worldwide each year due to cardiovascular disease, and more than half of those deaths occur in developing countries where plant-based diets high in legumes and starches are eaten by the vast majority of the people. Yet “everyone knows” plant-based diets prevent heart disease. Indeed this myth is repeated so often that massive numbers of educated, health-conscious individuals in first world countries are consciously adopting third world style diets in the hope of preventing disease, optimizing health and maximizing longevity. But if the WHO statistics are correct, plant-based diets might not be protective at all. And today’s fashionable experiment in veganism could end very badly indeed. HOMOCYSTEINE AND HEART DISEASE A study published in the August 26, 2001 issue of the journal Nutrition makes a strong case against plant-based diets for prevention of heart disease. The title alone, “Vegetarianism produces subclinical malnutrition, hyperhomocysteinemia and atherogenesis,” sounds a significant warning. The article establishes why subjects who eat mostly vegetarian diets develop morbidity and mortality from cardiovascular disease unrelated to vitamin B status and Framingham criteria. Co-author Kilmer S. McCully, MD, “Father of the Homocysteine Theory of Heart Disease,” is familiar to WAPF members as a winner of the Linus Pauling Award, WAPF’s Integrity in Science Award, and author of numerous articles published in peer-reviewed journals as well as the popular books The Homocysteine Revolution and The Heart Revolution. In 2009 Dr. McCully was one of the signers of the Weston A. Price Foundation’s petition to the FDA in which we asked the agency to retract its unwarranted 1999 soy/heart disease health claim. (See www.westonaprice.org/soy-alert/soy-heart-health-claim.) Dr. McCully teamed up with Yves Ingenbleek, MD, of Université Louis Pasteur in Strasbourg, France, which funded the research. Dr. Ingenbleek is well known for his work on malnutrition, the essential role sulfur plays along with nitrogen in metabolism, and sulfur deficiency as a cause of hyperhomocysteinemia. The study took place in Chad, and involved twenty-four rural male subjects ages eighteen to thirty, and fifteen urban male controls, ages eighteen to twenty-nine. (Women in this region of Chad could not be studied because of their animistic beliefs and proscriptions against collecting their urine.) The rural men were apparently healthy, physically active farmers with good lipid profiles. Their staple foods included cassava, sweet potatoes, beans, millet and ground nuts. Cassava leaves, cabbages and carrots provided good levels of carotenes, folates and pyridoxine (B6). The rural Chadian diet is plant-based because of a shortage of grazing lands and livestock, but subjects occasionally consume some B12-containing foods, mostly poultry and eggs, though very little dairy or meat. Their diet could be described as high carb, high fiber, low in both protein and fat, and low in the sulfur-containing amino acids. In brief, this diet is the very one recommended by many of today’s nutritional “experts” for overall good health and heart disease prevention. The urban controls were likewise healthy and ate a similar diet, but with beef, smoked fish and canned or powdered milk regularly on the menus. Their diet was thus higher in protein, fat and the sulfur-containing amino acids, though roughly equivalent in calories. Dr. McCully’s research over the past forty years on the pathogenesis of atherosclerosis has shown the role of homocysteine in free radical damage and the protective effect of vitamins B6, B12 and folate. Indeed, many doctors today recommend taking this trio of B vitamins as an inexpensive heart disease “insurance policy.” In Chad, both groups showed adequate levels of B6 and folate. The B12 levels of the vegetarian group were lower, but the difference was only of “borderline significance.” However, as the researchers point out, “A previous study undertaken in the same Chadian area in a larger group of sixty rural participants did demonstrate a weak inverse correlation between B12 and homocysteine concentrations in the twenty subjects most severely protein depleted . . . It is therefore likely that the hyperhomocysteinemia status of some of our rural subjects in the present survey might have resulted from combined B12 and protein deficiencies. The correlation of B12 deficiency with hyperhomocysteinemia could well reach statistical significance if a larger group of subjects were studied.” ANIMAL PROTEIN AND SULFUR Clearly it’s wise for people on plant-based diets to supplement their diets with B12, but protein malnutrition must also be addressed. And the issue is not just getting enough protein to eat, but the right kind. The bottom line is we must eat protein rich in bioavailable, sulfur-containing amino acids—and that means animal products. Vegans at this point will surely claim the issue is insufficient protein and trot out soy as the solution. Soy is indeed a complete plant-based protein, but notoriously low in methionine. It does contain decent levels of cysteine, but the cysteine is bound up in protease inhibitors, making it largely biounavailable. (For more information, read my book The Whole Soy Story: The Dark Side of America’s Favorite Health Food, endorsed by Dr. McCully, as well as our petition to the FDA noted above.) So what did Drs. Ingenbleek and McCully find among the study group of protein-deficient people? Higher levels of homocysteine, of course. Also significant alterations in body composition, lean body mass, body mass index and plasma transthyretin levels. In plain English, the near-vegetarian subjects were thinner, with poorer muscle tone, and showed subclinical signs of protein malnutrition. (So much for popular ideas of extreme thinness being healthy.) The plant-based diet of the study group was low in all of the sulfur-containing amino acids. As would be expected, labwork on these men showed lower plasma cysteine and glutathione levels compared to the controls. Methionine levels, however, tested comparably. The explanation for this is “adaptive response.” In brief, mammals trying to function with insufficient sulfur-containing amino acids will do whatever is necessary to survive. Given the essential role of methionine in metabolic processes, that means deregulating the transsulfuration pathway, increasing homocysteine levels, and methylating homocysteine to make methionine. Ultimately, it all boils down to our need for sulfur. As Stephanie Seneff, PhD, and many others have written in Wise Traditions and on the WAPF website, sulfur is vital for disease prevention and maintenance of good health. In terms of heart disease, Drs. Ingenbleek and McCully have shown sulfur deficiency not only leads to high homocysteine levels, but is the likeliest reason some clinical trials using B6, B12 and folate interventions have proved ineffective for the prevention of cardiovascular and cerebrovascular diseases. Over the past few years, headlines from such studies have led to widespread dismissal of Dr. McCully’s “Homocysteine Theory of Heart Disease” and renewed media focus on cholesterol, C-reactive protein and other possible culprits that can be treated by statins and other profitable drugs. In contrast, the research of Drs. McCully and Ingenbleek suggests we can better prevent heart disease with three inexpensive B vitamins and traditional diets rich in the sulfur-containing amino acids found in animal foods. In the blaze of publicity surrounding the video Forks Over Knives and other blasts of vegan propaganda, few people are likely to hear about this study. That’s sad, for it provides an important missing piece in our knowledge of heart disease development, a strong argument against the plant-based dietary fad, and a bright new chapter in what the New York Times has called “The Fall and Rise of Kilmer McCully.” This article appeared in Wise Traditions in Food, Farming and the Healing Arts, the quarterly journal of the Weston A. Price Foundation, Winter 2011. Myths of Vegetarianism Posted on December 31, 2002 by Stephen Byrnes, ND, RNCP • 20 Comments PrintFriendly and PDFPrint - PDF - Email Read this article in: Portuguese Myths & Truths About Vegetarianism Originally published in the Townsend Letter for Doctors & Patients, July 2000. Revised January 2002. “An unflinching determination to take the whole evidence into account is the only method of preservation against the fluctuating extremes of fashionable opinion.”—Alfred North Whitehead Bill and Tanya sat before me in my office in a somber mood: they had just lost their first baby in the second month of pregnancy. Tanya was particularly upset. “Why did this happen to me? Why did I miscarry my baby?” The young couple had come to see me mostly because of Tanya’s recurrent respiratory infections, but also wanted some advice as to how they could avoid the heartache of another failed pregnancy. Upon questioning Tanya about her diet, I quickly saw the cause of her infections, as well as her miscarriage: she had virtually no fat in her diet and was also mostly a vegetarian. Because of the plentiful media rhetoric about the supposed dangers of animal product consumption, as opposed to the alleged health benefits of the vegetarian lifestyle, Tanya had deliberately removed such things as cream, butter, meats and fish from her diet. Although she liked liver, she avoided it due to worries over “toxins.” Tanya and Bill left with a bottle of vitamin A, other supplements and a dietary prescription that included plentiful amounts of animal fats and meat. Just before leaving my office, Tanya looked at me and said ruefully: “I just don’t know what to believe sometimes. Everywhere I look there is all this low-fat, vegetarian stuff recommended. I followed it, and look what happened.” I assured her that if she and her husband changed their diets and allowed sufficient time for her weakened uterus to heal, they would be happy parents in due time. In November 2000, Bill and Tanya happily gave birth to their first child, a girl. The Evolution of a Myth Along with the unjustified and unscientific saturated fat and cholesterol scares of the past several decades has come the notion that vegetarianism is a healthier dietary option for people. It seems as if every health expert and government health agency is urging people to eat fewer animal products and consume more vegetables, grains, fruits and legumes. Along with these exhortations have come assertions and studies supposedly proving that vegetarianism is healthier for people and that meat consumption is associated with sickness and death. Several authorities, however, have questioned these data, but their objections have been largely ignored. As we shall see, many of the vegetarian claims cannot be substantiated and some are simply false and dangerous. There are benefits to vegetarian diets for certain health conditions, and some people function better on less fat and protein, but, as a practitioner who has dealt with several former vegetarians and vegans (total vegetarians), I know full well the dangerous effects of a diet devoid of healthful animal products. It is my hope that all readers will more carefully evaluate their position on vegetarianism after reading this paper. •Myth #1: Meat consumption contributes to famine and depletes the Earth’s natural resources. •Myth #2: Vitamin B12 can be obtained from plant sources. •Myth #3: Our needs for vitamin D can be met by sunlight. •Myth #4: The body’s needs for vitamin A can be entirely obtained from plant foods. •Myth #5: Meat-eating causes osteoporosis, kidney disease, heart disease, and cancer. •Myth #6: Saturated fats and dietary cholesterol cause heart disease, atherosclerosis, and/or cancer, and low-fat, low-cholesterol diets are healthier for people. •Myth #7: Vegetarians live longer and have more energy and endurance than meat-eaters. •Myth #8: The “cave man” diet was low-fat and/or vegetarian. Humans evolved as vegetarians. •Myth #9: Meat and saturated fat consumption have increased in the 20th century, with a corresponding increase in heart disease and cancer. •Myth #10: Soy products are adequate substitutes for meat and dairy products. •Myth #11: The human body is not designed for meat consumption. •Myth #12: Eating animal flesh causes violent, aggressive behavior in humans. •Myth #13: Animal products contain numerous, harmful toxins. •Myth #14: Eating meat or animal products is less “spiritual” than eating only plant foods. •Myth #15: Eating animal foods is inhumane. Myth #1: Meat consumption contributes to famine and depletes the Earth’s natural resources. Some vegetarians have claimed that livestock require pasturage that could be used to farm grains to feed starving people in Third World countries. It is also claimed that feeding animals contributes to world hunger because livestock are eating foods that could go to feed humans. The solution to world hunger, therefore, is for people to become vegetarians. These arguments are illogical and simplistic. The first argument ignores the fact that about 2/3 of our Earth’s dry land is unsuitable for farming. It is primarily the open range, desert and mountainous areas that provide food to grazing animals and that land is currently being put to good use (1). The second argument is faulty as well because it ignores the vital contributions that livestock animals make to humanity’s well-being. It is also misleading to think that the foods grown and given to feed livestock could be diverted to feed humans: Agricultural animals have always made a major contribution to the welfare of human societies by providing food, shelter, fuel, fertilizer and other products and services. They are a renewable resource, and utilize another renewable resource, plants, to produce these products and services. In addition, the manure produced by the animals helps improve soil fertility and, thus, aids the plants. In some developing countries the manure cannot be utilized as a fertilizer but is dried as a source of fuel. There are many who feel that because the world population is growing at a faster rate than is the food supply, we are becoming less and less able to afford animal foods because feeding plant products to animals is an inefficient use of potential human food. It is true that it is more efficient for humans to eat plant products directly rather than to allow animals to convert them to human food. At best, animals only produce one pound or less of human food for each three pounds of plants eaten. However, this inefficiency only applies to those plants and plant products that the human can utilize. The fact is that over two-thirds of the feed fed to animals consists of substances that are either undesirable or completely unsuited for human food. Thus, by their ability to convert inedible plant materials to human food, animals not only do not compete with the human rather they aid greatly in improving both the quantity and the quality of the diets of human societies. (2) Furthermore, at the present time, there is more than enough food grown in the world to feed all people on the planet. The problem is widespread poverty making it impossible for the starving poor to afford it. In a comprehensive report, the Population Reference Bureau attributed the world hunger problem to poverty, not meat-eating (3). It also did not consider mass vegetarianism to be a solution for world hunger. What would actually happen, however, if animal husbandry were abandoned in favor of mass agriculture, brought about by humanity turning towards vegetarianism? If a large number of people switched to vegetarianism, the demand for meat in the United States and Europe would fall, the supply of grain would dramatically increase, but the buying power of poor [starving] people in Africa and Asia wouldn’t change at all. The result would be very predictable — there would be a mass exodus from farming. Whereas today the total amount of grains produced could feed 10 billion people, the total amount of grain grown in this post-meat world would likely fall back to about 7 or 8 billion. The trend of farmers selling their land to developers and others would accelerate quickly. (4) In other words, there would be less food available for the world to eat. Furthermore, the monoculture of grains and legumes, which is what would happen if animal husbandry were abandoned and the world relied exclusively on plant foods for its food, would rapidly deplete the soil and require the heavy use of artificial fertilizers, one ton of which requires ten tons of crude oil to produce (5). As far as the impact to our environment, a closer look reveals the great damage that exclusive and mass farming would do. British organic dairy farmer and researcher Mark Purdey wisely points out that if “veganic agricultural systems were to gain a foothold on the soil, then agrochemical use, soil erosion, cash cropping, prairie-scapes and ill health would escalate.” (6) Neanderthin author Ray Audette concurs with this view: Since ancient times, the most destructive factor in the degradation of the environment has been monoculture agriculture. The production of wheat in ancient Sumeria transformed once-fertile plains into salt flats that remain sterile 5,000 years later. As well as depleting both the soil and water sources, monoculture agriculture also produces environmental damage by altering the delicate balance of natural ecosystems. World rice production in 1993, for instance, caused 155 million cases of malaria by providing breeding grounds for mosquitoes in the paddies. Human contact with ducks in the same rice paddies resulted in 500 million cases of influenza during the same year.(7) There is little doubt, though, that commercial farming methods, whether of plants or animals produce harm to the environment. With the heavy use of agrochemicals, pesticides, artificial fertilizers, hormones, steroids, and antibiotics common in modern agriculture, a better way of integrating animal husbandry with agriculture needs to be found. A possible solution might be a return to “mixed farming,” described below. The educated consumer and the enlightened farmer together can bring about a return of the mixed farm, where cultivation of fruits, vegetables and grains is combined with the raising of livestock and fowl in a manner that is efficient, economical and environmentally friendly. For example, chickens running free in garden areas eat insect pests, while providing high-quality eggs; sheep grazing in orchards obviate the need for herbicides; and cows grazing in woodlands and other marginal areas provide rich, pure milk, making these lands economically viable for the farmer. It is not animal cultivation that leads to hunger and famine, but unwise agricultural practices and monopolistic distribution systems. (8) The “mixed farm” is also healthier for the soil, which will yield more crops if managed according to traditional guidelines. Mark Purdey has accurately pointed out that a crop field on a mixed farm will yield up to five harvests a year, while a “mono-cropped” one will only yield one or two (9). Which farm is producing more food for the world’s peoples? Purdey well sums up the ecological horrors of “battery farming” and points to future solutions by saying: Our agricultural establishments could do very well to outlaw the business-besotted farmers running intensive livestock units, battery systems and beef-burger bureaucracies; with all their wastages, deplorable cruelty, anti-ozone slurry systems; drug/chemical induced immunotoxicity resulting in B.S.E. [see myth # 13] and salmonella, rain forest eradication, etc. Our future direction must strike the happy, healthy medium of mixed farms, resurrecting the old traditional extensive system as a basic framework, then bolstering up productivity to present day demands by incorporating a more updated application of biological science into farming systems. (10) It does not appear, then, that livestock farming, when properly practiced, damages the environment. Nor does it appear that world vegetarianism or exclusively relying on agriculture to supply the world with food are feasible or ecologically wise ideas. Myth #2: Vitamin B12 can be obtained from plant sources. Of all the myths, this is perhaps the most dangerous. While lacto and lacto-ovo vegetarians have sources of vitamin B12 in their diets (from dairy products and eggs), vegans (total vegetarians) do not. Vegans who do not supplement their diet with vitamin B12 will eventually get anemia (a fatal condition) as well as severe nervous and digestive system damage; most, if not all, vegans have impaired B12 metabolism and every study of vegan groups has demonstrated low vitamin B12 concentrations in the majority of individuals (11). Several studies have been done documenting B12 deficiencies in vegan children, often with dire consequences (12). Additionally, claims are made in vegan and vegetarian literature that B12 is present in certain algae, tempeh (a fermented soy product) and Brewer’s yeast. All of them are false as vitamin B12 is only found in animal foods. Brewer’s and nutritional yeasts do not contain B12 naturally; they are always fortified from an outside source. There is not real B12 in plant sources but B12 analogues–they are similar to true B12, but not exactly the same and because of this they are not bioavailable (13). It should be noted here that these B12 analogues can impair absorption of true vitamin B12 in the body due to competitive absorption, placing vegans and vegetarians who consume lots of soy, algae, and yeast at a greater risk for a deficiency (14). Some vegetarian authorities claim that B12 is produced by certain fermenting bacteria in the lower intestines. This may be true, but it is in a form unusable by the body. B12 requires intrinsic factor from the stomach for proper absorption in the ileum. Since the bacterial product does not have intrinsic factor bound to it, it cannot be absorbed (15). It is true that Hindu vegans living in certain parts of India do not suffer from vitamin B12 deficiency. This has led some to conclude that plant foods do provide this vitamin. This conclusion, however, is erroneous as many small insects, their feces, eggs, larvae and/or residue, are left on the plant foods these people consume, due to non-use of pesticides and inefficient cleaning methods. This is how these people obtain their vitamin B12. This contention is borne out by the fact that when vegan Indian Hindus later migrated to England, they came down with megaloblastic anaemia within a few years. In England, the food supply is cleaner, and insect residues are completely removed from plant foods (16). The only reliable and absorbable sources of vitamin B12 are animal products, especially organ meats and eggs (17). Though present in lesser amounts than meat and eggs, dairy products do contain B12. Vegans, therefore, should consider adding dairy products into their diets. If dairy cannot be tolerated, eggs, preferably from free-run hens, are a virtual necessity. That vitamin B12 can only be obtained from animal foods is one of the strongest arguments against veganism being a “natural” way of human eating. Today, vegans can avoid anemia by taking supplemental vitamins or fortified foods. If those same people had lived just a few decades ago, when these products were unavailable, they would have died. Myth #3: Our needs for vitamin D can be met by sunlight. Though not really a vegetarian myth per se, it is widely believed that one’s vitamin D needs can be met simply by exposing one’s skin to the sun’s rays for 15-20 minutes a few times a week. Concerns about vitamin D deficiencies in vegetarians and vegans always exist as this nutrient, in its full-complex form, is only found in animal fats (18) which vegans do not consume and more moderate vegetarians only consume in limited quantities due to their meatless diets. It is true that a limited number of plant foods such as alfalfa, sunflower seeds, and avocado, contain the plant form of vitamin D (ergocalciferol, or vitamin D2). Although D2 can be used to prevent and treat the vitamin D deficiency disease, rickets, in humans, it is questionable, though, whether this form is as effective as animal-derived vitamin D3 (cholecalciferol). Some studies have shown that D2 is not utilized as well as D3 in animals (19) and clinicians have reported disappointing results using vitamin D2 to treat vitamin D-related conditions (20). Although vitamin D can be created by our bodies by the action of sunlight on our skin, it is very difficult to obtain an optimal amount of vitamin D by a brief foray into the sun. There are three ultraviolet bands of radiation that come from sunlight named A, B, and C. Only the “B” form is capable of catalyzing the conversion of cholesterol to vitamin D in our bodies (21) and UV-B rays are only present at certain times of day, at certain latitudes, and at certain times of the year (22). Furthermore, depending on one’s skin color, obtaining 200-400 IUs of vitamin D from the sun can take as long as two full hours of continual sunning (23). A dark-skinned vegan, therefore, will find it impossible to obtain optimal vitamin D intake by sunning himself for 20 minutes a few times a week, even if sunning occurs during those limited times of the day and year when UV-B rays are available. The current RDA for vitamin D is 400 IUs, but Dr. Weston Price’s seminal research into healthy native adult people’s diets showed that their daily intake of vitamin D (from animal foods) was about 10 times that amount, or 4,000 IUs (24). Accordingly, Dr. Price placed a great emphasis on vitamin D in the diet. Without vitamin D, for example, it is impossible to utilize minerals like calcium, phosphorous, and magnesium. Recent research has confirmed Dr. Price’s higher recommendations for vitamin D for adults (24). Since rickets and/or low vitamin D levels has been well-documented in many vegetarians and vegans (26), since animal fats are either lacking or deficient in vegetarian diets (as well as those of the general Western public who routinely try to cut their animal fat intake), since sunlight is only a source of vitamin D at certain times and at certain latitudes, and since current dietary recommendations for vitamin D are too low, this emphasizes the need to have reliable and abundant sources of this nutrient in our daily diets. Good sources include cod liver oil, lard from pigs that were exposed to sunlight, shrimp, wild salmon, sardines, butter, full-fat dairy products, and eggs from properly fed chickens. Myth #4: The body’s needs for vitamin A can be entirely obtained from plant foods. True vitamin A, or retinol and its associated esters, is only found in animal fats and organs like liver (27). Plants do contain beta-carotene, a substance that the body can convert into vitamin A if certain conditions are present (see below). Beta-carotene, however, is not vitamin A. It is typical for vegans and vegetarians (as well as most popular nutrition writers) to say that plant foods like carrots and spinach contain vitamin A and that beta-carotene is just as good as vitamin A. These things are not true even though beta-carotene is an important nutritional factor for humans. The conversion from carotene to vitamin A in the intestines can only take place in the presence of bile salts. This means that fat must be eaten with the carotenes to stimulate bile secretion. Additionally, infants and people with hypothyroidism, gall bladder problems or diabetes (altogether, a significant portion of the population) either cannot make the conversion, or do so very poorly. Lastly, the body’s conversion from carotene to vitamin A is not very efficient: it takes roughly 6 units of carotene to make one unit of vitamin A. What this means is that a sweet potato (containing about 25,000 units of beta-carotene) will only convert into about 4,000 units of vitamin A (assuming you ate it with fat, are not diabetic, are not an infant, and do not have a thyroid or gall bladder problem) [28]. Relying on plant sources for vitamin A, then, is not a very wise idea. This provides yet another reason to include animal foods and fats in our diets. Butter and full-fat dairy foods, especially from pastured cows, are good vitamin A sources, as is cod liver oil. Vitamin A is all-important in our diets, for it enables the body to use proteins and minerals, insures proper vision, enhances the immune system, enables reproduction, and fights infections (29). As with vitamin D, Dr. Price found that the diets of healthy primitive peoples supplied substantial amounts of vitamin A, again emphasizing the great need humans have for this nutrient in maintaining optimal health now and for future generations. Myth #5: Meat-eating causes osteoporosis, kidney disease, heart disease, and cancer. Oftentimes, vegans and vegetarians will try to scare people into avoiding animal foods and fats by claiming that vegetarian diets offer protection from certain chronic diseases like the ones listed above. Such claims, however, are hard to reconcile with historical and anthropological facts. All of the diseases mentioned are primarily 20th century occurrences, yet people have been eating meat and animal fat for many thousands of years. Further, as Dr. Price’s research showed, there were/are several native peoples around the world (the Innuit, Maasai, Swiss, etc.) whose traditional diets were/are very rich in animal products, but who nevertheless did/do not suffer from the above-mentioned maladies (30). Dr. George Mann’s independent studies of the Maasai done many years after Dr. Price, confirmed the fact that the Maasai, despite being almost exclusive meat eaters, nevertheless, had little to no incidence of heart disease, or other chronic ailments (31). This proves that other factors besides animal foods are at work in causing these diseases. Several studies have supposedly shown that meat consumption is the cause of various illnesses, but such studies, honestly evaluated, show no such thing as the following discussion will show. Osteoporosis Dr. Herta Spencer’s research on protein intake and bone loss clearly showed that protein consumption in the form of real meat has no impact on bone density. Studies that supposedly proved that excessive protein consumption equaled more bone loss were not done with real meat but with fractionated protein powders and isolated amino acids (32). Recent studies have also shown that increased animal protein intake contributes to stronger bone density in men and women (33). Some recent studies on vegan and vegetarian diets, however, have shown them to predispose women to osteoporosis (34). Kidney Disease Although protein-restricted diets are helpful for people with kidney disease, there is no proof that eating meat causes it (35). Vegetarians will also typically claim that animal protein causes overly acidic conditions in the blood, resulting in calcium leaching from the bones and, hence, a greater tendency to form kidney stones. This opinion is false, however. Theoretically, the sulphur and phosphorous in meat can form an acid when placed in water, but that does not mean that is what happens in the body. Actually, meat contains complete proteins and vitamin D (if the skin and fat are eaten), both of which help maintain pH balance in the bloodstream. Furthermore, if one eats a diet that includes enough magnesium and vitamin B6, and restricts refined sugars, one has little to fear from kidney stones, whether one eats meat or not (36). Animal foods like beef, pork, fish, and lamb are good sources of magnesium and B6 as any food/nutrient table will show. Heart Disease The belief that animal protein contributes to heart disease is a popular one that has no foundation in nutritional science. Outside of questionable studies, there is little data to support the idea that meat-eating leads to heart disease. For example, the French have one of the highest per capita consumption of meat, yet have low rates of heart disease. In Greece, meat consumption is higher than average but rates of heart disease are low there as well. Finally, in Spain, an increase in meat eating (in conjunction with a reduction in sugar and high carbohydrate intake) led to a decrease in heart disease (37). Cancer The belief that meat, in particular red meat, contributes to cancer is, like heart disease, a popular idea that is not supported by the facts. Although it is true that some studies have shown a connection between meat eating and some types of cancer (38), its important to look at the studies carefully to determine what kind of meat is being discussed, as well as the preparation methods used. Since we only have one word for “meat” in English, it is often difficult to know which “meat” is under discussion in a study unless the authors of the study specifically say so. The study which began the meat=cancer theory was done by Dr. Ernst Wynder in the 1970s. Wynder claimed that there was a direct, causal connection between animal fat intake and incidence of colon cancer (39). Actually, his data on “animal fats” were really on vegetable fats (40). In other words, the meat=cancer theory is based on a phony study. If one looks closely at the research, however, one quickly sees that it is processed meats like cold cuts and sausages that are usually implicated in cancer causation (41) and not meat per se. Furthermore, cooking methods seem to play a part in whether or not a meat becomes carcinogenic (42). In other words, it is the added chemicals to the meat and the chosen cooking method that are at fault and not the meat itself. In the end, although sometimes a connection between meat and cancer is found, the actual mechanism of how it happens has eluded scientists (43). This means that it is likely that other factors besides meat are playing roles in some cases of cancer. Remember: studies of meat-eating traditional peoples show that they have very little incidence of cancer. This demonstrates that other factors are at work when cancer appears in a modern meat-eating person. It is not scientifically fair to single out one dietary factor in placing blame, while ignoring other more likely candidates. It should be noted here that Seventh Day Adventists are often studied in population analyses to prove that a vegetarian diet is healthier and is associated with a lower risk for cancer (but see a later paragraph in this section). While it is true that most members of this Christian denomination do not eat meat, they also do not smoke or drink alcohol, coffee or tea, all of which are likely factors in promoting cancer (44). The Mormons are a religious group often overlooked in vegetarian studies. Although their Church urges moderation, Mormons do not abstain from meat. As with the Adventists, Mormons also avoid tobacco, alcohol, and caffeine. Despite being meat eaters, a study of Utah Mormons showed they had a 22% lower rate for cancer in general and a 34% lower mortality for colon cancer than the US average (45). A study of Puerto Ricans, who eat large amounts of fatty pork, nevertheless revealed very low rates of colon and breast cancer (46). Similar results can be adduced to demonstrate that meat and animal fat consumption do not correlate with cancer (47). Obviously, other factors are at work. It is usually claimed that vegetarians have lower cancer rates than meat-eaters, but a 1994 study of vegetarian California Seventh Day Adventists showed that, while they did have lower rates for some cancers (e.g., breast and lung), they had higher rates for several others (Hodgkin’s disease, malignant melanoma, brain, skin, uterine, prostate, endometrial, cervical and ovarian), some quite significantly. In that study the authors actually admitted that: Meat consumption, however, was not associated with a higher [cancer] risk. And that, No significant association between breast cancer and a high consumption of animal fats or animal products in general was noted. (48) Further, it is usually claimed that a diet rich in plant foods like whole grains and legumes will reduce one’s risks for cancer, but research going back to the last century demonstrates that carbohydrate-based diets are the prime dietary instigators of cancer, not diets based on minimally processed animal foods (49). The mainstream health and vegetarian media have done such an effective job of “beef bashing,” that most people think there is nothing healthful about meat, especially red meat. In reality, however, animal flesh foods like beef and lamb are excellent sources of a variety of nutrients as any food/nutrient table will show. Nutrients like vitamins A, D, several of the B-complex, essential fatty acids (in small amounts), magnesium, zinc, phosphorous, potassium, iron, taurine, and selenium are abundant in beef, lamb, pork, fish and shellfish, and poultry. Nutritional factors like coenzyme Q10, carnitine, and alpha-lipoic acid are also present. Some of these nutrients are only found in animal foods–plants do not supply them. Myth #6: Saturated fats and dietary cholesterol cause heart disease, atherosclerosis, and/or cancer, and low-fat, low-cholesterol diets are healthier for people. This, too, is not a specific vegetarian myth. Nevertheless, people are often urged to take up a vegetarian or vegan diet because it is believed that such diets offer protection against heart disease and cancer since they are lower or lacking in animal foods and fats. Although it is commonly believed that saturated fats and dietary cholesterol “clog arteries” and cause heart disease, such ideas have been shown to be false by such scientists as Linus Pauling, Russell Smith, George Mann, John Yudkin, Abram Hoffer, Mary Enig, Uffe Ravnskov and other prominent researchers (50). On the contrary, studies have shown that arterial plaque is primarily composed of unsaturated fats, particularly polyunsaturated ones, and not the saturated fat of animals, palm or coconut (51). Trans-fatty acids, as opposed to saturated fats, have been shown by researchers such as Enig, Mann and Fred Kummerow to be causative factors in accelerated atherosclerosis, coronary heart disease, cancer and other ailments (52). Trans-fatty acids are found in such modern foods as margarine and vegetable shortening and foods made with them. Enig and her colleagues have also shown that excessive omega-6 polyunsaturated fatty acid intake from refined vegetable oils is also a major culprit behind cancer and heart disease, not animal fats. A recent study of thousands of Swedish women supported Enig’s conclusions and data, and showed no correlation between saturated fat consumption and increased risk for breast cancer. However, the study did show,as did Enig’s work, a strong link between vegetable oil intake and higher breast cancer rates (53). The major population studies that supposedly prove the theory that animal fats and cholesterol cause heart disease actually do not upon closer inspection. The Framingham Heart Study is often cited as proof that dietary cholesterol and saturated fat intake cause heart disease and ill health. Involving about 6,000 people, the study compared two groups over several years at five-year intervals. One group consumed little cholesterol and saturated fat, while the other consumed high amounts. Surprisingly, Dr William Castelli, the study’s director, said: In Framingham, Mass., the more saturated fat one ate, the more cholesterol one ate, the more calories one ate, the lower the person’s serum cholesterol … we found that the people who ate the most cholesterol, ate the most saturated fat, [and] ate the most calories, weighed the least and were the most physically active. (54) The Framingham data did show that subjects who had higher cholesterol levels and weighed more ran a slightly higher chance for coronary heart disease. But weight gain and serum cholesterol levels had an inverse correlation with dietary fat and cholesterol intake. In other words, there was no correlation at all (55). In a similar vein, the US Multiple Risk Factor Intervention Trial, sponsored by the National Heart and Lung Institute, compared mortality rates and eating habits of 12,000+ men. Those who ate less saturated fat and cholesterol showed a slightly reduced rate of heart disease, but had an overall mortality rate much higher than the other men in the study (56). Low-fat/cholesterol diets, therefore, are not healthier for people. Studies have shown repeatedly that such diets are associated with depression, cancer, psychological problems, fatigue, violence and suicide (57). Women with lower serum cholesterol live shorter lives than women with higher levels (58). Similar things have been found in men (59). Children on low-fat and/or vegan diets can suffer from growth problems, failure to thrive, and learning disabilities (60). Despite this, sources from Dr Benjamin Spock to the American Heart Association recommend low-fat diets for children! One can only lament the fate of those unfortunate youngsters who will be raised by unknowing parents taken in by such genocidal misinformation. There are many health benefits to saturated fats, depending on the fat in question. Coconut oil, for example, is rich in lauric acid, a potent antifungal and antimicrobial substance. Coconut also contains appreciable amounts of caprylic acid, also an effective antifungal (61). Butter from free-range cows is rich in trace minerals, especially selenium, as well as all of the fat-soluble vitamins and beneficial fatty acids that protect against cancer and fungal infections (62). In fact, the body needs saturated fats in order to properly utilize essential fatty acids (63). Saturated fats also lower the blood levels of the artery-damaging lipoprotein (a) (64); are needed for proper calcium utilization in the bones (65); stimulate the immune system (66); are the preferred food for the heart and other vital organs (67); and, along with cholesterol, add structural stability to the cell and intestinal wall (68). They are excellent for cooking, as they are chemically stable and do not break down under heat, unlike polyunsaturated vegetable oils. Omitting them from one’s diet, then, is ill-advised. With respect to atherosclerosis, it is always claimed that vegetarians have much lower rates of this condition than meat eaters. The International Atherosclerosis Project of 1968, however, which examined over 20,000 corpses from several countries, concluded that vegetarians had just as much atherosclerosis as meat eaters (69). Other population studies have revealed similar data. (70) This is because atherosclerosis is largely unrelated to diet; it is a consequence of aging. There are things which can accelerate the atherosclerotic process such as excessive free radical damage to the arteries from antioxidant depletion (caused by such things as smoking, poor diet, excess polyunsaturated fatty acids in the diet, various nutritional deficiencies, drugs, etc), but this is to be distinguished from the fatty-streaking and hardening of arteries that occurs in all peoples over time. It also does not appear that vegetarian diets protect against heart disease. A study on vegans in 1970 showed that female vegans had higher rates of death from heart disease than non-vegan females (71). A recent study showed that Indians, despite being vegetarians, have very high rates of coronary artery disease (72). High-carbohydrate/low-fat diets (which is what vegetarian diets are) can also place one at a greater risk for heart disease, diabetes, and cancer due to their hyperinsulemic effects on the body (73). Recent studies have also shown that vegetarians have higher homocysteine levels in their blood (74). Homocysteine is a known cause of heart disease. Lastly, low-fat/cholesterol diets, generally favored to either prevent or treat heart disease, do neither and may actually increase certain risk factors for this condition (75). Studies which conclude that vegetarians are at a lower risk for heart disease are typically based on the phony markers of lower saturated fat intake, lower serum cholesterol levels and HDL/LDL ratios. Since vegetarians tend to eat less saturated fat and usually have lower serum cholesterol levels, it is concluded that they are at less risk for heart disease. Once one realizes that these measurements are not accurate predictors of proneness to heart disease, however, the supposed protection of vegetarianism melts away (76). It should always be remembered that a number of things factor into a person getting heart disease or cancer. Instead of focusing on the phony issues of saturated fat, dietary cholesterol, and meat-eating, people should pay more attention to other more likely factors. These would be trans-fatty acids, excessive polyunsaturated fat intake, excessive sugar intake, excessive carbohydrate intake, smoking, certain vitamin and mineral deficiencies, and obesity. These things were all conspicuously absent in the healthy traditional peoples that Dr. Price studied. Myth #7: Vegetarians live longer and have more energy and endurance than meat-eaters. A vegetarian guidebook published in Great Britain made the following claim: You and your children don’t need to eat meat to stay healthy. In fact, vegetarians claim they are among the healthiest people around, and they can expect to live nine years longer than meat eaters (this is often because heart and circulatory diseases are rarer). These days almost half the population in Britain is trying to avoid meat, according to a survey by the Food Research Association in January 1990. (77) In commenting on this claim of extended lifespan, author Craig Fitzroy astutely points out that: The “nine-year advantage” is an oft-repeated but invariably unsourced piece of anecdotal evidence for vegetarianism. But anyone who believes that by snubbing mum’s Sunday roast they will be adding a decade to their years on the planet is almost certainly indulging in a bit of wishful thinking. (78) And that is what most of the claims for increased longevity in vegetarians are: anecdotal. There is no proof that a healthy vegetarian diet when compared to a healthy omnivorous diet will result in a longer life. Additionally, people who choose a vegetarian lifestyle typically also choose not to smoke, to exercise, in short, to live a healthier lifestyle. These things also factor into one’s longevity. In the scientific literature, there are surprisingly few studies done on vegetarian longevity. Russell Smith, PhD, in his massive review study on heart disease, showed that as animal product consumption increased among some study groups, death rates actually decreased! (79) Such results were not obtained among vegetarian subjects. For example, in a study published by Burr and Sweetnam in 1982, analysis of mortality data revealed that, although vegetarians had a slightly (.11%) lower rate of heart disease than non-vegetarians, the all-cause death rate was much higher for vegetarians (80). Despite claims that studies have shown that meat consumption increased the risk for heart disease and shortened lives, the authors of those studies actually found the opposite. For example, in a 1984 analysis of a 1978 study of vegetarian Seventh Day Adventists, HA Kahn concluded, Although our results add some substantial facts to the diet-disease question, we recognize how remote they are from establishing, for example, that men who frequently eat meat or women who rarely eat salad are thereby shortening their lives. (81) A similar conclusion was reached by D.A. Snowden (82). Despite these startling admissions, the studies nevertheless concluded the exact opposite and urged people to reduce animal foods from their diets. Further, both of these studies threw out certain dietary data that clearly showed no connection between eggs, cheese, whole milk, and fat attached to meat (all high fat and cholesterol foods) and heart disease. Dr. Smith commented, In effect the Kahn [and Snowden] study is yet another example of negative results which are massaged and misinterpreted to support the politically correct assertions that vegetarians live longer lives. (83) It is usually claimed that meat-eating peoples have a short life span, but the Aborigines of Australia, who traditionally eat a diet rich in animal products, are known for their longevity (at least before colonization by Europeans). Within Aboriginal society, there is a special caste of the elderly (84). Obviously, if no old people existed, no such group would have existed. In his book Nutrition and Physical Degeneration, Dr. Price has numerous photographs of elderly native peoples from around the world. Explorers such as Vilhjalmur Stefansson reported great longevity among the Innuit (again, before colonization). [85] Similarly, the Russians of the Caucasus mountains live to great ages on a diet of fatty pork and whole raw milk products. The Hunzas, also known for their robust health and longevity, eat substantial portions of goat’s milk which has a higher saturated fat content than cow’s milk (86). In contrast, the largely vegetarian Hindus of southern India have the shortest life-spans in the world, partly because of a lack of food, but also because of a distinct lack of animal protein in their diets (87). H. Leon Abrams’ comments are instructive here: Vegetarians often maintain that a diet of meat and animal fat leads to a pre-mature death. Anthropological data from primitive societies do not support such contentions. (88) With regards to endurance and energy levels, Dr Price traveled around the world in the 1920s and 1930s, investigating native diets. Without exception, he found a strong correlation between diets rich in animal fats, robust health and athletic ability. Special foods for Swiss athletes, for example, included bowls of fresh, raw cream. In Africa, Dr Price discovered that groups whose diets were rich in fatty meats and fish, and organ meats like liver, consistently carried off the prizes in athletic contests, and that meat-eating tribes always dominated tribes whose diets were largely vegetarian. (89) It is popular in sports nutrition to recommend “carb loading” for athletes to increase their endurance levels. But recent studies done in New York and South Africa show that the opposite is true: athletes who “carb loaded” had significantly less endurance than those who “fat loaded” before athletic events (90). Myth #8: The “cave man” diet was low-fat and/or vegetarian. Humans evolved as vegetarians. Our Paleolithic ancestors were hunter-gatherers, and three schools of thought have developed as to what their diet was like. One group argues for a high-fat and animal-based diet supplemented with seasonal fruits, berries, nuts, root vegetables and wild grasses. The second argues that primitive peoples consumed assorted lean meats and large amounts of plant foods. The third argues that our human ancestors evolved as vegetarians. The “lean” Paleolithic diet approach has been argued for quite voraciously by Dr.’s Loren Cordain and Boyd Eaton in a number of popular and professional publications (91). Cordain and Eaton are believers in the Lipid Hypothesis of heart disease–the belief (debunked in myth number six, above) that saturated fat and dietary cholesterol contribute to heart disease. Because of this, and the fact that Paleolithic peoples or their modern equivalents did/do not suffer from heart disease, Cordain and Eaton espouse the theory that Paleolithic peoples consumed most of their fat calories from monounsaturated and polyunsaturated sources and not saturated fats. Believing that saturated fats are dangerous to our arteries, Cordain and Eaton stay in step with current establishment nutritional thought and encourage modern peoples to eat a diet like our ancestors. This diet, they believe, was rich in lean meats and a variety of vegetables, but was low in saturated fat. The evidence they produce to support this theory is, however, very selective and misleading. (92) Saturated fats do not cause heart disease as was shown above, and our Paleolithic ancestors ate quite a bit of saturated fat from a variety of plant and animal sources. From authoritative sources, we learn that prehistoric humans of the North American continent ate such animals as mammoth, camel, sloth, bison, mountain sheep, pronghorn antelope, beaver, elk, mule deer, and llama (93). “Mammoth, sloth, mountain sheep, bison, and beaver are fatty animals in the modern sense in that they have a thick layer of subcutaneous fat, as do the many species of bear and wild pig whose remains have been found at Paleolithic sites throughout the world.” (94) Analysis of many types of fat in game animals like antelope, bison, caribou, dog, elk, moose, seal, and mountain sheep shows that they are rich in saturates and monounsaturates, but relatively low in polyunstaurates. (95) Further, while buffalo and game animals may have lean, non-marbled muscle meats, it is a mistake to assume that only these parts were eaten by hunter-gatherer groups like the Native Americans who often hunted animals selectively for their fat and fatty organs as the following section will show. Anthropologists/explorers such as Vilhjalmur Stefansson reported that the Innuit and North American Indian tribes would worry when their catches of caribou were too lean: they knew sickness would follow if they did not consume enough fat (96). In other words, these primitive peoples did not like having to eat lean meat. Northern Canadian Indians would also deliberately hunt older male caribou and elk, for these animals carried a 50-pound slab of back fat on them which the Indians would eat with relish. This “back fat” is highly saturated. Native Americans would also refrain from hunting bison in the springtime (when the animals’ fat stores were low, due to scarce food supply during the winter), preferring to hunt, kill and consume them in the fall when they were fattened up (97). Explorer Samuel Hearne, writing in 1768, described how the Native American tribes he came in contact with would selectively hunt caribou just for the fatty parts: On the twenty-second of July, we met several strangers, whom we joined in pursuit of the caribou, which were at this time so plentiful that we got everyday a sufficient number for our support, and indeed too frequently killed several merely for the tongues, marrow, and fat. (98) While Cordain and Eaton are certainly correct in saying that our ancestors ate meat, their contentions about fat intake, as well as the type of fat consumed, are simply incorrect. While various vegetarian and vegan authorities like to think that we evolved as a species on a vegan or vegetarian diet, there exists little from the realm of nutritional anthropology to support these ideas. To begin with, in his journeys, Dr Price never once found a totally vegetarian culture. It should be remembered that Dr. Price visited and investigated several population groups who were, for all intents and purposes, the 20th century equivalents of our hunter-gatherer ancestors. Dr. Price was on the lookout for a vegetarian culture, but he came up empty. Price stated: As yet I have not found a single group of primitive racial stock which was building and maintaining excellent bodies by living entirely on plant foods. (99) Anthropological data support this: throughout the globe, all societies show a preference for animal foods and fats and our ancestors only turned to large scale farming when they had to due to increased population pressures (100). Abrams and other authorities have shown that prehistoric man’s quest for more animal foods was what spurred his expansion over the Earth, and that he apparently hunted certain species to extinction. (101) Price also found that those peoples who, out of necessity, consumed more grains and legumes, had higher rates of dental decay than those who consumed more animal products. In his papers on vegetarianism, Abrams presents archaeological evidence that supports this finding: skulls of ancient peoples who were largely vegetarian have teeth containing caries and abscesses and show evidence of tuberculosis and other infectious diseases (102). The appearance of farming and the increased dependence on plant foods for our subsistence was clearly harmful to our health. Finally, it is simply not possible for our prehistoric ancestors to have been vegetarian because they would not have been able to get enough calories or nutrients to survive on the plant foods that were available. The reason for this is that humans did not know how to cook or control fire at that time and the great majority of plant foods, especially grains and legumes, must be cooked in order to render them edible to humans (103). Most people do not know that many of the plant foods we consume today are poisonous in their raw states (104). Based on all of this evidence, it is certain that the diets of our ancestors, the progenitors of humanity, ate a very non-vegetarian diet that was rich in saturated fatty acids. Myth #9: Meat and saturated fat consumption have increased in the 20th century, with a corresponding increase in heart disease and cancer. Statistics do not bear out such fancies. Butter consumption has plummeted from 18 lb (8.165 kg) per person a year in 1900, to less than 5 lb (2.27 kg) per person a year today (105). Additionally, Westerners, urged on by government health agencies, have reduced their intake of eggs, cream, lard, and pork. Chicken consumption has risen in the past few decades, but chicken is lower in saturated fat than either beef or pork. Furthermore, a survey of cookbooks published in America in the last century shows that people of earlier times ate plenty of animal foods and saturated fats. For example, in the Baptist Ladies Cook Book (Monmouth, Illinois, 1895), virtually every recipe calls for butter, cream or lard. Recipes for creamed vegetables are numerous as well. A scan of the Searchlight Recipe Book (Capper Publications, 1931) also has similar recipes: creamed liver, creamed cucumbers, hearts braised in buttermilk, etc. British Jews, as shown by the Jewish Housewives Cookbook (London, 1846), also had diets rich in cream, butter, eggs, and lamb and beef tallows. One recipe for German waffles, for example, calls for a dozen egg yolks and an entire pound of butter. A recipe for Oyster Pie from the Baptist cookbook calls for a quart of cream and a dozen eggs, and so forth and so on. It does not appear, then, that people ate leaner diets in the last century. It is true that beef consumption has risen in the last few decades, but what has also risen precipitously, however, is consumption of margarine and other food products containing trans-fatty acids (106), lifeless, packaged “foods”, processed vegetable oils (107), carbohydrates (108) and refined sugar (109). Since one does not see chronic diseases like cancer and heart disease in beef-eating native peoples like the Maasai and Samburu, it is not possible for beef to be the culprit behind these modern epidemics. This, of course, points the finger squarely at the other dietary factors as the most likely causes. Myth #10: Soy products are adequate substitutes for meat and dairy products. It is typical for vegans and vegetarians in the Western world to rely on various soy products for their protein needs. There is little doubt that the billion-dollar soy industry has profited immensely from the anti-cholesterol, anti-meat gospel of current nutritional thought. Whereas, not so long ago, soy was an Asian food primarily used as a condiment, now a variety of processed soy products proliferate in the North American market. While the traditionally fermented soy foods of miso, tamari, tempeh and natto are definitely healthful in measured amounts, the hyper-processed soy “foods” that most vegetarians consume are not. Non-fermented soybeans and foods made with them are high in phytic acid (110), an anti-nutrient that binds to minerals in the digestive tract and carries them out of the body. Vegetarians are known for their tendencies to mineral deficiencies, especially of zinc (111) and it is the high phytate content of grain and legume based diets that is to blame (112). Though several traditional food preparation techniques such as soaking, sprouting, and fermenting can significantly reduce the phytate content of grains and legumes (113), such methods are not commonly known about or used by modern peoples, including vegetarians. This places them (and others who eat a diet rich in whole grains) at a greater risk for mineral deficiencies. Processed soy foods are also rich in trypsin inhibitors, which hinder protein digestion. Textured vegetable protein (TVP), soy “milk” and soy protein powders, popular vegetarian meat and milk substitutes, are entirely fragmented foods made by treating soybeans with high heat and various alkaline washes to extract the beans’ fat content or to neutralize their potent enzyme inhibitors (110). These practices completely denature the beans’ protein content, rendering it very hard to digest. MSG, a neurotoxin, is routinely added to TVP to make it taste like the various foods it imitates (114). On a purely nutritional level, soybeans, like all legumes, are deficient in cysteine and methionine, vital sulphur-containing amino acids, as well as tryptophan, another essential amino acid. Furthermore, soybeans contain no vitamins A or D, required by the body to assimilate and utilize the beans’ proteins (115). It is probably for this reason that Asian cultures that do consume soybeans usually combine them with fish or fish broths (abundant in fat-soluble vitamins) or other fatty foods. Parents who feed their children soy-based formula should be aware of its extremely high phytoestrogen content. Some scientists have estimated a child being fed soy formula is ingesting the hormonal equivalent of five birth control pills a day (116). Such a high intake could have disastrous results. Soy formula also contains no cholesterol, vital for brain and nervous system development. Though research is still ongoing, some recent studies have indicated that soy’s phytoestrogens could be causative factors in some forms of breast cancer (117), penile birth defects (118), and infantile leukemia (119). Regardless, soy’s phytoestrogens, or isoflavones, have been definitely shown to depress thyroid function (120) and to cause infertility in every animal species studied so far (121). Clearly, modern soy products and isolated isoflavone supplements are not healthy foods for vegetarians, vegans, or anyone else, yet these are the very ones that are most consumed. Myth #11: The human body is not designed for meat consumption. Some vegetarian groups claim that since humans possess grinding teeth like herbivorous animals and longer intestines than carnivorous animals, this proves the human body is better suited for vegetarianism (122). This argument fails to note several human physiological features which clearly indicate a design for animal product consumption. First and foremost is our stomach’s production of hydrochloric acid, something not found in herbivores. HCL activates protein-splitting enzymes. Further, the human pancreas manufactures a full range of digestive enzymes to handle a wide variety of foods, both animal and vegetable. Further, Dr. Walter Voegtlin’s in-depth comparison of the human digestive system with that of the dog, a carnivore, and a sheep, a herbivore, clearly shows that we are closer in anatomy to the carnivorous dog than the herbivorous sheep. (123) While humans may have longer intestines than animal carnivores, they are not as long as herbivores; nor do we possess multiple stomachs like many herbivores, nor do we chew cud. Our physiology definitely indicates a mixed feeder, or an omnivore, much the same as our relatives, the mountain gorilla and chimpanzee who all have been observed eating small animals and, in some cases, other primates (124). Myth #12: Eating animal flesh causes violent, aggressive behavior in humans. Some authorities on vegetarian diet, such as Dr Rudolph Ballantine (125), claim that the fear and terror (if any, see myth #15) an animal experiences at death is somehow “transferred” into its flesh and organs and “becomes” a part of the person who eats it. In addition to the fact that no scientific studies exist to support such a theory, these thinkers would do well to remember the fact that a tendency to irrational anger is a symptom of low vitamin B12 levels which, as we have seen, are common in vegans and vegetarians. Furthermore, in his travels, Dr Price always noted the extreme happiness and ingratiating natures of the peoples he encountered, all of whom were meat-eaters. Myth #13: Animal products contain numerous, harmful toxins. A recent vegetarian newsletter claimed the following: Most people don’t realize that meat products are loaded with poisons and toxins! Meat, fish and eggs all decompose and putrefy extremely rapidly. As soon as an animal is killed, self-destruct enzymes are released, causing the formation of denatured substances called ptyloamines, which cause cancer. (126) This article then went on to mention “mad cow disease” (BSE), parasites, salmonella, hormones, nitrates and pesticides as toxins in animal products. If meat, fish and eggs do indeed generate cancerous “ptyloamines,” it is very strange that people have not been dying in droves from cancer for the past million years. Such sensationalistic and nonsensical claims cannot be supported by historical facts. Hormones, nitrates and pesticides are present in commercially raised animal products (as well as commercially raised fruits, grains and vegetables) and are definitely things to be concerned about. However, one can avoid these chemicals by taking care to consume range-fed, organic meats, eggs and dairy products which do not contain harmful, man-made toxins. Parasites are easily avoided by taking normal precautions in food preparations. Pickling or fermenting meats, as is custom in traditional societies, always protects against parasites. In his travels, Dr Price always found healthy, disease-free and parasite-free peoples eating raw meat and dairy products as part of their diets. Similarly, Dr Francis Pottenger, in his experiments with cats, demonstrated that the healthiest, happiest cats were the ones on the all-raw-food diet. The cats eating cooked meats and pasteurized milk sickened and died and had numerous parasites (127). Salmonella can be transmitted by plant products as well as animal. It is often claimed by vegetarians that meat is harmful to our bodies because ammonia is released from the breakdown of its proteins. Although it is true that ammonia production does result from meat digestion, our bodies quickly convert this substance into harmless urea. The alleged toxicity of meat is greatly exaggerated by vegetarians. “Mad Cow Disease,” or Bovine Spongiform Encephalopathy (BSE), is most likely not caused by cows eating animal parts with their food, a feeding practice that has been done for over 100 years. British organic farmer Mark Purdey has argued convincingly that cows that get Mad Cow Disease are the very ones that have had a particular organophosphate insecticide applied to their backs or have grazed on soils that lack magnesium but contain high levels of aluminum (128). Small outbreaks of “mad cow disease” have also occurred among people who reside near cement and chemical factories and in certain areas with volcanic soils (129). Purdey theorizes that the organophosphate pesticides got into the cows’ fat through a spraying program, and then were ingested by the cows again with the animal part feeding. Seen this way, it is the insecticides, via the parts feeding (and not the parts themselves or their associated “prions”), that has caused this outbreak. As noted before, cows have been eating ground up animal parts in their feeds for over 100 years. It was never a problem before the introduction of these particular insecticides. Recently, Purdey has gained support from Dr. Donald Brown, a British biochemist who has also argued for a non-infectious cause of BSE. Brown attributes BSE to environmental toxins, specifically manganese overload (130). Myth #14: Eating meat or animal products is less “spiritual” than eating only plant foods. It is often claimed that those who eat meat or animal products are somehow less “spiritually evolved” than those who do not. Though this is not a nutritional or academic issue, those who do include animal products in their diet are often made to feel inferior in some way. This issue, therefore, is worth addressing. Several world religions place no restrictions on animal consumption; and nor did their founders. The Jews eat lamb at their most holy festival, the Passover. Muslims also celebrate Ramadan with lamb before entering into their fast. Jesus Christ, like other Jews, partook of meat at the Last Supper (according to the canonical Gospels). It is true that some forms of Buddhism do place strictures on meat consumption, but dairy products are always allowed. Similar tenets are found in Hinduism. As part of the Samhain celebration, Celtic pagans would slaughter the weaker animals of the herds and cure their meat for the oncoming winter. It is not true, therefore, that eating animal foods is always connected with “spiritual inferiority”. Nevertheless, it is often claimed that, since eating meat involves the taking of a life, it is somehow tantamount to murder. Leaving aside the religious philosophies that often permeate this issue, what appears to be at hand is a misunderstanding of the life force and how it works. Modern peoples (vegetarian and non-vegetarian) have lost touch with what it takes to survive in our world–something native peoples never lose sight of. We do not necessarily hunt or clean our meats: we purchase steaks and chops at the supermarket. We do not necessarily toil in rice paddies: we buy bags of brown rice; and so forth, and so on. When Native Americans killed a game animal for food, they would routinely offer a prayer of thanks to the animal’s spirit for giving its life so that they could live. In our world, life feeds off life. Destruction is always balanced with generation. This is a good thing: unchecked, the life force becomes cancerous. If animal food consumption is viewed in this manner, it is hardly murder, but sacrifice. Modern peoples would do well to remember this. Myth #15: Eating animal foods is inhumane. Without question, some commercially raised livestock live in deplorable conditions where sickness and suffering are common. In countries like Korea, food animals such as dogs are sometimes killed in horrific ways, e.g., beaten to death with a club. Our recommendations for animal foods consumption most definitely do not endorse such practices. As noted in our discussion of myth #1, commercial farming of livestock results in an unhealthy food product, whether that product be meat, milk, butter, cream or eggs. Our ancestors did not consume such substandard foodstuffs, and neither should we. It is possible to raise animals humanely. This is why organic, preferably Biodynamic, farming is to be encouraged: it is cleaner and more efficient, and produces healthier animals and foodstuffs from those animals. Each person should make every effort, then, to purchase organically raised livestock (and plant foods). Not only does this better support our bodies, as organic foods are more nutrient-dense (131) and are free from hormone and pesticide residues, but this also supports smaller farms and is therefore better for the economy (132). Nevertheless, many people have philosophical problems with eating animal flesh, and these sentiments must be respected. Dairy products and eggs, though, are not the result of an animal’s death and are fine alternatives for these people. It should also not be forgotten that agriculture, which involves both the clearance of land to plant crops and the protection and maintenance of those crops, results in many animal deaths (133). The belief, therefore, that “becoming vegetarians” will somehow spare animals from dying is one with no foundation in fact. The Value of Vegetarianism As a cleansing diet, vegetarianism is sometimes a good choice. Several health conditions (e.g., gout) can often be ameliorated by a temporary reduction in animal products with an increase of plant foods. But such measures must not be continuous throughout life: there are vital nutrients found only in animal foods that we must ingest for optimal health. Furthermore, there is no one diet that will work for every person. Some vegetarians and vegans, in their zeal to get converts, are blind to this biochemical fact. “Biochemical individuality” is a subject worth clarifying. Coined by nutritional biochemist Roger Williams, PhD, the term refers to the fact that different people require different nutrients based on their unique genetic make-up. Ethnic and racial background figure in this concept as well. A diet that works for one may not work as well for someone else. As a practitioner, I’ve seen several clients following a vegetarian diet with severe health problems: obesity, candidiasis, hypothyroidism, cancer, diabetes, leaky gut syndrome, anemia and chronic fatigue. Because of the widespread rhetoric that a vegetarian diet is “always healthier” than a diet that includes meat or animal products, these people saw no reason to change their diet, even though that was the cause of their problems. What these people actually needed for optimal health was more animal foods and fats and fewer carbohydrates. Further, due to peculiarities in genetics and individual biochemistry, some people simply cannot do a vegetarian diet because of such things as lectin intolerance and desaturating enzyme deficiencies. Lectins present in legumes, a prominent feature of vegetarian diets, are not tolerated by many people. Others have grain sensitivities, especially to gluten, or to grain proteins in general. Again, since grains are a major feature of vegetarian diets, such people cannot thrive on them. (134) Desaturase enzyme deficiencies are usually present in those people of Innuit, Scandinavian, Northern European, and sea coast ancestry. They lack the ability to convert alpha-linolenic acid into EPA and DHA, two omega-3 fatty acids intimately involved in the function of the immune and nervous systems. The reason for this is because these people’s ancestors got an abundance of EPA and DHA from the large amounts of cold-water fish they ate. Over time, because of non-use, they lost the ability to manufacture the necessary enzymes to create EPA and DHA in their bodies. For these people, vegetarianism is simply not possible. They MUST get their EPA and DHA from food and EPA is only found in animal foods. DHA is present in some algae, but the amounts are much lower than in fish oils. (135) It is also apparent that vegan diets are not suitable for all people due to inadequate cholesterol production in the liver and cholesterol is only found in animal foods. It is often said that the body makes enough cholesterol to get by and that there is no reason to consume foods that contain it (animal foods). Recent research, however, has shown otherwise. Singer’s work at the University of California, Berkeley, has shown that the cholesterol in eggs improves memory in older people (136). In other words, these elderly people’s own cholesterol was insufficient to improve their memory, but added dietary cholesterol from eggs was. Though it appears that some people do well on little or no meat and remain healthy as lacto-vegetarians or lacto-ovo-vegetarians, the reason for this is because these diets are healthier for those people, not because they’re healthier in general. However, a total absence of animal products, whether meat, fish, insects, eggs, butter or dairy, is to be avoided. Though it may take years, problems will eventually ensue under such dietary regimes and they will certainly show in future generations. Dr. Price’s seminal research unequivocally demonstrated this. The reason for this is simple evolution: humanity evolved eating animal foods and fats as part of its diet, and our bodies are suited and accustomed to them. One cannot change evolution in a few years. Dr. Abrams said it well when he wrote: Humans have always been meat-eaters. The fact that no human society is entirely vegetarian, and those that are almost entirely vegetarian suffer from debilitated conditions of health, seems unequivocally to prove that a plant diet must be supplemented with at least a minimum amount of animal protein to sustain health. Humans are meat-eaters and always have been. Humans are also vegetable eaters and always have been, but plant foods must be supplemented by an ample amount of animal protein to maintain optimal health.(137) Author’s Notes: The author would like to thank Sally Fallon, MA; Lee Clifford, MS, CCN; and Dr. H. Leon Abrams, Jr., for their gracious assistance in preparing and reviewing this paper. This paper was not sponsored or paid for by the meat or dairy industries. The Ethics of Eating Meat: A Radical View Posted on June 30, 2002 by Charles Eisenstein • 9 Comments PrintFriendly and PDFPrint - PDF - Email Read this article in: Spanish Most vegetarians I know are not primarily motivated by nutrition. Although they argue strenuously for the health benefits of a vegetarian diet, many see good health as a reward for the purity and virtue of a vegetarian diet, or as an added bonus. In my experience, a far more potent motivator among vegetarians–ranging from idealistic college students, to social and environmental activists, to adherents of Eastern spiritual traditions like Buddhism and Yoga–is the moral or ethical case for not eating meat. Enunciated with great authority by such spiritual luminaries as Mahatma Gandhi, and by environmental crusaders such as Frances Moore Lappe, the moral case against eating meat seems at first glance to be overpowering. As a meat eater who cares deeply about living in harmony with the environment, and as an honest person trying to eliminate hypocrisy in the way I live, I feel compelled to take these arguments seriously. A typical argument goes like this: In order to feed modern society’s enormous appetite for meat, animals endure unimaginable suffering in conditions of extreme filth, crowding and confinement. Chickens are packed twenty to a cage, hogs are kept in concrete stalls so narrow they can never turn around. Arguing for the Environment The cruelty is appalling, but no less so than the environmental effects. Meat animals are fed anywhere from five to fifteen pounds of vegetable protein for each pound of meat produced–an unconscionable practice in a world where many go hungry. Whereas one-sixth an acre of land can feed a vegetarian for a year, over three acres are required to provide the grain needed to raise a year’s worth of meat for the average meat-eater. All too often, so the argument goes, those acres consist of clear-cut rain forests. The toll on water resources is equally grim: the meat industry accounts for half of US water consumption–2500 gallons per pound of beef, compared to 25 gallons per pound of wheat. Polluting fossil fuels are another major input into meat production. As for the output, 1.6 million tons of livestock manure pollutes our drinking water. And let’s not forget the residues of antibiotics and synthetic hormones that are increasingly showing up in municipal water supplies. Even without considering the question of taking life (I’ll get to that later), the above facts alone make it clear that it is immoral to aid and abet this system by eating meat. Factory or Farm? I will not contest any of the above statistics, except to say that they only describe the meat industry as it exists today. They constitute a compelling argument against the meat industry, not meat-eating. For in fact, there are other ways of raising animals for food, ways that make livestock an environmental asset rather than a liability, and in which animals do not lead lives of suffering. Consider, for example, a traditional mixed farm combining a variety of crops, pasture land and orchards. Here, manure is not a pollutant or a waste product; it is a valuable resource contributing to soil fertility. Instead of taking grain away from the starving millions, pastured animals actually generate food calories from land unsuited to tillage. When animals are used to do work–pulling plows, eating bugs and turning compost–they reduce fossil fuel consumption and the temptation to use pesticides. Nor do animals living outdoors require a huge input of water for sanitation. In a farm that is not just a production facility but an ecology, livestock has a beneficial role to play. The cycles, connections and relationships among crops, trees, insects, manure, birds, soil, water and people on a living farm form an intricate web, “organic” in its original sense, a thing of beauty not easily lumped into the same category as a 5000-animal concrete hog factory. Any natural environment is home to animals and plants, and it seems reasonable that an agriculture that seeks to be as close as possible to nature would incorporate both. Indeed, on a purely horticultural farm, wild animals can be a big problem, and artificial measures are required to keep them out. Nice rows of lettuce and carrots are an irresistible buffet for rabbits, woodchucks and deer, which can decimate whole fields overnight. Vegetable farmers must rely on electric fences, traps, sprays, and–more than most people realize–guns and traps to protect their crops. If the farmer refrains from killing, raising vegetables at a profitable yield requires holding the land in a highly artificial state, cordoned off from nature. Yes, one might argue, but the idyllic farms of yesteryear are insufficient to meet the huge demand of our meat-addicted society. Even if you eat only organically raised meat, you are not being moral unless your consumption level is consistent with all of Earth’s six billion people sharing your diet. Production and Productivity Such an argument rests on the unwarranted assumption that our current meat industry seeks to maximize production. Actually it seeks to maximize profit, which means maximizing not “production” but “productivity”–units per dollar. In dollar terms it is more efficient to have a thousand cows in a high-density feedlot, eating corn monocultured on a chemically-dependent 5,000-acre farm, than it is to have fifty cows grazing on each of twenty 250-acre family farms. It is more efficient in dollar terms, and probably more efficient in terms of human labor too. Fewer farmers are needed, and in a society that belittles farming, that is considered a good thing. But in terms of beef per acre (or per unit of water, fossil fuel, or other natural capital) it is not more efficient. In an ideal world, meat would be just as plentiful perhaps, but it would be much more expensive. That is as it should be. Traditional societies understood that meat is a special food; they revered it as one of nature’s highest gifts. To the extent that our society translates high value into high price, meat should be expensive. The prevailing prices for meat (and other food) are extraordinarily low relative to total consumer spending, both by historical standards and in comparison to other countries. Ridiculously cheap food impoverishes farmers, demeans food itself, and makes less “efficient” modes of production uneconomical. If food, and meat in particular, were more expensive then perhaps we wouldn’t waste so much–another factor to consider in evaluating whether current meat consumption is sustainable. Moral Imperative So far I have addressed issues of cruel conditions and environmental sustainability, important moral motivations for vegetarianism, to be sure. But vegetarianism existed before the days of factory farming, and it was inspired by a simple, primal conviction that killing is wrong. It is just plain wrong to take another animal’s life unnecessarily; it is bloody, brutal, and barbaric. Of course, plants are alive too, and most vegetarian diets involve the killing of plants. (The exception is the fruit-only “fruitarian” diet.) Most people don’t accept that killing an animal is the same as killing a plant though, and few would argue that animals are not a more highly organized form of life, with greater sentience and greater capacity for suffering. Compassion extends more readily to animals that cry out in fear and pain, though personally, I do feel sorry for garden weeds as I pull them out by the roots. Nonetheless, the argument “plants are alive too” is unlikely to satisfy the moral impulse behind vegetarianism. It should also be noted that mechanized vegetable farming involves massive killing of soil organisms, insects, rodents and birds. Again, this does not address the central vegetarian motivation, because this killing is incidental and can in principle be minimized. The soil itself, the earth itself, may, for all we know, be a sentient being, and surely an agricultural system, even if plant-based, that kills soil, kills rivers, and kills the land, is as morally reprehensible as any meat-oriented system, but again this does not address the essential issue of intent: Isn’t it wrong to kill a sentient being unnecessarily? One might also question whether this killing is truly unnecessary. Although the nutritional establishment looks favorably on vegetarianism, a significant minority of researchers vigorously dispute its health claims. An evaluation of this debate is beyond the scope of this article, but after many years of dedicated self-experimentation, I am convinced that meat is quite “necessary” for me to enjoy health, strength and energy. Does my good health outweigh another being’s right to life? This question leads us back to the central issue of killing. It is time to drop all unstated assumptions and meet this issue head-on. The Central Question Let’s start with a very naďve and provocative question: “What, exactly, is wrong about killing?” And for that matter, “What is so bad about dying?” It is impossible to fully address the moral implications of eating meat without thinking about the significance of life and death. Otherwise one is in danger of hypocrisy, stemming from our separation from the fact of death behind each piece of meat we eat. The physical and social distance from slaughterhouse to dinner table insulates us from the fear and pain the animals feel as they are led to the slaughter, and turns a dead animal into just “a piece of meat.” Such distance is a luxury our ancestors did not have: in ancient hunting and farming societies, killing was up close and personal, and it was impossible to ignore the fact that this was recently a living, breathing animal. Our insulation from the fact of death extends far beyond the food industry. Accumulating worldly treasures–wealth, status, beauty, expertise, reputation–we ignore the truth that they are impermanent, and therefore, in the end, worthless. “You can’t take it with you,” the saying goes, yet the American system, fixated on worldly acquisition, depends on the pretense that we can, and that these things have real value. Often only a close brush with death helps people realize what’s really important. The reality of death reveals as arrant folly the goals and values of conventional modern life, both collective and individual. It is no wonder, then, that our society, unprecedented in its wealth, has also developed a fear of death equally unprecedented in history. Both on a personal and institutional level, prolonging and securing life has become more important than how that life is lived. This is most obvious in our medical system, of course, in which death is considered the ultimate “negative outcome,” to which even prolonged agony is preferable. I see the same kind of thinking in Penn State students, who choose to suffer the “prolonged agony” of studying subjects they hate, in order to get a job they don’t really love, in order to have financial “security.” They are afraid to live right, afraid to claim their birthright, which is to do joyful and exciting work. The same fear underlies our society’s lunatic obsession with “safety.” The whole American program now is to insulate oneself as much as possible from death–to achieve “security.” It comes down to the ego trying to make permanent what can never be permanent. Modern Dualism Digging deeper, the root of this fear, I think, lies in our culture’s dualistic separation of body and soul, matter and spirit, man and nature. The scientific legacy of Newton and Descartes holds that we are finite, separate beings; that life and its events are accidental; that the workings of life and the universe may be wholly explained in terms of objective laws applied to inanimate, elemental parts; and therefore, that meaning is a delusion and God a projection of our wishful thinking. If materiality is all there is, and if life is without real purpose, then of course death is the ultimate calamity. Curiously, the religious legacy of Newton and Descartes is not all that different. When religion abdicated the explanation of “how the world works”–cosmology–to physics, it retreated to the realm of the non-worldly. Spirit became the opposite of matter, something elevated and separate. It did not matter too much what you did in the world of matter, it was unimportant, so long as your (immaterial) “soul” were saved. Under a dualistic view of spirituality, living right as a being of flesh and blood, in the world of matter, becomes less important. Human life becomes a temporary excursion, an inconsequential distraction from the eternal life of the spirit. Other cultures, more ancient and wiser cultures, did not see it like this. They believed in a sacred world, of matter infused with spirit. Animism, we call it, the belief that all things are possessed of a soul. Even this definition betrays our dualistic presumptions. Perhaps a better definition would be that all things are soul. If all things are soul, then life in the flesh, in the material world, is sacred. These cultures also believed in fate, the futility of trying to live past one’s time. To live rightly in the time allotted is then a matter of paramount importance, and life a sacred journey. When death itself, rather than a life wrongly lived, is the ultimate calamity, it is easy to see why an ethical person would choose vegetarianism. To deprive a creature of life is the ultimate crime, especially in the context of a society that values safety over fun and security over the inherent risk of creativity. When meaning is a delusion, then ego–the self’s internal representation of itself in relation to not-self–is all there is. Death is never right, part of a larger harmony, a larger purpose, a divine tapestry, because there is no divine tapestry; the universe is impersonal, mechanical and soulless. Obsolete Science Fortunately, the science of Newton and Descartes is now obsolete. Its pillars of reductionism and objectivity are crumbling under the weight of 20th century discoveries in quantum mechanics, thermodynamics and nonlinear systems, in which order arises out of chaos, simplicity out of complexity, and beauty out of nowhere and everywhere; in which all things are connected; and in which there is something about the whole that cannot be fully understood in terms of its parts. Be warned, my views would not be accepted by most professional scientists, but I think there is much in modern science pointing to an ensouled world, in which consciousness, order and cosmic purpose are written into the fabric of reality. In an animistic and holistic world view, the moral question to ask oneself about food is not “Was there killing?” but rather, “Is this food taken in rightness and harmony?” The cow is a soul, yes, and so is the land and the ecosystem, and the planet. Did that cow lead the life a cow ought to lead? Is the way it was raised beautiful, or ugly (according to my current understanding)? Allying intuition and factual knowledge, I ask whether eating this food contributes to that tiny shred of the divine tapestry that I can see. Divine Tapestry There is a time to live and a time to die. That is the way of nature. If you think about it, prolonged suffering is rare in nature. Our meat industry profits from the prolonged suffering of animals, people and the Earth, but that is not the only way. When a cow lives the life a cow ought to live, when its life and death are consistent with a beautiful world, then for me there is no ethical dilemma in killing that cow for food. Of course there is pain and fear when the cow is taken to the slaughter (and when the robin pulls up the worm, and when the wolves down the caribou, and when the hand uproots the weed), and that makes me sad. There is much to be sad about in life, but underneath the sadness is a joy that is dependent not on avoiding pain and maximizing pleasure, but on living rightly and well. It would indeed be hypocritical of me to apply this to a cow and not to myself. To live with integrity as a killer of animals and plants, it is necessary for me in my own life to live rightly and well, even and especially when such decisions seem to jeopardize my comfort, security, and rational self-interest, even if, someday, to live rightly is to risk death. Not just for animals, but for me too, there is a time to live and a time to die. I’m saying: What is good enough for any living creature is good enough for me. Eating meat need not be an act of arrogant species-ism, but consistent with a humble submission to the tides of life and death. If this sounds radical or unattainable, consider that all those calculations of what is “in my interest” and what will benefit me and what I can “afford” grow tiresome. When we live rightly, decision by decision, the heart sings even when the rational mind disagrees and the ego protests. Besides, human wisdom is limited. Despite our machinations, we are ultimately unsuccessful at avoiding pain, loss and death. For animals, plants, and humans alike, there is more to life than not dying. Splendid Specimens: The History of Nutrition in Bodybuilding Posted on December 14, 2004 by Randy Roach • 3 Comments PrintFriendly and PDFPrint - PDF - Email Read this article in: Dutch The sport called bodybuilding demands the ex-treme in body presentation. No other athletic endeavor requires such high levels of regimentation for muscle development and body fat reduction. To outsiders, such efforts may appear vain and self-centered, even looming out there on the lunatic fringe. Nevertheless, the sport has had considerable influence on other fields of athletics, not to mention the general public. We must remember that the men (and women) who sweat it out in the gym year after year were using the low-carbohydrate diet long before Dr. Atkins made it popular. Many other dietary strategies of today such as all-raw diets, protein supplementation, eating multiple small meals a day, carbohydrate loading, meal replacement packages and macro-nutrient balancing all derived their initial popularity from the bodybuilding field. Physical Culture Credit for the Physical Culture movement in North America, the precursor to the bodybuilding movement, goes to Bernarr Macfadden, an extraordinary entrepreneur who published physical culture magazines, organized physique competitions, wrote 150 books and accumulated millions in the publishing industry. Macfadden preached clean living and whole natural foods. He ate vast quantities of raw carrots, beet juice, fruits, dates, raisins, grains and nuts. He abstained from meat but recommended copious amounts of raw milk. In fact he even recommended an exclusive raw milk diet for extended periods. The dominant star of the early years was Eugen Sandow, whose career spanned the late 1890s and the early part of the 20th century. He did not display the typical burly brute image, but a finely chiseled body, resembling those of Roman and Greek athletes. With the help of Florenz Ziegfeld, he marketed and displayed his physique in artistic fashion. In fact, it was through this artistic expression that Sandow inspired Macfadden in the mid 1890s. In an 1894 interview on his dietary habits, Sandow claimed to abstain from hard liquor, coffee and tea, but consumed the occasional beer. He ate mostly wholesome foods, but indulged at selected opportunities. Sandow, along with most of the other Physical Culturists of his day, placed more emphasis on the mechanical aspects of diet as opposed to the chemical. He believed in doing what was necessary to facilitate good digestion, including eating at regular intervals, selecting simple foods, applying thorough mastication, eating slowly and tying it all together with a good night’s sleep. He was critical of over-indulgence and recommended foods with a high nutrient value, although he admitted to eating what he wanted, when he wanted, and however much he wanted during his younger years. Earle Liederman, author and friend of Sandow, also advocated whole natural foods. Liederman pointed out the importance of a strong digestive system enhanced by proper food mastication for men of strength and large appetites. He described the popularity of “beef juice” or “beef extract” for rapid muscle recovery. Liederman also felt obliged to mention that ice cream was very popular, referring to one lifter who often felt it necessary to finish his meals with a quart of vanilla ice cream. Arthur Saxon of the famous Saxon brothers trio and a contemporary of Eugen Sandow, also recommended nutrient-dense foods for endurance athletes. He warned against the dangers of hard liquor, but condoned beer. In fact, Saxon had a reputation for hefty beer drinking as did many men of strength of the time. He warned against smoking while admitting to being a smoker himself. For gaining muscle, Saxon recommended milk mixed with raw egg after a workout, milk with oatmeal, cheese, beans, peas, and meat. He called milk the perfect food. According to his brother Kurt, all three of the Saxon brothers had very hearty appetites. Along with his participation in the strength act, Kurt was also the trio’s chef. Kurt’s list of food consumed by the three brothers each day indicates substantial daily intake, with little self-denial. Milk is largely absent from Kurt’s menus. Raw vs. Cooked A debate that has been on-going since the early days of Physical Culture is the relative virtues of raw food versus cooked. Sandow referred to the eating of raw eggs and under-cooked meats as nonsense and a practice that was “passing away.” In the raw food corner was champion wrestler George Hackenschmidt, the “Russian Lion,” a man rivaling Sandow’s strength, and surpassing him in athletic ability. Like Sandow, he was small by today’s standards, standing just under 5’10? and weighing about 200 pounds. However, he was enormously strong. Both a gentleman and sportsman, George Hackenschmidt reflected a spiritually conservative philosophy towards nutrition. In his book The Way to Life, he stated: “I believe I am right in asserting that our creator has provided food and nutriment for every being for its own advantage. Man is born without frying-pan or stewpot. The purest natural food for human beings would, therefore, be fresh, uncooked food and nuts.” He stated that a diet of three quarters vegetable food and one quarter meat would appear to be most satisfactory for the people of central Europe but conceded a hearty appetite which, in his early training years, was based on 11 pints of milk per day, presumably raw, along with the rest of his diet. A prophet before his time, he warned about the dangers of refined sugar and meat from artificially fed and confined animals. He believed that most people ate too much flesh food from these improperly raised animals and encouraged more emphasis on natural raw foods. Vegetarianism The early bodybuilders also debated the pros and cons of vegetarianism. Macfadden and Hackenschmidt inclined towards diets that excluded meat, or that at least derived a preponderence of calories from plant foods. Juicing was popular among some. In his book Remembering Muscle Beach, Harold Zinkin describes fellow beach comrade Relna Brewer. At 17, Brewer worked in one of California’s first health food stores, located in Santa Monica. Relna’s job was to run the juice press. Because the owners of the store could not afford to pay much, Relna took out her pay in the celery, watermelon, orange and carrot juice she made each day. Jack Lalanne was probably one of Relna’s customers. Jack began his carreer as a vegetarian, bringing his own food, such as apple or carrot juice and vegetables, to train at the beach during the 1930s. However, Lalanne later ate meat when focussed on bodybuilding. In fact, Armand Tanny says that Jack would visit the local stockyards to acquire cow’s blood to drink while in training. Later Lalanne reverted back to his vegetarian ways, but allowing some fish and eggs. Lalanne opened one of the first health studios in Oakland in 1936. A colleague writes that Lalanne would work 14 hours a day then drive through the night 400 miles so he could be with the gang at Muscle Beach to participate in all the activities. When it came to pure energy and vitality, Lalanne was, and at 90 today, still is unbridled. Another vegetarian was Lionel Strongfort who promoted a system of raw foods based on fruits, vegetables, eggs and milk. He recommended very little meat and cooked fat. Strongfort suggested eating only two meals a day, a strategy shared by Macfadden that would re-emerge in the 60s and 70s. Strongfort and Macfadden both advised against overconsumption of food. They claimed overconsumption created a negative stress on the body’s systems, sensible advice that bodybuilding publications would ignore in the coming years. Perhaps the most accepted food across all the early eating models for bodybuilders was milk. One of the most popular protocols for building size and strength was the combination of back squatting and drinking large quantities of milk. Joseph Curtis Hise was a pioneer of this system in the 1930s and after 70 years this strategy is still going strong in the drug-free world of bodybuilding. Tony Sansone Another Physical Culturalist who advised against over-consumption was Tony Sansone, but Sansone understood the importance of flesh foods, including animal fats and organ meats. He wrote extensively on nutrition for bodybuilders and recommended nutrient-dense “foundation” foods such as milk, eggs, butter, meat, vegetables, fruits, and some whole grains, in that order. He also stressed the importance of organ meats such as liver, kidney, heart and cod liver oil and recognized the need to drink whole raw milk instead of pasteurized and skimmed. He believed goats milk was more nutritious and easily digested than cows milk. Fresh butter and cream were his preferred fats. He also recommended six to eight glasses of water per day. Tony Sansone wisely stressed the importance of generous amounts of fat in the diet to allow the complete utilization of nitrogenous (protein) foods in building muscle tissue–a fundamental and important fact that would be lost as the era of protein supplements took hold. He also knew that weight loss was not a matter of simple calorie counting, as cellular uptake or utilization of food varied on an individual basis. In anticipation of Dr. Atkins, Sansone recommended his foundation foods of milk, eggs, meat, vegetables and fruit for strength and health, and starchy foods as weight manipulators. His recipe for gaining weight was to add more high-carbohydrate foods such as bread and potatoes to the diet, and for losing weight to simply reduce or remove them. Tony Sansone’s caveat to lose no more than two pounds of fat per week is still the standard used in bodybuilding today. Muscle Beach Muscle Beach got its start in the 1930s as the meeting place of young athletes who lifted weights, built human pyramids, tumbled, juggled and engaged in any other athletic endeavor they could think of. That era gave us many recognizable names such as Harold Zinkin (creator of the Universal weight machine), Joe Gold (creator of Golds Gym), Jack Lalanne, Harry Smith, and the Tanny brothers, Armand and Vic (who created a popular gymnasium chain). In fact, it is safe to say that much of the fitness industry grew out of Muscle Beach–gyms, gym chains, TV exercise programs, fitness equipment, women lifting weights, even aspects of the natural organic food movement stemmed from this small stretch of sand. According to Harry Smith, long-time gym owner, ex-pro wrestler and Muscle Beach alumnus, body builders didn’t think much about specialty food or supplements in those days. The emphasis was on training rather than eating and resting. Harry did state that many of them tried to keep their eating clean, and that on a number of occasions they would frequent a small deli about one-half block from the beach. The deli offered freshly ground beef to which some of the guys would mix some raw onions and a little salt and pepper. The meat was eaten raw along with raw milk. Harry said it was a cheap and easy way to eat hearty and keep out of the restaurants. One important Muscle Beach raw food enthusiast was Armand Tanny. Originally a weightlifter, Armand had a fantastic physique and the strength to qualify him for the wrestling circuit. He visited the Hawaiian Islands just after the Second World War and came away with a lasting impression of the Samoans. “They ate everything raw,” he noted. “You name it, fish, meat, beetles–everything! They were so strong and healthy.” On his return to the US, he became interested in the work of Weston A. Price, stating that Price’s book Nutrition And Physical Degeneration served as his Bible. In 1948 he shut off his stove and ate just about everything raw from then on–tuna, beef, liver, lobster, oysters, clams, nuts, seeds, fruits and vegetables. Armand recalls wading out into the surf along the Santa Monica Pier and using his feet to kick up 6- to 7-inch Pismo clams, smashing them together to get at the pink and white flesh. Armand also took brewer’s yeast, desiccated liver, yogurt, black strap molasses and wheat germ oil, all recommendations of Gaylord Hauser, a nutritional guru of the era. Hauser also recommended fish liver oil, but Tanny felt he was getting plenty from all the raw fish he was consuming. Armand credited his 1950 Mr. USA and the Pro Mr. America titles to his raw meat diet. In the 1950s, he helped his brother Vic in the gym business and appeared in a Mae West act. His bodybuilding articles appeared prominently in bodybuilding publications for the remainder of the century, thus providing a link to Weston Price during the decade of the 50s. Bulking Up with John Grimek The biggest influence on bodybuilding in the 1930s and 1940s was John Grimek, the second American Athletics Union (AAU) Mr. America and the first to win back-to-back titles, in 1940 and 1941. Many commentators believe that Grimek represents the beginning of modern bodybuilding as we know it today, describing him as the best physique of the mid century. During the early 1930s, at the start of his career, Grimek came under the influence of Mark Berry, editor of Strength magazine and an advocate of an eating protocol in which an athlete would bulk up in bodyweight and then train it off. At one point, Berry had Grimek beef up his 5’ 8? frame to 250 pounds. The practice would become commonplace by the 1950s and maintain a foothold for several decades after. Grimek bulked up on whatever was put in front of him, reports his wife Angela in a 1956 Health and Strength article entitled “Life with John.” “John has an enormous appetite. . . John has yet to find a restaurant that can do justice to his appetite. . . . Sometimes he goes on a restricted diet–and it is surprising how little he can get by on then. But when he goes all out, he can never be filled. . . . but the ‘hog’ (our pet name for John) just eats and eats and still remains trim and muscular.” By the 1950s, Grimek’s diet included Hershey chocolate bars and hi-protein tablets manufactured and promoted by Bob Hoffman, publisher of Strength and Health, a magazine that provided a platform for Grimek along with the new-fangled supplements coming on the market. Hoffman used Hershey chocolate in his products, so Grimek and the rest of the York gang had easy access to some empty calories. Protein Powders and Supplements In the late 1930s a young pharmacist named Eugene Schiff developed a method of processing whey from milk for human consumption. He created Schiff Bio-Foods, a whey packaging company. This was a half century before whey concentrates would emerge as a popular supplement in the bodybuilding scene. For a short time he sold his packaged whey to local drug stores, then sold his own store to enter into the manufacturing and packaging of health foods. Schiff focused on supplements made from natural products. He began to experiment with whole foods such as brewer’s yeast, wheat germ and liver. He found that these foods were naturally rich in vitamins and minerals. The Schiff company claims that he was first to discover that rose hips was a superior source of vitamin C. Along with the first rose hip vitamin C supplement, he also launched one of the first multi-vitamin products, called “V-Complete.” The demand during World War II for non-perishable foods allowed the food industry to expand and popularize the market for powdered or dehydrated foods and bodybuilders would eventually find their way into this market. Powdered milk and eggs, and later powdered soy protein, were promoted as an easy way to get additional protein into the diet. Breakfast drinks based on a protein powder emerged into the diet of the legendary Steve Reeves who years later wrote about this practice in his book Building The Classic Physique. Reeves’ impressive natural physique landed him starring roles in the films Hercules and Hercules Unchained in the late 1950s and inspired thousands of young men to adopt weight training. His recipe for a breakfast drink included fresh orange juice, Knox gelatin, honey, banana, raw eggs and a blend of skim milk, egg white and soy protein. The first protein powders “tailored” specifically for athletes appeared around 1950. One of these was called 44, “The Supplemental Food Beverage,” produced in California by a company called Kevo Products. The principle ingredient was dehydrated powdered whole soy beans, along with kelp, wheat germ, dextrose, and various dehydrated plants, herbs and flavorings. The supplement was sold at health food stores, body-building studios, and health institutes. Another popular product was Hi-Protein, “a protein food supplement derived from soya flour, milk proteins, and wheat. The free amino acids which include natural tryptophan and the other natural essential amino acids where produced by an acid hydrolysis.” The product was developed by bodybuilder and nutrition guru Irvin Johnson with before and after photographs of weaklings turned musclemen. Bob Hoffman quickly capitalized on Johnson’s success by following immediately with his own soy-based product marketed heavily in Strength and Health. Hoffman’s infamous protein claimed many a victim with hives or gym-clearing gas. The debates on raw versus cooked and vegetarianism versus meat eating that appeared in bodybuilding magazines during the 1940s gave way to numerous articles on protein supplements in the 1950s, including “Building Biceps Faster With Food Supplements (Iron Man, December 1950,” “More and Better Protein Will Keep you Well (Strength & Health, March 1953),” “The Magical Power Of Protein (Mr. America, February 1958),” “Food Supplements Build Rock Hard Definition (Muscle Builder, June 1958)” and “Everyone Needs More Protein (Strength & Health, July 1959). Meal replacement products also appeared during the 1950s, with much hype. One product, called B-FIT, was recommended as a replacement for two or three regular meals per day. According to its promoters, B-FIT “is scientifically formulated to contain all the needed vitamins and minerals, plus ample supplies of the effective proteins and yet is so low in calories that the fatty tissue literally melts away. . . . You will not suffer from any nutritional deficiencies because B-FIT is a complete food insofar as scientific experiment and research is possible to develop. Approved by dieticians.” Advocates for new diet theories–food combining, alkaline-forming diets, even strict vegetarianism–promoted their ideas throughout the 1950s, but the big emphasis was on protein powders and supplements. For the 1954 world weightlifting championships, team coach Bob Hoffman hauled more than 100 pounds of his Hi Protein powder to Vienna, hailing it as the “secret weapon” for his athletes. But Russia, whose athletes finished no lower than second place, had a secret weapon of their own. The Secret Weapon It was John Ziegler, a doctor accompanying the American team to Vienna, who exposed just what this Soviet weapon was. Ziegler claimed that after a few drinks, a Russian doctor told him that the Soviet athletes were using–and abusing–testosterone. Ziegler was no stranger to testosterone. With his background in rehabilitation therapy and his connection with CIBA Pharmaceuticals, he was already experimenting with testosterone on himself, his patients and some novice athletes. In fact, author and historian John Fair writes that even the great John Grimek was cooperating with Ziegler and trying his drugs in the summer of 1954. Grimek reported disappointing results. Both American and German research scientists had identified testosterone and noted its effects as far back as the mid 1930s. CIBA Pharmaceuticals was already targeting bodybuilders with ads for synthetic testosterone in 1947. With Ziegler’s help, CIBA manufactured the most popular anabolic steroid of the 20th century. The drug was Dianabol, which came out in1958. The acceptance of steroid drugs among bodybuilders got off to a slow start. Drinking a gallon of milk or swallowing 2000 protein pills seemed more logical to them than taking a tiny pill to do the job. Even those who did take them were slow in accepting or acknowledging the fact that it was the steroids that were giving them such tremendous gains in muscle mass. Out on the West Coast, bodybuilding great Bill Pearl was also curious as to what the Russians were doing, so he took it upon himself to do his own research. During a visit to the University of California at Davis in 1958, he learned from a veterinarian about the successful use of steroids in beefing up cattle. Bill figured that if it was good enough for a bull, then it was good enough for him. While continuing to train hard, he took 30 mg of the steroid drug Nilevar (three times the recommended dose for humans, but an absolute joke by today’s practices) for 12 weeks and brought his bodyweight up from 225 to 250 pounds. Steroid use among athletes paralleled the challenge to conservative moral standards that characterized the era of the 1960s. It was a time that seemed ripe for the liberation of one’s desires. Individual freedoms took precedence over the rules, morals and ethics dictated by a long established culture–and by Mother Nature. If the new generation could take mind-altering drugs, it could take body-altering drugs as well. Anabolic (“building-up”) steroids such as testosterone ushered in a new bodybuilding look that was larger and more muscularly pronounced than ever before. During the early 1960s, the magazines emphasized caution about steroids. They acknowledged the rumors concerning Bill Pearl and others but tried to steer their readers away by stating that the drugs didn’t work, wouldn’t produce what bodybuilders expected, or were outright dangerous. Both Iron Man and Muscle Builder magazines warned of side effects and published articles claiming much better results with high-protein products. But behind the scenes, the athletes knew that they worked. Pearl openly acknowledged that he used them for a final time in 1961 to prepare for the 1961 National Amateur Bodybuilding Association (NABBA) Mr. Universe contest. He stated that the drugs by then were no longer underground but well known to the top bodybuilders. Steroids and Cream Still, most athletes relied on diet for strength-building, and protein occupied a large percentage of that diet. In the early 1960s, Irving Johnson targeted elite bodybuilders with a milk-and-egg protein blend considered far superior to competing products–including an earlier product of his own–based on soy. By the mid 60s, ads for Johnson’s protein blend began appearing in the bodybuilding magazines. At that time he changed his name to Rheo H. Blair. Blair claimed that his protein powder was made from milk and eggs obtained from animals raised on the rich soil of Wisconsin and that the proteins were extracted at very low temperatures. Wary of the difficulty some might have digesting all that protein, he endorsed hydrochloric acid supplements, to be taken with any protein meal. He also sold supplements such as amino acids, liver extract, B-complex and soybro (a combination of wheat germ, rice germ and soy germ oils). In 1966 he introduced a new protein formula which he claimed had a biological value resembling mother’s milk. Blair promoted his products with skillful salesmanship but he also made an important suggestion that would ensure that his products actually worked–he insisted that his protein be taken with raw cream or half and half. He was smart enough to know that you must replace the fat removed from protein during processing. He also recognized the benefits of raw dairy products. Athletes of the 1960s used a variety of recipes, varying the proportions of Blair’s protein product with raw cream, raw milk and raw egg yolk. Weight-trainer Don Howorth remembers eating 3 dozen eggs, 1 quart raw cream, and 2 pounds ground sirloin along with 2-3 cups of Blair’s protein powder per day. Blair had a special method for cooking his eggs. He did not cook them in boiling water but recommended cooking many eggs at one time in water maintained at 181 degrees for 31 minutes. The eggs were then left in the water to cool down slowly. Blair claimed that putting the eggs under cold water “shocked” many of the nutrients, rendering them ineffective and that cooking eggs in this fashion preserved much of their nutritional value. It is interesting to read Perry Rader’s “Reader Roundup” column in his Iron Man magazine during this time. He tries to explain the spectacular gains made by some of the popular bodybuilders who were using Blair’s products. Many of them were eating 6000 to 9000 calories a day in the same fashion as Don Howorth and gaining muscle while maintaining or even trimming their waist size. Rader published Blair’s response in a 1966 issue of Iron Man. Blair claimed that his protein powders, along with all of his other supplements, were formulated in a special manner to metabolize fat more efficiently. He also warned that taking cream with any protein powder other than his own would result in fat accumulation. But Blair could not help knowing that these dramatic results were not achieved on food and protein powders alone. Bodybuilders knew that they could expect to build muscle consuming 8000 calories per day, but not lose fat at the same time. That required some additional anabolic assistance. Blair knew his guys were taking steroids. Don Howorth readily admitted his past use of Dianabol, but was adamant about the importance of diet along with it. In fact, some bodybuilders were quite open about drugs. When Larry Scott, two-time winner of Mr. Olympia, was asked about his steroid use he said without hesitation, “Sure, doesn’t everyone?” However, the bodybuilding magazines continued the deception that the new, larger physiques were built on powders and supplements. Thus steroid use artificially inflated the already marketable commodities of bodybuilding. Vince Gironda One man who had definition dieting mastered and who never used drugs was the Iron Guru Vince Gironda. Pioneer of a technique involving intense abbreviated training routines rather than long workouts, Gironda began competing in the 1950s and then trained both athletes and movie stars for many decades after. So defined was his physique, he often found himself penalized by judges who seemed confused over his appearance. Says Gironda, “The men who judged physique contests at this time were puzzled by so much muscularity. Quotes from physique magazines stated I didn’t place higher in whatever contest because of too much muscularity. They thought that this type of cut-up physique was slightly repugnant so I lost most muscular titles to smoother men who had that type of definition for that day.” Gironda often stated that nutrition was 85-90 percent of bodybuilding. His alternative to drugs was eggs. Like Blair, he advocated up to 36 eggs a day for 6 to 8 weeks to produce muscle buildup. (He also took, among many other supplements, “orchic tissue tablets,” that is, dried testicles.) He recommended following this “anabolic phase” with a short-term vegetarian diet to “re-alkalize” the body. Similarly he alternated a low-carbohydrate diet with periods of carbohydrate loading. He was careful to point out the difference between natural and refined carbohydrate foods. He presented research data that strongly indicted refined carbohydrates as the real culprit in much of the century’s degenerative disease. His articles went into surprising detail on the biochemical pathways through which sugar did its damage, pointing out the relation between sugar and atherosclerosis, abnormal increases in height and weight and skeletal anomalies. As for protein, he believed the average American could get along fine with just 45 grams of quality protein a day. However, he insisted that bodybuilders needed over 300 grams daily for several weeks to force the growth process. He believed in quality protein powders and used Blair’s milk-and-egg blend until he came out with his own product. When he used the powders, he blended 1/3 of a cup with a dozen eggs and 12 ounces of raw cream or half & half. He was also big on steak and often ate his meat raw.mmended germ oils, amino acids, vitamin and mineral supplements, and hydrochloric acid (HCL). He recommended mineral rich sea kelp for its iodine content and dried liver extract for blood building and oxygen capacity boosting. Many bodybuilders used desiccated liver after the early 1950s experiments of Dr. Benjamin Ershoff. Ershoff who conducted the famous liver study wherein rats fed 10 percent desiccated liver swam far longer compared to controls. Macronutrientland In his early years, Blair recommended a very low carbohydrate diet. Later he advocated a diet consisting of 1/3 protein, 1/3 fat and 1/3 carbohydrates to build muscle; then he reversed himself and again urged avoidance of carbohydrate foods. But other bodybuilders included high levels of carbs in their diets. For example, teenage sensation Casey Viator, who became the youngest Mr. America ever at age 19, had his own special peanut butter pudding that consisted of 2 pounds of peanut butter, 1 jar of grape jelly and 3 or 4 bananas. The bananas were optional. This was part of a diet that also included 2 dozen eggs and 2 gallons of raw milk per day. Casey recalls his father not shedding too many tears when he finally moved out. A columnist in Strength & Health magazine recommended the following carbohydrate-rich concoction for “getting big” along with a diet that allowed unlimited meat and eggs: A one day supply of Hoffman’s Gain Weight formula (based on soy protein) 2 quarts milk 2 cups skim milk powder 2 raw eggs 4 tablespoons peanut butter ˝ brick ice cream 1 banana 4 tablespoons malted milk powder 6 tablespoons corn syrup By the 1960s, bodybuilders had figured out what they had to do to attain specific goals. Getting lean or “ripped” for a contest required stripping the diet of all carbohydrates, including milk and cream. Milk was a favorite for building muscle, but for losing fat, it contained too much carbohydrate and held water under the skin. Ketogenic diets consisting of meat and water were commonly used to prepare for the shows. During the 1950s, two English researchers–Professor Kekwick and Dr. Pawan–claimed to have isolated a fat-mobilizing substance that showed up in the urine along with ketone bodies after 24 hours on a no-carb diet. In spite of considerable scientific debate, the Ketogenic diet remained a constant in the field of bodybuilding until the 1980s. Yet it was in the early 70s that the lipid hypothesis began to take hold. The result was a series of diets that emphasized carbohydrates over protein and fats. The pre-game meal of beef was giving way to one of lasagna or spaghetti. The magazines of 1970 mirrored this confusion. For example, in an issue of Strength & Health, publisher Hoffman praises the African Masai tribe for their reverence of whole milk, while in his other publication, Muscular Development, he recommends skim milk because it is lower in saturated fats. (The vast majority of the nation was now drinking pasteurized milk–long time strength trainer Jim Bryan remembers avoiding raw milk because he was given the impression that it was dangerous.) MuscleMag publisher Bob Kennedy told his readers not to let anyone scare them away from eggs. Frank Zane, Mr. Olympia champion from 1977-79, was still eating the old way with plenty of eggs, lamb, beef, pork, heart, liver, raw milk, protein powder, vegetables, fruit with some potato and brown rice, educating his readers on the misconception of cholesterol and warning against over-consumption of polyunsaturated vegetable oils. But in Iron Man, Sterri Larson was telling readers that the diet of the bodybuilder was not necessarily one to produce good health. He believed that eggs were the best for both building muscle and losing fat, but that saturated fat and cholesterol could prove hazardous. According to bodybuilder Brian Horton, some of the athletes were now eating chicken and fish instead of beef and eggs. Steroid Use Meanwhile, by the end of the 1970s, professional bodybuilders were using a number of metabolism-enhancing substances such as amphetamines, Armour (Thyroid), human and animal growth hormone, and multiple steroids (a method referred to as “stacking”). Some of the top pros worked with physicians to monitor their blood parameters as they prepared for their competitions. During the months before an event, these athletes would swallow and inject any substance that would facilitate tremendous muscularity. Very few, if any, bodybuilders could attain such condition without this assistance. Steroid use suffered a setback with the revelation that 1988 Olympic gold medal sprinter Ben Johnson had tested positive for anabolic steroids, which had been banned from use in the Olympic games since 1975. In 1990, the Food and Drug Administration added steroids to the Schedule III list of the Controlled Substance Act. Since then, any athlete seeking to build muscle via anabolic steroids could just as easily find his next workout conducted in a Federal prison gym — and several have, to the dismay of many in the legal, medical and sports arenas. The ban on steroid use was no surprise to the bodybuilding world since abuse of the drugs, even at the high school level, was well known. Not only was the number of users growing, but so were the dosages and arsenals in professions where size and strength really made the difference. The magazines were not yet labeling heart disease as a side effect of steroid use. However, by 1970 they were starting to mention the fact that a number of strength athletes were succumbing at their prime. Columnist Bob Brown described his concern over losing friends at an early age to heart disease and wrote an article in Iron Man entitled “Will Weight Training Kill You?” Brown compiled some death statistics on prominent men of the iron game throughout the century and compared them to some mortality stats supplied from an insurance company. He concluded that even though strength trainers were not immune to early death, they fared better than the average American and stood a much better chance at living a longer life. Others noted the shortened careers of top bodybuilders. The 1967 Mr. America Don Howorth considered a comeback, but stated he knew his body would not do well with what he had to take at that stage of his life. Even the genetically blessed Casey Viator who was a serious contender for the Mr. Olympia title, walked from any more attempts in 1983 knowing that his body had had enough. New Dietary Trends In the early 1980s, bodybuilders became interested in the glycemic index of carbohydrate foods. A team of researchers at the University of Toronto, led by Dr. David Jenkins, demonstrated that different foods affected blood glucose levels at different rates. They developed the Glycemic Index in which many carbohydrate foods were measured against selected reference foods on how quickly they raised glucose levels. Many bodybuilders and other athletes used the glycemic index to plan their daily menu and carbohydrate selection. With the insurgence of carbs into the diet, along with a well-established reverence for protein, bodybuilders discovered there wasn’t much room left for fat. In fact, by the end of the decade, many found themselves in a competition for who could get their dietary fat the lowest. Some even attempted a theoretical zero fat diet. But not everyone was taken in. I interviewed bodybuilder Ron Kosloff who said he didn’t change a thing. “I knew what I saw,” he told me. “My grandparents lived on a farm and ate whole milk, cream, eggs, butter, meat, potatoes and homemade bread. My grandfather often ate 6 eggs a day for years, many of them raw, along with lard sandwiches. He lived to 98 while my grandmother lived to 101. What astounded me most was their farmhand who went by the name of Indian Joe. When I first saw him he looked in his 40s and was incredibly cut and muscular. He looked like Conan. I was shocked when I found out he was well into his 70s. Indian Joe lived to 115 years of age and ate nothing but meat, glands and intestines!” Kosloff had consumed a minimum of 6 eggs daily for the previous 20 years with no ill effects. Ron also noted that bodybuilders like Gironda and Blair were warning him back in the late 60s of the real hazardous fats–hydrogenated oils! Armand Tanny, now in his 60s, was also writing articles contradicting this new trend. All through the 1980s he wrote articles for Joe Weider’s Muscle and Fitness magazine such as: Caveman Diet (March 1986), Meat and the Bodybuilder (Dec 1986), Good Nutrition and Sex (June 1987), Streamline Meat (Oct 1987), Uncooked Delicacies (Dec 1986), and Those Beefs About Meat (Oct 1985). In the midst of the cholesterol scare in 1984, Vince Gironda released his book Unleashing The Wild Physique, still recommending 36 eggs a day to produce an anabolic effect. However, he also wrote an article defending carbohydrates and warning of the potential risks of high protein consumption. Putting Those Carbs to Work A major trend in the 80s and 90s was the concept of carbohydrate loading, first popularized by Vince Gironda back in the 50s and 60s. “I believe that every 3 to 5 days you need to get a ‘carbohydrate loading meal’ into your body . . . I feel that carbohydrate is necessary every third or fifth day in order to get the glycogen back into the liver.” Also back in the 1960s, cyclists were using a technique of loading their muscles with carbohydrates to give themselves an endurance edge. Bodybuilders were also loading their muscles just before a competition to give them a fuller look. Into the 1980s, the competitive bodybuilders had brought it into a science with their knowledge of the hormones vasopressin and aldosterone and how they controlled the sodium/water balance in the body. The challenge was to stand on stage on competition day with as much body fluid sucked into the muscles with the carbohydrates and not under the skin. The effect of this technique was so dramatic that hit or missed timing could represent a victory or looking terrible for bodybuilding standards. Often bodybuilders would be banging their heads off the wall one to three days after a big show when all the fluids would shift into the right places–too late! Similar diets followed including Cyclical Ketogenic Dieting (CKD) variously known as the “Ultimate Diet,” the “High-Fat Diet,” the “Anabolic Diet,” “Bodyopus,” the “Metabolic Diet,” “Anabolic Solution,” and the “Ultimate Diet 2.0.” The Supplement Boom Amino acids in their many forms (peptide-bonded, free-form, branch chained, L-crystalline) were popular in the 80s, based on the notion that certain isolated amino acids could stimulate the pituitary gland to release growth hormone. Claims that the free-form amino acids arginine and ornithine could help bodybuilders lose fat and gain muscle actually led to a world-wide shortage of arginine and ornithine. I remember contributing to that shortage. Others touted the amino acid lysine as a growth hormone releaser. Lysine is plentiful in milk, which is what bodybuilders used in the days before amino acid supplements. Soy protein powder made a big comeback in the 1990s with enough market hype to force the bodybuilding community to take another look. However, soy has never been accepted as a quality protein by the bodybuilders who knew anything about protein. Blair dumped it decades ago for the higher quality from milk and eggs. Vince Gironda simply referred to soy as “that s***!” Carbohydrate loading was made easier with drinks like CarboPlex, containing maltodextrin. Other products contained medium chain triglycerides (MCTs) derived from coconut oil, to provide energy while bypassing the normal fat-assimilating channels in the body. It was almost impossible to keep up with the new ergogenic and anabolic aids promoted in the magazines. They had bizarre names like Gamma Oryzanal, Osterolwere, Dibencozide and Inosine. A product called Metabolol containing glucose polymers, MCTs and various ergogenic agents became popular. Completing products–with names like “Ultimate Orange” and “Hot Stuff”–were promoted with clever and outlandish marketing tactics. More Anabolic Aids During the 1980s, the world of competitive bodybuilding could be summed up in one name–Lee Haney. Haney ruled the Mr. Olympia competition from 1984 to 1991. He was followed by Dorian Yates, winner for six straight years and then Ron Coleman who is the reigning Mr. Olympia in 2004. These two men ushered in a big jump in size and hardness. To put the size in perspective, Arnold Schwarzenegger was a huge athlete back in the 70s competing at 235 pounds at 6 feet 2 inches. In the 2003 Mr. Olympia contest, Ron Coleman stood under 6 feet and weighed 287 pounds–and he was even leaner than Schwarzenegger! Were these men better bodybuilders than Schwarzenegger and Haney? Not necessarily, just more daring chemists. Two very anabolic compounds had muscled their way to prominence in the pro ranks in a much bigger way than ever before. These compounds were insulin and growth hormone. Bodybuilders were using natural growth hormone from human cadavers and rhesus monkeys back in the 1970s. However, with the introduction of recombinant Human Growth Hormone in 1985, this product became more widely available. Another anabolic compound was creatine monohydrate, a muscle-hydrating substance. Whey protein came into prominence. Bodybuilders will ingest just about anything in the quest to build muscles–powders, pills, raw meat, blood, glands, and a whole assortment of esoteric concoctions that have been slam-dunked for the sake of the gain. Until the end of the 1980s, athletes sat on two distinct sides of the line–those who took steroids and those who did not. As Nelson Montana once stated, “Steroids do what all bodybuilders want –they build muscle!” That distinct line became blurred in the 1990s with the fall of the Berlin wall and the introduction of Eastern Block performance enhancing compounds known as “pro-hormones.” In the mid-1990s, supplements of Androstenedione, Androstenediol, Norandrostenedione, Norandrostenediol and DHEA appeared in the magazines. Originally deemed safe alternatives to steroids, the same side effects that manifested with steroids soon became apparent–male pattern baldness, prostrate enlargement, acne, reduced libido, liver and kidney toxicity, and–every bodybuilder’s favorite–gynecomastia (bitch tits). As more side effects revealed themselves, more precursors (pro-hormones) came on the scene to replace their predecessors. Baseball’s Mark McGuire helped the market in a big way. Bodybuilders started stacking these hormones like regular anabolic steroids along with estrogen blockers, growth hormone enhancers, cortisone inhibitors, stimulators (ephedra), creatine, protein powders and, if there was any cash left, perhaps some vitamins. The recommended diet today is high-carb, high-protein, and low in fat–skim milk, egg whites, protein powders. . . anything but real whole foods. It’s no surprise that early natural bodybuilders, such as LaLanne, Tanny, Gironda and Grimek, enjoyed good longevity in the sport while the health of today’s muscle stars is a huge question mark. As five-time Mr. Universe Bill Pearl recently remarked: “The guy left standing on the stage today at the end of a bodybuilding show is probably the guy in the arena who is closest to death.” It’s unfortunate that today’s young athletes who have that genetic potential to excel in bodybuilding really have no choice but to go down that pharmaceutical road if they want to achieve top honors at the shows. A friend of mine and long time gym owner Marty Hodgson stated to me, “We must remember it was in fact drugs that played a significant role in building those comic book characteristics that attracted us to the sport over the past 40 years. But those very substances that help make the sport are the same ones that are, with no doubt, destroying it.” Sidebars Daily Menu for the Three Saxon Brothers Breakfast 24 eggs 3 pounds smoked bacon Porridge with cream and honey Tea with plenty of sugar Dinner 10 pounds of meat Vegetables Sweet fruit (raw or cooked) Sweet cakes Salad Tea Sweet puddings Cocoa and whipped cream Supper Cold meat Smoked fish Lots of butter and cheese Beer Sansone’s Weight Gain Diet Breakfast Fresh fruit Medium serving of whole grain Cereal with cream and sugar 2 eggs 2 pieces whole grain toast, buttered 1 glass of milk Dinner Steak, lamb, mutton or other meat 1 baked potato with butter 2 pieces whole wheat toast,buttered 1 large leafy green salad 1 large serving of berries or other fruit 1 small piece of plain cake Supper 1 cup of bouillon or puree 1 medium serving of meat 1 large serving of cooked vegetables 2 pieces whole grain toast, buttered Pudding or custard 1 glass of milk Sansone’s Weight Loss Diet Breakfast Fresh fruit 2 pieces whole grain toast, buttered 1 egg 1 cup coffee or tea ˝ cup hot milk Dinner Steak, roast beef, mutton or other meat 1 piece whole grain toast, buttered 1 large serving vegetables Berries Supper 1 cup of soup or tomato puree 1 small serving meat or fish 1 large serving vegetable 1 piece whole grain toast, buttered 1 glass milk Steroid Side Effects Anabolic steroids are synthetic derivatives or chemically altered versions of the hormone testosterone. Testosterone is the main androgenic or masculinizing hormone in males. In females, it plays a secondary role and occurs at about 1/20th the amount that occurs in adult men. Testosterone has two primary characteristics that concern the athlete pursuing performance enhancement. The first and most sought-after attribute for the sport of bodybuilding is its anabolic effect, the ability to stimulate protein synthesis for muscle, bone and blood building. The second and less desired effect, especially with women, is the adrogenic response, the stimulation of secondary male sexual characteristics. Synthetic steroids are designed to enhance the anabolic effects of testosterone, while reducing the masculinizing properties. Unfortunately, the more you reduce the androgenic properties, the more you reduce the anabolic effect. Over the years, many different derivatives of the testosterone molecule have made their way through the sports arena. All of these synthetic versions have had varying degrees of androgenic and anabolic potencies. The more androgenic, the more anabolic and therefore more effective the drug for building muscle. Anabolic steroids can be taken orally, sublingually or via injection. Oral steroids usually act faster than their oil-based injectable counterparts. Injectable steroids such as Deca-Durabolin have been designed to reduce the androgenic attributes and can stay in the body much longer than oral steroids such as Dianabol. Dianabol travels quickly to the liver where it is broken down to a large degree. This type of steroid places more stress on the liver. The side effects of steroids can vary depending on gender and individual physiological characteristics. Age, dosage and duration of time on steroids also affect the degree of adverse reactions. Some of the side effects are also surrounded with controversy. For example, much of the media attention towards the serious liver disease through steroid use comes from patients with preexisting illnesses under longterm treatment with steroid medication. Nevertheless, steroid-using atheletes need to have their liver function monitored by health practitioners as the liver is definitely stressed by the practice. One common side effect of steroids is water retention leading to elevated blood pressure in some athletes; another is kidney damage. The most feared reaction among the male bodybuilders is the paradoxical feminizing known as gynecomastia. This involves the enlargement of the tissue around the nipples. For females, it is the masculinizing effects that do the most damage. Male pattern baldness, facial and body hair, deepening of the voice, and clitoral enlargement are all potential threats to the female taking androgenic steroids. Stimulation of the sebaceous glands may lead to acne in both male and female athletes. Behaviorial changes are also tied to steroid use. Almost everyone has heard of “roid rage.” Steroid use does not typically turn a mild mannered individual into a madman as the media would have us believe but anabolic steroids can increase aggression to some degree. If you are already an S.O.B., then steroid use may make you a bigger S.O.B. Psychological dependency also occurs, mainly because some athletes cannot deal with the loss of muscle, strength and desired appearance when withdrawing from steroids. Other possible side effects that may occur during the use of anabolic and androgenic steroids include prolonged bleeding time, headaches, nausea, feeling poorly, increased risk of injury, abcesses resulting from injection, anaphylactic shock (life-threatening reaction) and early death from heart disease. Vince Ginonda’s “Hormone Precursor Diet” for Muscle Build-Up Gironda recommended this diet for four to six weeks, followed by a mostly vegetarian “alkalinizing” diet. Breakfast Vince’s special protein drink made of 12 oz half and half, 12 raw eggs, 1/3 cup milk-and-egg protein powder, 1 banana. (Make one to three mixtures of this formula and drink throughout the day, between meals, and before retiring) Supplements 1 multi-vitamin tablet 3 vitamin A and D tablets or 3 halibut oil capsules 1 vitamin B complex 1 vitamin B-15 tablet 1 vitamin C comlex (300 mg) 2 vitamin E capsules (800 iu) 1 zinc tablet 1 chelated mineral tablets 5 alfalfa tablets 10 kelp tablets 3 tri-germ and wheat germ oil capsules 1 RNA/DNA tablet 3 Lysine tablets (400 mg) 1 hydrochloric acid tablet (before meal) 3 digestive enzyme tablets (after meal) 3 multi-glandular tablets (nucleo glan male or female) Lunch 1 pound hamburger or other meat Mixed greeen salad or raw vegetables Supplements 1 iron tablet 4 calcium tablets Repeat of breakfast vitamins with omission of vitamin E, tri-germ, wheat germ, halibut oil Dinner 1 to 2 pound steak or roast meat Raw or steamed vegetables or salad and cottage cheese Supplements Same as lunch Special Supplements 10 amino acids and desiccated liver tablets (every 3 hours) 5 yeast tablets with the protein drink 4 raw orchic tissue tablets (before and after workouts) 6 each of the following before retiring: arginine, ortithine, tryptophan, calcium tablets High-Carb Diet for Bodybuilders Typical of the new carb-rich diets was the 1979 diet of Clarence Bass, known for his “ripped” appearance: Breakfast 2 eggs 1 toast Cereal consisting of: 2 tablespoons wheat germ 5 tablespoons bran 1 tablespoon sunflower seeds 1 tablespoon raisins 1 cup whole raw milk Lunch Peanut butter sandwich on whole grain bread 1 cup yogurt from whole raw milk 1 apple or pear Supper 2 poached eggs 1 piece dry toast Huge salad Evening Meal 1 cup whole raw milk mixed with 1 cup water 1 tablespoon Fyblend fiber 1/2 grain saccharin 1/2 teaspoon decaffeinated coffee Big Ron’s Confusing Nutrition Advice Nowhere is confusion on what constitutes a healthy diet more evident than on the website of current bodybuilding champion Ron Coleman (bigroncoleman.com). His contradictory and watered-down nutritional advice: 1. Eat, eat and eat some more. 2. To add strength and mass, try to consume four to six meals a day. Choose from a variety of food groups at mealtime. Try to include lots of potatoes, rice, pasta, fruits and vegetables. 3. Make sure you are eating enough. A low fat diet and avoiding refined foods are good, but it won’t help you build mass. On the same note you don’t want to eat a high fat diet all the time. Fat provides additional calories, the fat-soluble vitamins A, D, E and K and raw materials for important hormones that stimulate muscle growth. 4. Monitor the amount of mass you are gaining. Measure your body parts and weigh every week to see if you are going in the right direction. 5. Lastly, continue to train hard. And remember gaining mass won’t happen overnight. Copyright: This article is excerpted from Randy Roach’s book Muscle, Smoke & Mirrors, available at prfit.com. This article appeared in Wise Traditions in Food, Farming and the Healing Arts, the quarterly magazine of the Weston A. Price Foundation, Fall 2004. Randy Roach info130@westonaprice.org' Bodybuilder and trainer Randy Roach has followed most of the bodybuilding diet trends over the past 30 years including methods not so embraced in bodybuilding circles, such as complete vegan vegetarianism. During his protein-drink phase he ate egg whites and discarded the yolks. He has discovered that too many carbohydrates give him all sorts of problems. Over the past 3 years he has migrated to a total raw diet. This includes raw meat, dairy, eggs (especially the yolks), honey, green juices, and some fruits with their seeds. Food for a typical day includes 1/4-1/2 pound raw chicken,1/2 pound raw beef, 1/4 pound raw liver, 16- 32 ounces of raw milk, 2-3 ounces raw cream, 6-8 tablespoons raw honey, 32 ounces raw green juice (celery, parsley, lemon, zucchini, honey, beets) and occasional fruit.