How's this for a mind-blower:
Schizophrenia, gluten, and low-carbohydrate, ketogenic diets: a case report and review of the literature
Synopsis: 70-year-old schizophrenic experienced complete remission of symptoms after adopting a low-carbohydrate diet. Now, of course, this is just one case study, and needs to be replicated a LOT more times. But it really caught my eye, as from my last post I've been thinking about the mental effects of diet, particularly grains. Schizophrenia is one extreme case of neurological disturbance, but as with all things biological, disease expression is rarely binary. The manifestation of symptoms covers a spectrum when viewed across the population. We just tend to pay the most attention to the extreme cases. Suppose grains were implicated as causal in schizophrenia. It's a good bet they then contribute to other less obvious forms of mental disturbance. Since grains are so widely consumed, this may be actually viewed as the "norm".
Indeed, other neurological conditions are known to benefit from removal of dietary grains, including pediatric epilepsy (discussed in the paper), ADHD, autism, and multiple sclerosis. I've been doing some poking around on wheat germ agglutinin and the brain. Turns out WGA does indeed cross the blood-brain barrier, will bind to insulin receptors in the brain, and probably all kinds of other stuff as well. Google "WGA brain", and you'll find WGA is actually used extensively to map out neuronal pathways, so clearly has potential neurological effects beyond just binding to insulin receptors.
I propose that the sequel to "Fat Head" be "Wheat Head". That covers a lot of ground, from Weston A. Price's observations on cranial development (there definitely seems to be distinct "Wheat Head" phenotypes), to dental disease to the neurological implications, and probably more.
Wouldn't it be fun if the food pyramid were making us fat, sick, deformed, and crazy all at once?
Thursday, February 26, 2009
Wednesday, February 25, 2009
The Children of the Wheat
Hello everyone out there in blog world. Let me start by apologizing for my prolonged absence. The recent lack of posting is partly a result of being busy with other aspects of life, not the least of which was searching for a new job. I feel very fortunate to have found an opportunity given the current economic situation; even more fortunate that I will be helping to build the next generation of genetic sequencing machines. So I get to have a job not only on the cutting edge of science, but also hopefully contributing to the health of our society. I expect to be pretty busy with the new gig, so I don't know that I'll be able to post much in the coming months either.
The other reason for the lack of posts is that I haven't really had much new to say. The one thing I'd like to get to is the rest of the energy regulation series, particularly some info about innate and learned food preferences, including why carbohydrates may be addictive (short answer: insulin tweaks an area of the brain called the insula, also lit up by drugs such as cocaine and opiates). I've started a few posts and abandoned them, mainly because they seemed to be covering the same ground. Let me throw out some these random thoughts here, rambling about in no particular order, sort of a brain dump before I disappear again.
First, I want to recommend a couple of DVDs. First is Tom Naughton's Fat Head, which is both funny and educational. Fat Head provides a gentle and highly accessible introduction to some of the topics covered in Gary Taubes' Good Calories, Bad Calories. Even my kids (8 and 4) "got it", though I suppose it didn't hurt that I've been pumping them full of the background info for a few years :-). Watch for the moment when Tom's doctor sees the measurable effects of eating an all fast-food low carb diet. The expression on his face is absolutely priceless (and to his credit, he didn't just blow it off like many in the mainstream of health would). Also watch the bonus interview footage, great stuff. I particularly liked a quote from Dr. Al Sears, something to the effect of "If you're not dead, you can still heal." Most doctors today seem resigned to mitigating the effects of metabolic syndrome through medication, rather than actually healing. I believe they're generally well-intentioned, just misinformed. But experience has shown that given the opportunity, the body has an amazing ability to heal itself, IF you can remove or at least mitigate the underlying factors reinforcing the underlying disease process. More on this later. Jimmy Moore has a great interview with Tom Naughton as well.
At one point in the bonus interviews, Dr. Sears discusses how rabbits were ultimately used as a model for heart disease. Apparently researchers started out by feeding dogs large quantities of lard, but the dogs would not develop athersclerosis. Of course that should have been obvious from the outset: saturated fat and cholesterol-containing animal fat are a cornerstone of the evolutionary diet of canines. Since the researchers didn't get their preconceptions validated using dogs, they switched to rabbits, whose natural diet is grass, and who thus never evolved any mechanisms for handling large dietary quantities of either fat or cholesterol. Not surprisingly, the poor bunnies' metabolism went berserk, and the researchers extrapolated this result to humans. Talk about confirmation bias.
Here's a personal related anecdote. Our dog Picasso is about 12 years old now. He was getting pretty porky and arthritic, and also began drinking a ridiculous amount of water, so I suspected he was developing doggy diabetes. I was going to take him to the vet, but first checked out the ingredients on his "healthy" doctor-recommended food. First ingredient: corn starch. I felt like a dope for not checking that earlier, and switched him to a diet of raw food, mainly patties made from ground up whole chickens, supplented with leftover bacon grease (we eat a lot of bacon here) and raw organ meats like heart, liver, kidneys, and tripe. The water-drinking issue disappeared almost immediately. Over time, Picasso has really trimmed up, looks like a young dog now, with a nice shiny coat. He's become a lot more friendly and playful now as well. People are always surprised to learn he's nearly 12. Hint hint: the evolutionary diet of humans is much closer to that of canines than bunny wabbits.
The other DVD is "My Big Fat Diet", which actually provides several examples of the healing powers of the human body. This documentary follows Dr. Jay Wortman as he treats metabolic syndrome in a group of Canadian Namgis First Nation people via a low carb diet. The results: not only did they lose fat, but also reduced or eliminated many of the other symptoms of metabolic syndrome along with associated medications. Even more striking was how the Namgis' sense of community and family returned as their bodies healed. Every time I watch My Big Fat Diet, I wonder how many of our various societal ills are fueled by poor health resulting from bad nutrition. The Western diet promotes a situation where the body perceives itself to be in constant crisis: insulin resistance essentially implies starvation at the cellular level, high blood sugar and dietary polyunsaturated fat contribute to glycative/oxidative stress, and hyperinsulinemia probably leads to chronically high levels of stress hormones like cortisol. Is it any wonder we find our society to be populated by individuals with greater focus on the immediate benefits to themselves rather than considering the much greater long-term benefits of contributing to societal well-being?
Another fun fact from My Big Fat Diet is how the Namgis used fat rendered from the tiny Oolichan fish to supply fat soluble vitamins, particularly Vitamin D in the winter. The Namgis traditionally made the association between the yellow color of the Oolichan grease and sunshine, which I thought was pretty insightful. I started a post on Vitamin D, but there's so much info out there already I decided I had little extra to add. Vitamin D deficiency is quickly making it's way into the mainstream medical consciousness as well, which is outstanding. The hormonal version of Vitamin D activates over 1000 genes (something I hope to learn more about in my new job), so it probably should not be surprising that Vitamin D deficiency can lead to a broad spectrum of health problems, particularly those like multiple sclerosis which are known to be influenced by genetic risk factors.
And it's very interesting to think about diseases like influenza, traditionally thought of as being primarily infectious and requiring immunization. But the influenza virus has certainly been around as long as humans, and it's hard to fathom how humanity could have survived if we were all getting knocked flat by the flu once a year. A whole tribe of hunter-gatherers on their backs with flu seems like prime cave bear food. And the flu doesn't behave like an infectious disease, as does the common cold. For instance, Google has a cool new resource estimating flu activity in the US, based on search queries. I've been watching this thing all winter, and it definitely does not show any sort of epidemic pattern. You'd expect flu hotspots to spread geographically over time, but instead the map is pretty much random. This sort of non-infectious pattern for influenza is generally observed, where it just pops up simultaneously in geographically separated locations rather than spreading.
A good hypothesis is that humans generally have a given flu virus all year (and I wonder if anybody has bothered to test for influenza antibodies in the summer). We carry all kinds of viruses all the time, they're just suppressed by our immune systems. But if the immune system becomes weakened, say due to Vitamin D deficiency brought about by lack of sunlight exposure in the winter, the virus can take hold and make you sick. Further, it is known that the majority of symptoms from both flu and cold are basically due to your own immune reaction, not the virus itself. The innate immune system uses cells like neutrophils, whose job it is to seek and destroy potentially infectious agents like viruses and bacteria using both physical and chemical means. But once ramped up, these hunter-killer cells will also destroy your own tissue, and so need to be moderated. If not controlled, your immune system will kill you, which is precisely what happened to victims of the Spanish flu epidemic (Epidemiol Infect 2006;134:1129–1140). What is the primary mechanism for moderating this immune response? Vitamin D.
Another personal anecdote: winter used to be medically difficult for our family, as it is for most people. We'd have our two kids at the doctor at least once a month for ear/sinus infections, strep throat, etc. and the we adults would usually drag around some kind of virus for a few weeks. But that's just part of life, right? Wrong. Since we began supplementing with Vitamin D in the winter (about two years ago), zero doctor visits. I don't even think the kids have had a fever in this time, certainly not one high enough to cause any concern. We do get sick, but it's minor, never more than an annoyance, and short-lived. While the other kids at school are dropping left and right from strep throat and flu, our kids now sail through pretty much unscathed. I've seen it multiple times with friends and family as well. They have some drawn out respiratory disease, like a persistent cough. When we finally get them on the Vitamin D train, it's gone, never to return. I also used to get bad hayfever attacks - no more. Yes, it's anecdotal, but this is what you'd expect from the well-established interaction between Vitamin D and the immune system.
So it's good to see the mainstream picking up on this. They seem to be "getting it" on other fronts as well, albeit slowly. MSNBC recently had a long article on omega-3's, which also discussed the general problems inherent in vegetable oils processed from seeds and soybeans. It's frustrating, because they get some of it "right" (in the sense that their conclusions follow logically from all of the available evidence), yet still are hung up on issues like dietary cholesterol and almost completely miss certain living elephants, as it were. I posted a longish comment on the article, but my essential criticism is one I've voiced before: if you start with bad or incomplete prior information, even the most rigorously logical analysis will lead to goofy conclusions. This in turn implies choices for diet, supplements, medical treatments etc. Read the article, and watch how some key assumptions lead to all kinds of wild inferences and extrapolations.
Michael Pollan's book "In Defense of Food" is another example of this. Though Pollan's "The Omnivore's Dilemma" remains one of my favorites, I had avoided reading "In Defense of Food", mainly because it espoused the mantra "Eat food. Not too much. Mostly plants." I'm all for eating food vs. factory produced foodlike substances, but the last two statements smack of dogma. But our local library decided to feature this book, so I thought I'd better read it in the interest of causing trouble.
Pollan gets some of it right (again, in terms of drawing logical inferences from the available evidence), but frustratingly misses the big stuff, like the total lack of evidence that dietary fiber contributes to health, or that red meat consumption causes disease. For instance, he discusses Good Calories, Bad Calories, but cherry-picks the evidence that apparently supports his preconceptions and ignores the rest. He rails against reductionism, apparently following in the footsteps of T. Colin Campbell (just about the worst possible choice), missing the point that the point of studying isolated aspects of nutrition and metabolism is to inform the "big picture". While a complex system like the human body may be greater than the sum of it's parts, you certainly have no hope of understanding the whole without at least understanding the parts.
Pollan then gives 24 rules which we should follow when selecting and eating food. This is a great example of how starting from goofy assumptions just leads to over-complication. Who is going to walk around a store or look at a menu and mentally check off 24 different things? There's no way that the simple act of eating should require that level of mental effort; it certainly didn't for our hunter-gatherer ancestors. Some of his recommendations are good, like shop at your farmers market or eat wild foods, but a lot of it is just nonsense. For instance, he wants us to "Eat slowly". What other animal consciously regulates it's rate of food intake? Do lions devour their prey at a measured pace, and teach their young to do the same? Like most, Pollan is enthused about a plant-based diet, all hopped up on the idea that we require thousands of different phytonutrients for health. The evidence for this? "In all my interviews with nutrition experts, the benefits of a plant-based diet provided the only point of consensus." Really - after extolling the evils of "nutritionism" for half the book, now you're going to follow the consensus of nutrition experts? Aren't these the same boobs who developed ideas like the food pyramid? Take a walk around your local mall, and you can see how well that's played out in the context of human health. Remember also that scientific consensus rarely is the result of critical examination of the evidence. Rather it's a social phenomenon: the scientists keep repeating the same thing to each other, and eventually it becomes "truth". If you're lucky, somebody with half a brain gets the ball rolling by actually making a logical inference from evidence, but the usual situation is that of the diet-heart hypothesis, where an initial unproved hypothesis turns into scientific "fact" simply through social forces.
Pollan further wants us to focus on eating leaves. This recommendation certainly makes sense for a cow or a gorilla, both of which have vast digestive tracts built for the long process of breaking down cellulose and freeing the nutrition in leaves. Both animals also eat enormous quantities of leaves to meet caloric requirements, e.g. upwards of 60 lbs. daily for a gorilla. Of course humans can predigest leaves through cooking, and we can add fat to help assimilation of the micronutrients, acids to neutralize antinutrients like phytic acid, etc. But those are recent developments in the course of evolution. Humans almost certainly did not evolve getting any significant calories from raw leaves. Our digestive systems just can't handle it, and you need to eat many pounds of leaves daily to have any hope of of meeting caloric requirements. It seems far more likely that humans would have gravitated toward more calorically dense plant foods, like fruit, nuts, and starchy root vegetables. Of course the most nutritionally dense foods are of animal origin, including seafoods, organs, and eggs.
Pollan's first rule is the worst: "Don't eat anything your great grandmother wouldn't recognize as food." First of all, I don't have the faintest idea what my great-grandmother would or would not consider food, and since she's long since passed on, I can't ask her. But given that margerine popped up over 100 years ago, I would guess that would pass as food for her. Does that make it okay to eat margerine? And how about bread? Great grandma probably considered "bread" as food, but bread is the end result of extensively processing wheat, which in its raw state would be toxic. So is bread "food", and should I eat bread? Evidence seems to be mounting that the answer is "no". So this guideline is pointless.
The mental processing required to figure what to eat should be low. Humans evolved as omnivores, which means we can eat a very wide variety of foods. Any minimally processed food that doesn't taste bad or make you immediately ill is almost certainly going to be healthy (I'm trying to think of a counterexample, with no luck). If your food requires a lot of technology and industrial processing to render it edible, you might want to rethink whether it actually qualifies as "food". Dr. Sears makes this point during his bonus interview on the Fat Head DVD, noting that had our ancestors tried raw grains or soybeans, they would have found them distasteful and contracted a painful bellyache, not exactly a stimulus to eat more of those foods. Unlike fruit, which evolved to be desirable to eat in order to spread seeds, grains and legumes do not want their seeds to be eaten, as that of course prevents those seeds from ever growing into plants. So these plants developed a variety of chemical and physical defenses to discourage predation. Some animals, notably birds, responded with adaptations which allow them to flourish on grains (birds have a crop used to grind up the grains, and enzymes to block anti-nutrients like protease inhibitors). Mammals (and humans in particularly) generally lack these adaptations. Note, for example, the effort required to prevent grain-fed cattle from dropping dead inconveniently early.
The wonders of science allow us to turn wheat and soybeans into foodlike substances that at least won't immediately put you in the hospital, and of course these pseudo-foods are made more attractive by formulating them to appeal to innate flavor and texture triggers like fattiness, saltiness, and sweetness. But in our industrialized food environment, we can no longer rely on our senses to distinguish what is good to eat. By contrast, we see "primitive" peoples able to thrive on a wide variety of foods obtainable from their environment, from the flesh/fat laden diet of the Inuit to the largely carbohydrate-based diet of the Kitavans (supplemented with seafood for certain minerals and fat-soluble vitamins). The common thread is that they pretty much directly eat what they obtain from their environment, and intervening processing is usually minimal and aimed at either making nutrients more accessible (cooking) or for preservation (drying/smoking). They don't need 24 rules to figure out what to eat, and neither should we.
Returning to the point about whether or not bread is "food": the role of grains in the modern diet deserves examination. Let me start by putting some context around this. It is, I think, increasingly clear that our "diseases of civilization" are strongly rooted in metabolic disturbances caused by food. Volek and Feinman have made a very strong argument that "metabolic syndrome" can be defined by the response of an individual to dietary carbohydrate, and that the cure is removal of such from the diet. This hypothesis is supported by many scientific studies, both "Fat Head" and "My Big Fat Diet", as well as the personal anecdotes of many thousands (including myself, having lost 100 lbs. and restored many aspects of health on a low carb diet). But the cause of a disease is not necessarily the inverse of the cure, i.e. just eating too many carbohydrates doesn't necessarily cause metabolic syndrome. The traditional diet of the Kitavans and Tarahumara is carbohydrate-based, but neither group develops metabolic syndrome. So I'd venture that it's the type of carbohydrates that drive the development of metabolic syndrome. Once you've broken your metabolism, then any significant quantity of dietary carbs will cause you problems, but what got you to that broken state? For example, your gasoline-powered car can go a long time burning gasoline with no significant issues. Put diesel in the tank, though, and you've really screwed things up, and no longer can use gasoline as fuel until you've fixed the damage done by the diesel.
Now I'll be the first to admit that this question is sort of academic. After all, if you just ate low carb across the board, you'd avoid any subclasses of carbohydrate food that could contribute to chronic metabolic problems. But the reality is that our food environment is flooded with refined-carbohydrate-rich (pseudo)foods. They are deeply ingrained in our culture, pushed on us from every direction. And there's also the issue of what to eat if your metabolism is not broken. I think about this particularly in terms of what to feed my children. Is it better to let the kids eat a slice of pizza or french fries? Chocolate-covered pretzels or a lollipop? And you can't monitor their food intake 24/7. You know they'll be offered crap from every direction when you're not around, so how can you teach them to make reasonable choices, e.g. taking bad instead of worse?
I don't think the scientific evidence is really there to provide a definitive answer. But we can make some reasonable guesses based on what is known about various aspects of metabolism and physiology and how these may hypothetically respond to various inputs. I'm increasingly of the opinion that in the spectrum of carbohydrates wheat is particularly bad. Stephan at the Whole Health Source blog has many articles discussing potential antinutrients in grains, definitely worth reading. One interesting aspect of grains are lectins, which are proteins that basically have the ability to bind to cellular receptors. Lectins in food are often not broken down to amino acids in the gut, and thus can cause mischief in the digestive system and, if they cross over into circulation (which they do), in the rest of the body as well.
Wheat germ agglutinin (WGA) has been studied in the test tube. If you Google "wga insulin receptor" you'll find a lot of papers, probably because WGA can bind to insulin receptors, which makes it easier to study various aspects of the receptor chemistry. From a physiological standpoint, WGA at least has the potential to be troublesome. For instance, not only does WGA bind to insulin receptors, it sticks there. When an insulin molecule binds to an insulin receptor, the whole complex is absorbed by the cell. One insulin molecule thus generates a specific and discrete response in a cell. When WGA binds to an insulin receptor, the complex is not absorbed, it just sits there activating at least part of the insulin signaling chain until it is knocked off (certain sugars accomplish this, like N-acetyl glucosamine). That's potentially nasty. In test tube experiments, at least, WGA is just as effective at insulin at stimulating glucose tranport and blocking lipolysis in fat cells. Stephan also notes that WGA has the potential to block leptin receptors. Leptin resistance is one of the hallmarks of metabolic syndrome.
It is, of course, treacherous to infer effects of grain lectins on a whole organism based on these test-tube experiments. I don't know of any studies which have really studied such effects in detail. But certain anecdotal evidence is at least consistent with the idea that wheat may play a special role in causing metabolic problems. Take the recent much-ballyhooed study of obesity in China. If you look at the data, you'll note that while there is a trend for the more obese subjects to eat more carbohydrates than the less obese, the differences are fairly small, on the order of 10% or less. By contrast, the most obese quartile for men at 5x more wheat flour than the least obese quartile. Hmmm.
The movie My Big Fat Diet provided another interesting bit of evidence. There's a big annual festival held by the Namgis each year. The chief (whose name escapes me) who had Type II diabetes and heart disease had been on a low carb diet for some time, which had allowed him to control his blood sugar entirely without any medication. At the festival he had one piece of "traditional" bannock, a deep-fried bread made from white wheat flour. His blood sugar then soared, and then did not return to normal for a week. Now I very much doubt that such elevated blood sugar was simply the result of the carbohydrate in the bread. Even given his impaired carbohydrate metabolism, this should have been cleared in the course of a day or so. But what if WGA from the flour was instead binding to insulin receptors and sticking there, causing his liver to crank out more sugar? Complete speculation, but interesting to think about.
Finally, I can relate my own experience. After going low carb, I would generally find that even moderate "cheating" would lead to some obvious long-term effects, the most notable was that I'd break out in painful acne. But one night we were at our favorite local restaurant. They have a wonderful scallop dish with a fantastic sauce, and the whole thing is on top of thinly sliced red potatoes. I decided to go ahead and eat the potatoes, mainly just to get at the sauce without being socially unacceptable by drinking it straight from the bowl. Interestingly, though this represented a lot of carbohydrates for me, it had no noticeable effect. I've reproduced this with potatoes a number of times, which seem to not freak out my metabolism when eaten in moderation. On the flip-side, even a small amount of bread will reliably trigger a breakout that lasts a week or more. A more dramatic example occurs with beer. If I drink even two or three premium beers now, I will get quite ill, and then suffer a week or so of acne. It's not the alcohol, because an equivalent amount of red wine or vodka has little effect beyond the usual buzz.
One final crumb for thought: why have we as a society developed such a love for wheat? There's plenty of other places you can load up on starch, such as potatoes, corn, and rice. But we seem to have a special soft spot for wheat-based foods, and cultures seem to be quite willing to displace other forms of carbohydrate with wheat (e.g. the Chinese as discussed above). Remember near the beginning of the post I noted that insulin stimulates the area of the brain known as the insula. One function of the insula is to mediate food-related rewards, e.g. eat some nice sweet fruit, get a bit of an insulin spike, and the insula reinforces that behavior. That makes sense from an evolutionary standpoint: fruit is nutrient dense, but doesn't last for long, so it's a good idea to load up on it while it's around. Certain addictive drugs like cocaine and opiates light up the same area, just with much greater intensity. Suppose WGA were able to get in to the insula and bind to insulin receptors there? That would probably reinforce our desire for wheat, even more so if WGA shows the "stickiness" observed in the test tube. Totally unproven hypothesis, but it seems like one worth testing.
Well, I think I've pretty much emptied my head at this point. Let me just close with a quote from Sir William Drummond:
So don't be a slave. All it takes to break free from the bonds the bigots would impose is to start using your own brain, and this intellectual freedom (and only this) will allow you to make choices which maximize your own health, wealth, and well-being.
The other reason for the lack of posts is that I haven't really had much new to say. The one thing I'd like to get to is the rest of the energy regulation series, particularly some info about innate and learned food preferences, including why carbohydrates may be addictive (short answer: insulin tweaks an area of the brain called the insula, also lit up by drugs such as cocaine and opiates). I've started a few posts and abandoned them, mainly because they seemed to be covering the same ground. Let me throw out some these random thoughts here, rambling about in no particular order, sort of a brain dump before I disappear again.
First, I want to recommend a couple of DVDs. First is Tom Naughton's Fat Head, which is both funny and educational. Fat Head provides a gentle and highly accessible introduction to some of the topics covered in Gary Taubes' Good Calories, Bad Calories. Even my kids (8 and 4) "got it", though I suppose it didn't hurt that I've been pumping them full of the background info for a few years :-). Watch for the moment when Tom's doctor sees the measurable effects of eating an all fast-food low carb diet. The expression on his face is absolutely priceless (and to his credit, he didn't just blow it off like many in the mainstream of health would). Also watch the bonus interview footage, great stuff. I particularly liked a quote from Dr. Al Sears, something to the effect of "If you're not dead, you can still heal." Most doctors today seem resigned to mitigating the effects of metabolic syndrome through medication, rather than actually healing. I believe they're generally well-intentioned, just misinformed. But experience has shown that given the opportunity, the body has an amazing ability to heal itself, IF you can remove or at least mitigate the underlying factors reinforcing the underlying disease process. More on this later. Jimmy Moore has a great interview with Tom Naughton as well.
At one point in the bonus interviews, Dr. Sears discusses how rabbits were ultimately used as a model for heart disease. Apparently researchers started out by feeding dogs large quantities of lard, but the dogs would not develop athersclerosis. Of course that should have been obvious from the outset: saturated fat and cholesterol-containing animal fat are a cornerstone of the evolutionary diet of canines. Since the researchers didn't get their preconceptions validated using dogs, they switched to rabbits, whose natural diet is grass, and who thus never evolved any mechanisms for handling large dietary quantities of either fat or cholesterol. Not surprisingly, the poor bunnies' metabolism went berserk, and the researchers extrapolated this result to humans. Talk about confirmation bias.
Here's a personal related anecdote. Our dog Picasso is about 12 years old now. He was getting pretty porky and arthritic, and also began drinking a ridiculous amount of water, so I suspected he was developing doggy diabetes. I was going to take him to the vet, but first checked out the ingredients on his "healthy" doctor-recommended food. First ingredient: corn starch. I felt like a dope for not checking that earlier, and switched him to a diet of raw food, mainly patties made from ground up whole chickens, supplented with leftover bacon grease (we eat a lot of bacon here) and raw organ meats like heart, liver, kidneys, and tripe. The water-drinking issue disappeared almost immediately. Over time, Picasso has really trimmed up, looks like a young dog now, with a nice shiny coat. He's become a lot more friendly and playful now as well. People are always surprised to learn he's nearly 12. Hint hint: the evolutionary diet of humans is much closer to that of canines than bunny wabbits.
The other DVD is "My Big Fat Diet", which actually provides several examples of the healing powers of the human body. This documentary follows Dr. Jay Wortman as he treats metabolic syndrome in a group of Canadian Namgis First Nation people via a low carb diet. The results: not only did they lose fat, but also reduced or eliminated many of the other symptoms of metabolic syndrome along with associated medications. Even more striking was how the Namgis' sense of community and family returned as their bodies healed. Every time I watch My Big Fat Diet, I wonder how many of our various societal ills are fueled by poor health resulting from bad nutrition. The Western diet promotes a situation where the body perceives itself to be in constant crisis: insulin resistance essentially implies starvation at the cellular level, high blood sugar and dietary polyunsaturated fat contribute to glycative/oxidative stress, and hyperinsulinemia probably leads to chronically high levels of stress hormones like cortisol. Is it any wonder we find our society to be populated by individuals with greater focus on the immediate benefits to themselves rather than considering the much greater long-term benefits of contributing to societal well-being?
Another fun fact from My Big Fat Diet is how the Namgis used fat rendered from the tiny Oolichan fish to supply fat soluble vitamins, particularly Vitamin D in the winter. The Namgis traditionally made the association between the yellow color of the Oolichan grease and sunshine, which I thought was pretty insightful. I started a post on Vitamin D, but there's so much info out there already I decided I had little extra to add. Vitamin D deficiency is quickly making it's way into the mainstream medical consciousness as well, which is outstanding. The hormonal version of Vitamin D activates over 1000 genes (something I hope to learn more about in my new job), so it probably should not be surprising that Vitamin D deficiency can lead to a broad spectrum of health problems, particularly those like multiple sclerosis which are known to be influenced by genetic risk factors.
And it's very interesting to think about diseases like influenza, traditionally thought of as being primarily infectious and requiring immunization. But the influenza virus has certainly been around as long as humans, and it's hard to fathom how humanity could have survived if we were all getting knocked flat by the flu once a year. A whole tribe of hunter-gatherers on their backs with flu seems like prime cave bear food. And the flu doesn't behave like an infectious disease, as does the common cold. For instance, Google has a cool new resource estimating flu activity in the US, based on search queries. I've been watching this thing all winter, and it definitely does not show any sort of epidemic pattern. You'd expect flu hotspots to spread geographically over time, but instead the map is pretty much random. This sort of non-infectious pattern for influenza is generally observed, where it just pops up simultaneously in geographically separated locations rather than spreading.
A good hypothesis is that humans generally have a given flu virus all year (and I wonder if anybody has bothered to test for influenza antibodies in the summer). We carry all kinds of viruses all the time, they're just suppressed by our immune systems. But if the immune system becomes weakened, say due to Vitamin D deficiency brought about by lack of sunlight exposure in the winter, the virus can take hold and make you sick. Further, it is known that the majority of symptoms from both flu and cold are basically due to your own immune reaction, not the virus itself. The innate immune system uses cells like neutrophils, whose job it is to seek and destroy potentially infectious agents like viruses and bacteria using both physical and chemical means. But once ramped up, these hunter-killer cells will also destroy your own tissue, and so need to be moderated. If not controlled, your immune system will kill you, which is precisely what happened to victims of the Spanish flu epidemic (Epidemiol Infect 2006;134:1129–1140). What is the primary mechanism for moderating this immune response? Vitamin D.
Another personal anecdote: winter used to be medically difficult for our family, as it is for most people. We'd have our two kids at the doctor at least once a month for ear/sinus infections, strep throat, etc. and the we adults would usually drag around some kind of virus for a few weeks. But that's just part of life, right? Wrong. Since we began supplementing with Vitamin D in the winter (about two years ago), zero doctor visits. I don't even think the kids have had a fever in this time, certainly not one high enough to cause any concern. We do get sick, but it's minor, never more than an annoyance, and short-lived. While the other kids at school are dropping left and right from strep throat and flu, our kids now sail through pretty much unscathed. I've seen it multiple times with friends and family as well. They have some drawn out respiratory disease, like a persistent cough. When we finally get them on the Vitamin D train, it's gone, never to return. I also used to get bad hayfever attacks - no more. Yes, it's anecdotal, but this is what you'd expect from the well-established interaction between Vitamin D and the immune system.
So it's good to see the mainstream picking up on this. They seem to be "getting it" on other fronts as well, albeit slowly. MSNBC recently had a long article on omega-3's, which also discussed the general problems inherent in vegetable oils processed from seeds and soybeans. It's frustrating, because they get some of it "right" (in the sense that their conclusions follow logically from all of the available evidence), yet still are hung up on issues like dietary cholesterol and almost completely miss certain living elephants, as it were. I posted a longish comment on the article, but my essential criticism is one I've voiced before: if you start with bad or incomplete prior information, even the most rigorously logical analysis will lead to goofy conclusions. This in turn implies choices for diet, supplements, medical treatments etc. Read the article, and watch how some key assumptions lead to all kinds of wild inferences and extrapolations.
Michael Pollan's book "In Defense of Food" is another example of this. Though Pollan's "The Omnivore's Dilemma" remains one of my favorites, I had avoided reading "In Defense of Food", mainly because it espoused the mantra "Eat food. Not too much. Mostly plants." I'm all for eating food vs. factory produced foodlike substances, but the last two statements smack of dogma. But our local library decided to feature this book, so I thought I'd better read it in the interest of causing trouble.
Pollan gets some of it right (again, in terms of drawing logical inferences from the available evidence), but frustratingly misses the big stuff, like the total lack of evidence that dietary fiber contributes to health, or that red meat consumption causes disease. For instance, he discusses Good Calories, Bad Calories, but cherry-picks the evidence that apparently supports his preconceptions and ignores the rest. He rails against reductionism, apparently following in the footsteps of T. Colin Campbell (just about the worst possible choice), missing the point that the point of studying isolated aspects of nutrition and metabolism is to inform the "big picture". While a complex system like the human body may be greater than the sum of it's parts, you certainly have no hope of understanding the whole without at least understanding the parts.
Pollan then gives 24 rules which we should follow when selecting and eating food. This is a great example of how starting from goofy assumptions just leads to over-complication. Who is going to walk around a store or look at a menu and mentally check off 24 different things? There's no way that the simple act of eating should require that level of mental effort; it certainly didn't for our hunter-gatherer ancestors. Some of his recommendations are good, like shop at your farmers market or eat wild foods, but a lot of it is just nonsense. For instance, he wants us to "Eat slowly". What other animal consciously regulates it's rate of food intake? Do lions devour their prey at a measured pace, and teach their young to do the same? Like most, Pollan is enthused about a plant-based diet, all hopped up on the idea that we require thousands of different phytonutrients for health. The evidence for this? "In all my interviews with nutrition experts, the benefits of a plant-based diet provided the only point of consensus." Really - after extolling the evils of "nutritionism" for half the book, now you're going to follow the consensus of nutrition experts? Aren't these the same boobs who developed ideas like the food pyramid? Take a walk around your local mall, and you can see how well that's played out in the context of human health. Remember also that scientific consensus rarely is the result of critical examination of the evidence. Rather it's a social phenomenon: the scientists keep repeating the same thing to each other, and eventually it becomes "truth". If you're lucky, somebody with half a brain gets the ball rolling by actually making a logical inference from evidence, but the usual situation is that of the diet-heart hypothesis, where an initial unproved hypothesis turns into scientific "fact" simply through social forces.
Pollan further wants us to focus on eating leaves. This recommendation certainly makes sense for a cow or a gorilla, both of which have vast digestive tracts built for the long process of breaking down cellulose and freeing the nutrition in leaves. Both animals also eat enormous quantities of leaves to meet caloric requirements, e.g. upwards of 60 lbs. daily for a gorilla. Of course humans can predigest leaves through cooking, and we can add fat to help assimilation of the micronutrients, acids to neutralize antinutrients like phytic acid, etc. But those are recent developments in the course of evolution. Humans almost certainly did not evolve getting any significant calories from raw leaves. Our digestive systems just can't handle it, and you need to eat many pounds of leaves daily to have any hope of of meeting caloric requirements. It seems far more likely that humans would have gravitated toward more calorically dense plant foods, like fruit, nuts, and starchy root vegetables. Of course the most nutritionally dense foods are of animal origin, including seafoods, organs, and eggs.
Pollan's first rule is the worst: "Don't eat anything your great grandmother wouldn't recognize as food." First of all, I don't have the faintest idea what my great-grandmother would or would not consider food, and since she's long since passed on, I can't ask her. But given that margerine popped up over 100 years ago, I would guess that would pass as food for her. Does that make it okay to eat margerine? And how about bread? Great grandma probably considered "bread" as food, but bread is the end result of extensively processing wheat, which in its raw state would be toxic. So is bread "food", and should I eat bread? Evidence seems to be mounting that the answer is "no". So this guideline is pointless.
The mental processing required to figure what to eat should be low. Humans evolved as omnivores, which means we can eat a very wide variety of foods. Any minimally processed food that doesn't taste bad or make you immediately ill is almost certainly going to be healthy (I'm trying to think of a counterexample, with no luck). If your food requires a lot of technology and industrial processing to render it edible, you might want to rethink whether it actually qualifies as "food". Dr. Sears makes this point during his bonus interview on the Fat Head DVD, noting that had our ancestors tried raw grains or soybeans, they would have found them distasteful and contracted a painful bellyache, not exactly a stimulus to eat more of those foods. Unlike fruit, which evolved to be desirable to eat in order to spread seeds, grains and legumes do not want their seeds to be eaten, as that of course prevents those seeds from ever growing into plants. So these plants developed a variety of chemical and physical defenses to discourage predation. Some animals, notably birds, responded with adaptations which allow them to flourish on grains (birds have a crop used to grind up the grains, and enzymes to block anti-nutrients like protease inhibitors). Mammals (and humans in particularly) generally lack these adaptations. Note, for example, the effort required to prevent grain-fed cattle from dropping dead inconveniently early.
The wonders of science allow us to turn wheat and soybeans into foodlike substances that at least won't immediately put you in the hospital, and of course these pseudo-foods are made more attractive by formulating them to appeal to innate flavor and texture triggers like fattiness, saltiness, and sweetness. But in our industrialized food environment, we can no longer rely on our senses to distinguish what is good to eat. By contrast, we see "primitive" peoples able to thrive on a wide variety of foods obtainable from their environment, from the flesh/fat laden diet of the Inuit to the largely carbohydrate-based diet of the Kitavans (supplemented with seafood for certain minerals and fat-soluble vitamins). The common thread is that they pretty much directly eat what they obtain from their environment, and intervening processing is usually minimal and aimed at either making nutrients more accessible (cooking) or for preservation (drying/smoking). They don't need 24 rules to figure out what to eat, and neither should we.
Returning to the point about whether or not bread is "food": the role of grains in the modern diet deserves examination. Let me start by putting some context around this. It is, I think, increasingly clear that our "diseases of civilization" are strongly rooted in metabolic disturbances caused by food. Volek and Feinman have made a very strong argument that "metabolic syndrome" can be defined by the response of an individual to dietary carbohydrate, and that the cure is removal of such from the diet. This hypothesis is supported by many scientific studies, both "Fat Head" and "My Big Fat Diet", as well as the personal anecdotes of many thousands (including myself, having lost 100 lbs. and restored many aspects of health on a low carb diet). But the cause of a disease is not necessarily the inverse of the cure, i.e. just eating too many carbohydrates doesn't necessarily cause metabolic syndrome. The traditional diet of the Kitavans and Tarahumara is carbohydrate-based, but neither group develops metabolic syndrome. So I'd venture that it's the type of carbohydrates that drive the development of metabolic syndrome. Once you've broken your metabolism, then any significant quantity of dietary carbs will cause you problems, but what got you to that broken state? For example, your gasoline-powered car can go a long time burning gasoline with no significant issues. Put diesel in the tank, though, and you've really screwed things up, and no longer can use gasoline as fuel until you've fixed the damage done by the diesel.
Now I'll be the first to admit that this question is sort of academic. After all, if you just ate low carb across the board, you'd avoid any subclasses of carbohydrate food that could contribute to chronic metabolic problems. But the reality is that our food environment is flooded with refined-carbohydrate-rich (pseudo)foods. They are deeply ingrained in our culture, pushed on us from every direction. And there's also the issue of what to eat if your metabolism is not broken. I think about this particularly in terms of what to feed my children. Is it better to let the kids eat a slice of pizza or french fries? Chocolate-covered pretzels or a lollipop? And you can't monitor their food intake 24/7. You know they'll be offered crap from every direction when you're not around, so how can you teach them to make reasonable choices, e.g. taking bad instead of worse?
I don't think the scientific evidence is really there to provide a definitive answer. But we can make some reasonable guesses based on what is known about various aspects of metabolism and physiology and how these may hypothetically respond to various inputs. I'm increasingly of the opinion that in the spectrum of carbohydrates wheat is particularly bad. Stephan at the Whole Health Source blog has many articles discussing potential antinutrients in grains, definitely worth reading. One interesting aspect of grains are lectins, which are proteins that basically have the ability to bind to cellular receptors. Lectins in food are often not broken down to amino acids in the gut, and thus can cause mischief in the digestive system and, if they cross over into circulation (which they do), in the rest of the body as well.
Wheat germ agglutinin (WGA) has been studied in the test tube. If you Google "wga insulin receptor" you'll find a lot of papers, probably because WGA can bind to insulin receptors, which makes it easier to study various aspects of the receptor chemistry. From a physiological standpoint, WGA at least has the potential to be troublesome. For instance, not only does WGA bind to insulin receptors, it sticks there. When an insulin molecule binds to an insulin receptor, the whole complex is absorbed by the cell. One insulin molecule thus generates a specific and discrete response in a cell. When WGA binds to an insulin receptor, the complex is not absorbed, it just sits there activating at least part of the insulin signaling chain until it is knocked off (certain sugars accomplish this, like N-acetyl glucosamine). That's potentially nasty. In test tube experiments, at least, WGA is just as effective at insulin at stimulating glucose tranport and blocking lipolysis in fat cells. Stephan also notes that WGA has the potential to block leptin receptors. Leptin resistance is one of the hallmarks of metabolic syndrome.
It is, of course, treacherous to infer effects of grain lectins on a whole organism based on these test-tube experiments. I don't know of any studies which have really studied such effects in detail. But certain anecdotal evidence is at least consistent with the idea that wheat may play a special role in causing metabolic problems. Take the recent much-ballyhooed study of obesity in China. If you look at the data, you'll note that while there is a trend for the more obese subjects to eat more carbohydrates than the less obese, the differences are fairly small, on the order of 10% or less. By contrast, the most obese quartile for men at 5x more wheat flour than the least obese quartile. Hmmm.
The movie My Big Fat Diet provided another interesting bit of evidence. There's a big annual festival held by the Namgis each year. The chief (whose name escapes me) who had Type II diabetes and heart disease had been on a low carb diet for some time, which had allowed him to control his blood sugar entirely without any medication. At the festival he had one piece of "traditional" bannock, a deep-fried bread made from white wheat flour. His blood sugar then soared, and then did not return to normal for a week. Now I very much doubt that such elevated blood sugar was simply the result of the carbohydrate in the bread. Even given his impaired carbohydrate metabolism, this should have been cleared in the course of a day or so. But what if WGA from the flour was instead binding to insulin receptors and sticking there, causing his liver to crank out more sugar? Complete speculation, but interesting to think about.
Finally, I can relate my own experience. After going low carb, I would generally find that even moderate "cheating" would lead to some obvious long-term effects, the most notable was that I'd break out in painful acne. But one night we were at our favorite local restaurant. They have a wonderful scallop dish with a fantastic sauce, and the whole thing is on top of thinly sliced red potatoes. I decided to go ahead and eat the potatoes, mainly just to get at the sauce without being socially unacceptable by drinking it straight from the bowl. Interestingly, though this represented a lot of carbohydrates for me, it had no noticeable effect. I've reproduced this with potatoes a number of times, which seem to not freak out my metabolism when eaten in moderation. On the flip-side, even a small amount of bread will reliably trigger a breakout that lasts a week or more. A more dramatic example occurs with beer. If I drink even two or three premium beers now, I will get quite ill, and then suffer a week or so of acne. It's not the alcohol, because an equivalent amount of red wine or vodka has little effect beyond the usual buzz.
One final crumb for thought: why have we as a society developed such a love for wheat? There's plenty of other places you can load up on starch, such as potatoes, corn, and rice. But we seem to have a special soft spot for wheat-based foods, and cultures seem to be quite willing to displace other forms of carbohydrate with wheat (e.g. the Chinese as discussed above). Remember near the beginning of the post I noted that insulin stimulates the area of the brain known as the insula. One function of the insula is to mediate food-related rewards, e.g. eat some nice sweet fruit, get a bit of an insulin spike, and the insula reinforces that behavior. That makes sense from an evolutionary standpoint: fruit is nutrient dense, but doesn't last for long, so it's a good idea to load up on it while it's around. Certain addictive drugs like cocaine and opiates light up the same area, just with much greater intensity. Suppose WGA were able to get in to the insula and bind to insulin receptors there? That would probably reinforce our desire for wheat, even more so if WGA shows the "stickiness" observed in the test tube. Totally unproven hypothesis, but it seems like one worth testing.
Well, I think I've pretty much emptied my head at this point. Let me just close with a quote from Sir William Drummond:
He who will not reason is a bigot; he who cannot is a fool; and he who dares not is a slave.I feel like this sums up the situation we currently face as a society. Too often, we rely on "experts" to think for us. In other words, we tend to be "slaves" (I personally believe that most people are not "fools": they have the capacity to think, they just don't bother). Yet these "experts" are far too often "bigots", driven by personal goals and desires rather than reason. Scientists generally don't engage in what constitutes actual science, in the sense of testing a hypothesis and objectively evaluating belief in that hypothesis, because there's no money in that. Instead they concentrate on confirming the beliefs of those who hold the purse-strings, and that makes them "bigots" in the sense described by Drummond.
So don't be a slave. All it takes to break free from the bonds the bigots would impose is to start using your own brain, and this intellectual freedom (and only this) will allow you to make choices which maximize your own health, wealth, and well-being.
Tuesday, February 24, 2009
Chili Verde Recipe
I had some friends ask me for this recipe, so thought this would be a good place to share. This is adapted from Dana Carpender's "Chicken Chili Verde" recipe, which can be found in her excellent cookbook "15 Minute Low-Carb Recipes".
- 6 lbs. pork shoulder, cut in 1/2" pieces (country/picnic style "boneless ribs" are a good choice)
- 2-3 medium yellow onions, diced
- 3 12 oz. bottles salsa verde
- 3 bay leaf
- 3 tsp. cumin
- 3 tbsp. minced garlic
- Fresh ground pepper to taste
- Sour cream or creme fraiche
- Shredded cheese
- Brown the pork over medium high heat. You'll probably need to do this in batches. If you put too much pork in the pan at once, it just steams and turns gray. A good heavy cast iron or stainless pan is essential for this - aluminum just doesn't conduct heat well enough. Add pork to crockpot.
- Brown onions in the same pan as the pork. Be sure to scrape up as much of the brown goodness from the bottom of the pan. Add onions to crockpot.
- Add spices and salsa to crockpot. Stir together if you are so inclined.
- Cook for 6-8 hours on low. Serve hot topped with sour cream (I like creme fraiche better - more buttery) and shredded cheese.
Subscribe to:
Posts (Atom)