tag:blogger.com,1999:blog-77210985683906365532024-03-13T12:48:10.898-07:00The Spark of Reason"In Science the authority embodied in the opinion of thousands is not worth a spark of reason in one man." - Galileo GalileiDavehttp://www.blogger.com/profile/18290594860469294453noreply@blogger.comBlogger51125tag:blogger.com,1999:blog-7721098568390636553.post-14293498085996641052011-08-12T09:16:00.000-07:002011-08-12T10:08:50.504-07:00Comment on Guyenet vs. Taubes; or Why I Don't Give a Crap What the Kitavans Eat<p>This post started as a comment to <a href="http://wholehealthsource.blogspot.com/2011/08/carbohydrate-hypothesis-of-obesity.html">Stephan Guyenet's excellent post on the carbohydrate hypothesis of obesity</a>, got too long, and so I'm putting it here. Do read Stephan's post, and keep an open mind. It's got loads of interesting and cutting-edge science, and this sort of debate and information exchange is how science progresses. If you find yourself experiencing cognitive dissonance, remember that absolute belief is antithetical to science. We always must update our beliefs as new information emerges.
<br /></p>Short summary of Stephan's blog post: the hypothesis that carbohydrates in general are fattening is probably over-simplified and does not reflect the most recent scientific understanding of metabolic regulation. It also leads one into a variety of paradoxes, a la the "French Paradox" of the diet-heart hypothesis.
<br /><p>I think part of what we're seeing here is the rather poor taxonomy of nutrition. We discuss things in terms of macronutrients, but those macronutrients come with (or without) all kinds of other metabolically relevant substances. And even within a given macronutrient group there can be significant metabolic differences, e.g. for fatty acids of different chain lengths, or between glucose and fructose (though Dr. Feinman might have something to say about the latter).
<br /></p>I've posted here before on my favorite example of this, and it seems like a good time to revisit (working from memory and about 4 hours of sleep, so please correct me if necessary). The Aztecs had a corn-based diet. They did experience obesity, but despite documenting a wide variety of health issues in detail never described diabetes. The Egyptians ate a wheat-based diet, also experienced obesity (along with heart disease, cancer, and the whole host of other fun "diseases of civilization"), and did document diabetes. Two high-carb diets, both resulting in some level of obesity, but from what we can tell (thousands of years later), both having radically different metabolic endpoints.
<br /><p>Two take-home points here. First is that we likely need to consider a broader dietary context than that imposed by our artificial macronutrient classification scheme, i.e. wheat and corn both provide primarily carbohydrates as energy, but probably do not have the same metabolic effects, particularly when considering over the timescale of a human life. Second, obesity is a symptom. A given symptom may result from multiple underlying conditions. We need to focus the discussion on more specific pathologies than just "obesity".
<br /></p>In the US and many other Westernized countries, one can take a look around and do a "liver check". How many people do you see with a protruding pot belly as opposed to a general body-wide distribution of fat? Most people I see have the big belly, sometimes even being very lean elsewhere (particularly in the arms and legs); there are a few "Rubinesque" figures as well, but the pot bellies seem to be running away with the obesity stakes. The big belly is indicative of fatty liver. Considering how central the liver is in metabolic regulation, it should come as no surprise that an inflamed fatty liver could lead to a whole host of metabolic disturbances: obesity, abnormal lipid profile, elevated blood sugar, elevated insulin, etc. In other words, metabolic syndrome.
<br /><p>I would argue that the rapidly growing health problem is not simply obesity, but metabolic syndrome (remember obesity is only one symptom, and there are thin people with metabolic syndrome too). We want to understand both how we arrive at metabolic syndrome (so our children can avoid it), and also how to treat it for those who did not avoid it. It is clear "carbohydrates" across the board are not causal in the development of metabolic syndrome. Stephan provides several counter-examples; another is the Tarahumara, who like the old-school Pima subsist largely on corn, beans, and squash, but who have one of the lowest rates of Type 2 diabetes in the world.
<br /></p>But the cure is not necessarily the reverse of the cause when it comes to disease. Metabolic syndrome brings a whole host of issues, not the least of which is broken carbohydrate metabolism. So while carbs in general may not lead to metabolic syndrome, once you've arrived dumping carbohydrates on your broken carbohydrate metabolism is tantamount to doing jumping jacks on two broken legs. I believe the science (along with a massive stack of anecdotal evidence) is pretty clear here, in that the most successful treatment for metabolic syndrome is carbohydrate restriction.
<br />
<br />So while yes, Virginia, the Kitavans eat a very high-carbohydrate diet and exhibit general metabolic health, for my personal dietary choices I don't really give a crap. The Kitavans have healthy carbohydrate metabolisms, but I don't (prior to going low carb I had a trophy beer gut, which in retrospect was my liver telling me "You're killing me slowly"). If you look down toward your feet and can see only your protruding liver, you might consider trading in the bagels for bacon (better yet, get yourself a blood glucose meter and check your post-bagel blood sugar - it might frighten you). It is important to remember that carbohydrate restriction is successful as a <span style="font-style: italic;">treatment </span>for a disease, but it doesn't necessarily follow that all carbs are bad for everybody. We have several examples of cultures who thrive on diets of lean protein and whole food sources of carbohydrate, like starchy tubers and fruit. We also have examples of cultures thriving largely on protein and fat. Humans appear to have a remarkable ability to survive as omnivores eating whole foods, which in no small part explains why we are one of the most widely spread species on the planet. So if you have a healthy metabolism, you probably can choose from a wide variety of whole foods (and by "whole food" I mean something you could plausibly obtain from Nature without the aid of much more technology than fire and a sharp stick). Once your metabolism is broken, you will likely need to make some choices to avoid those things which, due to your disease state, have become effectively toxic. In other words, make your nutritional choices based on actual knowledge of metabolism and your own state of health rather than picking a buzz-phrase and applying it blindly.
<br />
<br />And for God's sake, stop eating wheat ;-)
<br />Davehttp://www.blogger.com/profile/18290594860469294453noreply@blogger.com44tag:blogger.com,1999:blog-7721098568390636553.post-35566169340562728702011-01-29T11:06:00.000-08:002011-01-29T16:29:56.098-08:00On Taubes and ToiletsOne of our toilets has been acting balky lately. Last night I went to flush it and nothing happened. I started pondering on the possible causes of this, and had a brief vision of a bunch of Ph.D's standing around, stroking their chins and sagely examining the toilet through glasses perched on the ends of their noses. After a few knowing glances at each other, they pronounced: "From the First Law of Thermodynamics, we know the problem with your toilet is that, at some point in the past, less water came in than left!"<br /><br />Maybe I should have skipped that last martini at dinner last night.<br /><br />Anyway, my imaginary colleagues were only acting as scientists often do, pronouncing "truth" without getting to the root cause of the issue. Or perhaps my subconscious has been imprinted from too many conversations like this with my children:<br /><br />Me: "How did you get so dirty?"<br />Child: "I was playing in dirt."<br /><br />Back to the toilet. My imaginary scientist friends, while technically correct, were (as scientists often are) totally unhelpful. If I were to fix my toilet, I would need to know how it works, particularly the possible failure modes. In other words, I need to get to the root cause of why it didn't flush. Once I know what's actually broken, I can fix it. Invoking the First Law of Thermodynamics might make one sound smart, but it doesn't get my toilet flushing again. And believe me, in this case it was a vital importance to identify and repair the root cause, posthaste.<br /><br />The toilet is actually an example of a self-regulating system, by which I mean that when it works correctly, I don't have to pay attention to it. If you're not familiar with the workings of the common toilet, check out <a href="http://home.howstuffworks.com/toilet.htm">this entry at HowStuffWorks.com</a>. Basically, there's some clever gadgets in there that make sure things go smoothly. When you push the handle down, it pulls up the flapper, basically a rubber stopper with a hole in the bottom. There's air inside the stopper, which causes it to float open as long as the water level is above the stopper. Once the water is gone, the stopper closes. That's the output side. The input side is controlled by a float-activated valve. When the water level falls, so does the float, which opens the valve and lets water into the tank. As the tank fills, the float rises until it hits the switch and shuts the valve, turning off the water. This whole setup is basically tuned to ensure that you have enough water leaving the tank at the proper rate to get a good flush, while not having too much water enter the tank and thus flooding your bathroom. The toilet has an additional fail-safe to avoid the latter fate, in the form of an overflow pipe. If your float switch doesn't work, then the water goes down the overflow pipe instead of all over your floor.<br /><br />My particular problem was too little water, not too much. Since I have confidence in the First Law of Thermodynamics (<a href="http://sparkofreason.blogspot.com/2008/07/energy-conservation-its-not-just-good.html">at least in approximately flat regions of spacetime</a>, like my bathroom), I know that something caused the tank to not fill. The water didn't fill the tank at some point then magically vanish. One possibility was that my water main had broken. Checked the sink faucet, plenty of water there. Rather than stand around and be mystified by the inner workings of my toilet, I opened it up and took a look inside. No water alright, and it looked like the float was stuck. A quick poke and the toilet started filling. The moral of the story is that the laws of physics don't tell you how things work, but rather the constraints under which they work (e.g. the amount of water leaving the toilet in a flush is the same as the amount that entered when it filled). To solve my problem I needed to understand the mechanism by which the toilet regulated water flow and level, and how that regulation could go wrong. In other words, if you know what causes the toilet to work correctly, then you can infer what might cause it work incorrectly, and take appropriate action.<br /><br />If you've read Gary Taubes most recent work, <a href="http://www.amazon.com/Why-We-Get-Fat-Borzoi/dp/0307272702">Why We Get Fat</a>, you probably have realized by now that my toilet story is a bit of a setup. Why We Get Fat (WWGF) is generally described as "Good Calories, Bad Calories" lite, but it is a bit more than that. Taubes has focused on obesity in particular, and honed his arguments and presentation, and brought in some more recent research as well. It's an excellent and fast-paced read, and I highly recommend it.<br /><br />The key hypothesis of WWGF in terms of obesity is the same as GCBC: that obesity is the result of a failure in the regulation of metabolism, specifically carbohydrate metabolism. A broad set of critics attack both GCBC and WWGF on various detailed points, while missing the big picture. For instance, there's much ado about the specifics of Taubes' hypothesis on <span style="font-weight: bold;">how </span>this failure in carbohydrate metabolism arises. Taubes posits that overconsumption of carbohydrates basically leads to insulin resistance (though notes that the situation may not be so simple), while others point to various evidence that it may be specific carbohydrates (fructose), or vegetable oil, lack of physical activity, etc. These make for nice academic discussions, but if you're one of the millions of people with a broken metabolism, none of this is very helpful. Much as was the case with me and my toilet, if you're going to fix your metabolic machine, you need to have some idea of how it works and what might be go wrong. WWGF is a great place to start educating yourself, provided you don't fall in the trap of running around screaming "I can't see the forest because I've got blood in my eyes from running into all of these damned trees!"<br /><br />The publication of WWGF has also revived the strident preaching from the members of the Holy Church of the First Law of Thermodynamics. Now, I'll give you a pass if you read GCBC and perhaps came away thinking that Taubes implied that low carbohydrate diets somehow got around energy conservation. GCBC was a dense book, and Taubes (who was a degree in physics) no doubt thought that the First Law was just generally held to be true and that nobody would question his belief in it, and so didn't focus on it much. Taubes clearly learned the hard way that you can't take these things for granted. WWGF has two chapters on this topic, and makes it very clear that 1) Yes, Virginia, the First Law of Thermodynamics is alive and kicking, but 2) that the First Law adds no information as to the <span style="font-style: italic;">cause </span>of obesity, or what you might do to fix it. If you read WWGF and still think Taubes is claiming that thermodynamics doesn't apply to biological organisms, then you either didn't really read the book, weren't paying any attention, or have the logical facilities of a monkey on crack.<br /><br />The real lesson of WWGF is the same as my toilet story: just knowing the constraints on the workings of your body (e.g. conservation of energy) is not the same as knowing how the pieces actually fit together, the cause-effect relationships that make the whole machine go. You can't fix something without having some idea of how it works, whether it is a toilet or the human body. Like my toilet, your metabolism (and that of all living organisms) is self-regulating. Humans seem to be control-freaks in general, and we think that every aspect of life needs constant attention, much like driving a car (I wish people paid as much attention to driving as they do to other less consequential things, like whether or not their children poop enough times a day). But when my toilet works right, I don't have to sit in the bathroom and monitor it, waiting to shut off the water if there's an overflow, or fiddling with the float valve switch. It just does it's thing. Metabolism is the same way. Energy regulation is the key aspect of life, from bacteria to humans, and most life doesn't have the capacity to fret about how many calories it ate or how much it exercised. If your metabolism is operating correctly, by definition it is impossible to eat too much. When you have too much energy stored, the body has ways of eliciting biochemistry and behavior (which is just complicated biochemistry) to bring things back into balance: appetite is decreased, thermogenesis is increased, you have the urge to move around, etc.<br /><br />If you're obese, you don't have a character or mental defect any more than my balky toilet does. You have a physical problem in metabolic regulation. Invoking the First Law of Thermodynamics and berating obese people as having a behavioral issue does not address the root cause of obesity, any more than a similar approach would have worked in fixing my toilet (I'm having visions of registered dieticians bitching out my commode for lack of self-control). WWGF is a great place to start "opening the box" and empowering yourself to start giving the "experts" the finger, stop feeling like a failure because your experience doesn't agree with their beliefs, and get down to actually solving the problem.Davehttp://www.blogger.com/profile/18290594860469294453noreply@blogger.com21tag:blogger.com,1999:blog-7721098568390636553.post-85375594577529363072010-11-16T16:26:00.000-08:002010-11-16T16:38:10.954-08:00Have We Reached the Tipping Point?A very quick post here. Take a look at this article: <a href="http://www.foodnavigator-usa.com/Science-Nutrition/Low-fat-diets-could-increase-heart-disease-risk-say-nutrition-experts/">http://www.foodnavigator-usa.com/Science-Nutrition/Low-fat-diets-could-increase-heart-disease-risk-say-nutrition-experts/</a><br /><br />If the American Dietetic Association is flipping on low-fat diets, I'd say that signals the beginning of the end (hat tip to the <a href="http://holdthetoast.com/content/took-em-long-enough">Hold the Toast blog</a>). Still waiting for Dean Ornish to jump out tell us we've been punk'd.<br /><br />Also check out the <a href="http://www.fathead-movie.com/index.php/2010/11/16/the-twinkie-diet/">Fat Head take on the Twinkie diet</a>. Nice analysis of the food logs.Davehttp://www.blogger.com/profile/18290594860469294453noreply@blogger.com7tag:blogger.com,1999:blog-7721098568390636553.post-41100767949086345862010-11-10T08:26:00.000-08:002010-11-10T09:18:29.398-08:00If you are what you eat, what does that say about "The Twinkie Diet" professor?I've had a few questions on the "<a href="http://www.cnn.com/2010/HEALTH/11/08/twinkie.diet.professor/index.html">Twinkie Diet</a>" that's been buzzing about the Internet, so here's a few thoughts...<br /><br />The gist of the Twinkie business is that professor of human nutrition Mark Haub lost 27 pounds over 10 weeks by eating largely "junk food", like Twinkies. The "secret" was that he cut calories from 2600/day to 1800/day. Haub's point was to show "in weight loss, pure calorie counting is what matters most -- not the nutritional value of the food". This gets the "well DUH!" award for the month. Suppose you ate nothing at all. You'd be getting zero nutritional value. Do you think you might lose weight? Hmmmm, could be, doc.<br /><br />The deeper issue here is apparent ignorance of people like professors of human nutrition about the basics of metabolic regulation. To first order, if you keep the macronutrient ratios about the same in your diet and reduce calories, you will also reduce the amount of insulin your body secretes in response to that food. <a href="http://sparkofreason.blogspot.com/2009/08/gut-feeling-about-insulin.html">As oft noted on this blog</a>, insulin is a major metabolic hormone, governing a wide variety of processes having to do with the utilization and storage of energy, not the least of which is driving fat storage. More insulin means more fat storage. Less insulin means less fat storage. Drop insulin enough and on average more fat leaves the fat cells than is stored. The root cause of fat loss under calorie restriction is NOT simply restricting calories, but the result that calorie restriction has on your hormones, particularly insulin. For anecdotal evidence of this, you could ask a Type II diabetic who has to take insulin injections how hard it is to lose fat even by starving. More controlled experiments have been performed in animals. For instance, you can take an obese rat, keep it's insulin levels artificially high, and starve it. Said rat will literally starve to death while obese, consuming it's internal organs for energy, because the high insulin level effectively keeps fat locked up in fat cells.<br /><br />So yes, of course, you can eat a calorie restricted diet of Twinkies and lose fat. But failing to understand how all of the metabolic dots are connected leads to several common backwards assertions made in the article, e.g. "Being overweight is the central problem that leads to complications like high blood pressure, diabetes and high cholesterol". Sure about that doc? Or do obesity, high blood pressure, diabetes, and high (LDL) cholesterol have a common cause, like say excess insulin? After all, there are skinny people with high blood pressure, diabetes, and high cholesterol. There are obese people who otherwise tape out as very healthy. So obesity is clearly not a cause, at least not the root cause. Insulin modulates a large number of genes, and so the precise set of symptoms a person experiences from hyperinsulinemia is going to be a function of their specific genetic makeup. A key test of a scientific hypothesis is its predictive power. The hypothesis that obesity causes Type II diabetes misses by tens of percent. But 100% of Type II diabetics are hyperinsulinemic, whether or not they are obese. Where would you put your money?<br /><br />The key take-away here is that there is a large body of "health professionals" who essentially view the human body as a black box, and as such tend to come up with hand-waving and over-simplified "rules" linking various externally observable effects, like "calories in, calories out" (strictly true, but pointless because it makes no connection between cause and effect). As such, the recommendations of these people rarely rise above the level of old-wives' tales, in terms of the strength of evidence supporting them. When we "open the box", and begin to understand how the inputs and outputs are connected, and further how the body maintains control over metabolism and behavior in an attempt to maintain "health", things become much clearer. If your health expert has this knowledge, you are very lucky. Most are ignorant, and likely will remain so, as once a person deems themselves an "expert", they no longer feel the need to learn anything new (particularly if it contradicts their "expertness"). So it is going to be up to you to gain some measure of knowledge, so that you can make informed decisions for yourself.<br /><br />If you are a person with any degree of scientific interest and background, then I hope you will have read "<a href="http://www.amazon.com/Good-Calories-Bad-Gary-Taubes/dp/1400040787/">Good Calories, Bad Calories</a>" (or a similar book) by now. If not, then shame on you for purposefully maintaining your ignorance. While no book (even a scientific textbook) has the whole story, GCBC does a fantastic job of delving into the very well-established metabolic science linking insulin and various health issues. As oft noted within, most of this stuff is not considered controversial at all. The processes by which insulin regulates fat storage have been established for decades. The gap is simply one of knowledge, where "professors of human nutrition", medical doctors, and the like either don't learn this stuff, or fail to connect the dots: what you eat affects your hormones, which affect biological processes like fat storage, which affect other hormones, which can ultimately affect what you eat. Behavior, after all, is just another manifestation of biology. So it's going to be up to you to educate yourself to some extent. If you're more of a right-brain person or otherwise find GCBC a daunting read, Gary Taubes' forthcoming book "<a href="http://www.amazon.com/Why-We-Get-Fat-About/dp/0307272702/">Why We Get Fat</a>" might be more up your alley.<br /><br />But if you choose to remain ignorant, and blindly follow "expert advice", you deserve exactly what you get.Davehttp://www.blogger.com/profile/18290594860469294453noreply@blogger.com8tag:blogger.com,1999:blog-7721098568390636553.post-77321295466012024482010-09-27T08:45:00.000-07:002010-09-27T08:50:13.822-07:00Stupid = Fat + Sick?Just a quickie to point you to a great "tell it like it is" article at the Huffington post. Link below:<br /><br /><a href="http://www.huffingtonpost.com/justin-stoneman/post_868_b_720398.html">http://www.huffingtonpost.com/justin-stoneman/post_868_b_720398.html</a><br /><br />The short version: before taking "expert" advice, check their shoes . . . and their motivations. I'll repeat it again: the only person who truly has your best interests at heart is YOU. Everyone else has some other axe to grind. Once in awhile you might luck out and find an "expert" whose goals are aligned with yours, but don't hold your breath.Davehttp://www.blogger.com/profile/18290594860469294453noreply@blogger.com6tag:blogger.com,1999:blog-7721098568390636553.post-75360935655078086202010-09-04T13:23:00.000-07:002010-09-06T11:06:51.263-07:00Of Mice and Men, Meat and WheatLast week, I saw a very interesting show on the Science Channel (or one of the Discovery family), called "How Food Made Us Human". I don't see any more showings coming up, but recommend you keep an eye out, definitely worth watching. Much of the material will be review for those who follow recent ideas on metabolism and evolution, but it was well done: easy to follow, concise, with some nice hands-on demonstrations of the concepts. One I like in particular was the "chewing machine". Some scientists rigged up a gizmo the simulate chewing, and then put in "teeth" taken from casts of <span style="font-style: italic;">Australopithecus </span>and (I think) <span style="font-style: italic;">Homo habilis</span> fossil jaws. <span style="font-style: italic;">Australopithecus </span>has flat teeth, hypothesized to be better for grinding tough fibrous vegetation, while <span style="font-style: italic;">Homo habilis</span>' teeth are smaller with more ridges, better for tearing. And this is exactly what was demonstrated: the <span style="font-style: italic;">Australopithecus </span>jaws made short work of a carrot in a single bite, but barely put a dent in a piece of raw meat. <span style="font-style: italic;">Homo habilis</span> nearly cut the meat in half with a single bite.<br /><br />In another fun experiment, several human subjects were "locked up" at a zoo, and fed something like the diet of our closest primate relatives (chimps and gorillas), consisting of raw fruits and vegetables. It was mostly vegetables, if I remember correctly, lots of clips of people gnawing on carrots, raw broccoli, and the like. Long story short, everybody hated it. They spent about half the day doing nothing but chewing, and were starving nonetheless. I believe the average weight loss quoted was something like 10 pounds in two weeks (I meant to watch the show again and take notes, but this is the first "free" time I've gotten, and it's only because I'm at the park with the kids). Subjects apparently spoke of the desire for meat quite a bit (when they weren't bitching about all of the chewing and frequent bathroom visits). It's a fun experiment you can try at home yourself!<br /><br />One part of the show which rather surprised me: one of the scientists visited a remote tribe in Africa who were still living a fairly primal hunter-gatherer existence. What struck me was that these people looked like crap, nothing like the sort of pictures taken by Weston A. Price, or the fossilized <span style="font-style: italic;">H. habilis</span> jaws shown in the chewing experiment. Their faces showed signs of nutritional stress, with small jaws and crowded teeth. One hint here may be the effort they went through to get meat. In the show, they had a porcupine cornered in its burrow (BTW, this thing was huge, the size of a really big dog). The tribesmen spent the better part of the day digging 6-foot deep holes, until they forced the porcupine into the Hobson's choice of which hole to get speared in. It was a tremendous amount of effort to get some meat, and one of the hunters basically said "porcupine sucks, but at least it's meat". They also discussed how socially important it was for hunters to bring back meat, that it brought them status in the tribe, etc. Clearly, meat is at the top of the menu for these people. Yet that they would go to such lengths to obtain it (particularly when they find porcupine distasteful) makes me wonder if hunting isn't so good in this region anymore. Perhaps game has become scarce, hence the appearance of nutritional stress? Yet they're hanging in there, if nothing else a testament to the tremendous adaptability of humanity, made possible by our big brains (and what do you suppose made those big brains possible?)<br /><br />Anyway, "How Food Made Us Human" has spawned a couple of trains of thought, which I want to share with you here. The first has to do with a mouse experiment demonstrated on the show; the second with the correlations between diet changes and physiological changes over the course of evolution.<br /><br />The mouse experiment was very interesting, intended to show the effect of cooking on caloric bioavailability. Take some mice, feed them raw sweet potatoes, and measure the change in body mass as well as activity (based on distance run on the exercise wheel). Now cook the sweet potato and do the same thing. If you're still stuck in the calories-in calories-out (CICO) paradigm, the results should spawn massive cognitive dissonance. The mice who ate the cooked food showed the following differences when compared to those eating the raw food:<br /><ul><li>They exercised significantly more, AND</li><li>They were heavier.</li></ul>That's worth thinking about for a moment, particularly if you think that obesity is caused by conscious choices favoring gluttony and sloth. When the mice ate cooked sweet potato, they exercised MORE than those eating the raw version. Does cooking food spur psychological changes that cause you to become less lazy? But despite exercising more, the mice <span style="font-weight: bold;">still got heavier</span>. Put that in your pipe and smoke it, CICOs.<br /><br />Of course this all makes perfect sense when considered from the standpoint of evolution and metabolic regulation. Cooking makes calories more available. Though they didn't explicitly say so in the show, one presumes that the quantity of potato was held constant between the two groups (since not doing so would void the entire point of the experiment). So the only difference (presumably) was raw vs. cooked. Mice didn't evolve eating cooked food. The higher caloric availability likely "fooled" their digestive systems into taking up calories too rapidly, faster than required to support normal metabolic operations. Rate of digestion is regulated by hormonal and nervous feedback mechanisms: when the brain and other internal sensory systems think there's enough energy around, gastrointestinal motility decreases, slowing the rate at which food leaves the stomach to be digested and absorbed in the small intestine; and of course when an energy deficit is detected, food moves more rapidly out of the stomach. When the stomach is empty AND your body senses an energy deficit, you get hungry, and are driven to find more food.<br /><br />That's how it supposed to work. Like all feedback control systems, if you push outside the "designed" range of stability, it starts failing. I expect this to be particularly the case with biological systems. Biological responses tend to follow <a href="http://en.wikipedia.org/wiki/S-curve">"S-curve"</a> shapes. There's nothing deep about this. It simply reflects the fact that biological responses are limited by available resources. At some point you run out of the capacity to make more hormones, neurotransmitters, receptors, etc. Insulin response is a great example. As a function of blood glucose, the secretion of insulin follows a shape much like that seen on the Wikipedia page. At some point you either saturate the ability to detect glucose, or saturate the ability of the pancreas to crank out insulin, or both. The point is that it is possible to exceed your body's ability to effectively control blood sugar levels via the action of insulin, simply by changing the effective "sugar density" of the food you consume.<br /><br />Back to our mice: when faced with an excess of calories, how can the mouse's body respond? It can either store energy, or burn it off (or both). We know some of it got stored, as the mice got heavier. The show gave one example of "burning it off", in the spontaneous increase in activity. I don't know if they measured it, but I'd wager that the mice also gave off more heat, which I think is a more effective way dumping energy. Muscles are remarkably efficient, and it is surprising how much mechanical work you can get out of a kilocalorie, when compared to the equivalent thermal energy (1 kilocalorie will raise 1 kilogram of water only 1 degree C in temperature, but raise a 1 kg mass over 400 meters against gravity). So the outcome of the mouse experiment is wildly inconsistent with the CICO paradigm, but precisely what one might predict from evolution and metabolic regulation. It would be very interesting to see what would happen if the mice were allowed to continue eating cooked sweet potatoes for a longer time period. I wonder if they would develop mouse metabolic syndrome?<br /><br />The second line of thought follows the main line of reasoning from the show, which is thus: by incorporating more nutrient dense foods into their diets, our hominid ancestors set in path a major evolutionary shift, where gut size was exchanged for brain size. The argument is elegant: if you eat food with greater bioavailable caloric density, you can spend less energy on digestion. That opens up an evolutionary pathway to increase brain size and energy expenditure at the expense of the gut, because you can extract the same energy from food with a smaller and less-demanding digestive system. And indeed, the fossil record seems to indicate that the major jumps in hominid brain sizes came at two critical nutritional junctures. The first was around the time we started eating meat. The second was when we started cooking food, and cooking is hypothesized to have led to modern humans.<br /><br />It's interesting to consider the evolutionary advantage brought about by our big brains. What advantage did they bring our hominid ancestors? The scientist visiting the African hunter-gatherers went on a bit about how hunting is a fairly complex behavior, particularly as practiced by humans. But there's a lot more to it than that. The brain remembers - it stores information. And an "advanced" brain not only remembers information, but can extrapolate it to the future, making choices now that create advantages later. For instance, it's good to know that when the rains come, wildebeest are going to show up and a certain place, and that there's an effective method for cutting one out of the herd, how to make the tools you need to kill and butcher it, which parts are good to eat, etc. In the context of hunting and gathering, there's a positive feedback loop between the increase in nutrient density and encephalization, each reinforcing the other.<br /><br />Of course, there's been another major shift in nutrient density since the advent of cooking: agriculture. The advent of agriculture is an interesting case. At first blush it doesn't seem so hot. The main thrust of human agriculture has been domestication of annual grasses for their seeds, i.e. grains. Across the world, in geographically separate locations, populations growing different crops (wheat, corn, rice) uniformly appear to show significant increases in the chronic "diseases of civilization". But evolution doesn't care if your teeth fall out or you drop dead from cancer at age 40. All evolution cares about is reproductive fitness, and agricultural humans had an undeniable reproductive advantage over hunter-gatherers; otherwise we wouldn't be having this conversation.<br /><br />We had a hint from the mouse experiment above that an increase in dietary effective energy density could lead to "metabolic overload", exceeding the body's ability to balance and regulate the intake and expenditure of energy. And indeed, the evidence continues to mount that a good chunk of our modern epidemic of chronic diseases may be attributable to such metabolic malfunction. It makes me wonder what happened to our <span style="font-style: italic;">Australopithecus </span>ancestors when they started eating meat: did they suffer metabolic disease as well? It's an academic question, of course, as chronic disease or no, meat-eating proved to increase reproductive success. We might ask a similar question about what occurred as cooking gained popularity. These are hard questions to answer from the fossil record. Agriculture happened much more recently, and further is amenable to archaeology (agriculturists tend to gather in large numbers in one spot, as opposed to wandering all over the place looking for food).<br /><br />But here's a thought: we noted above that big brains are useful for remembering lots of stuff. This is important when living as a hunter-gatherer, because the dynamics of nature are complex. Maximizing reproductive potential in this context means being able to remember and extrapolate the myriad (and often subtle) cause-and-effect relationships of the natural world, along with whatever technological innovations are required to take advantage of this knowledge. This information does not get passed on genetically, but rather through communication, i.e. parents teaching children. Knowledge is power, in a very tangible sense, when talking about hunter-gatherer survival. Greater knowledge implies greater ability to obtain nutrient-dense food, hence greater reproductive fitness; it also means that it takes longer to get that knowledge into the brains of your offspring. It is often argued that diseases of civilization have little effect on evolution, because they generally kill you after the reproductive years. But that assumes the only information being passed along is genetic. If memories are also required for the reproductive success of your offspring, it pays to live long enough to pass along that information. And if you follow this line of reasoning, it's clear there's a volume of information at which the parent will not be able to effectively communicate the body of knowledge while still performing hunting and gathering activities required for survival. Enter Grandma and Grandpa. If the information volume for reproductive effectiveness is sufficiently large, it pays to live long enough to pass along that information to your offspring, their offspring, and so forth. Correspondingly, adoption of new dietary practices must either preserve this longevity, OR require less information to be effective.<br /><br />So where does agriculture fit in? Is the adoption of agriculture, which brings with it ever-increasing energy density in food, driving us toward the next phase of "big brain" evolution? Good question - but consider this: how much do you need to know to be an effective agriculturist? I would argue rather little, compared to hunting and gathering. A hunter-gatherer may have thousands of foods in their diet, and they have to know where and when to find them, how to prepare them, etc. Agriculturists have relatively narrow diets, and there's a relatively simple and fixed pattern to the whole business: plow the land, plant the seeds, keep out the weeds, harvest. Lather, rinse, repeat. So I think you can argue that agriculture has a much smaller information burden than hunting/gathering. The tremendous technological increases since the advent of agriculture are a testament to how relatively little brain-power is needed for obtaining food anymore, as we apparently had plenty of spare brain capacity to monkey around with things not directly related to getting fed.<br /><br />Now it is well known that brain volume decreased dramatically with the advent of agriculture. So did adult lifespan. Yet the agriculturists clearly laid the smack-down on the hunter gatherers, evolutionarily speaking. So neither long-term health nor brain size is a reproductive advantage once you start growing your own food (or at least the foods that our ancestors chose to cultivate). There's no point in fueling a big brain if you've got nothing to put in it. And there's no point in keeping old people around if they are not able to contribute directly to the reproductive fitness of their offspring. If you can't work the fields, don't make babies, and we don't need your accumulated wisdom, then you're pretty much just eating food better used for making more genetic copies. So for an agriculturist, dropping dead at 35 may actually have been an advantage.<br /><br />It makes me wonder what direction agriculture (and more recently, industrial food processing) is driving our "humanness". Does the ever-growing energy density and general availability of food imply we'll evolve even smaller guts and bigger brains? I'll put my money on "No". After all, look around you: it's not like the smartest people are the most successful reproductively. You can damn near be vegetative, contribute nothing to society at all, yet we will ensure you've got all the Big Macs and Twinkies you can stuff in your face to fuel the generation of lots of babies to do the same thing. In our current environment, evolution favors being chronically ill and stupid. It doesn't really matter how much we wring our hands about ethics, culture, and society: reproductive success always wins. So if humanity wishes to achieve its stated long-term goals of giving people long and healthy lives while living sustainably on Earth, we'd better figure out how to align those goals with Nature's overriding law of reproductive fitness.Davehttp://www.blogger.com/profile/18290594860469294453noreply@blogger.com11tag:blogger.com,1999:blog-7721098568390636553.post-50107758971652593882010-07-23T06:27:00.000-07:002010-07-23T06:45:28.834-07:00Cognitive Dissonance: Not Just for the LaypersonI must admit, I had not carefully read Dr. Campbell's <a href="http://campbellcoalition.com/?p=142&cpage=1#comment-36">"last word" from the "discussion" of his reply to Denise Minger</a>. His refusal to engage in discussion told me everything I needed to know. But in spelunking around for something else on that page, I came across this quote:<br /><br /><blockquote>I had hoped to have had a civil discourse, but this is difficult when the questions come from uncivil people. I also don’t have time to answer superficial questions of others like ‘what is the detailed mechanism of protein induction of high cholesterol levels’ – that easily could become an entire but relatively useless dissertation when the “mechanism” most decidedly is a symphony of mechanisms, as I explained in our book. At this point, the far more important observation is the dramatic increase in serum cholesterol.</blockquote><br /><br />Hmmm, I wonder what Campbell's definition of "uncivil" is? Seems to have some conceptual overlap with the second sentence, i.e. those who ask "superficial" questions are being "uncivil". The question in question came from me, and I'm glad to see it had one of the desired effect. My preferred outcome would have been that Dr. Campbell actually answered the question. Then I would have learned something. It is unfortunate that he instead evaded the question as above, because then all we learn is that a) he doesn't have an answer, but b) thinks he does, and is thrown into painful cognitive dissonance when confronted by the truth of his ignorance. The nonsense about there being a "symphony of mechanisms" is, I believe, a subtle trick played on Dr. Campbell by his own mind. There are indeed many possible causes, and may be several interacting processes. But he confuses "I don't know which of the many possibilities contributes to the effect" with "here is what we know, a complex process". Classic mental band-aid for cognitive dissonance.<br /><br />Anyway, I think my goal has been accomplished. I wanted to know if Dr. Campbell had any relevant information. If not, I wanted him to publicly torpedo his own credibility. Mission accomplished. Next time he wants to show up and bash a low-carb or paleo book an Amazon, you have ample material to demonstrate his irrationality.Davehttp://www.blogger.com/profile/18290594860469294453noreply@blogger.com20tag:blogger.com,1999:blog-7721098568390636553.post-83675774508350570002010-07-22T08:57:00.000-07:002010-07-22T09:13:25.290-07:00What T. Colin Campbell Didn't Want You to SeeT. Colin Campbell has chosen to not participate in any discussion of his own "scientific results". Take a look at his last word on the topic, note metadiscussion of what "science is about" rather than actually discussing any science, and check the shoes.<br /><br />The great thing about the Internet, of course, is that it is impossible to censor anything. I'm pasting the comments I submitted to Campbell's site below. These were not approved. Compare with the openness displayed by Denise Minger in publishing comments from all comers, and fostering open discussion. Draw your own conclusions. If you have submitted comments to campbellcoalition.com that were not published, feel free to post them in the comments here. I'll send through anything that isn't overt spam.<br /><br />To be fair, these comments may yet show up. There is a perfectly acceptable explanation that they haven't been published yet. I'm sure most bloggers have experienced "falling behind in comment moderation". If these comments are published, I partly retract my criticism. But the main portion remains valid: exchange of information is crucial to scientific progress. If you're not willing to exchange information, you're not interested in scientific progress.<br /><br />I posted this just because it seemed odd to be revising such a benign comment. Who does this, and why?<br /><p></p><blockquote><p>Uh, why did your answer to my original question change from ““Dr. Campbell said he will be able to post comments now and then, although he cannot respond to every question.” to “Dr. Campbell said he will participate to the extent possible.”? Those seem like they say the same thing to me. </p> <p>At any rate, I expect Dr. Campbell will find it a better use of his time to respond to specific points here rather than having to write lengthy detailed work such as above.</p></blockquote><p></p><p>Here's a harder question:<br /></p><p></p><blockquote><p>From the response above:</p> <p>“First and foremost, our extensive work on the biochemical fundamentals of the casein effect on experimental cancer in laboratory animals (only partly described in our book) was prominent because these findings led to my suggestion of fundamental principles and concepts that apply to the broader effects of nutrition on cancer development.”</p> <p>Can you explain what these fundamental principles might be, or at least direct me to a detailed discussion? Proteins are broken down in to amino acids in the gut (at least in healthy individuals). These amino acids are then transported throughout the body, where they may be used to build new proteins. How does a specific mixture of amino acids trigger cancer growth? And of course I doubt most free-living organisms eat large quantities of isolated casein. So if I eat a meal containing casein, the mixture of amino acids absorbed reflects that off the total protein content of the meal, not just the casein. </p> <p>It seems that in order for casein to have a specific role, it would need to trigger some other biological response beyond it’s simple amino acid content. For example, we know that most cancers have a very high glucose requirement, as they largely rely on anaerobic glucose metabolism for energy. We might then expect insulin to be required to stimulate glucose transport. Some cancers do indeed show higher expression of insulin receptors, see e.g.</p> <p><a href="http://cancerres.aacrjournals.org/content/52/14/3924.abstract" rel="nofollow">http://cancerres.aacrjournals.org/content/52/14/3924.abstract</a></p> <p>From this we might hypothesize that dietary carbohydrates would drive cancer growth by providing both a supply a glucose and increased insulin secretion. It further can encompass other observations, e.g. the association of dietary fat and cancer. When eaten in combination with carbohydrate, fat will amplify insulin secretion.</p> <p>Returning to your hypothesis that casein has a unique potential to stimulate cancer growth. What metabolic pathways are followed that create the “casein effect”? Is there some specific hormonal signal uniquely stimulated by casein?</p></blockquote><p></p><p>And a link to a multivariate analysis that would answer at least some of Dr. Campbell's objections:</p><p></p><blockquote><p>Here is an interesting blog on a multivariate analysis of China Study data:</p> <p><a href="http://healthcorrelator.blogspot.com/2010/07/china-study-again-multivariate-analysis.html" rel="nofollow">http://healthcorrelator.blogspot.com/2010/07/china-study-again-multivariate-analysis.html</a></p></blockquote><p><a href="http://healthcorrelator.blogspot.com/2010/07/china-study-again-multivariate-analysis.html" rel="nofollow"></a></p><p>I put these comments under the post "The Challenge of Telling the Truth:</p><p></p><blockquote><p>Nelson,</p> <p>Your suggestion about keeping an “open attitude” is a good one. However, you need to keep an open attitude about scientific evidence as well. The way you talk about “truth of health” sounds a lot more like religion than science. Perhaps this is simply a communication gap. I sincerely hope that you and your father have the sort of open and inquisitive minds required for scientific progress. There is no absolute “truth” in science, as this would imply we have perfect information. I doubt even the staunchest supporter of any dietary dogma would claim that we have perfect understanding of the deep complexities of human biology.</p> <p>I will reiterate here what I have said elsewhere: scientific progress is about two-way communication. You and your father likely have information that supports your hypotheses, information that others do not have. However, I’m sure you’d agree that others have information that you do not as well. The only way to reach “agreement” is communication, so we’re all on the same page. This is why dialog is so fundamental to scientific progress. I hope you and your father will participate in this dialog.</p></blockquote><p></p><p>---------</p><p></p><blockquote><p>“Despite lacking an adequate understanding of statistics and causality, this person used her intelligence and writing skills to compose a critique that might seem persuasive to laypeople.”</p> <p>You might wish to expand on this a bit. It sounds like you’re saying she is both stupid (“lacking…understanding”) and intelligent in the same sentence. And I’m sure you would agree that “laypeople” need to have greater understanding of the issues so that they can make informed decisions, rather than simply picking an “expert” to blindly follow. Perhaps you can provide a little Statistics 101 discussion for us to better illustrate the shortcomings in Ms. Minger’s analysis for the lay public?</p></blockquote>Davehttp://www.blogger.com/profile/18290594860469294453noreply@blogger.com45tag:blogger.com,1999:blog-7721098568390636553.post-56165091587025165952010-07-22T05:48:00.000-07:002010-07-22T08:55:44.025-07:00Yes, We Have No BananasJust a very quick post. It's been almost a week since I submitted my registration for 30bananasaday.com. Prolific commenter "durianrider" is one of the principles at this site. You can read some of his "insightful" comments on <a href="http://sparkofreason.blogspot.com/2010/07/going-bananas.html">the last post</a>. Note, however, the one question I asked repeatedly, to which he gives no attention. Not only do I think I will never get approved to post on 30bananasaday.com, I doubt I'll even get a reason. Draw your own conclusions.<br /><br />Another interesting development is the new(ish) web site <a href="http://campbellcoalition.com/">campbellcoalition.com</a>. <a href="http://campbellcoalition.com/?p=142">Dr. Campbell's response</a> to <a href="http://rawfoodsos.com/2010/07/07/the-china-study-fact-or-fallac/">Denise Minger's critique</a> is featured prominently, and better yet, Dr. Campbell has indicated that he may participate in some discussion here. I've posted a few questions, and urge others to do the same. I recommend you focus the discussion on scientific topics, as opposed to his opinion of Denise Minger, etc. Come armed with some hard questions on the connections between nutrition and metabolism, particular as they relate to Dr. Campbell's hypotheses. I believe this exercise has two realistic outcomes: either Dr. Campbell has some answers (which actually would be very cool), or he stonewalls. Either way we learn something interesting.<br /><br />*** UPDATE ***<br />Well, it didn't take long for us to learn something interesting. From the<a href="http://campbellcoalition.com/?p=142&cpage=1#comment-35"> comments on Dr. Campbell's reply to Denise Minger</a>:<br /><br /><blockquote>Based on the response received thus far, we have determined that our prior idea of a reasoned and civil discourse, with participation by Dr. Campbell, is not feasible and have decided to discontinue this discussion thread. Before closing, however, Dr. Campbell wanted to respond to comments from Denise Minger. Her comments are posted above, and Dr. Campbell’s response follows.</blockquote><br /><br />In other words, Dr. Campbell is going to have the last word, like it or not. So much for scientific discourse. The Campbells certainly could have chosen the path taken by Denise Minger - posting all discussion, whether "civil" or not, and choosing the reply to those questions or issues that are clearly intended to foster scientific discussion, and ignoring ad hominem attacks etc. Dr. Campbell's chosen course speaks to his true motivations.<br /><br />If you have questions you posted to Campbell's site which did not make it through moderation, I invite you to repost in the comments here. Others can then see exactly what offended Dr. Campbell so greatly that he opted out of the discussion.Davehttp://www.blogger.com/profile/18290594860469294453noreply@blogger.com6tag:blogger.com,1999:blog-7721098568390636553.post-14018342780869565822010-07-16T09:34:00.000-07:002010-07-16T10:07:17.361-07:00Going BananasI got an "interesting" comment on my <a href="http://sparkofreason.blogspot.com/2010/07/china-study-crushed-by-its-own-data.html">last post about Denise Minger's critique of "The China Study"</a>. It's the one from "durianrider" - check it out, particularly his "challenge". I've already answered the challenge, but invite others to also provide information about any elite non-vegan athletes they may know of. I have no illusions that we'll change durianrider's mind, or that of any "true believer", but the way to counter misinformation is with good information. Individuals need all the information they can get to make informed decisions, so let's make sure they get it, and support their right for informed choice. Personally, if you choose to be a vegan, that's fine with me. I have no stake in your personal lifestyle choice, but I do want to help people at least make that choice an informed one, rather than one based the propaganda of zealots.<br /><br />And T. Colin Campbell, if you're out there, let's see if you have the courage of your convictions. I have a Ph.D. and was an academic research scientist for many years, so I should be "worthy" of scientific discourse with you. And discourse is at the root of scientific progress. How can you expect to educate people like me on your views if you are unwilling to discuss them with opponents in a public forum?<br /><br />Related note: <a href="http://www.durianrider.blogspot.com/">durianrider </a>is also one of the principals of the 30bananasaday.com site, along with "freelee". Some of the discussion on the post "<a href="http://www.30bananasaday.com/group/debunkingthechinastudycritics">Debunking the China Study Critics</a>" is pretty interesting, from a sociological point of view. I am going to try registering for the site, and see if they have any willingness to let in opposing views. The registration page and f<a href="http://www.30bananasaday.com/forum/topics/please-read-our-forum">orum guidelines</a> make me suspect they are intolerant of those who might not agree with them, e.g. this quote:<br /><br /><blockquote>We will not tolerate "anti-fruit" posts or advice that recommends calorie restriction/or the suggestion that others are "overeating on fruit", also recommending others restrict their water intake will not be supported on 30BaD, these threads will be deleted and you will be given a warning. This advice is not only unproductive but dangerous to the health of our members.</blockquote><br /><br />One of the best signs of dogmatic belief is the intolerance of information which contradicts said belief. I'll reserve judgment on 30bananasaday.com until my application gets accepted or rejected, as I plan to make it quite clear that I will be providing evidence that runs counter to their mission.<br /><br />For my part, I welcome discussion from all corners, provided it is reasonably civil (i.e. contains actual information rather than emotional spewing). The definition of rationality is that two people with the same information will draw the same conclusions. But the only way those two people can achieve the same state of information is through <span style="font-weight: bold;">communication</span>. Even if you completely disagree with my views, there's a reasonable chance that I will learn something from you which may help me make better choices. So bring it on!Davehttp://www.blogger.com/profile/18290594860469294453noreply@blogger.com54tag:blogger.com,1999:blog-7721098568390636553.post-83379256010837366902010-07-09T03:39:00.000-07:002010-07-09T04:22:26.790-07:00The China Study: Crushed by its Own DataYou may have already seen <a href="http://rawfoodsos.com/2010/07/07/the-china-study-fact-or-fallac/">this outstanding analysis of the data from "The China Study"</a>. If you haven't, I highly recommend you give it a read. It's long, but well worth the effort. Readers of this blog know my opinion of T. Colin Campbell and his "scientific" work. Now somebody has taken the time to actually crunch the numbers, using Campbell's own data to demonstrate that his conclusions are baseless (at least when confined to this data), and probably the result of confirmation bias.<br /><br />I also love the observation that, despite his constant whining about the "dangers of reductionism" in science, Campbell's entire argument against animal protein really hinges on a strongly reductionist experiment, namely the isolated effect of casein fed to rats in large doses. Snap!<br /><br />Readers know of my criticisms of classical statistics, but it should be noted that I don't really have a problem with the mathematics, but the application. Math is what it is, either right or wrong. My issue is that classical statistics is used incorrectly, to draw inferences about hypotheses, when the underlying mathematical framework has nothing to do with inference. The key problem is that "statistics" are just numbers derived from data, like correlations. They don't say anything about a hypothesis: you will calculate the same correlation between two datasets, regardless of your hypothesis about what <span style="font-style: italic;">causes </span>that correlation. Anyway, I don't want to get off on a rant. My point here is that the author, Denise Minger, does an excellent job of confining her analysis and conclusions within the bounds of what classical statistics can tell you. And along the way, she does a great job of demonstrating how easy it is to fool yourself (as T. Colin Campbell did - repeatedly) by over-interpreting these numbers which, in the end, cannot tell you anything more than what's in the data.<br /><br />Ms. Minger has also done a great service in providing a concrete example of the issues in observational studies. You've likely read often that epidemiological studies are of little use in distinguishing between competing hypotheses. Now you have an example, replete with numbers. Ms. Minger demonstrates in several cases how a seemingly "obvious" conclusion vanishes once you dig into the large number of uncontrolled variables inherent in all observational studies. It's easy to find correlations in large datasets with many uncontrolled variables. The problem is that people take these correlations to mean more (or less) than they really do in terms of supporting/undermining a particular hypothesis, and the conclusions they draw are essentially <span style="font-style: italic;">ad hoc</span>, not based on any rigorous mathematical analysis, but rather hand-waving about what is "obvious". An oft-quoted example is that men who shave daily have a higher incidence of heart disease. It is "obvious" that heart disease is not caused by shaving, right? Or is it? There's a whole lot of other information that goes into that judgment. We generally take this sort of thing for granted, especially when made in pronouncements from "esteemed" scientists like T. Colin Campbell. But if you dig into the reasoning behind these conclusions, you generally find a tangled web of assumptions, hypotheses assumed to be true, but which have varying (if any) actual evidence to support them. Ms. Minger does a great job of teasing these out of Campbell's reasoning, and demonstrating how the data itself provides little evidence one way or another, precisely because it cannot distinguish between the potential effects of the many intertwined and uncontrolled variables.<br /><br />Anyway, enough of my babbling. Go read the article, you'll be glad you did (unless you're an uncritical fan of T. Colin Campbell, in which case you've got bigger problems).Davehttp://www.blogger.com/profile/18290594860469294453noreply@blogger.com30tag:blogger.com,1999:blog-7721098568390636553.post-41037459305997771042010-05-21T07:25:00.001-07:002010-05-21T07:28:18.488-07:00Alzheimer's and RAGESomething I wrote in an email a while ago . . .<br /><br />Advanced glycation endproducts (AGEs) are the endpoints of some complicated chemistry that occurs when simple sugars (glucose, fructose, etc.) react with proteins (and apparently fats too). They’re toxic for a variety of reasons, and trigger an inflammatory response via the receptor for advanced glycation endproducts, or RAGE.<br /><br />It turns out that RAGE binds to a whole bunch of things, and amongst them is the amyloid beta peptide, which is implicated in the development of Alzheimer’s. Amyloid beta is apparently produced via neural activity. I can’t figure out if it has a function or is just a by-product. I suspect it has some function, because the body has a mechanism for achieving a balance in the central nervous system (CNS). One kind of receptor (LRP) causes active transport out of the CNS to the blood, while RAGE triggers transport from the blood to the CNS across the blood-brain barrier. More RAGEs means you’ll have more amyloid beta in your brain. I couldn’t verify this, but I would guess that insulin drives the formation of RAGE. It makes sense, as your body would be preparing for glycation damage (more AGEs) from increased blood sugar, whether the source was food or glucose released due to stress. And indeed, diabetics have higher concentrations of RAGE (as do the blood vessels in the brains of Alzheimer’s victims).<br /><br />We learned today that stress actually increases amyloid beta production in the brain, via the action of corticotrophin releasing factor, or CRF. I got in contact with one of the authors of that study and he was nice enough to send me a reprint of the paper. It’s a pretty solid piece of research. Amongst other things, they showed that the more you stress mice, the more amyloid beta is produced. They could introduce CRF directly into the brain, and observe increased amyloid beta production. They could block the action of CRF, stress the mice, and see that less amyloid beta was produced. And finally they could directly block neural activity, and either stress the mice or introduce CRF, and again would see reduced amyloid beta. So it was a pretty solid case, albeit in mice. It would be surprising if humans turned out to be much different, though it’s certainly possible. CRF is released as part of the stress response. It is also released as a result of insulin-induced hypoglycemia, i.e. insulin goes up, blood sugar crashes, CRF pumps out.<br /><br />One last piece of the puzzle: by itself, amyloid beta is soluble, and shouldn’t form solid plaques (or at least do so slowly). But test-tube experiments show that formation of solid “fibrillar aggregates” of amyloid beta are accelerated if you provide seeds of altered amyloid beta. And what’s one form of the alteration? Glycation damage from sugar.<br /><br />So, less than surprisingly my hypothesis is that the route to Alzheimer’s mirrors that of heart disease. A high-carbohydrate diet leads to the following effects:<br /><ol><li>Increase in density of receptors for advanced glycation endproducts, which leads to increased amyloid beta concentrations in the brain.</li><li>Release of CRF, which increases production of amyloid beta in the brain.</li><li>Damage to amyloid beta, which increases the formation rate of solid aggregates, which may be contributory toward the formation of the plaques associated with Alzheimer’s.</li></ol>And of course, there’s the usual feedback between stress and diet: psychosocial stress makes you want to eat more carbohydrates, which makes you more stressed, etc.Davehttp://www.blogger.com/profile/18290594860469294453noreply@blogger.com5tag:blogger.com,1999:blog-7721098568390636553.post-11494533077361168122010-05-10T08:30:00.000-07:002010-05-10T13:58:41.193-07:00When Listening to Scientists, Be Sure to Check Their ShoesDuring college, I worked on and off as an intern at IBM Boulder. I remember when I changed departments, to work on management software for the facilities group (whose job it was to keep track of the walls and such - seriously, not as simple as you'd think). One of the senior guys named Tom took me out for my inaugural trip to the coffee machine. Back then we didn't have nice coffee set-ups like many companies do now, just a machine that gave you little paper cup of battery acid for a quarter. As we approached, there were some people ahead of us a the machine, including one of the managers I had just met. "You're gonna owe me a coffee," said Tom.<br /><br />"Uh, okay," I said, thinking it was some new guy tradition to buy coffee. "Why?"<br /><br />"See that guy?" asked Tom, indicating the manager.<br /><br />"Sure, " I replied.<br /><br />"Check his shoes."<br /><br />I dutifully looked at the shoes. Seeing nothing out of the ordinary, I asked "What about his shoes?"<br /><br />"They're full of shit."<br /><br />I bought the coffee.<br /><br />When you're getting information from scientists or other "experts", there are some good signs that indicate when a shoe check might be needed (to see what they're full of). One of the best is when scientists argue for/against a particular hypothesis by lecturing about the scientific method, rather than actual evidence. Usually this is a bitch-fest about how opponents of their views are unscientific self-interested boobs, while casting themselves like Gandalf on the Bridge of Khazad-dûm (paraphrasing a bit):<br /><br /><blockquote>You cannot pass! I am a servant of the Secret Fire, wielder of the Flame of Science. The dark fire will not avail you, Flame of Dumb-Dumb! Go back to the shadow. You shall not pass!</blockquote><br /><br />Riiiiiight.<br /><br />(Of course, since I spend a good chunk of this blog lecturing about scientific method, maybe I should check my own shoes :-)<br /><br />I recently came across a couple of excellent examples of exactly this phenomenon, and thought we'd all benefit (and maybe get a good laugh) from checking the shoes of those involved. The first is <a href="http://www.amazon.com/review/R2W7KWZKQY6BGJ/">T. Colin Campbell's "review" of the latest Atkins diet book</a>. I haven't read the book, and am no particular fan of Atkins over any other diet, beyond the fact that it applies well-understood metabolic principles to achieve predictable results. And I won't spend time dissecting Campbell's review. He doesn't say anything that amounts to much beyond the Gandalf quote above (I can't shake this mental image of Campbell on the bridge, wielding a carrot and handful of wheat against a cow with a platter of bacon on its back). <a href="http://livinlavidalowcarb.com/blog/t-colin-campbell-urges-action-on-new-atkins-book-says-it-is-very-misguided-on-science/7861">Jimmy Moore already did a great job of chewing up Campbell's argument</a>, so I'll direct you there and to the links within (definitely see also Chris Masterjohn's review of "The China Study", and Campbell's unintentionally humorous reply). I just find it funny that Campbell is lecturing anybody about the scientific method, when he seems to apply it selectively, if it all. For instance, see his discussion about his personal "scientific philosophy" and "holistic" approach in <a href="http://www.cathletics.com/articles/proteinDebate.pdf">The Protein Debate</a>. I think it's pretty clear that Campbell is a conditional fan of the "scientific method," as long as it leads you to conclusions that agree with his own.<br /><br />BTW, if you haven't read <a href="http://www.cathletics.com/articles/proteinDebate.pdf">The Protein Debate</a>, you should. For a long time you had to pay for access, but now it seems to be available for free. Loren Cordain provides a review of a lot of interesting evidence ranging from archaeological to biological, along with tons of references. Cordain has his own axe to grind, of course, so don't be fooled into thinking he's giving the whole picture. But he certainly provides a lot more background (164 references) than Campbell (0 references). Funny that Campbell complained in his Amazon review that Atkins never published a peer-reviewed paper and lectured on the requirement of peer review in "real" science (shoe check), yet neglects to reference said when arguing his own position. Read Campbell's part in the debate for lots of "check his shoes" examples. Plus it's great fun to see Campbell get handed his own ass - on a platter, with a side of bacon.<br /><br />The second example is a <a href="http://www.sciencemag.org/cgi/content/full/328/5979/689">letter to Science Magazine, entitled "Climate Change and the Integrity of Science"</a>. According to the <a href="http://www.guardian.co.uk/environment/2010/may/06/climate-science-open-letter-nas">guardian.co.uk</a>,<br /><br /><blockquote> <p>A group of 255 of the world's top scientists today wrote an open letter aimed at restoring public faith in the integrity of climate science.</p> <p>In a strongly worded condemnation of the recent escalation of political assaults on climatologists,<a href="http://www.guardian.co.uk/environment/2010/may/06/climate-science-open-letter"> the letter, published in the US Journal Science</a> and signed by 11 Nobel laureates, attacks critics driven by "special interests or dogma" and "McCarthy-like" threats against researchers. It also attempts to set the record straight on the process of rigorous scientific research.</p> </blockquote><br /><br />Wow, 255 scientists including 11 Nobel laureates? That's a lot of shoes to check. And we'll have to check those of Nobel winners twice.<br /><br />The letter actually gets off to a good start:<br /><br /><blockquote>We are deeply disturbed by the recent escalation of political<sup> </sup>assaults on scientists in general and on climate scientists<sup> </sup>in particular. All citizens should understand some basic scientific<sup> </sup>facts. There is always some uncertainty associated with scientific<sup> </sup>conclusions; science never absolutely proves anything. When<sup> </sup>someone says that society should wait until scientists are absolutely<sup> </sup>certain before taking any action, it is the same as saying society<sup> </sup>should never take action. For a problem as potentially catastrophic<sup> </sup>as climate change, taking no action poses a dangerous risk for<sup> </sup>our planet.</blockquote><br /><br />Clearly you cannot wait until uncertainties are resolved before making choices about how to deal with the possible outcomes of those uncertainties. And in theory, science is all about performing inference in the face of uncertainty, understanding how incomplete information about the world informs beliefs about competing hypotheses. Alas, the letter ruins this excellent start by espousing the opposite course, demanding that we should agree with their "facts":<br /><br /><p> </p><blockquote><p>Scientific conclusions derive from an understanding of basic<sup> </sup>laws supported by laboratory experiments, observations of nature,<sup> </sup>and mathematical and computer modeling. Like all human beings,<sup> </sup>scientists make mistakes, but the scientific process is designed<sup> </sup>to find and correct them. This process is inherently adversarial—scientists<sup> </sup>build reputations and gain recognition not only for supporting<sup> </sup>conventional wisdom, but even more so for demonstrating that<sup> </sup>the scientific consensus is wrong and that there is a better<sup> </sup>explanation. That's what Galileo, Pasteur, Darwin, and Einstein<sup> </sup>did. But when some conclusions have been thoroughly and deeply<sup> </sup>tested, questioned, and examined, they gain the status of "well-established<sup> </sup>theories" and are often spoken of as "facts."<sup> </sup></p> For instance, there is compelling scientific evidence that our<sup> </sup>planet is about 4.5 billion years old (the theory of the origin<sup> </sup>of Earth), that our universe was born from a single event about<sup> </sup>14 billion years ago (the Big Bang theory), and that today's<sup> </sup>organisms evolved from ones living in the past (the theory of<sup> </sup>evolution). Even as these are overwhelmingly accepted by the<sup> </sup>scientific community, fame still awaits anyone who could show<sup> </sup>these theories to be wrong. Climate change now falls into this<sup> </sup>category: There is compelling, comprehensive, and consistent<sup> </sup>objective evidence that humans are changing the climate in ways<sup> </sup>that threaten our societies and the ecosystems on which we depend.</blockquote><br /><br />Oh brother, how much self-aggrandizing hyperbole can you pack into two paragraphs? Right off we get the lecture on the scientific method. The authors compare themselves to Galileo, Pasteur, Darwin, and Einstein (such name-dropping is another indication a shoe-check is required). The comparison with other "well-established" theories also needs some examination in comparison with the anthropogenic global warming (AGW) hypothesis:<br /><ul><li>The Big Bang (or whatever process created the Universe), formation of the Earth, and evolution have all occurred already. For that matter, so has significant climate change on Earth, without help from human beings. What we don't have is a way of testing specific predictions about the behavior of a very complex nonlinear system, namely that human behavior is the driving force behind the recently observed global temperature variations, and that changes in human behavior can alter the course of future climate change. Big difference.<br /></li><li>The Big Bang, while "well-established" in the minds of physicists, is really only well-established in a semi-dogmatic sense. There are fairly major holes in the theory, in terms of predictive power. The current hypothesis required for getting from a Big Bang event to the Universe observed today ("inflation") has no evidential support - at all. It may be the <span style="font-weight: bold;">best</span> hypothesis we have at this point, but there's plenty of room for it to be supplanted by new information (and it wouldn't require much). The example is the most appropriate one for comparison to the AGW hypothesis, though for reasons opposite what the authors intended.<br /></li><li>Estimates of the age of the Earth leverage some other very basic "facts", amongst them that statistical behaviors of radioactive elements are observed to be the same every time we look. The nucleus of an atom on the Earth largely can be treated as an isolated system: it doesn't have a whole lot of complex interactions with the environment, in particular there really aren't any nonlinear feedback loops or other dynamical behavior to consider when doing radioactive dating. Inference of the age of the Earth can then be performed with some accuracy, as the relevant "givens" and observations don't admit much uncertainty. By contrast, global climate has many MANY interacting variables, most of which we probably don't even know about yet, and considerable uncertainty underlying the ones we do know about. It is difficult to see how any specific prediction of the <span style="font-weight: bold;">future </span>dynamic behavior of global climate could be as accurate as that for the <span style="font-weight: bold;">past</span> behavior of radioactive elements that have been sitting around in a rock for billions of years.</li><li>Evolution is about as close to a "fact" as you're going to get. First of all, it effectively follows from a combination of the "laws" of thermodynamics (mainly the first and second) and the ability of a system (whether it is a molecule or a complex organism) to a) maintain a relative narrow set of states against environmental fluctuations, and b) reproduce itself at a rate greater than it's destruction. Evolution is just math, in the end. And of course, it is observed repeatedly in the laboratory and Nature. There may be many specific models that predict different evolutionary endpoints, or routes by which currently observed endpoints were achieved. But the fundamental phenomenon, that mutable self-reproducing systems will evolve, applies to all of these models, and all predictions are necessarily consistent with this "meta-behavior". By contrast, global climate is an instance of a specific system, which we model given what (very little) we know about the intertwined physical, chemical, and biological systems on the Earth, and continued warming is a specific prediction of that model. As climate is a system showing chaotic behavior across many timescales, it may be fundamentally unpredictable, for all practical purposes. So calling this prediction a "fact" is stretching thin even the approximate definition of "fact" made by the authors.</li></ul>The letter goes on to state a variety of "facts" or "conclusions" which the authors imply are more or less incontrovertible, which would seem to contradict their initial points about uncertainty and the scientific method. I think the key problem here (and in most science) is the idea that there is any "conclusion" in science. The only real conclusion is the relative belief in one hypothesis over competing hypotheses, as opposed to a specific identification of "truth". But standard statistics is completely backwards on this point, instead testing if observed data are likely <span style="font-weight: bold;">given </span>that a hypothesis is true. It's not the likelihood of the hypothesis being tested, but that of the data. The truth of the hypothesis is assumed in this analysis. So when a scientist finds that their data is strongly consistent with the observations, they "conclude" the hypothesis is a "fact". But that ignores both any prior information (similar to "black box" diet studies which don't include knowledge of metabolism in assessing outcomes) as well as competing hypotheses. Your pet hypothesis might be consistent with the data at the 99% level, but if mine is 99.9% consistent, and further more consistent with other prior information, then it is more likely to be true. By not quantitatively assessing competing hypotheses, the authors of the letter are guilty of exactly the sort of "hiding heads in the sand" behavior of which they accuse their detractors:<br /><br /><blockquote>We also call for an end to McCarthy-like threats of criminal<sup> </sup>prosecution against our colleagues based on innuendo and guilt<sup> </sup>by association, the harassment of scientists by politicians<sup> </sup>seeking distractions to avoid taking action, and the outright<sup> </sup>lies being spread about them. Society has two choices: We can<sup> </sup>ignore the science and hide our heads in the sand and hope we<sup> </sup>are lucky, or we can act in the public interest to reduce the<sup> </sup>threat of global climate change quickly and substantively. The<sup> </sup>good news is that smart and effective actions are possible.<sup> </sup>But delay must not be an option.</blockquote><br /><br />I think everybody involved here is "ignoring the science" in one way or another. Threats of criminal prosecution is the sort of idiot knee-jerk response made by politicians, who, incapable of thinking for themselves, blindly follow the "expert du jour". When it turns out the politician made stupid and shortsighted decisions based on "expert" advice, they want to turn on the expert rather than accepting responsibility for acting like an idiot. Physician, heal thyself!<br /><br />But the authors of this letter are no better. AGW proponents seem to ignore the elephant in the living room: the climate is probably going to change at some point whether or not human activity has anything to do with it. If anything is going to doom humanity, it is our anthropocentric view, that we are the masters of the Earth, able to bend Nature to our will. History shows that environmental conditions are large unstable, requiring organisms to adapt or die. We clearly should not ignore the possibility of climate change and the effects it will have on human life. But should we focus our resources on trying to force Nature to behave as we wish (and probably failing over the long term)? Or is it better to learn from history, assume that change is coming, and figure out how we will adapt to Nature's whims? I'm guessing the personal goals of the "scientists" aligns strongly with one of these scenarios, not so much the other.<br /><br />And that's the real issue with both examples: the gap between the personal goals of those providing information and the goals of the receivers of that information. I've discussed this before, more in the context of organizations like pharmaceutical companies. But scientists are just as self-interested as any other organism or organization. The personal goals of academic scientists are centered around career advancement and getting funding for research. For both, you need to make some scientific hypothesis and be "right" about it, not necessarily in the sense of having actual evidence quantitatively weighting the hypothesis, but in getting some large chunk of the scientific community to buy in. Achieving said buy-in is the core goal of academic scientists, and whether or not "consensus" is obtained through actual evidence isn't really relevant to the practitioners. They generally think that the consensus so obtained is itself evidence that they're right, but there's circular reasoning and confirmation bias written all over that. When you are evaluating the evidence put forth by a scientist, you not only must evaluate the quality of that evidence, but also the context in which it is presented, because the presenter undoubtedly (and probably unconsciously) re-weights things based on their own beliefs and goals. The scientist has a vested interest in being considered "right", which can be a lot different than actually being "right". The stronger those beliefs and goals relative to the actual evidence, the more likely you'll hear about "facts" and the "scientific method" as opposed to detailed evidence, both supportive and contradictory.<br /><br />So when a scientist speaks, be sure to check the shoes.Davehttp://www.blogger.com/profile/18290594860469294453noreply@blogger.com23tag:blogger.com,1999:blog-7721098568390636553.post-21254831235906346412010-05-08T13:53:00.000-07:002010-05-08T19:35:14.379-07:00Mother Nature (and Monsanto): Thriving on The Law of Unintened ConsequencesI loved this article: <a href="http://www.nytimes.com/2010/05/04/business/energy-environment/04weed.html?pagewanted=1">U.S. Farmers Cope with Roundup-Resistant Weeds</a>. Here's an excerpt:<br /><br /><p> </p><blockquote><p>Roundup — originally made by <a href="http://topics.nytimes.com/top/news/business/companies/monsanto_company/index.html?inline=nyt-org" title="More information about Monsanto Co" class="meta-org">Monsanto</a> but now also sold by others under the generic name glyphosate — has been little short of a miracle chemical for farmers. It kills a broad spectrum of weeds, is easy and safe to work with, and breaks down quickly, reducing its environmental impact. </p><p> Sales took off in the late 1990s, after Monsanto created its brand of Roundup Ready crops that were genetically modified to tolerate the chemical, allowing farmers to spray their fields to kill the weeds while leaving the crop unharmed. Today, Roundup Ready crops account for about 90 percent of the soybeans and 70 percent of the corn and cotton grown in the United States. </p><p> But farmers sprayed so much Roundup that weeds quickly evolved to survive it. “What we’re talking about here is Darwinian evolution in fast-forward,” Mike Owen, a weed scientist at <a href="http://topics.nytimes.com/top/reference/timestopics/organizations/i/iowa_state_university/index.html?inline=nyt-org" title="More articles about Iowa State University" class="meta-org">Iowa State University</a>, said. </p><p> Now, Roundup-resistant weeds like horseweed and giant ragweed are forcing farmers to go back to more expensive techniques that they had long ago abandoned. </p></blockquote><p><br /></p><p>My first reaction on reading this was that Monsanto obviously screwed up. I mean, what idiot couldn't see this coming? But on second thought I'll bet they did see it coming. The later portion of the article discusses how Monsanto and other chemical companies are developing genetically-modified food plants (wheat, corn, soy) to be resistant to other herbicides as well (including one using a component of Agent Orange - mmmmm, Agent Orangey tofu). So of course, farmers will not have to buy additional pesticides, and probably pony up more cash for the next generation of resistant seeds. And you can see that going indefinitely, with the cash register ringing the whole time for Monsanto etc.</p><p>And to be clear: I don't think that companies like Monsanto are doing something evil. They're behaving exactly the way we ask them to in a (more or less) free market economy. They are taking a strategy that maximizes their value (or at least their assessment of it). That strategy may or may not have anything to do with maximizing your health or minimizing environmental impact. If there's any evil here, it's that of complacency on the part of the consumers, who (as a group) hold the ultimate power to change how corporations value their strategy. Corporations are notoriously short-sighted, as demonstrated by how readily many major financial institutions drove their respective buses off of a cliff recently. <a href="http://provisdom.com/">The start-up I used to work</a> for developed a whole set off mathematical and software tools with the idea of allowing public corporations to value long-term strategy in the face of uncertainty. We spent some time studying how corporations actually make decisions vs. how they should given a way of optimizing value given whatever they knew (and knew they didn't know). The gap is typically quite large. Corporations, like people, are shortsighted, and much better at rationalizing why they did something after the fact than making a rational decision in the first place.</p><p>The good news is that corporate myopia gives consumers a fairly large lever. If you want corporations to "care" about your long-term health and well-being, be an informed consumer, and make your buying choices to reflect your own goals. It's the "informed" part that's important here.</p><p>I wonder how the course chosen by chemical/seed companies will play out. Maybe something like this:</p><ul><li>Continued increase in spectrum of pesticides, resistance of weeds, and genetic engineering of food crops. At some point, the weeds are basically resistant to anything that won't outright kill humans.</li><li>Companies introduce a genetically modified bug to eat the weeds. New food crops are engineered to produce chemicals that repel the bugs. The insects eventually kill off most of the weeds, but evolve to be resistant to the food crop insect toxins, and start eating our food.</li><li>Cycle continues, introducing ever-more genetically engineered species introduced from higher in the food web.<br /></li><li>Eventually the genetically-engineered humans are produced to act as workers to contain all of the new pest species. These "humans" are built to thrive on weeds, and as such prove to have considerably greater reproductive fitness of the old-school "natural" humans, whose fate as a species is basically sealed.</li></ul>Davehttp://www.blogger.com/profile/18290594860469294453noreply@blogger.com0tag:blogger.com,1999:blog-7721098568390636553.post-68892225621933187972010-05-07T16:05:00.000-07:002010-05-08T19:54:41.330-07:00Why do you eat grains?That question isn't is smart-assed as it sounds. Bear with me.<br /><br />I've blabbed before as to how I've often asked nutrition experts "What's so healthy about 'healthy whole grains'?" I've never gotten an actual answer, and as far as I can tell the best one could say is "nothing in particular." And while I have discussed the <span style="font-style: italic;">possible </span>ways that grain consumption could lead to disease, I would have to admit that the evidence that grains have some particular disease-causing properties (outside of those with obvious clinically-detectable problems, like celiac) seems more correlation than causation at this point.<br /><br />So I've started rethinking this question more as "why does anybody eat anything?" Clearly the need, at some level, to seek out and consume food has to be innate. And animals evolve amazingly complex behaviors around food. I remember giving my dog an egg for the first time, shell and all. As he does with any food, I expected him to swallow it more or less whole, maybe with a couple of crunches for good measure. Instead, he gently picked it up from his bowl, put it on the ground, and ever-so-delicately cracked it open with his front teeth, then licked out the inside and left the shell. I'm pretty sure that wasn't a learned behavior, unless he's been climbing trees and getting into robins' nests behind my back.<br /><br />But in general, and probably particularly for omnivores, directed behavior associated with food (like "go find some more of those sweet orange spherical thingies") is learned. Babies put everything in their mouths for a reason: they're figuring out which things are worth seeking out and sticking in their mouths again. You may want to check out this <a href="http://wholehealthsource.blogspot.com/2010/05/traditional-preparation-methods-improve.html">fascinating paper on the topic</a>. The short version is this: there seem to be two main areas of the brain associated with taste. The primary taste cortex handles the innate sensing of taste: sweet, salt, bitter, sour, and umami, along with the texture and viscosity of food (to sense fat), temperature, capsaicin, etc. The response of the primary taste cortex is NOT attentuated by satiety. Something sweet tastes just as sweet whether you're hungry or full. But the primary taste cortex doesn't assign value to a particular taste, i.e. it does not decide whether something tastes "good" or "bad". That's the job of the secondary taste cortex. It is the secondary taste cortex that "decides" sweet things taste good when you're hungry, but no so much after eating a whole box of candy. Secondary taste cortex neurons learn what's good and what isn't, and are further tuned to specific foods. For instance, you can be fed to satiety with fat, and certain neurons will decrease their response to further fat. But the response of those same neurons to the taste of glucose does not decrease, regardless of whether or not you're full of butter. In other words, "there's always room for dessert".<br /><br />Anyway, let me get to the punch-line from the closing paragraph:<br /><br /><blockquote>The outputs of the orbitofrontal cortex reach brain regions such as the striatum, cingulate cortex, and dorsolateral prefrontal cortex where behavioural responses to food may be elicited because these structures produce behaviour which makes the orbitofrontal cortex reward neurons fire, as they represent a goal for behaviour. At the same time, outputs from the orbitofrontal cortex and amygdala, in part via the hypothalamus, may provide for appropriate autonomic and endocrine responses to food to be produced, including the release of hormones such as insulin.</blockquote><br /><br />In other words, the external response to food (behavior) is a <span style="font-style: italic;">learned </span>response driven by the secondary taste cortex, while the internal response (e.g. hormonal) is innate, originating in the primary taste cortex. That means that you learn what things taste "good" by the secondary taste cortex integrating feedback (positive and negative) from the rest of the body (primary taste cortex, glucose sensors, etc.), reinforcing or weakening the association of that taste with the behavior that led to those stimuli. So the fact that you "like" potato chips is intimately tied up with the impulse to get off the coach at midnight and stumble into the kitchen to finish off the bag. And the only reason you "like" any food is because your brain learned to, associating the flavor with some feedback signals which it interprets as being associated with a net positive outcome.<br /><br />One other point which is probably obvious, but important: the smaller the time between the flavor stimulus and relevant physiological response, the stronger the change in association with the behavior. Thus, getting cancer 10 years after eating a poisonous plant is not very helpful in weakening that behavior. It is certainly possible to crave something that produces a strong short-term reward, but has a net negative outcome. The brain (both consciously and unconsciously) is notably short-sighted in its assessment of value.<br /><br />Which brings me back to the original question: why do people eat grains? And I don't mean that as implying there's some moral judgment to made - food morality is just another religion. And there's obviously a spectrum of answers depending on the temporal proximity of the act of eating to a specific endpoint. On end is "prepared properly, they taste good" (I like sourdough toast dripping in butter as much as the next guy, though I eat it rarely). On the other end is the <a href="http://www.paleonu.com/panu-weblog/2009/11/27/health-and-evolutionary-reasoning-the-panu-method.html">evolutionary argument so brilliantly put forth by Kurt Harris</a>, basically that the net effect of domesticating grains was an advantage in reproductive fitness over hunter-gatherers, regardless of the relative "health" of those doing the reproducing. Evolution cares about making babies, and doesn't care if you have bad teeth and a bum ticker, as long as you contribute genes to more babies than the guy still killing perfectly serviceable beasts of burden with a rock on a stick.<br /><br />No, I'm interested in the middle area (logarithmically speaking), which is why we <span style="font-style: italic;">learned </span>to like grains. And why do we like them so much that <a href="http://wholehealthsource.blogspot.com/2010/05/traditional-preparation-methods-improve.html">we're willing to go to some amount of trouble to eat them</a>? Why do I so love sourdough toast and butter, even though it doinks my blood sugar and gives me acne?<br /><br />(Maybe it's the butter - New Zealand makes REALLY good butter.)<br /><br />I have nothing but vague guesses, and am hoping to get some interesting discussion in the comments.Davehttp://www.blogger.com/profile/18290594860469294453noreply@blogger.com49tag:blogger.com,1999:blog-7721098568390636553.post-84229059797411447122010-04-18T08:01:00.000-07:002010-04-20T12:09:17.402-07:00"The Myth of Green Beef": Pseudo-logic in actionCheck out this article: "<a href="http://www.theatlantic.com/food/archive/2010/04/the-myth-of-green-beef/38810/">The Myth of Green Beef</a>".<br /><br />The author, Helene York, provides a wonderfully clear example of "pseudo-logic", reasoning that is technically correct, but based on flawed or incomplete assumptions. Check out this quote:<br /><br /><blockquote>Linked to cardiovascular disease and maligned for its industry's dependence on federal corn subsidies, it now has a reputation as the Hummer of foods—an excessive contributor to environmental ills including climate change, nitrogen blooms, pollution, and depletion of Midwestern aquifers—not to mention E. coli contamination that has sickened and scared thousands.</blockquote><br /><br />Hmmm, sounds like the root of the problem here is the federal corn subsidies. Bon-Appetit Management, where Ms. York is the director for strategic initiatives, ran the cafe at my former company. I know from personal experience that a good chunk of the food provided by Bon-Appetit is made possible by federal subsidies corn, wheat, and soy. And of course there is a tidal wave of scientific evidence emerging that said foods are more likely the culprit cardiovascular disease, via the metabolic disturbances they create. The evidence that red meat per se causes any disease has, to my knowledge, never risen above association (the E. coli issue is problem with factory farmed animals, and only then for people whose health is otherwise compromised, maybe from eating "healthy" soy goo and avoiding the sun).<br /><br />Here's another classic:<br /><br /><blockquote>Voluntary rancher fees from an industry association's advocacy program have underwritten pro-meat marketing campaigns, stipends for researchers to raise doubts (but not conclusive evidence) about scientific studies, and dissemination of talking points that are misleading at best. "Reducing intakes of meat and dairy would only lead to hunger," I read recently, and the headline of an industry newsletter stated, "Meat and dairy intakes not linked to climate change." These news items represent a disturbing trend: raise doubts, obfuscate the facts, and misinform.</blockquote><br /><br />Isn't that EXACTLY what Ms. York is doing here? What makes her "facts" better than those she criticizes? Why are her studies more "scientific" than those that contradict her "conclusive" evidenct? Talk about confirmation bias. This is the fundamental problem we face when turning scientific information (or more precisely, the lack thereof) into decisions. Humans seem to have a psychological propensity to gravitate toward "absolute truths", and their absolute belief in those truths are motivated more by social and emotional factors than any sort of actual accounting of the evidence. Indeed, people like Ms. York seem to get wound around some sort of moral axle that drives their reasoning process. Beef is "bad" in her world. That's a "fact". Thus beef must be bad for your health, the environment, at the root of the global economic meltdown, bad hair days, etc. And maybe I'm pessimistic, but I have a feeling that, more than anything, serving beef might be "bad" for Bon-Appetit's bottom line. I would guess it is cheaper to sling soy/corn/wheat processed food (where you can reap the benefit of less prep and less annoying middle men sucking off the teat of government subsidies).<br /><br />But let's be optimistic, and presume Ms. York's motives are altruistic, that she really wants to save our hearts and our planet from the evils of a nice juicy steak. Does her reasoning hold water? I believe you would need to take the following assumptions as "facts" to support her conclusions:<br /><ol><li>Human activity causes global warming.</li><li>This warming trend will continue.</li><li>Changes in human activity can reverse the trend.</li></ol><br />This is where we run into trouble. The implication is that we have both a great enough understanding of global climate to make reliable predictions, and further that even if we had such detailed understanding, that behavior could be reliably extrapolated decades into the future. I'm no expert in global climate, but I know a thing or two about modeling complex systems, particularly in the face of uncertainty about the details. I seriously doubt that global climate models even begin to approach anything beyond a coarse representation of reality. There are plenty of aspects to the problem that we know we don't know, like the response of aquatic life to increased CO2 concentrations. There are significant uncertainties as well, e.g. solar and volcanic activity. And no doubt there's plenty of stuff we don't even know about, the "don't know what you don't know" category.<br /><br />And it gets worse. Climate is basically just another word for weather. I don't know if you've noticed, but it's pretty hard to predict the weather even a week into the future, much less 50 years. And short-term weather modeling is much better understood for the simple reason that when examining a shorter time period, less variables are likely to have a large effect (e.g. large glaciers don't change enough in a week to affect your forecast significantly). Even so, the weather remains unpredictable, and this unpredictability is intrinsic. Weather is an example of a non-linear system, one which exhibits a phenomenon called <a href="http://en.wikipedia.org/wiki/Chaos#Scientific_and_mathematical_chaos">deterministic chaos</a>. A brief digression might be in order.<br /><br />Consider a simple experiment, say measuring the time it takes a marble to fall from a height of one meter. We call such a system "deterministic" because the equations used to model it have no uncertainty. Given a particularly position and velocity for the marble, we can calculate the precise position and velocity an instant later. And this is a good approximation in our experiment. We might induce a little uncertainty in how our hand releases the marble, some from the measurement of the height, maybe some from air currents, etc. But we can repeat this experiment and get the pretty much the same results every time. In other words, small errors in our information about the marble's state translate into small errors in our predictions. The more accurate our information about the marble, the more accurate our prediction of the time to fall one meter.<br /><br />A system exhibiting deterministic chaos is deterministic in the strict sense of the term: given precise knowledge of it's state, we can predict exactly what will happen next. But unlike our marble experiment, chaotic systems amplify uncertainty. In other words, even small inaccuracies in your information about the system quickly become large. Worse yet, this amplification is exponential in time, so getting more accurate information might make them predictable for a slightly longer period, but it's still going to fall apart on you pretty quickly. Chaotic systems are predictable only in principle, but in practice your information is never perfect, and predictability drops exponentially with time. Deterministic chaos as we now think of it was "discovered" by <a href="http://en.wikipedia.org/wiki/Edward_Lorenz">Edward Lorenz</a>, who was modeling (you guessed it) global weather.<br /><br />So, even assuming that the East Anglia boobs, with their lost data and bogus statistical analyses, were "right" about there being a significant increase in mean global temperatures, how does that help us predict the future behavior of a complex chaotic system where are models are incomplete and full of uncertainties?<br /><br />Now when I drop this line of argument during discussions of global warming, the AGW crowd (after a bit of cognitive dissonance induced brain paralysis) come up with something like the following argument: human activity MIGHT be causing global warming, and since the downside has a value which is essentially negative infinity (extinction of the human race), we have to do everything possible to avoid it. Such an argument is more pseudo-logic, in this case by excluding the most likely scenario. AGW arguments center around whether or not the (supposedly) observed warming trend is caused by humans, and extrapolate that to conclude that humans might be able to reverse said trend. But this ignores the most likely scenario, which is that the climate will undergo a significant shift regardless of anything humans have done or will do. Why do I say this is the most likely scenario? Because it has happened many times in the past, and given the chaotic nature of climate, it is unlikely to stay in the current meta-stable state for long (many argue that the rise of civilization was made possible by unusual relative stability of climate). Arguments such as those put forth by Ms. York completely miss the point. We don't need to be worried about whether eating less hamburgers can affect the climate, we need to start hedging our risk that the climate <span style="font-style: italic;">will </span>change regardless of what we do. It's the short-sighted thinking and associated bad decision-making of individuals like Ms. York that will doom us, missing that the forest around them is burning down while hugging the tree right in front of their face.<br /><br />But back to the main thread. So the whole global warming argument is bogus. That's about as close to a fact as you're going to get, since it's really just mathematics (climate is chaotic, our knowledge of it is uncertain). Let's wander from math to the realm of science, where we consider evidence. The more detailed assumption underlying Ms. York's proposal is that cattle farming is particularly bad for the environment. She's basically equating cows with global destruction. This begs the question of how the Earth managed to survive millions of years of grazing animals, all of whom presumably had the same basic digestive strategy of modern plant eaters.<br /><ol><li>Possess large gut full of bacteria which can break down cellulose.</li><li>Eat plants, and lots of 'em.</li><li>Bacteria eat the cellulose, make CO2/methane/etc. as by-products.</li><li>Fart voluminously to avoid exploding.<br /></li></ol>I'll close by noting that there's nothing particularly "green" about any large-scale agriculture. Grain-fed cattle are no doubt the caboose on the train to ecological destruction, as cattle inefficiently convert grain into food (compared, say, to a chicken). But that whole process, like most modern agriculture, represents a massive perversion of a natural process, requiring considerable human intervention. And that perversion goes right back to the fields of corn/wheat/soy grown to feed both cows and people. The ecological issues have been described elsewhere (my favorite discussion is in <a href="http://www.amazon.com/Vegetarian-Myth-Food-Justice-Sustainability/dp/1604860804/">Lierre Keith's fantastic book The Vegetarian Myth</a>), but the bottom line is that the sort of large-scale monocultures we see today are an ecological dead end. Sooner or later you deplete the resources required to grow specific foods in high densities: soil, water, and petroleum for chemical fertilizers. Feeding people directly with that food instead of "wasting" on cattle might delay the end a bit, but you still get there, and if history shows us anything it's that making more food just winds up making more people who will wind up starving when the train crashes Thanks a billion, <a href="http://en.wikipedia.org/wiki/Borlaug">Norm Borlaug</a>, for "saving a billion people from starvation" by setting the stage for multiple billions to starve. This is the poster child for why the sort of short-term thinking displayed by Ms. York is to be avoided.<br /><br />So if there's any "myth" to be dispelled here, it's that there's anything "green" about the food industry, which includes the erstwhile Ms. York and her employer, Bon-Appetit Management.<br /><br />And winding back to the usual topic of this blog, we shouldn't forget the health consequences of a diet consisting mostly of processed soy/wheat/corn. There is plenty of evidence from all corners indicating that "diseases of civilization" arise from said foods. I'm still waiting for someone to detail the metabolic pathways by which eating a steak leads to diabetes, cancer, and heart disease. Any takers?Davehttp://www.blogger.com/profile/18290594860469294453noreply@blogger.com10tag:blogger.com,1999:blog-7721098568390636553.post-66124652790321655052010-04-08T19:14:00.000-07:002010-04-08T19:17:30.900-07:00Interview on "Livin' La Vida Low-Carb Show"Just a quick post - Jimmy Moore was kind enough to <a href="http://www.thelivinlowcarbshow.com/shownotes/1770/dave-dixon-is-spreading-a-spark-of-reason-episode-249/">interview me on his "Livin' La Vida Low-Carb Show"</a>. It was tremendous fun, and we got to discuss some interesting stuff. Check it out.<br /><br />I've been trying to get a blog done on some recent thoughts on the laws of thermodynamics and metabolic regulation. Hoped to have it by the time Jimmy's show was on, didn't quite make it. Hopefully will wrap it up by this weekend - stay tuned.Davehttp://www.blogger.com/profile/18290594860469294453noreply@blogger.com7tag:blogger.com,1999:blog-7721098568390636553.post-11860185980442151702009-12-24T17:28:00.000-08:002009-12-26T09:56:33.090-08:00Review of the SousVide SupremeWow, the last post was in August. Been pretty busy with the day job lately. Odd how it works out - my startup job (<a href="http://www.provisdom.com/">http://www.provisdom.com</a>, still close to my heart), for which I probably worked 10-12 hours a day, still seemed to leave me more time to do things like cook for the family. That's probably because I got to choose which 10-12 hours I worked, rather than having to spend an hour commuting each way and a solid 8+ hours sitting in a cubicle.<br /><br />I have to admit, when I first found out that the<a href="http://www.sousvidesupreme.com/"> Eades's world-changing project was a home sous-vide unit</a>, I was a tad disappointed. I was familiar with the concept of sous-vide, being a fan of shows like Top Chef. I also have to admit that in hindsight I really didn't "get it". The brilliance of the SousVide Supreme is that it enables my food habit in the face of my new work regimen. Our family eats meat - lots of meat. And they've become accustomed to it being prepared to a certain standard which is not really possible to achieve on an 8-5 schedule, because I generally get home after 6. To prepare a good roast chicken or steak (accompanied by Bordelaise sauce, the omission of which will lead to family chilliness until remedied) takes at least 2 hours.<br /><br />So it was a lot of crockpot and takeout during the week - until the SousVide Supreme came along. I pre-ordered mine, and awaited it with great excitement. It predictably didn't arrive until I was away from home during the Thanksgiving holidays, which caused me a certain amount of childish angst. But I finally got my grubby paws on it, made some righteously tasty food, and am ready to share my initial experiences and impressions.<br /><br />The short version is this: if you're a meat-eater, get one. It's worth every penny.<br /><br />I won't go too much into describing the unit, which has been done many other places. It is a little on the large side - but part of the issue is that our cabinets seem to have been made before appliances were invented. And you can cook an awful lot of food for the size. I made two tri-tips a couple of days ago, total weight five pounds, in a device the size of a bread-maker, and almost certainly with far less electricity than would have been required to achieve the same in my oven (which now seems cavernously inefficient). And I think the success of sous-vide can best be described by one guest's comment after the first bite: "Holy crap".<br /><br />If you don't know, "sous vide" is French for "under vacuum". The sous vide technique involves sealing the food in a vacuum bag and cooking in a water bath with precisely controlled temperature. There are multiple advantages to this approach. First, because the food (typically meat, though other foods benefit as well) is sealed, there isn't much moisture loss. The vacuum seal also ensures the water contacts the entire surface of the food. Water has much higher heat capacity and conductivity than air, so transfers heat to the food more effectively than the typical radiative/convective(air) transfer which occurs in a standard oven. Once up to temperature, the SousVide Supreme apparently requires about the same energy as a 60-watt lightbulb, something like 10x less than a conventional oven, I would imagine.<br /><br />But the real winner is that you set the water temperature to be the same as the final desired food temperature. "Normal" cooking requires a certain amount of precision by the chef. One applies relatively higher heat to the meat in an attempt to get the inside "done" before the whole thing turns to jerky. Unless you have a meat thermometer, the whole business is more art than science, because two pieces of meat have different fat/moisture/salt/etc. contents, all of which affects the thermal conductivity and the rate at which "doneness" is achieved. For instance, grass-fed beef typically has much lower fat content than grain-fed, and as a result cooks much faster (and is more rapidly rendered inedible). A thermometer helps, but of course the thermometer only measures the temperature at the center of the meat, at the location inserted, which may be of different size/fat content/etc. than the rest of the meat. I used to have a whole arsenal of techniques and tricks depending on the cut of meat, what it ate, etc.<br /><br />With sous-vide, you just pick your final temperature. The aforementioned tri-tip was done at 128F. Imagine trying to cook a steak at 128F in your oven. Not only would it take forever, but you'd be left with something resembling the bottom of a shoe at the end of the process. Better yet, you can leave the meat in the thing for a considerable amount of time (I'm talking hours) without risking overcooking. For instance, when we had our guests a couple of nights ago, I took out one tri-tip, gave it a shot in the broiler to give it some color (more on this in a moment), and left the other one in while we fed the kids (who hammered a good chunk of the first steak). About an hour later I just pulled out tri-tip #2, browned it up, and served hot. And it was outrageously good: tender, juicy, and brimming with flavor.<br /><br />And this has proved to be the real winner for me, with my new commuter lifestyle. I can drop in some steaks/chicken/chops before I leave for work in the morning, and have fabulous meat ready to eat 10-12 hours later when we all get home, plus a few minutes to heat a pan or the broiler and apply a tasty brown crust. And I'm not kidding about the fabulous. It does take a bit of experimentation with temperature and preparation to really nail it. I'll share a few things I've learned.<br /><br />First is that "doneness" of meat results from a non-trivial combination of time and temperature. If you really want to nerd it up on this topic, check out "<a href="http://amath.colorado.edu/%7Ebaldwind/sous-vide.html">A Practical Guide to Sous Vide Cooking</a>", originating from my alma mater (go Buffs!) The killing of any nasty bugs that might ruin your post-dining experience also results from a similar combination of time and temperature. Anyway, the first thing I tried was a London broil, which I cooked at the recommended 134F on a work day. So it cooked for about 10 or 11 hours at that temp. Was it the best steak ever? No (though my son claimed it was). But it was pretty darned good, a touch dry in texture (a sign it was starting to overcook), but nice and pink in color. Sous vide lesson 1: if you're going to leave your meat in for a long time, lower the temp a bit. The next try was with rib-eyes, done at about 130F, I believe, very nice, though could have been done lower still. The tri-tips came out great after 6 hours at 128F, and I think I'd drop it to 126F if I were going to leave it in all day.<br /><br />The next couple of tries were with chicken. Both were done as work-day meals, using breasts, legs, and thighs cut up. These were sealed with butter, salt, and pepper, and again cooked for about 11 hours. The first batch I did at the temperature recommended in the SousVide Supreme manual, which I think was 141F. The breasts were a bit dry (I'm a dark meat person by a long shot), but still better than most chicken breasts I'd had. The thighs and legs really shined, though: juicy and very flavorful. The next batch I did at 136F, and were dynamite. We all know the old saw about "tastes like chicken", which I thought was odd, since most chicken I'd had didn't taste like much of anything by itself. Not the sous-vide version, though. Tremendous flavor, and a big hit with the family. The downside: I tried to brown the skin in my stainless steel pan, but for the most part it just stuck, leaving all the tastiness behind. I'm going to try in the broiler, but I think sous-vide lesson 2 is to have a kitchen torch handy for browning. This allows high heat to be locally applied, to minimize the risk of drying out. Believe me, once you've had sous-vide chicken, you're not going to want to take that risk.<br /><br />We also tried pork chops, which I just sealed with salt. The chops themselves were just bulk-package center-cut, a little on the thin side. They came out fantastic, far more succulent and tasty than any pork chop I'd ever had. The bad news is that I again tried to brown in a pan, which dried them up pretty quickly. Sous-vide lesson 3: use thicker cuts of meat to prevent drying when you brown. I'll give it another whirl with some nice thick-cut chops.<br /><br />We've had some nice success with "contrary" cooking, using our new pressure cooker in conjunction with the SousVide Supreme. The "contrary" comes from the fact that things you usually cook fast with the stove or oven are cooked slowly by sous-vide, and things usually done slowly in a crockpot are done quickly (relatively) in the pressure cooker. One example is cheeseburgers topped with pulled pork. I did the burgers for 3 hours at 134F - pretty good, though again I think I could go lower, particularly considering that I'm going to brown them in a pan. The pulled pork takes about an hour under pressure, and I just let the pressure release naturally over another hour or so. The combination is fabulous.<br /><br />A better example was inspired by an interview with Heston Blumenthal while he was traveling on the SousVide Supreme tour. He stated that he always did stocks in the pressure cooker, since otherwise the flavors escape. This was a bit of a light-bulb moment for me (which ultimately led to the purchase of the pressure cooker). I would always make stock on the stove, cooking it for about 24 hours. My wife complained bitterly that the smell was driving her crazy because it made her hungry. I think she was having the same insight as Blumenthal. There were some additional issues with cooking stock on the stove. One was the time, which meant that stock had to be done in advance in large batches rather than cooking during the work day. If I didn't freeze the stock (which is something of a hassle), it had a tendency to grow interesting bacteria. The bacteria were at least nice enough to be fluorescent pink so I didn't put us all in the hospital.<br /><br />Now, with the pressure cooker, I can just make stock in parallel while the beef is cooking in the SousVide Supreme. Sauces are a fantastic way to bring variety to meat dishes, and further serve as a vehicle for nutrients that you might not otherwise consume. Here's my recipe for beef stock, followed by that for Bordelaise sauce, which I think is the perfect pairing with steak.<br /><br /><span style="font-weight: bold;">Beef Stock<br /></span><br />Ingredients<br /><ul><li>Two or three beef marrow bones, preferably the joint end with lots of cartilaginous goodness attached</li><li>One package of oxtails (about 0.5-1 lb usually).</li><li>0.5 lb sliced beef heart</li><li>3 large carrots, coarsely chopped</li><li>2 sticks celery, coarsely chopped</li><li>1-2 large yellow onions, coarsely chopped</li><li>One bunch thyme (I use one of those little plastic packages of fresh thyme)</li><li>One bunch parsley</li><li>One cup red wine<br /></li><li>4 cups water, plus any extra needed to cover<br /></li></ul>Procedure<br /><ol><li>Pre-heat the oven to 350F.<br /></li><li>In an oven-safe pan over high heat, brown the bones and oxtails on the stove. Throw in the veggies near the end (note that stores now often carry pre-made mirepoix, chopped carrots, celery, and onions, which saves some prep. I use about 4 cups of pre-made when I can get it).</li><li>Put all of this in the oven for 45 minutes.</li><li>Put the thyme, parsley, and beef heart in the pressure cooker. Add the browned meat and veggies on top, along with the water. Add extra water if needed to ensure everything is covered.<br /></li><li>Deglaze the pan with the red wine. I usually use Bordeaux, as (not surprisingly) it seems to match well with the other flavors in the Bordelaise (which originated in the French region of Bordeaux). Make sure to scrape all the brown goodies off the bottom of the pan, and add all of this to the pressure cooker.</li><li>Cook under high pressure for 1.5-3 hours. 3 hours gives the best flavor, but my pressure cooker only times up to 99 minutes. If I'm at home, I do two rounds of 99 minutes.</li></ol><span style="font-weight: bold;">Sauce Bordelaise</span><br /><br />Ingredients<br /><ul><li>4 cups beef stock</li><li>2-2/3 cups red wine (again, I like Bordeaux, and it doesn't need to be expensive)</li><li>6 large shallots, coarsely chopped<br /></li><li>One bunch thyme</li><li>8 oz. butter, cubed<br /></li><li>Salt and pepper to taste</li><li>Xanthan gum or other thickener<br /></li></ul>Procedure<br /><ol><li>Reduce the beef stock to 2-2/3 cup.</li><li>Combine wine, shallots, thyme, salt, and pepper in a sauce-pan. Cook until the liquid is reduced to about 1-1/3 cup.</li><li>Strain red wine reduction. A chinois works well for this, and allows you to mash some of the yum-yums out of the solids.</li><li>Combine reduced beef stock and red wine in a sauce pan and bring to a boil.</li><li>Melt in the butter.</li><li>Thicken. I use Xanthan gum, which works well, but is fairly touchy. I add a little at a time, give it a few minutes to cook and see how thick things are, repeating until I get the desired consistency. The traditional recipe uses a flour/butter roux as a thickener, which works fine. I try to avoid wheat, and don't really like the flour taste in the sauce anyway. But if you want to make a roux, you make it first and then add the liquid.<br /></li></ol>This is outrageously good on steak, even more so on sous-vide steak, which retains more beefy flavor and really matches well with the sauce. Use the left-overs for breakfast. Steak and eggs over easy smothered in Bordelaise is a little slice of heaven.<br /><br /><span style="font-weight: bold;">Sous-vide Ice Cream</span><br />Home-made ice cream is another of our favorite treats, often made to go along with our steak and Bordelaise. I use a modified version of Dr. Mary Dan Eades' sugar-free recipe. Making ice cream used to be something of a procedure, since the recipe is custard-based (technically a Creme Anglaise). When made on the stove the custard requires constant attention, and you have to temper the eggs, etc. With the SousVide Supreme, you can just mix everything, stick in a bag, and cook it. Here's the recipe.<br /><br />Ingredients<br /><ul><li>1.5 cup half-and-half</li><li>1.5 cup heavy cream</li><li>2 whole eggs plus 4 egg yolks</li><li>0.25 cup Splenda</li><li>0.5 cup polydextrose</li><li>1 vanilla bean, split and scraped OR 4 T vanilla extract</li><li>Enough ice water to submerge the bag<br /></li></ul>Procedure<br /><ol><li>Preheat the SousVide supreme to 82C.<br /></li><li>Combine all ingredients in a mixing bowl.</li><li>Pour mixture into a vacuum bag. Make sure you scrape out the polydextrose from the bottom. It doesn't dissolve very well in cold liquid, and has a tendency to congeal into a big clump.</li><li>Vacuum and seal the bag, and place in the SousVide Supreme (note you can do this with a zip-lock by zipping most of the way, submerging in the water bath the squeeze out the air, then zipping completely shut).</li><li>Cook for 20 minutes.</li><li>Remove the bag and squish the contents. It's hot, but I'm able to do this with my bare hands, though you can use oven mitts. Pay particularly attention to the polydextrose, which settles to the bottom. It will incorporate better in the hot liquid.</li><li>Return the bag to the water bath for another five minutes.</li><li>Submerge in ice water and squish it around some more. At this point you can either leave it in the ice water to chill, or transfer to the refrigerator.</li><li>Dump in the ice cream maker (if you used vanilla beans, remove the pods first).</li></ol>This is very yummy, as well as easy and fast enough to do on a work night. Polydextrose is soluble fiber, basically polymerized glucose in a configuration that human digestive enzymes can't break down. It has some of the same chemical and flavor properties of sugar, and for the purposes of ice cream, lowers the freezing point of water giving smaller ice crystals and a creamier texture. I get a kick out of telling people the ice cream they're eating is high fiber. It does make some people gassy, though, so adjust the amount as required.<br /><br /><span style="font-weight: bold;">Wrap-Up</span><br /><br />The SousVide Supreme really is revolutionary, particularly if you have a busy work week. Some of the high points:<br /><ul><li>Makes cooking of gourmet-quality meat nearly fool-proof.</li><li>Tremendously simplifies cooking of certain dishes (compare the ice cream procedure above with what is normally required for a Creme Anglaise).</li><li>Low electricity usage compared to an oven.</li><li>Very well engineered (the universal bag rack is something to marvel at, no doubt required spatial thinking skills that are well beyond my capability).</li><li>Food can be cooked in advance, shocked in ice, and frozen. Reheat to the perfect temperature in the SousVide Supreme.</li><li>Meat can be left in for an extended period without overcooking.<br /></li></ul>In theory, you should be able to use cheaper cuts of meat. I haven't had a chance to try this yet, but chuck roast sous-vide is next on the menu. I'll let you know how it turns out. In fact, what I really want to try is grass-fed chuck, which in theory would be downright inedible when cooked by normal means. The SousVide Supreme is a tad pricey on the face of it, but I found the benefits to be well worth the money.Davehttp://www.blogger.com/profile/18290594860469294453noreply@blogger.com11tag:blogger.com,1999:blog-7721098568390636553.post-22372895869333839862009-08-09T08:23:00.000-07:002009-08-10T07:14:54.659-07:00A GUT Feeling about InsulinAsk ten people how to lose weight (fat), and you'll likely get ten different answers. In fact, if you ask ten "experts" the same question, you'll probably also get ten answers (usually attached to some product or service requiring you to part with some money). Why all of the confusion? After all, it seems a fairly simple question at its base: how do you burn more fat than you store?<br /><br />I believe there's two key failures in critical thinking underlying the confusion. The first is that obesity itself is a "disease", which needs to be "cured". Many other diseases (heart disease, cancer, etc.) are <span style="font-style: italic;">associated </span>with obesity, and the prevailing thought is that curing obesity reduces risk for these other diseases. However, this ignores the mountain of evidence that an organism's metabolism is self-regulating. In this view, obesity is a symptom of of some underlying disease process which causes systemic failure of metabolic regulation. It is this underlying disease which needs to be fixed; further, it is possible that you can have this disease and not be obese (there are plenty of skinny Type II diabetics). Modern medicine is very skilled at treating symptoms and ignoring the root cause; indeed, this effect is rampant for obesity treatments. How many people do you know that have lost large amounts of fat, only to have it come back worse?<br /><br />The second failure comes from "black box" thinking. When hearing various prescriptions for curing obesity, I'm reminded of a <a href="http://star.psy.ohio-state.edu/coglab/Miracle.html">famous Sidney Harris cartoon</a>. For instance, a friend was recently telling me about a lemon juice diet. You drink lots of lemon juice, and the fat miraculously flows out of the fat cells. This supposedly had something to do with changing the acidity of your blood, but of course when prompted this person couldn't supply any actual physiological mechanism to explain this effect.<br /><br />To understand the problems with black-box thinking, we can use the example of, uh, a black box. It has a hole where you can put stuff in, and lots of different colored lights that blink in response to whatever you provide as input. Your job is figure out the rules of how the input relates to the blinking lights. As we try different things we find many patterns of colored lights, with no obvious patterns. For instance, we supply two different cube-shaped objects, but each elicits a different light pattern. So "cubiness" is not apparently relevant to the lights.<br /><br />The behavior of our black box may appear complex, but we don't really know if it's inherently complex, or if we just lack enough information to tease out the rules. We might crack the box open and examine how it actually works, and find that there really is a simple rule at the core, i.e. specific lights turn on depending on the molecular composition. The rule turns out to be simple, but it's the variety of different inputs that result in apparently complex behavior. Once you know how the box works inside, it becomes relatively easy to predict its response to a given input.<br /><br />If you notice, most studies on diet and health take the black box approach: they diddle some inputs, and observe how those inputs are associated the outputs (e.g. fat loss). But if you don't have some understanding of what's going on inside the box, you just wind up with a mass of confusing observations and associations. So the lack of consensus and mercurial nature of dietary recommendations should come as no surprise.<br /><br /><span style="font-weight: bold;font-size:130%;" >Unification and Symmetry</span><br />Science often faces such situations. The core difficulty is a lack of symmetry. Symmetry means "sameness in the face of change". A perfectly smooth cue ball will look the same no matter how you turn it. Paint some dots on the ball, and you break the symmetry.<br /><br />We often encounter cases where our observations seem to reflect a lack of symmetry, but if we look hard enough we find a deeper symmetry, one that unifies our observations under a common model. Such was the case in particle physics in the 20th century. Physicists had observed a vast zoo of different particles, first in cosmic rays (high-energy particles from space), then in "atom smashers". There were also four apparently disparate "forces" of nature: electromagnetic, weak nuclear, strong nuclear, and gravitation. The drive (which continues today) was to unify these different things by identifying the underlying symmetry. A "grand unified theory" (or GUT) would explain the all subatomic phenomena with a single model. Some progress has been made, e.g. many of the different particles were found to be composed from a much smaller family of more fundamental particles called <span style="font-style: italic;">quarks</span>. The electromagnetic and weak nuclear forces (the latter causes radioactive decay) we discovered to actually be one in the same, the apparent difference occuring because the universe is relatively cold.<br /><br /><span style="font-weight: bold;"><span style="font-size:130%;">A Unified Theory of Fat Storage<br /></span></span>Can we find a corresponding unifying principle for how fat loss and gain are related to diet? I think the answer is a qualified "yes". We likely need to restrict the domain of our model to one where the observed effect (obesity) has a common cause. Metabolic regulation is complex, and excess fat storage can have multiple root causes. We'll focus here on one possible cause, because it appears to be common and becoming more so: too much insulin, and/or not enough sensitivity to that insulin. Insulin is arguably the boss hormone for metabolic regulation: it effects many systems, and itself is affected by many factors. By examining the effect of insulin both on the behavior of individual cells and at the level of global metabolic regulation, we can in effect "open the box": see how inputs affect insulin and insulin response, then follow the effects of insulin in the body, particularly on fat storage.<br /><br />I am going to make the bold claim that insulin is the unifying factor, tying together many different observations about fat gain/loss. I intentionally said "many" instead of all, because there are other metabolic pathways influencing fat storage (e.g. increased adrenaline promotes release of fatty acids from fat cells). I'll make the further claim that just about any successful reducing strategy (one that results in fat loss) can be explained by its effects on insulin, whether that strategy involves diet, physical activity, drugs/supplements, or a combination. We should also be able to explain both the relative efficacy of different strategies both in terms of rate of fat loss and final equilibrium fat mass (e.g. many diets result in fat loss, but all seem to "stall" at some point; we should be able to explain this stall via our model). Some examples are given below.<br /><br />Our Grand Unified Theory theory then provides a more solid foundation for discussing the relative merits of different reducing strategies, and more importantly for making decisions about which lifestyle modifications are most appropriate. Instead of sifting through piles of observational evidence and "expert" testimony, you simply ask two questions:<br /><ol><li>Is my obesity insulin related? (The answer is probably "Yes" for most, but not all. Those whose obesity has some other cause, like a genetic leptin disorder, will need to seek other avenues of treatment).</li><li>How does X affect my insulin? From here you should be able to make a more informed decision about whether or not to pursue X for fat loss.</li></ol>Perhaps more importantly, by moving the focus from a symptom (obesity) to an underlying cause, we can begin to recognize that controlling insulin should have wide-ranging implications for health (insulin does many things beyond controlling blood sugar and fat storage).<br /><br /><span style="font-weight: bold;"><span style="font-size:130%;">A Brief Primer on Insulin</span></span><br />The effect of insulin on fat storage has been covered elsewhere in detail, most notably in Gary Taubes' book <a href="http://www.amazon.com/Good-Calories-Bad-Controversial-Science/dp/1400033462/"><span style="font-style: italic;">Good Calories, Bad Calories</span></a>. But it is probably worthwhile to hit the high points again. Insulin also does not act in isolation, but plays an intricate dance with other hormones and the nervous system. Some of these relationships are covered <a href="http://sparkofreason.blogspot.com/2008/07/energy-regulation-2-appetite.html">here</a>.<br /><br />Insulin is a protein (you can see a computer-generated representation <a href="http://en.wikipedia.org/wiki/File:InsulinHexamer.jpg">here</a>). Like all proteins, there is a gene that encodes the particular sequence of amino acids for manufacturing insulin. One of the interesting facts about insulin is that it's structure is remarkably consistent across time and species. Thus, species which appear genetically divergent, like humans and <a href="http://oceanexplorer.noaa.gov/explorations/lewis_clark01/logs/jul08/media/r609hagfish_532.jpg">hagfish</a>, do make different forms of insulin and the insulin receptor, but they're more simillar than different: human insulin has a large degree of cross-reactivity with hagfish insulin receptors, and vice-versa. So insulin has been around a long time, and the relative lack of cross-species mutation is an indication of it's key role in the survival of an organism.<br /><br />The effects of insulin are initiated when an insulin molecule binds to an <a href="http://en.wikipedia.org/wiki/File:PBB_Protein_INSR_image.jpg">insulin receptor </a>at the surface of a cell membrane. This binding triggers a series of chemical reactions, generally culminating at the cell nucleus, where genes are either up-regulated (meaning they make more of some protein) or down-regulated. Most people are familiar with the role of insulin in controlling blood sugar. One major effect of insulin binding is the manufacture of glucose transport (GLUT) proteins, which move glucose out of the blood, across the cell membrance, and into the cell. But insulin has many other effects. It is mitogenic, which means that it promotes cell division (i.e. insulin is a growth hormone). Insulin plays a key role in the manufacture of cholesterol from glucose, both by up-regulating transport of glucose into the cell and controlling manufacture of HMG-CoA reductase, and enzyme required to transform HMG-CoA into cholesterol (side note: statins block manufacture of HMG-CoA reductase). And there's a pile of other functions as well.<br /><br />When insulin binds to an insulin receptor, it not only causes a chemical signal to be sent. The entire insulin/receptor complex is also absorbed by the cell (endocytosis), removing the insulin from circulation. A condition in which there is too much insulin in the blood (hyperinsulinemia) could thus result either from too much insulin being produced in the pancreas, or from a relative lack of insulin receptors. Correspondingly, insulin resistance (the failure of cells to respond to the insulin signal) could result from a lack of insulin receptors, a failure in the chemical signal chain, or from some other molecule (like a lectin) physically blocking the insulin receptor.<br /><br />We should also realize that insulin does it's thing via it's effect on <span style="font-style: italic;">genes</span>. Genetic differences can thus imply diferent responses to insulin. Genes carry the code to manufacture proteins, and a rather small difference in gene activation by insulin can result in large visible differences between individuals. This is particularly true for fat storage. We'll see below how insulin triggers manufacture of lipoprotein lipase (LPL) which is necessary for fat storage. A small difference in the amount of LPL made in response to insulin results in a small difference in net amount of fat storage. But whether that small difference results in net negative or positive storage could determine whether or not an individual will become obese.<br /><br />On to the point. Insulin controls fat storage primarily through three pathways:<br /><ol><li>Up-regulation of lipoprotein lipase (LPL)</li><li>Down-regulation of hormone sensitive lipase (HSL)</li><li>Up-regulation of glucose transporters.</li></ol>The basic unit of fat is a fatty acid. Fatty acids are not water soluble, as anyone who has tried to mix oil and water knows. Blood is mostly water, and having fat droplets wandering around your blood vessels is not good. So fats need some other water soluble molecule to transport them around in the blood. Individual fatty acids can be transported bound to a molecule of albumin, but this mostly occurs for fatty acids released from fat cells. Dietary fats and those made in the liver are carried mostly as triglycerides in large molecules called lipoproteins. Triglycerides are also the storage form of fat in fat cells. A triglyceride is composed of three fatty acids stuck to a glycerol backbone.<br /><br />Triglycerides are too large to pass across the cell membrane. In order for fatty acids to get in/out of a fat cell, they must be freed from the triglycerides. Enzymes which perform this task are called <span style="font-style: italic;">lipases</span>. Lipoprotein lipase (LPL) acts on lipoproteins in the blood to free fatty acids for transport into the fat cells. Hormone sensitive lipase (HSL) acts on triglycerides inside the fat cell, freeing fatty acids for transport out of the fat cell. The precise mechanism by which the fats actually make it across the cell membrane isn't entirely clear. Cell membranes are largely made of fatty acids themselves (in the form of phospholipids), so it's like that free fatty acids passively diffuse across the cell membrane (whereas water soluble substances, like glucose, generally require the help of a transport molecule). There is also evidence of fat transporter molecules, though these may be more important in cells like muscle that may need energy faster than can be supplied by passive diffusion.<br /><br />The fatty acids inside the fat cell, regardless of their origin, are candidates for <span style="font-style: italic;">esterification</span>, which just means they can be incorporated into triglycerides. This in turn requires a supply of glucose to manufacture the glycerol backbone (actually a molecule named glycerol-3-phosphate, or alpha glycerol phosphate; we'll use G3P). Insulin is necessary to effect transport of glucose from the blood inside of the fat cell, and also up-regulates a key enzyme (G3P dehydrogenase) required to form G3P from glucose.<br /><br />Insulin increases LPL and decreases HSL. The relative concentration of fatty acids inside and outside of the fat cell are thus governed by insulin, as well as the availability of lipoproteins in the blood. Fatty acids tend to move from high concentration to low. If insulin is low, HSL activity is increased, fatty acids tend to build up in the cell and diffuse out to the blood. If insulin is high, LPL activity is increased, fatty acids build up outside the cell and tend to move in. Once inside the cell, insulin governs the relative rate at which fat is stored, not only through HSL, but also by effecting glucose transport and regulating G3P dehydrogenase.<br /><br />There are other metabolic pathways which affect this process. Some, like <span style="font-style: italic;">de novo lipogenesis</span>, are also regulated by insulin. Others, like <a href="http://sparkofreason.blogspot.com/2008/06/swift-kick-in-asp.html">acylation stimulation protein</a> (ASP), appear to be independent of insulin. There are ongoing arguments as to the relative importance of the various pathways, but I think the evidence is pretty clear that insulin is king of the hill when it comes to fat storage. For instance, Type I diabetics, who make little or no insulin, basically lack the ability store fat. If ASP were important in humans, Type I diabetics should be able to store plenty of fat (since one of the symptoms of Type I diabetes is ravenous hunger, I think we would have observed this). Any Type I diabetic who injects insulin, however, is familiar with the "fat pad" that forms at the injection site, due to (ta da) the high concentration of insulin in that area.<br /><br />So, lots of concepts and big words in the above. The takeaway is simple: more insulin means fat cells store fat; less insulin means fat cells release fat. The equilibrium point (at which you're neither storing nor releasing) is thus largely determined by average insulin levels. We should then be able to predict the effect of various lifestyle changes from their effect on insulin. Let's see how that works out for some commonly recommended reducing strategies.<br /><br /><span style="font-weight: bold;"><span style="font-size:130%;">Low Carbohydrate Diet<br /></span></span><br />This ought to be a no-brainer. Of all macronutrients, carbohydrates have the largest direct effect on insulin levels. Protein also stimulates a little insulin release, but nothing like a quantities of readily available carbohydrate (dietary protein also stimulates release of the hormone glucagon, which tends to counteract insulin's effect of driving glucose from the blood into fat cells, thus reducing fat storage). By itself, fat does not stimulate insulin release (in fact it seems to decrease it mildly). But fat does cause release of hormones like CCK, which amongst other things cause the pancreas to release more insulin for a given stimulus of glucose or amino acids (this is called the "incretin effect"). So eating fat and refined carbohydrates together (which is most food in the Western diet) ought to really crank your insulin. High average insulin means more fat storage - look around any public place if you want to see this in action.<br /><br />Conversely, removing carbohydrates from the diet should drastically reduce average insulin levels (unless you have some non-dietary problem, like an insulin-producing tumor, in which case you've got bigger problems that being fat). The decrease in insulin should move the body away from fat storage to fat release. Since this fat is now available for energy, appetite should decrease and/or activity should increase spontaneously. All of these effects have been observed repeatedly in both animal and human studies.<br /><br /><span style="font-weight: bold;"><span style="font-size:130%;">Low Calorie Diet (Starvation)<br /></span></span>Suppose we just cut calories across the board. Say your nominal caloric intake was 2400 kcal/day, including an average of 300g of carbohydrates. Leaving fructose out of the equation (fructose does not directly stimulate insulin release, but does cause the liver to become temporarily insulin resistant, the net effect of which may be to increase average insulin levels), that's equivalent to about a cup and a half of sugar each day (the gut rapidly breaks down "complex carbohydrates" into glucose for absorption into the blood). Since the total amount of glucose in a normal person's blood is about 1 tsp, this 1.5 cups should have a drastic effect on average insulin levels, as the body works very hard to keep blood glucose in a narrow range (too much or too little glucose in the blood will kill you in a hurry).<br /><br />Now, let's not change what we eat, just how much. We'll go from 2400 kcal/day down to 1600 kcal/day. That implies we're now eating 200g of carbohydrate per day, implying that average insulin levels should drop significantly. Again, this should result fat loss, since we've decreased insulin from the level that promoted our previous equilibrium. And that's precisely what's seen: starvation diets result in fat loss. However, that 200g of carbohydrate still promotes a fair amount of insulin secretion. We would thus expect initially rapid fat loss, tapering off over time, and finally stalling at the new equilibrium point. And once the fat stops coming out of the fat cells, your body is literally starving, and will likely make you fall off the wagon, so to speak. As your body has become used to lower levels of insulin (i.e. your insulin sensitivity has increased), resuming previous levels of carbohydrate and fat consumption should result in rapid weight gain, overshooting your previous equilibrium point. Which, again, is exactly what is seen.<br /><br /><span style="font-weight: bold;"><span style="font-size:130%;">Low Fat Diet<br /></span></span><br />The low-fat diet is an interesting case, and what is called "low-fat" often involves both calorie restriction and the trading out of refined carbohydrates for more whole food sources, which tend to have less effect on blood sugar and thus insulin. Both latter effects of course will drop your average insulin, and result in some fat loss. The interesting thing here is that reduction in dietary fat should also reduce secretion of incretin hormones like CCK, and thus further reduce insulin. So low-fat diets "work", as is often observed. In fact, I would predict it works better than just generically cutting calories. I don't know if this has been observed. The confusion most people have is the idea that eating fat makes you fat, and thus erroneously conclude reducing fat makes you thin. But all of this action is ultimately effected by insulin.<br /><br />And that's the rub, because it means it is difficult (and probably unhealthy) to eat low-fat forever. If you don't eat much fat, then you need carbohydrates for energy (using too much protein for energy results in nitrogen poisoning). If you get those carbohydrates from the usual sources, like bread, rice, or pasta, your insulin will go up, and you'll get fat again, whether you eat fat or not (note that excess dietary carbohydrate is converted to fat by the liver). Successful maintenance of a low-fat diet means getting carbohydrates from sources which are slowly digested, and/or maintaining a high enough level of physical activity to burn off excess glucose and enhance insulin sensitivity (more on this below).<br /><br /><span style="font-weight: bold;"><span style="font-size:130%;">Physical Activity<br /></span></span><br />We've all heard the old chestnut that to effect fat loss you just need to "eat less and exercise more". We've seen above how calorie reduction can affect insulin levels. But does exercise do the same thing?<br /><br />Interestingly, the answer is a qualified "Yes". Let's start with an extreme case (which, as it turns out, forms the basis for the very successful "slow burn" type exercise regimens). Muscle stores glycogen, a form of starch, for use as quick energy. The glucose to make that glycogen gets into the muscle cells via the action of insulin. In the case of muscle cells, insulin stimulates the cell to move a preformed store of GLUT4 molecules to the surface, so glucose can be rapidly absorbed from the blood. Now suppose you completely exhaust the muscle of its glycogen stores. What do you suppose its response will be?<br /><br />Not surprisingly, the cell cranks out more insulin receptors in an effort to rebuild it's energy. After all, you might need that quick energy to escape the next hungry lion that crosses your path. So exercise increases insulin sensitivity of muscle, and we learned above that when insulin binds to an insulin receptor the cell absorbs the whole complex. So, independent of diet effects, we expect exercise to reduce average insulin levels; further, in doing so, the muscles also clear out some glucose. Both of these effects should lead to some degree of fat loss. Any increase in net physical activity should result in this effect to some degree. Your muscle cells will only make insulin receptors if they need to. If you start as a total couch potato, and then start walking a mile a day, your muscles need to adapt to even this small increase in activity (walking a mile burns about an extra 100 kcal).<br /><br />And of course, that's what people see. How many friends have you known that started a new exercise regime and rapidly lost some weight? This is often accompanied by pronouncements like "I can eat anything I want, as long as I exercise enough". That's true, at least to the point where the new fat storage/release equilibrium is reached, at which point fat loss stops. Since the individual is no longer getting positive feedback of fat loss for their physical exertion, they usually cut back or quit, but continue eating "anything I want", and of course just get fat again.<br /><br />And all of this ignores the elephant in the living room, which is overall metabolic regulation. If you use more calories than are totally available to you from food and storage (remember that high insulin makes stored fat unavailable), you should get hungry. Further, the body knows what it wants, and will try very hard to make you eat it. If you burn up the muscles' store of carbohydrate, the resultant temporary increase in insulin sensitivity will drop your blood sugar. Your brain senses that drop, and tells you to go eat some carbohydrates. People often "reward" themselves with a food treat after a workout, or maybe have a sugary energy drink or similar. Of course, this tends to defeat whatever gain in insulin sensitivity your exercise created.<br /><br /><span style="font-weight: bold;"><span style="font-size:130%;">The Challenge<br /></span></span><br />The examples above, I believe, illustrate explanatory power of the insulin hypothesis, bringing many approaches which seemed disparate or opposed (like low fat vs. low carb) under a single explanation. My challenge to you, O Gentle Reader, is to provide counter-examples. Are there fat-loss strategies that cannot be explained by the insulin model? Give it your best shot in the comments.Davehttp://www.blogger.com/profile/18290594860469294453noreply@blogger.com35tag:blogger.com,1999:blog-7721098568390636553.post-55810253857900183732009-06-06T09:07:00.000-07:002009-06-06T09:49:43.113-07:00Your Elephant Stepped on my Coffee TableTake a look at this press release: <a href="http://www.bidmc.org/News/InResearch/2009/June/POMCNeurons.aspx">http://www.bidmc.org/News/InResearch/2009/June/POMCNeurons.aspx</a><br /><br />The summary is this: genetically leptin-resistant mice will become obese and develop Type II diabetes. These researchers restored leptin-sensitivity for the pro-opiomelanocortin (POMC) neurons in the arcuate nucleus (ARC), an <a href="http://sparkofreason.blogspot.com/2008/07/energy-regulation-2-appetite.html">area of the hypothalamus involved with energy regulation</a>, including appetite and blood sugar control. As a result, the mice both lost fat AND spontaneously increased their level of activity. They did not lose fat because they were exercising, they were excercising because they were losing fat.<br /><br />Now contrast that to the prevailing view of obesity and (supposedly) related health issues like diabetes: you're a lazy slob, sit on the couch, eat too much, and therefore become fat and diabetic. Gary Taubes "Good Calories, Bad Calories" laid the foundation for challenging this hypothesis, drawing on decades of research showing that energy regulation is governed by an intricate dance of hormones and the central nervous system. In this view, people overeat because they're becoming fat as a result of some malfunction in this system; correspondingly, lean people are more active for the same reason.<br /><br />This latest piece of research supports the hormone hypothesis. Leptin plays a key role in energy regulation, and is manufactured by fat cells depending on how much fat they contain. More fat, more leptin. Amongst other things, leptin acts on the brain to turn off appetite, i.e., when you've stored up enough energy, stop eating. It is further hypothesized that the ARC may a play role in blood glucose control, e.g. providing CNS signals to the liver to regulate glucose manufacture. This role is certainly supported by the research linked above.<br /><br />The key question becomes what causes the ARC to become leptin-resistant. The authors seem to completely miss this, instead gushing about "novel drug targets" (i.e. $$$). There are plenty of clues laying about, however. Stephan at Whole Health Source notes that leptin resistance precedes insulin resistance in the development of Type II diabetes. So what causes leptin resistance? Apart from genetic defects, this is an open question, but a reasonable conjecture would be wheat germ agglutinin (WGA), a kind of protein called a <span style="font-style: italic;">lectin </span>which is found in grains. Lectins like WGA have the annoying capability of binding to hormone receptors. This is all the more annoying because they can avoid protease enzymes in the digestive system and pass into the blood intact (most proteins are broken into amino acids, as loading up your body with intact foreign proteins is bad juju).<br /><br />WGA is so effective at binding hormone receptors that scientists regularly use it for studying these. For instance, they'll take the WGA with a radioactive substance and then see where it winds up sticking on a cell. Neurotransmitters are basically just hormones released in neuronal synapse, and scientists use it to study how things are transported in the brain. So, WGA a) binds to leptin receptors and b) wanders around your brain. And what does WGA do when it locks into your leptin receptors? Unknown, <a href="http://wholehealthsource.blogspot.com/2008/04/leptin-and-lectins-part-iii.html">but in the test tube, at least, it blocks the effects of leptin</a>. Hmmm, throw in insulin resistance of the liver from excess fructose, sounds like a recipe for Type II diabetes.Davehttp://www.blogger.com/profile/18290594860469294453noreply@blogger.com14tag:blogger.com,1999:blog-7721098568390636553.post-78805438059367488682009-05-20T07:00:00.000-07:002009-05-25T10:41:15.654-07:00The Paradox Paradox<blockquote><span style="font-style: italic; font-weight: bold;" class="body">By denying scientific principles, one may maintain any paradox.</span><br />Galileo Galilei<br /></blockquote><blockquote><br /><span style="font-weight: bold;">Paradox</span>: [Latin<i> paradoxum</i>, from Greek<i> paradoxon</i> from neuter sing. of paradoxos, conflicting with expectation, <i>para-</i>, beyond; see para–<sup>1</sup>, + <i>doxa</i>, opinion (from <i>dokein</i>, to think; see <a href="http://www.ask.com/web?q=dictionary%3A+dek-&content=ahdict%7C150509" onmousedown="return cspk(this,'ahdict',2)">dek-</a>).]<p></p>(noun)<ol><li> A seemingly contradictory statement that may nonetheless be true: <i> the paradox that standing is more tiring than walking.</i></li><li> One exhibiting inexplicable or contradictory aspects: <em> “The silence of midnight, to speak truly, though apparently a paradox, rung in my ears”</em> (Mary Shelley)</li><li> An assertion that is essentially self-contradictory, though based on a valid deduction from acceptable premises. </li><li> A statement contrary to received opinion. </li></ol></blockquote>This morning I ran across <a href="http://www.sciencedaily.com/releases/2009/05/090518172654.htm">an article discussing the "paradox" that obesity seems to play a protective role in heart disease</a>. We seem to be presented with a flood of paradoxes relating to health and nutrition - and indeed said paradoxes present equal confusion to (too) many scientists. Let's talk a bit about what a paradox really is, and then I'll show why the Galileo quote was right on the money. To say it another way, any scientist who cries "paradox" is being fundamentally unscientific. You'd never get them to admit it (because they probably don't believe it), but their use of paradox is in the sense of the 4th definition above, rather than indicating a true logical paradox. And we all know how well science and opinion mix.<br /><br />Most paradoxes are only superficially paradoxical, and can be resolved on deeper inspection. Real paradoxes are rare. Consider this example from the Wikipedia entry on "paradox":<br /><br /><i><blockquote>... consider a situation in which a father and his son are driving down the road. The car collides with a tree and the father is killed. The boy is rushed to the nearest hospital where he is prepared for emergency <a href="http://en.wikipedia.org/wiki/Surgery" title="Surgery">surgery</a>. On entering the surgery suite, the surgeon says, "I can't operate on this boy. He's my son."</blockquote></i>Sounds paradoxical, right? But the issue is simply a bad assumption: since most surgeons are men, one erroneously extrapolates that ALL surgeons are men. Obviously the surgeon must be the boy's mother. This is a common source of claimed paradoxes in science: extrapolating something that is believed at some level (e.g. obesity causes heart disease) to a statement of absolute truth.<br /><br />Let's consider mathematics, starting with simple Boolean logic. The point of logic is to reason deductively about the truth of a statement, given the truth of other statements. A paradox would imply you could get different answers depending on how you worked through the problem, i.e. two different sets of steps valid within the rules of logic would give different answers. If such paradoxes did exist, they clearly render logic useless, since you could never consistently prove something true. The dictionary definition of "paradox" admits a subtly different situation, which is a statement like "I am a liar". The rules of logic can neither prove nor disprove this statement. But this more an artifact of language and technical aspects of formal mathematical systems as opposed to the sort of "scientific paradox" claimed by the authors of the heart disease/obesity paper.<br /><br />Generalizing the case of logic to all math leads to the same conclusion. A mathematical system which admits true paradoxes is pointless. A true paradox would indicate inconsistency in the rules and assumptions used to build the system. Problems labeled "paradoxical" in math are really counter-intuitive, like the <a href="http://en.wikipedia.org/wiki/Banach-Tarski_paradox">Banach-Tarski Paradox</a>, where one can prove that there is a way of dividing up a 3-dimensional ball, moving the pieces around without stretching them, and reassembling to get two balls of the same size as the original. Sounds pretty paradoxical, right? But it's really just counter-intuitive: the size of the set of points in the one ball (called the <span style="font-style: italic;">cardinality</span>) is actually the same as the size of the set of points in the two balls. The size of a set is different than it's <span style="font-style: italic;">measure </span>(which in this case would be the volume). The result that we can double the volume of a set of points without changing the cardinality of that set violates our intuition, but is consistent within the mathematical definitions of measure and cardinality (this is roughly equivalent to realizing that that size of the set of even integers is the same as the size of the set of all integers: they're both infinite).<br /><br />Can we ever have a true scientific paradox? Mathematical truth is purely conceptual, and can thus be "absolute". We define the axioms and rules and mentally manipulate these to prove or disprove other statements. Science is messier. Nothing is absolute in science, because all scientific theories must be supported by observational evidence from the real world. Our observations are limited by various practical considerations. Our data is never 100% accurate, we can never be sure we've observed all of the relevant variables, etc. So our belief in a scientific hypothesis is always conditioned on the evidence which itself is subject to limitations of our ability to observe and collect information. Scientific belief thus exists in a continuum between absolute truth and falsehood, and is always conditioned on the available evidence. As new evidence is obtained, we update our beliefs accordingly toward greater or less truth as indicated by the new evidence.<br /><br />So you can never have a scientific paradox. Scientific honesty demands that observation of evidence contradicting a hypothesis causes you to lower your belief in that hypothesis. A paradox requires two statements which can be shown to be contradictory yet simultaneous true. But neither evidence nor hypotheses carry absolute truth, and our beliefs in either are always conditioned on the other. The scientifically relevant method evaluates belief of hypotheses conditioned on evidence.<br /><br />Science as most often practiced, using frequentist statistics, evaluates belief in data assuming the truth of a hypothesis, so it's no wonder scientists spend so much time confused about "paradoxes". Take a hypothesis and data that appears to contradict that hypothesis. Then try to test the hypothesis quantifying your belief in the data presuming truth of that hypothesis. When the number comes back low, you basically have two choices: come up with a reason why the data is "wrong" (e.g. a mistake in experimental design, broken instrument, drunken graduate student), or realize that your hypothesis (again, whose truth was assumed as part of the analysis) is possibly not true. If you believe your data AND are 100% convinced of the hypothesis (which begs the question of why you did the experiment in the first place), you'll think you've got a paradox. The only real paradox is that people get paid to make this fundamental error in inference - over and over and over . . .<br /><br />Our friends who observed the apparently paradoxical protective effect of obesity in heart disease patients have fallen into this trap. The right thing to do upon observing this effect is to update belief in the hypothesis that obesity <span style="font-style: italic;">causes </span>heart disease. The new evidence lowers our belief in that hypothesis, and simultaneously signals that we should evaluate competing hypotheses in the light of all of the available evidence. Indeed, if one were to do a proper analysis of the evidence, it would be clear that no more supports the hypothesis that obesity causes heart disease any more than it does the hypothesis that heart disease causes obesity. Not all heart disease patients are obese, and not all obese people suffer from heart disease. Further, there's no strong metabolic evidence indicating the arrow of causality.<br /><br />The smart thing to do in such situations is to start looking at hypotheses where a third culprit is the underlying cause of the observed associated effects. So what might cause both obesity and heart disease, or in some people one but not the other?<br /><br />A growing body of evidence links poor blood sugar control to heart attack risk (see <a href="http://www.sciencedaily.com/releases/2009/05/090521200807.htm">this recent study</a>, for instance). The body maintains blood glucose in a narrow range, because both too little or too much are dangerous. Too little and the brain starves. Too much and you overwhelm the systems which repair the damage caused by sugar, in particular that to the arterial lining. You cannot excrete excess blood glucose like you can excess water or salt (at least not without severely damaging the kidneys). So your options are to either store it, or turn it into something else. The muscles and liver have a limited capacity for storage of glucose. Once they're full, the liver, as directed by insulin, will turn the rest into fat, and your fat tissue, again as directed by insulin, will store that fat.<br /><br />At least that's how it's supposed to work. Insulin is a hormone, and hormones activate genes to manufacture proteins. The response to a hormonal stimulus is thus partially determined by genetics. Your genes will determine, for instance, the relative expression of lipoprotein lipase and hormone sensitive lipase in response to insulin levels. This in turn governs the ability to take fat from the blood and store it, or release that fat from fat cells to be used as energy. Similarly one guesses that insulin sensitivity of muscle and liver tissue has some genetic basis, and these may further be altered by disease, nutrition, etc. (overconsuption of either alcohol or fructose will make the liver insulin resistant, thus impeding its ability to store glucose, transform it to fat, or scale back manufacture of glucose from protein).<br /><br />In the framework of this hypothesis, a person with greater propensity towards fat storage has a potential advantage when it comes to heart disease, as it provides another "sink" for excess blood glucose. A perpetually skinny person may be at a disadvantage. If your fat cells don't respond to insulin signals, then the fat has nowhere to go and stacks up in your blood as "triglycerides". If your liver and/or muscle don't properly respond to insulin, glucose begins to build up in the blood. Neither situation is likely good for the development of heart disease, and in reality both seem to occur simultaneously for susceptible individuals.<br /><br />The news blurb doesn't state whether blood glucose or triglycerides tested, and the publisher of <em>Journal of the American College of Cardiology </em> doesn't provide free access to the publication. Perhaps a reader with access can post a comment as to whether blood glucose was tested and the results. Regardless, it is the unwillingness or inability of the authors to consider alternative hypotheses which leads them to cry "Paradox!" in such a public manner. Such individuals are clearly mired in irrational dogma and/or trying to drum up extra funding. From a broader view, any hypothesis (like diet-heart) which embraces paradoxes (like the "French paradox") are probably junk science. Treat them accordingly lest you extinguish your own spark of reason.Davehttp://www.blogger.com/profile/18290594860469294453noreply@blogger.com8tag:blogger.com,1999:blog-7721098568390636553.post-65857937232335923022009-05-03T16:48:00.000-07:002009-05-20T10:48:03.343-07:00Pearls Before H1N1The swine flu is making me sick. I don't have the virus (at least no symptoms), but the whole panic over it annoys me on multiple fronts. Take this <a href="http://www.google.com/hostednews/ap/article/ALeqM5jRff2g62rwBC48p3LyUUPxCg9HFAD97V27180">recent AP story</a> as an example. 241 cases? Is this really worth worrying about? I find the following quote particularly telling:<br /><br /><p></p><blockquote><p>Even if the swine virus doesn't prove as potent as authorities first feared, Besser said that doesn't mean the U.S. and World Health Organization overreacted in racing to prevent a pandemic, or worldwide spread, of a virus never before seen.</p><p>With a new infectious disease, "you basically get one shot, you get one chance to try to reduce the impact," Besser said. "You take a very aggressive approach and as you learn more information you can tailor your response."</p><p>It was just over a week ago that authorities learned the new flu CDC had detected in a few people in California and Texas was causing a large outbreak and deaths in Mexico, triggering global alarm.</p><p>"We didn't know what its lethality was going to be. We had to move. Once you get behind flu, you can't catch up," Homeland Security Secretary Janet Napolitano said.</p></blockquote><p></p>Maybe an informed reader can help me out here. Is there any reason to believe that the "very aggressive approach" makes any difference at all? Do all of these countermeasures have any effect? I get the feeling there's a whole bunch of "virus nerds" at the CDC just waiting for the opportunity to do something, which more than anything is to feed public hysteria and justify their existence. Maybe I'm being overly pessimistic, but the track record of government science types is pretty abysmal. I do think that they think they're being helpful, but I really have a difficult time shaking the feeling that anything public health authorities do to try and stop a virus (which has evolved over billions of years to be very efficient at spreading infection) is roughly equivalent to piling up cheesecloth to protect yourself from a tsunami.<br /><br />Western culture seems to be developing increasingly extreme paranoia about all things health related. And of course, this is fueled by the media and other groups (like the CDC) who stand to benefit from spreading fear. Too many people spend too much time worrying about "silent killers": cancer, heart disease, viral diseases, you name it. Correspondingly, there exists as MASSIVE "industry" concerned with disseminating information and treatment. Just look at the amount of money spent helping us with that most deadly of conditions, "high cholesterol". Can you watch TV anymore without seeing at least one advertisment for statins (geez, there's one on now - GO TIVO) or a wonder food (like Ch**rios for crying out loud) that is going to save you from the "silent killer". Ch**rios and statins: the delicious and healthy way to start the day.<br /><br />Fear is a complicated emotion, and that complication no doubt stems from the underlying complicated nature of trying to survive. I believe the major psychological source of fear is uncertainty, i.e. "was that sound Grog relieving himself due to over-consumption of cachonga root, or a bear coming to eat me?". I suspect the physiological source of fear is hormones, namely the stress hormones. Certainly stress brings about an increase in irrational fear (is there such a thing as rational fear?), and certain drugs can activate those same pathways and create tremendous fear. Our society seems now more than ever in the grip of fear inducers. Though science and technology have advanced human knowledge, the fact is that most of that knowledge is held by precious few. "Back in the day", when we all lived in the forest, you needed lots of knowledge to survive. The "unknown" was largely those aspects of Nature humans could not control, like the weather, hungry bears, and infection. <a href="http://sparkofreason.blogspot.com/2009/03/listening-to-experts-makes-you-stupid.html">Now we trust that to the "experts"</a>, tacitly ceding them control over our lives. And other aspects of lifestyle probably contribute to stress. Crappy nutrition certainly increases stress hormones, as does chronic illness. The diabetes "epidemic" is a pretty good sign that a major portion of the population is suffering from chronic illness due to poor nutrition (and a pronounced lack of sunshine).<br /><br />The combination is a real mess: a sick and fear-filled population driving a culture of experts to save them from their own ignorance. And of course the experts turn out to have little relevant expertise. Their major source of validation comes from the feedback that we give them. How many people do you know whose doctor fills them full of pills to no effect? The patient experiences little actual improvement in health, yet they keep going back for more. What if people started thinking for themselves and kicked their doctor to the curb in favor of self-informed care? I suspect a swift kick in the pocketbook would change MDs' opinion of statins in a big hurry. Similarly, let's have some fun and watch the shakeout if (as I think is highly probable) swine flu turns out to be a dud. Congress and the media will praise the CDC for quick and decisive action, and they'll wind up with a nice budget increase, which is all the validation they need. I suspect we won't see any sort of critical introspection as to whether or not all of this flopping about and general panic has a measurable benefit on public health. Nobody gets budget increases for that.<br /><br />In a <a href="http://www.proteinpower.com/drmike/low-carb-library/low-carbers-critical-thinkers-and-a-bulwark-against-illiteracy/">really nice post on critical thinking</a>, Dr. Mike Eades appropriated one of thousands of fabulous lines from George Carlin (who I personally think was probably smarter than everyone at the CDC - combined). I shall re-appropriate it here:<br /><br /><blockquote>Think of how stupid the average person is, and then realize that half of them are stupider than that.</blockquote>I spent about 10 years "in" science, 3 to get my M.S. and Ph.D. another 7 as a post-doc and researcher. I had the chance to interact with many scientists, mostly from physics, but also from other fields, and if there's one thing I can tell you with great certainty it is that the distribution of intelligence amongst scientists pretty much mirrors that of the population at large. By "intelligence", I mean the ability to rationally weight complex evidence as it relates to different hypotheses. My point here is not to say that scientists are dumb (though as we learned from Carlin, half of them <span style="font-style: italic;">are </span>dumb, by definition), but that you likely have the same reasoning capability as the average scientist. In fact, you're probably a little better than the average scientist. Scientists favor complex solutions, precisely because they are hard to understand. This validates their own self-perception of being smarter than the average bear. As a scientist I've had more than one scientific discussion end with "That's too simple to be right." Not, mind you, "that idea contradicts this piece of evidence". It's just too simple for their taste. So I dressed up the same simple idea with complex-appearing math and verbiage, leading to acceptance.<br /><br />The swine flu situation presents us with the opportunity to watch this in action. I was just watching a clip from a news conference, where some "expert" was simultaneously back-pedaling on the severity of the present threat, while drumming up some more fear about the future. The thrust of it was that the Spanish Flu "took a summer break", and then re-awoke to slaughter millions. So keep on wiping down those door handles and pouring in the taxpayer dollars for stockpiling flu vaccines and anti-virals. Yet our expert likely missed on the glaringly obvious simple hypothesis: people almost never get flu in the summer, an effect seen in both hemispheres. That applied also to the highly deadly Spanish Flu, which apparently spent the summer at the beach before coming back to clobber Western civilization. Or maybe something about the summer that made people more immune to a pre-existing infection. Gee, I wonder what that could be?<br /><br />(For those in the cheap seats, it's Vitamin D3.)<br /><br />Can I say with certainty that Vitamin D3 is the answer to the swine flu? No. But the CDC nerds also have no reason to say that it isn't, other than it's a) too simple, and b) puts a fair dent in their raison d'etre. Even WebMD, normally a bastion of medical orthodoxy, is at least <a href="http://www.webmd.com/cold-and-flu/news/20090223/low-vitamin-d-levels-linked-to-colds">considering the possibility</a>. I presume flu cases are being tested for H1N1 antibodies. I wonder if anybody is bothering to test for Vitamin D3 while they're at it?<br /><br />Nah, too simple. Ramp up that expensive anti-viral production. Far more tasteful.Davehttp://www.blogger.com/profile/18290594860469294453noreply@blogger.com0tag:blogger.com,1999:blog-7721098568390636553.post-72474353231044632432009-05-03T07:37:00.000-07:002009-05-03T07:39:57.553-07:00TwitterpatedI'm following the leads of Dr. Eades and Jimmy Moore on to Twitter. It's hard to find time to blog with my new job, and I wind up with stacks of links I want to talk about. Twitter seems like a good solution to at least push information I find interesting (and perhaps more often, incredibly dumb).<br /><br /><a href="http://twitter.com/sparkofreason">http://twitter.com/sparkofreason</a>Davehttp://www.blogger.com/profile/18290594860469294453noreply@blogger.com7tag:blogger.com,1999:blog-7721098568390636553.post-61009742761222720492009-03-24T07:00:00.000-07:002009-03-24T07:22:31.345-07:00Listening to Experts Makes You StupidGot to work early this morning, and I thought this article deserved a quickie post:<br /><br /><a href="http://www.newscientist.com/article/dn16826-brain-quirk-could-help-explain-financial-crisis.html?DCMP=OTC-rss&nsref=online-news">http://www.newscientist.com/article/dn16826-brain-quirk-could-help-explain-financial-crisis.html?DCMP=OTC-rss&nsref=online-news</a><br /><br />I think you could replace "Financial Crisis" with "Health Crisis" in the headline and nicely sum up the current boom in metabolic diseases etc. Most of us have done it at one point or another: uncritically accept the advice given by experts, even when a little thought shows it makes little sense. Now we've learned that the brain has a specific mechanism where it essentially shuts off given "expert advice". This perhaps explains why people seem to be thrown into such cognitive dissonance when presented with evidence which is rationally a slam dunk, but also contradicts what their doctors, the government, the media, and so forth have told them. I'm sure many of you have encountered irrational anger from friends and family when you question nutritional dogma. One of the weirdest things for me is how bent people get when I push them to justify why exactly "healthy whole grains" are so healthy. Still waiting (going on a couple of years now) for a response beyond "everybody knows that, so shut up."<br /><br />That's not to say expert advice is necessarily bad - you just need to use your own brain as well, and weigh the expert information appropriately. Tom Naughton makes this point very nicely in "<a href="http://www.fathead-movie.com/">Fat Head</a>" (see discussion of "functioning brain").<br /><br />BTW, I'm finding "<a href="http://www.fathead-movie.com/">Fat Head</a>" to be the most effective tool yet in overcoming the mental block created by "expert advice" (as opposed to my usual boring biochemistry lecture - maybe not so surprising). I suspect it's the humor that somehow breaks down the barriers of cognitive dissonance. It would be funny (in every sense of the word) if laughter made us more rational.Davehttp://www.blogger.com/profile/18290594860469294453noreply@blogger.com8tag:blogger.com,1999:blog-7721098568390636553.post-62825631445781625492009-03-22T13:33:00.000-07:002009-03-22T13:43:26.628-07:00Fat Head: The BlogTom Naughton of the fantastic <a href="http://www.fathead-movie.com/">"Fat Head" movie</a> has started his own blog at <a href="http://www.fathead-movie.com/">http://www.fathead-movie.com/</a>. Great stuff, and like the movie, informative and very funny. Be sure to check out his <a href="http://www.fathead-movie.com/?p=6">first post</a> and learn how to make your very own misleading study supporting ridiculous preconceptions.Davehttp://www.blogger.com/profile/18290594860469294453noreply@blogger.com0