Tuesday, November 16, 2010
If the American Dietetic Association is flipping on low-fat diets, I'd say that signals the beginning of the end (hat tip to the Hold the Toast blog). Still waiting for Dean Ornish to jump out tell us we've been punk'd.
Also check out the Fat Head take on the Twinkie diet. Nice analysis of the food logs.
Wednesday, November 10, 2010
The gist of the Twinkie business is that professor of human nutrition Mark Haub lost 27 pounds over 10 weeks by eating largely "junk food", like Twinkies. The "secret" was that he cut calories from 2600/day to 1800/day. Haub's point was to show "in weight loss, pure calorie counting is what matters most -- not the nutritional value of the food". This gets the "well DUH!" award for the month. Suppose you ate nothing at all. You'd be getting zero nutritional value. Do you think you might lose weight? Hmmmm, could be, doc.
The deeper issue here is apparent ignorance of people like professors of human nutrition about the basics of metabolic regulation. To first order, if you keep the macronutrient ratios about the same in your diet and reduce calories, you will also reduce the amount of insulin your body secretes in response to that food. As oft noted on this blog, insulin is a major metabolic hormone, governing a wide variety of processes having to do with the utilization and storage of energy, not the least of which is driving fat storage. More insulin means more fat storage. Less insulin means less fat storage. Drop insulin enough and on average more fat leaves the fat cells than is stored. The root cause of fat loss under calorie restriction is NOT simply restricting calories, but the result that calorie restriction has on your hormones, particularly insulin. For anecdotal evidence of this, you could ask a Type II diabetic who has to take insulin injections how hard it is to lose fat even by starving. More controlled experiments have been performed in animals. For instance, you can take an obese rat, keep it's insulin levels artificially high, and starve it. Said rat will literally starve to death while obese, consuming it's internal organs for energy, because the high insulin level effectively keeps fat locked up in fat cells.
So yes, of course, you can eat a calorie restricted diet of Twinkies and lose fat. But failing to understand how all of the metabolic dots are connected leads to several common backwards assertions made in the article, e.g. "Being overweight is the central problem that leads to complications like high blood pressure, diabetes and high cholesterol". Sure about that doc? Or do obesity, high blood pressure, diabetes, and high (LDL) cholesterol have a common cause, like say excess insulin? After all, there are skinny people with high blood pressure, diabetes, and high cholesterol. There are obese people who otherwise tape out as very healthy. So obesity is clearly not a cause, at least not the root cause. Insulin modulates a large number of genes, and so the precise set of symptoms a person experiences from hyperinsulinemia is going to be a function of their specific genetic makeup. A key test of a scientific hypothesis is its predictive power. The hypothesis that obesity causes Type II diabetes misses by tens of percent. But 100% of Type II diabetics are hyperinsulinemic, whether or not they are obese. Where would you put your money?
The key take-away here is that there is a large body of "health professionals" who essentially view the human body as a black box, and as such tend to come up with hand-waving and over-simplified "rules" linking various externally observable effects, like "calories in, calories out" (strictly true, but pointless because it makes no connection between cause and effect). As such, the recommendations of these people rarely rise above the level of old-wives' tales, in terms of the strength of evidence supporting them. When we "open the box", and begin to understand how the inputs and outputs are connected, and further how the body maintains control over metabolism and behavior in an attempt to maintain "health", things become much clearer. If your health expert has this knowledge, you are very lucky. Most are ignorant, and likely will remain so, as once a person deems themselves an "expert", they no longer feel the need to learn anything new (particularly if it contradicts their "expertness"). So it is going to be up to you to gain some measure of knowledge, so that you can make informed decisions for yourself.
If you are a person with any degree of scientific interest and background, then I hope you will have read "Good Calories, Bad Calories" (or a similar book) by now. If not, then shame on you for purposefully maintaining your ignorance. While no book (even a scientific textbook) has the whole story, GCBC does a fantastic job of delving into the very well-established metabolic science linking insulin and various health issues. As oft noted within, most of this stuff is not considered controversial at all. The processes by which insulin regulates fat storage have been established for decades. The gap is simply one of knowledge, where "professors of human nutrition", medical doctors, and the like either don't learn this stuff, or fail to connect the dots: what you eat affects your hormones, which affect biological processes like fat storage, which affect other hormones, which can ultimately affect what you eat. Behavior, after all, is just another manifestation of biology. So it's going to be up to you to educate yourself to some extent. If you're more of a right-brain person or otherwise find GCBC a daunting read, Gary Taubes' forthcoming book "Why We Get Fat" might be more up your alley.
But if you choose to remain ignorant, and blindly follow "expert advice", you deserve exactly what you get.
Monday, September 27, 2010
The short version: before taking "expert" advice, check their shoes . . . and their motivations. I'll repeat it again: the only person who truly has your best interests at heart is YOU. Everyone else has some other axe to grind. Once in awhile you might luck out and find an "expert" whose goals are aligned with yours, but don't hold your breath.
Saturday, September 4, 2010
In another fun experiment, several human subjects were "locked up" at a zoo, and fed something like the diet of our closest primate relatives (chimps and gorillas), consisting of raw fruits and vegetables. It was mostly vegetables, if I remember correctly, lots of clips of people gnawing on carrots, raw broccoli, and the like. Long story short, everybody hated it. They spent about half the day doing nothing but chewing, and were starving nonetheless. I believe the average weight loss quoted was something like 10 pounds in two weeks (I meant to watch the show again and take notes, but this is the first "free" time I've gotten, and it's only because I'm at the park with the kids). Subjects apparently spoke of the desire for meat quite a bit (when they weren't bitching about all of the chewing and frequent bathroom visits). It's a fun experiment you can try at home yourself!
One part of the show which rather surprised me: one of the scientists visited a remote tribe in Africa who were still living a fairly primal hunter-gatherer existence. What struck me was that these people looked like crap, nothing like the sort of pictures taken by Weston A. Price, or the fossilized H. habilis jaws shown in the chewing experiment. Their faces showed signs of nutritional stress, with small jaws and crowded teeth. One hint here may be the effort they went through to get meat. In the show, they had a porcupine cornered in its burrow (BTW, this thing was huge, the size of a really big dog). The tribesmen spent the better part of the day digging 6-foot deep holes, until they forced the porcupine into the Hobson's choice of which hole to get speared in. It was a tremendous amount of effort to get some meat, and one of the hunters basically said "porcupine sucks, but at least it's meat". They also discussed how socially important it was for hunters to bring back meat, that it brought them status in the tribe, etc. Clearly, meat is at the top of the menu for these people. Yet that they would go to such lengths to obtain it (particularly when they find porcupine distasteful) makes me wonder if hunting isn't so good in this region anymore. Perhaps game has become scarce, hence the appearance of nutritional stress? Yet they're hanging in there, if nothing else a testament to the tremendous adaptability of humanity, made possible by our big brains (and what do you suppose made those big brains possible?)
Anyway, "How Food Made Us Human" has spawned a couple of trains of thought, which I want to share with you here. The first has to do with a mouse experiment demonstrated on the show; the second with the correlations between diet changes and physiological changes over the course of evolution.
The mouse experiment was very interesting, intended to show the effect of cooking on caloric bioavailability. Take some mice, feed them raw sweet potatoes, and measure the change in body mass as well as activity (based on distance run on the exercise wheel). Now cook the sweet potato and do the same thing. If you're still stuck in the calories-in calories-out (CICO) paradigm, the results should spawn massive cognitive dissonance. The mice who ate the cooked food showed the following differences when compared to those eating the raw food:
- They exercised significantly more, AND
- They were heavier.
Of course this all makes perfect sense when considered from the standpoint of evolution and metabolic regulation. Cooking makes calories more available. Though they didn't explicitly say so in the show, one presumes that the quantity of potato was held constant between the two groups (since not doing so would void the entire point of the experiment). So the only difference (presumably) was raw vs. cooked. Mice didn't evolve eating cooked food. The higher caloric availability likely "fooled" their digestive systems into taking up calories too rapidly, faster than required to support normal metabolic operations. Rate of digestion is regulated by hormonal and nervous feedback mechanisms: when the brain and other internal sensory systems think there's enough energy around, gastrointestinal motility decreases, slowing the rate at which food leaves the stomach to be digested and absorbed in the small intestine; and of course when an energy deficit is detected, food moves more rapidly out of the stomach. When the stomach is empty AND your body senses an energy deficit, you get hungry, and are driven to find more food.
That's how it supposed to work. Like all feedback control systems, if you push outside the "designed" range of stability, it starts failing. I expect this to be particularly the case with biological systems. Biological responses tend to follow "S-curve" shapes. There's nothing deep about this. It simply reflects the fact that biological responses are limited by available resources. At some point you run out of the capacity to make more hormones, neurotransmitters, receptors, etc. Insulin response is a great example. As a function of blood glucose, the secretion of insulin follows a shape much like that seen on the Wikipedia page. At some point you either saturate the ability to detect glucose, or saturate the ability of the pancreas to crank out insulin, or both. The point is that it is possible to exceed your body's ability to effectively control blood sugar levels via the action of insulin, simply by changing the effective "sugar density" of the food you consume.
Back to our mice: when faced with an excess of calories, how can the mouse's body respond? It can either store energy, or burn it off (or both). We know some of it got stored, as the mice got heavier. The show gave one example of "burning it off", in the spontaneous increase in activity. I don't know if they measured it, but I'd wager that the mice also gave off more heat, which I think is a more effective way dumping energy. Muscles are remarkably efficient, and it is surprising how much mechanical work you can get out of a kilocalorie, when compared to the equivalent thermal energy (1 kilocalorie will raise 1 kilogram of water only 1 degree C in temperature, but raise a 1 kg mass over 400 meters against gravity). So the outcome of the mouse experiment is wildly inconsistent with the CICO paradigm, but precisely what one might predict from evolution and metabolic regulation. It would be very interesting to see what would happen if the mice were allowed to continue eating cooked sweet potatoes for a longer time period. I wonder if they would develop mouse metabolic syndrome?
The second line of thought follows the main line of reasoning from the show, which is thus: by incorporating more nutrient dense foods into their diets, our hominid ancestors set in path a major evolutionary shift, where gut size was exchanged for brain size. The argument is elegant: if you eat food with greater bioavailable caloric density, you can spend less energy on digestion. That opens up an evolutionary pathway to increase brain size and energy expenditure at the expense of the gut, because you can extract the same energy from food with a smaller and less-demanding digestive system. And indeed, the fossil record seems to indicate that the major jumps in hominid brain sizes came at two critical nutritional junctures. The first was around the time we started eating meat. The second was when we started cooking food, and cooking is hypothesized to have led to modern humans.
It's interesting to consider the evolutionary advantage brought about by our big brains. What advantage did they bring our hominid ancestors? The scientist visiting the African hunter-gatherers went on a bit about how hunting is a fairly complex behavior, particularly as practiced by humans. But there's a lot more to it than that. The brain remembers - it stores information. And an "advanced" brain not only remembers information, but can extrapolate it to the future, making choices now that create advantages later. For instance, it's good to know that when the rains come, wildebeest are going to show up and a certain place, and that there's an effective method for cutting one out of the herd, how to make the tools you need to kill and butcher it, which parts are good to eat, etc. In the context of hunting and gathering, there's a positive feedback loop between the increase in nutrient density and encephalization, each reinforcing the other.
Of course, there's been another major shift in nutrient density since the advent of cooking: agriculture. The advent of agriculture is an interesting case. At first blush it doesn't seem so hot. The main thrust of human agriculture has been domestication of annual grasses for their seeds, i.e. grains. Across the world, in geographically separate locations, populations growing different crops (wheat, corn, rice) uniformly appear to show significant increases in the chronic "diseases of civilization". But evolution doesn't care if your teeth fall out or you drop dead from cancer at age 40. All evolution cares about is reproductive fitness, and agricultural humans had an undeniable reproductive advantage over hunter-gatherers; otherwise we wouldn't be having this conversation.
We had a hint from the mouse experiment above that an increase in dietary effective energy density could lead to "metabolic overload", exceeding the body's ability to balance and regulate the intake and expenditure of energy. And indeed, the evidence continues to mount that a good chunk of our modern epidemic of chronic diseases may be attributable to such metabolic malfunction. It makes me wonder what happened to our Australopithecus ancestors when they started eating meat: did they suffer metabolic disease as well? It's an academic question, of course, as chronic disease or no, meat-eating proved to increase reproductive success. We might ask a similar question about what occurred as cooking gained popularity. These are hard questions to answer from the fossil record. Agriculture happened much more recently, and further is amenable to archaeology (agriculturists tend to gather in large numbers in one spot, as opposed to wandering all over the place looking for food).
But here's a thought: we noted above that big brains are useful for remembering lots of stuff. This is important when living as a hunter-gatherer, because the dynamics of nature are complex. Maximizing reproductive potential in this context means being able to remember and extrapolate the myriad (and often subtle) cause-and-effect relationships of the natural world, along with whatever technological innovations are required to take advantage of this knowledge. This information does not get passed on genetically, but rather through communication, i.e. parents teaching children. Knowledge is power, in a very tangible sense, when talking about hunter-gatherer survival. Greater knowledge implies greater ability to obtain nutrient-dense food, hence greater reproductive fitness; it also means that it takes longer to get that knowledge into the brains of your offspring. It is often argued that diseases of civilization have little effect on evolution, because they generally kill you after the reproductive years. But that assumes the only information being passed along is genetic. If memories are also required for the reproductive success of your offspring, it pays to live long enough to pass along that information. And if you follow this line of reasoning, it's clear there's a volume of information at which the parent will not be able to effectively communicate the body of knowledge while still performing hunting and gathering activities required for survival. Enter Grandma and Grandpa. If the information volume for reproductive effectiveness is sufficiently large, it pays to live long enough to pass along that information to your offspring, their offspring, and so forth. Correspondingly, adoption of new dietary practices must either preserve this longevity, OR require less information to be effective.
So where does agriculture fit in? Is the adoption of agriculture, which brings with it ever-increasing energy density in food, driving us toward the next phase of "big brain" evolution? Good question - but consider this: how much do you need to know to be an effective agriculturist? I would argue rather little, compared to hunting and gathering. A hunter-gatherer may have thousands of foods in their diet, and they have to know where and when to find them, how to prepare them, etc. Agriculturists have relatively narrow diets, and there's a relatively simple and fixed pattern to the whole business: plow the land, plant the seeds, keep out the weeds, harvest. Lather, rinse, repeat. So I think you can argue that agriculture has a much smaller information burden than hunting/gathering. The tremendous technological increases since the advent of agriculture are a testament to how relatively little brain-power is needed for obtaining food anymore, as we apparently had plenty of spare brain capacity to monkey around with things not directly related to getting fed.
Now it is well known that brain volume decreased dramatically with the advent of agriculture. So did adult lifespan. Yet the agriculturists clearly laid the smack-down on the hunter gatherers, evolutionarily speaking. So neither long-term health nor brain size is a reproductive advantage once you start growing your own food (or at least the foods that our ancestors chose to cultivate). There's no point in fueling a big brain if you've got nothing to put in it. And there's no point in keeping old people around if they are not able to contribute directly to the reproductive fitness of their offspring. If you can't work the fields, don't make babies, and we don't need your accumulated wisdom, then you're pretty much just eating food better used for making more genetic copies. So for an agriculturist, dropping dead at 35 may actually have been an advantage.
It makes me wonder what direction agriculture (and more recently, industrial food processing) is driving our "humanness". Does the ever-growing energy density and general availability of food imply we'll evolve even smaller guts and bigger brains? I'll put my money on "No". After all, look around you: it's not like the smartest people are the most successful reproductively. You can damn near be vegetative, contribute nothing to society at all, yet we will ensure you've got all the Big Macs and Twinkies you can stuff in your face to fuel the generation of lots of babies to do the same thing. In our current environment, evolution favors being chronically ill and stupid. It doesn't really matter how much we wring our hands about ethics, culture, and society: reproductive success always wins. So if humanity wishes to achieve its stated long-term goals of giving people long and healthy lives while living sustainably on Earth, we'd better figure out how to align those goals with Nature's overriding law of reproductive fitness.
Friday, July 23, 2010
I had hoped to have had a civil discourse, but this is difficult when the questions come from uncivil people. I also don’t have time to answer superficial questions of others like ‘what is the detailed mechanism of protein induction of high cholesterol levels’ – that easily could become an entire but relatively useless dissertation when the “mechanism” most decidedly is a symphony of mechanisms, as I explained in our book. At this point, the far more important observation is the dramatic increase in serum cholesterol.
Hmmm, I wonder what Campbell's definition of "uncivil" is? Seems to have some conceptual overlap with the second sentence, i.e. those who ask "superficial" questions are being "uncivil". The question in question came from me, and I'm glad to see it had one of the desired effect. My preferred outcome would have been that Dr. Campbell actually answered the question. Then I would have learned something. It is unfortunate that he instead evaded the question as above, because then all we learn is that a) he doesn't have an answer, but b) thinks he does, and is thrown into painful cognitive dissonance when confronted by the truth of his ignorance. The nonsense about there being a "symphony of mechanisms" is, I believe, a subtle trick played on Dr. Campbell by his own mind. There are indeed many possible causes, and may be several interacting processes. But he confuses "I don't know which of the many possibilities contributes to the effect" with "here is what we know, a complex process". Classic mental band-aid for cognitive dissonance.
Anyway, I think my goal has been accomplished. I wanted to know if Dr. Campbell had any relevant information. If not, I wanted him to publicly torpedo his own credibility. Mission accomplished. Next time he wants to show up and bash a low-carb or paleo book an Amazon, you have ample material to demonstrate his irrationality.
Thursday, July 22, 2010
The great thing about the Internet, of course, is that it is impossible to censor anything. I'm pasting the comments I submitted to Campbell's site below. These were not approved. Compare with the openness displayed by Denise Minger in publishing comments from all comers, and fostering open discussion. Draw your own conclusions. If you have submitted comments to campbellcoalition.com that were not published, feel free to post them in the comments here. I'll send through anything that isn't overt spam.
To be fair, these comments may yet show up. There is a perfectly acceptable explanation that they haven't been published yet. I'm sure most bloggers have experienced "falling behind in comment moderation". If these comments are published, I partly retract my criticism. But the main portion remains valid: exchange of information is crucial to scientific progress. If you're not willing to exchange information, you're not interested in scientific progress.
I posted this just because it seemed odd to be revising such a benign comment. Who does this, and why?
Uh, why did your answer to my original question change from ““Dr. Campbell said he will be able to post comments now and then, although he cannot respond to every question.” to “Dr. Campbell said he will participate to the extent possible.”? Those seem like they say the same thing to me.
At any rate, I expect Dr. Campbell will find it a better use of his time to respond to specific points here rather than having to write lengthy detailed work such as above.
Here's a harder question:
From the response above:
“First and foremost, our extensive work on the biochemical fundamentals of the casein effect on experimental cancer in laboratory animals (only partly described in our book) was prominent because these findings led to my suggestion of fundamental principles and concepts that apply to the broader effects of nutrition on cancer development.”
Can you explain what these fundamental principles might be, or at least direct me to a detailed discussion? Proteins are broken down in to amino acids in the gut (at least in healthy individuals). These amino acids are then transported throughout the body, where they may be used to build new proteins. How does a specific mixture of amino acids trigger cancer growth? And of course I doubt most free-living organisms eat large quantities of isolated casein. So if I eat a meal containing casein, the mixture of amino acids absorbed reflects that off the total protein content of the meal, not just the casein.
It seems that in order for casein to have a specific role, it would need to trigger some other biological response beyond it’s simple amino acid content. For example, we know that most cancers have a very high glucose requirement, as they largely rely on anaerobic glucose metabolism for energy. We might then expect insulin to be required to stimulate glucose transport. Some cancers do indeed show higher expression of insulin receptors, see e.g.
From this we might hypothesize that dietary carbohydrates would drive cancer growth by providing both a supply a glucose and increased insulin secretion. It further can encompass other observations, e.g. the association of dietary fat and cancer. When eaten in combination with carbohydrate, fat will amplify insulin secretion.
Returning to your hypothesis that casein has a unique potential to stimulate cancer growth. What metabolic pathways are followed that create the “casein effect”? Is there some specific hormonal signal uniquely stimulated by casein?
And a link to a multivariate analysis that would answer at least some of Dr. Campbell's objections:
Here is an interesting blog on a multivariate analysis of China Study data:
I put these comments under the post "The Challenge of Telling the Truth:
Your suggestion about keeping an “open attitude” is a good one. However, you need to keep an open attitude about scientific evidence as well. The way you talk about “truth of health” sounds a lot more like religion than science. Perhaps this is simply a communication gap. I sincerely hope that you and your father have the sort of open and inquisitive minds required for scientific progress. There is no absolute “truth” in science, as this would imply we have perfect information. I doubt even the staunchest supporter of any dietary dogma would claim that we have perfect understanding of the deep complexities of human biology.
I will reiterate here what I have said elsewhere: scientific progress is about two-way communication. You and your father likely have information that supports your hypotheses, information that others do not have. However, I’m sure you’d agree that others have information that you do not as well. The only way to reach “agreement” is communication, so we’re all on the same page. This is why dialog is so fundamental to scientific progress. I hope you and your father will participate in this dialog.
“Despite lacking an adequate understanding of statistics and causality, this person used her intelligence and writing skills to compose a critique that might seem persuasive to laypeople.”
You might wish to expand on this a bit. It sounds like you’re saying she is both stupid (“lacking…understanding”) and intelligent in the same sentence. And I’m sure you would agree that “laypeople” need to have greater understanding of the issues so that they can make informed decisions, rather than simply picking an “expert” to blindly follow. Perhaps you can provide a little Statistics 101 discussion for us to better illustrate the shortcomings in Ms. Minger’s analysis for the lay public?
Another interesting development is the new(ish) web site campbellcoalition.com. Dr. Campbell's response to Denise Minger's critique is featured prominently, and better yet, Dr. Campbell has indicated that he may participate in some discussion here. I've posted a few questions, and urge others to do the same. I recommend you focus the discussion on scientific topics, as opposed to his opinion of Denise Minger, etc. Come armed with some hard questions on the connections between nutrition and metabolism, particular as they relate to Dr. Campbell's hypotheses. I believe this exercise has two realistic outcomes: either Dr. Campbell has some answers (which actually would be very cool), or he stonewalls. Either way we learn something interesting.
*** UPDATE ***
Well, it didn't take long for us to learn something interesting. From the comments on Dr. Campbell's reply to Denise Minger:
Based on the response received thus far, we have determined that our prior idea of a reasoned and civil discourse, with participation by Dr. Campbell, is not feasible and have decided to discontinue this discussion thread. Before closing, however, Dr. Campbell wanted to respond to comments from Denise Minger. Her comments are posted above, and Dr. Campbell’s response follows.
In other words, Dr. Campbell is going to have the last word, like it or not. So much for scientific discourse. The Campbells certainly could have chosen the path taken by Denise Minger - posting all discussion, whether "civil" or not, and choosing the reply to those questions or issues that are clearly intended to foster scientific discussion, and ignoring ad hominem attacks etc. Dr. Campbell's chosen course speaks to his true motivations.
If you have questions you posted to Campbell's site which did not make it through moderation, I invite you to repost in the comments here. Others can then see exactly what offended Dr. Campbell so greatly that he opted out of the discussion.
Friday, July 16, 2010
And T. Colin Campbell, if you're out there, let's see if you have the courage of your convictions. I have a Ph.D. and was an academic research scientist for many years, so I should be "worthy" of scientific discourse with you. And discourse is at the root of scientific progress. How can you expect to educate people like me on your views if you are unwilling to discuss them with opponents in a public forum?
Related note: durianrider is also one of the principals of the 30bananasaday.com site, along with "freelee". Some of the discussion on the post "Debunking the China Study Critics" is pretty interesting, from a sociological point of view. I am going to try registering for the site, and see if they have any willingness to let in opposing views. The registration page and forum guidelines make me suspect they are intolerant of those who might not agree with them, e.g. this quote:
We will not tolerate "anti-fruit" posts or advice that recommends calorie restriction/or the suggestion that others are "overeating on fruit", also recommending others restrict their water intake will not be supported on 30BaD, these threads will be deleted and you will be given a warning. This advice is not only unproductive but dangerous to the health of our members.
One of the best signs of dogmatic belief is the intolerance of information which contradicts said belief. I'll reserve judgment on 30bananasaday.com until my application gets accepted or rejected, as I plan to make it quite clear that I will be providing evidence that runs counter to their mission.
For my part, I welcome discussion from all corners, provided it is reasonably civil (i.e. contains actual information rather than emotional spewing). The definition of rationality is that two people with the same information will draw the same conclusions. But the only way those two people can achieve the same state of information is through communication. Even if you completely disagree with my views, there's a reasonable chance that I will learn something from you which may help me make better choices. So bring it on!
Friday, July 9, 2010
I also love the observation that, despite his constant whining about the "dangers of reductionism" in science, Campbell's entire argument against animal protein really hinges on a strongly reductionist experiment, namely the isolated effect of casein fed to rats in large doses. Snap!
Readers know of my criticisms of classical statistics, but it should be noted that I don't really have a problem with the mathematics, but the application. Math is what it is, either right or wrong. My issue is that classical statistics is used incorrectly, to draw inferences about hypotheses, when the underlying mathematical framework has nothing to do with inference. The key problem is that "statistics" are just numbers derived from data, like correlations. They don't say anything about a hypothesis: you will calculate the same correlation between two datasets, regardless of your hypothesis about what causes that correlation. Anyway, I don't want to get off on a rant. My point here is that the author, Denise Minger, does an excellent job of confining her analysis and conclusions within the bounds of what classical statistics can tell you. And along the way, she does a great job of demonstrating how easy it is to fool yourself (as T. Colin Campbell did - repeatedly) by over-interpreting these numbers which, in the end, cannot tell you anything more than what's in the data.
Ms. Minger has also done a great service in providing a concrete example of the issues in observational studies. You've likely read often that epidemiological studies are of little use in distinguishing between competing hypotheses. Now you have an example, replete with numbers. Ms. Minger demonstrates in several cases how a seemingly "obvious" conclusion vanishes once you dig into the large number of uncontrolled variables inherent in all observational studies. It's easy to find correlations in large datasets with many uncontrolled variables. The problem is that people take these correlations to mean more (or less) than they really do in terms of supporting/undermining a particular hypothesis, and the conclusions they draw are essentially ad hoc, not based on any rigorous mathematical analysis, but rather hand-waving about what is "obvious". An oft-quoted example is that men who shave daily have a higher incidence of heart disease. It is "obvious" that heart disease is not caused by shaving, right? Or is it? There's a whole lot of other information that goes into that judgment. We generally take this sort of thing for granted, especially when made in pronouncements from "esteemed" scientists like T. Colin Campbell. But if you dig into the reasoning behind these conclusions, you generally find a tangled web of assumptions, hypotheses assumed to be true, but which have varying (if any) actual evidence to support them. Ms. Minger does a great job of teasing these out of Campbell's reasoning, and demonstrating how the data itself provides little evidence one way or another, precisely because it cannot distinguish between the potential effects of the many intertwined and uncontrolled variables.
Anyway, enough of my babbling. Go read the article, you'll be glad you did (unless you're an uncritical fan of T. Colin Campbell, in which case you've got bigger problems).
Friday, May 21, 2010
Advanced glycation endproducts (AGEs) are the endpoints of some complicated chemistry that occurs when simple sugars (glucose, fructose, etc.) react with proteins (and apparently fats too). They’re toxic for a variety of reasons, and trigger an inflammatory response via the receptor for advanced glycation endproducts, or RAGE.
It turns out that RAGE binds to a whole bunch of things, and amongst them is the amyloid beta peptide, which is implicated in the development of Alzheimer’s. Amyloid beta is apparently produced via neural activity. I can’t figure out if it has a function or is just a by-product. I suspect it has some function, because the body has a mechanism for achieving a balance in the central nervous system (CNS). One kind of receptor (LRP) causes active transport out of the CNS to the blood, while RAGE triggers transport from the blood to the CNS across the blood-brain barrier. More RAGEs means you’ll have more amyloid beta in your brain. I couldn’t verify this, but I would guess that insulin drives the formation of RAGE. It makes sense, as your body would be preparing for glycation damage (more AGEs) from increased blood sugar, whether the source was food or glucose released due to stress. And indeed, diabetics have higher concentrations of RAGE (as do the blood vessels in the brains of Alzheimer’s victims).
We learned today that stress actually increases amyloid beta production in the brain, via the action of corticotrophin releasing factor, or CRF. I got in contact with one of the authors of that study and he was nice enough to send me a reprint of the paper. It’s a pretty solid piece of research. Amongst other things, they showed that the more you stress mice, the more amyloid beta is produced. They could introduce CRF directly into the brain, and observe increased amyloid beta production. They could block the action of CRF, stress the mice, and see that less amyloid beta was produced. And finally they could directly block neural activity, and either stress the mice or introduce CRF, and again would see reduced amyloid beta. So it was a pretty solid case, albeit in mice. It would be surprising if humans turned out to be much different, though it’s certainly possible. CRF is released as part of the stress response. It is also released as a result of insulin-induced hypoglycemia, i.e. insulin goes up, blood sugar crashes, CRF pumps out.
One last piece of the puzzle: by itself, amyloid beta is soluble, and shouldn’t form solid plaques (or at least do so slowly). But test-tube experiments show that formation of solid “fibrillar aggregates” of amyloid beta are accelerated if you provide seeds of altered amyloid beta. And what’s one form of the alteration? Glycation damage from sugar.
So, less than surprisingly my hypothesis is that the route to Alzheimer’s mirrors that of heart disease. A high-carbohydrate diet leads to the following effects:
- Increase in density of receptors for advanced glycation endproducts, which leads to increased amyloid beta concentrations in the brain.
- Release of CRF, which increases production of amyloid beta in the brain.
- Damage to amyloid beta, which increases the formation rate of solid aggregates, which may be contributory toward the formation of the plaques associated with Alzheimer’s.
Monday, May 10, 2010
"Uh, okay," I said, thinking it was some new guy tradition to buy coffee. "Why?"
"See that guy?" asked Tom, indicating the manager.
"Sure, " I replied.
"Check his shoes."
I dutifully looked at the shoes. Seeing nothing out of the ordinary, I asked "What about his shoes?"
"They're full of shit."
I bought the coffee.
When you're getting information from scientists or other "experts", there are some good signs that indicate when a shoe check might be needed (to see what they're full of). One of the best is when scientists argue for/against a particular hypothesis by lecturing about the scientific method, rather than actual evidence. Usually this is a bitch-fest about how opponents of their views are unscientific self-interested boobs, while casting themselves like Gandalf on the Bridge of Khazad-dûm (paraphrasing a bit):
You cannot pass! I am a servant of the Secret Fire, wielder of the Flame of Science. The dark fire will not avail you, Flame of Dumb-Dumb! Go back to the shadow. You shall not pass!
(Of course, since I spend a good chunk of this blog lecturing about scientific method, maybe I should check my own shoes :-)
I recently came across a couple of excellent examples of exactly this phenomenon, and thought we'd all benefit (and maybe get a good laugh) from checking the shoes of those involved. The first is T. Colin Campbell's "review" of the latest Atkins diet book. I haven't read the book, and am no particular fan of Atkins over any other diet, beyond the fact that it applies well-understood metabolic principles to achieve predictable results. And I won't spend time dissecting Campbell's review. He doesn't say anything that amounts to much beyond the Gandalf quote above (I can't shake this mental image of Campbell on the bridge, wielding a carrot and handful of wheat against a cow with a platter of bacon on its back). Jimmy Moore already did a great job of chewing up Campbell's argument, so I'll direct you there and to the links within (definitely see also Chris Masterjohn's review of "The China Study", and Campbell's unintentionally humorous reply). I just find it funny that Campbell is lecturing anybody about the scientific method, when he seems to apply it selectively, if it all. For instance, see his discussion about his personal "scientific philosophy" and "holistic" approach in The Protein Debate. I think it's pretty clear that Campbell is a conditional fan of the "scientific method," as long as it leads you to conclusions that agree with his own.
BTW, if you haven't read The Protein Debate, you should. For a long time you had to pay for access, but now it seems to be available for free. Loren Cordain provides a review of a lot of interesting evidence ranging from archaeological to biological, along with tons of references. Cordain has his own axe to grind, of course, so don't be fooled into thinking he's giving the whole picture. But he certainly provides a lot more background (164 references) than Campbell (0 references). Funny that Campbell complained in his Amazon review that Atkins never published a peer-reviewed paper and lectured on the requirement of peer review in "real" science (shoe check), yet neglects to reference said when arguing his own position. Read Campbell's part in the debate for lots of "check his shoes" examples. Plus it's great fun to see Campbell get handed his own ass - on a platter, with a side of bacon.
The second example is a letter to Science Magazine, entitled "Climate Change and the Integrity of Science". According to the guardian.co.uk,
A group of 255 of the world's top scientists today wrote an open letter aimed at restoring public faith in the integrity of climate science.
In a strongly worded condemnation of the recent escalation of political assaults on climatologists, the letter, published in the US Journal Science and signed by 11 Nobel laureates, attacks critics driven by "special interests or dogma" and "McCarthy-like" threats against researchers. It also attempts to set the record straight on the process of rigorous scientific research.
Wow, 255 scientists including 11 Nobel laureates? That's a lot of shoes to check. And we'll have to check those of Nobel winners twice.
The letter actually gets off to a good start:
We are deeply disturbed by the recent escalation of political assaults on scientists in general and on climate scientists in particular. All citizens should understand some basic scientific facts. There is always some uncertainty associated with scientific conclusions; science never absolutely proves anything. When someone says that society should wait until scientists are absolutely certain before taking any action, it is the same as saying society should never take action. For a problem as potentially catastrophic as climate change, taking no action poses a dangerous risk for our planet.
Clearly you cannot wait until uncertainties are resolved before making choices about how to deal with the possible outcomes of those uncertainties. And in theory, science is all about performing inference in the face of uncertainty, understanding how incomplete information about the world informs beliefs about competing hypotheses. Alas, the letter ruins this excellent start by espousing the opposite course, demanding that we should agree with their "facts":
Scientific conclusions derive from an understanding of basic laws supported by laboratory experiments, observations of nature, and mathematical and computer modeling. Like all human beings, scientists make mistakes, but the scientific process is designed to find and correct them. This process is inherently adversarial—scientists build reputations and gain recognition not only for supporting conventional wisdom, but even more so for demonstrating that the scientific consensus is wrong and that there is a better explanation. That's what Galileo, Pasteur, Darwin, and Einstein did. But when some conclusions have been thoroughly and deeply tested, questioned, and examined, they gain the status of "well-established theories" and are often spoken of as "facts."For instance, there is compelling scientific evidence that our planet is about 4.5 billion years old (the theory of the origin of Earth), that our universe was born from a single event about 14 billion years ago (the Big Bang theory), and that today's organisms evolved from ones living in the past (the theory of evolution). Even as these are overwhelmingly accepted by the scientific community, fame still awaits anyone who could show these theories to be wrong. Climate change now falls into this category: There is compelling, comprehensive, and consistent objective evidence that humans are changing the climate in ways that threaten our societies and the ecosystems on which we depend.
Oh brother, how much self-aggrandizing hyperbole can you pack into two paragraphs? Right off we get the lecture on the scientific method. The authors compare themselves to Galileo, Pasteur, Darwin, and Einstein (such name-dropping is another indication a shoe-check is required). The comparison with other "well-established" theories also needs some examination in comparison with the anthropogenic global warming (AGW) hypothesis:
- The Big Bang (or whatever process created the Universe), formation of the Earth, and evolution have all occurred already. For that matter, so has significant climate change on Earth, without help from human beings. What we don't have is a way of testing specific predictions about the behavior of a very complex nonlinear system, namely that human behavior is the driving force behind the recently observed global temperature variations, and that changes in human behavior can alter the course of future climate change. Big difference.
- The Big Bang, while "well-established" in the minds of physicists, is really only well-established in a semi-dogmatic sense. There are fairly major holes in the theory, in terms of predictive power. The current hypothesis required for getting from a Big Bang event to the Universe observed today ("inflation") has no evidential support - at all. It may be the best hypothesis we have at this point, but there's plenty of room for it to be supplanted by new information (and it wouldn't require much). The example is the most appropriate one for comparison to the AGW hypothesis, though for reasons opposite what the authors intended.
- Estimates of the age of the Earth leverage some other very basic "facts", amongst them that statistical behaviors of radioactive elements are observed to be the same every time we look. The nucleus of an atom on the Earth largely can be treated as an isolated system: it doesn't have a whole lot of complex interactions with the environment, in particular there really aren't any nonlinear feedback loops or other dynamical behavior to consider when doing radioactive dating. Inference of the age of the Earth can then be performed with some accuracy, as the relevant "givens" and observations don't admit much uncertainty. By contrast, global climate has many MANY interacting variables, most of which we probably don't even know about yet, and considerable uncertainty underlying the ones we do know about. It is difficult to see how any specific prediction of the future dynamic behavior of global climate could be as accurate as that for the past behavior of radioactive elements that have been sitting around in a rock for billions of years.
- Evolution is about as close to a "fact" as you're going to get. First of all, it effectively follows from a combination of the "laws" of thermodynamics (mainly the first and second) and the ability of a system (whether it is a molecule or a complex organism) to a) maintain a relative narrow set of states against environmental fluctuations, and b) reproduce itself at a rate greater than it's destruction. Evolution is just math, in the end. And of course, it is observed repeatedly in the laboratory and Nature. There may be many specific models that predict different evolutionary endpoints, or routes by which currently observed endpoints were achieved. But the fundamental phenomenon, that mutable self-reproducing systems will evolve, applies to all of these models, and all predictions are necessarily consistent with this "meta-behavior". By contrast, global climate is an instance of a specific system, which we model given what (very little) we know about the intertwined physical, chemical, and biological systems on the Earth, and continued warming is a specific prediction of that model. As climate is a system showing chaotic behavior across many timescales, it may be fundamentally unpredictable, for all practical purposes. So calling this prediction a "fact" is stretching thin even the approximate definition of "fact" made by the authors.
We also call for an end to McCarthy-like threats of criminal prosecution against our colleagues based on innuendo and guilt by association, the harassment of scientists by politicians seeking distractions to avoid taking action, and the outright lies being spread about them. Society has two choices: We can ignore the science and hide our heads in the sand and hope we are lucky, or we can act in the public interest to reduce the threat of global climate change quickly and substantively. The good news is that smart and effective actions are possible. But delay must not be an option.
I think everybody involved here is "ignoring the science" in one way or another. Threats of criminal prosecution is the sort of idiot knee-jerk response made by politicians, who, incapable of thinking for themselves, blindly follow the "expert du jour". When it turns out the politician made stupid and shortsighted decisions based on "expert" advice, they want to turn on the expert rather than accepting responsibility for acting like an idiot. Physician, heal thyself!
But the authors of this letter are no better. AGW proponents seem to ignore the elephant in the living room: the climate is probably going to change at some point whether or not human activity has anything to do with it. If anything is going to doom humanity, it is our anthropocentric view, that we are the masters of the Earth, able to bend Nature to our will. History shows that environmental conditions are large unstable, requiring organisms to adapt or die. We clearly should not ignore the possibility of climate change and the effects it will have on human life. But should we focus our resources on trying to force Nature to behave as we wish (and probably failing over the long term)? Or is it better to learn from history, assume that change is coming, and figure out how we will adapt to Nature's whims? I'm guessing the personal goals of the "scientists" aligns strongly with one of these scenarios, not so much the other.
And that's the real issue with both examples: the gap between the personal goals of those providing information and the goals of the receivers of that information. I've discussed this before, more in the context of organizations like pharmaceutical companies. But scientists are just as self-interested as any other organism or organization. The personal goals of academic scientists are centered around career advancement and getting funding for research. For both, you need to make some scientific hypothesis and be "right" about it, not necessarily in the sense of having actual evidence quantitatively weighting the hypothesis, but in getting some large chunk of the scientific community to buy in. Achieving said buy-in is the core goal of academic scientists, and whether or not "consensus" is obtained through actual evidence isn't really relevant to the practitioners. They generally think that the consensus so obtained is itself evidence that they're right, but there's circular reasoning and confirmation bias written all over that. When you are evaluating the evidence put forth by a scientist, you not only must evaluate the quality of that evidence, but also the context in which it is presented, because the presenter undoubtedly (and probably unconsciously) re-weights things based on their own beliefs and goals. The scientist has a vested interest in being considered "right", which can be a lot different than actually being "right". The stronger those beliefs and goals relative to the actual evidence, the more likely you'll hear about "facts" and the "scientific method" as opposed to detailed evidence, both supportive and contradictory.
So when a scientist speaks, be sure to check the shoes.
Saturday, May 8, 2010
Roundup — originally made by Monsanto but now also sold by others under the generic name glyphosate — has been little short of a miracle chemical for farmers. It kills a broad spectrum of weeds, is easy and safe to work with, and breaks down quickly, reducing its environmental impact.
Sales took off in the late 1990s, after Monsanto created its brand of Roundup Ready crops that were genetically modified to tolerate the chemical, allowing farmers to spray their fields to kill the weeds while leaving the crop unharmed. Today, Roundup Ready crops account for about 90 percent of the soybeans and 70 percent of the corn and cotton grown in the United States.
But farmers sprayed so much Roundup that weeds quickly evolved to survive it. “What we’re talking about here is Darwinian evolution in fast-forward,” Mike Owen, a weed scientist at Iowa State University, said.
Now, Roundup-resistant weeds like horseweed and giant ragweed are forcing farmers to go back to more expensive techniques that they had long ago abandoned.
My first reaction on reading this was that Monsanto obviously screwed up. I mean, what idiot couldn't see this coming? But on second thought I'll bet they did see it coming. The later portion of the article discusses how Monsanto and other chemical companies are developing genetically-modified food plants (wheat, corn, soy) to be resistant to other herbicides as well (including one using a component of Agent Orange - mmmmm, Agent Orangey tofu). So of course, farmers will not have to buy additional pesticides, and probably pony up more cash for the next generation of resistant seeds. And you can see that going indefinitely, with the cash register ringing the whole time for Monsanto etc.
And to be clear: I don't think that companies like Monsanto are doing something evil. They're behaving exactly the way we ask them to in a (more or less) free market economy. They are taking a strategy that maximizes their value (or at least their assessment of it). That strategy may or may not have anything to do with maximizing your health or minimizing environmental impact. If there's any evil here, it's that of complacency on the part of the consumers, who (as a group) hold the ultimate power to change how corporations value their strategy. Corporations are notoriously short-sighted, as demonstrated by how readily many major financial institutions drove their respective buses off of a cliff recently. The start-up I used to work for developed a whole set off mathematical and software tools with the idea of allowing public corporations to value long-term strategy in the face of uncertainty. We spent some time studying how corporations actually make decisions vs. how they should given a way of optimizing value given whatever they knew (and knew they didn't know). The gap is typically quite large. Corporations, like people, are shortsighted, and much better at rationalizing why they did something after the fact than making a rational decision in the first place.
The good news is that corporate myopia gives consumers a fairly large lever. If you want corporations to "care" about your long-term health and well-being, be an informed consumer, and make your buying choices to reflect your own goals. It's the "informed" part that's important here.
I wonder how the course chosen by chemical/seed companies will play out. Maybe something like this:
- Continued increase in spectrum of pesticides, resistance of weeds, and genetic engineering of food crops. At some point, the weeds are basically resistant to anything that won't outright kill humans.
- Companies introduce a genetically modified bug to eat the weeds. New food crops are engineered to produce chemicals that repel the bugs. The insects eventually kill off most of the weeds, but evolve to be resistant to the food crop insect toxins, and start eating our food.
- Cycle continues, introducing ever-more genetically engineered species introduced from higher in the food web.
- Eventually the genetically-engineered humans are produced to act as workers to contain all of the new pest species. These "humans" are built to thrive on weeds, and as such prove to have considerably greater reproductive fitness of the old-school "natural" humans, whose fate as a species is basically sealed.
Friday, May 7, 2010
I've blabbed before as to how I've often asked nutrition experts "What's so healthy about 'healthy whole grains'?" I've never gotten an actual answer, and as far as I can tell the best one could say is "nothing in particular." And while I have discussed the possible ways that grain consumption could lead to disease, I would have to admit that the evidence that grains have some particular disease-causing properties (outside of those with obvious clinically-detectable problems, like celiac) seems more correlation than causation at this point.
So I've started rethinking this question more as "why does anybody eat anything?" Clearly the need, at some level, to seek out and consume food has to be innate. And animals evolve amazingly complex behaviors around food. I remember giving my dog an egg for the first time, shell and all. As he does with any food, I expected him to swallow it more or less whole, maybe with a couple of crunches for good measure. Instead, he gently picked it up from his bowl, put it on the ground, and ever-so-delicately cracked it open with his front teeth, then licked out the inside and left the shell. I'm pretty sure that wasn't a learned behavior, unless he's been climbing trees and getting into robins' nests behind my back.
But in general, and probably particularly for omnivores, directed behavior associated with food (like "go find some more of those sweet orange spherical thingies") is learned. Babies put everything in their mouths for a reason: they're figuring out which things are worth seeking out and sticking in their mouths again. You may want to check out this fascinating paper on the topic. The short version is this: there seem to be two main areas of the brain associated with taste. The primary taste cortex handles the innate sensing of taste: sweet, salt, bitter, sour, and umami, along with the texture and viscosity of food (to sense fat), temperature, capsaicin, etc. The response of the primary taste cortex is NOT attentuated by satiety. Something sweet tastes just as sweet whether you're hungry or full. But the primary taste cortex doesn't assign value to a particular taste, i.e. it does not decide whether something tastes "good" or "bad". That's the job of the secondary taste cortex. It is the secondary taste cortex that "decides" sweet things taste good when you're hungry, but no so much after eating a whole box of candy. Secondary taste cortex neurons learn what's good and what isn't, and are further tuned to specific foods. For instance, you can be fed to satiety with fat, and certain neurons will decrease their response to further fat. But the response of those same neurons to the taste of glucose does not decrease, regardless of whether or not you're full of butter. In other words, "there's always room for dessert".
Anyway, let me get to the punch-line from the closing paragraph:
The outputs of the orbitofrontal cortex reach brain regions such as the striatum, cingulate cortex, and dorsolateral prefrontal cortex where behavioural responses to food may be elicited because these structures produce behaviour which makes the orbitofrontal cortex reward neurons fire, as they represent a goal for behaviour. At the same time, outputs from the orbitofrontal cortex and amygdala, in part via the hypothalamus, may provide for appropriate autonomic and endocrine responses to food to be produced, including the release of hormones such as insulin.
In other words, the external response to food (behavior) is a learned response driven by the secondary taste cortex, while the internal response (e.g. hormonal) is innate, originating in the primary taste cortex. That means that you learn what things taste "good" by the secondary taste cortex integrating feedback (positive and negative) from the rest of the body (primary taste cortex, glucose sensors, etc.), reinforcing or weakening the association of that taste with the behavior that led to those stimuli. So the fact that you "like" potato chips is intimately tied up with the impulse to get off the coach at midnight and stumble into the kitchen to finish off the bag. And the only reason you "like" any food is because your brain learned to, associating the flavor with some feedback signals which it interprets as being associated with a net positive outcome.
One other point which is probably obvious, but important: the smaller the time between the flavor stimulus and relevant physiological response, the stronger the change in association with the behavior. Thus, getting cancer 10 years after eating a poisonous plant is not very helpful in weakening that behavior. It is certainly possible to crave something that produces a strong short-term reward, but has a net negative outcome. The brain (both consciously and unconsciously) is notably short-sighted in its assessment of value.
Which brings me back to the original question: why do people eat grains? And I don't mean that as implying there's some moral judgment to made - food morality is just another religion. And there's obviously a spectrum of answers depending on the temporal proximity of the act of eating to a specific endpoint. On end is "prepared properly, they taste good" (I like sourdough toast dripping in butter as much as the next guy, though I eat it rarely). On the other end is the evolutionary argument so brilliantly put forth by Kurt Harris, basically that the net effect of domesticating grains was an advantage in reproductive fitness over hunter-gatherers, regardless of the relative "health" of those doing the reproducing. Evolution cares about making babies, and doesn't care if you have bad teeth and a bum ticker, as long as you contribute genes to more babies than the guy still killing perfectly serviceable beasts of burden with a rock on a stick.
No, I'm interested in the middle area (logarithmically speaking), which is why we learned to like grains. And why do we like them so much that we're willing to go to some amount of trouble to eat them? Why do I so love sourdough toast and butter, even though it doinks my blood sugar and gives me acne?
(Maybe it's the butter - New Zealand makes REALLY good butter.)
I have nothing but vague guesses, and am hoping to get some interesting discussion in the comments.
Sunday, April 18, 2010
The author, Helene York, provides a wonderfully clear example of "pseudo-logic", reasoning that is technically correct, but based on flawed or incomplete assumptions. Check out this quote:
Linked to cardiovascular disease and maligned for its industry's dependence on federal corn subsidies, it now has a reputation as the Hummer of foods—an excessive contributor to environmental ills including climate change, nitrogen blooms, pollution, and depletion of Midwestern aquifers—not to mention E. coli contamination that has sickened and scared thousands.
Hmmm, sounds like the root of the problem here is the federal corn subsidies. Bon-Appetit Management, where Ms. York is the director for strategic initiatives, ran the cafe at my former company. I know from personal experience that a good chunk of the food provided by Bon-Appetit is made possible by federal subsidies corn, wheat, and soy. And of course there is a tidal wave of scientific evidence emerging that said foods are more likely the culprit cardiovascular disease, via the metabolic disturbances they create. The evidence that red meat per se causes any disease has, to my knowledge, never risen above association (the E. coli issue is problem with factory farmed animals, and only then for people whose health is otherwise compromised, maybe from eating "healthy" soy goo and avoiding the sun).
Here's another classic:
Voluntary rancher fees from an industry association's advocacy program have underwritten pro-meat marketing campaigns, stipends for researchers to raise doubts (but not conclusive evidence) about scientific studies, and dissemination of talking points that are misleading at best. "Reducing intakes of meat and dairy would only lead to hunger," I read recently, and the headline of an industry newsletter stated, "Meat and dairy intakes not linked to climate change." These news items represent a disturbing trend: raise doubts, obfuscate the facts, and misinform.
Isn't that EXACTLY what Ms. York is doing here? What makes her "facts" better than those she criticizes? Why are her studies more "scientific" than those that contradict her "conclusive" evidenct? Talk about confirmation bias. This is the fundamental problem we face when turning scientific information (or more precisely, the lack thereof) into decisions. Humans seem to have a psychological propensity to gravitate toward "absolute truths", and their absolute belief in those truths are motivated more by social and emotional factors than any sort of actual accounting of the evidence. Indeed, people like Ms. York seem to get wound around some sort of moral axle that drives their reasoning process. Beef is "bad" in her world. That's a "fact". Thus beef must be bad for your health, the environment, at the root of the global economic meltdown, bad hair days, etc. And maybe I'm pessimistic, but I have a feeling that, more than anything, serving beef might be "bad" for Bon-Appetit's bottom line. I would guess it is cheaper to sling soy/corn/wheat processed food (where you can reap the benefit of less prep and less annoying middle men sucking off the teat of government subsidies).
But let's be optimistic, and presume Ms. York's motives are altruistic, that she really wants to save our hearts and our planet from the evils of a nice juicy steak. Does her reasoning hold water? I believe you would need to take the following assumptions as "facts" to support her conclusions:
- Human activity causes global warming.
- This warming trend will continue.
- Changes in human activity can reverse the trend.
This is where we run into trouble. The implication is that we have both a great enough understanding of global climate to make reliable predictions, and further that even if we had such detailed understanding, that behavior could be reliably extrapolated decades into the future. I'm no expert in global climate, but I know a thing or two about modeling complex systems, particularly in the face of uncertainty about the details. I seriously doubt that global climate models even begin to approach anything beyond a coarse representation of reality. There are plenty of aspects to the problem that we know we don't know, like the response of aquatic life to increased CO2 concentrations. There are significant uncertainties as well, e.g. solar and volcanic activity. And no doubt there's plenty of stuff we don't even know about, the "don't know what you don't know" category.
And it gets worse. Climate is basically just another word for weather. I don't know if you've noticed, but it's pretty hard to predict the weather even a week into the future, much less 50 years. And short-term weather modeling is much better understood for the simple reason that when examining a shorter time period, less variables are likely to have a large effect (e.g. large glaciers don't change enough in a week to affect your forecast significantly). Even so, the weather remains unpredictable, and this unpredictability is intrinsic. Weather is an example of a non-linear system, one which exhibits a phenomenon called deterministic chaos. A brief digression might be in order.
Consider a simple experiment, say measuring the time it takes a marble to fall from a height of one meter. We call such a system "deterministic" because the equations used to model it have no uncertainty. Given a particularly position and velocity for the marble, we can calculate the precise position and velocity an instant later. And this is a good approximation in our experiment. We might induce a little uncertainty in how our hand releases the marble, some from the measurement of the height, maybe some from air currents, etc. But we can repeat this experiment and get the pretty much the same results every time. In other words, small errors in our information about the marble's state translate into small errors in our predictions. The more accurate our information about the marble, the more accurate our prediction of the time to fall one meter.
A system exhibiting deterministic chaos is deterministic in the strict sense of the term: given precise knowledge of it's state, we can predict exactly what will happen next. But unlike our marble experiment, chaotic systems amplify uncertainty. In other words, even small inaccuracies in your information about the system quickly become large. Worse yet, this amplification is exponential in time, so getting more accurate information might make them predictable for a slightly longer period, but it's still going to fall apart on you pretty quickly. Chaotic systems are predictable only in principle, but in practice your information is never perfect, and predictability drops exponentially with time. Deterministic chaos as we now think of it was "discovered" by Edward Lorenz, who was modeling (you guessed it) global weather.
So, even assuming that the East Anglia boobs, with their lost data and bogus statistical analyses, were "right" about there being a significant increase in mean global temperatures, how does that help us predict the future behavior of a complex chaotic system where are models are incomplete and full of uncertainties?
Now when I drop this line of argument during discussions of global warming, the AGW crowd (after a bit of cognitive dissonance induced brain paralysis) come up with something like the following argument: human activity MIGHT be causing global warming, and since the downside has a value which is essentially negative infinity (extinction of the human race), we have to do everything possible to avoid it. Such an argument is more pseudo-logic, in this case by excluding the most likely scenario. AGW arguments center around whether or not the (supposedly) observed warming trend is caused by humans, and extrapolate that to conclude that humans might be able to reverse said trend. But this ignores the most likely scenario, which is that the climate will undergo a significant shift regardless of anything humans have done or will do. Why do I say this is the most likely scenario? Because it has happened many times in the past, and given the chaotic nature of climate, it is unlikely to stay in the current meta-stable state for long (many argue that the rise of civilization was made possible by unusual relative stability of climate). Arguments such as those put forth by Ms. York completely miss the point. We don't need to be worried about whether eating less hamburgers can affect the climate, we need to start hedging our risk that the climate will change regardless of what we do. It's the short-sighted thinking and associated bad decision-making of individuals like Ms. York that will doom us, missing that the forest around them is burning down while hugging the tree right in front of their face.
But back to the main thread. So the whole global warming argument is bogus. That's about as close to a fact as you're going to get, since it's really just mathematics (climate is chaotic, our knowledge of it is uncertain). Let's wander from math to the realm of science, where we consider evidence. The more detailed assumption underlying Ms. York's proposal is that cattle farming is particularly bad for the environment. She's basically equating cows with global destruction. This begs the question of how the Earth managed to survive millions of years of grazing animals, all of whom presumably had the same basic digestive strategy of modern plant eaters.
- Possess large gut full of bacteria which can break down cellulose.
- Eat plants, and lots of 'em.
- Bacteria eat the cellulose, make CO2/methane/etc. as by-products.
- Fart voluminously to avoid exploding.
So if there's any "myth" to be dispelled here, it's that there's anything "green" about the food industry, which includes the erstwhile Ms. York and her employer, Bon-Appetit Management.
And winding back to the usual topic of this blog, we shouldn't forget the health consequences of a diet consisting mostly of processed soy/wheat/corn. There is plenty of evidence from all corners indicating that "diseases of civilization" arise from said foods. I'm still waiting for someone to detail the metabolic pathways by which eating a steak leads to diabetes, cancer, and heart disease. Any takers?
Thursday, April 8, 2010
I've been trying to get a blog done on some recent thoughts on the laws of thermodynamics and metabolic regulation. Hoped to have it by the time Jimmy's show was on, didn't quite make it. Hopefully will wrap it up by this weekend - stay tuned.