Thursday, December 24, 2009

Review of the SousVide Supreme

Wow, the last post was in August. Been pretty busy with the day job lately. Odd how it works out - my startup job (, still close to my heart), for which I probably worked 10-12 hours a day, still seemed to leave me more time to do things like cook for the family. That's probably because I got to choose which 10-12 hours I worked, rather than having to spend an hour commuting each way and a solid 8+ hours sitting in a cubicle.

I have to admit, when I first found out that the Eades's world-changing project was a home sous-vide unit, I was a tad disappointed. I was familiar with the concept of sous-vide, being a fan of shows like Top Chef. I also have to admit that in hindsight I really didn't "get it". The brilliance of the SousVide Supreme is that it enables my food habit in the face of my new work regimen. Our family eats meat - lots of meat. And they've become accustomed to it being prepared to a certain standard which is not really possible to achieve on an 8-5 schedule, because I generally get home after 6. To prepare a good roast chicken or steak (accompanied by Bordelaise sauce, the omission of which will lead to family chilliness until remedied) takes at least 2 hours.

So it was a lot of crockpot and takeout during the week - until the SousVide Supreme came along. I pre-ordered mine, and awaited it with great excitement. It predictably didn't arrive until I was away from home during the Thanksgiving holidays, which caused me a certain amount of childish angst. But I finally got my grubby paws on it, made some righteously tasty food, and am ready to share my initial experiences and impressions.

The short version is this: if you're a meat-eater, get one. It's worth every penny.

I won't go too much into describing the unit, which has been done many other places. It is a little on the large side - but part of the issue is that our cabinets seem to have been made before appliances were invented. And you can cook an awful lot of food for the size. I made two tri-tips a couple of days ago, total weight five pounds, in a device the size of a bread-maker, and almost certainly with far less electricity than would have been required to achieve the same in my oven (which now seems cavernously inefficient). And I think the success of sous-vide can best be described by one guest's comment after the first bite: "Holy crap".

If you don't know, "sous vide" is French for "under vacuum". The sous vide technique involves sealing the food in a vacuum bag and cooking in a water bath with precisely controlled temperature. There are multiple advantages to this approach. First, because the food (typically meat, though other foods benefit as well) is sealed, there isn't much moisture loss. The vacuum seal also ensures the water contacts the entire surface of the food. Water has much higher heat capacity and conductivity than air, so transfers heat to the food more effectively than the typical radiative/convective(air) transfer which occurs in a standard oven. Once up to temperature, the SousVide Supreme apparently requires about the same energy as a 60-watt lightbulb, something like 10x less than a conventional oven, I would imagine.

But the real winner is that you set the water temperature to be the same as the final desired food temperature. "Normal" cooking requires a certain amount of precision by the chef. One applies relatively higher heat to the meat in an attempt to get the inside "done" before the whole thing turns to jerky. Unless you have a meat thermometer, the whole business is more art than science, because two pieces of meat have different fat/moisture/salt/etc. contents, all of which affects the thermal conductivity and the rate at which "doneness" is achieved. For instance, grass-fed beef typically has much lower fat content than grain-fed, and as a result cooks much faster (and is more rapidly rendered inedible). A thermometer helps, but of course the thermometer only measures the temperature at the center of the meat, at the location inserted, which may be of different size/fat content/etc. than the rest of the meat. I used to have a whole arsenal of techniques and tricks depending on the cut of meat, what it ate, etc.

With sous-vide, you just pick your final temperature. The aforementioned tri-tip was done at 128F. Imagine trying to cook a steak at 128F in your oven. Not only would it take forever, but you'd be left with something resembling the bottom of a shoe at the end of the process. Better yet, you can leave the meat in the thing for a considerable amount of time (I'm talking hours) without risking overcooking. For instance, when we had our guests a couple of nights ago, I took out one tri-tip, gave it a shot in the broiler to give it some color (more on this in a moment), and left the other one in while we fed the kids (who hammered a good chunk of the first steak). About an hour later I just pulled out tri-tip #2, browned it up, and served hot. And it was outrageously good: tender, juicy, and brimming with flavor.

And this has proved to be the real winner for me, with my new commuter lifestyle. I can drop in some steaks/chicken/chops before I leave for work in the morning, and have fabulous meat ready to eat 10-12 hours later when we all get home, plus a few minutes to heat a pan or the broiler and apply a tasty brown crust. And I'm not kidding about the fabulous. It does take a bit of experimentation with temperature and preparation to really nail it. I'll share a few things I've learned.

First is that "doneness" of meat results from a non-trivial combination of time and temperature. If you really want to nerd it up on this topic, check out "A Practical Guide to Sous Vide Cooking", originating from my alma mater (go Buffs!) The killing of any nasty bugs that might ruin your post-dining experience also results from a similar combination of time and temperature. Anyway, the first thing I tried was a London broil, which I cooked at the recommended 134F on a work day. So it cooked for about 10 or 11 hours at that temp. Was it the best steak ever? No (though my son claimed it was). But it was pretty darned good, a touch dry in texture (a sign it was starting to overcook), but nice and pink in color. Sous vide lesson 1: if you're going to leave your meat in for a long time, lower the temp a bit. The next try was with rib-eyes, done at about 130F, I believe, very nice, though could have been done lower still. The tri-tips came out great after 6 hours at 128F, and I think I'd drop it to 126F if I were going to leave it in all day.

The next couple of tries were with chicken. Both were done as work-day meals, using breasts, legs, and thighs cut up. These were sealed with butter, salt, and pepper, and again cooked for about 11 hours. The first batch I did at the temperature recommended in the SousVide Supreme manual, which I think was 141F. The breasts were a bit dry (I'm a dark meat person by a long shot), but still better than most chicken breasts I'd had. The thighs and legs really shined, though: juicy and very flavorful. The next batch I did at 136F, and were dynamite. We all know the old saw about "tastes like chicken", which I thought was odd, since most chicken I'd had didn't taste like much of anything by itself. Not the sous-vide version, though. Tremendous flavor, and a big hit with the family. The downside: I tried to brown the skin in my stainless steel pan, but for the most part it just stuck, leaving all the tastiness behind. I'm going to try in the broiler, but I think sous-vide lesson 2 is to have a kitchen torch handy for browning. This allows high heat to be locally applied, to minimize the risk of drying out. Believe me, once you've had sous-vide chicken, you're not going to want to take that risk.

We also tried pork chops, which I just sealed with salt. The chops themselves were just bulk-package center-cut, a little on the thin side. They came out fantastic, far more succulent and tasty than any pork chop I'd ever had. The bad news is that I again tried to brown in a pan, which dried them up pretty quickly. Sous-vide lesson 3: use thicker cuts of meat to prevent drying when you brown. I'll give it another whirl with some nice thick-cut chops.

We've had some nice success with "contrary" cooking, using our new pressure cooker in conjunction with the SousVide Supreme. The "contrary" comes from the fact that things you usually cook fast with the stove or oven are cooked slowly by sous-vide, and things usually done slowly in a crockpot are done quickly (relatively) in the pressure cooker. One example is cheeseburgers topped with pulled pork. I did the burgers for 3 hours at 134F - pretty good, though again I think I could go lower, particularly considering that I'm going to brown them in a pan. The pulled pork takes about an hour under pressure, and I just let the pressure release naturally over another hour or so. The combination is fabulous.

A better example was inspired by an interview with Heston Blumenthal while he was traveling on the SousVide Supreme tour. He stated that he always did stocks in the pressure cooker, since otherwise the flavors escape. This was a bit of a light-bulb moment for me (which ultimately led to the purchase of the pressure cooker). I would always make stock on the stove, cooking it for about 24 hours. My wife complained bitterly that the smell was driving her crazy because it made her hungry. I think she was having the same insight as Blumenthal. There were some additional issues with cooking stock on the stove. One was the time, which meant that stock had to be done in advance in large batches rather than cooking during the work day. If I didn't freeze the stock (which is something of a hassle), it had a tendency to grow interesting bacteria. The bacteria were at least nice enough to be fluorescent pink so I didn't put us all in the hospital.

Now, with the pressure cooker, I can just make stock in parallel while the beef is cooking in the SousVide Supreme. Sauces are a fantastic way to bring variety to meat dishes, and further serve as a vehicle for nutrients that you might not otherwise consume. Here's my recipe for beef stock, followed by that for Bordelaise sauce, which I think is the perfect pairing with steak.

Beef Stock

  • Two or three beef marrow bones, preferably the joint end with lots of cartilaginous goodness attached
  • One package of oxtails (about 0.5-1 lb usually).
  • 0.5 lb sliced beef heart
  • 3 large carrots, coarsely chopped
  • 2 sticks celery, coarsely chopped
  • 1-2 large yellow onions, coarsely chopped
  • One bunch thyme (I use one of those little plastic packages of fresh thyme)
  • One bunch parsley
  • One cup red wine
  • 4 cups water, plus any extra needed to cover
  1. Pre-heat the oven to 350F.
  2. In an oven-safe pan over high heat, brown the bones and oxtails on the stove. Throw in the veggies near the end (note that stores now often carry pre-made mirepoix, chopped carrots, celery, and onions, which saves some prep. I use about 4 cups of pre-made when I can get it).
  3. Put all of this in the oven for 45 minutes.
  4. Put the thyme, parsley, and beef heart in the pressure cooker. Add the browned meat and veggies on top, along with the water. Add extra water if needed to ensure everything is covered.
  5. Deglaze the pan with the red wine. I usually use Bordeaux, as (not surprisingly) it seems to match well with the other flavors in the Bordelaise (which originated in the French region of Bordeaux). Make sure to scrape all the brown goodies off the bottom of the pan, and add all of this to the pressure cooker.
  6. Cook under high pressure for 1.5-3 hours. 3 hours gives the best flavor, but my pressure cooker only times up to 99 minutes. If I'm at home, I do two rounds of 99 minutes.
Sauce Bordelaise

  • 4 cups beef stock
  • 2-2/3 cups red wine (again, I like Bordeaux, and it doesn't need to be expensive)
  • 6 large shallots, coarsely chopped
  • One bunch thyme
  • 8 oz. butter, cubed
  • Salt and pepper to taste
  • Xanthan gum or other thickener
  1. Reduce the beef stock to 2-2/3 cup.
  2. Combine wine, shallots, thyme, salt, and pepper in a sauce-pan. Cook until the liquid is reduced to about 1-1/3 cup.
  3. Strain red wine reduction. A chinois works well for this, and allows you to mash some of the yum-yums out of the solids.
  4. Combine reduced beef stock and red wine in a sauce pan and bring to a boil.
  5. Melt in the butter.
  6. Thicken. I use Xanthan gum, which works well, but is fairly touchy. I add a little at a time, give it a few minutes to cook and see how thick things are, repeating until I get the desired consistency. The traditional recipe uses a flour/butter roux as a thickener, which works fine. I try to avoid wheat, and don't really like the flour taste in the sauce anyway. But if you want to make a roux, you make it first and then add the liquid.
This is outrageously good on steak, even more so on sous-vide steak, which retains more beefy flavor and really matches well with the sauce. Use the left-overs for breakfast. Steak and eggs over easy smothered in Bordelaise is a little slice of heaven.

Sous-vide Ice Cream
Home-made ice cream is another of our favorite treats, often made to go along with our steak and Bordelaise. I use a modified version of Dr. Mary Dan Eades' sugar-free recipe. Making ice cream used to be something of a procedure, since the recipe is custard-based (technically a Creme Anglaise). When made on the stove the custard requires constant attention, and you have to temper the eggs, etc. With the SousVide Supreme, you can just mix everything, stick in a bag, and cook it. Here's the recipe.

  • 1.5 cup half-and-half
  • 1.5 cup heavy cream
  • 2 whole eggs plus 4 egg yolks
  • 0.25 cup Splenda
  • 0.5 cup polydextrose
  • 1 vanilla bean, split and scraped OR 4 T vanilla extract
  • Enough ice water to submerge the bag
  1. Preheat the SousVide supreme to 82C.
  2. Combine all ingredients in a mixing bowl.
  3. Pour mixture into a vacuum bag. Make sure you scrape out the polydextrose from the bottom. It doesn't dissolve very well in cold liquid, and has a tendency to congeal into a big clump.
  4. Vacuum and seal the bag, and place in the SousVide Supreme (note you can do this with a zip-lock by zipping most of the way, submerging in the water bath the squeeze out the air, then zipping completely shut).
  5. Cook for 20 minutes.
  6. Remove the bag and squish the contents. It's hot, but I'm able to do this with my bare hands, though you can use oven mitts. Pay particularly attention to the polydextrose, which settles to the bottom. It will incorporate better in the hot liquid.
  7. Return the bag to the water bath for another five minutes.
  8. Submerge in ice water and squish it around some more. At this point you can either leave it in the ice water to chill, or transfer to the refrigerator.
  9. Dump in the ice cream maker (if you used vanilla beans, remove the pods first).
This is very yummy, as well as easy and fast enough to do on a work night. Polydextrose is soluble fiber, basically polymerized glucose in a configuration that human digestive enzymes can't break down. It has some of the same chemical and flavor properties of sugar, and for the purposes of ice cream, lowers the freezing point of water giving smaller ice crystals and a creamier texture. I get a kick out of telling people the ice cream they're eating is high fiber. It does make some people gassy, though, so adjust the amount as required.


The SousVide Supreme really is revolutionary, particularly if you have a busy work week. Some of the high points:
  • Makes cooking of gourmet-quality meat nearly fool-proof.
  • Tremendously simplifies cooking of certain dishes (compare the ice cream procedure above with what is normally required for a Creme Anglaise).
  • Low electricity usage compared to an oven.
  • Very well engineered (the universal bag rack is something to marvel at, no doubt required spatial thinking skills that are well beyond my capability).
  • Food can be cooked in advance, shocked in ice, and frozen. Reheat to the perfect temperature in the SousVide Supreme.
  • Meat can be left in for an extended period without overcooking.
In theory, you should be able to use cheaper cuts of meat. I haven't had a chance to try this yet, but chuck roast sous-vide is next on the menu. I'll let you know how it turns out. In fact, what I really want to try is grass-fed chuck, which in theory would be downright inedible when cooked by normal means. The SousVide Supreme is a tad pricey on the face of it, but I found the benefits to be well worth the money.

Sunday, August 9, 2009

A GUT Feeling about Insulin

Ask ten people how to lose weight (fat), and you'll likely get ten different answers. In fact, if you ask ten "experts" the same question, you'll probably also get ten answers (usually attached to some product or service requiring you to part with some money). Why all of the confusion? After all, it seems a fairly simple question at its base: how do you burn more fat than you store?

I believe there's two key failures in critical thinking underlying the confusion. The first is that obesity itself is a "disease", which needs to be "cured". Many other diseases (heart disease, cancer, etc.) are associated with obesity, and the prevailing thought is that curing obesity reduces risk for these other diseases. However, this ignores the mountain of evidence that an organism's metabolism is self-regulating. In this view, obesity is a symptom of of some underlying disease process which causes systemic failure of metabolic regulation. It is this underlying disease which needs to be fixed; further, it is possible that you can have this disease and not be obese (there are plenty of skinny Type II diabetics). Modern medicine is very skilled at treating symptoms and ignoring the root cause; indeed, this effect is rampant for obesity treatments. How many people do you know that have lost large amounts of fat, only to have it come back worse?

The second failure comes from "black box" thinking. When hearing various prescriptions for curing obesity, I'm reminded of a famous Sidney Harris cartoon. For instance, a friend was recently telling me about a lemon juice diet. You drink lots of lemon juice, and the fat miraculously flows out of the fat cells. This supposedly had something to do with changing the acidity of your blood, but of course when prompted this person couldn't supply any actual physiological mechanism to explain this effect.

To understand the problems with black-box thinking, we can use the example of, uh, a black box. It has a hole where you can put stuff in, and lots of different colored lights that blink in response to whatever you provide as input. Your job is figure out the rules of how the input relates to the blinking lights. As we try different things we find many patterns of colored lights, with no obvious patterns. For instance, we supply two different cube-shaped objects, but each elicits a different light pattern. So "cubiness" is not apparently relevant to the lights.

The behavior of our black box may appear complex, but we don't really know if it's inherently complex, or if we just lack enough information to tease out the rules. We might crack the box open and examine how it actually works, and find that there really is a simple rule at the core, i.e. specific lights turn on depending on the molecular composition. The rule turns out to be simple, but it's the variety of different inputs that result in apparently complex behavior. Once you know how the box works inside, it becomes relatively easy to predict its response to a given input.

If you notice, most studies on diet and health take the black box approach: they diddle some inputs, and observe how those inputs are associated the outputs (e.g. fat loss). But if you don't have some understanding of what's going on inside the box, you just wind up with a mass of confusing observations and associations. So the lack of consensus and mercurial nature of dietary recommendations should come as no surprise.

Unification and Symmetry
Science often faces such situations. The core difficulty is a lack of symmetry. Symmetry means "sameness in the face of change". A perfectly smooth cue ball will look the same no matter how you turn it. Paint some dots on the ball, and you break the symmetry.

We often encounter cases where our observations seem to reflect a lack of symmetry, but if we look hard enough we find a deeper symmetry, one that unifies our observations under a common model. Such was the case in particle physics in the 20th century. Physicists had observed a vast zoo of different particles, first in cosmic rays (high-energy particles from space), then in "atom smashers". There were also four apparently disparate "forces" of nature: electromagnetic, weak nuclear, strong nuclear, and gravitation. The drive (which continues today) was to unify these different things by identifying the underlying symmetry. A "grand unified theory" (or GUT) would explain the all subatomic phenomena with a single model. Some progress has been made, e.g. many of the different particles were found to be composed from a much smaller family of more fundamental particles called quarks. The electromagnetic and weak nuclear forces (the latter causes radioactive decay) we discovered to actually be one in the same, the apparent difference occuring because the universe is relatively cold.

A Unified Theory of Fat Storage
Can we find a corresponding unifying principle for how fat loss and gain are related to diet? I think the answer is a qualified "yes". We likely need to restrict the domain of our model to one where the observed effect (obesity) has a common cause. Metabolic regulation is complex, and excess fat storage can have multiple root causes. We'll focus here on one possible cause, because it appears to be common and becoming more so: too much insulin, and/or not enough sensitivity to that insulin. Insulin is arguably the boss hormone for metabolic regulation: it effects many systems, and itself is affected by many factors. By examining the effect of insulin both on the behavior of individual cells and at the level of global metabolic regulation, we can in effect "open the box": see how inputs affect insulin and insulin response, then follow the effects of insulin in the body, particularly on fat storage.

I am going to make the bold claim that insulin is the unifying factor, tying together many different observations about fat gain/loss. I intentionally said "many" instead of all, because there are other metabolic pathways influencing fat storage (e.g. increased adrenaline promotes release of fatty acids from fat cells). I'll make the further claim that just about any successful reducing strategy (one that results in fat loss) can be explained by its effects on insulin, whether that strategy involves diet, physical activity, drugs/supplements, or a combination. We should also be able to explain both the relative efficacy of different strategies both in terms of rate of fat loss and final equilibrium fat mass (e.g. many diets result in fat loss, but all seem to "stall" at some point; we should be able to explain this stall via our model). Some examples are given below.

Our Grand Unified Theory theory then provides a more solid foundation for discussing the relative merits of different reducing strategies, and more importantly for making decisions about which lifestyle modifications are most appropriate. Instead of sifting through piles of observational evidence and "expert" testimony, you simply ask two questions:
  1. Is my obesity insulin related? (The answer is probably "Yes" for most, but not all. Those whose obesity has some other cause, like a genetic leptin disorder, will need to seek other avenues of treatment).
  2. How does X affect my insulin? From here you should be able to make a more informed decision about whether or not to pursue X for fat loss.
Perhaps more importantly, by moving the focus from a symptom (obesity) to an underlying cause, we can begin to recognize that controlling insulin should have wide-ranging implications for health (insulin does many things beyond controlling blood sugar and fat storage).

A Brief Primer on Insulin
The effect of insulin on fat storage has been covered elsewhere in detail, most notably in Gary Taubes' book Good Calories, Bad Calories. But it is probably worthwhile to hit the high points again. Insulin also does not act in isolation, but plays an intricate dance with other hormones and the nervous system. Some of these relationships are covered here.

Insulin is a protein (you can see a computer-generated representation here). Like all proteins, there is a gene that encodes the particular sequence of amino acids for manufacturing insulin. One of the interesting facts about insulin is that it's structure is remarkably consistent across time and species. Thus, species which appear genetically divergent, like humans and hagfish, do make different forms of insulin and the insulin receptor, but they're more simillar than different: human insulin has a large degree of cross-reactivity with hagfish insulin receptors, and vice-versa. So insulin has been around a long time, and the relative lack of cross-species mutation is an indication of it's key role in the survival of an organism.

The effects of insulin are initiated when an insulin molecule binds to an insulin receptor at the surface of a cell membrane. This binding triggers a series of chemical reactions, generally culminating at the cell nucleus, where genes are either up-regulated (meaning they make more of some protein) or down-regulated. Most people are familiar with the role of insulin in controlling blood sugar. One major effect of insulin binding is the manufacture of glucose transport (GLUT) proteins, which move glucose out of the blood, across the cell membrance, and into the cell. But insulin has many other effects. It is mitogenic, which means that it promotes cell division (i.e. insulin is a growth hormone). Insulin plays a key role in the manufacture of cholesterol from glucose, both by up-regulating transport of glucose into the cell and controlling manufacture of HMG-CoA reductase, and enzyme required to transform HMG-CoA into cholesterol (side note: statins block manufacture of HMG-CoA reductase). And there's a pile of other functions as well.

When insulin binds to an insulin receptor, it not only causes a chemical signal to be sent. The entire insulin/receptor complex is also absorbed by the cell (endocytosis), removing the insulin from circulation. A condition in which there is too much insulin in the blood (hyperinsulinemia) could thus result either from too much insulin being produced in the pancreas, or from a relative lack of insulin receptors. Correspondingly, insulin resistance (the failure of cells to respond to the insulin signal) could result from a lack of insulin receptors, a failure in the chemical signal chain, or from some other molecule (like a lectin) physically blocking the insulin receptor.

We should also realize that insulin does it's thing via it's effect on genes. Genetic differences can thus imply diferent responses to insulin. Genes carry the code to manufacture proteins, and a rather small difference in gene activation by insulin can result in large visible differences between individuals. This is particularly true for fat storage. We'll see below how insulin triggers manufacture of lipoprotein lipase (LPL) which is necessary for fat storage. A small difference in the amount of LPL made in response to insulin results in a small difference in net amount of fat storage. But whether that small difference results in net negative or positive storage could determine whether or not an individual will become obese.

On to the point. Insulin controls fat storage primarily through three pathways:
  1. Up-regulation of lipoprotein lipase (LPL)
  2. Down-regulation of hormone sensitive lipase (HSL)
  3. Up-regulation of glucose transporters.
The basic unit of fat is a fatty acid. Fatty acids are not water soluble, as anyone who has tried to mix oil and water knows. Blood is mostly water, and having fat droplets wandering around your blood vessels is not good. So fats need some other water soluble molecule to transport them around in the blood. Individual fatty acids can be transported bound to a molecule of albumin, but this mostly occurs for fatty acids released from fat cells. Dietary fats and those made in the liver are carried mostly as triglycerides in large molecules called lipoproteins. Triglycerides are also the storage form of fat in fat cells. A triglyceride is composed of three fatty acids stuck to a glycerol backbone.

Triglycerides are too large to pass across the cell membrane. In order for fatty acids to get in/out of a fat cell, they must be freed from the triglycerides. Enzymes which perform this task are called lipases. Lipoprotein lipase (LPL) acts on lipoproteins in the blood to free fatty acids for transport into the fat cells. Hormone sensitive lipase (HSL) acts on triglycerides inside the fat cell, freeing fatty acids for transport out of the fat cell. The precise mechanism by which the fats actually make it across the cell membrane isn't entirely clear. Cell membranes are largely made of fatty acids themselves (in the form of phospholipids), so it's like that free fatty acids passively diffuse across the cell membrane (whereas water soluble substances, like glucose, generally require the help of a transport molecule). There is also evidence of fat transporter molecules, though these may be more important in cells like muscle that may need energy faster than can be supplied by passive diffusion.

The fatty acids inside the fat cell, regardless of their origin, are candidates for esterification, which just means they can be incorporated into triglycerides. This in turn requires a supply of glucose to manufacture the glycerol backbone (actually a molecule named glycerol-3-phosphate, or alpha glycerol phosphate; we'll use G3P). Insulin is necessary to effect transport of glucose from the blood inside of the fat cell, and also up-regulates a key enzyme (G3P dehydrogenase) required to form G3P from glucose.

Insulin increases LPL and decreases HSL. The relative concentration of fatty acids inside and outside of the fat cell are thus governed by insulin, as well as the availability of lipoproteins in the blood. Fatty acids tend to move from high concentration to low. If insulin is low, HSL activity is increased, fatty acids tend to build up in the cell and diffuse out to the blood. If insulin is high, LPL activity is increased, fatty acids build up outside the cell and tend to move in. Once inside the cell, insulin governs the relative rate at which fat is stored, not only through HSL, but also by effecting glucose transport and regulating G3P dehydrogenase.

There are other metabolic pathways which affect this process. Some, like de novo lipogenesis, are also regulated by insulin. Others, like acylation stimulation protein (ASP), appear to be independent of insulin. There are ongoing arguments as to the relative importance of the various pathways, but I think the evidence is pretty clear that insulin is king of the hill when it comes to fat storage. For instance, Type I diabetics, who make little or no insulin, basically lack the ability store fat. If ASP were important in humans, Type I diabetics should be able to store plenty of fat (since one of the symptoms of Type I diabetes is ravenous hunger, I think we would have observed this). Any Type I diabetic who injects insulin, however, is familiar with the "fat pad" that forms at the injection site, due to (ta da) the high concentration of insulin in that area.

So, lots of concepts and big words in the above. The takeaway is simple: more insulin means fat cells store fat; less insulin means fat cells release fat. The equilibrium point (at which you're neither storing nor releasing) is thus largely determined by average insulin levels. We should then be able to predict the effect of various lifestyle changes from their effect on insulin. Let's see how that works out for some commonly recommended reducing strategies.

Low Carbohydrate Diet

This ought to be a no-brainer. Of all macronutrients, carbohydrates have the largest direct effect on insulin levels. Protein also stimulates a little insulin release, but nothing like a quantities of readily available carbohydrate (dietary protein also stimulates release of the hormone glucagon, which tends to counteract insulin's effect of driving glucose from the blood into fat cells, thus reducing fat storage). By itself, fat does not stimulate insulin release (in fact it seems to decrease it mildly). But fat does cause release of hormones like CCK, which amongst other things cause the pancreas to release more insulin for a given stimulus of glucose or amino acids (this is called the "incretin effect"). So eating fat and refined carbohydrates together (which is most food in the Western diet) ought to really crank your insulin. High average insulin means more fat storage - look around any public place if you want to see this in action.

Conversely, removing carbohydrates from the diet should drastically reduce average insulin levels (unless you have some non-dietary problem, like an insulin-producing tumor, in which case you've got bigger problems that being fat). The decrease in insulin should move the body away from fat storage to fat release. Since this fat is now available for energy, appetite should decrease and/or activity should increase spontaneously. All of these effects have been observed repeatedly in both animal and human studies.

Low Calorie Diet (Starvation)
Suppose we just cut calories across the board. Say your nominal caloric intake was 2400 kcal/day, including an average of 300g of carbohydrates. Leaving fructose out of the equation (fructose does not directly stimulate insulin release, but does cause the liver to become temporarily insulin resistant, the net effect of which may be to increase average insulin levels), that's equivalent to about a cup and a half of sugar each day (the gut rapidly breaks down "complex carbohydrates" into glucose for absorption into the blood). Since the total amount of glucose in a normal person's blood is about 1 tsp, this 1.5 cups should have a drastic effect on average insulin levels, as the body works very hard to keep blood glucose in a narrow range (too much or too little glucose in the blood will kill you in a hurry).

Now, let's not change what we eat, just how much. We'll go from 2400 kcal/day down to 1600 kcal/day. That implies we're now eating 200g of carbohydrate per day, implying that average insulin levels should drop significantly. Again, this should result fat loss, since we've decreased insulin from the level that promoted our previous equilibrium. And that's precisely what's seen: starvation diets result in fat loss. However, that 200g of carbohydrate still promotes a fair amount of insulin secretion. We would thus expect initially rapid fat loss, tapering off over time, and finally stalling at the new equilibrium point. And once the fat stops coming out of the fat cells, your body is literally starving, and will likely make you fall off the wagon, so to speak. As your body has become used to lower levels of insulin (i.e. your insulin sensitivity has increased), resuming previous levels of carbohydrate and fat consumption should result in rapid weight gain, overshooting your previous equilibrium point. Which, again, is exactly what is seen.

Low Fat Diet

The low-fat diet is an interesting case, and what is called "low-fat" often involves both calorie restriction and the trading out of refined carbohydrates for more whole food sources, which tend to have less effect on blood sugar and thus insulin. Both latter effects of course will drop your average insulin, and result in some fat loss. The interesting thing here is that reduction in dietary fat should also reduce secretion of incretin hormones like CCK, and thus further reduce insulin. So low-fat diets "work", as is often observed. In fact, I would predict it works better than just generically cutting calories. I don't know if this has been observed. The confusion most people have is the idea that eating fat makes you fat, and thus erroneously conclude reducing fat makes you thin. But all of this action is ultimately effected by insulin.

And that's the rub, because it means it is difficult (and probably unhealthy) to eat low-fat forever. If you don't eat much fat, then you need carbohydrates for energy (using too much protein for energy results in nitrogen poisoning). If you get those carbohydrates from the usual sources, like bread, rice, or pasta, your insulin will go up, and you'll get fat again, whether you eat fat or not (note that excess dietary carbohydrate is converted to fat by the liver). Successful maintenance of a low-fat diet means getting carbohydrates from sources which are slowly digested, and/or maintaining a high enough level of physical activity to burn off excess glucose and enhance insulin sensitivity (more on this below).

Physical Activity

We've all heard the old chestnut that to effect fat loss you just need to "eat less and exercise more". We've seen above how calorie reduction can affect insulin levels. But does exercise do the same thing?

Interestingly, the answer is a qualified "Yes". Let's start with an extreme case (which, as it turns out, forms the basis for the very successful "slow burn" type exercise regimens). Muscle stores glycogen, a form of starch, for use as quick energy. The glucose to make that glycogen gets into the muscle cells via the action of insulin. In the case of muscle cells, insulin stimulates the cell to move a preformed store of GLUT4 molecules to the surface, so glucose can be rapidly absorbed from the blood. Now suppose you completely exhaust the muscle of its glycogen stores. What do you suppose its response will be?

Not surprisingly, the cell cranks out more insulin receptors in an effort to rebuild it's energy. After all, you might need that quick energy to escape the next hungry lion that crosses your path. So exercise increases insulin sensitivity of muscle, and we learned above that when insulin binds to an insulin receptor the cell absorbs the whole complex. So, independent of diet effects, we expect exercise to reduce average insulin levels; further, in doing so, the muscles also clear out some glucose. Both of these effects should lead to some degree of fat loss. Any increase in net physical activity should result in this effect to some degree. Your muscle cells will only make insulin receptors if they need to. If you start as a total couch potato, and then start walking a mile a day, your muscles need to adapt to even this small increase in activity (walking a mile burns about an extra 100 kcal).

And of course, that's what people see. How many friends have you known that started a new exercise regime and rapidly lost some weight? This is often accompanied by pronouncements like "I can eat anything I want, as long as I exercise enough". That's true, at least to the point where the new fat storage/release equilibrium is reached, at which point fat loss stops. Since the individual is no longer getting positive feedback of fat loss for their physical exertion, they usually cut back or quit, but continue eating "anything I want", and of course just get fat again.

And all of this ignores the elephant in the living room, which is overall metabolic regulation. If you use more calories than are totally available to you from food and storage (remember that high insulin makes stored fat unavailable), you should get hungry. Further, the body knows what it wants, and will try very hard to make you eat it. If you burn up the muscles' store of carbohydrate, the resultant temporary increase in insulin sensitivity will drop your blood sugar. Your brain senses that drop, and tells you to go eat some carbohydrates. People often "reward" themselves with a food treat after a workout, or maybe have a sugary energy drink or similar. Of course, this tends to defeat whatever gain in insulin sensitivity your exercise created.

The Challenge

The examples above, I believe, illustrate explanatory power of the insulin hypothesis, bringing many approaches which seemed disparate or opposed (like low fat vs. low carb) under a single explanation. My challenge to you, O Gentle Reader, is to provide counter-examples. Are there fat-loss strategies that cannot be explained by the insulin model? Give it your best shot in the comments.

Saturday, June 6, 2009

Your Elephant Stepped on my Coffee Table

Take a look at this press release:

The summary is this: genetically leptin-resistant mice will become obese and develop Type II diabetes. These researchers restored leptin-sensitivity for the pro-opiomelanocortin (POMC) neurons in the arcuate nucleus (ARC), an area of the hypothalamus involved with energy regulation, including appetite and blood sugar control. As a result, the mice both lost fat AND spontaneously increased their level of activity. They did not lose fat because they were exercising, they were excercising because they were losing fat.

Now contrast that to the prevailing view of obesity and (supposedly) related health issues like diabetes: you're a lazy slob, sit on the couch, eat too much, and therefore become fat and diabetic. Gary Taubes "Good Calories, Bad Calories" laid the foundation for challenging this hypothesis, drawing on decades of research showing that energy regulation is governed by an intricate dance of hormones and the central nervous system. In this view, people overeat because they're becoming fat as a result of some malfunction in this system; correspondingly, lean people are more active for the same reason.

This latest piece of research supports the hormone hypothesis. Leptin plays a key role in energy regulation, and is manufactured by fat cells depending on how much fat they contain. More fat, more leptin. Amongst other things, leptin acts on the brain to turn off appetite, i.e., when you've stored up enough energy, stop eating. It is further hypothesized that the ARC may a play role in blood glucose control, e.g. providing CNS signals to the liver to regulate glucose manufacture. This role is certainly supported by the research linked above.

The key question becomes what causes the ARC to become leptin-resistant. The authors seem to completely miss this, instead gushing about "novel drug targets" (i.e. $$$). There are plenty of clues laying about, however. Stephan at Whole Health Source notes that leptin resistance precedes insulin resistance in the development of Type II diabetes. So what causes leptin resistance? Apart from genetic defects, this is an open question, but a reasonable conjecture would be wheat germ agglutinin (WGA), a kind of protein called a lectin which is found in grains. Lectins like WGA have the annoying capability of binding to hormone receptors. This is all the more annoying because they can avoid protease enzymes in the digestive system and pass into the blood intact (most proteins are broken into amino acids, as loading up your body with intact foreign proteins is bad juju).

WGA is so effective at binding hormone receptors that scientists regularly use it for studying these. For instance, they'll take the WGA with a radioactive substance and then see where it winds up sticking on a cell. Neurotransmitters are basically just hormones released in neuronal synapse, and scientists use it to study how things are transported in the brain. So, WGA a) binds to leptin receptors and b) wanders around your brain. And what does WGA do when it locks into your leptin receptors? Unknown, but in the test tube, at least, it blocks the effects of leptin. Hmmm, throw in insulin resistance of the liver from excess fructose, sounds like a recipe for Type II diabetes.

Wednesday, May 20, 2009

The Paradox Paradox

By denying scientific principles, one may maintain any paradox.
Galileo Galilei

Paradox: [Latin paradoxum, from Greek paradoxon from neuter sing. of paradoxos, conflicting with expectation, para-, beyond; see para–1, + doxa, opinion (from dokein, to think; see dek-).]

  1. A seemingly contradictory statement that may nonetheless be true: the paradox that standing is more tiring than walking.
  2. One exhibiting inexplicable or contradictory aspects: “The silence of midnight, to speak truly, though apparently a paradox, rung in my ears” (Mary Shelley)
  3. An assertion that is essentially self-contradictory, though based on a valid deduction from acceptable premises.
  4. A statement contrary to received opinion.
This morning I ran across an article discussing the "paradox" that obesity seems to play a protective role in heart disease. We seem to be presented with a flood of paradoxes relating to health and nutrition - and indeed said paradoxes present equal confusion to (too) many scientists. Let's talk a bit about what a paradox really is, and then I'll show why the Galileo quote was right on the money. To say it another way, any scientist who cries "paradox" is being fundamentally unscientific. You'd never get them to admit it (because they probably don't believe it), but their use of paradox is in the sense of the 4th definition above, rather than indicating a true logical paradox. And we all know how well science and opinion mix.

Most paradoxes are only superficially paradoxical, and can be resolved on deeper inspection. Real paradoxes are rare. Consider this example from the Wikipedia entry on "paradox":

... consider a situation in which a father and his son are driving down the road. The car collides with a tree and the father is killed. The boy is rushed to the nearest hospital where he is prepared for emergency surgery. On entering the surgery suite, the surgeon says, "I can't operate on this boy. He's my son."
Sounds paradoxical, right? But the issue is simply a bad assumption: since most surgeons are men, one erroneously extrapolates that ALL surgeons are men. Obviously the surgeon must be the boy's mother. This is a common source of claimed paradoxes in science: extrapolating something that is believed at some level (e.g. obesity causes heart disease) to a statement of absolute truth.

Let's consider mathematics, starting with simple Boolean logic. The point of logic is to reason deductively about the truth of a statement, given the truth of other statements. A paradox would imply you could get different answers depending on how you worked through the problem, i.e. two different sets of steps valid within the rules of logic would give different answers. If such paradoxes did exist, they clearly render logic useless, since you could never consistently prove something true. The dictionary definition of "paradox" admits a subtly different situation, which is a statement like "I am a liar". The rules of logic can neither prove nor disprove this statement. But this more an artifact of language and technical aspects of formal mathematical systems as opposed to the sort of "scientific paradox" claimed by the authors of the heart disease/obesity paper.

Generalizing the case of logic to all math leads to the same conclusion. A mathematical system which admits true paradoxes is pointless. A true paradox would indicate inconsistency in the rules and assumptions used to build the system. Problems labeled "paradoxical" in math are really counter-intuitive, like the Banach-Tarski Paradox, where one can prove that there is a way of dividing up a 3-dimensional ball, moving the pieces around without stretching them, and reassembling to get two balls of the same size as the original. Sounds pretty paradoxical, right? But it's really just counter-intuitive: the size of the set of points in the one ball (called the cardinality) is actually the same as the size of the set of points in the two balls. The size of a set is different than it's measure (which in this case would be the volume). The result that we can double the volume of a set of points without changing the cardinality of that set violates our intuition, but is consistent within the mathematical definitions of measure and cardinality (this is roughly equivalent to realizing that that size of the set of even integers is the same as the size of the set of all integers: they're both infinite).

Can we ever have a true scientific paradox? Mathematical truth is purely conceptual, and can thus be "absolute". We define the axioms and rules and mentally manipulate these to prove or disprove other statements. Science is messier. Nothing is absolute in science, because all scientific theories must be supported by observational evidence from the real world. Our observations are limited by various practical considerations. Our data is never 100% accurate, we can never be sure we've observed all of the relevant variables, etc. So our belief in a scientific hypothesis is always conditioned on the evidence which itself is subject to limitations of our ability to observe and collect information. Scientific belief thus exists in a continuum between absolute truth and falsehood, and is always conditioned on the available evidence. As new evidence is obtained, we update our beliefs accordingly toward greater or less truth as indicated by the new evidence.

So you can never have a scientific paradox. Scientific honesty demands that observation of evidence contradicting a hypothesis causes you to lower your belief in that hypothesis. A paradox requires two statements which can be shown to be contradictory yet simultaneous true. But neither evidence nor hypotheses carry absolute truth, and our beliefs in either are always conditioned on the other. The scientifically relevant method evaluates belief of hypotheses conditioned on evidence.

Science as most often practiced, using frequentist statistics, evaluates belief in data assuming the truth of a hypothesis, so it's no wonder scientists spend so much time confused about "paradoxes". Take a hypothesis and data that appears to contradict that hypothesis. Then try to test the hypothesis quantifying your belief in the data presuming truth of that hypothesis. When the number comes back low, you basically have two choices: come up with a reason why the data is "wrong" (e.g. a mistake in experimental design, broken instrument, drunken graduate student), or realize that your hypothesis (again, whose truth was assumed as part of the analysis) is possibly not true. If you believe your data AND are 100% convinced of the hypothesis (which begs the question of why you did the experiment in the first place), you'll think you've got a paradox. The only real paradox is that people get paid to make this fundamental error in inference - over and over and over . . .

Our friends who observed the apparently paradoxical protective effect of obesity in heart disease patients have fallen into this trap. The right thing to do upon observing this effect is to update belief in the hypothesis that obesity causes heart disease. The new evidence lowers our belief in that hypothesis, and simultaneously signals that we should evaluate competing hypotheses in the light of all of the available evidence. Indeed, if one were to do a proper analysis of the evidence, it would be clear that no more supports the hypothesis that obesity causes heart disease any more than it does the hypothesis that heart disease causes obesity. Not all heart disease patients are obese, and not all obese people suffer from heart disease. Further, there's no strong metabolic evidence indicating the arrow of causality.

The smart thing to do in such situations is to start looking at hypotheses where a third culprit is the underlying cause of the observed associated effects. So what might cause both obesity and heart disease, or in some people one but not the other?

A growing body of evidence links poor blood sugar control to heart attack risk (see this recent study, for instance). The body maintains blood glucose in a narrow range, because both too little or too much are dangerous. Too little and the brain starves. Too much and you overwhelm the systems which repair the damage caused by sugar, in particular that to the arterial lining. You cannot excrete excess blood glucose like you can excess water or salt (at least not without severely damaging the kidneys). So your options are to either store it, or turn it into something else. The muscles and liver have a limited capacity for storage of glucose. Once they're full, the liver, as directed by insulin, will turn the rest into fat, and your fat tissue, again as directed by insulin, will store that fat.

At least that's how it's supposed to work. Insulin is a hormone, and hormones activate genes to manufacture proteins. The response to a hormonal stimulus is thus partially determined by genetics. Your genes will determine, for instance, the relative expression of lipoprotein lipase and hormone sensitive lipase in response to insulin levels. This in turn governs the ability to take fat from the blood and store it, or release that fat from fat cells to be used as energy. Similarly one guesses that insulin sensitivity of muscle and liver tissue has some genetic basis, and these may further be altered by disease, nutrition, etc. (overconsuption of either alcohol or fructose will make the liver insulin resistant, thus impeding its ability to store glucose, transform it to fat, or scale back manufacture of glucose from protein).

In the framework of this hypothesis, a person with greater propensity towards fat storage has a potential advantage when it comes to heart disease, as it provides another "sink" for excess blood glucose. A perpetually skinny person may be at a disadvantage. If your fat cells don't respond to insulin signals, then the fat has nowhere to go and stacks up in your blood as "triglycerides". If your liver and/or muscle don't properly respond to insulin, glucose begins to build up in the blood. Neither situation is likely good for the development of heart disease, and in reality both seem to occur simultaneously for susceptible individuals.

The news blurb doesn't state whether blood glucose or triglycerides tested, and the publisher of Journal of the American College of Cardiology doesn't provide free access to the publication. Perhaps a reader with access can post a comment as to whether blood glucose was tested and the results. Regardless, it is the unwillingness or inability of the authors to consider alternative hypotheses which leads them to cry "Paradox!" in such a public manner. Such individuals are clearly mired in irrational dogma and/or trying to drum up extra funding. From a broader view, any hypothesis (like diet-heart) which embraces paradoxes (like the "French paradox") are probably junk science. Treat them accordingly lest you extinguish your own spark of reason.

Sunday, May 3, 2009

Pearls Before H1N1

The swine flu is making me sick. I don't have the virus (at least no symptoms), but the whole panic over it annoys me on multiple fronts. Take this recent AP story as an example. 241 cases? Is this really worth worrying about? I find the following quote particularly telling:

Even if the swine virus doesn't prove as potent as authorities first feared, Besser said that doesn't mean the U.S. and World Health Organization overreacted in racing to prevent a pandemic, or worldwide spread, of a virus never before seen.

With a new infectious disease, "you basically get one shot, you get one chance to try to reduce the impact," Besser said. "You take a very aggressive approach and as you learn more information you can tailor your response."

It was just over a week ago that authorities learned the new flu CDC had detected in a few people in California and Texas was causing a large outbreak and deaths in Mexico, triggering global alarm.

"We didn't know what its lethality was going to be. We had to move. Once you get behind flu, you can't catch up," Homeland Security Secretary Janet Napolitano said.

Maybe an informed reader can help me out here. Is there any reason to believe that the "very aggressive approach" makes any difference at all? Do all of these countermeasures have any effect? I get the feeling there's a whole bunch of "virus nerds" at the CDC just waiting for the opportunity to do something, which more than anything is to feed public hysteria and justify their existence. Maybe I'm being overly pessimistic, but the track record of government science types is pretty abysmal. I do think that they think they're being helpful, but I really have a difficult time shaking the feeling that anything public health authorities do to try and stop a virus (which has evolved over billions of years to be very efficient at spreading infection) is roughly equivalent to piling up cheesecloth to protect yourself from a tsunami.

Western culture seems to be developing increasingly extreme paranoia about all things health related. And of course, this is fueled by the media and other groups (like the CDC) who stand to benefit from spreading fear. Too many people spend too much time worrying about "silent killers": cancer, heart disease, viral diseases, you name it. Correspondingly, there exists as MASSIVE "industry" concerned with disseminating information and treatment. Just look at the amount of money spent helping us with that most deadly of conditions, "high cholesterol". Can you watch TV anymore without seeing at least one advertisment for statins (geez, there's one on now - GO TIVO) or a wonder food (like Ch**rios for crying out loud) that is going to save you from the "silent killer". Ch**rios and statins: the delicious and healthy way to start the day.

Fear is a complicated emotion, and that complication no doubt stems from the underlying complicated nature of trying to survive. I believe the major psychological source of fear is uncertainty, i.e. "was that sound Grog relieving himself due to over-consumption of cachonga root, or a bear coming to eat me?". I suspect the physiological source of fear is hormones, namely the stress hormones. Certainly stress brings about an increase in irrational fear (is there such a thing as rational fear?), and certain drugs can activate those same pathways and create tremendous fear. Our society seems now more than ever in the grip of fear inducers. Though science and technology have advanced human knowledge, the fact is that most of that knowledge is held by precious few. "Back in the day", when we all lived in the forest, you needed lots of knowledge to survive. The "unknown" was largely those aspects of Nature humans could not control, like the weather, hungry bears, and infection. Now we trust that to the "experts", tacitly ceding them control over our lives. And other aspects of lifestyle probably contribute to stress. Crappy nutrition certainly increases stress hormones, as does chronic illness. The diabetes "epidemic" is a pretty good sign that a major portion of the population is suffering from chronic illness due to poor nutrition (and a pronounced lack of sunshine).

The combination is a real mess: a sick and fear-filled population driving a culture of experts to save them from their own ignorance. And of course the experts turn out to have little relevant expertise. Their major source of validation comes from the feedback that we give them. How many people do you know whose doctor fills them full of pills to no effect? The patient experiences little actual improvement in health, yet they keep going back for more. What if people started thinking for themselves and kicked their doctor to the curb in favor of self-informed care? I suspect a swift kick in the pocketbook would change MDs' opinion of statins in a big hurry. Similarly, let's have some fun and watch the shakeout if (as I think is highly probable) swine flu turns out to be a dud. Congress and the media will praise the CDC for quick and decisive action, and they'll wind up with a nice budget increase, which is all the validation they need. I suspect we won't see any sort of critical introspection as to whether or not all of this flopping about and general panic has a measurable benefit on public health. Nobody gets budget increases for that.

In a really nice post on critical thinking, Dr. Mike Eades appropriated one of thousands of fabulous lines from George Carlin (who I personally think was probably smarter than everyone at the CDC - combined). I shall re-appropriate it here:

Think of how stupid the average person is, and then realize that half of them are stupider than that.
I spent about 10 years "in" science, 3 to get my M.S. and Ph.D. another 7 as a post-doc and researcher. I had the chance to interact with many scientists, mostly from physics, but also from other fields, and if there's one thing I can tell you with great certainty it is that the distribution of intelligence amongst scientists pretty much mirrors that of the population at large. By "intelligence", I mean the ability to rationally weight complex evidence as it relates to different hypotheses. My point here is not to say that scientists are dumb (though as we learned from Carlin, half of them are dumb, by definition), but that you likely have the same reasoning capability as the average scientist. In fact, you're probably a little better than the average scientist. Scientists favor complex solutions, precisely because they are hard to understand. This validates their own self-perception of being smarter than the average bear. As a scientist I've had more than one scientific discussion end with "That's too simple to be right." Not, mind you, "that idea contradicts this piece of evidence". It's just too simple for their taste. So I dressed up the same simple idea with complex-appearing math and verbiage, leading to acceptance.

The swine flu situation presents us with the opportunity to watch this in action. I was just watching a clip from a news conference, where some "expert" was simultaneously back-pedaling on the severity of the present threat, while drumming up some more fear about the future. The thrust of it was that the Spanish Flu "took a summer break", and then re-awoke to slaughter millions. So keep on wiping down those door handles and pouring in the taxpayer dollars for stockpiling flu vaccines and anti-virals. Yet our expert likely missed on the glaringly obvious simple hypothesis: people almost never get flu in the summer, an effect seen in both hemispheres. That applied also to the highly deadly Spanish Flu, which apparently spent the summer at the beach before coming back to clobber Western civilization. Or maybe something about the summer that made people more immune to a pre-existing infection. Gee, I wonder what that could be?

(For those in the cheap seats, it's Vitamin D3.)

Can I say with certainty that Vitamin D3 is the answer to the swine flu? No. But the CDC nerds also have no reason to say that it isn't, other than it's a) too simple, and b) puts a fair dent in their raison d'etre. Even WebMD, normally a bastion of medical orthodoxy, is at least considering the possibility. I presume flu cases are being tested for H1N1 antibodies. I wonder if anybody is bothering to test for Vitamin D3 while they're at it?

Nah, too simple. Ramp up that expensive anti-viral production. Far more tasteful.


I'm following the leads of Dr. Eades and Jimmy Moore on to Twitter. It's hard to find time to blog with my new job, and I wind up with stacks of links I want to talk about. Twitter seems like a good solution to at least push information I find interesting (and perhaps more often, incredibly dumb).

Tuesday, March 24, 2009

Listening to Experts Makes You Stupid

Got to work early this morning, and I thought this article deserved a quickie post:

I think you could replace "Financial Crisis" with "Health Crisis" in the headline and nicely sum up the current boom in metabolic diseases etc. Most of us have done it at one point or another: uncritically accept the advice given by experts, even when a little thought shows it makes little sense. Now we've learned that the brain has a specific mechanism where it essentially shuts off given "expert advice". This perhaps explains why people seem to be thrown into such cognitive dissonance when presented with evidence which is rationally a slam dunk, but also contradicts what their doctors, the government, the media, and so forth have told them. I'm sure many of you have encountered irrational anger from friends and family when you question nutritional dogma. One of the weirdest things for me is how bent people get when I push them to justify why exactly "healthy whole grains" are so healthy. Still waiting (going on a couple of years now) for a response beyond "everybody knows that, so shut up."

That's not to say expert advice is necessarily bad - you just need to use your own brain as well, and weigh the expert information appropriately. Tom Naughton makes this point very nicely in "Fat Head" (see discussion of "functioning brain").

BTW, I'm finding "Fat Head" to be the most effective tool yet in overcoming the mental block created by "expert advice" (as opposed to my usual boring biochemistry lecture - maybe not so surprising). I suspect it's the humor that somehow breaks down the barriers of cognitive dissonance. It would be funny (in every sense of the word) if laughter made us more rational.

Sunday, March 22, 2009

Fat Head: The Blog

Tom Naughton of the fantastic "Fat Head" movie has started his own blog at Great stuff, and like the movie, informative and very funny. Be sure to check out his first post and learn how to make your very own misleading study supporting ridiculous preconceptions.

Thursday, February 26, 2009

Wheat Head

How's this for a mind-blower:

Schizophrenia, gluten, and low-carbohydrate, ketogenic diets: a case report and review of the literature

Synopsis: 70-year-old schizophrenic experienced complete remission of symptoms after adopting a low-carbohydrate diet. Now, of course, this is just one case study, and needs to be replicated a LOT more times. But it really caught my eye, as from my last post I've been thinking about the mental effects of diet, particularly grains. Schizophrenia is one extreme case of neurological disturbance, but as with all things biological, disease expression is rarely binary. The manifestation of symptoms covers a spectrum when viewed across the population. We just tend to pay the most attention to the extreme cases. Suppose grains were implicated as causal in schizophrenia. It's a good bet they then contribute to other less obvious forms of mental disturbance. Since grains are so widely consumed, this may be actually viewed as the "norm".

Indeed, other neurological conditions are known to benefit from removal of dietary grains, including pediatric epilepsy (discussed in the paper), ADHD, autism, and multiple sclerosis. I've been doing some poking around on wheat germ agglutinin and the brain. Turns out WGA does indeed cross the blood-brain barrier, will bind to insulin receptors in the brain, and probably all kinds of other stuff as well. Google "WGA brain", and you'll find WGA is actually used extensively to map out neuronal pathways, so clearly has potential neurological effects beyond just binding to insulin receptors.

I propose that the sequel to "Fat Head" be "Wheat Head". That covers a lot of ground, from Weston A. Price's observations on cranial development (there definitely seems to be distinct "Wheat Head" phenotypes), to dental disease to the neurological implications, and probably more.

Wouldn't it be fun if the food pyramid were making us fat, sick, deformed, and crazy all at once?

Wednesday, February 25, 2009

The Children of the Wheat

Hello everyone out there in blog world. Let me start by apologizing for my prolonged absence. The recent lack of posting is partly a result of being busy with other aspects of life, not the least of which was searching for a new job. I feel very fortunate to have found an opportunity given the current economic situation; even more fortunate that I will be helping to build the next generation of genetic sequencing machines. So I get to have a job not only on the cutting edge of science, but also hopefully contributing to the health of our society. I expect to be pretty busy with the new gig, so I don't know that I'll be able to post much in the coming months either.

The other reason for the lack of posts is that I haven't really had much new to say. The one thing I'd like to get to is the rest of the energy regulation series, particularly some info about innate and learned food preferences, including why carbohydrates may be addictive (short answer: insulin tweaks an area of the brain called the insula, also lit up by drugs such as cocaine and opiates). I've started a few posts and abandoned them, mainly because they seemed to be covering the same ground. Let me throw out some these random thoughts here, rambling about in no particular order, sort of a brain dump before I disappear again.

First, I want to recommend a couple of DVDs. First is Tom Naughton's Fat Head, which is both funny and educational. Fat Head provides a gentle and highly accessible introduction to some of the topics covered in Gary Taubes' Good Calories, Bad Calories. Even my kids (8 and 4) "got it", though I suppose it didn't hurt that I've been pumping them full of the background info for a few years :-). Watch for the moment when Tom's doctor sees the measurable effects of eating an all fast-food low carb diet. The expression on his face is absolutely priceless (and to his credit, he didn't just blow it off like many in the mainstream of health would). Also watch the bonus interview footage, great stuff. I particularly liked a quote from Dr. Al Sears, something to the effect of "If you're not dead, you can still heal." Most doctors today seem resigned to mitigating the effects of metabolic syndrome through medication, rather than actually healing. I believe they're generally well-intentioned, just misinformed. But experience has shown that given the opportunity, the body has an amazing ability to heal itself, IF you can remove or at least mitigate the underlying factors reinforcing the underlying disease process. More on this later. Jimmy Moore has a great interview with Tom Naughton as well.

At one point in the bonus interviews, Dr. Sears discusses how rabbits were ultimately used as a model for heart disease. Apparently researchers started out by feeding dogs large quantities of lard, but the dogs would not develop athersclerosis. Of course that should have been obvious from the outset: saturated fat and cholesterol-containing animal fat are a cornerstone of the evolutionary diet of canines. Since the researchers didn't get their preconceptions validated using dogs, they switched to rabbits, whose natural diet is grass, and who thus never evolved any mechanisms for handling large dietary quantities of either fat or cholesterol. Not surprisingly, the poor bunnies' metabolism went berserk, and the researchers extrapolated this result to humans. Talk about confirmation bias.

Here's a personal related anecdote. Our dog Picasso is about 12 years old now. He was getting pretty porky and arthritic, and also began drinking a ridiculous amount of water, so I suspected he was developing doggy diabetes. I was going to take him to the vet, but first checked out the ingredients on his "healthy" doctor-recommended food. First ingredient: corn starch. I felt like a dope for not checking that earlier, and switched him to a diet of raw food, mainly patties made from ground up whole chickens, supplented with leftover bacon grease (we eat a lot of bacon here) and raw organ meats like heart, liver, kidneys, and tripe. The water-drinking issue disappeared almost immediately. Over time, Picasso has really trimmed up, looks like a young dog now, with a nice shiny coat. He's become a lot more friendly and playful now as well. People are always surprised to learn he's nearly 12. Hint hint: the evolutionary diet of humans is much closer to that of canines than bunny wabbits.

The other DVD is "My Big Fat Diet", which actually provides several examples of the healing powers of the human body. This documentary follows Dr. Jay Wortman as he treats metabolic syndrome in a group of Canadian Namgis First Nation people via a low carb diet. The results: not only did they lose fat, but also reduced or eliminated many of the other symptoms of metabolic syndrome along with associated medications. Even more striking was how the Namgis' sense of community and family returned as their bodies healed. Every time I watch My Big Fat Diet, I wonder how many of our various societal ills are fueled by poor health resulting from bad nutrition. The Western diet promotes a situation where the body perceives itself to be in constant crisis: insulin resistance essentially implies starvation at the cellular level, high blood sugar and dietary polyunsaturated fat contribute to glycative/oxidative stress, and hyperinsulinemia probably leads to chronically high levels of stress hormones like cortisol. Is it any wonder we find our society to be populated by individuals with greater focus on the immediate benefits to themselves rather than considering the much greater long-term benefits of contributing to societal well-being?

Another fun fact from My Big Fat Diet is how the Namgis used fat rendered from the tiny Oolichan fish to supply fat soluble vitamins, particularly Vitamin D in the winter. The Namgis traditionally made the association between the yellow color of the Oolichan grease and sunshine, which I thought was pretty insightful. I started a post on Vitamin D, but there's so much info out there already I decided I had little extra to add. Vitamin D deficiency is quickly making it's way into the mainstream medical consciousness as well, which is outstanding. The hormonal version of Vitamin D activates over 1000 genes (something I hope to learn more about in my new job), so it probably should not be surprising that Vitamin D deficiency can lead to a broad spectrum of health problems, particularly those like multiple sclerosis which are known to be influenced by genetic risk factors.

And it's very interesting to think about diseases like influenza, traditionally thought of as being primarily infectious and requiring immunization. But the influenza virus has certainly been around as long as humans, and it's hard to fathom how humanity could have survived if we were all getting knocked flat by the flu once a year. A whole tribe of hunter-gatherers on their backs with flu seems like prime cave bear food. And the flu doesn't behave like an infectious disease, as does the common cold. For instance, Google has a cool new resource estimating flu activity in the US, based on search queries. I've been watching this thing all winter, and it definitely does not show any sort of epidemic pattern. You'd expect flu hotspots to spread geographically over time, but instead the map is pretty much random. This sort of non-infectious pattern for influenza is generally observed, where it just pops up simultaneously in geographically separated locations rather than spreading.

A good hypothesis is that humans generally have a given flu virus all year (and I wonder if anybody has bothered to test for influenza antibodies in the summer). We carry all kinds of viruses all the time, they're just suppressed by our immune systems. But if the immune system becomes weakened, say due to Vitamin D deficiency brought about by lack of sunlight exposure in the winter, the virus can take hold and make you sick. Further, it is known that the majority of symptoms from both flu and cold are basically due to your own immune reaction, not the virus itself. The innate immune system uses cells like neutrophils, whose job it is to seek and destroy potentially infectious agents like viruses and bacteria using both physical and chemical means. But once ramped up, these hunter-killer cells will also destroy your own tissue, and so need to be moderated. If not controlled, your immune system will kill you, which is precisely what happened to victims of the Spanish flu epidemic (Epidemiol Infect 2006;134:1129–1140). What is the primary mechanism for moderating this immune response? Vitamin D.

Another personal anecdote: winter used to be medically difficult for our family, as it is for most people. We'd have our two kids at the doctor at least once a month for ear/sinus infections, strep throat, etc. and the we adults would usually drag around some kind of virus for a few weeks. But that's just part of life, right? Wrong. Since we began supplementing with Vitamin D in the winter (about two years ago), zero doctor visits. I don't even think the kids have had a fever in this time, certainly not one high enough to cause any concern. We do get sick, but it's minor, never more than an annoyance, and short-lived. While the other kids at school are dropping left and right from strep throat and flu, our kids now sail through pretty much unscathed. I've seen it multiple times with friends and family as well. They have some drawn out respiratory disease, like a persistent cough. When we finally get them on the Vitamin D train, it's gone, never to return. I also used to get bad hayfever attacks - no more. Yes, it's anecdotal, but this is what you'd expect from the well-established interaction between Vitamin D and the immune system.

So it's good to see the mainstream picking up on this. They seem to be "getting it" on other fronts as well, albeit slowly. MSNBC recently had a long article on omega-3's, which also discussed the general problems inherent in vegetable oils processed from seeds and soybeans. It's frustrating, because they get some of it "right" (in the sense that their conclusions follow logically from all of the available evidence), yet still are hung up on issues like dietary cholesterol and almost completely miss certain living elephants, as it were. I posted a longish comment on the article, but my essential criticism is one I've voiced before: if you start with bad or incomplete prior information, even the most rigorously logical analysis will lead to goofy conclusions. This in turn implies choices for diet, supplements, medical treatments etc. Read the article, and watch how some key assumptions lead to all kinds of wild inferences and extrapolations.

Michael Pollan's book "In Defense of Food" is another example of this. Though Pollan's "The Omnivore's Dilemma" remains one of my favorites, I had avoided reading "In Defense of Food", mainly because it espoused the mantra "Eat food. Not too much. Mostly plants." I'm all for eating food vs. factory produced foodlike substances, but the last two statements smack of dogma. But our local library decided to feature this book, so I thought I'd better read it in the interest of causing trouble.

Pollan gets some of it right (again, in terms of drawing logical inferences from the available evidence), but frustratingly misses the big stuff, like the total lack of evidence that dietary fiber contributes to health, or that red meat consumption causes disease. For instance, he discusses Good Calories, Bad Calories, but cherry-picks the evidence that apparently supports his preconceptions and ignores the rest. He rails against reductionism, apparently following in the footsteps of T. Colin Campbell (just about the worst possible choice), missing the point that the point of studying isolated aspects of nutrition and metabolism is to inform the "big picture". While a complex system like the human body may be greater than the sum of it's parts, you certainly have no hope of understanding the whole without at least understanding the parts.

Pollan then gives 24 rules which we should follow when selecting and eating food. This is a great example of how starting from goofy assumptions just leads to over-complication. Who is going to walk around a store or look at a menu and mentally check off 24 different things? There's no way that the simple act of eating should require that level of mental effort; it certainly didn't for our hunter-gatherer ancestors. Some of his recommendations are good, like shop at your farmers market or eat wild foods, but a lot of it is just nonsense. For instance, he wants us to "Eat slowly". What other animal consciously regulates it's rate of food intake? Do lions devour their prey at a measured pace, and teach their young to do the same? Like most, Pollan is enthused about a plant-based diet, all hopped up on the idea that we require thousands of different phytonutrients for health. The evidence for this? "In all my interviews with nutrition experts, the benefits of a plant-based diet provided the only point of consensus." Really - after extolling the evils of "nutritionism" for half the book, now you're going to follow the consensus of nutrition experts? Aren't these the same boobs who developed ideas like the food pyramid? Take a walk around your local mall, and you can see how well that's played out in the context of human health. Remember also that scientific consensus rarely is the result of critical examination of the evidence. Rather it's a social phenomenon: the scientists keep repeating the same thing to each other, and eventually it becomes "truth". If you're lucky, somebody with half a brain gets the ball rolling by actually making a logical inference from evidence, but the usual situation is that of the diet-heart hypothesis, where an initial unproved hypothesis turns into scientific "fact" simply through social forces.

Pollan further wants us to focus on eating leaves. This recommendation certainly makes sense for a cow or a gorilla, both of which have vast digestive tracts built for the long process of breaking down cellulose and freeing the nutrition in leaves. Both animals also eat enormous quantities of leaves to meet caloric requirements, e.g. upwards of 60 lbs. daily for a gorilla. Of course humans can predigest leaves through cooking, and we can add fat to help assimilation of the micronutrients, acids to neutralize antinutrients like phytic acid, etc. But those are recent developments in the course of evolution. Humans almost certainly did not evolve getting any significant calories from raw leaves. Our digestive systems just can't handle it, and you need to eat many pounds of leaves daily to have any hope of of meeting caloric requirements. It seems far more likely that humans would have gravitated toward more calorically dense plant foods, like fruit, nuts, and starchy root vegetables. Of course the most nutritionally dense foods are of animal origin, including seafoods, organs, and eggs.

Pollan's first rule is the worst: "Don't eat anything your great grandmother wouldn't recognize as food." First of all, I don't have the faintest idea what my great-grandmother would or would not consider food, and since she's long since passed on, I can't ask her. But given that margerine popped up over 100 years ago, I would guess that would pass as food for her. Does that make it okay to eat margerine? And how about bread? Great grandma probably considered "bread" as food, but bread is the end result of extensively processing wheat, which in its raw state would be toxic. So is bread "food", and should I eat bread? Evidence seems to be mounting that the answer is "no". So this guideline is pointless.

The mental processing required to figure what to eat should be low. Humans evolved as omnivores, which means we can eat a very wide variety of foods. Any minimally processed food that doesn't taste bad or make you immediately ill is almost certainly going to be healthy (I'm trying to think of a counterexample, with no luck). If your food requires a lot of technology and industrial processing to render it edible, you might want to rethink whether it actually qualifies as "food". Dr. Sears makes this point during his bonus interview on the Fat Head DVD, noting that had our ancestors tried raw grains or soybeans, they would have found them distasteful and contracted a painful bellyache, not exactly a stimulus to eat more of those foods. Unlike fruit, which evolved to be desirable to eat in order to spread seeds, grains and legumes do not want their seeds to be eaten, as that of course prevents those seeds from ever growing into plants. So these plants developed a variety of chemical and physical defenses to discourage predation. Some animals, notably birds, responded with adaptations which allow them to flourish on grains (birds have a crop used to grind up the grains, and enzymes to block anti-nutrients like protease inhibitors). Mammals (and humans in particularly) generally lack these adaptations. Note, for example, the effort required to prevent grain-fed cattle from dropping dead inconveniently early.

The wonders of science allow us to turn wheat and soybeans into foodlike substances that at least won't immediately put you in the hospital, and of course these pseudo-foods are made more attractive by formulating them to appeal to innate flavor and texture triggers like fattiness, saltiness, and sweetness. But in our industrialized food environment, we can no longer rely on our senses to distinguish what is good to eat. By contrast, we see "primitive" peoples able to thrive on a wide variety of foods obtainable from their environment, from the flesh/fat laden diet of the Inuit to the largely carbohydrate-based diet of the Kitavans (supplemented with seafood for certain minerals and fat-soluble vitamins). The common thread is that they pretty much directly eat what they obtain from their environment, and intervening processing is usually minimal and aimed at either making nutrients more accessible (cooking) or for preservation (drying/smoking). They don't need 24 rules to figure out what to eat, and neither should we.

Returning to the point about whether or not bread is "food": the role of grains in the modern diet deserves examination. Let me start by putting some context around this. It is, I think, increasingly clear that our "diseases of civilization" are strongly rooted in metabolic disturbances caused by food. Volek and Feinman have made a very strong argument that "metabolic syndrome" can be defined by the response of an individual to dietary carbohydrate, and that the cure is removal of such from the diet. This hypothesis is supported by many scientific studies, both "Fat Head" and "My Big Fat Diet", as well as the personal anecdotes of many thousands (including myself, having lost 100 lbs. and restored many aspects of health on a low carb diet). But the cause of a disease is not necessarily the inverse of the cure, i.e. just eating too many carbohydrates doesn't necessarily cause metabolic syndrome. The traditional diet of the Kitavans and Tarahumara is carbohydrate-based, but neither group develops metabolic syndrome. So I'd venture that it's the type of carbohydrates that drive the development of metabolic syndrome. Once you've broken your metabolism, then any significant quantity of dietary carbs will cause you problems, but what got you to that broken state? For example, your gasoline-powered car can go a long time burning gasoline with no significant issues. Put diesel in the tank, though, and you've really screwed things up, and no longer can use gasoline as fuel until you've fixed the damage done by the diesel.

Now I'll be the first to admit that this question is sort of academic. After all, if you just ate low carb across the board, you'd avoid any subclasses of carbohydrate food that could contribute to chronic metabolic problems. But the reality is that our food environment is flooded with refined-carbohydrate-rich (pseudo)foods. They are deeply ingrained in our culture, pushed on us from every direction. And there's also the issue of what to eat if your metabolism is not broken. I think about this particularly in terms of what to feed my children. Is it better to let the kids eat a slice of pizza or french fries? Chocolate-covered pretzels or a lollipop? And you can't monitor their food intake 24/7. You know they'll be offered crap from every direction when you're not around, so how can you teach them to make reasonable choices, e.g. taking bad instead of worse?

I don't think the scientific evidence is really there to provide a definitive answer. But we can make some reasonable guesses based on what is known about various aspects of metabolism and physiology and how these may hypothetically respond to various inputs. I'm increasingly of the opinion that in the spectrum of carbohydrates wheat is particularly bad. Stephan at the Whole Health Source blog has many articles discussing potential antinutrients in grains, definitely worth reading. One interesting aspect of grains are lectins, which are proteins that basically have the ability to bind to cellular receptors. Lectins in food are often not broken down to amino acids in the gut, and thus can cause mischief in the digestive system and, if they cross over into circulation (which they do), in the rest of the body as well.

Wheat germ agglutinin (WGA) has been studied in the test tube. If you Google "wga insulin receptor" you'll find a lot of papers, probably because WGA can bind to insulin receptors, which makes it easier to study various aspects of the receptor chemistry. From a physiological standpoint, WGA at least has the potential to be troublesome. For instance, not only does WGA bind to insulin receptors, it sticks there. When an insulin molecule binds to an insulin receptor, the whole complex is absorbed by the cell. One insulin molecule thus generates a specific and discrete response in a cell. When WGA binds to an insulin receptor, the complex is not absorbed, it just sits there activating at least part of the insulin signaling chain until it is knocked off (certain sugars accomplish this, like N-acetyl glucosamine). That's potentially nasty. In test tube experiments, at least, WGA is just as effective at insulin at stimulating glucose tranport and blocking lipolysis in fat cells. Stephan also notes that WGA has the potential to block leptin receptors. Leptin resistance is one of the hallmarks of metabolic syndrome.

It is, of course, treacherous to infer effects of grain lectins on a whole organism based on these test-tube experiments. I don't know of any studies which have really studied such effects in detail. But certain anecdotal evidence is at least consistent with the idea that wheat may play a special role in causing metabolic problems. Take the recent much-ballyhooed study of obesity in China. If you look at the data, you'll note that while there is a trend for the more obese subjects to eat more carbohydrates than the less obese, the differences are fairly small, on the order of 10% or less. By contrast, the most obese quartile for men at 5x more wheat flour than the least obese quartile. Hmmm.

The movie My Big Fat Diet provided another interesting bit of evidence. There's a big annual festival held by the Namgis each year. The chief (whose name escapes me) who had Type II diabetes and heart disease had been on a low carb diet for some time, which had allowed him to control his blood sugar entirely without any medication. At the festival he had one piece of "traditional" bannock, a deep-fried bread made from white wheat flour. His blood sugar then soared, and then did not return to normal for a week. Now I very much doubt that such elevated blood sugar was simply the result of the carbohydrate in the bread. Even given his impaired carbohydrate metabolism, this should have been cleared in the course of a day or so. But what if WGA from the flour was instead binding to insulin receptors and sticking there, causing his liver to crank out more sugar? Complete speculation, but interesting to think about.

Finally, I can relate my own experience. After going low carb, I would generally find that even moderate "cheating" would lead to some obvious long-term effects, the most notable was that I'd break out in painful acne. But one night we were at our favorite local restaurant. They have a wonderful scallop dish with a fantastic sauce, and the whole thing is on top of thinly sliced red potatoes. I decided to go ahead and eat the potatoes, mainly just to get at the sauce without being socially unacceptable by drinking it straight from the bowl. Interestingly, though this represented a lot of carbohydrates for me, it had no noticeable effect. I've reproduced this with potatoes a number of times, which seem to not freak out my metabolism when eaten in moderation. On the flip-side, even a small amount of bread will reliably trigger a breakout that lasts a week or more. A more dramatic example occurs with beer. If I drink even two or three premium beers now, I will get quite ill, and then suffer a week or so of acne. It's not the alcohol, because an equivalent amount of red wine or vodka has little effect beyond the usual buzz.

One final crumb for thought: why have we as a society developed such a love for wheat? There's plenty of other places you can load up on starch, such as potatoes, corn, and rice. But we seem to have a special soft spot for wheat-based foods, and cultures seem to be quite willing to displace other forms of carbohydrate with wheat (e.g. the Chinese as discussed above). Remember near the beginning of the post I noted that insulin stimulates the area of the brain known as the insula. One function of the insula is to mediate food-related rewards, e.g. eat some nice sweet fruit, get a bit of an insulin spike, and the insula reinforces that behavior. That makes sense from an evolutionary standpoint: fruit is nutrient dense, but doesn't last for long, so it's a good idea to load up on it while it's around. Certain addictive drugs like cocaine and opiates light up the same area, just with much greater intensity. Suppose WGA were able to get in to the insula and bind to insulin receptors there? That would probably reinforce our desire for wheat, even more so if WGA shows the "stickiness" observed in the test tube. Totally unproven hypothesis, but it seems like one worth testing.

Well, I think I've pretty much emptied my head at this point. Let me just close with a quote from Sir William Drummond:
He who will not reason is a bigot; he who cannot is a fool; and he who dares not is a slave.
I feel like this sums up the situation we currently face as a society. Too often, we rely on "experts" to think for us. In other words, we tend to be "slaves" (I personally believe that most people are not "fools": they have the capacity to think, they just don't bother). Yet these "experts" are far too often "bigots", driven by personal goals and desires rather than reason. Scientists generally don't engage in what constitutes actual science, in the sense of testing a hypothesis and objectively evaluating belief in that hypothesis, because there's no money in that. Instead they concentrate on confirming the beliefs of those who hold the purse-strings, and that makes them "bigots" in the sense described by Drummond.

So don't be a slave. All it takes to break free from the bonds the bigots would impose is to start using your own brain, and this intellectual freedom (and only this) will allow you to make choices which maximize your own health, wealth, and well-being.