Why the brain matters in obesity

THANK YOU FOR VISITING SWEATSCIENCE.COM!

My new Sweat Science columns are being published at www.outsideonline.com/sweatscience. Also check out my new book, THE EXPLORER'S GENE: Why We Seek Big Challenges, New Flavors, and the Blank Spots on the Map, published in March 2025.

- Alex Hutchinson (@sweatscience)

***

Those of you interested in nutrition may already be following the online debate between Gary Taubes and Stephan Guyenet — back in August, Guyenet critiqued Taubes’s carbohydrate-insulin hypothesis, and now Taubes is returning the favour by critiquing Guyenet’s food-reward hypothesis. I’m not going to get into the nitty-gritty of the debate here, except to say that I think it’s a mistake to frame this debate as an “either-or.” Despite Taubes’s insistence to the contrary, the two ideas can coexist — and even if they do, I suspect they still don’t add up to the “whole truth” about obesity. Here’s one reason why.

In one of his recent posts, Taubes makes the distinction between body-centred and brain-centred theories of obesity (or you can think of it as physiology vs. psychology, one of his commenters points out). Taubes believes obesity originates in the body:

In this paradigm, specific foods are fattening because they induce metabolic and hormonal responses in the body — in the periphery, as its known in the lingo — that in turn induce fat cells to accumulate fat. The brain has little say in the matter.

Leaving aside the precise mechanism, I largely agree with the idea that regulation of calories in and calories out isn’t under the conscious control of the brain. And I’m pretty sure Guyenet would agree too. But I’m not quite ready to conclude that the brain plays no role.

This is a figure from a study published in the Archives of Pediatrics & Adolescent Medicine in 2009, from researchers at Penn State (no wisecracks please). The text is freely available here. The study followed 1,061 children, who were tested at the age of 3 for self-control (the length of time they were able to refrain from playing with a fun toy after being asked not to) and then again at the age of 5 for delayed gratification (the classic Marshmallow Test, which I’ve written about before, except using M&Ms, animal crackers or pretzels: they could have a small amount anytime, or a larger amount if they waited 3.5 minutes). Then their BMI was tracked until their turned 12.

The results are pretty clear: doing well on either or both of the impulse-control tests predicts less weight gain nine years later. So the question is: how can a test that involves (not) playing with a toy when you’re 3 years old predict future weight gain, if the brain has no say in weight gain?

Let me be absolutely clear: I don’t think “better impulse control” will play any useful role in weight loss for the vast majority of people. Once you’re overweight, I suspect physiology totally swamps psychology in most cases. But if you’re looking for an overall understanding of the mechanisms of weight gain and loss — and if, like Taubes, you insist that the correctness of your theory means that all alternate ideas must be 100% incorrect — then I believe you can’t ignore the brain (and its interactions with the modern food/physical activity environment) completely.

The brain senses macronutrient RATIOs, not just amounts

THANK YOU FOR VISITING SWEATSCIENCE.COM!

My new Sweat Science columns are being published at www.outsideonline.com/sweatscience. Also check out my new book, THE EXPLORER'S GENE: Why We Seek Big Challenges, New Flavors, and the Blank Spots on the Map, published in March 2025.

- Alex Hutchinson (@sweatscience)

***

The classic syllogism of “nutritionism” goes something like this:

  1. Eating food A makes people healthy.
  2. Food A contains nutrient X.
  3. Therefore we should isolate nutrient X, manufacture it in powder form, and ingest it large quantities to become healthy.

This seems pretty logical, and I certainly wouldn’t have questioned this basic mode of thinking a decade ago. And of course the approach has had many successes — taking vitamin C really does ward off scurvy if, for whatever reason, you’re subsisting on a diet devoid of vitamin C. But when we shift from “correcting deficiencies” to “enhancing health,” the approach seems to sputter, as people like Michael Pollan have argued.

The question, from my perspective, is: Why? Why do so many studies find that taking an isolated nutrient fails to reproduce the benefits observed from ingesting that nutrient in the context of a whole food (or, perhaps even more importantly, a whole meal or whole dietary pattern)? There are obviously many factors, such as the rate at which the nutrients are absorbed, and synergies between different nutrients in the food (a possible explanation for why nitrites are “good” when they from spinach and beets but “evil” in the context of fatty hot dogs).

A new study published last week in Neuron (press release here, abstract here) offers another clue. The study looked at the activation of “orexin/hypocretin” neurons in the hypothalamus, which “regulate energy balance, wakefulness, and reward.” It has long been known that glucose levels in the brain reduce the activation of these neurons. Researchers at the University of Cambridge tested how they responded to protein and fat, and found that certain amino acids increase the activation of the neurons, while fatty acids have no effect.

Okay, so the brain responds to macronutrient levels in the body. Cool. Carbs turn this particular neural signal up, and protein turns it down. And if you eat both protein and carbs at the same time, you’d expect that the net result will be the sum of the two signals. But that’s not what the researchers found. The combined protein-and-carb signal was a nonlinear combination of the two individual signals — meaning that these neurons were, in effect, responding to the protein-to-carb ratio rather than amounts. As the researchers put it:

In summary, our data show that the activity in the orx/hcrt system is regulated by macronutrient balance, rather than simply by the caloric content of the diet.

The bottom line: if you try to understand how this particular aspect of human physiology works by breaking food down into its constituent nutrients and testing them one by one, you’re doomed to failure because its response to individual nutrients is different from its response to combinations of nutrients. Which leads to a corollary: if you try to create a healthy diet by assembling a collection of pills and powders, you’re almost certainly sacrificing some of the synergies present in real foods.

Sidney Crosby, chiropractic neurology, and the limits of evidence

THANK YOU FOR VISITING SWEATSCIENCE.COM!

My new Sweat Science columns are being published at www.outsideonline.com/sweatscience. Also check out my new book, THE EXPLORER'S GENE: Why We Seek Big Challenges, New Flavors, and the Blank Spots on the Map, published in March 2025.

- Alex Hutchinson (@sweatscience)

***

The good news: Sidney Crosby is back from the concussions that kept him on the bench for more than 10 months, and he had two goals and two assists in his return against the Islanders last night. But one downside, a reader pointed out to me in an e-mail, is that Crosby’s return may give added credibility to “chiropractic neurology,” the alternative therapeutic approach that Crosby turned to during his rehab. What exactly is this? I don’t know — and I’m not alone:

It’s a field that’s unfamiliar to many traditional doctors, including Randall Benson, a neurologist at Wayne State in Detroit who has studied several ex-NFL players. Says Benson, “It’s very difficult to evaluate what kind of training, expertise or knowledge a chiropractic neurologist has since I have never heard of [the discipline].”

That’s a quote from David Epstein and Michael Farber’s excellent look at Crosby’s rehab from Sports Illustrated in October. A couple of other interesting quotes:

In 1998, at Parker University, a Dallas chiropractic college, Carrick [the chiropractic neurologist who Crosby worked with] worked on Lucinda Harman before 300 students. Two car accidents and a neurotoxic bite from a brown widow spider had left Harman, herself a Ph.D. in experimental psychology, wheelchair-bound and with headaches, during which she saw spots.”[Carrick] asked if they were red and yellow,” she says. “I said, ‘No, they’re green, blue and purple.’ ” Carrick informed the audience that this meant her brain was being drastically deprived of oxygen and that, without treatment, she had six months to live. Harman, now 59, says simply, “Miracle.” But Randall Benson says that “there’s nothing out in peer-reviewed literature supporting” an association between the color of spots a patient sees during a headache and the severity of the oxygen deprivation in the brain.

[…]

Carrick, who has had a handful of studies that have appeared in scientific journals, has never published data on vestibular concussions. “We don’t have enough time to publish studies,” he says, “but we’re doing a large one at Life [University] right now.”

It’s a great piece — fair but rigorous. In some ways, though, the most important quote may be the kicker:

“I don’t think this is a case of trying to do something wacky,” Crosby says. “When someone came along and invented the airplane, people must have thought they were out of their mind. Who thinks he can fly? I’m sure people thought that person might have been stretching it a bit… . At the end of the day, as long as the person getting the care is comfortable, I think that’s what’s important.

Much as my evidence-based personality protests, I do think there’s some truth to that. Especially in cases like this, where — as with so many health conditions — there isn’t a well-established “standard-of-care” treatment. It’s totally different from, say, Steve Jobs choosing “alternative” forms of cancer treatment instead of surgery. In that case, the potential benefits of the surgery are well-known and well-understood. But many people face health conditions where the verdict of the Cochrane review is basically “there is insufficient evidence to conclude that ANY interventions do any good.” In that case, it’s hard to argue against trying other, unproven approaches rather than simply doing nothing.

Of course, sports medicine is a little different — it’s not life-or-death. For pro athletes, the incentive to try anything and everything in order to return to play (and earn money during their brief career window) is enormous. If I were Tiger Woods or Terrell Owens, I would have tried platelet-rich plasma to speed tendon healing too, despite the lack of evidence that it actually works. The problem is that the use of these therapies by sports stars gives the general public the impression that they’re proven, established treatments — hence the huge surge in PRP over the last few years. Will the same thing happen with chiropractic neurology? I hope not. But on the other hand, if someone who’s been in two car accidents and been bitten by a neurotoxic spider is in pain and hasn’t been able to get relief from conventional treatment, I’d have a hard time criticizing them if they decided to give it a try.

Cadence in elite runners increases as they accelerate

THANK YOU FOR VISITING SWEATSCIENCE.COM!

My new Sweat Science columns are being published at www.outsideonline.com/sweatscience. Also check out my new book, THE EXPLORER'S GENE: Why We Seek Big Challenges, New Flavors, and the Blank Spots on the Map, published in March 2025.

- Alex Hutchinson (@sweatscience)

***

One quick graph from a new study by Robert Chapman and his collaborators at the University of Indiana, just published online in Medicine & Science in Sports & Exercise:

This is data from 18 elite runners (12 male, 6 female), showing their stride frequency as a function of speed. For reference, 3.00 Hz corresponds to 180 steps per minute; 3.3 Hz corresponds to about 200. On the speed axis, 4.0 m/s is 4:10 per km, and 7.0 m/s is 2:23 per km. In other words, these are FAST paces. The key point: they get faster, in part, by quickening their cadence. There’s no magic cadence that they stay at while lengthening their stride to accelerate.

Interesting wrinkle: the women have faster cadence than the men at any given speed. Chapman assumes this is partly due to the fact that the men are taller — but even normalizing by height doesn’t quite erase the difference. (And that even ignores the argument that, as I blogged about here, cadence should be proportional to the square root of leg length, not leg length itself.) The remaining difference, Chapman hypothesizes, could be due to “application of greater ground forces by the men or differences in muscle fiber type distribution.” This makes sense: if you’re stronger (as the men, on average, will be), you’ll have stronger push-off, longer stride, and thus shorter cadence at any given speed. But it seems pretty clear that height plays at least some role.

Paleo, the pace of evolution, and chronic stress

THANK YOU FOR VISITING SWEATSCIENCE.COM!

My new Sweat Science columns are being published at www.outsideonline.com/sweatscience. Also check out my new book, THE EXPLORER'S GENE: Why We Seek Big Challenges, New Flavors, and the Blank Spots on the Map, published in March 2025.

- Alex Hutchinson (@sweatscience)

***

My Jockology column in today’s Globe and Mail takes a look at the paleo diet — or rather, the paleo “lifestyle.” The column is actually in the form of an infographic in the paper, beautifully illustrated as “cave art” by Trish McAlaster. Unfortunately, the online version so far just lifts the text, without any of the data and graphics that accompany it. Nonetheless, it’s hopefully worth a read!

As a teaser, here’s an excerpt from a section on how the pace of evolution has changed over the past few thousand years, and what that means for the quest for the perfect “ancestral” diet:

The paleo diet depends on the assumption that our genes haven’t had time to adapt to the “modern” diet. Since evolution depends on random mutations, larger populations evolve more quickly because there’s a greater chance that a particularly favourable mutation will occur. As a result, our genome is now changing roughly 100 times faster than it was during the Paleolithic era, meaning that we have had time to at least partly adapt to an agricultural diet.

The classic example: the ability to digest milk, which developed only in populations that domesticated dairy animals. More than 90 per cent of Swedes, for example, carry this mutation. Finnish reindeer herders, in contrast, acquired genes that allow them to digest meat more efficiently, while other populations can better digest alcohol or grains. The “ideal” ancestral diet is most likely different for everyone. [READ THE WHOLE ARTICLE]

And, as another teaser, here’s a section of Trish’s infographic illustrating the difference between the acute stress of the paleo lifestyle compared to the chronic stress of modern life: