A reality check for altitude tents and houses

THANK YOU FOR VISITING SWEATSCIENCE.COM!

As of September 2017, new Sweat Science columns are being published at www.outsideonline.com/sweatscience. Check out my bestselling new book on the science of endurance, ENDURE: Mind, Body, and the Curiously Elastic Limits of Human Performance, published in February 2018 with a foreword by Malcolm Gladwell.

- Alex Hutchinson (@sweatscience)

***

[UPDATE 11/30: Lots of great discussion of this post below and on Twitter. I’ve added a new post with some responses, more data, and further thoughts HERE.]

A recurring theme on this blog is that not all studies are created equal. The quality of the study design makes a huge difference in the amount of faith that we can place in the results. So it’s always a big pleasure to see awesomely painstaking studies like the new one in Journal of Applied Physiology by Carsten Lundby’s group in Zurich. The topic: the “live high, train low” (LHTL) paradigm used by endurance athletes, in which they spend as much time at high altitude as possible to stimulate adaptations to low oxygen, while descending to lower altitude each day for training so that their actual workout pace isn’t compromised by the lack of oxygen.

There have been a bunch of LHTL studies since the 1990s that found performance benefits — but it’s really difficult to exclude the possibility of placebo effect, since athletes know they’re supposed to get faster under the LHTL strategy (and, conversely, athletes who get stuck in the control group know they’re not supposed to get faster). But Lundby and his colleagues managed to put together a double-blinded, placebo-controlled study of LHTL. The main features:

  • 16 trained cyclists spent eight weeks at the Centre National de Ski Nordique in Premanon, France. For four of those weeks, they spent 16 hours a day confined to their altitude-controlled rooms. Ten of the subjects were kept at altitude (3,000 m), and six were at ambient (~1,000 m) altitude.
  • Neither the subjects nor the scientists taking the measurements knew which cyclists were “living high.” Questionnaires during and after the study showed that the subjects hadn’t been able to guess which group they were in.
  • On five occasions before, during and after the four weeks, the subjects underwent a whole series of performance and physiological tests.

So, after going to all this trouble, what were the results?

Hemoglobin mass, maximal O2-uptake in normoxia and at a simulated altitude of 2,500 m and mean power output in a simulated 26.15 km time-trial remained unchanged in both groups throughout the study. Exercise economy (i.e. O2-uptake measured at 200 Watt) did not change during the LHTL-intervention and was never significantly different between groups. In conclusion, four weeks of LHTL using 16 hours per day of normobaric hypoxia did not improve endurance performance or any of the measured associated physiological variables.

This is, frankly, a surprising result, and the paper goes into great detail discussing possible explanations and caveats — especially considering the study didn’t find the same physiological changes (like increased hemoglobin mass, which you’d expect would be placebo-proof) that previous studies have found. Two points worth noting:

(1) The subjects were very well-trained compared to previous studies, with VO2max around 70 ml/kg/min and high initial hemoglobin mass. It’s possible that the beneficial effects of LHTL show up only in less-trained subjects.

(2) There’s a difference between living at 3,000 m and living in a room or tent kept at oxygen levels comparable to 3,000 m: pressure. “Real-world” altitude has lower pressure as well as lower oxygen; this study lowered oxygen but not atmospheric pressure. Apparently a few recent studies have hinted at the possibility that pressure as well as oxygen could play a role in the body’s response to altitude, though this remains highly speculative.

As always, one new study doesn’t erase all previous studies, nor does it override the practical experience of elite athletes. But it suggests that we should think carefully about whether altitude really works the way we’ve been assuming it works. As the researchers conclude:

In summary, our study provides no indication for LHTL, using normobaric hypoxia, to improve time trial performance or VO2max of highly trained endurance cyclists more than conventional training. Given the considerable financial and logistic effort of performing a LHTL camp, this should be taken into consideration before recommending LHTL to elite endurance athletes.

 

Which “rules of running” should you break?

THANK YOU FOR VISITING SWEATSCIENCE.COM!

As of September 2017, new Sweat Science columns are being published at www.outsideonline.com/sweatscience. Check out my bestselling new book on the science of endurance, ENDURE: Mind, Body, and the Curiously Elastic Limits of Human Performance, published in February 2018 with a foreword by Malcolm Gladwell.

- Alex Hutchinson (@sweatscience)

***

Forgot to mention this earlier — I have an article in this month’s Runner’s World called “Breaking All the Rules,” which is now available online. Basically, I had a chance to chat with a bunch of veteran coaches — Jack Daniels, Frank “Gags” Gagliano, Roy Benson, Jeff Galloway, Hal Higdon and Pete Pfitzinger — and ask them which “rules of running” they’d recommend not following blindly. Here’s one example:

THE RULE: Do prerace strides
For generations, runners have followed the same rituals to warm up before races or workouts: Start with some jogging, move on to a little bit of stretching, then perform a series of “strides”—short sprints lasting about 10 seconds that get your heart pumping and kick-start the delivery of oxygen to your running muscles. But do these timeworn rituals really help us perform better? Jack Daniels, Ph.D., isn’t convinced. “What I most often see at races is a bunch of runners striding up and down at a speed that is clearly faster than the coming race pace,” he says. Since these strides are the last thing runners do before starting the event, that inappropriate pace is fresh in their minds. “And when the gun finally sounds, they ‘stride’ or sprint right out.” The result: a way-too-fast start followed by an inevitable crash.
HOW TO BREAK IT: For shorter events like 5-K and 10-K races, jogging just long enough to get a good sweat going is all you need to do, says Daniels. (For longer races, you can get away with even less: Run the first mile of a half or full marathon as your warmup.) To get the oxygen-boosting benefits of strides without skewing your pace judgment—and screwing up your race result—try a sustained two-to three-minute effort 10 minutes before starting the race or workout. Run it slightly faster than your half-marathon pace, or at a speed that feels moderately hard. You should not be sprinting.

Other topics covered include stretching, the length and pace of your long run, having a recurring weekly training structure, back-to-back hard days, cross-training during injuries, and so on. I should emphasize: none of the coaches are suggesting that existing rules-of-thumb are 100-percent bad. They’re just worth thinking about to make sure that they really do make sense for your personal situation and goals. For example, Higdon questions the wisdom of the 10-percent rule — he’s not saying that you should never increase your mileage by 10 percent; he’s just saying that sometimes it might make sense to increase it by more, sometimes by less, so you shouldn’t follow the rule blindly.

Can you “train” your fingers and toes to withstand cold?

THANK YOU FOR VISITING SWEATSCIENCE.COM!

As of September 2017, new Sweat Science columns are being published at www.outsideonline.com/sweatscience. Check out my bestselling new book on the science of endurance, ENDURE: Mind, Body, and the Curiously Elastic Limits of Human Performance, published in February 2018 with a foreword by Malcolm Gladwell.

- Alex Hutchinson (@sweatscience)

***

Winter’s coming, so here’s a topical study just published in the European Journal of Applied Physiology. Do your fingers and toes gradually adapt to being exposed to cold temperatures? There are three questions we can ask:

  1. Are the digits able to maintain a higher temperature when they’re exposed to cold?
  2. Are the digits quicker to experience “cold-induced vasodilation” (COVD)? (When you get cold, your blood vessels contract; but after a certain point, the vessel walls get so cold that they can’t stay contracted, so you get a sudden rush of blood that helps to warm up your fingers and toes — which turns out to be a very useful response to avoid frostbite.)
  3. Do your digits hurt less?

Over the years, many researchers have tested whether our digits adapt to cold, and the results are all over the map — some see a positive effect, some see a negative effect, some see no effect. Into the breach come researchers from the Netherlands and from Brock University in Canada. They took 16 suckers volunteers and had them dip their right hand and foot in 8 C water for 30 minutes at a time for 15 consecutive days. At the beginning and end of the experiment, the “trained” hand/foot was compared to the “untrained” hand/foot.

Here’s how skin temperature changed over the 15 days:

The data kind of meanders around, but there’s not much of a clear trend. Unfortunately, there was a clear trend for CIVD: during the pre-training test, 52% of subjects experienced CIVD; during the post-training test, only 24% experienced it.

And finally, the pain score:

On the surface, this might seem like good news: it hurts less as you gradually become accustomed to the unpleasant sensation of being cold. In fact, though, this is bad news. As your body gets used to the cold, you notice it less, but you also are less likely to get the warming effects of CIVD. Combine these two factors, and you become increasingly likely to get frostbite without realizing it.

So what does this mean? Well, it probably ends my dreams of being a polar explorer. I have extremely poor circulation in my fingers, and this suggests that this is unlikely to improve no matter how often I freeze my fingers off. So, despite the odd looks I get, I’m going to continue to run in my big puffy mittens whenever it gets close to freezing, because I won’t get any long-term “training” benefit from suffering.

Why the brain matters in obesity

THANK YOU FOR VISITING SWEATSCIENCE.COM!

As of September 2017, new Sweat Science columns are being published at www.outsideonline.com/sweatscience. Check out my bestselling new book on the science of endurance, ENDURE: Mind, Body, and the Curiously Elastic Limits of Human Performance, published in February 2018 with a foreword by Malcolm Gladwell.

- Alex Hutchinson (@sweatscience)

***

Those of you interested in nutrition may already be following the online debate between Gary Taubes and Stephan Guyenet — back in August, Guyenet critiqued Taubes’s carbohydrate-insulin hypothesis, and now Taubes is returning the favour by critiquing Guyenet’s food-reward hypothesis. I’m not going to get into the nitty-gritty of the debate here, except to say that I think it’s a mistake to frame this debate as an “either-or.” Despite Taubes’s insistence to the contrary, the two ideas can coexist — and even if they do, I suspect they still don’t add up to the “whole truth” about obesity. Here’s one reason why.

In one of his recent posts, Taubes makes the distinction between body-centred and brain-centred theories of obesity (or you can think of it as physiology vs. psychology, one of his commenters points out). Taubes believes obesity originates in the body:

In this paradigm, specific foods are fattening because they induce metabolic and hormonal responses in the body — in the periphery, as its known in the lingo — that in turn induce fat cells to accumulate fat. The brain has little say in the matter.

Leaving aside the precise mechanism, I largely agree with the idea that regulation of calories in and calories out isn’t under the conscious control of the brain. And I’m pretty sure Guyenet would agree too. But I’m not quite ready to conclude that the brain plays no role.

This is a figure from a study published in the Archives of Pediatrics & Adolescent Medicine in 2009, from researchers at Penn State (no wisecracks please). The text is freely available here. The study followed 1,061 children, who were tested at the age of 3 for self-control (the length of time they were able to refrain from playing with a fun toy after being asked not to) and then again at the age of 5 for delayed gratification (the classic Marshmallow Test, which I’ve written about before, except using M&Ms, animal crackers or pretzels: they could have a small amount anytime, or a larger amount if they waited 3.5 minutes). Then their BMI was tracked until their turned 12.

The results are pretty clear: doing well on either or both of the impulse-control tests predicts less weight gain nine years later. So the question is: how can a test that involves (not) playing with a toy when you’re 3 years old predict future weight gain, if the brain has no say in weight gain?

Let me be absolutely clear: I don’t think “better impulse control” will play any useful role in weight loss for the vast majority of people. Once you’re overweight, I suspect physiology totally swamps psychology in most cases. But if you’re looking for an overall understanding of the mechanisms of weight gain and loss — and if, like Taubes, you insist that the correctness of your theory means that all alternate ideas must be 100% incorrect — then I believe you can’t ignore the brain (and its interactions with the modern food/physical activity environment) completely.

The brain senses macronutrient RATIOs, not just amounts

THANK YOU FOR VISITING SWEATSCIENCE.COM!

As of September 2017, new Sweat Science columns are being published at www.outsideonline.com/sweatscience. Check out my bestselling new book on the science of endurance, ENDURE: Mind, Body, and the Curiously Elastic Limits of Human Performance, published in February 2018 with a foreword by Malcolm Gladwell.

- Alex Hutchinson (@sweatscience)

***

The classic syllogism of “nutritionism” goes something like this:

  1. Eating food A makes people healthy.
  2. Food A contains nutrient X.
  3. Therefore we should isolate nutrient X, manufacture it in powder form, and ingest it large quantities to become healthy.

This seems pretty logical, and I certainly wouldn’t have questioned this basic mode of thinking a decade ago. And of course the approach has had many successes — taking vitamin C really does ward off scurvy if, for whatever reason, you’re subsisting on a diet devoid of vitamin C. But when we shift from “correcting deficiencies” to “enhancing health,” the approach seems to sputter, as people like Michael Pollan have argued.

The question, from my perspective, is: Why? Why do so many studies find that taking an isolated nutrient fails to reproduce the benefits observed from ingesting that nutrient in the context of a whole food (or, perhaps even more importantly, a whole meal or whole dietary pattern)? There are obviously many factors, such as the rate at which the nutrients are absorbed, and synergies between different nutrients in the food (a possible explanation for why nitrites are “good” when they from spinach and beets but “evil” in the context of fatty hot dogs).

A new study published last week in Neuron (press release here, abstract here) offers another clue. The study looked at the activation of “orexin/hypocretin” neurons in the hypothalamus, which “regulate energy balance, wakefulness, and reward.” It has long been known that glucose levels in the brain reduce the activation of these neurons. Researchers at the University of Cambridge tested how they responded to protein and fat, and found that certain amino acids increase the activation of the neurons, while fatty acids have no effect.

Okay, so the brain responds to macronutrient levels in the body. Cool. Carbs turn this particular neural signal up, and protein turns it down. And if you eat both protein and carbs at the same time, you’d expect that the net result will be the sum of the two signals. But that’s not what the researchers found. The combined protein-and-carb signal was a nonlinear combination of the two individual signals — meaning that these neurons were, in effect, responding to the protein-to-carb ratio rather than amounts. As the researchers put it:

In summary, our data show that the activity in the orx/hcrt system is regulated by macronutrient balance, rather than simply by the caloric content of the diet.

The bottom line: if you try to understand how this particular aspect of human physiology works by breaking food down into its constituent nutrients and testing them one by one, you’re doomed to failure because its response to individual nutrients is different from its response to combinations of nutrients. Which leads to a corollary: if you try to create a healthy diet by assembling a collection of pills and powders, you’re almost certainly sacrificing some of the synergies present in real foods.