Archive

Archive for November, 2011

A reality check for altitude tents and houses

November 29th, 2011

[UPDATE 11/30: Lots of great discussion of this post below and on Twitter. I've added a new post with some responses, more data, and further thoughts HERE.]

A recurring theme on this blog is that not all studies are created equal. The quality of the study design makes a huge difference in the amount of faith that we can place in the results. So it’s always a big pleasure to see awesomely painstaking studies like the new one in Journal of Applied Physiology by Carsten Lundby’s group in Zurich. The topic: the “live high, train low” (LHTL) paradigm used by endurance athletes, in which they spend as much time at high altitude as possible to stimulate adaptations to low oxygen, while descending to lower altitude each day for training so that their actual workout pace isn’t compromised by the lack of oxygen.

There have been a bunch of LHTL studies since the 1990s that found performance benefits — but it’s really difficult to exclude the possibility of placebo effect, since athletes know they’re supposed to get faster under the LHTL strategy (and, conversely, athletes who get stuck in the control group know they’re not supposed to get faster). But Lundby and his colleagues managed to put together a double-blinded, placebo-controlled study of LHTL. The main features:

  • 16 trained cyclists spent eight weeks at the Centre National de Ski Nordique in Premanon, France. For four of those weeks, they spent 16 hours a day confined to their altitude-controlled rooms. Ten of the subjects were kept at altitude (3,000 m), and six were at ambient (~1,000 m) altitude.
  • Neither the subjects nor the scientists taking the measurements knew which cyclists were “living high.” Questionnaires during and after the study showed that the subjects hadn’t been able to guess which group they were in.
  • On five occasions before, during and after the four weeks, the subjects underwent a whole series of performance and physiological tests.

So, after going to all this trouble, what were the results?

Hemoglobin mass, maximal O2-uptake in normoxia and at a simulated altitude of 2,500 m and mean power output in a simulated 26.15 km time-trial remained unchanged in both groups throughout the study. Exercise economy (i.e. O2-uptake measured at 200 Watt) did not change during the LHTL-intervention and was never significantly different between groups. In conclusion, four weeks of LHTL using 16 hours per day of normobaric hypoxia did not improve endurance performance or any of the measured associated physiological variables.

This is, frankly, a surprising result, and the paper goes into great detail discussing possible explanations and caveats — especially considering the study didn’t find the same physiological changes (like increased hemoglobin mass, which you’d expect would be placebo-proof) that previous studies have found. Two points worth noting:

(1) The subjects were very well-trained compared to previous studies, with VO2max around 70 ml/kg/min and high initial hemoglobin mass. It’s possible that the beneficial effects of LHTL show up only in less-trained subjects.

(2) There’s a difference between living at 3,000 m and living in a room or tent kept at oxygen levels comparable to 3,000 m: pressure. “Real-world” altitude has lower pressure as well as lower oxygen; this study lowered oxygen but not atmospheric pressure. Apparently a few recent studies have hinted at the possibility that pressure as well as oxygen could play a role in the body’s response to altitude, though this remains highly speculative.

As always, one new study doesn’t erase all previous studies, nor does it override the practical experience of elite athletes. But it suggests that we should think carefully about whether altitude really works the way we’ve been assuming it works. As the researchers conclude:

In summary, our study provides no indication for LHTL, using normobaric hypoxia, to improve time trial performance or VO2max of highly trained endurance cyclists more than conventional training. Given the considerable financial and logistic effort of performing a LHTL camp, this should be taken into consideration before recommending LHTL to elite endurance athletes.

 

Which “rules of running” should you break?

November 28th, 2011

Forgot to mention this earlier — I have an article in this month’s Runner’s World called “Breaking All the Rules,” which is now available online. Basically, I had a chance to chat with a bunch of veteran coaches — Jack Daniels, Frank “Gags” Gagliano, Roy Benson, Jeff Galloway, Hal Higdon and Pete Pfitzinger — and ask them which “rules of running” they’d recommend not following blindly. Here’s one example:

THE RULE: Do prerace strides
For generations, runners have followed the same rituals to warm up before races or workouts: Start with some jogging, move on to a little bit of stretching, then perform a series of “strides”—short sprints lasting about 10 seconds that get your heart pumping and kick-start the delivery of oxygen to your running muscles. But do these timeworn rituals really help us perform better? Jack Daniels, Ph.D., isn’t convinced. “What I most often see at races is a bunch of runners striding up and down at a speed that is clearly faster than the coming race pace,” he says. Since these strides are the last thing runners do before starting the event, that inappropriate pace is fresh in their minds. “And when the gun finally sounds, they ‘stride’ or sprint right out.” The result: a way-too-fast start followed by an inevitable crash.
HOW TO BREAK IT: For shorter events like 5-K and 10-K races, jogging just long enough to get a good sweat going is all you need to do, says Daniels. (For longer races, you can get away with even less: Run the first mile of a half or full marathon as your warmup.) To get the oxygen-boosting benefits of strides without skewing your pace judgment—and screwing up your race result—try a sustained two-to three-minute effort 10 minutes before starting the race or workout. Run it slightly faster than your half-marathon pace, or at a speed that feels moderately hard. You should not be sprinting.

Other topics covered include stretching, the length and pace of your long run, having a recurring weekly training structure, back-to-back hard days, cross-training during injuries, and so on. I should emphasize: none of the coaches are suggesting that existing rules-of-thumb are 100-percent bad. They’re just worth thinking about to make sure that they really do make sense for your personal situation and goals. For example, Higdon questions the wisdom of the 10-percent rule — he’s not saying that you should never increase your mileage by 10 percent; he’s just saying that sometimes it might make sense to increase it by more, sometimes by less, so you shouldn’t follow the rule blindly.

Can you “train” your fingers and toes to withstand cold?

November 28th, 2011

Winter’s coming, so here’s a topical study just published in the European Journal of Applied Physiology. Do your fingers and toes gradually adapt to being exposed to cold temperatures? There are three questions we can ask:

  1. Are the digits able to maintain a higher temperature when they’re exposed to cold?
  2. Are the digits quicker to experience “cold-induced vasodilation” (COVD)? (When you get cold, your blood vessels contract; but after a certain point, the vessel walls get so cold that they can’t stay contracted, so you get a sudden rush of blood that helps to warm up your fingers and toes — which turns out to be a very useful response to avoid frostbite.)
  3. Do your digits hurt less?

Over the years, many researchers have tested whether our digits adapt to cold, and the results are all over the map — some see a positive effect, some see a negative effect, some see no effect. Into the breach come researchers from the Netherlands and from Brock University in Canada. They took 16 suckers volunteers and had them dip their right hand and foot in 8 C water for 30 minutes at a time for 15 consecutive days. At the beginning and end of the experiment, the “trained” hand/foot was compared to the “untrained” hand/foot.

Here’s how skin temperature changed over the 15 days:

The data kind of meanders around, but there’s not much of a clear trend. Unfortunately, there was a clear trend for CIVD: during the pre-training test, 52% of subjects experienced CIVD; during the post-training test, only 24% experienced it.

And finally, the pain score:

On the surface, this might seem like good news: it hurts less as you gradually become accustomed to the unpleasant sensation of being cold. In fact, though, this is bad news. As your body gets used to the cold, you notice it less, but you also are less likely to get the warming effects of CIVD. Combine these two factors, and you become increasingly likely to get frostbite without realizing it.

So what does this mean? Well, it probably ends my dreams of being a polar explorer. I have extremely poor circulation in my fingers, and this suggests that this is unlikely to improve no matter how often I freeze my fingers off. So, despite the odd looks I get, I’m going to continue to run in my big puffy mittens whenever it gets close to freezing, because I won’t get any long-term “training” benefit from suffering.

Why the brain matters in obesity

November 25th, 2011

Those of you interested in nutrition may already be following the online debate between Gary Taubes and Stephan Guyenet — back in August, Guyenet critiqued Taubes’s carbohydrate-insulin hypothesis, and now Taubes is returning the favour by critiquing Guyenet’s food-reward hypothesis. I’m not going to get into the nitty-gritty of the debate here, except to say that I think it’s a mistake to frame this debate as an “either-or.” Despite Taubes’s insistence to the contrary, the two ideas can coexist — and even if they do, I suspect they still don’t add up to the “whole truth” about obesity. Here’s one reason why.

In one of his recent posts, Taubes makes the distinction between body-centred and brain-centred theories of obesity (or you can think of it as physiology vs. psychology, one of his commenters points out). Taubes believes obesity originates in the body:

In this paradigm, specific foods are fattening because they induce metabolic and hormonal responses in the body — in the periphery, as its known in the lingo — that in turn induce fat cells to accumulate fat. The brain has little say in the matter.

Leaving aside the precise mechanism, I largely agree with the idea that regulation of calories in and calories out isn’t under the conscious control of the brain. And I’m pretty sure Guyenet would agree too. But I’m not quite ready to conclude that the brain plays no role.

This is a figure from a study published in the Archives of Pediatrics & Adolescent Medicine in 2009, from researchers at Penn State (no wisecracks please). The text is freely available here. The study followed 1,061 children, who were tested at the age of 3 for self-control (the length of time they were able to refrain from playing with a fun toy after being asked not to) and then again at the age of 5 for delayed gratification (the classic Marshmallow Test, which I’ve written about before, except using M&Ms, animal crackers or pretzels: they could have a small amount anytime, or a larger amount if they waited 3.5 minutes). Then their BMI was tracked until their turned 12.

The results are pretty clear: doing well on either or both of the impulse-control tests predicts less weight gain nine years later. So the question is: how can a test that involves (not) playing with a toy when you’re 3 years old predict future weight gain, if the brain has no say in weight gain?

Let me be absolutely clear: I don’t think “better impulse control” will play any useful role in weight loss for the vast majority of people. Once you’re overweight, I suspect physiology totally swamps psychology in most cases. But if you’re looking for an overall understanding of the mechanisms of weight gain and loss — and if, like Taubes, you insist that the correctness of your theory means that all alternate ideas must be 100% incorrect — then I believe you can’t ignore the brain (and its interactions with the modern food/physical activity environment) completely.

The brain senses macronutrient RATIOs, not just amounts

November 24th, 2011

The classic syllogism of “nutritionism” goes something like this:

  1. Eating food A makes people healthy.
  2. Food A contains nutrient X.
  3. Therefore we should isolate nutrient X, manufacture it in powder form, and ingest it large quantities to become healthy.

This seems pretty logical, and I certainly wouldn’t have questioned this basic mode of thinking a decade ago. And of course the approach has had many successes — taking vitamin C really does ward off scurvy if, for whatever reason, you’re subsisting on a diet devoid of vitamin C. But when we shift from “correcting deficiencies” to “enhancing health,” the approach seems to sputter, as people like Michael Pollan have argued.

The question, from my perspective, is: Why? Why do so many studies find that taking an isolated nutrient fails to reproduce the benefits observed from ingesting that nutrient in the context of a whole food (or, perhaps even more importantly, a whole meal or whole dietary pattern)? There are obviously many factors, such as the rate at which the nutrients are absorbed, and synergies between different nutrients in the food (a possible explanation for why nitrites are “good” when they from spinach and beets but “evil” in the context of fatty hot dogs).

A new study published last week in Neuron (press release here, abstract here) offers another clue. The study looked at the activation of “orexin/hypocretin” neurons in the hypothalamus, which “regulate energy balance, wakefulness, and reward.” It has long been known that glucose levels in the brain reduce the activation of these neurons. Researchers at the University of Cambridge tested how they responded to protein and fat, and found that certain amino acids increase the activation of the neurons, while fatty acids have no effect.

Okay, so the brain responds to macronutrient levels in the body. Cool. Carbs turn this particular neural signal up, and protein turns it down. And if you eat both protein and carbs at the same time, you’d expect that the net result will be the sum of the two signals. But that’s not what the researchers found. The combined protein-and-carb signal was a nonlinear combination of the two individual signals — meaning that these neurons were, in effect, responding to the protein-to-carb ratio rather than amounts. As the researchers put it:

In summary, our data show that the activity in the orx/hcrt system is regulated by macronutrient balance, rather than simply by the caloric content of the diet.

The bottom line: if you try to understand how this particular aspect of human physiology works by breaking food down into its constituent nutrients and testing them one by one, you’re doomed to failure because its response to individual nutrients is different from its response to combinations of nutrients. Which leads to a corollary: if you try to create a healthy diet by assembling a collection of pills and powders, you’re almost certainly sacrificing some of the synergies present in real foods.

Sidney Crosby, chiropractic neurology, and the limits of evidence

November 23rd, 2011

The good news: Sidney Crosby is back from the concussions that kept him on the bench for more than 10 months, and he had two goals and two assists in his return against the Islanders last night. But one downside, a reader pointed out to me in an e-mail, is that Crosby’s return may give added credibility to “chiropractic neurology,” the alternative therapeutic approach that Crosby turned to during his rehab. What exactly is this? I don’t know — and I’m not alone:

It’s a field that’s unfamiliar to many traditional doctors, including Randall Benson, a neurologist at Wayne State in Detroit who has studied several ex-NFL players. Says Benson, “It’s very difficult to evaluate what kind of training, expertise or knowledge a chiropractic neurologist has since I have never heard of [the discipline].”

That’s a quote from David Epstein and Michael Farber’s excellent look at Crosby’s rehab from Sports Illustrated in October. A couple of other interesting quotes:

In 1998, at Parker University, a Dallas chiropractic college, Carrick [the chiropractic neurologist who Crosby worked with] worked on Lucinda Harman before 300 students. Two car accidents and a neurotoxic bite from a brown widow spider had left Harman, herself a Ph.D. in experimental psychology, wheelchair-bound and with headaches, during which she saw spots.”[Carrick] asked if they were red and yellow,” she says. “I said, ‘No, they’re green, blue and purple.’ ” Carrick informed the audience that this meant her brain was being drastically deprived of oxygen and that, without treatment, she had six months to live. Harman, now 59, says simply, “Miracle.” But Randall Benson says that “there’s nothing out in peer-reviewed literature supporting” an association between the color of spots a patient sees during a headache and the severity of the oxygen deprivation in the brain.

[...]

Carrick, who has had a handful of studies that have appeared in scientific journals, has never published data on vestibular concussions. “We don’t have enough time to publish studies,” he says, “but we’re doing a large one at Life [University] right now.”

It’s a great piece — fair but rigorous. In some ways, though, the most important quote may be the kicker:

“I don’t think this is a case of trying to do something wacky,” Crosby says. “When someone came along and invented the airplane, people must have thought they were out of their mind. Who thinks he can fly? I’m sure people thought that person might have been stretching it a bit… . At the end of the day, as long as the person getting the care is comfortable, I think that’s what’s important.

Much as my evidence-based personality protests, I do think there’s some truth to that. Especially in cases like this, where — as with so many health conditions — there isn’t a well-established “standard-of-care” treatment. It’s totally different from, say, Steve Jobs choosing “alternative” forms of cancer treatment instead of surgery. In that case, the potential benefits of the surgery are well-known and well-understood. But many people face health conditions where the verdict of the Cochrane review is basically “there is insufficient evidence to conclude that ANY interventions do any good.” In that case, it’s hard to argue against trying other, unproven approaches rather than simply doing nothing.

Of course, sports medicine is a little different — it’s not life-or-death. For pro athletes, the incentive to try anything and everything in order to return to play (and earn money during their brief career window) is enormous. If I were Tiger Woods or Terrell Owens, I would have tried platelet-rich plasma to speed tendon healing too, despite the lack of evidence that it actually works. The problem is that the use of these therapies by sports stars gives the general public the impression that they’re proven, established treatments — hence the huge surge in PRP over the last few years. Will the same thing happen with chiropractic neurology? I hope not. But on the other hand, if someone who’s been in two car accidents and been bitten by a neurotoxic spider is in pain and hasn’t been able to get relief from conventional treatment, I’d have a hard time criticizing them if they decided to give it a try.

Cadence in elite runners increases as they accelerate

November 21st, 2011

One quick graph from a new study by Robert Chapman and his collaborators at the University of Indiana, just published online in Medicine & Science in Sports & Exercise:

This is data from 18 elite runners (12 male, 6 female), showing their stride frequency as a function of speed. For reference, 3.00 Hz corresponds to 180 steps per minute; 3.3 Hz corresponds to about 200. On the speed axis, 4.0 m/s is 4:10 per km, and 7.0 m/s is 2:23 per km. In other words, these are FAST paces. The key point: they get faster, in part, by quickening their cadence. There’s no magic cadence that they stay at while lengthening their stride to accelerate.

Interesting wrinkle: the women have faster cadence than the men at any given speed. Chapman assumes this is partly due to the fact that the men are taller — but even normalizing by height doesn’t quite erase the difference. (And that even ignores the argument that, as I blogged about here, cadence should be proportional to the square root of leg length, not leg length itself.) The remaining difference, Chapman hypothesizes, could be due to “application of greater ground forces by the men or differences in muscle fiber type distribution.” This makes sense: if you’re stronger (as the men, on average, will be), you’ll have stronger push-off, longer stride, and thus shorter cadence at any given speed. But it seems pretty clear that height plays at least some role.

Paleo, the pace of evolution, and chronic stress

November 21st, 2011

My Jockology column in today’s Globe and Mail takes a look at the paleo diet — or rather, the paleo “lifestyle.” The column is actually in the form of an infographic in the paper, beautifully illustrated as “cave art” by Trish McAlaster. Unfortunately, the online version so far just lifts the text, without any of the data and graphics that accompany it. Nonetheless, it’s hopefully worth a read!

As a teaser, here’s an excerpt from a section on how the pace of evolution has changed over the past few thousand years, and what that means for the quest for the perfect “ancestral” diet:

The paleo diet depends on the assumption that our genes haven’t had time to adapt to the “modern” diet. Since evolution depends on random mutations, larger populations evolve more quickly because there’s a greater chance that a particularly favourable mutation will occur. As a result, our genome is now changing roughly 100 times faster than it was during the Paleolithic era, meaning that we have had time to at least partly adapt to an agricultural diet.

The classic example: the ability to digest milk, which developed only in populations that domesticated dairy animals. More than 90 per cent of Swedes, for example, carry this mutation. Finnish reindeer herders, in contrast, acquired genes that allow them to digest meat more efficiently, while other populations can better digest alcohol or grains. The “ideal” ancestral diet is most likely different for everyone. [READ THE WHOLE ARTICLE]

And, as another teaser, here’s a section of Trish’s infographic illustrating the difference between the acute stress of the paleo lifestyle compared to the chronic stress of modern life:

Compression gear during interval workouts: a new possibility

November 19th, 2011

An interesting wrinkle in the debate over whether compression garments do anything during exercise to improve performance, from a new Australian study just posted in the Journal of Strength & Conditioning Research. The situation so far:

  • Every time you take a step while running, the flexing of your calf muscle operates something called the “calf muscle pump” — basically, your calf literally squeezes the blood vessels in your lower leg, helping to shoot oxygen-depleted blood back toward the heart.
  • Graduated compression of the lower leg (i.e. tighter at the ankle, looser at the knee) is thought to enhance the action of this calf muscle pump, by helping it to squeeze harder. This should reduce the load on your heart and speed the circulation of blood through your body, possibly enhancing performance.
  • One argument against the idea that compression garments boost performance is that, when you’re running hard, the action of the calf muscle pump is already maxed out, so adding more compression doesn’t help. You can’t squeeze more blood from a stone!

The new study put 25 rubgy players through a form of interval workout: basically 5:00 easy, 5:00 medium, 5:00 hard, 5:00 easy, 5:00 hard, 5:00 easy. They each did the test twice, once in running shorts and once in full-leg graduated compression bottoms. The researchers measured a bunch of variables (heart rate, oxygen consumption, lactate levels, blood pH) during each stage of the workout. There were basically only two elements where the data was significantly different between shorts and tights: in the fourth and sixth intervals (i.e. the easy recovery intervals), heart rate and lactate levels were both significantly lower in compression tights.

On the surface, this fits nicely with the ideas above. The tights don’t help when you’re running fast, since the calf muscle pump is maxed out; but during the easy recovery, the compression does help, resulting in lower lactate and heart rate — and, in theory, better performance on the subsequent hard section.

This is the problem, though: the study didn’t actually measure performance. The pace during each interval was predetermined, so we don’t know whether this difference in physiological parameters actually translates into better real-world performance. That’s a point that was highlighted in another Australian compression study that I blogged about back in August. That study also found physiological “improvements” from compression — but in that case, they also measured performance and found no difference. As the researchers wrote:

However, the magnitude of this improved venous flow through peripheral muscles appears trivial for athletes and coaches, as it did not improve [time-to-exhaustion] performance. This would suggest that any improvement in the clearance of waste products is insufficient to negate the development of fatigue.

Bottom line: I remain skeptical that wearing compression during a run will allow you to run faster. (Note that this is entirely separate from the question of whether wearing compression during and after a run will allow you to avoid or recover more quickly from muscle soreness, a claim that has somewhat better support.) This new study raises the intriguing possibility that compression might boost active recovery during interval workouts — but until it’s directly tested in a performance context, it’s just a hypothesis.

The case against antioxidant vitamin supplements

November 17th, 2011

The December issue of Sports Medicine has an enormous, detailed review of research on the effect of antioxidant (i.e. vitamin C, vitamin E, coenzymeQ10, etc.) supplements on training. To most people, this seems like a no-brainer: what could be smarter than popping a multivitamin as “insurance” in case your diet isn’t giving you all the vitamins you need? But (as I’ve blogged about before) there’s an emerging school of thought arguing that taking antioxidants can actually block some of the gains you’d otherwise get from training. Here’s how I explained the debate back in April:

The traditional theory goes like this: strenuous exercise produces “reactive oxygen species” (ROS), which cause damage to cells and DNA in the body. Taking antioxidant supplements like vitamins C and E helps to neutralize the ROS, allowing the body to recover more quickly from workouts.

The new theory, in contrast, goes like this: strenuous exercise produces ROS, which signal to the body that it needs to adapt to this new training stress by becoming stronger and more efficient. Taking antioxidant supplements neutralizes the ROS, which means the body doesn’t receive the same signals telling it to adapt, so you make smaller gains in strength and endurance from your training.

The new paper comes down firmly on the side of the latter view:

The aim of this review is to present and discuss 23 studies that have shown that antioxidant supplementation interferes with exercise training-induced adaptations. The main findings of these studies are that, in certain situations, loading the cell with high doses of antioxidants leads to a blunting of the positive effects of exercise training and interferes with important [reactive oxygen species]-mediated physiological processes, such as vasodilation and insulin signalling.

So is this definitive? Far from it. As the review notes, there have been a few studies that found beneficial effects of antioxidant supplements on exercise performance, tons that have found no effect, and a few (23, to be exact) that have found negative effects. What most of the studies have in common:

As commonly found in sports nutrition research, the vast majority do not adhere to all the accepted features of a high-quality trial (e.g. placebo-controlled, double-blind, randomized design with an intent-to-treat analysis). Indeed, most studies fail to provide sufficient detail regarding inclusion and exclusion criteria, justification of sample size, adverse events, data gathering and reporting, randomization, allocation and concealment methods, and an assessment of blinding success. The poor quality of the majority of studies in this field increases the possibility for bias and needs to be always considered when evaluating the findings.

This is a really important point to bear in mind, and not just when it comes to sports nutrition. Whatever the supplement, training method, or piece of equipment you’re talking about, there’s nearly always a crappy, poorly executed study that seems to “prove” that it works. So where does that leave us? On this topic, I’m in agreement with the authors:

We recommend that an adequate intake of vitamins and minerals through a varied and balanced diet remains the best approach to maintain the optimal antioxidant status in exercising individuals.