Dehydration in the lab vs. the real world
The “right” amount of hydration during exercise is a hot topic these days. For years, lab studies showed that if you forcibly dehydrate someone and then stick them on a treadmill, their performance suffers. But more recently, “real-world” studies (like the one I blogged about here) have shown that in many cases, the fastest finishers in races tend to be the most dehydrated. The difference, according to researchers like Tim Noakes, is that in the real world your brain is able to make pacing decisions that keep your body in a safe state (using the thirst mechanism); on a treadmill at a fixed pace in a lab, your brain is cut out of the loop.
The result is that there are two bodies of research — lab and field — that appear to be answering the same question (how much should we drink?) but that produce completely different answers. So it’s nice to see a study from the “lab” camp that tries to bridge this gap by doing some experiments on an outdoor trail run. Researchers from the University of Connecticut had 14 runners perform two 12K trails runs, one in a hydrated state and the other in a dehydrated state. Here’s what they found:
Pretty straightforward, right? The runners were slower when they were dehydrated. As expected. Case closed. And we should disregard those inconvenient studies that found that faster runners in real races are more dehydrated, according to the researchers:
Although some field studies have found runners to be extremely successful despite considerable body fluid losses, these runners were not compared with a control condition where these same runners remained more optimally hydrated. Therefore, one cannot conclude that performance in these elite runners may have been enhanced if they had maintained or at least attenuated some of their fluid losses while racing.
But hang on a sec. Does this new study really offer valid “field” conditions? Not quite. It may have been conducted outdoors, on nice trails in a local state park, but it nonetheless managed to reproduce all the usual problems of lab studies. First of all, the runners weren’t freely paced: they were instructed to run at a set heart rate, which imposes a rather arbitrary limitation. So they didn’t go slower because they were unable to keep the pace, but because they weren’t allowed to increase their heart rate. More importantly, the dehydrated runners weren’t allowed to drink or eat “high water content foods” for 22 hours before the test run! The result:
So what can we conclude from these results? If you subject volunteers to a punishing dehydration regimen before your experiment even starts, their performance will suffer. This is very important to bear in mind next time you’re stranded in the desert. But as for how much water you should drink during your next run, this study has basically nothing to say.
(I should point out that the researchers did measure a number of other physiological parameters in the study, like gastrointestinal temperature. This is useful data. They also argue that holding heart rate steady is useful because it’s “similar to how a cross country or track coach may advise their athletes to maintain a certain intensity level during a run.” Sure, I guess. But if that’s what they’re measuring, then why are they using the results to make claims about what happens in real-world marathons, where nobody starts after a full day of dehydration?)