Active vs. passive warm-up

THANK YOU FOR VISITING SWEATSCIENCE.COM!

My new Sweat Science columns are being published at www.outsideonline.com/sweatscience. Also check out my new book, THE EXPLORER'S GENE: Why We Seek Big Challenges, New Flavors, and the Blank Spots on the Map, published in March 2025.

- Alex Hutchinson (@sweatscience)

***

What exactly is the purpose of a warm-up before exercise? According to a new study in the Journal of Strength and Conditioning Research, it’s:

to enhance physical performance, to reduce muscle soreness, and to prevent sports-related injuries by increasing the body temperature.

But if the main mechanism of the warm-up is literally to warm the body, could we accomplish the same thing by, say, sitting in warm water? That’s what this study tested: three different cycling tests (six minutes at 80% VO2max) after (1) no warm-up, (2) an “active” warm-up of 20 minutes easy cycling, or (3) a “passive” warm-up of soaking the legs in 39-C water for 20 minutes. The result: the active warm-up allowed subjects to use more oxygen (measured VO2) with less effort (lower HR), and possibly lower lactate accumulation (though the latter wasn’t statistically significant).

So what does this mean? It suggests that the benefits of a proper warm-up aren’t just the result of raising your temperature. Higher temperature does confer some benefits: for example, your muscles and tendons become more elastic, reducing the risk of injury. Nerve signals from brain to muscle are transmitted more quickly. The rate of metabolic reactions inside your cells speeds up by 13% for each degree C that the temperature increase.

But there are other benefits beyond temperature. Crucially, the active warm-up causes your blood vessels to dilate to speed the flow of oxygen to working muscles. When you start the main workout or race, the sudden increase in demand puts you into temporary oxygen debt, because your heart, lungs and muscle metabolism can’t respond instantly to the higher demand. If you’re properly warmed up, your systems are already partly ready for the increased demand (blood vessels dilated from the warm-up, heart rate already elevated, etc.), so they can deliver more oxygen than if they were starting cold. That means the short period of initial oxygen debt doesn’t last as long — and since aerobic metabolism is more efficient that anaerobic metabolism, it means that you’re more efficient overall.

The practical take-away: well, we all know that warm-ups (as opposed to sitting in a luke-warm bath) are important, so this doesn’t change anything. But there’s still lots of debate about exactly what a warm-up is supposed to do, and what the best way to do it is — hence all the posts about dynamic versus static stretching, for example. In the long run, figuring which parts of a warm-up really do boost performance will help us design better warm-up routines.

Are fruit and vegetables really more expensive than soy and corn?

THANK YOU FOR VISITING SWEATSCIENCE.COM!

My new Sweat Science columns are being published at www.outsideonline.com/sweatscience. Also check out my new book, THE EXPLORER'S GENE: Why We Seek Big Challenges, New Flavors, and the Blank Spots on the Map, published in March 2025.

- Alex Hutchinson (@sweatscience)

***

I just read an extremely interesting piece on “cheap food,” by Wellesley College prof Robert Paarlberg at Good Magazine (via Yoni Freedhoff’s newish Weighty Matters Facebook page). It’s a mind-twister: Paarlberg argues against the Michael Pollan/Mark Bittman claim that farm subsidies make junk food cheap and healthy food expensive, thus contributing to the obesity problem.

His main claim is that U.S. farm policy acts primarily by propping up prices for foods like sugar, beef and milk, for example by slapping import tariffs on sugar from countries where it’s cheaper to produce. This makes sugar more expensive, not less expensive. Same with ethanol subsidies, which drive the price of corn (and thus high-fructose corn syrup) up, not down. As for direct payments to encourage extra production when prices are low, he argues that these only lower the price of corn “slightly.”

Is this a fair assessment of the overall effect of U.S. farm policy? Or is he just cherry-picking a few elements of the farm policy that support his perspective? According to Bittman, the 2012 U.S. farm bill includes $30 billion of subsidies, of which $5 billion is direct payments to farmers. Direct payments, of course, effectively lower prices, contrary to Paarlberg’s thesis. Where do these direct payments go? According to this analysis, about a third of the total direct payments between 1995 and 2008 went to corn growers, followed by wheat then soybeans.

But we’re still considering the problem too narrowly, I think. Even if we accept the core of Paarlberg’s premise — that subsidies to corn and soy primarily serve to keep prices high so that the farmers earn a reasonable income — that still has far-reaching effects on the rest of the food system. Ethanol subsidies may drive corn prices higher, but they also create an insatiable demand for corn, encouraging farmers to plant corn instead of other crops. By making corn and soy reliably profitable (with production and loan guarantees, by propping up prices, or through whatever other means the farm bill employs), they’re effectively discouraging other crops and making them more expensive.

So what’s the overall effect of all this? The fact is, I don’t have enough information or expertise to figure it out. But we have another option: we can look directly at food prices to see which types of food are getting more or less expensive. Paarlberg does this, and concludes that the price of fruits and vegetables has fallen at exactly the same rate as the price of junk food, drawing on data from a 2008 USDA study. He even helpfully includes a couple of graphs from the study to illustrate his point:

price trend for apples and cookies

Again, we might ask whether this is the full story. If we go back to the USDA report he cites, the very first figure is the following:

food prices between 1980 and 2006A bit of a different picture! But food prices are complicated, as the authors of the USDA report explain. It’s possible that the rise in the price of fruits and vegetables reflect changes in quality: now we buy peeled baby carrots instead of unpeeled carrots, and we expect fruits to be available all year. That’s why they examine the price trends for specific items like apples and chocolate chip cookies. Of course, once we’re choosing individual products rather than a broad basket of goods, our biases come into play. It’s no surprise that Paarlberg didn’t choose to show the data for, say, broccoli (on the left) and potato chips (on the right):

food price broccoli potato chipsThis pair of graphs tells a different story — but they, too, are selectively chosen to make a point. So what’s the truth? It’s very hard to tell; or, conversely, it’s very easy to find data to back up different points of view. Here’s another graph from the Institute for Agriculture and Trade Policy, based on earlier USDA data:

With this surfeit of data to choose from, it’s worth asking who’s choosing what data to show us. In the comments on the Good website, several people point out that Paarlberg was a member of Monsanto’s Biotech Advisory Council to the CEO — not surprising, given that his most recent book is Starved for Science: How Biotechnology Is Being Kept Out of Africa. There’s nothing inherently wrong with advising Monsanto, but it does give us a sense of the perspective with which he’s approaching the data.

Ultimately, I think the article is very much worth reading. It’s a useful — and perhaps much needed — corrective to the idea that all our societal health problems are the result of a conspiracy by big business to cram us full of high-fructose corn syrup. It may well be that, given the realities of modern technology, unhealthy food really is cheaper to produce than healthy food, independent of what the government does — in which case more responsibility devolves onto us to make healthier but more expensive choices. But even if that’s the case, there’s no reason for the government to make the situation worse by continuing to preferentially encourage unhealthy mega-crops over healthy ones!

Heart damage markers disappear 72 hours after marathon

THANK YOU FOR VISITING SWEATSCIENCE.COM!

My new Sweat Science columns are being published at www.outsideonline.com/sweatscience. Also check out my new book, THE EXPLORER'S GENE: Why We Seek Big Challenges, New Flavors, and the Blank Spots on the Map, published in March 2025.

- Alex Hutchinson (@sweatscience)

***

The debate about whether “extreme” exertions like running a marathon can damage your heart continues to simmer. The latest addition is a paper published online last week in Medicine & Science in Sports & Exercise by a group from TU Munich. One of the co-authors is Stefan Moehlenkamp, whose recent studies of fibrosis in the hearts of veteran marathon runners have stirred up controversy.

In this study, they took blood tests (and various other measurements) from 102 participants in the Munich Marathon, before, immediately after, 24 hours, and 72 hours after the race. They were looking at the rise and fall of various “cardiac biomarkers” that signal possible heart damage, in particular and newly developed test for cardiac troponins that is much more sensitive than previous tests.

We already know that troponin levels rise after a marathon — but we don’t whether that’s a signal that heart muscle cells are dying, or whether it just signals some temporary damage, in the same way that your leg muscles are temporarily “damaged” by a marathon but quickly recover. When heart muscle cells actually die, as in a heart attack, levels of troponin stay elevated for four to seven days, as troponin continues to leak from the dead cells. In contrast, temporary damage causes a sharp peak in troponin that returns to normal after a few days. Here are the results:

troponin levels return to normal 72hrs after marathon

Combining this sharp peak and quick decline with the other measurements in the study, the researchers conclude that “cardiac necrosis [i.e. cell death] during marathon running seems very unlikely.” Instead, the evidence points to temporary damage to cell membranes, possibly caused by decreased availability of oxygen or ATP during the race.

Referring to the earlier study that found fibrosis in veteran marathon runners, the researchers write:

Findings of myocardial injury, as seen in older marathon runners (5) are probably independent of marathon running but rather related to cardiovascular disease or risk factors, particularly smoking.

Does this mean the controversy is over? Far from it. For one thing, this study was written before more recent results showed possible heart damage in elite athletes who weren’t former smokers. More research, as always, is needed. But the results are encouraging — I remain pretty firmly convinced that the cardiac benefits of training for and competing in marathons dramatically outweigh the putative risks.

High-intensity interval training improves insulin sensitivity

THANK YOU FOR VISITING SWEATSCIENCE.COM!

My new Sweat Science columns are being published at www.outsideonline.com/sweatscience. Also check out my new book, THE EXPLORER'S GENE: Why We Seek Big Challenges, New Flavors, and the Blank Spots on the Map, published in March 2025.

- Alex Hutchinson (@sweatscience)

***

“High-intensity interval training” (HIT) has been a big buzzword for the past few years, with plenty of studies showing that short, intense bursts of exercise can produce many of the same results as long, steady cardio sessions. Martin Gibala’s group at McMaster just published a new study in Medicine & Science in Sports & Exercise with a couple of points worth noting:

  • You don’t have go “all out”: Many of the early studies used 30-second Wingate tests at 100% exertion, which is pretty challenging for inexperienced or unfit exercisers. The more moderate protocol Gibala has been studying is cycling 10 x 60s hard with 60s recovery. The hard sections were done at 60% peak power (80-95% of heart rate reserve) — so hard, but not fall-off-the-bike hard.
  • Anyone can do it: Instead of using relatively fit subjects, this study used older (average age 45) subjects who were sedentary (no regular exercise program for at least a year).

The most interesting result for me: subjects improved their insulin sensitivity by 35% on average after just two weeks, three sessions a week. Lots of other parameters also improved, but insulin sensitivity is something that we know is crucially important in avoiding and managing metabolic syndrome. And the whole workout, including the three-minute warm-up and five-minute warm-down, takes less than half an hour.

By no means am I suggesting that interval training is the One True Answer to fitness (and neither are Gibala et al.). There are good arguments for varying what type of workout you do. But in terms of bang for buck, it’s hard to compete with HIT.

Can yoga be studied with “conventional” clinical trials?

THANK YOU FOR VISITING SWEATSCIENCE.COM!

My new Sweat Science columns are being published at www.outsideonline.com/sweatscience. Also check out my new book, THE EXPLORER'S GENE: Why We Seek Big Challenges, New Flavors, and the Blank Spots on the Map, published in March 2025.

- Alex Hutchinson (@sweatscience)

***

I’m generally a pretty skeptical guy. If someone tells me that X really works, but it can’t be verified by science because science is just one way of understanding things, I roll my eyes. But bear with me here. A couple of years ago, I interviewed a guy named Timothy McCall, the author of Yoga as Medicine, for an article about yoga research. He was a smart guy with some very interesting things to say, and I still get occasional e-mail updates from his website. The latest includes a link to this article, from the Spring 2011 issue of Yoga Therapy Today, by Nina Moliver, called “Yoga Research: Yes, No, or How?”

The article is a pretty wide-ranging look at the current debate (in yoga circles) about how yoga and science fit together. A couple of points caught my attention. First, McCall and Moliver argue, yoga is a very slow healing technique:

[E]ven six months is a drop in the bucket for a Yoga practice. By privileging short-term studies and standardized protocols, we are forever studying beginners, [and] we are systematically underestimating the healing potential of Yoga in our research…

The bigger argument is that randomized, controlled trials — of any length — to study yoga don’t work, for various complex, holistic reasons that don’t sound very convincing to me. The alternative is observational studies. And as it happens, Moliver completed an award-winning PhD thesis at Northcentral University last year that used an observational design — an online survey — to study yoga in 211 female yoga practitioners plus 182 controls. Observational studies have a lot of problems, in particular the inability to distinguish between cause and effect, as Moliver acknowledges:

For example, if a researcher didn’t randomly assign the participants, it is not possible to know if Yoga practitioners are happier because they practiced Yoga, or if people who were happier were naturally attracted to starting a Yoga practice.

But there are still ways of extracting useful data. For example, if you see a dose-response effect — the longer people have been yoga-ing, the happier they are — that’s pretty suggestive. And as for those confounding variables:

For the Yoga practitioner, these so-called confounders — a healthier diet, a simpler lifestyle, more time outdoors, more kindness and compassion, more loving relationships, more bike-riding, a better path to right livelihood — are not confusing. They are mutually enhancing and reinforcing.

In other words, who cares if you end up happier and healthier because you’ve aligned energy flows in your body or simply because you’ve spent more time being physically active and mindful — the result is what matters. And indeed, Moliver’s study did see evidence of a dose-response relationship in her subjects, some of whom had been practicing yoga for as a long as 50 years. (I’m hoping the study will be published, as the abstract isn’t very revealing about the “range of intercorrelated wellness measures” that demonstrated the dose-response effect.)

This all sounds very reasonable to me — and in fact, it’s very reminiscent of Paul Williams’ National Runners’ Health Study, which takes a similar observational approach (albeit with more than 100,000 subjects) to tease out dose-response relationships that would be nearly impossible to detect with conventional short-term intervention studies.

One caveat: this approach tells us what works, but it doesn’t tell us how it works. You can’t take an observational study that finds health benefits from yoga and conclude that this proves that we can indeed control the circulation of energy flow in our bodies. To make claims about cause and effect, you really do need proper randomized trials. Notably, Moliver’s study didn’t see any difference between different types of yoga: just doing it, and keeping at it for long periods of time, correlated to better levels of psychological and physical well-being.