Archive for April, 2011

Helmets, head injuries and “risk homeostasis”

April 11th, 2011

This week’s Jockology article in the Globe and Mail takes a look at helmets and the slippery concept of “risk homeostasis” — the idea that wearing protective equipment will cause you to take more risks and cancel out any safety benefits:

The message that Mikael Colville-Andersen offered to his audience at the TEDx conference in Copenhagen last November isn’t what you’d expect from a cycling advocate who travels the world promoting urban bike use.

“There are actually scientific studies that show that your risk of brain injuries is higher when you’re wearing a helmet, and that you have a 14-per-cent greater chance of getting into an accident with a helmet on,” he said. “These are not things that we hear about too often.”

With hockey head shots in the news, and revelations that ex-players like Bob Probert suffered from a form of brain damage called chronic traumatic encephalopathy, head protection is a hot topic. But Mr. Colville-Andersen’s controversial anti-helmet crusade offers a reminder that technology and equipment, on their own, can’t keep us safe. We have to consider the underlying factors that influence our risk-taking decisions – and those of the people around us… [READ THE WHOLE ARTICLE]

UPDATE April 11: Lots of e-mails and comments on the Globe site: this is clearly a controversial topic, so I’d like to expand on a couple of points. The truth is that I started working on this article with the idea of presenting the counterintuitive research that Mikael Colville-Andersen discusses in his TEDx talk, showing why helmets are actually a bad idea. But when I dug into the literature, I found that the picture was much more complicated than the way he portrayed it.

There are really two separate questions: physics and public policy. The first is fairly straightforward and asks whether, in the context of certain types of mishaps that sometimes occur on bicycles, a helmet will significantly reduce the severity of your injury. In the lab, we can clearly see that helmets can mitigate the effects of certain impacts. It’s much harder to show that this is the case in the real world, because we can’t have access to “controlled” scenarios of people going head-over-heels with and without helmets. And statistical studies of injury rates are hindered by all sorts of very serious limitations (e.g. we don’t actually know how cycling rates change, let alone traffic conditions, road surfaces, cycling skills, etc.)

Now, one could argue (as many people do) that the lack of crystal-clear epidemiological evidence of reduced cycling-related head injuries is PROOF that helmets don’t work. This is specious. Just because something is difficult to prove doesn’t mean that it’s not true. We can debate where the burden of proof should lie, but my personal assessment of the laboratory data, in combination with the admittedly circumstantial epidemiological data, is that there are certain situations in which I’d be very glad to have a helmet on my head. I certainly accept that other people might look at the same data I looked at and decide that helmets aren’t worthwhile – after all, risk calculation is a personal thing. But those who claim that it’s “proven” that helmets make no difference whatsoever in the context of individual accident scenarios are simply delusional, in my opinion.

Where Colville-Andersen is more convincing is in his public policy arguments, rooted in some of the factors I discuss in the article (risk homeostasis, safety in numbers, etc.). There have been dozens and dozens of studies on cycling injury rates and how they changed (or not) with the introduction of helmet campaigns and laws. My impression after reading through many of these studies: there are more studies supporting the efficacy of helmets than there are null studies, but the results are surprisingly weak and the overall conclusions are equivocal at best. All of the studies are plagued by serious methodological challenges inherent in trying to study this question. There are certainly plenty of studies that forcefully conclude that helmets are either good or no good, and plenty of people who cherry-pick those results in order to claim that the debate is settled one way or the other. But if you open both eyes and look at the totality of the data, I don’t believe there’s enough evidence to reach a conclusion either way.

And that leaves us with some interesting policy debates. If the overall effect of helmets is weak at best, are we justified in imposing a helmet law on children? How many cases of brain damage in Ontario would have to be avoided each year to make such a law worthwhile? One? Ten? 100? 0.1? These aren’t science questions, they’re policy questions. They’re worth debating – certainly, I’m more open to the idea of scrapping helmet laws than I would have been before watching Colville-Andersen’s talk. But as I concluded in the Globe article: based on what I learned from going through all this literature, I’ll still be wearing a helmet when I bike.

In praise of balaclavas: facial heat loss makes fingers cold

April 9th, 2011
Comments Off on In praise of balaclavas: facial heat loss makes fingers cold

Okay, this update isn’t exactly seasonal in the northern hemisphere — but winter will be back in another six months, so tuck this away for future reference. Researchers at the U.S. Army Research Institute of Environmental Medicine (Thermal and Mountain Medicine Division) did a neat little study in the European Journal of Applied Physiology looking at the effect of heat loss through the face.

Basically, 10 volunteers each spent two 60-minute sessions standing in a cold chamber (-15 C) facing into a 3 m/s wind (wind chill equivalent to -20 C). In one session, they wore a balaclava and goggles; in the other session, their face was bare (the balaclava was still on, but pulled down to expose the face, and the goggles were removed. Your face tries to stay as warm as possible, meaning that blood vessels there don’t constrict much even when it gets cold — so there’s the potential to lose large amounts of heat. In addition, there are some direct cues: for example, facial cold exposure triggers the trigeminal nerve, which causes blood vessels to extremities to constrict.

All of this means your fingers and toes are more likely to get cold. And that is, indeed, what they found:

finger temperature with balaclava

The other part of the test was that the subjects had to take their gloves off (while still wearing thin gloves) at multiple points during the test to perform dexterity tests (the Purdue Pegboard and the Minnesota Rate of Manipulation, for what it’s worth). Surprisingly, there were no differences between the trials. Apparently your hands get cold so quickly as soon as you take the mittens off that you no longer get any benefit from having been fractionally warmer a moment earlier.

So the takeaway here: if you get cold hands during winter exercise, keeping your face covered could make a difference. And there’s a bonus insight. Ever since I was a kid, I remember people saying things like “Wear a hat, because the body loses X percent of its heat through your head,” where X was always some enormous number like 90. I’ve always wondered what the actual numbers are. Here’s a passage from the introduction to this paper:

Froese and Burton (1957) provide an example of whole-body cold exposure (-4°C, 2.2 m/s wind) where about half of the resting heat production would be lost from a bare head if the rest of the body was well insulated (5 clo). They (Froese and Burton 1957) estimated that the addition of relatively little insulation (2.4 clo) on the head would restore heat balance, although a higher amount (3.5 clo) would be required if the face remained exposed to cold. If thermal face protection can restore heat balance, extremity cooling would also likely be limited.

So there you go. For that particular set of circumstances, about half of the heat your body produces is lost from the head (including the face). Of course, if you start moving, or if the rest of your body isn’t that warmly dressed, or if the temperature or wind conditions change, then the conclusions change. But still, 50% gives us a ballpark estimate. Neat.

And one final aside: the “standard amount of insulation required to keep a resting person warm in a windless room at 70 °F (21.1 °C) is equal to one ‘clo‘.”

Hiking the Three Passes route in Nepal’s Everest region

April 8th, 2011

Mount Everest viewed from Kala Pattar

We now pause for a short bit of self-promotion: my article in Sunday’s New York Times travel section is now available online. It’s about the trip to Nepal Lauren and I took last December, where we hiked a route called the Three Passes — a way of seeing the Everest region without spending all our time in the traffic jams along the route to Base Camp:

PERCHED on a narrow platform 17,500 feet above sea level, we paused to snack on boiled potatoes and the spicy Tibetan dumplings called momos, and to drink in the view.

We were at the top of the Renjo La, the pass that is the lowest point along a knife-edged ridge separating two valleys. Behind us, looming above a turquoise glacial lake, was Mount Everest. In front of us, an immense stone staircase led down into a valley dotted with roofless stone shelters and the occasional yak — a ribbon of green hemmed in by the soaring gray and white of Himalayan rock and ice.

Stunned into silence by the panorama, we descended the staircase and hiked on in a reverie. It wasn’t until we reached the banks of a fast-flowing river a few hours later that we noticed that the landscape no longer corresponded to the lines and dots on our map. We’d hiked for five hours without seeing another living soul, and, perhaps in part because of our solitude, somewhere along the way had taken a wrong turn…[READ THE WHOLE ARTICLE]

There’s also a nice slide-show accompanying the article, with some pictures from the trip.

Skipping breakfast leads to lead poisoining?

April 7th, 2011

A few months ago, I blogged about the controversy surrounding whether eating breakfast is a good strategy for people trying to lose weight. I (along with expert clinicians like Yoni Freedhoff) am in the pro-breakfast camp, but a few readers offered well-supported arguments against breakfast.

So I’ve been biding my time since then, waiting for a slam-dunk argument — and now I’ve got it! A new study in the journal Environmental Health looked at blood levels of lead in a group of 1,344 children in China. Apparently, it has been shown previously that fasting increases the rate of lead absorption in the gastrointestinal tract. So if you don’t eat breakfast, this daily mini-fast could cause your body to absorb more lead into the bloodstream. Sure enough, after controlling for factors like age and gender, the study found that regular breakfast-eaters (as reported by their parents) had 15% less lead in their blood than regular breakfast skippers.

In all seriousness, this is unlikely to be relevant to anyone who doesn’t have lead paint on their walls or a toy-box full of lead toys. I just thought it was interesting — and it does show that eating patterns and timing do affect how your body processes the food (and heavy metals) that pass through your gut. Overall, the research on breakfast and weight control is still pretty muddled and conflicting. I remain pro-breakfast, but I realize this study isn’t going to win anyone over!

UPDATE April 8: Perfect timing: I just noticed that Peter Janiszewski over at Obesity Panacea has a post on a new prospective study showing that breakfast-skippers aren’t just heavier in a cross-sectional analysis, but also tended to gain the most weight after a two-year follow-up. Still suffers from the same flaws as any non-randomized trial (i.e. the skippers could be the ones who are already battling weight problems), but an interesting finding nonetheless.

Active vs. passive warm-up

April 6th, 2011
Comments Off on Active vs. passive warm-up

What exactly is the purpose of a warm-up before exercise? According to a new study in the Journal of Strength and Conditioning Research, it’s:

to enhance physical performance, to reduce muscle soreness, and to prevent sports-related injuries by increasing the body temperature.

But if the main mechanism of the warm-up is literally to warm the body, could we accomplish the same thing by, say, sitting in warm water? That’s what this study tested: three different cycling tests (six minutes at 80% VO2max) after (1) no warm-up, (2) an “active” warm-up of 20 minutes easy cycling, or (3) a “passive” warm-up of soaking the legs in 39-C water for 20 minutes. The result: the active warm-up allowed subjects to use more oxygen (measured VO2) with less effort (lower HR), and possibly lower lactate accumulation (though the latter wasn’t statistically significant).

So what does this mean? It suggests that the benefits of a proper warm-up aren’t just the result of raising your temperature. Higher temperature does confer some benefits: for example, your muscles and tendons become more elastic, reducing the risk of injury. Nerve signals from brain to muscle are transmitted more quickly. The rate of metabolic reactions inside your cells speeds up by 13% for each degree C that the temperature increase.

But there are other benefits beyond temperature. Crucially, the active warm-up causes your blood vessels to dilate to speed the flow of oxygen to working muscles. When you start the main workout or race, the sudden increase in demand puts you into temporary oxygen debt, because your heart, lungs and muscle metabolism can’t respond instantly to the higher demand. If you’re properly warmed up, your systems are already partly ready for the increased demand (blood vessels dilated from the warm-up, heart rate already elevated, etc.), so they can deliver more oxygen than if they were starting cold. That means the short period of initial oxygen debt doesn’t last as long — and since aerobic metabolism is more efficient that anaerobic metabolism, it means that you’re more efficient overall.

The practical take-away: well, we all know that warm-ups (as opposed to sitting in a luke-warm bath) are important, so this doesn’t change anything. But there’s still lots of debate about exactly what a warm-up is supposed to do, and what the best way to do it is — hence all the posts about dynamic versus static stretching, for example. In the long run, figuring which parts of a warm-up really do boost performance will help us design better warm-up routines.

Are fruit and vegetables really more expensive than soy and corn?

April 5th, 2011

I just read an extremely interesting piece on “cheap food,” by Wellesley College prof Robert Paarlberg at Good Magazine (via Yoni Freedhoff’s newish Weighty Matters Facebook page). It’s a mind-twister: Paarlberg argues against the Michael Pollan/Mark Bittman claim that farm subsidies make junk food cheap and healthy food expensive, thus contributing to the obesity problem.

His main claim is that U.S. farm policy acts primarily by propping up prices for foods like sugar, beef and milk, for example by slapping import tariffs on sugar from countries where it’s cheaper to produce. This makes sugar more expensive, not less expensive. Same with ethanol subsidies, which drive the price of corn (and thus high-fructose corn syrup) up, not down. As for direct payments to encourage extra production when prices are low, he argues that these only lower the price of corn “slightly.”

Is this a fair assessment of the overall effect of U.S. farm policy? Or is he just cherry-picking a few elements of the farm policy that support his perspective? According to Bittman, the 2012 U.S. farm bill includes $30 billion of subsidies, of which $5 billion is direct payments to farmers. Direct payments, of course, effectively lower prices, contrary to Paarlberg’s thesis. Where do these direct payments go? According to this analysis, about a third of the total direct payments between 1995 and 2008 went to corn growers, followed by wheat then soybeans.

But we’re still considering the problem too narrowly, I think. Even if we accept the core of Paarlberg’s premise — that subsidies to corn and soy primarily serve to keep prices high so that the farmers earn a reasonable income — that still has far-reaching effects on the rest of the food system. Ethanol subsidies may drive corn prices higher, but they also create an insatiable demand for corn, encouraging farmers to plant corn instead of other crops. By making corn and soy reliably profitable (with production and loan guarantees, by propping up prices, or through whatever other means the farm bill employs), they’re effectively discouraging other crops and making them more expensive.

So what’s the overall effect of all this? The fact is, I don’t have enough information or expertise to figure it out. But we have another option: we can look directly at food prices to see which types of food are getting more or less expensive. Paarlberg does this, and concludes that the price of fruits and vegetables has fallen at exactly the same rate as the price of junk food, drawing on data from a 2008 USDA study. He even helpfully includes a couple of graphs from the study to illustrate his point:

price trend for apples and cookies

Again, we might ask whether this is the full story. If we go back to the USDA report he cites, the very first figure is the following:

food prices between 1980 and 2006A bit of a different picture! But food prices are complicated, as the authors of the USDA report explain. It’s possible that the rise in the price of fruits and vegetables reflect changes in quality: now we buy peeled baby carrots instead of unpeeled carrots, and we expect fruits to be available all year. That’s why they examine the price trends for specific items like apples and chocolate chip cookies. Of course, once we’re choosing individual products rather than a broad basket of goods, our biases come into play. It’s no surprise that Paarlberg didn’t choose to show the data for, say, broccoli (on the left) and potato chips (on the right):

food price broccoli potato chipsThis pair of graphs tells a different story — but they, too, are selectively chosen to make a point. So what’s the truth? It’s very hard to tell; or, conversely, it’s very easy to find data to back up different points of view. Here’s another graph from the Institute for Agriculture and Trade Policy, based on earlier USDA data:

With this surfeit of data to choose from, it’s worth asking who’s choosing what data to show us. In the comments on the Good website, several people point out that Paarlberg was a member of Monsanto’s Biotech Advisory Council to the CEO — not surprising, given that his most recent book is Starved for Science: How Biotechnology Is Being Kept Out of Africa. There’s nothing inherently wrong with advising Monsanto, but it does give us a sense of the perspective with which he’s approaching the data.

Ultimately, I think the article is very much worth reading. It’s a useful — and perhaps much needed — corrective to the idea that all our societal health problems are the result of a conspiracy by big business to cram us full of high-fructose corn syrup. It may well be that, given the realities of modern technology, unhealthy food really is cheaper to produce than healthy food, independent of what the government does — in which case more responsibility devolves onto us to make healthier but more expensive choices. But even if that’s the case, there’s no reason for the government to make the situation worse by continuing to preferentially encourage unhealthy mega-crops over healthy ones!

Heart damage markers disappear 72 hours after marathon

April 3rd, 2011
Comments Off on Heart damage markers disappear 72 hours after marathon

The debate about whether “extreme” exertions like running a marathon can damage your heart continues to simmer. The latest addition is a paper published online last week in Medicine & Science in Sports & Exercise by a group from TU Munich. One of the co-authors is Stefan Moehlenkamp, whose recent studies of fibrosis in the hearts of veteran marathon runners have stirred up controversy.

In this study, they took blood tests (and various other measurements) from 102 participants in the Munich Marathon, before, immediately after, 24 hours, and 72 hours after the race. They were looking at the rise and fall of various “cardiac biomarkers” that signal possible heart damage, in particular and newly developed test for cardiac troponins that is much more sensitive than previous tests.

We already know that troponin levels rise after a marathon — but we don’t whether that’s a signal that heart muscle cells are dying, or whether it just signals some temporary damage, in the same way that your leg muscles are temporarily “damaged” by a marathon but quickly recover. When heart muscle cells actually die, as in a heart attack, levels of troponin stay elevated for four to seven days, as troponin continues to leak from the dead cells. In contrast, temporary damage causes a sharp peak in troponin that returns to normal after a few days. Here are the results:

troponin levels return to normal 72hrs after marathon

Combining this sharp peak and quick decline with the other measurements in the study, the researchers conclude that “cardiac necrosis [i.e. cell death] during marathon running seems very unlikely.” Instead, the evidence points to temporary damage to cell membranes, possibly caused by decreased availability of oxygen or ATP during the race.

Referring to the earlier study that found fibrosis in veteran marathon runners, the researchers write:

Findings of myocardial injury, as seen in older marathon runners (5) are probably independent of marathon running but rather related to cardiovascular disease or risk factors, particularly smoking.

Does this mean the controversy is over? Far from it. For one thing, this study was written before more recent results showed possible heart damage in elite athletes who weren’t former smokers. More research, as always, is needed. But the results are encouraging — I remain pretty firmly convinced that the cardiac benefits of training for and competing in marathons dramatically outweigh the putative risks.

High-intensity interval training improves insulin sensitivity

April 2nd, 2011

“High-intensity interval training” (HIT) has been a big buzzword for the past few years, with plenty of studies showing that short, intense bursts of exercise can produce many of the same results as long, steady cardio sessions. Martin Gibala’s group at McMaster just published a new study in Medicine & Science in Sports & Exercise with a couple of points worth noting:

  • You don’t have go “all out”: Many of the early studies used 30-second Wingate tests at 100% exertion, which is pretty challenging for inexperienced or unfit exercisers. The more moderate protocol Gibala has been studying is cycling 10 x 60s hard with 60s recovery. The hard sections were done at 60% peak power (80-95% of heart rate reserve) — so hard, but not fall-off-the-bike hard.
  • Anyone can do it: Instead of using relatively fit subjects, this study used older (average age 45) subjects who were sedentary (no regular exercise program for at least a year).

The most interesting result for me: subjects improved their insulin sensitivity by 35% on average after just two weeks, three sessions a week. Lots of other parameters also improved, but insulin sensitivity is something that we know is crucially important in avoiding and managing metabolic syndrome. And the whole workout, including the three-minute warm-up and five-minute warm-down, takes less than half an hour.

By no means am I suggesting that interval training is the One True Answer to fitness (and neither are Gibala et al.). There are good arguments for varying what type of workout you do. But in terms of bang for buck, it’s hard to compete with HIT.

Can yoga be studied with “conventional” clinical trials?

April 1st, 2011

I’m generally a pretty skeptical guy. If someone tells me that X really works, but it can’t be verified by science because science is just one way of understanding things, I roll my eyes. But bear with me here. A couple of years ago, I interviewed a guy named Timothy McCall, the author of Yoga as Medicine, for an article about yoga research. He was a smart guy with some very interesting things to say, and I still get occasional e-mail updates from his website. The latest includes a link to this article, from the Spring 2011 issue of Yoga Therapy Today, by Nina Moliver, called “Yoga Research: Yes, No, or How?”

The article is a pretty wide-ranging look at the current debate (in yoga circles) about how yoga and science fit together. A couple of points caught my attention. First, McCall and Moliver argue, yoga is a very slow healing technique:

[E]ven six months is a drop in the bucket for a Yoga practice. By privileging short-term studies and standardized protocols, we are forever studying beginners, [and] we are systematically underestimating the healing potential of Yoga in our research…

The bigger argument is that randomized, controlled trials — of any length — to study yoga don’t work, for various complex, holistic reasons that don’t sound very convincing to me. The alternative is observational studies. And as it happens, Moliver completed an award-winning PhD thesis at Northcentral University last year that used an observational design — an online survey — to study yoga in 211 female yoga practitioners plus 182 controls. Observational studies have a lot of problems, in particular the inability to distinguish between cause and effect, as Moliver acknowledges:

For example, if a researcher didn’t randomly assign the participants, it is not possible to know if Yoga practitioners are happier because they practiced Yoga, or if people who were happier were naturally attracted to starting a Yoga practice.

But there are still ways of extracting useful data. For example, if you see a dose-response effect — the longer people have been yoga-ing, the happier they are — that’s pretty suggestive. And as for those confounding variables:

For the Yoga practitioner, these so-called confounders — a healthier diet, a simpler lifestyle, more time outdoors, more kindness and compassion, more loving relationships, more bike-riding, a better path to right livelihood — are not confusing. They are mutually enhancing and reinforcing.

In other words, who cares if you end up happier and healthier because you’ve aligned energy flows in your body or simply because you’ve spent more time being physically active and mindful — the result is what matters. And indeed, Moliver’s study did see evidence of a dose-response relationship in her subjects, some of whom had been practicing yoga for as a long as 50 years. (I’m hoping the study will be published, as the abstract isn’t very revealing about the “range of intercorrelated wellness measures” that demonstrated the dose-response effect.)

This all sounds very reasonable to me — and in fact, it’s very reminiscent of Paul Williams’ National Runners’ Health Study, which takes a similar observational approach (albeit with more than 100,000 subjects) to tease out dose-response relationships that would be nearly impossible to detect with conventional short-term intervention studies.

One caveat: this approach tells us what works, but it doesn’t tell us how it works. You can’t take an observational study that finds health benefits from yoga and conclude that this proves that we can indeed control the circulation of energy flow in our bodies. To make claims about cause and effect, you really do need proper randomized trials. Notably, Moliver’s study didn’t see any difference between different types of yoga: just doing it, and keeping at it for long periods of time, correlated to better levels of psychological and physical well-being.