Archive

Archive for June, 2010

Jungle bound until July 7

June 26th, 2010

I’m flying out tomorrow morning to Papua New Guinea for a hiking trip along the Kokoda Track — definitely new territory for Lauren and me. A friend at Outside sent me this link to a story about a hike in the same region from a few years ago, which made me excited and a little nervous. While we’ve done some fairly intense hikes in Canada and the Australian outback, this will be a different kind of trip. We know a lot about bear-proofing and alpine passes, but not a lot about leeches and jungle rivers…

png

For the record, while most Australians who hike this route do so as part of fully outfitted groups (it’s a World War II battle site with significant meaning to Aussies), we’re schlepping our own food and gear. We will have a guide with us, though. The one thing we don’t have (which this bit of research from last week is making me regret) is hiking poles…

Anyway, the point of this post is that it’s astronomically unlikely that I’ll find any Internet between now and July 7 — so check back then as I start sorting through the backlog of cool studies that I have sitting on my desk right now!

[The picture above, from the Outside story I linked to, is by Philipp Engelhorn.]

Jockology: training for soccer

June 24th, 2010

This week’s Jockology column rounds up a bunch of research on the optimal preparation and training for soccer: the mechanics of kicking, the physiology of repeated short sprints, the psychology of penalty kicks, the optimal warm-up and nutrition, rapid direction changes, etc. It’s in the form of a big infographic, put together by Trish McAlaster, the talented artist I often work with at the Globe. (We’re currently working a pretty cool graphic for the next column — stay tuned!)

Most interesting bit of info in the current column, for me, was this: when you run a short sprint, you get about 20% of the ATP you need from aerobic processes, and 80% from anaerobic processes. But if you keep sprinting (as you would for a soccer game), the third sprint is already 50% aerobic/50% anaerobic, and the “Nth” sprint is 75% aerobic/25% anaerobic. So if you want to be fast late in the game, you need to fuel yourself like an endurance athlete.

(This info comes from Stuart Phillips‘ chapter in the book Sports Nutrition: From Lab to Kitchen. And I actually simplified the info a bit for the column by combining the contributions from phosphocreatine with other anaerobic sources. The actual split for aerobic/anaerobic/phosphocreatine is 20/30/50 for the first sprint, 50/20/30 for the third, and 75/5/20 for the Nth.)

,

Whole-body compression garments for soccer players?

June 20th, 2010

A few more volleys have been fired in the escalating Compression Garment Wars. Australian researchers have just posted a new study in the Journal of Strength and Conditioning Research that gives a thumbs-up to whole-body compression garments for the repeated-sprint activity typical of soccer and other field sports. Meanwhile, researchers at Indiana University have reached the opposite conclusion about the ability of compression leggings to help running endurance and jumping ability.

First the Australian study: they used full-body Skins compression suits — yep, that’s long sleeves and full leggings. The test was a treadmill simulation of the demands of a typical soccer game, with a mix of walking, jogging, running and sprinting for 45 minutes; the more distance you cover, the “better” you’ve done. They had volunteers do the test twice, once with Skins and once without, and found a “moderate strength likely improvement” in distance covered (5.42 vs. 5.88 km), and similar “likely” improvements in muscle oxygenation, but no changes in heart rate and VO2.

(This “likely” business means they didn’t find statistically significant differences — not surprisingly given the small sample size — but still have reason to believe that the difference is big enough to matter to athletes, if I understand correctly.)

Now, I believe the oxygenation stuff. No doubt that if you cram yourself into one of these suits, you are affecting your physiology in some way. (For example, as the authors note, similar previous studies have found that — surprise, surprise — full body suits “may have some thermoregulatory effects.” In other words, they make you hotter.)

But the question is, are these small physiological effects translating into meaningful performance changes? On the surface, you might think this study answers yes. However, it absolutely boggles my mind that anyone could write a paper like this and not even mention the possibility of a placebo effect. Really? You put one group in a $100 spacesuit, the other group in a cotton undershirt, and you’re surprised to see an 8% improvement in a very “soft” measurement with a 12% error bar? And then you conclude, with a straight face, that these things will make you a better soccer player because you’ll have more oxygen in your lower limb muscles, even though the increased speed was only seen during the jogging portions of the test and not the fast running or sprinting?

Okay, so on the other hand, two new studies from Indiana. One tested lower leg compression garments on 16 distance runners, and found no changes in muscle oxygenation, running economy, or mechanics. The other tested the upper thigh compression shorts that basketball players love, using three sizes (Goldilocks style: one that was a size up, one that was a size down, and one that was just right). Vertical jump was the same in all three cases.

The most interesting point in the Indiana studies may be the following:

Although overall the study found that the compression garment had no effect on running mechanics and economy, there was some variation. Four subjects had an average of greater than one percent increase in oxygen consumption — their economy worsened — while wearing the compression garment. However, four other subjects experienced a greater than one percent decrease in oxygen consumption — their economy improved — while wearing the compression garment. Laymon had her subjects complete a subjective questionnaire about their feelings toward compression garments before completing their tests. It turned out that the subjects who experienced improvement in their economy were more likely to have a favorable attitude toward compressive wear and believed that by wearing the compressive garment their racing would improve.

“Overall, with these compressive sleeves and the level of compression that they exert, they don’t seem to really do much,” [researcher Abigail] Laymon said. “However, there may be a psychological component to compression’s effects. Maybe if you have this positive feeling about it and you like them then it may work for you. It is a very individual response.

Two important points here. First, it looks like if you believe they work, then they will. (See my post last week confirming that superstitions boost performance!) Second, there’s significant individual variation. As I said at the top of this post, there’s no doubt that compression does something to the body. But these studies, and the others that continue to pile up, do nothing to convince me that we’ve figured out how to harness those effects in a useful way (at least for performance enhancement — recovery from soreness looks a little more solid at this point).

Trekking poles reduce muscle damage from downhill walking

June 19th, 2010

There has been a surprising amount of research into the effect of trekking poles (or hiking poles or Nordic walking poles or whatever you want to call them) in the last few years. It’s still unclear whether they raise or lower heart rate and effort — it probably depends on how you set up the experiment, and how vigorously you use your arms. But a new study that just appeared online in Medicine & Science in Sports & Exercise has an interesting finding about muscle damage that seems fairly clear-cut.

Researchers from a couple of British universities led a group of 37 people on a day-hike to the top of Mount Snowdon and back down again; half used poles and the other half didn’t. They measured all sorts of variables before, during and after, including heart rate, perceived exertion, maximal voluntary muscle contraction, soreness, etc. Both groups took the same amount of time and had (on average) the same heart rates, but the pole group reported lower perceived exertion on the way up — which agrees with some (but not all) previous studies.

The new finding in this study is that the pole-users had less leg soreness from the downhill portion of the hike (which involves damage-inducing eccentric muscle contractions) immediately after and in the days following. They also had less reduction in their maximal voluntary muscle contractions.

Now, this isn’t particularly earth-shattering. Taking the load off your legs during downhills is precisely the rationale that convinced me to try using (borrowed) poles during a hike in the Rockies a few years ago. If anything, it’s surprising that no one has tested the link between muscle soreness and poles before. So now, with a tough eight-day hike through hilly terrain coming up in exactly a week, I have to decide whether it’s worth investing in poles…

(My current strategy to avoid soreness is based on the principle that inducing DOMS once reduces the severity of the next bout. Lauren and I did a hill workout this morning — 10x30s — and for an added twisted we sprinted back down the hill after three of the intervals. Well see whether that manages to induce any protective soreness tomorrow!)

Crossing your fingers boosts performance (touch wood)

June 16th, 2010

Amby Burfoot points out a pretty neat study by German researchers in Psychological Science showing that superstition really does boost performance. The researchers point out that, despite their irrationality, superstitions are surprisingly prevalent across cultures, with famous examples such as Michael Jordan wearing his old UNC shorts under his NBA uniform for his entire career. And they’re particularly common in two groups “whose members regularly engage in performance tasks–namely, athletes and students.”

But do they work? The researchers did four studies that suggest they do. The first was a simple test: take ten putts on a golfing green and sink as many as you can.

[W]hile handing the ball over to the participants, the experimenter said, “Here is your ball. So far it has turned out to be a lucky ball” (superstition-activated condition) or “This is the ball everyone has used so far” (control condition).

Sure enough, the lucky ball group hit 6.42 putts, while the neutral ball group hit just 4.75.

The other three experiments involved motor dexterity, memory and anagrams, and the participants were primed with superstitions like keeping their fingers crossed or having a lucky charm present — all without realizing that the true purpose of the experiment was to test superstitions (anyone who figured it out was excluded from the analysis).

The upshot of the experiments is that superstition’s power appears to mediated through “self-efficacy” — basically, a positive superstition makes you believe you’ll perform better, and that confidence enables you to do so. The researchers point out that this is different from, say, bouncing the ball three times and exhaling loudly every time you take a foul shot. Those sorts of rituals serve to focus attention and trigger well-learned motor sequences, rather than boosting self-efficacy.

And, with respect to truly outstanding performances, [the authors conclude,] the present findings suggest that it may have been the well-balanced combination of existing talent, hard training, and good-luck underwear that made Michael Jordan perform as well as he did.

Soccer science

June 14th, 2010

When I went out for my run this morning, Sydney seemed like a ghost town: empty sidewalks, tumbleweed blowing down the streets, etc. The only signs of life were in the pubs, which had been open since 4 a.m. for the Socceroos’ Monday-morning World Cup debut. Inside, people were huddled quietly over their empty schooners, absorbing their 4-0 loss to Germany.

In that spirit, a couple of good recent articles on the science of soccer:

- Ross Tucker of The Science of Sport has started a series on the physiology of soccer. The first installment offers a good profile of what it takes to play a full game: running 10 to 15 km, including between 80 and 110 sprints, and so on. I had a chance to chat with Ross for a couple of hours last week for an upcoming article — a very interesting guy with lots of insight, as you can gather from the blog.

- A very thorough round-up of recent research on the psychology of the penalty kick, by Andrew Keh of the New York Times. As I write this, Ghana has just taken the first penalty kick of the tournament, scoring to defeat Serbia — but we’ll be seeing a lot more of these when we reach the elimination rounds. One of the most interesting observations:

Kick takers in a shootout score at a rate of 92 percent when the score is tied and a goal ensures their side an immediate win. But when they need to score to tie the shootout, with a miss meaning defeat, the success rate drops to 60 percent.

“This to me is the key finding of all our studies,” said Geir Jordet, a professor at the Norwegian School of Sport Sciences in Oslo who has analyzed shootouts with fervor. Jordet also found that shooting percentages tend to drop with each successive kick — 86.6 percent for the first shooter, 81.7 for the second, 79.3 for the third and so on.

“It demonstrates so clearly the power of psychology,” he said

,

How to taper for a race, and why it works

June 11th, 2010

This week’s Jockology column looks at research into tapering: how to reduce your training before an important competition so that you’re well-rested but don’t lose any fitness. It tackles how long you should taper for (two weeks seems to work well); how you should adjust training volume (reduce by 40 to 60 percent), intensity (don’t change) and frequency (don’t change); and the difference between step, linear and exponential tapers.

The most interesting finding for me came from a new study by Scott Trappe and his colleagues at Ball State’s Human Performance Laboratory, suggesting that tapering isn’t just about rest — it actually helps your muscles grow:

He and his colleagues took a series of muscle biopsies from university cross-country runners preparing for a championship race. Surprisingly, they found that the individual muscle fibres responsible for explosive power in the legs actually got bigger and contracted more powerfully after the training reduction.

“On a molecular level, the wheels are so greased that the engines proceed at a high rate even after you reduce your training,” explains Dr. Trappe. This creates a window of opportunity during which the delicate balance between muscle synthesis and breakdown shifts to favour muscle growth.

In contrast, the researchers found no change in measures of cardiovascular endurance such as VO2max. This suggests that it’s the muscle adaptation that provides the performance boost of tapering – and just as importantly, that a brief period of less training doesn’t compromise endurance. The result: The runners raced 6 per cent faster over 8 kilometres than they had just three weeks earlier. [read the rest of the column]

,

Weak hips cause runner’s knee

June 8th, 2010

I’m looking forward to going through the research presented last weekend at this year’s ACSM meeting. For starters, a study presented by researchers from the Indiana University found that hip strengthening exercises reduce or eliminate “patellofemoral pain” (“runner’s knee”) in female runners. This is an idea that has been gaining momentum over the past few years — I first heard about it back in 2007 from Reed Ferber of the University of Calgary’s Running Injury Clinic (and wrote about it here).

The Indiana study is pretty small — just nine runners, with the five who did the hip strengthening exercises lowering their pain score from 7 to 2 or lower (on a scale of 0 to 10) after six weeks of twice-a-week strengthening. The researchers are hoping to try the same program on a larger group of runners. Normally I wouldn’t get too excited about such a small study, but given that the idea is also being developed elsewhere (such as this study about hips strength and knee arthritis that I blogged about last year), it’s starting to look pretty interesting. I suffered through an extremely persistent case of runner’s knee a decade ago that kept me out of competition for almost two years, so I certainly wish I’d known about the possibility that hip exercises might help.

If you want to give them a try, here are Reed Ferber’s suggested hip exercises [pdf, 2 MB].

,

Barefoot running and the difference between biomechanics and injury rate studies

June 7th, 2010

I just noticed that a short article I wrote for Canadian Running‘s May/June issue is now available online. It’s my attempt to provide some context for the studies on barefoot running that made lots of (somewhat wild) headlines at the beginning of the year. It doesn’t offer any definitive conclusions, mainly because I don’t think such conclusions yet exist. My main point is the distinction between biomechanical studies and injury-rate studies. Everyone has been beating up on the shoe industry for years because it relies on the former rather than the latter — but that distinction is suddenly being “forgotten” now that biomechanical studies supporting barefoot running are appearing.

A short excerpt:

[...] There’s no doubt that thinking on footwear has evolved in the last decade or two. For instance, plush cushioning is no longer considered the ultimate defence against injury. “I wish running companies would stop rattling on about ‘gel’ and ‘air’ and so on,” says Simon Bartold, an Australian shoe researcher who consults for Asics. Newer shoes reflect this thinking, he says: Nike has introduced the Free, for example, and Asics has completely abandoned the concept of “motion control.” But rushing to the opposite extreme and claiming that runners of all shapes and sizes should give up shoes makes no sense either – and the new studies certainly don’t support this position. [...]

,

“Heart rate recovery” and acute vs. chronic training fatigue

June 6th, 2010

I had a chance to see an interesting study in progress a few days ago, during my visit to Cape Town, which prompted me to look up a paper that appeared earlier this year in the European Journal of Applied Physiology. It’s a case study of an elite Dutch cyclist being monitored with something called the Lamberts and Lambert Submaximal Cycle Test (LSCT), which was first described last year in a British Journal of Sports Medicine paper.

The gist is as follows: to warm up before a hard workout, you do a specific 15-minute protocol (6min at 60% of max heart rate, 6min at 80%, and 3min at 90%). You measure your power output and perceived exertion during these three stages, and then you measure how much your heart rate decreases during the 90 seconds after the test. Doing the test frequently (it’s not too strenuous, so you can do it as a warm-up before pretty much every workout) gives you objective data that tells you whether you’re fresh or tired, and whether your training is making you faster or slower.

Just as a sample, here’s a snippet of data, showing the power (at a fixed heart rate) for the first stage of the test, compared to the weekly training load. Pretty clear correlation:

lsctYou can see a gradual increase in power as the training cycle progresses, indicating that the cyclist is getting fitter. But you can also see big spikes in power during the heavy training weeks — that’s not because he was “fitter,” but because the acute training-induced fatigue meant he had to work harder (and thus produce more power) in order to get his heart rate up to 60% max. The mechanism has to do with decreased sympathetic nerve activity and increased parasympathetic nerve activity — and what’s most interesting to me is that the exact opposite happens in the case of chronic training-induced fatigue.

The same pattern can be seen in the heart rate recovery data:

lsct2During the heavy training weeks, the athlete’s heart dropped more quickly than during the other weeks. So he was tired from the dramatic increase in training load — but the test suggests that he was what the researchers call “functionally overreached” as opposed to “non-functionally overreached.” Had he persisted with the extreme training load for too long, his heart-rate recovery would have started to dip down instead of up, indicating overtraining. In other words, the researchers conclude:

This suggests that training-induced acute and chronic fatigue are reflected differently in the LSCT, which has important practical applications for monitoring.

Obviously this test is best suited to cycling, since you can precisely measure your power output. But I wonder whether a simplified version of the test, where you just exercise (run, row, whatever) at a set submaximal heart rate and then measure your heart rate recovery, would provide any meaningful information.

Oh yeah, the study I saw in progress: two groups of cyclists, each doing two (I think) hard workouts a week. One group does them on set days, come hell or high water; the other group does the LSCT three times a week, and determines whether or not to work out that day depending on the results. The hypothesis is that working out when your body is ready to go, and resting when it’s not, will lead to greater gains in fitness and performance. It’ll be interesting to see the results.

,