Exercising credibility: why a theory of change matters

Credibility

Recently the Chris Evans breakfast show on UK’s Radio 2 picked up a news story on a Danish study reporting that half an hour’s exercise a day is better for you than one hour.  Like me, the radio presenters were puzzled by this finding and wanted to know more.

Middle aged men of reasonable fitness were randomly assigned to two groups, one doing half an hour a day and the other a full hour. After three months the group exercising less had lost more weight.  The half hour exercise group lost eight pounds compared to six pounds for the one hour group; about one kilo more weight loss.

The study was widely covered in the press; partly because it has a message most people would like to hear, but also for the novelty value of counter intuitive results. Indeed, the researcher himself is at a loss to explain why this should be so.

Perhaps, he says, the half hour group weren’t tired after half an hour. Hence they exercised some more and so they didn’t actually exercise less!  Or perhaps they exercised much harder in that short time. But some simple calculations of plausible differences in exercise intensity show it’s not possible to burn more calories in half an hour than what would be burned in one hour.  Maybe, he added, those exercising more felt the need to eat more, and over-compensated the calories loss. This last explanation is plausible: a sports drink and chocolate bar wipes the calorie loss from about half an hour of exercise. And a listener to the programme pointed out that those exercising more may have built more muscle, which weighs more than fat.

To an outsider the real puzzle here is that the study wasn’t set up to be able to explain its findings.

The strength of randomized control trials (RCTS), like this study, is their ability to establish causal relationships between the intervention and the outcome. But we need factual analysis of what happened, to help complement the counterfactual analysis of causality. In the case of this study, participants should have been asked to keep food and exercise diaries. (Though doing so raises another often ignored problem in research that ‘the measurement is a treatment’.  Keeping diaries often reduces food intake and increases exercise volume).

Similarly many randomized control trials in international development fail to pay adequate attention to collecting data around the intervention’s theory of change. And so the authors resort to non-evidence based speculation to explain their findings. But without such explanation the appropriate policy response to study findings is often unclear.

The Danish researcher plans to go on to study exercise and commuting. But as one of the radio presenters exclaimed ‘he can’t do that, he hasn’t finished this research yet’. The presenter is right. RCTs which just link the intervention to outcomes are unfinished research projects.  Incredible findings won’t become credible without a plausible, evidence-based explanation.

The UK Department for International Development recently released a new paper on the use of theories of change in international development. 3ie’s paper on theory-based impact evaluation also offers guidelines for mapping out the causal chain of an intervention.  Developing a theory of change helps identify the evaluation questions and can very often increase a study’s policy relevance.

Add new comment

This question is for testing whether or not you are a human visitor and to prevent automated spam submissions.

Authors

Howard white Howard WhiteDirector, GDN Evaluation and Evidence Synthesis Programme

About

Evidence Matters is 3ie’s blog. It primarily features contributions from staff and board members. Guest blogs are by invitation.

3ie publishes blogs in the form received from the authors. Any errors or omissions are the sole responsibility of the authors. Views expressed are their own and do not represent the opinions of 3ie, its board of commissioners or supporters.

Archives

Authors

Howard white Howard WhiteDirector, GDN Evaluation and Evidence Synthesis Programme

About

Evidence Matters is 3ie’s blog. It primarily features contributions from staff and board members. Guest blogs are by invitation.

3ie publishes blogs in the form received from the authors. Any errors or omissions are the sole responsibility of the authors. Views expressed are their own and do not represent the opinions of 3ie, its board of commissioners or supporters.

Archives