Development intervention evaluations are lousy for a bunch of reasons.
Organizations tend to overestimate their ‘impact’ – because they forget that their results are valid only for the population they track across time. They forget that ‘impact’ is subject to particular geographies, economic conditions, culture, aspirations and the opportunity costs beneficiaries place on participating in the intervention.
But there isn’t anything unduly surprising about this situation. After all development runs on funds and funds go to those who have the best impact. Sorry – let me rephrase to those who demonstrate the best impact.
If validity of research and impact doesn’t concern you – you aren’t alone. So what if people massage the data a bit or design studies to show a particular effect? In the end, people are ’empowered’ with all sorts of things right?
Children are empowered to go to school, parents are empowered to have ‘safe sex’ and communities are empowered to drink freshwater…. NB: Just don’t ask by how much!
I’m now a development sector person – so here is what I sit and do all day at a premiere development sector organization; answer questions.
If I said 7.2% of all children in India in the age group of X & Y do this – it is the same thing as saying 7.2% of all TV watching children in the same age-group do this, right?
My job description says I should be available at all times to answer ad hoc data requests to support other staff. Data pornographer.
But wait there is more . The ever present request for snazzy graphs and pretty graphs. I have nothing against good looking graphs. In fact, data visualization is a lovely discipline.
Nevertheless – there is something particularly vile about a request for a pretty/snazzy graph in the absence of good data. A poorly coloured graph which has some half-way decent data to show can still make a tonne of sense. But when a picture of a graph is sent you and you are asked to recreate a pretty version in excel – you know it is a lost battle.