Image: Flickr, Kit.
A recent report from the Independent Commission for Aid Impact (ICAI) criticised the UK’s Department for International Development (DFID) for having a great deal of knowledge but being bad at using it.
The report is very specific about what DFID doesn’t do well. A poor approach to sharing knowledge – both internally and externally; a reluctance to incorporate the insights and experiences of ‘local’ staff into learning processes; and poor institutional memory, the result of high staff turnover.
Our experience of many donors’ work on Political Economy Analysis (PEA) – as researchers, consultants, analysts, trainers and observers – suggests the same lessons apply to this area too. And actually, when it comes to ‘doing’ PEA, DFID fares pretty well compared to many other donor agencies.
Our own critique of donors is set out in a recent DLP paper, Donors Doing Political Economy Analysis™: From Process to Product (and Back Again?). We argue that PEA has changed since it first emerged. Once seen as a ‘transformative’ process meant to change how donor officials think, it has become in many cases a bespoke set of off-the-shelf products designed by external consultants for ad hoc use by donor staff.
We question whether or not the current PEA model will actually lead to internal change in donor agencies – to donor staff being better able to think and work politically.
After all, ‘bespoke’ PEA draws very heavily on external consultants for everything from analytical design to implementation to training to evaluation. Certainly ICAI’s pessimistic assessment of DFID’s ability, so far, to foster organisational learning suggests that a different approach is needed to incorporate ‘political thinking’ into development policy and practice.
But lack of institutional learning isn’t the biggest problem with current PEA practice.
In theory, PEA is undertaken as part of the development enterprise. In practice, it may resemble intelligence gathering to some developing country governments. There are instances where exclusion of foreign government officials in the design and execution of PEA has damaged relations between governments and donors.
And it must also be said that even more harmful consequences could flow from the reluctance of PEA practitioners to bring developing country officials – and this includes those ‘local staff’ employed in donor embassies and agencies abroad – into a process that informs wider donor policy in that country. Positive, developmental change requires the buy-in and commitment of power-holders in developing states. Promotion of ‘ownership’ by donors in Paris, Accra and Busan has acknowledged this.
There is a lot crammed into our paper; we’re the first to admit that. But there’s so much more we could have put in.
Like the consultant who told us that they were about to ‘go undercover’ into a madrassa (an Islamic school or college) in a South Asian country. This didn’t mean some sort of 007 operation with interesting gadgets (Heather’s first reaction) but rather misrepresenting to interviewees both the purpose of the interviews and the study’s funder (a bilateral donor). A giant no-no under research ethics rules.
Or another who told us about billing three different donors for the same analysis, because although they all use pretty much the same PEA framework, they needed to have their report ‘branded’ with their own logo.
And another who told us that they had done a conflict analysis in a country only recently post-conflict, where the analysis could have been summed up in one sentence (and we quote): ‘You must be out of your *** minds if you think you’ll be able to deliver aid in this environment.’ But pressure from their employer to deliver a report ‘more palatable’ to the funder meant that instead much was written about ‘entry points’ when there weren’t a whole lot of entry points on the ground.
These are extreme cases. Some PEA is likely to be very high quality, adhering to high ethical standards. But most is likely to be somewhere in the middle. Quick and dirty, fairly useful. Sometimes read by the people who commissioned it. Sometimes done with the full knowledge of the partner country.
It’s encouraging, then, to see some evidence that the PEA status quo may be changing. See, for example, ECDPM’s analysis of the European Commission’s decision to suspend PEA and to look within instead. Or Neil McCulloch’s impressive reflection on the impact of donor incentives on their ability to think and work politically. An important new volume edited by Verena Fritz and Brian Levy at the World Bank shows us what donor-led political economy analysis can look like and what results it can produce.
So can new forms of PEA help donors to work politically – to get real about politics, as Alina Rocha Menocal has put it?
In Nigeria and the Philippines, politically savvy programming is bringing national and international actors together. The DFID-funded State Accountability and Voice Initiative (SAVI) in Nigeria uses a participatory approach to PEA. And in the DFAT-funded Coalitions for Change programme in the Philippines, DLP is working with The Asia Foundation on action research that builds ongoing informal political analysis into the programming process.
These programmes are politically smart, domestically-owned and genuinely innovative. They involve consultants, of course, because there is almost always need for some additional technical expertise, but PEA hasn’t been ‘farmed out’ as a one-off bolt-on. Instead, it has been integrated into the programmes’ design and implementation.
These new initiatives are ongoing, so we’re not yet able to say whether they’ll be more effective than traditional programming. But they’re certainly less likely to involve hiding PEA from ‘partner’ governments behind fake occupations on visa applications – or to require one of Q’s gizmo-packed fountain pens.