Sei sulla pagina 1di 4

Briefing

Baseline survey on
community feedback
mechanisms used by donors

2014
June
Development Initiatives
exists to end
absolute poverty
by 2030

www.devinit.org
Background
Development Initiatives (DI) PPA project 2011- 2016 requires DI to establish a baseline to
better understand the community feedback mechanisms used by donors in Uganda and Kenya.
The objectives of this workstream are: to determine the number of donors that collect and use
community feedback in the design and review of programmes when allocating resources in the
East African region; improve information to specific communities and systems through which
they can provide feedback and review if this has led to better donor use of information in their
programme design and delivery.
DI believes that information is power. Access to information is central to achieving development
outcomes, through promoting choice and empowerment for communities. In turn the
communities are able to engage with the decisions made on their behalf. Access to information,
preferably in transparent, open formats, is a precondition to community feedback - but it is not
sufficient. Beyond making information available, aid agencies and other DPs need to actively
listen and engage with feedback received from communities for access to information to really
result in lasting change and improvements to the lives of people living in poverty.
This paper sets out what the initial findings from the desk research and interviews with donors in
Uganda. It will be followed in due course by findings that will draw on donor responses from
Kenya.

Methodology for baseline survey


There are two methods DI has used to develop the baseline of donor agencies that capture
which beneficiary feedback mechanisms are used donor in their aid programmes for specific
donors working in East Africa:

Desk review research


Structured meetings with donors to discuss their approach to beneficiary feedback
mechanisms

The desk review of the donor feedback mechanisms is a method of looking for and analysing
donor feedback online and was essentially meant to analyse which different mechanisms
donors use to capture beneficiary feedback from the projects they implement.
In the desk review, 15 most prioritised donor agencies that operate within Uganda and
elsewhere in East Africa were reviewed. We looked at each donors website; navigate through it
to look at the policy or methodology it uses to capture feedback. The key method used was
word search on the publicly available material online. Search terms used were beneficiary
feedback, community feedback, feedback, beneficiary engagement, community
engagement, listening to beneficiaries, consultation (with the community/beneficiaries),
listening to beneficiaries, etc.

Three meetings were held in the week of 24th March 2014 in Kampala to discuss with donors
their use of community feedback. The meetings were held with the European Union, DFID and
the World Bank.
A representative from the donor was questioned with the aim of (a) understanding current
practices on feedback, (b) gathering thoughts on the current practices of donors and (c) gauging
interest in participation on future discussions on this theme. Questions were adapted to the
responses of recipients. Specific questions asked included:
Aim
Surveying
current
practices

Reflection
on current
practices

Question
- What does feedback mean to your organisation?
- Why is feedback important to your organisation?
- In what ways do you collect feedback from your beneficiaries?
(including at which stages of the project cycle, methodologies for
collecting feedback, and impact of this feedback on programming)
- In what types of activities do you seek beneficiary feedback (project
planning, project delivery, advocacy activities, etc)?
- What is your mechanism for response to beneficiary feedback?
-

Assessing
interest in
future work

What are some of the challenges to further collecting, implementing


feedback and integrating it into programme design?
What are the drivers of success or failure of beneficiary feedback
mechanisms? (context, etc)
What have you learned from beneficiaries that you wouldnt have
learned otherwise?
What are the costs of beneficiary feedback processes?
What is some of your upcoming work on beneficiary feedback?
What would be the incentives for integrating beneficiary feedback into
its work in a more systematic way?
What information and evidence would you need to give greater priority
to feedback?
Would you be interested in learning more about feedback, and
participating in a discussion on this with other donors?

Results from the desk research


The 15 agencies fall into three categories:
1. Those that have a clear strategy for capturing feedback.
2. Those that dont have a clear strategy but have projects where feedback has been
collected.
3. Those that have no feedback strategy and have no feedback captured for any of the
projects available.
Seven agencies have a clear and well laid out strategy/policy through which feedback is
collected, some have it covered under the accountability or evaluation function, while others
include it in other sectors/ departments. These agencies are: DFID, USAID, World Bank, African
Development Fund, Norwegian Refugee Council, DANIDA and SIDA. Some of these agencies
work hard to engage beneficiaries to engage online through functions such as Have your say,
Better together or provide your comments. Others have engaged more specifically with
communities through community discussions to involve community views about a particular
project that has been implemented so that the project or future projects can be improved
following feedback.

Five agencies fall in the second group of agencies that have no clear strategy but they capture
feedback for some projects but do not have a clear, well thought out strategy on capturing and
using feedback in project evaluation and design. These five agencies are: European Union,
JICA, War Child Holland, Netherlands Development Cooperation and Belgium Development
Cooperation.
The last group of donors have no feedback policy or strategy and their websites also had no
obvious projects where beneficiaries have had the opportunity to input into the projects by either
providing feedback on participating in consultations about the implementation of the project.
These three agencies were GIZ, IRISH AID and AFD.

Results from Interviews with agencies


The meetings revealed quite different level of experience and practice on community feedback
across the 3 donors.
While DFID is becoming a thought leader among donors on feedback at the headquarter level,
this is not necessarily reflected at country office level, where practices on collecting and
integrating feedback are ad-hoc and the topic is often not consider a strategic priority. There is
no currently no systematic, integrated method of collecting or using feedback in DFID Uganda.
DFID staff interviewed had undertaken some very expensive feedback collection training
(extensive workshops, etc) that have not necessarily demonstrated huge impact - so the
method was later changed to using other tools such a U-Report (SMS based). This technique
may have a lower response rate compared to expensive methods, but appears to work much
better and the light touch approach is easier to manage.
One of the programme areas operational in northern Uganda collects feedback through
community leadership structures. As these are often long-term settlements, leadership is
relatively stable and split into committees on particular themes (health, education, etc). This
system works very well and allows prioritisation of needs on a rolling basis. One reason why the
feedback loop works well in this context may be because the camp leader is easily accessible,
and the beneficiary population are also easily accessible. DFID Uganda would be interested in
engaging in a Kampala roundtable on feedback. This seems to be a good country for this kind
of discussion, given for example very high mobile phone usage. They were very keen for
someone from London to join the meeting to ensure joined-up thinking across DFID.
The World Bank reveals a wealth of experience and know-how on collecting feedback, and a
high level of investment on collecting feedback, but is less able to show how feedback
influences programming. Improving its ability to use feedback strategically is an important step
following President Jim Kims commitment to collect feedback on 100% of project.
Transparency, access to information and accountability are at the heart of the World Banks
agenda in the design and delivery of their programmes. The Bank tries to incorporate feedback
in a number of its work streams:
o
o
o
o
o
o

Follow the money type of work: identification of organisations that would be able
to follow the money, training of media on budget analysis (2012)
Country opinion survey: ongoing now, last one was in 2008. But what impact did
it have? There are examples of the Bank not acting on feedback received
Corruption Hotline (Integrity Section of the Office of Parliament), where anyone
can call the World Bank directly
Public list of bad contractors
SMS monitoring systems trialled and to be phased up
Procurement feedback channels... etc: many other channels

Feedback is prioritised and collected at different stages of project cycle: planning mission,
monitoring mission, etc all include component of talking to beneficiaries. However for the Bank
the question is whether feedback is integrated and acted upon? How does the Bank respond to

Development Initiatives // www.devinit.org

feedback and go back to its beneficiaries to report on how the feedback helped to make
changes to programmes? These questions still need to be reviewed and answered. Also, the
feedback loop is difficult to close, in the case of the country opinion surveys, beneficiaries are
accessed by a firm rather than World Bank and do not directly communicate with the Bank.
Also, there is so much feedback that the Bank has to put in processes to manage the amount of
feedback they are collecting. However, currently BFM works as a system of checks for the
World Bank, where just the existence of these channels acts as a prevention to corruption.
While the EU has practically neither experience, nor identifiable high-level commitment on
feedback, there is strong interest in learning in this area. The EU in Uganda has about 20 grants
on going with non-state actors but none of them include feedback mechanisms. The interviewee
felt that more could be done in this area, specifically the EU could include a requirement for
community feedback as a new element in its call for proposals guidelines. It was felt this may
also help the EU overcome its lack of visibility with beneficiaries often being unaware of who is
funding the programme. A main question for the EU in this area was around how to
communicate directly with beneficiaries and what methods were the most effective in opening
up this communications and engagement. Overall the EU staff interviewed were very keen to
participate in future workshops or roundtables on this issue to help them develop their practice
in this area.
Areas of particular interest/questions to explore further with the donor agencies include:
- Link between country office and head office- how does feedback from country reach into
head office level?
- Are country office and headquarters on the same page on feedback?
- What should country office do with feedback once its collected? How can it be more
strategically integrated into evaluations and programme design? While feedback
collection is seen as relatively straightforward, the question of how to deal with it in a
cost-effective way is still open to debate.
Overall, all three agencies interviewed are interested in knowing what other donors are doing in
this area, and improve their current practices.

Next steps
Similar desk research and interviews will be conducted with donors in Kenya too and the results
of the interviews will be compared to pull out similarities and differences in donor behaviour
across country offices in the region. Other countries may be added to provide more depth of
information for the baseline survey.
Next, specific programmes will seek to increase access to information for beneficiaries and
channels for them to provide feedback also improved this will mainly be in Uganda and focus
on channels for communication with government directly rather than donors. However, DIs
policy engagement work will proactively work with key donors to improve their uptake of BFMs
and greater integration into their programming. Donors will then be surveyed again to see if
there has been an improvement in the way they incorporate feedback into this programme
development and delivery.

Potrebbero piacerti anche