Documenti di Didattica
Documenti di Professioni
Documenti di Cultura
Michael Flint
March 2003
ii
CONTENTS
Summary
1. Introduction
2. What is results-based management?
3. The history of results-based approaches
4. Strategic planning
5. Monitoring and reporting
6. Managing
7. Issues in results-based management
8. Conclusions
Annexes
A. Strategic planning country level
B. Strategic planning corporate
C. References
iii
iv
SUMMARY
1. The purpose of this report is to present a comparative study of the practice of
results-based management in a sample of five multilateral development
institutions: the United Nations Development Programme (UNDP); United
Nations Childrens Fund (UNICEF); United Nations Development Fund for
Women (UNIFEM); Inter-American Development Bank (IDB); and the World
Bank. The report is based on a review of documents and a limited number of
interviews with head office staff in mid-2002. As such, it does not claim to be
definitive, nor necessarily fully up to date with developments since then.
2. The terms results and results-based management (RBM) are used in
different ways in different institutions. Section 2 of the report provides some
introductory definitions. Results are taken to include outputs, outcomes and
impacts, but with an emphasis on outcomes and impacts. RBM is similar to,
but not synonymous with, performance management.
3. All five institutions are, to a greater or lesser extent, engaging with resultsbased management. All have made a commitment to increase their resultsfocus. All have taken steps to, or are working on, improving the planning and
reporting of results. As befits their different histories, mandates and cultures,
there is enormous variety in their approaches and progress to the four main
components of RBM: strategic planning, monitoring, reporting and managing
(using).
i.
ii.
iii.
external accountability is driving much of the recent push for RBM. This
needs to be accompanied by a greater emphasis on using results
information for internal management.
iv.
vi
v.
7. Finally, this study has implications for those supporting and monitoring the
progress of results-based management within multilateral development
institutions. Assessing the quality and extent of management change is not a
straightforward task. Increasing support for the introduction of RBM will need
to be accompanied by a more sophisticated approach to its monitoring.
vii
1. Introduction
1.1 Recent interest in results-based management in multilateral development
institutions is the product of two related developments. The first was the
definition of, and agreement on, global development goals. This process
started in the mid-1990s, and culminated in the endorsement of the
Millennium Development Goals in September 2000 by all 189 United Nations
states. The significance of this event is that, for the first time ever, all
development agencies have a common set of results to which they are
working, and against which their collective performance can be judged. This
focus on results was confirmed at the United Nations Conference on
Financing for Development in Monterrey in March 2002, and is matched by a
broad consensus on development partnership and aid effectiveness. One key
feature of this consensus is the emergence of the country as the primary unit
of account.
1.2 The second development has been the drive to improve public sector
performance in OECD member states. One response in many countries has
been the adoption of results-based management (RBM) by public sector
agencies, including those responsible for development co-operation. OECD
countries are the major donors to the multilateral development institutions
(MDIs). It was therefore only a matter of time before the MDIs themselves
were influenced to embark upon a similar process of reform. This began to
happen in the late 1990s. References to results and results-based
approaches have become increasingly common among MDIs as a
consequence. However, these references often mean different things in
different institutions.
Study objectives
1.3 The purpose of this report is to present a comparative study of the practice of
results-based management in a sample of UN development agencies and
multilateral development banks. This was originally intended as background
to a DFID-sponsored workshop on RBM. Outline conclusions on the value of
RBM as currently practiced, and the reforms needed to realise its full
potential, were expected. In the event, DFID decided not to hold a workshop,
in part because of the similar World Bank sponsored workshop in June 2002.
1.4 Eight institutions were originally selected for study. With the agreement of
DIFD this was reduced to five:
1.5 The consultant was ask to document and comment on the following aspects
of RBM for each institution: the length of experience; changes made over
time; organisation, effectiveness and timeliness; quality of information;
commitment of operational staff; use made by management; and the quality
of reports. This proved to be a hugely ambitious undertaking. RBM is a
management approach, not a simple technical instrument. There is a huge
difference between how it is meant to work on paper, how it is said to work,
and how it actually works. Understanding RBM basically means
understanding how these institutions are managed, both in head office and in
the countries where they operate. This was clearly impossible in the time
available (25 days in total). Each of these institutions would require this much
time to do them justice. Useful meetings were held with all the institutions
involved, but these could not really do more than scratch the surface. The
result is a report that is inevitably more superficial than was originally
intended, and which concentrates more on generic issues than on
institutional specifics.
1.6 The report begins with a discussion of the key terms: results and resultsbased management (section 2). Section 3 contains a brief history of RBM in
each institution. Sections 4-6 cover the main elements of RBM: planning,
monitoring, and managing. The report ends with a discussion of the main
issues in implementing and monitoring RBM.
2.
2.1 Results-based management can mean many different things. The area is
bedevilled by different definitions. What one institution calls an outcome is
anothers output, intermediate outcome, or impact. Without agreement
about what exactly RBM is, it is very difficult to assess or monitor its
implementation. Some discussion of what these words mean is therefore
required at the outset.
Results
2.2 The recent OECD DAC glossary of key terms defines a result as the output,
outcome or impact of a development intervention (Box 1). While this is the
definition used in this report, it should be noted that this is a broader
definition than used by some of the leading exponents of results-based
management. According to the Treasury Board of Canada, a result is the
end or purpose for which a programme or activity is performed ... and refers
exclusively to outcomes. 1 Outcome in this usage covers both effects and
impacts - but not outputs - and may be immediate, intermediate or final.
Box 1 Results
OECD DAC definitions 2 :
Result : the output, outcome or impact of a development intervention.
-
Output : the products, capital goods and services which result from a
development intervention.
2.4 The other key feature of a result is that it should represent attributable
change resulting from a cause-and-effect relationship. In other words, there
has to be a reasonable connection, or at least a credible linkage, between
the specific outcome and the activities and outputs of the agency. If no
attribution is possible, it is not a result.
IMPACTS
Reduced infection
OUTCOMES
Immunisation coverage
OUTPUTS
Immunisation programmes
ACTIVITIES
INPUTS
Results-based management
Glossary of key terms in evaluation and results based management. OECD DAC (2002)
Measuring Outputs and Outcomes in IDA Countries. International Development Association. February
2002.
3
should include measures of process and efficiency, not just results. RBM is
just one, albeit significant, approach to performance management.
2.9 The application of RBM varies from country to country, and from agency to
agency. However, there are four core elements to most RBM approaches 4 :
2.10 The experience and thinking of the five multilateral institutions with respect
to these four elements is considered below, having first briefly outlined the
history of results-based approaches in each.
3.
3.1 This section documents the history of results-based approaches in the five
institutions reviewed: UNDP, the World Bank Group, Inter-American
Development Bank (IDB), UNICEF and UNIFEM. The practice of RBM
needs to be considered at three main levels 5 :
Project
Country
Corporate
3.2 In the context of development co-operation, RBM at the project level has the
longest history and is most well documented 6 . Work on introducing RBM at
country and corporate level is much more recent. It is at these levels where
the real challenge for RBM lies. This report will accordingly concentrate on
RBM at country and corporate level.
3.3 This does not mean that RBM at project level should be ignored, for two
reasons. First, despite the shift to a non-project development paradigm,
projects still dominate the aid landscape 7 . Second, RBM is most applicable,
and least problematic, at the project level. Despite this, the application of
RBM and logical frameworks to projects has not been particularly
successful. The limited success of RBM in the much simpler environment of
projects should, at the very least, give pause for thought. This issue is
discussed further below (section 7).
3.4 It is important to emphasise that the degree to which RBM has been applied,
or is claimed to be, is not necessarily correlated with effectiveness. The fact
that most of the institutions have not yet adopted and implemented RBM in a
formal sense does not mean that they are not implementing parts of the
approach at some levels. It certainly does not mean that they are not
producing development results.
3.5 UNDP has made the strongest commitment to RBM. It is the only institution
of the five to have begun to implement RBM as an organising principle at all
levels, and is the most advanced of all the UN agencies. Further advances
have been made since the information on which this section is based was
collected. 8
3.6 UNDPs advanced status has two origins. The first was the pressure of
declining core funds in the 1990s. UNDP knew that it had to change if it was
to recover the confidence of the donor community. In 1997 UNDP initiated a
set of change management processes, known as UNDP 2001. The UNDP
change process emphasised, among other things, the need for the
organisation to become more results-orientated 9 .
3.7 In parallel, UNDPs Evaluation Office (EO) had been working on developing
results-based monitoring and evaluation policies, methodologies and tools.
In 1997 EO commissioned a joint study with SIDA on results management 10 ,
and produced a handbook on results-orientated monitoring and evaluation
for programme managers 11 . In 1998 EO was given lead responsibility for
developing a framework for the measurement and assessment of
8
In a response to a draft version of this report, UNDP stated that this report does not take account of the
many, more recent advancements UNDP has made in internalising RBM.
9
Annual Report of the Administrator for 1997. UNDP (1998)
10
Measuring and Managing Results. Poate, D. (1997)
10
3.8 Since then, UNDP has been working to ensure that assessing and reporting
on results is not a minority preoccupation but a way of doing business for the
organisation as a whole. 13 Having been piloted in ten countries, RBM was
introduced worldwide in only one year, with the first Results-Orientated
Annual Report (ROAR) produced in 1999. Strategic choices were made to
learn from others; to learn by doing; to tailor RBM to UNDP; to keep the
system as simple as possible; not to over-invest in indicators; and to
manage for (not by) results. The result is an approach that is still being
adapted, but which has been mainstreamed throughout the organisation and
its instruments. The next generation of RBM software is currently being
introduced.
Box 3: UNDPs Results-Based Management System
Planning Instruments:
Reporting Instruments:
11
Results-orientated Monitoring and Evaluation: a Handbook for Programme Managers. UNDP (1997)
Results Based Management Overview and General Principles. UNDP.
13
The Multi-Year Funding Framework. UNDP (1998)
12
11
3.10 Since 1998 the CC report has used a results orientated format for reporting
against the SBP. By virtue of its close association with UNDP, UNIFEM was
influenced by the UNDP 2001 change process and by the introduction of
RBM in that organisation. UNIFEM uses the Results and Competency
Assessment developed by UNDP, and has an interface with the UNDP
ROAR. However, UNIFEM has also been exploring, and been influenced by,
the RBM approaches of other multilateral and bilateral agencies.
3.11 According to the recent Report of the Executive Director, each of UNIFEMs
three programming objectives is measured and driven by a results-based
framework designed to create a learning and knowledge based institution.
14
14
12
3.12 UNICEF is proof that a results-orientation is not necessarily new. While the
use of the term RBM may be new, UNICEF has been practising large parts
of the approach for at least twenty years. One of the best examples was the
child survival campaign launched in 1982. By insisting on strategic action,
measurable results, and clear accountability, the then Director of UNICEF
(J.P.Grant) spearheaded extraordinary improvements in child survival and
development over the following decade.
3.13 Over the last few years UNICEF has recognised the need to define more
clearly the results it seeks to achieve. In 1996, a new Mission Statement was
approved. This was followed by a Medium-Term Plan (MTP) for 1998-2001.
Although containing a statement of priorities, these were numerous and
wide-ranging, and were not mainstreamed within UNICEF. The MTP also
lacked clearly defined targets against which to measure achievement.
Significant progress was nevertheless made over the MTP period in
achieving a stronger results-focus in programming and reporting, and in
moving towards a more strategic approach.
3.14 In 2000, UNICEF produced a Multi-Year Funding Framework. This was seen
as an opportunity to strengthen results-based management within the
organisation. Analytical reporting on results linked to objectives and budget
was identified as a core element of the framework. The Executive Directors
Annual Report in the same year was the first to use a results-based format.
13
3.16 The World Bank has been working to increase its results orientation for the
past ten years. In 1992, the World Bank was criticized by the Wapenhans
Report for giving more attention to the quantity of its lending than to its
quality: a product of the so-called approval culture. The World Bank
responded with a concerted effort to improve its focus on quality and results.
In 1993 the World Bank issued a new plan entitled Getting Results: the
World Banks Agenda for Development Effectiveness and initiated the Next
Steps reform programme. In 1994 Learning from the Past, Embracing the
Future was published, with a results orientation as one of its six guiding
principles.
3.17 This was followed by the Renewal Process in 1996. A raft of general and
sector-specific performance monitoring indicators, and the logical
framework, were introduced. In the same year, the Quality Assurance Group
(QUAG) was established to improve, and allow management to keep track
of, project design (quality-at-entry) and supervision. This added a significant
quality element to the traditional measures of lending approvals (number and
amount), project performance (projects at risk), and ratings of closed
projects (outcome, sustainability, and institutional development impact).
15
16
Report on the Evaluation Function in the Context of the Medium-Term Strategic Plan. UNICEF (2002)
Medium-Term Strategic Plan for the Period 2002-2005. UNICEF (2001)
14
3.18 1997 saw the launch of the Strategic Compact. The Compact aimed to
make the World Bank more effective and efficient in achieving its main
mission - reducing poverty and included a commitment to building a
performance assessment system and to making management more
performance based. This led to further improvements in performance
measurement and management, and to some increase in results-orientation.
For example, the 1998 Annual Report on Operations Evaluation (AROE)
concluded that while RBM had not been formally adopted as had been
recommended by the AROE in 1997 - operations are moving in that
direction.
3.19 A similar judgement was made in 2001. The Strategy Update Paper
summarised the situation in the following way :
We also are much more explicitly focusing on results, particularly on how
we can better measure, monitor and manage to achieve them. We have
come a long way in developing measures of operational inputs and their
quality, and these have helped us to make a steady improvement in Bank
performance over the last several years. We now need to ratchet up our
results focus, doing more to measure and explain how our work makes a
difference in terms of country outcomes. 17
3.20 Recent IDA-13 and Monterrey discussions have given renewed impetus to
the search for better ways of monitoring country outputs and the contribution
to country outcomes. Most of the improvements in the 1990s were aimed at
improving the quality of the design, implementation and monitoring of
projects. The World Bank accepts that more needs be done to increase its
results orientation, particularly in areas other than projects. Improvements
are planned in the planning and monitoring of country programmes, as well
as for sector and thematic strategies. Further work on implementing the
17
Strategy Update Paper for FY03-05: Implementing the World Banks Strategic Framework. Executive
Summary p.i. March 2002.
15
3.21 The IDB has not experienced the same level of external pressure for reform
and results, and the associated permanent management revolution, which
has characterised the World Bank over the last decade. However, concern
about the results-focus of the IDB has followed a broadly similar history 19 .
3.22 As with the World Bank, recent efforts to increase the results-focus of the
IDB originated from a critical review of the Banks portfolio. In 1993 the Task
Force on Portfolio Management (TAPOMA) found that the focus on the initial
approval of projects and the subsequent control of execution took the focus
away from managing for development results. It concluded that a concern for
results needed to be paramount.
3.23 The IDB Board and management endorsed this shift of focus and responded
in the mid-1990s with a series of improvements to the way projects were
designed and monitored. The overall aim was to promote a resultsorientated dialogue among Bank staff, executing agencies, and national
counterparts and an increased results-focus in project design, monitoring
and reporting. Improvements included the requirement for logical
frameworks, impact indicators, and project completion reports based on data
on the outcomes or impacts. The new US Administrations emphasis on
results throughout 2001, and internal changes in Office of Evaluation and
Oversight, gave fresh momentum to RBM within IDB.
18
Better Measuring, Monitoring and Managing for Development Results. Development Committee Paper.
World Bank. September 2002.
19
This section draws extensively on the Development Effectiveness Report. Office of Evaluation and
Oversight. IDB. February 2002.
16
20
Development Effectiveness at the IDB. Paper for the Board of Executive Directors. January 2002.
17
4.
Strategic planning
4.1 This section considers the extent to which strategic planning at the
country and corporate level has a results-focus. For institutions that are
implementing RBM, strategic planning should be about planning to
achieve outcomes: management for results. Plans should contain clear,
realistic and attributable results; defined indicators specifying exactly what
will be achieved by when; a results chain or logic model linking inputs,
activities, outputs and outcomes; and a strategy or strategies explaining
how and why inputs will lead to outcomes, including a discussion of risk.
Country-level planning
4.2 All the institutions are, to a greater or lesser extent, struggling with three
challenges. First, to align their programmes more explicitly to the
countrys own plans, such as the Poverty Reduction Strategy Paper
(PRSP). Second, to raise the sights of their programmes from the project
level to country level. And third, to define better country-level results
frameworks.
18
19
22
20
4.9 Recent UNDP Country Programme Outlines (CPOs) include a results and
resources framework. 27 This lists intended outcomes and outputs (with
indicators) within strategic areas of support. Examples from the Malaysia
CPO are contained in Box 5. Note that the UNDP outcomes are lower
level outcomes (less ambitious and more attributable) than those
specified by UNICEF, IDB or the World Bank.
4.10
Frameworks (CCFs). These had merely listed the areas of support and
made no mention of results. 28 More recent CCFs had listed key results
under each strategic area of support, but had not distinguished between
outcomes and outputs, nor included indicators 29 .
4.11
India CPO (2002); Malaysia CPO (2002). UNDP consider that the Malaysia CPO is not a good example
from the RBM perspective.
28
Mongolia CCF (1997)
29
Malawi CCF (2001)
21
Examples from the Health Programme IMEP for Malawi (2002-06) are
contained in Box 6.
Corporate planning
4.12
4.13
up planning, but has gone further than any of the other institutions in
determining a corporate results framework. The Strategic Results
Framework (SRF) for 2000-03 lists 7 goals, 24 sub-goals, 142 outcomes
(with indicators), and 84 strategic areas of support. Box 7 contains
examples from the SRF.
22
4.15
similar structure to that of the UNDP SRF, but without the sub-goals. 120
outcomes and indicators were listed. As with UNDP, no means of
verification were given for the indicators. An example is given below.
23
4.16
4.17
The IDB and World Bank have not yet attempted to develop
30
How are we doing? Tracking UNIFEM progress in achieving results for management and learning.
Briefing Note. UNIFEM (2002)
31
Renewing the Commitment to Development: Report of the Working Group on Institutional Strategy. IDB
(1999)
24
4.19
4.20
32
33
25
5.
5.1 This section should be as much about monitoring as about reporting. Not
everything that is monitored is reported, or needs to be. However, the limited
duration of this study meant that little information could be collected on
monitoring per se. Time constraints also meant that no country-level reports
were examined.
5.2 The distinctions between monitoring and reporting, and between internal and
external reporting, are important. Many institutions are under pressure to
report externally on results. While this is important for accountability, internal
reporting to management, and monitoring more generally, are arguably at
least as important. RBM is intended to improve both management
effectiveness and accountability.
5.3 It is also important to stress that accountability for results implies more than
just reporting results. Many results (eg. outcomes) will not be attributable to
a single institution. Because of this, reporting needs to demonstrate several
things :
i.
that the agency is managing for outcomes, not just activities and outputs.
ii.
iii.
iv.
that the design and implementation of the results strategy is sound and
effective;
v.
that the results over which the MDI has a significant degree of control, and
is aiming for, are being achieved.
26
5.4 None of the reports reviewed yet approach this standard. Most concentrate
on the second and last task reporting on outputs and outcomes but
without either analysing the strength of the link between the two, nor the
effectiveness of the management strategy.
5.5 The UNDP Results Orientated Annual Report (ROAR) represents the most
ambitious and comprehensive corporate results report. The third ROAR
(2001) presents key findings for each of the six SRF goals, together with indepth analysis of three selected sub-goals. Aggregated global figures for the
percentage of annual outputs fully or partially achieved, and the percentage
of outcomes where there was positive progress, are presented in the text,
together with the number or percentage of country offices active in each
area. Comparative figures are sometimes given for achievements in the
previous year. One of the general observations made is that there is still a
sizeable gap ... between impressive results at the output levels achieved
within each goal and their contribution to realising outcomes. 35
Results-Based Management and Accountability for Enhanced Aid Effectiveness. A Reference Paper.
CIDA Policy Branch. July 2002.
35
Results-Orientated Annual Report, 2001. UNDP (2002), p.2.
27
5.7 While the ROAR is clearly a great advance on previous reporting, it lacks
transparency in two respects. First, there is no single table showing
coverage and achievements by goal and sub-goal for 1999, 2000, and 2001.
It would be possible to largely create such a table by extracting the figures
for 2000 and 2001 from the 67 pages of text, but the fact that the data is not
presented in an accessible format is strange. The ROAR badly needs a
straightforward summary. Second, although activities by goal and country
are tabulated in an annex, there is no presentation of the achievement by
outputs and outcome for each country office. This is a deliberate decision 36 ,
and may reflect a judgement that country-specific results would be
misleading given the wide variation in results, projects and countries.
5.8 For the last two years the Evaluation Office of UNDP has prepared a
Development Effectiveness Report (DER). This is largely based on
independent evaluation studies, and complements the ROAR by providing
summary findings on the impact and sustainability of UNDP interventions at
project and country level. The relative paucity of empirical data on the
development impact of UNDPs assistance was noted in both of the last
DERs.
5.9 UNICEF has used a results matrix to report on its Medium-Term Plan (MTP)
since 1999. The results cited are a mix of global outcomes to which UNICEF
made some contribution, or a description of what UNICEF has supported (ie.
activities). There is no assessment of what UNICEF has directly achieved in
terms of outputs, either against what was planned for the year in question or
over the four years of the plan. A comparison of the results matrix for 1999
and 2001 does not allow any conclusion to be drawn as to whether UNICEF
is more or less effective than it was, or whether its contribution is growing or
36
Results Based Management Concepts and Methodology. UNDP Technical Note (2000) p.17.
28
shrinking. The better definition of intended results in the MTSP for 2002-05
is likely to lead to improved reporting.
5.10 UNIFEMs Strategy and Business Plan (SBP) for 1997-99 included a
detailed list of activities under each objective. The new SBP for 2000-03
included a report on the previous plan that lists the specific and general
results achieved. However, it is not possible to match the results with the
activities originally listed in the SBP for 1997-99.
5.11 The annual Report of the Executive Director mentions that implementation
of each of [the SBP] objectives is measured and driven by a results-based
framework, but does not report against the intended outcomes listed in the
SBP for 2000-03. 37 This is done in the Consultative Committee Report,
which is an annual report of results against the objectives of the SBP, based
on data from the 6-month and annual reports submitted from each SubRegional Office.
5.12 The IDB prepares an Annual Report on Projects in Execution (ARPE) for the
Board of Executive Directors. This provides detailed information on the
status and performance of the Banks portfolio, including an assessment of
the extent to which ongoing projects in each country are likely to achieve
their development objectives. In addition, the ARPE provides an assessment
of trends and challenges, the issues affecting portfolio performance and
notes the Banks response to these challenges. In the last two years the
report has contained an analysis of the quality and compliance rate of
Project Completion Reports (PCRs), has provided information on good
practices noted, and highlighted lessons learned from both the Bank and
Borrowers.
37
UNIFEM Report of the Executive Director. Executive Board of UNDP. September 2002.
29
5.13 The Bank is in the process of revamping the PCR and the Project
Performance Monitoring Report (PPMR). The PPMR has been modified to
include historical project ratings, as well as greater attention to financial and
sustainability issues and lessons learned, and will be linked to other relevant
reports and monitoring systems. The last PPMR will also serve as a key
input for the preparation of the PCR, which will focus more on results and
comply with OECD/DAC guidelines for MDBs. It will include an evaluation of
both the Bank and Borrower performance, an assessment of the projects
contribution to institutional development, and an outlook on expectations
regarding the projects ability to deliver benefits in the medium and longterm.
5.14 Like UNDP, IDBs Office of Evaluation and Oversight (OVE) has also
produced a Development Effectiveness Report. However, unlike in UNDP,
OVE is independent of management. One of the findings reported in the IDB
DER was that, for completed projects rated as highly likely to achieve their
development objectives, the majority of PCRs only discuss project outputs.
Although it is quite likely that all the projects made some contribution to
outcomes and impacts, this was very rarely documented in the PCR 38 .
38
39
30
5.16 Part of the problem lies in the lack of monitorable outcomes in Country
Assistance Strategies (CAS), Sector Strategy Papers, and Project Appraisal
Documents. These and other problems have been the subject of a
comprehensive M&E action plan since 1999. Further improvements in M&E
such as a CAS completion report are underway. The methodological
challenges associated with measuring and attributing results are also very
real.
5.19 The ARDE provides a reliable measure of the extent to which completed
Bank projects are producing relevant results. Because OED has used a
40
Better Measuring, Monitoring and Managing for Development Results. Development Committee Paper.
World Bank. (September 2002) p.11
31
consistent methodology over the past few years, it also allows trends in
project performance to be monitored. What it does not attempt to do is to
quantify the specific results achieved or assess the contribution towards
higher goals, such as the MDGs. In this sense it is more of an aggregation of
results ratings rather than results. This is a practical solution to the problem
of aggregating across diverse results.
5.20 The World Bank acknowledges that there is scope to improve its reporting
on its results. The assessment of the Strategic Compact found that the
corporate scorecard was still incomplete, in part because of a lack of an
agreed methodology. Limited progress has been made on agreeing ways of
measuring and monitoring the impact of World Bank actions at country and
sector level. This missing second tier of the corporate scorecard is intended
to link internal bank measures (such as product quantity and quality) with the
International Development Goals (IDGs). The Strategic Framework
produced in 2001 also highlighted the need to link country and sector work
with the IDGs, but was not able to say how this would be done.
32
6.
Managing
...there is sufficient evidence that the key elements are well known to
donors and carried out to some extent. But in so many instances they have
failed owing to weaknesses in how the systems are used rather than what
its components are. They reflect the missing link between the measurement
procedures and the way in which the information is used the management
process 41
6.1 There are two primary uses, and motivations, for results information. The first
is for accountability: to demonstrate effectiveness to others. This aspect was
covered in the previous section. The second is to provide continuous
feedback and learning for management. To what extent are these institutions
really managing for outcomes? To what extent are they using information on
outputs and outcomes, and from evaluation, in decision making?
6.2 These are difficult questions to answer, particularly in this type of study. The
potential uses of results information extend throughout the institution, from
planning and budgeting to staff appraisal. The observations below are drawn
from a small number of interviews, and from the few reports that address this
issue.
Planning
6.3 Section 4 looked at the extent to which these institutions were planning to
achieve outcomes: managing for results. In an ideal world strategic planning
should also be about managing by results. Institutions should be amending
their plans on the basis of results and experience. In practice this is
something that few institutions are able or prepared to do. No development
41
33
institution has been implementing RBM long enough for the results of one
strategic planning cycle to inform to inform the next. 42
6.4 More fundamentally, given the time lags between programmes and
outcomes, let alone between programmes and data on outcomes, it is
doubtful whether management by outcomes or impacts will ever be a practical
proposition for development agencies. The best that can be hoped for is for
periodic reviews to examine the alignment of the programme in respect of
outcome trends. UNICEF have done this to some extent in their Medium
Term Strategic Plan by concentrating, for example, on countries with
particularly high child mortality rates.
42
This is not to say that the experience of one planning cycle has not informed the next. For example, the
UNIFEM SBP for 1997-99 certainly informed the formulation of SBP 2000-03.
43
Measuring Outputs and Outcomes in IDA Countries. IDA (February 2002) p. 5
34
Resource allocation
6.6 As with planning, resource allocation can mean allocating for results or by
results (or conceivably both). The World Bank is probably the strongest
exponent of budgeting for results. There is good evidence that aid has a
larger impact on growth and poverty reduction in the context of good policies
and institutions. In line with this thinking, the World Bank has for some time
used assessments of country policy and institutional (CPI) performance as a
basis for allocating IDA funding 44 . 45 Since the Strategic Framework, the Bank
has sought still greater selectivity and focus in its work.
6.8 This is not to deny that there is real question about how best to balance
need and results in resource allocation, particularly for country allocations.
44
Better Measuring, Monitoring and Managing for Development Results. Development Committee Paper.
World Bank. (September 2002) p.7
45
It can be argued that CPI scores are themselves results of previous actions by governments and donors.
The World Bank is in effect budgeting on the basis of past results in order to increase the likelihood of
future results.
35
But this is not an either/or choice. Aid should be directed at countries in need
with good policy and institutional environments, and therefore the best
prospects for achieving results. It would appear that the World Bank does this
rather better than do the UN agencies. 46
6.9 Finally, results-based budgeting should have implications for how resources
are allocated. The 2001 ARDE included an analysis of which objectives World
Bank projects have been the most effective at achieving. This showed, for
example, much greater success with physical infrastructure than for public
sector institutional change.
6.10
47
6.11
46
UNDPs allocations to good policy/bad policy countries (as measured by CPIA scores) became less
favourable over the 1990s.
47
Annual Review of Development Effectiveness 2001. OED World Bank (2002) p.20
36
Staff appraisal
6.12
It was not possible to ascertain the extent to which the assessment of staff
6.13
RBM more generally - is the time-lag between inputs and outcomes. Annual
appraisals can only hold staff accountable for very short-term results. Any
higher outputs or outcomes will be the product of resources and actions
provided years before. Equally, given the predominance of short postings,
most staff will be long gone by the time the outputs and outcomes of their
work become apparent. Making staff more accountable for planning for
results, and for reporting on results, would be a step in the right direction.
Managerial response
6.14
UNDP is aware that the real challenge for RBM is, and remains, to realise
6.15
37
6.16
On the other hand, there has been some criticism of the limited response
of management to some of the key ROAR findings, such as the relatively poor
performance of UNDP in respect of gender. There is also reported to be more
commitment to RBM at headquarters rather than in the country offices, and
more at middle rather than senior management.
6.17
6.18
as UNDP. Any managerial changes are therefore both more incremental and
more difficult for an outsider to detect. The World Bank experience is a case
in point. Ten years of management reform, intended in part to increase the
results-focus of the organisation, have made some difference. The design,
outcomes, sustainability and institutional development impact of World Bank
projects have improved. However, as the Bank itself acknowledges, there is
much more that needs to be done to increase its results-orientation,
particularly at country, sector and corporate level. The Banks own
assessment of performance measurement under the Strategic Compact
concluded as follows :
... while the measurement of performance as well as several of its uses
have improved during the Compact period, performance measurement has
not yet been used systematically and consistently to make strategic
38
6.20
48
Assessment of the Strategic Compact. Annex 9 Performance Measurement. World Bank (2001) p. 15
39
7.
7.3 Aid agencies have come to recognise that development results depend on
developing countries. As the World Bank said recently: that is where
development outcomes are realised and measured and where the other
goals will be met, or not 51 . This presents development agencies with a
49
Assessing Development Effectiveness. Flint, M. and Jones, S. DER Working Paper 1. DFID Evaluation
Department (2001)
50
Measuring and Managing Results. Poate, D. (1997)
51
Better Measuring, Monitoring and Managing for Development Results. Development Committee Paper.
World Bank. (September 2002) p.4
40
7.4 Experience in OECD countries suggests that RBM will not succeed
without a supportive policy, fiscal and institutional environment. As Poate
stated in 1997 even where aid agencies can tackle their own internal
measurement procedures and use of performance information,
advocating performance measurement to clients in isolation is unlikely to
lead to improved results because too many components are missing.
RBM requires and means full-scale public sector management reform in
developing countries. This is an altogether more challenging prospect
than attempting to implement RBM with development agencies.
Strengthening country capacity in parallel is an all important priority, but a
long-term task. Many countries not yet even have the systems to track
inputs and outputs, let alone outcomes 53 .
Attribution
52
53
This would be akin to a car company introducing RBM in the major offices but not in its car plants.
Annual Report on Operations Evaluation 2000-01. World Bank (2002) p.34
41
PROJECTS
Largely self-contained, involving a
relatively limited and identifiable range of
actors, each of whom has relatively clearly
defined, complementary roles and interests
Produce tangible outputs
42
54
54
Results-Based Management and Multilateral Programming at CIDA a discussion paper. Mark Schacter.
Institute of Governance, Ottawa, Canada. (1999) p.4-5
43
7.10
Aggregation
7.11
7.12
44
frameworks or similar may or may not help at country level, but are
unlikely to do so at corporate level. Aggregating the number of
programmes implemented in each results area as in the UNDP SRF
merely measures activities, not results or performance.
7.13
Incentives
7.14
45
7.15
7.16
7.17
55
55
Killick (1997) and Mosley, Harrigan and Toye (1995) quoted in Shepherd, G. (op cit).
Delivering Project Aid in Old and New Ways: Institutions Matter. Geoffrey Shepherd. Workshop Paper
(May 2002) p.21
56
46
7.18
7.19
Staff members face similar incentives. The amount spent and the
7.20
47
7.21
results-focus is not possible. It does, however, mean that it will not just
happen. Institutions and staff cannot be made individually accountable for
outcomes and impacts in developing countries. They can be made more
accountable for reporting more effectively on their processes, strategies,
and near-term results, and for evaluating their contribution. The
experience of the World Bank Quality Assurance Group (QUAG) shows
that it is possible to alter internal incentives. Similar mechanisms could be
used to shift incentives towards results-based management.
7.22
59
Improving Aid to Africa. Van de Walle, N. and Johnson, T. Overseas Development Council, Washington
(1996) quoted in Shepherd, G. (op cit)
48
8.
Conclusions
8.1 Results-based management is the latest in a very long line of efforts to make
development agencies more effective. It is also the latest in a very long line of
efforts to improve the measurement, monitoring and reporting of
effectiveness. This is not to diminish its potential significance. Thinking about
development in terms of outcomes and impacts, rather than inputs, activities
and outputs, is a powerful idea that has major implications for how
multilateral development institutions operate.
8.2 All five institutions are, to a greater or lesser extent, engaging with resultsbased management. All have made a commitment to increase their resultsfocus, and all have taken steps to, or are working on, improving the planning
and reporting of results. As befits their different histories, mandates and
cultures, there is enormous variety in their approaches and progress.
8.3 Five conclusions emerge from this study. First, results-based management is
easier said than done, particularly for development institutions, and
particularly given the new focus on country and global results. The
application of RBM to development institutions is an attempt to stretch the
approach from one institutional context (government in developed countries)
to a very different one. It is also an attempt to stretch the approach from one
category of activity where is relatively easy to apply (projects) to one where it
is not (country and corporate). Institutions should not underestimate the
challenge.
49
implement RBM within MDIs alone, or within developing countries but without
a supportive policy and fiscal environment, are unlikely to succeed.
8.5 Third, there are tensions and trade-offs between the two uses of results
information: accountability and management. The reality is that external
accountability is driving much of the recent push for improved RBM.
Multilateral agencies are under pressure to demonstrate results. They are
under less pressure to manage for and/or by results. Unless this is corrected,
the result will be an over-emphasis on measuring and reporting results, and
insufficient emphasis on using results information for internal management.
The obvious risk is that so-called results-based approaches will lead more to
an increase in the results claimed by institutions, than an increase in the
results actually achieved.
50
8.8 Finally, this study has implications for those supporting and monitoring the
progress of results-based management within multilateral development
institutions. The clear message of this report is that RBM is not something
that has either been introduced or not. RBM implies far-reaching changes
throughout the institution. All of these five institutions are trying, in different
ways and with different degrees of success, to increase their results
orientation. The understandable temptation is to claim a greater degree of
change, and a greater impact, than may exist in reality. It follows that
assessing the quality and extent of management change is not a
straightforward task. Increasing support for the introduction of RBM will need
to be accompanied by a more sophisticated approach to its monitoring.
51
ANNEX A
STRATEGIC PLANNING COUNTRY LEVEL
UNDP
Document
Logical framework 63
MDGs
GOALS 64
Indicators
Targets
OUTCOMES
Indicators
Targets
OUTPUTS
Indicators
Targets
ACTIVITIES 65
INPUTS
UNIFEM
60
UNICEF
Country Programme
Outline
Country Note
61
WB
IDB
Country Assistance
Strategy 62
Country Paper
60
52
Document
Logical framework 66
MDGs
GOALS 67
Indicators
Targets
SUB-GOALS
Indicators
OUTCOMES
Indicators
Targets
OUTPUTS
ACTIVITY AREAS 69
INPUTS
ANNEX B
UNDP
UNIFEM
UNICEF
WB
IDB
Strategic Results
Framework 2000-03
Strategic Framework
Paper 2001
Institutional Strategy
I999
24
142
120 68
84
28
89
66
70
53
ANNEX C
REFERENCES
CIDA Results-Based Management and Accountability for Enhanced Aid
Effectiveness. A Reference Paper. Policy Branch. July 2002.
Flint, M., and Jones, S. - Assessing Development Effectiveness. DER Working
Paper 1. DFID Evaluation Department. 2001.
IDA Measuring Outputs and Outcomes in IDA Countries. February 2002.
IDB Development Effectiveness Report. Office of Evaluation and Oversight.
February 2002.
IDB Development Effectiveness at the IDB. Paper for the Board of Executive
Directors. January 2002.
IDB Ecuador Country Paper. 2001.
IDB Brazil Country Paper. 2000.
IDB Chile Country Paper. 2001.
IDB Country Paper Guidelines. February 2002.
IDB Renewing the Commitment to Development: Report of the Working Group
on Institutional Strategy. 1999.
Killick, T. Principals, Agents and the Failings of Conditionality. Journal of
International Development, Vol.9, No.4, 483-495. 1997.
Mosley, P., Harrigan, J., and Toye, J. Aid and Power: the World Bank and
Policy Based Lending. Routledge, New York. 1995.
OECD Glossary of Key Terms in Evaluation and Results-Based Management.
DAC. 2002.
OECD Results Based Management in the Development Co-operation
Agencies: A review of experience. Annette Binnendijk. DAC. 2001.
Poate, D. Measuring and managing results. Report for UNDP. 1997.
Schacter, M. Results-Based Management and Multilateral Programming at
CIDA a discussion paper. Institute of Governance, Ottawa, Canada. 1999.
54
Shepherd, G. Delivering Project Aid in Old and New Ways: Institutions Matter.
Workshop Paper. May 2002.
Treasury Board of Canada Secretariat Results-Based Management Lexicon.
2002.
UNDP - Annual Report of the Administrator for 1997. 1998.
UNDP Results-Orientated Monitoring and Evaluation: a Handbook for
Programme Managers. 1997.
UNDP Results Based Management: Overview and Principles.
UNDP Results Based Management Concepts and Methodology. Technical
Note. 2000.
UNDP Multi-Year Funding Framework. 1998.
UNDP First Country Co-operation Framework for Mongolia (1997-2001). 1997.
UNDP Country Programme Outline for India (2003-2007). June 2002.
UNDP Country Programme Outline for Malaysia (2003-2007). June 2002.
UNDP Second Country Co-operation Framework for Malawi (2002-2006).
2001.
UNDP Second Regional Co-operation Framework for Africa (2002-2006).
February 2002.
UNDP Results Orientated Annual Report, 2001. 2002.
UNICEF Report on the Evaluation Function in the Context of the Medium-Term
Strategic Plan. 2002.
UNICEF Medium-Term Strategic Plan for the Period 2002-05. 2001.
UNIFEM Report of the Executive Director. September 2002.
UNIFEM - How are we doing? Tracking UNIFEM progress in achieving results for
management and learning. Briefing Note. 2002.
Van de Walle, N. and Johnston, T. Improving Aid to Africa. Overseas
Development Council, Washington. 1996.
55
56