29
Report on the Quality Assurance of 2013 ROARs UNDP, October 2014 1 United Nations Development Programme Quality Assurance Report 2013 Country Office Results Oriented Annual Reports (ROAR) - October 2014 Contents 1. Introduction ........................................................................................................................................... 2 2. The 2013 ROAR: format ......................................................................................................................... 3 3. The 2013 ROAR: reporting process ....................................................................................................... 3 4. Scoring methodology............................................................................................................................. 3 5. Quality assurance team ......................................................................................................................... 4 6. Summary of Quality Assurance Scores .................................................................................................. 6 7. Acclaim list – ROARs assessed as the highest quality.......................................................................... 11 8. Lower scoring ROARs/ROARs needing attention (amber)................................................................... 13 9. General observations on quality and compliance ............................................................................... 15 ANNEXES ..................................................................................................................................................... 16 Annex 1 – Average scores by year and region ........................................................................................ 16 Annex 2 - Country Office ROAR Quality Assurance Scores (A to Z) ........................................................ 17 Annex 3 – Top ROARs 2012-2014 (scoring 18, 19 or 20 points) ............................................................. 22 Annex 4 - ROAR Quality Assurance Scoring Methodology ..................................................................... 23 Annex 5 –List of Quality Assurors by region ........................................................................................... 24 Annex 6 - ROAR Quality Assurance Scores by region and Country Office from 2011 to 2013 ............... 25

Quality Assurance Report 2013 Country Office Results ...ROAR questions were very similar to 2012. The ROAR was nevertheless shortened (around 20 questions were removed, mainly from

  • Upload
    others

  • View
    1

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Quality Assurance Report 2013 Country Office Results ...ROAR questions were very similar to 2012. The ROAR was nevertheless shortened (around 20 questions were removed, mainly from

Report on the Quality Assurance of 2013 ROARs

UNDP, October 2014

1

United Nations Development Programme

Quality Assurance Report

2013 Country Office Results Oriented Annual

Reports (ROAR) -

October 2014

Contents

1. Introduction ........................................................................................................................................... 2

2. The 2013 ROAR: format ......................................................................................................................... 3

3. The 2013 ROAR: reporting process ....................................................................................................... 3

4. Scoring methodology ............................................................................................................................. 3

5. Quality assurance team ......................................................................................................................... 4

6. Summary of Quality Assurance Scores .................................................................................................. 6

7. Acclaim list – ROARs assessed as the highest quality .......................................................................... 11

8. Lower scoring ROARs/ROARs needing attention (amber) ................................................................... 13

9. General observations on quality and compliance ............................................................................... 15

ANNEXES ..................................................................................................................................................... 16

Annex 1 – Average scores by year and region ........................................................................................ 16

Annex 2 - Country Office ROAR Quality Assurance Scores (A to Z) ........................................................ 17

Annex 3 – Top ROARs 2012-2014 (scoring 18, 19 or 20 points) ............................................................. 22

Annex 4 - ROAR Quality Assurance Scoring Methodology ..................................................................... 23

Annex 5 –List of Quality Assurors by region ........................................................................................... 24

Annex 6 - ROAR Quality Assurance Scores by region and Country Office from 2011 to 2013 ............... 25

Page 2: Quality Assurance Report 2013 Country Office Results ...ROAR questions were very similar to 2012. The ROAR was nevertheless shortened (around 20 questions were removed, mainly from

Report on the Quality Assurance of 2013 ROARs

UNDP, October 2014

2

1. Introduction

This paper presents the results of the quality assessment for the 2013 ROARs, and draws out key

characteristics of high quality ROARs and areas for improvement. It sets out the steps we need to take to

improve the quality of reporting in the future1.

The Results Oriented Annual Report (ROAR) exercise is one important part of our corporate strategic

planning system – based on the principles of results-based management - which is intended to help units

measure and monitor their programmes and the difference they are making for sustained,

transformational change. The ROAR should be of value to all country offices as a tool to assess the

effectiveness of country programmes based on demonstrable use of quantitative and qualitative

evidence. The ROAR quality assurance score is designed to assess the extent to which offices give

evidence in their ROAR of understanding RBM approaches and are using evidence to support claims of

progress at the output and outcome levels.

UNDP made a commitment to the Executive Board to improve results reporting at the Mid Term Review

of the relevant Strategic Plan (2008-13). The Results Framework includes an indicator on the Country

Office (CO) ROAR quality rating with a target of a 20% annual increase in the number of COs producing

ROARs that meet expected standards (14 out of 20 points). In 2011, 80 out of 136 (59%) of CO ROARs

met the expected standard and this increased to 91 (67%) in 2012. In 2013, there was another increase,

with 101 out of 136 (74%) of CO ROARs meeting the expected standard, and not a single ROAR was

scored in the “red” category. Detailed scoring per CO and a report on the results for 2011 and 2012 can

be found on the ROAR Intranet page including full background on the methodology.

2013 can be considered as an important milestone in UNDP: not only it brought together, for the first

time at the organizational level, a measurable and monitorable results framework at impact, outcome

and output levels (IRRF), but also, it was a transition year during which the organization invested

considerable effort to strengthen results-based management (RBM), programme performance, learning

from evaluation and results reporting. 2013 also closes the third generation of RBM in UNDP, and

constitutes a great opportunity to capitalize on our long-term commitment to raise the level of our

analysis and the relevance of our evidence base.

The evidence (ROAR QA scores 2011, 2012, 2013) demonstrates that the organization has been

delivering incremental improvements in ROAR quality ratings over the past three years. In 2013, our

goal to aim for every ROAR to be scored as satisfactory or higher2 was reached. The increasingly robust

and evidence-based analysis of our performance in 2013 provided us with a strong foundation of data to

drive forward higher quality programming. This continuing trend towards a stronger culture of results

was also highlighted by the Annual Report on Evaluation 2013.

The ROAR has been an essential source of data and evidence to inform organizational planning, learning,

decision-making and the performance and results reporting to the Executive Board. It will continue to

be, while undergoing a conceptual revision in 2014, in order to adapt to the new context. The new

format will be integrated within the Corporate Strategic Planning tool.

1 Quality assurance scores by CO can be found in summary Section 10 and in full in Annex A and will be included in the Balanced Score Card.

2 Associate Administrator’s ROAR launch message on 11

th November 2013

Page 3: Quality Assurance Report 2013 Country Office Results ...ROAR questions were very similar to 2012. The ROAR was nevertheless shortened (around 20 questions were removed, mainly from

Report on the Quality Assurance of 2013 ROARs

UNDP, October 2014

3

2. The 2013 ROAR: format

Being the last one to report of a generation of ROARs having reported on the previous Strategic Plan

2008-2013, the 2013 ROAR was therefore used as a bridge between the results under the out-going plan

and the organization’s positioning to undertake the new plan. The 2012 Cumulative Review section was

replaced by a “bridging” section of eight thematic questions that will help us show how the current

results of the organization position us to deliver on important priorities of the new Strategic Plan. The

aim was to provide baseline results data relevant to the new programme priorities, and to contribute to

the Annual Business Plan/IWP planning exercise for 2014.

To ensure consistency with reporting in previous years and to minimize burden on Country offices, the

ROAR questions were very similar to 2012. The ROAR was nevertheless shortened (around 20 questions

were removed, mainly from the output section).

3. The 2013 ROAR: reporting process

The Associate administrator launched the 2013 ROAR on November 11th

2013. She insisted on the

leadership needed across the organization to keep strengthened its reporting capacities, especially

during the critical transition to the new Strategic Plan, and exhorted for stronger evidence and data-

driven programme management. She reminded the organization that we couldn’t invest in development

programming without investing proportionally in monitoring and evaluation. She highlighted that M&E

was not an add-on, but an integral component of good programme and project management. She

recalled that while peer organizations were dedicating between 1-3% of programme expenditure to

M&E, UNDP was still far behind. She insisted that the ROAR and other initiatives such as the Country-

Office Support Initiative/regional roadmap exercise should make a major step closer to meeting

international standards, and to help UNDP become an organization whose learning, knowledge and

practice are firmly grounded in evidence that further guides our, and our partners’ decisions on how

best to achieve long-term development.

The Operation Support Group provided methodological support throughout the whole reporting period

(November 11th

to January 31st) through various webinars/training sessions co-organized by Regional

Bureaux, one on one sessions, and email support ([email protected]). As usual, the ROAR website

(https://intranet.undp.org/apps/roar2013/SitePages/Roar_Home.aspx) contained substantial guidance

notes and FAQs.

4. Scoring methodology

The scoring methodology (attached in Annex 2) was discussed and agreed by the OPG in February 2012

and approved again for use in 2013. The scoring methodology was designed in 2011 following an

analysis of stronger and weaker ROARs with a view to identifying key quality criteria to recognize strong

examples of results-based monitoring and reporting, and address some of the most prevalent

shortcomings in UNDP’s results monitoring and reporting culture and capacity. In addition it was

designed to include criteria to incentivize compliance with the ROAR guidance and template to ensure

data was provided in full to ensure credible and complete analysis could be carried out.

Page 4: Quality Assurance Report 2013 Country Office Results ...ROAR questions were very similar to 2012. The ROAR was nevertheless shortened (around 20 questions were removed, mainly from

Report on the Quality Assurance of 2013 ROARs

UNDP, October 2014

4

The methodology focuses on the quality of the component parts of the report itself (results focus and

use of data and evidence) not the quality of the underlying programme, the nature of the actual results

achieved or whether or not progress is on track. The scoring system does not look at every section of the

ROAR but instead focuses on the key quality issues in the key sections of the report for which every CO

is required to provide a quantitative and qualitative analysis.

ROARs are scored out of 20 points and then flagged on a red, amber, green scale to signal which ones

need most attention. Eight of the 20 points available are used to assess whether the CO has complied

with corporate requirements in the ROAR, e.g. completion of the all sections in the ROAR in full. The

remaining 12 points are used to assess results-focus and use of evidence and data. By separating the

points by quality and compliance the organisation can pinpoint which areas of the ROAR need greatest

improvement.

Balanced Scorecard Indicator

(ROARs scored out of 20 points in total):

<9 = Red (seriously weak, urgent attention needed)

9 – 13 = Amber (needs management attention)

>13 = Green (meets quality standards expected)

Lessons learned

- The scoring methodology has however revealed weaknesses, the main one being its subjectivity.

While the necessity of a scoring mechanism is not questioned, and the rating system (Red Amber

Green) demonstrated its efficiency (it provides an incentive for Country Offices to “do better”),

the methodology is not objective enough to ensure that scores are standardized. Even though

cross-checks and adjustments are made by OSG after ROARs are quality assured by the internal

Quality Assuror team, it seems there is still an important margin for subjective scoring, mainly

depending on how lenient or severe the quality assuror was.

- In many occurrences, on the “use of data and evidence” scoring parameter the quality assurors

were extremely severe in the scoring of

Recommendation

- The methodology needs to be revamped as it is not sensitive enough and very much based on

subjective judgment of the quality. A more objective, neutral and systematic scoring of ROARs is

needed in the future.

5. Quality assurance team

In 2012, OSG with the support of one consultant scored the 2011 ROARs. In 2013 it was agreed by the

OPG that the 2012 ROARs should be scored by 18 skilled M&E staff (nominated by regional bureaus

mainly from the COs with the top scoring ROARs in 2011). These staff attended a workshop in New York

for one week in March 2013 to learn how to complete the scoring exercise and then carried out the task

virtually over a couple of months. OSG managed and coordinated the task team closely - to ensure the

scoring methodology was applied rigorously and fairly. The workshop was also used for strategic

discussions and networking on M&E and the Country Office Support Initiative (COSI) on RBM.

For the 2013 ROAR, three option needs were submitted to the OPG to set up the scoring team. The need

to maintain consistency of approach with previous years was underlined, but also cost and time. The

Page 5: Quality Assurance Report 2013 Country Office Results ...ROAR questions were very similar to 2012. The ROAR was nevertheless shortened (around 20 questions were removed, mainly from

Report on the Quality Assurance of 2013 ROARs

UNDP, October 2014

5

options presented were based on the principles agreed the year before: that quality assurance and

scoring should be devolved to regions and COs to encourage ownership over the approach and to create

an incentive for COs staff to improve ROAR quality, but also implications during a busy period when HQ,

regional and CO staff will be engaged in operationalizing the new Strategic Plan, as well as analysis of

the ROARs and production of the 2013 Annual Report. English, French and Spanish were also expected

from the team, along with the principle that Quality Assurors would ideally score ROARS outside of their

regions. The deadline for the scoring was end April 2014, and estimated costs were included3.

The options were the following:

� Option 1: Staff from the COs with the top scoring 2012 ROARs are invited to complete the

scoring exercise virtually, from their COs, with support and coordination from OSG.

� Option 2: Staff who performed the scoring exercise last year are requested to carry out the

exercise again, virtually from their CO with support and coordination by OSG.

� Option 3: Up to five staff (one from each region) from last year’s scoring Task Team are invited

to complete the 2013 exercise on detail assignment in New York for around one month, with

coordination by OSG.

The consensus was to select Option 2 (that staff who performed the scoring exercise last year should be

requested to carry out the exercise again, virtually from their CO with support and coordination by

OSG). OSG requested support from Regional Bureaus to contact the CO and RSC staff (and their

respective senior management) to request their support, and insisted that for continuity reasons it

would be important to rely as much as possible on the same staff as the previous year. A team of 25

staff members in HQ, Regional centers and Country Offices quality was designated and undertook the

assignment of quality assuring all 136 CO ROARs on behalf of the organization. Among those 25 staff, 17

were from Country Offices in the 5 regions, 3 from 2 Regional Hubs (Panama and Addis), 1 from a

Regional Bureau, and 4 from the Operations Support Group4.

All the quality assurors were trained on the methodology and the process through 3 webinars. OSG also

provided them with methodological support throughout the whole process. Some of them (12) had

already quality assured ROARs previously and had gone through a one week workshop held in New York

in 2013. With an aim to keep building capacities, these experienced Quality assurors were asked to be

the “buddies” of the other less experienced staff members. This buddy system was welcome by the

different team members, who took it as an opportunity to interact with colleagues in other Country

Offices, exchange on regional specificities, and benefit from the know-how of other experienced staff

members.

Overall, the Quality assurance support role was played with enthusiasm by most of the staff members.

High levels of commitment were noticed, the most reason being that staff saw their involvement in the

ROAR QA as a great learning opportunity, as well as a way to interact with HQ on strategic programmatic

issues. 2 staff members did not manage to undertake the QA, one of them being away for health

3 OPG action brief and paper on Scoring of CO ROARs 2013 – options for process

4 List of Quality Assurors in annex

Page 6: Quality Assurance Report 2013 Country Office Results ...ROAR questions were very similar to 2012. The ROAR was nevertheless shortened (around 20 questions were removed, mainly from

Report on the Quality Assurance of 2013 ROARs

UNDP, October 2014

6

reasons, the other one not having been able to finalize the exercise on time. Some staff members also

mentioned that the absence of “incentive” this year made the process slightly less attractive to them.

Each member of the QA team scored on average 5 to 6 ROARs. OSG coordinated and managed the task

team closely and spot checked at least one ROAR scored by each member of the team to ensure the

scoring methodology was applied consistently, rigorously and fairly. Scores were compared with last

year’s and additional spot checks were carried out by OSG in cases where scoring differed significantly to

validate the findings. The scores were circulated with regional bureau M&E advisors before the

finalization of the final report. 3 Regional Bureaux asked OSG to double-check scores for a total of 8

ROARs (3 in RBEC, 3 in RBLAC and 2 in RBAS). The cross-check was conducted by 2 experienced Quality

Assurors (who did not get the initial score in order not to be influenced) and revealed that the rating

was accurate in 6 cases, while one ROAR gained 1 point and another one 3.

Lessons learned

- The QA process is seen as a learning and networking opportunity for Quality assurors;

- The buddy system was a good way for staff to strengthen each other’s capacities, however some

staff could not prioritize the QA and the exchanges with their buddies, who in the end had to

request methodological support from OSG;

- The workload on staff members is considerable and delays were accumulated due to several

other initiatives soliciting the same staff members (most of the time, M&E officers, also in charge

of IRRF, programme alignment, etc.)

Recommendation

- Keep the ROAR QA as an internal process, with interesting capacity building potential, however

Institutionalize the ROAR QA process and find a way to reward/incentivize staff

- Increase the total number of Quality assurors in order to lighten the pressure on staff

- Instead of a one-on-one buddy system, envisage groups of 4

6. Summary of Quality Assurance Scores

The overall quality assurance score is 15, showing that the quality remains stable (15 in 2012 as well).

Page 7: Quality Assurance Report 2013 Country Office Results ...ROAR questions were very similar to 2012. The ROAR was nevertheless shortened (around 20 questions were removed, mainly from

Report on the Quality Assurance of 2013 ROARs

UNDP, October 2014

7

In 2013, 101 ROARs (75%) of ROARs met the

standard expected (green), which represents 9

more than in 2012 and 20 more than in 2011.

This increase in number of green ROARs

represents an 11% improvement from 2012 to

2013, but does not yet reach the target set in

the 2008-2013 Strategic Plan Results

Framework of a 20% annual increase.

Number of Country Office ROARS by the RAG rating in 2013, and scoring trend compared to 2012

GREEN AMBER RED

Total #

Trend 2012-

2013 Total #

Trend 2012-

2013 Total #

Trend 2012-

2013

Overall trend

2012-2013

RBA 31 +1 14 -1 0 0 ↑

RBAP 17 +2 7 -2 0 0 ↑

RBAS 11 +1 7 -1 0 0 ↑

RBEC 21 +5 2 -6 0 0 ↑

RBLAC 21 +1 4 -1 0 0 ↑

TOTAL 101 +10 35 -10 0 0 ↑

Scores for Country Offices by region

RBLAC

Country Region Total

Argentina RBLAC 14

Barbados & OECS RBLAC 16

Bolivia RBLAC 13

Brazil RBLAC 20

Chile RBLAC 15

Colombia RBLAC 16

Costa Rica RBLAC 15

Cuba RBLAC 16

Dominican Republic RBLAC 17

Ecuador RBLAC 20

El Salvador RBLAC 18

Guatemala RBLAC 15

Guyana RBLAC 15

Haiti RBLAC 14

Honduras RBLAC 16

Jamaica RBLAC 12

Mexico RBLAC 20

Nicaragua RBLAC 17

Panama RBLAC 19

Paraguay RBLAC 19

Peru RBLAC 17

Suriname RBLAC 10

Trinidad & Tobago RBLAC 11

Uruguay RBLAC 18

Venezuela RBLAC 15

RBEC

80

53

3

91

45

0

101

35

0

GREEN AMBER RED

Total number of Country Office ROARS by

the red, amber, green rating in 2011,2012

and 2013

Page 8: Quality Assurance Report 2013 Country Office Results ...ROAR questions were very similar to 2012. The ROAR was nevertheless shortened (around 20 questions were removed, mainly from

Report on the Quality Assurance of 2013 ROARs

UNDP, October 2014

8

Country Region Total

Albania RBEC 17

Armenia RBEC 18

Azerbaijan RBEC 16

Belarus RBEC 19

Bosnia & Herzegovina RBEC 18

Bulgaria RBEC 14

Croatia RBEC 16

Cyprus RBEC 15

Georgia RBEC 17

Kazakhstan RBEC 14

Kosovo RBEC 15

Kyrgyzstan RBEC 19

Lithuania RBEC

Macedonia RBEC 15

Moldova RBEC 19

Montenegro RBEC 18

Romania RBEC 13

Russia RBEC 18

Serbia RBEC 19

Tajikistan RBEC 17

Turkey RBEC 17

Turkmenistan RBEC 12

Ukraine RBEC 16

Uzbekistan RBEC 19

RBAP

Country Region Total

Afghanistan RBAP 17

Bangladesh RBAP 20

Bhutan RBAP 20

Cambodia RBAP 17

China RBAP 17

DPR Korea RBAP 9

Fiji RBAP 15

India RBAP 18

Indonesia RBAP 18

Iran RBAP 10

Laos RBAP 13

Malaysia RBAP 15

Maldives RBAP 18

Mongolia RBAP 19

Myanmar RBAP 12

Nepal RBAP 20

Pakistan RBAP 18

Papua New Guinea RBAP 18

Philippines RBAP 14

Samoa RBAP 12

Sri Lanka RBAP 16

Thailand RBAP 13

Timor-Leste RBAP 13

Vietnam RBAP 15

RBAS

Country Region Total

Algeria RBAS 10

Bahrain RBAS 12

Djibouti RBAS 14

Egypt RBAS 16

Iraq RBAS 14

Jordan RBAS 12

Kuwait RBAS 10

Lebanon RBAS 15

Libya RBAS 10

Morocco RBAS 18

PAPP RBAS 13

Saudi Arabia RBAS 18

Somalia RBAS 15

Sudan RBAS 20

Syria RBAS 14

Tunisia RBAS 16

United Arab Emirates RBAS 12

Yemen RBAS 16

RBA

Country Region Total

Angola RBA 16

Benin RBA 12

Botswana RBA 17

Burkina Faso RBA 17

Burundi RBA 13

Page 9: Quality Assurance Report 2013 Country Office Results ...ROAR questions were very similar to 2012. The ROAR was nevertheless shortened (around 20 questions were removed, mainly from

Report on the Quality Assurance of 2013 ROARs

UNDP, October 2014

9

Cameroon RBA 17

Cape Verde RBA 14

Central Africa Republic RBA 14

Chad RBA 11

Comoros RBA 13

Congo RBA 10

Cote d'Ivoire RBA 16

Democratic Republic of Congo RBA 20

Equatorial Guinea RBA 12

Eritrea RBA 16

Ethiopia RBA 16

Gabon RBA 16

Gambia RBA 15

Ghana RBA 16

Guinea RBA 15

Guinea Bissau RBA 13

Kenya RBA 15

Lesotho RBA 16

Liberia RBA 12

Madagascar RBA 18

Malawi RBA 12

Mali RBA 14

Mauritania RBA 19

Mauritius RBA 13

Mozambique RBA 15

Namibia RBA 19

Niger RBA 17

Nigeria RBA 13

Rwanda RBA 18

Sao Tome and Principe RBA 12

Senegal RBA 10

Sierra Leone RBA 19

South Africa RBA 16

South Sudan RBA 18

Swaziland RBA 12

Tanzania (United Republic of) RBA 16

Togo RBA 19

Uganda RBA 19

Zambia RBA 18

Zimbabwe RBA 16

All 5 regions managed to increase the number

of CO ROARs scoring in the green range.

Over the past 3 years, all 5 regions have

maintained a good level of quality, with RBLAC

ahead (average score of 15.5 points), closely

followed by RBEC (15.3), RBAP (15 points), RBA

(14.6 points) and RBAS (14 points).

While looking more closely at the scores, the following should be highlighted:

On the positive side, evidence is showing that standards have improved, with 10 more ROARS meeting

the minimum standard expected (green). 70 Country Offices gained points, which represents more than

half (52%) of Country offices in the organisation delivering better ROARs. 26 COs (20%) managed to

improve ROAR quality from amber to green (compared to 34 last year). No ROAR was assessed as

seriously weak (red) compared to none in 2012, and 3 in 2011.

On the more negative side, it should be noted that overall in 2013, 47 (35%) ROARs lost points. The 27%

of ROARs needing management attention to improve quality (amber) is a considerable decrease from

2012 (33%) and 2012 (39%) but this is still too high.

15

14 15

15

15 16

14 15 16

14 16

16

13 14 15

2011 2012 2013

Y E A R L Y A V E R A G E R O A R Q A S C O R E

B Y R E G I O N ( 2 0 1 1 - 2 0 1 3 )

RBA RBEC RBAP RBLAC RBAS

Page 10: Quality Assurance Report 2013 Country Office Results ...ROAR questions were very similar to 2012. The ROAR was nevertheless shortened (around 20 questions were removed, mainly from

Report on the Quality Assurance of 2013 ROARs

UNDP, October 2014

10

Most importantly, 20 COs dropped from green to amber (compared to 20 last year)5. Particular attention

should be paid to those COs which have fallen from green to amber or scored significantly less than last

year (i.e. fallen by more than 2 points between 2012 and 2013). It is recommended that each region

look carefully at the reasons why scores have fallen or increased to identify weakness, strengths and

lessons learned which could help improve RBM.

List of Country Office ROARS having lost more than 2 points between 2012 and 2013

Country Office Point loss>2

Myanmar -6

Senegal -6

Congo -5

Malawi -5

Benin -4

Mauritius -4

Swaziland -4

Bahrain -3

Romania

-3

Sao Tome and Principe

-3

Bolivia -3

Trinidad & Tobago -3

5 Individual scores by CO can be found in summary in Section 10 and in full in Annex A.

Page 11: Quality Assurance Report 2013 Country Office Results ...ROAR questions were very similar to 2012. The ROAR was nevertheless shortened (around 20 questions were removed, mainly from

Report on the Quality Assurance of 2013 ROARs

UNDP, October 2014

11

7. Acclaim list – ROARs assessed as the highest quality

38 ROARs were assessed as the best quality in 2013 (scoring 18, 19 or 20 points), while in 2012 only 13

ROARs were acclaimed, representing an increase of almost 3 times (from only 13 last year).

Serbia and Bangladesh have been fluctuating between 18 to 20 points over the past 3 years and deserve

a special praise. In addition Bhutan, Brazil, DRC, Ecuador, Mexico, Nepal and Sudan scored 20 out of 20

points. These ROARs should be used as good examples for other regions and COs to learn from.

It should be noted that those COs acclaimed for the highest quality ROARs in 2011 which are not in the

2012 list still scored in the green range this year (scoring between 14 and 17 points), except for

Myanmar (13, amber): China and Kirgizstan both scored 17.

Characteristics of the best quality ROARs include: clear descriptions of outcome level results, evidenced

by relevant qualitative or quantitative data; good use of outcome indicators which have been assessed

using latest data against clear baselines and targets; clear, logical narrative describing how UNDP

contributed to outcome level, national development results; and use of evaluations to verify the results

achieved or to evidence challenges and lessons affecting progress.

Page 12: Quality Assurance Report 2013 Country Office Results ...ROAR questions were very similar to 2012. The ROAR was nevertheless shortened (around 20 questions were removed, mainly from

Report on the Quality Assurance of 2013 ROARs

UNDP, October 2014

12

13

37

0

5

10

15

20

25

30

35

40

year 2012 year 2013

Country Offices receiving scores above equal or above to 18 points in 2012 and 2013 (in

red, 2 years in a row)

20 points:

Kyrgyzstan,

Mexico,

Serbia, South

Sudan, Sudan

19 points:

Bangladesh,

Brazil, China,

Ecuador,

India

18 points : El

Salvador,

Myanmar,

Uruguay

20 points:

Bangladesh,

Bhutan,

Brazil, DRC,

Ecuador,

Mexico,

Nepal,

Sudan

19 points:

Belarus,

Kyrgyzstan

Mauritania,

Moldova,

Mongolia,

Namibia,

Panama,

Paraguay,

Serbia,

Sierra Leone,

Togo,

Uganda,

Uzbekistan

18 points :

Armenia,

Bosnia &

Herzegovina,

India,

Indonesia,

Madagascar,

Maldives,

Montenegro,

Morocco,

Pakistan,

Papua New

Guinea,

Russia,

Rwanda,

Saudi Arabia,

South Sudan,

Uruguay,

Zambia

Page 13: Quality Assurance Report 2013 Country Office Results ...ROAR questions were very similar to 2012. The ROAR was nevertheless shortened (around 20 questions were removed, mainly from

Report on the Quality Assurance of 2013 ROARs

UNDP, October 2014

13

8. Lower scoring ROARs/ROARs needing attention (amber)

In 2013, no ROAR fell into the red category. Thirty-five ROARs were in the amber range, all requiring

special management attention on the basis of the following criteria:

• The CO scored less than 50% of the points allocated to quality in the scoring methodology (as

opposed to compliance which should be simpler to address);

• The country is of specific interest to the Executive Board, external partners or UNDP due to the

current or recent political or other special situation;

• The country programme is of special interest or strategic importance to UNDP due to its nature

(e.g. regional hub, South-South programme etc.).

All 35 Country Offices with ROARs scored in the amber range should be the focus of management

attention in 2014. Of these, 35 are still producing ROARs scoring less than 50% against the quality

criteria (“results focus” and “use of evidence and data”).

A number of those identified as needing special attention last year6 continued to score in the amber

range and low on quality in 2013 (in red below).

2013 Amber ROARs for attention TOTAL SCORE QUALITY TREND

Country Region 2013 2012 2013 2012

Algeria RBAS 10 13 25% 42% ↓

Bahrain RBAS 12 15 36% 58% ↓

Benin RBA 12 16 33% 67% ↓

Bolivia RBLAC 13 16 42% 75% ↓

Burundi RBA 13 12 42% 33% ↑

Chad RBA 11 13 25% 50% ↓

Comoros RBA 13 13 42% 42% ↔

Congo RBA 10 15 17% 58% ↓

DPR Korea

RBAP 9 11 8% 25% ↓

Equatorial Guinea

RBA 12 10 50% 25% ↑

Guinea Bissau

RBA 13 11 42% 25% ↑

Iran

RBAP 10 11 17% 25% ↓

Jamaica

RBLAC 12 14 33% 50% ↓

Jordan

RBAS 12 13 33% 42% ↓

Kuwait

RBAS 10 10 25% 25% ↔

Laos

RBAP 13 11 42% 33% ↑

Liberia

RBA 12 11 42% 25% ↓

Libya

RBAS 10 9 17% 17% ↑

Malawi

RBA 12 17 33% 75% ↓

Mauritius

RBA 13 17 42% 75% ↓

6 Report on the Quality Assurance of 2012 ROARs, p. 6

Page 14: Quality Assurance Report 2013 Country Office Results ...ROAR questions were very similar to 2012. The ROAR was nevertheless shortened (around 20 questions were removed, mainly from

Report on the Quality Assurance of 2013 ROARs

UNDP, October 2014

14

Myanmar RBAP 13 18 42% 83% ↓

Nigeria RBA 13 15 58% 58% ↓

PAPP RBAS 13 13 42% 42% ↔

Romania RBEC 13 16 42% 67% ↓

Samoa RBAP 12 9 33% 50% ↑

Sao Tome and Principe RBA 12 15 33% 58% ↓

Senegal RBA 10 16 17% 67% ↓

Suriname RBLAC 10 13 17% 42% ↓

Swaziland RBA 12 16 33% 67% ↓

Thailand

RBAP 13 15 42% 58% ↓

Timor-Leste RBAP 13 11 50% 42% ↑

Trinidad & Tobago

RBLAC 11 14 33% 50% ↓

Turkmenistan RBEC 12 13 37% 42% ↓

United Arab Emirates RBAS 12 14 35% 50% ↓

The team who carried out the quality assurance of the ROARs made a number of observations about the

characteristics of higher and lower scoring ROARs which may help identify how reporting could be

improved in future years. Characteristics of lower scoring ROARs are the following:

a. Narrative style lacks clarity and consistency at programme level

o The narrative in general lacked clarity, used too many localized acronyms and references making

significance of UNDP’s work difficult to comprehend

o Different sections clearly drafted by different writers, leading to lack of consistency and

fragmenting the results picture. The RR Foreword frequently had its own “voice”, suggesting it

may have been the only section written or checked by senior management.

b. Unable to be specific about UNDP’s role and contribution to outcome level development change

o Lower scoring ROARs failed to be specific about what UNDP actually did in contribution to

outcome level change. For example the CO may have explained the development change at the

outcome level but did not specify UNDP’s role that well and/or struggled to make the

connection between UNDP activities and outputs and outcome level results.

o When key development changes did take place, the COs did not elaborate on why this

development change was important, significant or beneficial to the programme country and its

people. In general, the link to transformational change was missing.

o In cases where projects had recently closed or CPD outcomes were in flux, some COs did not

report any information, even though expenditures were incurred in the year (suggesting that

work had been carried out). In several cases where projects were closed, evaluations had been

conducted and could have been cited, but were not.

c. Lack of evidence and data to verify the results reported and very limited use of evaluations

o No use, or limited use, of evaluation findings to support results statements.

o In some cases evaluations were noted in the evaluation section (Section E) but these were not

used to validate and elaborate on results reporting, despite the likelihood that these findings

would have been pertinent.

Page 15: Quality Assurance Report 2013 Country Office Results ...ROAR questions were very similar to 2012. The ROAR was nevertheless shortened (around 20 questions were removed, mainly from

Report on the Quality Assurance of 2013 ROARs

UNDP, October 2014

15

o Poor reporting against the outcome indicators. In many cases the outcome indicators were

poorly formulated which made it difficult to report progress against these.

o In weaker ROARS the CO stated in the narrative that progress and development change had

been made but did not substantiate this with data or evidence leaving the reader without any

sense of the scale of progress or the credibility of the stated results.

9. General observations on quality and compliance

Results focus:

Country offices with lower scores in this section did not always describe results or change at the

outcome level and instead tended to describe activities and outputs or struggled to articulate how UNDP

contributed to outcome level results. COs with lower scores here need to ensure that they are able to

report how their programme contributes to outcome level change and that this can be communicated

clearly and logically. These results should be supported by evidence and data (qualitative and/or

quantitative) to verify that results have been achieved.

Use of evidence and data:

Country offices were asked to specify the evidence base used in Sections A and B (Resident

Representative’s forward and outcome progress) and to draw on data (both qualitative and

quantitative) to support their assessment of progress. Furthermore COs were asked to appropriately

monitor and make use of outcome indicators or to indicate if the indicator set needed revision. Lower

scores in this section suggest that the CO did not do so across all outcomes and therefore should

improve the use of data and evidence (including evaluations) not only to support annual results

reporting but also as a critical part of routine results-based management through-out the programme

and project cycle. In particular, outcome indicators were often not articulated or monitored correctly.

The rating system recognizes that data for outcome indicators are typically not collected annually, but in

many cases there is no instance of qualitative or quantitative data ever having been collected, or no

demonstration that it is being used to support claims of progress.

Compliance issues:

Eight out of the twenty points available could be achieved through compliance with the requirements of

the ROAR template. All the COs met 100% of the compliance criteria because the functionality of the

ROAR template has improved which prevents the CO from saving the ROAR without completing all

sections in full.

Page 16: Quality Assurance Report 2013 Country Office Results ...ROAR questions were very similar to 2012. The ROAR was nevertheless shortened (around 20 questions were removed, mainly from

Report on the Quality Assurance of 2013 ROARs

UNDP, October 2014

16

ANNEXES

Annex 1 – Average scores by year and region

Number of Country Office ROARS by the red, amber, green rating in 2013, 2012 and 2011

GR

EE

N

AM

BE

R

RE

D

TR

EN

D

TO

TA

L

GR

EE

N

AM

BE

R

RE

D

TO

TA

L

GR

EE

N

AM

BE

R

RE

D

TO

TA

L

RBA 31 69% 14 31% 0 0% ↑ 45 30 67% 15 33% 0 0% 45 ↑ 29 64% 16 36% 0 0% 45

RBAP 17 71% 7 29% 0 0% ↑ 24 15 63% 9 38% 0 0% 24 ↑ 14 58% 9 38% 1 4% 24

RBAS 11 61% 7 39% 0 0% ↑ 18 10 56% 8 44% 0 0% 18 ↑ 8 44% 9 50% 1 6% 18

RBEC 21 91% 2 9% 0 0% ↑ 23 16 67% 8 33% 0 0% 24 ↑ 16 67% 7 29% 1 4% 24

RBLAC 21 84% 5 20% 0 0% ↑ 25 20 80% 5 20% 0 0% 25 ↑ 13 52% 12 48% 0 0% 25

TOTAL 100 74% 36 27% 0 0% ↑ 135 91 67% 45 33% 0 0% 136 ↑ 80 59% 53 39% 3 2% 136

Page 17: Quality Assurance Report 2013 Country Office Results ...ROAR questions were very similar to 2012. The ROAR was nevertheless shortened (around 20 questions were removed, mainly from

Report on the Quality Assurance of 2013 ROARs

UNDP, October 2014

17

Annex 2 - Country Office ROAR Quality Assurance Scores (A to Z)

2013 ROAR 2012 ROAR Trend 2013 vs. 2012

Country Region Total score Total score Difference Trend

Afghanistan RBAP 17 16 1 ↑

Albania RBEC 17 13 4 ↑

Algeria RBAS 10 13 -3 ↓

Angola RBA 16 17 -1 ↓

Argentina RBLAC 14 16 -2 ↓

Armenia RBEC 18 16 2 ↑

Azerbaijan RBEC 16 16 0 ↔

Bahrain RBAS 12 15 -3 ↓

Bangladesh RBAP 20 19 1 ↑

Barbados & OECS RBLAC 16 15 1 ↑

Belarus RBEC 19 17 2 ↑

Benin RBA 12 16 -4 ↓

Bhutan RBAP 20 15 5 ↑

Bolivia RBLAC 13 16 -3 ↓

Bosnia & Herzegovina RBEC 18 12 6 ↑

Botswana RBA 17 12 5 ↑

Brazil RBLAC 20 19 1 ↑

Bulgaria RBEC 14 17 -3 ↓

Burkina Faso RBA 17 17 0 ↔

Burundi RBA 13 12 1 ↑

Cambodia RBAP 17 16 1 ↑

Cameroon RBA 17 17 0 ↔

Cape Verde RBA 14 15 -1 ↓

Central Africa Republic RBA 14 13 1 ↑

Page 18: Quality Assurance Report 2013 Country Office Results ...ROAR questions were very similar to 2012. The ROAR was nevertheless shortened (around 20 questions were removed, mainly from

Report on the Quality Assurance of 2013 ROARs

UNDP, October 2014

18

Chad RBA 11 13 -2 ↓

Chile RBLAC 15 16 -1 ↓

China RBAP 17 19 -2 ↓

Colombia RBLAC 15 12 3 ↑

Comoros RBA 13 13 0 ↔

Congo RBA 10 15 -5 ↓

Costa Rica RBLAC 15 15 0 ↔

Cote d'Ivoire RBA 16 14 2 ↑

Croatia RBEC 16 17 -1 ↓

Cuba RBLAC 16 16 0 ↔

Cyprus RBEC 15 12 3 ↑

Djibouti RBAS 14 10 4 ↑

Dominican Republic RBLAC 17 17 0 ↔

DPR Korea RBAP 9 11 -2 ↓

Democratic Republic of Congo RBA 20 16 4 ↑

Ecuador RBLAC 20 19 1 ↑

Egypt RBAS 16 14 2 ↑

El Salvador RBLAC 18 18 0 ↔

Equatorial Guinea RBA 12 10 2 ↑

Ethiopia RBA 16 15 1 ↑

Eritrea RBA 16 17 -1 ↓

Fiji RBAP 15 16 -1 ↓

Gabon RBA 16 17 -1 ↓

Gambia RBA 15 13 2 ↑

Georgia RBEC 17 17 0 ↔

Ghana RBA 16 16 0 ↔

Guatemala RBLAC 15 11 4 ↑

Guinea RBA 15 15 0 ↔

Guinea Bissau RBA 13 11 2 ↑

Page 19: Quality Assurance Report 2013 Country Office Results ...ROAR questions were very similar to 2012. The ROAR was nevertheless shortened (around 20 questions were removed, mainly from

Report on the Quality Assurance of 2013 ROARs

UNDP, October 2014

19

Guyana RBLAC 15 12 3 ↑

Haiti RBLAC 14 17 -3 ↓

Honduras RBLAC 16 13 3 ↑

India RBAP 18 19 -1 ↓

Indonesia RBAP 18 16 2 ↑

Iran RBAP 10 11 -1 ↓

Iraq RBAS 14 15 -1 ↓

Jamaica RBLAC 12 14 -2 ↓

Jordan RBAS 12 13 -1 ↓

Kazakhstan RBEC 14 13 1 ↑

Kenya RBA 15 16 -1 ↓

Kosovo RBEC 15 15 0 ↔

Kuwait RBAS 10 10 0 ↔

Kyrgyzstan RBEC 19 20 -1 ↓

Laos RBAP 13 11 2 ↑

Lebanon RBAS 15 13 -1 ↓

Lesotho RBA 16 13 3 ↑

Liberia RBA 12 11 1 ↑

Libya RBAS 10 9 1 ↑

Lithuania RBEC

10

Macedonia RBEC 15 14 1 ↑

Madagascar RBA 18 14 4 ↑

Malawi RBA 12 17 -5 ↓

Malaysia RBAP 15 14 1 ↑

Maldives RBAP 18 12 6 ↑

Mali RBA 14 16 -2 ↓

Mauritania RBA 19 15 4 ↑

Mauritius RBA 13 17 -4 ↓

Mexico RBLAC 20 20 0 ↔

Page 20: Quality Assurance Report 2013 Country Office Results ...ROAR questions were very similar to 2012. The ROAR was nevertheless shortened (around 20 questions were removed, mainly from

Report on the Quality Assurance of 2013 ROARs

UNDP, October 2014

20

Moldova RBEC 19 16 3 ↑

Mongolia RBAP 19 13 6 ↑

Montenegro RBEC 18 15 3 ↑

Morocco RBAS 18 17 1 ↑

Mozambique RBA 15 11 4 ↑

Myanmar RBAP 13 18 -5 ↓

Namibia RBA 19 10 9 ↑

Nepal RBAP 20 16 4 ↑

Nicaragua RBLAC 17 14 3 ↑

Niger RBA 17 15 2 ↑

Nigeria RBA 13 15 -2 ↓

Pakistan RBAP 18 12 6 ↑

Panama RBLAC 19 17 2 ↑

PAPP RBAS 13 13 0 ↔

Papua New Guinea RBAP 18 17 1 ↑

Paraguay RBLAC 19 17 2 ↑

Peru RBLAC 17 17 0 ↔

Philippines RBAP 14 13 1 ↑

Romania RBEC 13 16 -3 ↓

Russia RBEC 18 10 8 ↑

Rwanda RBA 18 12 6 ↑

Samoa RBAP 12 9 3 ↑

Sao Tome and Principe RBA 12 15 -3 ↓

Saudi Arabia RBAS 18 9 9 ↑

Senegal RBA 10 16 -6 ↓

Serbia RBEC 19 20 -1 ↓

Sierra Leone RBA 19 14 5 ↑

Somalia RBAS 15 17 -2 ↓

South Africa RBA 16 12 4 ↑

Page 21: Quality Assurance Report 2013 Country Office Results ...ROAR questions were very similar to 2012. The ROAR was nevertheless shortened (around 20 questions were removed, mainly from

Report on the Quality Assurance of 2013 ROARs

UNDP, October 2014

21

South Sudan RBA 18 20 -2 ↓

Sri Lanka RBAP 16 14 2 ↑

Sudan RBAS 20 20 0 ↔

Suriname RBLAC 10 13 -3 ↓

Swaziland RBA 12 16 -4 ↓

Syria RBAS 14 15 -1 ↓

Tajikistan RBEC 17 16 1 ↑

Tanzania (United Republic of) RBA 16 11 5 ↑

Thailand RBAP 13 15 -2 ↓

Timor-Leste RBAP 13 11 2 ↑

Togo RBA 19 16 3 ↑

Trinidad & Tobago RBLAC 11 14 -3 ↓

Tunisia RBAS 16 14 2 ↑

Turkey RBEC 17 15 2 ↑

Turkmenistan RBEC 12 13 -1 ↓

Uganda RBA 19 16 3 ↑

Ukraine RBEC 16 16 0 ↔

United Arab Emirates RBAS 12 14 -2 ↓

Uruguay RBLAC 18 18 0 ↔

Uzbekistan RBEC 19 12 7 ↑

Venezuela RBLAC 15 15 0 ↔

Vietnam RBAP 15 17 -2 ↓

Yemen RBAS 16 14 2 ↑

Zambia RBA 18 14 4 ↑

Zimbabwe RBA 16 16 0 ↔

Page 22: Quality Assurance Report 2013 Country Office Results ...ROAR questions were very similar to 2012. The ROAR was nevertheless shortened (around 20 questions were removed, mainly from

Report on the Quality Assurance of 2013 ROARs

UNDP, October 2014

22

Annex 3 – Top ROARs 2012-2014 (scoring 18, 19 or 20 points)

List of 2012 top ROARs

TOP ROARS 2011

Scoring 18, 19 or 20 points

Azerbaijan 19

Bangladesh 19

Benin

20

Chile

19

Cote d'Ivoire 19

DRC

19

Eritrea

19

Madagascar 18

Myanmar 19

Serbia 18

Togo

18

Zambia 18

List of 2012 top ROARs

TOP ROARS 2012

Bangladesh 19

Brazil 19

China 19

Ecuador 19

El Salvador 18

India 19

Kyrgyzstan 20

Mexico 20

Myanmar 18

Serbia 20

South Sudan 20

Sudan 20

Uruguay 18

In purple, top ROARs for 2 years in a row

List of 2013 top ROARs

TOP ROARS 2013

Armenia 18

Bangladesh 20

Belarus 19

Bhutan 20

Bosnia & Herzegovina 18

Brazil 20

Democratic Republic of Congo 20

Ecuador 20

El Salvador 18

India 18

Indonesia 18

Kyrgyzstan 19

Madagascar 18

Maldives 18

Mauritania 19

Mexico 20

Moldova 19

Mongolia 19

Montenegro 18

Morocco 18

Namibia 19

Nepal 20

Pakistan 18

Panama 19

Papua New Guinea 18

Paraguay 19

Russia 18

Rwanda 18

Saudi Arabia 18

Serbia 19

Sierra Leone 19

South Sudan 18

Sudan 20

Togo 19

Uganda 19

Uruguay 18

Uzbekistan 19

Zambia 18

Page 23: Quality Assurance Report 2013 Country Office Results ...ROAR questions were very similar to 2012. The ROAR was nevertheless shortened (around 20 questions were removed, mainly from

Report on the Quality Assurance of 2013 ROARs

UNDP, October 2014

23

Annex 4 - ROAR Quality Assurance Scoring Methodology

Issue Score ROAR Sections

Results Focus

(Quality

scoring)

6 points Referring to achieved or intended significant results in Section A, Q 2: “outline your

office’s key contributions to development change in 2012” (0 points if not at all, 1 point

if somewhat or partially and 2 points for fully).

Addressing the outcome statement in the third question in Section C: “what has

changed in the country under this outcome area?” (0 points if not at all, 1 point if

somewhat or partially, and 2 points for fully)

Explaining UNDP’s actual or intended contribution to the outcome in 2011 in the fourth

question in Section C: “what has UNDP contributed to this outcome level change?” (0

points if not at all, 1 point if somewhat or partially and 2 points for fully)

Use of data

and evidence

(Quality

scoring)

6 points Correctly entering appropriate data for outcome indicators (0 points if not at all, 1 point

if somewhat or partially and 2 points for fully)

Data being utilized to provide an evidentiary basis to claims in Sections A and C (0 points

if not at all, 1 point if somewhat or partially and 2 points for fully)

Use of evidence from evaluations (produced by UNDP or other agencies/partners) to

support results or lessons learned reported in Sections A and C (0 points if not at all, 1

point if somewhat or partially and 2 points for fully)

Other key

issues

(Compliance

scoring)

8 points Completion of the mandatory questions for output 4 in full as well as any other outputs

if selected (0 points if not at all, 1 point if somewhat or partially and 2 points for fully)

Output target status entered via IWP monitoring view for quarter 4 (0 points if not at

all, 1 point if somewhat or partially and 2 points for fully)

Accurate and full reporting on Unit Key results (Section D, parts A, B and C) (0 points if

not at all, 1 point if somewhat or partially and 2 points for fully)

All other mandatory questions in the ROAR completed in full by the deadline (1

February 2013). (0 points if not at all, 1 point if somewhat or partially and 2 points for

fully)

Each ROAR rated out of a maximum of 20 points in total (12 points for quality and 8 points for

compliance).

BSC indicator: <9 = Red (needs urgent management attention)

9 – 13 = Amber (needs some management attention)

>13 = Green (meets or exceeds the minimum standard expected)

Page 24: Quality Assurance Report 2013 Country Office Results ...ROAR questions were very similar to 2012. The ROAR was nevertheless shortened (around 20 questions were removed, mainly from

Report on the Quality Assurance of 2013 ROARs

UNDP, October 2014

24

Annex 5 –List of Quality Assurors by region

Staff member Bureau Unit French Spanish New nomination

Aurelie Boukobza HQ OSG Yes No Yes

Benoit Conti HQ OSG Yes No No

Gemma Wood HQ OSG No No No

Heriberto Tapia HQ OSG No Yes No

Adrien Tigo RBA Benin Yes No Yes

Andson Nusne RBA Zambia No No No

Francois-Corneille

Kedowide RBA RC-Addis Yes No No

Mamadou Ndaw RBA DRC Yes No No

Blerta Cela RBAP Bangladesh Yes Yes Yes

Kamolmas Jaiyen RBAP APRC No No No

Le Le Lan RBAP Vietnam / DPRK No No No

Nadia Hamid RBAS Sudan No No Yes

Noha Rifaat RBAS Egypt No No No

Olfa Borsali Ben Hamida RBAS Tunisia Yes No No

Armine Hovhannisyan RBEC Armenia No No Yes

Daniel Varga RBEC Serbia No No No

Irina Goryunova RBEC Kazakhstan No No Yes

Lilia Ormonbekova RBEC Kyrgyzstan Yes No Yes

Zarina Abdulaeva RBEC Tajikistan No No Yes

Carla Lecaros RBLAC Regional Hub Panama - El Salvador

CO) No Yes Yes

Juliana Santos RBLAC Brazil No No Yes

Kenroy Roach RBLAC RBLAC HQ No No No

Maria Rosa Moreno RBLAC Cuba No Yes No

Marta Cozar RBLAC Chili No Yes Yes

Silvia Guzman RBLAC Regional Hub Panama No Yes Yes

Page 25: Quality Assurance Report 2013 Country Office Results ...ROAR questions were very similar to 2012. The ROAR was nevertheless shortened (around 20 questions were removed, mainly from

Report on the Quality Assurance of 2013 ROARs

UNDP, October 2014

25

Annex 6 - ROAR Quality Assurance Scores by region and Country Office from 2011 to 2013

Figure 1 RBAS scores per Country Office 2001-2013

Addendum: read ‘15’ for Lebanon 2013.

Page 26: Quality Assurance Report 2013 Country Office Results ...ROAR questions were very similar to 2012. The ROAR was nevertheless shortened (around 20 questions were removed, mainly from

Report on the Quality Assurance of 2013 ROARs

UNDP, October 2014

26

Figure 2 RBAP scores per Country Office 2001-2013

Page 27: Quality Assurance Report 2013 Country Office Results ...ROAR questions were very similar to 2012. The ROAR was nevertheless shortened (around 20 questions were removed, mainly from

Report on the Quality Assurance of 2013 ROARs

UNDP, October 2014

27

Figure 3 RBEC scores per Country Office 2001-2013

Addendum: read ‘19’ for Kyrgyzstan 2013.

Page 28: Quality Assurance Report 2013 Country Office Results ...ROAR questions were very similar to 2012. The ROAR was nevertheless shortened (around 20 questions were removed, mainly from

Report on the Quality Assurance of 2013 ROARs

UNDP, October 2014

28

Figure 4 RBLAC scores per Country Office 2001-2013

Page 29: Quality Assurance Report 2013 Country Office Results ...ROAR questions were very similar to 2012. The ROAR was nevertheless shortened (around 20 questions were removed, mainly from

Report on the Quality Assurance of 2013 ROARs

UNDP, October 2014

29

Figure 5 RBA scores per Country Office 2001-2013