18
1 Terms of Reference (ToR) NATIONAL INDIVIDUAL CONSULTANCY Final evaluation: “Joint Programme on Youth and Women Employment” Job Title: National Consultant for the Final Evaluation of “Youth and Women Employment Programme” Category: Youth Employment Duty Station: Kigali, Rwanda Type of contract: National consultant Expected starting date: Immediately Duration of assignment: 35 days I. Background and context Initiated in 2013, the One UN Joint Programme “Youth and Women Employment Programme” is a 5- year Programme ending in June 2018. The Joint programme is aligned to the programming cycles of Government of Rwanda (GoR) as it aims to response to the national priority defined in Vision 2020 and EDPRS2 that is “Productivity and Youth Employment”. This five-year programme is aligned to the programming cycles of GoR and UN through the national development plan Economic Development and Poverty Reduction Strategy (EDPRS) 2 2013-2018 and the United Nations Development Assistance Plan (UNDAP) 2013-2018 respectively and is in particular aligned to Outcomes 1, 2 and 4 of the first result area of UNDAP - Inclusive Economic Transformation - and - Productivity and Youth Employment - thematic area of EDPRS 2. It is built on five outcomes including enhancing national capacities to promote employment intensive growth and to mainstream youth employment in programmes and budgets; building skills and competences of youth and women for employability and enterprises competitiveness; enhancing job creation and enterprise development through entrepreneurship development access to markets and inclusive financial

United Nations Development Programme - Terms of Reference … · 2018. 2. 12. · and Women Employment Programme _ Category: Youth Employment Duty Station: Kigali, Rwanda Type of

  • Upload
    others

  • View
    2

  • Download
    0

Embed Size (px)

Citation preview

  • 1

    Terms of Reference (ToR)

    NATIONAL INDIVIDUAL CONSULTANCY

    Final evaluation: “Joint Programme on Youth and Women Employment”

    Job Title: National Consultant for the Final Evaluation of “Youth and Women Employment Programme”

    Category: Youth Employment Duty Station: Kigali, Rwanda Type of contract: National consultant Expected starting date: Immediately Duration of assignment: 35 days

    I. Background and context

    Initiated in 2013, the One UN Joint Programme “Youth and Women Employment Programme” is a 5-year Programme ending in June 2018. The Joint programme is aligned to the programming cycles of Government of Rwanda (GoR) as it aims to response to the national priority defined in Vision 2020 and EDPRS2 that is “Productivity and Youth Employment”. This five-year programme is aligned to the programming cycles of GoR and UN through the national development plan Economic Development and Poverty Reduction Strategy (EDPRS) 2 2013-2018 and the United Nations Development Assistance Plan (UNDAP) 2013-2018 respectively and is in particular aligned to Outcomes 1, 2 and 4 of the first result area of UNDAP - Inclusive Economic Transformation - and - Productivity and Youth Employment - thematic area of EDPRS 2. It is built on five outcomes including enhancing national capacities to promote employment intensive growth and to mainstream youth employment in programmes and budgets; building skills and competences of youth and women for employability and enterprises competitiveness; enhancing job creation and enterprise development through entrepreneurship development access to markets and inclusive financial

  • 2

    services; promoting labour market information systems for youth economic empowerment opportunities; and building the coordination, management and oversight of the programme. The five programme outcomes are further aligned to the three strategic objectives underlined in the National Employment Programme (NEP) which are: Creating sufficient jobs that are adequately remunerative and sustainable across the economy; Equipping the workforce with vital skills and attitude for increased productivity that are needed for the private sector growth; and Providing a national framework for coordinating all employment and related initiatives; and the four pillars of NEP: Skills Development, Entrepreneurship and Business Development; Labour Market Intervention; and Coordination and Monitoring & Evaluation. The Joint programme brought together 12 sister UN agencies based on their comparative advantages (ILO, UNDP, UNCDF, UNECA, FAO, UN-WOMEN, UN-HABITAT, UNIDO, UNV, UNCTAD, UNESCO and ITC) under the coordination of ILO and UNDP to support 7 ministries (MYICT, MINECOFIN, MINICOM, MIGEPROF, MIFOTRA, MINEDUC, MINAGRI), the Private Sector Federation, the City of Kigali, WDA and BDF. The Joint Programme was built on the following five outcomes:

    ✓ National Capacities Enhanced to promote employment intensive growth and mainstream youth and

    women employment in programmes and budgets; ✓ Youth and women skills and competences enhanced for employability and enterprises

    competitiveness; ✓ Job creation and enterprise development enhanced through entrepreneurship development and

    inclusive financial services; ✓ Labour market information systems enhanced for youth economic empowerment opportunities; and ✓ Programme coordination, management and oversight strengthened.

    Within this Joint Programme, UNDP provided technical and financial support to the Ministry of Youth and

    ICT (now Ministry of Youth) for the implementation of the YouthConnekt initiative. Covering issues of

    youth employment, the role of youth in national dialogue and innovation with a strong gender and

    technology focus, YouthConnekt is a successful multi-faceted programme initiated in 2012 by the

    Government of Rwanda through the Ministry of Youth and ICT (MYICT) in partnership with UNDP1. The

    Platform seeks to empower young people and connect them with leaders from the public, private sector

    and the civil society to forge partnerships to provide employment and entrepreneurship opportunities

    (such as internships, courses, mentorships, job advertisements, job fairs, coaching, business incubator

    centres, banks, as well as other platforms) to youth. The platform connects young people to role models,

    resources, knowledge and skills development2. By combining elements of skills development,

    entrepreneurship, access to jobs and finance, awareness raising on issues related to youth development,

    and promotion of youth citizenship through community work and inclusion in local and national policy

    1 https://www.youtube.com/watch?v=SdT6Q4zsNQg 2 http://youthconnekt.rw/

    https://www.youtube.com/watch?v=SdT6Q4zsNQghttp://youthconnekt.rw/

  • 3

    dialogue, the YouthConnekt initiative has proven to be a very innovative and effective way of facilitating

    the demographic transition and creating high productive off-farm jobs.

    II. Evaluation Purpose

    The purpose of the Final Evaluation (FE) is to examine the results, achievements and constraints of UNDP

    funded activities of the Joint Programme on Youth and Women Employment.

    The Project was initiated in 2013 and is planned to end in June 2018. The findings and recommendations

    of the evaluation and lessons learned from its implementation will inform for the upcoming project cycle.

    The Evaluation also aims at assessing UNDP’s contribution to the achievement of UNDAP Outcome 1.

    The consultant is intended to identify weaknesses and strengths of the project design and

    implementation, and to come up with recommendations regarding the overall design and orientation of

    the project and on the work plan for the next programming cycle, after evaluating the adequacy,

    efficiency, and effectiveness of implementation, as well as assessing the achievements the project outputs

    and outcomes. The evaluation will also assess early signs of project success or failure and prompt

    adjustments. The results and recommendations of the evaluation would therefore help UNDP and

    MINIYOUTH to document lessons learnt and best practices for the next project cycle.

    III. Evaluation scope and objectives

    Objectives

    In line with the project’s objectives, UNDP Rwanda, in collaboration with the project’s implementing partner (MINIYOUTH), plans to conduct a final evaluation of UNDP funded interventions of the Joint programme on Youth and Women Employment. The evaluation aims to assess the achievements of the outputs and outcomes. The final evaluation main objectives are the following:

    • Assess the Programme’s implementation strategy

    • Assess the relevance, efficiency, effectiveness, sustainability, and impact of the interventions

    • Assess the Programme’s processes, including budgetary efficiency

    • Assess the extent to which planned activities and outputs have been achieved

    • Identify the main achievements and impacts of the programme’s activities

    • Identify the underlying causes and issues of non-achievement of some targets

    • Document lessons learnt

    • Make recommendations for the next project cycle

  • 4

    Scope The evaluation covers the implementation period of the project, from July 2013 up to February 2018. It

    covers the specific UNDP funded interventions of the One UN Joint Programme on Youth and Women

    Employment. The geographic coverage of the evaluation is the whole country (Rwanda). The scope of the

    final evaluation covers all activities undertaken in the framework of the project. This refers to:

    • Planned outputs of the project compared to actual outputs and the actual results as a contribution

    to attaining the project objectives

    • Problems and necessary corrections and adjustments to document lessons learnt

    • Efficiency of project management, including the delivery of outputs and activities in terms of

    quality, quantity, timeliness and cost efficiency

    • Likely outcomes and impact of the project in relation to the specified goals and objectives of the

    programme

    The evaluation comprises the following elements:

    (i) Assess whether the programme design was clear, logical and commensurate with the time

    and resources available;

    (ii) An evaluation of the project’s delivery of achievement of its overall objectives;

    (iii) An evaluation of programme’s performance in relation to the indicators, assumptions and

    risks specified in the logical framework matrix and the Project Document; An assessment of

    the scope, quality and significance of the programme outputs produced to date in relation to

    expected results; Identification of any programmatic and financial variance and/or

    adjustments made during the duration of the project and an assessment of their conformity

    with decisions of the PSC and their appropriateness in terms of the overall objectives of the

    programme;

    (iv) An evaluation of the programme’s contribution to the achievements of UNDAP’s outcome

    and outputs and the national Development agenda (EDPRS2, National Youth Policy);

    (v) Identification and, to the extent possible, quantification of any additional outputs and

    outcomes beyond those specified in the Programme Document;

    (vi) An evaluation of project coordination, management and administration. This includes specific

    reference to:

    a. Organizational/institutional arrangements for collaboration among the different

    stakeholders involved in project arrangements and execution;

    b. The effectiveness of the monitoring and evaluation framework/mechanisms used by

    MINIYOUTH in monitoring on a day to day basis, progress in project implementation;

    c. Administrative, operational and/or technical challenges and constraints that influenced

    the effective implementation of the project;

  • 5

    d. An assessment of the functionality of the institutional structure established and the role

    of the Project Steering Committee (PSC);

    e. Financial management of the project, including the balance between expenditures on

    administrative and overhead charges in relation to those on the achievement of

    substantive outputs.

    (vii) A prognosis of the degree to which the overall objectives and expected outcomes of the

    programme were met;

    (viii) Progress towards sustainability and replication of programme activities;

    (ix) Assess the extent to which the design, implementation and results of the programme have

    incorporated a gender equality perspective and human rights-based approach3

    (x) Assess of the extent to which the design, implementation and results of the project have

    incorporated the environmental sustainability concerns and make recommendation

    accordingly

    (xi) Lessons learned during programme implementation;

    (xii) Evaluate the programme’s exit strategy in terms of quality and clarity

    IV. Evaluation

    Evaluation criteria

    The programme will be evaluated on the basis of the DAC evaluation criteria:

    • Relevance: measures whether the programme addresses an important development goal and

    whether its objectives are still valid.

    • Effectiveness: measures whether the programme activities achieve its goals.

    • Efficiency: measures the cost effectiveness, i.e. the economic use of resources to achieve desired

    results.

    • Sustainability: measures whether the benefits of the programme are likely to continue after

    donor funding has been withdrawn. The programme needs to be environmentally as well as

    financially sustainable.

    • Impacts of intervention: measures the positive and negative changes produced by the

    programme, directly or indirectly, intended or unintended.

    Evaluation Questions

    More specifically, the final evaluation aims at addressing the following questions for each evaluation criteria, although not limited to:

    3 For more guidance on this, the consultants will be requested to use UNEG’s Guidance in Integrating Human Rights and Gender Equality in Evaluation” http://uneval.org/document/detail/1616

    http://uneval.org/document/detail/1616

  • 6

    Relevance

    • Where is this Programme being implemented? How was the Programme site selected? What has been the main focus of the programme implementation so far? Who are the main beneficiaries? How were they selected? How was the programme aligned to the national development strategy (EDPRS 2, Vision 2020)?

    • The extent to which the programme activities are suited to the priorities and policies of the target group, recipient and donor.

    • To what extent did the objectives remain valid throughout the programme duration?

    • Were the activities and outputs of the programme consistent with the overall goal and the attainment of its objectives?

    • Were the activities and outputs of the programme consistent with the intended impacts and Effects?

    Effectiveness

    • To what extent were the objectives achieved?

    • What were the major factors influencing the achievement or non-achievement of the objectives?

    • Did the activities contribute to the achievement of the planned outputs?

    • Have the different outputs been achieved?

    • What progress toward the outcomes has been made?

    • To what extend the design, implementation and results of the programme have incorporated a gender equality perspective and human rights based approach? What should be done to improve gender and human rights mainstreaming?

    • What has been the result of the capacity building/trainings interventions? Were qualified trainers available to conduct training?

    • How did UNDP support the achievement of programme outcome and outputs?

    • How was the partnership strategy conducted by UNDP? Has UNDP partnership strategy been appropriate and effective? What factors contributed to effectiveness or ineffectiveness? What were the synergies with other programmes?

    Efficiency

    • Were activities cost-efficient?

    • Were objectives achieved on time?

    • Was the programme implemented in the most efficient way compared to alternatives?

    • What was the original budget for the Programme? How have the Programme funds been spent? Were the funds spent as originally budgeted?

    • Are there any management challenges affecting efficient implementation of the Programme? What are they and how are they being addressed?

  • 7

    Sustainability

    • To what extent will the benefits of the programme or programme continue after donor funding stops?

    • What were the major factors which influenced the achievement or non-achievement of sustainability of the programme or programme?

    • Does the programme have a clear exit strategy?

    • To what extend the design, implementation and results of the programme have incorporated environment sustainability? What should be done to improve environmental sustainability mainstreaming?

    Impact of interventions

    • What are the stated goals of the Programme? To what extent are these goals shared by stakeholders? What are the primary activities of the programme and expected outputs? To what extent have the activities progressed? How did the programme contribute to the achievement of UNDAP outcomes and outputs?

    • What has happened as a result of the programme?

    • What have been the main impact of the programme on the Youth Employment framework in Rwanda?

    • How many people have been affected?

    • Has the programme contributed or is likely to contribute to long-term social, economic, technical, environmental changes for individuals, communities, and institutions related to the programme?

    • What difference has the programme made to beneficiaries?

    V. Methodology

    General guidance on evaluation methodology can be found in the UNDP Handbook on Monitoring and Evaluating for Development Results, the UNDP Guidelines for Outcome Evaluators, and UNDP Outcome-Level Evaluation: A Companion Guide to the Handbook on Monitoring and Evaluating for Development Results). UNDP’s Evaluation Policy provides information about the role and use of evaluation within the M&E architecture of the organization. The final decision on the specific design and methods for the evaluation will emerge from consultation among programme staff, the evaluators and key stakeholders, based on the inception report prepared by the evaluators, about what is appropriate and feasible to meet the evaluation purpose and objectives and answer the evaluation questions, given limitations of budget, time and data. The evaluation should use a mixed methods approach, drawing on both primary and secondary, quantitative and qualitative data to come up with an overall assessment backed by clear evidence. Data will be collected through surveys of all relevant stakeholders (national and local Government institutions, development partners, civil society organizations partners, private sector, beneficiaries, etc.) and through

  • 8

    focus group discussions. Further data on the programme indicators (RRF data) will be used by the evaluation to assess the programme progress and achievements. The evaluation methodology will include the following:

    (i) Desk review of programme document, monitoring reports (such as minutes of LPAC meeting,

    Minutes of Steering Committee meetings including other relevant meetings, Programme

    annual Implementation Report, quarterly progress reports, and other internal documents

    including consultant and financial reports);

    (ii) Review of specific products produced so far, including datasets, management and action

    plans, publications and other material and reports;

    (iii) Interviews with the Head of SPIU, Programme Manager, ICT Innovations and Youthconnekt

    Specialist, Technical Assistant/Project Coordinator and the Administrative Assistant in

    MINIYOUTH

    (iv) Interviews with Head of Poverty and Environment Unit and UNDP Programme Analyst

    (v) Interviews with Government partners (National Youth Council, some districts and other local

    authorities), UN agencies (UNV, UNFPA) and other relevant stakeholders involved in the

    implementation of YouthConnekt including DOT Rwanda, Akazi Kanoze, SNV, Imbuto

    Foundation, World Vision, etc.).

    (vi) Focus group discussions with all stakeholders

    VI. Deliverables (Evaluation Products)

    This section presents the key evaluation products the evaluation team will be accountable for producing. The deliverables are the following:

    • Evaluation inception report—An inception report should be prepared by the evaluators before going into the full-fledged data collection exercise. It should detail the evaluators’ understanding of what is being evaluated and why, showing how each evaluation question will be answered by way of: proposed methods, proposed sources of data and data collection procedures. The inception report should include a proposed schedule of tasks, activities and deliverables, designating a team member with the lead responsibility for each task or product. The inception report provides the programme unit and the evaluators with an opportunity to verify that they share the same understanding about the evaluation and clarify any misunderstanding at the outset. The inception report will be discussed and approved with UNDP and MINIYOUTH 1 week after signing the contract

    • Draft evaluation report—Submission of draft evaluation report to UNDP for comments and inputs. The programme unit and key stakeholders in the evaluation will then review the draft evaluation report to ensure that the evaluation covers the scope and meets the required quality criteria.

    • Presentation of Draft evaluation report (PPT presentation) to the Technical Committee for inputs, comments and approval.

  • 9

    • Final evaluation report. The final report should be completed 1 week after receipt of consolidated comments from stakeholders and submitted to UNDP.

    VII. Evaluation Team Composition and required competencies

    The Individual consultant should have the following skills/competencies and characteristics:

    • At least master’s degree in Public Policy and Management, Public Administration, Development studies, International Development, Economics or/and Management and Business;

    • At least 7 years accumulated experience in project/programme evaluation.

    • At least 10 years accumulated experience in programme management support, programme/project formulation, monitoring and evaluation and RBM implementation;

    • Proven expertise, knowledge and experience in the field of Youth Employment/Empowerment/ entrepreneurship and/or business development;

    • Good understanding of gender equality, human-rights based approach and environmental sustainability concepts;

    • Strong interpersonal and managerial skills, ability to work with people from different backgrounds and evidence of delivering good quality evaluation and research products in a timely manner

    • Proven understanding of key elements of result-based programme management in International development cooperation

    • Fluent in English and working knowledge of French would be an added advantage

    • Excellent written and verbal communication skills in English

    VIII. How to apply

    Candidates should apply by presenting the following documents:

    (i) Letter of Confirmation of Interest and Availability using the template provided by

    UNDP;

    (ii) Personal CV or P11, indicating all past experience from similar projects as well as the

    contact details (e-mail and telephone number) of the candidate and at least three (3)

    professional references;

    (iii) Brief description of why the individual considers him/herself as the most suitable for

    the assignment and a methodology, if applicable, on how he/she will approach and

    complete the assignment

    (iv) Financial Proposal that indicates the all-inclusive fixed total contract price supported

    by a breakdown of costs, as per template provided

  • 10

    All interested applicants should submit the above documents to UNDP CO Rwanda by emailing

    [email protected]. Application deadline is 15 February 2018.

    IX. Evaluation Ethics

    The evaluation in UNDP will be conducted in accordance with the principles outlined in the UNEG ‘Ethical Guidelines for Evaluation4 .The critical issues the evaluator must address in the design and implementation of the evaluation include evaluation ethics and procedures to safeguard the rights and confidentiality of information providers, (for example: measures to ensure compliance with legal codes governing areas such as provisions to collect and report data, particularly permissions needed to interview or obtain information about children and young people; provisions to store and maintain security of collected information; and protocols to ensure anonymity and confidentiality

    X. Implementation Arrangements

    This section describes the organization and management structure for the evaluation and defines the

    roles, key responsibilities and lines of authority of all parties involved in the evaluation process.

    Implementation arrangements are intended to clarify expectations, eliminate ambiguities, and facilitate

    an efficient and effective evaluation process.

    UNDP UNDP is responsible for the management of this final evaluation and will contract independent consultant to conduct the evaluation on behalf of the Government of Rwanda. UNDP will be the focal point for the evaluation and will facilitate the logistical requirements and provide technical assistance during all phases of the evaluation process, including setting up interviews, field visits, and payments for the consultant. UNDP Programme Analyst Day-to-day management of the Evaluation Team will be provided by UNDP programme analyst overseeing the project. He or she will ensure that all issues pertaining to the contract with the Evaluation Team, including payments are completed on schedule and will be responsible for facilitating the work of the Evaluation Team. He or she will provide all documentation to the team for the desk review, set up interview appointments and field visits and convene focus group meetings. Evaluation Management Team An Evaluation Management Team led by UNDP composed of a representative of MINIYOUTH, UNDP Environment Head of Unit, Programme Analyst, the Chief of UNDP Management Support Unit will oversee the conduct of the evaluation at the technical level and will be responsible for providing guidance and

    4 UNEG, ‘Ethical Guidelines for Evaluation’. Available at www.uneval.org/ethicalguidelines

    mailto:[email protected]

  • 11

    direction for the evaluation process and inputs and comments on the draft evaluation report as well as for approving the final document. The team will provide quality assurance and guidance to the evaluation to ensure that it meets the UNEG evaluation quality criteria. The technical committee will oversee the implementation of the agreed schedule of consultation activities, ensure wide stakeholder consultations, will be in charge of verifying all facts in the report and oversee the production of the final report and the drafting and implementation of follow up actions.

    XI. Time Frame for the evaluation process

    The evaluation will be conducted in February-March 2018 for an estimated 35 working days. The evaluation will include the following phases with their respective time frame.

    Phase Tasks and deliverables Time-Line

    Desk Review and

    Inception report

    phase

    • Desk review conducted

    • Briefings of evaluators

    • An inception report will be prepared by the evaluators detailing the evaluators’ understanding of what is being evaluated and why, showing how each evaluation question will be answered by way of: proposed methods, proposed sources of data and data collection procedures. The inception report should include a proposed schedule of tasks, activities and deliverables, designating a team member with the lead responsibility for each task or product.

    5 days

    Stakeholder

    consultations and

    Interviews

    • The evaluators will consult with all relevant stakeholders and conduct a series of interviews, focus group discussions, and field visits in order to collect the required data.

    15 days

    Analysis of data

    and drafting

    report

    • Once the data is collected, the evaluators will analyse them and draft the evaluation report.

    8 days

    Presentation of

    draft evaluation

    report to

    Stakeholder

    meeting

    • Once the draft final evaluation report submitted, it will be presented to all stakeholders for reviewing. The comments shared by the stakeholders will be incorporated into the final evaluation report.

    2 days

    Final Report • The evaluators will revise the final evaluation report based on the comments and inputs provided by all stakeholders and submit the final report to UNDP.

    5 days

    Total number of working days 35 days

  • 12

    XII. Price Proposal and Schedule of Payments

    The consultancy fee will be paid as a Lump Sum (inclusive of all expenses related to the consultancy), and

    will be fixed regardless of changes in the cost components of the consultancy. The consultancy fee will

    be paid upon completion of the following milestones:

    • 30% after presentation and adoption of the inception report

    • 30% after presentation and approval of the draft report

    • 40% after the approval of the final report

    XIII. Annexes

    Annex 1: Key Stakeholders and partners

    Stakeholders Intervention area the contact person

    Ministry of Youth GoR Minister

    Ministry of Youth GoR Permanent Secretary

    Ministry of Youth GoR Head of SPIU

    Ministry of Youth GoR YouthConnekt specialist

    UNDP UN One UN Resident Coordinator, Rwanda

    UNDP UN UNDP Rwanda Country Director

    UNDP UN Head of Poverty Reduction and Environment

    Unit, UNDP

    UNDP UN Programme Analyst, UNDP

    UNFPA UN UNFPA Rwanda Representative

    UNV UN Programme Specialist, UNV Rwanda

    DOT Rwanda Youth empowerment Country Director

    DOT Rwanda Youth empowerment Country Program Manager

    Akazi Kanoze Access/EDC Youth economic

    empowerment

    Marketing and Communication Coordinator

    World Vision Youth empowerment/

    livelihood

    Country Director

    Girls in ICT Girl awareness raising

    and capacity building

    ICT

    Girls in ICT President

    Girls in ICT Girl awareness raising

    and capacity building

    ICT

    Senior software engineer at Rwanda

    Development Board (RDB)

    Imbuto Foundation Youth empowerment Director General

  • 13

    Imbuto Foundation Youth empowerment Project Officer / Youth Empowerment & Mentorship

    SNV Rwanda Poverty

    reduction/livelihood

    Country Director

    National Youth Council Representative Excecutive secretary

    YouthConnekt bootcamp

    awardees

    TBD Beneficiaries

    Annex 2: Documents to be consulted The list below details the important documents that the evaluator should read at the outset of the evaluation and before finalizing the evaluation design and the inception report. The list might include other relevant documents identified during the inception phase and the consultation process.

    Documents

    DRG1 Joint Programme on Youth and Women Employment

    Booklet YouthConnekt Documentation, 2017

    Annual report “Youth and Women Employment Programme”, 2013-2017

    Quarterly Progress report “Youth and Women Employment Programme”, 2013-2018

    Project Steering Committee Meetings minutes, “Youth and Women Employment Programme, 2013-

    2017

    National Youth Policy, Rwanda

    Republic of Rwanda, NST 2018

    Republic of Rwanda, EDPRS 2 (2013-2018)

    Republic of Rwanda, Vision 2020

    United Nations Rwanda, One UN Programme Rwanda, CCPD 2013-2018

    United Nations Rwanda, UNDAP 2013-2018

    UNEG, ‘Ethical Guidelines for Evaluation

    UNEG’s Guidance in Integrating Human Rights and Gender Equality in Evaluation

    UNDP Handbook on Planning, Monitoring and Evaluations for Development Results (2009)

  • 14

    UNEG ‘Standards for Evaluation in the UN System’ 2005.

    Addendum June 2011 Evaluation: Updated guidance on Evaluation in the Handbook on Planning,

    Monitoring and Evaluation for Development Results (2009)

    Annex 3: Selection criteria Submissions will be evaluated in consideration of the evaluation criteria as stated below: The offer will be evaluated by using the best value for money approach. Technical proposal will be evaluated on 70% whereas the financial proposal will be evaluated on 30%. Below is the breakdown for the technical proposal which will be brought to 70%.

    Criteria Weight Max. Point

    Technical

    At least master’s degree in Public Policy and Management, Public Administration, Development studies, International Development, Economics or/and Management and Business;

    10% 10

    At least 7 years accumulated experience in project/programme

    evaluation

    15% 15

    At least 10 years’ experience in programme management support including formulation, monitoring and evaluation and RBM implementation

    15% 15

    Proven expertise, knowledge and experience in the field of Youth Employment/Empowerment/ entrepreneurship and business development; initiatives;

    20% 20

    Overall Methodology (clear demonstration of evaluation methodology

    and understanding of the ToR)

    30% 30

    Fluent in English (written and verbal skills) 10% 10

  • 15

    Annex 4: Sample Evaluation Matrix The evaluation matrix is a tool that evaluators need to create as a map and reference in planning and conducting an evaluation. It serves as a useful tool for summarizing and visually presenting the evaluation design and methodology for discussions with stakeholders. It details evaluation questions that the evaluation will answer, data sources, data collection, analysis tools or methods appropriate for each data source, and the standard or measure by which each question will be evaluated. The draft sample evaluation Matrix to be used by the evaluators is presented below.

    Annex 5: Required format for the evaluation report The final report must include, but not necessarily be limited to, the following elements outlined in the quality criteria for evaluation reports: Title and opening pages Should provide the following basic information:

    • Name of the evaluation intervention

    • Time frame of the evaluation and date of the report

    • Countries of the evaluation intervention

    • Names and organizations of evaluators

    • Name of the organization commissioning the evaluation

    • Acknowledgements

    Table of contents Should always include boxes, figures, tables and annexes with page references. List of acronyms and abbreviations Executive summary—A stand-alone section of two to three pages that should:

    • Briefly describe the intervention (the project(s), programme(s), policies or other interventions) that was evaluated.

    • Explain the purpose and objectives of the evaluation, including the audience for the evaluation and the intended uses.

    • Describe key aspect of the evaluation approach and methods.

    • Summarize principle findings, conclusions, and recommendations.

  • 16

    Introduction—Should:

    • Explain why the evaluation was conducted (the purpose), why the intervention is being evaluated at this point in time, and why it addressed the questions it did.

    • Identify the primary audience or users of the evaluation, what they wanted to learn from the evaluation and why, and how they are expected to use the evaluation results.

    • Identify the intervention (the project(s) programme(s), policies or other interventions) that was evaluated—see upcoming section on intervention.

    • Acquaint the reader with the structure and contents of the report and how the information contained in the report will meet the purposes of the evaluation and satisfy the information needs of the report’s intended users.

    Description of the intervention—Provides the basis for report users to understand the logic and assess the merits of the evaluation methodology and understand the applicability of the evaluation results. The description needs to provide sufficient detail for the report user to derive meaning from the evaluation. The description should:

    • Describe what is being evaluated, who seeks to benefit, and the problem or issue it seeks to address.

    • Explain the expected results map or results framework, implementation strategies, and the key assumptions underlying the strategy.

    • Link the intervention to national priorities, UNDAF priorities, corporate multiyear funding frameworks or strategic plan goals, or other programme or country specific plans and goals.

    • Identify the phase in the implementation of the intervention and any significant changes (e.g., plans, strategies, logical frameworks) that have occurred over time, and explain the implications of those changes for the evaluation.

    • Identify and describe the key partners involved in the implementation and their roles.

    • Describe the scale of the intervention, such as the number of components (e.g., phases of a project) and the size of the target population for each component.

    • Indicate the total resources, including human resources and budgets.

    • Describe the context of the social, political, economic and institutional factors, and the geographical landscape within which the intervention operates and explain the effects (challenges and opportunities) those factors present for its implementation and outcomes.

    • Point out design weaknesses (e.g., intervention logic) or other implementation constraints (e.g., resource limitations).

    Evaluation scope and objectives—The report should provide a clear explanation of the evaluation’s scope, primary objectives and main questions.

    • Evaluation scope: The report should define the parameters of the evaluation, for example, the time period, the segments of the target population included, the geographic area included, and which components, outputs or outcomes were and were not assessed.

    • Evaluation objectives: The report should spell out the types of decisions evaluation users will make, the issues they will need to consider in making those decisions, and what the evaluation will need to achieve to contribute to those decisions.

  • 17

    • Evaluation criteria: The report should define the evaluation criteria or performance standards used. The report should explain the rationale for selecting the particular criteria used in the evaluation.

    • Evaluation questions: Evaluation questions define the information that the evaluation will generate. The report should detail the main evaluation questions addressed by the evaluation and explain how the answers to these questions address the information needs of users.

    Evaluation approach and methods—The evaluation report should describe in detail the selected methodological approaches, methods and analysis; the rationale for their selection; and how, within the constraints of time and money, the approaches and methods employed yielded data that helped answer the evaluation questions and achieved the evaluation purposes. The description should help the report users judge the merits of the methods used in the evaluation and the credibility of the findings, conclusions and recommendations. The description on methodology should include discussion of each of the following:

    • Data sources—The sources of information (documents reviewed and stakeholders), the rationale for their selection and how the information obtained addressed the evaluation questions.

    • Sample and sampling frame—If a sample was used: the sample size and characteristics; the sample selection criteria (e.g., single women, under 45); the process for selecting the sample (e.g., random, purposive); if applicable, how comparison and treatment groups were assigned; and the extent to which the sample is representative of the entire target population, including discussion of the limitations of the sample for generalizing results.

    • Data collection procedures and instruments—Methods or procedures used to collect data, including discussion of data collection instruments (e.g., interview protocols), their appropriateness for the data source and evidence of their reliability and validity.

    • Performance standards—The standard or measure that will be used to evaluate performance relative to the evaluation questions (e.g., national or regional indicators, rating scales).

    • Stakeholder engagement—Stakeholders’ engagement in the evaluation and how the level of involvement contributed to the credibility of the evaluation and the results.

    • Ethical considerations—The measures taken to protect the rights and confidentiality of informants (see UNEG ‘Ethical Guidelines for Evaluators’ for more information).

    • Background information on evaluators—The composition of the evaluation team, the background and skills of team members and the appropriateness of the technical skill mix, gender balance and geographical representation for the evaluation.

    • Major limitations of the methodology—Major limitations of the methodology should be identified and openly discussed as to their implications for evaluation, as well as steps taken to mitigate those limitations.

    Data analysis—The report should describe the procedures used to analyse the data collected to answer the evaluation questions. It should detail the various steps and stages of analysis that were carried out, including the steps to confirm the accuracy of data and the results. The report also should discuss the appropriateness of the analysis to the evaluation questions. Potential weaknesses in the data analysis and gaps or limitations of the data should be discussed, including their possible influence on the way findings may be interpreted and conclusions drawn.

  • 18

    Findings and conclusions—The report should present the evaluation findings based on the analysis and conclusions drawn from the findings.

    • Findings—Should be presented as statements of fact that are based on analysis of the data. They should be structured around the evaluation criteria and questions so that report users can readily make the connection between what was asked and what was found. Variances between planned and actual results should be explained, as well as factors affecting the achievement of intended results. Assumptions or risks in the project or programme design that subsequently affected implementation should be discussed.

    • Conclusions—Should be comprehensive and balanced, and highlight the strengths, weaknesses and outcomes of the intervention. They should be well substantiated by the evidence and logically connected to evaluation findings. They should respond to key evaluation questions and provide insights into the identification of and/or solutions to important problems or issues pertinent to the decision making of intended users.

    Recommendations—The report should provide practical, feasible recommendations directed to the intended users of the report about what actions to take or decisions to make. The recommendations should be specifically supported by the evidence and linked to the findings and conclusions around key questions addressed by the evaluation. They should address sustainability of the initiative and comment on the adequacy of the project exit strategy, if applicable. Lessons learned—As appropriate, the report should include discussion of lessons learned from the evaluation, that is, new knowledge gained from the particular circumstance (intervention, context outcomes, even about evaluation methods) that are applicable to a similar context. Lessons should be concise and based on specific evidence presented in the report. Report annexes—Suggested annexes should include the following to provide the report user with supplemental background and methodological details that enhance the credibility of the report:

    • ToR for the evaluation

    • Additional methodology-related documentation, such as the evaluation matrix and data collection instruments (questionnaires, interview guides, observation protocols, etc.) as appropriate

    • List of individuals or groups interviewed or consulted and sites visited

    • List of supporting documents reviewed

    • Project or programme results map or results framework

    • Summary tables of findings, such as tables displaying progress towards outputs, targets, and goals relative to established indicators

    • Short biographies of the evaluators and justification of team composition

    • Code of conduct signed by evaluators Annex 6: Code of Conduct for Evaluators in the UN System (see link)

    file:///C:/Users/tillberg.gabrielle/Downloads/UNEG_FN_COC_2008_CodeOfConduct.pdf