33
ROUTINE DATA QUALITY ASSESSMENT TOOL (RDQA) GUIDELINES FOR IMPLEMENTATION DRAFT SEPTEMBER 28, 2007 The Global Fund to Fight Aids, Tuberculosis and Malaria, Office of the Global AIDS Coordinator, U.S. President’s Initiative for AIDS Relief, USAID, WHO, UNAIDS, MEASURE Evaluation, others?

GUIDELINES FOR IMPLEMENTATION DRAFT · 2013-01-08 · ROUTINE DATA QUALITY ASSESSMENT TOOL (RDQA) GUIDELINES FOR IMPLEMENTATION DRAFT SEPTEMBER 28, 2007 The Global Fund to Fight Aids,

  • Upload
    others

  • View
    4

  • Download
    0

Embed Size (px)

Citation preview

Page 1: GUIDELINES FOR IMPLEMENTATION DRAFT · 2013-01-08 · ROUTINE DATA QUALITY ASSESSMENT TOOL (RDQA) GUIDELINES FOR IMPLEMENTATION DRAFT SEPTEMBER 28, 2007 The Global Fund to Fight Aids,

ROUTINE DATA QUALITY ASSESSMENT TOOL (RDQA)

GUIDELINES FOR IMPLEMENTATION

DRAFT

SEPTEMBER 28, 2007

The Global Fund to Fight Aids, Tuberculosis and Malaria, Office of the Global AIDS Coordinator, U.S. President’s Initiative for AIDS Relief,

USAID, WHO, UNAIDS, MEASURE Evaluation, others?

Page 2: GUIDELINES FOR IMPLEMENTATION DRAFT · 2013-01-08 · ROUTINE DATA QUALITY ASSESSMENT TOOL (RDQA) GUIDELINES FOR IMPLEMENTATION DRAFT SEPTEMBER 28, 2007 The Global Fund to Fight Aids,

Acknowledgments This tool was developed with input of a number of people from various organizations. Those most directly involved in development of the tool included Karen Hardee from MEASURE Evaluation/JSI, Ronald Tran Ba Huy of The Global Fund to Fight AIDS, Tuberculosis and Malaria, Cyril Pervilhac and Yves Souteyrand from the World Health Organization and David Boone from MEASURE Evaluation/JSI. This tool benefited from feedback on the DQA for Auditing from a number of participants in workshops and meetings held in Nigeria, Ghana, Senegal, South Africa, Vietnam, Switzerland and the U.S. For more information, please contact: Karen Hardee or David Boone MEASURE Evaluation/JSI [email protected] [email protected] Ronald Tran Ba Huy The Global Fund [email protected] Cyril Pervilhac WHO [email protected]

Page 3: GUIDELINES FOR IMPLEMENTATION DRAFT · 2013-01-08 · ROUTINE DATA QUALITY ASSESSMENT TOOL (RDQA) GUIDELINES FOR IMPLEMENTATION DRAFT SEPTEMBER 28, 2007 The Global Fund to Fight Aids,

Contents

Page

1. Background and Objectives 1

2. The Link Between the M&E System and Data Quality 1

3. Methodology 5

4. Implementation 6

5. Ethical Considerations 11

Page 4: GUIDELINES FOR IMPLEMENTATION DRAFT · 2013-01-08 · ROUTINE DATA QUALITY ASSESSMENT TOOL (RDQA) GUIDELINES FOR IMPLEMENTATION DRAFT SEPTEMBER 28, 2007 The Global Fund to Fight Aids,

1

1- BACKGROUND AND OBJECTIVES Programs are increasingly expected to provide high quality data to national governments and donors to measure success in reaching ambitious goals related to diseases such as Acquired Immunodeficiency Syndrome (AIDS), Tuberculosis (TB) and Malaria. In the spirit of the M&E component of the “Three Ones”1, the “Stop TB Strategy” and Roll Back Malaria, and a number of multilateral and bilateral organizations have collaborated to develop tools to help Programs and projects improve the data management and reporting aspects of their M&E systems and to improve overall data quality.2 The Routine Data Quality Assessment (RDQA) Tool, which is a greatly simplified version of the DQA for Auditing, is presented here. The RDQA provides the methodology to assess the ability of Programs/projects to collect and report quality data and to verify a sample of reported data and to develop and action plan based on the findings. The RDQA can be used flexibly for a number of purposes, including initial assessment of M&E capacity and routine or periodic monitoring. For example, the DQA can be used by Programs and/or partners as well as individual project/ M&E units for monitoring and evaluation, by donors who want to spot-check data quality problems or even individual program sites that may want to verify/check that their reporting is accurate and timely. In addition, the RDQA can help programs/projects prepare for external data quality audits.3

2- LINK BETWEEN THE M&E SYSTEM AND DATA QUALITY The RDQA has been developed based on a multidimensional concept of data flows through a Program/project M&E system that operates at three (or more) levels and the seven dimensions of data quality that can pose challenges at each level of the system. Further, the RDQA identifies seven functional areas that need to be addressed to strengthen the data management and reporting system and improve the quality of data the system produces.

1 The “Three Ones” principles were established in 2004 at a meeting co-hosted by UNAIDS, the United Kingdom and the United

States at which key donors reaffirmed their commitment to strengthening national AIDS responses led by the affected countries themselves. The Three Ones include: one agreed HIV/AIDS Action Framework that provides the basis for coordinating the work of all partners; one National AIDS Coordinating Authority, with a broad-based multisectoral mandate; and one agreed country-level Monitoring and Evaluation System (http://www.unaids.org/en/Coordination/Initiatives/three_ones.asp. Nov. 27, 2006).

2 The two main tools include the Monitoring and Evaluation Systems Strengthening Tool (MESST) and the Data Quality Assessment

Tool (with versions for auditing and for routine assessment). 3 The U.S. Government has undertaken a number of program audits of PEPFAR programs that have included data quality and the

Global Fund is undertaking data quality audits on a subset of its grants each year.

Objective of the RDQA • Verify rapidly that appropriate data management systems are in place; • Verify rapidly the quality of reported data for key indicators at selected sites;

and • Develop an action plan for strengthening the M&E system and improving data

quality

Page 5: GUIDELINES FOR IMPLEMENTATION DRAFT · 2013-01-08 · ROUTINE DATA QUALITY ASSESSMENT TOOL (RDQA) GUIDELINES FOR IMPLEMENTATION DRAFT SEPTEMBER 28, 2007 The Global Fund to Fight Aids,

2

A- Levels of the M&E System Data collected, aggregated and reported to measure indicators flows through an M&E system that begins with the recording of an encounter between a client and program staff member, a commodity distributed, or a person trained. Data are collected on source documents (e.g. patient records, client intake sheets, registers, training registers, commodity distribution logs, etc.) The data from source documents is aggregated and sent to a higher level (e.g. a district, a partner or prime recipient or a sub-partner or a sub-recipient) for further aggregation before being sent to the next level, culminating in aggregation at the highest level of a program (e.g. the M&E Unit of a National Program, the Principle Recipient of a Global Fund grant, or the SI Unit of a USG program). The data from countries is frequently sent to international offices for global aggregation to show progress in meeting goals related to health initiatives. Figure 1 illustrates this data flow in data management and reporting system that includes service sites, districts and national M&E Unit. Each country and program/project may have a different data flow. Challenges to data quality can occur at each of these levels.

B- Data Quality Dimensions The RDQA is grounded in the components of data quality, namely, that Programs/projects need accurate and reliable data that are complete, timely, precise, and maintained under conditions of confidentiality, when appropriate (see Table 1). Furthermore the data must have integrity and be maintained with appropriate levels of confidentiality to be considered credible.

District 1

Monthly Report

SDP 1 45

SDP 2 20

TOTAL 65

District 3

Monthly Report

SDP 4 50

SDP 5 200

TOTAL 250

District 2

Monthly Report

SDP 4 75

TOTAL 75

NATIONAL

Monthly Report

Region 1 65

Region 2 75

TOTAL 390

Region 3 250

Service Delivery Point 1

Monthly Report

ARV Nb. 45

Source Doc1

Service Delivery Point 2

Monthly Report

ARV Nb. 20

Source Doc 1

Service Delivery Point 3

Monthly Report

ARV Nb. 75

Source Doc 1

Service Delivery Point 4

Monthly Report

ARV Nb. 50

Source Doc 1

Service Delivery Point 5

Monthly Report

ARV Nb. 200

Source Doc 1

Figure 1. Data Flow Through an Illustrative M&E System

Page 6: GUIDELINES FOR IMPLEMENTATION DRAFT · 2013-01-08 · ROUTINE DATA QUALITY ASSESSMENT TOOL (RDQA) GUIDELINES FOR IMPLEMENTATION DRAFT SEPTEMBER 28, 2007 The Global Fund to Fight Aids,

3

C- Functional Areas to Strengthen Data Management and Reporting and Data Quality To address data quality challenges throughout the data management and reporting system, it is important to focus on the key functional areas that programs/projects can address. Table 2 shows these functional areas and the questions that need to be answered in determining the strength of the M&E system.

Table 1. Data Quality Dimensions Dimensions of

data quality

Operational Definition Main dimensions of data quality

Accuracy Also known as validity. Accurate data are considered correct: the data measure what they are intended to measure. Accurate data minimize error (e.g., recording or interviewer bias, transcription error, sampling error) to a point of being negligible.

Reliability The data generated by a program’s information system are based on protocols and procedures that do not change according to who is using them and when or how often they are used. The data are reliable because they are measured and collected consistently.

Sub dimensions of data quality

Precision This means that the data have sufficient detail. For example, an indicator requires the number of individuals who received HIV counseling & testing and received their test results by sex of the individual. An information system lacks precision if it is not designed to record the sex of the individual who received counseling and testing.

Completeness Completeness means that an information system from which the results are derived is appropriately inclusive: it represents the complete list of eligible persons or units and not just a fraction of the list.

Timeliness Data are timely when they are up-to-date (current), and when the information is available on time. Timeliness is affected by: (1) the rate at which the program’s information system is updated; (2) the rate of change of actual program activities; and (3) when the information is actually used or required.

Integrity Data have integrity when the system used to generate them are protected from deliberate bias or manipulation for political or personal reasons.

Confidentiality Confidentiality means that clients are assured that their data will be maintained according to national and/or international standards for data. This means that personal data are not disclosed inappropriately, and that data in hard copy and electronic form are treated with appropriate levels of security (e.g. kept in locked cabinets and in password protected files.

Page 7: GUIDELINES FOR IMPLEMENTATION DRAFT · 2013-01-08 · ROUTINE DATA QUALITY ASSESSMENT TOOL (RDQA) GUIDELINES FOR IMPLEMENTATION DRAFT SEPTEMBER 28, 2007 The Global Fund to Fight Aids,

4

Table 2. M&E Functional Area and Key Questions to Address Data Quality

Functional Areas

Questions Dimension of Data Quality

I M&E Capabilities, Roles and Responsibilities 1

Are key M&E and data-management staff identified with clearly assigned responsibilities?

Accuracy, Reliability

II Training 2

Have the majority of key M&E and data-management staff received the required training?

Accuracy, Reliability

III Indicator Definitions 3

Are there operational indicator definitions meeting relevant standards that are systematically followed by all service points?

Accuracy, Reliability

IV Data Reporting Requirements 4

Has the Program/project clearly documented (in writing) what is reported to who, and how and when reporting is required?

Accuracy, Reliability, Timeliness, Completeness

5 Are there standard data-collection and reporting forms that are systematically used?

Accuracy, Reliability

6 Are data recorded with sufficient precision/detail to measure relevant indicators?

Accuracy, Precision

7 Are data maintained in accordance with international or national confidentiality guidelines?

Confidentiality

V Data Collection and Reporting Forms and Tools

8

Are source documents kept and made available in accordance with a written policy?

Ability to assess Accuracy, Precision, Reliability, Timeliness, and Integrity, and Confidentiality

9 Does clear documentation of collection, aggregation and manipulation steps exist?

Accuracy, Reliability

VI

10 Are data quality challenges identified and are

mechanisms in place for addressing them? Accuracy, Reliability

11

Are there clearly defined and followed procedures to identify and reconcile discrepancies in reports?

Accuracy, Reliability

Data Management Processes and Data Quality Controls

12

Are there clearly defined and followed procedures to periodically verify source data?

Ability to assess Accuracy, Precision, Reliability, Timeliness, and Integrity, and Confidentiality

VII Links with National Reporting System 13

Does the data collection and reporting system of the Program/project link to the National Reporting System?

To avoid parallel systems and undue multiple reporting burden

Page 8: GUIDELINES FOR IMPLEMENTATION DRAFT · 2013-01-08 · ROUTINE DATA QUALITY ASSESSMENT TOOL (RDQA) GUIDELINES FOR IMPLEMENTATION DRAFT SEPTEMBER 28, 2007 The Global Fund to Fight Aids,

5

Table 2. M&E Functional Area and Key Questions to Address Data Quality on staff in order to increase data quality.

Answers to these 13 questions can help highlight data quality challenges and the related aspects of the data management and reporting system. For example, if data accuracy is an issue, the RDQA can help assess if reporting entities are using the same indicator definitions, if they are collecting the same data, on the same forms, using the same instructions. The RDQA can help assess if roles and responsibilities are clear (e.g. all staff know what data they are collecting and reporting, when, to who and how) and if staff have received relevant training. It can highlight if indicator definitions are standard, etc. These are referred to in the RDQA as the functional areas of the M&E system related to collecting, aggregating and reporting data to measure indicators. Figure 2 shows the conceptual links between data quality, the levels of the M&E system and the functional areas that are embedded in the RDQA.

Page 9: GUIDELINES FOR IMPLEMENTATION DRAFT · 2013-01-08 · ROUTINE DATA QUALITY ASSESSMENT TOOL (RDQA) GUIDELINES FOR IMPLEMENTATION DRAFT SEPTEMBER 28, 2007 The Global Fund to Fight Aids,

6

3- METHODOLOGY B – The RDQA Checklist The RDQA has a checklist (currently in Word, eventually also in Excel) with sections for service sites, intermediate sites and a central M&E Unit. The checklist for each of these levels includes questions to assess the data management and reporting system and instructions for recounting data and comparing with data that have been previously reported. A copy of the Word version of the RDQA checklist is found in Annex 1. 1- Assessment of Data Management and Reporting Systems The purpose of assessing the data management and reporting system is to identify potential challenges to data quality created by the design and implementation of data management and reporting systems. The system assessment portion of the checklist is arranged into the seven functional areas with 13 key summary questions that are critical to evaluating whether the Program/project(s) data management system is well designed and implemented to produce quality data (Table 2 above). 2- Verifying Data The purpose of this part of the DQA is to assess on a limited scale if service delivery and intermediate aggregation sites are collecting and reporting data to measure the indicator(s) accurately, completely and on time and to cross check the reported results with other data sources.

Page 10: GUIDELINES FOR IMPLEMENTATION DRAFT · 2013-01-08 · ROUTINE DATA QUALITY ASSESSMENT TOOL (RDQA) GUIDELINES FOR IMPLEMENTATION DRAFT SEPTEMBER 28, 2007 The Global Fund to Fight Aids,

7

4- IMPLEMENTATION

The DQA for self-assessment and improvement can be implemented in four steps:

Step 1 – PREPARATION

• Determine the purpose, levels of the system to be included, the indicator(s) and data sources and the time period of the RDQA.

• Determine the facilities/sites to be visited and notify the facilities/sites.

A. Purpose The RDQA checklist can be used for:

Initial assessment of M&E systems established by new implementing partners (or in

decentralized systems) to collect, manage and report data.

Routine supervision of data management and reporting systems and data quality at various levels. For example, routine supervision visits may include checking on a certain time period worth of data (e.g. one day, one week or one month) at the service site level, whereas periodic assessments (e.g. quarterly, biannually or annually) could be carried out at all levels to assess the functioning of the entire Program/project’s M&E system.

Periodic assessment by donors of the quality of data being provided to them (this use of the

DQA could be more frequent and more streamlined than official data quality audits that use the DQA for Auditing) but less frequent than routine monitoring of data.

Preparation for a formal data quality audit.

The RDQA is flexible for all of these uses. Countries and programs are encouraged to adapt the checklist to fit local program contexts.

B. Levels of the M&E System to be Included

Part of determining the scope of the RDQA is to decide what levels of the M&E system will be included in the assessment – service sites, intermediate aggregation sites and a central M&E unit. The levels will depend on the program that is the subject of the RDQA. The levels should be determined once the appropriate levels have been identified and “mapped” (e.g. there are 100 sites providing the service in 10 districts). Reports from sites are sent to districts, which then send reports to the M&E Unit of the national program. In some cases, the data flow map will include more than one intermediate site (e.g. regions, provinces or states or multiple levels of program organizations).

C. Indicator(s), Data Sources and Reporting Period

The RDQA is designed to assess the M&E systems and to verify data related to indicators that are reported to programs or donors. Therefore, it is important to select one or more indicators – or at least program areas – to serve as the subject of the RDQA. This choice will be based on the list of indicators reported by the Program/project. For example, a program or project

Page 11: GUIDELINES FOR IMPLEMENTATION DRAFT · 2013-01-08 · ROUTINE DATA QUALITY ASSESSMENT TOOL (RDQA) GUIDELINES FOR IMPLEMENTATION DRAFT SEPTEMBER 28, 2007 The Global Fund to Fight Aids,

8

focusing on the program area of treatment for HIV may report indicators of numbers of people on ART. Another program or project may focus on meeting the needs of orphans or vulnerable children therefore the indicators for that project would be from the OVC program area. A malaria Program/project might focus on providing insecticide-treated bed nets (ITN) or on treating people for malaria – or on both of those activities. Therefore a Program/project may report one indicator or (more likely) multiple indicators to measure success. Table 3 shows an example of program areas, the service delivery areas related to AIDS, TB and Malaria covered under Global Fund grants. Other donors may have somewhat different program areas.

Table 3. Illustrative Example of Program Areas (Global Fund Service Delivery Areas for AIDS, TB and Malaria)

GLOBAL FUND SERVICE DELIVERY AREAS

HIV/AIDS

Prevention Condom Distribution Testing and Counseling PMTCT STI Diagnosis and Treatment Behavioral Change Communication - Community Outreach

Treatment Antiretroviral Treatment (ART) and Monitoring Prophylaxis and Treatment for Opportunistic Infections

Care and Support Support for the chronically ill and families Support for orphans and other children (OVC)

MALARIA Prevention

Insecticide-Treated Nets Vector control (other than ITNs) Malaria Prevention in Pregnancy (IPT) Behavioral Change Communication - Community Outreach

Treatment Prompt, Effective Antimalaria Treatment Home based Management of Malaria

TB DOTS

For each program area, a number of indicators are measured, through various data sources. For example, under the Disease TB, in the program area Treatment, the international community has agreed to the harmonized indicator: Number of new smear positive TB cases that successfully complete treatment. The data source for this indicator is facility-based and

Page 12: GUIDELINES FOR IMPLEMENTATION DRAFT · 2013-01-08 · ROUTINE DATA QUALITY ASSESSMENT TOOL (RDQA) GUIDELINES FOR IMPLEMENTATION DRAFT SEPTEMBER 28, 2007 The Global Fund to Fight Aids,

9

the source documents are the district TB register along with the facility register and patient treatment cards. As another example related to the Disease AIDS, under the U.S. President’s Initiative for AIDS Relief (PEPFAR), a program area is Orphans and Vulnerable Children, and an indicator is: Number of OVC served by OVC programs (disaggregated by male, female, primary direct and supplemental direct). The data source for this indicator will be at community-based organizations that serve OVC and the source documents will be client records (intake forms, daily logs, registers, etc).

The RDQA can be implemented on one or more indicators, but it is important to keep in mind that data come from various sources, most notably facility-based, community-based, and commodity distribution-based data sources. When planning the RDQA, it is important to determine the data sources that will need to be assessed related to the indicator(s) and program area(s).

The RDQA designed to assess data related to indicators during a specific (selected) time period, generally a reporting period. Using a specified reporting period gives a reference from which to compare the “recounted” data, and is the recommended method for the RDQA. If sites are large or reporting periods long (e.g. six months), the staff or team assessing the data might decide to review source documents for one day, one week or one month and to cross-check the data with registers or laboratory or pharmacy records.

D. Determine and Notify Sites to be Visited

The types of sites included will depend on the scope of the RDQA. Once the scope and levels have been set, the actual sites to be visited should be selected purposively. The list of sites may include those:

Due for routine M&E monitoring during a given supervision cycle.

Selected based on the indicator(s) under review or based on some other characteristic

(e.g. all NGO sites offering the related service or all implementing partners offering services in a program area, such as community-based prevention programs for AIDS, TB or Malaria).

Selected sites of a new implementing partner/subpartner or new principal recipient/sub-

recipients.

Sites should be notified prior to the visit for the data quality assessment. This notification is important in order for appropriate staff to be available to answer the M&E systems questions in the checklist and to facilitate the data verification by providing access to relevant source documents.

Step 2 – SITE VISITS

• Assess the Program/project’s data management and reporting system at all or selected level of the M&E Unit, any intermediate aggregation level and at selected service sites

• Verify data for selected indicator(s) from source documents and compare with reported

results.

Page 13: GUIDELINES FOR IMPLEMENTATION DRAFT · 2013-01-08 · ROUTINE DATA QUALITY ASSESSMENT TOOL (RDQA) GUIDELINES FOR IMPLEMENTATION DRAFT SEPTEMBER 28, 2007 The Global Fund to Fight Aids,

10

Part 1 of the RDQA checklist is the systems assessment should be administered at each of the levels of the M&E system that is included in the RDQA. There is a specific worksheet or part of the checklist for each level included in the RDQA (e.g. worksheets for service sites, intermediate aggregation sites and the M&E Unit). The questions should be asked of the staff member(s) at each level who are most familiar with the data management and reporting system. Table 2 above shows the seven functional areas of the M&E system that are included in the assessment of the data management and reporting system. Relevant questions are asked at each level of the M&E system. Part 2 of the RDQA, data verifications, should also be filled out for each level included. At service sites, recounting includes recounting the number of people, cases or events recorded during the reporting period as contained in the relevant source documents. This number is compared to the number reported by the site during the reporting period (on a summary report sent to the next level)). At the Intermediate Aggregation Level, recounting includes re-aggregation of the numbers from reports received from all service delivery points and a comparison of that number with the aggregated result that was contained in the summary report prepared by the intermediate aggregation level and submitted to the next level. At the M&E Unit, recounting includes re-aggregating reported numbers from all reporting entities and comparing them to the summary report that was prepared by the M&E Unit for the reporting period. At this level, the reports should also be reviewed to count the number that are on available, on time, and complete – all measures of data quality. Alternatively, the recount can be simplified by comparing the recounted results in the relevant register to the summary report. Using this method, a sample of names from the register can be selected using the methodology for the “cross checks” in Part 2 of the checklist. Those entries on the register can be compared to the same information on the source documents (e.g. name, age, sex, diagnosis, treatment date(s), treatment regimen, etc. as relevant for the indicator). If errors are found, the methodology of recounting from source documents (described in the paragraph above) should be used to thoroughly check the data. Summary statistics that can be calculated from the RDQA include: Service Sites: • Strength of the Data Management System – The seven functional areas can be assessed

using the 13 summary questions listed in Table 2 and appended to each checklist (note: in the Excel version to come, this will be automatically generated).

• Service Site Data Verification Rate (the difference between recounted and reported numbers).

Under 100% indicates over reporting and over 100% indicates under reporting of results. Intermediate Aggregation Sites: • Strength of the Data Management System – The seven functional areas can be assessed

using the 13 summary questions listed in Table 2 and appended to each checklist (note: in the Excel version to come, this will be automatically generated).

• Intermediate Aggregation Site Data Verification Rate (the difference between recounted and

reported numbers). Under 100% indicates over reporting and over 100% indicates under reporting of results.

• % of Reports

Page 14: GUIDELINES FOR IMPLEMENTATION DRAFT · 2013-01-08 · ROUTINE DATA QUALITY ASSESSMENT TOOL (RDQA) GUIDELINES FOR IMPLEMENTATION DRAFT SEPTEMBER 28, 2007 The Global Fund to Fight Aids,

11

o Available o On time o Complete

M&E Unit: • Strength of the Data Management System – The seven functional areas can be assessed

using the 13 summary questions listed in Table 2 and appended to each checklist (note: in the Excel version to come, this will be automatically generated).

• M&E Unit Data Verification Rate (the difference between recounted and reported numbers).

Under 100% indicates over reporting and over 100% indicates under reporting of results.

• % of Reports o Available o On time o Complete

Step 3 – ACTION PLAN • Based on the findings from the RDQA, prepare an action plan for strengthening the data

management system and for improving the quality of data. Part 3 of the RDAQ has a template for an action plan that is based on the findings from the RDQA and includes follow up actions, responsibilities, timelines and resource needs.

Examples of findings related to the design and implementation of the Program/project’s data collection, reporting and management systems include:

• The lack of documentation describing aggregation and data manipulation steps. • Unclear and/or inconsistent directions provided to reporting sites about when or to whom

report data is to be submitted. • The lack of designated staff to review and question submitted site reports. • The lack of a formal process to address incomplete or inaccurate submitted site reports, • The lack of a required training program for site data collectors and managers. • Differences between program indicator definitions and the definition as cited on the

Program/project’s data collection forms. • The lack of standard data collection forms.

Examples of findings related to verification of data produced by the data management system could include:

• A disconnect between the delivery of services and the filling out of source documents. • Incomplete or inaccurate source documents. • Data entry and/or data manipulation errors. • Misinterpretation or inaccurate application of the indicator definition.

Step 4 – FOLLOW UP

• Implement activities included in the RDQA action plan and follow up with relevant sites/locations to insure implementation.

Page 15: GUIDELINES FOR IMPLEMENTATION DRAFT · 2013-01-08 · ROUTINE DATA QUALITY ASSESSMENT TOOL (RDQA) GUIDELINES FOR IMPLEMENTATION DRAFT SEPTEMBER 28, 2007 The Global Fund to Fight Aids,

12

The purpose of the RDQA is to strengthen M&E systems related to collecting, managing and reporting data related to indicators. It will be important to follow-up to ensure that strengthening measures identified in the action plan are actually carried out.

5- ETHICAL CONSIDERATIONS Data quality assessments must be conducted with the utmost adherence to the ethical standards of the country. While those undertaking the RDQA may require access to personal information (e.g. medical records), under no circumstances should any personal information be disclosed in relation to the conduct of the assessment.

Page 16: GUIDELINES FOR IMPLEMENTATION DRAFT · 2013-01-08 · ROUTINE DATA QUALITY ASSESSMENT TOOL (RDQA) GUIDELINES FOR IMPLEMENTATION DRAFT SEPTEMBER 28, 2007 The Global Fund to Fight Aids,

13

Annex 1: RDQA Checklist (with sheets for Service Sites, Intermediate Aggregation Sites and the Central M&E Unit)

Page 17: GUIDELINES FOR IMPLEMENTATION DRAFT · 2013-01-08 · ROUTINE DATA QUALITY ASSESSMENT TOOL (RDQA) GUIDELINES FOR IMPLEMENTATION DRAFT SEPTEMBER 28, 2007 The Global Fund to Fight Aids,

14

ROUTINE DATA QUALITY ASSESSMENT – Service Delivery Points Assessment and Verification Sheet

This ROUTINE DATA QUALITY ASSESSMENT (RDQA) framework is comprised of three complementary checklists which enable to assess the quality of data at each level of the data-collection and reporting system (i.e., M&E Unit, Intermediate Aggregation Levels and Service Delivery Points). This questionnaire should be used at Service Delivery Points. It is divided in three parts:

PART 1 – Systems Assessment. Assessment of the systems in place to collect, check and report data to the next level of aggregation (e.g. District, Province or National).

PART 2 - Data Verifications. Review of the availability, completeness and accuracy of reported data for a given time period.

PART 3 – Action Plan. Summarize key findings and needed improvements. Identify issues to be followed up at the various levels of the M&E system.

Background Information Name of Program/project

Name of Funding Agency (if appropriate)

Name of Central M&E Unit

Indicator(s) Selected

Reporting Period Verified

Documentation Reviewed

Date of Verifications PART 1 - Systems Assessment This assessment is designed to help identify potential risks to data quality linked to the data-collection and reporting system at Service Delivery Sites.

Data Management System Questions

Func-tional Area*

Yes / No

Comments

1 There are designated staff responsible for reviewing aggregated numbers prior to submission to the next level (e.g., to districts, to regional offices, to the central M&E Unit).

I

2 The responsibility for recording the delivery of services on source documents is clearly assigned to the relevant staff.

I

Page 18: GUIDELINES FOR IMPLEMENTATION DRAFT · 2013-01-08 · ROUTINE DATA QUALITY ASSESSMENT TOOL (RDQA) GUIDELINES FOR IMPLEMENTATION DRAFT SEPTEMBER 28, 2007 The Global Fund to Fight Aids,

15

Data Management System Questions

Func-tional Area*

Yes / No

Comments

3 All relevant staff have received training on the data management processes and tools.

II

4 The M&E Unit has provided written guidelines to the Service Delivery Point on reporting requirements and deadlines.

IV

5 Clear instructions have been provided by the M&E Unit on how to complete the data collection and reporting forms/tools.

I, II, V

6 The source documents and reporting forms/tools specified by the M&E Unit are consistently used by the Service Delivery Point.

V

7 All source documents and reporting forms relevant for measuring the indicator(s) are available for auditing purposes (including dated print-outs in case of computerized system).

V

8 If applicable, there are quality controls in place for when data from paper-based forms are entered into a computer (e.g., double entry, post-data entry verification, etc).

VI

9 If applicable, there is a written back-up procedure for when data entry or data processing is computerized.

VI

10 If yes, the latest date of back-up is appropriate given the frequency of update of the computerized system (e.g., back-ups are weekly or monthly).

VI

11 The data collected on the source document has sufficient precision to measure the indicator(s) (i.e., relevant data are collected by sex, age, etc. if the indicator specifies desegregation by these characteristics).

III, V

12 Relevant personal data are maintained according to national or international confidentiality guidelines.

VI

13 The recording and reporting system avoids double counting people within and across Service Delivery Points (e.g., a person receiving the same service twice in a reporting period, a person registered as receiving the same service in two different locations, etc).

VI

14 The reporting system enables the identification and recording of a "drop out", a person "lost to follow-up" and a person who died.

VI

Page 19: GUIDELINES FOR IMPLEMENTATION DRAFT · 2013-01-08 · ROUTINE DATA QUALITY ASSESSMENT TOOL (RDQA) GUIDELINES FOR IMPLEMENTATION DRAFT SEPTEMBER 28, 2007 The Global Fund to Fight Aids,

16

Data Management System Questions

Func-tional Area*

Yes / No

Comments

15 When applicable, the data are reported through a single channel of the national reporting system.

VII

* 7 Functional areas include: I. M&E Capabilities, Roles and Responsibilities II. Training (of M&E staff) III. Indicator Definitions IV. Data Reporting Requirements V. Data Collection and Reporting Forms and Tools VI. Data Management Processes and Data Quality Controls VII. Links with National Reporting System PART 2 - Data Verifications A- Documentation Review: Review availability and completeness of all indicator source

documents for the selected reporting period.

Yes / No

Comments

1 Review available source documents for the reporting period being verified. Is there any indication that source documents are missing?

If yes, determine how this might have affected reported numbers.

2 Are all available source documents complete?

If no, determine how this might have affected reported numbers.

3 Review the dates on the source documents. Do all dates fall within the reporting period?

If no, determine how this might have affected reported numbers.

Page 20: GUIDELINES FOR IMPLEMENTATION DRAFT · 2013-01-08 · ROUTINE DATA QUALITY ASSESSMENT TOOL (RDQA) GUIDELINES FOR IMPLEMENTATION DRAFT SEPTEMBER 28, 2007 The Global Fund to Fight Aids,

17

B- Recounting reported Results: Recount results from source documents, compare the verified numbers to the site reported numbers and explain discrepancies (if any).

% or

Nb. Comments

4 Recount the number of people, cases or events recorded during the reporting period by reviewing the source documents. [A]

5 Copy the number of people, cases or events reported by the site during the reporting period from the site summary report. [B]

6 Calculate % difference between recounted and reported numbers. [A/B]

7 What are the reasons for the discrepancy (if any) observed (i.e., data entry errors, arithmetic errors, missing source documents, other)?

Note: Recounting can be simplified by recounting from registers and comparing to summary reports. The registers can be verified using cross checks with source documents (e.g. client records). See below C- for cross checks.

C- Cross-check reported results with other data sources: For example, cross-checks can be performed by examining separate inventory records documenting the quantities of treatment drugs, test-kits or ITNs purchased and delivered during the reporting period to see if these numbers corroborate the reported results. Other cross-checks could include, for example, randomly selecting 20 patient cards and verifying if these patients were recorded in the unit, laboratory or pharmacy registers. To the extent relevant, the cross-checks should be performed in both directions (for example, from Patient Treatment Cards to the Register and from Register to Patient Treatment Cards).

8 List the documents used for performing the cross-checks.

9 Describe the cross-checks performed?

10 What are the reasons for the discrepancy (if any) observed?

Page 21: GUIDELINES FOR IMPLEMENTATION DRAFT · 2013-01-08 · ROUTINE DATA QUALITY ASSESSMENT TOOL (RDQA) GUIDELINES FOR IMPLEMENTATION DRAFT SEPTEMBER 28, 2007 The Global Fund to Fight Aids,

18

PART 3 - Recommendations for the Service Site Based on the findings of the systems’ review and data verification at the service site, please describe any challenges to data quality identified and recommended strengthening measures, with an estimate of the length of time the improvement measure could take. These will be discussed with the Program.

Description of Action Point Requirement or

Recommendation (please specify)

Timeline

Systems Assessment Questions by Functional Area (for reference) Functional Areas Summary Questions

I M&E Capabilities, Roles and Responsibilities 1 Are key M&E and data-management staff identified with clearly

assigned responsibilities?

II Training 2 Have the majority of key M&E and data-management staff received the required training?

III Indicator Definitions 3 Are there operational indicator definitions meeting relevant standards that are systematically followed by all service points?

IV Data Reporting Requirements 4 Has the Program/project clearly documented (in writing) what is

reported to who, and how and when reporting is required?

5 Are there standard data-collection and reporting forms that are systematically used?

6 Are data recorded with sufficient precision/detail to measure relevant indicators?

7 Are data maintained in accordance with international or national confidentiality guidelines?

V Data Collection and Reporting Forms and Tools

8 Are source documents kept and made available in accordance with a written policy?

9 Does clear documentation of collection, aggregation and manipulation steps exist?

VI

10 Are data quality challenges identified and are mechanisms in place

for addressing them?

11 Are there clearly defined and followed procedures to identify and reconcile discrepancies in reports?

Data Management Processes and Data Quality Controls

12 Are there clearly defined and followed procedures to periodically verify source data?

VII Links with National Reporting System 13 Does the data collection and reporting system of the

Program/project link to the National Reporting System?

Page 22: GUIDELINES FOR IMPLEMENTATION DRAFT · 2013-01-08 · ROUTINE DATA QUALITY ASSESSMENT TOOL (RDQA) GUIDELINES FOR IMPLEMENTATION DRAFT SEPTEMBER 28, 2007 The Global Fund to Fight Aids,

19

ROUTINE DATA QUALITY ASSESSMENT – Intermediate Levels Assessment and Verification Sheet

This ROUTINE DATA QUALITY ASSESSMENT (RDQA) framework is comprised of three complementary checklists which enable to assess the quality of data at each level of the data-collection and reporting system (i.e., M&E Unit, Intermediate Aggregation Levels and Service Delivery Points). This questionnaire should be used at the Intermediate Aggregation Levels (e.g., districts, regions, provinces). It is divided in two parts:

PART 1 - Data Verifications. Review of the availability, completeness and accuracy of reported data for a given time period.

PART 2 – Systems Assessment. Assessment of the systems in place to aggregate data from sub-reporting levels (e.g. Service Delivery Points) and report results to the next level (e.g. National M&E Unit).

PART 3 – Action Plan. Summarize key findings and needed improvements. Identify issues to

be followed up at the various levels of the M&E system.

Background Information Name of Program/project

Name of Funding Agency (if appropriate)

Name of Central M&E Unit

Indicator(s) Selected

Reporting Period Verified

Documentation Reviewed

Date of Verifications PART 1 - Systems Assessment This assessment is designed to help identify potential risks to data quality linked to the data-collection and reporting system at Intermediate Aggregation Levels (e.g., districts, regions, provinces).

Data Management System Questions

Funct-ional Area

Yes / No

Comments

1 There are designated staff responsible for reviewing the quality of data (i.e., accuracy, completeness and timeliness) received from sub-reporting levels (e.g.,

I

Page 23: GUIDELINES FOR IMPLEMENTATION DRAFT · 2013-01-08 · ROUTINE DATA QUALITY ASSESSMENT TOOL (RDQA) GUIDELINES FOR IMPLEMENTATION DRAFT SEPTEMBER 28, 2007 The Global Fund to Fight Aids,

20

Data Management System Questions

Funct-ional Area

Yes / No

Comments

service points).

2 There are designated staff responsible for reviewing aggregated numbers prior to submission to the next level (e.g., to the central M&E Unit).

I

3 All relevant staff have received training on the data management processes and tools.

II

4 The M&E Unit has provided written guidelines to the Intermediate Aggregation Level on reporting requirements and deadlines.

IV

5 Clear instructions have been provided by the M&E Unit on how to complete the data collection and reporting forms/tools.

I, II, IV, V

6 The source documents and reporting forms/tools specified by the M&E Unit are consistently used by all reporting levels.

V

7 All source documents and reporting forms relevant for measuring the indicator(s) are available for auditing purposes (including dated print-outs in case of computerized system).

V

8 Feedback is systematically provided to all service points on the quality of their reporting (i.e., accuracy, completeness and timeliness).

VI

9 If applicable, there are quality controls in place for when data from paper-based forms are entered into a computer (e.g., double entry, post-data entry verification, etc).

VI

10 If applicable, there is a written back-up procedure for when data entry or data processing is computerized.

VI

11 If yes, the latest date of back-up is appropriate given the frequency of update of the computerized system (e.g., back-ups are weekly or monthly).

VI

12 Relevant personal data are maintained according to national or international confidentiality guidelines.

VI

13 The recording and reporting system avoids double counting people within and across Service Delivery Points (e.g., a person receiving the same service twice in a reporting period, a person registered as receiving the same service in two different locations, etc).

VI

Page 24: GUIDELINES FOR IMPLEMENTATION DRAFT · 2013-01-08 · ROUTINE DATA QUALITY ASSESSMENT TOOL (RDQA) GUIDELINES FOR IMPLEMENTATION DRAFT SEPTEMBER 28, 2007 The Global Fund to Fight Aids,

21

Data Management System Questions

Funct-ional Area

Yes / No

Comments

14 The reporting system enables the identification and recording of a "drop out", a person "lost to follow-up" and a person who died.

VI

15 There is a written procedure to address late, incomplete, inaccurate and missing reports; including following-up with service points on data quality issues.

VI

16 If data discrepancies have been uncovered in reports from service points, the Intermediate Aggregation Levels (e.g., districts or regions) have documented how these inconsistencies have been resolved.

VI

17 When applicable, the data are reported through a single channel of the national reporting system.

VII

* 7 Functional areas include: I. M&E Capabilities, Roles and Responsibilities II. Training (of M&E staff) III. Indicator Definitions IV. Data Reporting Requirements V. Data Collection and Reporting Forms and Tools VI. Data Management Processes and Data Quality Controls VII. Links with National Reporting System PART 2 - Data Verifications D- Re-aggregate reported numbers from all Service Delivery Points: Reported results from

all Service Delivery Points should be re-aggregated and the total compared to the number contained in the summary report prepared by the Intermediate Aggregation Site.

% or Nb.

Comments

4 What aggregated result was contained in the summary report prepared by the Intermediate Aggregation Site (and submitted to the next reporting level)? [A]

5 Re-aggregate the numbers from the reports received from all Service Delivery Points. What is the re-aggregated number? [B]

6 Calculate % difference between recounted and reported numbers. [A/B]

7 What are the reasons for the discrepancy (if any) observed (i.e., data entry errors, arithmetic errors, missing source documents, other)?

Page 25: GUIDELINES FOR IMPLEMENTATION DRAFT · 2013-01-08 · ROUTINE DATA QUALITY ASSESSMENT TOOL (RDQA) GUIDELINES FOR IMPLEMENTATION DRAFT SEPTEMBER 28, 2007 The Global Fund to Fight Aids,

22

E- Review availability, completeness and timeliness of reports from all Service Delivery Points. How many reports should there have been from all Service Delivery Points? How many are there? Were they received on time? Are they complete?

% or Nb.

Comments

1 How many reports should there have been from all Service Delivery Points? [A]

2 How many reports are there? [B]

3 Calculate % Available Reports [B/A]

4 Check the dates on the reports received. How many reports were received on time? (i.e., received by the due date). [C]

5 Calculate % On time Reports [C/A]

6 How many reports were complete? (i.e., complete means that the report contained all the required indicator data*). [D]

7 Calculate % Complete Reports [D/A]

* For a report to be considered complete, it should at least include (1) the reported count relevant to the indicator; (2) the reporting period; (3) the date of submission of the report; and (4) a signature from the staff having submitted the report.

PART 3 – Recommendations for the Intermediate Aggregation Level Based on the findings of the systems’ review and data verification at the intermediate aggregation site, please describe any compliance requirements or recommended strengthening measures, with an estimate of the length of time the improvement measure could take. See systems assessment functions by function area (table below) for review of system). Action points should be discussed with the Program.

Description of Action Point Requirement or

Recommendation (please specify)

Timeline

Add more rows, as needed

Page 26: GUIDELINES FOR IMPLEMENTATION DRAFT · 2013-01-08 · ROUTINE DATA QUALITY ASSESSMENT TOOL (RDQA) GUIDELINES FOR IMPLEMENTATION DRAFT SEPTEMBER 28, 2007 The Global Fund to Fight Aids,

23

Systems Assessment Questions by Functional Area (for reference) Functional Areas Summary Questions

I M&E Capabilities, Roles and Responsibilities 1 Are key M&E and data-management staff identified with clearly

assigned responsibilities?

II Training 2 Have the majority of key M&E and data-management staff received the required training?

III Indicator Definitions 3 Are there operational indicator definitions meeting relevant standards that are systematically followed by all service points?

IV Data Reporting Requirements 4 Has the Program/project clearly documented (in writing) what is

reported to who, and how and when reporting is required?

5 Are there standard data-collection and reporting forms that are systematically used?

6 Are data recorded with sufficient precision/detail to measure relevant indicators?

7 Are data maintained in accordance with international or national confidentiality guidelines?

V Data Collection and Reporting Forms and Tools

8 Are source documents kept and made available in accordance with a written policy?

9 Does clear documentation of collection, aggregation and manipulation steps exist?

VI

10 Are data quality challenges identified and are mechanisms in place

for addressing them?

11 Are there clearly defined and followed procedures to identify and reconcile discrepancies in reports?

Data Management Processes and Data Quality Controls

12 Are there clearly defined and followed procedures to periodically verify source data?

VII Links with National Reporting System 13 Does the data collection and reporting system of the

Program/project link to the National Reporting System?

Page 27: GUIDELINES FOR IMPLEMENTATION DRAFT · 2013-01-08 · ROUTINE DATA QUALITY ASSESSMENT TOOL (RDQA) GUIDELINES FOR IMPLEMENTATION DRAFT SEPTEMBER 28, 2007 The Global Fund to Fight Aids,

24

ROUTINE DATA QUALITY ASSESSMENT – M&E Unit Assessment and Verification Sheet

This ROUTINE DATA QUALITY ASSESSMENT (RDQA) framework is comprised of three complementary checklists which enable to assess the quality of data at each level of the data-collection and reporting system (i.e., M&E Unit, Intermediate Aggregation Levels and Service Delivery Points). This questionnaire should be used at the central M&E Unit. It is divided in three parts:

PART 1 - Data Verifications. Review of the availability, completeness and accuracy of reported data for a given time period.

PART 2 – Systems Assessment. Assessment of the design and implementation at the M&E Unit of the data management and reporting system.

PART 3 – Follow up Recommendations and Action Plan. Summarize key findings and

needed improvements. Identify issues to be followed up at the various levels of the M&E system.

Background Information Name of Program/project

Name of Funding Agency (if appropriate)

Name of Central M&E Unit

Indicator(s) Selected

Reporting Period Verified

Documentation Reviewed

Date of Verifications

PART 1 - Systems Assessment This assessment is designed to help identify potential risks to data quality linked to the design and implementation of the data-management and reporting system.

Question

Funct-ional area

Yes / No

Comments

1 There is a documented organizational structure/chart that clearly identifies positions that have data management responsibilities at the M&E Unit. (to specify which Unit: e.g. MoH, NAP, GF, World Bank)

I

Page 28: GUIDELINES FOR IMPLEMENTATION DRAFT · 2013-01-08 · ROUTINE DATA QUALITY ASSESSMENT TOOL (RDQA) GUIDELINES FOR IMPLEMENTATION DRAFT SEPTEMBER 28, 2007 The Global Fund to Fight Aids,

25

Question

Funct-ional area

Yes / No

Comments

2 All staff positions dedicated to M&E and data management systems are filled.

I

3 A senior staff member (e.g., the Program Manager) is responsible for reviewing the aggregated numbers prior to the submission/release of reports from the M&E Unit.

I

4 There are designated staff responsible for reviewing the quality of data (i.e., accuracy, completeness, timeliness and confidentiality ) received from sub-reporting levels (e.g., regions, districts, service points).

I

5 There is a training plan which includes staff involved in data-collection and reporting at all levels in the reporting process.

II

6 All relevant staff have received training on the data management processes and tools.

II

7 The M&E Unit has documented and shared the definition of the indicator(s) with all relevant levels of the reporting system (e.g., regions, districts, service points).

III

8 There is a description of the services that are related to each indicator measured by the Program/project.

III

9 The M&E Unit has provided written guidelines to all reporting entities (e.g., regions, districts, service points) on reporting requirements and deadlines.

IV

10 If multiple organizations are implementing activities under the Program/project, they all use the same reporting forms and report according to the same reporting timelines.

V

11 The M&E Unit has identified a standard source document (e.g., medical record, client intake form, register, etc.) to be used by all service delivery points to record service delivery.

V

12 The M&E Unit has identified standard reporting forms/tools to be used by all reporting levels.

V

Page 29: GUIDELINES FOR IMPLEMENTATION DRAFT · 2013-01-08 · ROUTINE DATA QUALITY ASSESSMENT TOOL (RDQA) GUIDELINES FOR IMPLEMENTATION DRAFT SEPTEMBER 28, 2007 The Global Fund to Fight Aids,

26

Question

Funct-ional area

Yes / No

Comments

13 Clear instructions have been provided by the M&E Unit on how to complete the data collection and reporting forms/tools.

I, II, V

14 The data collected by the M&E system has sufficient precision to measure the indicator(s) (i.e., relevant data are collected by sex, age, etc. if the indicator specifies disaggregation by these characteristics).

V

15 There is a written policy that states for how long source documents and reporting forms need to be retained.

V

16 All source documents and reporting forms relevant for measuring the indicator(s) are available for auditing purposes (including dated print-outs in case of computerized system).

V

17 The M&E Unit has clearly documented data aggregation, analysis and/or manipulation steps performed at each level of the reporting system.

VI

18 Feedback is systematically provided to all sub-reporting levels on the quality of their reporting (i.e., accuracy, completeness and timeliness).

VI

19 There are quality controls in place for when data from paper-based forms are entered into a computer (e.g., double entry, post-data entry verification, etc).

VI

20 There is a written back-up procedure for when data entry or data processing is computerized.

VI

21 If yes, the latest date of back-up is appropriate given the frequency of update of the computerized system (e.g., back-ups are weekly or monthly).

VI

22 Relevant personal data are maintained according to national or international confidentiality guidelines.

VI

23 The recording and reporting system avoids double counting people within and across Service Delivery Points (e.g., a person receiving the same service twice in a reporting period, a person registered as receiving the same service in two different locations, etc).

VI

Page 30: GUIDELINES FOR IMPLEMENTATION DRAFT · 2013-01-08 · ROUTINE DATA QUALITY ASSESSMENT TOOL (RDQA) GUIDELINES FOR IMPLEMENTATION DRAFT SEPTEMBER 28, 2007 The Global Fund to Fight Aids,

27

Question

Funct-ional area

Yes / No

Comments

24 The reporting system enables the identification and recording of a "drop out", a person "lost to follow-up" and a person who died.

VI

25 There is a written procedure to address late, incomplete, inaccurate and missing reports; including following-up with sub-reporting levels on data quality issues.

VI

26 If data discrepancies have been uncovered in reports from sub-reporting levels, the M&E Unit (e.g., districts or regions) has documented how these inconsistencies have been resolved.

VI

27 The M&E Unit can demonstrate that regular supervisory site visits have taken place and that data quality has been reviewed.

VI

28 When applicable, the data are reported through a single channel of the national reporting system.

VII

* 7 Functional areas include: I. M&E Capabilities, Roles and Responsibilities II. Training (of M&E staff) III. Indicator Definitions IV. Data Reporting Requirements V. Data Collection and Reporting Forms and Tools VI. Data Management Processes and Data

PART 2 - Data Verifications F- Re-aggregate reported numbers from all reporting entities. Reported results from all

reporting entities (e.g., Districts or Regions) should be re-aggregated and the total compared to the number contained in the summary report prepared by the M&E Unit.

% or Nb.

Comments

4 What aggregated result was contained in the summary report prepared by the M&E Unit? [A]

5 Re-aggregate the numbers from the reports received from all reporting entities. What is the re-aggregated number? [B]

6 Calculate % difference between recounted and reported numbers. [A/B]

Page 31: GUIDELINES FOR IMPLEMENTATION DRAFT · 2013-01-08 · ROUTINE DATA QUALITY ASSESSMENT TOOL (RDQA) GUIDELINES FOR IMPLEMENTATION DRAFT SEPTEMBER 28, 2007 The Global Fund to Fight Aids,

28

7 What are the reasons for the discrepancy (if any) observed (i.e., data entry errors, arithmetic errors, missing source documents, other)?

G- Review availability, completeness, and timeliness of reports from all reporting entities. How many reports should there have been from all reporting entities? How many are there? Were they received on time? Are they complete?

% or Nb.

Comments

1 How many reports should there have been from all reporting entities (e.g., regions, districts, service points)? [A]

2 How many reports are there? [B]

3 Calculate % Available Reports [B/A]

4 Check the dates on the reports received. How many reports were received on time? (i.e., received by the due date). [C]

5 Calculate % On time Reports [C/A]

6 How many reports were complete? (i.e., complete means that the report contained all the required indicator data*). [D]

7 Calculate % Complete Reports [D/A]

* For a report to be considered complete, it should at least include (1) the reported count relevant to the indicator; (2) the reporting period; (3) the date of submission of the report; and (4) a signature from the staff having submitted the report.

PART 3 – Follow up Recommendations and Action Plan A – Summarize key issues that the Program should follow up at various levels of the system

(e.g. issues found at site level and/or at intermediate aggregation site level).

Key issues to follow-up Level/ location

B– Final Action Plan. Based on the findings of the data management systems’ review and data

verification, please describe any challenges to data quality identified and recommended strengthening measures that should be undertaken to strengthen the M&E system and data

Page 32: GUIDELINES FOR IMPLEMENTATION DRAFT · 2013-01-08 · ROUTINE DATA QUALITY ASSESSMENT TOOL (RDQA) GUIDELINES FOR IMPLEMENTATION DRAFT SEPTEMBER 28, 2007 The Global Fund to Fight Aids,

29

quality. Please give a complete description of the measure, responsibility, timeline, funding source, technical assistance needs and follow-up date and comments. This information should be completed after discussion with the Program/project. .

RDQA Final Action Plan

Country:

Program/project

Date of RDQA:

Date of Proposed Follow-up

Description of strengthening

measure

Responsibility

Timeline

Funding required

Funding source

Technical assistance

needs

Follow-up date and

comments

Add rows as needed

Systems Assessment Questions by Functional Area

Functional Areas Summary Questions

I M&E Capabilities, Roles and Responsibilities 1 Are key M&E and data-management staff identified with clearly

assigned responsibilities?

II Training 2 Have the majority of key M&E and data-management staff received the required training?

III Indicator Definitions 3 Are there operational indicator definitions meeting relevant standards that are systematically followed by all service points?

IV Data Reporting Requirements 4 Has the Program/project clearly documented (in writing) what is

reported to who, and how and when reporting is required?

5 Are there standard data-collection and reporting forms that are systematically used?

6 Are data recorded with sufficient precision/detail to measure relevant indicators?

7 Are data maintained in accordance with international or national confidentiality guidelines?

V Data Collection and Reporting Forms and Tools

8 Are source documents kept and made available in accordance with a written policy?

Page 33: GUIDELINES FOR IMPLEMENTATION DRAFT · 2013-01-08 · ROUTINE DATA QUALITY ASSESSMENT TOOL (RDQA) GUIDELINES FOR IMPLEMENTATION DRAFT SEPTEMBER 28, 2007 The Global Fund to Fight Aids,

30

Systems Assessment Questions by Functional Area

9 Does clear documentation of collection, aggregation and manipulation steps exist?

VI

10 Are data quality challenges identified and are mechanisms in place

for addressing them?

11 Are there clearly defined and followed procedures to identify and reconcile discrepancies in reports?

Data Management Processes and Data Quality Controls

12 Are there clearly defined and followed procedures to periodically verify source data?

VII Links with National Reporting System 13 Does the data collection and reporting system of the

Program/project link to the National Reporting System?