29
Introducing the Multi- Indicator Version of the RDQA Tool Presented at the MEMS - MEASURE Evaluation Brown Bag, Abuja December 7, 2012

Introducing the Multi-Indicator Version of the RDQA Tool Presented at the MEMS - MEASURE Evaluation Brown Bag, Abuja December 7, 2012

Embed Size (px)

Citation preview

Page 1: Introducing the Multi-Indicator Version of the RDQA Tool Presented at the MEMS - MEASURE Evaluation Brown Bag, Abuja December 7, 2012

Introducing the Multi-Indicator Version of the RDQA Tool

Presented at theMEMS - MEASURE Evaluation Brown Bag,

AbujaDecember 7, 2012

Page 2: Introducing the Multi-Indicator Version of the RDQA Tool Presented at the MEMS - MEASURE Evaluation Brown Bag, Abuja December 7, 2012

■ National programs and donor-funded projects are working towards achieving ambitious goals in the fight against HIV, TB and malaria.

■ Measuring success and improving management of these initiatives are based on strong M&E system that produce quality data regarding program implementation.

■ As a result of strategies like “Three Ones”, the “Stop TB Strategy” and the “RBM Global Strategic Plan”, a multi-partner project* was launched in mid-2006 to develop a joint Routine Data Quality Assessment (RDQA) Tool.

■ The objective of this initiative was to provide a common approach for assessing and improving data quality (between partners and with National Programs).

* Partners most directly involved include PEPFAR, USAID, WHO, Stop TB, the Global Fund and MEASURE Evaluation.

Background - 1

Page 3: Introducing the Multi-Indicator Version of the RDQA Tool Presented at the MEMS - MEASURE Evaluation Brown Bag, Abuja December 7, 2012

■ Importantly, funding is tied to performance and need to show effectiveness of interventions

■ Hence, the need for quality data is imperative to show program effectiveness

■ Interestingly, single indicator-RDQA tool was used for Joint national DQA exercises in 2008, 2009, 2011 and 2012 (led by NACA)

■ Multi-indicator tool has never been used in the country and there is a need to sensitize M&E professionals of the potential opportunities in this tool

■ This tool provides opportunity to evaluate data quality for selected priority indicators in different program areas at the same time and identify areas for improvement

Background - 2

Page 4: Introducing the Multi-Indicator Version of the RDQA Tool Presented at the MEMS - MEASURE Evaluation Brown Bag, Abuja December 7, 2012

Countries where RDQA has been used or is currently being implemented

Kenya Tanzania South Africa, Lesotho, Swaziland Nigeria Cote d’Ivoire DRC Haiti Mozambique India Botswana Global Fund On Site Data Verification (OSDV) by LFAs in

many countries

Page 5: Introducing the Multi-Indicator Version of the RDQA Tool Presented at the MEMS - MEASURE Evaluation Brown Bag, Abuja December 7, 2012

■Refers to the worth/accuracy of the information collected & focuses on ensuring that the process of data capturing, verifying and analysis is of a high standard.

■RDQA tools facilitate this process and also provide opportunity for capacity building

Data Quality

Page 6: Introducing the Multi-Indicator Version of the RDQA Tool Presented at the MEMS - MEASURE Evaluation Brown Bag, Abuja December 7, 2012

Mistake should be prevented rather than detected

Correcting data that has been wrongly recorded is difficult and expensive

The quality of the data is largely determined by how well the data are collected and forms are completed

In the presence of errors, data cannot be interpreted – useless!

Increased Data Quality Increased reliability and usability

Why Data Quality is Important - I

Page 7: Introducing the Multi-Indicator Version of the RDQA Tool Presented at the MEMS - MEASURE Evaluation Brown Bag, Abuja December 7, 2012

Program planning

Data use

Program decision making

Sharing program information

Reporting/Accountability

Why Data Quality is Important - II

Page 8: Introducing the Multi-Indicator Version of the RDQA Tool Presented at the MEMS - MEASURE Evaluation Brown Bag, Abuja December 7, 2012

Data Quality Assessment involve checking data against several criteria/dimensions

o Validityo Integrityo Reliabilityo Timelinesso Completenesso Precisiono Confidentiality

DQA tool is used to assess the quality of the data and should be responsive to meeting the seven dimensions

Assessment helps us to determine areas of poor data quality & help come up with action plans for potential solutions.

Data Quality Assurance

Page 9: Introducing the Multi-Indicator Version of the RDQA Tool Presented at the MEMS - MEASURE Evaluation Brown Bag, Abuja December 7, 2012

• VERIFY the quality of reported data for key indicators at selected sites

• ASSESS the ability of data-management systems to collect, manage and report quality data.

• IMPLEMENT measures with appropriate action plans for strengthening the data management and reporting system and improving data quality.

• MONITOR capacity improvements and performance of the data management and reporting system to produce quality data.

Objectives of RDQA

Page 10: Introducing the Multi-Indicator Version of the RDQA Tool Presented at the MEMS - MEASURE Evaluation Brown Bag, Abuja December 7, 2012

Routine data quality checks as part of on-going supervision

Initial and follow-up assessments of data management and reporting systems – measure performance improvement over time

Strengthening program staff’s capacity in data management and reporting

External assessment by partners and other stakeholders

Uses of RDQA

Page 11: Introducing the Multi-Indicator Version of the RDQA Tool Presented at the MEMS - MEASURE Evaluation Brown Bag, Abuja December 7, 2012

11

Generally, the quality of reported data is dependent on the underlying data management and reporting systems; stronger systems should produce better quality data.

RE

PO

RT

ING

LE

VE

LS

Service Points

Intermediate Aggregation Levels (e.g. LGAs, States)

M&E Unit

QUALITY DATA Accuracy, Completeness, Reliability, Timeliness, Confidentiality,

Precision, Integrity

Dimensions of Quality

Data quality mechanisms and controlsVII

Data management processes VI

Links with National Reporting System V

Links with the national reporting systemVIII

Data Management ProcessesIV

Data-collection and Reporting Forms / ToolsIII

Indicator Definitions and Reporting GuidelinesII

M&E Structure, Functions and CapabilitiesI

Functional Components of a Data Management System Needed to Ensure Data Quality

Da

ta-M

an

ag

em

en

t a

nd

R

ep

ort

ing

Sy

ste

m

Conceptual Framework of DQA

Page 12: Introducing the Multi-Indicator Version of the RDQA Tool Presented at the MEMS - MEASURE Evaluation Brown Bag, Abuja December 7, 2012

PREPARATION

PHASE 1

IMPLEMENTATION

PHASE 2

ACTION PLAN

PHASE 3

FOLLOW UP

PHASE 4

2. Determine indicators, data sources and time period 5. Verify data

4. Assess data management system

1. Determine scope of the DQA

3. Determine and notify facilities/sites

6. Summarize findings and prepare action plan

7. Implement activities and follow up

Implementation is conducted at M&E Unit, service sites and intermediate aggregation levels, as appropriate, given the scope of the DQA

RDQA Methodology: Chronology and Steps

Page 13: Introducing the Multi-Indicator Version of the RDQA Tool Presented at the MEMS - MEASURE Evaluation Brown Bag, Abuja December 7, 2012

■The methodology for the DQA includes two (2) protocols:

Data Verifications

(Protocol 1)

Quantitative comparison of recounted to reported data and review of timeliness, completeness and availability of reports.

1

Assessment of Data Management

Systems

(Protocol 2)

Qualitative assessment of the strengths and weaknesses of the data-collection and reporting system.

2

RDQA Methodology: Protocols

Page 14: Introducing the Multi-Indicator Version of the RDQA Tool Presented at the MEMS - MEASURE Evaluation Brown Bag, Abuja December 7, 2012

Data Verification

Documentation Review

Recounted results – trace and verify

Cross checks – compare with alternative data sources

Reporting Performance

Timeliness, completeness, availability (Intermediate level and higher)

System Assessment

Are elements in place to ensure quality reporting?

RDQA Methodology: Protocols

Page 15: Introducing the Multi-Indicator Version of the RDQA Tool Presented at the MEMS - MEASURE Evaluation Brown Bag, Abuja December 7, 2012

■ PURPOSE: Assess on a limited scale if Service Delivery Points and Intermediate Aggregation Sites are collecting and

reporting data accurately and on time.

■ The data verification step takes place in two stages:

- In-depth verifications at the Service Delivery Points; and

- Follow-up verifications at the Intermediate Aggregation Levels (Districts, Regions) and at the M&E Unit.

Trace and verify Indicator Data

M&E Management

Unit

Service Delivery Sites /

Organizations

Intermediate Aggregation

levels

(eg. District, Region)

5. Trace and Verify Reported Results

RDQA Methodology: Data Verification Component

Page 16: Introducing the Multi-Indicator Version of the RDQA Tool Presented at the MEMS - MEASURE Evaluation Brown Bag, Abuja December 7, 2012

Service Delivery Site 5

Monthly Report

ARV Nb. 50

Service Delivery Site 6

Monthly Report

ARV Nb. 200

Source Document 1

Source Document 1

District 1

Monthly Report

SDS 1 45

SDS 2 20

TOTAL 65

District 4

Monthly Report

SDP 5 50

SDP 6 200

TOTAL 250

District 3

Monthly Report

SDS 4 75

TOTAL 75

M&E Unit/National

Monthly Report

District 1 65

District 3 75

TOTAL 435

District 4 250

ILLUSTRATION

Service Delivery Site 3

Monthly Report

ARV Nb. 45

Source Document 1

Service Delivery Site 4

Monthly Report

ARV Nb. 75

Source Document 1

Service Delivery Site 1

Monthly Report

ARV Nb. 45

Source Document 1

Service Delivery Site 2

Monthly Report

ARV Nb. 20

Source Document 1

District 2

Monthly Report

SDS 3 45

TOTAL 45

District 2 45

RDQA Methodology: Data Verification

Page 17: Introducing the Multi-Indicator Version of the RDQA Tool Presented at the MEMS - MEASURE Evaluation Brown Bag, Abuja December 7, 2012

SERVICE DELIVERY POINT - 5 TYPES OF DATA VERIFICATIONS

Verifications Description -

Verification no. 1:

Documentation Review

Review availability and completeness of all indicator source documents for the selected reporting period.

In all cases

Verification no. 2:

Data Verification

Trace and verify reported numbers: (1) Recount the reported numbers from available source documents; (2) Compare the verified numbers to the site reported number; (3) Identify reasons for any differences.

In all cases

Verification no. 3:

Cross-checks

Perform “cross-checks” of the verified report totals with other data-sources (eg. inventory records, laboratory reports, etc.).

If feasible

Service Delivery Points – Data Verification

Page 18: Introducing the Multi-Indicator Version of the RDQA Tool Presented at the MEMS - MEASURE Evaluation Brown Bag, Abuja December 7, 2012

CROSS CHECKS - Perform cross-checks of the verified report totals with other data-sources

Indicator-specific notes for auditor: Cross checking may be done by comparing (1) Patient Treatment Cards and the ART Register; and (2) Drug Stock Records and the ART Register. The code of the regimen dispensed to the patient is recorded in the ART Register. The exact number of patients receiving each regimen in the facility at any time can therefore be counted by reviewing the ART Register.

CROSS-CHECK 1.1 : From Patient Treatment Cards to the ART Register. Was this cross check performed?

Yes      

4.1If feasible, select 5% of Patient Treatment Cards (or at least 20

cards) who are currently on treatment. How many cards were selected?

  5    

4.2How many of the patients selected were recorded in the ART

Register?  3    

Calculate % difference for cross check 1.1

If difference is below 90%, select an additional 5% of Patient Treatment Cards (or at least an extra 10 cards) and redo the calculation (ADD the numbers to the existing numbers in the above cells); repeat up to three times.

  60.0%    

Service Delivery Points – Cross Checks

Page 19: Introducing the Multi-Indicator Version of the RDQA Tool Presented at the MEMS - MEASURE Evaluation Brown Bag, Abuja December 7, 2012

Assessment of Data Management and

Reporting Systems

M&E Management

Unit

Service Delivery Sites /

Organizations

Intermediate Aggregation

levels

(eg. District, Region)

Assess Data Management and Reporting Systems

■ PURPOSE: Identify potential risks to data quality created by the data-management and reporting systems at:

- the M&E Management Unit;- the Service Delivery Points;- any Intermediary Aggregation Level (District or Region).

■ The RDQA assesses both (1) the design; and (2) the implementation of the data-management and reporting systems.

■ The assessment covers 8 functional areas (HR, Training, Data Management Processes , etc.)

RDQA Methodology: Systems Assessment Component

Page 20: Introducing the Multi-Indicator Version of the RDQA Tool Presented at the MEMS - MEASURE Evaluation Brown Bag, Abuja December 7, 2012

SYSTEMS ASSESSMENT QUESTIONS BY FUNCTIONAL AREA

Functional Areas Summary Questions

I M&E Capabilities, Roles and Responsibilities

1Are key M&E and data-management staff identified with clearly assigned responsibilities?

II Data Management Processes 2

Does clear documentation of collection, aggregation and manipulation steps exist?

III Links with National Reporting System

3Does the data collection and reporting system of the Program/Project link to the National Reporting System?

IV Indicator Definitions 4

Are there operational indicator definitions meeting relevant standards and are they systematically followed by all service points?

V Data-collection and Reporting Forms and Tools

5Are there standard data-collection and reporting forms that are systematically used?

6Are source documents kept and made available in accordance with a written policy?

Functional Areas of an M&E System that affect Data Quality

Page 21: Introducing the Multi-Indicator Version of the RDQA Tool Presented at the MEMS - MEASURE Evaluation Brown Bag, Abuja December 7, 2012

RDQA System Assessment

1There are designated staff responsible for reviewing aggregated numbers prior to submission to the next level (e.g., to districts, to regional offices, to the central M&E Unit).

Yes - completely

2The responsibility for recording the delivery of services on source documents is clearly assigned to the relevant staff.

Partly

3 All relevant staff have received training on the data management processes and tools. No - not at all

4The M&E Unit has provided written guidelines to the Service Delivery Point on reporting requirements and deadlines.

Partly

5Clear instructions have been provided by the M&E Unit on how to complete the data collection and reporting forms/tools.

No - not at all

6The source documents and reporting forms/tools specified by the M&E Unit are consistently used by the Service Delivery Point.

Yes - completely

7All source documents and reporting forms relevant for measuring the indicator(s) are available for auditing purposes (including dated print-outs in case of computerized system).

Partly

8The data collected on the source document has sufficient precision to measure the indicator(s) (i.e., relevant data are collected by sex, age, etc. if the indicator specifies desegregation by these characteristics).

No - not at all

Part 2. Systems Assessment

Please Provide a Comment.

Please Provide a Comment.

Please Provide a Comment.

Please Provide a Comment.

Please Provide a Comment.

Please Provide a Comment.

I - M&E Capacities, Roles and Responsibilities

II - Training

III - Data Reporting Requirements

IV - Data-collection and Reporting Forms and Tools

Page 22: Introducing the Multi-Indicator Version of the RDQA Tool Presented at the MEMS - MEASURE Evaluation Brown Bag, Abuja December 7, 2012

1- Strength of the M&E System, evaluation based on a review of the Program/project’s data management and reporting system, including responses to overall summary questions on how well the system is designed and implemented;

2- Verification Factors generated from the trace and verify recounting exercise performed on primary records and/or aggregated reports (i.e. the ratio of the recounted value of the indicator to the reported value);

3- Available, On time and Complete Reports percentages calculated at the Intermediate Aggregation Level and the M&E Unit).

4- Action Plan for System Strengthening for each level assessed.

RDQA Outputs

Page 23: Introducing the Multi-Indicator Version of the RDQA Tool Presented at the MEMS - MEASURE Evaluation Brown Bag, Abuja December 7, 2012

RDQA Summary Statistics – Level Specific Dashboard

Part 4: DASHBOARD: M&E Unit

Data Management Assessment - M&E Unit

2.0

2.0

1.5

2.0

1.9

1.8

2.0

0

1

2

3

M&E Capacities, Roles andResponsibilities

Training

Indicator Definitions

Data Reporting RequirementsData-collection and Reporting

Forms and Tools

Data Management Processesand Data Quality Controls

Links with National ReportingSystem

Data and Reporting Verifications - M&E Unit

80%

65%

85%89%

0%

10%

20%

30%

40%

50%

60%

70%

80%

90%

100%

% Available % On Time % Complete Verification Factor

Page 24: Introducing the Multi-Indicator Version of the RDQA Tool Presented at the MEMS - MEASURE Evaluation Brown Bag, Abuja December 7, 2012

RDQA Summary Statistics – Global Dashboard

DIMENSIONS OF DATA QUALITY– Distribution of Checklist Answers by Dimension(Note: The number of responses is located in each colored bar)

14

12

7

6

5

0

5

31

29

18

18

21

8

18

16

11

11

13

5

1

0

0

0

1

0

0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100%

Accuracy

Reliability

Timeliness

Completeness

Precision

Confidentiality

Integrity

Yes - Completely

Partly

No - not at all

N/A

Data Management Assessment - Global Aggregate Score

0

1

2

3

M&E Capabilities, Roles andResponsibilities

Training

Indicator Definitions

Data Reporting RequirementsData Collection and Reporting

Forms and Tools

Data Management Processes andData Quality Controls

Links with National ReportingSystem

Data and Reporting Verifications - Global Aggregate Score

78%

65%

78%

67%

55%

60%

65%

70%

75%

80%

% Available % On Time % Complete Verification Factor

Page 25: Introducing the Multi-Indicator Version of the RDQA Tool Presented at the MEMS - MEASURE Evaluation Brown Bag, Abuja December 7, 2012

REPORTING LEVEL

FINDINGS RECOMMENDATIONS

National M&E Unit

No specific documentation specifying data-management roles and responsibilities, reporting timelines, standard forms, storage policy, …

Develop a data management manual to be distributed to all reporting levels

Inability to verify reported numbers by the M&E Unit because too many reports (from Service Points) are missing (67%)

Systematically file all reports from Service Points

Develop guidelines on how to address missing or incomplete reports

Most reports received by the M&E Unit are not signed-off by any staff or manager from the Service Point

Reinforce the need for documented review of submitted data – for example, by not accepting un-reviewed reports

ILLUSTRATION

Example of Systems’ Finding at the M&E Unit (HIV/AIDS)

Page 26: Introducing the Multi-Indicator Version of the RDQA Tool Presented at the MEMS - MEASURE Evaluation Brown Bag, Abuja December 7, 2012

REPORTING LEVEL

FINDINGS RECOMMENDATIONS

Intermediate Aggregation

Level

Inability to retrieve source documents (i.e., treatment forms) for a specific period

Improve source document storage process by clearly identifying stored source document by date

Service Points

Confusion regarding the definition of a patient “lost to follow-up” (3 months for Temeke Hospital; 2 months for Iringa Hospital).

The M&E Unit should clearly communicate to all service points the definition of a patient “lost to follow up”

The service points do not systematically remove patients “lost to follow up” from counts of numbers of people on ART

Develop a mechanism to ensure that patients “lost to follow up” are systematically removed from the counts of numbers of people on ART

In cases of "satellite sites“, the reporting system and source documents do not always identify the location of a patient

Develop a coding system that clearly identifies a patient’s treatment location so that data verification can be accomplished

Page 27: Introducing the Multi-Indicator Version of the RDQA Tool Presented at the MEMS - MEASURE Evaluation Brown Bag, Abuja December 7, 2012

Multi-Indicator RDQA Tool

Page 28: Introducing the Multi-Indicator Version of the RDQA Tool Presented at the MEMS - MEASURE Evaluation Brown Bag, Abuja December 7, 2012

Thank you…

Page 29: Introducing the Multi-Indicator Version of the RDQA Tool Presented at the MEMS - MEASURE Evaluation Brown Bag, Abuja December 7, 2012

MEASURE Evaluation is a MEASURE project funded by the U.S. Agency for International Development and implemented by the

Carolina Population Center at the University of North Carolina at Chapel Hill in partnership with Futures Group International, ICF Macro, John Snow, Inc., Management Sciences for Health, and Tulane University. Views expressed in this presentation do not necessarily reflect the views of USAID or the U.S. Government.

MEASURE Evaluation is the USAID Global Health Bureau's primary vehicle for supporting improvements in monitoring and evaluation in

population, health and nutrition worldwide.

Visit us online at http://www.cpc.unc.edu/measure