Data Quality Assessment Plan

Speeches Shim

FEATURED

USAID TIPS: Conducting a Data Quality Assessment

These two short guides highlight useful techniques for analyzing performance data, serving as a quick source of reminders about the methods that might appropriate for particular performance indicators.

DOWNLOAD NOW

ALSO SEE

USAID TIPS on Data Quality Standards


USAID Performance Management Toolkit


USAID Monitoring Toolkit

The purpose of a Data Quality Assessment is to ensure that the USAID staff are aware of the strengths and weaknesses of the data they obtain about project and program performance, as determined by reviewing actual data on indicators against USAID’s—validity, integrity, precision, reliability and timeliness—and are aware of the extent to which the data can be trusted as a basis for management decisions.

Conceptually, USAID performance indicators are like rulers. They can be used to measure the status of people or crops or other matters of interest. The changes in status they measure are treated as being trustworthy because the ruler doesn’t change; it measures what it measures exactly the same way every time.

In practice, not all performance indicators are this reliable. Much of the time, Data Quality Assessments (DQAs) find that the information gathered on performance indicators is sound. But problems have arisen, and serve not only as a warning of what can go wrong but also demonstrate the value of conducting DQAs. They help ensure that only rarely is the wording of questionnaires changed from baseline to later periods, making the result non-comparable, and only rarely do partners vary how they sample populations ; the season during which they collect data; or simply forget gather data, or “pump up” the numbers they report to try to look good.

As a guard against such problems, and because they are required by USAID’s ADS, plans for carrying out DQAs are included in Project MEL Plans. Particular attention needs to be paid to the schedule for DQAs since data reported to Washington for Government Performance and Results Modernization Act (GPRAMA) reporting purposes or for reporting externally on Agency performance must have had a data quality assessment at some time within the three years before submission. USAID Missions/Offices may choose to conduct data quality assessments more frequently if needed. USAID Missions/Offices are not required to conduct data quality assessments for data that are not reported to USAID/Washington. Thus, managers are not required to do data quality assessments on all performance indicators that they use. However, managers should be aware of the strengths and weaknesses of all indicators, and for this reason, some carry out DQAs on a larger share of their performance indicators than is actually required.

A DQA section of a Project MEL could usefully address several key aspects of a project’s approach to this responsibility, including:

  • Who will carry out DQAs for the project?
  • How will they be carried out?
  • When what schedule will the be carried out?

Missions, and offices within Missions, often vary with respect to who carries out DQAs. Some have firms they routinely engage for this purpose. Others organize DQA teams on a ad hoc basis or sometimes carry they out using USAID staff. An MEL Plan can clarify how this will be done for the a particular project. The approach and tools used to carry out DQAs also vary. Some Missions use a set of forms included in the USAID Performance Management Toolkit, while others use modified versions of theses form included in USAID’s most recent TIPS on Conducting Data Quality Assessments. An MEL Plan can clarify what will be done in the project on which it focuses and the procedures selected can then be included in the Project Indicator Reference Sheets prepared for each of a project’s Objectively Verifiable Indicators.

Including a DQA schedule in a Project MEL Plan is a good way to ensure that indicators are scheduled for DQAs on a timely basis over the life of a project. As the sample table below suggests, some indicators may already be in use and will follow a slightly different schedule that those that are introduced for the first time with the start of a new project.

Project LF Level Performance Indicator DQA Schedule Last DQA Next Scheduled DQA
(Project Year)
Y1 Y2 Y3 Y4 Y5
Goal Growth in real gross domestic product (GDP) per capita – collected by E3 N/A          
Purpose Foreign trade (X+M) as a percentage of GDP – collected by E3 N/A          
Export sales of assisted firms   X     X  
Domestic investment in non-traditional exports   X     X  
FDI in non-traditional exports   X     X  
Sub-Purpose Time to export/import (days) – from the World Bank            
Number of documents required to export/import – from the World Bank N/A          
Cost of exports/imports for operators that make electronic submissions   X     X  
Outputs Land customs border crossings fully automated   X     X  
Number of trained customs officers at land border crossings 2011     X    
Expedited clearance procedures operational at land customs border crossings   X     X  
Percentages of shippers using expedited shipping, by land customs border crossing   X     X  
Number of firms receiving USG capacity building assistance to export 2011     X    
Inputs Computers and other equipment installed on schedule and within budget   X     X  
New agents assigned to land border customs crossings 2011     X    
Training provided for existing land border customs crossing staff   X     X  
Person hours of training completed in trade and investment capacity building supported by USG   X     X  
Percentage of shippers that have seen flyers or heard local radio ads   X     X  
Expedited shippers program initiated within 2 months of land border crossing automation   X     X  
Number of days of USG supported technical assistance in trade and investment capacity building provided to counterparts or stakeholders 2011     X    
 << Project Assumptions Monitoring Template (Optional) Up Reporting Project Performance >>

ProjectStarter

BETTER PROJECTS THROUGH IMPROVED
MONITORING, EVALUATION AND LEARNING

A toolkit developed and implemented by:
Office of Trade and Regulatory Reform
Bureau of Economic Growth, Education, and Environment
US Agency for International Development (USAID)

For more information, please contact Paul Fekete.