Developmental Evaluation Pilot Sustained Uptake

Speeches Shim

Over the course of its history, the U.S. Global Development Lab (hereinafter, “the Lab”) of the United States Agency for International Development (USAID) has evolved its programming related to scaling, adoption, acceleration, and uptake. This evolution occurred in response to the Lab’s charter to “source, test, and scale” development solutions, and was also informed by ad hoc learnings from previous efforts. Following the conception of the Lab Wide Priorities (LWPs), the Lab agreed to undertake active learning to enable them to better understand and implement different approaches to scale/sustained uptake. Over the course of nearly two years, the Developmental Evaluation Pilot Activity (DEPA-MERL) supported Lab teams using a developmental evaluation (DE) approach. The DE approach helped several Lab teams and offices – including Digital Development for Feed the Future (D2FTF), Scaling Off-Grid Energy (SOGE), Digital Financial Services (DFS), Digital Inclusion (DI), and the Office of Evaluation, Impact, and Assessment (EIA) – to rigorously collect, analyze, and disseminate learnings regarding the sustained uptake of innovations these teams seek to promote within and beyond USAID. The DE appealed to the teams given its innovative and rigorous nature, and most importantly, its emphasis on providing timely, on-demand, and use-focused deliverables.

Over the course of its engagement, the Sustained Uptake DE worked with seven teams in the Lab over 22 months. In doing so, teams engaged in capacity building around sustainability planning and answered the following questions.

Sustained Uptake DE Evaluation Questions


1. What are the conditions and working relationships necessary in the LWPs, the Lab, and its partners to achieve sustained uptake internally (Missions and Bureaus) and externally? 2. How do we determine which current Lab approaches are most effective at sustained uptake? What has been the perceived and real value add of the approaches? What can we learn from Lab uptake models?

3. What are the replicable principles/elements from the different sustained uptake models and how should others apply them to a different context?

4. How does the Lab balance sustained uptake initiatives that are internal versus external? What impact (internal or external) does the Lab value more? Where can the Lab have the most impact?

In order to answer these questions, the Developmental Evaluator (hereinafter, “Evaluator”) employed appreciative inquiry, positive deviance case studies, process tracing, outcome harvesting, and various facilitated work with the DE teams. These evaluative efforts contributed to an iterative database used throughout the DE, resulting in evidence informed by 474 sources and 1,675 unique data points. The findings, conclusions, recommendations, and adaptive work with the DE teams resulted in the following key outcomes from the Sustained Uptake DE. Key Outcomes:

  • The DE identified effective and efficient models to achieve sustained uptake with both internal and external audiences.
  • The DE helped six teams develop and initiate implementation of Sustainability Plans and exit strategies, thereby improving sustainability of programming and increasing the understanding of pathways to scale for the teams’ respective innovations.
  • The DE created and disseminated the Mission Engagement Playbook – a how-to manual built on DE evidence of how to work with USAID Missions effectively. This helped to improve the efficiency and effectiveness of Mission-Headquarters (HQ) relationships for teams who implemented the guidance.
  • The DE improved working relationships between Bureaus and with private sector partners.
  • The DE helped teams’ design pathways to scale, including the ability to assess ecosystemlevel impact.
  • The DE improved team culture for five teams focusing on developing action-oriented, adaptive decision-making.

Overall, the Sustained Uptake DE provided extensive evaluative and adaptive management support to the Lab, providing them with evidence on effective and efficient models for both internal and external sustained uptake. The DE further improved teams’ capabilities of achieving ecosystem-level outcomes, and provided tools to continue this work moving forward.

What is Developmental Evaluation? 


Development Evaluation (DE) is an approach to evaluation that supports the continuous adaptation of development interventions. DE provides evaluative thinking and timely feedback to inform ongoing adaptation as needs, findings, and insights emerge in complex dynamic situations. The DE helps facilitate the process from findings to action in a collaborative process with the DE stakeholders. 

Outcome Harvesting Report on Uptake DE


The Developmental Evaluation Pilot Activity-Monitoring, Evaluation, Research, and Learning (DEPAMERL)— situated in the US Global Development Lab’s Monitoring, Evaluation, Research, and Learning Innovations Program at the United States Agency for International Development (USAID)—is testing the effectiveness of developmental evaluation in the USAID context. Developmental evaluation (DE) was created to evaluate innovative programs that operate in complex environments and are thus expected to adapt over time. From March 2017 to December 2018, DEPA-MERL conducted a DE with the US Global Development Lab (hereinafter, “the Lab”). The Sustained Uptake DE (hereinafter, “the Uptake DE”) was conducted in service of the Lab’s mission to source, test, and scale development solutions. The Uptake DE helped several of the Lab’s teams to collect, analyze, and disseminate learnings regarding the sustained uptake1 of innovations these teams seek to promote within and beyond USAID.

Evaluation Background and Purpose


DE is an evaluative approach aimed at facilitating continuous adaptation of interventions. In this context, it involves having one or more Developmental Evaluators integrated into the implementation team, usually on a full-time basis. This report seeks to facilitate learning on the implementation of DEs in the USAID context. Readers of this report include USAID stakeholders, organizations funding or implementing DE, and Developmental Evaluators themselves. Using the information collected, the DEPAMERL consortium aims to build on existing literature and offer readers targeted data and guidance to improve the effectiveness of DE. Additionally, the findings from this study will be compared to findings from other DE pilots conducted by DEPA-MERL. A cross-case comparative report is expected to be released in September 2019.

Methodology and Limitations 


During all 22 months of the Uptake DE, the William Davidson Institute at the University of Michigan (WDI) team collected data to answer the following three research questions:

  • Question 1: How does DE capture, promote, and enable the utilization of emergent learnings in support of ongoing programming in a complex system, in the USAID context?
  • Question 2: What are the barriers and enablers to implementation of DE in the USAID context?
  • Question 3: What do key informants consider to be the value (added or lost) of conducting a DE compared to a traditional evaluation approach?

To answer these questions, the WDI team used mixed methods, which included outcome harvesting. The WDI team conducted a document review, held semi-structured interviews with the Developmental Evaluator and stakeholders, and administered an electronic survey to stakeholders. Limitations of the study included resource constraints (time and funding), respondent selection bias, funding bias, and lack of a counterfactual.

Top Developmental Evaluation Findings!


These series of one-pagers covers the most significant findings from the Uptake Developmental Evaluation that are applicable for broad USAID HQ application. From evidence-grounded Mission engagement strategies to strengthening theories of change and metrics for improved ecosystem-level initiatives, these one-pagers provide practical guidance to implement successful strategies for achieving sustained uptake. 

Additional Resources 


Date 
Monday, November 4, 2019 - 3:00pm