Evaluation Development and Implementation

Evaluation studies are essential for understanding the effectiveness, efficiency, and impact of programs and policies. The objective and challenge of an implementation science approach to program evaluation is supporting implementation learning and shifting to systems for continuous (leveraged) learning by engaging in-country, regional, and global learning partners. 

Through its network of implementation science experts and more than 33 global organizations, HEARD works collaboratively with USAID country and implementing partners to evaluate and improve global health policy and programs through evaluation. HEARD seeks to engage and develop the principles and practical applications of implementation science approaches to program evaluation that generate useful knowledge about ongoing implementation and improve understanding of how and why programs are functioning.

Learn more about Evaluation Development and Implementation below:

Implementation Science Approach to Evaluations

If there is a gap in existing data, or a need to understand how to improve practices, experts in our network support research studies, policy analysis and evaluations to improve implementation of global health. From systematic reviews of literature to economic evaluations to process evaluations, we focus on data liberation and the sustainability of health service delivery.

The use of evidence to improve global health delivery and diplomacy requires the active engagement of collaborators representing an extensive array of skill sets. Current partners include governments, regional health bodies, policy advocacy groups, civil society, research organizations, and academic institutions.

By engaging the right mix of partners at the right time with the right evaluation tools and methods, we can determine the most relevant priorities and questions while minimizing the “stalling” of evidence in the research-to-use pathway. Whenever possible, we encourage the co-creation of evaluation designs and implementation.

Rapid Scoping

The HEARD Project emphasizes rapid and robust scoping of proposed evaluations, which serves the purpose of quickly refining the objectives of the evaluation, and identifying and framing opportunities for the best approach and value. Scoping is participatory and involves the client/requestor (often a Mission), implementing partners, and ideally additional in-country and global/regional learning and leverage partners, to understand learning needs from multiple perspectives and how the evaluation can address those.

Independent Evaluation Team Leads

Evaluations undertaken by the HEARD Project are generally conducted using independent (i.e. not employed by or representing the HEARD Project implementing partners or USAID) Evaluation Team Leads to ensure the independence and objectivity of the evaluations. Team Leads are experienced senior-level individuals with expertise in the relevant subject matter and/or evaluations. Team leads are responsible for oversight to the design and implementation of the evaluation; maintaining regular communication with the HEARD Management Team focal point and USAID focal point; and ensuring the quality and timely completion of the evaluation and reporting.

Partnered Approach

Evaluations undertaken by the HEARD Project draw on HEARD Partners to form the evaluation teams, wherein, supported by the HEARD Core Team:

  • HEARD Global Technical Anchors lead on the evaluation design and methodology, participating in the scoping activities as needed, to understand the request and best support evaluation design. In collaboration with the Evaluation Lead, the Design Lead is responsible for leading the design of the evaluation, developing the protocol and tools, and managing the data analysis process.
  • The Evaluation Implementation Team is composed of individuals from the HEARD Project Anchor partners, the HEARD Core Team, and HEARD Sub-Regional Anchors for evaluations in their respective regions.  Other HEARD Technical Resource Partners can be brought in through a competitive process, as needed. The size and composition of the team will be determined for each evaluation.
Technically Supported

A Strategy Reference Group (SRG) is established for evaluations that would benefit from applying the consideration of a broader expert group to the evaluation findings. The SRG will review the evaluation findings and take a consensus building approach to develop recommendations in areas of interest for the client/requestor.