What is Monitoring, Evaluation, and Learning?
International development and other similar fields have been tracking, trying to assess, and learning from their efforts for at least the past six decades. These efforts led to the creation of tools, techniques, and specific skillsets, which collectively influenced the evolution of the monitoring and evaluation (M&E) field. The aim of this field is to better understand and improve development interventions through creation and use of evidence. Core definitions for this work are provided on the right.
This specialization often emphasizes two pillars: accountability and learning. As donor’s shortened periods of performance and international development actors experienced increased scrutiny, outsized emphasis was put on accountability and measuring statistically significant impact. Over recent years, the field has increasingly become aware of the challenges of operating in complexity, the learning needs associated with innovative programming, and the influence of the interconnected nature of our world. As such, learning and collaboration have re-emerged as core focal points in the M&E sector.
MONITORING is the process of collecting and analyzing information tracking implementation quality, the context, and progress towards outcomes both for decision making and reporting.
EVALUATION is the study and/or assessment of an activity’s causal mechanisms and contributions to change. Evaluations are leveraged more heavily to make decisions about intervention effectiveness, impact, and sustainability.
LEARNING is the intentional use of evidence to course correct, refine our understanding of the world and activities impact on it, respond to emergent environments, and adapt to maintain activity efficacy. Learning maximizes the utility of monitoring and evaluation efforts and provides an opportunity to enhance the return on investment of development resources midstream.
Why and how should we do MEL?
MEL helps us understand our work, its comparative effectiveness, its impact, and how it both influences and is influenced by the context in which it operates. MEL also helps us better understand and integrate partners and beneficiaries perspectives, ideas, and experiences into programming. As the field continues to grow, we are seeing the emergence of innovative MEL approaches better suited to address the complexity of the development operating environment. In the past, international development has focused efforts on summative performance and impact evaluations. Impact evaluations assess the effects of an intervention, but require evaluators establish causal attribution, which is not always possible or appropriate. Accountability-oriented MEL, including performance monitoring, impact assessments, and research help us track progress against known theories of change and validate approaches that we know to be effective, providing evidence for more robust uptake.
While valuable, impact evaluations do not address the field’s need to test new approaches and adapt to emergent issues during activity implementation. By taking a more holistic and systems-based approach, we can learn what works in which contexts and be more agile, responding to emergent needs and environmental changes (i.e. conflict outbreaks, new legislation, drought, etc.). Learning-oriented MEL focuses on understanding the theory behind what we do and how it affects those we work with in both intended and unintended ways. A learning approach may look at multiple, slightly different applications to determine what is most effective, or provide a deeper, more nuanced understanding of the enabling environment to unlock barriers.
Our MEL Service Offerings
Headlight Consulting Services, LLP provides the tools, methodological expertise, and capacity development skills needed to help organizations both track progress towards development outcomes and adapt programming based on learning. Headlight has helped design the MEL Plans for the USAID Digital Strategy, the initial USAID activities under the Digital Connectivity and Cybersecurity Partnership, and for seven of GSMA’s Mobile for Development programs, among other past experience. Our expertise and explicit work with innovative MEL approaches enables us to support strategies/programs/activities operating in complexity, looking for increased integration across workstreams, and/or looking to apply cutting-edge MEL to their work. We take a systems-oriented and sustainability approach to all our efforts, and incorporate organizational change best practice, setting up clients for long-term success. Headlight’s MEL services include, but are not limited to:
- Designing, integrating, and operationalizing context, performance, and learning monitoring for traditional and innovative practices (such as digital programming and disaster risk management):
- MEL Frameworks
- MEL Plans
- Indicator Dashboards
- Data Collection Tools
- Designing and implementing rigorous qualitative evaluations with facilitated support for uptake of recommendations. *All evaluations Headlight conducts are based on utilization-focused evaluation principles:
|Expertise in Developmental Evaluations. Rebecca Herrington, Headlight’s CEO, helped originate the concept for USAID’s Developmental Evaluation Pilot Activity consortium award and supported the conceptualization, design, management, and implementations of DEs at USAID. She was responsible for the design and implementation of the Sustained Uptake DE conducted under the DEPA-MERL Pilot Activity from 2017 to 2019, across seven teams in the former U.S. Global Development Lab (“Lab”). This DE resulted in improved programmatic models, enhanced capacity and a toolkit for Sustainability Planning, the actualization of evidence-driven decision making, Systems Based Theories of Change that helped teams intentionally program towards ecosystem-level impact, and identification of successful and unsuccessful scale pathways. Adaptations and evidence from this DE are still impacting USAID to this day, with Ms. Herrington’s Mission Engagement Playbook informing the Client Services strategy of the new Bureau for Development, Democracy, and Innovation (DDI), and the co-writing of the Implementing Developmental Evaluation Practical Guide for evaluators and administrators, which has supported DE to become an accepted evaluation method in the newest ADS 201 revisions. Headlight is currently implementing an ever-expanding roster of DEs, including a Project-level DE on Disaster Risk Management for USAID/Ethiopia with an entirely local DE team, a Project-level DE for USAID/RFS’s Policy LINK program, and individual buy-in DEs for Policy LINK in Ethiopia and Bangladesh.”|
- Designing, integrating, and operationalizing fit-for-purpose learning efforts that map existing monitoring and evaluation across activities, identify gaps, enhance collaboration across stakeholders, and facilitate the organizational behavior change necessary to improve adaptability and support a data-driven organization. Headlight has notable Collaborating, Learning, and Adapting (CLA) expertise, serving as the CLA partner on USAID MEL platforms and making direct contributions to the USAID CLA Toolkit:
- CLA training, integration support, and design services
- Learning Reviews
- Learning Agenda design and implementation
- Monitoring-Evaluation-Learning Mapping (mapping MEL efforts to guiding learning questions across a large portfolio and identifying remaining learning gaps and associated methods)
- Case study design, implementation, and dissemination
Headlight would love to support your monitoring, evaluation, and/or learning needs. We have the breadth and depth of expertise, experience, and toolbox to tailor-design a solution for you. For more information about our services please email firstname.lastname@example.org. Headlight Consulting Services, LLP is a certified women-owned small business and therefore eligible for sole source procurements. We can be found on the Dynamic Small Business Search or on SAM.gov via our name or DUNS number (081332548).