What is a USAID Performance Management Plan?

By Maxine Secskas,  CLAME Associate, Headlight Consulting Services, LLC

This blog post is the first in a 2-part series on USAID Performance Management Plans.

This blog post (and a subsequent post reflecting on our recent support in drafting a Performance Management Plan) is intended to inform and offer suggestions from our recent experience for those MEL professionals who are writing or working with a USAID Performance Management Plan (PMP) or those interested in learning more about how PMPs came to be and how they are used within USAID today. 

A 2020 update to USAID’s operational policy (ADS) included a new focus on integrating Collaborating, Learning, and Adapting (CLA) principles in Performance Management Plans. In 2021, Headlight supported the writing of a new 5-year PMP for a USAID Mission client and found that while there is some guidance available from the USAID Learning Lab and a few recently completed PMPs from other Missions to use as examples, the new CLA-integrated PMP style is still new to many Mission staff and development practitioners. We thought it an opportune time to reflect on the history and purpose of PMPs and try to answer the questions: Why does USAID use PMPs? Why is there a new focus on CLA?

This blog mini-series starts with this post about the purpose and history of PMPs, and the subsequent post, scheduled for June 2, will take a look at our learnings and advice from supporting the writing of a PMP recently. The history recounted in this blog is based on personal recollections from interviews, and it is not meant to be an official record of USAID history.

What is a Performance Management Plan?

A Performance Management Plan (PMP) is a document that USAID Missions use to set down their plan for monitoring, evaluation, and learning work for a 5-year period. It is intended to accompany a Mission’s 5-year Country Development Cooperation Strategy (CDCS) and is the action plan for how they intend to measure progress toward their development objectives and answer learning questions and priorities. 

According to the ADS Chapter 201 guidance, USAID’s annual requirements for monitoring and evaluation objectives and planning will be met through the use of a PMP and Activity Monitoring, Evaluation, and Learning (MEL) plans. These will guide USAID staff and partners in managing the MEL processes needed to measure progress and apply learnings. The PMP should be aligned with the Mission’s Learning Questions and Results Framework from their CDCS and should contain:

  • Table of Contents and Modification Log  
  • Identification of Learning Priorities and a Plan to Address Them with Monitoring, Evaluation, and CLA  
  • Performance Indicators for Intermediate Results (IR)
  • Schedule of Performance Management Tasks and Associated Resources

The PMP is intended to be a living document, updated continually by the Mission over the life of the strategy to include any new information and make adaptations. Missions are meant to utilize the approaches outlined in the PMP when reviewing data as part of their annual portfolio review process and CDCS mid-course stocktaking. 

Our next blog post will go into more detail about what information a PMP could contain and how we helped integrate CLA into the plan to address Learning Priorities.

A History of PMPs 

I sat down with a couple of Headlight staff over the past year to discuss their first-hand experience with PMPs from their time at USAID. Thank you to Cindy Clapp-Wincek, Former Senior Evaluation Advisor at Headlight and Former Director of the USAID Office for Learning, Evaluation and Research (2011-2014). And thank you to Monica Matts, current Director of Strategic Learning at Headlight and former Senior Learning Advisor at USAID Bureau for Policy, Planning and Learning (2013-2021). It was my intention to learn more about how current PMPs came to be in order to better understand how they are practically used by Missions. 


The first iteration of a short-term management plan came about in the late 1980s at the USAID’s Africa Bureau, which had been using 25-year Action Plans. It was recognized that these were too long-term to be useful for management purposes. So, the Bureau started using documents called “Action Plan Guidance,” which were shorter and more grounded in data use and evidence-informed decision-making. The guidance documents provided direction for measuring progress against the action plans.

In a related development, Missions and Activities started to use logical frameworks in the form of an “objective tree” (now called a Results Framework), which helped to standardize managing progress measurement. There was recognition at the time that staff should only be held responsible for what they can produce, so objective trees shifted focus to more easily measurable outputs rather than longer-term impacts. These objective trees/Results Frameworks were combined with action plans and became a precursor to the current PMP.

After a global USAID summit, including all Bureaus and the predecessor to the LER office, the use of the Results Framework for project strategy became more widespread until it was included in expanded agency-wide guidance initially as a Performance Monitoring Plan, now called a Performance Management Plan.


Starting in the early 1990s, the use of PMPs for Activities and Projects was standard across Missions. By using  PMPs, staff and partners came to understand that the log frames they were using at the time were not realistic, and it allowed Missions to make a case for greater funding and more achievable goals. The PMP guided the reporting that Missions could use to facilitate conversations with USAID Washington offices around their strategic objectives, necessary budgeting, and strategy planning. 

1990-2010: Era of Standard Indicators

Early PMPs were very indicator-focused. The field of international development had very little emphasis on monitoring until the late 1980s, when it saw an acceleration in performance monitoring. Eventually, the scale tipped towards a focus on monitoring and included very little evaluation. As part of this trend, the USAID Bureau of Policy and Program Coordination, which included evaluation guidance and support, was suspended from 2005 to 2010. 

During the George W. Bush Administration, the standard foreign assistance indicators (“common F indicators”) were developed to monitor common metrics to compare spending and progress across countries for both USAID and the State Department. Monitoring processes at the time were complicated, context-specific, and took a lengthy time to get answers. The standard “F” indicators set by the US State Department and still used by foreign assistance programs today were developed to enable clearer and quicker aggregated results. All USAID Missions with relevant programs began to report on the F indicators at that time to standardize foreign assistance reporting. This standard monitoring focus was reflected in the PMPs of the time. 

2010-2020: Reinvigoration and the Program Cycle

Around 2010, with the beginning of the Obama Administration, the US Government put a renewed focus on development. The Bureau for Policy, Planning, and Learning (PPL) was re-formed from the suspended Bureau of Policy and Program Coordination. This new PPL Bureau included an evaluation function. 

An early action from PPL was to write Program Cycle guidance and policy in 2011. The new Program Cycle guidance required Missions to develop five-year strategies. Previously, Missions and Bureaus had programmed at the activity level, and there was no requirement for big picture planning. The new Program Cycle guidance also instituted a new requirement for PMPs. Previously, PMPs had been for individual contracts or specific activities.

At that point, Missions had experience with standard F indicators and reporting on data, so it was with this expectation that they approached PMPs. This first round of Mission-level PMPs were used to capture what data was being tracked and which performance indicators were being used. The PMP guidance at this time from PPL centered evaluations and encouraged consideration of how Missions could learn from Monitoring and Evaluation, but the learning was not a focus or requirement. 

In 2016, Program Cycle policy was again revised. It updated PMPs to require input on Collaborating, Learning, and Adapting approaches. CLA was no longer optional, but the proposed structure for PMPs was as follows: a section for monitoring plans, a section for evaluations, and a section on how the Mission will do CLA. 

2020-present: Integration of CLA Principles

In the last few years, there was another update to Program Cycle guidance and policy. The changes encourage staff to not think of Monitoring, Evaluation, and CLA as 3 separate ideas in a vacuum. This is the basis for the most recent PMP guidance, which encourages all 3 approaches to be integrated together and encourages a focus on learning priorities.

This newest PMP logic became standard in late 2020 when new guidance was issued for ADS Chapter 201, which codifies the Program Cycle in USAID’s operational model. The latest ADS 201 states:

A PMP must be grounded in the development hypotheses, objectives, learning questions, Results Framework, and operating context of a Mission’s CDCS. It articulates learning priorities, expected programmatic results, and a Mission’s approach for using monitoring, evaluation, and CLA to understand progress toward results. It informs management decisions that support the national self-reliance objectives. It should also include plans for monitoring, evaluation, and CLA activities and approaches to build the capacity and commitment of local partners. (ADS 201, p. 46)

The new guidance puts a focus on the role of learning and an emphasis on progress toward learning priorities as part of strategy-level MEL for each Mission. This should be accomplished by identifying top learning priorities, then considering how to address them through Monitoring, Evaluation, and CLA. The new guidance attempts to break down the silos between these practices and recognizes that they are all interdependent. 

Please check back on June 2 for the next post on our Reflections on Supporting the Writing of a PMP to read more about what the new PMP guidance looks like in practice, or sign up for blog notifications to receive updates on all future blogs!

If you have any questions about organizational development and learning or need help integrating CLA into a Performance Management Plan, Headlight would love to support you! We have the breadth and depth of experience to tailor-meet your needs. For more information about our services, please email info@headlightconsultingservices.org. Headlight is a certified woman-owned small business and therefore eligible for sole source procurements. We can be found on the Dynamic Small Business Search or on SAM.gov via our name or DUNS number (081332548). 

Leave a Reply


  • Anonymous

    Can I get a PMP for a USAID project (a blank one that lists the Outcome Indicators) through an freedom of information request?