Learning Reviews: Using What You Already Know

By: Chelsie Kuhn, MEL Associate, Headlight Consulting Services, LLP

This is the second post in a series of two about Learning Reviews.

So you’ve decided it’s time to use a Learning Review? As a refresher, a Learning Review is a way to systematically look at past assessments, evaluations, reports, and any other learning documentation in order to inform recommendations and strategy, program, or activity design efforts. Unlike Desk Reviews, Learning Reviews focus on coding and analyzing data instead of summarizing it. With layers of triangulation and secondary analysis built into the process, we can confidently draw findings and conclusions knowing that the foundation of the process is built with rigor. Recommendations stemming from these findings and conclusions serve as the best use of an existing evidence base in designing or revisiting strategies, programs, and activities. For more on different tools to leverage information in your strategy, program, and activity design see our previous post

This blog provides practical guidance on how to operationalize a Learning Review. While you may need support and extra human resources to actualize a Learning Review (Headlight would love to help!), the following steps provide a great overview of the process needed if you choose to implement a Learning Review on your own. 

Step 1: Decide on Taking a Learning Approach

First, clients need to decide that they are ready to take a learning approach to their work. The additional rigor and action-oriented recommendations that stem from a Learning Review will only be useful if clients take the time to reflect on learnings and make adaptations. The timing for implementing a Learning Review might be especially optimal if the client is designing a new strategy or program, or if they are pausing to reflect and take stock of progress to date. Once clients have decided that a Learning Review is the appropriate tool and they are ready to commit to taking recommendations and implementing Collaborative, Adaptive, and Learning processes, then you can establish a contextually appropriate scope for the Learning Review. 

Step 2: Develop the Questions and Objectives

Once the decision has been made to pursue a Learning Review, you need to work with the client to decide if there are any specific themes or questions that you should be looking for in your analysis. Looking at what works, what doesn’t work, and enabling environment factors are standard themes, but if there are sub-themes, goals, or particular programming mechanisms that the client wants to understand more about, make sure those are identified upfront so that you can coding and adding descriptors appropriately when reviewing the evidence. [Example: the influence of digital, ecosystem-level impacts, gender considerations, factors contributing to community tensions, etc.] This is also a great time to discuss how the Learning Review will be used and by whom. Knowing the audience upfront can help you think about what level of evidence will be trustworthy to decision-makers and how best to structure the deliverables to facilitate taking action.

Step 3: Collect Documents and Confirm Scope

Work with your client to gather all relevant documents for the scope of the project that is being reviewed. Make sure you collect all relevant evaluations, assessments, performance reports, and learning documentation for the determined period of review. The client may have this documentation on hand, but it also may require some digging and reaching out to other stakeholders to ensure you have a comprehensive dataset for review. Due to the rigorous analysis process of conducting a Learning Review, it can be difficult to add in additional data on an iterative basis. Double check with the client once you have a complete list of the data that you’ll be reviewing that you have all of the documents–better to know now and add anything that is missing than get to then end of coding and analysis only to have missed documents! Once the documents have been collected, upload them into your qualitative data analysis software of choice

Step 4: Read and Code the Data

Build out a codebook based on discussions with the client on what the themes should be and clearly define each code (this will help later when you are trying to figure out how to code excerpts across different types of documentation). Begin reading and coding your data accordingly, allowing the codebook to evolve along the way based on emergent themes. Once you get through coding each document, don’t forget to add descriptors to properly disaggregate the evidence. Disaggregations for this type of review often include type of evidence, strength of evidence (e.g., an Evaluation Report versus a Quarterly Report with anecdotal evidence), sector, focus on special considerations (e.g., gender report), and time period covered. For more on qualitative coding, see our Top 5 Dos and Donts post and Qualitative Analysis Software post for our take and for more resources.

Step 5: Complete Secondary Analysis and Triangulation

Once all data has been coded and all documents have descriptors, go back and triangulate findings using the Code Applications table under the analysis tab to start–this table will show how many times a code has been used. We also refer to the Code Co-Occurrence table to scan briefly for where code applications may have overlapped among the various codes and if anything might be worth looking into more deeply. To claim triangulation, you need to see the accurate application of a code appear across at least three sources. Documents that come from the same source are not likely to count as separate data points (ex: mention a particular theme or incident in both a quarterly and annual report from an implementer). For codes that can be triangulated, conduct secondary analysis to see if you can find sub-trends and draw out more actionable nuance to the finding. This can be done by importing excerpts from your qualitative analysis software into Excel sheets and then adding columns for secondary codes. For example, when a primary code for what works–coordination is triangulated, then a sub-theme might be the mechanism that best enables that coordination, like a  “community of practice.” 

Step 6: Build a Findings, Conclusions, and Recommendations (FCR) Matrix

Begin assembling your FCR Matrix. For findings, this requires just copying what the data says (e.g. 15 excerpts from 8 sources highlight …) with no additional interpretation. As you go, be sure to highlight illustrative quotes from the excerpts that best represent the finding and can be integrated into a report to help ground the narrative in data. Conclusions are where you can layer on interpretation, addressing what the findings mean and how multiple findings may be interrelated. From conclusions, we move to actionable, specific recommendations, addressing how the client should apply this evidence moving forward to improve their work; this could be a continuation of something already working well, or a change to improve a process or close a gap. Remember, the more detailed the FCR entries are the easier the report writing will be!

Step 7: Write and Design the Report

After the FCR Matrix is completed, components are moved into a report and expanded upon for more clarity. Recommendations are listed in order of priority with those most important and with the strongest evidence coming first. When writing the report, consider adding thoughtful data visualizations or graphic elements to help the reader navigate the evidence and enhance engagement. These small details can go a long way in making your report more accessible and easier to read, which improves the likelihood of uptake. 

Step 8: Submit and Support the Use of Findings

Once the Learning Review has been submitted, be sure to continue supporting learning and implementation of recommendations. Unlike traditional evaluations, Learning Reviews should be used to get leadership’s attention and decision-making power to implement recommended changes in the design stage before programs or activities begin again. By thinking about and using this data at the design stage, implementers can build upon past experience and iterate, using resources more efficiently and effectively. Oftentimes, follow through on implementation is where organizations get stuck–but when recommendations are actionable and specific from the start, and when there is organizational momentum built (“quick wins”), the more likely the organization is to continue in the learning cycle. It can also be helpful to coordinate a Strategic Learning Debrief to share out the results and action plan to move forward. 

Using the scaffolding from above while doing a Learning Review makes the completion easy and mostly seamless. The biggest things to watch out for are making sure to include all of the documents from the beginning, to build out and continue to reference your codebook for data coding and analysis, and to make sure that your FCR Matrix is set up properly and filled out in as much detail as possible. Each of these steps might seem small, but getting them wrong risks an inefficient process with potentially skewed and/or wrong findings. 

 If you need help implementing a Learning Review, Headlight would love to support you! We have the breadth and depth of expertise, experience, and toolbox to tailor-meet your needs. For more information about our services please email info@headlightconsultingservices.org. Headlight Consulting Services, LLP is a certified women-owned small business and therefore eligible for sole source procurements. We can be found on the Dynamic Small Business Search or on SAM.gov via our name or DUNS number (081332548). 


Leave a Reply

Comments

no comments found.