frPropose a project

Evaluate the «FasterReading»program to improve this learning strategy in schools in Africa

Publication by Equipe du FID

30 septembre 2023



Deux élèves étudient dans une classeDeux élèves étudient dans une classe

The Rising Academy Network (RAN), a network of private schools in Africa, has developed an accelerated reading program called “FasterReading”, targeted to over-age Early-Childhood Education (ECE) students. The program is based on the targeted instruction pedagogy, which is an evidence-based pedagogical approach that tailors instruction at the level of the student’s current knowledge instead of the grade-level of the official curriculum.

The goal is to prepare these children academically and developmentally for enrolment into age-appropriate grades and improve the quality of learning. While studies have shown that this approach - grouping children by ability level to target instruction to the needs of each group - is effective to improve learning outcomes for primary grades, there is limited evidence of using this approach in Early-Childhood Education classrooms RAN and the research organization IDInsight developed a partnership to conduct a Randomized Controlled Trial to address this data gap and to evaluate the impact of the program.

The two organizations look back at the genesis of this partnership and discuss the learnings and challenges that they encountered during their collaboration.

What was the initial motivation to work together?


Rising is committed to the highest standards of rigor and transparency at every level, and that becomes even more important when learning about the impact of our model and programs. Along with our own data and analytics team, we partner with world-class researchers to measure our impact and learn how we can improve. Rising had developed FasterReading over the prior year and had already started implementing it in some of our schools in Sierra Leone and Liberia. While we knew it was early stages of the program development and we were still understanding what were the key elements of a successful implementation, we wanted to learn as much as possible early on, in order to iterate and improve the program, and start to understand the potential impact of the program. IDinsight and Rising had worked together earlier on, and the experiences and collaborative approach were fantastic. We were certain they were the right evaluating partners for FasterReading.

We wanted to learn about the program through two related but independent processes: an impact evaluation (carried out by IDinsight) and process evaluation (carried out by Rising, with support from IDinsight). These two parallel work streams allowed us to identify challenges in the program implementation early enough and iterate even during the same academic year.


At IDinsight, we have an organizational commitment to focus on high-impact projects, in which data and evidence from our work closely informs our clients’ decisions in making better policies and programs. We measure our impact as a function of the breadth and depth of our partner’s impact, as well as our contribution to our partner’s decision-making . We were excited to partner with Rising on this particular project because it presented a significant social impact case.

Rising Academy Network (RAN), an education organization working with more than 700 schools in Africa, has developed an accelerated reading program called “FasterReading”, a structured pedagogy solution designed to catch students up on foundational literacy. The program was developed to address persistent challenges in foundational learning with modestly skilled teachers and includes a coherent solution of content, coaching and feedback.

We envisaged that our evidence would directly inform RAN’s Early Childhood Education program design and scale-up decisions in these schools, impacting about 250,000 student lives. Since overage enrollment is a critical issue in West Africa, and RAN has cultivated close relationships with government partners in the ministry of education, we expect that our results will inform ECE programming across the region.

We hoped to also contribute to the body of literature as a trusted thought partner and further establish IDinsight’s expertise in ECE and education policy in West Africa, given the high social impact potential and an open publication requirement on findings from the funder.

How did the collaboration go? Which challenges did you face? What are some challenges of a collaboration involving an implementing organization and a research partner? How did you overcome them?


IDinsight played an important role evaluating the program and was also a great thought partner for considering the potential impact of the program. The IDinsight team was supportive and flexible throughout the evaluation process. They were very mindful of the importance of this type of evaluation for Rising as an opportunity to consider practical implications for improving delivery and serving teachers and students more effectively. Decision-making was collaborative and accounted for contextual constraints, and Rising’s feedback was actively sought.

We faced an important challenge initially when we noticed that some of the schools did not adhere to the study design. This was mainly caused by the approach that we took to the implementation and evaluation of the program. As an education provider, we wanted as many students as possible to benefit from the program, while still being able to evaluate its impact. We decided to focus the evaluation only on overage ECE students. This was both because it is a persistent challenge in the education sector in Liberian schools and because it allowed us to implement the program in more schools and include more primary school students, beyond the scope of the evaluation.

While all schools received FasterReading materials and trainings, only treatment schools were instructed to include overage ECE students in the program, and the evaluation focused on the effect of the program on this subset of overage ECE students. Schools struggled to stick to this design. The challenges came both from schools being confused about which students to include in this new program and from wanting their teachers and students to benefit from the new materials and ways of teaching that were now available to them. We learned from some schools that, even when they knew their ECE students should not be part of the program, they still went ahead and included them because the materials provided were better than the existing ECE resources in school and much easier for students and teachers to follow.

Ultimately about a third of schools in our control group implemented the program in some form. IDinsight was informed about this challenge as soon as we became aware of it and immediately started working on potential solutions to non-compliance from an analysis perspective.


We enjoyed working with Rising immensely. Their team is smart, passionate about the work, sophisticated about evidence use, and fun to be around. We learned a lot working on this project with them and are excited to see how they use the results to further strengthen and expand the FasterReading program.

The main challenge that we faced in the collaboration was simply not having enough time to spend with the Rising team as they implemented the FasterReading program. In general, as a research partner it is tricky to have sufficient visibility over program implementation, but that visibility is crucial for correctly analyzing and interpreting the data. This was doubly hard in our engagement with Rising since our core team was based in IDinsight’s regional offices in Zambia and Senegal, whereas the evaluation was in Liberia.

Fortunately, Rising’s core team was closely monitoring implementation and shared frequent updates with us. This transparency and communication enabled Rising to collect detailed data on treatment non-compliance (described by Rising above): with that, we were able to estimate how many cycles of the FasterReading program each school implemented, whether in treatment or control group. Knowledge of the context provided by RAN, in particular that non compliance was driven by confusion among school staff - not by other school, teacher or student characteristics, made us confident our treatment intensity method gave a fair estimate for the impact of the program on students who completed the 5 FasterReading cycles. We share more details about how we accounted for non-compliance in this evaluation via this blogpost on the IDinsight website.

Were the evaluation results expected? How did RAN welcome the results? What was IDInsight’s approach to share the results with RAN? And with the broader education community?


Our evaluation found that Rising’s program likely led to modest positive effects on reading proficiency, though impact estimates were imprecise due to treatment non-compliance (described above). The positive results aligned with our expectations given Rising’s justifications for designing the Faster curriculum and the encouraging feedback from teachers who were involved in the pilot of the program in the previous school year. When we designed the RCT we expected the statistical estimates to be more precise, but we adjusted expectations as the extent of treatment non-compliance came to light.


Yes. We were aware that the program still needed more iterations until it would reach its full potential, but we were hopeful that even in its early days it could show some positive, encouraging results. It was also important to evaluate early in an attempt to detect potential undesirable effects of the program that otherwise would have been unnoticed. The evaluation not only helped us understand the impact of the program, but also revealed areas where the program could be further strengthened and improved.


On sharing results, we had frequent touchpoints with Rising throughout the engagement. We shared early results during a regularly-scheduled call with Rising, and then iterated with the Rising team to finalize the analysis and the report. In general we find that an iterative and collaborative approach to writing the evaluation report with the implementer leads to a more accurate interpretation of the results and better-informed programmatic recommendations.

In terms of external sharing, we worked with the Rising team to identify promising avenues for dissemination, which so far have included the following:

  • We made the final report publicly available on the IDinsight website.
  • We will also make the anonymized primary data available on IDinsight’s data repository hosted on the Harvard Dataverse.
  • We will be publishing the findings in a peer-reviewed journal. This will be open-access so that the publication will not require a subscription to access.
  • We shared the final report with key stakeholders such as the Liberian Education Advancement Program (LEAP) and USAID in Liberia.

What were the key lessons learned from this collaboration, from both the results and the process of evaluating the program? What will RAN do/has already done with these results, to improve and expand the Reading Faster Program?


Based on the results, we learned that the program has potential to improve student learning and there are no major undesirable effects on other outcomes. It also helped us identify weak spots in the program that could be addressed in future versions. As mentioned above, we have already done some tweaks to the program based on some of the results that came out of this evaluation. Some examples of these changes include:

  • Rising investing more time and effort on teacher training in the early stages of the program.
  • A mobile application that contains video-based training content and information about how to implement the program was developed and made available to all schools.
  • A new version of the program has been specifically developed for those schools with staff constraints. This more flexible version of FasterReading can be implemented even with just a few teachers in school.
  • Training and tools to better equip field staff to support the implementation and monitoring of FasterReading in schools have also been strengthened.


We learned a lot about the pressures on the Liberian education system, what Rising is doing to ease those pressures, and the opportunities and challenges of implementing a phonics-based reading program in this environment. Based on this evaluation, we believe that FasterReading and similar programs have a lot of potential to cost-effectively strengthen early-grade reading and numeracy pedagogy in Liberia and similar contexts.

How will RAN and IDinsight build on this evaluation to continue learning on effective strategies to improve learning in schools? What are the next relevant questions to be answered? Do you envision other evaluations together to contribute to this learning agenda?


Despite pausing the program evaluation this year, we plan to resume it in the near future with more extensive development and implementation experience. We are hoping that improvements of contents, training, tools, and data systems, will enhance program effectiveness and outcomes.

Additionally, we aim to assess varying levels of support for the FasterReading program to address sustainability challenges in our sector. We want to determine the value of different support levels, such as comparing FasterReading Training and Materials alone to including Coaching and Monitoring.

We have also created FasterMath, a level-based program for math, currently in schools outside Liberia. We are considering expanding it to more countries and exploring evaluation options. Due to the successful and positive partnership with IDinsight, we are eager to collaborate on future impact evaluations.


We are currently exploring options to support RAN in future years and on other programs they are implementing. We are in regular and frequent contact with RAN to weigh their priorities on possible future engagements. We are excited by the opportunity to continue to work with such an evidence-aligned partner!

Publication by Equipe du FID

30 septembre 2023


Gaining insight and sharing the learning of FID supported projects

Explore FID's insights