Author: Camila Alegría Ciro Avitabile Annie Chumpitaz Torres

Early warning systems can help prevent dropouts, but administrators need to consider critical aspects to increase effectiveness

Source(s): World Bank, the
School in Nepal
ThulungPhoto.com/Shutterstock

Extended school closures due to COVID-19 increased concerns about students dropping out of school. In response, many Latin American countries accelerated the development of early warning systems (EWS), which use student data to help identify whether a student is at risk of dropping out of school, allowing timely action to be taken by the school to prevent this outcome.

In Latin America and the Caribbean (LAC), countries such as Chile, Colombia, Guatemala, Dominican Republic and Belize have given impetus to this type of system in the wake of the pandemic. Similarly, at the end of 2020, the Peruvian Ministry of Education (Minedu) launched Alerta Escuela, a web platform in which school principals and tutors can observe the level of dropout risk for each student, and access guidance to work with those most at risk.

Although there are several EWS experiences, evidence on their effectiveness is limited. The World Bank and Minedu collaborated to design, implement and evaluate Alerta Escuela, which gave us the opportunity to reflect on some of the critical decisions and derive lessons to achieve greater effectiveness of these early warning systems based on experience (see Implementation Evaluation and Impact Evaluation).

Considerations before implementing early warning systems

Ministry officials will need to make a number of critical decisions when implementing early warning systems. We have outlined some of the major decisions below.

National vs. targeted

Peru opted for a national scope for Alerta Escuela because there was massive concern about the potential increase in school dropouts due to the pandemic, even though the dropout rate for the previous 10 years was less than 3% in primary school and less than 5% in secondary school.

Governments should consider several factors before making this decision: the magnitude of the problem, heterogeneity in the territory, and the social or political context. Deploying an EWS in a targeted manner or with pilots can provide valuable lessons for informed progress. Areas and grades with higher dropout rates can be prioritized and their performance can be continuously evaluated. The profile of school principals and teachers as users, how the system responds to needs, and what adjustments should be made to increase its value for final users should also be evaluated.

Administrative vs. collected data

Alerta Escuela relied mainly on administrative data from Minedu and other government agencies, and included an extra module to collect information on remote education. However, one year after its deployment, the evaluation of its implementation finds that the registry for this complementary module affected users' perception of the tool. Given school principals’ focus on the costs of using the system, they consider the system to be too time consuming because it "only requests information," which could explain why only 7% of schools with students at high or medium risk have accessed the platform, and 2% have downloaded the management and pedagogical guidelines.

The quantity, quality and method of data collection influence the ability of EWS to predict risk, to generate trust in users, to motivate their use, and to be effective. Ideally, available data should be used. It’s possible to opt for data collections to incorporate more recent data or variables absent in administrative systems, but this increases costs of data collection and for the users (teachers and school principals).

Simple vs. complex model

Robust educational information systems allow ministries of education to develop analytical models to predict dropout. For example, for Alerta Escuela, Minedu developed a predictive model based on machine learning techniques, in which predictive variables of dropout risk were incorporated according to the literature and available data: student characteristics, history of academic performance, family context, characteristics of the educational service, and socioeconomic information of the household (see AE Methodological Document).

Despite the methodological rigor and simple explanations on how risk is calculated, the implementation evaluation shows that school principals do not understand it and even do not know the variables used to obtain it. It is possible that the more complex the model, the more difficult it is for the user to understand it. More training is required with increased complexity and the predictive capacity of the model. This is a trade-off that policy makers should take into account.

Lessons from Peru’s experience

Some key lessons emerged from Peru’s experience implementing Alerta Escula, which ministries and schools should consider when implementing similar systems.

Early warning systems should be simple to understand and use

Based on the results of focus groups, we found that the few principals that downloaded guidelines shared them without reviewing or explaining them to their teachers. Principals do not read the guidelines because they "are too long." Therefore, the platform should be designed to be user-centered, training should contain more images, and the resources should be easy to identify within the platform.

The benefits of use must outweigh the costs and limitations

Principals had difficulties understanding Alerta Escuela’s potential, focusing its use on reporting information monthly rather than the deployment of preventive actions. Benefits and costs of use greatly vary across schools, and depend partly on the number of students and connectivity (36% of educational facilities have access to this service). With a small number of students per school, the system could be more of a burden than a support, being more efficient to follow each specific case. In areas with limited connectivity, other means of delivering information can be more effective, such as SMS or WhatsApp with monthly reports. Support from the Ministry, especially at the beginning, could help principals to internalize benefits and costs.

Ensure a continuous cycle of evaluation

In 2021, Minedu conducted a qualitative evaluation that shed light on Alerta Escuela awareness and implementation. It also evaluated the impact of an SMS strategy based on behavioral principles, finding that reminders have significant effects on access and use of the EWS; however, increased access and use did not translate into a higher propensity to take preventive actions. As a result, there was no reduction in school dropouts. The ministry reviews the estimates on a yearly basis to update the predictive model. Given the change in the context in which Alerta Escuela was designed and evaluated, from distance education to 100% face-to-face, it is imperative to continue evaluating the usefulness and effectiveness of its use.

Finally, the lessons of Alerta Escuela highlight the importance of framing public policy decisions in an iterative process of design, implementation and evaluation, focused on understanding the needs of users, and ensuring the effectiveness of the solutions deployed.

Explore further

Country and region Americas
Share this

Please note: Content is displayed as last posted by a PreventionWeb community member or editor. The views expressed therein are not necessarily those of UNDRR, PreventionWeb, or its sponsors. See our terms of use

Is this page useful?

Yes No Report an issue on this page

Thank you. If you have 2 minutes, we would benefit from additional feedback (link opens in a new window).