ABCDEFGHIJKLMNOPQRSTUVWXYZ
1
2
3
4
5
6
7
8
9
Menu of Evidence-based Approaches to Support Foundational Literacy and Numeracy
10
11
IntroductionHow and when to use this menu
12
13
More than half of all children in low- and middle-income countries do not learn to read with comprehension by age 10, despite the ambitions of Sustainable Development Goal 4 for “inclusive and equitable quality education and lifelong opportunities for all.” This high rate of Learning Poverty is just one indicator of the wide learning gaps that prevent education from providing the opportunity it should. We need to understand not just what is effective to get more children into school, but also how to improve learning outcomes once the children are there. Given the scale of the challenge, resources within each country need to be directed to the most cost-effective approaches possible. The investment over the past decade in research on cost-effective ways to improve learning gives us an opportunity to increase the value for money of education programs.
14
This menu comes at a time when the UNICEF education strategy has shifted to focus on foundational literacy and numeracy (FLN). The strategy articulates an ambition to enable, by 2030, 78 Million children (from 24 million) aged 10 in 74 countries to be ready to succeed at school in development and emergency contexts. The FLN initiative is being launched as COVID 19 has centered attention on a looming education crisis. We see this as a moment of opportunity to “build back better” and model what “good” looks like, quickly, and building “modern” education systems.
15
A forthcoming special issue led by UNICEF in the International Journal of Educational Development (IJED) lays out the scale of the challenge and potential solutions.
16
17
What is provided in this menu: (a) a filterable menu by FLN barrier (b) an evidence review of effective and ineffective approaches (c) links to full "evidence memos" with source evidence and implementation details
18
In the "Filterable Toolkit" tab, we include a tool which enables the user to scroll by context-specific barrier, such as "Children's differential needs are not being met in the classroom." The barrier selected will then trigger the results of an in-depth evidence synthesis of both effective and ineffective programs. For example, this barrier will return evidence categories "targeting instruction to a child's learning level rather than age or grade" as well as "mother tongue instruction," among others. Each evidence category has results reported against 5 categories: evidence strength, cost-effectiveness, FLN domain, an evidence to action takeaway, and a link to learn more about the evidence (e.g., source papers), contextual considerations, generalizability across contexts, specific examples, and implementation details and program packages. Each barrier also includes a type of barrier in brackets aligned to the framework used in UNICEF's bottleneck diagnostic analyses: [S] refers to Supply, [D] to Demand, [Q] to Quality, and [EE] to Enabling Environment. The "Full Menu" tab includes the entire synthesis in one table.For context needs assessment and prioritization, diagnostic options include bottleneck analyses. When deciding the relevant action (e.g. proof of concept, scale-up, evidence generation, etc), we include a column called "evidence to action takeaway" for each evidence category. To monitor learning continuously, various cheap, high-frequency learning assessment options include MICS, ASER/Uwezo, and/or EGRA/EGMA.
19
Assessing the EvidenceAudience and use cases
20
21
In this menu, we review interventions based on their effectiveness at improving learning outcomes. We also provide guidance on the type of context in which a specific intervention is likely to be useful in improving learning. Even the best interventions will not be effective if it addresses a problem that is not present in a given context or is implemented poorly.This evidence menu is meant to be a public good for ministries and national governments, as well as multilateral and bilateral development organizations. Example use cases include (a) public expenditure reviews (b) program documents and directly supported proof of concepts (c) and technical assistance and policy review.
22
Matchmaking contextual needs and conditions with the evidence of effective solutions is at the heart of the Generalizability Framework approach.
23
We draw on a series of recent academic studies and reviews, as well as a new effort by the Building Evidence in Education (BE2) group, which UNICEF is a core part of, to invest in "Smart Buys" to improve foundational learning. Evidence Memos
24
We review a range of rigorous research in education, primarily in low- and middle-income countries. This evidence review is rooted in a body of evidence, rather than only a single study. Despite the rapid growth of the evidence in education, there are also many important interventions for which rigorous, actionable evidence is still in short supply. These interventions are still reviewed and grouped under "more evidence needed." Synthesizing evidence involved both quantitative data as well as expert judgement capturing qualitative insights, the importance of context, scale, and equity.The evidence memos are linked in the "learn more" section and provide deep dives into each program or policy, including a detailed review of the evidence, specific citations to all study references, an analyzes in adapting and contextualizing programs across contexts, and implementation details and toolkits where they are available. In addition, J-PAL is available to make linkages to the original authors and implementers for direct conversations and technical assistance.
25
26
Criteria and factors consideredTerms and definitions
27
28
EquityEvidence Strength:
29
A focus of the evidence reviewed focus on intervention that improve education outcomes for the most marginalized students and for students at scale, which means schooling and learning for all.We include the following categories to capture effectiveness and level of evidence: "Strong evidence of effectiveness" indicates a lot of evidence exists and the approach is typically effective; "Promising evidence but some uncertainty" indicates there is high potential, with perhaps a few studies showing strong effects, but other studies showing minimal effects, hence there is uncertainty, either since the intervention is highly contextual or varies due to difficulty in implementation, among other factors; "Strong evidence of ineffectiveness" indicates there is a lot of evidence and the approach is typically ineffective; and "More evidence is needed" indicates an important topic which is of central importance or a core component of many education systems, yet there is minimal evidence or mixed evidence to date.
30
31
OutcomesCost Effectiveness:
32
33
This synthesis focuses on identifying the interventions that are most cost-effective in improving learning in basic education, measured in terms of core cognitive skills (typically literacy and numeracy. Moreover, because improving learning has proved far more challenging than expanding access, this note focuses on that goal. In cases where impacts on such cognitive skills are often not measured, such as early childhood development, the we have relied on proxies such as effects on school readiness. This dimension combines effectiveness information with cost data, either directly from a given set of studies or estimated due to the nature of the intervention. For example, information interventions are known to be typically very cheap. We include multiple categories: "high" which indicates a program that is highly effective and relatively cheap, such as "teaching at the right level", or moderately effective but very cheap, such as information interventions; "medium" indicates a program that is highly effective but expensive or somewhat effective and cheap, or may indicate high variation in cost-effectiveness among the programs included in the category; "low" indicates a program that is somewhat effective and expensive; "unknown" indicates a category with few costed impact evaluations to date.
34
ScaleDomain:
35
Given the scale of the learning crisis, and UNICEF in-depth collaboration with governments around the world, we prioritize approaches which have been test at large scale, and/or have the potential to scale.Leaders who lead: education system leaders (e.g. in ministries of education)
36
System reformParents who prioritize: informed parents demand & access quality education
37
38
To address the learning crisis, a suite of reforms will be needed to achieve long-term system reform. The approaches and solutions outlined in this menu are only a part of this system-wide reform, rather than any one approach being a silver bullet. Many of these approaches can be combined to achieve greater impacts and have complementarities. Of note, while long-term reform is the ultimate goal, individual approaches can still close substantial learning gaps on their own, and in the process, catalyze long-term systems change.Teachers who teach: teachers who implement effective pedagogy
39
Shifting current spending versus allocating new spendingLearners who learn: learners engaged in foundational skill building
40
Even small changes to "business as usual" approaches in education can often be more cost-effective than adopting entirely new approaches. Many education systems have large line-items and expenditure on approaches reviewed in this evidence menu. Rather than allocate new funding, simply shifting current spending could achieve substantial learning progress. For example, popular interventions frequently adopted by ministries of education, such as providing inputs only (e.g., computers, tablets, school grants, extra textbooks) or general skills teacher training have very little effect on learning outcomes. In contrast, adaptive educational technology software (rather than hardware) which targets instruction to the level of the child can have large and cost-effective impacts on learning. Shifting existing Information and Communication (ICT) line-items in ministry budgets from more hardware to better software could have highly cost-effective returns.The domains refer to critical intervention points & actors targeted for change.
41
State of the evidenceEvidence to Action Takeaway
42
43
This toolkit reflects the current state of the evidence reviewed across multiple discplines including largely education and economics. Over time, as more studies become available, this menu will also evolve.The evidence to action takeaway categories include: (a) pilot/scale (b) test/research. Approaches that have evidence of high and consistent effectiveness are recommended for direct action through pilots to demonstrate local proof of concept and generate momentum in-country; scale-up is recommnded especially if an existing country effort is operational and ready for scale. For approaches with limited evidence, mixed evidence, or high variation in effectiveness in the literature, more evidence generation is recommended to close evidence gaps. These takeaways are meant to only be a guide rather than a definitive recommendation. In some cases, even effective and well-studied interventions might benefit from further research, for example, to test scale pathways or to optimize programs for cost-effectiveness.
44
45
Acknowledgements: this menu was produced by Noam Angrist, Radhika Bhula, and Sam Friedlander in close consultation with John Floretta, Anne Healy, Karthik Muralidharan from J-PAL and Kenneth Russell, Atif Rafique, Manuel Cardoso, Pragya Dewan, Hsiao Chen Lin, and Rob Jenkins from UNICEF. We thank Shiraz Chakera from UNICEF and Rachel Hinton from FCDO for inputs and useful discussions.
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100