“Matchmaking” with Researchers to Conduct Impact Evaluations on Protection Programming
Thank you for your interest in collaborating on an impact evaluation as part of IPA and J-PAL’s Humanitarian Protection Initiative (HPI). The objective of this form is to learn about organizations that a) have projects that they would like to evaluate using a rigorous impact evaluation, b) would like to partner with researchers to conduct the evaluation, and c) expect to benefit from the results of such an evaluation. We will use this information to reach out to researchers to explore their interest in a research partnership.
To help us understand whether we may find a suitable match for your organization, please provide the information below about projects that your organization is implementing or will implement, which could potentially be evaluated with a rigorous impact evaluation. We expect that most evaluations supported through HPI will be randomized evaluations. Randomized evaluations are a type of rigorous impact evaluation that involves the random assignment of a program into intervention and comparison groups, e.g. by a lottery. In cases where a randomized evaluation is not possible, we can consider supporting alternate designs.
Below are some questions to consider as you consider your teams' interest in participating in the matchmaking process. HPI staff from J-PAL and IPA will review submissions, reach out with any additional questions, and attempt to match promising opportunities for rigorous impact evaluations with researchers as such opportunities arise.
Considerations
Team capacity: Generally, does your team have the time and capacity to engage with researchers, especially to do the up-front work of designing and setting up a research project?
Funding: Do you have sustained access to implementation funding? Researchers are typically responsible for securing funding for any costs related to the evaluation itself (data collection, analysis, etc.), but it requires the implementing partner to commit to having the financial and logistical capacity to implement the program that is being evaluated.
Suitability: There are a few components that will help us, in particular, to evaluate whether a program is suitable for a rigorous evaluation.
Number of program participants or other implementation “units”: In order to meaningfully be able to detect any difference in outcomes between those who receive the program and those who are serving as the comparison group, there needs to be enough people, communities, businesses, etc. (depending on what level the program is delivered at) in the study. For programs that are delivered to individuals, we generally find that numbers of research participants, including the comparison group, range from a few hundred participants to a few thousand. Exactly how many differs based on program design and expected impact, for example, depends on evaluation design solutions that researchers will be well suited to discuss as part of potentially forthcoming partnerships.
Timeline: Another key consideration is the project timeline; conducting a rigorous impact evaluation typically requires collaboration prior to the beginning of a program to make sure we can implement the research design before people in the study receive the program. So if a program has already started or is about to start, it probably makes sense to wait until a future iteration of the program to conduct a rigorous impact evaluation.
Commitment to running research: Research collaborations require institutional buy-in. For instance, there needs to be commitment to the research design, collaboration throughout the implementation, and a willingness to share data about the cost of the program. For those conducting a randomized evaluation, the willingness of all partners to randomize who receives the program is a key factor to consider. There are many ways in which researchers can help identify the most suitable design, but in principle, partners need to be open to randomization.
Policy relevance and HPI scope: Please carefully read through the call for proposals to ensure that the project you wish to evaluate is in scope for support under the HPI. Policymakers and practitioners must be in need of more or better information to make or influence decisions in this area. Policy relevant projects also tell us useful information about the cost-effectiveness of programs, and opportunities for scale. As such, the expected “lessons learned” from this evaluation should have generalizable implications and potential for relevance beyond this case.
Team diversity and accountability to affected populations: We highly encourage proposals from teams with researchers or other research team members with lived experiences related to displacement, and with team members from the countries where the project will take place.
About HPI
The Humanitarian Protection Initiative is a collaboration between J-PAL and Innovations for Poverty Action with the generous support by the UK International Development from the UK government. HPI will generate and share new evidence to inform policy and practice to effectively improve protection outcomes by keeping those who are affected by armed conflict safe from violence, coercion and deliberate deprivation while ensuring their dignity and rights. Read more from the HPI team »
Next Steps
Please note that teams are required to partner with independent academics to apply. As such, you can expect the academics to be motivated by the ways in which a prospective partnership might contribute to the broader academic literature, in addition to improving the effectiveness of social programs. We do not operate a consultant model and do not have a relationship with the researchers in which we commit their time and contribution to a research partnership. The researchers have full control over which projects they take on and we cannot guarantee that any particular impact evaluation opportunity can be “matched” to a researcher.
However, IPA and J-PAL staff have experience in vetting potential research opportunities, circulating a subset of promising opportunities to researchers, and, where partnerships are formed, having research teams pursue evaluation funding jointly from HPI or other sources. Please note that this process may take time. We are committed to carefully assessing each submission, and will let you know whether or not we were able to identify an interested researcher within eight weeks of your submission.
If you have any questions, don't hesitate to indicate so in the final question box or reach out to: dli@povertyactionlab.org