ABCDEFGHIJKLMNOPQRSTUVWXYZAA
1
ToC step / assumptionOne-sentence summaryExisting indicatorsPlanned additional indicators (high-confidence)Possible additional indicators (more speculative)
2
Suvita collects phone numbers & DoBs for eligible children in a geography and enters this data into TelerivetWe are quite confident that we are successfully enrolling a large majority of children born in public health facilities, and that their data (phone numbers and dates of birth) are generally accurate.
We plan to establish more robust ongoing monitoring (rather than one-time assessment) of these indicators, and to investigate and more precisely describe who our programme does not cover (because they aren't present in our data sources).
How complete is the data?
- We know 56.9% of children in BH and 55.8% in MH are born in public health facilities (our eligible population) [NFHS-5 BH NFHS-5 MH]
- In MH, we understand that a small % of additional children, very likely born in private facilities, are also in the RCH database from which we enroll families.
- BH: Completed a one-time assessment in 2022 of retention of children from (public health facilities') labour room birth registers through the Suvita data entry pipeline to enrollment. Monitoring indicators:
- 0.1% of captured photos were blurry/illegible first time (but later routinely rectified)
--> 9.18% of children in the health registers don't have any recorded phone number (so can't be enrolled)
--> 2.02% of recorded numbers in facility registers are invalid (e.g. wrong number of digits, start with an invalid digit)
--> 0.02% of finally-enrolled numbers (after existing reconciliation & validation measures) were incorrect
So overall we successfully enroll ~90% of children from our eligible sample in BH
- MH: Currently working on a one-off assessment of whether our coverage to date includes all eventually-enrolled-in-RCH babies
- In both states, we aim to enroll children before their first reminder for their 6-weeks vaccines. In MH, 80% of children are enrolled in time for their first vaccine reminder and 98% in time for their second. In BH, 98% of children are enrolled in time for their first reminder.
How accurate is the data?
- We have passively monitored data accuracy (of the government's own RCH database in MH / of labour ward registers in BH) when calling our SMS users during subsequent surveys. Anecdotally we are typically able to reach ~50% of users for these surveys, of whom ~95% report personal data matching that in the database.
- Exploration of who (demographically) is excluded from our SMS programs through statistical analysis of NFHS-5 data (comparing vaccination rates and patterns of undervaccination between those born in public health facilities vs those born outside facilities)
- Automate data quality assurance checks and establish protocols for manual checks:
- Calling a random selection of SMS phone numbers to ensure that the owner of the phone is actually the caregiver of the child.
- Automating data quality processes to flag when a single phone number is registered to multiple children
- Automating routine monitoring protocols for how many children we "missed" enrolling even though they were included in government data sources (e.g. because they were added late or were missing a phone number)
- Through field visits, identify children's caregivers that do not receive Suvita SMS. Then, survey those caregivers to establish 1) why they are not enrolled (children born at home, lack phone number, or some data administrative error) and 2) their vaccination status.
- More broadly, be able to state with higher confidence what percentage of children within a block (or district) are not enrolled in our SMS program.
3
Suvita sends SMS reminders to caregivers and they are successfully received on caregivers' phonesWe are confident that 75-80% of messages, once sent, are delivered successfully to our users' phones. But we have identified some weaknesses in the final (manual) stages of enrollment that affected messages due to a small minority of users.
We plan to establish more robust ongoing monitoring of delivery rates by geography. We plan to establish more systematic checks for upload errors (as well as taking actions e.g. automation to eliminate the opportunity for errors where possible)
- We are aware of 2 instances in the last year where there were errors in the final stage of routine data upload, affecting message delivery to 2% of total enrolled contacts across both states. Currently this is weakly 'monitored' by a) ad-hoc checks that messages for new cohorts are sending as expected and b) user feedback (i.e. if someone gets a message that they think is incorrect, they have our number to call and ask about it, which flags the error to us). Both errors were caught within a few months of cohort upload, but this is a focus for strengthened monitoring this year.
- Our SMS provider tracks delivery rates. In the last 180 days, 75% of sent SMSs in BH and 79% in MH were successfully received by the users' handsets.
- Creation of a live dashboard summarizing SMS delivery rates by district and by date
- More systematic checks for potential errors in the upload process
4
Caregivers read SMSs and absorb some informationUser feedback indicates that parents generally remember and read their SMSs from Suvita; report finding the messages useful; and would like to continue with the service. We plan to corroborate this with feedback from health workers, and to conduct in-depth interviews with some parents to better understand the mechanism of impact. We also plan to establish protocols for how often these surveys should be routinely conducted.- We conduct periodic "Receiving, Understanding and Feedback" (RUF) surveys with users in both states. These cover various questions about how parents interact with the messages e.g. do you remember the message, what did it say, can you read it to me, are the messages useful or not, would you like to continue receiving them? We consistently find that most caregivers recall receiving our messages and accurately recall (unprompted) that the content involved vaccinations. Most of those who recall receiving our messages state that they found the reminders useful and want to continue with the service.
- We conduct ad-hoc field visits where we talk to caregivers and healthcare workers about their experience of the SMSs programme. We have collected various anecdotes from these discussions where people described to us specific instances where the messages helped them / their patients
- In depth qualitative interviews examining the mechanism through which SMS content changes behaviour.
- Additional follow-up (quantitative) surveys with caretakers with questions updated from results of qualitative interviews.
- In depth qualitative interviews with ASHAs about their observations around the impact of SMS content.
- Follow-up (quantitative) surveys with ASHAs about SMS programs. This not only provides us with non-self-reported data, but also provides a better understanding of the causal mechanism of our program.
- Establishing protocols for how often follow up surveys are conducted. For example, a follow up survey be conducted a certain number of months after expansion into a new district. Or, in the event of expansion into dozens of new districts in a short period of time, a policy as to which new districts get follow up surveys (i.e., half of districts get a survey 3 months after expansion, and the other half 9 months after).
- From both interviews and surveys of caretakers: estimates of the respective attributable impact of reminders (nudging), increasing knowledge, societal pressure, etc.
- Surveys measuring whether norms, knowledge, beliefs, and attitudes surrounding vaccination change [depends on validity of measures; power and effect size]
5
Caregivers bring eligible children to vaccination appointment, and health workers vaccinate childrenBased on self-reporting from parents, 10-15% of our enrolled users state that they probably or definitely wouldn't have got their child vaccinated on time without our remidners. We plan to validate this with other sources, e.g. health workers (through surveys) and ideally, depending on availability of usable data, through a DiD analysis using government administrative data.- RUF surveys also involve asking a question aiming to get at the counterfactual: “Do you think your child would have got vaccinated on time even if you had not received the SMS message from Suvita?” Results from RUF surveys in each state (N.B. these are not super recent as we haven't done full analysis of the most recent surveys, but they give some indication):
Bihar
- Sample size = 500 (up to 4 call attempts per person); giving 332 complete interviews
- Asked 197 people the ‘counterfactual question’
- 24 said their child would “probably not” have got vaccinated on time without the reminder and 31 said “definitely not”. Together this is 28% of the 197 who were asked the question or 11% of the total n=500 sample.
Maharashtra
- Sample size = 506 (up to 4 call attempts per person); giving 415 complete interviews
- Asked 206 the ‘counterfactual question’
- 67 said “probably not” and 11 said “definitely not” = 38% of the 206 or 15% of the total n=506 sample

- We conduct ad-hoc field visits where we talk to caregivers and healthcare workers about their experience of the SMSs programme. We have collected various anecdotes from these discussions where people described to us specific instances where the messages helped them / their patients

- We have done some ad-hoc exercises to increase our confidence about the supply of vaccines at clinic days (e.g. a survey of ASHAs in Bihar for the state government) - we know that supply is generally strong (e.g. >90% of scheduled clinics are held) but we aim to develop a more deeply informed position on this this year.
- Currently (April 2023) working on an internal staggered difference in difference (DiD) impact analysis of SMS + Ambassador programs in Bihar (and potentially in Maharashtra too) using administrative data from our government partners; though we think it's most likely that the data will be too noisy to draw a clear conclusion.

- We also plan on conducting in-depth interviews (qualitative) followed by a survey (quantitative) with caregivers on their interactions with vaccination services, including why their child has/has not been vaccinated. Within that, we plan to include questions on whether they faced 'supply side' issues in getting their child vaccinated - i.e., whether vaccination clinic happened as scheduled, whether they faced harassment from healthcare workers, etc.
- Late 2023, conduct another internal DiD of both programs with an additional 6mo of data. We expect this analysis will have more statistical power due to a) larger sample size b) being against a less 'noisy' (non-pandemic) baseline and c) less missing data (due to being outside the pandemic).
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100