1 of 56

Framework and Methodology of Implementation Research

Ari Probandari

Facilitator of Implementation Research Course, Regional Training Centre for WHO TDR SEARO-WPRO Region, FKKMK UGM

Professor of Public Health, Faculty of Medicine, Universitas Sebelas Maret, Surakarta

2 of 56

Research Methodology

Theoretical framework

Research design

Population and study participants

Data collection

Data analysis

3 of 56

Common IR Framework [1]

Consolidated Framework for Implementation Research (CFIR)

RE-AIM

NIRN stages of implementation

ADAPT-ITT

WHO ExpandNet

4 of 56

Consolidated Framework for Implementation Research (CFIR) (https://cfirguide.org)

  • 2009 version and revised 2022 version
  • “The CFIR is intended to be used to collect data from individuals who have power and/or influence over implementation outcomes.”

5 of 56

RE-AIM (https://re-aim.org)

Reach

Effectiveness

Adoption

Implementation consistency/cost/adoption

Maintenance/sustainment

6 of 56

(https://re-aim.org)

7 of 56

How can I use RE-AIM?

  • Focus on one step or all steps
  • Used for program planning or evaluation
  • RE-AIM may be measured qualitatively or quantitatively
  • Used retroactively or prospectively

43

Glasgow et al 1999)

8 of 56

Diffusion of Innovation Theory (Rogers 1962)

9 of 56

Five main factors that influence adoption of an innovation (Rogers 1962)

Relative Advantage

The degree to which an innovation is seen as better than the idea, program, or product it replaces.

Compatibility

How consistent the innovation is with the values,

experiences, and needs of the potential adopters

Complexity

How difficult the innovation is to understand and/or use.

Trialibility

The extent to which the innovation can be tested or experimented with before a commitment to adopt is made

Observability

The extent to which the innovation provides tangible results

10 of 56

Other framework

  • Implementation fidelity (Carol et al. 2007)

11 of 56

Other framework

  • Implementation sustainability (Schell et al. 2013)

12 of 56

Adapting Frameworks/Models

12

Using some pieces of a larger model

Tailoring a model to fit the program and context

Combining frameworks

Identify a models from literature, presentations, experts, etc.

Most commonly used in implementation research

13 of 56

Common study designs in IR

Pragmatic Trials

Effectiveness-Implementation Hybrid Trials

Stepped Wedge Cluster RCT

Quality Improvement Studies

Participatory Action Research

Realist Review

Mixed Methods

14 of 56

Considerations of IR study design

  • The selection of study design should consider the purpose/research questions
    • Any implementation strategies will be tested?
  • Consider other things:
    • Target Population(s)
    • Cost/Personnel
    • Setting
    • Data
    • Stakeholders
    • Feasibility
    • Logistics
    • Timeline
    • Ethics

15 of 56

Explanatory vs. Pragmatic Trials�(Schwartz and Lellouch in 1967)

Explanatory trials

    • Optimized to determine efficacy
    • Small sample size
    • Highly selected participants
    • Overestimating benefits
    • Underestimating harms

Pragmatic Trials

    • Show the real-world effectiveness of the intervention in broad patient groups
    • Seek to maximize implementation’s variability (Settings, Providers, Patient’s characteristics, Resources, Time)

= Clinical trial

16 of 56

    • Explanatory trials

Can this intervention work under ideal conditions?

    • Pragmatic trials

Does this intervention work under usual conditions?

17 of 56

Key Outcomes in Pragmatic Trials

  • Major life events
    • death, hospital admissions, etc.
  • Treatment effects
  • Safety of intervention
  • Symptoms
  • Disability
  • Quality of life

18 of 56

How ‘pragmatic’ is the trial?

19 of 56

The PRECIS Wheel

1: very explanatory

2: rather explanatory

3: equal

4: rather pragmatic

5: very pragmatic

BMJ 2015;350:h2147

20 of 56

BMJ | 20 December2008 | Volume 337

21 of 56

Common study designs in IR

Pragmatic Trials

Effectiveness-Implementation Hybrid Trials

Quality Improvement Studies

Participatory Action Research

Realist Review

Mixed Methods

22 of 56

Effectiveness-Implementation �Hybrid Trials

Explanatory Trial

Pragmatic Trial

Hybrid Designs

23 of 56

Effectiveness-Implementation �Hybrid Trials

  • combine elements of effectiveness and implementation research
  • intervene and/or observe the implementation process as it actually occurs
    • assessing implementation outcome variables
  • researchers to simultaneously evaluate the impact of interventions introduced in real-world settings and the implementation strategy used to deliver them.
  • Enable rapid translational gains of clinical intervention

24 of 56

Type of Hybrid Designs

    • Test the effects of a health intervention on relevant outcomes while observing and gathering information on implementation.

Type 1

    • Dual testing of health interventions and implementation strategies.

Type 2

    • Test an implementation strategy while observing and gathering information on the health intervention’s impact on relevant outcomes.

Type 3

25 of 56

26 of 56

RQs for Hybrids Type 1

What are potential barriers and facilitators to “real-world” implementation of the intervention?

What problems were associated with delivering the intervention during the clinical effectiveness trial and how might they translate or not to real-world implementation?

What potential modifications to the clinical intervention could be made to maximize implementation?

What potential implementation strategies appear promising?

27 of 56

General conditions

1. there should be strong face validity for the clinical intervention that would support applicability to the new setting, population, or delivery method in question;

2. there should be a strong base of at least indirect evidence (e.g., data from different but associated populations) for the intervention that would support applicability to the new setting, population, or delivery method in question;

3. there should be minimal risk associated with the intervention, including both its direct risk and any indirect risk through replacement of a known adequate intervention.

28 of 56

Additional conditions for Hybrids Type 2 & 3

(4) there should be “implementation momentum” within the clinical system and/or the literature toward routine adoption of the clinical intervention in question.

(5) there should be reasonable expectations that the implementation intervention/strategy being tested is supportable in the clinical and organizational context under study

(6) there is reason to gather more data on the effectiveness of the clinical intervention

29 of 56

Stepped Wedge Trial Design

  • RCT Classification in the way participants are exposed to the intervention
  • an intervention is rolled-out sequentially to the trial participants (either as individuals or clusters of individuals) over a number of time periods.
  • Random allocation, but in the end all participant will receive the intervention

30 of 56

31 of 56

Stepped Wedge Trial Design

  • When there are logistical, practical or financial constraints
  • Longer trial duration
  • Contamination
  • Blinding

32 of 56

33 of 56

Common study designs in IR

Pragmatic Trials

Effectiveness-Implementation Hybrid Trials

Quality Improvement Studies

Participatory Action Research

Realist Review

Mixed Methods

34 of 56

Langley et.al. 2009

35 of 56

Quality Improvement Study

  • Time series studies with pre-post assessment
  • Multiple time series studies
    • Baseline and intervention repeated at different times (equivalent time series - ABAB)
    • Interventions lagged across different groups at different times (multiple baselines- AAAB, ABAB, ABBB, etc.)
  • Factorial design studies
    • Intervention is randomized to groups to compare time series

36 of 56

37 of 56

Common study designs in IR

Pragmatic Trials

Effectiveness-Implementation Hybrid Trials

Quality Improvement Studies

Participatory Action Research

Realist Review

Mixed Methods

38 of 56

Participatory Action Research

  • carried out with and by local people rather than on them
  • assign power and control over the research process to the subjects
  • Involve iterative process of reflection and action

39 of 56

40 of 56

41 of 56

Common study designs in IR

Pragmatic Trials

Effectiveness-Implementation Hybrid Trials

Quality Improvement Studies

Participatory Action Research

Realist Review

Mixed Methods

42 of 56

Mixed Methods Design

  • Collection and integration of qualitative and quantitative data
  • Good approach for understanding processes, context and complexity
  • Can embed within other types of study designs
  • Qualitative
    • Purposive sampling, data saturation
    • Data collection: Interviews, focus groups, etc.
    • Qualitative data analysis: thematic, content etc.
  • Quantitative Data Collection:
    • Random sampling is preferable, sample size calculation
    • Data collection: surveys, experimental studies, etc.
    • Data analysis: statistical analysis

43 of 56

Integration in Design

44 of 56

Embedded design

  • Collect qualitative and quantitative data to obtain broader and more comprehensive understanding of context and processes
  • Conduct one study within the other type of study design (e.g., qualitative within quantitative study design or quantitative within qualitative study design)
  • Qualitative + Quantitative
  • Concurrent data collection

45 of 56

Explanatory design

  • Qualitative data helps explain or build on initial quantitative results
  • Use qualitative to explain atypical quantitative results
  • Use quantitative participant characteristics to guide sampling for qualitative data collection
  • Quantitative 🡪 Qualitative
  • Sequential data collection

46 of 56

Exploratory design

  • Quantitative data help explain or build on initial qualitative results
  • Exploration is needed due to lack of available data, limited understanding of context, and/or few available measures/instruments
  • Qualitative 🡪 Quantitative
  • Sequential data collection

47 of 56

Integration in data analysis/interpretation

  • Intramethod analytic (consider linkages, contrasts, and interpretations of each database separately)
  • Core merging analytics (consider linkages, contrasts, and interpretations across the databases)
  • Advanced merging analytic (linking or intersecting and interpreting the analysis using complex procedures and often computer software).

48 of 56

Integration in data interpretation

49 of 56

50 of 56

51 of 56

Exploratory sequential joint display

Guetters TC, Fetters MD, Creswell JW. Integrating quantitative and qualitative results in health science mixed methods research through joint displays. Annals of Family Medicine 2015; 13(6)

52 of 56

Exploratory sequential joint display

Guetters TC, Fetters MD, Creswell JW. Integrating quantitative and qualitative results in health science mixed methods research through joint displays. Annals of Family Medicine 2015; 13(6)

53 of 56

Explanatory sequential joint display

Guetters TC, Fetters MD, Creswell JW. Integrating quantitative and qualitative results in health science mixed methods research through joint displays. Annals of Family Medicine 2015; 13(6)

54 of 56

Cross-case comparison joint display

Guetters TC, Fetters MD, Creswell JW. Integrating quantitative and qualitative results in health science mixed methods research through joint displays. Annals of Family Medicine 2015; 13(6)

55 of 56

GRAMMS�good reporting of a mixed-methods study

Describe the justification for using a mixed methods approach to address the research question.

Describe the design in terms of the purpose, priority and sequence of methods.

Describe each method in terms of sampling, data collection and analysis.

Describe where the integration has occurred, how it has occurred, and who has participated in it.

Describe any limitation derived from associating one method with another method.

Describe any insights gained from mixing or integrating methods.

56 of 56

Thank You

Please contact for further discussion: ari.probandari@gmail.com