1 of 30

Edtech Efficacy Academic Research Symposium

Action Session Share Outs�

2 of 30

Crowdsourcing Efficacy Research and Product Reviews

3 of 30

Challenge 1:

Creating transparent, representative, and easy-to-use access for educational institutions to review ed-tech products

#ShowTheEvidence

Crowdsourcing Efficacy Research and Product Reviews

4 of 30

Solution:

Increase the quantity of reviews in order to have reviews relevant to different types of users and increase the value of available reviews.

#ShowTheEvidence

Crowdsourcing Efficacy Research and Product Reviews

5 of 30

#ShowTheEvidence

  • Identify incentives and motivations for reviewers
  • Identify use cases

First step: Evaluate existing rubrics, systems, and taxonomies

Crowdsourcing Efficacy Research and Product Reviews

6 of 30

Challenge 2:

Efficacy research is difficult for educators to find

#ShowTheEvidence

Crowdsourcing Efficacy Research and Product Reviews

7 of 30

Solution:

Creating a database of efficacy research

#ShowTheEvidence

Crowdsourcing Efficacy Research and Product Reviews

8 of 30

#ShowTheEvidence

  • Develop a standard rubric for categorizing research
  • Transparency pledge
  • API for cross-listing results

First step: Develop a standard rubric for categorizing research

Crowdsourcing Efficacy Research and Product Reviews

9 of 30

Funding for Research on EdTech Efficacy�

10 of 30

How might we create comprehensive, sustainable models for funding to develop and disseminate levels of evidence that lead to efficacy in ed-tech?

Funding for Research on EdTech Efficacy

#ShowTheEvidence

11 of 30

Bending and Vaulting

Funding for Research on EdTech Efficacy

#ShowTheEvidence

12 of 30

Funding for Research on EdTech Efficacy

Bending

Vaulting

  • Rightsize the level of rigor for the problems we are researching
  • Develop the opensource space for surveys, research tools, data collection tools/processes use of data, interoperability, etc… including further development of the rapid research models that have been developed (RCE Coach, Digital Promise Framework, Learning Assembly Taxonomy, PowerMyLearning etc.)
  • Further develop out the testbeds to ensure we have places to run tests. (e.g. Charter networks, League of Innovative Schools, Regional EdClusters such as in Pittsburgh, Chicago, Boston, Kansas City etc.)

FIRST STEP: Create a concept paper to articulate what the heck we are talking about.

  • Identify the challenges and issues the consortium will address
  • Scan the field for the right people and technologies to align with identified challenges
  • Identify the process and measurement

  • FIRST STEP: Determine the broad governance of the consortium

13 of 30

Institutional Competence in Evaluating Efficacy Research�

14 of 30

The Challenge:

The disconnect between schools of ed and ed tech community...

leads to a lack of capacity among schools of ed to prepare educators for effective use of educational technology.

#ShowTheEvidence

Institutional Competence in Evaluating Efficacy Research

15 of 30

The Solution:

Improve the effective use of educational technology by building relevant knowledge, competencies, and connections among both faculty and students at schools of education

#ShowTheEvidence

Institutional Competence in Evaluating Efficacy Research

16 of 30

#ShowTheEvidence

GOAL: Prepare both pre-service and in-service educators & leaders to be transformational users, consumers, and evaluators of ed tech

PROCESS:

  1. Integrate existing tools/frameworks for use and evaluation of ed tech with school of ed curricula.
  2. Learn from exemplar school districts (including w/ partnerships).
  3. Connect higher ed to professional networks and trainers, and agree upon competencies that are authentically assessed.
  4. Make research and case studies accessible and actionable to practitioners.
  5. Build capacity of faculty (broadly defined) at schools of ed.

Institutional Competence in Evaluating Efficacy Research

17 of 30

IMMEDIATE NEXT STEPS:

  • Develop smart demand. Let schools of ed hear from stakeholders, including: recent school of ed grads, expert practitioners, ed tech providers, and professional associations

  • Collaborate with experts outside of higher ed to develop more effective programs and training.

  • Amplify practices of and consult with lighthouse teacher training programs.

Institutional Competence in Evaluating Efficacy Research

#ShowTheEvidence

18 of 30

Role of Evidence in Higher Ed Decision Making�

19 of 30

Understanding Learning:

Faculty (and students and administrators) need an understanding of how learning really works so they can make informed EdTech decisions.

#ShowTheEvidence

Role of Evidence in Higher Ed Decision Making

20 of 30

Building Faculty Capacity:

Time, incentives, and training for faculty to be engaged in evidence-based learning evaluation and design

#ShowTheEvidence

Role of Evidence in Higher Ed Decision Making

21 of 30

#ShowTheEvidence

  • Access to data from learning platforms
  • Creating accessible/agreed upon language about learning outcomes with professional communities
  • Identifying better measures of learning (of different types) and training people to use those measures
  • Informing/collaborating with vendors on evidence-based learning functionality and data

Role of Evidence in Higher Ed Decision Making

22 of 30

Company and Investor Views of Research�

23 of 30

The challenge:

How might we clarify the outcomes, and levels of evidence and efficacy, that are appropriate to each group: product developers, educators, purchasers (schools/districts/universities), and policymakers?

#ShowTheEvidence

Company and Investor Views of Research

24 of 30

The Solution:

  • Product user study → developers
  • Short-cycle evaluation → educators and purchasers
  • Longitudinal evaluation → policymakers

#ShowTheEvidence

Company and Investor Views of Research

25 of 30

#ShowTheEvidence

First Steps

  • Make market map/landscape of “research solutions” that already exist to get lay of land and highlight gaps
  • Create additional row on the Learning Assembly’s chart: “What constitutes a good case study/short-cycle evaluation/etc?”
  • Build on short-cycle evaluation created by DoE and Mathematica to be more readable/user focused
  • Pilot Design Research Options/Recommendations

Company and Investor Views of Research

26 of 30

Role of Evidence in K12 District Decision Making�

27 of 30

K-12 Use of Research:

How might we better support evidence-based ed-tech decision making among K-12 decision makers?

Role of Evidence in K12 District Decision Making

#ShowTheEvidence

28 of 30

Connecting the dots

  • Dynamic, interactive database for matching districts with products and communicating about implementation and product features at the district- and teacher-levels
    • Searchable by levels of research
    • Pulls in student demographics, teacher demographics, implementation context, along with outcomes
    • Opportunities for sharing feedback at district and teacher levels

#ShowTheEvidence

Role of Evidence in K12 District Decision Making

29 of 30

#ShowTheEvidence

Next Steps:

  • Identify third party as the broker between researchers, product developers, and districts
  • Get commitment from stakeholder groups (including districts) to engage in the research
    • Identify incentives
    • Collaborate with districts to identify the key components (e.g. filters) that should be included
  • Create opportunities to broaden stakeholder voices
    • Content specialists
    • Teacher, student, and parent voice

Role of Evidence in K12 District Decision Making

30 of 30

#ShowTheEvidence