3. Public - STEM Evaluation Repository
 Share
The version of the browser you are using is no longer supported. Please upgrade to a supported browser.Dismiss

 
View only
 
 
Still loading...
ABCDEFGHIJKLMNOPQRST
1
What type of resource would you like to add?What is the name of the resource you would like to add?Is the resource publicly available/accessible?How can we access this resource? Brief DescriptionWould you like to connect a dataset to this resource? Is psychometric or validation information available for this resource?Are publications, reports, or any supplemental materials available for this resource?If the answer to either of the above questions was "Yes," provide any reference/citation information available.Relevant STEM content or disciplinesEvaluation conceptsAudience of evaluandIntervention or programming typeAdditional Keywords or TIG AlignmentsDomainAssessment typeAppropriate Respondent AgeAppropriate Respondent PopulationAdministration TimeAdditional Notes
2
evaluation tools: instruments, measures, scales, protocolsA Survey for Measuring 21st Century Teaching and Learning: West Virginia 21st Century Teaching and Learning SurveyYeshttps://www.academia.edu/5901608This teacher survey is available for re-use in studies of 21st century teaching and learning. It has demonstrated excellent reliability, improving on reliable measures from previous studies (std. alpha > .90, inter-item correlations > .58). Support for content validity is based on review of existing frameworks and measures. Support for concurrent validity includes strong relationships to time spent using project-based learning.YesYesThis survey has demonstrated excellent reliability, improving on reliable measures from previous studies (std. alpha > .90, inter-item correlations > .58). Support for content validity is based on review of existing frameworks and measures. Support for concurrent validity includes strong relationships to time spent using project-based learning.

Non peer-reviewed materials: Ravitz, J., Hixson, N., English, M., & Mergendoller, J. (2011). Using project based learning to teach 21st century skills: Findings from a statewide initiative . Vancouver, BC https://www.academia.edu/1854322 ; WVDE report: Hixson, N. , Ravitz J. & Whisman, A. (2012 ). Extended professional development in project-based learning: Impacts on 21st century teaching and student achievement . Charleston, WV: West Virginia Department of Education. Retrieved from https://www.academia.edu/1999374

general/non-specificquantitative approach, outcome measures for teaching practicesformal education, preK-12 education, higher educationpedagogy, educator professional developmentcompetence, skills, or reasoning, teaching practices related to 21st century skillsmultiple choicePK-12 instructor, college/university facultyinstructorsshort (<15 minutes), moderate (15-45 minutes)Was used in study of project based learning, could be applied to other constructivist-oriented teaching interventions (Inquiry, Design Based Learning, etc.) to show opportunities to learn 21cs. The unique feature of this survey is the 21st century skill is defined, it has behavioral examples, and then it asks perceptions. The combination of definition and behaviors increases validity of perceptions.
3
evaluation planning resourcesMy Environmental Education Evaluation Resource Assistant (MEERA)Yeshttp://meera.snre.umich.edu/MEERA is focused on evaluation for environmental education programs, and functions as a type of online "evaluation consultant." The site provides information for project leaders, staffers, and environmental education practitioners. It includes everything from basic background information on the meaning and uses of evaluation to specific how-to guides, evaluation planning guides, and tools/instruments.NoNogeneral/non-specific, environmental educationevaluation planning, evaluation training or capacity buildinginformal or out-of-school-time education, lifelong learning/self-directed education, environmental educationpublic engagement, citizen science/public participation in scientific research, environmental educationcluster, multi-site or multi-level evaluation, environmental programs, mixed methods evaluation
4
relevant repositories and databasesNanoscale Informal Science Education Network (NISE)Yeshttp://www.nisenet.org/Nanoscale Informal Science Education network site includes evaluation/research studies on nanoscience-related informal science education programs. Their library includes products and example evaluation reports/deliverables applicable to scientists, informal science educators, and/or K-12 educators.NoNoscience, technology, engineering, mathematics
5
evaluation tools: instruments, measures, scales, protocolsAmerican Camp Association Youth Outcomes BatteryNohttp://www.acacamps.org/research/youth-outcomes-battery

Submitter Contact: Deb Bialeschki, dbialeschki@acacamps.org
ACA Youth Outcomes Battery includes 11 separate common outcomes that are statistically tested, age-appropriate, easily administered tools in two formats (increased, and status+ change) and have national norms for comparison. A new online system (Phase 1) for the "basic" allows for electronic administration that generates basic statistical info. Staff/parent perception versions for the same outcomes also are available.

Submitter Contact: Deb Bialeschki, dbialeschki@acacamps.org
general/non-specific, 21st century skills supported by STEMquantitative approach, data analysis, evaluation useinformal or out-of-school-time education, parents or caregiversout of school time use
6
relevant repositories and databasesOnline Evaluation Resource Library (OERL)Yeshttp://oerl.sri.com/Online evaluation resource library for NSF projects (largely focused on STEM related projects). Includes evaluation plans, reports, and instruments. Developed by SRI with funding from the National Science Foundation.NoNointegrated STEM, science, technology, engineering, mathematicsqualitative approach, quantitative approach, mixed methods approach, instruments, evaluation reportsformal education, preK-12 education, higher education
7
evaluation planning resourcesAIHEC Indigenous Evaluation Framework: Telling Our Story in Our Place and TimeYeshttp://indigeval.aihec.org/Pages/Documents.aspx

More information can be obtained from the author Dr. Joan LaFrance. She can be contacted through the American Indian Higher Education Consortium, 121 Oronoco Street, alexandria, Va 22314. Telephone: 703-838-0400.
The AIHEC Indigenous Evaluation Framework centers evaluation in traditional ways of knowing.

Submitter Contact: Dr. Carol Davis, Tribal Nations Research Group, Turtle Mountain Band of Chippewa, cadavistmt@yahoo.com

More information can be obtained from the author Dr. Joan LaFrance. She can be contacted through the American Indian Higher Education Consortium, 121 Oronoco Street, alexandria, Va 22314. Telephone: 703-838-0400.
NoYesgeneral/non-specific, all of the aboveevaluation training or capacity building, mixed methods approach, evaluation useall of the aboveall of the abovediversity, equity and inclusion in STEM, underrepresented minorities (URM) in STEM, broadening participation in STEM, disabilities and other vulnerable populations, STEM identity, collaborative, participatory & empowerment evaluation, evaluation use, organizational learning & evaluation capacity building, environmental programs, research on evaluation, mixed methods evaluationMore information can be obtained from the author Dr. Joan LaFrance. She can be contacted through the American Indian Higher Education Consortium, 121 Oronoco Street, Alexandria, Va 22314. Telephone: 703-838-0400.
8
evaluation tools: instruments, measures, scales, protocolsSurvey of Undergraduate Research Experiences (SURE)Yeshttp://www.grinnell.edu/academics/areas/psychology/assessnebts/sure-iii-surveyThis is a validated instrument for evaluating undergraduate research programs which was developed by and administered through David Lopatto of Grinnell College. There are a lot of insightful questions and, although it may not align with all of your programmatic goals, it does allow for comparisons against hundreds of other similar programs throughout the country.Evaluators can work with the developer at Grinnell College to access comparison data sets.NoNointegrated STEM, undergraduate research experiencesmixed methods approachhigher education, graduate & postdoctoral trainingresearch experiences, mentorships/internships/apprenticeshipsunderrepresented minorities (URM) in STEM, STEM identitycompetence, skills, or reasoning, engagement or interest, attitude or behavior, career knowledge or acquisitionmultiple choice, self reportcollege/adultstudents
9
evaluation tools: instruments, measures, scales, protocolsUndergraduate Research Student Self-Assessment (URSSA)Yeshttp://www.colorado.edu/eer/research/undergradtools.htmlURSSA is an online survey instrument for use in evaluating student outcomes of undergraduate research experiences in the sciences. It is highly customization and allows for comparison against similar programs (especially for NSF funded Biology REU programs).YesYesHunter, A.-B., Laursen, S.L., & Seymour, E. (2007). Becoming a scientist: The role of undergraduate research in students' cognitive, personal and professional development. Science Education, 91(1), 36-74.

Hunter, A.-B., Laursen, S. L., & Seymour, E. (2008). Benefits of participating in undergraduate research in science: Comparing student and faculty perceptions, Ch. 7 in Creating Effective Undergraduate Research Programs in Science: The Transformation from Student to Scientist, R. Taraban, & R. L. Blanton, eds. New York: Teachers College Press, pp. 135-171.

Hunter, A-B., Weston, T.J., Laursen, S.L., & Thiry, H. (2009). URSSA: Evaluating student gains from undergraduate research in science education. Council on Undergraduate Research Quarterly, 29(3), 15-19.

Laursen, S., Hunter, A.-B., Seymour, E., DeAntoni, T., De Welde, K., & Thiry. H. (2006). Undergraduate research in science: Not just for scientists any more. In J. J. Mintzes and W. Leonard (eds.), Handbook of College Science Teaching (pp. 55-66). Arlington, VA: NSTA Press.

Laursen, S., Hunter, A.-B., Seymour, E., Thiry, H. & Melton, G. (2010). Undergraduate research in the sciences: Engaging students in real science. San Francisco: Jossey Bass.

Seymour, E., Hunter, A.-B., Laursen, S.L., & DeAntoni, T. (2004). Establishing the benefits of research experiences for undergraduates in the sciences: First findings from a three-year study. Science Education, 88(4), 493-534.

Weston, T. J., & Laursen, S. L. (2015). The Undergraduate Research Student Self-Assessment (URSSA): Validation for use in program evaluation. CBE-Life Sciences Education, 14(3), ar33. DOI 10.1187/cbe.14-11-0206 http://www.lifescied.org/content/14/3/ar33.full
integrated STEM, science, biology REU programsqualitative approachhigher educationresearch experiences, mentorships/internships/apprenticeshipsunderrepresented minorities (URM) in STEM, broadening participation in STEM, quantitative methodsengagement or interest, attitude or behavior, career knowledge or acquisitionpoint scale, multiple choice, short response, self reportcollege/adultstudentsmoderate (15-45 minutes)
10
relevant repositories and databasesField-Testing Learning Assessment Guide (FLAG) for Science, Math, Engineering and Technology InstructorsYeshttp://www.flaguide.org/tools/tools_discipline.phpThis online repository of STEM assessments includes discipline-specific resources that have been tested in the field. Resources include conceptual diagnostic tests, attitude surveys, scoring rubrics, and other tools, each aligned to a specific STEM discipline.general/non-specificqualitative approach, quantitative approachquantitative methods, qualitative methodsUsers can search the FLAG repository by discipline or by evaluation/assessment technique.
11
relevant repositories and databasesInformalScience.orgYeshttp://informalscience.org/InformalScience.org, hosted by the NSF-funded Center for the Advancement of Informal Science Education, is an existing repository that provides many resources and links for evaluating STEM in informal learning environments. Users can find evaluation guides, example reports, and instruments/tools.scienceevaluation planning, evaluation training or capacity buildinginformal or out-of-school-time educationpublic engagement, citizen science/public participation in scientific research, informal science education
12
basic introduction to evaluationThe 2010 User-Friendly Handbook for Project Evaluation (NSF Publication)Yeshttp://informalscience.org/documents/TheUserFriendlyGuide.pdfBasic guide to evaluation intended for project directors and principal investigators working on NSF-funded STEM education projects. This is very helpful for working with stakeholders who are well-versed in working with federal funders but who are not very familiar with evaluation.general/non-specificevaluation planning, evaluation training or capacity buildingevaluation use, organizational learning & evaluation capacity buildingThe Handbook is written with NSF-specific guidelines in mind, but is helpful and applicable more broadly than that.
13
relevant repositories and databasesCitizen Science Central Resources on Measuring EffectsYeshttp://www.birds.cornell.edu/citscitoolkit/toolkit/steps/effects/resourcesCollection of evaluation resources and instruments that are helpful specifically for citizen science initiatives.sciencecitizen science/public participation in scientific research
14
evaluation tools: instruments, measures, scales, protocolsCornell Lab of Ornithology/Citizen Science Central DEVISE ScalesYeshttp://www.birds.cornell.edu/citscitoolkit/evaluation/instrumentsThe DEVISE project (Developing, Validating, and Implementing Situated Evaluation Instruments) developed a set of constructs and instruments to measure outcomes associated with citizen science. Users must request access to scales related to interest, motivation, self-efficacy, and skills.YesContact the developers for psychometric and validation information. Available on request from http://www.birds.cornell.edu/citscitoolkit/evaluation/instrumentsscience, nature/process of sciencemixed methods approach, research on evaluationinformal or out-of-school-time education, lifelong learning/self-directed educationpublic engagement, citizen science/public participation in scientific researchbroadening participation in STEM, STEM identity, environmental programscompetence, skills, or reasoning, engagement or interest, attitude or behavior, content knowledgepoint scale, multiple choice, self reportany ageanyshort (<15 minutes), single time point collection, multiple time point collectionNot all scales are currently available. Must use a contact form to request access.
15
evaluation planning resourcesUser's Guide for Evaluating Learning Outcomes from Citizen Science (Cornell Lab of Ornithology)Yeshttp://www.birds.cornell.edu/citscitoolkit/evaluation/?searchterm=evaluationUser's guide for practitioners and project leaders involved in citizen science who would like to move toward more robust evaluation of their efforts. Includes templates and worksheets for developing an evaluation plan.Nogeneral/non-specific, citizen scienceevaluation training or capacity buildinginformal or out-of-school-time education, lifelong learning/self-directed education, citizen scientistspublic engagement, citizen science/public participation in scientific researchcollaborative, participatory & empowerment evaluation, organizational learning & evaluation capacity building
16
relevant repositories and databasesAssessment Tools in Informal Science (ATIS)Yeshttp://www.pearweb.org/atisSearchable, sortable collection of assessment tools intended for informal or out-of-school-time STEM interventions.YesYesIndividual entries in the repository are tagged with available information, including primary references and basic psychometric properties.general/non-specificinformal or out-of-school-time education, lifelong learning/self-directed educationout of school time useSTEM identity, organizational learning & evaluation capacity buildingDatabase can be searched by age, domain, or assessment type.
17
evaluation planning resourcesA Framework for K-12 Science Education: Practices, Crosscutting Concepts, and Core Ideas (2012, the National Academies Press)Yeshttp://www.nap.edu/catalog/13165/a-framework-for-k-12-science-education-practices-crosscutting-conceptsThis report of the National Research Council's Board on Science Education lays out a framework for K-12 formal science education. Many of the constructs and ideas discussed within have ramifications for the evaluation of formal STEM education interventions.Nointegrated STEM, science, nature/process of science, general/non-specificSTEM policyformal education, preK-12 educationcurriculum, pedagogydiversity, equity and inclusion in STEM, underrepresented minorities (URM) in STEM, broadening participation in STEM, STEM identity
18
relevant repositories and databasesINSPIRE Assessment Center for STEM LiteracyYeshttp://inspire-purdue.org/assessment-centerCollection of valid and reliable assessment instruments that measure various constructs related to engineering and science.integrated STEM, science, engineering, nature/process of engineering
19
evaluation tools: instruments, measures, scales, protocolsAssessing Women & Men in Engineering (AWE)Yeshttp://www.engr.psu.edu/awe/misc/about.aspxSeveral instruments related to persistence, diversity, retention, and other elements of engineering education, available upon registration.YesYesCitations and psychometric information are available through the website, upon registration.technology, engineering, nature/process of engineeringqualitative approach, quantitative approach, mixed methods approachformal education, informal or out-of-school-time education, preK-12 education, higher educationcurriculum, design challenges, research experiences, mentorships/internships/apprenticeshipsdiversity, equity and inclusion in STEM, underrepresented minorities (URM) in STEM, broadening participation in STEM, gender, STEM identitycompetence, skills, or reasoning, engagement or interest, attitude or behavior, career knowledge or acquisitionself report, multiple typeshigh school, college/adult, college/university facultystudents, instructorsMultiple, various instruments
20
relevant repositories and databasesCADRE (Community for Advancing Discovery Research in Education) K-12 ResourcesYeshttp://cadrek12.org/resourcesResources related to K-12 STEM education, development, and evaluation, Organized by key themes and linked to NSF-funded DR K-12 grantees. Includes evaluation tools/instruments, proposal writing resources, STEM education research, resource guides related to STEM education in various contexts, and more.general/non-specificdiversity, equity and inclusion in STEM, underrepresented minorities (URM) in STEM, broadening participation in STEMContains a great deal of material across many of the different categories in this STEM repository.
21
relevant repositories and databasesSTELAR (STEM Learning and Resource Center) ResourcesYeshttp://stelar.edc.org/resourcesCollection of STEM resources related to the NSF's ITEST grantees (Innovative Technology Experiences for Students and Teachers). Includes publications, curricular materials, instruments and evaluation reports.general/non-specificdiversity, equity and inclusion in STEM, underrepresented minorities (URM) in STEM, broadening participation in STEM
22
relevant repositories and databasesEvaluaATE Resource LibraryYeshttp://www.evalu-ate.org/library/Collection of evaluation resources connected to awardees of the NSF's Advanced Technological Education (ATE) program. Includes opportunities for education and professional development related to evaluation, along with evaluation, research, and other reports/publications related to ATE.technology
23
example evaluation reports or deliverablesHorizon Research, Inc., Reports and PresentationsYeshttp://www.horizon-research.com/publications-and-products/Horizon Research, Inc., offers some reports, presentations, instruments, and other types of deliverables related to various aspects of STEM education (primarily in science and mathematics).Horizon Research, Inc., hosts the public data release from the 2012 National Survey of Science and Mathematics Education (NSSME), including data from 7,752 science and math teachers and 1,504 schools: http://www.horizon-research.com/2012nssme/2012-national-survey-of-science-and-mathematics-education-public-release-data/YesYesPsychometric information and publications are available for some of the materials.science, mathematicsformal education, preK-12 educationcurriculum, pedagogymiddle school, high school, PK-12 instructorstudents, instructors
24
relevant repositories and databasesMSPnet Library and ResourcesYeshttp://hub.mspnet.orgMSPnet is the online home to awardees of the National Science Foundation MSP and STEM+C projects. It includes a library of publications and a resource collection (with a section on assessment and evaluation).science, mathematicsformal education, preK-12 educationcurriculum, pedagogy, educator professional development
25
evaluation planning resourcesNational Academies Press - Reports by the Board on Science EducationYeshttp://www.nap.edu/author/BOSE/division-of-behavioral-and-social-sciences-and-education/board-on-science-educationThe National Academies Press has published a variety of STEM-relevant technical reports and workshop findings. This specific repository entry links to reports coming out of the Board on Science Education, which includes formal and informal/out-of-school-time material.sciencebasic research on STEM learningformal education, informal or out-of-school-time education, lifelong learning/self-directed education, preK-12 education, higher education
26
evaluation tools: instruments, measures, scales, protocolsEQuIP Rubric for Science (NGSS)Yeshttp://nextgenscience.org/resources/equip-rubric-lessons-units-scienceEducators Evaluating the Quality of Instructional Practice, or EQuIP, has assembled this rubric and criteria for aligning instructional materials to the three-dimensional learning aspects of the Next Generation Science Standards (NGSS).science, nature/process of sciencequalitative approachformal education, preK-12 educationcurriculum, pedagogyquality rubricPK-12 instructorinstructorsmoderate (15-45 minutes), long (over 45 minutes)
27
evaluation planning resourcesSystems Evaluation ProtocolsYeshttps://core.human.cornell.edu/research/systems/protocol/The Cornell University Office for Research on Evaluation (CORE) has provided a guide to their Systems Evaluation Protocol (SEP), along with an online evaluation system for planning evaluations from a systems perspective (https://core.human.cornell.edu/research/systems/netway.cfm). The resources are freely available but may require registration.evaluation planning, evaluation training or capacity building, evaluation use, research on evaluationevaluation use, organizational learning & evaluation capacity building, program theory, research on evaluation
28
relevant repositories and databasesTools and Resources for Assessing Social Impact (TRASI)Yeshttp://trasi.foundationcenter.org/TRASI provides evaluation instruments, planning tools, and a discussion community focused on social impact assessment. Their database of tools can be filtered by approach, purpose, sector, and focus, and includes education, environment, and other STEM-related content areas. general/non-specific, not STEM-specificorganizational learning & evaluation capacity building
29
evaluation tools: instruments, measures, scales, protocolsPAC-Involved Discussion Guides for Project Team InterviewsNoContact Bernadette Wright at bernadette@meaningfulevidence.com. Discussion guides for three rounds of project team interviews, conducted before, during, and after project implementation. Developed for the evaluation of PAC-Involved, a Howard University project funded by NSF.NoNoNoscience, mathematics, physics, astronomy, cosmologyevaluation planning, qualitative approachinformal or out-of-school-time education, preK-12 educationcurriculum, pedagogy, using popular mediaunderrepresented minorities (URM) in STEM, qualitative methodsproject team perceptions of program goals, activities, development, implementation, and outcomesinterviewcollege/adult, PK-12 instructor, college/university facultyinstructors, institutionmoderate (15-45 minutes), multiple time point collection
30
evaluation tools: instruments, measures, scales, protocolsPAC-Involved Student Focus Group GuideNoContact Bernadette Wright at bernadette@meaningfulevidence.com. Guide for conducting focus groups with students. Includes items from existing instruments and new items specific to the project. Developed for the evaluation of PAC-Involved, a Howard University project funded by NSF.NoNoscience, physics, astronomy, cosmologyevaluation planning, qualitative approachinformal or out-of-school-time education, preK-12 educationcurriculum, pedagogy, using popular mediaunderrepresented minorities (URM) in STEM, qualitative methodsengagement or interest, attitude or behavior, students' perceptions of the program and what difference it made for themfocus grouphigh schoolstudentslong (over 45 minutes)
31
evaluation tools: instruments, measures, scales, protocolsPAC-Involved Intervention and Comparison Group Student Pre- and Post-Survey ItemsNoContact Bernadette Wright at bernadette@meaningfulevidence.com.Items included in online surveys of intervention and comparison group students, administered before and after their participation in the program. Developed for the evaluation of PAC-Involved, a Howard University project funded by NSF.NoNoscience, physics, astronomy, cosmologyevaluation planning, qualitative approachinformal or out-of-school-time education, preK-12 educationcurriculum, pedagogy, using popular mediaunderrepresented minorities (URM) in STEM, qualitative methodsengagement or interest, attitude or behavior, content knowledgemultiple choice, short response, extended response, self reporthigh schoolstudentsmoderate (15-45 minutes)
32
evaluation tools: instruments, measures, scales, protocolsPAC-Involved Unobtrusive Observation GuideNoContact Bernadette Wright at bernadette@meaningfulevidence.com.Guide for conducting unobtrusive observation of a PAC-Involved program session. Developed for the evaluation of PAC-Involved, a Howard University project funded by NSF.NoNoscience, physics, astronomy, cosmologyevaluation planning, qualitative approachinformal or out-of-school-time education, preK-12 educationcurriculum, pedagogy, using popular mediaunderrepresented minorities (URM) in STEM, qualitative methods, mixed methods evaluationcompetence, skills, or reasoning, engagement or interest, attitude or behaviorobservationn/along (over 45 minutes)
33
evaluation tools: instruments, measures, scales, protocolsDimensions of Success (DOS)Nohttp://www.pearweb.org/tools/dos.html#overviewThe Dimensions of Success observation tool, or DoS, defines twelve indicators of STEM program quality in out-of-school time (e.g., afterschool, summer camps, etc.). It was developed and studied with funding from the National Science Foundation (NSF) by the Program in Education, Afterschool and Resiliency (PEAR), along with partners at Educational Testing Services (ETS) and Project Liftoff. The DoS tool allows researchers, practitioners, funders, and other stakeholders to track the quality of STEM learning opportunities and to pinpoint strengths and weaknesses.
This is an excellent observational assessment tool. However, it requires training and certification to administer. It is a heavy lift for non-evaluation staff. Data generated from this assessment is excellent for facilitating discussion for program improvement and staff development.
PEAR will analyze your data and send reports. YesYeshttp://www.pearweb.org/research/pdfs/DoSTechReport_092314_final.pdf

http://www.pearweb.org/tools/dos.html#research
integrated STEM, STEM + arts (STEAM), science, technology, engineering, nature/process of science, nature/process of engineering, general/non-specific, not as strong for comp scievaluation training or capacity building, qualitative approachinformal or out-of-school-time education, preK-12 education, homeschool educationdiversity, equity and inclusion in STEM, underrepresented minorities (URM) in STEM, broadening participation in STEM, disabilities and other vulnerable populations, English-language learners, STEM identity, cluster, multi-site or multi-level evaluation, organizational learning & evaluation capacity building, program design, qualitative methods, youth focused evaluationcompetence, skills, or reasoning, engagement or interest, attitude or behavior, content knowledgeobservationelementary, middle school, high school, college/adultstudentslong (over 45 minutes), multiple time point collectionExcellent observational tool, but a heavy lift for use by non-evaluation professionals.
34
evaluation tools: instruments, measures, scales, protocolsCS Observation ProtocolNohttps://drive.google.com/drive/u/0/folders/0B_uOS_yIRMQ8VTNfMVptUDRBM2s

Submitter Contact: Kathy Haynie of Hanie Research and Evaluation, kchaynie@stanfordalumni.org
This is an observational protocol for high school computer science (CS) classes. It measures instruction/pedagogy, classroom activities, culturally relevant practices, CS content, cognitive demand, computational thinking skills, and student engagement.

Submitter Contact: Kathy Haynie of Haynie Research and Evaluation, kchaynie@stanfordalumni.org
NoYesThis protocol was developed, vetted, and piloted as part of the evaluation of the SMASH program, Level the Playing Field Institute.technology, computer sciencequalitative approachformal education, preK-12 educationcurriculum, pedagogydiversity, equity and inclusion in STEM, underrepresented minorities (URM) in STEM, broadening participation in STEM, qualitative methodscompetence, skills, or reasoning, engagement or interest, content knowledgeobservationhigh schoolstudentsmoderate (15-45 minutes), long (over 45 minutes), One class period
35
evaluation planning resourcesAfter-school program logic model and evaluation guideYesTemplate: https://docs.google.com/document/d/14eLtxMYHoxE1bP6uNF3JOVBDXhTUFIqDuhQ44p5E3vA/edit#

Exemplar: https://docs.google.com/document/d/1ps0oEgddNFZzE7jlTi2Tcel_N9NrpkhpcoRyaTY48XU/edit

Submitter Contact: Axel Reitzig, reitzig_axel@svvsd.org
Begin with construction a logic model that describes the current state of your program.

1. Identify resources do you currently have (staff, materials, financial, space). What do you need?

2. Identify existing outputs of your program (what activities, who is participating).

3. Identify short-term (1-3 years) and long-term (3-5 years) goals.

4. Identify 2-3 goals and develop evaluation statements out of each. Statements have four components:
- Describe the output
- Identify the methodology for measuring the output
- Identify the short-term goal created via the output
- Identify the long-term goal created via the STO

5. Develop an action plan by answering the following questions:
- What are you measuring?
- How will you measure it?
- How will you collect the data?
- Who is responsible?

Submitter Contact: Axel Reitzig, reitzig_axel@svvsd.org
None at this time.NoNointegrated STEM, technology, engineering, Roboticsevaluation planning, mixed methods approachinformal or out-of-school-time education, preK-12 educationeducational technologyengagement or interestTBD by user of templateelementary, middle school, high schoolstudentsTBDNA
36
evaluation planning resourcesEvaluation Flash CardsYeshttp://www.ottobremer.org/sites/default/files/fact-sheets/OBF_flashcards_201402.pdfThis guide was developed by Michael Quinn Patton for the Otto Bremer Foundation to support evaluators and organizations so they get the most our of their evaluations by introducing and describing core evaluation concepts. There are 25 “flash cards” (listed below) that help organizational leaders understand what constitutes a good evaluation and differentiate the purposes and strengths of different types of evaluation:
Evaluative Thinking
Evaluation Questions
Logic Models
Theory of Change
Evaluation vs. Research
Dosage
Disaggregation
Changing Denominators, Changing Rates
SMART Goals
Distinguishing Outcomes From Indicators
Performance Targets
Qualitative Evaluation
Triangulation Through Mixed Methods
Important and Rigorous Claims of Effectiveness
Accountability Evaluation
Formative Evaluation
Summative Evaluation
Developmental Evaluation
The IT Question
Fidelity or Adaptation
High-Quality Lessons Learned
Evaluation Quality Standards
Complete Evaluation Reporting
Utilization-Focused Evaluation
Distinguish Different Kinds of Evidence
NoNogeneral/non-specificevaluation planning, evaluation training or capacity building, qualitative approach, quantitative approach, mixed methods approach, data analysis, evaluation useformal education, informal or out-of-school-time education, lifelong learning/self-directed education, preK-12 education, homeschool education, higher education, graduate & postdoctoral training, workforce training, parents or caregivers, educational administrators, STEM professionals, Organizational Leaderscurriculum, pedagogy, educator professional development, pre-service educator training, distance education, educational technology, science communication, public engagement, design challenges, research experiences, mentorships/internships/apprenticeships, science fairs, citizen science/public participation in scientific researchdiversity, equity and inclusion in STEM, underrepresented minorities (URM) in STEM, broadening participation in STEM, disabilities and other vulnerable populations, English-language learners, lesbian, gay, bisexual and transgender issues, gender, STEM identity, cluster, multi-site or multi-level evaluation, collaborative, participatory & empowerment evaluation, data visualization and reporting, evaluation use, organizational learning & evaluation capacity building, environmental programs, program design, program theory, research on evaluation, design & analysis of experiments, quantitative methods, qualitative methods, mixed methods evaluation, youth focused evaluation, general useEvaluation Flash Cards are practical and intended to make core evaluation concepts easily accessible and retrievable.
37
evaluation tools: instruments, measures, scales, protocolsScience Learning Activation Yeshttp://www.activationlab.org/tools/


Submitter Contact: hartry@berkeley.edu
The Learning Activation Lab is a national research and design effort to learn and demonstrate how to activate children in ways that ignite persistent engagement in science, technology, engineering, art, and mathematics learning and innovation. Increasing understanding and appreciation of science, technology, engineering, art, and mathematics subjects is critical to keeping our nation at the forefront of technology and innovation and preparing young people for the challenges of the future. We have developed and tested a variety of survey instruments, interview and observation protocols, and other instruments for use in our research efforts.

Submitter Contact: hartry@berkeley.edu
Datasets are available. Please contact the Activation Lab for access by emailing: info@activationlab.org. YesYesValidation reports are available at: http://www.activationlab.org/tools/

Research archive can be accessed at: http://www.activationlab.org/research/
STEM + arts (STEAM), science, technology, engineering, nature/process of science, nature/process of engineeringevaluation planning, qualitative approach, quantitative approach, data analysis, evaluation useformal education, informal or out-of-school-time education, preK-12 educationcurriculum, pedagogy, science fairs, citizen science/public participation in scientific researchdiversity, equity and inclusion in STEM, underrepresented minorities (URM) in STEM, broadening participation in STEM, STEM identity, quantitative methods, qualitative methods, youth focused evaluationcompetence, skills, or reasoning, engagement or interest, attitude or behavior, career knowledge or acquisitioninterview, point scale, multiple choice, short response, observation, self reportelementary, middle school, high schoolstudentsshort (<15 minutes), moderate (15-45 minutes)
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
Loading...
 
 
 
Repository