1 of 30

New Science Standards�New Assessments

Joyce Barry – Plainview-Old Bethpage Schools

jbarry@pobschools.org

Mary Loesing, Ed.D. – Connetquot Schools

mloesing@ccsdli.org

2 of 30

New Standards

  • Based on “A Framework for K-12 Science Education” (2012)
  • Next Generation Science Standards (2013)
  • New York State P-12 Science Learning Standards (Adopted December 2016, Implementation Date July 2017)
    • Phase I: Raise Awareness and Build Capacity 07/2017-08/2019
    • Phase II: Transition and Implementation 09/2019-08/2021
    • Phase III: Implementation and Sustainability 09/2021-08/2024

3 of 30

Nineteen states and the District of Columbia (representing over 36% of U.S. students) have adopted the Next Generation Science Standards (NGSS). The 19 states are Arkansas, California, Connecticut, Delaware, Hawaii, Illinois, Iowa, Kansas, Kentucky, Maryland, Michigan, Nevada, New Hampshire, New Jersey, New Mexico, Oregon, Rhode Island, Vermont and Washington.

 Twenty states (representing 29% of U.S. students) have developed their own standards based on recommendations in the NRC Framework for K-12 Science Education. The 20 states are Alabama, Colorado, Georgia, Idaho, Indiana, Louisiana, Massachusetts, Mississippi, Missouri, Montana, Nebraska, New York, Oklahoma, South Carolina, South Dakota, Tennessee, Utah, West Virginia, Wisconsin, and Wyoming.

4 of 30

Scientific & Engineering Practices

  1. Asking Questions (for science) and Defining Problems (for engineering)
  2. Developing and Using Models
  3. Planning and Carrying Out Investigations
  4. Analyzing and Interpreting Data
  5. Using Mathematics and Computational Thinking
  6. Constructing Explanations (for science) and Designing Solutions (for engineering)
  7. Engaging in Argument from Evidence
  8. Obtaining, Evaluating, and Communicating Information

5 of 30

Crosscutting Concepts

 

1. Patterns.

2. Cause and effect: Mechanism and Explanation

3. Scale, proportion, and quantity.

4. Systems and system models.

5. Energy and matter: Flows, Cycles, Conservation

6. Structure and function.

7. Stability and change.

6 of 30

Disciplinary Core IDEAS

  1. Disciplinary Core Ideas outline the content that is included in the performance expectation.
  2. For any PE that was added by NYSED, a corresponding DCI has also been added.
  3. The DCIs will be especially helpful for elementary teachers who may not have as much science content knowledge as secondary teachers.

7 of 30

What is the difference?

  • Three-dimensional standards are unique because each standard is written as a performance expectation that combines a SEP, CCC and DCI; student proficiency is tied to being able to use the three dimensions together to make sense of phenomena and solve problems.
  • Phenomena – observable events that students can use science to explain.
  • Problems – situations that humans wish to change
      • Achieve 2018

8 of 30

Why make this shift?

  • Our standards, based on the Framework, emphasize goals for teaching and learning that are intentionally designed – built on decades of research and expertise– to ensure that all students leave their K-12 educational experience ready to be scientifically literate, critical consumers of information and able to pursue a full range of opportunities available to them in an increasingly STEM-oriented world. (Achieve 2018)

9 of 30

Need for New Assessments

  • Assessments aligned to new three-dimensional standards need to measure student performance in ways that inform, incentivize and monitor the progress of schools and districts as they implement the standards. To do this requires assessments that allow students to demonstrate science proficiency by
    • Authentically and directly engaging in reasoning about phenomena and problems, and
    • Doing so by integrating the three interacting dimensions of science that each have associated knowledge and application expectations.
  • Done well these assessments can shape better teaching and learning in science, and possibly influence other content areas.

 

10 of 30

Thought Experiment #1

Jack

“I care most about whether students learned what they were supposed to, as described by the standards (grade-specified PEs, SEPs, CCCs, DCIs).”

Jill

10

“I care most about whether students can flexibly make sense of different kinds of phenomena and problems, using the DCIs, SEPs, and CCCs as needed.”

Do you agree with Jack or Jill?

11 of 30

Thought Experiment #2

“At the end of the day, science in school is about understanding ideas and principles that govern the natural world--the focus of science assessments should be on deep conceptual understanding of science principles. ”

Jack

Jill

11

“Meh. Science in school is really about making sure students can use evidence and reasoning to make sense of things they encounter--the focus of science assessments should be on process.”

Do you agree with Jack or Jill?

12 of 30

Thought Experiment #3

“Demonstrating knowledge-in-use is nice to have, but not required for science assessments.”

Jack

Jill

12

“Demonstrating knowledge in use is a must-have for science assessments.”

Do you agree with Jack or Jill?

13 of 30

Thought Experiment #4

Jack

“The role of phenomena and problems in assessments is to be an engaging hook. It’s okay if students don’t actually explain the phenomenon as long as they describe the general ideas related to the phenomenon in assessment tasks.”

Jill

13

“Assessments should always require students to actually figure out a phenomenon or problem through the course of a task.”

14 of 30

Where did we land?

  • There are a range of different perspectives about what is most important to elicit from students.
  • Some of these perspectives are driven by different assessment purposes.
  • Some of them are grounded in more personal philosophies.
  • In a system of assessments, we can balance these goals.
  • It is important to keep these differences in mind as we are thinking about features of good assessments because these differences can shape the “flavor” of an assessment.

15 of 30

Which scenario reigns supreme?

  • Both of the following tasks are MS tasks, administered as a summative assessment.

  • They are both intended to elicit grade-appropriate 3D performances, and no further information is provided to the student.

16 of 30

Which scenario reigns supreme?

Option A

Option B

Choose: A or B. Be ready to defend your answer!

17 of 30

Option B

18 of 30

What are attributes of a good assessment question?

  • Necessary to respond to the task
  • Focused on real-world observations that students can connect with
  • a specific case or instance, not a statement or topic
  • puzzling and/or intriguing,
  • creating a “need to know”
  • Explainable using grade-appropriate elements of DCIs, SEPs, and CCCs
  • Use real or well-crafted data that are grade-appropriate and accurate;
  • Uses at least 2 modalities
  • Are locally relevant, globally relevant, and/or exhibit features of universality.
  • Is comprehensible at the grade-level and for a range of student groups.
  • Uses as many words as needed, but no more.
  • Is sufficiently rich and engaging to drive student sense-making throughout the task.

19 of 30

20 of 30

Characteristics of 3-D Science Assessments

  • Item clusters are the base unit for assessment – not individual items
  • Item clusters must be aligned to one or more PEs and must be inclusive of all the dimensions associated with the PE(s)
  • Each individual item within the cluster must align with at least two dimensions of the standards to qualify for inclusion in an item cluster.

21 of 30

22 of 30

Steps to Take to Begin Developing Assessment Questions

  • Build an Anchor Chart of the Standard – An anchor chart is a conceptual model, a visual representation of concepts, for teachers not students.
  • Create a list of the concepts
  • Vocabulary and cross cutting concepts for each standard are already pulled out on Paul Andersen’s website www.wonderofscience.com website.
  • Sketch out the standards using the vocabulary
  • Also include sources where you can find data about the topic to be used in the question scenario.

23 of 30

Setting the Scene for the Question

  • Choose a phenomena
  • Create a stimulus for the question.
  • The prompt is what the students will use to frame the answer to their question.

24 of 30

Example

  • Standard: MS-ESS 3-2 Natural Hazards
  • Phenomenon: The relationship between seismic activity and volcanic eruptions. Eruption of the Kilauea volcano in Hawaii in the spring of 2018.
  • Stimulus: Maps and Charts of plate boundaries and volcanic activity, USGS weekly volcanic activity report.
  • Prompt: You have been hired to determine the ideal location for a new school on the big island of Hawaii. Use your evidence from the charts, maps and reports you have been given to determine where on the island would be the best location.

25 of 30

26 of 30

27 of 30

28 of 30

What are other states doing?

29 of 30

NYSED Science Assessments

  • No dates for new assessments have been released.
  • Item writers are being hired and recruited - http://www.p12.nysed.gov/assessment/teacher/home.html
  • The New York State Education Department (NYSED) asks all educators who wish to participate in test development activities to complete our online application located at https://www.research.net/s/NYSTPRecruit
  • Items writers need to be trained in developing thee-dimensional assessment questions.
  • Banks of three-dimensional items need to be created and field tested.
  • Suggestions are being made to create transitional assessments which introduce 3-D items.
  • NYSED is using the same process they used to develop the new social studies assessment to develop the science assessments. . https://www.engageny.org/new-york-state-k-12-social-studies

30 of 30

Resources