Digging Deeper: Considering researchers’ work on complexity in past science assessments and the NGSS Brian Gong Center for Assessment Presentation in the session on “Developing a Common Language to Understand Content Complexity for Alignment Studies of the NGSS,” CCSSO National Conference on Student Assessment June 28, 2018 San Diego, CA
A conversation in the scientific tradition … • Thanks to Sara and WebbAlign, and the participants in the May 2018 gathering (and the previous RILS gathering, and … ) • An example of how science happens (Conant, 1948/1957) – Tug between theory and experience, mediated by reasoning – Proliferation then consolidation (maybe) – The interplay of theory and engineering (technologies) – Productive cross-fertilization – Social nature of science (correspondence and publishing) – Longitudinal – Serendipitous and tractable Science complexity - Gong - CCSSO NCSA - 6/28/18 2
My Central messages • The nature and degree of assessment cognitive complexity or cognitive demand arise from the interaction of specific stimulus of the task, the performance task the student is to do, and the content standards being assessed. • Cognitive and learning scientists and researchers have proposed useful frameworks for analyzing assessment cognitive complexity that could be applied to assessments of the NGSS. • It is imperative to develop practical, appropriate alignment methodologies for assessments of the NGSS. It may be possible to agree on general elements. Science complexity - Gong - CCSSO NCSA - 6/28/18 3
Overview • Specifying claims about the NGSS to allow alignment evaluations • My focus regarding complexity today • Some relevant work on complexity in science assessments from the past • Implications for complexity and alignment evaluation of NGSS assessments • Some urgent next steps Science complexity - Gong - CCSSO NCSA - 6/28/18 4
Alignment and specifying NGSS claims • Alignment methodologies have been developed to examine and evaluate the degree of appropriate correspondence between: – Standards and standards – Standards and assessments – Claims (reporting) and assessments • An Interpretive Argument and Validation Argument are a more complete declaration and evaluation of the relationships between claims (interpretations and uses) and assessments Science complexity - Gong - CCSSO NCSA - 6/28/18 5
My focus regarding NGSS complexity • I currently am focusing on claims necessary to create an Interpretive Argument to help guide development of an NGSS assessment as well as to evaluate it – Domain definition – PLDs and other statements of quality of performance – Reporting categories and subcategories – Evidence to support claims (e.g., test blueprint) – Measurement model • Alignment will be part of Interpretive Argument • Consideration of cognitive complexity will be an aspect of Interpretive Argument and of alignment Science complexity - Gong - CCSSO NCSA - 6/28/18 6
NGSS and complexity • Complexity of what the NGSS mean: integration of three dimensions of Scientific and Engineering Practices, Disciplinary Science Ideas, and Cross-cutting Concepts, with associated rich documentation (appendices, after-market definitional materials and examples) • More than most standards and assessments, designing assessments for the NGSS standards require more choices which are difficult to optimize – NGSS under-specified in several ways in comparison to typical content standards in other domains – NGSS PE both too few to define construct well and too many to assess practically – Many states still developing Interpretive Arguments and the associated claims – States working towards claims often differ in significant ways, creating different targets for alignment, including complexity (however defined). Complexity is an important piece of NGSS alignment considerations. A focus on item/cluster development prior to knowing intended claim/ • evidence at the test level does not necessarily lead to viable tests Science complexity - Gong - CCSSO NCSA - 6/28/18 7
NGSS and cognitive complexity • Today: NGSS focus on an analytical unit of “making a purposeful assertion and supporting it with scientific evidence and reasoning” – This unit is larger an individual item and requires being interpreted as a whole, and should also be interpreted as parts – This unit is smaller than a test because it is not sufficient evidence (usually) to support a test-level claim • What are ways to characterize the cognitive complexity of assessments of this analytical unit? – What contributes to the cognitive complexity required in the claim? – What contributes to the cognitive complexity of the assessment evidence? • Implications for task design (construct-relevant; construct-irrelevant) • Implications for scoring of performance Science complexity - Gong - CCSSO NCSA - 6/28/18 8
Some relevant work on complexity in science assessments from the past • Drawing on work by Shavelson, Baxter, Glaser, Wilson, Mislevy, and colleagues – Context was primarily science performance tasks, 1990s, many state programs and university-based R&D projects • Check for correspondence on “science standards” to NGSS – Analytical approach was primarily cognitive psychologists, most with considerable measurement expertise – Focus is always on student making a claim and supporting with evidence in scientific ways Science complexity - Gong - CCSSO NCSA - 6/28/18 9
Task types: Shavelson et al.’s four types (1997) Comparative investigation • Paper Towels: Discover which of three kinds of paper towels holds the most water and which holds the least (Baxter, Shavelson, Goldman, & Pine, 1992). • Bubbles: Discover which of three soapy solutions produces the most durable bubbles (Solano-Flores, 1994; Solano-Flores & Shavelson, 1994b ). • lncline Planes: Determine the relationship between the angle of inclination and the amount of force needed to move an object up a plane (Solano-Flores, Jovanovic, Shavelson, & Bachman, 1 994). Component identification • Electric Mysteries: Determine the components of the mystery box (Shavelson, Baxter, & Pine, 1991 1. • Mystery Powders: Given a bar containing substances commonly found in the kitchen (e.g., baking soda, starch, sugar), determine which substances are in the bag (Baxter, Elder, & Glaser, 1995; Baxter & Shavelson,1995). • Motor: Given a motor, a battery, and a box containing a battery, determine the polarity of the battery that is inside the box (Druker, Solano-Flores, Brown, & Shavelson, 1996). Classification • Sink & Float: Create a classification system that allows you to predict whether an object will sink or float in tap water (Solano-Flores, Shavelson, Ruiz-Primo, Schultz, Wiley, & Brown, 1997). • Rocks & Charts: Given a set of minerals, test the minerals for known attributes and create a classification system using those attributes (Druker, 1997). Observation • Daytime Astronomy: Model the path of the sun from sunrise to sunset and use direction, length, and angle of shadows to solve location problems (Solano-Flores, Shavelson, Ruiz-Primo, Schultz, Wiley, & Brown, 1997). Science complexity - Gong - CCSSO NCSA - 6/28/18 10
Recommend
More recommend