relationship between test item cognitive complexity
play

Relationship between test item cognitive complexity, content, and - PowerPoint PPT Presentation

Relationship between test item cognitive complexity, content, and instructional sensitivity Tanya Longabach Jie Chen, Ph.D. Neal Kingston, Ph.D. Presentation outline n Significance of instructional sensitivity in today s education


  1. Relationship between test item cognitive complexity, content, and instructional sensitivity Tanya Longabach Jie Chen, Ph.D. Neal Kingston, Ph.D.

  2. Presentation outline n Significance of instructional sensitivity in today ’ s education environment n Study purpose n Methods and data analysis n Results: factors that may influence reading and math item instructional sensitivity n Results: difference in instructional sensitivity of reading and math items n Implications and direction for future research

  3. Instructional sensitivity in today’s education environment n Increased interest in instructional sensitivity as a means to determine appropriateness of assessment n Instructional sensitivity is the responsiveness of a test or test item to instruction n A test is only as instructionally sensitive as the items on the test n However, few studies have been done to determine what it is that makes test items instructionally sensitive

  4. Issues associated with determining instructional sensitivity n Critical thinking items and instructional sensitivity (Wiliam, 2007; Embretson, 2010; Muthen, Kao, & Burstein, 1988) n Critical thinking items tend to be less instructionally sensitive n Content subjects (e.g. math vs. reading) (Nye et al., 2004; Boscardin et al., 2005) n Math items tend to be more instructionally sensitive than reading items

  5. Study purpose n Address the gaps in knowledge about characteristics of instructionally sensitive items n Examine the impact of the following test item characteristics on instructional sensitivity: n Use of formula to answer the question (math only) n Presence of a chart or graph in the item (math only) n Use of special vocabulary in the item (math only) n Cognitive complexity as defined by Bloom’s taxonomy (math and reading) n Standard to which the items were written (math and reading) n Content subject (math vs. reading) n Grade level

  6. Bloom ’ s taxonomy

  7. Kansas Curricular Standards example: mathematics Number of items in Standard Indicator Indicator descriptions each interim description number assessment Represents and explains whole numbers and non-negative rational numbers from M.5.1.1.K1 2 0 to 1,000,000. Uses a variety of computational methods M.5.1.3.A4 . to solve problems with exact or 2 approximate answers Number and Uses various strategies to estimate non- M.5.1.3.K2 2 computation negative whole and rational quantities. Solves one- and two-step problems M.5.1.4.A1 using a variety of computational 2 procedures. Determines greatest common factor and M.5.1.4.K4 least common multiple of two whole 2 numbers. Represents and relates unknown M.5.2.2.K1 quantities from 0 to 1,000 using 2 variables and symbols. Solves one-step equations with whole Algebra M.5.2.2.K2 number solutions using addition, 2 subtraction, or multiplication. Uses a function table to identify, plot, M.5.2.3.K4 and label points in the first quadrant of 2 the coordinate plane.

  8. Kansas Curricular Standards example: reading Number of items Standard Indicator Indicator descriptions in each interim description number assessment Determines the meaning of words or phrases by using context clues (e.g., definitions, R.5.1.3.1 2 restatements, examples, descriptions) from Reading, sentences or paragraphs. Vocabulary Determines meaning of words through R.5.1.3.4 . knowledge of word structure (e.g., 2 contractions, root words, prefixes, suffixes). Identifies and describes characters' physical traits, personality traits, and feelings, and R.5.2.1.1 2 explains reasons for characters' actions and the consequences of those actions. Identifies and describes the setting (e.g., Literature, environment, time of day or year, historical Literary period, situation, place) and explains the R.5.2.1.2 2 Concepts importance of the setting to the story or literary text. Identifies and describes the major conflict in a R.5.2.1.3 story and major events related to the conflict 2 (e.g., problem or conflict, climax, resolution).

  9. Methods n Data from 2011-2012 Kansas Interim Assessment, window 2 (November 11, 2011, to January 13, 2012) n Multi-stage computer adaptive test, two testlets Mathematics r d grade 4 th grade 5 th grade 6 th grade 7 th grade 8 th grade Total Grade level 3 N items 24 28 30 28 30 28 168 N students 1307 778 724 936 1220 1387 6350 Reading N items 18 21 21 n/a n/a n/a 60 N students 1194 807 838 n/a n/a n/a 2839

  10. Data analysis n Conducted logistic regression to identify instructionally sensitive items (Chen & Kingston, 2012) Z = β 0 + β 1 θ + β 2 G + β 3 ( θ *G) n Instructional sensitivity was measured as the difference between Nagelkerke’s R 2 value of the logistic regression model in step 3 and that of the model in step 1 (Chen, 2012), converted to z score ∆ R 2 = R 2 (M3) - R 2 (M1) n Conducted correlation and stepwise regression to examine relationship of instructional sensitivity with predicting factors

  11. Results: instructionally sensitive items

  12. Correlation results: mathematics Instructional Bloom’s S_1 S_2 S_3 graph vocabulary formula sensitivity* taxonomy 3M ** ** -.10 -.25 .22 -.35 -.35 .88 .52 4M -.32 -.21 .12 -.18 .37 .33 .13 5M ** -.08 -.21 .12 -.19 -.25 -.10 .52 6M * ** * * * -.41 -.22 .60 -.47 .40 .41 -.14 7M * * .07 .14 -.03 -.18 -.38 .03 .40 8M -.23 .03 .05 -.26 -.08 .16 -.16 *Instructional sensitivity from here on was measured as the difference between Nagelkerke’s R 2 value of the logistic regression model in step 3 and that of the model in step 1 (Chen, 2012), converted to z score. Bolded correlation values are significant; * indicates p<.05, ** indicates p<.01.

  13. Regression results: mathematics 2 grade variable 1 β weight variable 2 β weight R 3rd vocabulary .86 graph -.23 .82 4th - - - - - 5th formula .52 - - .28 6th standard .60 - - .36 7th formula .47 graph -.46 .36 8th - - - - -

  14. Results: instructionally sensitive items

  15. Correlation and regression results: reading Instructional Bloom’s taxonomy sensitivity S_1 S_2 3R .02 -.30 -.11 4R -.35 .30 .25 -.21 .13 .35 5R • Regression: no variables were entered in the equation.

  16. Combined grades by subject

  17. Correlation and regression results: grades combined by subject Correlation Instructional Bloom’s graph vocabulary formula grade sensitivity taxonomy -.29 ** .26 ** All grades math -.03 .09 -.08 All grades .13 n/a n/a n/a .15 reading Regression 2 grade variable 1 β weight variable 2 β weight R All grades math Bloom’s taxonomy -.24 vocabulary .21 .12 All grades reading - - - - -

  18. Discussion n More instructionally sensitive items in math assessment than in reading assessment n Reading skills are acquired in a variety of settings, while math skills are acquired mainly in school n Math and reading assessments may be testing constructs at different granularity levels n Non-linear relationship of instructional sensitivity with grade variable

  19. Implications and future directions n OTL information should be included in test information dataset for the purposes of research on instructional sensitivity n Further research is needed on how to make test items more instructionally sensitive n Controlling for confounding factors, such as previous exposure to content, will increase the precision of instructional sensitivity measurement n Standard clarity should be examined further n Using more precise measurement of OTL n Nested information needs to be taken into account

  20. Questions? Comments? For additional information, please contact: Tanya Longabach tlongabach@ku.edu

Recommend


More recommend