Using Rubrics/Scoring Guides in Program Assessment Carrie Zelna, Ph.D. Director, Office of Assessment Division of Academic and Student Affairs
Outcomes • identify possible opportunities for using rubrics/scoring guides as a tool in program assessment • describe the types of rubrics/scoring guides that can be applied to program assessment • identify the steps necessary to apply rubrics/scoring guides systematically (Norming, Sampling and Analysis)
Four Steps of Assessment Four Steps of Assessment Linda Suskie 2009 Assessing Student Learning: A common sense guide. 2 nd edition. Jossey-Bass. 1. Establish Learning Goals (Plan) 2. Providing Learning Opportunities (Act) 3. Assess Student Learning (Observe) P. Steinke & C. Zelna 4. Use the results (Reflect) Linda Suskie email on 4/4/2008 to the Assess listserve : “….understand that assessment is action research, not experimental research. While it is systematic, action research is context-specific, informal, and designed to inform individual practice. As such, it doesn't have the precision, rigor, or generalizability of experimental research. “
Expectations at NC State: http://www.ncsu.edu/assessment/acad_uaap.htm 1. All degree programs and transcripted certificates have a set of 4-7 program-level student learning outcomes measured within a 3-5 year cycle using direct evidence 2. Make clear decisions based on the data (*not decisions, not necessarily changes)
How to Identify Products: Curriculum Maps • Identifies where concepts are taught • Highlights potential issues in the curriculum • Identifies possible key courses that may have course products for assessment
Genetics Required Courses Elective Courses GN 311 GN 312 GN 421 GN 423 GN 425 GN 492/493 GN 434 GN 441 GN 451 GN 490 Population, Elementary Quantitative, Advanced Special Human and Principles of Genetics Molecular Evolutionary Genetics Problems in Genes and Biomedical Genome Genetics Laboratory Genetics Genetics Laboratory Genetics Development Genetics Science Colloquium LEARNING OUTCOMES Graduates will be able to: 1) Demonstrate a sound working knowledge of the principles of genetics A) Describe the basic concepts in molecular, population, quantitative and evolutionary genetics E L E, Q E M M E, O, R E, O, R E, O, P D, O B) Describe how knowledge in genetics is based upon research and the interpretation of experimental results E L E, Q E M E, O, R E, O, R E, O, P D, O C) Describe how model genetic systems are used to understand the biology of all organisms L E, Q M E, O, R E, O, R E, O, P D, O 2) Engage in scientific inquiry and apply technical, analytical and critical thinking skills to solving problems in genetics A) Demonstrate the ability to solve genetics problems in the classroom or laboratory E, H L E, Q E M M E, O, R E, O, R E B) Describe experimental systems used in genetics research. L M D, O C) Describe basic laboratory and computational techniques used in research areas such as transmission genetics, population genetics, cytogenetics and molecular genetics E,L E, Q E M M E, O, R E, O, R E, O, P D, O D) Develop hypotheses related to a research project L M M E) Design experiments aimed at answering hypotheses or basic genetics questions L E, Q E M M E, O, R E F) Demonstrate skill at collecting data and analyzing results L E, O, R E, O, R E, O, P
Types of Evidence • Selected Response • Constructed Response • Product/Performance
Selected-Response: Measuring Acquisition of Knowledge and Skills Traditional Test Questions • True/False • Matching • Multiple Choice • Course Assessment: Look for patterns in the answers
Constructed Response • Short-Answer Essay Questions • Concept Maps • Identifying Themes • Making Predictions • Summaries • Explain Your Solution Course Assessment: Rubrics/Scoring Guides http://jfmueller.faculty.noctrl.edu/toolbox/tasks.htm
Product/Performance “...reveals their understanding of certain concepts and skills and/or their ability to apply, analyze, synthesize or evaluate those concepts and skills” * Research Paper Capstone Project Article Reviews Film Analysis Case Study Error Analysis Panel Discussion Fishbowl Discussion Oral Presentations Course Assessment: Rubrics/Scoring Guides * http://jfmueller.faculty.noctrl.edu/toolbox/tasks.htm
Rubric: Definition and Purpose • Rubric: “a scoring tool that lays out the specific expectations for an assignment” (Stevens & Levi, 2005, p. 3) • It is a way of organizing criteria to systematically determine if the outcome is met based on data gathered through papers, observation, document analysis, or some other appropriate method. • When you review the data in the aggregate, a rubric can help identify patterns of strengths and weaknesses that might allow for enhancements to the program.
Types of Rubrics 1. Check-list 2. Rating Scale 3. Descriptive 4. Holistic 5. Structured Observation Guide
Things to Consider • Testing • Norming (multiple raters) • Sampling • Scoring
Testing Your Rubric/Scoring Guide • Metarubric: use a to review your work • Peer review: ask one of your colleagues to review the rubric and provide feedback on content • Student review: ask several students review the rubric if appropriate (students in a class may help you create it) • Test with products: use student work to test the rubric once you feel it is ready (3-5?)
Norming with Multiple Raters • Will multiple reviewers look at each product? • Spend time walking through the rubric/scoring guide as a group. • Review 1 product individually, then compare responses. • Repeat as needed until you feel comfortable that you are on the same page (5 to 10) • This could result in additional changes to the rubric/scoring guide in some cases • Consider doing this throughout the process to ensure that you are not drifting
Sampling: Being Systematic • Should you sample at all or review all the products? • Mary Allen- Rule of Thumb_ 50 to 75 is sufficient for assessment • Consider the attitudes of the faculty towards data: • Quantitative approaches • Qualitative approaches **My office can help choose random students for your assessment.
Qualitative Approach to Sampling: • Qualitative approach would be to use the rubric with 15 to 20 portfolios randomly chosen across the sections. You would analyze the results and repeat. Complete this three times then compare all the results. If you are seeing consistent results you can choose to stop. If you are not seeing consistent results then repeat again. When you start to see the same results consistently, you have reached what is known as "data saturation." When using qualitative methods, seeing the same consistent results is a key to knowing that your sample is representative of the population.
Scoring • Individual vs. Aggregate Scores • Average Score (Mean) By Dimension and Total • Total Score: Total scores may be reviewed to get a big picture • Dimension: Dimension scores to look for patterns • Frequency Distributions • Scale: Frequencies by scale to get a clearer understanding of the data *Nothing says you have to have a quantitative value…..We will come back to this.
Example: University of Virginia
Scoring the Data Understanding/ Paper Separation/ Change in Self- Application/ ID# Class Age Gender Length Total Objectivity Dissonance Perspective Perception Resolution Verification totals A FR 19 F 5 18 3 3 3 3 3 3 18 B SR 21 M 3 17 3 3 3 3 3 2 17 C FR 18 F 7 16 3 3 3 2 2 3 16 D SR 21 M 5 16 3 3 3 2 3 2 16 E SO 19 F 9 15 2 3 3 2 2 3 15 F FR 18 M 3 14 3 3 3 2 3 0 14 G SO 20 M 3 14 3 3 3 0 3 2 14 H SO 19 M 5 13 2 2 3 2 2 2 13 I FR 18 M 8 13 3 3 3 2 2 0 13 J JR 20 F 5 13 2 2 2 2 3 2 13 K SO 20 M 5 13 3 3 2 2 2 1 13 L FR 18 M 7 13 2 3 2 2 2 2 13 M JR 20 F 3 11 3 3 3 0 2 0 11 N FR 18 F 5 10 2 2 2 2 2 0 10 O SO 22 M 4 10 2 3 2 2 2 0 11 P FR 18 F 6 10 2 3 1 2 1 1 10 Q FR 19 M 9 9 2 2 1 2 1 1 9 R FR 18 M 3 9 2 3 2 1 1 0 9 S FR 18 M 15 7 2 1 1 1 1 1 7 T SO 20 F 4 7 1 2 0 1 2 1 7 Av erage Score 2.526315789 2.789473684 2.368421053 1.84210526 2.21052632 1.368421053 13.10526316
Frequencies Frequencies Scale: 3 Scale:2 Scale: 1 Scale: 0 Separation/ Objectivity 9 10 1 2 Dissonance 14 5 1 2 Understanding/ Change in Perspective 10 6 3 3 Self-Perception 2 13 3 4 Resolution 6 10 4 2 Application/ Verification 3 6 5 8 16 14 12 10 Scale: 3 8 Scale: 2 6 Scale: 1 Scale: 0 4 2 0 Separation/ Dissonance Understanding/ Self-Perception Resolution Application/ Objectivity Change in Verification Perspective
Other “Scoring” Options • Structured Observation Guide: • Thematic approach-using qualitative coding to determine what you are seeing. • Open Coding on a paper (still systematic)
References/Resources • Allen, Mary J. (2006). Assessing General Education Programs. Bolton, MA: Anker Publishing Co. • Stevens and Antonia (2005). Introduction to Rubrics. Sterling, VA:Stylus Publishing, LLC. • Suskie, L (2009). Assessing Student Learning: A common sense guide. San Francisco, CA: Jossey-Bass.
Recommend
More recommend