Assessment & Evaluation Richard J. Shavelson Stanford University & Yosemite National Institutes NPS Symposium San Diego, California April 18, 2002
Sessions I & II: Evaluation Background & Group Activity • Evaluation: Uses, Misuses and a Working Definition • Distinctions: Assessment & Evaluation • Evaluation: Formative and Summative • Evaluation: Interpretative and Other Educational Programs • Small-Group Activity: Designing an Evaluation 4/18/2002 NPS Symposium 2
Evaluation: Uses, Misuses— The YNI Experience • Initial Use : – Education Committee : Improve field-education program – Administration & Some Board Members : Demonstrate , through an independent “Stanford” evaluation, the effectiveness of YNI’s field-based education program • Initial Misuse : Evaluation was viewed as a marketing tool with the imprimatur of “Stanford University” • Current Use : Evaluators turned initial purpose into a focus on field-based education improvement, especially inquiry- based learning 4/18/2002 NPS Symposium 3
Evaluation: A Working Definition Evaluation is the art ( not a “model” ) of bringing conceptual, political and empirical evidence to bear on some NPS Program using a variety of tools from various social sciences in order to reach a judgment as to the value of the program in meeting goals, intended and unintended . 4/18/2002 NPS Symposium 4
Some Distinctions: Assessment & Evaluation • Assessment refers to a combination of social measurements-- a test, a performance assessment, an observation scale, an interview scale, or attitude scale score--that bears on an outcome of interest. • Evaluation is a judgment of the value of a program under inquiry on the basis of artful and rigorous application of scientifically justifiable method 4/18/2002 NPS Symposium 5
Example (“Summative”) Evaluation: CPB/Annenberg Distance Education Curriculum • Background & Question—does it work? • Evaluator’s turn of question— Exchangeability • Absence of evaluation model to fit context • Creativity in quasi-experimental design with quantitative and qualitative data • Judgment of value 4/18/2002 NPS Symposium 6
Formative & Summative Evaluation • Formative— – To improve the NPS Program during its development process – Feedback to close gap between goals and present conditions • Summative – To judge the overall value of the NPS Program – Best conceived as as comparing the value of the program against alternatives or standards sometimes considering program cost (e.g., CPB evaluation, class size reduction evaluation) 4/18/2002 NPS Symposium 7
Use of Formative Evaluation— The YNI Experience Doing Well Needs Improvement Recommendations Staff Training : Staff Training : Staff Training : Focus on Inquiry, Community YI: more inquiry activities, All: Assess staff competencies Connections, Bio-monitoring, diversity/connections training and create training around Diversity weaknesses Pre-Trips : Pre-Trips : Pre-Trips : Realize that we have as much to Improve knowledge-sharing All: Create communication between groups between Outreach/Field Science learn from pre-trip as students do Inquiry : Inquiry : Inquiry : Staff is getting more comfortable YI: Need more support/training All: Continue to provide with inquiry. Trainings on topic information have occurred at all campuses. HI/YI/YNI: 5 staff attend IFI HI, OPI great progress training in June Stewardship : Stewardship : Stewardship : Role/topic of service learning Stewardship being recognized Clarify opportunities for as more than end-of-program stewardship and integrate into topic program Source: Science : Science : Science : Schneider Inquiry activities more common, Inconsistent communication Discuss history of science role in Quarterly biomonitoring occurring about role of science in program education and create action plan Report for clarifying within organization 2002 4/18/2002 NPS Symposium 8
Use of Summative Evaluation— The Bavarian Experience Source: Bogner, F. (1998). The influence of short-term outdoor ecology education on long-term variables of environmental perspectives. Journal of Environmental Education, 29 (4), 17-29. 4/18/2002 NPS Symposium 9
Small-Group Discussion: Designing an Evaluation • Groups Choose a Problem: – Design an evaluation of a National Park’s Interpretation Program – As part of a National Park’s strategic plan for linking with external field-based science education program, design an evaluation to inform selection decisions • Consider: – Mission – Activities – Goals – Credible evidence • Groups Report Back 4/18/2002 NPS Symposium 10
Appendix to Session I: CPB Evaluation Design Community College Semester Fall Spring Class Class Treatment (T) 2.82! Site A Final Exam 1.29 T Spring Pretest 3.37!! Spring Final (T) 0.16 1.26 T Site H 1.51 4/18/2002 NPS Symposium 11
Session III: Overview • Evaluation Questions and Appropriate Designs • Some Problems with Environmental Education Evaluation “Models” • Some Problems with Interpretation of Environmental Education Findings 4/18/2002 NPS Symposium 12
Evaluation Questions and Appropriate Designs • What’s happening? – Statistical point and relational estimates – Case and other qualitative (e.g., ethnographic) studies • Is there a systematic effect? – Randomized experiments Source: – Quasi-experiments Shavelson & Town (Eds.) – Correlational studies (2002), Scientific • How is it happening? Research in Education. National – Experiments Academies Press – Correlational studies – Case and other qualitative (e.g., ethnographic) studies 4/18/2002 NPS Symposium 13
Some Problems with Environmental Education Evaluation Models • Inadequate design to address question – One-shot and pre-post studies to address summative- evaluation (“Effects”) questions – Posttest attitude questionnaires to address participants’ perceptions – Seldom formative evaluation reported • “A variety of research designs and data collection methods were employed. Surveys, observation, and interviewing were used most often, but some studies discuss… focus groups, quizboard testing, or photography…. Several studies employ a pre- and posttest design… but more often… a posttest. Often the form of experimental or quasi-experimental design was unclear” (Wells & Smith, 2000, p. 1). 4/18/2002 NPS Symposium 14
Some Problems with Environmental Education Evaluation Models (Cont’d.) • Over-interpretation of environmental education findings • We have learned that: – “Education programs can effectively accomplish the dual-goal of helping students achieve national Science Standards while at the same time fostering stewardship of national parks” (Reynolds, 2001, p. 1). – Based on ... case studies of schools that use environmental education [EE] as the focus for their curriculum… Sward found ‘… compared to traditional educational approaches, [EE] improves academic performance across the curriculum’ (Glenn, 2000)” (ASCD, 2001). 4/18/2002 NPS Symposium 15
Recommend
More recommend