Achieving real progress Using a developmental model of assessment Ben Lawless Adele Hudson Aitken college, Melbourne
Culture change
“I have found that the rubrics are like the building blocks that help me get the best marks that I can achieve” “I like being able to see the rubric descriptors that students in higher year levels have been given because it gave me some to aim for.” “I wouldn’t be where I am today without having these rubrics as they challenged me to push myself to reach new heights” “I really like it when the teachers give me examples of what each box in the rubric represent. It helps me to understand what I need to do to achieve the next level” “It is so good that the descriptors in the rubrics do not change between Year 7 and Year 12. When I first arrived in Year 7 I had no idea what anything on the rubric meant, but now that I am in Year 10 I know the rubric so well that I don’t even need to look at it for parts of my assessment task.”
Assessment theory
Teacher experience Evidence Assessment Research data
Standard vs developmental models Standard model Developmental model • Assessment occurs after instruction is complete • Assessment is used to improve teaching • Teachers don’t question each others’ data or • Teachers hold each other accountable based on their strategies data and teaching strategies they use • Teach whole class at once, with a bit of help for • Targeted teaching as much as possible – ideally the lower kids and a bit of extension for the top individually but even 3-5 levels is usually sufficient kids if possible • Compares students to criteria and focuses on where • Compares students to norms and focus on what students are ready to learn students cannot do • Developmental thinking: assessment tells me where a • Deficit thinking: students must be at a certain student is in their development and I teach them from year level norm and I must correct all the there deficits they have
Against scores ▪ Don’t promote growth mindset ▪ Don’t help improvement ▪ Make students feel ‘judged’ ▪ Can’t inform decisions ▪ Hard to interpret ▪ Inaccurate reporting ▪ Parents think an “a” means something, but it usually doesn’t
Norm vs criterion-referencing Norm-referencing Criterion-referencing ▪ Compare performance to other students ▪ Compare performance to criteria ▪ Examples ▪ Examples ▪ Australian curriculum ▪ Driving test ▪ State curricula ▪ NCEA (NZ) ▪ VCE / HSC ▪ Skill-based rubrics (when written correctly) ▪ ATAR ▪ NAPLAN ▪ Good for: ▪ Good for: ▪ Admissions ▪ Targeting instruction ▪ Diagnosing learning disabilities ▪ Measuring progress ▪ Gathering system-level data
Difficulty of interpretation • We record outcomes but not inputs • Naplan • What do schools do differently to produce different results? • Have confounding factors been disregarded? • Teacher level • Do teachers record their teaching strategies and behaviours? • If data is impossible to interpret, why gather it?
USE OF DATA • Data collection takes time but Most isn’t used to improve teaching • So why collect it? • Other uses • Summative reporting • Teacher accountability • Education act • Assessment should be for teaching
THE CURRICULUM IS NOT DEVELOPMENTAL
CREATING PROGRESSIONS
Why use progressions? • Tell students what they can do • Tell students how to get better • Much more detailed feedback • Can show student progress over years • MOST WITHIN- YEARS ASSESSMENT ISN’T EQUATED SO THAT IT MEANS SOMETHING • PARENTS (ETC) INCORRECTLY ASSUME AN ‘A’ MEANS SOMETHING
how • Write skill-based rubrics that conform to certain guidelines • Criteria must describe increasing quality of performance • No ambiguous language • No counts or pseudo-counts • Record assessment data electronically • Analyse data • Perform guttman analysis on large data sets • This can be used to infer criteria difficulty
LEVEL P LEVEL O LEVEL N LEVEL M LEVEL L LEVEL K LEVEL J LEVEL I LEVEL H LEVEL G LEVEL F LEVEL E LEVEL D LEVEL C LEVEL B LEVEL A
LEVEL P Students at this level discuss historical concepts Students at this level evaluate the reliability and purpose of LEVEL O sources Students at this level evaluate sources and historical events LEVEL N Students at this level analyse sources and can find authoritative sources LEVEL M Students at this level critique sources, and can use historical context in their writing LEVEL L Students at this level draw connections between different LEVEL K historical concepts LEVEL J Students at this level use multiple sources and can research independently Students at this level write descriptively about history and LEVEL I explain features of sources LEVEL H Students at this level discuss historical information in detail and use a variety of sources LEVEL G Students at this level can apply historical knowledge to answer questions LEVEL F Students at this level can write clearly and explain simple historical ideas LEVEL E Students at this level can make detailed historical observations LEVEL D Students at this level can make accurate suggestions about historical material LEVEL C Students at this level can find historical information LEVEL B Students at this level can perform simple actions with sources LEVEL A Students at this level can list information
JANE patel – HISTORY PROGRESS REPORT 2019 Students at this level can LEVEL H write clearly and explain simple historical ideas Students at this level can LEVEL G apply historical knowledge to answer questions Students at this level discuss LEVEL F historical information in detail and use a variety of sources Students at this level can LEVEL E make detailed historical Key observations End of 2018 End of Semester 1 2019 End of Semester 2 2019 Average student achievement by end of Semester 2 2019
What can you do with a developmental progression? 1. Get students to track their own progress 2. Show students what improvement looks like 3. Target teaching of new skills at the right level 4. Design ability based groupings and teaching material
TARGETING TEACHING
What is the learner ready to learn? What evidence shows this? What happened? What teaching CLINICAL strategies could How can this be TEACHING be used? interpreted? MODEL Which is the best What is the expected impact on learning? strategy? How will it be resourced and put How will this be evaluated? into effect?
Targeted teaching • Rubrics diagnose student “zone of actual development” (ZAD) • Design individual interventions to target “goldilocks zone” or “zone of proximal development” (ZPD) • or group students who need similar interventions
REFERENCES • Australian Education Act, (2013). • Gonski, D., Arcus, T., Boston, K., Gould, V., Johnson, W., O'Brien, L., . . . Roberts, M. (2018). Through Growth to Achievement. Retrieved from https://www.appa.asn.au/wp-content/uploads/2018/04/20180430-Through- Growth-to-Achievement_Text.pdf • Griffin, P. (2007). The comfort of competence and the uncertainty of assessment. Studies in Educational Evaluation, 33, 87-99. • Griffin, P., & Care, E. (2009). Assessment is for teaching. Independence, 34(2), 56-59. • Griffin, P. (Ed.) (2017). Assessment for Teaching (2nd ed.). Singapore: Cambridge. • Hattie, J. (2007). The power of feedback. Review of Educational Research, 77(81). • Hattie, J. (2009). Visible learning: a synthesis of over 800 meta-analyses relating to achievement. London: Routledge. • Masters, G. (2005). Against the grade : in search of continuity in schooling and learning. Professional Educator, 4(1), 12-22. • Masters, G. (2011). Assessing student learning: why reform is overdue. In. Australian Council for Education Research: Australian Council for Education Research. • Vygotsky, L. (1965). Thought and language. Cambridge, MA: MIT Press.
blawless@aitkencollege.edu.au ahudson@aitkencollege.edu.au
Recommend
More recommend