institutional learning outcomes information literacy
play

Institutional Learning Outcomes Information Literacy Pilot - PowerPoint PPT Presentation

Institutional Learning Outcomes Information Literacy Pilot Assessment Project Presented to ILO Subcommittee February 5 th , 2018 2014-15 l 2017-18 Fanny Yeung, Institutional Research, Analysis, and Decision Support Julie Stein, Educational


  1. Institutional Learning Outcomes Information Literacy Pilot Assessment Project Presented to ILO Subcommittee February 5 th , 2018 2014-15 l 2017-18 Fanny Yeung, Institutional Research, Analysis, and Decision Support Julie Stein, Educational Effectiveness, APS

  2. ILO Inf. Literacy Rubric Developed 2014-15 FLC Did Not Include all Colleges (COS, CBE)

  3. ILO Inf. Literacy Rubric Further Developed, Winter, 2017 Faculty Included From All Colleges Information Literacy Rubric Development Faculty First Name Last Name College Department Stephanie Alexander Library Project Lead, University Libraries Stephanie Seitz CBE Management Deepika Mathur CLASS Human Development & COS Health Sciences Craig Derksen CLASS Philosophy Tom Bickley Library University Libraries Matt Atencio CEAS Kinesiology Doc Matsuda CLASS Anthropology 3

  4. ILO Inf. Literacy Rubric Winter/Spring 2017

  5. ILO Inf. Literacy Rubric Changes 2014-15 2017 new new

  6. ILO Inf. Literacy Rubric Piloted, Spring 2017 Information Literacy Rubric Application Faculty First Name Last Name College Department Jean Moran COS Earth & Environmental Sciences Doc Matsuda CLASS Anthropology Deepika Mathur CLASS Human Development & COS Health Sciences Matt Atencio CEAS Kinesiology Jeff Newcomb CBE Marketing & Entr. Ben Klein CLASS History Becky Beal CEAS Kinesiology Rahima Gates COS Health Sciences 6

  7. ILO Inf. Literacy Pilot Spring 2017 Assessment Results

  8. Spring 2017 ILO Inf. Lit. Assessment Results

  9. Individual Scores vs. Average Scores BB Individual Individual Scores Scores (n=120 Average Scores (n= 149 reviews) reviews) (n=60 artifacts) Scope 3.30 3.23 3.23 Gather 2.87 2.77 2.77 Evaluate 2.85 2.71 2.71 Synthesize 3.07 2.98 2.98 Communicate 3.19 3.09 3.09 Attribute 3.06 3.03 3.03 *Note: BB scores are automated prior to data verification and cleaning. Additionally, scores reflects all artifacts and assessments (i.e., graduate courses) from collection.

  10. CSUEB Information Literacy Assessment, Spring 2017 (n=120) Scope 1 20 49 50 Gather 13 36 37 34 Evaluate 17 33 38 32 Synthesize 6 28 49 37 Communicate 3 22 56 39 Attribute 12 17 47 44 0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100% Rating=1 Rating=2 Rating=3 Rating=4

  11. Scope: Identification of Question/Concept/Problem (n=60) 4.0 3.72 3.50 3.44 3.5 3.22 2.89 2.89 3.0 2.83 2.5 2.0 1.5 1.0 0.5 0.0 Course 1 (n=6) Course 2 (n=9) Course 4 (n=9) Course 5 (n=9) Course 6 (n=9) Course 7 (n=9) Course 8 (n=9) Course Mean Institutional Mean (3.23) Competent Rubric Score (3)

  12. Gather: Use of Search Strategies (n=60) 4.0 3.67 3.5 3.11 3.00 3.0 2.61 2.39 2.5 2.33 2.00 2.0 1.5 1.0 0.5 0.0 Course 1 (n=6) Course 2 (n=9) Course 4 (n=9) Course 5 (n=9) Course 6 (n=9) Course 7 (n=9) Course 8 (n=9) Course Mean Institutional Mean (2.77) Competent Rubric Score (3)

  13. Evaluation of Sources (n=60) 4.0 3.56 3.5 3.00 2.89 3.0 2.83 2.5 2.33 2.22 2.0 1.83 1.5 1.0 0.5 0.0 Course 1 (n=6) Course 2 (n=9) Course 4 (n=9) Course 5 (n=9) Course 6 (n=9) Course 7 (n=9) Course 8 (n=9) Course Mean Institutional Mean (2.71) Competent Rubric Score (3)

  14. Synthesize: Connections among Sources (n=60) 4.0 3.50 3.5 3.33 3.11 3.00 3.0 2.67 2.58 2.50 2.5 2.0 1.5 1.0 0.5 0.0 Course 1 (n=6) Course 2 (n=9) Course 4 (n=9) Course 5 (n=9) Course 6 (n=9) Course 7 (n=9) Course 8 (n=9) Course Mean Institutional Mean (2.98) Competent Rubric Score (3)

  15. Communicate: Knowledge and Use of Disciplinary Approaches (n=60) 4.00 3.67 3.50 3.17 3.17 3.17 2.94 3.00 2.75 2.67 2.50 2.00 1.50 1.00 0.50 0.00 Course 1 (n=6) Course 2 (n=9) Course 4 (n=9) Course 5 (n=9) Course 6 (n=9) Course 7 (n=9) Course 8 (n=9) Course Mean Institutional Mean (3.09) Competent Rubric Score (3)

  16. Attribute: Effective, Ethical, and Legal Use of Attribution (n=60) 4.0 3.83 3.5 3.33 3.17 3.17 3.11 3.0 2.5 2.28 1.92 2.0 1.5 1.0 0.5 0.0 Course 1 (n=6) Course 2 (n=9) Course 4 (n=9) Course 5 (n=9) Course 6 (n=9) Course 7 (n=9) Course 8 (n=9) Course Mean Institutional Mean (3.30) Competent Rubric Score (3)

  17. Information Literacy ILO Spring 2017: Rater Consistency across Domains 35 33 31 29 30 28 28 27 26 25 24 25 22 22 20 20 15 12 9 10 6 6 5 5 5 1 1 0 0 0 0 0 Scope Gather Evaluate Synthesize Communicate Attribute 0 pt difference 1 pt difference 2 pt difference 3 pt difference

  18. Faculty Feedback on Changes to Rubric Faculty #1: “No suggestions to clarify or change the rubric other than to infuse it as you see fit to speak to your assignment's main goals.” Faculty #2: “The Rubric was a good fit for evaluating the project assignment. A vital part of the project assignment required critical thinking for analyses, conclusions and takeaways based on information search. I used the Rubric’s criteria for Synthesize and Communicate to evaluate students’ critical thinking. My one recommendation for further refinement of the Rubric would be to transform the Criteria and descriptions using student-friendly language, to help with students’ deeper understanding of requirements and possibilities.” Faculty #3 “The rubric works for general assessment of that skill, but for giving feedback I find it best to be more specific to their work.”

  19. Institutional Learning Outcomes Information Literacy Pilot Assessment Project Discussion & Questions • What changes should be made to the Information Literacy rubric and/or assessment process to improve ILO assessment? • Were there any challenges in assessing the “Evaluate,” “Synthesize,”, and/or “Communicate” rubric domains? • The graduate course was excluded from analysis. How should graduate ILO assessments be structured in the future?

Recommend


More recommend