how computer skills testing improved
play

How Computer Skills Testing Improved Our Student Success Rates - PowerPoint PPT Presentation

Gateway Technical College: How Computer Skills Testing Improved Our Student Success Rates Presenters : Raymond Koukari, Jr., Gateway Technical College Jodi Noll, Labyrinth Learning AGENDA Our Problem/Background Driving Issue Initial


  1. Gateway Technical College: How Computer Skills Testing Improved Our Student Success Rates Presenters : Raymond Koukari, Jr., Gateway Technical College Jodi Noll, Labyrinth Learning

  2. AGENDA • Our Problem/Background – Driving Issue • Initial Steps • Refining the Solution • Testing • Creation of Multiple courses • Implementation • Process • Challenges • Outcomes • Recommendations

  3. Driving Issue – Core Ability “ All students will demonstrate essential computer skills by graduation” Accreditation HLC and ACBSP process showed that they want proof Are you doing what you say?

  4. Driving Issue – High Course Failure Rate 21.5% of students were not completing our PC Basics course Why put students into a course that they aren't ready for & are against the odds of succeeding in? - Wendy Revolinski

  5. What did we do? Instructor Meetings: Why the high failure rate?

  6. Conclusion Unprepared /unskilled students Wide range of skills left some students unprepared, confused, or bored Needs Identified: • New, developmental course • Method to determine student placement

  7. What did we do? Determined needs for a tool to test students for course placement √ Agile √ Office Suite √ Gmail √ Versions √ Mirror campus environment

  8. What did we do? Consulted the experts √ What should be tested √ Assessment design √ Technical solution √ Testing Process

  9. Test Design What should we test? Started with recommendations from vendor partner (Labyrinth) 3 rounds of feedback / design assessment • Internal review • Student testing: on-campus beta  Summer course  9 sections Custom URL that has all the details  All students for them to check out later. • Administrative assistants

  10. Test Timing & Location When and Where to Take the Test? Concern: Extended testing might be too long • Location selected: Testing Center Results: Most students • Coordinated with Compass testing tested in just 20 • Following compass test, students click link minutes. A few tested up to an hour. and are taken to digital literacy test • Started testing April 2013

  11. Solution Details • All students in technical and associate degree programs tested • 2 new courses  3 CR Computers for Professionals  1 CR Basic Computing (NEW) Computers Credit for • 3 placement options Basic for Prior Score Computing Professionals Learning 0 - 49 X 50 – 89 X 90 – 100 X

  12. Gaining Institutional Support • Issue has impact across all divisions Faculty • Failure in early classes = higher drop out Librarians rates Administrators • Broader problem involving all students Faculty Admissions • GOAL: Help students be more successful, not determine admission Counselors Faculty Assessment Center PROVOST SUPPORT RECEIVED Students Provost

  13. Determining Funding Dean of Enrollment Services • Responsible for all testing centers Plan A: Increase in fee to cover cost Compass testing – reading, writing, math • $20 fee • $5 wiggle room vs. actual cost and fee • Computer assessment = $3 Plan B: No increase in fee for students

  14. Our Data • First Trials Fall 2013 Semester # Students # Sect 103-142 30 1.67 Basic Computing Computing for 103-143 134 7.44 Professionals CPFL 13 Test-out Credit Total # Scores 177

  15. Our Data • Fall & Spring Semester Fall 2013 Semester # Students # Sect % Overall Basic Computing 103-142 206 11.44 15% Computing for 103-143 1044 58.00 78% Professionals CFPL 85 6% Test-out Credit Total # Scores 1335 Spring 2014 Semester # Students # Sect % Overall 103-142 271 15.06 15% Basic Computing Computing for 103-143 1420 78.89 78% Professionals CFPL 119 7% Test-out Credit Total # Scores 1810

  16. Critical Considerations/Recommendations • Early involvement of all divisions • Share vision & implementation plan • Including others early in planning phase • Pass along information to all • Clear on goals and plan • Engage a wide audience to gather support • Challenge - Business & IT get it, but not the whole college • Try to get support from the broader audience • Gain commitment to change Data is key to gaining commitment • Faculty • Senior management • Provost has to be on board • Data is key to commitment

  17. Critical Considerations / Recommendations • Get a Sponsor in test area • Seek IT support • Click another link, and go into the computer assessment • Put another “test field” in the system • Use API to automate the process to integrate (15 min) • Conduct a pilot • Test process • Test student results

  18. Sound Bites from Instructors Students feel more comfortable coming into Computers for Professionals. As an instructor, providing that basic class has made my starting days so much easier in Computers for Professionals. Most of the students come in with the basics, and I can start right in teaching what is suppose to be taught in that class. – C. Wooster It allows me to give extra attention to the students who are just beginning and I no longer feel like I'm neglecting the students who need more help in getting started. – A. Perkins I think the computer skills assessment has greatly helped Computers for Professionals. I wish it would have been implemented years ago. It was very difficult to teach this course when you have people who have no computer skills at all - yes they are out there. – W. Revolinski

  19. Testing Platform – Labyrinth Learning eLab SET • Over 2000 questions covering the Internet, email, computer concepts, Microsoft Office • Realistic simulation environment mimics real world • Select questions and create your own • Flexible test implementation and student registration • Track and analyze scores by student or group • Affordable license keys • Unlimited educator access • Responsive customer support

  20. Testing Platform - Labyrinth Learning DEMO

  21. Outcomes Direct Results Student Feedback • 90% of Instructors said • Most students were the students were better not surprised by prepared results • Measured key “core • They had a sense of ability” statement relief that they would • Reduced course failure be placed in a course that was right for rates them • Achieved higher retention rate

  22. Outcomes – Prove it! • Source of Actionable Data • Accrediting agencies • Placement • Course improvements

  23. Justification – Interface Grant

  24. Ongoing Process • Future • 18 month post-test 11/14 • Comparison of students pre-test and post-test • Refine assessment with survey feedback

Recommend


More recommend