language proficiency assessment
play

Language Proficiency Assessment Seumas Rogan Chief, Test Design - PowerPoint PPT Presentation

Language Proficiency Assessment Seumas Rogan Chief, Test Design & Analysis DEFENSE LANGUAGE INSTITUTE FOREIGN LANGUAGE CENTER Mission Statement Develop , validate , administer , sustain , and assess results of standardized language


  1. Language Proficiency Assessment Seumas Rogan Chief, Test Design & Analysis

  2. DEFENSE LANGUAGE INSTITUTE FOREIGN LANGUAGE CENTER Mission Statement Develop , validate , administer , sustain , and assess results of standardized language proficiency tests; educate on standards and evaluate student feedback , all in support of the Defense Foreign Language Program. 2

  3. DEFENSE LANGUAGE INSTITUTE FOREIGN LANGUAGE CENTER Priorities • Sustainment of deployed assessments. • Validity and reliability of all test modalities. • Alignment of the workforce with the mission and strategy. • Stabilization and control of the business processes. • Definition/implementation of the next generation of assessments. 3

  4. DEFENSE LANGUAGE INSTITUTE FOREIGN LANGUAGE CENTER LPAD Org Chart Kalman Weinfeld Director, Language Proficiency Assessment Dr. Pradyumna Amatya Stakeholder Deputy Director Relations Susan Hagan Technology Evaluations Division (EV) Dr. Tom Parry Brent Eickholt Dr. Gerd Brendel Oral Proficiency Test Management Test Review and Standards Division Division Education Division (PSD) (TM) (TRE) Dr. Chung Yao Kao Dr. Seumas Rogan Test Production Test Analysis and Division Design Division (TP) (TAD) 4

  5. DEFENSE LANGUAGE INSTITUTE FOREIGN LANGUAGE CENTER Summary of LPAD Functions • TP: Produce language proficiency assessments in Listening and Reading Comprehension. • TRE: Review test items and train faculty and staff in ILR. • TAD: Design and analyze the performance of language proficiency assessments. • PSD: Certify and manage the performance of Oral Proficiency Interview (OPI) testers. • TM: Schedule and administer DLI graduation tests and external OPI. • EV: Survey students and report statistics and red-flags. • Technology: Design and maintain automated solutions for language testing business processes. 5

  6. DEFENSE LANGUAGE INSTITUTE FOREIGN LANGUAGE CENTER DLPT Lifecycle NEEDS ANALYSIS TEST SPECS LESSONS LEARNED IMPLEMENTATION MAINTAIN FRAME EVALUATE WORK QA / QC Ethical Practices DEVELOPM DEPLOY ENT TEST STANDARD PRETEST & SETTING ANALYZE

  7. DEFENSE LANGUAGE INSTITUTE FOREIGN LANGUAGE CENTER Test Production • Test platforms: – Multiple Choice (MCT) – high volume (>200). – Constructed Response (CRT) – low volume. • Standard development latency: ~30 months. • Standard development cost (Listening + Reading): ~$1.25M. • Sustainment and Computer Adaptive Tests require pools of characterized items: – Generated via automatic seeding of new items in the released test forms. 7

  8. DEFENSE LANGUAGE INSTITUTE FOREIGN LANGUAGE CENTER Test Production • Automatic Seeding Status: – Multiple Choice Tests (MCT): on line for 6 languages (Chinese- Mandarin, Modern Standard Arabic, Pashto, Persian Farsi, Russian, Spanish); 4 additional languages scheduled/year. – Constructed Response Tests (CRT): Reduction in scored items (via a reduction in scored levels) required to enable seeding. • Scheduled Test Releases (2015-2017): – 9 MCT: Urdu, Chinese-Cantonese, Tagalog, Portuguese, German, Hindi, Thai, Swahili, Vietnamese – 6 CRT: Haitian-Creole, Yoruba, Kazakh, Amharic, Hausa, Malay • Computer Adaptive Tests (CAT): – Planning implementation for 13 DLI high-volume languages. 8

  9. DEFENSE LANGUAGE INSTITUTE FOREIGN LANGUAGE CENTER Test Review and Education • Conduct independent target language reviews of each test-item set for correctness, completeness, appropriateness, and adherence to specified ILR levels. • Provide Text Typology and Passage Rating training under the ILR Guidelines to all DLPT5 test developers, independent reviewers, and DLIFLC faculty (by request). • Represent DLI at the ILR testing work-group to ensure interoperability among various government agencies and international partners.

  10. DEFENSE LANGUAGE INSTITUTE FOREIGN LANGUAGE CENTER Test Analysis & Design • Coordinate DLPT design initiatives: – DLPT5 Validity Framework • What do DLPT scores mean? How should they be used? – Item Bank Specification • Are we asking the right questions? – Web-Based Field Testing • Can we obtain a representative sample of examinees to calibrate items? – Small-n Standard Setting Study • How do we set passing scores without item parameter data? 10

  11. DEFENSE LANGUAGE INSTITUTE FOREIGN LANGUAGE CENTER Highlight: Design • Issue – Lack of field testing participation for DLPT5s in Cantonese, Tagalog, Hindi, German – Results in: • Examinee proficiency misclassification • Redress – Web ‐ Based Field Testing 11

  12. DEFENSE LANGUAGE INSTITUTE FOREIGN LANGUAGE CENTER Test Analysis & Design • Conduct statistical analysis of: – Item response data • Do test items perform within specified tolerances? – Test form reliability • Are results on test forms equivalent? – Standard setting cut score recommendations • Are cut scores fair, reliable and valid? – Examinee comments /feedback • Have examinees expressed actionable concerns? – Item bank characteristics • What is the distribution of content across the range of difficulty? 12

  13. DEFENSE LANGUAGE INSTITUTE FOREIGN LANGUAGE CENTER Highlight: Analysis (1) • Issue – Questionable standard 1.5 .3 setting panelist recommendations 1 .2 Density Density – Results in: .5 .1 • Examinee proficiency misclassification 0 0 • Redress 0 5 10 15 20 25 30 35 40 45 50 55 60 NC Cuts – Small ‐ n standard setting ILR LVL 1+ ILR LVL2 ILR LVL 2+ ILR LVL 3 study – Additional field testing data 13

  14. DEFENSE LANGUAGE INSTITUTE FOREIGN LANGUAGE CENTER Highlight: Analysis (2) • Issue – DLIFLC faculty question why few examinees are awarded ILR 2+ in Korean DLPT5 • Redress – Demonstrate that, consistent with test specifications, maximum score precision at ILR levels 2 /2+ /3 14

  15. DEFENSE LANGUAGE INSTITUTE FOREIGN LANGUAGE CENTER Proficiency Standards Division •Train and certify select faculty as OPI testers at DLIFLC. •Provide orientation training for DLIFLC faculty on the ILR and OPI. •Ensure appropriate interpretation and uniform implementation of the ILR at DLIFLC and the DLIFLC contract entities. •Provide quality assurance that OPI testers (DLIFLC and contract) are providing consistently fair and accurate assessments. 15

  16. DEFENSE LANGUAGE INSTITUTE FOREIGN LANGUAGE CENTER Test Management Division • Schedule, test, and report results for DLIFLC resident students. • Grade all Constructed Response Tests administered. • FY14 volume: DLPT (RC+LC)@ DLI: 8,949 (Worldwide: ~123,000); OPI: DLI: 3,407, External: 15,463; ICPT: 10,745; CRT Gradings: 10,193. • Test capacity: 5 DLPT test labs; 7 OPI studios. – New semester-based scheduling requires 3 additional OPI studios. 16

  17. DEFENSE LANGUAGE INSTITUTE FOREIGN LANGUAGE CENTER Evaluation Division • Provide DLIFLC leadership with valid and reliable evaluative information. • Outcomes (FY14): –5,330 ISQ/ESQ Evaluation surveys. –121 Red Flag reports. –198 Snapshot reports. –256 Attrition surveys. –394 Non-Resident Surveys. 17

  18. DEFENSE LANGUAGE INSTITUTE FOREIGN LANGUAGE CENTER Technology • DOMINO (Test Item Development and Workflow Automation System) – Centralized and version controlled system hosting 49 DLPT5 development projects • 52 active users • 7,670 test passages; 11,464 test items – Enforces standardized workflow and task assignment processes – Expanding for contractor-model of DLPT item development, Psychometric support and direct reporting 18

  19. DEFENSE LANGUAGE INSTITUTE FOREIGN LANGUAGE CENTER • SharePoint – Platform for ongoing DLPT5 External Review contract • 150 active users • 5,008 test items – Repository of previously contracted DLPT5 items • 32,952 test items • TDMS (Test Management System) – Automatic scheduling and scoring processes for OPI and DLPT – Repository of score data – Expanding to support Psychometric Item-Response data requirements 19

  20. DEFENSE LANGUAGE INSTITUTE FOREIGN LANGUAGE CENTER A few words about the ILR • Inter ‐ Agency Language Roundtable – Dates to 1950’s – Skill level descriptions for • Speaking, Listening, Reading, – Used as the primary reference by US Government Agencies – http://www.govtilr.org/ 20

  21. DEFENSE LANGUAGE INSTITUTE FOREIGN LANGUAGE CENTER ILR Proficiency Scale • Set of general language proficiency descriptions indicating what language learners CAN or CANNOT do using the target language – i.e., Knowledge, Skills & Abilities (KSAs) of/with the target language • 6 “base” levels, and “plus (+)” levels in ‐ between to indicate that proficiency exceeds one level but is not sustained at the next level • Used by DLIFLC in developing DLPT5 and classifying the results 21

  22. DEFENSE LANGUAGE INSTITUTE FOREIGN LANGUAGE CENTER ILR Proficiency Scale 5 5 = Functional Native Proficiency 4+ 4+ = Advanced Professional Proficiency + 4 4 = Advanced Professional Proficiency 3+ 3+ = General Professional Proficiency + 3 3 = General Professional Proficiency 2+ 2+ = Limited Working Proficiency + 2 2 = Limited Working Proficiency 1+ 1+ = Elementary Proficiency + 1 1 = Elementary Proficiency 0+ 0+ = Memorized Proficiency 0 0 = No Proficiency 22

  23. DEFENSE LANGUAGE INSTITUTE FOREIGN LANGUAGE CENTER ILR Proficiency Scale 5 4+ 4 3+ 3 2+ Focus of the 2 Lower Range 1+ DLPT5 1 0+ 0 23

Recommend


More recommend