Working with What Works Clearinghouse Standards to Evaluate Designs for Broadening Participation Kate Winter (Kate Winter Evaluation, LLC) & Eva Fernández, Sabrina Avila, Patrick Johnson, & Jennifer Valad (CUNY Queens College) 2018 Transforming STEM Higher Education Association of American Colleges & Universities ~ Network for Academic Renewal November 10, 2018
Overview ● Background What Works Clearinghouse (WWC) ● ● At your tables: Case studies Group share-out ● ● At your tables: Implications for planned proposals or projects in progress ● Resources
Background
Context: STEM Bridges Project RFP, Title III HSI-STEM program, US Dept. of Education: no specification about WWC Evidence Standards or quantitative methods, but specified that the evaluation design would be judged on the extent to which: (1) The data elements and the data collection procedures are clearly described and appropriate to measure the attainment of activity objectives and to measure the success of the project in achieving the goals of the comprehensive development plan; (up to 5 points) (2) The data analysis procedures are clearly described and are likely to produce formative and summative results on attaining activity objectives and measuring the success of the project on achieving the goals of the comprehensive development plan; (up to 5 points) and (3) The evaluation will provide guidance about effective strategies suitable for replication or testing in other settings. (up to 5 points) (US Department of Education, 2016)
STEM Bridges: The Proposal http://hsistem.qc.cuny.edu Two Goals: (a) graduate more Hispanic & low-income students (b) develop articulation agreements for QCC-to-QC transfer Three Activities: ❶ Improve access: redesign STEM “landing” courses ❷ Improve learning: learning collectives ❸ Bridge: cross-campus model for transfer student success
❶ Improve Access: Re-design 20 STEM-landing courses (up to 7,000 students annually in treatment-group courses) ❷ Improve Learning: Develop “learning collectives” where peers mentor students ❸ Bridge: Articulation agreements and assessment processes for all STEM majors, QCC → QC
STEM Bridges: Evaluation Design ● Cluster Randomized Controlled Trial (RCT), intended to meet WWC evidence standards without reservations Eligible sections randomly assigned to treatment or control, some ● treatment sections randomly assigned to learning collectives (txt +) ● Exploring “landing” course GPA, “gateway” course GPA, term-to-term retention, time to graduation Hierarchical modeling nesting students within sections, ● using baselines as covariates ● Two sites, large sample, multiple disciplines
WWC Evidence Standards
How to Meet WWC Evidence Standards TLAs ● Without reservations WWC: What Works Clearinghouse RCTs or RDDs* with low attrition ○ RCT: Randomized Control Trial RDD: Regression Discontinuity Design With reservations ● QED: Quasi-Experimental Design ITT: Intention To Treat ○ RCTs with high attrition ○ QEDs with baseline equivalence, RDDs* All RCTs must follow an ITT protocol and use approved baseline ● measures ● All studies must be of appropriate interventions with included populations and approved outcomes *Have additional requirements and are not discussed today
Attrition RCTs with low attrition meet WWC Evidence Standards with no reservations RCTs with high attrition must demonstrate baseline equivalence
What is “Low Attrition” versus “High Attrition”? Conservative Thresholds Liberal Thresholds* *Liberal thresholds are used in postsecondary education reviews
Challenges ● Institutions are inexperienced with experimental designs FERPA, IRB consent process ○ ○ Faculty reluctance ○ Project personnel still figuring out monitoring processes ● Faculty unfamiliar with RCT structure Compliance, implementation fidelity, and sample size issues ○ Implementing experimental designs in small departments ● ● Access to institutional data, analytic sample sizes, & statistical power ● Avoiding attrition
Case Studies The following case studies are real. At the request of the survivors, the names have been changed. Out of respect for the dead, The rest has been told exactly as it occurred. (Adapted from Fargo , 1996) Annotated Version + Glossary of Terms: https://goo.gl/99NXkX
Case 1: Fall Term
Case 2: Spring Term
Implications
Applying Our Lessons Learned ● Build fault tolerance in at the proposal stage: Plan to exclude MANY sections each term ○ Use conservative N for estimates ○ ● Develop resources to prepare faculty to participate in an RCT ● Spend the first term (or the first year) “beta-testing” all processes: faculty recruitment, sampling, implementation monitoring, data acquisition, data analysis ● Let your evaluator play “bad cop” and toe the line of the WWC rules ● Closely monitor treatment implementation so your evaluator can formatively conduct ToT analysis
Resources ● RCT Basics: https://emj.bmj.com/content/20/2/164 WWC Homepage: https://ies.ed.gov/ncee/wwc/FWW ● WWC Handbooks: https://ies.ed.gov/ncee/wwc/Handbooks ● ● WWC Postsecondary Review Protocols: https://ies.ed.gov/ncee/wwc/Handbooks#protocol WWC Webinars: https://ies.ed.gov/ncee/wwc/Handbooks#webinars ● Common Guidelines: https://ies.ed.gov/pdf/CommonGuidelines.pdf ● Slides: https://goo.gl/WXXQpJ Handouts: https://goo.gl/99NXkX
Thanks! Kate Winter (kate@katewinterevaluation.com) http://www.katewinterevaluation.com Eva Fernández (eva.fernandez@qc.cuny.edu) & Sabrina Avila, Patrick Johnson, Jennifer Valad http://hsistem.qc.cuny.edu Slides: https://goo.gl/WXXQpJ Handouts: https://goo.gl/99NXkX
Recommend
More recommend