decision support system
play

Decision Support System CISOA & 3CBG Conference 2014 February 25 - PowerPoint PPT Presentation

Decision Support System CISOA & 3CBG Conference 2014 February 25 th , 2014 Daniel Lamoree Sr. Systems Analyst/Programmer Mt. San Antonio College dlamoree@mtsac.edu Learning Objectives 1. How Mt. SAC calculates FTES Targets 2. How Mt. SAC


  1. Decision Support System CISOA & 3CBG Conference 2014 February 25 th , 2014 Daniel Lamoree Sr. Systems Analyst/Programmer Mt. San Antonio College dlamoree@mtsac.edu

  2. Learning Objectives 1. How Mt. SAC calculates FTES Targets 2. How Mt. SAC decides sections to add or cut 3. How Mt. SAC Deans develop prospective schedules

  3. Mt. SAC Story: Lost FTES

  4. Scheduling 2014-2015 Overview — Top-Down Approach — Get Annual FTES Target — Distribute Annual Target between CR, ENHC_NC, NC — Grow only Credit? Distribute as before? — Distribute CR, ENHC_NC, NC among Terms — Grow Summer (yes, please)? Fall? Winter? Spring? — Distribute FTES among Divisions . . .

  5. Annual Targeting — Example — Funded FTES for Prior Year = 29371.99 — Growth = 3.5% — Unfunded FTES for Prior Year = 400 — ((29371.99 * 1.035) – 400) = 30000 — CR: 27000 (90%), 2400 ENHC_NC (8%), 600 NC (2%) — 10% Summer; 42% Fall; 8% Winter; 40% Spring — Of 10% Summer: 36.22% HSS; . . .

  6. Annual Targeting

  7. Annual Targeting

  8. Annual Targeting

  9. Annual Targeting

  10. Annual Targeting — Just one catch . . .

  11. Impossible? Failure?

  12. Minimizing Spring Uncertainty — Knowns — Sections Scheduled for Spring — Scheduled Hours per Section for Spring — Historical Fill Rate for Spring — Unknowns — Future Contact Hours (Fill Rate for WSCH/DSCH or PACH) — Mt. SAC Decision — Projection

  13. Projection: Weighted Averages 4 3 2 1 25% 20% 15% 10% 1st Most Recent 5% 2nd Most Recent 0% 3rd Most Recent -5% 4th Most Recent -10% -15% -20% -25%

  14. Does the projection work? Spring 2012 Acct Potential Projected Actual Error # Error % 8432.9192 8312.95 8381.988 69.038 0.824% W 289.5956 311.02 266.019 45.001 16.916% IW 117.6122 110.82 97.9406 12.8794 13.150% ID 374.2581 395.12 340.651 54.469 15.990% D 33.2027 25.87 28.3751 2.5051 8.829% LD 256.2772 236.96 224.4053 12.5547 5.595% LW 9503.865 9392.74 9339.379 53.361 0.571% TOTAL

  15. Does the projection work? Fall 2012 Acct Potential Projected Actual Error # Error % 8980.0658 8939.18 9004.8445 65.6645 0.729% W 294.9029 323.12 268.3027 54.8173 20.431% IW 146.9111 148.73 123.4596 25.2704 20.469% ID 393.9151 385.57 337.1664 48.4036 14.356% D 29.3333 27.49 29.0263 1.5363 5.293% LD 234.8028 208.51 228.1898 19.6798 8.624% LW 10079.931 10032.6 9990.9893 41.6107 0.416% TOTAL

  16. Does the projection work? Spring 2013 Acct Potential Projected Actual Error # Error % 9147.3702 W 8924.05 8839.0069 85.0431 0.962% 320.2368 IW 337.38 290.7746 46.6054 16.028% 141.3104 ID 131.81 116.7168 15.0932 12.931% 426.6003 D 445.74 359.189 86.551 24.096% 27.8053 LD 23.8 26.2312 2.4312 9.268% 271.2639 LW 245.29 231.6928 13.5972 5.869% 10334.5869 TOTAL 10108.07 9863.6113 244.4587 2.478%

  17. Does the projection work? Fall 2013 Acct Potential Projected Actual Error # Error % 9444.0141 9523.71 9263.0738 260.6362 2.814% W 350.3353 418.18 314.465 103.715 32.981% IW 189.386 215.32 162.3545 52.9655 32.623% ID 369.7928 397.66 317.5203 80.1397 25.239% D 93.0252 124.5 83.0736 41.4264 49.867% LW 10446.5534 10679.37 10140.487 538.8828 5.314% TOTAL

  18. What happened? — No variance in pervious years; easy to project when fill rates hover around 100% (after drops and adds) — What now? — More robust model, an actual predictive model — Will that help given downward trends? Always 1 year or term behind? — Maintain Agility — Reporting via Argos and APEX — Sandboxing via APEX

  19. APEX — Highlights — Oracle’s primary tool for developing Web applications using SQL and PL/SQL — Only requires web browser to develop — No cost option of the Oracle Database

  20. Reports — What sections should we add? — Demand — 90%+ Fill — Waitlists — Registration Acceleration — What sections should we cut? — Lagging Sections — Registration Acceleration — What else? — Room Usage — Excluded CRNs

  21. Demand 90%+ Fill

  22. Response?

  23. Waitlists

  24. Registration Acceleration

  25. Lagging Courses

  26. Lagging Courses

  27. Room Usage

  28. Excluded CRNs From 320

  29. Sandbox

  30. Sandbox

  31. Sandbox

  32. Sandbox

  33. Sandbox

  34. Sandbox

  35. Sandbox

  36. Sandbox

  37. Sandbox

  38. Sandbox

  39. Sandbox

  40. Sandbox

  41. Sandbox

  42. Sandbox

  43. Sandbox

  44. Sandbox

  45. Sandbox

  46. Sandbox

  47. Future — Building Predictive Model — Room/Space Efficiency — Reporting Off Sandboxes — Task/Directive Assignment

Recommend


More recommend