What Works Best with TSPi for Small Team Productivity and Quality William L. Honig, Ph.D. Associate Professor, Department of Computer Science 1 W. L. Honig, TSPi Symposium 2006 San Diego, CA
TSPi Effectiveness with Small Teams • TSPi impact on software teams – 23 teams of 7 to 12 graduate students on real world developments – Software process awareness and impact • Productivity coupled with quality • Result of planning and analysis • Extensive data collection • Bringing real world software experience to the classroom – R&D leadership in communications companies – Land line, wireless, satellite, private and public networks • Voice, data, land line, mobile, satellite, network management 2 W. L. Honig, TSPi Symposium 2006 San Diego, CA
What Results? • Data Summary – Productivity – Source Lines of Code (LOC) per Person Hour • High 47.4 • Average 13.5 • Low 1.8 (complete Cycle 2 development, including reuse – all phases) • Data Summary – Quality – Defects Injected per Total KLOC • Low 2.8 • Average 24.1 • High 86.3 3 W. L. Honig, TSPi Symposium 2006 San Diego, CA
How do the teams work? • Team composition • Students assigned to Team » Based on From INFO • Roles matched to background • Demographic mixture • Well trained individual programmers • Learning environment Team Phoenix • 14 to 17 weeks of class Fall 2001 • Strict enforcement of team discipline • Face to face team meetings required • Students ? Employees, but can be “fired” 4 W. L. Honig, TSPi Symposium 2006 San Diego, CA
Team Roles at a Glance (Five Specialized Roles) • Support Manager • Quality/Process • Development Manager Manager • Team Leader • Planning Manager COMP 474 Software Engineering Dr. William L. Honig 2005
Team Productivity – Cycle 2 Source Lines Of Code (LOC) per Hour Productivity 50.000 45.000 Total LOC / Total Time in Phase 40.000 Productivity of Each Team 35.000 30.000 Good 25.000 20.000 15.000 10.000 5.000 0.000 Phoenix II Evolution Avalanche Rambler Phoenix Blue Bee Titans eUphoria Silicon Raiders Ice Cool Lucid Dim Sum Doc Max Seals Sharp T3 Volki Socrotes Kites G10 Beta Bees Code Warriors 6 W. L. Honig, TSPi Symposium 2006 San Diego, CA Teams
How is TSPi used in the classroom? • Student teams complete two cycles of of development • Same team assignment for both cycles • Some switch roles for cycle two • “Customer” provides starting point • Product Needs Statement (not full requirements) • 2 to 4 meetings with customer to clarify needs and review requirements and plans • Teams present key milestones and demonstrate product to faculty, research assistants, customer 7 W. L. Honig, TSPi Symposium 2006 San Diego, CA
The Process at a Glance (TSPi) Requirements Plan Strategy Launch A controlled, data driven, step-by-step process for software life cycle Design Repeat Implementation Postmortem Test COMP 474 Software Engineering Dr. William L. Honig 2005
How do students learn PSP first? • Personal Software Process (PSP) – Required for individuals – Prerequisite for TSPi • PSP trial introduction – Undergraduate programming course – Plan (estimate time), track defects, record time spend • Only some TSPi student teams have this experience before TSPi begin – Quick two day introduction – One programming project 9 W. L. Honig, TSPi Symposium 2006 San Diego, CA
Development Projects • “Real World” Development – University staff groups as customer • working system or, • prototype or, • requirements clarification,… • Wide range of applications Titans – Prospect tracking for Graduate School Fall 2002 – Summer visit registration for College of Arts and Sciences – Student Portal for Information Technology – Grant Approval and Tracking for VP Research • Many technologies – C++, Java, XML, ColdFusion, … 10 W. L. Honig, TSPi Symposium 2006 San Diego, CA
How are data collected? • Textbook: Watts S. Humphrey, Introduction to the Team Software Process sm • Key data entered weekly into 21 forms: – Product Summary (SUMP) – Quality Summary (SUMQ) – Work Tasks/Effort (TASK) – Schedule and Earned Value (SCHEDULE) – Defect Identification and Correction (LOGD) – Inspection Reports (INS) Phoenix – Time Recording Log (LOGT) Fall 2001 11 W. L. Honig, TSPi Symposium 2006 San Diego, CA
TSPi Plan Summary: Form SUM P Name Date Team Instructor Part/Level Cycle Product Size Plan Actual Requirements pages (SRS) Other text pages High-level design pages (SDS) Detailed design lines Base LOC (B) (measured) If it’s not Deleted LOC (D) documented, it’s not (Estimated) (Counted) Modified LOC (M) (Estimated) (Counted) there… Added LOC (A) (N-M) (T-B+D-R) Reused LOC (R) (Estimated) (Counted) Total New and Changed LOC (N) If you can’t measure (Estimated) (A+M) Total LOC (T) it, it’s not there… (N+B-M -D+R) (Measured) Total New Reuse LOC Estimated Object LOC (E) Upper Prediction Interval (70%) Lower Prediction Interval (70%) Time in Phase (hours) Plan Actual Actual % Management and miscellaneous Launch Strategy and planning Requirements System test plan Requirements inspection High-level design Integration test plan High-level design inspection Implementation planning Detailed design Detailed design review Initial Findings, FEB 2002 Dr. William L. Honig Test development 2002 Detailed design inspection
W. L. Honig, TSPi Symposium 2006 San Diego, CA Defects Injected per LOC Total Defects Injected / Total LOC 0.01 0.02 0.03 0.04 0.05 0.06 0.07 0.08 0.09 What Results? 0 Rambler Ice Cool Lucid Team Quality(a) of Each Phoenix Good Blue Bee Dim Sum Doc Max Socrotes Kites Quality--a Titans G10 Teams eUphoria Beta Seals Bees Sharp Silicon Raiders Evolution Avalanche T3 Code Warriors Phoenix II Volki 13
Quality Results from Cycle Testing ONLY • In Cycle Testing determines the quality numbers – No “production” use recorded • “Testing can only show the presence of bugs, not their absence” – Fault Seeding – Bug Density / Arrival Rate Analysis 14 W. L. Honig, TSPi Symposium 2006 San Diego, CA
Where are the Hours Used? Total Time by Phase Total Cycle 2 Hours by Phase 5% Mgmt&Misc 20% 15% Launch Strat&Plan 3% Requirements Design 9% Implementation 24% 8% Test PostMortem 16% 6396 Total Hours to Date 15 W. L. Honig, TSPi Symposium 2006 San Diego, CA
Student Outcomes • Student Perceptions – Popular Course • Team work experiences very positive learning • Understand process – appreciation varies • Data collection a struggle – Volume of data needed – Needed for timely team cooperation • My Viewpoint Volki • Students well equipped to join industrial teams; Spring 2005 larger team sizes work well Pot Luck • TSPi textbook is great on metrics and quality, limited on coverage of design, testing,… • Volume of “paper work” can lead to cybercrud 16 W. L. Honig, TSPi Symposium 2006 San Diego, CA
Students “Value” Forms Greatest perceived value in forms that manage change and defects (red) and project plan creation and tracking (blue) Question 15 100% Student Survey: 75% Choose the 50% forms useful 25% to your team. 0% T L T T O S D E S P Q R K R K R T G S A N L F C S G M M P S E E I E U R I A O N M C E E C O U U T D T T W P I L S S U L G S E S O H C L Initial Findings, FEB 2002 Dr. William L. Honig S 2002
How do these findings apply to industry? • Student teams approximate small industry task teams / development groups – Importance of (self) policing team behavior – Specialized roles help (in addition to developer role) • Training / Coach / Observer role is critical to rapid introduction of process such as TSPi – Get through one cycle quickly to speed learning – Need Process Coach / Facilitator • Face to face regular meetings – Weekly cycle of data, analysis, action – Emphasis on analysis and quality is key • Lead teams to analysis (not just data generation) • Historical data a real help for getting started – If none, BEGIN NOW! 18 W. L. Honig, TSPi Symposium 2006 San Diego, CA
What about TSPi and Small Teams? • Team data for 23 student teams show industry level productivity early in learning TSPi - Quality *always* needs focus • TSPi can be learned efficiently and applied rapidly – Team composition and coaching G10 • The “academic” learning approach likely Fall 2002 applicable to other types of organizations – Value of discipline, data collection, metrics 19 W. L. Honig, TSPi Symposium 2006 San Diego, CA
LOC Vary Greatly Total LOC and Its Max. Min. and Avg. 14000 12000 10000 Total LOC of Each Team Total LOC Max 8000 Min Average 6000 4000 2000 0 x m x s s I r l d e s s s 0 a a p s n e 3 I i e o s k i a l i e e e n 1 i t e r o h T r x o n u r a r l l c M e e o o b B t t a o e a c i e S G e i C u o i B d t n K t B h n i V m o h S u r L r i e e c S i r m c T p a l a e h a a o o o u o U R l c P R i v a W h l D D I B S e E v P n A e o d c Teams i o l i C S 20 W. L. Honig, TSPi Symposium 2006 San Diego, CA
Recommend
More recommend