What Works Best with TSPi for Small Team Productivity and Quality - - PowerPoint PPT Presentation

what works best with tspi for small team productivity and
SMART_READER_LITE
LIVE PREVIEW

What Works Best with TSPi for Small Team Productivity and Quality - - PowerPoint PPT Presentation

What Works Best with TSPi for Small Team Productivity and Quality William L. Honig, Ph.D. Associate Professor, Department of Computer Science 1 W. L. Honig, TSPi Symposium 2006 San Diego, CA TSPi Effectiveness with Small Teams TSPi


slide-1
SLIDE 1

1

  • W. L. Honig, TSPi Symposium 2006 San Diego, CA

What Works Best with TSPi for Small Team Productivity and Quality

William L. Honig, Ph.D.

Associate Professor, Department of Computer Science

slide-2
SLIDE 2

2

  • W. L. Honig, TSPi Symposium 2006 San Diego, CA

TSPi Effectiveness with Small Teams

  • TSPi impact on software teams

– 23 teams of 7 to 12 graduate students on real world developments – Software process awareness and impact

  • Productivity coupled with quality
  • Result of planning and analysis
  • Extensive data collection
  • Bringing real world software experience to the

classroom

– R&D leadership in communications companies – Land line, wireless, satellite, private and public networks

  • Voice, data, land line, mobile, satellite, network

management

slide-3
SLIDE 3

3

  • W. L. Honig, TSPi Symposium 2006 San Diego, CA

What Results?

  • Data Summary – Productivity

– Source Lines of Code (LOC) per Person Hour

  • High 47.4
  • Average 13.5
  • Low 1.8

(complete Cycle 2 development, including reuse – all phases)

  • Data Summary – Quality

– Defects Injected per Total KLOC

  • Low 2.8
  • Average 24.1
  • High 86.3
slide-4
SLIDE 4

4

  • W. L. Honig, TSPi Symposium 2006 San Diego, CA

How do the teams work?

  • Team composition
  • Students assigned to Team

» Based on From INFO

  • Roles matched to background
  • Demographic mixture
  • Well trained individual programmers
  • Learning environment
  • 14 to 17 weeks of class
  • Strict enforcement of team discipline
  • Face to face team meetings required
  • Students ? Employees, but can be “fired”

Team Phoenix Fall 2001

slide-5
SLIDE 5
  • Dr. William L. Honig

2005

COMP 474 Software Engineering

Team Roles at a Glance

(Five Specialized Roles)

  • Support Manager
  • Quality/Process

Manager

  • Planning Manager
  • Development

Manager

  • Team Leader
slide-6
SLIDE 6

6

  • W. L. Honig, TSPi Symposium 2006 San Diego, CA

Team Productivity – Cycle 2

Source Lines Of Code (LOC) per Hour

0.000 5.000 10.000 15.000 20.000 25.000 30.000 35.000 40.000 45.000 50.000

Total LOC / Total Time in Phase

Rambler Ice Cool Lucid Phoenix Blue Bee Dim Sum Doc Max Socrotes Kites Titans G10 eUphoria Beta Seals Bees Sharp Silicon Raiders Evolution Avalanche T3 Code Warriors Phoenix II Volki Teams

Productivity

Productivity of Each Team

Good

slide-7
SLIDE 7

7

  • W. L. Honig, TSPi Symposium 2006 San Diego, CA

How is TSPi used in the classroom?

  • Student teams complete two cycles of
  • f development
  • Same team assignment for both cycles
  • Some switch roles for cycle two
  • “Customer” provides starting point
  • Product Needs Statement (not full requirements)
  • 2 to 4 meetings with customer to clarify needs and

review requirements and plans

  • Teams present key milestones and demonstrate

product to faculty, research assistants, customer

slide-8
SLIDE 8
  • Dr. William L. Honig

2005

COMP 474 Software Engineering

Strategy

The Process at a Glance (TSPi)

Plan Requirements Design Implementation Test Postmortem Repeat

A controlled, data driven, step-by-step process for software life cycle

Launch

slide-9
SLIDE 9

9

  • W. L. Honig, TSPi Symposium 2006 San Diego, CA

How do students learn PSP first?

  • Personal Software Process (PSP)

– Required for individuals – Prerequisite for TSPi

  • PSP trial introduction

– Undergraduate programming course – Plan (estimate time), track defects, record time spend

  • Only some TSPi student teams have this

experience before TSPi begin

– Quick two day introduction – One programming project

slide-10
SLIDE 10

10

  • W. L. Honig, TSPi Symposium 2006 San Diego, CA

Development Projects

  • “Real World” Development

– University staff groups as customer

  • working system or,
  • prototype or,
  • requirements clarification,…
  • Wide range of applications

– Prospect tracking for Graduate School – Summer visit registration for College of Arts and Sciences – Student Portal for Information Technology – Grant Approval and Tracking for VP Research

  • Many technologies

– C++, Java, XML, ColdFusion, …

Titans Fall 2002

slide-11
SLIDE 11

11

  • W. L. Honig, TSPi Symposium 2006 San Diego, CA

How are data collected?

  • Textbook: Watts S. Humphrey, Introduction to

the Team Software Processsm

  • Key data entered weekly into 21 forms:

– Product Summary (SUMP) – Quality Summary (SUMQ) – Work Tasks/Effort (TASK) – Schedule and Earned Value (SCHEDULE) – Defect Identification and Correction (LOGD) – Inspection Reports (INS) – Time Recording Log (LOGT)

Phoenix Fall 2001

slide-12
SLIDE 12
  • Dr. William L. Honig

2002

Initial Findings, FEB 2002

TSPi Plan Summary: Form SUM P

Name Date Team Instructor Part/Level Cycle Product Size Plan Actual Requirements pages (SRS) Other text pages High-level design pages (SDS) Detailed design lines Base LOC (B) (measured) Deleted LOC (D)

(Estimated) (Counted)

Modified LOC (M)

(Estimated) (Counted)

Added LOC (A)

(N-M) (T-B+D-R)

Reused LOC (R)

(Estimated) (Counted)

Total New and Changed LOC (N)

(Estimated) (A+M)

Total LOC (T)

(N+B-M -D+R) (Measured)

Total New Reuse LOC Estimated Object LOC (E) Upper Prediction Interval (70%)

Lower Prediction Interval (70%)

Time in Phase (hours) Plan Actual Actual % Management and miscellaneous Launch Strategy and planning Requirements System test plan Requirements inspection High-level design Integration test plan High-level design inspection Implementation planning Detailed design Detailed design review Test development Detailed design inspection

If it’s not documented, it’s not there… If you can’t measure it, it’s not there…

slide-13
SLIDE 13

13

  • W. L. Honig, TSPi Symposium 2006 San Diego, CA

What Results?

Defects Injected per LOC

0.01 0.02 0.03 0.04 0.05 0.06 0.07 0.08 0.09

Total Defects Injected / Total LOC

Rambler Ice Cool Lucid Phoenix Blue Bee Dim Sum Doc Max Socrotes Kites Titans G10 eUphoria Beta Seals Bees Sharp Silicon Raiders Evolution Avalanche T3 Code Warriors Phoenix II Volki Teams

Quality--a

Quality(a) of Each Team

Good

slide-14
SLIDE 14

14

  • W. L. Honig, TSPi Symposium 2006 San Diego, CA

Quality Results from Cycle Testing ONLY

  • In Cycle Testing determines the quality

numbers

– No “production” use recorded

  • “Testing can only show the presence of

bugs, not their absence”

– Fault Seeding – Bug Density / Arrival Rate Analysis

slide-15
SLIDE 15

15

  • W. L. Honig, TSPi Symposium 2006 San Diego, CA

Where are the Hours Used?

Total Time by Phase

Total Cycle 2 Hours by Phase

20% 3% 9% 8% 16% 24% 15% 5% Mgmt&Misc Launch Strat&Plan Requirements Design Implementation Test PostMortem

6396 Total Hours to Date

slide-16
SLIDE 16

16

  • W. L. Honig, TSPi Symposium 2006 San Diego, CA

Student Outcomes

  • Student Perceptions – Popular Course
  • Team work experiences very positive learning
  • Understand process – appreciation varies
  • Data collection a struggle

– Volume of data needed – Needed for timely team cooperation

  • My Viewpoint
  • Students well equipped to join industrial teams;

larger team sizes work well

  • TSPi textbook is great on metrics and quality, limited
  • n coverage of design, testing,…
  • Volume of “paper work” can lead to cybercrud

Volki Spring 2005 Pot Luck

slide-17
SLIDE 17
  • Dr. William L. Honig

2002

Initial Findings, FEB 2002

Students “Value” Forms

Student Survey: Choose the forms useful to your team. Question 15

0% 25% 50% 75% 100% C C R I N S T A S K L O G D S C H E D U L E S U M S L O G T S U M P S U M P Q C S R I T L L O G T E S T S T R A T W E E K P E E R I N F O

Greatest perceived value in forms that manage change and defects (red) and project plan creation and tracking (blue)

slide-18
SLIDE 18

18

  • W. L. Honig, TSPi Symposium 2006 San Diego, CA

How do these findings apply to industry?

  • Student teams approximate small industry task teams /

development groups

– Importance of (self) policing team behavior – Specialized roles help (in addition to developer role)

  • Training / Coach / Observer role is critical to rapid

introduction of process such as TSPi

– Get through one cycle quickly to speed learning – Need Process Coach / Facilitator

  • Face to face regular meetings

– Weekly cycle of data, analysis, action – Emphasis on analysis and quality is key

  • Lead teams to analysis (not just data generation)
  • Historical data a real help for getting started

– If none, BEGIN NOW!

slide-19
SLIDE 19

19

  • W. L. Honig, TSPi Symposium 2006 San Diego, CA

What about TSPi and Small Teams?

  • Team data for 23 student teams show industry

level productivity early in learning TSPi

  • Quality *always* needs focus
  • TSPi can be learned efficiently and

applied rapidly

– Team composition and coaching

  • The “academic” learning approach likely

applicable to other types of organizations

– Value of discipline, data collection, metrics

G10 Fall 2002

slide-20
SLIDE 20

20

  • W. L. Honig, TSPi Symposium 2006 San Diego, CA

LOC Vary Greatly

Total LOC and Its Max. Min. and Avg.

2000 4000 6000 8000 10000 12000 14000

R a m b l e r I c e C

  • l

L u c i d P h

  • e

n i x B l u e B e e D i m S u m D

  • c

M a x S

  • c

r

  • t

e s K i t e s T i t a n s G 1 e U p h

  • r

i a B e t a S e a l s B e e s S h a r p S i l i c

  • n

R a i d e r s E v

  • l

u t i

  • n

A v a l a n c h e T 3 C

  • d

e W a r r i

  • r

s P h

  • e

n i x I I V

  • l

k i

Teams Total LOC

Total LOC of Each Team Max Min Average

slide-21
SLIDE 21
  • Dr. William L. Honig

2002

Initial Findings, FEB 2002

Ramblers Team Metric Chart

Planned Value vs. Earned Value

11% 32% 93% 100% 0% 20% 40% 60% 80% 100% 120% Week1 Week2 Week3 Week4 Time V a lu e PV EV

Week1 Week2 Week3 Week4 P H AH 90 85 141 82 89 89 89 89

  • 50

100 150

V a lu e Time

Planned Hours vs. Actual Hours

PH AH

Defects Injected vs. Removed

5 5 4 5 5 4 Week1 Week2 Week3 Week4 Removed Injected

CCR Tracking Chart

  • 7

16

  • 7

16

  • 5

10 15 20 Week1 Week2 Week3 Week4

Time N

  • .

Submitted Approved Rejected

  • No. INS Finished

5 4 6 2 4 6 8 Week1 Week2 Week3 Week4 No. Time

  • No. INS

Ramblers Team Metric Chart

5-Up Chart

slide-22
SLIDE 22

Pert Chart

Lau Strategy Plan Request Design Implement Test Postmortem

Size and time Estimation (All members) 15 hrs Set goals (All members) 5 hrs Update risk & issues (Support M.) 5 hrs Update configuration management procedure (Support M.) 5 hrs Update Product list and size estimation (Plan M.) 14 hrs Allocate tasks among members (Plan M.) 3 hrs Estimating the defects (Quality M.) 10 hrs Produce SRS (Develop M.) 11 hrs Produce STP (Develop M.) 11 hrs Inspect SRS (Quality M.) 6 hrs Produce SDS (Develop M.) 9 hrs Inspect SDS (Quality M.) 8 hrs Detailed design (All members) 14.5 hrs Test plan and development (Develop M.) 8 hrs Build & Integration (Support M.) 9 hrs Documentation (Support M.) 11.5 hrs System test (Develop M.) 9.5 hrs

Week 1 Week 2 Week 4 Week 3

Project Tasks

Task Dependency Time

Assign roles (All members) 3 hrs ITL SUMS TASK SCHEDULE SUMP SUMQ Pert Chart 30 hrs Inspect STP (Quality M.) 6 hrs SRS STP SRS-INS STP-INS Produce ITP (Develop M.) 7 hrs Inspect ITP (Quality M.) 5 hrs SDS-INS ITP-INS SDS-INS ITP-INS Implementation planning (Develop M.) 11.5 hrs Unit Test Plan (All members) 9.5 hrs Detailed Design inspection (Quality M.) 11 hrs Code (All members) 22 hrs Code inspection (Quality M.) 9 hrs Quality review (All members) 7.5 hrs Code, CCR SUMS, SUMP SUMQ, LOGT LOGD, INS LOGD LOGTEST SUMP SUMQ Finish documentations (Support M.) 24 hrs Update douments (All memebers) 22 hrs 17.5 hrs 58 hrs

PIIC

CYCLE 2

slide-23
SLIDE 23
  • Dr. William L. Honig

2002

Initial Findings, FEB 2002

Larger Team Size Works

  • Flexibility in Roles:

– Some ability to switch roles – Easier to recover from “drop outs”

  • Student Feedback:

– Students identified the problems their team encountered – 20% felt a smaller team size of 5 would lessen the problems

slide-24
SLIDE 24
  • Dr. William L. Honig

2002

Initial Findings, FEB 2002

What are some next steps?

Expand Focus on Analysis Metrics for In cycle Quality Improvement Ease Data Gathering Travail Mobile Tool Incorporate Teaching Materials on Technique Best Practices Effectiveness of TSPi to Accelerate Transition to CMMI

Questions, follow-ups, ideas…. contact

William L. Honig

whonig@luc.edu, 1-312-915-7988