todd warren cs 394 spring 2011 team structure at
play

Todd Warren CS 394 Spring 2011 Team structure at Microsoft Product - PowerPoint PPT Presentation

Todd Warren CS 394 Spring 2011 Team structure at Microsoft Product Complexity and Scheduling Quality and software testing Knowing when to ship 3x Programming Program System 3x 9x Programming Programming System Product


  1. Todd Warren – CS 394 Spring 2011

  2.  Team structure at Microsoft  Product Complexity and Scheduling  Quality and software testing  Knowing when to ship

  3. 3x Programming Program System 3x 9x Programming Programming System Product Product Source: Fred Brooks Jr., Mythical Man Month, “The Tar Pit”

  4. Professional Services Product Business Marginal Costs Almost Constant Drive towards High Concentration Regional Appearance Mainly regional, with increasing Highly Globalized tendency to globalization Customer Relationship One to One One to Many Most Important Number to Capacity Utilization Rate Market Share (Installed Base) Watch Relevance of Human Resources 1. Strategy 1. Management Areas Software 2. Marketing and 2. Development Sales Marketing and 3. Human Resources 3. Sales 4. Software Strategy Development 4. Source : Hoch, Roeding, Purkert, Lindner, “Secrets of Software Success”, 1999

  5.  Product Manager  User-Interface designer  End-user liaison Program Management  Project Manager  Architect  Developers  Tool Smith Software Development Engineers  QA/Testers  Build Coordinator  Risk Officer  End User Documentation Test and Quality Assurance User Assistance / Education Source: McConnell

  6. Size Matters!   Different Methodologies and Approaches Scope of Feature and Quality Matters  Affects Level of Process needed and overhead   5 person teams: Moderate Process, Shared Roles  24 person teams (PMC): Moderate Process, Lifecycle oriented roles and specialization —good for “Extreme” style process  60-100 (MS Project): Moderate Process, some loose functional specialization and lifecycle  100-200 (Windows CE) person teams: Medium to Heavy Process, Lifecycle roles and functional specialization  1000+ Person Teams (Windows Mobile): Heavy Process, Multiple Methodologies, Formal Integration Process Higher Quality==more rigorous process  True also for open source, online projects   Apache is best example of very specified culture of contribution

  7. Feature Team Function A Function B Function C Development Development Development team 1 team 1 team 1 Very Development Development Development Formal team 2 team 2 Team 2 Casual Development Development Development team 3 team 3 Team 3 More Formal

  8. Office Windows Edge Edge Edge Edge Edge Edge Core Edge Core Edge Edge Edge Edge Edge

  9. 25% Developers  45% Testers  10% Program Management  10% User Education / Localization  7% Marketing  3% Overhead 

  10.  1 UI Designer  5 Program managers  8 Developers  10 testers

  11. 30 Developers (27%)  36 Testers (33%)  15 Program Mgrs (14%)  20 UA/Localization (18%)  6 Marketing (5%)  3 Overhead (3%) 

  12. 112 Developers (25.9%)  247 Testers (57.3%)  44 Program Mgrs. (10.2%)  12 Marketing (2.7%)  16 Overhead (3.7%) 

  13. Admin/Other 1% Development 22% Program Mgt 14% Test 49% User Ed / Localization 13%

  14. Role % ratio to dev cust. Integration 15.79% 0.71 Developers 22.11% 1.00 Tester 35.79% 1.62 Program Managers 10.53% 0.48 UI Design 5.26% 0.24 UE 4.21% 0.19 Eng services 6.32% 0.29

  15.  3 month maximum is a good rule of thumb for a stage/milestone.  Hard for people to focus on anything longer than 3 months.  Never let things go un-built for longer than a week

  16.  216 days development (truthfully probably more like 260d)  284 days on “testing” in example  Component Tests: 188d  System wide tests:~97d  50/50 split between design/implement and test/fix  Some Projects (e.g. operating systems, servers) longer integration period (more like 2:1)  Factors: How distributed, number of “moving parts”  Show why some of the Extreme methodology is appealing.

  17.  1/3 planning  1/6 coding  1/4 component test and early system test  1/4 system test, all components in hand

  18. Milestone Planned Date Actual Date M1 start 5/12/97 5/12/97 M1 end 8/8/97 8/22/97 M2 start 8/11/97 8/25/97 M2 end 11/7/97 12/12/97 M3 start 11/10/97 12/15/97 M3 end ("code complete") 2/23/98 3/31/98 Beta 1 3/9/98 6/22/98 Beta 2 5/11/98 9/21/98 RTM U.S. 7/13/98 3/25/99

  19. Project Time Split 120% 100% 80% % of Project Time 60% 40% 20% 0% brooks Office 2000 plan Office 2000 actual Project 2002 Office 2007 system test 25% 27% 46% 23% 30% Component test 25% 27% 23% 38% 26% Coding 17% 27% 19% 27% 18% Planning 33% 19% 13% 11% 26%

  20.  Design in Scenarios up front  What is necessary for the component  UI is different than API  Server is different than client  Set Criteria and usage scenarios  Understanding (and controlling if possible) the environment in which the software is developed and used  “The last bug is found when the last customer dies” -Brian Valentine, SVP eCommerce, Amazon

  21. Exchange versions: 4.0 (latest SP), 5.0 (latest SP) and 5.5 Windows NT version: 3.51, 4.0 (latest SP’s) Langs. (Exchange and USA/USA, JPN/Chinese, JPN/Taiwan, JPN/Korean, Windows NT): JPN/JPN, GER/GER, FRN/FRN Platforms: Intel, Alpha, (MIPS, PPC 4.0 only) Connectors X.400: Over TCP, TP4, TP0/X.25 Connectors IMS: Over LAN, RAS, ISDN Connectors RAS: Over NetBEUI, IPX, TCP Connector interop: MS Mail, MAC Mail, cc:Mail, Notes News: NNTP in/out Admin: Daily operations Store: Public >16GB and Private Store >16GB Replication: 29 sites, 130 servers, 200,000 users, 10 AB views Client protocols: MAPI, LDAP, POP3, IMAP4, NNTP, HTTP Telecommunication: Slow Link Simulator, Noise Simulation Fault tolerance: Windows NT Clustering Security: Exchange KMS server, MS Certificate Server Proxy firewall: Server-to-Server and Client-to-Server

  22.  5m lines of code  4 processor architectures  ARM/Xscale, MIPS, x86, SH  20 Board Support Packages  Over 1000 possible operating system components  1000’s of peripherals

  23.  2 code instances (“standard” and “pro”)  4 ARM Chip Variants  3 memory configuration variations  8 Screen sizes (QVGA, VGA, WVGA, Square..)  60 major interacting software components  3 network technologies (CDMA, GSM, WiFi)  Some distinct features for 7 major vendors  100 dependent 3 rd party apps for a complete “phone”

  24. Defects per week 1000 100 200 300 400 500 600 700 800 900 0 3/1/98 4/1/98 5/1/98 6/1/98 7/1/98 8/1/98 9/1/98 10/1/98 11/1/98 Defects Found over Project Life 12/1/98 1/1/99 2/1/99 Time 3/1/99 4/1/99 5/1/99 6/1/99 7/1/99 8/1/99 9/1/99 10/1/99 11/1/99 12/1/99 1/1/00 2/1/00 Closed Resolved Open

  25. Defect Activity by Phase 1000 900 800 Releasing 700 Defect/Week Unit Testing 600 Coding 500 Closed 400 Resolved 300 Open 200 100 0 3/1/98 4/1/98 5/1/98 6/1/98 7/1/98 8/1/98 9/1/98 10/1/98 11/1/98 12/1/98 1/1/99 2/1/99 3/1/99 4/1/99 5/1/99 6/1/99 7/1/99 8/1/99 9/1/99 10/1/99 11/1/99 12/1/99 1/1/00 2/1/00 Week

  26. Unit Tests Feature Implemented Implemented Feature is Specified Test Design Test Release Is written Document Component Testing Specialized System Testing Test” Bug Fix Regression Tests

  27. Types of Tests Stage of Cycle  Black Box  Unit Test /  White Box  Verification Test  “Gray” Box  Component  Acceptance Test  System Test  Performance Test  Stress Test  External Testing (Alpha/Beta/”Dogfood”)  Regression Testing

  28. Test with the Customer in Mind Make it Measurable  Ship Requirement Catch Bugs Early Guard the Process Fix of Fix M0 M1 M2 RTM t of Cost Cos Time Time ProOnGo LLC – May 2009

  29. How close are we to Are the criteria What are we building, satisfying agreed passing stably, every and why? upon metrics/criteria? time we test? What metrics and What do our bug How risky is this last- criteria summarize trends say about our minute code check- customer demands? progress? in? Can we reliably Based on current Do we pass all measure these trends, when will we criteria? If not: what, metrics and criteria? pass all criteria? why, how? M1 .. Mn RTM Milestone M0 Development & Test Confirm, Ship Specs & Test Plans ProOnGo LLC – May 2009

  30. What are we building, and why? Any problems with this?  // An API that draws a line from x to y VOID LineTo(INT x, INT y); What metrics and criteria summarize customer demands?  Can we reliably measure these metrics and criteria?  M0 M1 .. Mn RTM Specs & Test Plans Development & Milestone Test Confirm, Ship ProOnGo LLC – May 2009

  31. Most Shallow Canary Frequent Coverage Build Verification Tests Automated Test Pass Least Completes Manual Test Pass Frequent Coverage M0 M1 .. Mn RTM Specs & Test Plans Development & Milestone Test Confirm, Ship ProOnGo LLC – May 2009

  32.  Fast Tests that can automatically run at check-in time  Static Code Analysis (like lint)  Trial build, before check-in committed to SCM  Form-field tests: ▪ Check-In cites a bug number? ▪ Code-reviewer field filled out? M0 M1 .. Mn RTM Specs & Test Plans Development & Milestone Test Confirm, Ship ProOnGo LLC – May 2009

Recommend


More recommend