1 / 11

Assessing the IDPD Factor: Quality Management Platform Project

Assessing the IDPD Factor: Quality Management Platform Project. Thomas Tan Qi Li Mei He. Table of Content. Overview of IDPD Quality Management Platform (QMP) Project Project Information Data Collection Data Analysis Results Discussion Q&A.

elton
Download Presentation

Assessing the IDPD Factor: Quality Management Platform Project

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Assessing the IDPD Factor:Quality Management Platform Project Thomas Tan Qi Li Mei He

  2. Table of Content • Overview of IDPD • Quality Management Platform (QMP) Project • Project Information • Data Collection • Data Analysis Results • Discussion • Q&A

  3. Incremental Development Productivity Decline (IDPD) • Example: Site Defense BMD Software • 5 builds, 7 years, $100M • Build 1 productivity over 300 SLOC/person month • Build 5 productivity under 150 SLOC/PM • Including Build 1-4 breakage, integration, rework • 318% change in requirements across all builds • IDPD factor=20% productivity decrease per build • Similar trends in later unprecedented systems • Not unique to DoD: key source of Windows Vista delays

  4. QMP Project Information • Project Information: • Web-based application • System is to facilitate the process improvement initiatives in many small and medium software organizations • 6 builds, 6 years, different increment duration • Size after 6th build: 548 KSLOC mostly in Java • Average staff on project: ~20

  5. Data Collection • Data Collection • Most data come from release documentation, build reports, and project member interviews/surveys • Data include product size, effort by engineering phase, effort by engineering activities, defects by phase, requirements changes, project schedule, COCOMO II driver ratings (rated by project developers and organization experts) • Data collection challenges: • Incomplete and inconsistency data • Different data format, depends on who filled the data report • No system architecture documents available

  6. Data Analysis – Staffing • Staffing is stable for most early builds, and enough talents stayed in the project to overcome loss of developers • Some staff turnover occurred during build 5 and build 6 • Experience gained – application and platform • Team cohesion improved Staffing and Personnel Capabilities Ratings

  7. Data Analysis – Phase Effort Distribution Phase Effort Percentage • Experienced major integration difficulties in build 3 – major drop in productivity (see next slide) • Forced re-architecting in build 4 – increase in requirement and design effort • Re-architecting paid off in build 5 and 6, which focused primarily on implementation and testing • Testing and fixing as a major source of added integration effort – testing phase effort increased from build to build

  8. Data Analysis – Productivity Trends • The slope of the trend line is -0.76 SLOC/PH per build • Across the five builds, this corresponds to a 14% average decline in productivity per build. This is smaller than the 20% Incremental Development Productivity Decline (IDPD) factor for a large defense program • Most likely because the project is considerably smaller in system size and complexity

  9. Discussion • Staffing stability helps to improve team cohesion and developer experience, thus provide positive contribution to productivity outcome • Design deficiency and code breakage causes productivity declines • In our case study, the development team encountered integration difficulties in build 3, where the original design was insufficient to accommodate additional modules, and a re-architecting effort was necessary to put this project back on track – as what they have done in build 4. • Inserting new code into the previous build adds effort to read, analyze, and test both the new and old code in order to ensure nothing is broken, this extra effort may be mitigated by experienced staff • Extra testing and fixing effort, particularly regression and integration tests, is inevitable, and the amount of this extra effort will increase as the system becomes larger and larger

  10. Future Research • Using COCOMO II Cost Drivers to normalize new size and effort: • Product Effort Multipliers on Size • Personnel Effort Multipliers on Effort • Find Significant Effort Multipliers and analyze its impacts on productivities • Calibrate Equivalent New Size • Calculate equivalent new size based on CodeCountTM “Diff” for each increment and compare that with actual size • Use the results to adjust parameters for calculating equivalent new size with integration rework consideration

  11. Q & A • Questions? • Comments? • Thank you very much

More Related