Using in-process metrics to determine the quality status of a software project under development is easier
said than done. How can you interpret a test-phase defect curve correctly to reflect the true quality status
of the project? If the defect curve is below a given baseline, is this a positive sign? What if the lower
curve is due to slow progress in testing? Likewise, how does one establish meaningful metrics for
design reviews and code inspections and interpret them correctly? How about metrics for stability and
reliability?
This paper describes the Effort/Outcome Model, which is a framework for establishing and interpreting inprocess
metrics in software development. The model has been validated and used on large scale
software projects in a mature software development organization. The central issue for in-process
metrics, the concept and definition of the model, and its use are discussed. Examples of metrics real-life
projects are provided.
Share
Related Documents
  1. [Free] MantisBT : Bug tracking tool (v1.3) (2181)
  2. Bug Reporting Best Practices (2307)
  3. [Paid] Serena TeamTrack : issue management system (4147)
  4. Defect Tracking Process (9366)
  5. [Paid] OfficeClip Issue Tracker : Bug tracking tool (1756)
  6. Top 10 Tips for Bug Tracking (2437)
  7. Bug Reporting Art and Advocacy (1966)
  8. Defect Management Process (2323)
  9. Defect Prevention Process (1325)
  10. [Free] BugPort : Defect Management Tool (1918)
  11. Bug Report that make sense (2022)
  12. Using Defect Analysis to Improve Software Process (1832)
  13. [Free] Bug-buddy : Free Bug Reporting Tool (1766)
  14. Types of errors (2061)
  15. [Video] The Ethics of Error Prevention (1109)
  16. Bug Template & Writing effective defect profile (2607)
  17. 10 Tips to Avoid Writing a Bad Software Defect Report! (2011)
  18. [Free] Sierra : bug management platform (1668)
  19. Manual to Measure Defect : KSMA Metrics (1173)
  20. [Video] Static Analysis Plug-in for Eclipse (1182)