#63 – ROI IN SOFTWARE PM TOOLS & SOFTWARE QC – CAPERS JONES

Capers Jones pixThe construction of large software systems is one of the most hazardous activities of the business world.  The failure or cancellation rate of large software systems is over 35%.  Of the large systems that are completed, about two thirds experience schedule delays and cost overruns.  Yet some large systems are finished early, meet their budgets, and have few if any quality problems.  How do successful projects differ from projects that fail?  Better project management and better quality control are the most important differences between success and failure in the software world.  Thus excellence in software project management has a very favorable return on investment (ROI) due to cost avoidance.

INTRODUCTION
Software development is a troubling technology.  Software is custom designed, hand coded, and therefore highly labor intensive and error prone.  As a result large software projects are among the most expensive and troublesome undertakings of any commercial product.  For example, large software systems cost far more to build and take much longer to construct than the office buildings occupied by the companies that have commissioned the software.  Really large software systems in the 250,000 function point size range can cost more than building a domed football stadium, a 50-story skyscraper, or a 90,000 ton cruise ship.

Not only are large systems expensive, but they also have one of the highest failure rates of any manufactured object in human history.  The term “failure” refers to projects that are canceled without completion due to cost or schedule overruns, or which run later than planned by more than 25%.

The author often works as an expert witness in litigation for software projects that were either terminated due to cost overruns or did not work properly when delivered.  Depositions and discovery documents in these cases reveal four endemic problems:

  1. Poor estimation practices before the projects began.
  2. Poor quality control practices during development.
  3. Poor change control practices during development.
  4. Poor status tracking by managers during development.

Thus the combination of poor quality and poor project management is the cause of far too many software failures in the modern world.

Let us consider what the phrase “large systems” means in the context of six different size plateaus separated by an order of magnitude for each plateau.  In these discussions we will assume an average burdened cost of $10,000 per month.  We will also assume U.S. norms for work hours per month of 132:

1 Function Point (55 Java statements)

There are few software applications of this size except small enhancements to larger applications, or minor personal applications.  The schedules for such small programs are usually only from a day to perhaps a week.  The effort for these small projects ranges from a few hours to a day.  Costs are a few hundred dollars.  Risks are low other than the risk of accidentally having bugs.  Defect potentials are below 1.0 per function point and defect removal efficiency (DRE) often tops 99%.

10 Function Points (550 Java statements)

This is the typical size of end-user applications, and also a very frequent size plateau for small enhancements to existing software.  Development schedules are usually less than a month.  The effort for these projects runs from perhaps a week to just over two weeks.  Costs are in the range of $3,500.  Risks are low.  Defect potentials are perhaps 1.25 per function point and defect removal efficiency (DRE) often tops 99%.

100 Function Points (5,500 Java statements)

This is the practical upper limit of end-user applications.  There are complete few stand-alone applications of this size in 2014, but 35 years ago there were a number of DOS applications in this size range such as early Basic interpreters.  Development schedules are usually between four and six months.  The effort for these projects runs from about three months to perhaps six months.  Typical costs for this size range are about $50,000.  Defect potentials are about 1.75 per function point and defect removal efficiency can top 98% if static analysis is used before testing.  Risks are low but poor quality can be an issue.

1,000 Function Points (55,000 Java statements)

This is a fairly common entry-level size range for many commercial and internal smart phone software applications.  It is also a common size range for small web applications.  Schedules for software projects of this size are usually range from 12 to 16 months.  This size range is where many agile projects occur.  Effort for these projects ranges between about 50 and 75 staff months.  Costs are in the ballpark of $600,000.  Defect potentials can top 3.0 per function point and defect removal efficiency is usually about 96%.  Risks of cost overruns, schedule delays, and poor quality are significant.  Formal estimates and formal project tracking are needed.  Quality control requires pre-test inspections, static analysis, and formal testing.  Security flaws start to become troublesome.

10,000 Function Points (550,000 Java statements)

Applications of this size are usually termed “systems” because they are far too large for individual programs.  This size range is often troubled by cost and schedule overruns and by outright cancellations.  Development teams of 60 to 75 people of multiple disciplines are common, so communication and interface problems are endemic.  (Agile is not optimal for large systems so other methods such as Rational Unified Process (RUP) or Team Software Process (TSP) are the modern replacements for waterfall development.)

Effort for these systems can top 2,000 staff months.  Costs can top $25,000,000.  Defect potentials can approach 4.0 bugs per function point and defect removal efficiency is often below 90% unless a rigorous combination of pre-test inspections, static analysis, and formal testing by certified test personnel are deployed.  Security flaws are endemic at this size range.

Software schedules in this size plateau run from two to more than four years, although the initial planning for applications of this size range tends to naively assume schedules of 18 months or less.  Cost overruns and schedule delays are endemic.  To succeed in this size range, excellent quality control and capable project management are absolute necessities.

Outright failure or cancellation of these systems can approach 30% while schedule delays and cost overruns exceed 75%.

100,000 Function Points (5,500,000 Java statements)

Applications that approach or exceed 100,000 function points in size are among the most expensive and risky constructs in the history of industry.  Applications of 100,000 function points are about the size of IBM’s MVS operating system.  Even larger are ERP packages which top 250,000 function points plus some major defense applications.

Schedules for these massive systems range from 60 to more than 72 calendar months although most initial schedule plans are excessively optimistic at about 48 calendar months, which never happens.

Staffing for these systems can top 600 people and 50 occupation groups.  Effort for these massive systems can top 40,000 staff months and costs can top $500,000,000.

Defect potentials can approach 6.00 per function point while defect removal efficiency levels are often below 85%.  Only a rigorous combination of defect prevention, pre-test inspections and static analysis, combined with formal testing by certified test personnel are adequate for quality control.  Security flaws can be in the hundreds and are endemic at this size range.   Worse, these large applications usually control valuable assets such as financial information or defense data and hence are prime targets for security attacks.

The odds of outright failure top 50% at 100,000 function points.  Of the surviving projects, over 90% of these massive applications run late by more than 12 months and exceed planned budgets by over $50,000,000.  Make no mistake, large software systems are among the most error-prone and expensive artifacts in human history.

Using these six size ranges, table 1 shows the approximate frequency of various kinds of outcomes, ranging from finishing early to total cancellation.   Table 1 is derived from a total of about 20,000 projects between 1974 and 2014.

 

Table 1:  Software Project Outcomes by Size of Project Circa 2014
    Probability of Selected Outcomes  
           

Function

Early

On-Time

Delayed

Canceled

Total

Points

                 1

28.00%

70.00%

1.50%

0.50%

100.00%

               10

23.00%

73.00%

3.00%

1.00%

100.00%

            100

12.00%

73.00%

7.00%

8.00%

100.00%

         1,000

8.00%

65.00%

15.00%

12.00%

100.00%

       10,000

4.00%

20.00%

43.00%

33.00%

100.00%

     100,000

2.00%

11.00%

32.00%

55.00%

100.00%

Average

12.83%

52.00%

16.92%

18.25%

100.00%

As can easily be seen from table 1 small software projects are successful in the majority of instances, but the risks and hazards of cancellation or major delays rise quite rapidly as the overall application size goes up.  Indeed, the development of large applications in excess of 10,000 function points is one of the most hazardous and risky business undertakings of the modern world.

This is why the values of excellent quality control and excellent project management tools correlate directly with application size in terms of function points.  Below 100 function points project management tools are optional and quality control need only be average.  Above 1,000 function points both quality control and project management tools are mandatory.  Above 10,000 function points only top-ranked quality control and a full suite of modern project management tools can lead to successful outcomes.

Table 2 shows the approximate cost of development assuming a constant burdened cost rate of $10,000 per month:

 

Table 2:  Software Application Costs by Size

Size in

Application Cost

Application Cost

Function

at $10,000 per

per Function

Points

Staff Month

Point

1

$307

$307

10

$5,900

$590

100

$68,750

$688

1,000

$837,008

$837

10,000

$27,002,794

$2,700

100,000

$383,661,586

$3,837

Average

$68,596,057

$1,493

It is obvious from tables 1 and 2 that the value of quality goes up with overall application.  For small projects below 100 function points both marginal quality and marginal management are survivable.  Above 1,000 function points excellence in quality and excellence in project management are mandatory for success, and also to ensure that the projects are actually completed and not terminated due to massive cost and schedule overruns.

(Note: Because manual function point counting is slow and expensive, normal counts are seldom used above 10,000 function points.  The sizes shown are based on the patent-pending sizing method in Software Risk Master (SRM).

Examples of large systems in the 100,000 function point size range and larger include Windows 7 and 8; IBM’s operating systems; ERP packages such as SAP and Oracle; and major defense systems such as the World Wide Military Command and Control System (WWMCCS), and some specialized security applications.)

Not only do excellent quality control and excellent project management benefit costs, but they also benefit schedules.  Figure 1 shows the approximate schedule duration in calendar months for 10,000 function point projects with best case quality control and project management; average case; and worst case:

Figure 1:  Schedules in Calendar Months for 10,000 Function Points

The schedule range runs from about 32 months for the best case with excellent quality and management; 36 months for the average case; and 43 calendar months for the worst-case with poor quality control and poor project management.

Patterns of Project Management Tool Usage

Although software project failures outnumber successes there are successful large system built.  From assessment and benchmark studies, two critical factors differentiate successful software applications from failures:  1) project management; 2) quality control.

Table 3:  Differences between Successful and Unsuccessful Software Projects

In the 10,000 Function Point Size Category

Successful Projects Unsuccessful Projects

  • Software cost estimating tools Manual estimating methods
  • Project planning tools No project planning tools
  • Quality estimating tools No quality estimates performed
  • Project management tools Partial use of project management tools
  • Project cost tracking tools Partial use of cost tracking tools
  • Project progress tracking tools No progress tracking tools
  • Accurate progress reports Progress reports that conceal problems
  • Formal project milestone tracking Informal milestone tracking
  • Defect prevention such as JAD No defect prevention
  • Effective requirements gathering Ineffective requirements gathering
  • Formal requirements inspections No requirements inspections
  • Formal design inspections No design inspections utilized
  • Formal code inspections No code inspections utilized
  • Pre-test static analysis No pre-test static analysis
  • Formal testing Informal testing
  • Certified test personnel No certified test personnel

To understand the significance of table 3 it is necessary to know why software projects run late or are canceled.  Large systems usually run late because they contain so many defects or errors that they don’t work so testing stretches out towards infinity.  Unfortunately, if this situation is not detected until testing begins it is too late for corrective actions to be fully successful.

Once testing starts on a defective application, the project will enter a nightmare cycle of finding hundreds of problems, fixing them as fast as possible, and then re-testing the application.  This cycle can run on for months.  In fact, many troubled projects appear to be on schedule until testing begins, when it is suddenly discovered that the application just does not work.

Because software defects are the source of schedule and cost overruns, it is important to know how many defects are likely to occur.  It is also important to know the most effective ways of preventing and removing software defects.  This explains why successful software projects use estimating tools that can predict defect volumes.

If defects are not found until late in the project during testing, it is too late to bring the project back under control.  This explains why successful software projects use formal design and code inspections prior to testing.  Inspections are about twice as effective as most forms of testing in finding software defects, and they take place much earlier.

Project management and quality control are closely intertwined on successful large software projects.  The secret of success is to keep quality under control at all times, and to use state of the art predictive and tracking tools.

Return on Investment in Project Management and Quality Control

The standard economic definition of productivity is “good or services produced per unit of labor or expense.”  Software bugs or defects are not “goods or services” but rather unfortunate accidents.  The metric “cost per defect” does not measure economic value, and also has a fatal flaw.  The “cost per defect” metric is always cheapest where the largest number of defects are found.  Therefore this metric penalizes quality.  It cannot safely be used to show the real economic value of quality.

To measure the economic value of quality, it is necessary to show the overall costs of the entire application with and without excellence in terms of quality control (and project management.)

In this analysis simple values and ratios are used to illustrate the point that good management and good quality control are valuable.  The data shown is not exact, but uses even numbers to make calculations easy to understand.  The same sequence of calculations can be carried out with local data that matches local results.

You also need to know the costs of project management and quality tools and methods.

  • Software cost estimating tools range from open-source to about $5,000 per seat.  They are typically in the 1000 to 3000 function point size range.
  • Project management tools range from open-source to about $5,000 per seat. They are typically in the 1000 to 3000 function point size range.
  • Cost and schedule tracking tools range from open-source to about $7,000 per seat.

They are typically in the 2000 to 4000 function point size range.

  • Static analysis tools range from open-source to about $1,000 per seat.  They are typically in the 1500 to 4000 function point size range.
  • Defect tracking tools range from open-source to about $1,000 per seat.  They are typically in the 200 to 700 function point size range.
  • Quality estimation is usually performed by specialists such as Namcook Analytics.
  • Inspections are free except for labor costs.

Let us make some simplifying assumptions.  Assume that the average cost to build a successful software application in the 10,000 function point size range is about $1000 per function point or $10,000,000.  However, unsuccessful projects that are cancelled usually accrue costs of more than $1,500 per function point and are about 12 months late when they are cancelled.  In other words, cancelled software projects in the 10,000 function point range cost about $15,000,000 and obviously are a write-off without any positive value.

If the failing project becomes subject to breach of contract litigation, the damage costs can be much higher than the cost of the software itself.  But litigation damages are difficult to predict ahead of time.

The costs of fully equipping software project managers on 10,000 function point projects with state of the art software cost and quality estimating tools is not very expensive:  the cost would normally be less than $10.00 per function point or only about $100,000.  The total cost per manager would only be in the range of $10,000 or less.

The costs of performing formal design and code inspections run to about $100 per function point, or roughly $1,000,000 for a 10,000 function point application.

Thus an investment in state of the art project management tools and quality control nets out to about $110 per function point.  But this investment could raise the probability of a successful outcome from less than 25% to more than 75%.

Thus an investment of about $110 per function point could head off a possible loss or write-off of $1,500 per function point.  This would generate an approximate return on investment of $13.64 for every dollar spent which is quite a good ROI.

Further, applications in the 10,000 function point size range have shorter schedules and lower costs if effective project management and quality control are part of the development process.  Even with $110 per function point for inspections and management tools, the probable development cost of this hypothetical application might drop to only $850 per function point or $850,000 in total.

Under this scenario an investment of $110 per function point would yield savings of $150 per function point.  Here the ROI is only $1.36 for every dollar spent.  But if you consider the entire picture, greater value emerges.

  • Investment in inspections and project management tools = $110 per function point.
  • Cost per function point for failing projects without inspections or good project management tools = $1,500.
  • Cost per function point with average quality control and project management tools = $1000
  • Cost per function point with inspections and good project management tools = $850.
  • Probability of failure without inspections, static analysis, and good project management tools = 75%.
  • Probability of failure with inspections, static analysis, and good project management tools = 5%.
  • Cost per function point of failed applications of the same size = $1,500
  • Cost per function point difference between success and failure ($1,500 – $850 = $650).
  • Most probable ROI ($650 / $110 = $5.91)
  • Maximum ROI ($1,500 / $110) = $13.64

Good project management and good quality control raise the probability of successful projects and lower the probability of failure and delay.  In other words, good project management and good quality projects in the 10,000 function point range will often approach the $850 per function point cost, and never drift into the $1,500 per function point cost range associated with inept management and inadequate quality.

In other words a combination of excellent project management and excellent quality control lowers the odds of failure and also lowers the cost of development.  If the higher probability of success is included in the ROI calculations the results are quite favorable.  Above all, project managers should be alert to the fact that cancelled projects not only cost much more than successful projects, but are a complete write-off with no residual value.

Table 4 illustrates the fact that excellence in quality control and excellence in project management are directly proportional to application size.  Table 3 shows the approximate ROI for both quality and project management for applications between 1 function point and 100,000 function points:

 

Table 4:  ROI of Excellent Quality and Project Management

Application Size

 

ROI of

 

ROI of

ROI of

in Function Points

 

Excellent

 

Excellent

Total

   

Quality

 

Project

Project

   

 

 

Management

Excellence

1

$1.50

$1.00

$2.50

10

$2.00

$1.50

$3.50

100

$5.00

$3.00

$8.00

1,000

$10.00

$7.50

$17.50

10,000

$15.00

$12.50

 

$27.50

100,000

$25.00

$20.00

 

$45.00

AVERAGE

 

$9.75

 

$7.58

$17.33

As can be seen from table 4 the value of high quality and good project management goes up rapidly as application size increases.

The reason that the ROI of software quality is slightly larger than the ROI of project management is because the “defect potential” of software projects rises steeply with application size.  Table 5 shows the approximate total numbers of defects that will need to be found and removed for six software size ranges between 1 and 100,000 function points:

 

Table 5:  Software Defect Potentials for Six Application Size Ranges

Function

Req.

Architecture

Design

Code

Doc.

TOTAL

Points

Defects

Defects

Defects

Defects

Defects

Defects

1

                      3

                      1

                    1

                      1

                 1

                        7

10

                    17

                      4

                  10

                    12

                 5

                      48

100

                  115

                    29

                  87

                  118

               26

                    375

1,000

                  746

                  221

                777

               1,175

             139

                 3,058

10,000

               4,472

               1,653

             6,929

             11,755

             738

               25,547

100,000

           195,878

             37,356

           61,751

           117,546

          3,921

             416,452

Average

           33,539

              6,544

         11,593

           21,768

            805

             74,248

 

(Note:  Predicting defect potentials and defect removal efficiency (DRE) are both standard features of the author’s Software Risk Master (SRM) estimating tool.  The values shown in table 5 assume average teams, the Java language, and iterative development.  Needless to say the results would vary with team experience, programming languages, and methodologies.)

Returning to the main theme, the presence of a suite of project management tools is not, by itself, the main differentiating factor between successful and unsuccessful software projects.  The primary reason for the differences noted between failing and successful projects is that the project managers who utilize a full suite of management tools are usually better trained and have a firmer grasp of the intricacies of software development than the managers who lack adequate management tools.

Bringing a large software project to a successful conclusion is a very difficult task which is filled with complexity.  The managers who can deal with this complexity recognize that some of the cost and resource scheduling calculations exceed the ability of manual methods.

Managers on failing projects, on the other hand, tend to have a naïve belief that project planning and estimating are simple enough to be done using rough rules of thumb and manual methods.  These same managers tend to skimp on quality control and bypass inspections.

SUMMARY AND CONCLUSIONS
Tools by themselves do not make successful projects.  Capable managers and capable technical personnel are also needed.  However, attempting to construct large software projects without adequate management and quality control tools is not a safe undertaking.  No one in the industrialized world today would dream of starting a large engineering project without adequate tools for project management.  Yet software projects whose total staffing compares to many large-scale engineering projects are routinely started using “back of the envelope” planning and estimating methods.  It is no wonder that failures are so common.

Software itself is intangible, but the schedules and cost estimates for software can be highly tangible.  Software projects are still subject to the basic laws of manufacturing and software needs to be placed on a firm engineering basis.  It is professionally embarrassing that software remains troublesome at this point in the 21st century.

Project managers are the primary key to software project success and failures.  To a very large degree, the sophistication or lack of sophistication of the project management tool suite will determine whether software projects will succeed, experience major cost and schedule overruns, or fail completely.

SUGGESTED READINGS

Boehm, Barry Dr.; Software Engineering Economics; Prentice Hall, Englewood Cliffs, NJ; 1981; 900 pages.

Brown, Norm (Editor); The Program Manager’s Guide to Software Acquisition Best Practices; Version 1.0; July 1995; U.S. Department of Defense, Washington, DC; 142 pages.

DeMarco, Tom; Controlling Software Projects; Yourdon Press, New York; 1982; ISBN 0-917072-32-4; 284 pages.

DeMarco, Tom; Why Does Software Cost So Much?; Dorset House, New York, NY; ISBN 0-932633-34-X; 1995; 237 pages.

Department of the Air Force; Guidelines for Successful Acquisition and Management of Software Intensive Systems; Volumes 1 and 2; Software Technology Support Center, Hill Air Force Base, UT; 1994.

Garmus, David and Herron, David; Function Point Analysis; Addison Wesley, Boston, MA; ISBN 0-201-69944-3; 2001; 362 pages.

Grady, Robert B.; Practical Software Metrics for Project Management and Process Improvement; Prentice Hall, Englewood Cliffs, NJ; ISBN 0-13-720384-5; 1992; 270 pages.

Grady, Robert B. & Caswell, Deborah L.;  Software Metrics:  Establishing a Company-Wide Program; Prentice Hall, Englewood Cliffs, NJ; ISBN 0-13-821844-7; 1987; 288 pages.

IFPUG Counting Practices Manual, Release 4, International Function Point Users Group, Westerville, OH; April 1995; 83 pages.

Jones, Capers; The Technical and Social History of Software Engineering; Addison Wesley, Englewood Cliffs, NJ; 2014.

Jones, Capers and Bonsignour, Olivier; The Economics of Software Quality; Addison Wesley, Englewood Cliffs, NJ, 2012.

Jones, Capers; Software Engineering Best Practices, McGraw Hill, NY; 2010.

Jones, Capers; Estimating Software Costs; McGraw Hill, NY; 2nd edition 2007; ISBN 0-07-148300-4

Jones, Capers; Applied Software Measurement; McGraw Hill, NY 2nd edition 1996; ISBN 0-07-032826-9; 618 pages (3rd edition due in the Spring of 2008).

Jones, Capers; Assessment and Control of Software Risks; Prentice Hall, Englewood Cliffs NJ; 1994;  ISBN 0-13-741406-4; 711 pages.

Jones, Capers; Patterns of Software System Failure and Success;  International Thomson Computer Press, Boston, MA;  December 1995; 250 pages; ISBN 1-850-32804-8; 292 pages.

Jones, Capers;  Software Quality – Analysis and Guidelines for Success; International Thomson Computer Press, Boston, MA; ISBN 1-85032-876-6; 1997; 492 pages.

Kan, Stephen H.; Metrics and Models in Software Quality Engineering;  2nd edition;Addison Wesley, Boston, MA; ISBN 0-201-72915-6; 2003; 528 pages.

Multiple authors; Rethinking the Software Process; (CD-ROM); Miller Freeman, Lawrence, KS; 1996. (This is a new CD ROM book collection jointly produced by the book publisher, Prentice Hall, and the journal publisher, Miller Freeman.  This CD ROM disk contains the full text and illustrations of five Prentice Hall books:  Assessment and Control of Software Risks by Capers Jones; Controlling Software Projects by Tom DeMarco;  Function Point Analysis by Brian Dreger;  Measures for Excellence by Larry Putnam and Ware Myers; and Object-Oriented Software Metrics by Mark Lorenz and Jeff Kidd.)

Putnam, Lawrence H.; Measures for Excellence — Reliable Software On Time, Within Budget; Yourdon Press – Prentice Hall, Englewood Cliffs, NJ; ISBN 0-13-567694-0; 1992; 336 pages.

Putnam, Lawrence H and Myers, Ware.;  Industrial Strength Software – Effective Management Using Measurement; IEEE Press, Los Alamitos, CA; ISBN 0-8186-7532-2; 1997; 320 pages.

Rubin, Howard;  Software Benchmark Studies For 1998; Howard Rubin Associates, Pound Ridge, NY; 1997.

Stukes, Sherry, Deshoretz, Jason, Apgar, Henry and Macias, Ilona; Air Force Cost Analysis Agency Software Estimating Model Analysis ;  TR-9545/008-2; Contract F04701-95-D-0003, Task 008; Management Consulting & Research, Inc.; Thousand Oaks, CA 91362; September 30 1996.

Symons, Charles R.; Software Sizing and Estimating – Mk II FPA (Function Point Analysis);  John Wiley & Sons, Chichester; ISBN 0 471-92985-9; 1991; 200 pages.

Thayer, Richard H. (editor); Software Engineering and Project Management; IEEE Press, Los Alamitos, CA; ISBN 0 8186-075107; 1988; 512 pages.

Umbaugh, Robert E. (Editor);  Handbook of IS Management; (Fourth Edition); Auerbach Publications, Boston, MA; ISBN 0-7913-2159-2; 1995; 703 pages.

Zells, Lois; Managing Software Projects – Selecting and Using PC-Based Project Management Systems; QED Information Sciences, Wellesley, MA; ISBN 0-89435-275-X; 1990; 487 pages.

Copyright 2014 by Capers Jones.
All Rights Reserved.

Bio:
Capers Jones, VP and CTO
Namcook Analytics LLC
Web:            www.Namcook.com
Blog:            
http://Namcookanalytics.com
Email:            
Capers.Jones3@gmail.com

Leave a Reply

Your email address will not be published. Required fields are marked *