#149 – SOFTWARE WASTAGE: TIME SPENT, CANCELED PROJECTS, CYBER ATTACKS, AND LITIGATION FOR POOR QUALITY – CAPERS JONES

Capers Jones pixSoftware demographics and software work patterns show interesting changes over the past few decades. A study in 1990 noted 116 software occupations but only 5 involving quality and none involving risks or cyber-attacks. An updated study in 2016 noted 205 total software occupations of which 32 were involved with risks, cyber-attacks, and quality. This is due to the huge increase in numbers and types of software cyber-attacks.

The overall software industry appears to have more cancelled projects, more time on repairing defects, and more cyber-attacks than any other. It also has many lawsuits for poor quality or breach of contract.

Roughly 100 software engineering work days per year can be viewed as “wastage” or non-productive work on 1) finding and fixing bugs; 2) recovering from successful cyber-attacks; 3) working on projects that are cancelled; 4) involvement in litigation for poor quality.

Only about 50 days per year go to actual software development on projects that are delivered and work well after deployment. So far as the author can tell, no other industry has such a poor ratio of wastage to productive work as does software. This is why CEO’s regard software as their least professional technical organization.

Introduction: Early Software Studies in the 1990’s

In 1990 AT&T commissioned the author to carry out a study of software occupations in large organizations. Among the participants were AT&T itself, IBM, the Navy, Texas Instruments, and several banks and insurance companies.

The study had some interesting findings. Among them was the fact that none of the human resource (HR) organizations actually knew how many software personnel were employed, due in part to the use of generic job titles such as “member of the technical staff” that included software and non-software personnel without any way of knowing which was which.

This generic job title is widely used among technology companies and includes electrical engineers, chemical engineers, aerospace engineers, automotive engineers, telecommunications engineers, mechanical engineers, mathematicians, physicists, and also software engineers. None of the HR organizations had any breakdowns of specific kinds of engineers subsumed under this generic job title.

Since HR organizations are the source of U.S. employment statistics the fact that none of them knew software employment totals raises questions as to the accuracy of Bureau of Labor Statistics reports. The original study used interviews with local operating unit managers to find out how many software personnel worked in various business units.

This study noted a total of 116 software occupations. There were 5 occupations involved with quality at that time including 1) director of quality, 2) quality assurance specialists, and test specialists for 3) information systems, 4) embedded software, and 5) military software.   At the time no occupations were found that were involved with software risks or cyber-attacks.

Another long-range study by the author consisted of collecting daily samples of data from volunteer software personnel at several companies about how they spent their time over a three-month period. The choices used by the volunteers included finding and fixing bugs, producing paper documents, developing new code, maintaining old code, attending meetings, taking classes, and several other common activities including holidays. The study also followed up later as to whether software projects in the study were completed or terminated.

Perhaps the most significant finding of this study was that software quality was so bad in 1990 that software engineers and programmers spent about 100 days per year on finding and fixing bugs or working on projects that were later canceled. Productive work on successful projects only totaled to about 50 days per year.

The author coined the phrase “wastage” for all of the non-productive activities that intrude into actual development of successful software that are delivered to clients. Today the topics that are combined into the term “wastage” include:

Elements of Software Wastage Circa 2016

  1. Finding and fixing bugs before and after release
  2. Working on projects canceled due to negative ROI
  3. Recovering from successful cyber attacks
  4. Depositions and court time for lawsuits for poor quality

The author has also worked as an expert witness in a number of lawsuits where the litigation involved either outright cancellation or claims of poor quality and excessive defects. Obviously time spent being deposed or testifying on a lawsuit for poor quality is not productive work.

New Software Data Circa 2016

The author has continued with these studies and updated both occupation groups and work patterns from time to time.

New occupation data circa 2016 shows some striking differences from the older study of 1990. Due to the huge increase in cyber-attacks and the looming threat of possible cyber-warfare occupations involving software risks and security have increased significantly. Today in 2016 out of a total of 205 occupations noted (See Appendix A) jobs for risk, cyber security, and quality now total to 32. Table 1 shows these occupations:

Table 1: Risk, Cyber Security, and Quality Occupation Groups
1 Chief risk officers (CRO)
2 Chief security officer (CSO)
3 Complexity specialists
4 Cyber-attack specialists
5 Cyber-defense specialists
6 Cyber-warfare specialists
7 Director of Data Quality
8 Director of Quality
9 Ethical hacker
10 Governance specialist
11 Independent Verification and Validation (IV&V) specialists – defense projects
12 Quality assurance specialists – hardware  
13 Quality assurance specialists – software  
14 Quality assurance specialists – systems  
15 Quality circle specialists – Japan      
16 Quality function deployment (QFD) specialists
17 Quality measurement specialists    
18 Risk analysis specialists  
19 Risk management specialists
20 Security specialists – cyber-attacks
21 Security specialists – hardware  
22 Security specialists – office space
23 Security specialists – personnel
24 Security specialists – software
25 Test case design specialists
26 Testing specialists – automated
27 Testing specialists – manual
28 Users – acceptance test team
29 Users – beta test team
30 Users – governance team
31 Vice President of Quality
32 Zero-day security specialist

One reason for the large increase in risk, quality, and cyber-security occupations is because modern software is subject to a significant number of very serious risks as shown in table 2:

Table 2: Twenty Five Software Risks Circa 2016

  1. Project cancellations
  2. Project cost overruns
  3. Project schedule delays
  4. Creeping requirements (> 1% per month)
  5. Deferred features due to deadlines (>20% of planned features)
  6. High defect potentials
  7. Low defect removal efficiency (DRE)
  8. Latent security flaws in application when released
  9. Error-prone modules (EPM) in applications
  10. High odds of litigation for outsource contract projects
  11. Low customer satisfaction levels
  12. Low team morale due to overtime and over work
  13. Inadequate defect tracking which fails to highlight real problems
  14. Inadequate cost tracking which omits major expense elements
  15. Long learning curves by maintenance and support teams
  16. Frequent user errors when learning complex new systems
  17. Post-release cyber-attacks (denial of service, hacking, data theft, etc.)
  18. High cost of learning to use the application (COL)
  19. High cost of quality (COQ)
  20. High technical debt
  21. High maintenance costs
  22. High warranty costs
  23. Excessive quantities of rework
  24. Difficult enhancement projects
  25. High total cost of ownership (TCO)

Another significant reason for this large increase in occupations involving cyber security and risk is because of the significant and increasing threats of a number of different forms of cyber-attack and the increasing need for various kinds of cyber-attack deterrence as shown in table 3:

Table 3 Forms of Cyber Deterrence and Cyber Attacks
   
Cyber-Attack Deterrence    
1 Security static analysis before release
2 Security inspections
3 Security testing
4 Ethical hackers
5 Security-hardened hardware  
6 Security-hardened software  
7 Security-hardened offices  
 
Cyber-Attacks and Recovery    
1 Hacking, data theft
2 Denial of service attack
3 Virus, worm, botnet attack
4 Phishing/whale phishing
5 Cyber blackmail (locked files)
6 Infrastructure/equipment attacks

 

Early cyber-attacks mainly concentrated on identify theft of stealing funds. Today in 2016 cyber-attacks now include denial of service, locking computers and blackmailing owners to unlock them; attacks on physical manufacturing equipment, and even worse attacks on schools and hospitals. Even prison doors can be opened via cyber-attack. Since computers and software are now the main operating components of every business and government agency, nothing is safe in 2016.

As it happens software wastage is proportional to the overall size of software applications measured in function points. It is also proportional to the set of quality control methods utilized.

High quality software applications utilize requirements models, certified reusable components, formal inspections, security specialists, static analysis, mathematical test case design, certified test personnel, ISO quality standards, and many are at level 5 on the Software Engineering Institute (SEI) capability maturity model integrated (CMMI). Defect potentials are usually < 3.00 per function point. Defect removal efficiency of high-quality software applications is over 96.00% on average about 99.65% at best.

Poor quality software applications utilize little if any pre-test defect removal activites and only casual testing by uncertified development personnel. Test case design is not based on mathematical methods such as cause-effect graphs or design of experiments. Defect potentials are often > 5.00 per function point. Defect removal efficiency on poor-quality applications runs from below 80.00% up to a high of about 93.00% at best.

Table 4 shows approximate numbers of waste days and productive days for software applications ranging from 10 function points to 100,000 function points under both poor quality and high quality scenarios:

Table 4: Software Work Patterns by Application Size
           
  Poor Quality   High Quality
           
Function Waste Productive   Waste Productive
Points Days Days   Days Days
10 80 100 20 152
       
100 99 81 45 134
       
1,000 111 61 49 131
       
10,000 129 50 55 124
       
100,000 160 25 66 114
Average 116 63   47 131

 

As can be seen the high-quality projects have at least 50% more productive days per year for small applications and over 400% more productive days for large systems. The overall average shows more than twice as many productive days than waste days for high quality software, which is also cheaper and faster to build.

Overall for the entire software industry finding and fixing bugs is the #1 cost driver, so better quality control is on the critical path for shorting software schedules and lowering software costs during development and after release.

National Economic Impact of Software Wastage

As already stated the software industry has a poor reputation for schedule delays, cost overruns, cancelled projects, and poor quality. In recent years excessive vulnerabilities to cyber-attacks have also tarnished the reputation of the software industry and added to wastage.

Table 5 attempts to scale up the topics discussed in this paper and show a rough approximation of overall wastage for the entire U.S. software industry as a whole:

Table 5: U.S. National Economic Impact of Software Wastage Circa 2016
        Monthly Monthly Annual
      Software staff Salary U.S. Cost U.S. Cost
U.S. Development staff        1,900,000 $10,000 $19,000,000,000 $228,000,000,000
U.S. Maintenance staff        2,150,000 $9,000 $19,350,000,000 $232,200,000,000
TOTALS            4,050,000 $9,469 $38,350,000,000 $460,200,000,000
      Average Total Annual $ per
    Projects Funct Pt. Function Pts. Function Pt.
Development projects          1,350,000 275            371,250,000 $614.14
Legacy maintenance projects          2,950,000 140            413,000,000 $562.23
TOTALS              4,300,000              182            784,250,000 $586.80
% on bug repairs   40%     $184,080,000,000
% on cyber-attacks   11%     $50,622,000,000
% on cancelled projects 8%     $36,816,000,000
% on litigation 3%     $13,806,000,000
TOTAL U.S. WASTAGE 62%     $285,324,000,000
Bug repairs per FP         $234.72
Cyber-attacks per FP       $64.55
Cancel projects per FP $46.94
Litigation per FP $17.60
U.S. WASTAGE PER FUNCTION POINT $363.82
Note 1: bug repairs, cyber-attacks, and cancelled projects are all symptoms of poor quality.
Note 2: about 5% of outsource contracts go to litigation for poor quality or cancellation
Note 3: better quality control would speed up schedules, lower costs, and raise value by about 50%.
Note 4: static analysis, inspections, models, mathematical test case design, and certified reuse are critical.

Out of a total of about $460 billion for overall U.S. software costs circa 2016 some $285 billion go to all forms of wastage or about 62% of total software expenses.

So far as the author can determine, no other industry has so many failures and cancelled projects, so many bugs in delivered projects, so many successful cyber-attacks, and so much litigation for poor quality and breach of contract.

Reducing Wastage in Future U.S. Software Projects

The main reason for the high percentage of wastage in the U.S. software industry is poor quality control. A precursor reason for poor quality control is that hardly any company actually knows the technology stacks that will achieve high quality control. A precursor reason for lack of quality knowledge is that software measurements and metrics are so poor that software quality costs are not even accurately covered in the software literature.

For over 60 years two common software metrics, “lines of code” and “cost per defect” have distorted reality and concealed the true economic value of high software quality. The “cost per defect” metric penalizes quality and is cheapest for the buggiest software.   The “lines of code” metric makes requirements and design bugs invisible and also penalizes modern programming languages and makes older low-level languages seem better than they are.

The function point metric, used in this paper, is the most accurate available metric for studying software quality and software economics. The new software metric for non-functional requirements called “SNAP” may add value in the future but has little empirical quality data available in 2016, while function points have over 40 years of successful usage and over 75,000 available benchmarks.

The two best metrics for software quality analysis are function points for measuring “defect potentials” combined with “defect removal efficiency” (DRE).

The defect potential metric originated within IBM circa 1970 and shows the total numbers of potentials defects from all sources. Table 6 shows U.S. averages for defect potentials in 2016:

Table 6: Average Software Defect Potentials circa 2016 for the United States

  • Requirements 70 defects per function point
  • Architecture 10 defects per function point
  • Design 95 defects per function point
  • Code 15 defects per function point
  • Security code flaws 25 defects per function point
  • Documents 45 defects per function point
  • Bad fixes 65 defects per function point
  • Totals 25 defects per function point

Averages are misleading since the range can be larger or smaller by more than 50% in both directions.

As can be seen software defects originate in multiple sources and not just in source code.   For example requirements defects contribute to about 20% of delivered defects and are much harder to eliminate than code defects. Requirements defect often can’t be remove via testing. Another difficult defect category for removal are “bad fixes” or new bugs accidentally included in bug repairs. About 7% of U.S. bug repairs have new bugs in them.

The overall 2016 U.S. averages for defect removal efficiency (DRE) are shown in Table 7 below:

Table 7: U.S. Software Average DRE Ranges 2016
  (Powers of 10)
 
Function
Points Best Average Worst
1 99.90% 97.00% 94.00%
10 99.00% 96.50% 92.50%
100 98.50% 95.00% 90.00%
1,000 96.50% 94.50% 87.00%
10,000 95.00% 89.50% 83.50%
100,000 93.50% 86.00% 78.00%
Average 97.07% 93.08% 87.50%

 

The calculations for defect removal efficiency (DRE) are fairly simple. All bugs found during development are recorded. Bugs reported by users in the first 90 days of use are reported. If developers found 950 bugs and users report 50 bugs in the first three months of use, the defect total is 1000 bugs and DRE is obviously 95%.

Of course bug reports after 90 days continue to occur but it is necessary to have a fixed calendar point in order to calculate DRE in a consistent fashion.

The technology stack topics needed to achieve high-software quality levels and thereby reduce or even eliminate wastage are shown in table 8.0:

Table 8.0: Technology Stack to Reduce Software Wastage
Low High
Quality Quality
   
Complex structure does Effective decomposition
not allow effective into discrete, buildable
decomposition components
   
Maximum Maximum
Cyclomatic Cyclomatic
Complexity > 50 Complexity < 10
Unreliable reusable Certified reusable
materials materials
Unplanned requirements Unplanned requirements
> 1.5% per month < 0.25% per month
Many security Very few security
vulnerabilities vulnerabilities
> 0.3 per function point < 0.01 per function point
Multiple error-prone Zero error-prone
modules modules
Poor maintainability Good maintainability
(1 star) (5 stars)
No static analysis Static analysis
of legacy code of 100% of legacy code
No renovation Full renovation
of legacy code of legacy code
before enhancements before enhancements
No SEMAT usage SEMAT used for project
CMMI levels = 0 or 1 CMMI levels = 3 to 5
Annual Training Assumptions for Managers and Technical Staff
  (Courses or self-study coureses)\
 
  Low High
  Quality Quality
     
  No curriculum Effective curriculum
  planning planning
Managers = < 2 days Managers = > 5 days
Developers = < 2 days Developers = > 5 days
Maintainers = < 2 days Maintainers = > 5 days
Testers = < 2 days Testers = > 5 days
SQA = < 2 days SQA = > 5 days
Webinars:= < 2 Webinars: = > 8
per year per year
Professional Certification Assumptions for Managers and Technical Staff
  (Percent of fulltime employees attaining professional certification)
 
  Low High
  Quality Quality
Managers = < 10% Managers = > 30%
Developers = < 10 Developers = > 20%
Maintainers: = < 10% Maintainers = 15%
Testers = < 15% Testers = > 50%
SQA = < 10% SQA = > 25%
Annual Technology Investments for Management and Technical Staff
  (Training, tools, and methodologies)
 
  Low High
  Quality Quality
Managers: < $1,000 Managers: > $3,000
Developers: < $1,500 Developers > $3,500
Maintainers: < $1000 Maintainers: > $3,000
Testers: < $1,500 Testers: > $3,000
SQA: < $500 SQA: > $1,500
TOTAL: < $6,000 TOTAL: > $14,000
AVERAGE: < $1,100 AVERAGE: > $2,800
PERCENT: < 58% PERCENT: > 147%
   
Software Project Management Assumptions  
   
Low High
Quality Quality
   
No value Formal value analysis
analysis adjusted for risks
   
No risk analysis Early risk analysis
No risk solutions Early risk solutions
No risk Continuous risk
monitoring monitoring
Inadequate project Rapid project
corrections corrections
   
Informal manual Automated
cost estimates parametric cost estimates
 
Inaccurate manual Automated parametric
schedule estimates schedule estimates
 
Grossly inaccurate Formal, accurate
progress tracking progress tracking
 
Grossly inaccurate Formal, accurate
cost tracking cost tracking
No project office for Automated project office
projects > 1000 FP for projects > 1000 FP
Ineffective or erroneous Effective “dashboard”
status monitoring status monitoring
 
No internal benchmark Formal benchmark
data available comparisons: Internal
No external benchmark Formal benchmark
data utilized comparisons: External
(Namcook, ISBSG, others)
Casual change Formal change
control control
Inadequate governance Effective governance
for financial software for financial software
Quality Method and Practice Assumptions  
  (Low, Average, and High Quality)
     
  Quality Estimation and Measurement Assumptions
     
  Low High
  Quality Quality
     
  No quality Automated early
  estimates quality estimates
   
  Defect tracking: Defect tracking:
  post-release from project start
 
  No test coverage Automated test coverage
  measusrements measurements
 
  Unknown test coverage Test coverage > 95%
 
  No cyclomatic Full Cyclomatic
  complexity measures complexity measures
 
  Unknown cyclomatic Cyclomatic complexity
  complexity (might top 50) is normally < 10
 
  Bad-fix injection > 20% Bad-fix injection < 3%
 
 
  No quality measures Full quality measures:
  Defect potentials
  Defect detection
  Defect removal efficiency
  Defect severity levels
  Duplicate defects
  Bad fix injection
  Invalid defects
  Defects by origin
  False positives
  Defect repair cost
  Defect repair time
  Root causes
 
  No cost of quality (COQ) Full cost of quality (COQ)
 
  No technical debt measures Technical debt plus COQ
 
  Lines of code Function points
  (hazardous metric)
 
  Cost per defect Defect repair costs
  (hazardous metric) per function point
 
  No standards used Adherence to ISO 9126
  quality standards
 
  Adherence to ISO 14764
  maintainability standards
 
  Adherence to ISO 14143
  functional sizing
 
  Adherence to ISO 12207
  software lifecycle
 
     
  Defect Prevention Method and Practice Assumptions
     
  Low High
  Quality Quality
None Joint Application
Design (JAD)
Quality Function
Deployment (QFD)
Kaizen/kanban
Six Sigma for software
Development Method and Practice Assumptions
   
Low High
Quality Quality
   
Methods chosen Methods chosen based
by popularity and not by technical evaluation and
evaluation of results empirical data
CMMI 0 or 1 CMMI 3, 4, 5
Waterfall Agile < 1000 FP
Agile PSP/TSP > 1000 FP
Uncertified reuse RUP
XP
Hybrid
Certified reuse
Model-based develop.
Pre-Test Defect Removal Methods and Practice Assumptions
   
Low High
Quality Quality
   
No SQA Certified SQA
None Static analysis – text
FOG index – Req. & Des.
Automated correctness
proofs
Requirements
inspections
Design
inspections
Architecture
inspections
Code inspections
Test-case inspections
Refactoring
Static analysis – 100 % of code
Error-prone module
removal from legacy code
Testing Method and Practice Assumptions
 
Low High
Quality Quality
   
Manual testing Manual + automated testing
Uncertified Certified
Test teams Test teams
Informal test Formal test
case design case design using
mathematical models:
1) Cause-effect graphs
2) Design of experiemnts
Informal. ineffective test Formal, effective test
library control library control
> 50 builds during test < 10 builds during test
No risk-based tests Risk-based tests
No reusable tests Many reusable tests
Unit test * Unit test *
Function test * Function test **
Regression test * Regression test **
System test * Performance test **
Beta test *** Security test **
Acceptance test *** Usability test **
(* = tests by developers) System test **
(** = tests by experts) Beta test ***
(*** = tests by customers) Acceptance test ***
(* = tests by developers)
(** = tests by experts)
(*** = tests by customers)
No test coverage Test coverage analysis
analysis (Percent of code
executed during testing;
risks covered by testing;
paths covered by testing;
inputs covered by testing;
outputs covered by testing)

 

Table 8 illustrates an endemic problem of the software industry. Reducing wastage and improving software quality requires a great deal of technical knowledge that can only be derived from accurate measurements of size, quality, costs, and technology effectiveness.

The software industry has suffered with inaccurate metrics and sloppy and incomplete measurement practices for over 60 years. This is a key factor in software wastage.

If medicine had the same poor combination of bad metrics and incomplete measurements as software does then medical doctors would probably not be using sterile surgical procedures even in 2016 and might still be treating infections with blood-letting and leaches instead of with antibiotics.

A final recommendation by the author is that every reader of this paper should also acquire and read Paul Starr’s book The Social Transformation of American Medicine, Perseus Group 1982. This book won a well-deserved Pulitzer Prize in 1982.

Only about 150 years ago medicine had a similar combination of poor measurements and inaccurate metrics combined with mediocre professional training. Starr’s book on how medical practice improved to reach today’s high standards is a compelling story with many topics that are relevant to software.

Summary and Conclusions

The software industry is one of the largest and most wealthy industries in human history. Software had created many multi-billion dollar companies such as Apple, Facebook, Google, and Microsoft. Software has created many millionaires and also quite a few billionaires such as Bill Gates, Larry Ellison, Sergey Brin, Jeff Bezos, and Mark Zukerberg.

However software quality remains mediocre in 2016 and software “wastage” remains alarmingly bad. These conditions are somewhat like smallpox and diphtheria. They can be prevented by vaccination or successfully treated.

In order to reduce software wastage the software industry needs to eliminate inaccurate metrics that distort reality. The software industry needs to adopt functional metrics and also capture the true and complete costs of software development and maintenance.

Software also needs to focus on defect prevention and pre-test defect removal and of course use formal mathematics methods for test case design, plus using more certified test personnel.

It would be technically possible reduce software wastage by over 90% within 10 years if there were a rapid and effective method of technology transfer that could reach hundreds of companies and thousands of software personnel.

However as of 2016 there are no really effective channels that can rapidly spread proven facts about better software quality control. Most software journals don’t even publish accurate quantitative data.

Universities at both graduate and undergraduate levels often still use “lines of code” and “cost per defect” and hence are providing disinformation to students instead of solid facts.

Professional societies such as the IEEE, ACM, SIM, PMI, IFPUG, etc. provide valuable networks and social services for members but not reliable quantitative data. Also, it would be more effective if the software professional societies followed the lead of the American Medical Association (AMA) and provided reciprocal memberships and better sharing of information.

The major standards for software quality and risk such as ISO 9000/9001 and ISO 31000 provide useful guidelines but there is no empirical quantified data that either risks or quality have tangible benefits.

Achieving levels 3 through 5 on the Software Engineering Institute’s capability maturity model integrated (CMMI) does yield tangible improvements in quality. However the SEI itself does not collect or publish quantitative data on quality or productivity. (The Air Force gave the author a contract to demonstrate the value of higher CMMI levels.)

The software journals, including refereed software journals, contain almost no quantitative data at all. The author’s first job out of college was editing a medical journal. About a third of the text in medical articles discusses the metrics and measurement methods used and how data was collected and validated. By contrast the author has read over a dozen refereed software articles that used the “lines of code” metric without even defining whether physical lines or logical statements were used, and these can vary by over 500%. Some articles did not even mention which programming languages were used and these can vary by over 2000%. Compared to medical journals refereed software journals are embarrassingly amateurish even in 2016 when it comes to metrics, measures, and quantitative results.

Software quality companies in testing and static analysis make glowing claims about their products but produce no facts or proven quantitative information about defect removal efficiency (DRE).

Software education and training companies teach some useful specific courses but all of them lack an effective curriculum that includes defect prevention, pre-test defect removal, and effective test technologies, or measuring defect potentials and defect removal efficiency (DRE) which should be basic topics in all software quality curricula.

Software quality conferences often have entertaining speakers but suffer from a shortage of factual information and solid quantitative data about methods to reduce defect potentials and raise defect removal efficiency.

There are some excellent published books on software quality but only a few of these have sold more than a few thousand copies in an industry with millions of practitioners. For example Paul Strassmann’s book on The Squandered Computer covers software economic topics quite well. Steve Kan’s book on Metrics and Models in Software Quality Engineering does an excellent job on quality metrics and measures; Mike Harris, David Herron, and Stasia Iwanacki’s book on The Business Value of IT is another solid title with software economic facts; Alain Abran’s book on Software Metrics and Metrology covers functional metrics; Olivier Bonsignour and the author’s book on The Economics of Software Quality has quantified data on the effectiveness of various methods, tools, and programming languages.

On a scale of 1 to 10 the quality of medical information is about a 9.5; the quality of legal information is about a 9; the quality of information in electronic and mechanical information is also about a 9; for software in 2016 the overall quality of published information is maybe a 3.5. In fact some published data that uses ”cost per defect” and “lines of code” has a negative value of perhaps – 5 due to the distortion of reality by these two common but inaccurate metrics.

Wastage, poor quality, poor metrics, poor measurements, and poor technology transfer are all endemic problems of the software industry. This is not a good situation in a world of accelerating cyber-attacks and looming cyber warfare.

All of these endemic software problems are treatable problems that could be eliminated if software adopts some of the methods used by medicine as discussed in Paul Starr’s book The Social Transformation of American Medicine.

References and Readings on Software Quality

Abrain, Alain; Software Estimating Models; Wiley-IEEE Computer Society; 2015
Abrain, Alain; Software Metrics and Metrology; Wiley-IEEE Computer Society; 2010
Abrain, Alain; Software Maintenance Management: Evolution and Continuous Improvement; Wiley-IEEE Computer Society, 2008.
Abrain, A. and Robillard, P.N.; “Function Point Analysis, An Empirical Study of its Measurement Processes”; IEEE Transactions on Software Engineering, Vol 22, No. 12; Dec. 1996; pp. 895-909.
Albrecht, Allan; AD/M Productivity Measurement and Estimate Validation; IBM Corporation, Purchase, NY; May 1984.
Austin, Robert d:; Measuring and Managing Performance in Organizations; Dorset House Press, New York, NY; 1996; ISBN 0-932633-36-6; 216 pages.
Beck, Kent; Test-Driven Development; Addison Wesley, Boston, MA; 2002; ISBN 10: 0321146530; 240 pages.
Black, Rex; Managing the Testing Process: Practical Tools and Techniques for Managing Hardware and Software Testing; Wiley; 2009; ISBN-10 0470404159; 672 pages.
Boehm, Barry Dr.; Software Engineering Economics; Prentice Hall, Englewood Cliffs, NJ; 1981; 900 pages.
Brooks, Fred: The Mythical Man-Month, Addison-Wesley, Reading, Mass., 1974, rev. 1995.
Chelf, Ben and Jetley, Raoul; “Diagnosing Medical Device Software Defects Using Static Analysis”; Coverity Technical Report, San Francisco, CA; 2008.
Chess, Brian and West, Jacob; Secure Programming with Static Analysis; Addison Wesley, Boston, MA; 20007; ISBN 13: 978-0321424778; 624 pages.
Cohen, Lou; Quality Function Deployment – How to Make QFD Work for You; Prentice Hall, Upper Saddle River, NJ; 1995; ISBN 10: 0201633302; 368 pages.
Crosby, Philip B.; Quality is Free; New American Library, Mentor Books, New York, NY; 1979; 270 pages.
Charette, Bob; Software Engineering Risk Analysis and Management; McGraw Hill, New York, NY; 1989.
Charette, Bob; Application Strategies for Risk Management; McGraw Hill, New York, NY; 1990.
Constantine, Larry L; Beyond Chaos: The Expert Edge in Managing Software Development;   ACM Press, 2001.
DeMarco, Tom; Peopleware: Productive Projects and Teams; Dorset House, New York, NY; 1999; ISBN 10: 0932633439; 245 pages.
DeMarco, Tom; Controlling Software Projects; Yourdon Press, New York; 1982; ISBN 0-917072-32-4; 284 pages.
Everett, Gerald D. And McLeod, Raymond; Software Testing; John Wiley & Sons, Hoboken, NJ; 2007; ISBN 978-0-471-79371-7; 261 pages.
Ewusi-Mensah, Kweku; Software Development Failures; MIT Press, Cambridge, MA; 2003; ISBN 0-26205072-2276 pages.
Flowers, Stephen; Software Failures: Management Failures; Amazing Stories and Cautionary Tales; John Wiley & Sons; 1996.
Gack, Gary; Managing the Black Hole: The Executives Guide to Software Project Risk; Business Expert Publishing, Thomson, GA; 2010; ISBN10: 1-935602-01-9.
Gack, Gary; Applying Six Sigma to Software Implementation Projects; http://software.isixsigma.com/library/content/c040915b.asp.
Galorath, Dan and Evans, Michael; Software Sizing, Estimation, and Risk Management: When      Performance is Measured Performance Improves; Auerbach; Philadelphia, PA; 2006.
Garmus, David and Herron, David; Function Point Analysis – Measurement Practices for Successful Software Projects; Addison Wesley Longman, Boston, MA; 2001; ISBN 0-201-69944-3;363 pages.
Gibbs, T. Wayt; “Trends in Computing: Software’s Chronic Crisis”; Scientific American Magazine, 271(3), International edition; pp 72-81; September 1994.
Gilb, Tom and Graham, Dorothy; Software Inspections; Addison Wesley, Reading, MA; 1993; ISBN 10: 0201631814.
Harris, Michael D.S., Herron, David, and Iwanacki, Stasia; The Business Value of IT; CRC Press, Auerbach Publications; 2009.
Hill, Peter R. Practical Software Project Estimation; McGraw Hill, 2010
Hill, Peter; Jones Capers; and Reifer, Don; The Impact of Software Size on Productivity; International Software Standards Benchmark Group (ISBSG), Melbourne, Australia, September 2013.
Howard, Alan (Ed.); Software Metrics and Project Management Tools; Applied Computer Research (ACR; Phoenix, AZ; 1997; 30 pages.
Humphrey, Watts; Managing the Software Process; Addison Wesley, Reading, MA; 1989.
International Function Point Users Group (IFPUG); IT Measurement – Practical Advice from the Experts; Addison Wesley Longman, Boston, MA; 2002; ISBN 0-201-74158-X; 759 pages.
Jacobsen, Ivar, Griss, Martin, and Jonsson, Patrick; Software Reuse – Architecture, Process, and Organization for Business Success; Addison Wesley Longman, Reading, MA; ISBN 0-201-92476-5; 1997; 500 pages.
Jacobsen, Ivar et al; The Essence of Software Engineering; Applying the SEMAT Kernel; Addison Wesley Professional, 2013.
Jones, Capers: Software Risk Master (SRM) tutorial; Namcook Analytics LLC, Narragansett RI, 2015.
Jones, Capers: Software Defect Origins and Removal Methods; Namcook Analytics LLC; Narragansett RI, 2015.
Jones, Capers: The Mess of Software Metrics; Namcook Analytics LLC, Narragansett RI; 2015.
Jones, Capers; The Technical and Social History of Software Engineering; Addison Wesley, 2014.
Jones, Capers and Bonsignour, Olivier; The Economics of Software Quality;Addison Wesley, Boston, MA; 2011; ISBN 978-0-13-258220-9; 587 pages.
Jones, Capers; Software Engineering Best Practices; McGraw Hill, New York; 2010; ISBN 978-0-07-162161-8;660 pages.
Jones, Capers; Applied Software Measurement; McGraw Hill, 3rd edition 2008; ISBN 978=0-07-150244-3; 662 pages.
Jones, Capers; Critical Problems in Software Measurement; Information Systems Management Group, 1993; ISBN 1-56909-000-9; 195 pages.
Jones, Capers; Software Productivity and Quality Today — The Worldwide Perspective; Information Systems Management Group, 1993; ISBN -156909-001-7; 200 pages.
Jones, Capers; Assessment and Control of Software Risks; Prentice Hall, 1994; ISBN 0-13-741406-4; 711 pages.
Jones, Capers; New Directions in Software Management; Information Systems Management Group; ISBN 1-56909-009-2; 150 pages.
Jones, Capers; Patterns of Software System Failure and Success; International Thomson Computer Press, Boston, MA; December 1995; 250 pages; ISBN 1-850-32804-8; 292 pages.
Jones, Capers: “Sizing Up Software;” Scientific American Magazine, Volume 279, No. 6, December 1998; pages 104-111.
Jones, Capers; Conflict and Litigation Between Software Clients and Developers; Software Productivity Research technical report; Narragansett, RI; 2007; 65 pages
Jones, Capers; Software Quality – Analysis and Guidelines for Success; International Thomson Computer Press, Boston, MA; ISBN 1-85032-876-6; 1997; 492 pages.
Jones, Capers; Estimating Software Costs; 2nd edition; McGraw Hill, New York; 2007; 700 pages..
Jones, Capers; “The Economics of Object-Oriented Software”; SPR Technical Report; Software Productivity Research, Burlington, MA; April 1997; 22 pages.
Jones, Capers; “Software Project Management Practices: Failure Versus Success”;Crosstalk, October 2004.
Jones, Capers; “Software Estimating Methods for Large Projects”; Crosstalk, April 2005.
Kan, Stephen H.; Metrics and Models in Software Quality Engineering, 2nd edition; Addison Wesley Longman, Boston, MA; ISBN 0-201-72915-6; 2003; 528 pages.
Land, Susan K; Smith, Douglas B; Walz, John Z; Practical Support for Lean Six Sigma Software Process Definition: Using IEEE Software Engineering Standards; WileyBlackwell; 2008; ISBN 10: 0470170808; 312 pages.
McConnell, Steve; Software Project Survival Guide; Microsoft Press; 1997.
Mosley, Daniel J.; The Handbook of MIS Application Software Testing; Yourdon Press, Prentice Hall; Englewood Cliffs, NJ; 1993; ISBN 0-13-907007-9; 354 pages.
Nandyal; Raghav; Making Sense of Software Quality Assurance; Tata McGraw Hill Publishing, New Delhi, India; 2007; ISBN 0-07-063378-9; 350 pages.
Pressman, Roger; Software Engineering – A Practitioner’s Approach; McGraw Hill, NY; 6th edition, 2005; ISBN 0-07-285318-2.
Radice, Ronald A.; High Qualitiy Low Cost Software Inspections; Paradoxicon Publishingl Andover, MA; ISBN 0-9645913-1-6; 2002; 479 pages.
Royce, Walker E.; Software Project Management: A Unified Framework; Addison Wesley Longman, Reading, MA; 1998; ISBN 0-201-30958-0.
Starr, Paul; The Social Transformation of American Medicine; Basic Books; Perseus Group; 1982; ISBN 0-465-07834-2. NOTE: This book won a Pulitzer Prize in 1982 and is highly recommended as a guide for improving both professional education and professional status. There is much of value for the software community.
Strassmann, Paul; Information Payoff; Information Economics Press, Stamford, Ct; 1985.
Strassmann, Paul; Governance of Information Management: The Concept of an Information Constitution; 2nd edition; (eBook); Information Economics Press, Stamford, Ct; 2004.
Strassmann, Paul; Information Productivity; Information Economics Press, Stamford, Ct; 1999.
Weinberg, Gerald M.; The Psychology of Computer Programming; Van Nostrand Reinhold, New York; 1971; ISBN 0-442-29264-3; 288 pages.
Weinberg, Gerald M; Becoming a Technical Leader; Dorset House; New York; 1986; ISBN 0-932633-02-1; 284 pages.
Weinberg, Dr. Gerald; Quality Software Management – Volume 2 First-Order Measurement; Dorset House Press, New York, NY; ISBN 0-932633-24-2; 1993; 360 pages.
Wiegers, Karl A; Creating a Software Engineering Culture; Dorset House Press, New York, NY; 1996; ISBN 0-932633-33-1; 358 pages.
Wiegers, Karl E.; Peer Reviews in Software – A Practical Guide; Addison Wesley Longman, Boston, MA; ISBN 0-201-73485-0; 2002; 232 pages.
Yourdon, Ed; Outsource: Competing in the Global Productivity Race; Prentice Hall PTR, Upper Saddle River, NJ; ISBN 0-13-147571-1; 2005; 251 pages.
Yourdon, Ed; Death March – The Complete Software Developer’s Guide to Surviving “Mission Impossible” Projects; Prentice Hall PTR, Upper Saddle River, NJ; ISBN 0-13-748310-4; 1997; 218 pages.

Appendix A: Software Occupation Groups Circa 2016
Capers Jones, VP and CTO, Namcook Analytics LLC

Note 4: Risk, cyber security, and quality occupations and activities are in RED.
Software Occupation Groups Circa 2016
1 Accessibility specialist
2 Accounting specialists
3 Agile coaches
4 Agile metric specialists
5 Agile Scrum masters
6 Architect (information)
7 Architect (cloud)
8 Architect (data)
9 Architects (enterprise)
10 Architects (software)
11 Architects (systems)
12 Assessment specialists (SEI CMMI)
13 Assessment specialists (TICKIT)
14 Audit specialists
15 Baldrige Award specialists
16 Baseline specialists
17 Benchmark specialists
18 Business analysts (BA)
19 Business process re-engineering specialists (BPR)
20 Chief data officer (CDO)
21 Chief information officer (CIO)
22 Chief innovation Officer (CINO)
23 Chief knowledge officer (CKO)
24 Chief risk officers (CRO)
25 Chief security officer (CSO)
26 Chief technical officer (CTO
27 Cloud development specialists
28 CMMI audit specialists
29 Complexity specialists
30 Component development specialists
31 Configuration control specialists
32 Contract administration specialist
33 Consulting specialists
34 Container development specialists
35 Content manager
36 Cost estimating specialists
37 COTS acquisition specialists
38 Curriculum planning specialists
39 Customer liaison specialists
40 Customer support specialists
41 Cyber-attack specialists
42 Cyber-defense specialists
43 Cyber-warfare specialists
44 Data analyst
45 Data base administration specialists
46 Data center specialists
47 Data modeler
48 Data quality specialists
49 Data specialists (big data)
50 Data warehouse specialists
51 Decision support specialists
52 Development process specialists
53 DevOps specialists
54 Digital marketing specialist
55 Director of Data Quality
56 Director of Quality
57 Distributed system specialists
58 Domain specialists
59 Earned value (EVM) specialists
60 Education specialists
61 E-learning specialists
62 Embedded systems specialists
63 Enterprise resource planning (ERP) specialists
64 ERP data migration specialists
65 ERP deployment specialists
66 ERP porting specialists
67 Ethical hacker
68 Executive assistants
69 Frame specialists
70 Frameworks specialist
71 Front-end designer
72 Front-end developer
73 Full-stack developer (DevOps)
74 Function point specialists (COSMIC)
75 Function point specialists (FISMA)
76 Function point specialists (IFPUG)
77 Function point specialists (NESMA)
78 Function point specialists (other)
79 Generalists
80 Globalization and nationalization specialists
81 Governance specialist
82 Graphical user interface (GUI) specialists
83 Human factors specialists
84 Information engineering specialists (IE)
85 Instruction specialists – human factors
86 Instruction specialists – management
87 Instruction specialists – technical
88 Integration specialists
89 Intellectual property (IP) specialists
90 Internet specialists
91 ISO certification specialists
92 IV&V specialists – defense projects
93 Joint application design (JAD) specialists
94 Kaizen specialists
95 Kanban specialists
96 Key process indicator (KPI) specialists
97 Knowledge domain specialists
98 Legacy renovation specialists
99 Library specialists – books
100 Library specialists – programming libraries
101 Library specialists – test libraries
102 Litigation support specialists
103 Machine learning specialists
104 Maintenance specialists
105 Manager – 1st line
106 Manager – 2nd line
107 Manager – 3rd line
108 Marketing specialists
109 Measurement specialists
110 Members of the technical staff
111 Metric specialists
112 Microcode specialists
113 Mobile development specialist
114 Model specialists – design
115 Model specialists – requirements
116 Multi-media specialists
117 Network maintenance specialists
118 Network specialists – (LAN)
119 Network specialists – (WAN)
120 Network specialists (wireless)
121 Neural net specialists
122 Object-oriented specialists
123 Outsource evaluation specialists
124 Package evaluation specialists
125 Pair-programming specialists
126 Patent attorney (software with innovative features)
127 Pattern specialists
128 Performance specialists
129 Process improvement specialists
130 Productivity specialists
131 Programmers – embedded
132 Programmers – general
133 Programmers – microcode
134 Programming language specialists
135 Project cost analysis specialists
136 Project cost estimating specialists
137 Project lead
138 Project managers
139 Project office specialists
140 Project planning specialists
141 Proposal specialist
142 Quality assurance specialists – hardware  
143 Quality assurance specialists – software  
144 Quality assurance specialists – systems  
145 Quality circle specialists – Japan  
146 Quality function deployment (QFD) specialists
147 Quality measurement specialists    
148 Reengineering specialists
149 Requirements engineering specialists
150 Reuse specialists – architecture
151 Reuse specialists – code
152 Reuse specialists – designs
153 Reuse specialists – requirements
154 Reuse specialists – test materials
155 Reverse engineering specialists
156 Risk analysis specialists
157 Risk management specialists
158 Sales support specialists
159 Secretaries
160 Security specialists – cyber-attacks
161 Security specialists – hardware  
162 Security specialists – office space
163 Security specialists – personnel
164 Security specialists – software
165 SEMAT specialist
166 SEO consultant
167 SNAP metric specialists
168 Social media specialist
169 Software engineers
170 Software installation specialists
171 Software Sales specialists
172 Software withdrawal/decommissioning specialists
173 Story-point specialists
174 Systems administrator
175 Systems analysis specialists
176 Systems engineers
177 Systems support specialists
178 Technical illustration specialists
179 Technical support specialists
180 Technical translation specialists
181 Technical writing specialists
182 Test case design specialists
183 Testing specialists – automated
184 Testing specialists – manual
185 Use-case specialists
186 User liaison specialist
187 Users – acceptance test team
188 Users – agile embedded  
189 Users – beta test team  
190 Users – change control team  
191 Users – design review team
192 Users – governance team
193 Users – prototype team
194 Users – requirements team
195 UX designer
196 Vice President of Information Systems
197 Vice President of Quality
198 Vice Preisdent of Software Engineering
199 Virtual reality specialists
200 Web analytics specialists
201 Web development specialists
202 Web master
203 Web page design specialists
204 WordPress developer
205 Zero-day security specialist

 BIO:

Capers Jones, Vice President and CTO, Namcook Analytics LLC
Email: Capers.Jones3@Gmail.com
Web:   www.Namcook.com

 

Leave a Reply

Your email address will not be published.