Developer Productivity with Designer/2000,
a research summary report


Contents
  INTRODUCTION
  AN INDEPENDENT STUDY
  METHOD
  RESULTS
  CONCLUSION
  APPENDIX A: PARTICIPATING PROJECTS
  APPENDIX B: REFERENCES

 

Introduction

The value of any investment in software development technology lies in the increase in effectiveness of developers using the new technology. That effectiveness translates to productivity, but not just productivity in writing new code. Most projects are not ‘green field’ projects, they are add-ons, maintenance or enhancements and so improved productivity must especially apply to these, the majority of projects. Conversely, if a software component is of poor quality, maintenance effort will be increased and productivity in projects managing and building on that component will be impeded, no matter how swift the initial implementation was. Hence the place to look for sustainable productivity is in the mainstream maintenance and enhancement projects.

Of course in the competitive world of software development tools most effort is placed in the easier to demonstrate and initially more seductive initial development. Little effort is expended in measuring the full lifecycle effectiveness of such tools.

This report summarises a study which addresses exactly this problem. It was undertaken as a master’s degree research project at the Technical University of Munich. It was undertaken independently of Oracle Corporation or any other interested party. An overview of the results is presented below and shows consistently high levels of productivity achieved by Oracle Designer/2000 users. Productivity at levels much higher than tools benchmarked in other studies.

An Independent Study

The study was undertaken in 1995 and 1996 by Jenns Hofmann of the Technical University of Munich, under the supervision of his Professor, Professor Broy and Professor Rudolph of the University of Bremerhaven, an international authority on software metrics.

Oracle Corporation helped Hofmann to get in touch with customers and reimbursed expenses he incurred in visiting these customers in Germany and Italy. A full list of the organisations that participated is given in Appendix A. All the projects measured used either Oracle Developer/2000, the classic 4GL environment, Developer/2000 with partial use of Designer/2000 - for modelling and generating the database only, or Developer/2000 with full use of Designer/2000 for generating server and client-side code.

Method

The measure of software development productivity is generally taken to be the time spent by a developer for a unit of development. The problem remains however of how to measure a unit of development.

This problem of computer science, normally known as the software metric problem has been around for many years. Perhaps the most venerable and widely known metric is the lines-of code (loc) metric, whereby we just count the number of lines of code written to quantify the unit of development.

However this solution brings with it its own problems:

  • do we count only procedural commands?
  • do we count comments and declarations too?
  • should there be a maximum and/or minimum length for each line?
  • how are macro-instructions and other code-dense techniques to be evaluated?
  • are lines of code in different languages equivalent?

The real problem of loc as a metric is that regardless of its shortcomings as a comparator, it is useless as a predictor. We cannot know the number of lines of code in a project until after it has been completed.

In the late seventies an IBM researcher, Allan Albrecht, proposed a unit of measurement called Function Pointsa. Function Point Analysis (FPA) measure the ‘size’ of a requirement based on externally measurable characteristics such as the number of interfaces, the number of entities and so on. The essence of FPA is that it measures independently of implementation and it measures on the basis of information known close to the beginning of the lifecycle. Over the last twenty years the use of FPA has grown internationally and an community now collaborates to standardise function points (the International Function Point User Group - IFPUG) .

This study uses FPA as the metric because it provides a publicly documented and consistent way of comparing the projects studied. It also provides some comparisons, albeit not as rigorous, with other studies conducted .

Results

Government, industry and software-houses participated in study which covered the details of seventeen projects. The two charts included below graphically and dramatically illustrate the results.

Figure 1, below, illustrates the range of projects analysed. They cover a wide range of applications and effort: 24 to 421 Function Points and 12 to 4286 developer-hours.

The left side shows function points and effort (in person-hours) on projects using with Developer/2000 only. This is the classic 4GL-approach for a database-project with neither repository nor upperCASE tools. The right side is shows the equivalent figures for project using the full integrated Designer/2000 & Developer/2000. In the middle column the figures represent projects which used Designer/2000 only for data design and generation.

The figures of this picture show a range between simple additional reports and very complex enhancements. By way of comparison, in a COBOL-environment 421 Function Points might demand something of the order more than 6000 hours of developmentb.


Figure 1 Project size and development effort


Figure 2, below, represents the productivity achieved. The metric is hours per function point (h/FP) - the mean time it takes a developer to deliver software with the functionality of one function point. The range of the projects is indicated by the bracket. The chart again distinguishes the three categories for the use of Designer/2000 (none, partial and full) and the number of project studied in each class is indicated.

Figure 2 Productivity in Hours per Function Point

Using Developer/2000 alone, productivity is on average 4 h/FP. Full use of the integrated Designer & Developer environment requires less than 1 h/FP. As one might expect, the productivity realised by using Designer/2000 only for data modelling and generation lies somewhere in between these extremes.

The conclusion to be drawn from these numbers is that a Developer/2000 shop can quadruple its productivity by fully exploiting Designer/2000. As discussed above, developers probably spend more than half their time of on just such projects.

This work is based on the standards of IFPUG. Other studies using function points have been published. Table 1, below, combines results from this study and others.

Programming Environment h/FP Size of Projects (FP) Source
Assembler, FORTRAN >30 not specified b
COBOL85, PL/1 15-30 not specified b
C++ 7-15 not specified b
General 4GL 5-10 not specified b
All Projects 15-25 <500 c
Maintenance Projects 8-25 <160 d
Oracle 4GL (Developer/2000) 2-6 up to 285 e
Integrated Oracle Environment £ 1 up to 421 e

Table 1 Comparison of FPA Studies

Productivity is again measured in h/FP, where required for conversion a person/month is assumed to be 150 hours.

The first five entries in Table 1 make no distinction between new applications and maintenance. The sixth entry refers to a specific study of maintenance projects which hence provide an independent data point with which to compare. The comparison is very favourable to the Designer/Developer environment.

Conclusion

Most figures published for programmer productivity are subject to vendor bias, often based on ad-hoc data gathering and usually refer to a single project, or single organisation. This study, however was carried out independently, to the highest scientific standards, using a proven and publicly supported methodology on a significant sample.

The results speak for themselves. Developer/2000 alone delivers a highly effective programming environment by comparison with other similar studies. When Designer/2000 and Developer/2000 are integrated, the results are dramatically improved. Designer/2000 sets a benchmark for software development productivity that way beyond industry norms and the way out of reach of many alternative environments.

Appendix A: Participating Projects

The Technical University Munich guaranteed anonymity to project leaders participating with this work. The published results provide sufficient information on the productivity and the organizations involved which were drawn from government, industry and software houses:

from government:

  • Amt für Raumbezogene Informatik, Bozen Italy
  • Amt für Soziodemographische Informatik, Bozen Italy
  • Amt für Technische Informatik, Bozen Italy
  • Amt für Zweisprachigkeitsprüfungen, Bozen Italy
  • Bayerische Verwaltung für Ländliche Entwicklung, München Germany
  • Finanzministerium Baden-Württemberg, Stuttgart Germany

from industry:

  • Assecura AG, München Germany
  • Daimler Benz Aerospace Airbus GmbH, Bremen Germany
  • Daimler Benz Aerospace Airbus GmbH, Stade Germany
  • K&L Ruppert, Weilheim Germany
  • Sekurit Saint Gobain Deutschland GmbH, Aachen Germany
  • Südfleisch AG, München Germany

from the software sector

  • CAD MAP, Berlin Germany
  • Dator Software Engineering GmbH, Bozen Italy
  • Ibykus GmbH, Erfurt Germany
  • Ingenieurbüro Pöltl, Herzogenaurach Germany
  • ORACLE Deutschland GmbH, München Germany
  • Pegas GmbH, München Germany
  • Schumann AG, Stuttgart Germany

Appendix B: References


a Albrecht, A.J. Measuring application development productivity, Proceeding
  of the joint SHARE/GUIDE/IBM Application Development Symposium, October 1979, pp. 83-92
b Capers Jones, ComputerWorld 1988, November 7
c Capers Jones, Applied Software Measurement, McGraw-Hill 1991
d Capers Jones, Applied Software Measurement, McGraw-Hill 1991, Figure 3.19
e Hofmann, J. Empirische Untersuchung des Einflusses eines CASE - Werkzeuges
  auf die Produktivität von Wartungseingriffen in kommerziell genutzte
Informationssysteme
, Institute of Informatics, Technical University
of Munich 1995

Back to Contents