Occamal concepts have naturally emerged in software efforts from nearly the beginning of software, and limited formulations of Occamal concepts have been promoted and valued in software. The purpose of this section is to identify those early expressions for what they are; they show the widespread, ever-springing nature of the desire to come to some principle that will yield optimal programs.
Software tools
One of the earliest pure-software efforts, the work to build the early language FORTRAN, was clearly a step in an Occamal direction. By creating a machine-independent language in the first place, programmers could concentrate on expressing their programs in a single language, FORTRAN, instead of the multiple machine languages that came to exist. Since FORTRAN is closer to the problem domain of most scientific programming, it usually takes fewer statements to express a program in FORTRAN than in assembly language; therefore, writing in FORTRAN eliminates redundancy and is thus better in Occamal terms. The relationship between the statements of the language and the machine language are all centralized in one place, the compiler. The compiler’s run-time library was just as important in this respect – it provided a set of commonly used functions that only needed to be written once, and then used whenever needed. In the early days of FORTRAN, for example, machines typically did not provide native support for floating point operations. The programmer could write his floating-point formulas and calculations without thinking about the machine, knowing that if the machine his program eventually ran on had no support for floating point, the compiler would generate the right calls and the run-time library would do the work.
All computer languages take programs in an Occamal direction. Just as FORTRAN abstracted and centralized the floating point operations commonly needed in scientific computing, so did COBOL abstract and centralize processing records of data and the BCD arithmetic needed for financial calculations.
Similarly, the concept of subroutine and shared subroutine libraries arose very early. One of the important functions performed by early software societies (IBM’s SHARE, the ACM and IEEE) was to collect, refine and standardize libraries of functions widely used by groups of people. Early software companies owed their existence to the time and money it took to build major functions, and built a business on selling many copies of software that was expensive to build, and less expensive for the user to buy than build for himself.
The use of macros is clearly driven by Occamal goals. The idea and motivation of a macro is simple: when there are repeating examples of a string in the text of your program that you suspect you may want to change in the future, you define the text in a macro and use the macro instead of the text itself. When the need for a change arises, you go to the single macro definition, change it, and you’re good to go, regardless of how extensively the macro is used.
The concepts of inheritance, templates and component re-use are clearly Occamal in nature. The idea is you write the common parts of a function in a master class or template, and then sub-class or apply the template to handle special cases or variations. You have the master class or template which gives you the single place to go to make changes, while benefiting from only needing to spell out the variation in each particular application. Using these things tends to reduce the number of lines of code and reduce the redundancy, and is therefore Occamal.
Software applications
Occamality can be clearly seen in the evolution of software products.
Over time, software efforts in a particular field naturally tend to the Occamal, model-based ideal, because such bodies of code are the easiest to maintain and enhance, while providing maximum flexibility to their users. Early attempts at building Customer Relationship Management (CRM) systems, for example, tended to be bodies of code written in some supposedly easy-to-change 4GL. Every CRM system needs customization. When one of these early CRM systems was installed, the source code would be opened up, programmers would hack away, and sometimes an application vaguely appropriate to the customer’s needs would emerge stumbling from the dust. The awfulness of this approach quickly became evident as customers found that upgrading to the vendor’s latest release meant re-entering the coding war zone and having to fight a battle on two fronts, i.e., their customizations of the old release and the vendor’s “improvements” to the old release in the form of the “upgrade.”
The vendors who survived this nightmare were the ones who, through natural selection and the survival of the fittest, minimized the damage to their customers through the installation and customization process, not unlike the way parasites evolve to avoid killing their hosts. Invariably, this meant increasing degrees of Occamality (not “perfect” Occamality, mind you, just “increasing degrees” of Occamality), nearly always expressed as a more model and template-based approached to application definition.
Engineering product management systems, usually called PLM systems, are an example of a set of software that has long-since evolved to be more Occamal than not, simply because engineering products and processes are so highly individualized that one of the main functions of modern PLM products is to make it easy to express product and process definitions without requiring modification of the source code of the product.
Here is further explanation of the evolution towards increasing application in software applications.
These examples make it clear that Occamality is a thread that weaves through the software industry and underlies many of its developments, although the people who created and promoted the developments generally did not think of them in these terms.