Society has changed, is changing now and will change yet again soon. It is always in a state of perpetual flux and yet we always think of it as a static, fixed and unchanging constant. Culture is the expression of society and the culture of any society leaves its mark on the world in the form of the artifacts it produces.
Our definition of a culture is largely shaped by the legacy they leave behind. The Coliseum is one of the great legacies left behind by the Roman civilisation of two thousand years ago or so. It acts as a vivid reminder of a society that endorsed slavery, public displays of bloodletting, and many other forms of excess with a regularity almost unparalleled in the rest of human history.
Mention the word Roman to anybody today and you could lay money that the first thought to enter their head would be Gladiator, always a popular subject for films too. Of course the Romans had a gentler side as well and produced much literature, poetry and other forms of fine art. In the words of William Shakespeare, however, ″the evil that men do lives after them, while the good is oft interred with their bones.″
An early recollection I have of an interaction with computer systems occurred at the beginning of the 1980s, after I had received a nasty letter from the Inland Revenue threatening me with prison if I didn′t pay a particular bill within fourteen days. I was quite taken aback by the letter, as I thought I was up to date with my income tax affairs.
A visit to the local tax office wasn′t very encouraging. ″Don′t worry about the letter,″ they said. ″It′s just a standard form letter we send to everyone.″ I found that slightly strange but worse was to come. ″What′s happened is that the payments you made were paid into the wrong account″, was their explanation. ″They were mistakenly paid into the previous year, which is why they haven′t shown up in this year′s and that is the reason you were sent the letter.″
So simple! All that needed to be done to rectify the matter would be to transfer the money into the correct year and everything in the garden would be rosy, or so I thought. ″It′s not actually as easy as that,″ said the tax bloke. ″We can′t actually transfer money from one account to another. What you′ll have to do is apply for a tax refund and get it back that way. It shouldn′t be a problem since you′ve obviously overpaid. In the meantime, if want to stop getting the threatening letters, you′ll have to pay what′s owing and keep up your payments.″ I won′t relate my response to this; most of it is unprintable anyway. Suffice it to say they eventually sorted things out without me having to wait for a refund.
Those were times when computer systems were held in awe and their designs were inflexible and rarely questioned. Systems were created by and for systems people using strict, formalised methods like SSADM. Change was slow and considered with steering committees and feasibility studies to go through first. We had programmers instead of developers in those days and their task was to produce code to very tightly defined specifications with little requirement for design skills, occasionally they might aspire to becoming an analyst-programmer.
The 1990s brought many changes. The early part of the decade was a heady period when processing power was delivered to everybody′s desktop and the proliferation of networks allowed us to share devices and communications previously undreamed by mere mortals.
The culmination of all this sharing was, of course, the Internet. Although the Internet has been around in one form or another since the late 1960s, it wasn′t accessible by anyone other than academics and military types until the 1990s.
With the new technology came new needs and new market forces. Deming once said, ″No customer ever asked for an integrated circuit, a facsimile or an automobile. They invent nothing but are quick learners and will always compare one source with another.″ Software customers were no longer the systems departments of major corporations but were now individuals that could and would go elsewhere if they didn′t like the product or it wasn′t delivered quickly enough.
Behind the scenes another change occurred. Since the early 90s, object-orientation has pushed aside structured programming and become the dominant paradigm. This is probably the most radical change in the industry so far and it occurred because structured programming designs were unable to tackle the increasing complexity required by systems.
OO design and development principles and techniques are very different to those of structured programming and the tools and notations previously required became obsolete overnight. After a brief struggle between different camps, the Unified Modelling Language (UML) became the standard and everybody set about learning it.
Over the last few years, it′s become more and more apparent that, despite the pervasiveness of UML, the majority of our industry hasn′t successfully make the change to object orientation. Most have adopted the terminology and learned the technical details, like language syntax and which library routines do what. These are simple enough things to understand. The principles of object-oriented design (OOD), however, seem to be missing, the result usually being structured programming with objects.
Design is sadly missing from software development today. We moved across to OO because we structured programming had reached the limits of its capabilities but, as an industry, we don′t seem able to express OO designs effectively. We have the technology and the tools to produce models of the solutions but there′s a gap where we′re unable to turn those models into well-designed implementations.
Currently there are two approaches to the problem. From the agile camp we have evolutionary design, usually manifested as test-driven development (TDD). Guided by acceptance and unit tests, start with a very small prototype and add functionality when it′s required by the tests. Continually revisit the design and use refactoring to improve it when necessary. The tests are used as a safety harness to prevent any domino effects rippling through the code if any errors are introduced when you make your design improvements.
TDD is a very effective technique for producing very high-quality designs very quickly but requires rigorous discipline and a zealous attitude to testing that only dedicated developers are able to sustain. It can only work if everybody is passionate about maintaining high levels of quality and it requires the participation of the whole team to succeed. The designs produced are obviously specific to the technology they are implemented in.
The other approach is Model Driven Architecture (MDA) from the Object Management Group (OMG). MDA abstracts the design away from the detail of implementation and uses UML models to express them. The models capture the domain knowledge of the organisation in a platform independent model (PIM). MDA uses tools, or more likely suites of tools, provided by independent vendors, to implement the design in the technology of your choice by mapping the PIM to a platform specific model (PSM).
In this way you should be able to insulate your intellectual property from the risk of change or advances in technology. If your platform becomes obsolete, you should be able to press a button and produce a version of your existing PIM for whatever PSM is current.
Your organisation can concentrate on developing its solutions vertically, without having to worry about finding the skills to implement them on different platforms. MDA promises us a brave new world where the problems of cross-platform interoperability are a distant memory but there is a long way to go yet before it becomes a complete reality.
MDA and TDD are two entirely different approaches and both have their pros and cons but both promise better designs. More importantly, both require good design skills. Refactoring, one of the core practices of TDD, is ′improving the design without changing the functionality′. By definition, to improve a design you need to be able to identify a ′better′ one. According to the MDA proponents, in 5 years 80% of today′s programmers will no longer be programmers but will be redeployed as modellers.
Jim Highsmith cites changes in the pharmaceutical industry where they′ve moved from careful laboratory design to an experimentation based approach. They′ve been able to this because modern, sophisticated instruments can test hundreds, if not thousands, of compounds, created using combinatorial chemistry techniques, in a day. Improvements in technology have made careful specification and design less effective and more costly than experimentation.
Who knows? It may be that MDA, or something similar, may evolve to such an extent that the same will become true in our industry soon. Until then, if we want to succeed as an industry and as individuals, we need to improve our design skills as these are fast becoming the most sought after commodity
Twenty years ago programmers may have been able to leave the responsibility for design to others, this is no longer the case and will be even less so in the future. The culture we live in today demands good quality. Quality begins with good design, it is everybody′s responsibility and customers will get more discerning, not less.
The artifacts we produce are computer systems and they will be the legacy we are remembered for. I for one would like this era to remembered for the quality and robustness of its product′s designs, rather than the failures of recent years.
First published in Application Development Advisor Jan/Feb 2005