The Future of Structured Analysis
“All attempts to predict the future in any detail appear ludicrous within a few years. This book has a more realistic, yet at the same time, more ambitious, aim. It does not try to describe the future, but to define the boundaries within which possible futures must lie. If we regard the ages that stretch ahead of us as an unmapped and unexplored country, what I am attempting to do is to survey its frontiers and to get some idea of its extent. The detailed geography of the interior must remain unknown -- until we reach it.”
-- Arthur C. Clarke
Profiles of the Future
Throughout this book, we have seen an evolution of ideas and techniques in the field of systems analysis. They fall into three broad periods of time:
- Conventional systems analysis, prior to the mid-1970s, characterized (if it was done at all) by narrative Victorian novel specifications that were hard to read, hard to understand, and virtually impossible to maintain.
- Classical structured analysis, from the mid-1970s to the mid-1980s, as described in [DeMarco, 1978], [Gane and Sarson, 1977], and others. This was characterized by early versions of graphical models, and an emphasis on the modeling of current implementations of a system before modeling the new system.
- Modern structured analysis, as described in this book and recent books such as [Ward and Mellor, 1985] and [McMenamin and Palmer, 1984].
This chapter summarizes some of the major changes that have taken place since the introduction of classical structured analysis in the late 1970s, and to use this as a starting point for discussing likely changes over the next 5 to 10 years.
WHAT HAS CHANGED
Several aspects of structured analysis have gradually changed over the past ten years. The major areas of change include the following:
- Terminology changes. We now use the term environmental model as a way of describing what used to be called just a context diagram. This is because classical structured analysis did not include an event list as part of the formal system model. Also, we now use the term essential instead of logical to describe a model which concentrates on what the system has to do, and the term implementation instead of physical to describe a model that concentrates on how the system will be developed. These are obviously minor changes, but they have helped reduce confusion when talking with users who wonder whether the opposite of a logical system is an illogical system.
- Event partitioning. One of the more significant developments in structured analysis, which we discussed in Chapter 20 and Chapter 21, is the use of an event list to guide the initial development of the behavioral model. This replaces the approach of strict top-down partitioning of the context diagram to a top-level dataflow diagram (Figure 0), and from Figure 0 to lower levels, and so on. While the top-down approach is not wrong in any sense of the word, it has been difficult for many systems analysts to practice; the event partitioning approach, which is a middle-out approach, has proved more successful in many analysis projects.
- De-emphasis of current physical modeling. As Chapter 17 pointed out, there are a number of reasons why the systems analyst might be tempted to model the current implementation of a system. But time and again we have found that it is a dangerous temptation and that the systems analyst spends far more time engaged in this activity than it warrants. While we don’t outlaw the current physical model, we do discourage and de-emphasize it. Modern structured analysis emphasizes the development, as early as possible, of an essential model of the user’s new system.
- Real-time modeling tools. Classical structured analysis was primarily intended for the development of straightforward business systems; it made no provision for interrupts, signals, or timing issues. However, many of today’s complex systems do include a variety of real-time issues, and the structured analysis modeling tools have been extended accordingly. Control flows and control processes have been used to augment dataflow diagrams; and state-transition diagrams have been introduced as a new modeling tool to illuminate the time-dependent requirements of a system.
- Closer integration of process modeling and data modeling. Classical structured analysis used data structure diagrams to model the relationships between stores on the dataflow diagram. However, the relationships were often obscured by the notation, and the notation tended to foster intense discussions and debates about the design and implementation of the physical data base. The entity-relationship diagram presented in this book provides a more logical or conceptual model of the data required by the system, and it allows the relationships between data entities to be described rigorously and in detail. Also, the balancing rules discussed in Chapter 14 ensure that the data model (documented with ERDs) is fully consistent and compatible with the process model (documented with DFDs and process specifications).
It is important for you to be familiar with these changes, for you may find yourself working for an organization that has not yet incorporated the changes into their standards; in 1987, when the first edition of this book was being written, many large organizations that I visited throughout the United States were still using systems development methods that were at least 10 years old.
FUTURE DEVELOPMENTS IN STRUCTURED ANALYSIS
No one can profess to know the future in any detail; the most we can hope for, as Arthur C. Clarke points out in the introduction to this chapter, is to find some signposts to the future. Recent developments suggest a number of trends that are likely to continue well into the next decade. They include the following:
More Widespread Awareness of Systems Analysis
Computers, as we all know, are becoming a ubiquitous part of everyone’s life. Consequently, we are finding that a larger and larger part of society is learning to use computers and talk about computers; more importantly (in the context of this book), many people are becoming increasingly familiar with structured analysis and other aspects of software engineering. I am especially interested in the future impact of structured analysis on three groups: senior management in our business and government organizations, children, and computer professionals in Third World countries.
In most large organizations, one typically finds that the senior levels of management are people in their late 40s, or 50s, or 60s. This means that they received their education and spent the formative years of their career 20 or 30 or even 40 years ago. If we look at corporate cultures up through the 1970s, and perhaps even the 1980s, we have to remember that while computers were certainly being used, they were not widely available to individuals, and they were not part of the technology or culture that people grew up with. But this has changed dramatically since the early 1990s; we are beginning to see senior levels of management who either (1) began their career in the data processing or IT organization,<ref>Three examples of this are John Reed, the former Chairman of Citicorp; Richard Crandall, former head of American Airlines; and Frank Lautenberg, formerly Chairman of ADP (the payroll services company) and one of the two U.S. Senators from New Jersey until his retirement at the end of 2000. There are also several former programmers and systems analysts who are members of Congress.</ref> or (2) began their career in some other part of the organization (e.g., accounting, sales, or manufacturing) whose day-to-day operation was dramatically affected by computer technology. This means that you can expect, as a systems analyst, that top management will be increasingly aware of the strategic importance of information processing systems in their organization, and they will be increasingly interested in seeing high-level models of major new systems. If you tried to show a dataflow diagram to the CEO of your organization today, chances are that he would not understand it and would not understand why he or she should understand it. Especially now that the Internet and “e-business” are having such a dramatic effect on the economy, I believe that senior management will come to realize that it is just as important to be able to read (and critique) a system model as it is to read and critique a balance sheet or profit and loss statement.
Children will also become more and more familiar with the concepts structured analysis over the next several years. Already structured programming and structured design are being taught at the high school level in some parts of the United States. Structured analysis, once the subject of graduate level seminars, is now being taught at the third year and fourth year of undergraduate computer science and business school curricula, and will soon be part of a standard first-year college subject. Long division was once a college subject and is now taught regularly to young children; similarly, structured analysis will be a topic that children learn during their normal educational process.
It has been estimated that any U.S. child born after 1980 will graduate from secondary school having written 10,000 lines of code; this is equivalent to roughly a year of full-time programming experience for today’s adult programmer. Along with all this programming experience, we can expect that the current generation of children will have more and more experience with systems design and systems analysis. Not all this generation will end up choosing a career as a programmer or systems analyst; indeed, only a small fraction will choose such a career path. But the rest of today’s children, whether they choose to be accountants or engineers, salespeople, teachers, or politicians, will form a community of intelligent end users of information systems; users will know much more about what to expect from a systems analyst and what to ask of a systems analyst. Much of our present work, predicated, it seems, on the difficulty of dealing with ignorant users, may be superfluous in the future.
There is one other aspect of the growing awareness of structured analysis that we should mention: the impact on software industries in developing nations. Throughout the latter part of the 20th century, international competition in a variety of manufacturing industries has become more and more intense, and American industries have often lost competitive ground (or gone out of business) when faced with Japanese, Korean, Chinese, or Brazilian industries offering a high-quality product at a competitive price. The same phenomenon is beginning to happen in the systems development industry. Software engineering techniques, including the structured analysis techniques discussed in this book, can help the competitive organization develop systems with a productivity ten times higher than that of many American organizations and with a level of quality (expressed in mean time between failure or number of bugs) one hundred times higher than that of comparative American organizations.<ref>For a discussion of this, see D. Tajima and T. Matsubara, “The Computer Industry in Japan,” Computer, Volume 14, No. 5 (May 1981), pp.89-96.</ref> And since, to a larger and larger degree, all our products and all our services depend on computer-based information systems, this has profound implications for all of American industry.<ref>I first expressed these thoughts in the first edition of this book, which was written in 1987. Since then, I expanded on the issue of international competition in the software industry in a book entitled Decline and Fall of the American Programmer (Prentice Hall, 1992). Four years later, I concluded that the U.S. software industry still had some unique characteristics that would help it maintain world dominance; see Rise and Resurrection of the American Programmer (Prentice Hall, 1996). Meanwhile, the software industry in countries like India continues to grow by leaps and bounds, and it’s not yet clear just how long the U.S. will retain its leading position.</ref>
Deployment of Automated Tools
Throughout this book, we have discussed the possibility of using workstation-based tools to automate various aspects of structured analysis, especially the labor-intensive activities of creating graphical system models and checking them for completeness and correctness.
Appendix A describes the features of many such analyst toolkits that are currently available. But few systems analysts are using these tools, even today. In the early 1990s, approximately 10% of the systems analysts in North America, Western Europe and other software development centers) typical systems development organization had access to person workstations for creating dataflow diagrams, entity-relationship diagrams, and so on. It was expected that by the end of the decade, we would have reached the “critical mass” of 50% deployment of such tools throughout the industry; however, there was a backlash against such tools in the mid-1990s because of their perceived inefficiency and overhead. Throughout the latter part of the 1990s and early part of the 21st century, there has been a great emphasis on visual development environments that greatly enhance the ease of prototyping and rapid implementation, and a stagnation of interest in analysis/design modeling tools.
Assuming that we do eventually reach a critical mass of deployment of automatic analysis tools, it will be reasonable to argue that our approach to systems analysis will have fundamentally changed. To draw an analogy: it is interesting to talk about the improvements that can be achieved in the field of carpentry by using a power saw instead of a hand saw, but the issue is moot if only 1% of the carpenters have electricity. And the power of the tool does indeed affect the way we work with the world around us; Craig Brod made this point very eloquently in a classic book called Technostress ([Brod, 1984]):
- Tools have always set in motion great changes within human societies. Tools create us as much as we create them. The spear, for example, did much more than extend a hunter’s reach; it changed the hunter’s gait and use of his arms. It encouraged better eye-hand coordination; it led to social organizations for tracking, killing and retrieving large prey. It widened the gap between the unskilled hunter and the skilled hunter and made pooling of information more important as hunting excursions became more complex. There were other, less obvious effects: changes in the diets of hunting societies led to sharing of food and the formation of new social relationships. The value of craftsmanship increased. People began to plan ahead, storing weapons for reuse. All of these new tool-related demands, in turn, spurred greater development of the brain. Brain complexity led to new tools, and new tools made yet more complex brains advantageous to the survival of the species.
The Impact of Maintenance Disasters
In the previous chapter, we discussed the user of the structured analysis model to facilitate ongoing maintenance and modification of systems. But this is an issue that often seems abstract, philosophical, and politically unimportant during the development phase of a project, when the primary emphasis seems to be getting the system delivered to the user. Preferably a working system; hopefully the system the user wanted. But, failing that, any system that appears to work and appears to satisfy at least some of the user requirements. The simple political reality is this: the importance of structured analysis and formal, rigorous system models has not been fully appreciated by many senior managers in our organizations. Even within the ranks of EDP management, structured analysis doesn’t have the same “gut-level” sense of urgency as the political necessity of delivering a working (or allegedly working) system on time to the user.
As I suggested above, this situation may change as the user population becomes more familiar with computer technology and as competition from Third World countries becomes more intense. But there is another phenomenon that will dramatically highlight the need for current, up-to-date system models that are maintained as diligently as the source code: maintenance disasters that will cause current systems to collapse.
In the extreme case, this may happen because an existing system -- a large, complex, undocumented system -- aborts or comes to a standstill, with no one able to figure out how to repair it. But this is unlikely; it is more likely that the cause of the failure will be identified and simply outlawed. The word will go out, “You can’t enter a type X25 transaction into the system any more because it causes problems.”
No, the likely cause of a major maintenance disaster will be the total impossibility of making a necessary, urgent modification to an existing system. Such changes are often mandated by new legislation or government policy; but they may also be required because of changes to the business environment or competitive situation.
Many organizations faced this problem during their Y2K remediation efforts -- i.e., when systems had to be updated to properly handle both 20th-century and 21st-century dates. Part of the problem was related to the implementation of the system, that is, coding that had been patched and repatched so many times that it was no longer possible to determine accurately how the system operated. In several cases, the only viable option was to scrap the system and replace it with an entirely new package, often acquired at considerable expense.
But the larger problem, both in the Y2K projects and with many current maintenance situations, is that nobody knows or remembers what these systems are really supposed to do. Third-generation maintenance programmers are now interacting with third-generation users to discuss potential changes to a system whose original requirements are a mystery to both. In this environment, it is inevitable that the maintenance programmers will eventually throw up their hands and refuse to make any more changes.
When faced with a problem like this, top management is likely to have a “knee-jerk” reaction: committees will be formed, standards will be imposed, and new procedures will be promulgated. Just as we have seen government leaders react to problems of toxic waste, oil spills, corruption, and a number of other problems only after a major disaster occurs, I believe that a number of top business and government managers will react to the problem of nonexistent system models only after a major maintenance problem occurs.<ref>Indeed, it was widely expected that exactly this situation would occur because of the difficulties associated with Y2K remediation. And in some organizations, it did lead to a drastic overhaul in software engineering practices, as senior management vowed “never again” to be faced with such a crisis.</ref>
The Marriage of Structured Analysis and Artificial Intelligence
Starting in the late 1980s, significant attention has being devoted throughout business, government, and the computer industry to artificial intelligence (AI): expert systems, natural language systems, robotics, and many related fields. Though AI had previously been considered an academic subject with little practical application, and though it was once implemented on exotic hardware with unfamiliar programming languages like LISP and PROLOG, it has gradually become more of a mainstream topic -- particularly the area of expert systems, those systems that can replicate the behavior of human experts in certain narrowly defined fields. More and more AI software packages and textbooks are available for COBOL-based environments and PC-based environments (see, for example, [Taylor, 1985], [Derfler, 1985], [Webster, 1985], [Keller, 1986], and [Rose, 1985]). More and more AI applications, ranging from medical diagnosis to oil exploration to stock portfolio evaluation to tax planning, are finding their way into the mainstream business world.<ref>A large amount of artificial work is still carried out on specialized computer hardware, using specialized programming languages such as Lisp and Prolog. However, (1) most companies would prefer to integrate their AI applications with the other applications that they run on their standard IBM mainframe hardware, (2) most programmers would rather use such familiar languages as COBOL than such esoteric languages as Prolog, and (3) the business-oriented AI applications will have to tap into the knowledge base, which already exists on the mainframe computer.</ref>
What does this have to do with structured analysis? The connection between artificial intelligence and expert systems works in both directions: structured analysis can help in the process of building an expert system, and expert system technology can help in the process of doing structured analysis.
When building an expert system, three aspects of the system are often dominant: the human interface, the knowledge representation, and the inference engine that evaluates and interrogates the knowledge base. The human interface may involve natural-language (English, French, German, etc.) input and a combination of graphics, text, and sound for output. The knowledge base may be expressed as a series of IF-THEN-ELSE rules or as a series of frames.<ref>See [Keller, 1986] for a description of frames.</ref> And the inference engine may be based on a forward-chaining or backward-chaining approach and may be implemented with a vendor-supplied “expert shell.”
But the significant thing about all this is that the expert system components are now becoming just a part of a larger system, for example, an operational system that feeds and updates the knowledge base or that uses the output of the expert system component to carry on other system functions. Thus, the modeling tools of structured analysis may be used to help model the overall system. But more important, it means that in the coming years, you may need to become familiar with the technology of expert systems and artificial systems in order to be a successful systems analyst. Keller’s book ([Keller, 1986]) is a good place to start, for it shows many of the interactions between structured analysis and expert systems.
In a reverse sense, AI can assist the process of structured analysis by acting as a tutor to guide a junior analyst through the various steps and processes described in this book. One could easily imagine, for instance, an “analyst’s assistant” that would ask a series of questions of the human analyst and then produce a proposed context diagram or event list. Just think: can you remember, now that you have reached the end of this book, all of the rules and guidelines of the past several hundred pages? Do you think you’ll remember them all a year from now? Wouldn’t it be nice to have an expert system available in a desktop computer that could remind you how to draw DFDs, ERDs, and STDs that are all properly balanced?
While all this sounds very exciting, you should not be concerned that expert systems are going to do away with human systems analysts. Researchers in this field point out that all the successful expert systems, ranging from medical diagnosis to portfolio analysis, have succeeded because they have concentrated on a very limited domain of expertise. A successful systems analyst, though, really needs to be an expert in several different areas: he must understand the technology of structured analysis presented in this book; he must understand the user’s application area; he should know a lot about accounting, so that he can produce accurate cost-benefit calculations; he should be an expert in communications and cognitive psychology, so that he can communicate effectively with the user; and he should also be well-versed in computer hardware and software, so that he can communicate effectively with designers and programmers. Current estimates (see, for example, [Barstow, 1987]) are that expert systems will be able to help do the job of systems analyst on simple systems by the mid-1990s, but it will be well past the end of the century before expert systems technology will be able to do systems analysis on large systems.
The Impact of New Generations of Computer Hardware
Enormous sums of money are being expended today by private companies, universities, research organizations, military organizations, and governments around the world, all with the objective of producing dramatically more powerful computer hardware during the next 10 to 15 years. What has this to do with systems analysis? Simply this: the business of defining user requirements for an information system has to be done within the context of what the user and the systems analyst think is possible with available technology. But what we think is possible is based largely on what we know now about computer technology. It could well be argued that most end users and most systems analysts do not even begin to make use of existing computer hardware technology, so what are they going to do with technology one million times more powerful?
Past experience with other technological advances, for example, in the field of communication (from smoke signals to telegraph to telephone, etc.) and transportation (from walking to horses to cars to airplanes, etc.) gives us a clue; our first reaction to radically improved technology is to continue doing the same kind of things we did before, but a little faster and more easily (and more economically, in many cases). Only later do we begin to see entirely new applications for the new technology.
As an example, consider the field of transportation, with which we are all familiar; while airplane travel is relatively new (compared to the history of the human race), it has been with us all our lives, and it has an important impact on our assumptions, both conscious and unconscious, explicit and implicit, about the way we can live our lives. Suppose someone told you tomorrow, though, that a supersonic underground train was available to carry you from the East Coast of the United States to the West Coast at speeds of 3000 miles per hour.<ref>This is not science fiction. Serious engineering proposals for such a supersonic train were drawn up at MIT as far back as the mid-1980s.</ref> What would this do to your business life? To your social life? What kind of changes would you begin making today if you were reasonably certain that this advanced technology would be available to you within the next 3 to 5 years?
And that is exactly the position that we will find ourselves in as systems analysts for the rest of this century; each time we are given dramatically more powerful computer technology, our first reaction (and the reaction of the end users, too) will be to reimplement existing applications somewhat more efficiently. The challenge will be to find entirely new applications -- entirely new, radically different uses of computer technology -- than the applications currently being developed.
It is important that you have a future-oriented perspective as you finish this book and begin practicing structured analysis in the real world. While the modeling, iterative problem solving, top-down partitioning, and other concepts discussed in this book will almost certainly be valid for the foreseeable future, many of the details (e.g., the technology available to support structured analysis, and even such specific techniques as event partitioning) may change or be replaced.
You should not expect that the material you have learned in this book is constant, permanent, impervious to change. Like all science, and especially like all other aspects of computer science, the field of systems analysis is destined to continue changing, evolving, and (hopefully) improving well into the next century and beyond. For some, it is frightening to realize that half of what one learns in this technical field is obsolete within 5 years. For others, and I hope that you will include yourself in this category, it is a source of constant renewal and excitement.
And on that note we come to the end of this book. You are not yet a veteran systems analyst, but you should have enough tools and techniques to enter the profession without fear of falling flat on your face. May you enjoy the practice of structured analysis on information systems that will benefit society. And may you return in less than 5 years to see what has changed. Ciao!
- Tom DeMarco, Structured Analysis and Systems Specification. Englewood Cliffs, N.J.: Prentice-Hall, 1979.
- Chris Gane and Trish Sarson, Structured Systems Analysis: Tools and Techniques. Englewood Cliffs, N.J.: Prentice-Hall, 1978.
- Arthur C. Clarke, Profiles of the Future. New York: Holt, Rinehart, and Winston, 1984.
- Jared Taylor, “Lightyears Ahead of Paper,” PC Magazine, April 16, 1985.
- Frank Derfler, “Expert-Ease Makes Its Own Rules,” PC Magazine, April 16, 1985.
- Robin Webster, “M.1 Makes a Direct Hit,” PC Magazine, April 16, 1985.
- Robert E. Keller, Expert System Technology: Development and Application. Englewood Cliffs, N.J.: YOURDON Press, 1986.
- Frank Rose, Into the Heart of the Mind. New York: Harper & Row, 1985.
- D. Barstow, “Artificial Intelligence and Software Engineering,” Proceedings of the 9th International Conference on Software Engineering. Washington, D.C.: IEEE Computer Society Press, March 1987.
- Paul Ward and Steve Mellor, Structured Development of Real-Time Systems. New York: YOURDON Press, 1986.
- Steve McMenamin and John Palmer, Essential Systems Analysis. New York: YOURDON Press, 1984.
- Craig Brod, Technostress. New York: John Wiley, 1984.
QUESTIONS AND EXERCISES
- What are the three broad evolutionary stages of systems analysis that have taken place over the last 20 years?
- What are the five major changes that have taken place in the field of systems analysis during the past 10 years?
- What do the terms logical and physical mean in the context of this chapter?
- What is event partitioning? What has it replaced in the field of systems analysis?
- Why has current physical modeling been de-emphasized in systems analysis?
- What additional tools have been added to the field of systems analysis to help model real-time systems?
- What is a data structure diagram? What has it been replaced by in the field of systems analysis?
- How are computers beginning to affect the jobs and activities of senior management in organizations?
- Why will the teaching of structured analysis and structured design to children in the coming years have an impact on systems development projects?
- Why is structured analysis likely to be a factor in international competition between the United States, Europe, and many Third World countries?
- Research Project: What percentage of the programmers and systems analysts have analyst toolkit workstations available to them in your organization?
- Why are automated tools important for systems analysis?
- Why will maintenance disasters have an impact on structured analysis in the future?
- What relationship are we likely to see between artificial intelligence and structured analysis in the future?
- By what percentage, or multiple, is computer hardware expected to improve over the next 10 to 15 years?
- Why will continued improvements in computer hardware have an impact on the way systems analysis is done?