Much of the programming world involves concurrent applications. Here are some examples from operating systems, real-time systems, and simulation.
When you and your colleagues all log in to terminals connected to the same time-sharing system, each of you works separately, but yet you are all using the same computer. Each of you has the feeling that the computer is working only on your task, yet many of you are working simultaneously. How is this seeming paradox possible?
The illusion that you are alone on the time-shared computer is caused by a combination of fast computers and clever programming. Suppose you are using the computer to edit a program or read electronic mail. You read and type at human speed. A very fast typist can enter 100 words per minute, or--at an average of six characters per word--about 10 characters per second. In the tenth of a second between two of your keystrokes, a modern computer can execute hundreds of thousands of machine instructions. If those "extra" machine instructions could be put to productive use, the computer would have plenty of time between your keystrokes to service other human users. It is not unusual for a modern time-shared computer to handle 100 or more simultaneous users, each working at human speed.
Managing all those instructions and users is part of the responsibility of a modern operating system. An operating system is, as you know by now, just a sophisticated program; in fact, it is a concurrent program, capable of managing many devices and human users to give the illusion of simultaneity.
Some time-shared computers consist of a single CPU; others consist of a set of identical CPUs. With more than one CPU, programs can be executed in parallel--that is, literally at the same time. With a single CPU, no real parallel execution is possible, but that one CPU can be shared in such a way that many programs seem to be executing in parallel. Concurrent programming is the creation of programs that consist of segments that have the potential for parallel execution; depending upon the actual number of CPUs available, execution of a concurrent program may be literally parallel, entirely time-shared, or some combination of the two.
Similarly, a subway system can be modeled by a program in which each train, station, turnout, and block (section of track that is permitted to hold at most one train) is represented by a program segment. The flow of simulated customers in the bank, or of trains in the subway, can be controlled or varied at will.
Simulation is an important tool in optimizing physical systems-- for example, choosing the most effective number of open checklanes in a supermarket or the frequency and maximum speed of trains in a subway. Studying the computer model provides information and insight into the behavior of the physical system if the former is a faithful representation of the latter; concurrent programming provides a natural way of assigning program segments to represent physical objects and therefore aids greatly in developing good simulations.
Ada is one of only a few programming languages--and the only popular one--to provide built-in structures for concurrent programming. In this chapter, we use a series of examples to present a few of the basic Ada structures and end with a spider program, namely, a simulation of a family of spiders, all trying simultaneously to move around the room without bumping into walls or other spiders.
Generally, your incoming e-mail is stored in a system file called the electronic mail box, or just mailbox. Suppose you are reading your mail when a friend sends you a message. The new message must be added to your mailbox file; your reading must be momentarily suspended while the file is modified (you may not notice the temporary suspension, but it happens anyway). Now suppose that two incoming messages arrive at the same time. Not only must your reading be suspended, but something in the mail software must update your mailbox one message at a time. If this protection were not provided--if two messages could update the mailbox literally at the same time--the mailbox would become hopelessly garbled and therefore useless.
The e-mail situation is an example of a readers-writers problem, a category of computing problems in which multiple readers of, and multiple writers to, a data structure must be prevented from interfering with one another. The prevention technique is called mutual exclusion; update actions on the data structure are handled one at a time while other actions are excluded. Ada's protected types provide mutual exclusion; we can declare a protected type, and variables of that type, with read operations--called protected functions and update operations--protected procedures--which Ada guarantees to execute correctly. Specifically, multiple calls to a protected procedure are executed one at a time.
Section 16.2 introduces task types and task objects; Section 16.3 introduces protected types and protected objects.
Copyright © 1996 by Addison-Wesley Publishing Company, Inc.