IV. Introduction to Modeling and Simulation Systems

A. Historical Perspective [SS]

Today Simulation is arguably one of the most multifaceted topics that can face an Industrial Engineer in the workplace. It can also be one of the most important to a corporation, regardless of the industry. Quality, safety and productivity are all affected by Simulation, whether the issues occur in the office, on the manufacturing floor, or in a warehouse. This article is focussed towards providing information on the development of Industrial Process Simulation from the stage of infancy to the current stage where it is used as a powerful tool for increasing the competitiveness and profits of the company [5].

Simulation is extensively being used as a tool to increase the production capacity. Simulation software used by Cymer Inc. (leading producer of laser illumination sources), increased the production capacity from 5 units/month at the beginning of 1999 to 45/month at the end of 1999, an increase by around 400% [5].

Visualization and graphics have undoubtedly made a huge impact on all simulation companies. Easy-to-use modeling has resulted in low-priced packages that would have been unthinkable just a few years ago. The Simulation technology has shot up in value to other related industries. The Simulation industry is coming of age and is no longer just the domain of academics.

This article provides insight into the working environment and intellectual and managerial attitudes during the formative period of simulation development. It also suggests a basis for comparison with the current practices.

The history of computer simulation dates back to World War II when two mathematicians Jon Von Neumann and Stanislaw Ulam were faced with the puzzling problem of behavior of neutrons. Hit and trial experimentation were too costly and the problem was too complicated for analysis. Hence, the Roulette wheel technique was suggested by the mathematicians. The basic data regarding the occurrence of various events were known, into which the probabilities of separate events were merged in a step by step analysis to predict the outcome of the whole sequence of events. With the remarkable success of the techniques on neutron problem, it soon became popular and found many applications in the business and industry [1].

This was a time, in the post-war world, when new technologies, developed for military purposes during the war, began to emerge as new problem-solving tools in the world at large. At that time the field of computing was divided into two approaches: analog and digital. Analog computers were particularly suitable for problems requiring the solution of differential equations. Analog computers used electronic DC amplifiers configured as integrators and summers with a variety of non-linear, electronic and Electro-mechanicalComponents for multiplication, division, function generation, etc. These units were manually interconnected so as to produce a system that obeyed the differential equations under study. A great deal of ingenuity was often necessary in order to produce accurate, stable solutions. The electronics used vacuum tubes (valves), as did the early digitalcomputers. The transistor was still some years in the future [3].

In the late ‘40s and early ‘50s, commercially designed computers, both analog and digital started to appear in a number of organizations. Unsuspecting members of the technical staffs of these organizations suddenly found themselves responsible for figuring out how to use these electronic monsters and apply them to the problems of the day. One such engineer, working at the Naval Air Missile Test Center at Point Mugu on the California coast north of Los Angeles, was John McLeod, who took delivery of a new analog computer sometime in 1952. John was not the only engineer in the aerospace community in Southern California facing the same problems, and a few of them decided to get together as an informal user group to exchange ideas and experiences [3].

Computer simulation was not a useful tool in the 1950s.Simulation took too long to get results, needed too many skilled people, and as a result cost a considerable amount in both personnel and computer time. And most disheartening, results were often ambiguous. One example is the attempt to model the field data for peak periods in case of telephone systems. This is because the system did not conform to the queuing theory used during those days. One technique used was discrete event computer simulation. The tools available for the approach were an IBM 650, assembly language, and a team of mathematician, a systems engineer and a programmer. The team accomplished less than half of what they were set to do, took twice as long and overspent the budget by a factor of two [2].

The computer systems of the 60s were predominantly batch systems. Both data and the program were fed to the computer in a batch via punched cards. Source data were taken on forms from which keypunch operators prepared the punched cards. Data Processors developed the programs. The early use of punched cards in manufacturing was predominantly seen in their inclusion in job or order packets for material requisition, labor reporting and job tracking. A mainstay of that period was the classical IBM 1620 [5].

In October 1961 IBM presented the "Gordon Simulator" to Norden (systems design company). In December 1961 Geoffrey Gorden presented his paper at the fall Joint Computer Conference on a General Purpose Systems Simulator (GPSS) [1,2]. This new tool was used to design the system for the FAA to distribute weather information to general aviation [2].

IBM provided the software and hardware. The team was able to construct the model, simulate the problem, and obtain answers in only six weeks. A new tool had become available for systems designers. With the success of this tool models began to be produced for outside groups by Norden and a simulation activity was established. Early simulation groups were established at: Boeing, Martin Marietta, Air Force Logistics Command, General Dynamics, Hughes Aircraft, Raytheon, Celanese, Exxon, Southern Railway, and the computer manufacturers were IBM, Control Data, National Cash Register, and UNIVAC [2].

However the users of GPSS from IBM were concentrating on aspects of computer systems very different from the Norden systems. Geoffrey Gorden’s concept was that the actual designers would use GPSS. But the design engineers preferred to communicate their problems to programmers or a simulation group.The interactions among the GPSS simulation groups occurred through the IBM user’s group conference, SHARE. It was a huge meeting and those interested in simulation had only one session [2].

Meanwhile, at Rand Corporation Harry Markowitz, Bernard Hausner, and Herbert Karr produced a version of SIMSCRIPT in 1962 to simulate their inventory problems. Elsewhere, there were other approaches. In England J. Buxton and J. Laski developed CSL, the Control and Simulation Language. An Early version of SIMULA was developed in Norway by O. Dahl and K. Nygaard and Don Knuth and J. McNeley produced SOL- A symbolic Language for General Purpose System Simulation. Ken Tocker wrote a short book on the ART OF SIMULATION [4].

The characteristics of this period were quantities of simulation language developments and few efforts to coordinate and compare the different approaches. There was, also, no organized activity to help users get started or provide guidance. The first step to address these limitations was to look at simulation languages. This was done at a workshop on Simulation Languages at Stanford University in March of 1964. Then at the International Federation for Information Processing (IFIP) Congress in New York in May of 1965 there was a discussion of languages and application, which in turn led to another workshop at the University of Pennsylvania in March of 1966. One result of this workshop was the realization that a narrower conference on the uses of simulation was needed [4].

In response to these needs, an organizing group was established composed of members osf SHARE, Joint User’s Group of ACM, and the Computer and Systems Science and Cybernetics Groups of IEEE. This group organized the November 1967 Conference on Application of Simulation using the General Purpose Simulation System (GPSS). Highlights of the conference included a speech by Geoffrey Gordon who spoke at length on "The Growth of GPSS" and there was a session on machine interference for GPSS.

Encouraged by the success, the organizing group set out to make the conference format broader, include other languages and provide a conference digest. In December 1968 a second conference on the applications of Simulation was held in New York at the hotel Roosevelt with over seven hundred attendees. For that conference, what is today known as SCS became a sponsor and a 368 page conference digest was published. That conference became the first one to address, in great variety, the many aspects of DiscreteEvent Simulation. There were a total of 78 papers presented at twenty-two sessions [4].

The following topics were discussed in the conference [4]:

  1. "Difficulties in convincing Top Management"
  2. Sessions with papers on Statistical Considerations, random number generation for GPSS/360, languages- SIMSCRIPT 2, SIMULA 67, SPURT, a simulation tutorial and the case for FORTRAN- A Minority viewpoint.
  3. Sessions covered transportation, computer systems, manufacturing applications, reliability and maintainability, graphics and GPSS modifications, simulation and human behavior, distibution systems, communications, urban systems, gaming models, job shops, materials handling, marketing models, languages for modeling computer systems, facility planning models, and simulation and ecology.
In 1969 third conference on the Application of Simulation was held in December in Los Angeles. One sign of becoming established is was that both AIIE and TIMS joined as sponsors. Among the new items were GASP and a session on health systems. The 1970 and 1971 fourth and fifth conference were held in New York for the last time. The fourth conference discussed the first GPSS tutorial by Tom Schriber. The fifth conference became the first to be titled the WINTER SIMULATION CONFERENCE. The number of tutorials grew with Alan Pritsker covering GASP 2 AND Yen Chao SIMSCRIPT. An education session was added since many schools were offering course in both coninuous and discrete event simulation. The first SIMSCRIPT tutorial by Ed Russell was published in 1976. In the 1977 conference held in Washington, D.C. two new sessions on agricultural and military systems were added. There was also an increased interest in the internal workings of the language. One example was an IMPROVED EVENTS LIST ALGORITHM presented by Jim Henriksen [3].

Simulation was a topic that was taught to Industrial Engineers in school but rarely applied. Long hours spent at the computer terminal and seemingly endless runs to find an obscure bug in a language was what simulation meant to I.E. graduates in the 70s. When spreadsheet tools were first introduced in the late 1970s they were only used by a "few true believers". The popularity of simulation as a powerful tool increased with the number of conferences and sessions. The number of sessions held on simulation doubled by 1971 and continued to rise to about forty sessions in 1977 and sixty sessions in 1983 as compared to 12 in 1967. A sign of the growing maturity in the field was a Panel Discussion at Miami in 1978 on the FAILURES OF SIMULATION, focussing on what can and does go wrong and a paper on MANAGING SIMULATION Projects. In 1979 the conference was held in San Diego and in 1980 it was held in Orlando. There were more tutorials and papers were organized into tracts of sessions for beginners, intermediate, and advanced practioneers [3].

Two common fears of simulation in early 80s were [5]:

  1. Simulation is extremely complicated, so only experts can use it.
  2. Simulation takes forever because of programming and debugging.
However, the number of computerized systems increased from relatively four in the 1970s to a great many in the late 70s and early 80s. A survey of commercially available production management systems published by CAM-1 in 1981 listed 283 different computerized systems available and most of the systems listed in the report were under $ 50,000.

The sudden commercial availability of large number of computerized manufacturing systems was complemented by the emergence of an extensive array of available computer hardware and software, particularly from 1980 on. At the same time, the attractive computer price/ performance reduction was fueling a similar explosion of computing applications in engineering design and plant automation [5].

In 1982 most simulation software concentrated on material requirements planning (MRP), which considers only the timing and sizing of orders without regard to capacity limitations. Software didn’t advance beyond this stage to give a true meaning to automated factory. Hundreds of robots and millions of dollars worth of computer-controlled equipment were worthless as they were underutilized and spent their time working on the wrong part because of poor planning. In 1982 personal microcomputers were 16 bit machines capable of holding memories of the order of 128k, 256k, or even 512k. Not much software was available to take the advantage of the 16bit microprocessor and the additional memory. In 1983 the number of companies using simulation was small. With the evolution of information systems that can collect and store much of data necessary to build and maintain models, simulation for production planning became more feasible. The widely used factory management system by CAM-I supported and distributed, closed loop control of shop floor operations and closed loop communications between planning and operations functions. On installing such a system much of the problems associated with building and maintaining simulation models was eliminated [5].

With the development of SLAMII by Pritsker and associates in 1983 simulation software became a powerful tool.It was popularly used on the IBM PC. SLAMII provided three different modeling approaches [5]:

  1. Network
  2. Discrete event
  3. Continuous and the flexibility to use any combination of them in a single simulation model; Its cost was $975.
Late 80s saw the development of SIMANIV and CINEMAIV, the newest in simulation and animation software by systems modeling. All code was self documented, models of complex systems could be developed entirely within SIMAN, with easy-to-use menu driven framework. New interactive capabilities aided in constructing and validating the simulation model. Expanded drawing features, real time plots and frequency graphics added to CINEMA’S new ability [5].

In 1984 the first simulation language specifically designed for modeling manufacturing systems was developed. In the late 80s with the development of the discrete event simulation model, the management was able to assess the cost-benefits of alternatives, maintenance strategies, converting equipment repairs and capital replacements [5].

In the early 90s software such as EMS version of GPSS/PC began to emerge, which allowed users of IBM compatible personal computers to access additional memory, above the 640k limit imposed by the original PC architecture. EXTEND was a Macintosh based graphical behavior simulation application that supported both discrete and continuous event simulation. MIC-SIM version 3.0 provided modeling capabilities and features that were so easy to learn and use that training and consulting services were no longer needed. GPSS/H was supported by a wide variety of hardware in the industry, from PCs and most Unix workstations to VAX/VMS and IBM mainframe systems. It offered numerous extensions, which prevented users from having to write external code in Fortran or ‘C’. MAST provided a single environment for the design, acquisition and operation of manufacturing systems. It required no programming, no modeling, not even a text editing was required to study a production system [5].

The power of simulation as a tool became evident in the middle 90s.Challenges were faced by companies like Universal Data Systems (ultra modern electronics assembly plant). The hurdle was to convert the entire plant to a hybrid flow-shop where an individual unit would be sent to the next operation as soon as it was completed at the current operation. One serious reservation for this change was the impact on finished goods inventory. Experiments were carried out using the simulation program written in GPSS/ PC (Minuteman) using an IBM PC/AT. The entire program took 30 days to simulate and the results were positive with the eventual conversion of the entire plant to a flow-shop environment as compared to the original batch environment [5].

Models were increasingly used to design new plants and to plan the flow of work in these new facilities. The influence of graphics became more marked and a number of vendors used the conference exhibit space to demonstrate the advantages of their system by actually bringing a computer to the conference site. Technology had moved so far thatsimulation, for those who were skilled in the art, became quicker, cheaper, and much more responsive to the designs of the model constructor [5].

In 1998 software such as Micro Saint version 2.0 for Windows 95 began to stand out. It provided automatic data collection, optimization and new Windows interface. In addition to this , it did not require the ability to write in any programming language. Today, Simulation has advanced to such a stage that the software enables the user to model, execute, and animate any manufacturing system in any level of detail. A complex 2000-foot conveyor can be modeled in minutes. The products, equipment and information is represented by a single entity associated with four dimensions (X, Y, Z and time) and a definition of its behavior [5].

Advanced versions of simulation software today, support the following features [5]:

This history provides a spring board from which to extrapolate a few predictions for simulation capabilities of the future. The future of Simulation may involve integration with other techniques and other software applications. Companies like Pritsker acquired Symix, a producer of Enterprise Resource Planning (ERP) software. Deneb Robotics was acquired by Dassault Systems, a maker of 3-D CAD software. Simulation has developed in leaps and bounds since the 90s and it is predicted that in the future companies not using simulation software may be faced with the challenge to stay afloat in the competitive world [5].

Back to Table of Contents