SimKnowledge: Simulation Based Knowledge Elicitation
On a day-to-day basis most manufacturing systems are subject to significant levels of human interaction and intervention from human decision-makers. This may range from the behaviour of individual operators through to the planning and control decisions taken by management. The performance of the operations system will be affected, possibly significantly, by these human interactions [Peer-Olaf et al, 2001].
One means for improving performance would be to improve the operational decisions taken by plant supervisors on a day-to-day basis. The nature of such decision-making, however, is poorly understood. Indeed, many manufacturing supervisors are unable to express the manner in which decisions are taken concerning areas such as production scheduling and machine repair and maintenance.
Previous research (EPSRC grant reference: M72876) has lead to the development of a methodology, known as 'Knowledge Based Improvement' (KBI), that attempts to address this issue [Robinson et al, 2001]. The methodology is based upon the use of discrete-event simulation and artificial intelligence and can be summarised in 5 key stages:
· Stage 1: understanding the decision-making process
· Stage 2: data collection
· Stage 3: determining the experts' decision-making strategies
· Stage 4: determining the consequences of the strategies
· Stage 5: seeking improvements
In the first stage the operations system is observed and a visual simulation model of that system is developed. The simulation acts as a catalyst for asking questions about the nature of decisions that are taken in supervising the manufacturing facility. In the second stage decision-making scenarios are presented to the manufacturing supervisors and they are asked to provide responses. In so doing a series of example decisions are generated. This can be performed in a number of ways from a simple paper based exercise to the use of the simulation model in an interactive mode, forming a manufacturing simulation 'game'.
In stage 3 artificial intelligence methods (e.g. neural networks and expert systems) are used to learn and infer decision-making rules from the example decisions collected in stage 2. In stage 4 the consequences of the decision-making rules are determined by letting the artificial intelligence representation of the human decision-maker interact with the simulation model. Since the decision-maker no longer needs to be present during simulation runs, much longer predictive runs can be performed.
Finally, improvements can be sought (stage 5) by comparing the decision-making strategies of alternative decision-makers, or by using optimisation methods (heuristics) to search for improved decision-making strategies.
A key question that emerged from this research was how best could decision-making scenarios be presented to the decision-makers in order to obtain realistic example decisions as efficiently as possible?
The purpose of this research is to focus on the knowledge elicitation process at the core of the KBI methodology. The aim is to determine the most efficient and effective means for eliciting knowledge from decision-makers, and more specifically eliciting that knowledge via a simulation model.
The specific objectives are:
· To determine alternative mechanisms for eliciting knowledge from decision-makers using a visual interactive simulation
· To compare the alternative methods in terms of their efficiency (speed of data collection)
· To compare the alternative methods in terms of their effectiveness (accuracy of data collection)
· To compare the data collection methods in terms of the ability to train various artificial intelligence methods from the data sets collected
Neural networks, rule based expert systems and data mining tools will be among the artificial intelligence methods explored.
The research is to be undertaken using a case based approach at Ford Motor Company. The work will centre on the test area in the Dagenham engine assembly plant. A plant supervisor is required to allocate engines to test cells with the aim of maximising throughput while spreading the work load evenly. The status of the test cells must also be taken into account. Although little is known about how these decisions are taken, they do significantly affect the throughput of the facility. It is important, therefore, that the supervisors' knowledge is understood for use by future supervision staff. Another reason for basing the research on the Dagenham engine plant, is that a simulation model of the process already exists, although it only provides a simplistic representation of the allocation decision.
Figure 1 provides an overview of the methodology that is to be employed. The key stages are as follows:
· Investigate the manufacturing system to understand the process and the decision-making required
· Adapt the existing simulation model so it can act as a means for generating decision-making scenarios
· Elicit knowledge from the decision-makers by asking them to respond to the simulated scenarios (create data sets)
· Train artificial intelligence tools with the data sets
· Replace the decision-makers with the trained artificial intelligence tools during further simulation runs
Each of these stages is described in more detail below.
Figure 1 Simulation Based Knowledge Elicitation
First the engine plant test area will be investigated to gain an understanding of the process and to understand the nature of the decision-making. This will be achieved through observation, interviews and investigation of data that are available such as layout drawings and the data captured from the plant monitoring systems. The existing visual interactive simulation (VIS [Hurrion, 1976]) will also be used during this stage.
Following this, the
In the third stage, knowledge will be
elicited from the decision-makers via the
· Level of visual display: paper based, none, 2D, 21/2D, 3D
· Interactive interface: number of decision-making attributes (key data upon which decisions are taken) that are reported to the decision-maker
· Scenario generation: use of historic scenarios, adapted historic scenarios to give more extreme examples, random sampling of scenarios, adapted random sampling of scenarios to give more extreme examples
· Self learning: learning responses to specific scenarios as the data collection progresses and automatically responding to future iterations of the same scenario
The design of the knowledge elicitation sessions will also be explored. In particular the duration of sessions (observing decision-maker fatigue) and the use of group versus individual sessions. A key issue will be the need for significant input from the decision-makers, which could lead to over familiarity and fatigue. In order to avert this problem, knowledge elicitation sessions will also be run with other Ford staff and with non-experts. Although this will not reveal useful information about the nature of the decisions taken by the plant supervisors, it will act as a means for testing alternative representations of decision making scenarios.
Following knowledge elicitation, the data sets that have been collected will be used to train the various artificial intelligence tools. Where possible off-the-shelf software will be purchased to reduce development time. In training the artificial intelligence tools the size of the data sets required and the validity of the representation of the human-decision maker will be investigated.
The final stage will be to link the trained artificial intelligence tools with the simulation models in order to represent the human decision-makers. This will act as another means of investigating the validity of the representation of the decision-makers.
Ford Motor Company
Anonymous (2001). Knowledge
Management - A Guide to Good Practice. PricewaterhouseCoopers/BSi,
Barrett, A.R. and Edwards, J.S. (1995). Knowledge elicitation and knowledge representation in a large domain with multiple experts. Expert Systems with Applications, 8 (1), pp. 169-176.
Edwards, J.S. (1991). Building Knowledge-Based Systems: Towards A
Edwards, J. S., Duan, Y. and Robins, P. C. (2000) An analysis of expert systems for business decision making at different levels and in different roles. European Journal of Information Systems, 9, (1), 36-46.
Flitman A.M. and Hurrion, R.D. (1987). Linking Discrete-Event Simulation Models with Expert Systems. J. Opl Res. Soc., 38 (8), pp. 723-734.
Huber G. P. (2000) Transferring Sticky
Knowledge: Suggested solutions and needed studies. In Edwards J S & Kidd J
B (eds) Knowledge Management
beyond the Hype: Looking towards the new millennium. Proceedings of KMAC 2000,
Hurrion, R.D. (1976). The Design, Use and
Required Facilities of an Interactive Visual Computer Simulation Language to
Explore Production Planning Problems.
Hurrion, R.D. (1991). Intelligent Visual Interactive Modelling. Eur J Opl Res, 54 (3), pp. 349-356.
Hurrion, R. D. (1993). Using 3D Animation Techniques to Help with the Experimental Design and Analysis Phase of a Visual Interactive Simulation Project. J. Opl Res. Soc. 44 (7), pp. 693-700.
Hurrion, R. D. (1997). An Example of Simulation Optimisation Using a Neural Network Metamodel: Finding the Optimum Number of Kanbans in a Manufacturing System. J. Opl Res. Soc. 48 (11), pp. 1105-1112.
Kidd, A. L. (ed) (1987). Knowledge
Acquisition for Expert Systems: A Practical Handbook. Plenum,
Mertins, K., Heisig,
P. and Vorbeck, J. (Eds.) (2001) Knowledge Management: Best Practices in
Peer-Olaf, S., Mason, S., Baines, T. and Ladbrook, J. (2001). An Overview of the
Actual State of Human Performance Modelling and the Usage/Usability within the
Manufacturing System Design Process. Presented at OR43, Operational
Research Society Annual Conference,
Robinson, S. (1994). Successful Simulation: A Practical Approach
to Simulation Projects. McGraw-Hill,
Robinson, S. (2002). General Concepts of Quality for Discrete-Event Simulation. European Journal of Operational Research, 138 (1), pp. 103-117.
Edwards, J.S. and Yongfa, W. (1998). An Expert Systems Approach to Simulating the Human Decision Maker. Winter
Simulation Conference 1998 (D.J. Medeiros, E.F. Watson, M. Manivannan, J. Carson, eds.), The
Society for Computer Simulation,
Robinson, S. and Pidd, M. (1998). Provider and Customer Expectations of Successful Simulation Projects. J. Opl Res. Soc., 49 (3), pp. 200-209.
Robinson, S., Alifantis, A., Edwards, J.S., Hurrion,
R.D., Ladbrook, J. and Waller, T. (2001). Modelling and Improving Human Decision Making with
of the 2001 Winter Simulation Conference ed. B.A. Peters, J.S. Smith, D.J. Medeiros, and M.W. Rohrer.
Return to Stewart Robinson's Home Page