This volume contains a well-balanced set of applications and theory papers in artificial intelligence advances. The applications papers each discuss a system that is (or is close to being) a fielded system that solves real problems using one or more AI techniques. They cover areas such as education, physics, energy, control, medicine and mechanical engineering.
The theory papers, representing recent advances in various theoretical aspects of AI technology, concern themselves with “building block” issues, i.e. theories, algorithms, architectures, and software tools that can or will be used for modules within future systems. The topics covered are: clustering, natural language, adaptive algorithms, distributed processing, knowledge acquisition, and systems programming.
https://doi.org/10.1142/9789814439404_fmatter
The following sections are included:
https://doi.org/10.1142/9789814439404_0001
The authors have studied how to integrate concepts from other fields such as psychology and imagery into Computer-Aided Education (CAE) using Artificial Intelligence (AI).
It is believed that images (graphic or real) may play an important role in learning. However, if images are not to be used only for illustration, it is necessary to employ concepts like those of AI. In this paper, the application of AI concepts in an image-based CAE system is developed and the prototype BIRDS is briefly presented.
The concepts presented here include the pedagogical and psychological aspects of learning, and the importance of images in learning, both as pedagogical objects, and as means for obtaining psychological information about a learner. A student model, defined by the authors, is developed, and its components, construction and use are explained.
https://doi.org/10.1142/9789814439404_0002
This paper discusses the design and implementation of PLAYMAKER, a knowledge-based system for characterizing hydrocarbon plays. PLAYMAKER is a component of XX (eXpert eXplorer), a workstation-based tool that aids exploration geologists in a number of different tasks: sediment and carbonate simulation, play and field characterization, retrieval and storage of information in a geological database, comparison of the play or field under study with other fields in the database, and report generation. PLAYMAKER is implemented using MIDST (Mixed Inferencing Dempster-Shafer Tool), a rule-based expert system shell that incorporates mixed-initiative and inexact reasoning based on the Dempster-Shafer evidence combination scheme. This paper discusses the effectiveness of a two-level knowledge base structure adopted for the design and implementation of PLAYMAKER.
https://doi.org/10.1142/9789814439404_0003
Thermal infrared images of the ocean obtained from satellite sensors are widely used for the study of ocean dynamics. The derivation of mesoscale ocean information from satellite data depends to a large extent on the correct interpretation of infrared oceanographic images. The difficulty of the image analysis and understanding problem for oceanographic images is due in large part to the lack of precise mathematical descriptions of the ocean features, coupled with the time varying nature of these features and the complication that the view of the ocean surface is typically obscured by clouds, sometimes almost completely. Towards this objective, the present paper describes a hybrid technique that utilizes a nonlinear probabilistic relaxation method and an expert system for the oceanographic image interpretation problem. This paper highlights the advantages of using the contextual information in the feature labeling algorithm. The need for an expert system and its feedback in automatic interpretation of oceanic features is discussed. The paper presents some important results of the series of experiments conducted at the Remote Sensing Branch, of the Naval Oceanographic and Atmospheric Research Laboratory, on the National Oceanic and Atmospheric Administration Advanced Very High Resolution Radiometer (AVHRR) imagery data. The results clearly indicate the drastic improvement in labeling due to the oceanographic expert system.
https://doi.org/10.1142/9789814439404_0004
An expert system that acts as an intelligent assistant to operators tuning a particle beam accelerator was developed. The system incorporates three approaches to tuning: (1) Duplicating within a software program the reasoning and the procedures used by an operator to tune an accelerator. This approach has been used to steer particle beams through the transport section of Lawrence Livermore National Laboratory's Advanced Test Accelerator and through the injector section of the Experimental Test Accelerator. (2) Using a model to simulate the position of a beam in an accelerator. The simulation is based on data taken directly from the accelerator while it is running. This approach will ultimately be used by operators of the Experimental Test Accelerator to first compare actual and simulated beam performance in real time, then to determine which set of parameters is optimum in terms of centering the beam, and finally to feed those parameters to the accelerator. Operators can also use the model to determine if a component has failed. (3) Using a mouse to manually select and control the magnets that steer the beam. Operators on the Experimental Test Accelerator can also use the mouse to call up windows that display the horizontal and vertical positions of the beam as well as its current.
https://doi.org/10.1142/9789814439404_0005
The purpose of this expert system is to assess a predisposition to bleeding in a patient undergoing a tonsillectomy and/or adenoidectomy, as may occur with patients who have certain blood conditions such as hemophilia, von Willebrand's disease, and platelet function defects. This objective is achieved by establishing a correlation between the patients' responses to a medical questionnaire and the relative quantities of blood lost during their operations.
Three major modules constitute the system: the automated questionnaire to be completed by the patient, the expert system proper, and the patient database. The questionnaire is divided into fourteen sections on topics such as the occurrence of bruises and positive familial history. The expert system takes the responses to the questionnaire and determines risk categories for patients undergoing the operations. The patient database contains the patients' questionnaire responses, personal data and the risk assessments generated by the expert system.
Subsequent to the development of the expert system prototype, inductive learning techniques were examined to automatically classify patients as either normal or abnormal. Specifically, Quinlan's ID3 induction algorithm was used whereby a training set of patient data generates a classification tree whose leaf nodes correspond to the classification outcomes.
This paper presents the architecture of the hematology expert system, identifies some problems encountered during the knowledge engineering process and presents some statistical data pertaining to the accuracy of its risk assessments. It also offers a comparative review of the expert system and inductive learning methodologies.
https://doi.org/10.1142/9789814439404_0006
Even though AI technology is a relatively new discipline, many of its concepts have already found practical applications. Expert systems, in particular, have made significant contributions to technologies in such fields as business, medicine, engineering design, chemistry, and particle physics.
This paper describes an expert system developed to aid mechanical engineering designers in the preliminary design of variable-stroke internal-combustion engines. Variable-stroke engines are more economical in fuel consumption but their design is particularly difficult to accomplish. With the traditional design approach, synthesizing the mechanisms for the design is rather difficult and evaluating the mechanisms is an even more cumbersome and time-consuming effort. Our expert system assists the designer by generating and evaluating a large number of design alternatives represented in the form of graphs. Through the application of structural and design rules obtained from design experts to the graphs, good quality preliminary design configurations of the engines are promptly deduced. This approach can also be used in designing other types of mechanisms.
https://doi.org/10.1142/9789814439404_0007
Clustering, pattern recognition, and classification are important components of many artificial intelligence systems, especially those designed to classify new observations. Often, one has access to a history of old observations that have been previously classified and, under these circumstances, the old data may be used as a training set from which one may obtain rules for the subsequent classification of new data. The techniques used to obtain these rules may be traditional statistical methods or modern computer-intensive techniques. Sometimes, however, the history of old observations has not been previously classified. Under these circumstances, the analyst simply wishes to uncover structure in the data and ascertain whether the structure is apparent or real. When the analyst is searching for clusters, statistical clustering methodologies are often used. Although effective at locating clusters, such approaches leave the interpretation of the clusters as a task for the human analyst. A relatively new class of "conceptual" clustering techniques have emerged from the discipline of machine learning. These techniques attempt to both locate and explain clusters among the data. In this way, the explanations of cluster membership may be used to construct rules for the subsequent classification of new data. The generation of interpretable rules, whether by the use of classification algorithms or conceptual clustering algorithms is of considerable importance in reducing the knowledge acquisition "bottleneck" that often impedes progress towards the building of rule-based systems. In this paper, two new techniques for conceptual clustering are introduced and compared.
https://doi.org/10.1142/9789814439404_0008
A man-machine interface in natural language for a DataBase Management System (DBMS) called VORAS, based upon semantics, is presented. This DBMS can also be used for knowledge representation, and is well-suited to the design of queries in natural language. The system VORAS is an object-oriented DBMS developed from a specific model of representation called the Property Driven Model (PDM). A user may write a query in natural language. Most of the analysis is done at a semantic level. A syntactic level has been added, to improve the performance of the system. However, it is still possible to use a short-hand style.
https://doi.org/10.1142/9789814439404_0009
This paper describes Grumman's Rapid Expert Assessment to Counter Threats (REACT) project, designed to aid pilots in air combat decision making. We present a hierarchical design for a planning system which addresses some of the real-time aspects of planning for threat response. This paper concentrates on the lowest level of this hierarchy which is responsible for planning combat maneuvers at low altitude over hilly terrain when the enemy is not in sight. REACT's Lost Line of Sight module attempts to maximize the amount and depth of knowledge which can be utilized in the time available before the system must commit to its next action. It utilizes a hybrid architecture for planning decisions which incorporates multiple knowledge representations and planners based on artificial intelligence, neural networks, and decision theory. This architecture allows planning at different degrees of competence to be performed by concurrently operating planners with differing amounts of knowledge. We describe research on the planning issues in REACT as well as the associated knowledge representation and knowledge acquisition issues. In addition, we describe how work on developing terrain reasoning capability in REACT has suggested guidelines for knowledge base design and data management, system and language specifications, and planner architectures pertinent to real-time coupled systems.
https://doi.org/10.1142/9789814439404_0010
This work explores a distributed problem solving (DPS) approach, namely the AM/AG (Amplification/Aggregation) model. The AM/AG model is a hierarchic social system metaphor for DPS based on Mintzberg's model of organizations. At the core of the model are information flow mechanisms, namely, amplification and aggregation. Amplification is a process of decomposing a given task, called an agenda, into a set of subtasks with magnified degree of specificity and distributing them to multiple processing units downward in the hierarchy. Aggregation is a process of combining the results reported from multiple processing units into a unified view, called a resolution, and promoting the conclusion upward in the hierarchy.
Amplification is discussed in detail. A set of generative rules is introduced. Each rule specifies a set of actions for transforming an input agenda into other forms with higher specificity. The proposed model can be used to account for the memory recall process which makes associations between vast amounts of related concepts, sorts out the combined results, and promotes the most plausible ones. An example of memory recall is used to illustrate the model.
https://doi.org/10.1142/9789814439404_0011
CAUSA is a knowledge acquisition tool which supports the incremental modeling of complex dynamic systems during the whole knowledge acquisition task. It provides an environment for the modeling and simulation of dynamic systems on a quantitative level. The environment provides a conceptual framework which includes primitives like objects, processes, and causal dependencies, allowing for the modeling of a broad class of complex systems. Simulation allows for the quantitative and qualitative inspection and empirical investigation of the behavior of the modeled system. CAUSA is implemented in Knowledge-Craft and runs on a Symbolics 3640.
https://doi.org/10.1142/9789814439404_0012
The Prioritized Production System (PRIOPS) is an architecture that supports timeconstrained, knowledge-based embedded system programming and learning. Inspired by the theory of automatic and controlled human information processing in cognitive psychology, PRIOPS supports a two-tiered processing approach. The automatic partition provides for compilation of productions into constant-time-constrained processes for reaction to environmental conditions. The notion of a habit in humans approximates the concept of automatic processing trading flexibility and generality for efficiency and predictability in dealing with expected environmental situations. Explicit priorities allow critical automatic activities to pre-empt and defer execution of lower priority processing. An augmented version of the Rete match algorithm implements O(1), priority-scheduled automatic matching. The controlled partition supports more complex, less predictable activities such as problem solving, planning, and learning that apply in novel situations for which automatic reactions do not exist. The PRIOPS notation allows the programmer of knowledge-based embedded systems to work at a more appropriate level of abstraction than is provided by conventional embedded system programming techniques. This paper explores programming and learning in PRIOPS in the context of a maze traversal program.