Skip main navigation

Cookies Notification

We use cookies on this site to enhance your user experience. By continuing to browse the site, you consent to the use of our cookies. Learn More
×

System Upgrade on Tue, May 28th, 2024 at 2am (EDT)

Existing users will be able to log into the site and access content. However, E-commerce and registration of new users may not be available for up to 12 hours.
For online purchase, please visit us again. Contact us at customercare@wspc.com for any enquiries.

SEARCH GUIDE  Download Search Tip PDF File

  Bestsellers

  • articleNo Access

    An ANN-based data-predictive approach for comparative study between CFD finite difference and finite volume method

    In computational fluid dynamics (CFD), there is a transformation of methods over the years for building commercially coded software. Each method has predicted its own set of importance, but the exportation and prediction of data are some of the crucial elements for post-processing and validating results. In the present investigation, a detailed comparative analysis is performed over finite difference method (FDM) and finite volume method (FVM) method for the 1D steady-state heat conduction problem over a 1-m-long plate. The comparison was made between solution creation and validation between FDM and FVM for the analytical and computational scheme. The convergence-dependent study is performed as multi-objective optimization to predict how artificial neural network (ANN) can be used to verify and validate the solution of CFD.

  • articleNo Access

    Incorporating Turbulence Models into the Lattice-Boltzmann Method

    The Lattice-Boltzmann method (LBM) is extended to allow incorporation of traditional turbulence models. Implementation of a two-layer mixing-length algebraic model and two versions of the k-ε two-equation model, Standard and RNG, in conjunction with a wall model, are presented. Validation studies are done for turbulent flows in a straight pipe at three Re numbers and over a backwards facing step of expansion ratio 1.5 and ReH=44 000. All models produce good agreement with experiment for the straight pipes but the RNG k-ε model is best able to capture both the recirculation length, within 2% of experiment, and the detailed structure of the mean fluid flow for the backwards facing step.

  • articleNo Access

    Assessment of improved delayed detached eddy simulation in predicting unsteady flows and sound around a circular cylinder

    Unsteady flows in the field of engineering are usually calculated by the Unsteady Reynolds-Averaged Navier–Stokes (URANS) owing to the low requirements for computational efforts. However, the numerical resolution of URANS, especially in predicting the unsteady wake flows and sound, is still questionable. In this work, unsteady flow and sound calculations of a circular cylinder are carried out using Improved Delayed Detached Eddy Simulation (IDDES) and the Ffowcs Williams–Hawkings (FW-H) analogy. The predicted results of this calculation are compared with those from the previous studies in the literature in terms of the mean and RMS of the velocity components as well as the sound pressure. The results show that IDDES retains much of the numerical accuracy of the Large Eddy Simulation (LES) approach in predicting unsteady flows and noise while requiring a reduced computational resources in comparison to LES. It is believed that the IDDES can be applied to calculate the complex unsteady flows and flow generated sound with reasonable accuracy in engineering field, which can be used as a promising method for scale-resolving simulations to avoid the expensive computational requirements of LES.

  • articleNo Access

    HIGH LEVEL DOCUMENT ANALYSIS GUIDED BY GEOMETRIC ASPECTS

    The realization of the paper-free office seems to be difficult that expected. Therefore, good paper-computer interfaces are necessary to transform paper documents into an electronic form, which allows the use of a filing and retrieval system. An electronic document page is an optically scanned and digitized representation of a printed page. Document analysis is the problem of interpreting and labeling the constitutents of the document. Although there are very reliable optical character recognition (OCR) methods, the process could be very inefficient. To prune the search space and to become more efficient, some search supporting methods have to be developed. This article proposes an approach to identify the layout of a document page by dividing it recursively into nested rectangular areas. The procedure is used as a basis for a document layout model, which is able to control an automatic interpretation mechanism for deriving a high level representation of the contents of a document. We have implemented our method in Common Lisp on a Symbolies 3640 Workstation and have run it for a large population of office documents. The results obtained have been very encouraging and have convincingly confirmed the soundness of our approach.

  • articleNo Access

    Virtual Verification and Validation of Automotive System

    An integrated framework for Virtual Verification and Validation (VVV) for a complete automotive system is proposed. The framework can simulate/emulate the system on three levels: System on Chip (SoC), Electronic control unit (ECU) and system level. The framework emulates the real system including hardware (HW) and software (SW). It enhances the automotive V-cycle and allows co-development of the automotive system SW and HW. The procedure for debugging AUTOSAR application on the virtual platform (VP) is shown. SW and HW profiling is feasible with the presented methodology. Verification and validation of automotive embedded SW is also presented. The proposed methodology is efficient as the system complexity increases which shortens the development cycle of automotive system. It also provides fault injection capability. With HW emulation, co-debugging mechanism is demonstrated. A case study covering the framework capability is presented. The case study demonstrates the proposed framework and methodology to design, simulate, trace, profile and debug AUTOSAR SW using VPs.

  • articleNo Access

    VERIFYING REQUIREMENTS THROUGH MATHEMATICAL MODELLING AND ANIMATION

    Achieving confidence in the correctness, completeness and consistency of requirements specifications can be problematic and the consequences of incorrect requirements can be costly. In this paper we argue that specification and animation can provide reasonably high levels of assurance in the requirements without the overheads of using general purpose theorem proving tools. We propose a framework based on mode analysis and the operational semantics of logic programs for animating specifications. The framework allows us to combine prototyping and limited forms of automated deduction to increase our levels of confidence in specifications. Finally, we show how such a framework can be used to increase the level of confidence in the correctness of a simple dependency management system specification written in Z.

  • articleNo Access

    A GENERIC FORMAL FRAMEWORK FOR CONSTRUCTING AGENT INTERACTION PROTOCOLS

    Agent interaction protocols (AIP) design is one of the principal issues for building multi-agent systems. Indeed, the construction of AIP should integrate theories, methodologies and tools. We propose in this paper a unifying framework that provides a generic agent architecture to be reused as well as a methodology to construct and refine AIP specifications in an incremental way. This framework is based on the highly expressive formal language Lotos and its related technologies, such as finite state machines and temporal logics. Hence, the proposed framework also facilitates formal validation and verification of AIP specifications using rigorous tools. We argue that there are three layers of semantics of Lotos specifications that can improve Lotos expressivity in describing agent interaction. Therefore, this framework can describe almost all aspects of agent interaction and at different abstraction levels. In addition, we demonstrate how to generate an online auction protocol from the generic framework, and how to validate and verify this protocol.

  • articleNo Access

    ASSISTED KNOWLEDGE BASE GENERATION, MANAGEMENT AND COMPETENCE RETRIEVAL

    Despite the presence of many systems for developing and managing structured taxonomies and/or SKOS models for a given domain for which small documents set are accessible, the production and maintenance of these domain knowledge bases is still a very expensive and time consuming process. This paper proposes a solution for assisting expert users in the development and management of knowledge base, including SKOS and ontologies modeling structures and relationships. The proposed solution accelerates the knowledge production by crawling and exploiting different kinds of sources (in multiple languages and with several inconsistencies among them). The proposed tool supports the experts in defining relationships among the most recurrent concepts, reducing the time to SKOS production and allowing assisted production. The validity of the produced knowledge base has been assessed by using SPARQL query interface and a precision and recall model. The results have demonstrated better performance with respect to the state of the art. The solution has been developed for Open Space Innovative Mind project, with the aim of creating a portal to allow industries at posing semantic queries to discover potential competences in a large institution such as the University of Florence, in which several distinct domains are associated with its own departments.

  • articleNo Access

    A SYSTEMATIC REVIEW OF THE EMPIRICAL VALIDATION OF OBJECT-ORIENTED METRICS TOWARDS FAULT-PRONENESS PREDICTION

    Object-oriented (OO) approaches of software development promised better maintainable and reusable systems, but the complexity resulting from its features usually introduce some faults that are difficult to detect or anticipate during software change process. Thus, the earlier they are detected, found and fixed, the lesser the maintenance costs. Several OO metrics have been proposed for assessing the quality of OO design and code and several empirical studies have been undertaken to validate the impact of OO metrics on fault proneness (FP). The question now is which metrics are useful in measuring the FP of OO classes? Consequently, we investigate the existing empirical validation of CK + SLOC metrics based on their state of significance, validation and usefulness. We used systematic literature review (SLR) methodology over a number of relevant article sources, and our results show the existence of 29 relevant empirical studies. Further analysis indicates that coupling, complexity and size measures have strong impact on FP of OO classes. Based on the results, we therefore conclude that these metrics can be used as good predictors for building quality fault models when that could assist in focusing resources on high risk components that are liable to cause system failures, when only CK + SLOC metrics are used.

  • articleNo Access

    DESIGN OF REDUNDANT FORMAL SPECIFICATIONS BY LOGIC PROGRAMMING: MERGING FORMAL TEXT AND GOOD COMMENTS

    Among the various tasks involved in SE & KE, requirements engineering, specification, prototyping, and validation are regarded as crucial since they decide whether a software system fulfills the users’ expectations. Formal methods provide a rigorous framework to guaranteed. Logic Programming has been recently shown as a promising candidate support these tasks and some relevant features can be in that way captured and formally regarding these concerns. Nevertheless, formalism does need some explanation to let it be more readable and understandable.

    This paper focuses on a specification design method which mixes formal text (represented by a logic program) and comments (using either formal or informal assertions). By the design of a specification we refer to the intertwined tasks of describing the specification and improving it by the investigation of proofs. These proofs aim to verify the link between the specification and the comments, and are partly automated. Then we present our practical experience in the use of an interacti ve proof system. As an example, we show how this methodology is currently applied to the draft of standard Prolog.

  • articleNo Access

    THE APPLICATION OF MACHINE LEARNING TOOLS TO THE VALIDATION OF AN AIR TRAFFIC CONTROL DOMAIN THEORY

    In this paper we describe a project (IMPRESS) in which machine learning (ML) tools were created and utilised for the validation of an Air Traffic Control domain theory written in first order logic. During the project, novel techniques were devised for the automated revision of general clause form theories using training examples. These techniques were combined in an algorithm which focused in on the parts of a theory which involve ordinal sorts, and applied geometrical revision operators to repair faulty component parts. While we illustrate the feasibility of applying ML to this area, we conclude that to be effective it must be focused to the application at hand, and used in mixed-initiative mode within a tools environment. The method is illustrated with experimental results obtained during the project.

  • articleNo Access

    THE INFECTION-ENCAPSULATION MODEL — APPLICATION TO DROSOPHILA SIMULANS AND LEPTOPILINA BOULARDI STRAINS FROM TUNISIA

    Some larvae of Drosophila infected by parasitic wasps are able to encapsulate the larvae of the parasitoid, and the emerging hosts present a visible melanized capsule in the abdomen. In this paper, a model for estimating the infection rate RI by the rate of hosts presenting a capsule HC is developed. For Drosophila simulans parasitized by Leptopilina boulardi, the model RI = HC/(k+(1-k)HC), with k=0.123, is validated from experimental data. The validation process is based upon a bootstrap strategy over 12870 possibilities of grouping 8 elementary experimental results among 16. Validation consists in fitting the theoretical curve from a data set and in controlling the overlap of the curve with the confidence rectangle established with the complementary data set. This validation process appears to be independent of the confidence level. This infection-encapsulation model is applied to field observations in Tunisia and predicts high levels of infection. This prediction is confirmed at Nasr'Allah by a direct measure of the infection rate. The biological hypotheses involved in this model are discussed. The model merely allows one to follow the evolution of infection in population cages and in the wild, by catching and counting adult hosts, without access to breeding sites. The model is generalisable to other species of hosts and parasitoids presenting the encapsulation reaction.

  • articleNo Access

    CONSTRAINTS AS INCOMPATIBILITY RELATIONS IN KBS

    In this paper we will show that the different definitions of contradiction obtained in the literature can be modeled by means of an incompatibility relation.

  • articleNo Access

    VALIDATION OF AN ELECTROGONIOMETRY SYSTEM AS A MEASURE OF KNEE KINEMATICS DURING ACTIVITIES OF DAILY LIVING

    Purpose: The increasing use of electrogoniometry (ELG) in clinical research requires the validation of different instrumentation. The purpose of this investigation was to examine the concurrent validity of an ELG system during activities of daily living. Methods: A total of 10 asymptomatic participants gave informed consent to participate. A Biometrics SG150 electrogoniometer was directly compared to a 12 camera three-dimensional motion analysis system during walking, stair ascent, stair descent, sit to stand, and stand to sit activities for the measurement of the right knee angle. Analysis of validity was undertaken by linear regression. Standard error of estimate (SEE), standardized SEE (SSEE), and Pearson's correlation coefficient r were computed for paired trials between systems for each functional activity. Results: The 95% confidence interval of SEE was reasonable between systems across walking (LCI = 2.43°; UCI = 2.91°), stair ascent (LCI = 2.09°; UCI = 2.42°), stair descent (LCI = 1.79°; UCI = 2.10°), sit to stand (LCI = 1.22°; UCI = 1.41°), and stand to sit (LCI = 1.17°; UCI = 1.34°). Pearson's correlation coefficient r across walking (LCI = 0.983; UCI = 0.990), stair ascent (LCI = 0.995; UCI = 0.997), stair descent (LCI = 0.995; UCI = 0.997), sit to stand (LCI = 0.998; UCI = 0.999), and stand to sit (LCI = 0.996; UCI = 0.997) was indicative of a strong linear relationship between systems. Conclusion: ELG is a valid method of measuring the knee angle during activities representative of daily living. The range is within that suggested to be acceptable for the clinical evaluation of patients with musculoskeletal conditions.

  • articleNo Access

    Intellectual Property Protection OF NATURAL PRODUCTS

    The article discusses about trade secrets, trademarks and patents and plant patents. It is about the intellectual property protection while dealing with natural products.

  • articleNo Access

    STRUCTURAL HEALTH MONITORING ORIENTED FINITE ELEMENT MODEL OF TSING MA BRIDGE TOWER

    The modeling, updating and validation of a structural health monitoring oriented finite element model (FEM) of the Tsing Ma suspension bridge towers are presented in this paper. The portal-type bridge tower is composed of two hollow reinforced concrete legs and four deep pre-stressed cross-beams with a steel truss cast in the concrete of each cross-beam to form a narrow corridor for access between two legs. Except that steel trusses are modeled by beam elements, all structural components are modeled by solid elements to facilitate local damage detection, in particular at member joints. The established tower model is then updated using sensitivity-based model updating method taking the natural frequencies identified from field measurement data as reference. Furthermore, a two-level validation criterion is proposed and implemented to examine the replication performance of the updated finite element model of the bridge tower in terms of (1) natural frequencies in higher modes of vibration and (2) dynamic characteristics of the tower-cable system. The validation results show that a good replication of dynamic characteristics is achieved by the updated tower model when compared to the field measurement results. Finally, stress distribution and concentration of the bridge tower are investigated through nonlinear static analysis of the tower-cable system.

  • articleNo Access

    SEGMENT CLASSIFICATION OF ECG DATA AND CONSTRUCTION OF SCATTER PLOTS USING PRINCIPAL COMPONENT ANALYSIS

    In many medical applications, feature selection is obvious; but in medical domains, selecting features and creating a feature vector may require more effort. The wavelet transform (WT) technique is used to identify the characteristic points of an electrocardiogram (ECG) signal with fairly good accuracy, even in the presence of severe high-frequency and low-frequency noise. Principal component analysis (PCA) is a suitable technique for ECG data analysis, feature extraction, and image processing — an important technique that is not based upon a probability model. The aim of the paper is to derive better diagnostic parameters for reducing the size of ECG data while preserving morphology, which can be done by PCA. In this analysis, PCA is used for decorrelation of ECG signals, noise, and artifacts from various raw ECG data sets. The aim of this paper is twofold: first, to describe an elegant algorithm that uses WT alone to identify the characteristic points of an ECG signal; and second, to use a composite WT-based PCA method for redundant data reduction and better feature extraction. PCA scatter plots can be observed as a good basis for feature selection to account for cardiac abnormalities. The study is analyzed with higher-order statistics, in contrast to the conventional methods that use only geometric characteristics of feature waves and lower-order statistics. A new algorithm — viz. PCA variance estimator — is developed for this analysis, and the results are also obtained for different combinations of leads to find correlations for feature classification and useful diagnostic information. PCA scatter plots of various chest and augmented ECG leads are obtained to examine the varying orientations of the ECG data in different quadrants, indicating the cardiac events and abnormalities. The efficacy of the PCA algorithm is tested on different leads of 12-channel ECG data; file no. 01 of the Common Standards for Electrocardiography (CSE) database is used for this study. Better feature extraction is obtained for some specific combinations of leads, and significant improvement in signal quality is achieved by identifying the noise and artifact components. The quadrant analysis discussed in this paper highlights the filtering requirements for further ECG processing after performing PCA, as a primary step for decorrelation and dimensionality reduction. The values of the parameters obtained from the results of PCA are also compared with those of wavelet methods.

  • articleNo Access

    AUTOMATIC COMPUTER-BASED TRACINGS (ACT) IN LONGITUDINAL 2-D ULTRASOUND IMAGES USING DIFFERENT SCANNERS

    Objective. The aim of this paper is to show an algorithm for the automatic computer-based tracing (ACT) of common carotid artery (CCA) in longitudinal B-mode ultrasound images characterized by four main features: (i) user-independence; (ii) suitability to normal and pathological images; (iii) robustness to noise; and (iv) independent of ultrasound OEM scanner.

    Methods. Three hundred longitudinal B-mode images (100 normal CCAs, 100 CCAs with increased intima-media thickness, 60 stable plaques, and 40 echolucent plaques) were acquired using three different (GE, Siemens, and Biosound) OEM ultrasound image scanners. The algorithm processed each image to delineate the region of interest containing the CCA. Output of the algorithm are three segmentation lines representing (a) distal (far) and (b) near adventitia layers, and (c) lumen of the CCA. Three operators qualitatively scored the ACTs.

    Results. The CCA was correctly automatically traced in all the 300 B-mode images. The performance was independent on the image scanner used to acquire the image or on the type of the CCA (healthy versus pathologic). Eight ACTs out of 300 received a poor score after visual inspection due to an automated adventitia tracing that did not correctly follow the CCA wall in a small portion of the image.

    Conclusions. The proposed algorithm is robust in ACTs of CCA since it is independent of scanner and normal/abnormal wall. This approach could constitute a general basis for a completely automated segmentation procedure.

  • articleNo Access

    COMPARISON OF THE ACTIVITIES OF THE DEEP TRUNK MUSCLES MEASURED USING INTRAMUSCULAR AND SURFACE ELECTROMYOGRAPHY

    Surface electromyography (EMG) has been used to estimate deep trunk muscle activity. However, it remains unknown whether surface EMG provides an accurate estimation of this activity. The purposes of this study were to compare surface and intramuscular EMG activity measurements and investigate the efficacy of surface EMG measurement for the transversus abdominis (TrA) and the multifidus (MF) muscles. Eight healthy men participated in the study. TrA and MF activities were simultaneously measured by both intramuscular and surface EMG during isometric trunk exercises. Spearman correlation coefficients for the relationship between the two activity measurements for the right TrA, left TrA, right MF, and left MF were 0.55, 0.36, 0.67, and 0.79, respectively. For the TrA, Bland–Altman plots revealed that mean differences between measurements obtained by intramuscular EMG and surface EMG were not close to zero, with a systematic bias toward higher surface EMG values. In conclusion, surface and intramuscular EMG activity measurements were strongly correlated for MF muscles, but poorly correlated for TrA muscles.

  • articleNo Access

    CONSTRUCTION AND VALIDATION OF A THREE-DIMENSIONAL FINITE ELEMENT MODEL OF THE DISTAL RADIOULNAR JOINT

    Objectives: The study was to establish a precise three-dimensional (3D) finite element model (FEM) of the distal radioulnar joint (DRUJ) and then to validate its accuracy for the application to the research on clinical biomechanics. Materials and methods: The right forearm DRUJ of a volunteer (male, 28 years old, 62 kilograms) was scanned by computed tomography (CT) and magnetic resonance imaging (MRI). The resulting sectional images were input into MIMICS10.1 and ANSYS10.0 to generate 3D FEM of the DRUJ. With this FEM, the bending load, axial compression load and the torsion load conditions were simulated, and the vonmises stress distribution of the DRUJ was detected. The simulation results were compared with the biomechanics experiment results which were reported by the literatures. Results: The constructed FEM consisted of 333,805 elements and 508,384 nodes. Together, the simulation results with this FEM were in consistent with those of the reported experiments in bending load, axial compression load and torsion load conditions. Discussion: The 3D FEM of the DRUJ can reflect the real geometric structure of the DRUJ objectively and the simulation with this FEM can predict the results of the biomechanics experiments successfully.