Today, heavy machinery can be equipped with robots operated by artificial intelligence (AI) to streamline operations, decrease human labor, and increase efficiency. Intelligent technologies like these can boost productivity by carrying out routine tasks with extreme accuracy. Since robots reduce risks to human workers, robots with AI algorithms and sensors could be used in hazardous environments. In certain contexts, including building site management, mining, or handling hazardous materials, safety must take precedence above everything else. Analytics from this study, aided by AI, have opened the door to predictive maintenance procedures. To detect equipment problems, the Hybrid Artificial Intelligence Framework (HAIF) examines sensor data in conjunction with historical patterns. The objective is to avoid expensive failures and downtime. Optimization of equipment utilization, energy consumption, and fuel consumption can be achieved by applying AI-driven machine learning algorithms. Several advantages arise from energy waste, including cost avoidance and enhanced operational efficacy. Robots can do exact alignments, measurements, and operations with the help of AI vision technologies. This precision significantly impacts the manufacturing, agricultural, and construction industries. Artificial intelligence has the potential to optimize the allocation of resources, the use of heavy equipment, and supply networks by evaluating massive amounts of data.
The graph exploration problem is to visit all the nodes of a connected graph by a mobile entity, e.g., a robot. The robot has no a priori knowledge of the topology of the graph or of its size. Cohen et al. [3] introduced label guided graph exploration which allows the system designer to add short labels to the graph nodes in a preprocessing stage; these labels can guide the robot in the exploration of the graph. In this paper, we address the problem of adjustable 1-bit label guided graph exploration. We focus on the labeling schemes that not only enable a robot to explore the graph but also allow the system designer to adjust the ratio of the number of different labels. This flexibility is necessary when maintaining different labels may have different costs or when the ratio is pre-specified. We present 1-bit labeling (two colors, namely black and white) schemes for this problem along with a labeling algorithm for generating the required labels. Given an n-node graph and a rational number ρ, we can design a 1-bit labeling scheme such that n/b ≥ ρ where b is the number of nodes labeled black. The robot uses O(ρ log Δ) bits of memory for exploring all graphs of maximum degree Δ. The exploration is completed in time . Moreover, our labeling scheme can work on graphs containing loops and multiple edges, while that of Cohen et al. focuses on simple graphs.
CFD-modeling for numerical investigation is used in a wide range of applied tasks, e.g. in fluid mechanics. To better understand the effect of operational parameters on the final results, some tasks are associated with carrying out monotonous, repetitive calculations for a wide range of operational parameters such as velocity, flow direction and temperature. In this paper, a Python-based code for automation of the repeating calculations in CFD-modeling was developed and described. The automation code was tested for CFD-modeling in Ansys Fluent for two flow dynamic tasks: a simple 2D-geometry — NACA0018 airfoil, and a complex 3D-geometry — packed bed with heat transfer. Three different computers with various computational power were used for the comparison. The results of CFD-modeling were compared with the experimental data. The efficiency of using Python-based code was evaluated through comparison with the results of manual (without automation) calculation. It was established that the application of the Python-based code does not affect the accuracy of numerical results. At the same time, utilization of the Python-based code can save up to 25% of computation time for the simple 2D-geometries with a moderately low number of elements in the mesh, and up to 15% for the complex 3D-geometries with a number of elements in several millions. The compiled Python-based code is attached as supplementary material to this paper.
This paper evaluates data validity of available empirical sources and the extent of services sector labor market impact of offshoring in the US, EU-15 and Japan. A three-tier data validity hierarchy is identified, while the employment impact of offshoring in the three regions is found to be limited. Correspondingly, developing Asia is unlikely to experience large-scale employment gains as a destination region. Instead, the crucial role of domestic entrepreneurs in the growth of the Indian IT-related services industry is highlighted, as are the twin educational challenges facing developing Asia: the need to improve both primary and higher education simultaneously.
We present a novel method for detecting rotated lungs in chest radiographs for quality control and augmenting automated abnormality detection. The method computes a principal rib-orientation measure using a generalized line histogram technique for quality control, and therefore augmenting automated abnormality detection. To compute the line histogram, we use line seed filters as kernels to convolve with edge images, and extract a set of lines from the posterior rib-cage. After convolving kernels in all possible orientations in the range [0°, 180°), we measure the angle with maximum magnitude in the line histogram. This measure provides an approximation of the principal chest rib-orientation for each lung. A chest radiograph is upright if the difference between the orientation angles of both lungs with respect to the horizontal axis is negligible. We validate our method on sets of normal and abnormal images and argue that rib orientation can be used for rotation detection in chest radiographs as an aid in quality control during image acquisition. It can also be used for training and testing data sets for computer aided diagnosis research, for example. In our experiments, we achieve a maximum accuracy of approximately 90%.
We present a novel technique to separate panels from stitched multipanel figures appearing in biomedical research articles. Since such figures may comprise images from different imaging modalities, separating them is a crucial first step for effective biomedical content-based image retrieval (CBIR): multimodal biomedical document classification and/or retrieval, for instance. The method applies local line segment detection based on the gray-level pixel changes. It then applies a line vectorization process that connects prominent broken lines along the panel boundaries while eliminating insignificant line segments within the panels. We validated our fully automatic technique on a set of stitched multipanel biomedical figures extracted from articles within the Open Access subset of PubMed Central® repository, and achieved precision and recall of 87.16% and 83.51%, respectively, in less than 0.461s per image, on average. We also reported the recent ImageCLEF 2015 competition results that highlight the usefulness of the proposed work.
Mammographic screening programmes generate large numbers of highly variable, complex images, most of which are unequivocally normal. When present, abnormalities may be small or subtle. Two processes critical to the success of screening programmes are the perception of potential abnormalities and the subsequent analy-sis of each detected lesion to determine its clinical significance. The consequences of errors are costly, and in many screening centres, films are read by two radiologists in an attempt to reduce errors. The prime objective of our research is to improve the accuracy of the detection and analysis of breast lesions by providing radiologists with computer-aided digital image analysis tools. In this paper we focus on the detection and analysis of mammographic microcalcifications.
We describe a philosophy of research aimed at generating useful computer-based aids for radiologists. Firstly, it is necessary to accurately identify specific tasks which are difficult for the human observer. Having correctly identified a problem, appropriate computer vision methods must be developed and their performance evaluated. It is then important to determine effective ways of using such methods to aid radiologists, and it is essential to prove that the effect on radiologists’ performance is entirely beneficial.
We present results of experiments to determine factors affecting radiologists’ perception of microcalcifications, and to investigate the effects of attention-cueing on detection performance. Our results show that radiologists’ performance can be significantly improved with the use of prompts generated from automatically-detected microcalcification clusters.
We describe a new method for the delineation of mammographic abnormalities based on the analysis of multiple high quality X-ray projections of excised lesions. Biopsy specimens are secured inside a rigid tetrahedron, the edges of which provide a reference frame to which the locations of features can be related. A three-dimensional representation of an abnormality can be formed and rotated to resemble its appearance in the original mammogram.
Oil and gas processing facilities utilize various process automation systems with proprietary controllers. As the systems age; older technologies become obsolete resulting in frequent premature capital investments to sustain their operation.
This paper presents a new design of automation controller to provide inherent mechanisms for upgrades and/or partial replacement of any obsolete components without obligation for a complete system replacement throughout the expected life cycle of the processing facilities.
The input/output racks are physically and logically decoupled from the controller by converting them into distributed autonomous process interface systems. The proprietary input/output communication between the conventional controller CPU and the associated input/output racks is replaced with standard real-time data distribution service middleware for providing seamless cross-vendor interoperable communication between the controller and the distributed autonomous process interface systems. The objective of this change is to allow flexibility of supply for all controller’s subcomponents from multiple vendors to safeguard against premature automation obsolescence challenges.
Detailed performance analysis was conducted to evaluate the viability of using the standard real-time data distribution service middleware technology in the design of automation controller to replace the proprietary input/output communication. The key simulation measurements to demonstrate its performance sustainability while growing in controller’s size based on the number of input/output signals are communication latency, variation in packets delays, and communication throughput. The overall performance results confirm the viability of the new proposal as the basis for designing cost effective evergreen process automation solutions that would result in optimum total cost of ownership capital investment throughout the systems’ life span. The only limiting factor is the selected network infrastructure.
The world is witnessing a sudden shift in the paradigm of technology moving from centralized to decentralized approach. Centralized approach leads to single point of failure if any fault occurs and hence a whole system comes to rest. Hence, a decentralized approach like Multi-Agent System is trending now-a-days. A MAS is a collection of a number software entities (agents) working together in pursuit of specified tasks. This paper presents a comprehensive review on various aspects of multi-agent system. The paper explains the basic concepts of MAS with various ways it has been defined in literature. A comparison has been made on the standards to be followed for applying MAS. Classification of MAS architecture has been investigated and compared. Application of MAS in various areas of optimization technique, software platform and real-time simulation are listed in the paper. The paper draws attentions toward benefits and limitations of using MAS based on the survey done. Finally, after visualizing the wide scope of research in the field of MAS, an attempt has been made to identify future research avenues.
The goal to provide faster and optimal solution to complex and high-dimensional problem is pushing the technical envelope related to new algorithms. While many approaches use centralized strategies, the concept of multi-agent systems (MASS) is creating a new option related to distributed analyses for the optimization problems. A novel learning algorithm for solving the global numerical optimization problems is proposed. The proposed learning algorithm integrates the multi-agent system and the hybrid butterfly–particle swarm optimization (BFPSO) algorithm. Thus it is named as multi-agent-based BFPSO (MABFPSO). In order to obtain the optimal solution quickly, each agent competes and cooperates with its neighbors and it can also learn by using its knowledge. Making use of these agent–agent interactions and sensitivity and probability mechanism of BFPSO, MABFPSO realizes the purpose of optimizing the value of objective function. The designed MABFPSO algorithm is tested on specific benchmark functions. Simulations of the proposed algorithm have been performed for the optimization of functions of 2, 20 and 30 dimensions. The comparative simulation results with conventional PSO approaches demonstrate that the proposed algorithm is a potential candidate for optimization of both low-and high-dimensional functions. The optimization strategy is general and can be used to solve other power system optimization problems as well.
Requirements traceability (RT) aims at defining and utilizing relationships between stakeholder requirements and artifacts produced during the software development life-cycle and provides an important means to foster software understanding. Although techniques for generating and validating traceability information are available, RT in practice often suffers from the enormous effort and complexity of creating and maintaining traces. This results in invalid or incomplete trace information which cannot support engineers in real-world problems. In this paper we present a tool-supported approach that requires the designer to specify some trace dependencies but eases trace acquisition by generating others automatically. We illustrate the approach using a video-on-demand system and show how the generated traces can be used in various engineering scenarios to improve software understanding. In a case study using an open source software application we demonstrate that the approach is capable of dealing with large-scale systems and delivers valid results.
In this short note, we discuss the basic approach to computational modeling of dynamical systems. If a dynamical system contains multiple time scales, ranging from very fast to slow, computational solution of the dynamical system can be very costly. By resolving the fast time scales in a short time simulation, a model for the effect of the small time scale variation on large time scales can be determined, making solution possible on a long time interval. This process of computational modeling can be completely automated. Two examples are presented, including a simple model problem oscillating at a time scale of 10–9 computed over the time interval [0,100], and a lattice consisting of large and small point masses.
The Failure Mode and Effects Analysis (FMEA) documents single failures of a system, by identifying the failure modes, and the causes and effects of each potential failure mode on system service and defining appropriate detection procedures and corrective actions. When extended by Criticality Analysis procedure (CA) for failure modes classification, it is known as Failure Mode Effects and Criticality Analysis (FMECA). The present paper presents a literature review of FME(C)A, covering the following aspects: description and review of the basic principles of FME(C)A, types, enhancement of the method, automation and available computer codes, combination with other techniques and specific applications. We conclude with a discussion of various issues raised as a result of the review.
Evaluating the integrity of the welded pipes used for fluid transportation in processing industries demands certain investigations on the erosion and corrosion behavior under various environmental conditions. ASTM A106 Grade-B pipes are butt welded using an automated MIG welding setup to obtain the optimum output response such as Reinforcement Form Factor (W1), Penetration Shape Factor (W2), and Tensile Strength (W3) in the weldments. The slurry erosion test is conducted on the weldment surface by varying the velocity and erodent concentration in acidic (0.1M H2SO4) and alkaline (3.5%wt. NaCl) conditions. Correspondingly, the samples are subjected to electrochemical corrosion test in 0.1M H2SO4 and 3.5% wt. NaCl solutions. The SEM investigations carried out on the eroded weldment surface show glimpses of erosion mechanisms such as shallow and deep ploughing, oxide cracks, ridges and valleys, scale formation at some areas attributing to sulphide deposition. The corrosion that occurred on the weldment surface tested under acidic conditions is relatively high compared to the alkaline conditions. The reinforcement form factor is the most preferable weld bead characteristic to obtain better erosion and corrosion resistant weldments in the investigated pipe material.
Bone Healing from Within
How Technology Helps in Care Coordination: Telehealth?
Soft Wearable Machines for Robot-Assisted Rehabilitation
Technology Can Help Patients Find Doctors and Share Medical Data
Seizing Opportunity in Asia-Pacific's Complex and Rapidly Changing Medical Device Market
How Logistics Technology Can Treat Tomorrow's Life Sciences & Healthcare Complications
Artificial Intelligence and associated technologies are rapidly automating routine and non-routine tasks across industries and can severely disrupt labor markets. This paper presents an agent-based, evolutionary model of labor market dynamics where workers adjust to technology shocks induced by automation. Firms produce a homogeneous service by combining the outputs of tasks performed by workers while stochastically adapting to automation of tasks causing displacement of workers. We develop a model that includes: (i) the description of occupation mobility as a directed graph where nodes represent occupations, and the directed edges represent the mobility pathways along which displaced workers can get retrained and redeployed (ii) explicit microfoundations of the processes of job matching and wage setting between firms and heterogeneously skilled workers. The model focuses on the influence of workers’ retraining choices on the employment levels and wage inequality in the labor market. Simulation results indicate distinct tipping points for unemployment and wage inequality with changes in mobility pathways along which workers retrain and redeploy across occupations. An increase in the density of mobility pathways induces a reinforcement effect on employment. Retraining displaced workers without building dense and well-distributed mobility pathways across occupations could widen wage inequality due to excessive crowding of workers around specific tasks. Our work focuses on the finance and insurance industry dataset, where we observe that the reskilling of displaced workers along occupation mobility pathways assisted by a lower retraining cost improves the unemployment levels. Also, if the firms aggressively automate their tasks, an increase in the cost of retraining increases inequality of wages in the labor market.
Driven by promises of better and quicker decision-making, research on applications of technologies such as artificial intelligence in automating management decisions is increasing. However, the factors influencing the decision to increase the level of automation of a given management decision have remained a sidenote in the literature. In this systematic literature review, we organize these factors from the fragmented and heterogeneous research landscape concerned with automating management decisions. Using a systematically derived sample of research, we categorize and distill a multitude of factors into four themes: goals, foundations, design considerations, and application. We propose positive influences of six factors and a negative influence of one, namely costs, on the decision to increase the level of automation of a management decision. Finally, based on these propositions, we derive an agenda to guide future research.
This study uses bibliometric analysis to integrate, synthesise, and expand the knowledge regarding the relationship between automation and the labour market. In this paper, the authors examined the Web of Science (WoS) core collection database for articles published between 2002 and 2022. The co-citation, co-occurrence, and publication patterns were analysed using VOSviewer 1.6.19. The study comprised 287 papers, with the United States having the highest percentage of research publications, followed by Germany, China, and the United Kingdom. The institutional study shows that the Massachusetts Institute of Technology, Boston University, National Bureau of Economic Research, Harvard University, and the University of London are all leading institutions in this field of study and have more than 100 links. The co-occurrence of keywords revealed “automation”, “employment”, “growth”, and “jobs” as the most discussed terms. The paper concludes by identifying gaps in the literature and proposing possibilities for future studies.
Current repairing process for worn-out blades and blisks of aeroengines and industrial gas turbines is highly manual, labor experience-based and hence error-prone. It requires more sophisticated CNC-driven laser equipment to replace manual welding for the repair. This paper presents an innovative strategy that will lead to the automation of laser welding and cladding. The project makes use of reverse engineering techniques to capture the geometric shape of the broken area by a digitized point cloud and nominal geometry. The core software technologies include four aspects: (i) point-to-surface-best-fitting technology that puts the point cloud coordinate system nominal CAD coordinate system, (ii) a procedure to extract a broken boundary that automatically separates the broken area and the unbroken area, (iii) geometric model representation for the broken area, and (iv) generation of a STL file to drive CNC laser machine. The tool has been implemented with UG API. Some experimental results have shown that the presented strategy is efficient for repair automation.
This review emphasizes the evolving need for automated inspection in metal fabrication processes due to the increasing complexity of design advancements over the years. The study explores various defect detection algorithms and evaluates their effectiveness in enhancing the accuracy and reliability of the inspection process. Machine vision plays a crucial role in this context, contributing significantly to the precision of the inspection process in metal fabrication. Its ability to handle complex tasks ensures a thorough assessment of manufactured components. The paper also explores the use of digital image correlation (DIC) as a key tool in quality assurance for metal fabricated products. This technique provides detailed insights, enabling a thorough understanding of structural integrity and defect identification. By integrating insights on automated inspection through defect detection algorithms, machine vision and DIC, this review aims to advance quality assurance methodologies in the ever-evolving field of metal fabrication.
Please login to be able to save your searches and receive alerts for new content matching your search criteria.