In digital platforms, abnormal events involve multiple data sources and complex information types, and the difficulty of tracking them increases due to the similarity and interaction between components, operations, and user behavior. Therefore, in order to achieve precise tracking and efficient processing of abnormal events, and thereby improve the stability and security of the platform, a digital platform abnormal event tracking method based on a knowledge graph is proposed. First, using data mining and association rule techniques, abnormal event data in the digital platform are effectively collected and integrated. Subsequently, the data are input into a model that integrates residual atrous convolutional neural networks and conditional random fields to achieve precise identification of key entities. On the basis of entity recognition, the correlation between entities is extracted and a knowledge graph architecture for abnormal events is constructed, providing a solid foundation for subsequent deep analysis. Through a visual interface, the knowledge graph of abnormal events can be intuitively displayed, making it easy for users to quickly understand the full picture of the event. At the same time, the knowledge graph subgraph matching algorithm is adopted, combined with flow graph indexing and optimal matching sequence, to achieve accurate tracking and recognition of abnormal events. The experimental results show that this method can effectively track abnormal events in the digital platform. The first detection time is relatively short, with a mishandling time of 8.3s and data duplication of 7.9s. The continuous tracking time is long, with a security vulnerability of 50min. The false alarm rate is low, with the highest being 2.1% for data duplication, and the false miss rate is also low, with the highest being 0.8% for mishandling. This method can identify the number of abnormal events, which helps to understand the stability and health status of the platform. By timely and effectively preventing abnormal events, the frequency of their occurrence can be reduced, and the overall security and stability of the platform can be improved.
The stochastic dynamical ρ4 equation is utilized as a robust framework for modeling the behavior of complex systems characterized by randomness and nonlinearity, with applications spanning various scientific fields. The aim of this paper is to employ an analytical method to identify stochastic traveling wave solutions of the dynamical ρ4 equation. Novel hyperbolic and rational functions are investigated through this method. A Galilean transformation is applied to reformulate the model into a planar dynamical system, which enables a comprehensive qualitative analysis. Additionally, the emergence of chaotic and quasi-periodic patterns following the introduction of a perturbation term is addressed. Simulation results indicate that significant changes in the systems’ dynamic behavior are caused by adjusting the amplitude and frequency parameters. Our findings indicate the impact of the method on system dynamics and its efficacy in analyzing solitons and phase behavior in nonlinear models. These discoveries provide fresh perspectives on how the suggested method can lead to notable shifts in the systems’ dynamic behavior. The effectiveness and practicality of the proposed methodology in scrutinizing soliton solutions and phase visualizations across diverse nonlinear models are underscored by these revelations.
The difficulties in conceptualizing the interactions among a large number of processors make it difficult both to identify the sources of inefficiencies and to determine how a parallel program could be made more efficient. This paper describes an instrumentation system that can trace the execution of distributed memory parallel programs by recording the occurrence of parallel program events. The resulting event traces can be used to compile summary statistics that provide a global view of program performance. In addition, visualization tools permit the graphic display of event traces. Visual presentation of performance data is particularly useful, indeed, necessary for large-scale parallel computers; the enormous volume of performance data mandates visual display.
Since the validity of the Navier-Stokes equations is well established, any fluid dynamic phenomenon could be calculated if methods for solving them correctly are obtained. A fair portion of this dream seems to have come true through the remarkable development of supercomputers and solution algorithms that have made the simulation of high-Reynolds-number flows possible. For understanding the underlying flow mechanism, means of properly visualizing the computed flow field are needed and have been developed. On the whole, computer simulation is becoming the most effective tool for the study of fluid dynamics.
The use of self-organizing maps to analyze data often depends on finding effective methods to visualize the SOM's structure. In this paper we propose a new way to perform that visualization using a variant of Andrews' Curves. Also we show that the interaction between these two methods allows us to find sub-clusters within identified clusters. Perhaps more importantly, using the SOM to pre-process data by identifying gross features enables us to use Andrews' Curves on data sets which would have previously been too large for the methodology. Finally we show how a three way interaction between the human user and these two methods can be a valuable exploratory data analysis tool.
Growing hierarchical self-organizing models are characterized by the flexibility of their structure, which can easily accomodate for complex input datasets. However, most proposals use the Euclidean distance as the only error measure. Here we propose a way to introduce Bregman divergences in these models, which is based on stochastic approximation principles, so that more general distortion measures can be employed. A procedure is derived to compare the performance of networks using different divergences. Moreover, a probabilistic interpretation of the model is provided, which enables its use as a Bayesian classifier. Experimental results are presented for classification and data visualization applications, which show the advantages of these divergences with respect to the classical Euclidean distance.
Visualization is primarily utilized as a training method to enhance athletic movement quality, increase concentration power, and minimize competition stress on the player while building firm confidence. Physical literacy (PL) provides a valuable lens for analyzing physical activity (PA) movement in more significant social and affective learning processes. This paper presents an Interactive Visualization positioning in physical education (IVPPE) to deal with the signal fluctuations and positioning techniques in visualizing Deep Neural Network (DNN). To ensure the success of their game, athletes are always looking for new ways to improve their health and performance. Using sensors to keep tabs on training and recovery has become more popular among athletes. Currently, sports teams are using sensors to track both the players’ internal and external workloads. It illustrates the multilayer localizer (MLL) based on transfer learning to improve the positioning accuracy and physical literacy positioning model (PLPM) as a health determinant. A variety of data augmentation techniques are used to combat signal fluctuations. As a result, the combined effects of motivation-promoting physical activity-based visualization improve the accuracy ratio to 96.7%, prediction ratio to 96.2%, efficiency ratio to 96.8%, and reduce the error rate to 18.7%, stress level (52.8%) compared to other conventional models and have a positive impact on the localizer and positioning, making a difference in physical activity (PA) levels.
By utilizing the molecular dynamics code SPaSM on Livermore's BlueGene/L architecture, consisting of 212 992 IBM PowerPC440 700 MHz processors, a molecular dynamics simulation was run with one trillion atoms. To demonstrate the practicality and future potential of such ultra large-scale simulations, the onset of the mechanical shear instability occurring in a system of Lennard-Jones particles arranged in a simple cubic lattice was simulated. The evolution of the instability was analyzed on-the-fly using the in-house developed massively parallel graphical object-rendering code MD_render.
The ability to simulate several aspects of two-dimensional quantum mechanics is discussed, in conjunction with an ongoing visualization project, WebTOP, that has been of recognizable importance to physics education since its inception in the late 1990s. In the past, the WebTOP project has been primarily used as a means of visualizing optics and wave phenomena and, now, the development of certain interactive quantum mechanical demonstrations has the potential to strengthen its power as an educational tool for the physics community. The added functionality for propagating wave packets forward in time for a given 2D potential gives rise to the ability to investigate interesting quantum behaviors. Fractional revivals of states in the 2D infinite square well can be clearly seen as well as the time delay of scattered wave packets for certain step potentials. Aspects of squeezed and coherent states of the 2D harmonic oscillator potential can also be explored, among other observable phenomena.
The aim of this paper is to develop a set of algorithms for defect identification in any crystal system based on structural data from molecular dynamics simulations. The set, named FEDIS, consists of two algorithms: the extended centrosymmetric parameter (E-CSP) method and the fast neighbor distance analysis (F-NDA) method. The E-CSP extends the Central Symmetric Parameter (CSP) method for centrally symmetric materials by introducing a compensation term for asymmetric crystal that adapts to all crystal systems. The F-NDA modifies the Nearest Neighbor Analysis (NDA) method by replacing vector computation with scalar computation. The developed algorithms are validated through several cases that demonstrate their effectiveness and efficiency in detecting various types of defects. The algorithms are implemented in C++ and integrated into 3D interactive interface software that can be downloaded on GitHub.
Two important tools of today's science and engineering are computational grids and visualization. While grid infrastructures offer a means to process large amounts of data across different, possibly distant resources, visualization aids in understanding the meaning of data. The Grid Visualization Kernel (GVK) addresses the connection of grid applications and visualization clients on the grid. The visualization capabilities of GVK are provided as flexible grid services via dedicated interfaces and protocols, while GVK itself relies on Globus services to implement the functionality of the visualization pipeline. This paper describes the concept of GVK and its core functionality for grid visualization services, and discusses how to use visualization in the grid environment.
Compiled communication can benefit the parallel application design and performance in several ways such as analyzing the communication pattern to optimize a configurable network for performance improvement or to visualize the communication requirements to study and improve the application design. In this article we present symbolic expression analysis techniques in a MPI parallel compiler. Symbolic expression analysis allows the identification and representation of the communication pattern and also assists in the determination of communication phases in MPI parallel applications at compile-time. We demonstrate that using compiler analysis based on symbolic expression analysis to determine the communication pattern can provide an accurate visualization of the communication requirements. Using information from the compiler to program a circuit switching interconnect in multiprocessor systems has the potential to achieve more efficient communication with lower cost compared to packet/wormhole switching. For example, we demonstrate that our compiler approach provides an average of 2.6 times improvement in message delay over a threshold-based runtime system for our benchmarks with a maximum improvement of 9.7 times.
The experiments at the Large Hadron Collider (LHC) rely upon a complex distributed computing infrastructure (WLCG) consisting of hundreds of individual sites worldwide at universities and national laboratories, providing about half a billion computing job slots and an exabyte of storage interconnected through high speed networks. Wide Area Networking (WAN) is one of the three pillars (together with computational resources and storage) of LHC computing. More than 5 PB/day are transferred between WLCG sites. Monitoring is one of the crucial components of WAN and experiments operations. In the past years all experiments have invested significant effort to improve monitoring and integrate networking information with data management and workload management systems. All WLCG sites are equipped with perfSONAR servers to collect a wide range of network metrics. We will present the latest development to provide the 3D force directed graph visualization for data collected by perfSONAR. The visualization package allows site admins, network engineers, scientists and network researchers to better understand the topology of our Research and Education networks and it provides the ability to identify nonreliable or/and nonoptimal network paths, such as those with routing loops or rapidly changing routes.
The wake of two circular cylinders in tandem arrangement is investigated by flow visualization and PIV experiments in a towing water tank. The two cylinders are spaced at L/d (spacing ratio) = 2.0 to 15.0 and the cross flow Reynolds number ranges from 60 to 120. The flow is seeded with fine Rilsan particles and illuminated by a 2 mm thick laser sheet. The PIV image analysis is done by a standard cross correlation scheme with a powerful validation algorithm followed by multi-pass adaptive cross correlation iterations. The main objective of the study is to investigate the characteristics of the downstream cylinder wake changing considerably with the spacing ratio of the two cylinders.
Molecular dynamics (MD) simulations and visualizations were explored to investigate the changes in structure of liquid aluminosilicates. The models were constructed for four compositions with varying Al2O3/SiO2 ratio. The local structure and network topology was analyzed through the pair of radial distribution functions, bond angle, bond length and coordination number distributions. The results showed that the structure of aluminosilicates mainly consists of the basic structural units TOy (T is Al or Si; y = 3, 4, 5). Two adjacent units TOy are linked to each other through common oxygen atoms and form continuous random network of basic structural units TOy. The bond statistics (corner-, edge- and face- sharing) between two adjacent TOy units are investigated in detail. The self-diffusion coefficients for three atomic types are affected by the degree of polymerization (DOP) of network characterized by the proportions of nonbridging oxygen (NBO) and Qn species in the system. It was found that Q4 and Q3 tetrahedral species (tetrahedron with four and three bridging oxygens, respectively) decreases, while Q0 (with four nonbridging oxygen) increase with increasing Al2O3/SiO2 molar ratio, suggesting that a less polymerized network was formed. The structural and dynamical heterogeneities, micro-phase separation and liquid–liquid phase transition are also discussed in this work.
A numerical study is performed on the supersonic flow over an open cavity at Mach number of 1.5. A newly developed visualization method is employed to visualize the complicated flow structures, which provide an insight into major flow physics. Four types of shock/compressive waves which existed in experimental schlieren are observed in numerical visualization results. Furthermore, other flow structures such as multi-scale vortices are also obtained in the numerical results. And a new type of shocklet which is beneath large vortices is found. The shocklet beneath the vortex originates from leading edge, then, is strengthened by successive interactions between feedback compressive waves and its attached vortex. Finally, it collides against the trailing surface and generates a large number of feedback compressive waves and intensive pressure fluctuations. It is suggested that the shocklets beneath vortex play an important role of cavity self-sustained oscillation.
The spatial heterogeneity of land use patterns and residents’ corresponding economic activities give rise to urban mobility’s latent structure, which is of great importance for urban planning and transport infrastructure investment but cannot be readily captured using conventional data sources. We developed a methodological framework for detecting urban mobility structure at the transportation analysis zone (TAZ) level in Beijing using mobile phone signal data. First, we derived origin–destination data at the TAZ level from mobile phone data and visualized them in ArcGIS. Next, we improved community detecting algorithms generally used in social networks by reversing distance weight, such as by dividing ODs by 1, and used the results to reveal hidden clustering features of TAZs, according ODs between them. We visualized and analyzed population density, OD spatial distribution at different times, and ratio of daytime to nighttime population using the GIS platform; all showed some spatial cluster features. We then applied a structure detection algorithm using ODs between TAZ pairs to identify the hidden structure of urban mobility extracted from phone data. For Beijing, the identified mobility structure contains 27 clusters, with those in suburban areas tending to match administrative boundaries well but those in the developed center areas showing complex distributions and matching administrative boundaries poorly. Authorities that provide mobility infrastructure can use the resulting insights into urban planning and transportation planning to inform policy decisions at the local and city levels.
A Simulation, Animation, Visualization and Interactive Control (SAVIC) environment has been developed for the design and operation of an integrated robotic manipulator system. This unique system possesses the abilities for (1) multi-sensor simulation, (2) kinematics and locomotion animation, (3) dynamic motion and manipulation animation, (4) transformation between real and virtual modes within the same graphics system, (5) ease in exchanging software modules and hardware devices between real and virtual world operations, and (6) interfacing with a real robotic system. This research is focused on enhancing the overall productivity of an integrated human-robot system. This paper describes a working system and illustrates the concepts by presenting the simulation, animation and control methodologies for a unique mobile robot with articulated tracks, a manipulator, and sensory modules.
Many different techniques for visualizing data exist, and often users must experiment with several before selecting the most effective one for their problem. Knowledge of the characteristics of the human visual system can assist in our choice of visualization techniques. Limits imposed by our visual "cognitive bandwidth" mean that only detail up to these limits needs to be generated in a visualization scene. Some aspects of our visual process will be discussed and an approach will be described for modeling scene detail, which takes visual limits of such aspects into account.
The use of computer visualization as a means to analyze complex geographic datasets is discussed. Visualization is a valuable tool for conducting exploratory data analysis on geographical data; making good use of the human eye's unparalleled ability to recognize structure and relationships that may be inherent within the data. Traditional GIS are extremely poor at visualization, being limited to a very restricted set of visual attributes with which to convey information (position, size, color). The use of a more sophisticated approach is discussed in detail. Specifically, a system to visualise complex environmental datasets is described, which makes use of knowledge concerning the problem domain as well as knowledge concerning human cognition.
In the realizations produced, the most salient attributes in the data, for a particular task, are assigned to the most striking visual attributes. Assignments are controlled by heuristics that may be changed to alter system behavior. Results are presented showing the application of this approach on datasets involving several multi-dimensional thematic layers of environmental data, used in mineral exploration.
Please login to be able to save your searches and receive alerts for new content matching your search criteria.