Please login to be able to save your searches and receive alerts for new content matching your search criteria.
Scientific Machine Learning (SciML) is a recently emerged research field which combines physics-based and data-driven models for the numerical approximation of differential problems. Physics-based models rely on the physical understanding of the problem at hand, subsequent mathematical formulation, and numerical approximation. Data-driven models instead aim to extract relations between input and output data without arguing any causality principle underlining the available data distribution. In recent years, data-driven models have been rapidly developed and popularised. Such a diffusion has been triggered by a huge availability of data (the so-called big data), an increasingly cheap computing power, and the development of powerful Machine Learning (ML) algorithms. SciML leverages the physical awareness of physics-based models and, at the same time, the efficiency of data-driven algorithms. With SciML, we can inject physics and mathematical knowledge into ML algorithms. Yet, we can rely on data-driven algorithms’ capability to discover complex and nonlinear patterns from data and improve the descriptive capacity of physics-based models. After recalling the mathematical foundations of digital modelling and ML algorithms, and presenting the most popular ML architectures, we discuss the great potential of a broad variety of SciML strategies in solving complex problems governed by Partial Differential Equations (PDEs). Finally, we illustrate the successful application of SciML to the simulation of the human cardiac function, a field of significant socio-economic importance that poses numerous challenges on both the mathematical and computational fronts. The corresponding mathematical model is a complex system of nonlinear ordinary and PDEs describing the electromechanics, valve dynamics, blood circulation, perfusion in the coronary tree, and torso potential. Despite the robustness and accuracy of physics-based models, certain aspects, such as unveiling constitutive laws for cardiac cells and myocardial material properties, as well as devising efficient reduced-order models to dominate the extraordinary computational complexity, have been successfully tackled by leveraging data-driven models.
Informatics and Scientific Computing approach parallel processing in a different way. We briefly describe the different points of view of both camps. Next we concentrate on a case study in the area of scientific computing. The problem chosen is from Physical Chemistry (self-consistent field computation). We describe the problem, the sequential solution, the parallelization strategy and present the performance values we have achieved. Our implementation is based on a 60-node transputer system, available at the Parallel Processing Laboratory in Basel.
We review Schmidt and Kraus decompositions in the form of singular value decomposition using operations of reshaping, vectorization and reshuffling. We use the introduced notation to analyze the correspondence between quantum states and operations with the help of Jamiołkowski isomorphism. The presented matrix reorderings allow us to obtain simple formulae for the composition of quantum channels and partial operations used in quantum information theory. To provide examples of the discussed operations, we utilize a package for the Mathematica computing system implementing basic functions used in the calculations related to quantum information theory.
In recent years, intense usage of computing has been the main strategy of investigations in several scientific research projects. The progress in computing technology has opened unprecedented opportunities for systematic collection of experimental data and the associated analysis that were considered impossible only few years ago.
This paper focuses on the strategies in use: it reviews the various components that are necessary for an effective solution that ensures the storage, the long term preservation, and the worldwide distribution of large quantities of data that are necessary in a large scientific research project.
The paper also mentions several examples of data management solutions used in High Energy Physics for the CERN Large Hadron Collider (LHC) experiments in Geneva, Switzerland which generate more than 30,000 terabytes of data every year that need to be preserved, analyzed, and made available to a community of several tenth of thousands scientists worldwide.
Janus is a conceptual framework and C++ template library that provides a flexible and extensible collection of efficient data structures and algorithms for a broad class of data-parallel applications. In particular, finite difference methods, (adaptive) finite element methods, and data-parallel graph algorithms are supported. An outstanding advantage of providing a generic C++ framework is that it provides application-oriented abstractions that achieve high performance without relying on language extension or non-standard compiler technology. The C++ template mechanism allows to plug user-defined types into the Janus data structures and algorithms. Moreover, Janus components can easily be combined with standard software packages of this field.
Rule-based systems are typically tested using a set of inputs which will produce known outputs. However, one does not know how thoroughly the software has been exercised. Traditional test-coverage metrics do not account for the dynamic data-driven flow of control in rule-based systems. Our literature review found that there has been little prior work on coverage metrics for rule-based systems. This paper proposes test-coverage metrics for rule-based systems derived from metrics defined by prior work, and presents an industrial scale case study.
We conducted a case study to evaluate the practicality and usefulness of the proposed metrics. The case study applied the metrics to a system for computational fluid-dynamics models based on a rule-based application framework. These models were tested using a regression-test suite. The data-flow structure built by the application framework, along with the regression-test suite, provided case-study data. The test suite was evaluated against three kinds of coverage. The measurements indicated that complete coverage was not achieved, even at the lowest level definition. Lists of rules not covered provided insight into how to improve the test suite. The case study illustrated that structural coverage measures can be utilized to measure the completeness of rule-based system testing.
This paper presents the mathematical analysis of the dynamical system for avian influenza. The proposed model considers a nonlinear dynamical model of birds and human. The half-saturated incidence rate is used for the transmission of avian influenza infection. Rigorous mathematical results are presented for the proposed models. The local and global dynamics of each model are presented and proven that when ℛ0<1, then the disease-free equilibrium of each model is stable both locally and globally, and when ℛ0>1, then the endemic equilibrium is stable both locally and globally. The numerical results obtained for the proposed model shows that influenza could be eliminated from the community if the threshold is not greater than unity.
Dengue infection affects more than half of the world’s population, with 1 billion symptomatic cases identified per year and several distinct genetic serotypes: DENV 1–4. Transmitted via the mosquito bite, the dengue virus infects Langerhans cells. Monocytes, B lymphocytes, and mast cells infected with dengue virus produce various cytokines although it is not clear which ones are predominant during DHF disease. A mathematical model of the Dengue virus infection is developed according to complex dynamics determined by many factors. Starting from a state of equilibrium that we could define as “virus-free” asymptotically stable with a viral reproduction number lower than one which means a very effective action of the innate immune system: it stops the infectious process, the mathematical analysis of stability in the presence of the virus demonstrates that the proposed model is dynamically influenced. Dengue fever affects more than half of the world’s population, with 1 billion symptomatic cases and multiple genetic serotypes confirmed each year, which simulates a network of interactions between the various populations involved without considering the speeds of the processes in question which are indicated in a separate computation. In this research, a hybrid approach of petri nets is utilized to connect the discrete models of dengue.