Purpose: This research aims to examine and study the direct impact of change in COVID-19 cases and IIP index on the performance of NIFTY and selected mutual funds NAV performance for one year from the date WHO declared COVID-19 pandemic on 11 March 2020.
Approach/Methodology/Design: The study applies using daily NAV series of four mutual funds across sectors. ANOVA, Correlation, Regression, Descriptive Statistics.
Findings: There is a strong relation among NIFTY 50 and all the four mutual funds’ NAV performance, however, the relation between daily reported COVID-19 cases and NAV performance couldn’t be established.
Practical Implications: Economic turmoil has affected the disposable income of investors, the capital market is very volatile. This study will help the researchers and analysts to understand the relationship between COVID-19, NIFTY 50 index and select mutual funds NAV movements. And they can make a better decision under a similar situation.
We present a novel extension to passbits providing significant reduction to unsuccessful search lengths for open addressing collision resolution hashing. Both the experimental and analytical results presented demonstrate the dramatic reductions possible. This method does not restrict the hashing table configuration parameters and utilizes very little additional storage space per bucket. The runtime performance for insertion is essentially the same as for ordinary open addressing with passbits; the successful search lengths remain the same as for open addressing without passbits. For any given loading factor, the unsuccessful search length can be made to be arbitrarily close to one bucket access.
Transient simulation of a gate circuit is an efficient method of counting signal changes occurring during a transition of the circuit. It is known that this simulation covers the results of classical binary analysis, in the sense that all signal changes appearing in binary analysis are also predicted by the simulation. For feedback-free circuits of 1- and 2-input gates, it had been shown that the converse also holds, if wire delays are taken into account. In this paper we generalize this result. First, we prove that, for any feedback-free circuit N of arbitrary gates, there exists an expanded circuit, constructed by adding a number of delays to each wire of N, such that binary analysis of
covers transient simulation of N. For this result, the number of delays added to a wire is obtained from the transient simulation. Our second result involves adding only one delay per wire, which leads to the singular circuit
of N. This result is restricted to circuits consisting only of gates realizing functions from the set
, functions obtained by complementing any number of inputs and/or the output of a function from
, and FORKS. The numbers of inputs of the AND, OR and XOR gates are arbitrary, and all functions of two variables are included. We show that binary analysis of such a circuit
covers transient simulation of N. We also show that this result cannot be extended to arbitrary gates, if we allow only a constant number of delays per wire.
Indoor aerosols are directly affecting human lives. Especially aerosols in workshops, factories, and laboratories, where many chemical substances are used in treatment and production processes, might contain toxic elements: special care must be taken to alleviate air pollution and assure a clean breathing environment for the workers. For this study, size segregated aerosol particle sampling with a cascade impactor was performed in the machine workshop of Jožef Stefan Institute. The samples, collected during weekdays and weekend were analyzed with a microbeam facility at Tohoku University. Bulk PIXE analysis with scanning over the whole sample area was conducted along with multimodal microanalysis with microscopic scanning. Using bulk analysis, high concentrations of Pb and Ba were detected on weekend days, which was related to the removal of an old white paint from the furniture. On weekdays, concentrations of W and of soil origin elements increased, probably because of the machine operations and worker movements. At the same time high concentration of sulfur was detected. A microscopic multimodal analysis shows that it stems from a lubricant oil vapor. The combination of bulk and microanalysis of the size selected samples is an effective approach to aerosol characterization in the working environment.
Compiled communication can benefit the parallel application design and performance in several ways such as analyzing the communication pattern to optimize a configurable network for performance improvement or to visualize the communication requirements to study and improve the application design. In this article we present symbolic expression analysis techniques in a MPI parallel compiler. Symbolic expression analysis allows the identification and representation of the communication pattern and also assists in the determination of communication phases in MPI parallel applications at compile-time. We demonstrate that using compiler analysis based on symbolic expression analysis to determine the communication pattern can provide an accurate visualization of the communication requirements. Using information from the compiler to program a circuit switching interconnect in multiprocessor systems has the potential to achieve more efficient communication with lower cost compared to packet/wormhole switching. For example, we demonstrate that our compiler approach provides an average of 2.6 times improvement in message delay over a threshold-based runtime system for our benchmarks with a maximum improvement of 9.7 times.
Sophora flavescens has been widely used in traditional Chinese medicine for over 1700 years. This plant is known for its heat-clearing, damp-drying, insecticidal, and diuretic properties. Phytochemical research has identified prenylated flavonoids as a unique class of bioactive compounds in S. flavescens. Recent pharmacological studies reveal that the prenylated flavonoids from S. flavescens (PFS) exhibit potent antitumor, anti-inflammatory, and glycolipid metabolism-regulating activities, offering significant therapeutic benefits for various diseases. However, the pharmacokinetics and toxicological profiles of PFS have not been systematically studied. Despite the diverse biological effects of prenylated flavonoid compounds against similar diseases, their structure-activity relationship is not yet fully understood. This review aims to summarize the latest findings regarding the chemical composition, drug metabolism, pharmacological properties, toxicity, and structure-activity relationship of prenylated flavonoids from S. flavescens. It seeks to highlight their potential for clinical use and suggest directions for future related studies.
In this paper we describe RooFitUnfold, an extension of the RooFit statistical software package to treat unfolding problems, and which includes most of the unfolding methods that commonly used in particle physics. The package provides a common interface to these algorithms as well as common uniform methods to evaluate their performance in terms of bias, variance and coverage. In this paper we exploit this common interface of RooFitUnfold to compare the performance of unfolding with the Richardson–Lucy, Iterative Dynamically Stabilized, Tikhonov, Gaussian Process, bin-by-bin and inversion methods on several example problems.
Based on the analyzing of the thermodynamic performance of the heat transfer process in the low temperature heat exchangers, the exergy efficiency of the heat transfer process is defined and a general expression for the exergy efficiency is derived, which can be used to discuss the effect of heat transfer units number and heat capacity ratio of fluids on the exergy efficiency of the low temperature heat exchanger. The variation of the exergy efficiency for several kinds of flow patterns in the low heat exchangers is compared and the calculating method of the optimal values of heat capacity ratio for the maximum exergy efficiency is given.
Pedestrian-car interweaving is a prominent problem in old residential communities in Chinese cities. To achieve a better pedestrian-car separation to create a safe and comfortable living environment in old residential communities, this paper investigated the mechanism of the flows of pedestrians and cars on a road network inside an old residential community. A method for calculating the flows of pedestrians and cars was proposed to identify the road segments or nodes where the pedestrian flows are interlaced or intersected with the vehicle flows. This method was applied to the estimation of the traffic in the Wangyuehu Community of Changsha City, China. The estimated distribution of community network traffic and pedestrian-car interweaving sites was consistent with the actual situation.
In the case of high pressure, the size of the multistage pump axial force sometimes reaches hundreds of tons, the seriousness of the problem generated by excessive axial force has surpassed the efficiency, wear and other factors, and has become a decisive factor in the stable operation of multistage pumps. In this paper, the 11-stage double-case multistage pump is taken as the research object. On the basis of studying the clearance internal flow mechanism, according to the pressure difference equation of axial motion of clearance fluid, the main factors affecting the balance force of the balance drum are analyzed, and a new type of balance drum — “double helical balance drum” is proposed, which is compared with the smooth balance drum in the hydraulic performance and axial force performance. The research shows that with the increase of the axial displacement of the balance clearance, the pressure drop between the clearances decreases linearly. Compared with the smooth balance drum, the pressure drop at the inlet and outlet of the helical balance drum is 1.30 times that of the smooth balance drum under the design flow rate condition, the mean velocity, mean velocity curl and velocity coefficient of the double helical balance drum increase, indicating that adding double helix can increase the dynamic pressure ratio of clearance fluid. In addition, the test results show that the head and efficiency of the double helical balance drum of multistage pump is increased by 0.76% and 1.1%, respectively, compared with the smooth balance drum, and the vibration amplitude of the front and rear bearing is reduced by 3.36%, 0.76% and 1.75% on average along the axial, radial and tangential direction. In addition, the temperature of the front and rear bearing is reduced by 3.1% and 8.7%, respectively, indicating that the double helical balance drum can effectively improve the hydraulic performance and axial force performance of the multistage pump at design operating point. The results of the study can provide reference for the long-cycle stable operation of multistage pumps and the optimization of balance drum design.
This paper starts with a brief discussion of so-called wavelet transforms, i.e., decompositions of arbitrary signals into localized contributions labelled by a scale parameter. The main features of the method are first illustrated through simple mathematical examples. Then we present the first applications of the method to the recognition and visualisation of characteristic features of speech and of musical sounds.
Mammographic screening programmes generate large numbers of highly variable, complex images, most of which are unequivocally normal. When present, abnormalities may be small or subtle. Two processes critical to the success of screening programmes are the perception of potential abnormalities and the subsequent analy-sis of each detected lesion to determine its clinical significance. The consequences of errors are costly, and in many screening centres, films are read by two radiologists in an attempt to reduce errors. The prime objective of our research is to improve the accuracy of the detection and analysis of breast lesions by providing radiologists with computer-aided digital image analysis tools. In this paper we focus on the detection and analysis of mammographic microcalcifications.
We describe a philosophy of research aimed at generating useful computer-based aids for radiologists. Firstly, it is necessary to accurately identify specific tasks which are difficult for the human observer. Having correctly identified a problem, appropriate computer vision methods must be developed and their performance evaluated. It is then important to determine effective ways of using such methods to aid radiologists, and it is essential to prove that the effect on radiologists’ performance is entirely beneficial.
We present results of experiments to determine factors affecting radiologists’ perception of microcalcifications, and to investigate the effects of attention-cueing on detection performance. Our results show that radiologists’ performance can be significantly improved with the use of prompts generated from automatically-detected microcalcification clusters.
We describe a new method for the delineation of mammographic abnormalities based on the analysis of multiple high quality X-ray projections of excised lesions. Biopsy specimens are secured inside a rigid tetrahedron, the edges of which provide a reference frame to which the locations of features can be related. A three-dimensional representation of an abnormality can be formed and rotated to resemble its appearance in the original mammogram.
In literature many chaotic systems, based on third-order jerk equations with different nonlinear functions, are available. A jerk system is taken to be a part of dynamical systems that can exhibit regular and chaotic behavior. By extension, a hyperjerk system can be described as a dynamical system with nth-order ordinary differential equations where n is 4 or up to. Hyperjerk systems have been investigated in literature in the last decade. This paper consists of numerical studies and experimental realization on FPAA for fourth-order hyperjerk system with exponential nonlinear function.
This paper describes the Multiagent Systems Engineering (MaSE) methodology. MaSE is a general purpose, methodology for developing heterogeneous multiagent systems. MaSE uses a number of graphically based models to describe system goals, behaviors, agent types, and agent communication interfaces. MaSE also provides a way to specify architecture-independent detailed definition of the internal agent design. An example of applying the MaSE methodology is also presented.
A technique for creating analysis models of a web application from a business model is proposed for easy and effective development. Moreover, a technique for generating test cases from the sequence diagrams for a web application is proposed. The use case diagram and web page list are generated from the business model which is depicted using the notations of the UML (Unified Modeling Language) activity diagram. The page diagram and logical/physical database models are created based on the web page list and extracted data fields. Test cases for the web application are generated from call messages (including self-call messages) of the UML sequence diagram. The effectiveness of these techniques is shown using a practical case study which is a development project of a web application for RMS (Research material Management System).
In this paper, an agent-based open and adaptive system development process has been proposed which continuously change and evolve to meet new requirements. The proposed methodology is based on a model-based technique that provides a specific model for the type of information to be gathered and uses this model to drive the domain specific analysis process. The focus is on a clear separtion between the requirement gathering and analysis phases. The analysis methodology further splits the analysis phase into the user_centric analysis and the system_centric analysis phases. Optimization of the system performance has also been proposed by exploiting the relationships and dependencies among roles and mapping criteria between roles to agents. The Gaia and ROADMAP models have been used as a basis to the proposed agent-based modeling method.
Software size estimation at the early analysis phase of software development lifecycle is crucial for predicting the associated effort and cost. Analysis phase captures the functionality addressed in the software to be developed in object-oriented software development life-cycle. Unified modeling language captures the functionality of the software at the analysis phase based on use case model. This paper proposes a new method named as use case model function point to estimate the size of the object-oriented software at the analysis phase itself. While this approach is based on use case model, it also adapts the function point analysis technique to use case model. The various features such as actors, use cases, relationship, external reference, flows, and messages are extracted from use case model. Eleven rules have been derived as guidelines to identify the use case model components. The function point analysis components are appropriately mapped to use case model components and the complexity based on the weightage is specified to calculate use case model function point. This proposed size estimation approach has been evaluated with the object-oriented software developed in our software engineering laboratory to assess its ability to predict the developmental size. The results are empirically analysed based on statistical correlation for substantiating the proposed estimation method.
In the generation and analysis of Big Data following the development of various information devices, the old data processing and management techniques reveal their hardware and software limitations. Their hardware limitations can be overcome by the CPU and GPU advancements, but their software limitations depend on the advancement of hardware. This study thus sets out to address the increasing analysis costs of dense Big Data from a software perspective instead of depending on hardware. An altered K-means algorithm was proposed with ideal points to address the analysis costs issue of dense Big Data. The proposed algorithm would find an optimal cluster by applying Principal Component Analysis (PCA) in the multi-dimensional structure of dense Big Data and categorize data with the predicted ideal points as the central points of initial clusters. Its clustering validity index and F-measure results were compared with those of existing algorithms to check its excellence, and it had similar results to them. It was also compared and assessed with some data classification techniques investigated in previous studies and we found that it made a performance improvement of about 3–6% in the analysis costs.
Social networks have gained a lot attention. They are perceived as a vast source of information about their users. Variety of different methods and techniques has been proposed to analyze these networks in order to extract valuable information about the users – things they do and like/dislike. A lot of effort is put into improvement of analytical methods in order to grasp a more accurate and detailed image of users. Such information would have an impact on many aspects of everyday life of people – from politics, via professional life, to shopping and entertainment.
The theory of fuzzy sets and systems, introduced in 1965, has the ability to handle imprecise and ambiguous information, and to cope with linguistic terms. The theory has evolved into such areas like possibility theory and computing with words. It is very suitable for processing data in a human-like way, and providing the results in a human-oriented manner.
The paper presents a short survey of works that use fuzzy-based technologies for analysis of social networks. We pose an idea that fuzzy-based techniques allow for introduction of humancentric and human-like data analysis processes. We include here detailed descriptions of a few target areas of social network analysis that could benefit from applications of fuzzy sets and systems methods.
A methodology for reliability and safety analysis of a certain class of repairable fault-tolerant systems is presented. The analysis leads to a closed-form approximation of the probability of an absorbing state in a state transition diagram. This expression can provide insight into the relationship between various system parameters and system reliability and/or safety. The approximation technique is based on the combination of results from the analysis of several failure mechanisms, each studied by itself, into an expression for the approximate reliability of a system. The resulting approximation error can be analyzed in order to evaluate whether or not the approximation is useful in a given situation.
Please login to be able to save your searches and receive alerts for new content matching your search criteria.