Please login to be able to save your searches and receive alerts for new content matching your search criteria.
Academic literature and market practitioners have always devoted great attention to the analysis of asset management products, with particular regard to fund classification and performance metrics. Less attention has been paid to rating methodologies and to the risk of attributing positive ratings to underperforming asset managers. The most widespread rating criterion is the ordinal one, which is based on the assumption that the best asset managers are those who have performed better than their competitors regardless of their ability to achieve a given threshold (i.e. a positive overperformance against the benchmark). Our study, after a description of the most common risk-adjusted performance measures, introduces the idea of attributing the rating on a cardinal basis, setting in advance a given threshold that should be achieved to receive a positive evaluation (i.e. a rating equal to or higher than 3 on a scale of 1–5). The empirical test conducted on a sample of funds (belonging to the main equity and bond asset classes) made it possible to quantify the effects of the cardinal approach on the attribution of the rating and on the probability of assigning a good rating to underperforming funds. Empirical analysis also highlighted how the cardinal method allows, on average, better performance than the ordinal one even in an out-of-sample framework. The differences between the two methodologies are particularly remarkable in efficient markets such as the North American equity market. The two rating assignment systems were also analyzed using contingency tables to test the ability to anticipate the default event (underperformance relative to the benchmark). The policy suggestion emerging from our study concerns the significant impact of the rating criterion in reducing the risk of recommending funds that, despite a good rating, have failed to perform satisfactorily and are unlikely to do so in the future either.
Inertial Measurement Units (IMUs) were first applied to aircraft navigation and large devices in the 1930s. At that time their application was restricted because of constraints such as size, cost, and power consumption. In recent years, however, Micro-electromechanical (MEMS) IMUs were introduced with very favorable features such as low cost, compactness, and low processing power. One of the disadvantages of these low cost IMU sensors is that the accuracy is lower compared to high-end sensors. However, past experimental results have shown that redundant Magnetic and Inertial Measurement Units (MIMUs) improve navigation performance such as for unmanned air vehicles. Even though past simulation and experimental results demonstrated that redundant sensors improve the navigation performance, however, none of the current research work offers information as to how many sensors are required in order to meet a certain accuracy. This paper evaluates different numbers of sensor configurations of an MIMU sensor array using a simulation environment. Differently rotated MIMU sensors are incrementally added and the Madgwick filter is used to estimate the Euler angles of foot mounted MIMU data. The evaluation measure used is the root mean square error (RMSE) based on the Euler angles as compared to the ground truth. During the experiments it was noticed that the execution time with increasing number of sensors increases exponentially, and thus, the parallelization of the code was designed and implemented, and run on a multi-core machine. Thus, the speedup of the parallel implementation was evaluated. The findings using the parallel version with 16 sensors are that the execution time is less than twice the execution time of having only 1 sensor and 24 times less than using the sequential version with the added benefit of a 26% increase in accuracy.
This study is an attempt to evaluate and compare the performance of State-Owned Commercial Banks (SOCBs) and Private Commercial Banks (PCBs) of Bangladesh. CAMEL rating model has been applied to confess where a bank can be successful and where it has weaknesses. Data have been collected from four SOCBs and eight PCBs for the years 2014–2017. Among the selected SOCBs, it is found that Agrani Bank holds “Satisfactory” position where Sonali Bank holds “Fair” position through the year 2014–2017. On the other hand, Janata bank has improved its position from “Fair” to “Satisfactory” for the year 2016 and 2017. Moreover, Rupali Bank holds ‘Satisfactory’ position only for the year 2017 where this position was “Fair” for the year 2014–2016. On the other hand, it is found that all the selected PCBs hold “Satisfactory” position through the year 2014–2017. Though the composite rating for both types of banks (SOCBs and PCBs) is in “Satisfactory level”, Rank-1 is given to PCBs and Rank-2 is given to SOCBs. CAMEL ratio for “Asset quality” for both types of banks (SOCBs and PCBs) are showing “Dissatisfactory level”. “Earning quality” of SOCBs is showing at a “Marginal level”. Therefore, proper attention should be given to manage the “Asset quality” and SOCBs should increase the “Earning quality”.
As one of the poorest countries in the world, agriculture is Bangladesh’s economic pillar, leading to Bangladesh’s economy becoming vulnerable to global warming. The generation of high-resolution climate predictions in Bangladesh can help to reduce the huge damage and losses inflicted by disasters linked to climate. The statistical downscaling model (SDSM) is the most widely used software on robust climate downscaling and prediction analysis. In this study, by using the SDSM model, we established the statistical relationship between observed climate data in Bangladesh and the large-scale low-resolution NCEP data and used three statistical indicators to evaluate the prediction performance of the SDSM software. Our results show that the SDSM software is more suitable for forecasting humidity/temperature in Bangladesh than rainfall.
Molecular imaging is an important technology to clarify biological and medical uncertainties in the 21th century. This is best realized via in vivo imaging of biological processes in small animals. Thus, a special high resolution imager dedicated for small animals is required. We recently installed a high resolution animal positron emission tomography (PET) scanner (microPET R4) for doing in vivo molecular imaging of gene expression. This paper describes the performance evaluation of our microPET R4 scanner. The microPET R4 scanner is a dedicated PET for studies of rodents. The system is composed of 96 detector modules, each with an 8 × 8 array of 2.1 × 2.1 × 10 mm3 lutetium oxyorthosilicate (LSO) crystals, arranged as 32 crystal rings and 14.8 cm in diameter. The detector crystals are coupled to a Hamamatsu R5900-C8 position sensitive photomultiplier tube (PS-PMT) via a 10 cm long optical fiber bundle. The system operates in 3D mode without inter-plane septa, acquiring data in list mode. A number of scanner parameters such as sensitivity, spatial resolution and energy resolution were determined in this work. In the center of field of view (FOV) a maximal sensitivity of 21.04 cps/kBq was calculated from a measurement with a germanium-68 point source with an energy window of 250-750 keV. Spatial resolution of 2.03 mm (FORB+2D-FBPJ/1.61 mm (FORB + 2D-OSEM) full width at half maximum (FWHM) in the tangential direction and 2.07 mm (2D-FBP)/1.65 mm (2D-OSEM) FWHM in the radial direction were measured in the center with a 0.28 mm diameter 18F-FDG line source. The energy resolution of the scanner was measured across all crystals ranging from 13.9% to around 34.6% with a mean of 18.45%. The results show that the microPET R4 is a suitable PET scanner for imaging small animals like mice and rats.
Micro-X-ray computed tomography (micro-CT) has several characters such as non-invasive, high spatial resolution, high signal-to-noise ratio, providing three-dimensional volume information. Because micro-CT was utilized in many kinds of research field such as preclinical biomedical study, designing a performance phantom and developing analytic methods to objectively evaluate the performance of micro-CT are very important. In this study, the performance phantom and the analytic methods were developed for performance evaluation of micro-CT. The performance parameters extracted from different CT images including noise, linearity, spatial resolution, and hardware alignment were defined in the American Association of Physicists in Medicine (AAPM) Report No. 1 and the American Society for Testing and Materials (ASTM) E1695-95. Standard deviation, Pearson's correlation coefficient, edge response function, and visualization method were utilized to evaluate noise, linearity, spatial resolution, and hardware alignment, respectively. A digital uniform disk image was utilized to evaluate the accuracy of spatial resolution evaluation method. The physical phantom study was performed to evaluate a home-made micro-CT and a commercial micro-CT (Skyscan 1076). According to these results, the performance phantom and the analytic methods developed in this study have demonstrated their capability to evaluate performance of any micro-CT.
Data envelopment analysis (DEA) is the most widely used non-parametric method in healthcare operation management to measure technical, productive, and allocative efficiency. As healthcare is characterized by complex production processes, we need some other subsequent techniques. Thus, integrated with DEA models, additional estimation procedures have been applied to evaluate efficiency. Although the literature on DEA is prevalent, there exists a lack of evidence in the studies using two-stage DEA in healthcare efficiency analysis. This chapter aims to review publications about two-stage DEA, which is a specific variation of conventional DEA, and to explore how two-stage DEA procedures are prevalent in healthcare. Investigating the state of the art of two-stage DEA models can add value for researchers who plan to conduct research using DEA. This chapter offers a rapid review and bibliometric analysis to explore publications regarding the relevant topic. The number of publications reached a peak in 2021. Review articles focused on various healthcare specialties. Seventeen articles were related to hospitals and healthcare centers, and tobit regression remained the primary choice of analysis for the dependent variable in 11 articles. It was widely used across various units, such as health regions, health systems, and patient-level treatment. Some concerns and controversies were addressed to improve validation and prove practical usefulness.
Every organization, for-profit or non-profit, has a different standard of performance. Performance is a critical factor in determining whether an organization will succeed in a highly competitive market. Due to the increasing number of institutions, educational institutions are becoming significant organizations whose performance must be scrutinized. As a matter of fact, the chapter’s goal is to assess the performance of a state university’s 11 faculties in Turkey based on their activities at the end of the academic year 2020–2021. Data envelopment analysis has been used as a performance measuring tool to evaluate the existing situation with consideration of both input and output variables. This study’s input variables include the number of classrooms, the budget, and academic staff members. The numbers of students, graduates, and articles have all been utilized as output variables. They are used in this study to assess the efficiency levels of academic departments. Eventually, inefficient academic units were identified, and faculty performance comparisons were accomplished. Finally, the potential causes were discussed comprehensively.
State-Owned Enterprises (SOEs) have become important instruments of social and economic policy in industrialized mixed economies and in developing countries. The use of SOEs as instruments of public policy and the resulting clashes between these enterprises and private firms on the one hand and government and other controllers on the other, are causing concern. Public committees in different countries as well as international organizations have been searching for positive theory for guidance in handling the multitude of problems related to these enterprises. Theoretical models have made important contributions to the formalization of certain problems and the classification of the information needed to solve them. Unfortunately, these theoretical models have had little relevance for the solution of important real problems.
Much of the research on SOEs is concerned with how these enterprises should behave, and what should be the product of their operations. Almost no research has been done on why SOEs function as they do. The paucity of knowledge about the operation of SOEs stems both from insufficient research effort, and from the concern of researchers with formal structures and products of these organizations and not with management behavior or with decision processes.
The purpose of this paper is to call for research beyond the confines of traditional economics, using the tools of management science to obtain insights into the difficult but salient problems of SOEs.
This study focuses on the edge router of an optical network. This device works as an interface between the electronic and optical domains : it takes packets coming from client layers and converts them into optical packets to be sent into the optical network.
Arriving variable length client packets fill the optical packet until the latter is full or cannot accommodate the arriving block. The aim of this work is to measure the efficiency with which electronic packets can be carried into optical packets.
Performance measures such as the packetisation efficiency and the mean time of packetisation give good results.
Wireless ad-hoc networks do not require any preexisting infrastructure and support mobility. Thus, they are excellent candidates for military tactical networks. In such networks, multihop routing is often needed. In this paper, we report measures made on an OLSR platform consisting of eighteen nodes implementing OLSR, a proactive routing protocol proposed as experimental RFC. Some nodes are embedded in vehicles. This platform is representative of real military tactical configurations in urban areas.
Hot stamping processes of high strength steel require that speed and pressure of forming equipment can be adjusted flexibly and accurately. Mechanical link servo press adopts servo direct technology to adjust the slider movement and output pressure flexibly and provides faster cycle times at lower costs. Therefore it is a kind of ideal equipment for complex hot stamping processes. This paper presents a method of performance evaluation for mechanical link servo press. A series of repeatable testing methods are used to detect and evaluate the key performances of mechanical link servo press, such as BDC precision, pressure holding capacity, production efficiency and power consumption. In this paper, the performance evaluation method is used to detect and evaluate the mechanical link servo press on hot stamping line. The experimental results show that a well-performed mechanical link servo press must have high performances like high BDC precision, high pressure holding capacity, high production efficiency and low power consumption.
This paper presents a new CPU scheduler to facilitate concurrent applications in Xen. The new scheduler uses the completely fair scheduler to reduce the gap between concurrent and non-concurrent virtual machines (VMs), and to balance physical CPU time distribution and resource utilization. It uses the red-black tree to shorten the virtual CPU (VCPU) lookup time, achieve efficient task scheduling and to reduce the runtime. To assist with concurrent VCPU synchronization and scheduling, we set a concurrent waiting queue which can handily pick up marked VCPUs for execution. Extended simulation runs are conducted to evaluate the performance of our proposed scheduler and the simulation results showed that our scheduler outperforms existing schedulers in reducing the runtime for both concurrent and non-concurrent applications.
Hot forming has been an advanced technology, which fulfills lightweight of automotive body and guarantees crash safety at the same time. At present, hot forming has been an efficient way for automotive manufacturers to improve competence of their products. In this paper, a hot forming Bpillar reinforcement and its original steel sheet are evaluated, including component size, microstructure and mechanical properties. Meanwhile, three-point-bending and side impact testing are conducted for B-pillars manufactured from different raw material suppliers.
Modern electronic warfare equipment presents a development trend which is highly integrated, smart and intelligent. They can produce a variety of high strength and targeted electronic jamming in the whole airspace, frequency domain and time domain, which affects the air defense intelligence radar detection ability seriously. In order to ensure operational effectiveness in a complex interference environment, it is the necessary to evaluate radar anti-jamming system. Some have complex technical factors, but also contain a lot of uncertain factors and fuzzy factors or human factors. To evaluate the effect and performance is difficult, hence, how to objectively and comprehensively evaluate the anti-interference ability of modern radar systems has become a radar department of design, production and use of important topics of common interest.
An effective Facility Management (FM) is essential to maintain old buildings so ensure that they remain sturcturally sound, and able to support the funcationality to the inhabitants using them. However, to date, there is comprehensive study on the subject, this paper aims to review the previous papers on facility management performance evaluation and construct evaluation framework as a first step of strategic management by identifying critical success factors (CSF) and key performance indicators (KPI). The evaluation framework could be employed for cornerstone of strategic management on FM. Moreover, the identified CSF and KPI are surveyed in order to offer an empirical approach for strategic management in FM domain. The survey results indicate that reliability of service, timely responsiveness to emergency, tenants’ safety, customer satisfaction and work execution control are the most critical in successful FM performance. In addition, training is essentential to strengthen the monitoring; training for emergency situation, education and training for service mindset, regular meeting with tenants and safe inspection and patrol are the most important factors in enhancing the level of performance in the office building FM practice. Finally, the survey results on CSF and KPI on Korean office buildings are addressed in strategic perspective in FM, thus suggesting future research.
In this paper, the influence of the stamping effect is investigated in the performance analysis of a side structure. The analysis covers the performance evaluation such as crashworthiness and NVH. Stamping analyses are carried out for a center pillar, and then, numerical simulations are carried out in order to identify the stamping effect on the crashworthiness and the natural frequency. The result shows that the analysis considering the forming history leads to a different result from that without considering the stamping effect, which demonstrates that the design of auto-body should be carried out considering the stamping history for accurate assessment of various performances.
This article proposes two quadratic-constrained Data Envelopment Analysis (DEA) models for the evaluation of mutual funds, from a perspective of evaluation based on endogenous benchmarks. In comparison to the previous studies, this article decomposes the two vital factors for mutual funds performance, i.e., risk and return, in these quadratic constrained DEA models, one of which is a partly controllable quadratic constrained programming, in order to construct mutual funds' endogenous benchmarks and give insight management suggestions. The approach is illustrated using a sample of 25 actual mutual funds in the China market. It identifies the root reasons for inefficiency and the ways for improving the performance. The result shows that although the market environment in year 2006 is much better than that in 2005, the average efficiency score declines in year 2006 due to the relaxing of system risk control. The majority of mutual funds do not show persistence in efficiency ranking. The most important conclusion is that the ranking of mutual funds in China depends mostly on the system risk control.
Data Envelopment Analysis (DEA) has been recognized, over the recent years, as a valuable analytical research tool for performance evaluation of several, similar entities engaged in different activities. Characterized by single (multiple) input(s) and output(s), this technique distinguishes between efficient and inefficient units, thereby forming an efficient frontier. It measures the level of efficiency of non-frontier units and identifies benchmarks against which such inefficient units can be compared. In the classical DEA approach, an optimization model is formulated and solved to evaluate the efficiency score of each Decision Making Unit (DMU) separately. The Joint Optimization DEA model presentedin this paper extends the performance measurement DEA technique by evaluating the performance of all DMUs simultaneously. An Interactive method is designed which considers the gap between the target and achieved values of inputs and outputs of the DMUs and provides the decision maker with an appropriate framework to choose the most preferred solution.
The study of interactions between host and pathogen proteins is important for understanding the underlying mechanisms of infectious diseases and for developing novel therapeutic solutions. Wet-lab techniques for detecting protein–protein interactions (PPIs) can benefit from computational predictions. Machine learning is one of the computational approaches that can assist biologists by predicting promising PPIs. A number of machine learning based methods for predicting host–pathogen interactions (HPI) have been proposed in the literature. The techniques used for assessing the accuracy of such predictors are of critical importance in this domain. In this paper, we question the effectiveness of K-fold cross-validation for estimating the generalization ability of HPI prediction for proteins with no known interactions. K-fold cross-validation does not model this scenario, and we demonstrate a sizable difference between its performance and the performance of an alternative evaluation scheme called leave one pathogen protein out (LOPO) cross-validation. LOPO is more effective in modeling the real world use of HPI predictors, specifically for cases in which no information about the interacting partners of a pathogen protein is available during training. We also point out that currently used metrics such as areas under the precision-recall or receiver operating characteristic curves are not intuitive to biologists and propose simpler and more directly interpretable metrics for this purpose.