Search name | Searched On | Run search |
---|---|---|
[Keyword: Survey] AND [All Categories: Engineering] (15) | 30 Mar 2025 | Run |
[Keyword: Crisis] AND [All Categories: General Medicine] (1) | 30 Mar 2025 | Run |
[Keyword: Crisis] AND [All Categories: Social Issues / Human Security] (4) | 30 Mar 2025 | Run |
[Keyword: Crisis] AND [All Categories: Life Sciences / Biology] (13) | 30 Mar 2025 | Run |
[Keyword: Errors] AND [All Categories: Numerical & Computational Mathematics] (1) | 30 Mar 2025 | Run |
You do not have any saved searches
The study presented in this paper highlights an important issue that was subject for discussions and research about a decade ago and now have gained new interest with the current advances of grid computing and desktop grids. New techniques are being invented on how to utilize desktop computers for computational tasks but no other study, to our knowledge, has explored the availability of the said resources. The general assumption has been that there are resources and that they are available. The study is based on a survey on the availability of resources in an ordinary office environment. The aim of the study was to determine if there are truly usable under-utilized networked desktop computers available for non-desktop tasks during the off-hours. We found that in more than 96% of the cases the computers in the current investigation was available for the formation of part-time (night and weekend) computer clusters. Finally we compare the performance of a full time and a metamorphosic cluster, based on one hypothetical linear scalable application and a real world welding simulation.
This paper reviews recent state-of-the-art H.264 sub-pixel motion estimation (SME) algorithms and architectures. First, H.264 SME is analyzed and the impact of its functionalities on coding performance is investigated. Then, design space of SME algorithms is explored representing design problems, approaches, and recent advanced algorithms. Besides, design challenges and strategies of SME hardware architectures are discussed and promising architectures are surveyed. Further perspectives and future prospects are also presented to highlight emerging trends and outlook of SME designs.
The development of vision-based human activity recognition and analysis systems has been a matter of great interest to both the research community and practitioners during the last 20 years. Traditional methods that require a human operator watching raw video streams are nowadays deemed as at least ineffective and expensive. New, smart solutions in automatic surveillance and monitoring have emerged, propelled by significant technological advances in the fields of image processing, artificial intelligence, electronics and optics, embedded computing and networking, molding the future of several applications that can benefit from them, like security and healthcare. The main motivation behind it is to exploit the highly informative visual data captured by cameras and perform high-level inference in an automatic, ubiquitous and unobtrusive manner, so as to aid human operators, or even replace them. This survey attempts to comprehensively review the current research and development on vision-based human activity recognition. Synopses from various methodologies are presented in an effort to garner the advantages and shortcomings of the most recent state-of-the-art technologies. Also a first-level self-evaluation of methodologies is also proposed, which incorporates a set of significant features that best describe the most important aspects of each methodology in terms of operation, performance and others and weighted by their importance. The purpose of this study is to serve as a reference for further research and evaluation to raise thoughts and discussions for future improvements of each methodology towards maturity and usefulness.
A person’s preference to select or reject certain meals is influenced by several aspects, including colour. In this paper, we study the relevance of food colour for such preferences. To this end, a set of images of meals is processed by an automatic method that associates mood adjectives that capture such meal preferences. These adjectives are obtained by analyzing the colour palettes in the image, using a method based in Kobayashi’s model of harmonic colour combinations. The paper also validates that the colour palettes calculated for each image are harmonic by developing a rating model to predict how much a user would like the colour palettes obtained. This rating is computed using a regression model based on the COLOURlovers dataset implemented to learn users’ preferences. Finally, the adjectives associated automatically with images of dishes are validated by a survey which was responded by 178 people and demonstrates that the labels are adequate. The results obtained in this paper have applications in tourism marketing, to help in the design of marketing multimedia material, especially for promoting restaurants and gastronomic destinations.
Survey analysis method is widely used in many areas such as social study, marketing research, economics, public health, clinical trials and transportation data analysis. Minimum sample size determination is always needed before a survey is conducted to avoid huge cost. Some statistical methods can be found from the literature for finding the minimum required sample size. This paper proposes a method for finding the minimum total sample size needed for the survey when the population is divided into cells. The proposed method can be used for both the infinite population case and the finite population case. A computer program is needed to realize the sample size calculation. The computer program the authors used is SAS/IML, which is a special integrated matrix language (IML) procedure of the Statistical Analysis System (SAS) software.
The redundancy is a widely spread technology of building computing systems that continue to operate satisfactorily in the presence of faults occurring in hardware and software components. The principle objective of applying redundancy is achieve reliability goals subject to techno-economic constraints. Due to a plenty of applications arising virtually in both industrial and military organizations especially in embedded fault tolerance systems including telecommunication, distributed computer systems, automated manufacturing systems, etc., the reliability and its dependability measures of redundant computer-based systems have become attractive features for the systems designers and production engineers. However, even with the best design of redundant computer-based systems, software and hardware failures may still occur due to many failure mechanisms leading to serious consequences such as huge economic losses, risk to human life, etc. The objective of present survey article is to discuss various key aspects, failure consequences, methodologies of redundant systems along with software and hardware redundancy techniques which have been developed at the reliability engineering level. The methodological aspects which depict the required steps to build a block diagram composed of components in different configurations as well as Markov and non-Markov state transition diagram representing the structural system has been elaborated. Furthermore, we describe the reliability of a specific redundant system and its comparison with a non redundant system to demonstrate the tractability of proposed models and its performance analysis.
Construction workers are frequently exposed to awkward work postures and physical demands that can lead to work-related musculoskeletal disorders. There has been limited development of assessment and outreach strategies targeting this highly mobile workforce in general and especially among Hispanic construction workers. We report the prevalence of joint pain from a convenience sample of Hispanic construction workers. A workplace musculoskeletal disorder assessment was undertaken coinciding with construction-site lunch truck visits among 54 workers employed at two large South Florida construction sites. A 45-item questionnaire preloaded onto handheld devices was utilized to record field data. Forty-seven percent of Hispanic workers reported joint pain 30 days prior to interview date, of whom 87% indicated these joint problems interfered with work activities. Over 63% reported experiencing low back pain that lasted at least a whole day during the past 3 months. Right and left knees were the most frequently reported painful joints (both 34%). Musculoskeletal disorders as evident by joint pain, appears to be prevalent among Hispanic construction workers. Workplace ergonomic prevention strategies that reduce musculoskeletal disorders using innovative recruitment and engagement methods (such as during lunch truck construction-site visits) may improve opportunities to reduce joint pain and damage.
This paper presents empirical research aimed at studying what characterizes successful information technology (IT) projects. There are often doubts about what characterizes project success and who actually defines it. In this paper, we have reviewed the literature and present significant contributions to the discussion of what characterizes successful IT projects. Furthermore, a survey was conducted in Norway to collect data on successful IT projects. Research results show that the five most important success criteria are: (1) the IT system works as expected and solves the problems, (2) satisfied users, (3) the IT system has high reliability, (4) the solution contributes to improved efficiency and competitive power, and (5) the IT system realizes strategic, tactical and operational objectives.
IT business value research examines the organizational performance impacts of information technology. In this paper, we apply the value configuration of the value shop to describe and measure organizational performance. The value shop consists of the five primary activities of problem understanding, solutions to problems, decisions on actions, implementation of actions, and evaluations of actions in an iterative problem-solving cycle. Police investigation work is defined as value shop activities. Our empirical study of Norwegian police results in significant relationships between information technology use and investigation performance for all primary activities. The most important primary activities for IT use are problem understanding and implementation of actions, as both significantly improve value shop performance.
This paper is a self-contained introductory tutorial on the problem in proteomics known as peptide sequencing using tandem mass spectrometry. This tutorial deals specifically with de novo sequencing methods (as opposed to database search methods). We first give an introduction to peptide sequencing, its importance and history and some background on proteins. Next we show the relationship between a peptide and the final spectrum produced from a tandem mass spectrometer, together with a description of the various sources of complications that arise during the process of generating the mass spectrum. From there we model the computational problem of de novo peptide sequencing, which is basically the reverse problem of identifying the peptide which produced the spectrum. We then present several major approaches to solve it (including reviewing some of the current algorithms in each approach), and also discuss related problems and post-processing approaches.
The International Liquid Mirror Telescope (ILMT) is a 4-m class survey telescope that has recently achieved first light and is expected to swing into full operations by January 1, 2023. It scans the sky in a fixed 22′ wide strip centered at the declination of 29∘21′41′′ and works in Time Delay Integration (TDI) mode. We present a full catalog of sources in the ILMT strip that can serve as astrometric calibrators. The characteristics of the sources for astrometric calibration are extracted from Gaia EDR3 as it provides a very precise measurement of astrometric properties such as RA (α), Dec (δ), parallax (π), and proper motions (μα∗ & μδ). We have crossmatched the Gaia EDR3 with SDSS DR17 and PanSTARRS-1 (PS1) and supplemented the catalog with apparent magnitudes of these sources in g,r, and i filters. We also present a catalog of spectroscopically confirmed white dwarfs with Sloan Digital Sky Survey (SDSS) magnitudes that may serve as photometric calibrators. The catalogs generated are stored in an SQLite database for query-based access. We also report the offsets in equatorial positions compared to Gaia for an astrometrically calibrated TDI frame observed with the ILMT.
There has been a lot of interest in time series forecasting in recent years. Deep neural networks have shown their effectiveness and accuracy in various industries. It is currently one of the most extensively used machine-learning algorithms for dealing with massive volumes of data due to the reasons stated above. Statistical modeling includes forecasting, which is used for decision-making in various fields. Time-varying variables may be forecasted based on their past values, which is the goal of forecasting. Developing models and techniques for trustworthy forecasting is an important part of the forecasting process. As part of this study, a systematic mapping investigation and a literature review are used. Time series researchers have relied on ARIMA approaches for decades, notably the autoregressive integrated moving average model, but the need for it to be stationary makes this method somewhat rigid. Forecasting methods have improved and expanded with the introduction of computers, ranging from stochastic models to soft computers. Conventional approaches may not be as accurate as soft computing. In addition, the volume of data that can be analyzed and the efficiency of the process are two of the many benefits of using soft computing.
Radio frequency identification (RFID) and wireless sensor networks (WSN) are two important wireless technologies that have a wide variety of applications and provide limitless future potentials. RFID facilitates detection and identification of objects that are not easily detectable or distinguishable by using current sensor technologies. However, it does not provide information about the condition of the objects it detects. Sensors, on the other hand, provide information about the condition of the objects as well as the environment. Hence, integration of these technologies will expand their overall functionality and capacity. This chapter first presents a brief introduction on RFID and then investigates recent research works, new patents, academic products and applications that integrate RFID with sensor networks. Four types of integration are discussed: (1) integrating tags with sensors; (2) integrating tags with wireless sensor nodes and wireless devices; (3) integrating readers with wireless sensor nodes and wireless devices; and (4) mix of RFID and wireless sensor networks. New challenges and future works are discussed at the end.
An effective Facility Management (FM) is essential to maintain old buildings so ensure that they remain sturcturally sound, and able to support the funcationality to the inhabitants using them. However, to date, there is comprehensive study on the subject, this paper aims to review the previous papers on facility management performance evaluation and construct evaluation framework as a first step of strategic management by identifying critical success factors (CSF) and key performance indicators (KPI). The evaluation framework could be employed for cornerstone of strategic management on FM. Moreover, the identified CSF and KPI are surveyed in order to offer an empirical approach for strategic management in FM domain. The survey results indicate that reliability of service, timely responsiveness to emergency, tenants’ safety, customer satisfaction and work execution control are the most critical in successful FM performance. In addition, training is essentential to strengthen the monitoring; training for emergency situation, education and training for service mindset, regular meeting with tenants and safe inspection and patrol are the most important factors in enhancing the level of performance in the office building FM practice. Finally, the survey results on CSF and KPI on Korean office buildings are addressed in strategic perspective in FM, thus suggesting future research.
Orienteering Problem (OP) can be modeled in many applications such as logistic scheduling system, tourist guide service, athlete recruiting. OP and its variants have attracted increasing attention of scholars. In this paper, a literature review is conducted on the recent development of OP and its new variant including Time-Dependent Orienteering Problem (TDOP), Stochastic Orienteering Problem (SOP) and Multi-Objective Orienteering Problem (MOOP). The definition and formulation of OP and its variants are formally described. The exact and heuristic algorithms used to solve the OP and its variants are summarized and compared.
Please login to be able to save your searches and receive alerts for new content matching your search criteria.