Skip main navigation

Cookies Notification

We use cookies on this site to enhance your user experience. By continuing to browse the site, you consent to the use of our cookies. Learn More
×

System Upgrade on Tue, May 28th, 2024 at 2am (EDT)

Existing users will be able to log into the site and access content. However, E-commerce and registration of new users may not be available for up to 12 hours.
For online purchase, please visit us again. Contact us at customercare@wspc.com for any enquiries.

SEARCH GUIDE  Download Search Tip PDF File

  • articleOpen Access

    TECHNOLOGIES FOR LARGE DATA MANAGEMENT IN SCIENTIFIC COMPUTING

    In recent years, intense usage of computing has been the main strategy of investigations in several scientific research projects. The progress in computing technology has opened unprecedented opportunities for systematic collection of experimental data and the associated analysis that were considered impossible only few years ago.

    This paper focuses on the strategies in use: it reviews the various components that are necessary for an effective solution that ensures the storage, the long term preservation, and the worldwide distribution of large quantities of data that are necessary in a large scientific research project.

    The paper also mentions several examples of data management solutions used in High Energy Physics for the CERN Large Hadron Collider (LHC) experiments in Geneva, Switzerland which generate more than 30,000 terabytes of data every year that need to be preserved, analyzed, and made available to a community of several tenth of thousands scientists worldwide.

  • articleFree Access

    Dynamic Virtual Machine Allocation in Cloud Computing Using Elephant Herd Optimization Scheme

    Cloud computing is a computing technology that is expeditiously evolving. Cloud is a type of distributed computing system that provides a scalable computational resource on demand including storage, processing power and applications as a service via Internet. Cloud computing, with the assistance of virtualization, allows for transparent data and service sharing across cloud users, as well as access to thousands of machines in a single event. Virtual machine (VM) allocation is a difficult job in virtualization that is governed as an important aspect of VM migration. This process is performed to discover the optimum way to place VMs on physical machines (PMs) since it has clear implications for resource usage, energy efficiency, and performance of several applications, among other things. Hence an efficient VM placement problem is required. This paper presents a VM allocation technique based on the elephant herd optimization scheme. The proposed method is evaluated using real-time workload traces and the empirical results show that the proposed method reduces energy consumption, and maximizes resource utilization when compared to the existing methods.

  • articleFree Access

    Hybrid COOT-Reverse Cognitive Fruit Fly Optimization-Based Big Data Services and Virtual Machine Allocation for Cloud Storage System

    In recent years, cloud computing technologies have been developed rapidly in this computing world to provide suitable on-demand network access all over the world. A cloud service provider offers numerous types of cloud services to the user. But the most significant issue is how to attain optimal virtual machine (VM) allocation for the user and design an efficient big data storage platform thereby satisfying the requirement of both the cloud service provider and the user. Therefore, this paper presents two novel strategies for optimizing VM resource allocation and cloud storage. An optimized cloud cluster storage service is introduced in this paper using a binarization based on modified fuzzy c-means clustering (BMFCM) algorithm to overcome the negative issues caused by the repetitive nature of the big data traffic. The BMFCM algorithm utilized can be implemented transparently and can also address problems associated with massive data storage. The VM selection is optimized in the proposed work using a hybrid COOT-reverse cognitive fruit fly (RCFF) optimization algorithm. The main aim of this algorithm is to improve the massive big data traffic and storage locality. The CPU utilization, VM power, memory dimension and network bandwidth are taken as the fitness function of the hybrid COOT-RCFF algorithm. When implemented in CloudSim and Hadoop, the proposed methodology offers improvements in terms of completion time, overall energy consumption, makespan, user provider satisfaction and load ratio. The results show that the proposed methodology improves the execution time and data retrieval efficiency by up to 32% and 6.3% more than the existing techniques.

  • articleOpen Access

    Extraction, classification and visualization of 3-dimensional clouds simulated by cloud-resolving atmospheric model

    Cloud-resolving atmospheric general circulation models using large-scale supercomputers reproduce realistic behavior of 3-dimensional atmospheric field on a global scale. To understand the simulation result for scientists, conventional visualization methods based on 2-dimensional cloud classification are not enough for understanding individual clouds and their physical characteristics. In this study, we propose a new 3-dimensional extraction and classification method of simulated clouds based on their 3-dimensional shape and physical properties. Our proposed method extracts individual clouds by cloud water and cloud ice, and classifies them into six types by their altitude and upward flow. We applied the method to time-varying atmospheric simulation data, and attempted to visualize atmospheric phenomena on the tropics such as developing cumulonimbus and tropical cyclone. Two case studies clearly visualize the behavior of individual cloud type and clarify that some cloud types have a relationship with rainfall during active weather phenomena. The proposed method has the potential to analyze such phenomena that develop in the vertical direction as well as the horizontal direction.