Please login to be able to save your searches and receive alerts for new content matching your search criteria.
Cloud computing’s simulation and modeling capabilities are crucial for big data analysis in smart grid power; they are the key to finding practical insights, making the grid resilient, and improving energy management. Due to issues with data scalability and real-time analytics, advanced methods are required to extract useful information from the massive, ever-changing datasets produced by smart grids. This research proposed a Dynamic Resource Cloud-based Processing Analytics (DRC-PA), which integrates cloud-based processing and analytics with dynamic resource allocation algorithms. Computational resources must be able to adjust the changing grid circumstances, and DRC-PA ensures that big data analysis can scale as well. The DRC-PA method has several potential uses, including power grid optimization, anomaly detection, demand response, and predictive maintenance. Hence the proposed technique enables smart grids to proactively adjust to changing conditions, boosting resilience and sustainability in the energy ecosystem. A thorough simulation analysis is carried out using realistic circumstances within smart grids to confirm the usefulness of the DRC-PA approach. The methodology is described in the intangible, showing how DRC-PA is more efficient than traditional methods because it is more accurate, scalable, and responsive in real-time. In addition to resolving existing issues, the suggested method changes the face of contemporary energy systems by paving the way for innovations in grid optimization, decision assistance, and energy management.
The main idea of this framework is that it is capable of overcoming the drawbacks that are always linked with conventional cloud-based methods. Computation and storage resources in Internet of Things (IoT) networks are distributed closer to the network’s edge; therefore, the amount of data processed in real time is reduced. By decreasing the distance of the data transfers, less bandwidth is used. Structural problems, data safety, interoperability, and resource allocation-related matters denote challenges preventing the successful implementation of those ideas. The proposed work is the cloud-enabled fog computing framework (C-FCF) in data center systems based on the IoT platform. It brings cloud computing to a new level of scalability by compounding the following: Scalable architecture, uniform communication interfaces, the dynamic algorithms that allocate resources, and the data-centered approach, on the one hand, and strong security protocols, on the other. The wireless sensor network (WSN) approach to this technology represents a greater versatility of this system as it can be applied to perform different tasks in various industries like smart cities, healthcare, transportation, and industrial automation services. The application of the given services illustrates C-FCF’s capability of creating innovation, modeling effectiveness, and uncovering potential for integration within the IoT network. The virtual simulation analysis is necessary to validate C-FCF’s effectiveness in real-life scenarios. The simulations provide evidence of features, including low latency, efficient resource utilization, and overall system performance, which underlines the practical aspects of applying C-FCF in different IoT settings. Developing this advanced computing architecture, which can surpass the limitations of conventional technology and the ability to entail many different use cases, will potentially change the data processing and management paradigm in IoT-enabled settings.
In the new generation of power grids, the smart grid (SG) integrates sophisticated characteristics, including situation awareness, two-way communication, and distributed energy supplies. Integrated SG uses various operational metrics, including devices with sensors, meters, and renewable power sources. There are several challenges when securely disposing and storing electricity data acquired from an SG. It is vulnerable to cyberattacks due to its digitization and integration of an increasing number of links. Issues with latency, security, privacy, and excessive bandwidth consumption will arise when this enormous amount of data is transmitted directly to the cloud. Edge computing (EC) solves this problem by moving data processing to the network’s periphery, close to the embedded devices. With improved data processing speeds, a more responsive and resilient grid may be achieved, instantly responding to energy demand and supply changes. EC reduces the volume of sensitive data sent to central servers, reducing potential security breaches. Data may be better protected from intrusions by being analyzed locally and only pertinent information transferred to the cloud. Thus, a blockchain is an intriguing SG paradigm solution with many benefits. The SG’s decentralization and improved cybersecurity have prompted a lot of work into using blockchain technology; since it is well-known that data saved in the blockchain is immutable, it is crucial to find foolproof ways to verify data are accurate and comply with high-quality standards before storing it in the blockchain. A practical solution for storing precise power data that enables the safe execution of adaptable transactions is a Cloud-Edge Fusion Blockchain model for the smart grid (CEFBM-SG). Consequently, the SG’s dependability, resilience, and scalability will be improved as the number of distributed energy sources (DERs) connected to it increases. Utilizing the idea of computing at the edge to enhance responsiveness and dependability. Executed security analyses and performance evaluations demonstrate CEFBM-SG’s exceptional security and efficiency.
Data redundancy consumes huge storage space while setting up or employing cloud and fog storage. The dynamic cloud nature primarily focuses on the static environments which must be revised. Data deduplication solutions help minimize and control this issue by eradicating duplicate data from cloud storage systems. Since it might improve storage economy and security, data deduplication (DD) over encrypted data is a crucial problem in computing and storage systems. In this research, a novel approach to building secure deduplication systems across cloud and fog environments is developed. It uses MCDD and convergent cryptographic algorithms. The two most significant objectives of such systems are the focus of the suggested approach. Data redundancy must be minimized, but it also needs to be secured using a robust encryption method, which needs to be devised. The suggested approach is ideally suited for tasks like a user uploading new data to cloud storage or the fog. The proposed method might eliminate data redundancy by detecting redundancy at the block level. The testing results indicate that the recommended methodology can surpass a few cutting-edge techniques regarding computing effectiveness and security levels. The file is encrypted twice, once with the modified cryptographic model for deduplication (MCDD) and once with convergence encryption (CE).
Presented in this paper is a possible solution for speeding up the integration of various data in the big data mainstream. The data enrichment and convergence of all possible sources is still at the beginning. As a result, existing techniques must be retooled in order to increase the integration of already existing databases or of the ones specific to Internet of Things in order to use the advantages of the big data to fulfill the final goal of web of data creation. In this paper, semantic web-specific solutions are used to design a system based on intelligent agents. It tries to solve some problems specific to automation of the database migration system with the final goal of creating a common ontology over various data repositories or producers in order to integrate them into systems based on big data architecture.
Cloud-resolving atmospheric general circulation models using large-scale supercomputers reproduce realistic behavior of 3-dimensional atmospheric field on a global scale. To understand the simulation result for scientists, conventional visualization methods based on 2-dimensional cloud classification are not enough for understanding individual clouds and their physical characteristics. In this study, we propose a new 3-dimensional extraction and classification method of simulated clouds based on their 3-dimensional shape and physical properties. Our proposed method extracts individual clouds by cloud water and cloud ice, and classifies them into six types by their altitude and upward flow. We applied the method to time-varying atmospheric simulation data, and attempted to visualize atmospheric phenomena on the tropics such as developing cumulonimbus and tropical cyclone. Two case studies clearly visualize the behavior of individual cloud type and clarify that some cloud types have a relationship with rainfall during active weather phenomena. The proposed method has the potential to analyze such phenomena that develop in the vertical direction as well as the horizontal direction.
In the distributed data-intensive computing environment, securely assigning tasks to appropriate machines is a big job scheduling problem. The complexity of this problem increases with the number of jobs and their job times. Several meta-heuristic algorithms including particle swarm optimization (PSO) technique and variable neighborhood particle swarm optimization (VNPSO) technique are employed to solve the problem to a certain extent. This paper proposes a modified PSO with scout adaptation (MPSO-SA) algorithm, which uses a cyclic term called mutation operator, to solve the job scheduling problem in the cloud environment. The comparative study between the proposed MPSO-SA scheduling mechanism and the conventional scheduling algorithms show that the proposed method decreases the probability of security risk on scheduling the jobs.
Today, a large number of information and communication technologies (ICT) and networking technologies are being used in industrial control systems. Thus, networked industrial control systems (NICS) are exposed to many security threats. Moreover, new technologies for NICS also need to be tested. This paper presents a cloud-based experimental platform for NICS to test new technologies and security threats. A cloud platform is used to emulate network devices and Simulink is used to simulate the physical layer. To build this testbed, we modify the cloud platform and add three modules to the testbed. One module is used so that the cloud platform can connect to real devices. By using this module, real devices can be added to the networks in the cloud platform. The second module is used for network connection configurations in the testbed. By using this module, the bandwidth, delay and packet loss rate for networks in the testbed can all be set. The third module is used to connect the Simulink to the testbed. The main features of the proposed platform are high flexibility, high authenticity, and low cost. Advanced persistent threat (APT) attacks are a common threat for NICS nowadays. In order to prove the feasibility of the proposed testbed, a common NICS is established and an APT attack is executed on it.
In cloud computing, load balancing is crucial for effective resource management. Keeping servers from getting overworked entails dividing up incoming network traffic or computational tasks among several servers. Better resource management, increased throughput, and quicker reaction times result from this. Several heuristic and metaheuristic techniques have been used to disperse the load across the available virtual machines. Researchers have worked very hard to find a solution for the load balancing issue. This paper uses the following stages to develop a novel load-balancing model with optimization assistance: Virtual Machine (VM) classification, load balancing, and replica management are the three main processes. For the VM classification process, a modified version of the fuzzy clustering approach is suggested. For the load balancing procedure, the COOT Insisted Bald Eagle Search (COOTIBES) model is suggested. This optimization-assisted load balancing takes into account a number of limitations, including frequency, makespan, memory usage, resource utilization, and execution time. Additionally, the suggested COOTIBES algorithm manages replicas while taking load, put cost, and storage cost into account. Lastly, using various performance indicators, the suggested work’s performance is contrasted with traditional models. While the Inquisitive Genetic Algorithm with Grey Wolf Optimization Algorithm (IG-GWO), Ant Colony Optimization for Continuous Domains (ACOR), Life Choice-Based Optimizer (LCBO), Cat Swarm Optimization (CSO), Genetic Algorithm Combined with First Come First Serve + Genetic Algorithm Combined with Round Robin (GA-FCFS+GA-RR), Jellyfish Optimization (JFO), and Namib Beetle Optimization (NBO) offer bigger makespan values, the COOTIBES scheme gives a smaller makespan of 180 for a task count of 2000.
Cloud computing is internet based network diagram that represent internet, or various parts of it, as schematic clouds. The term, characteristics and services associated with internet based computing is called cloud computing. Characteristics means infrastructures, provisioning, network access and managed metering. The primary business associated model which is employed in software, platform and infrastructure as a service and common deployment model which is deployed by service providers and users to use and maintain the cloud services like private, public, community and hybrid clouds are discussed in this papers. In this paper cloud computing refers to different types of services and applications being delivered in the internet cloud but the fact is, in many cases the devices use to access such services and applications do not require any special applications. That is cloud computing is everywhere. Cloud computing also promise to cut operational and capital costs.