Loading [MathJax]/jax/output/CommonHTML/jax.js
Skip main navigation

Cookies Notification

We use cookies on this site to enhance your user experience. By continuing to browse the site, you consent to the use of our cookies. Learn More
×

System Upgrade on Tue, May 28th, 2024 at 2am (EDT)

Existing users will be able to log into the site and access content. However, E-commerce and registration of new users may not be available for up to 12 hours.
For online purchase, please visit us again. Contact us at customercare@wspc.com for any enquiries.

SEARCH GUIDE  Download Search Tip PDF File

  Bestsellers

  • articleNo Access

    SHARED RISK RESOURCE GROUP COMPLEXITY AND APPROXIMABILITY ISSUES

    This article investigates complexity and approximability properties of combinatorial optimization problems yielded by the notion of Shared Risk Resource Group (SRRG). SRRG has been introduced in order to capture network survivability issues where a failure may break a whole set of resources, and has been formalized as colored graphs, where a set of resources is represented by a set of edges with same color. We consider here the analogous of classical problems such as determining paths or cuts with the minimum numbers of colors or color disjoint paths. These optimization problems are much more difficult than their counterparts in classical graph theory. In particular standard relationship such as the Max Flow - Min Cut equality do not hold any longer. In this article we identify cases where these problems are polynomial, for example when the edges of a given color form a connected subgraph, and otherwise give hardness and non approximability results for these problems.

  • articleNo Access

    A Sorting Network on Trees

    Sorting networks are a class of parallel oblivious sorting algorithms. Not only do they have interesting theoretical properties but they can be fabricated. A sorting network is a sequence of parallel compare-exchange operations using comparators which are grouped into stages. This underlying graph defines the topology of the network. The majority of results on sorting networks concern the unrestricted case where the underlying graph is the complete graph. Prior results are also known for paths, hypercubes, and meshes. In this paper we introduce a sorting network whose underlying topology is a tree and formalize the concept of sorting networks on a restricted graph topology by introducing a new parameter for graphs called its sorting number. The main result of the paper is a description of an O(min(nΔ2,n2)) depth sorting network on a tree with maximum degree Δ.

  • articleNo Access

    SYNTHESIZING LINEAR SYSTOLIC ARRAYS FOR DYNAMIC PROGRAMMING PROBLEMS

    In this paper we design modular linear systolic arrays for the solutions of dynamic programming problems using geometric considerations. Our results show that solving a dynamic programming problem of size n requires (n - 1) ⌈( n/α)⌉ + 1 cells and (n - 1) ((α + 3) ⌈(n/α)⌉ + 2) + 1 time steps, where each cell has a local memory of constant size α and the time delay between two consecutive cells of the array is assumed to be constant.

  • articleNo Access

    TRANSITIVE CLOSURE IN PARALLEL ON A LINEAR NETWORK OF PROCESSORS

    In this paper, we propose a linear time parallel algorithm (in the number of edges of the transitive closure) that computes the transitive closure of a directed graph on a linear network of n processors. The underlying architecture is a linear network of processors with neighbouring communications, where the number of processors is equal to the number of vertices of the graph.

  • articleNo Access

    SUB-LOGARITHMIC ALGORITHMS FOR THE LARGEST EMPTY RECTANGLE PROBLEM

    We show that the Largest Empty Rectangle problem can be solved by reducing it, in a natural way, to the All Nearest Smaller Values problem. We provide two classes of algorithms: the first one assumes that the input points are available sorted by x (resp. y) coordinate. Our algorithm corresponding to this case runs in O(log log n) time using formula processors in the Common-CRCW-PRAM model. For unsorted input, we present algorithms that run in formula time using formula processors in the Common-CRCW-PRAM, or in O(log n) time using formula processors in the EREW-PRAM model. No sub-logarithmic time parallel algorithms have been previously reported for this problem.

  • articleNo Access

    OPTIMAL PARALLEL SUPERPRIMITIVITY TESTING FOR SQUARE ARRAYS

    We present an optimal O(log log n) time algorithm on the CRCW PRAM which tests whether a square array, A, of size n×n, is superprimitive. If A is not superprimitive, the algorithm returns the quasiperiod, i.e., the smallest square array that covers A.

  • articleNo Access

    LEAST MEAN SQUARED BACKGROUND CALIBRATION FOR OFDM MULTICHANNEL RECEIVERS

    This paper presents a data estimation scheme for wide band multichannel charge sampling filter bank receivers together with a complete system calibration algorithm based on the least mean squared (LMS) algorithm. A unified model has been defined for the receiver containing all first order mismatches, offsets, imperfections, and the LMS algorithm is employed to track these errors. The performance of this technique under noisy channel conditions has been verified. Moreover, a detailed complexity analysis of the calibration algorithm is provided which shows that sinc filter banks have much lower complexity than traditional continuous-time filter banks.

  • articleNo Access

    Reduced Complexity Linearity Improved Threshold Quantized Comparator Based Flash ADC

    This paper describes a standard cell-based new approach of comparator design for flash ADC. Conventional flash ADC comparator consumes up to 60% of the power due to resistive ladder network and analog comparators. Threshold inverter quantized (TIQ) comparators reported earlier have improved speed and provide low-power, low-voltage operation. But they need feature size variation and have non-linearity issues. Here, a new standard cell comparator is proposed which retains all advantages of TIQ comparator and provides improved linearity with reduced hardware complexity. A 4-bit ADC designed using the proposed comparator requires 206 minimum-sized transistors and provides large area saving compared to previously proposed designs. Thermometer code is partitioned using algebraic division theorem. This conversion is used for mathematical modeling and complexity reduction of decoder circuit using semi-parallel organization of comparators. Circuit is designed using 90 nm technology which exhibits satisfactory performance even in process variation.

  • articleNo Access

    The Comparison, Analysis and Circuit Implementation of the Chaotic Systems

    A four-dimensional chaotic system with complex dynamical properties is constructed. The complexity of the system was evaluated by equilibrium point, Lyapunov exponential spectrum and bifurcation model. The coexistence of Lyapunov exponential spectrum and bifurcation model proves the coexistence of attractors. C0 and SE complexity algorithms are used to compare and analyze the corresponding complexity characteristics of the system, and the most complex integer-order system is obtained. In addition, the circuit to switch between different chaotic attractors is novel. In particular, more complex parameters are selected for the fractional-order chaotic system through the analysis of parameter complexity, and the rich dynamics of the system are analyzed. Experimental results based on Field-Programmable Gate Array (FPGA) platform verify the feasibility of the system. Finally, the most complex integer-order system is compared with its corresponding fractional-order system in image encryption, so that the fractional-order system has a better application prospect.

  • articleNo Access

    Exploring the Potential of Deep Learning and Blockchain for Intrusion Detection Systems: A Comprehensive Review

    Detecting intrusions in real-time within cloud networks presents a multifaceted challenge involving intricate processes such as feature representation, intrusion type classification and post-processing procedures. Distributed Intrusion Detection Systems (DIDSs) constitute a complex landscape characterized by diverse contextual nuances, distinct functional advantages and limitations specific to deployment scenarios. Despite these challenges, DIDS offers immense potential for addressing evolving intrusion detection challenges through tailored contextual adaptations and unique functional advantages. Moreover, exploring the limitations associated with different deployment contexts facilitates targeted improvements and refinements, unlocking new avenues for innovation in intrusion detection technologies. Notably, deep learning (DL) integrated with blockchain technology emerges as a superior approach in terms of security, while bioinspired models excel in Quality of Service (QoS). These models demonstrate higher accuracy across various network scenarios, underscoring their efficacy in intrusion detection. Additionally, edge-based models exhibit high accuracy and scalability with reduced delay, complexity and cost in real-time network environments. The fusion of these models holds promise for enhancing classification performance across diverse attack types, offering avenues for future research exploration. This text conducts a comprehensive comparison of performance metrics, including accuracy, response delay, computational complexity, scalability and deployment costs. The proposed Novel DIDS Rank (NDR) streamlines model selection by considering these metrics, enabling users to make well-informed decisions based on multiple performance aspects simultaneously. This unified ranking approach facilitates the identification of DIDS that achieves high accuracy and scalability while minimizing response delay, cost and complexity across varied deployment scenarios.

  • articleNo Access

    COMPLEXITY AND FAMILIARITY WITH COMPUTER ASSISTANCE WHEN MAKING ILL-STRUCTURED BUSINESS DECISIONS

    Using high-level tasks typical of managerial decisions, this experimental study examined the influence of computer assistance on solving ill-structured problems. Of specific interest were the influences of complex and simple assistance on decision performance and decision maker attitudes. Our findings suggest that performance and user attitudes can be enhanced by technology that provides clear and simple instruction in good problem-solving practices. However, when that technology adds a complex array of technical options and features, the assistance may fail to improve or/and may even diminish performance and damage user attitudes. Familiarization with such complex technology may not improve these outcomes. The findings regarding the relationship between assistance complexity and decision performance are consistent with those of studies that suggest complexity has a curvilinear influence on performance. When considering the application of technological assistance to ill-structured decision tasks, the complexity of the assistance should not be ignored.

  • articleNo Access

    OPEN-ENDED ARTIFICIAL EVOLUTION

    Of all the issues discussed at Alife VII: Looking Forward, Looking Backward, the issue of whether it was possible to create an artificial life system that exhibits open-ended evolution of novelty is by far the biggest. Of the 14 open problems settled on as a result of debate at the conference, some 6 are directly, or indirectly related to this issue.

    Most people equate open-ended evolution with complexity growth, although a priori these seem to be different things. In this paper I report on experiments to measure the complexity of Tierran organisms, and show the results for a size-neutral run of Tierra. In this run, no increase in organismal complexity was observed, although organism size did increase through the run. This result is discussed, offering some signposts on path to solving the issue of open ended evolution.

  • articleNo Access

    ADAPTIVE DATA ANALYSIS OF COMPLEX FLUCTUATIONS IN PHYSIOLOGIC TIME SERIES

    We introduce a generic framework of dynamical complexity to understand and quantify fluctuations of physiologic time series. In particular, we discuss the importance of applying adaptive data analysis techniques, such as the empirical mode decomposition algorithm, to address the challenges of nonlinearity and nonstationarity that are typically exhibited in biological fluctuations.

  • chapterNo Access

    REFINEMENT AND COMPLEXITY IN THE EVOLUTION OF CHESS

    The past history of chess and its past variants is examined based on a new measure of game refinement and complexity.The history of chess variants can be divided into the three stages: the origin with moderate complexity and refinement, many diverging variants with high complexity and low refinement, and the modern sophisticated chess with low complexity and high refinement.

  • chapterNo Access

    TWO TOPICS CONCERNING TWO-DIMENSIONAL AUTOMATA OPERATING IN PARALLEL

    This paper deals with two topics concerning two-dimensional automata operating in parallel. We first investigate a relationship between the accepting powers of two-dimensional alternating finite automata (2-AFAs) and nondeterministic bottom-up pyramid cellular acceptors (NUPCAs), and show that Ω (diameter × log diameter) time is necessary for NUPCAs to simulate 2-AFAs. We then investigate space complexity of two-dimensional alternating Turing machines (2-ATMs) operating in small space, and show that if L(n) is a two-dimensionally space-constructible function such that limn→∞L(n)/loglog n > 1 and L(n) ≤ log n, and L'(n) is a function satisfying L'(n) = o(L(n)), then there exists a set accepted by some strongly L(n) space-bounded two-dimensional deterministic Turing machine, but not accepted by any weakly L'(n) space-bounded 2-ATM, and thus there exists a rich space hierarchy for weakly S(n) space-bounded 2-ATMs with loglog n ≤ S(n) ≤ log n.

  • chapterNo Access

    The Role of the CIO in the Development of Interoperable Information Systems in Healthcare Organizations

    A major challenge for business information systems (BIS) design and management within healthcare units is the need to accommodate the complexity and changeability in terms of their clinical protocols, technology, business and administrative processes. Interoperability is the response to these demands, but there are many ways to achieve an “interoperable” information system. In this chapter we address the requirements of a healthcare interoperability framework to enhance enterprise architecture interoperability of healthcare organizations, while maintaining the organization's technical and operational environments, and installed technology. The healthcare interoperability framework is grounded on the combination of model driven architecture (MDA) and service-oriented architecture (SOA) technologies.

    The main argument in this chapter is to advocate the role of chief information officer (CIO) in dealing with challenges posed by the need to achieve BIS grounded on interoperability at the different layers, and in developing and executing an interoperability strategy, which must be aligned with the health organization's business, administrative, and clinical processes. A case study of the Hospital de São Sebastião is described demonstrating the critical role of the CIO for the development of an interoperable platform.

  • chapterNo Access

    Chapter 3: Inaccessible Information and the Mathematical Theory of Oracles

    People always need more information than they have. They can get some of this information by themselves but a good deal of information remains inaccessible. This situation, which always existed in human civilization, brought forth Oracles. The idea of an Oracle reflected interactions between systems, such as people and states, with less information and systems with more indispensable information. In the 20th century, it was proved that some information is intrinsically inaccessible and then the concept of an Oracle naturally came to computer science becoming popular in the realm of algorithms and computations. At first, Turing machines with an Oracle were introduced and studied. Later inductive Turing machines with Oracles, limit Turing machines with Oracles and evolutionary Turing machines with Oracles were established and explored. Here we create a theoretical background for the concept of an Oracle. In the first part of this work, we contemplate Oracles in human culture. In the second part, we contemplate Oracles in computer science and mathematics. In the third part, the variety of Oracles is analyzed and classified. In the fourth part, we develop a mathematical theory of Oracles, where the concepts of an Oracle and its brands are formalized and studied in the mathematical framework.

  • chapterNo Access

    Chapter 9: The Chessboard Model: The Missing Links Between Complexity and Emergence

    This article represents a new approach in explaining the relationship between interacting units of a larger organization. With complexity theory as the theoretical foundation, organizations are viewed as chaordic systems interacting to produce a positive outcome, emergence. Organizational culture and strategic planning are introduced as mediating factors that help drive the activities of the interacting units. This theoretical model will help practitioners of human resource development and project management understand the underlying forces that drive the successful completion of projects, initiatives, and individual tasks. The “Chessboard” model is introduced as a concept map.