Loading [MathJax]/jax/output/CommonHTML/jax.js
Skip main navigation

Cookies Notification

We use cookies on this site to enhance your user experience. By continuing to browse the site, you consent to the use of our cookies. Learn More
×

System Upgrade on Tue, May 28th, 2024 at 2am (EDT)

Existing users will be able to log into the site and access content. However, E-commerce and registration of new users may not be available for up to 12 hours.
For online purchase, please visit us again. Contact us at customercare@wspc.com for any enquiries.

SEARCH GUIDE  Download Search Tip PDF File

  Bestsellers

  • chapterNo Access

    "SIMPLE RULES" FOR IMPROVING CORPORATE IQ: BASIC LESSONS FROM COMPLEXITY SCIENCE

    Human and social capital developments are discussed in the context of increasing corporate IQ, defined as distributed intelligence (DI) in firms, as the basis of improved economic rent generation. A review of complexity science shows that adaptive tension dynamics (energy-differentials) may be used to foster adaptively efficacious DI appreciation. The optimal region for rapidly improving adaptive fitness occurs in the region in which emergent self-organization occurs—between the 1st and 2nd critical values of adaptive tension. Below the 1st value there is little change; above the 2nd value the system becomes chaotic and dysfunctional. Twelve "simple rules" drawn from complexity science are defined. These are available to rent-seeking CEOs wishing to create improved corporate IQ.

  • chapterFree Access

    Goodness of machine learning models

    In this context, a model is an algorithm or a procedure that applies to data resulting in a functional relation τ between “input space” X and “output space” Y. In this short paper, we will delineate objective criteria which help to disambiguate and rate models’ credibility. We will define pertinent concepts and will voice an opinion on the matter of good versus bad versus so–so models.

  • chapterNo Access

    Chapter 4: Cooperation Among Tumor Cell Subpopulations Leads to Intratumor Heterogeneity

    Heterogeneity is a hallmark of all cancers. Tumor heterogeneity is found at different levels — interpatient, intrapatient, and intratumor heterogeneity. All of them pose challenges for clinical treatments. The latter two scenarios can also increase the risk of developing drug resistance. Although the existence of tumor heterogeneity has been known for two centuries, a clear understanding of its origin is still elusive, especially at the level of intratumor heterogeneity (ITH). The coexistence of different subpopulations within a single tumor has been shown to play crucial roles during all stages of carcinogenesis. Here, using concepts from evolutionary game theory and public goods game, often invoked in the context of the tragedy of commons, we explore how the interactions among subclone populations influence the establishment of ITH. By using an evolutionary model, which unifies several experimental results in distinct cancer types, we develop quantitative theoretical models for explaining data from in vitro experiments involving pancreatic cancer as well as in vivo data in glioblastoma multiforme. Such physical and mathematical models complement experimental studies, and could optimistically provide new ideas for the design of efficacious therapies for cancer patients.

  • chapterNo Access

    Chapter 13: The Evolution of Leadership Theories: A Literature Review

    Leadership theories have evolved over the decades, reflecting its complex and multifaceted nature. This narrative literature review aims to analyze significant leadership literature in popular leadership course books and published peer-reviewed English articles, from which future research directions can be extrapolated. Various development stages of leadership, which hold different research values, are examined and discussed. Since some emerging theories appear to be understudied (shared/distributive leadership, complexity theory of leadership, etc.), there are arguably much scope and opportunities for these to be developed in the future.

  • chapterNo Access

    Chapter 11: Leading Quality Management Transformation in Complexity Through Adaptive Space and Metaphors

    Most organizations experience the tension between producing and innovating. With increasing external pressures, e.g. driven by fast-paced technological innovation and connectivity in the 4th industrial revolution, the organizational tension and complexity is increasing rapidly. This poses challenges concerning how quality management (QM) will be carried out in the future. In order to handle higher complexity and organizational adaptability, QM need to master enabling leadership that creates adaptive spaces, bridging the entrepreneurial and operational systems in organizations. Guiding images such as metaphors have been suggested to support transformative change, but have only received limited research. This paper investigates the role of guiding images as facilitating tool for leading organizational adaptability and contribute to QM leadership practices in the 4th industrial revolution.

    The paper studied three case organizations where management teams needed to facilitate organizational adaptability in complexity and used a generative image to support the leadership process. The researchers gained access to key meetings to study the processes through observations that were later analyzed with the use of storyboards.

    The study shows how the enabling leadership processes can be facilitated by guiding images. The guiding images can be generated while managers experience tension and can help support the leadership processes of conflicting and connecting. Three types of guiding images were identified.

    The paper contributes with practical knowledge about how to support the enabling leadership process with guiding images and contributes to QM by introducing complex adaptive leadership theory.

  • chapterNo Access

    Chapter 12: FinTech Firms and the Exploration and Exploitation of Financial Landscapes

    A problem is said to be complex if it comprises many parts or dimensions with various interactions. In such a case, it is reasonable to assume that the problem admits a multidimensional solution space, which comprises all candidate solutions. Different solutions can have different goodness score, which can be called fitness. Innovators and problem solvers interested in solving a complex problem are assumed to be located somewhere in the solution landscape. Ideally, a searching agent would be interested in finding the solution with the highest fitness possible, and in doing so, employing a search heuristic. Financial problems such as investment, payment, risk management, and fraud detection are all examples of complex problems that have complex landscapes of solutions. In searching for solutions to financial problems, firms often engage in financial innovation. A particular type of financial innovation is developed by FinTech start-ups, which introduce innovations in the backend and customer-face financial services. FinTech firms are thought to follow a unique approach to change characterized by agility and unconstrained search for new solutions. So far, little has been done to understand the nature of the search behavior of FinTech firms; this, therefore, is still an open gap in the literature that requires further investigation. In this theoretical chapter, I first propose to use the theory of fitness landscapes to model the solution space of financial problems, and second, I adopt an open innovation perspective to model the search heuristics of FinTech firms who look for solutions in such landscapes.

  • chapterFree Access

    Chapter 1: True Storytelling — A Philosophy of Life and an Ethical Change-management Method

    How can we be better to implement and create commitment for sustainable changes in (for instance) municipalities, help start ups create stronger teams and also possess complexity as a leader or consultant without getting stress and burn-out? This chapter explores how we can use storytelling to create a commitment to sustainable and ethical changes, and to navigate situations of high complexity. We believe the change-management method True Storytelling has some answers. True Storytelling is more than a change-management method, it is a philosophy inspired from indigenous peoples’ storytelling, quantum physics and arts. This text will introduce the seven principles and seven processes of True Storytelling, its theory and some of the tools used in the Foundation Ethics Module at the True Storytelling Institute. After reading the chapter you will have a basic understanding about the concept.

  • chapterNo Access

    Chapter 12: Being in the Fishbowl: A Model for Complexity-Informed Emotional Intelligence in Open-Plan Office Spaces

    This chapter examines a complexity-inspired model for multilevel analysis of situations. Originally termed relational introspection (Wakefield, 2012), other adaptations include fractal relationality (Boje & Henderson, 2015) and the self–others–situation (SOS) model described herein. I then adapt it for use in open-plan office environments, where workers are able to observe one another’s every move. This adaptation leverages storytelling in the form of a self-narrative, as described by Chang (2016), to explore unfolding patterns of behaviors and perceptions, ultimately adding an emotional intelligence component in the interest of encouraging constructive patterns in settings where one’s own influence may not necessarily take center stage.

  • chapterNo Access

    Chapter 2: Enabling a Turn Toward Relational Process Ontologies via Grounded Theory: Creating Theories that Perform Better Organizations

    We encourage fully enacting grounded theory (GT) to help enable organizational science to make a turn toward relational process ontologies (RPOs). RPOs unify practice and philosophy and enable more positive aspirations for organizational futures. How researchers enact GT has changed over three waves, waves which we explore in depth in terms of theoretical mindset and practice. GT is presently inadequate for the complex theorization RPOs require; therefore, we need a fourth wave of GT. Using an RPO of theory-as-historical, we guide the development of a fourth wave of GT. Theory-as-historical sees first- and second-order codes as serving different historical aims, thus second-order codes do not have to build on first-order codes. We discuss the fourth wave GT’s implications for new methods, questions, forms of knowledge, and insights, which enable organizational science to create theories that perform better organizations.

  • chapterNo Access

    ENCOUNTERING COMPLEXITY: IN NEED FOR A SELF-REFLECTING (PRE)EPISTEMOLOGY

    We have recently started to understand that fundamental aspects of complex systems such as emergence, the measurement problem, inherent uncertainty, complex causality in connection with unpredictable determinism, time-irreversibility and non-locality all highlight the observer’s participatory role in determining their workings. In addition, the principle of ‘limited universality’ in complex systems, which prompts us to ‘search for the appropriate level of description in which unification and universality can be expected’, looks like a version of Bohr’s ‘complementarity principle’. It is more or less certain that the different levels of description possible of a complex whole — actually partial objectifications — are projected on to and even redefine its constituent parts. Thus it is interesting that these fundamental complexity issues don’t just bear a formal resemblance to, but reveal a profound connection with, quantum mechanics. Indeed, they point to a common origin on a deeper level of description.

  • chapterNo Access

    THE CHATTON-OCKHAM STRATEGY; AN ALTERNATIVE TO THE SIMPLICITY PRINCIPLE

    Modern science has been thoroughly influenced by the centuries-old Simplicity Principle, also known as Ockham’s Razor. This principle is usually formulated as “entities must not be multiplied without necessity”. The main problem with this formulation is that the necessity or redundancy of an entity (e.g. a concept, hypothesis, law, rule, an explanatory element) cannot always be compellingly demonstrated. This means that, certainly within an empiristic, positivistic or materialistic worldview, the use of Ockham’s Razor easily tends towards unjustified reductionism. However, ontologically or epistemologically, the Simplicity Principle can no longer be justified. The Simplicity Principle does not provide a sufficient argument to reject “entities” as irrelevant or superfluous. Moreover, a reductionistic conception of science cannot contribute to tackling issues concerning ultimate values, meanings of life, metaphysics, aesthetics, religion and several aspects of practical life, such as counselling, morals, politics and jurisdiction. Therefore, this article proposes an alternative principle that I have called the Chatton–Ockham Strategy, which is an integration of Chatton’s anti-Razor and Ockham’s Razor and deals with the complexity–simplicity polarity.

  • chapterNo Access

    REFINEMENT AND COMPLEXITY IN THE EVOLUTION OF CHESS

    The past history of chess and its past variants is examined based on a new measure of game refinement and complexity.The history of chess variants can be divided into the three stages: the origin with moderate complexity and refinement, many diverging variants with high complexity and low refinement, and the modern sophisticated chess with low complexity and high refinement.

  • chapterNo Access

    SEEING THE TREES AND THEIR BRANCHES IN THE NETWORK IS HARD

    Phylogenetic networks are a restricted class of directed acyclic graphs that model evolutionary histories in the presence of reticulate evolutionary events, such as horizontal gene transfer, hybrid speciation, and recombination. Characterizing a phylogenetic network as a collection of trees and their branches has long been the basis for several methods of reconstructing and evaluating phylogenetic networks. Further, these characterizations have been used to understand molecular sequence evolution on phylogenetic networks.

    In this paper, we address theoretical questions with regard to phylogenetic networks, their characterizations, and sequence evolution on them. In particular, we prove that the problem of deciding whether a given tree is contained inside a network is NP-complete. Further, we prove that the problem of deciding whether a branch of a given tree is also a branch of a given network is polynomially equivalent to that of deciding whether the evolution of a molecular character (site) on a network is governed by the infinite site model. Exploiting this equivalence, we establish the NP-completeness of both problems, and provide a parameterized algorithm that runs in time O(2k/2n2), where n is the total number of nodes and k is the number of recombination nodes in the network, which significantly improves upon the trivial brute-force O(2kn) time algorithm for the problem. This reduction in time is significant, particularly when analyzing recombination hotspots.

  • chapterNo Access

    MONITORING THE DEPTH OF ANAESTHESIA USING FRACTAL COMPLEXITY METHOD

    Complexus Mundi01 Jan 2006

    We propose a simple and effective method of characterizing complexity of EEG-signals for monitoring the depth of anaesthesia using Higuchi's fractal dimension method. We demonstrate that the proposed method may compete with the widely used BIS monitoring method.

  • chapterNo Access

    COMPLEXITY ISSUES IN DYNAMIC GEOMETRY

    This article deals with the intrinsic complexity of tracing and reachability questions in the context of elementary geometric constructions. We consider constructions from elementary geometry as dynamic entities: while the free points of a construction perform a continuous motion the dependent points should move consistently and continuously. We focus on constructions that are entirely built up from join, meet and angular bisector operations. In particular the last operation introduces an intrinsic ambiguity: Two intersecting lines have two different angular bisectors. Under the requirement of continuity it is a fundamental algorithmic problem to resolve this ambiguity properly during motions of the free elements.

    After formalizing this intuitive setup we prove the following main results of this article:

    • It is NP-hard to trace the dependent elements in such a construction.

    • It is NP-hard to decide whether two instances of the same construction lie in the same component of the configuration space.

    • The last problem becomes PSPACE-hard if we allow one additional sidedness test which has to be satisfied during the entire motion.

    On the one hand the results have practical relevance for the implementations of Dynamic Geometry Systems. On the other hand the results can be interpreted as statements concerning the intrinsic complexity of analytic continuation.

  • chapterNo Access

    MODELING COMPLEXITY USING HIERARCHICAL MULTI-AGENT SYSTEMS

    Existing approaches to modeling natural systems are inadequate to exhibit all the properties of complex phenomena. Current models in the field of complex systems, such as cellular automata, are straightforward to understand and give interesting theoretical results, but they are not very pertinent for a real case study. We show that hierarchical multi-agent modeling provides an interesting alternative for modeling complex systems through two case examples: a virtual ecosystem for studying species dynamics and the simulation of gravitational structures in cosmology.

  • chapterNo Access

    STUDYING COMPLEX STELLAR DYNAMICS USING A HIERARCHICAL MULTI-AGENT MODEL

    This paper defines a new approach for cosmological simulation based on complex systems theory: a hierarchical multi-agent system is used to study stellar dynamics. At each level of the model, global behavior emerges from agent interactions. The presented model uses physically-based laws and agent-interactions to present stellar structures has the result of self-organization. Nevertheless a strong bond with cosmology is kept by showing the capacity of the model to exhibit structures close to those of the observable universe.

  • chapterNo Access

    SOME REMARKS ON THE FOUNDATIONS OF NUMERICAL ANALYSIS

    The problem of increasing the understanding of algorithms by considering the foundations of numerical analysis and computer science is considered. The schism between scientific computing and computer science is discussed from a theoretical perspective. These theoretical considerations have an intellectual importance when viewing the computer. In particular, the legitimacy and importance of models of machines that accept real numbers is considered.

  • chapterNo Access

    COMPLEXITY OF BEZOUT'S THEOREM IV: PROBABILITY OF SUCCESS; EXTENSIONS

    We estimate the probability that a given number of projective Newton steps applied to a linear homotopy of a system of n homogeneous polynomial equations in n + 1 complex variables of fixed degrees will find all the roots of the system. We also extend the framework of our analysis to cover the classical implicit function theorem and revisit the condition number in this context. Further complexity theory is developed.

  • chapterNo Access

    BIO-PHYSICS MANIFESTO — FOR THE FUTURE OF PHYSICS AND BIOLOGY

    The Newtonian revolution taught us how to dissect phenomena into contingencies (e.g., initial conditions) and fundamental laws (e.g., equations of motion). Since then, ‘fundamental physics’ has been pursuing purer and leaner fundamental laws. Consequently, to explain real phenomena a lot of auxiliary conditions become required. Isn't it now the time to start studying ‘auxiliary conditions’ seriously?

    The study of biological systems has a possibility of shedding light on this neglected side of phenomena in physics, because we organisms were constructed by our parents who supplied indispensable auxiliary conditions; we never self-organize. Thus, studying the systems lacking self-organizing capability (such as complex systems) may indicate new directions to physics and biology (biophysics).

    There have been attempts to construct a ‘general theoretical framework’ of biology, but most of them never seriously looked at the actual biological world. Every serious natural science must start with establishing a phenomenological framework. Therefore, this must be the main part of bio-physics. However, this article is addressed mainly to theoretical physicists and discusses only certain theoretical aspects (with real illustrative examples).