Please login to be able to save your searches and receive alerts for new content matching your search criteria.
Human and social capital developments are discussed in the context of increasing corporate IQ, defined as distributed intelligence (DI) in firms, as the basis of improved economic rent generation. A review of complexity science shows that adaptive tension dynamics (energy-differentials) may be used to foster adaptively efficacious DI appreciation. The optimal region for rapidly improving adaptive fitness occurs in the region in which emergent self-organization occurs—between the 1st and 2nd critical values of adaptive tension. Below the 1st value there is little change; above the 2nd value the system becomes chaotic and dysfunctional. Twelve "simple rules" drawn from complexity science are defined. These are available to rent-seeking CEOs wishing to create improved corporate IQ.
Heterogeneity is a hallmark of all cancers. Tumor heterogeneity is found at different levels — interpatient, intrapatient, and intratumor heterogeneity. All of them pose challenges for clinical treatments. The latter two scenarios can also increase the risk of developing drug resistance. Although the existence of tumor heterogeneity has been known for two centuries, a clear understanding of its origin is still elusive, especially at the level of intratumor heterogeneity (ITH). The coexistence of different subpopulations within a single tumor has been shown to play crucial roles during all stages of carcinogenesis. Here, using concepts from evolutionary game theory and public goods game, often invoked in the context of the tragedy of commons, we explore how the interactions among subclone populations influence the establishment of ITH. By using an evolutionary model, which unifies several experimental results in distinct cancer types, we develop quantitative theoretical models for explaining data from in vitro experiments involving pancreatic cancer as well as in vivo data in glioblastoma multiforme. Such physical and mathematical models complement experimental studies, and could optimistically provide new ideas for the design of efficacious therapies for cancer patients.
Most organizations experience the tension between producing and innovating. With increasing external pressures, e.g. driven by fast-paced technological innovation and connectivity in the 4th industrial revolution, the organizational tension and complexity is increasing rapidly. This poses challenges concerning how quality management (QM) will be carried out in the future. In order to handle higher complexity and organizational adaptability, QM need to master enabling leadership that creates adaptive spaces, bridging the entrepreneurial and operational systems in organizations. Guiding images such as metaphors have been suggested to support transformative change, but have only received limited research. This paper investigates the role of guiding images as facilitating tool for leading organizational adaptability and contribute to QM leadership practices in the 4th industrial revolution.
The paper studied three case organizations where management teams needed to facilitate organizational adaptability in complexity and used a generative image to support the leadership process. The researchers gained access to key meetings to study the processes through observations that were later analyzed with the use of storyboards.
The study shows how the enabling leadership processes can be facilitated by guiding images. The guiding images can be generated while managers experience tension and can help support the leadership processes of conflicting and connecting. Three types of guiding images were identified.
The paper contributes with practical knowledge about how to support the enabling leadership process with guiding images and contributes to QM by introducing complex adaptive leadership theory.
How can we be better to implement and create commitment for sustainable changes in (for instance) municipalities, help start ups create stronger teams and also possess complexity as a leader or consultant without getting stress and burn-out? This chapter explores how we can use storytelling to create a commitment to sustainable and ethical changes, and to navigate situations of high complexity. We believe the change-management method True Storytelling has some answers. True Storytelling is more than a change-management method, it is a philosophy inspired from indigenous peoples’ storytelling, quantum physics and arts. This text will introduce the seven principles and seven processes of True Storytelling, its theory and some of the tools used in the Foundation Ethics Module at the True Storytelling Institute. After reading the chapter you will have a basic understanding about the concept.
This chapter examines a complexity-inspired model for multilevel analysis of situations. Originally termed relational introspection (Wakefield, 2012), other adaptations include fractal relationality (Boje & Henderson, 2015) and the self–others–situation (SOS) model described herein. I then adapt it for use in open-plan office environments, where workers are able to observe one another’s every move. This adaptation leverages storytelling in the form of a self-narrative, as described by Chang (2016), to explore unfolding patterns of behaviors and perceptions, ultimately adding an emotional intelligence component in the interest of encouraging constructive patterns in settings where one’s own influence may not necessarily take center stage.
We encourage fully enacting grounded theory (GT) to help enable organizational science to make a turn toward relational process ontologies (RPOs). RPOs unify practice and philosophy and enable more positive aspirations for organizational futures. How researchers enact GT has changed over three waves, waves which we explore in depth in terms of theoretical mindset and practice. GT is presently inadequate for the complex theorization RPOs require; therefore, we need a fourth wave of GT. Using an RPO of theory-as-historical, we guide the development of a fourth wave of GT. Theory-as-historical sees first- and second-order codes as serving different historical aims, thus second-order codes do not have to build on first-order codes. We discuss the fourth wave GT’s implications for new methods, questions, forms of knowledge, and insights, which enable organizational science to create theories that perform better organizations.
We have recently started to understand that fundamental aspects of complex systems such as emergence, the measurement problem, inherent uncertainty, complex causality in connection with unpredictable determinism, time-irreversibility and non-locality all highlight the observer’s participatory role in determining their workings. In addition, the principle of ‘limited universality’ in complex systems, which prompts us to ‘search for the appropriate level of description in which unification and universality can be expected’, looks like a version of Bohr’s ‘complementarity principle’. It is more or less certain that the different levels of description possible of a complex whole — actually partial objectifications — are projected on to and even redefine its constituent parts. Thus it is interesting that these fundamental complexity issues don’t just bear a formal resemblance to, but reveal a profound connection with, quantum mechanics. Indeed, they point to a common origin on a deeper level of description.
Modern science has been thoroughly influenced by the centuries-old Simplicity Principle, also known as Ockham’s Razor. This principle is usually formulated as “entities must not be multiplied without necessity”. The main problem with this formulation is that the necessity or redundancy of an entity (e.g. a concept, hypothesis, law, rule, an explanatory element) cannot always be compellingly demonstrated. This means that, certainly within an empiristic, positivistic or materialistic worldview, the use of Ockham’s Razor easily tends towards unjustified reductionism. However, ontologically or epistemologically, the Simplicity Principle can no longer be justified. The Simplicity Principle does not provide a sufficient argument to reject “entities” as irrelevant or superfluous. Moreover, a reductionistic conception of science cannot contribute to tackling issues concerning ultimate values, meanings of life, metaphysics, aesthetics, religion and several aspects of practical life, such as counselling, morals, politics and jurisdiction. Therefore, this article proposes an alternative principle that I have called the Chatton–Ockham Strategy, which is an integration of Chatton’s anti-Razor and Ockham’s Razor and deals with the complexity–simplicity polarity.
The past history of chess and its past variants is examined based on a new measure of game refinement and complexity.The history of chess variants can be divided into the three stages: the origin with moderate complexity and refinement, many diverging variants with high complexity and low refinement, and the modern sophisticated chess with low complexity and high refinement.
Phylogenetic networks are a restricted class of directed acyclic graphs that model evolutionary histories in the presence of reticulate evolutionary events, such as horizontal gene transfer, hybrid speciation, and recombination. Characterizing a phylogenetic network as a collection of trees and their branches has long been the basis for several methods of reconstructing and evaluating phylogenetic networks. Further, these characterizations have been used to understand molecular sequence evolution on phylogenetic networks.
In this paper, we address theoretical questions with regard to phylogenetic networks, their characterizations, and sequence evolution on them. In particular, we prove that the problem of deciding whether a given tree is contained inside a network is NP-complete. Further, we prove that the problem of deciding whether a branch of a given tree is also a branch of a given network is polynomially equivalent to that of deciding whether the evolution of a molecular character (site) on a network is governed by the infinite site model. Exploiting this equivalence, we establish the NP-completeness of both problems, and provide a parameterized algorithm that runs in time O(2k/2n2), where n is the total number of nodes and k is the number of recombination nodes in the network, which significantly improves upon the trivial brute-force O(2kn) time algorithm for the problem. This reduction in time is significant, particularly when analyzing recombination hotspots.
We propose a simple and effective method of characterizing complexity of EEG-signals for monitoring the depth of anaesthesia using Higuchi's fractal dimension method. We demonstrate that the proposed method may compete with the widely used BIS monitoring method.
This article deals with the intrinsic complexity of tracing and reachability questions in the context of elementary geometric constructions. We consider constructions from elementary geometry as dynamic entities: while the free points of a construction perform a continuous motion the dependent points should move consistently and continuously. We focus on constructions that are entirely built up from join, meet and angular bisector operations. In particular the last operation introduces an intrinsic ambiguity: Two intersecting lines have two different angular bisectors. Under the requirement of continuity it is a fundamental algorithmic problem to resolve this ambiguity properly during motions of the free elements.
After formalizing this intuitive setup we prove the following main results of this article:
• It is NP-hard to trace the dependent elements in such a construction.
• It is NP-hard to decide whether two instances of the same construction lie in the same component of the configuration space.
• The last problem becomes PSPACE-hard if we allow one additional sidedness test which has to be satisfied during the entire motion.
On the one hand the results have practical relevance for the implementations of Dynamic Geometry Systems. On the other hand the results can be interpreted as statements concerning the intrinsic complexity of analytic continuation.
The problem of increasing the understanding of algorithms by considering the foundations of numerical analysis and computer science is considered. The schism between scientific computing and computer science is discussed from a theoretical perspective. These theoretical considerations have an intellectual importance when viewing the computer. In particular, the legitimacy and importance of models of machines that accept real numbers is considered.
We estimate the probability that a given number of projective Newton steps applied to a linear homotopy of a system of n homogeneous polynomial equations in n + 1 complex variables of fixed degrees will find all the roots of the system. We also extend the framework of our analysis to cover the classical implicit function theorem and revisit the condition number in this context. Further complexity theory is developed.
The Newtonian revolution taught us how to dissect phenomena into contingencies (e.g., initial conditions) and fundamental laws (e.g., equations of motion). Since then, ‘fundamental physics’ has been pursuing purer and leaner fundamental laws. Consequently, to explain real phenomena a lot of auxiliary conditions become required. Isn't it now the time to start studying ‘auxiliary conditions’ seriously?
The study of biological systems has a possibility of shedding light on this neglected side of phenomena in physics, because we organisms were constructed by our parents who supplied indispensable auxiliary conditions; we never self-organize. Thus, studying the systems lacking self-organizing capability (such as complex systems) may indicate new directions to physics and biology (biophysics).
There have been attempts to construct a ‘general theoretical framework’ of biology, but most of them never seriously looked at the actual biological world. Every serious natural science must start with establishing a phenomenological framework. Therefore, this must be the main part of bio-physics. However, this article is addressed mainly to theoretical physicists and discusses only certain theoretical aspects (with real illustrative examples).
Recalling the decomposition methodology, the complexity of the decomposition process is described. The complexity of a system is connected with the depth reached in the decomposition process. In particular the number of subsystems and the number of active relations present in a decomposition are the elements used to define a complexity index. Some considerations about the decompositions sequences allow to put in evidence some basic properties useful to define the maximum values of complexity. Given some hypotheses about the relation patterns due to the starting steps in the decomposition process the range for each decomposition level is evaluated through computer simulations. In addition some connections with other knowledge contexts, as graph theory, are presented.
Music, compared to other complex forms of representation, is fundamentally characterized by constant evolution and a dynamic succession of structure reference models. This is without taking into account historical perspective, the analysis of forms and styles, or questions of a semantic nature; the observation rather refers to the phenomenology of the music system. The more abstract a compositional model, the greater the number and frequency of variables that are not assimilated to the reference structure; this "interference" which happens more often than not in an apparently casual manner, modifies the creative process to varying but always substantial degrees: locally, it produces a disturbance in perceptive, formal and structural parameters, resulting more often than not in a synaesthetic experience; globally, on the other hand, it defines the terms of a transition to a new state, in which the relations between elements and components modify the behavior of the entire system from which they originated. It is possible to find examples of this phenomenon in the whole range of musical production, in particular in improvisations, in the use of the Basso Continuo, and in some contrapuntal works of the baroque period, music whose temporal dimension can depart from the limits of mensurability and symmetry to define an open compositional environment in continuous evolution.
Physics is a seamless dialogue with Nature that transcends the divisions made by practising physicists—often for convenience, often imposed by their own limitations. It is difficult to sustain the notion that one branch of physics can satisfactorily answer all the questions of Nature. I expand on this theme briefly, and also technically, using the example of hydrodynamic turbulence.
A major challenge for business information systems (BIS) design and management within healthcare units is the need to accommodate the complexity and changeability in terms of their clinical protocols, technology, business and administrative processes. Interoperability is the response to these demands, but there are many ways to achieve an “interoperable” information system. In this chapter we address the requirements of a healthcare interoperability framework to enhance enterprise architecture interoperability of healthcare organizations, while maintaining the organization's technical and operational environments, and installed technology. The healthcare interoperability framework is grounded on the combination of model driven architecture (MDA) and service-oriented architecture (SOA) technologies.
The main argument in this chapter is to advocate the role of chief information officer (CIO) in dealing with challenges posed by the need to achieve BIS grounded on interoperability at the different layers, and in developing and executing an interoperability strategy, which must be aligned with the health organization's business, administrative, and clinical processes. A case study of the Hospital de São Sebastião is described demonstrating the critical role of the CIO for the development of an interoperable platform.
This chapter highlights the impact and manageability of rapid, constantly/ continuously and unique changes taking place in humanity that are affecting the existence individuals, as well as all categories of human organizations. It has been observed that the traditional Newtonian mindset, and its associated reductionist hypothesis and design paradigm that have served humanity ‘well’ are manifesting their limits/constraints, vulnerability and disparities. The crux of the issue is escalating complexity density, incoherency, greater mismatch among current thinking, principles, values, structure, dynamic, and hierarchical dominance, limited predictability, and the overall changing ‘reality’.
Vividly, order and linearity are not the only inherent attributes of humanity. Consequently, the significance, appropriateness and exploration of certain properties of complexity theory are introduced, partially to better identify, analyze, comprehend and manage the accelerating gaps of inconsistency — in particular, to nurture a new mindset. Arising from the new mindset, human organizations/systems are confirmed as intrinsic composite complex adaptive systems (composite CAS, nonlinear adaptive dynamical systems) comprising human beings/agents that are CAS. In this respect, leadership, governance, management, and strategic approaches adopted by all human organizations must be redefined.
Concurrently, a special focus on intelligence (and its associated consciousness), the first inherent strengths of all human agents, and its role as the key latent impetus/driver, is vital. This recognition indicates that a change in era is inevitable. Humanity is entering the new intelligence era — the core of the knowledge-intensive and complexity-centric period. Overall, an integrated intelligence/consciousness-centric, complexity-centric and network-centric approach is essential. It adopts a complexity-intelligence-centric path that focuses on the optimization of all intense intrinsic intelligence/consciousness sources (human thinking systems), better exploitation of the co-existence of order and complexity, and integration of networks in human organizations — (certain spaces of complexity must be better utilized, coherency of network of networks must be achieved, and preparation for punctuation point must be elevated).
The new holistic (multi-dimensional) strategy of the intelligent organization theory (IO theory) is the complexity-intelligence strategy, and the new mission focuses on the new intelligence advantage.