Loading [MathJax]/jax/output/CommonHTML/jax.js
Skip main navigation

Cookies Notification

We use cookies on this site to enhance your user experience. By continuing to browse the site, you consent to the use of our cookies. Learn More
×

System Upgrade on Tue, May 28th, 2024 at 2am (EDT)

Existing users will be able to log into the site and access content. However, E-commerce and registration of new users may not be available for up to 12 hours.
For online purchase, please visit us again. Contact us at customercare@wspc.com for any enquiries.

SEARCH GUIDE  Download Search Tip PDF File

  Bestsellers

  • articleNo Access

    THE FASTEST AND SHORTEST ALGORITHM FOR ALL WELL-DEFINED PROBLEMS

    An algorithm M is described that solves any well-defined problem p as quickly a the fastest algorithm computing a solution to p, save for a factor of 5 and low-order additive terms. M optimally distributes resources between the execution of provably correct p-solving programs and an enumeration of all proofs, including relevant proofs of program correctness and of time bounds on program runtimes. M avoids Blum's speed-up theorem by ignoring programs without correctness proof. M has broader applicability and can be faster than Levin's universal search, the fastest method for inverting functions save for a large multiplicative constant. An extension of Kolmogorov complexity and two novel natural measures of function complexity are used to show the most efficient program computing some function f is also among the shortest programs provably computing f.

  • articleNo Access

    INFORMATION DISTANCE AND ITS APPLICATIONS

    We have been developing a general theory of information distance and a paradigm of applying this theory to practical problems.[3, 19, 20] There are several problems associated with this theory. On the practical side, among other problems, the strict requirement of triangle inequality is unrealistic in some applications; on the theoretical side, the universality theorems for normalized information distances were only proved in a weak form. In this paper, we will introduce a complementary theory that resolves or avoids these problems.

    This article also serves as a brief expository summary for this area. We will tell the stories about how and why some of the concepts were introduced, recent theoretical developments and interesting applications. These applications include whole genome phylogeny, plagiarism detection, document comparison, music classification, language classification, fetal heart rate tracing, question answering, and a wide range of other data mining tasks.

  • articleNo Access

    SETS OF K-INDEPENDENT STRINGS

    A bit string is random (in the sense of algorithmic information theory) if it is incompressible, i.e., its Kolmogorov complexity is close to its length. Two random strings are independent if knowing one of them does not simplify the description of the other, i.e., the conditional complexity of each string (using the other as a condition) is close to its length. We may define independence of a k-tuple of strings in the same way.

    In this paper we address the following question: what is that maximal cardinality of a set of n-bit strings if any k elements of this set are independent (up to a certain constant)? Lower and upper bounds that match each other (with logarithmic precision) are provided.

  • articleNo Access

    Analysis of the Complexity Measures in the EEG of Schizophrenia Patients

    Complexity measures have been enormously used in schizophrenia patients to estimate brain dynamics. However, the conflicting results in terms of both increased and reduced complexity values have been reported in these studies depending on the patients’ clinical status or symptom severity or medication and age status. The objective of this study is to investigate the nonlinear brain dynamics of chronic and medicated schizophrenia patients using distinct complexity estimators. EEG data were collected from 22 relaxed eyes-closed patients and age-matched healthy controls. A single-trial EEG series of 2min was partitioned into identical epochs of 20s intervals. The EEG complexity of participants were investigated and compared using approximate entropy (ApEn), Shannon entropy (ShEn), Kolmogorov complexity (KC) and Lempel–Ziv complexity (LZC). Lower complexity values were obtained in schizophrenia patients. The most significant complexity differences between patients and controls were obtained in especially left frontal (F3) and parietal (P3) regions of the brain when all complexity measures were applied individually. Significantly, we found that KC was more sensitive for detecting EEG complexity of patients than other estimators in all investigated brain regions. Moreover, significant inter-hemispheric complexity differences were found in the frontal and parietal areas of schizophrenics’ brain. Our findings demonstrate that the utilizing of sensitive complexity estimators to analyze brain dynamics of patients might be a useful discriminative tool for diagnostic purposes. Therefore, we expect that nonlinear analysis will give us deeper understanding of schizophrenics’ brain.

  • articleNo Access

    THE SECOND QUANTIZED QUANTUM TURING MACHINE AND KOLMOGOROV COMPLEXITY

    The Kolmogorov complexity of a physical state is the minimal physical resources required to reproduce that state. We define a second quantized quantum Turing machine and use it to define second quantized Kolmogorov complexity. There are two advantages to our approach — our measure of the second quantized Kolmogorov complexity is closer to physical reality and unlike other quantum Kolmogorov complexities, it is continuous. We give examples where the second quantized and quantum Kolmogorov complexity differ.

  • articleNo Access

    KOLMOGOROV COMPLEXITY AND CHAOTIC PHENOMENON IN COMPUTING THE ENVIRONMENTAL INTERFACE TEMPERATURE

    In this paper, we consider the chaotic phenomenon and Kolomogorov complexity in computing the environmental interface temperature. First, the environmental interface is defined in the context of the complex system, in particular for autonomous dynamical systems. Then we consider the following issues in modeling procedure: (i) how to replace given differential equations by appropriate difference equations in modeling of phenomena in the environmental world? (ii) whether a mathematically correct solution to the corresponding differential equation or system of equations is always physically possible and (iii) phenomenon of chaos in autonomous dynamical systems in environmental problems, in particular in solving the energy balance equation to calculate environmental interface temperature. The difference form of this equation for computing the environmental interface temperature is discussed and analyzed depending on parameters of equation, using the Lyapunov exponent and sample entropy. Finally, the Kolmogorov complexity of time series obtained from this difference equation is analyzed.

  • articleNo Access

    CHAOS IN COMPUTING THE ENVIRONMENTAL INTERFACE TEMPERATURE: NONLINEAR DYNAMIC AND COMPLEXITY ANALYSIS OF SOLUTIONS

    In this paper, we consider an environmental interface as a complex system, in which difference equations for calculating the environmental interface temperature and deeper soil layer temperature are represented by the coupled maps. First equation has its background in the energy balance equation while the second one in the prognostic equation for deeper soil layer temperature, commonly used in land surface parametrization schemes. Nonlinear dynamical consideration of this coupled system includes: (i) examination of period one fixed point and (ii) bifurcation analysis. Focusing part of analysis is calculation of the Lyapunov exponent for a specific range of values of system parameters and discussion about domain of stability for this coupled system. Finally, we calculate Kolmogorov complexity of time series generated from the coupled system.

  • articleNo Access

    KOLMOGOROV COMPLEXITY SPECTRUM FOR USE IN ANALYSIS OF UV-B RADIATION TIME SERIES

    In this paper, we have used the Kolmogorov complexity and sample entropy measures to estimate the complexity of the UV-B radiation time series in the Vojvodina region (Serbia) for the period 1990–2007. We have defined the Kolmogorov complexity spectrum and have introduced the Kolmogorov complexity spectrum highest value (KCH). We have established the UV-B radiation time series on the basis of their daily sum (dose) for seven representative places in this region using: (i) measured data, (ii) data calculated via a derived empirical formula and (iii) data obtained by a parametric UV radiation model. We have calculated the Kolmogorov complexity (KC) based on the Lempel–Ziv algorithm (LZA), KCH and sample entropy (SE) values for each time series. We have divided the period 1990–2007 into two subintervals: (i) 1990–1998 and (ii) 1999–2007 and calculated the KC, KCH and SE values for the various time series in these subintervals. It is found that during the period 1999–2007, there is a decrease in the KC, KCH and SE, compared to the period 1990–1998. This complexity loss may be attributed to (i) the increased human intervention in the post civil war period causing increase of the air pollution and (ii) the increased cloudiness due to climate changes.

  • articleNo Access

    Calculating the complexity of spatially distributed physical quantities

    With the development of mathematics as well as natural sciences and with the improvement of the human cognitive level, a new discipline dealing with complexity of different and complex natural systems has been recognized. Therefore, several complexity measures have been developed. Complexity measures provided to scientific community new insights into environmental processes that cannot be discovered by the traditional mathematical methods. Spatial distribution of heavy metals and radionuclides (HM&RN further) is formed by acting natural processes as well as human activities. Despite the fact that this distribution plays an important role in environmental processes, it has not been analyzed with deserving attention. The usual way to present the results obtained by some measurements having an objective to describe environmental properties is by creating a map of spatial distributions of some chosen quantities or indices. Attempts to introduce some quantitative measure, which characterizes measured areal distribution (and corresponding map) of physical quantity, cannot be frequently encountered in scientific community. In this paper, we invested an effort to introduce some numerical indices as a new measure which can describe spatial distributions of physical quantity based on the complexity computed by the Lempel–Ziv algorithm (LZA) or Lempel–Ziv complexity (LZC).

  • articleNo Access

    MINIMUM DESCRIPTION LENGTH METHOD FOR FACET MATCHING

    The Minimum Description Length (MDL) criterion is used to fit a facet model of a car to an image. The best fit is achieved when the difference image between the car and the background has the greatest compression. MDL overcomes the overfitting and parameter precision problems which hamper the more usual maximum likelihood method of model fitting. Some preliminary results are shown.

  • articleNo Access

    Measuring Algorithmic Complexity in Chaotic Lasers

    Thanks to the simplicity and robustness of its calculation methods, algorithmic (or Kolmogorov) complexity appears as a useful tool to reveal chaotic dynamics when experimental time series are too short and noisy to apply Takens’ reconstruction theorem. We measure the complexity in chaotic regimes, with and without extreme events (sometimes called optical rogue waves), of three different all-solid-state lasers: Kerr lens mode locking femtosecond Ti:Sapphire (“fast” saturable absorber), Nd:YVO4+ Cr:YAG (“slow” saturable absorber) and Nd:YVO4 with modulated losses. We discuss how complexity characterizes the dynamics in an understandable way in all cases, and how it provides a correction factor of predictability given by Lyapunov exponents. This approach may be especially convenient to implement schemes of chaos control in real time.

  • articleNo Access

    A NOVEL ECG AND EEG CLASSIFICATION SYSTEM BASED ON NONLINEAR STATISTICAL FEATURES

    Fractals01 Jan 2023

    Accurate classification of the medical signals is urgently needed in clinical medicine. This paper aims to create a classifier to shorten the time of the classification and ensure the sorting accuracy, which assists physicians in saving diagnostic time and formulating the treatment plans. We create the classifier based on Kolmogorov complexity, Shannon entropy, Higuchi’s Hurst exponent and multifractal features. We obtain a feature value from Kolmogorov complexity, Shannon entropy and Higuchi’s Hurst exponent, and three feature values based on multifractal features to compose a vector and analyze it. Furthermore, we study a vector composed of six multifractal features as a control group. Electrocardiogram (ECG) and electroencephalogram (EEG) signals are applied to examine the performance of the classifier by support vector machine (SVM). The accuracy of ECG signals based on mixed classification (MC–ECG–SVM) reaches 94.17%, which is approximately 15% higher than that of ECG signals only based on multifractal features classification (UC–ECG–SVM). The sensitivities of MC–ECG–SVM and UC–ECG–SVM are 86.09% and 64.54%, respectively. The specificities of MC–ECG–SVM and UC–ECG–SVM are 98.26% and 93.65%, respectively. Analogously, the accuracy, sensitivity, and specificity of EEG signals based on mixed classification (MC–EEG–SVM) reach 95.29%, 96.28%, and 94.55%, respectively. The accuracy, sensitivity, and specificity of EEG signals based on multifractal features classification (UC–EEG–SVM) are 87.40%, 89.28%, and 88.11%, respectively. Therefore, the mixed classification method is more accurate than the classification method only based on multifractal features.

  • articleNo Access

    An Information Theoretic Approach to Predicting Software Faults

    Software measurement and modeling is intended to improve quality by predicting quality factors, such as reliability, early in the life cycle. The field of software measurement generally assumes that attributes of software products early in the life cycle are somehow related to the amount of information in those products, and thus, are related to the quality that eventually results from the development process.

    Kolmogorov complexity and information theory offer a way to quantify the amount of information in a finite object, such as a program, in a unifying framework. Based on these principles, we propose a new synthetic measure of information composed from a set of conventional primitive metrics in a module. Since not all information is equally relevant to fault-insertion, we also consider components of the overall information content. We present a model for fault-insertion based on a nonhomogeneous Poisson process and Poisson regression. This approach is attractive, because the underlying assumptions are appropriate for software quality data. This approach also gives insight into design attributes that affect fault insertion.

    A validation case study of a large sample of modules from a very large telecommunications system provides empirical evidence that the components of synthetic module complexity can be useful in software quality modeling. A large telecommunications system is an example of a computer system with rigorous software quality requirements.

  • articleNo Access

    RELATIVIZING CHAITIN'S HALTING PROBABILITY

    As a natural example of a 1-random real, Chaitin proposed the halting probability Ω of a universal prefix-free machine. We can relativize this example by considering a universal prefix-free oracle machine U. Let formula be the halting probability of UA; this gives a natural uniform way of producing an A-random real for every A ∈ 2ω. It is this operator which is our primary object of study. We can draw an analogy between the jump operator from computability theory and this Omega operator. But unlike the jump, which is invariant (up to computable permutation) under the choice of an effective enumeration of the partial computable functions, formula can be vastly different for different choices of U. Even for a fixed U, there are oracles A =* B such that formula and formula are 1-random relative to each other. We prove this and many other interesting properties of Omega operators. We investigate these operators from the perspective of analysis, computability theory, and of course, algorithmic randomness.

  • articleNo Access

    Lowness for effective Hausdorff dimension

    We examine the sequences A that are low for dimension, i.e. those for which the effective (Hausdorff) dimension relative to A is the same as the unrelativized effective dimension. Lowness for dimension is a weakening of lowness for randomness, a central notion in effective randomness. By considering analogues of characterizations of lowness for randomness, we show that lowness for dimension can be characterized in several ways. It is equivalent to lowishness for randomness, namely, that every Martin-Löf random sequence has effective dimension 1 relative to A, and lowishness for K, namely, that the limit of KA(n)/K(n) is 1. We show that there is a perfect formula-class of low for dimension sequences. Since there are only countably many low for random sequences, many more sequences are low for dimension. Finally, we prove that every low for dimension is jump-traceable in order nε, for any ε > 0.

  • articleNo Access

    ACCURATE DETECTION OF SEIZURE USING NONLINEAR PARAMETERS EXTRACTED FROM EEG SIGNALS

    Electroencephalography (EEG) is the graphical recording of electrical activity along the scalp. The EEG signal monitors brain activity noninvasively with a high accuracy of milliseconds and provides valuable discernment about the brain’s state. It is also sensitive in detecting spikes in epilepsy.

    Computer-aided diagnosis (CAD) tools allow epilepsy to be diagnosed by evading invasive methods. This paper presents a novel CAD system for epilepsy using other linear features together with Hjorth’s nonlinear features such as mobility, complexity, activity and Kolmogorov complexity. The proposed method uses MATLAB software to extract the nonlinear features from the EEG data. The optimal features are selected using the statistical analysis, ANOVA (analysis of variance) test for classification. Once selected, they are fed into the decision tree (DT) for the classification of the different epileptic classes.

    The proposed method affirms that four nonlinear features, Kolmogorov complexity, singular value decomposition, mobility and permutation entropy are sufficient to provide the highest accuracy of 93%, sensitivity of 97%, specificity of 88% and positive predictive value (PPV) of 94%, with the DT classifier.

    The mean value is the highest in the ictal stage for the Kolmogorov complexity proving it to have the best variation. It also has the highest F-value of 300.439 portraying it to be the best parameter that is favourable for the clinical diagnosis of epilepsy, when used together with the DT classifier, for a duration of 23.6s of EEG data.

  • articleNo Access

    A FORMAL THEORY OF EARLY COGNITION DEVELOPMENT

    The formal theory of the development of early perception and motor control presented here deals with cognitive development as a mapping from a finite set of given experiences to a set of perceptual and motor-control functions. The theory involves seven constraints that uniquely define the mapping. The compatibility with observational phenomena and sufficiency of these constraints shows the validity of the theory. The principle underlying these constraints is a coding by the most efficient representation of information. The efficiency of representation is evaluated by the coding redundancy of given experiences defined as the number of real numbers that characterize experiences plus the size of the minimum continuous decoding function. The coding redundancy of experiences by the most efficient representation corresponds to the Kolmogorov complexity of the experiences. The mapping accounts for the dependence on neonatal experience of the development of perceptual and motor-control functions. This theory of development can also be seen as a metatheory of cognition that presents us a unified view of the diversity of perceptual and motor-control modules.

  • articleNo Access

    A GENERAL FRAMEWORK FOR BICLUSTERING GENE EXPRESSION DATA

    A large number of biclustering methods have been proposed to detect patterns in gene expression data. All these methods try to find some type of biclusters but no one can discover all the types of patterns in the data. Furthermore, researchers have to design new algorithms in order to find new types of biclusters/patterns that interest biologists. In this paper, we propose a novel approach for biclustering that, in general, can be used to discover all computable patterns in gene expression data. The method is based on the theory of Kolmogorov complexity. More precisely, we use Kolmogorov complexity to measure the randomness of submatrices as the merit of biclusters because randomness naturally consists in a lack of regularity, which is a common property of all types of patterns. On the basis of algorithmic probability measure, we develop a Markov Chain Monte Carlo algorithm to search for biclusters. Our method can also be easily extended to solve the problems of conventional clustering and checkerboard type biclustering. The preliminary experiments on simulated as well as real data show that our approach is very versatile and promising.

  • articleNo Access

    Quantum information distance

    In this paper, we give a definition for quantum information distance. In the classical setting, information distance between two classical strings is developed based on classical Kolmogorov complexity. It is defined as the length of a shortest transition program between these two strings in a universal Turing machine. We define the quantum information distance based on Berthiaume et al.’s quantum Kolmogorov complexity. The quantum information distance between qubit strings is defined as the length of the shortest quantum transition program between these two qubit strings in a universal quantum Turing machine. We show that our definition of quantum information distance is invariant under the choice of the underlying quantum Turing machine.

  • articleNo Access

    AIT Foundations of Structured Experience

    In this paper, we present a unifying framework to study consciousness based on algorithmic information theory (AIT). We take as a premise that “there is experience” and focus on the requirements for structured experience (𝒮) — the spatial, temporal, and conceptual organization of our first-person experience of the world and of ourselves as agents in it. Our starting point is the insight that access to good models — succinct and accurate generative programs of world data — is crucial for homeostasis and survival. We hypothesize that the successful comparison of such models with data provides the structure to experience. Building on the concept of Kolmogorov complexity, we can associate the qualitative aspects of 𝒮 with the algorithmic features of the model, including its length, which reflects the structure discovered in the data. Moreover, a modeling system tracking structured data will display dimensionality reduction and criticality features that can be used empirically to quantify the structure of the program run by the agent. KT provides a consistent framework to define the concepts of life and agent and allows for the comparison between artificial agents and 𝒮-reporting humans to provide an educated guess about agent experience. A first challenge is to show that a human agent has 𝒮 to the extent they run encompassing and compressive models tracking world data. For this, we propose to study the relation between the structure of neurophenomenological, physiological, and behavioral data. The second is to endow artificial agents with the means to discover good models and study their internal states and behavior. We relate the algorithmic framework to other theories of consciousness and discuss some of its epistemological, philosophical, and ethical aspects.