Please login to be able to save your searches and receive alerts for new content matching your search criteria.
In this work, we study conditional monotone cumulants and additive convolution in the shuffle-algebraic approach to non-commutative probability. We describe c-monotone cumulants as an infinitesimal character and identify the c-monotone additive convolution as an associative operation in the set of pairs of characters in the dual of a double tensor Hopf algebra. In this algebraic framework, we understand previous results on c-monotone cumulants and prove a combinatorial formula that relates c-free and c-monotone cumulants. We also identify the notion of t-Boolean cumulants in the shuffle-algebraic approach and introduce the corresponding notion of t-monotone cumulants as a particular case of c-monotone cumulants.
The unpredictability of the occurrence of epileptic seizures makes it difficult to detect and treat this condition effectively. An automatic system that characterizes epileptic activities in EEG signals would allow patients or the people near them to take appropriate precautions, would allow clinicians to better manage the condition, and could provide more insight into these phenomena thereby revealing important clinical information. Various methods have been proposed to detect epileptic activity in EEG recordings. Because of the nonlinear and dynamic nature of EEG signals, the use of nonlinear Higher Order Spectra (HOS) features is a seemingly promising approach. This paper presents the methodology employed to extract HOS features (specifically, cumulants) from normal, interictal, and epileptic EEG segments and to use significant features in classifiers for the detection of these three classes. In this work, 300 sets of EEG data belonging to the three classes were used for feature extraction and classifier development and evaluation. The results show that the HOS based measures have unique ranges for the different classes with high confidence level (p-value < 0.0001). On evaluating several classifiers with the significant features, it was observed that the Support Vector Machine (SVM) presented a high detection accuracy of 98.5% thereby establishing the possibility of effective EEG segment classification using the proposed technique.
Electrocardiogram (ECG) is the electrical activity of the heart indicated by P, Q-R-S and T wave. The minute changes in the amplitude and duration of ECG depicts a particular type of cardiac abnormality. It is very difficult to decipher the hidden information present in this nonlinear and nonstationary signal. An automatic diagnostic system that characterizes cardiac activities in ECG signals would provide more insight into these phenomena thereby revealing important clinical information. Various methods have been proposed to detect cardiac abnormalities in ECG recordings. Application of higher order spectra (HOS) features is a seemingly promising approach because it can capture the nonlinear and dynamic nature of the ECG signals. In this paper, we have automatically classified five types of beats using HOS features (higher order cumulants) using two different approaches. The five types of ECG beats are normal (N), right bundle branch block (RBBB), left bundle branch block (LBBB), atrial premature contraction (APC) and ventricular premature contraction (VPC). In the first approach, cumulant features of segmented ECG signal were used for classification; whereas in the second approach cumulants of discrete wavelet transform (DWT) coefficients were used as features for classifiers. In both approaches, the cumulant features were subjected to data reduction using principal component analysis (PCA) and classified using three layer feed-forward neural network (NN) and least square — support vector machine (LS-SVM) classifiers. In this study, we obtained the highest average accuracy of 94.52%, sensitivity of 98.61% and specificity of 98.41% using first approach with NN classifier. The developed system is ready clinically to run on large datasets.
Highlights from Brookhaven National Laboratory (BNL) and experiments at the BNL Relativistic Heavy Ion Collider (RHIC) are presented for the years 2011–2013. This review is a combination of lectures which discussed the latest results each year at a three year celebration of the 50th anniversary of the International School of Subnuclear Physics in Erice, Sicily, Italy. Since the first collisions in the year 2000, RHIC has provided nucleus–nucleus and polarized proton–proton collisions over a range of nucleon–nucleon center-of-mass energies from 7.7 GeV to 510 GeV with nuclei from deuterium to uranium, most often gold. The objective was the discovery of the Quark Gluon Plasma, which was achieved, and the measurement of its properties, which were much different than expected, namely a "perfect fluid" of quarks and gluons with their color charges exposed rather than a gas. Topics including quenching of light and heavy quarks at large transverse momentum, thermal photons, search for a QCD critical point as well as measurements of collective flow, two-particle correlations and J/Ψ suppression are presented. During this period, results from the first and subsequent heavy ion measurements at the Large Hadron Collider (LHC) at CERN became available. These confirmed and extended the RHIC discoveries and have led to ideas for new and improved measurements.
Understanding the phase diagram of the QCD matter is one of the ultimate goals in high-energy nuclear physics. Event-by-event fluctuations of conserved charges are believed to be sensitive to the QCD phase structures. In this paper, we will review the measurements carried out in the Beam Energy Scan program at RHIC. Our focus will be on the technical details to overcome various difficulties of the measurements. We will also discuss about the current interpretations on the results and future prospects.
In this paper we shall give combinatorial remarks on the r-free convolution. In particular, we shall introduce the set partition statistic on non-crossing partitions, which gives the r-free deformed moment-cumulant formula. We shall also give the probability measure of the r-free Poisson law and its moments, exactly.
Exchangeability systems arising from Fock space constructions are considered and the corresponding cumulants are computed for generalized Toeplitz operators and similar noncommutative random variables. In particular, simplified calculations are given for the two known examples of q-cumulants.
In the second half of the paper we consider in detail the Fock states associated to characters of the infinite symmetric group recently constructed by Bożejko and Guta. We express moments of multidimensional Dyck words in terms of the so-called cycle indicator polynomials of certain digraphs.
We define a product of algebraic probability spaces equipped with two states. This product is called a conditionally monotone product. This product is a new example of independence in noncommutative probability theory and unifies the monotone and Boolean products, and moreover, the orthogonal product. Then we define the associated cumulants and calculate the limit distributions in central limit theorem and Poisson's law of small numbers. We also prove a combinatorial moment-cumulant formula using monotone partitions. We investigate some other topics such as infinite divisibility for the additive convolution and deformations of the monotone convolution. We define cumulants for a general convolution to analyze the deformed convolutions.
In a fundamental lemma we characterize “generating functions” of certain functors on the category of algebraic non-commutative probability spaces. Special families of such generating functions correspond to “unital, associative universal products” on this category, which again define a notion of non-commutative stochastic independence. Using the fundamental lemma, we prove the existence of cumulants and of “cumulant Lie algebras” for all independences coming from a unital, associative universal product. These include the five independences (tensor, free, Boolean, monotone, anti-monotone) appearing in Muraki’s classification, c-free independence of Bożejko and Speicher, the indented product of Hasebe and the bi-free independence of Voiculescu. We show how the non-commutative independence can be reconstructed from its cumulants and cumulant Lie algebras.
We analyze the solution to the linear stochastic heat equation driven by a multiparameter Hermite process of order q≥1. This solution is an element of the qth Wiener chaos. We discuss various properties of the solution, such as the necessary and sufficient condition for its existence, self-similarity, α-variation and regularity of its sample paths. We will also focus on the probability distribution of the solution, which is non-Gaussian when q≥2.
The boolean and monotone notions of independence lack the property of independent constants. We address this problem from a combinatorial point of view (based on cumulants defined from weights on set-partitions, in the general framework of operator-valued probability spaces). We show that if the weights are singleton inductive (SI), then all higher-order cumulants involving constants vanish, just as in the free and classical case. Our combinatorial considerations lead rather directly to mild variations of boolean and monotone probability theories which are closely related to the usual notions. The SI-boolean case is related to c-free and Fermi convolutions. We also describe some standard combinatorial aspects of the SI-boolean and cyclic-boolean lattices, such as their Möbius functions, featuring well-known combinatorial integer sequences.
The role of coalgebras as well as algebraic groups in non-commutative probability has long been advocated by the school of von Waldenfels and Schürmann. Another algebraic approach was introduced more recently, based on shuffle and pre-Lie calculus, and results in another construction of groups of characters encoding the behavior of states. Comparing the two, the first approach, recast recently in a general categorical language by Manzel and Schürmann, can be seen as largely driven by the theory of universal products, whereas the second construction builds on Hopf algebras and a suitable algebraization of the combinatorics of non-crossing set partitions. Although both address the same phenomena, moving between the two viewpoints is not obvious. We present here an attempt to unify the two approaches by making explicit the Hopf algebraic connections between them. Our presentation, although relying largely on classical ideas as well as results closely related to Manzel and Schürmann’s aforementioned work, is nevertheless original on several points and fills a gap in the non-commutative probability literature. In particular, we systematically use the language and techniques of algebraic groups together with shuffle group techniques to prove that two notions of algebraic groups naturally associated with free, respectively, Boolean and monotone, probability theories identify. We also obtain explicit formulas for various Hopf algebraic structures and detail arguments that had been left implicit in the literature.
In this study, we analyze nonlinear feature extraction methods in terms of their ability to support the diagnosis of coronary artery disease (CAD) and myocardial infarction (MI). The nonlinear features were extracted from electrocardiogram (ECG) signals that were measured from CAD patients, MI patients as well as normal controls.
We tested 34 recurrence quantification analysis (RQA) features, 14 bispectrum, and 136 cumulant features. The features were extracted from 10,546 normal, 41,545 CAD, and 40,182 MI heart beats. The feature quality was assessed with Student’s t-test and the t-value was used for feature ranking.
We found that nonlinear features can effectively represent the physiological realities of the human heart.
The enhanced tear film evaporation and diminished tear production causes a dry eye (DE) condition. A non-invasive infrared (IR) thermography is most commonly used as a diagnostic tool for diagnosis of DE. However, the availability of high-quality IR thermal camera at low cost is difficult. Hence, an efficient DE detection system which can perform efficiently by using low-cost and low-quality images instead of conventional IR images would be a significant contribution. Therefore, in this work, we have evaluated the performance of automated non-invasive DE detection system using low-quality images obtained by adding different levels of noise to high-quality IR images. In this work, the performances of two non-linear higher-order spectra (HOS) cumulants and bispectrum features are compared. These features are extracted from the IR images with different levels of Gaussian noise. Principal component analysis (PCA) is performed on these extracted features and they are ranked using t-value and later fed to different classifiers. We have achieved the accuracies, sensitivities and specificities of: (i) 86.90%, 85.71% and 88.10%, with noise level 0 using 24 bispectrum features, and (ii) 80.95%, 85.71% and 76.19%, with noise level 10 using 15 bispectrum features for right eye IR images. This study exhibits that even in the presence of high levels of noise, the detection of DE is possible and our proposed method performs efficiently using HOS bispectrum features. Thus, our proposed method can be used to detect DE using low-quality and inexpensive cameras instead of high-cost IR camera.
The objective of this paper is to study the arbitrage free pricing of the covariance swap for Barndorff–Nielsen and Shephard (BN–S) type Lévy process driven financial markets. One of the major challenges in arbitrage free pricing of swap is to obtain an accurate pricing expression which can be used with good computational accuracy. In this paper, we obtain analytic expressions for the pricing of the covariance swap. We show that with the analytic expressions obtained from the BN–S model, the error estimation in fitting the delivery price is much less than the existing models with comparable parameters. The models and pricing formulas proposed in this paper are computable in real time and hence can be efficiently used in practical applications.
We propose an asymptotically unbiased and consistent estimate of the bispectrum of a stationary continuous-time process X = {X(t)}t∈ℝ. The estimate is constructed from observations obtained by a random sampling of the time by {X(τk)}k∈ℤ, where {τk}k∈ℤ is a sequence of real random variables, generated from a Poisson counting process. Moreover, we establish the asymptotic normality of the constructed estimate.
We consider velocity structure functions in turbulence through an approach using cumulants, and for a fixed value of the distance ℓ. This allows to consider the cumulant generating function Φℓ(q) = log〈|ΔVℓ|q〉. Using an atmospheric turbulent database, we show that the cumulant generating function is nonanalytic, with a development compatible with a log-stable model of the form Φℓ(q) = Aℓq + Bℓqα and a parameter value of α = 1.5. The parameters Aℓ, and Bℓ are experimentally estimated: they are respectively increasing and decreasing functions of ℓ; their scaling ranges correspond to the scaling range of the velocity fluctuations. The dependence between these two functions is studied in relation to Extended Self Similarity and Generalized Extended Self Similarity properties.
In this paper we consider the sampling properties of the bootstrap process, that is, the empirical process obtained from a random sample of size n (with replacement) of a fixed sample of size n of a continuous distribution. The cumulants of the bootstrap process are given up to the order n-1 and their unbiased estimation is discussed. Furthermore, it is shown that the bootstrap process has an asymptotic minimax property for some class of distributions up to the order n-1/2.