The world is a global village, and all economies are connected (negatively or positively) with each other. A financial crisis in one economy is likely to have an impact on the other economies. Since the stock exchange plays an important role for a country due to its ability to mobilize local resources for fruitful investment, thus it is mandatory to detect, and date-stamp any bubble(s) in the stock market of a particular country and its major trading partners to save these economies from any crises leading them toward sustainable growth. Current literature on Pakistan though discusses the bubble detection issues but no study is available that analyzes bubbles in Pakistan as well as its major trading partners. In addition, if there is any bubble in one of the chosen countries then what will be the impact of this bubble on a specific or any other country and with what magnitude? This study fills in this void by contributing to the existing literature in two ways, first, it analyzes the presence of bubbles in the stock markets of Pakistan and its major trading partners and in addition, it also provides the level of connectedness among these stock markets and the impact of any shock in one of the chosen countries on a specific country or on the rest of the countries. The empirical analysis is based on monthly data from Jan 2000 till Oct 2022 and the bubbles are detected via state of art generalized supremum ADF (GSADF) test while the connectedness is tested via the Diebold and Yilmaz [(2012). Better to give than to receive: Predictive directional measurement of volatility spillovers. International Journal of Forecasting, 28(1), 57–66] approach. Some interesting results are obtained based on the empirical findings and relevant policy recommendations are made.
We develop an autoregressive model framework based on the concept of Principal Dynamic Modes (PDMs) for the process of action potential (AP) generation in the excitable neuronal membrane described by the Hodgkin–Huxley (H–H) equations. The model's exogenous input is injected current, and whenever the membrane potential output exceeds a specified threshold, it is fed back as a second input. The PDMs are estimated from the previously developed Nonlinear Autoregressive Volterra (NARV) model, and represent an efficient functional basis for Volterra kernel expansion. The PDM-based model admits a modular representation, consisting of the forward and feedback PDM bases as linear filterbanks for the exogenous and autoregressive inputs, respectively, whose outputs are then fed to a static nonlinearity composed of polynomials operating on the PDM outputs and cross-terms of pair-products of PDM outputs. A two-step procedure for model reduction is performed: first, influential subsets of the forward and feedback PDM bases are identified and selected as the reduced PDM bases. Second, the terms of the static nonlinearity are pruned. The first step reduces model complexity from a total of 65 coefficients to 27, while the second further reduces the model coefficients to only eight. It is demonstrated that the performance cost of model reduction in terms of out-of-sample prediction accuracy is minimal. Unlike the full model, the eight coefficient pruned model can be easily visualized to reveal the essential system components, and thus the data-derived PDM model can yield insight into the underlying system structure and function.
We confirm the results of Chaves and collaborators of very large size effects in the case when a site on the triangular lattice is permanently emptied if it has less than three occupied neighbors. We also show that the relaxation time peaks at an initial concentration different from the percolation threshold.
This article applies the panel stationarity test with a break proposed by Hadri and Rao (2008) to examine whether 14 macroeconomic variables of OECD countries can be best represented as random walk or stationary fluctuations around a deterministic trend. In contrast to previous studies, based essentially on visual inspection of the break type or just applying the most general break model, we use a model selection procedure based on BIC. We do this for each time series so that heterogeneous break models are allowed for in the panel. Our results suggest, overwhelmingly, that if we account for a structural break, cross-sectional dependence and choose the break models to be congruent with the data, then the null of stationarity cannot be rejected for all the 14 macroeconomic variables examined in this article. This is in sharp contrast with the results obtained by Hurlin (2004), using the same data but a different methodology.
Chong and Lam and Chong et al. show that SETAR(200) and MA(50) outperform other rules in both the U.S. and the Chinese stock market. This paper investigates the synergy of combining SETAR(200) and MA(50) rules in ten U.S. and Chinese stock market indexes. It is found that the SETAR rule performs better in the U.S. market, while the MA rule performs better in the Chinese market. In addition, we find evidence that a new strategy combining the two rules together is able to create synergy. An immediate implication of our result is that investors are able to improve the performance of their portfolios by combining existing profitable trading rules.
Innovation is a major economic growth contributor, which is often hindered by corruption. However, this relationship is not always supported. This study analyzes the interrelation between anti-corruption (AC) and technological innovation (TI) in China by applying the bootstrap rolling-window full-sample and subsample Granger causality test. The results confirm that the influence of AC on TI is two-fold. On the one hand, AC positively influences TI, indicating that it facilitates TI. This finding supports the “sanding-the-wheels” hypothesis, which postulates that corruption impedes innovation. On the other hand, there is a negative influence from AC to TI, which is mainly caused by the COVID-19 pandemic. Further, the results show that TI positively influences AC, implying that TI can affect government’s AC-related decisions. Based on these findings, governments should coordinate their efforts toward innovation and AC, while firms should adopt innovation-driven strategies for long-term growth.
We point out that the bootstrap program in quantum mechanics proposed by Han et al. reduces to a bootstrap study of a microcanonical ensemble of the same Hamiltonian in the ℏ→0 limit. In the limit, the quantum mechanical archipelago becomes an extremely thin line for an anharmonic oscillator. For a double-well potential, a peninsula in E≤0 appears.
The quark model emerged from the Gell-Mann–Ne'eman flavor SU(3) symmetry. Its development, in the context of strong interactions, took place in a heuristic theoretical framework, referred to as the Bootstrap Era. Setting the background for the dominant ideas in strong interaction of the early 1960s, we outline some aspects of the constituent quark model. An independent theoretical development was the emergence of hadron duality in 1967, leading to a realization of the Bootstrap idea by relating hadron resonances (in the s-channel) with Regge pole trajectories (in t- and u-channels). The synthesis of duality with the quark-model has been achieved by duality diagrams, serving as a conceptual framework for discussing many aspects of hadron dynamics toward the end of the 1960s.
Bootstrap equations for conformal correlators that mimic the early theory of conformal bootstrap are written down in frames of the AdS/CFT approach. The simplified version of these equations, that may be justified if Schwinger–Keldysh formalism is used in AdS/CFT instead of conventional Feynman–Witten diagrams technique, permits to calculate values of conformal dimensions in the O(N) symmetric model with conformal or composite Habbard–Stratonovich field.
This paper proposes an effective method to elevate the performance of saliency detection via iterative bootstrap learning, which consists of two tasks including saliency optimization and saliency integration. Specifically, first, multiscale segmentation and feature extraction are performed on the input image successively. Second, prior saliency maps are generated using existing saliency models, which are used to generate the initial saliency map. Third, prior maps are fed into the saliency regressor together, where training samples are collected from the prior maps at multiple scales and the random forest regressor is learned from such training data. An integration of the initial saliency map and the output of saliency regressor is deployed to generate the coarse saliency map. Finally, in order to improve the quality of saliency map further, both initial and coarse saliency maps are fed into the saliency regressor together, and then the output of the saliency regressor, the initial saliency map as well as the coarse saliency map are integrated into the final saliency map. Experimental results on three public data sets demonstrate that the proposed method consistently achieves the best performance and significant improvement can be obtained when applying our method to existing saliency models.
Monte Carlo approach for power estimation is based on the assumption that the samples of power are Normally distributed. However, the power distribution of a circuit is not always Normal in the real world. In this paper, the Bootstrap method is adopted to adjust the confidence interval and redeem the deficiency of the conventional Monte Carlo method. Besides, a new input sequence stratification technique for power estimation is proposed. The proposed technique utilizes a multiple regression method to compute the coefficient matrix of the indicator function for stratification. This new stratification technique can adaptively update the coefficient matrix and keep the population of input vectors in a better stratification status. The experimental results demonstrate that the proposed Bootstrap Monte Carlo method with adaptive stratification can effectively reduce the simulation time and meet the user-specified confidence level and error level.
This paper presents a 2×VDD tolerant I/O buffer with low voltage (VDD) devices. A novel bootstrap circuit for mixed voltage I/O buffer is proposed to solve the unwanted leakage paths and gate oxide reliability issues. The proposed circuit is designed using 1.8V thick gate devices in 22-nm FinFET technology with 1.8V signaling and tolerant to 3.3V. The structure can be used in any CMOS technology for 2×VDD tolerant I/O buffer.
Highly reliable software systems rarely fail during tests because they are usually designed with fault-tolerant mechanisms and tested comprehensively. It is usually difficult to obtain sufficient failure data to carry out reliability measurements by using traditional software reliability models. These models are typically based on probabilistic statistics, and the measurement accuracy cannot be guaranteed with insufficient failure data. We propose a nonparametric bootstrap (NBP) resampling method and six parametric bootstrap (PB) resampling methods to construct software reliability models for small sample conditions based on commonly used models, i.e., the Jelinski–Moranda (J–M), Goel–Okumoto (G–O), Musa–Okumoto (M–O), Schneidewind, Duane and Littlewood-Verrall models. The bootstrap is a statistical procedure that resamples a single dataset to create many simulated samples. Our experimental results on fourteen failure datasets collected from industry and academia show that the proposed models improve by 10.2–18.0% failure time prediction accuracy, 24.7–30.7% curve fitting accuracy, and 7.7–42.9% reliability measurement accuracy compared with the original models. Furthermore, our approaches achieve 58.3–91.1% better failure time prediction accuracy in the case of small sample conditions compared to state-of-the-art machine learning and neural network-based methods. Overall, our approaches can perform more accurate reliability measurements than the original models even in scenarios with limited failure data.
Recently blind source separation (BSS) methods have been highly successful when applied to biomedical data. This paper reviews the concept of BSS and demonstrates its usefulness in the context of event-related MEG measurements. In a first experiment we apply BSS to artifact identification of raw MEG data and discuss how the quality of the resulting independent component projections can be evaluated. The second part of our study considers averaged data of event-related magnetic fields. Here, it is particularly important to monitor and thus avoid possible overfitting due to limited sample size. A stability assessment of the BSS decomposition allows to solve this task and an additional grouping of the BSS components reveals interesting structure, that could ultimately be used for gaining a better physiological modeling of the data.
We introduce a new technique to associate a spanning tree to the average linkage cluster analysis. We term this tree as the Average Linkage Minimum Spanning Tree. We also introduce a technique to associate a value of reliability to the links of correlation-based graphs by using bootstrap replicas of data. Both techniques are applied to the portfolio of the 300 most capitalized stocks traded on the New York Stock Exchange during the time period 2001–2003. We show that the Average Linkage Minimum Spanning Tree recognizes economic sectors and sub-sectors as communities in the network slightly better than the Minimum Spanning Tree. We also show that the average reliability of links in the Minimum Spanning Tree is slightly greater than the average reliability of links in the Average Linkage Minimum Spanning Tree.
This work illustrates the use of bootstrap methods to quantify the statistical uncertainties on the correlation coefficients between the slope of the symmetry energy and the neutron skin thickness in heavy nuclei. By using several energy density functionals, I discuss the density dependence of such a correlation and its evolution with isospin asymmetry. In particular, I observe that the correlation between the slope of the symmetry energy and the neutron skin is present not only at saturation density, but over a much larger density range.
Some larvae of Drosophila infected by parasitic wasps are able to encapsulate the larvae of the parasitoid, and the emerging hosts present a visible melanized capsule in the abdomen. In this paper, a model for estimating the infection rate RI by the rate of hosts presenting a capsule HC is developed. For Drosophila simulans parasitized by Leptopilina boulardi, the model RI = HC/(k+(1-k)HC), with k=0.123, is validated from experimental data. The validation process is based upon a bootstrap strategy over 12870 possibilities of grouping 8 elementary experimental results among 16. Validation consists in fitting the theoretical curve from a data set and in controlling the overlap of the curve with the confidence rectangle established with the complementary data set. This validation process appears to be independent of the confidence level. This infection-encapsulation model is applied to field observations in Tunisia and predicts high levels of infection. This prediction is confirmed at Nasr'Allah by a direct measure of the infection rate. The biological hypotheses involved in this model are discussed. The model merely allows one to follow the evolution of infection in population cages and in the wild, by catching and counting adult hosts, without access to breeding sites. The model is generalisable to other species of hosts and parasitoids presenting the encapsulation reaction.
Complex systems are composed of mutually interacting components and the output values of these components usually exhibit long-range cross-correlations. Using wavelet analysis, we propose a method of characterizing the joint multifractal nature of these long-range cross correlations, a method we call multifractal cross wavelet analysis (MFXWT). We assess the performance of the MFXWT method by performing extensive numerical experiments on the dual binomial measures with multifractal cross correlations and the bivariate fractional Brownian motions (bFBMs) with monofractal cross correlations. For binomial multifractal measures, we find the empirical joint multifractality of MFXWT to be in approximate agreement with the theoretical formula. For bFBMs, MFXWT may provide spurious multifractality because of the wide spanning range of the multifractal spectrum. We also apply the MFXWT method to stock market indices, and in pairs of index returns and volatilities we find an intriguing joint multifractal behavior. The tests on surrogate series also reveal that the cross correlation behavior, particularly the cross correlation with zero lag, is the main origin of cross multifractality.
This paper focuses on the generalization of several software reliability models and the derivation of confidence intervals of reliability assessment measures. First we propose a gamma function model as a generalized model, and discuss how to obtain the confidence intervals from a data set by using a bootstrap scheme when the size of the data set is small. A two-parameter numerical differentiation method is applied to the data set to estimate the model parameters. We also show several numerical illustrations of software reliability assessment.
The paper presents two alternative schemes for pricing European and American call options, both based on artificial neural networks. The first method uses binomial trees linked to an innovative stochastic volatility model. The volatility model is based on wavelets and artificial neural networks. Wavelets provide a convenient signal/noise decomposition of the volatility in the non-linear feature space. Neural networks are used to infer future volatility levels from the wavelets feature space in an iterative manner. The bootstrap method provides the 95% confidence intervals for the options prices. In the second approach neural networks are trained with genetic algorithms in order to reverse-engineer the Black–Scholes formulae. The standard Black–Scholes model provides a starting point for an evolutionary training process, which yields improved options prices. Market options prices as quoted on the Chicago Board Options Exchange are used for performance comparison between the Black–Scholes model and the proposed options pricing schemes. The proposed models produce as good as and often better options prices than the conventional Black–Scholes formulae.
Please login to be able to save your searches and receive alerts for new content matching your search criteria.