The residential sector in Thailand has been a fast-growing energy consumption sector since 1995 at a rate of 6% per year. This sector makes a significant contribution to Thailand’s rising electricity demand especially during the COVID-19 pandemic. This study projects Thailand’s residential electricity consumption characteristics and the factors affecting the growth of electricity consumption using a system dynamics (SD) modeling approach to forecast long-term electricity consumption in Thailand. Furthermore, the COVID-19 pandemic and the lockdown can be seen as a forced social experiment, with the findings demonstrating how to use resources under particular circumstances. Four key factors affecting the electricity demand used in the SD model development include (1) work and study from home, (2) socio-demographic, (3) temperature changing, and (4) rise of GDP. Secondary and primary data, through questionnaire survey method, were used as data input for the model. The simulation results reveal that changing behavior on higher-wattage appliances has huge impacts on overall electricity consumption. The pressure to work and study at home contributes to rises of electricity consumption in the residential sector during and after COVID-19 pandemic. The government and related agencies may use the study results to plan for the electricity supply in the long term.
We have translated fractional Brownian motion (FBM) signals into a text based on two "letters", as if the signal fluctuations correspond to a constant stepsize random walk. We have applied the Zipf method to extract the ζ′ exponent relating the word frequency and its rank on a log–log plot. We have studied the variation of the Zipf exponent(s) giving the relationship between the frequency of occurrence of words of length m < 8 made of such two letters: ζ′ is varying as a power law in terms of m. We have also searched how the ζ′ exponent of the Zipf law is influenced by a linear trend and the resulting effect of its slope. We can distinguish finite size effects, and results depending whether the starting FBM is persistent or not, i.e., depending on the FBM Hurst exponent H. It seems then numerically proven that the Zipf exponent of a persistent signal is more influenced by the trend than that of an antipersistent signal. It appears that the conjectured law ζ′ = |2H - 1| only holds near H = 0.5. We have also introduced considerations based on the notion of a time dependent Zipf law along the signal.
The classification of multivariate time-varying data finds application in several fields, such as economics, finance, marketing research, psychometrics, bioinformatics, medicine, signal processing, pattern recognition, etc. In this paper, by considering an exploratory formalization, we propose different unsupervised clustering models for multivariate data time arrays (objects×quantitative variables×times). These models can be classified in two different approaches: the cross sectional and the longitudinal approach. In the first case, after the objects, observed at each time, have been classified, comparison among the classifications made in different time instants will be done. In the second approach, we cluster the time trajectories of the objects; then, we obtain only one classification by comparing the instantaneous and evolutive features of the trajectories of the objects. In particular, in this work, the second approach is analyzed in detail, with reference to the so-called single and double step procedures. Geometric, correlative, instantaneous, evolutive and trend characteristics of the multivariate time arrays are taken into account in the different proposed clustering models. Furthermore, the fuzzy approach, that is particularly suitable in the dynamic classification problem, has been considered. Extensions of a cluster-validity criterion for the proposed fuzzy dynamic clustering models are also suggested. A socio-economic example concludes the paper.
A model for a stock market bubble is based upon the assumption that market agents incorporate trend and deviation of the actual price from the fundamental value into their expectations. Possible price dynamics are analyzed, and necessary conditions for large price deviations are obtained.
A series of stock prices typically shows a large trend and smaller fluctuations. These two parts are often studied together, as if parts of a single process; but they appear to be separately caused. In this paper, the two parts are analyzed separately, so that one does not distort the other, and some spurious interaction terms are avoided. This contributes a model, in which a wide range of features of stock price behavior are identified. With logarithms of stock prices, the two parts become of more comparable size. This is found to lead to a simpler additive model. On a logarithmic scale, the stock prices show the trend as a straight line (which can be extrapolated), with added fluctuations filling a narrow band. The trend and fluctuations are thus separated. The trend appears to be largely generated by a positive feedback process, describing investor behavior. The width of the fluctuation band does not grow with time, so positive feedback is not its cause. The movement of stock prices can be understood by analyzing the trend and fluctuations as separate processes; the latter considered as a stationary stochastic process with a scale factor. This analysis is applied to a historical dataset (S&P500 index of daily prices from February 1928). Here, the fluctuations are autocorrelated over short time intervals; there is little structure, except for market crash periods, when variability increases. The slope of the trend showed some jumps, not predictable from price history. This approach to modeling describes many aspects of stock price behavior, which are usually discussed in behavioral finance.
The Detrended Fluctuation Analysis (DFA) and its extensions (MF-DFA) have been proposed as robust techniques to determine possible long-range correlations in self-affine signals. However, many studies have reported the susceptibility of DFA to trends which give rise to spurious crossovers and prevent reliable estimations of the scaling exponents. Lately, several modifications of the DFA method have been reported with many different techniques for eliminating the monotonous and periodic trends. In this study, a smoothing algorithm based on the Orthogonal V-system (OVS) is proposed to minimize the effect of power-law trends, periodic trends, assembled trends and piecewise function trends. The effectiveness of the new method is demonstrated on monofractal data and multifractal data corrupted with different trends.
The RR and RT time intervals extracted from the electrocardiogram measure respectively the duration of cardiac cycle and repolarization. The series of these intervals recorded during the exercise test are characterized by two trends: A decreasing one during the stress phase and an increasing one during the recovery, separated by a global minimum. We model these series as a sum of a deterministic trend and random fluctuations, and estimate the trend using methods of curve extraction: Running mean, polynomial fit, multi scale wavelet decomposition. We estimate the minimum location from the trend. Data analysis performed on a group of 20 healthy subjects provides evidence that the minimum of the RR series precedes the minimum of the RT series, with a time delay of about 19 seconds.
The amplitudes of R and T waves of the electrocardiogram (ECG) recorded during the exercise test show both large inter- and intra-individual variability in response to stress. We analyze a dataset of 65 normal subjects undergoing ambulatory test. We model the dataset of R and T series in the framework of functional data, assuming that the individual series are realizations of a non-stationary process, centered at the population trend. We test the time variability of this trend computing a simultaneous confidence band and the zero crossing of its derivative. The analysis shows that the amplitudes of the R and T waves have opposite responses to stress, consisting respectively in a bump and a dip at the early recovery stage. Our findings support the existence of a relationship between R and T wave amplitudes and respectively diastolic and systolic ventricular volumes.
Drought is among the natural disasters that seriously impact the environment and human life. This study aims to explore the spatial pattern of drought using the percent of normal precipitation index (PNPI) in Fars Province, located in the Southern part of Iran. To this end, a drought risk model based on data from 42 stations in Fars province from 1990 to 2019 was evaluated. The model includes three criteria of maximum drought intensity in the period, drought trend, and a maximum number of consecutive dry years. The final drought risk map was obtained with an arithmetic mean of three indicators of intensity, continuity, and trend. The final hazard map and the 3-criteria map were interpolated by the inverse distance weighting (IDW) method and were classified into five risk classes: none, mild, moderate, severe, and very severe. The final vulnerability map shows that moderate hazard areas (5% of the region), which are observed in the Sothern parts of the region, are less widespread than areas under severe hazard (83% of the region), which are observed in almost all parts of the region. According to the final vulnerability map, about 94% of the area of Fars province is under severe and very severe conditions. Overall, this study, regarding its simplicity and considering different dimensions of drought, may be utilised as a basic framework to evaluate drought hazards for other locations worldwide. In this respect, it is necessary to study the multiple sights of this phenomenon for land use planning, resource management, and prevention of water and food crises. Therefore, this model can help users and administrations with executive initiatives.
This paper describes a study about the impact of earthquakes on debris flows with a focus on the Great Wenchuan Earthquake 2008 in China. The land form, precipitation, and source material are the three key factors for debris flow initiation in the Wenchuan surrounding area. Classifications and examples of four types of debris flow initiation triggering (gully triggering, slope triggering, liquefaction triggering, and gully erosion triggering) have been presented. The initiation mechanisms are attributed to hydraulic and geomechanical aspects. The actual debris flow cases linked with the Great Wenchuan Earthquake and other earthquakes in China have been used to illustrate the increased magnitudes of debris flows due to a large amount of loose materials created by the seismic actions. The critical precipitation for debris flows is reduced by the earthquake. It is predicted that the impact of the Great Wenchuan Earthquake on the local debris flows would be significant in the next 5–6 years, and much less in the following years (up to 20 years). Finally, the debris flow system will reach a relative stable stage. This prediction is based on the historical observations at other earthquake areas and the qualitative analysis on debris flow initiation mechanisms.
Real nonstationary time sequences are in general not monofractals. That is, they cannot be characterized by a single value of fractal dimension. It has been shown that many real-time sequences are crossover-fractals: sequences with two fractal dimensions — one for the short and the other for long ranges. Here, we use the empirical mode decomposition (EMD) to decompose monofractals into several intrinsic mode functions (IMFs) and then use partial sums of the IMFs decomposed from two monofractals to construct crossover-fractals. The scale-dependent fractal dimensions of these crossover-fractals are checked by the inverse random midpoint displacement method (IRMD).
Empirical mode decomposition (EMD) lacks theoretical support. We propose a piecewise monotonous model for EMD, and prove that the trend-subtracting iteration converges and IMF-separating procedure ends up in finite steps under mild conditions. Experiments are implemented and compared with the classical EMD.
The estimation and significance testing of the first-order autoregressive (AR1) coefficient in short time series with trends are examined. The purpose is to identify the difficulties to which analysis procedures need to adjust for better results. The delta recursive AR1 estimator rδ and the Sen–Theil trend estimator are viable for short sequence application. Significance testing for rδ has low power. But the existence of trend has negligible influence in estimation and testing. The common practice of trend removal before AR1 estimation gives poorer results. Application to air quality data showed this could greatly change conclusions. Implication to analysis is discussed.
Due to the fact that rainfall may hamper signal availability, rainfall variability and its resultant effect on environment and communication have become of global concern. In this study, we investigate the trend and variability of rainfall in southwestern Nigeria and examine its effect on radio communication. Our results reveal a steady increasing in rainfall and slightly unstable in volume variation in southwestern Nigeria. The study also reveals that the tendency of higher attenuation in years was caused by the increasing trend but showing variability with frequency. With the rising trend in view, there is therefore the likelihood that radio communication infrastructures will experience increasing outage and more signal loss in the future years. This outcome should serve as useful tools in optimizing satellite link budget and better utilization of available bandwidth.
The Bloomberg Carbon Clock is a web-based and real-time estimate of the global average atmospheric CO2 level. The work was conceived to help non-scientists understand the seasonal nature of atmospheric composition, and offers a click-through explanation of the CO2 data’s meaning, history, and implications. The Carbon Clock was designed to satisfy both scientific rigor and aesthetic appeal.
The Detrended Fluctuation Analysis (DFA) and its extensions (MF-DFA) have been proposed as robust techniques to determine possible long-range correlations in self-affine signals. However, many studies have reported the susceptibility of DFA to trends which give rise to spurious crossovers and prevent reliable estimations of the scaling exponents. Lately, several modifications of the DFA method have been reported with many different techniques for eliminating the monotonous and periodic trends. In this study, a smoothing algorithm based on the Orthogonal V-system (OVS) is proposed to minimize the effect of power-law trends, periodic trends, assembled trends and piecewise function trends. The effectiveness of the new method is demonstrated on monofractal data and multifractal data corrupted with different trends.
Indonesia, a tropical maritime continent between the Pacific Ocean and the Indian Ocean, frequently experiences extreme rainfall (ER) that lead to a major disaster such as floods and landslides. These disasters are effect economic activity and impact human daily life, and the future change projection, therefore, is important to reduce the impact of extreme rainfall in Indonesia. In this study, we examined the linking of the annual maximum (AM) of daily rainfall series with climate variability including El Nino Southern Oscillation (ENSO), Indian Ocean Dipole (IOD), and Madden-Julian Oscillation (MJO) by optimizing statistical extreme value analysis based on daily rainfall data (1985– 2014) observed at ten meteorological stations around the Java and Makassar Islands. Using the trend-free pre-whitening (TFPW) Mann-Kendall test, the AM had significantly increased by 0.983 mm/year (p < 0.001), probably due to intensified sea surface temperature (SST) anomaly by global warming. Furthermore, based on the best selected non-stationary generalized extreme value distribution model, Waingapu and Luwuk covariate significantly with ENSO, while Perak and Jakarta station covariate insignificantly to IOD. The intensified AM during La Nina and negative IOD tend to increase Indonesia SST and then corresponding with the low-level wind convergence that produces upward moisture flux motion, increasing cloud cover that enriches the convection activity. On the contrary, the Madden-Julian Oscillation signal in AM was less prominent in all stations possibly due to weakened mesoscale circulation. During active MJO over Indonesia region, increasing in cloud cover will reduce the solar radiation. This condition is unfavorable for convection activity and therefore, ER is reduced. Finally, analyses of Indonesia ER variability reveal some sensitivity to climate variability in adjacent parts of Indian and Pacific Ocean and trough extreme value analysis, this study highlights the interaction between ER variability and sea-air phenomena around Indonesia.
A key element in the design of a repeated sample survey is the rotation pattern, which affects the variability of the time series of survey estimates and the seasonally adjusted and trend estimates produced from them. This paper considers the choice of rotation pattern for seasonally adjusted and trend estimates obtained from a repeated survey, using X11 based methods.
In order to provide data reference in protecting Beihai Silver Beach this article used the moving average method spearman rank correlation analysis method and R/S rescaled range method to analyze the typhoon tendency in Beihai Silver Beach and predicted the future tendency from the typhoon frequency intensity and extremes. The results show that: In the past 60 years there is a downward trend in the typhoon frequency in Beihai Silver Beach with a growing trend in the intensity and extremes though those three trends aren't very obvious; the typhoon frequency in Beihai Silver Beach will rise in the future so do the intensity and extremes. For ensuring Silver Beach is not affected by the change of typhoon in the future we must make some targeted and reasonable responses.
Please login to be able to save your searches and receive alerts for new content matching your search criteria.