Data are the direct record of an event, such as a rocket launch, a phenomenon, nature, or engineering processes. The record can be taken by our eyes, ears, electronic sensors, or mechanical devices. We analyze the data, detect signals and make decisions. Thus, data are connections between reality and us, and data analysis is for us to understand the reality and to find out its underlying driving mechanism. In this sense, data analysis is very different from data processing. The former emphasizes detailed decomposition and examinations of the data to extract physical understanding, while the latter often relies on established algorithms and machines to output values of mathematical parameters.
In the era of big data, science and technology advance at an unprecedented pace. The inadequacies of traditional data analysis methods based on a priori basis have become glaringly clear. The complex data cannot be well represented by a priori basis and are not linear and stationary. We have to face the reality of nonstationarity and nonlinearity in data. Fortunately, some methods, such as empirical mode decomposition (EMD), have already been developed to analyze nonlinear and nonsationary data. It seems that a viable way to innovate methodology is to break away from traditional limitations of a priori basis and make a paradigm shift to adaptive analysis approaches, using iterative algorithms based only on data and not on a fixed basis. EMD, the Bayesian method, Kalman filtering, and machine learning techniques may be considered adaptive analysis methods. This journal encourages the further development of data analysis methods for nonlinear and nonstationary processes.
This journal emphasizes:
This journal publishes original research articles, as well as method surveys and critical reviews of state-of-the-art research, and book reviews. Conference proceedings may be accepted on a case by case basis and at the invitation from the Editor-in-Chief.