Please login to be able to save your searches and receive alerts for new content matching your search criteria.
The denoising of a natural signal/image corrupted by Gaussian white noise is a classical problem in signal/image processing. However, it is still in its infancy to denoise high dimensional data. In this paper, we extended Sendur and Selesnick's bivariate wavelet thresholding from two-dimensional (2D) image denoising to three-dimensional (3D) data cube denoising. Our study shows that bivariate wavelet thresholding is still valid for 3D data cubes. Experimental results show that bivariate wavelet thresholding on 3D data cube is better than performing 2D bivariate wavelet thresholding on every spectral band separately, VisuShrink, and Chen and Zhu's 3-scale denoising.
This paper starts with a brief discussion of so-called wavelet transforms, i.e., decompositions of arbitrary signals into localized contributions labelled by a scale parameter. The main features of the method are first illustrated through simple mathematical examples. Then we present the first applications of the method to the recognition and visualisation of characteristic features of speech and of musical sounds.
In the digital world, artificial intelligence tools and machine learning algorithms are widely applied in analysis of medical images for identifying diseases and make diagnoses; for example, to make recognition and classification. Speckle noises affect all medical imaging systems. Therefore, reduction in corrupting speckle noises is very important, since it deteriorates the quality of the medical images and makes tasks such as recognition and classification difficult. Most existing denoising algorithms have been developed for the additive white Gaussian noise (AWGN). However, AWGN is not a speckle noise. Therefore, this work presents a novel speckle noise removal algorithm within the framework of Bayesian estimation and wavelet analysis. This research focuses on noise reduction by the Bayesian with wavelet-based method because it provides good efficiency in noise reduction and spends short time in processing. The subband decomposition of a logarithmically transformed image is best described by a family of heavy-tailed densities such as Logistic distribution. Then, this research proposes the maximum a posteriori (MAP) estimator assuming Logistic random vectors for each parent-child wavelet co-efficient of noise-free log-transformed data and log-normal density for speckle noises. Moreover, a redundant wavelet transform, i.e., the cycle-spinning method, is applied in our proposed methods. In our experiments, our proposed methods give promising denoising results.
We study high frequency Nikkei stock index series and investigate what certain wavelet transforms suggest in terms of volatility features underlying the observed returns process. Several wavelet transforms are applied for exploratory data analysis. One of the scopes is to use wavelets as a pre-processing smoothing tool so to de-noise the data; we believe that this procedure may help in identifying, estimating and predicting the latent volatility. Evidence is shown on how a non-parametric statistical procedure such as wavelets may be useful for improving the generalization power of GARCH models when applied to de-noised returns.
Gaussian noise is an important problem in computer vision. The novel methods that become popular in recent years for Gaussian noise reduction are Bayesian techniques in wavelet domain. In wavelet domain, the Bayesian techniques require a prior distribution of wavelet coefficients. In general case, the wavelet coefficients might be better modeled by non-Gaussian density such as Laplacian, two-sided gamma, and Pearson type VII densities. However, statistical analysis of textural image is Gaussian model. So, we require flexible model between non-Gaussian and Gaussian models. Indeed, Gumbel density is a suitable model. So, we present new Bayesian estimator for Gumbel random vectors in AWGN (additive white Gaussian noise). The proposed method is applied to dual-tree complex wavelet transform (DT-CWT) as well as orthogonal discrete wavelet transform (DWT). The simulation results show that our proposed methods outperform the state-of-the-art methods qualitatively and quantitatively.
The S-transform is a method of time-local spectral analysis (also known as time-frequency analysis), a modified short-time Fourier Transform, in which the width of the analyzing window scales inversely with frequency, in analogy with continuous wavelet transforms. If the time series is non-stationary and consists of a mix of Gaussian white noise and a deterministic signal, though, this type of scaling leads to larger apparent noise amplitudes at higher frequencies. In this paper, we introduce a modified S-transform window with a different scaling function that addresses this undesirable characteristic.
The application of image processing in industry has shown remarkable success over the last decade, for example, in security and telecommunication systems. The denoising of natural image corrupted by Gaussian noise is a classical problem in image processing. So, image denoising is an indispensable step during image processing. This paper is concerned with dual-tree complex wavelet-based image denoising using Bayesian techniques. One of the cruxes of the Bayesian image denoising algorithms is to estimate the statistical parameter of the image. Here, we employ maximum a posteriori (MAP) estimation to calculate local observed variance with generalized Gamma density prior for local observed variance and Laplacian or Gaussian distribution for noisy wavelet coefficients. Evidently, our selection of prior distribution is motivated by efficient and flexible properties of generalized Gamma density. The experimental results show that the proposed method yields good denoising results.
The paper presents an informal review of some techniques available for signal analysis. In the interpretation of biomedical signals, the individuation of hidden transient phenomena in the spectrum can have a crucial role for diagnostic purposes. Since most biological signals are nonstationary, the Fourier transform is not sufficient to detect possible transient phenomena in the spectrum; therefore, some improvements in the Fourier transform technique have been carried out by means of window functions in the transformation kernel. Some of the most important features of recent developments in signal analysis are discussed here, with special focus on the uncertainty principle governing any time–frequency analysis.
In many medical applications, feature selection is obvious; but in medical domains, selecting features and creating a feature vector may require more effort. The wavelet transform (WT) technique is used to identify the characteristic points of an electrocardiogram (ECG) signal with fairly good accuracy, even in the presence of severe high-frequency and low-frequency noise. Principal component analysis (PCA) is a suitable technique for ECG data analysis, feature extraction, and image processing — an important technique that is not based upon a probability model. The aim of the paper is to derive better diagnostic parameters for reducing the size of ECG data while preserving morphology, which can be done by PCA. In this analysis, PCA is used for decorrelation of ECG signals, noise, and artifacts from various raw ECG data sets. The aim of this paper is twofold: first, to describe an elegant algorithm that uses WT alone to identify the characteristic points of an ECG signal; and second, to use a composite WT-based PCA method for redundant data reduction and better feature extraction. PCA scatter plots can be observed as a good basis for feature selection to account for cardiac abnormalities. The study is analyzed with higher-order statistics, in contrast to the conventional methods that use only geometric characteristics of feature waves and lower-order statistics. A new algorithm — viz. PCA variance estimator — is developed for this analysis, and the results are also obtained for different combinations of leads to find correlations for feature classification and useful diagnostic information. PCA scatter plots of various chest and augmented ECG leads are obtained to examine the varying orientations of the ECG data in different quadrants, indicating the cardiac events and abnormalities. The efficacy of the PCA algorithm is tested on different leads of 12-channel ECG data; file no. 01 of the Common Standards for Electrocardiography (CSE) database is used for this study. Better feature extraction is obtained for some specific combinations of leads, and significant improvement in signal quality is achieved by identifying the noise and artifact components. The quadrant analysis discussed in this paper highlights the filtering requirements for further ECG processing after performing PCA, as a primary step for decorrelation and dimensionality reduction. The values of the parameters obtained from the results of PCA are also compared with those of wavelet methods.
Since Donoho et al. proposed the wavelet thresholding method for signal denoising, many different denoising approaches have been suggested. In this paper, we present three different wavelet shrinkage methods, namely NeighShrink, NeighSure and NeighLevel. NeighShrink thresholds the wavelet coefficients based on Donoho's universal threshold and the sum of the squares of all the wavelet coefficients within a neighborhood window. NeighSure adopts Stein's unbiased risk estimator (SURE) instead of the universal threshold of NeighShrink so as to obtain the optimal threshold with minimum risk for each subband. NeighLevel uses parent coefficients in a coarser level as well as neighbors in the same subband. We also apply a multiplying factor for the optimal universal threshold in order to get better denoising results. We found that the value of the constant is about the same for different kinds and sizes of images. Experimental results show that our methods give comparatively higher peak signal to noise ratio (PSNR), are much more efficient and have less visual artifacts compared to other methods.
At first, this paper is concerned with wavelet-based image denoising using Bayesian technique. In conventional denoising process, the parameters of probability density function (PDF) are usually calculated from the first few moments, mean and variance. In the first part of our work, a new image denoising algorithm based on Pearson Type VII random vectors is proposed. This PDF is used because it allows higher-order moments to be incorporated into the noiseless wavelet coefficients' probabilistic model. One of the cruxes of the Bayesian image denoising algorithms is to estimate the variance of the clean image. Here, maximum a posterior (MAP) approach is employed for not only noiseless wavelet-coefficient estimation but also local observed variance acquisition. For the local observed variance estimation, the selection of noisy wavelet-coefficient model, either a Laplacian or a Gaussian distribution, is based upon the corrupted noise power where Gamma distribution is used as a prior for the variance. Evidently, our selection of prior is motivated by analytical and computational tractability. In our experiments, our proposed method gives promising denoising results with moderate complexity. Eventually, our image denoising method can be simply extended to audio/speech processing by forming matrix representation whose rows are formed by time segments of digital speech waveforms. This way, the use of our image denoising methods can be exploited to improve the performance of various audio/speech tasks, e.g., denoised enhancement of voice activity detection to capture voiced speech, significantly needed for speech coding and voice conversion applications. Moreover, one of the voice abnormality detections, called oropharyngeal dysphagia classification, is also required denoising method to improve the signal quality in elderly patients. We provide simple speech examples to demonstrate the prospects of our techniques.
This paper, proposes a novel approach for feature extraction based on the segmentation and morphological alteration of handwritten multi-lingual characters. We explored multi-resolution and multi-directional transforms such as wavelet, curvelet and ridgelet transform to extract classifying features of handwritten multi-lingual images. Evaluating the pros and cons of each multi-resolution algorithm has been discussed and resolved that Curvelet-based features extraction is most promising for multi-lingual character recognition. We have also applied some morphological operation such as thinning and thickening then feature level fusion is performed in order to create robust feature vector for classification. The classification is performed with K-nearest neighbor (K-NN) and support vector machine (SVM) classifier with their relative performance. We experiment with our in-house dataset, compiled in our lab by more than 50 personnel.
In this paper, we present new Bayesian estimators for adaptive generalized Gaussian (GG) random vectors in additive white Gaussian noise (AWGN). The derivations are an extension of existing results for Pearson type VII random vectors in AWGN. Pearson type VII random vectors is one of the distribution that successfully use for image denoising. However, Pearson type VII distribution have higher-order moment in statistical parameter for fitted the data such as mean, variance and kurtosis. In our literature, where high-order statistics were used, better performance can be obtained but with much higher computational complexity. In fact, adaptive GG random vectors is similar to Pearson type VII random vectors. However, the special case of adaptive GG random vectors has only first few statistical moments such as adaptive parameter. So, the proposed method can be calculated very fast, without any complex step. In fact, the adaptive parameter of adaptive GG density is the function of standard deviation. Here, we employ minimum mean square error (MMSE) estimation to calculate local observed variances with gamma density prior for local observed variances and Gaussian distribution for noisy wavelet coefficients. In our experiments, our proposed method gives promising denoising results with moderate complexity.
In optical techniques, noise signal is a classical problem in medical image processing. Recently, there has been considerable interest in using the wavelet transform with Bayesian estimation as a powerful tool for recovering image from noisy data. In wavelet domain, if Bayesian estimator is used for denoising problem, the solution requires a prior knowledge about the distribution of wavelet coefficients. Indeed, wavelet coefficients might be better modeled by super Gaussian density. The super Gaussian density can be generated by Gaussian scale mixture (GSM). So, we present new minimum mean square error (MMSE) estimator for spherically-contoured GSM with Maxwell distribution in additive white Gaussian noise (AWGN). We compare our proposed method to current state-of-the-art method applied on standard test image and we quantify achieved performance improvement.
The random interpolation average (RIA) is a simple yet good denoising method. It firstly employed several times of random interpolations to a noisy signal, then applied the wavelet transform (WT) denoising to each interpolated signal and averaged all of the denoised signals to finish the denoising process. In this paper, multiple wavelet bases and the level-dependent threshold estimator were used in the RIA scheme so that it can be more suitable for the electrocardiogram (ECG) signal denoising. The synthetic ECG signal, real ECG signal and four types of noise were used to perform comparison experiments. The results show that the proposed method can provide the best signal to noise ratio (SNR) improvement in the deoising applications of the synthetic ECG signal and the real ECG signals. For the real ECG signals denoising, the average SNR improvement is 5.886 dB, while the result of the RIA scheme with single wavelet basis (RIAS), the fully translation-invariant [TI (fully)] and the WT denoising using hard thresholding [WT (hard)] are 5.577, 5.274 and 3.484 dB, respectively.
In this paper, an improved method for de-noising bearing vibration signals to detect the bearing's faults is proposed. The method is based on discrete wavelet transforms, coefficients shrinkage methods and fast Fourier transforms. The frequency sub-bands for bearing fault conditions and the de-noised signal power spectrums are obtained to indicate better detecting results compared with conventional methods.
Flexible Alternative Current Transmission Systems (FACTS) based on Voltage Source Converter (VSC) is used for voltage regulation in transmission and distribution systems. FACTS can rapidly supply dynamic VARs required during system faults for voltage support. The apparent impedance is influenced by the reactive power injected or absorbed by FACTS, which will result in the under reaching or over reaching of distance relay. This paper presents simulation results of the application of distance relays for the protection of transmission systems employing FACTS controllers. The complete digital simulation of FACTS within a transmission system is performed in the MATLAB/ Simulink environment using the Power System Block set (PSB). An efficient method based on wavelet transforms, fault detection, classification and location is proposed using ANN technique which is almost independent of fault impedance, fault distance and fault inception angle of transmission line fault currents with FACTS controllers.