Processing math: 100%
Skip main navigation

Cookies Notification

We use cookies on this site to enhance your user experience. By continuing to browse the site, you consent to the use of our cookies. Learn More
×

System Upgrade on Tue, May 28th, 2024 at 2am (EDT)

Existing users will be able to log into the site and access content. However, E-commerce and registration of new users may not be available for up to 12 hours.
For online purchase, please visit us again. Contact us at customercare@wspc.com for any enquiries.

SEARCH GUIDE  Download Search Tip PDF File

  • articleOpen Access

    SOME FRACTAL DIMENSION ESTIMATE ALGORITHMS AND THEIR APPLICATIONS TO ONE-DIMENSIONAL BIOMEDICAL SIGNALS

    Fractals can model many classes of time-series data. The fractal dimension is an important characteristic of fractals that contains information about their geometrical structure at multiple scales. The covering methods are a class of efficient approaches, e.g., box-counting (BC) method, to estimate the fractal dimension. In this paper, the differential box-counting (DBC) approach, originally for 2-D applications, is modified and applied to 1-D case. In addition, two algorithms, called 1-D shifting-DBC (SDBC-1D) and 1-D scanning-BC (SBC-1D), are also proposed for 1-D signal analysis. The fractal dimensions for 1-D biomedical pulse and ECG signals are calculated.

  • articleOpen Access

    AN INTEGRATED ECG COMPRESSION AND ERROR PROTECTION SCHEME FOR BLUETOOTH TRANSMISSION IN HOME TELE-CARE APPLICATIONS

    Recently, using a wireless technology to transmit physiological signals, such as electrocardiogram (ECG), for a home tele-care system has received great attention. Although wireless transmission can provide the mobility advantage, it has to cope with the potential problem in limited bandwidth and induced interference. In this study, we propose an integration design in which a state-of-the-art compression algorithm called SPIHT (set partitioning in hierarchical trees) is combined with an unequal error protection scheme, to solve the problem for the ECG signals transmitted in Bluetooth packets. In this design, part of the SPIHT bit stream behaves like a fragile variable length code and needs a stronger protection with the forward error correction (FEC) code; the rest of the bit stream is lightly protected by the code. The simulation results show that the 2/3-rate FEC code in DM packets works effectively without the proposed scheme when the interference is fairly small. However, with the proposed scheme, the quality of received ECG signals is usually much better than that without the scheme when stronger interferences from fading channels and wireless LAN are encountered in an indoor environment. Consequently, the important features of an ECG waveform, such as P wave, QRS complex, and T wave, could be well preserved at a receiver site with clinically acceptable reconstruction quality. In addition, the data compression method in the proposed scheme can save total transmission power and time, and therefore reduce its potential interferences to other wireless devices.

  • articleOpen Access

    USING CORRELATION COEFFICIENT IN ECG WAVEFORM FOR ARRHYTHMIA DETECTION

    Arrhythmia is one kind of diseases that gives rise to the death and possibly forms the immedicable danger. The most common cardiac arrhythmia is the ventricular premature beat. The main purpose of this study is to develop an efficient arrhythmia detection algorithm based on the morphology characteristics of arrhythmias using correlation coefficient in ECG signal. Subjects for experiments included normal subjects, patients with atrial premature contraction (APC), and patients with ventricular premature contraction (PVC). So and Chan's algorithm was used to find the locations of QRS complexes. When the QRS complexes were detected, the correlation coefficient and RR-interval were utilized to calculate the similarity of arrhythmias. The algorithm was tested using MIT-BIH arrhythmia database and every QRS complex was classified in the database. The total number of test data was 538, 9 and 24 for normal beats, APCs and PVCs, respectively. The results are presented in terms of, performance, positive predication and sensitivity. High overall performance (99.3%) for the classification of the different categories of arrhythmic beats was achieved. The positive prediction results of the system reach 99.44%, 100% and 95.35% for normal beats, APCs and PVCs, respectively. The sensitivity results of the system are 99.81%, 81.82% and 95.83% for normal beats, APCs and PVCs, respectively. Results revealed that the system is accurate and efficient to classify arrhythmias resulted from APC or PVC. The proposed arrhythmia detection algorithm is therefore helpful to the clinical diagnosis.

  • articleOpen Access

    THE COMPARISON OF SSD ALGORITHM WITH OTHER ECG SAMPLING ALGORITHMS

    We had proposed a novel and fast Electrocardiogram (ECG) signal compression algorithm for non-uniform sampling in time domain [1]. It meets the real-time requirement for clinical application. Moreover, the compression performance is stable and uniform even for abnormal ECG signals. A criterion called sum square difference (SSD) is defined as an error test equation. The algorithm using SSD to calculate error tolerance is applied to the records in MIT-BIH database (with 11-bit resolution and 360 Hz sampling rate). It belongs to the threshold-limited algorithm but [1] does not mention much about this kind of algorithm. In this paper we provide more comparisons among SSD, Fan, scan-along polygonal approximation (SAPA), maximum enclosed area (MEA), and optimization algorithm (OPT) using the two measures called sample compression ratio (SCR) and percent root mean squared difference (PRD) with proper mean offset that [1] does not adopt. The results show SSD outperforms the mentioned algorithms with the same computational complexity O(n). Moreover, the comparison with the best but time-consuming coder OPT (O (n3)) shows how much the algorithm can be improved.

  • articleNo Access

    DESIGN AND IMPLEMENTATION OF ECG COMPRESSION ALGORITHM WITH CONTROLLABLE PERCENT ROOT-MEAN-SQUARE DIFFERENCE

    In this paper, the orthogonality of coefficient matrices of wavelet filters is utilized to derive the energy equation for the relation between time-domain signal and its corresponding wavelet coefficients. Using the energy equation, the relationship between the wavelet coefficient error and the reconstruction error is obtained. The errors considered in this paper include the truncation error and quantization error. This not only helps to control the reconstruction quality but also brings two advantages: (1) It is not necessary to perform inverse transform to obtain the distortion caused by compression using wavelet transform and can thus reduce computation efforts. (2) By using the energy equation, we can search for a threshold value to attain a better compression ratio within the range of a pre-specified percent root-mean-square difference (PRD) value. A compression algorithm with run length encoding is proposed based on the energy equation. In the end, the Matlab software and MIT-BIH database are adopted to perform simulations for verifying the feasibility of our proposed method. The algorithm is also implemented on a DSP chip to examine the practicality and suitability. The required computation time of an ECG segment is less than 0.0786 ,s which is fast enough to process real-time signals. As a result, the proposed algorithm is applicable for implementation on mobile ECG recording devices.

  • articleNo Access

    MINIATURE MODULES FOR MULTI-LEAD ECG RECORDING

    Remote monitoring systems for home health care service have become one of the hottest topics recently. Biomedical signals recorded by portable devices can be wirelessly transmitted through the Internet. In this paper, a miniature signal-condition module for ambulatory recording of electrocardiogram (ECG) signals was designed with high input impedance, high common-mode rejection ratio (CMRR), low power, appropriate amplification and filtration, and automatic suppression of offset voltage. For early detection of acute myocardial infarction (AMI), this device is extended and 12-lead ECG recording is available. Due to the modular approach, the module is accommodated for other biomedical signals recording as well if the gain and pass-band of the module are modified.

  • articleNo Access

    EMERGENCY HEALTH CARE AND FOLLOW-UP TELEMEDICINE SYSTEM FOR RURAL AREAS BASED ON LABVIEW

    This paper presents the design and development of a prototype for remote ECG data transmission based on Internet-enabled health care services and telemedicine fundamentals. An ECG acquisition system developed by the authors is used to acquire the ECG signal in lead-II configuration from patient and store it in .lvm format in a PC interfaced to patient module through RS232. This unit (data server) on the patient side then transfers the data to a remote client (on doctor's side) using TCP/IP as network protocol on LabVIEW 8.20 environment. Using this device, a specialist doctor can telematically move to the patient site and instruct medical personnel when handling a patient. During the last years, more and more modern tools have found their ways to different tasks during the design, the realization, and data processing in the area of Internet access to the e-health services. The telemedicine system demonstrated in this work is a combined real-time and store-and-forward facility.

  • articleNo Access

    OBSTRUCTIVE SLEEP APNEA CLASSIFICATION WITH ARTIFICIAL NEURAL NETWORK BASED ON TWO SYNCHRONIC HRV SERIES

    In the present study, "obstructive sleep apnea (OSA) patients" and "non-OSA patients" were classified into two groups using with two synchronic heart rate variability (HRV) series obtained from electrocardiography (ECG) and photoplethysmography (PPG) signals. A linear synchronization method called cross power spectrum density (CPSD), commonly used on HRV series, was performed to obtain high-quality signal features to discriminate OSA from controls. To classify simultaneous sleep ECG and PPG signals recorded from OSA and non-OSA patients, various feed forward neural network (FFNN) architectures are used and mean relative absolute error (MRAE) is applied on FFNN results to show affectivities of developed algorithm. The FFNN architectures were trained with various numbers of neurons and hidden layers. The results show that HRV synchronization is directly related to sleep respiratory signals. The CPSD of the HRV series can confirm the clinical diagnosis; both groups determined by an expert physician can be 99% truly classified as a single hidden-layer FFNN structure with 0.0623 MRAE, in which the maximum and phase values of the CPSD curve are assigned as two features. In future work, features taken from different physiological signals can be added to define a single feature that can classify apnea without error.

  • articleNo Access

    COMPARISON OF TIME AND FREQUENCY METHODS FOR CARDIAC ANOMALIES RECOGNITION

    The main aim of this paper is a contribution to the design of an Intelligent Diagnosis System for Cardiac anomalies detection in acquired ECG (Electrocardiogram) signals. To attain this goal, the authors have developed two approaches to extract the most relevant and significant parameters that can best classify ECGs into two classes namely, Normal and Abnormal from real Electrocardiogram signals. The first approach is a time method which consists in modeling ECGs using RBF (Radial Basis Function) neural network algorithm, whereas, the second approach is a frequency based method which relies on the DWT (Discrete Wavelet Transform) that projects the signal into a time-scale (time-frequency) plane. In both approaches, the resulting data are submitted to the same SVM (Support Machine Vector) Classifier. The results obtained have shown a good adjustment of relevant parameters for each approach and have revealed the most efficient combined processing-classification algorithm that has achieved classification rates up to 100 % allowing at a time a reduced number of parameters. This fact had considerably simplified the hardware for a real time implementation. Finally, the authors compared both approaches and have identified some implementation constraints, such as the sampling frequency for the frequency based approach and preprocessing and cutting for the time approach.

  • articleNo Access

    DRY ELECTRODE MATERIAL TESTING AND GAIN ANALYSIS FOR SINGLE ARM ECG SYSTEM

    Electrocardiography (ECG) is a medical diagnostic procedure used to record the electrical activity of the heart and display it as a waveform. For picking the ECG waveform, wet and dry electrodes can be utilized. Lots of dry electrode related studies have been carried out in different parts of the world. This paper focuses on the acquisition of ECG from the left arm and right arm separately and comparing the waveforms and gain required for the system. In addition to that, different dry electrode materials and dimensions are used to check the acquisition of ECG waveform from the single arm. Dry electrodes employed in this study include copper, brass, phosphor bronze and nickel silver. The various dimensions used in this study are 3cm×3cm×0.5mm, 2.5cm×2.7cm×0.5mm, 2.5cm×2.5cm×0.5mm, 2cm×3cm×0.5mm, 2cm×2cm×0.5mm and 1cm×3cm×0.5mm. A total of five healthy subjects are used in this study for the acquisition of ECG. The results are very much promising in that the gain required for the left arm ECG system is less than that needed for the right arm ECG system and moreover a clear ECG is obtained from the left arm compared to the right arm ECG waveform. All the dry electrodes used in the study can pick the ECG except for the dimension 1cm×3cm×0.5mm. Copper and brass provide stable output compared to nickel silver and phosphor bronze.

  • articleNo Access

    A COMPREHENSIVE APPROACH TOWARDS CLASSIFICATION AND PREDICTION OF VENTRICULAR TACHYCARDIA AND VENTRICULAR FIBRILLATION

    Ventricular tachycardia (VT) is a fast heart rate that arises from improper electrical activity in the ventricular of the heart. VT may eventually lead to lethal ventricular fibrillation (VF) which is characterized by fast and irregular heart rhythm. Since difference between VT and VF is diagnosed by specialist in a critical and stressful situation, the possibility of wrong decision is not low. Here, various set of ECG features belonged to different domains are implemented to investigate the predictability and discriminability of VT and VF episodes. Informative features from different domains such as correlation dimension (phase space) and power spectrum (frequency domain) were elicited from electrocardiogram (ECG) signals to describe the amount of irregularity/variation through the attack. In addition raw signal samples were used to assess the classification task based on the time domain features. Applying correlation dimension, power spectrum and the raw samples of ECGs to artificial neural network (ANN) classifier provides 91%, 92% and 71% classification accuracy between VT and VF signals, respectively. However, to enrich the time domain features, surrogate data was generated and the results of time domain is increased up to 87% which represents that ANNs are able to learn the dynamic nature of chaotic signals.

  • articleNo Access

    ENHANCED DYNAMIC THRESHOLD ALGORITHM OF QRS COMPLEX DETECTION

    The dynamic threshold algorithm (DTA) presented by Pan Tompkins is a popular QRS detection method, and it has high sensitivity and specificity. However, the accuracy of this algorithm would be compromised if its sensitivity is increased. In this study, an enhanced dynamic threshold algorithm (EDTA) based on dynamic threshold rules is proposed, which add a compensation scheme to reduce the rate of misdetection and missed detection of R wave in low signal-to-noise ratio condition, sensitivity and detection error rate are calculated on simulated and clinical data to compare the performance between EDTA and DTA, and EDTA yields a competitive results. For the clinical data, the average accuracy rate of EDTA is 99.24%, which is higher than that of DTA at 95.98%. Further compared experiments among EDTA and the two other popular algorithms are conducted and the results of their validation over a public database are given and discussed, which prove our superiority.

  • articleNo Access

    HYBRID TECHNIQUE FOR ECG SIGNAL COMPRESSION USING PARALLEL AND CASCADE METHOD

    The recording of electrical activity of the heart by using electrodes is known as electrocardiography (ECG). In long time monitoring of ECG, a huge amount of data needs to be handled. To handle the situation, an efficient compression technique which can retain the clinically important features of ECG signal is required. The continuous monitoring of this signal requires a large amount of memory. Hence, there is a requirement of compression. The compression of ECG signal using transforms in cascade is explored to incorporate the added advantages of both the transforms. This paper presents compression of ECG signal by hybrid technique consisting of cascade and parallel combination of discrete cosine transform (DCT) and discrete wavelet transform (DWT). The simulation is carried out using MATLAB tool. Various wavelet transforms are used for the testing purpose. The performance measures used are Percent square mean Root Difference (PRD) and CR to validate the results. The methodology using cascade combination proved to be better than the parallel technique in terms of Compression Ratio (CR). The highest CR achieved is 28.2 in the method using DCT and DWT in cascade. Different DWTs are used for the testing purpose. The parallel method shows the improved PRD as compared to the cascade method.

  • articleNo Access

    ELECTROCARDIOGRAM DATA COMPRESSION TECHNIQUES IN 1D/2D DOMAIN

    Electrocardiogram (ECG) is one of the best representatives of physiological signal that provides the state of the autonomic nervous system, primarily responsible for the cardiac activity. The ECG data compression plays a significant role in localized digital storage or efficient communication channel utilization in telemedicine applications. The lossless and lossy compression system’s compressor efficiency depends on the methodologies used for compression and the quality measure used to evaluate distortion. Based on domain ECG, data compression can be performed either one-dimensional (1D) or two-dimensional (2D) for utilization of inter and inter with intra beat correlation, respectively. In this paper, a comparative study between 1D and 2D ECG data compression methods was taken out from the existing literature to provide an update in this regard. ECG data compression techniques and algorithms in 1D and 2D domain have their own merits and limitations. Recently, numerous research and techniques in 1D ECG data compression have been developed, including direct and transformed domain. Additionally, 2D ECG data compression research is reported based on period normalization and complexity sorting in recent times. Finally, several practical issues highlight the assessment of reconstructed signal quality and performance comparisons with an average comparative of exhaustive existing 1D and 2D ECG compression methods based on the utilized digital signal processing systems.

  • articleNo Access

    MULTISCALE BSBL COMPRESSED SENSING-BASED ECG SIGNAL COMPRESSION WITH ENCODING FOR TELEMEDICINE

    An electrocardiogram (ECG) signal is an important diagnostic tool for cardiologists to detect the abnormality. In continuous monitoring, an ambulatory huge amount of ECG data is involved. This leads to high storage requirements and transmission costs. Hence, to reduce the storage and transmission cost, there is a requirement for an efficient compression or coding technique. One of the most promising compression techniques is Compressive Sensing (CS) which makes efficient compression of signals. By this methodology, a signal can easily be reconstructed if it has a sparse representation. This paper presents the Block Sparse Bayesian Learning (BSBL)-based multiscale compressed sensing (MCS) method for the compression of ECG signals. The main focus of the proposed technique is to achieve a reconstructed signal with less error and more energy efficiency. The ECG signal is sparsely represented by wavelet transform. MIT-BIH Arrhythmia database is used for testing purposes. The Huffman technique is used for encoding and decoding. The signal recovery is appropriate up to 75% of compression. The quality of the signal is ascertained using the standard performance measures such as signal-to-noise ratio (SNR) and Percent root mean square difference (PRD). The quality of the reconstructed ECG signal is also validated through the visual method. This method is most suitable for telemedicine applications.

  • articleNo Access

    MACHINE LEARNING APPROACH TO DETECT ECG ABNORMALITIES USING COST-SENSITIVE DECISION TREE CLASSIFIER

    Cardiac Arrhythmia is an abnormal heart rhythm that develops when the electrical impulses control the heart’s contraction which does not function properly. The heart can beat too fast (tachycardia), too slow (bradycardia), or in an irregular pattern. Observing ECG signal peaks and channels freehand is difficult due to their ingenious modification. Automated detection of cardiovascular abnormalities is preferred for the early diagnosis of cardiac disorders. This paper used machine learning approaches for detecting ECG abnormality utilizing a Support Vector Machine (SVM) and Cost-Sensitive Decision-Tree (CS-DT) classifier. The Empirical Mode Decomposition approach was utilized to examine the properties of R-peaks and QRS complexes in ECG signs. Various morphological characteristics are analyzed from the signal penetrated by the classifier to diagnose the irregular beats. A set of twenty-two clinically feasible features comprising temporal, morphological, and statistical were extracted from the processed ECG signals and applied to the classifier to categorize cardiovascular irregularities like Normal (N), Left Bundle Branch Block (LBBB), Right Bundle Branch Block (RBBB), Atrial Premature Beats (APB), and Premature Ventricular Contraction (PVC). The Beth Israel Hospital at Massachusetts Institute of Technology (MIT-BIH) dataset has been used for this work, where feature datasets are split into training and evaluation subsets. The training set is used to train machine learning models on the extracted features, while the evaluation set is used to assess the performance of the trained models. The evaluation metrics such as Accuracy (Acc), Sensitivity (Se), Specificity (Sp), and Positive Predictivity (Pp), are frequently used to evaluate the model’s performance in Arrhythmia detection along with classification. The simulation has been conducted using SVM and CS-DT classifier with performance for all individual class labels at a Confidence Factor (CF) of 0.5. The performance of the time and frequency domain features is merged resulting in higher classification of Sensitivity, Specificity, Positive Predictivity, and Accuracy of 89.5%, 98.11%, 87.76%, and 96.8% in SVM, 97.71%, 99.58%, 97.66%, 99.32% in CS-DT classifier in identifying the irregular heartbeats.