首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 206 毫秒
1.
This study reports on a novel method to detect and reduce the contribution of movement artifact (MA) in electrocardiogram (ECG) recordings gathered from horses in free movement conditions. We propose a model that integrates cardiovascular and movement information to estimate the MA contribution. Specifically, ECG and physical activity are continuously acquired from seven horses through a wearable system. Such a system employs completely integrated textile electrodes to monitor ECG and is also equipped with a triaxial accelerometer for movement monitoring. In the literature, the most used technique to remove movement artifacts, when noise bandwidth overlaps the primary source bandwidth, is the adaptive filter. In this study we propose a new algorithm, hereinafter called Stationary Wavelet Movement Artifact Reduction (SWMAR), where the Stationary Wavelet Transform (SWT) decomposition algorithm is employed to identify and remove movement artifacts from ECG signals in horses. A comparative analysis with the Normalized Least Mean Square Adaptive Filter technique (NLMSAF) is performed as well. Results achieved on seven hours of recordings showed a reduction greater than 40% of MA percentage (between before- and after- the application of the proposed algorithm). Moreover, the comparative analysis with the NLMSAF, applied to the same ECG recordings, showed a greater reduction of MA percentage in favour of SWMAR with a statistical significant difference (pvalue < 0.0.5).  相似文献   

2.
BACKGROUND: The presence of parasite interference signals could cause serious problems in the registration of ECG signals and many works have been done to suppress electromyogram (EMG) artifacts noises and disturbances from electrocardiogram (ECG). Recently, new developed techniques based on global and local transforms have become popular such as wavelet shrinkage approaches (1995) and time-frequency dependent threshold (1998). Moreover, other techniques such as artificial neural networks (2003), energy thresholding and Gaussian kernels (2006) are used to improve previous works. This review summarizes windowed techniques of the concerned issue. METHODS AND RESULTS: We conducted a mathematical method based on two sets of information, which are dominant scale of QRS complexes and their domain. The task is proposed by using a varying-length window that is moving over the whole signals. Both the high frequency (noise) and low frequency (base-line wandering) removal tasks are evaluated for manually corrupted ECG signals and are validated for actual recorded ECG signals. CONCLUSIONS: Although, the simplicity of the method, fast implementation, and preservation of characteristics of ECG waves represent it as a suitable algorithm, there may be some difficulties due to pre-stage detection of QRS complexes and specification of algorithm's parameters for varying morphology cases.  相似文献   

3.
The main purpose of the present work is the definition of a fully automatic procedure for correlation dimension (D2) estimation. In the first part, the procedure for the estimation of the correlation dimension (D2) is proposed and tested on various types of mathematical models: chaotic (Lorenz and Henon models), periodical (sinusoidal waves) and stochastic (Gaussian and uniform noise). In all cases, accurate D2 estimates were obtained. The procedure can detect the presence of multiple scaling regions in the correlation integral function. The connection between the presence of multiple scaling regions and multiple dynamic activities cooperating in a system is investigated through the study of composite time series. In the second part of the paper, the proposed algorithm is applied to the study of cardiac electrical activity through the analysis of electrocardiographic signals (ECG) obtained from the commercially available MIT-BIH ECG arrhythmia database. Three groups of ECG signals have been considered: the ECGs of normal subjects and ECGs of subjects with atrial fibrillation and with premature ventricular contraction. D2 estimates are computed on single ECG intervals (static analysis) of appropriate duration, striking a balance between stationarity requisites and accurate computation requirements. In addition, D2 temporal variability is studied by analyzing consecutive intervals of ECG tracings (dynamic analysis). The procedure reveals the presence of multiple scaling regions in many ECG signals, and the D2 temporal variability differs in the three ECG groups considered; it is greater in the case of atrial fibrillation than in normal sinus rhythms. This study points out the importance of considering both the static and dynamic D2 analysis for a more complete study of the system under analysis. While the static analysis visualizes the underlying heart activity, dynamic D2 analysis insights the time evolution of the underlying system. Received: 11 April 1997 / Accepted in revised form: 19 March 1999  相似文献   

4.

Background

The electrocardiogram (ECG) is a diagnostic tool that records the electrical activity of the heart, and depicts it as a series of graph-like tracings, or waves. Being able to interpret these details allows diagnosis of a wide range of heart problems. Fetal electrocardiogram (FECG) extraction has an important impact in medical diagnostics during the mother pregnancy period. Since the observed FECG signals are often mixed with the maternal ECG (MECG) and the noise induced by the movement of electrodes or by mother motion, the separation process of the ECG signal sources from the observed data becomes quite complicated. One of its complexity is when the ECG sources are dependent, thus, in this paper we introduce a new approach of blind source separation (BSS) in the noisy context for both independent and dependent ECG signal source. This approach consist in denoising the observed ECG signals using a bilateral total variation (BTV) filter; then minimizing the Kullbak-Leibler divergence between copula densities to separate the FECG signal from the MECG one.

Results

We present simulation results illustrating the performance of our proposed method. We will consider many examples of independent/dependent source component signals. The results will be compared with those of the classical method called independent component analysis (ICA) under the same conditions. The accuracy of source estimation is evaluated through a criterion, called again the signal-to-noise-ratio (SNR). The first experiment shows that our proposed method gives accurate estimation of sources in the standard case of independent components, with performance around 27 dB in term of SNR. In the second experiment, we show the capability of the proposed algorithm to successfully separate two noisy mixtures of dependent source components - with classical criterion devoted to the independent case - fails, and that our method is able to deal with the dependent case with good performance.

Conclusions

In this work, we focus specifically on the separation of the ECG signal sources taken from skin two electrodes located on a pregnant woman’s body. The ECG separation is interpreted as a noisy linear BSS problem with instantaneous mixtures. Firstly, a denoising step is required to reduce the noise due to motion artifacts using a BTV filter as a very effective one-pass filter for denoising. Then, we use the Kullbak-Leibler divergence between copula densities to separate the fetal heart rate from the mother one, for both independent and dependent cases.
  相似文献   

5.
Abstract

The Electrocardiogram (ECG), represents the electrical activity of the heart. It is characterised by a number of waves P, QRS, T which are correlated to the status of the heart activity. In this paper, the aim is to present a powerful algorithm to aid cardiac diagnosis. The approach used is based on a determinist method, that of the tree decision. However, the different waves of the ECG signal need to be identified and then measured following a signal to noise enhancement. Signal to noise enhancement is performed by a combiner linear adaptive filter whereas P, QRS, T wave identification and measurement are performed by a derivative approach. Results obtained on simulated and real ECG signals are shown to be highly, satisfactory in the aid of cardiac arrhythmia diagnosis, such as junctionnal escapes, blocks, etc.  相似文献   

6.
The aim of this paper is to develop a method to extract relevant activities from surface electromyography (SEMG) recordings under difficult experimental conditions with a poor signal to noise ratio. High amplitude artifacts, the QRS complex, low frequency noise and white noise significantly alter EMG characteristics. The CEM algorithm proved to be useful for segmentation of SEMG signals into high amplitude artifacts (HAA), phasic activity (PA) and background postural activity (BA) classes. This segmentation was performed on signal energy, with classes belonging to a χ2 distribution. Ninety-five percent of HAA events and 96.25% of BA events were detected, and the remaining noise was then identified using AR modeling, a classification based upon the position of the coordinates of the pole of highest module. This method eliminated 91.5% of noise and misclassified only 3.3% of EMG events when applied to SEMG recorded on passengers subjected to lateral accelerations.  相似文献   

7.
《IRBM》2019,40(3):145-156
ObjectiveElectrocardiogram (ECG) is a diagnostic tool for recording electrical activities of the human heart non-invasively. It is detected by electrodes placed on the surface of the skin in a conductive medium. In medical applications, ECG is used by cardiologists to observe heart anomalies (cardiovascular diseases) such as abnormal heart rhythms, heart attacks, effects of drug dosage on subject's heart and knowledge of previous heart attacks. Recorded ECG signal is generally corrupted by various types of noise/distortion such as cardiac (isoelectric interval, prolonged depolarization and atrial flutter) or extra cardiac (respiration, changes in electrode position, muscle contraction and power line noise). These factors hide the useful information and alter the signal characteristic due to low Signal-to-Noise Ratio (SNR). In such situations, any failure to judge the ECG signal correctly may result in a delay in the treatment and harm a subject (patient) health. Therefore, appropriate pre-processing technique is necessary to improve SNR to facilitate better treatment to the subject. Effects of different pre-processing techniques on ECG signal analysis (based on R-peaks detection) are compared using various Figures of Merit (FoM) such as sensitivity (Se), accuracy (Acc) and detection error rate (DER) along with SNR.MethodsIn this research article, a new fractional wavelet transform (FrWT) has been proposed as a pre-processing technique in order to overcome the disadvantages of other existing commonly used techniques viz. wavelet transform (WT) and the fractional Fourier transform (FrFT). The proposed FrWT technique possesses the properties of multiresolution analysis and represents signal in the fractional domain which consists of representation in terms of rotation of signals in the time–frequency plane. In the literature, ECG signal analysis has been improvised using statistical pre-processing techniques such as principal component analysis (PCA), and independent component analysis (ICA). However, both PCA and ICA are prone to suffer from slight alterations in either signal or noise, unless the basis functions are prepared with a worldwide set of ECG. Independent Principal Component Analysis (IPCA) has been used to overcome this shortcoming of PCA and ICA. Therefore, in this paper three techniques viz. FrFT, FrWT and IPCA are selected for comparison in pre-processing of ECG signals.ResultsThe selected methods have been evaluated on the basis of SNR, Se, Acc and DER of the detected ECG beats. FrWT yields the best results among all the methods considered in this paper; 34.37dB output SNR, 99.98% Se, 99.96% Acc, and 0.036% DER. These results indicate the quality of biology-related information retained from the pre-processed ECG signals for identifying different heart abnormalities.ConclusionCorrect analysis of the acquired ECG signal is the main challenge for cardiologist due to involvement of various types of noises (high and low frequency). Twenty two real time ECG records have been evaluated based on various FoM such as SNR, Se, Acc and DER for the proposed FrWT and existing FrFT and IPCA preprocessing techniques. Acquired real-time ECG database in normal and disease situations is used for the purpose. The values of FoMs indicate high SNR and better detection of R-peaks in a ECG signal which is important for the diagnosis of cardiovascular disease. The proposed FrWT outperforms all other techniques and holds both analytical attributes of the actual ECG signal and alterations in the amplitudes of various ECG waveforms adequately. It also provides signal portrayals in the time-fractional-frequency plane with low computational complexity enabling their use practically for versatile applications.  相似文献   

8.
基于小波变换的心电信号去噪算法   总被引:1,自引:0,他引:1  
目的:去除在心电信号采集过程中混入的肌电干扰、工频干扰、基线漂移等噪声信号,避免噪声对心电信号特征点的识别和提取造成误判和漏判。方法:首先利用coif4小波对心电信号按Mallat算法进行分解,然后采用软、硬阈值折衷与小波重构的算法进行去噪。结果:采用MIT/BIH Arrhythmia Database中的心电信号进行仿真、验证,有效去除了三种常见的噪声信号。结论:本方法实时性好,为临床分析与诊断奠定了基础。  相似文献   

9.
《IRBM》2020,41(5):252-260
ObjectiveMonitoring the heartbeat of the fetus during pregnancy is a vital part in determining their health. Current fetal heart monitoring techniques lack the accuracy in fetal heart rate monitoring and features acquisition, resulting in diagnostic medical issues. The demand for a reliable method of non-invasive fetal heart monitoring is of high importance.MethodElectrocardiogram (ECG) is a method of monitoring the electrical activity produced by the heart. The extraction of the fetal ECG (FECG) from the abdominal ECG (AECG) is challenging since both ECGs of the mother and the baby share similar frequency components, adding to the fact that the signals are corrupted by white noise. This paper presents a method of FECG extraction by eliminating all other signals using AECG. The algorithm is based on attenuating the maternal ECG (MECG) by filtering and wavelet analysis to find the locations of the FECG, and thus isolating them based on their locations. Two signals of AECG collected at different locations on the abdomens are used. The ECG data used contains MECG of a power of five to ten times that of the FECG.ResultsThe FECG signals were successfully isolated from the AECG using the proposed method through which the QRS complex of the heartbeat was conserved, and heart rate was calculated. The fetal heart rate was 135 bpm and the instantaneous heart rate was 131.58 bpm. The heart rate of the mother was at 90 bpm with an instantaneous heart rate of 81.9 bpm.ConclusionThe proposed method is promising for FECG extraction since it relies on filtering and wavelet analysis of two abdominal signals for the algorithm. The method implemented is easily adjusted based on the power levels of signals, giving it great ease of adaptation to changing signals in different biosignals applications.  相似文献   

10.
In the living and working environment, stressful factors, such as noise, can cause health problems including cardiovascular diseases and noise-induced hearing loss. Some heat shock proteins (Hsps) play an important role in protecting cardiac cells against ischemic injury, and antibodies against these Hsps are associated with the development and prognosis of atherogenesis, coronary heart disease, and hypertension. Whether the presence of such antibodies is associated with abnormal electrocardiography (ECG) in stressed autoworkers exposed to chronic noise is presently unknown. Therefore, we investigated the association between the levels of plasma anti-Hsp60 and anti-Hsp70 with electrocardiograph abnormality in 396 autoworkers exposed to different noise levels by using Western blot, ECG, and multivariate logistic regression analysis. The results showed that the increase in levels of anti-Hsp70 was associated with a higher risk of ECG abnormalities characteristic of chronic myocardial ischemia (P < 0.05), conductive abnormality (P < 0.01), or heart displacement (P < 0.05); in contrast, elevated anti-Hsp60 was related to ECG abnormalities characteristic of sinus arrhythmia, chronic myocardial ischemia, and ectopic rhythm (P < 0.01 for all). Overall, high levels of both anti-Hsp70 and anti-Hsp60 were associated with significantly increased risk of ECG abnormalities (odds ratio [OR] = 1.73 and 95% confidence interval [Cl] = 1.04-2.86 for anti-Hsp70 and OR = 1.36 and 95% Cl = 1.07-1.72 for anti-Hsp60) with and without adjustment for cumulative noise exposure (OR = 1.96 and 95% Cl = 1.20-3.21 for anti-Hsp70 and OR = 3.93 and 95% Cl = 1.72-8.92 for anti-Hsp60). These findings suggest that the production of both anti-Hsp70 and anti-Hsp60 may be independent risk factors for the development and progression of abnormal ECG and therefore possibly cardiovascular diseases in autoworkers exposed to occupational noise.  相似文献   

11.
The Electrocardiogram (ECG), represents the electrical activity of the heart. It is characterised by a number of waves P, QRS, T which are correlated to the status of the heart activity. In this paper, the aim is to present a powerful algorithm to aid cardiac diagnosis. The approach used is based on a determinist method, that of the tree decision. However, the different waves of the ECG signal need to be identified and then measured following a signal to noise enhancement. Signal to noise enhancement is performed by a combiner linear adaptive filter whereas P, QRS, T wave identification and measurement are performed by a derivative approach. Results obtained on simulated and real ECG signals are shown to be highly, satisfactory in the aid of cardiac arrhythmia diagnosis, such as junctionnal escapes, blocks, etc.  相似文献   

12.
The surface electromyographic (sEMG) signal that originates in the muscle is inevitably contaminated by various noise signals or artifacts that originate at the skin-electrode interface, in the electronics that amplifies the signals, and in external sources. Modern technology is substantially immune to some of these noises, but not to the baseline noise and the movement artifact noise. These noise sources have frequency spectra that contaminate the low-frequency part of the sEMG frequency spectrum. There are many factors which must be taken into consideration when determining the appropriate filter specifications to remove these artifacts; they include the muscle tested and type of contraction, the sensor configuration, and specific noise source. The band-pass determination is always a compromise between (a) reducing noise and artifact contamination, and (b) preserving the desired information from the sEMG signal. This study was designed to investigate the effects of mechanical perturbations and noise that are typically encountered during sEMG recordings in clinical and related applications. The analysis established the relationship between the attenuation rates of the movement artifact and the sEMG signal as a function of the filter band pass. When this relationship is combined with other considerations related to the informational content of the signal, the signal distortion of filters, and the kinds of artifacts evaluated in this study, a Butterworth filter with a corner frequency of 20 Hz and a slope of 12 dB/oct is recommended for general use. The results of this study are relevant to biomechanical and clinical applications where the measurements of body dynamics and kinematics may include artifact sources.  相似文献   

13.
《IRBM》2014,35(6):351-361
Nowadays, doctors use electrocardiogram (ECG) to diagnose heart diseases commonly. However, some nonideal effects are often distributed in ECG. Discrete wavelet transform (DWT) is efficient for nonstationary signal analysis. In this paper, the Symlets sym5 is chosen as the wavelet function to decompose recorded ECG signals for noise removal. Soft-thresholding method is then applied for feature detection. To detect ECG features, R peak of each heart beat is first detected, and the onset and offset of the QRS complex are then detected. Finally, the signal is reconstructed to remove high frequency interferences and applied with adaptive searching window and threshold to detect P and T waves. We use the MIT-BIH arrhythmia database for algorithm verification. For noise reduction, the SNR improvement is achieved at least 10 dB at SNR 5 dB, and most of the improvement SNR are better than other methods at least 1 dB at different SNR. When applying to the real portable ECG device, all R peaks can be detected when patients walk, run, or move at the speed below 9 km/h. The performance of delineation on database shows in our algorithm can achieve high sensitivity in detecting ECG features. The QRS detector attains a sensitivity over 99.94%, while detectors of P and T waves achieve 99.75% and 99.7%, respectively.  相似文献   

14.
Electrocardiogram (ECG) signals are difficult to interpret, and clinicians must undertake a long training process to learn to diagnose diabetes from subtle abnormalities in these signals. To facilitate these diagnoses, we have developed a technique based on the heart rate variability signal obtained from ECG signals. This technique uses digital signal processing methods and, therefore, automates the detection of diabetes from ECG signals. In this paper, we describe the signal processing techniques that extract features from heart rate (HR) signals and present an analysis procedure that uses these features to diagnose diabetes. Through statistical analysis, we have identified the correlation dimension, Poincaré geometry properties (SD2), and recurrence plot properties (REC, DET, L mean) as useful features. These features differentiate the HR data of diabetic patients from those of patients who do not have the illness, and have been validated by using the AdaBoost classifier with the perceptron weak learner (yielding a classification accuracy of 86%). We then developed a novel diabetic integrated index (DII) that is a combination of these nonlinear features. The DII indicates whether a particular HR signal was taken from a person with diabetes. This index aids the automatic detection of diabetes, thereby allowing a more objective assessment and freeing medical professionals for other tasks.  相似文献   

15.
Surface electromyograms (EMG) of back muscles are often corrupted by electrocardiogram (ECG) signals. This noise in the EMG signals does not allow to appreciate correctly the spectral content of the EMG signals and to follow its evolution during, for example, a fatigue process. Several methods have been proposed to reject the ECG noise from EMG recordings, but seldom taking into account the eventual changes in ECG characteristics during the experiment. In this paper we propose an adaptive filtering algorithm specifically developed for the rejection of the electrocardiogram corrupting surface electromyograms (SEMG). The first step of the study was to choose the ECG electrode position in order to record the ECG with a shape similar to that found in the noised SEMGs. Then, the efficiency of different algorithms were tested on 28 erector spinae SEMG recordings. The best algorithm belongs to the fast recursive least square family (FRLS). More precisely, the best results were obtained with the simplified formulation of a FRLS algorithm. As an application of the adaptive filtering, the paper compares the evolutions of spectral parameters of noised or denoised (after adaptive filtering) surface EMGs recorded on erector spinae muscles during a trunk extension. The fatigue test was analyzed on 16 EMG recordings. After adaptive filtering, mean initial values of energy and of mean power frequency (MPF) were significantly lower and higher respectively. The differences corresponded to the removal of the ECG components. Furthermore, classical fatigue criteria (increase in energy and decrease in MPF values over time during the fatigue test) were better observed on the denoised EMGs. The mean values of the slopes of the energy-time and MPF-time linear relationships differed significantly when established before and after adaptive filtering. These results account for the efficacy of the adaptive filtering method proposed here to denoise electrophysiological signals.  相似文献   

16.
赵艳娜  魏珑  徐舫舟  赵捷  田杰  王越 《生物磁学》2009,(16):3128-3130
目的:研究去除心电信号中的基线漂移、工频干扰和肌电干扰等噪声,提高心电信号的自动识别和诊断精度。方法:利用Coif4小波对心电信号进行8尺度分解,采用小波分解重构法去除基线漂移,然后利用改进的小波闽值算法去除工频干扰和肌电干扰。结果:利用Matlab仿真工具,选择MIT-BIH心率失常数据库中信号进行验证,能有效去除这三种噪声,并且很好的保持R波的信息。结论:本算法在不丢失心电信号有用信息的前提下,可以较好的去除三种常见的噪声,可以用于心电信号自动分析之前的预处理。  相似文献   

17.
This paper presents a new ECG denoising approach based on noise reduction algorithms in empirical mode decomposition (EMD) and discrete wavelet transform (DWT) domains. Unlike the conventional EMD based ECG denoising approaches that neglect a number of initial intrinsic mode functions (IMFs) containing the QRS complex as well as noise, we propose to perform windowing in the EMD domain in order to reduce the noise from the initial IMFs instead of discarding them completely thus preserving the QRS complex and yielding a relatively cleaner ECG signal. The signal thus obtained is transformed in the DWT domain, where an adaptive soft thresholding based noise reduction algorithm is employed considering the advantageous properties of the DWT compared to that of the EMD in preserving the energy in the presence of noise and in reconstructing the original ECG signal with a better time resolution. Extensive simulations are carried out using the MIT-BIH arrythmia database and the performance of the proposed method is evaluated in terms of several standard metrics. The simulation results show that the proposed method is able to reduce noise from the noisy ECG signals more accurately and consistently in comparison to some of the stateof-the-art methods.  相似文献   

18.
In this paper, two novel and simple, target distortion level (TDL) and target data rate (TDR), Wavelet threshold based ECG compression algorithms are proposed for real-time applications. The issues on the use of objective error measures, such as percentage root mean square difference (PRD) and root mean square error (RMSE) as a quality measures, in quality controlled/guranteed algorithm are investigated with different sets of experiments. For the proposed TDL and TDR algorithm, data rate variability and reconstructed signal quality is evaluated under different ECG signal test conditions. Experimental results show that the TDR algorithm achieves the required compression data rate to meet the demands of wire/wireless link while the TDL algorithm does not. The compression performance is assessed in terms of number of iterations required to achieve convergence and accuracy, reconstructed signal quality and coding delay. The reconstructed signal quality is evaluated by correct diagnosis (CD) test through visual inspection. Three sets of ECG data from three different databases, the MIT-BIH Arrhythmia (mita) (Fs=360 Hz, 11 b/sample), the Creighton University Ventricular Tachyarrhythmia (cuvt) (Fs=250 Hz, 12 b/sample) and the MIT-BIH Supraventricular Arrhythmia (mitsva) (Fs=128 Hz, 10 b/sample), are used for this work. For each set of ECG data, the compression ratio (CR) range is defined. The CD value of 100% is achieved for CR ≤12, CR ≤ 8 and CR ≤ 4 for data from mita, cuvt and mitsva databases, respectively. The experimental results demonstrate that the proposed TDR algorithm is suitable for real-time applications.  相似文献   

19.
Functional magnetic resonance imaging (fMRI) is a non-invasive and powerful imaging tool for detecting brain activities. The majority of fMRI studies are performed with single-shot echo-planar imaging (EPI) due to its high temporal resolution. Recent studies have demonstrated that, by increasing the spatial-resolution of fMRI, previously unidentified neuronal networks can be measured. However, it is challenging to improve the spatial resolution of conventional single-shot EPI based fMRI. Although multi-shot interleaved EPI is superior to single-shot EPI in terms of the improved spatial-resolution, reduced geometric distortions, and sharper point spread function (PSF), interleaved EPI based fMRI has two main limitations: 1) the imaging throughput is lower in interleaved EPI; 2) the magnitude and phase signal variations among EPI segments (due to physiological noise, subject motion, and B0 drift) are translated to significant in-plane aliasing artifact across the field of view (FOV). Here we report a method that integrates multiple approaches to address the technical limitations of interleaved EPI-based fMRI. Firstly, the multiplexed sensitivity-encoding (MUSE) post-processing algorithm is used to suppress in-plane aliasing artifacts resulting from time-domain signal instabilities during dynamic scans. Secondly, a simultaneous multi-band interleaved EPI pulse sequence, with a controlled aliasing scheme incorporated, is implemented to increase the imaging throughput. Thirdly, the MUSE algorithm is then generalized to accommodate fMRI data obtained with our multi-band interleaved EPI pulse sequence, suppressing both in-plane and through-plane aliasing artifacts. The blood-oxygenation-level-dependent (BOLD) signal detectability and the scan throughput can be significantly improved for interleaved EPI-based fMRI. Our human fMRI data obtained from 3 Tesla systems demonstrate the effectiveness of the developed methods. It is expected that future fMRI studies requiring high spatial-resolvability and fidelity will largely benefit from the reported techniques.  相似文献   

20.
ECG data compression techniques have received extensive attention in ECG analysis. Numerous data compression algorithms for ECG signals have been proposed during the last three decades. We describe two algorithms based on the scan-along polygonal approximation algorithm (SAPA) that are suitable for multichannel ECG data reduction on a microprocessor-based system. One represents a modification of SAPA (MSAPA) which adopts the method of integer division table searching to speed up data reduction; the other (CSAPA) combines MSAPA and TP, a turning-point algorithm, to preserve ST segment signals. Results show that our algorithms achieve a compression ratio of more than 5:1 and a percent rms difference (PRD) to the original signal of less than 3.5%. In addition, the maximum execution time of MSAPA for processing one data point is about 50μ s. Moreover, the CSAPA algorithm retains all of the details of the ST segment, which are important in ischaemia diagnosis, by employing the TP algorithm.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号