首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
The use of the Hodrick-Prescott (HP) filter is presented as an alternative to the traditional digital filtering and spline smoothing methods currently used in biomechanics. In econometrics, HP filtering is a standard tool used to decompose a macroeconomic time series into a nonstationary trend component and a stationary residual component. The use of the HP filter in the present work is based on reasonable assumptions about the jerk and noise components of the raw displacement signal. Its applicability was tested on 4 kinematic signals with different characteristics. Two are well known signals taken from the literature on biomechanical signal filtering, and the other two were acquired with our own motion capture system. The criterion for the selection of cutoff frequency was based on the power spectral density of the raw displacement signals. The results showed the technique to be well suited to filtering biomechanical displacement signals in order to obtain accurate higher derivatives in a simple and systematic way. Namely, the HP filter and the generalized cross-validated quintic spline (GCVSPL) produce similar RMS errors on the first (0.1063 vs. 0.1024 m/s2) and second (23.76 vs. 23.24 rad/s2) signals. The HP filter performs slightly better than GCVSPL on the third (0.209 vs. 0.236 m/s2) and fourth (1.596 vs. 2.315 m/s2) signals.  相似文献   

2.
This paper presents an empirical approach for the decomposition, simulation, and reconstruction of wind-induced stem displacement of plantation-grown Scots pine trees. Results from singular spectrum analysis (SSA) allow a low-dimensional characterization of the complex and complicated tree motion patterns in response to non-destructive wind excitation. Since motion of the sample trees was dominated by sway in the first mode, the application of SSA on time series of sample trees’ stem displacement yielded characteristic and distinguishable non-oscillatory trend components, quasi-oscillatory sway, and noise, of which only the non-oscillatory components were correlated directly with wind characteristics. Although sway in the range of the dominant damped fundamental frequency dominated the measured stem displacement signals, it was almost decoupled from near-surface airflow. The ability to discriminate SSA-components is demonstrated based on correlation and spectral analysis. These SSA-components, as well as wind speed measured in the canopy space of the Scots pine forest, were used to train neural networks, which could then reasonably simulate tree response to wind excitation.  相似文献   

3.
This paper is the first of a series focusing on the biomechanical analysis of live trees. The finite element method (fem) is the most common method used for the analysis of complex mechanical structures. Several fem industrial codes exist, but they need to be adapted to calculate the mechanical behaviour of growing trees. A general incremental model has been developed for this specific application. In this model, time was discretised and for any developmental stage, a new equilibrium was written considering the increment of weight due to the mass of new wood layers and new vegetative elements being added. Maturation strains of new-formed cells were also considered for the simulation of the shoot reorientation process. This model was intended for use at the whole plant level. A multi-layer beam finite element is presented, which is well adapted to discretise tree limbs. The shape evolution of the structure was represented at each time step by the nodal displacement vector. The mechanical stresses induced as a result of growth were determined within the stem using a cumulative process taking into account the past history of each growth ring. The first basic results of growth stresses and shape evolution were compared with already published results at the branch level.  相似文献   

4.
With respect to the first example in Schimmel (2001), Van Dongen et al. (2001) conclude from their Lomb-Scargle analysis that the noise I used 'contains new periodicities that are added to the signal (these periodicities by themselves resemble a harmonic series of a 38-hour rhythm).' They infer that 'the variance of the added noise is about five times as large as the variance of the signal' causing the detection of the new significant periodicities in the noise prior to the 24-h bimodal rhythm. Moreover the 'example reflects a combination of an extremely non-sinusoidal signal with noise that is not independent, which results in a time series that is difficult to analyze with virtually any known method.' In the following, I briefly examine these concerns to avoid misunderstandings and to alert that with an inadequate use of the statistical significance test, misleading conclusions can be obtained. Although this paper further emphasizes difficulties in the detection with Lomb-Scargle periodograms, this should not be used as de-motivation. As stated in Schimmel (2001) Lomb-Scargle is a powerful technique but such as any other method one should be aware about its limitations, and use additional tools to constrain the true data characteristics.  相似文献   

5.
Tennis stroke mechanics have attracted considerable biomechanical analysis, yet current filtering practice may lead to erroneous reporting of data near the impact of racket and ball. This research had three aims: (1) to identify the best method of estimating the displacement and velocity of the racket at impact during the tennis serve, (2) to demonstrate the effect of different methods on upper limb kinematics and kinetics and (3) to report the effect of increased noise on the most appropriate treatment method. The tennis serves of one tennis player, fit with upper limb and racket retro-reflective markers, were captured with a Vicon motion analysis system recording at 500 Hz. The raw racket tip marker displacement and velocity were used as criterion data to compare three different endpoint treatments and two different filters. The 2nd-order polynomial proved to be the least erroneous extrapolation technique and the quintic spline filter was the most appropriate filter. The previously performed "smoothing through impact" method, using a quintic spline filter, underestimated the racket velocity (9.1%) at the time of impact. The polynomial extrapolation method remained effective when noise was added to the marker trajectories.  相似文献   

6.
Motion analysis systems typically introduce noise to the displacement data recorded. Butterworth digital filters have been used to smooth the displacement data in order to obtain smoothed velocities and accelerations. However, this technique does not yield satisfactory results, especially when dealing with complex kinematic motions that occupy the low- and high-frequency bands. The use of the discrete wavelet transform, as an alternative to digital filters, is presented in this paper. The transform passes the original signal through two complementary low- and high-pass FIR filters and decomposes the signal into an approximation function and a detail function. Further decomposition of the signal results in transforming the signal into a hierarchy set of orthogonal approximation and detail functions. A reverse process is employed to perfectly reconstruct the signal (inverse transform) back from its approximation and detail functions. The discrete wavelet transform was applied to the displacement data recorded by Pezzack et al., 1977. The smoothed displacement data were twice differentiated and compared to Pezzack et al.'s acceleration data in order to choose the most appropriate filter coefficients and decomposition level on the basis of maximizing the percentage of retained energy (PRE) and minimizing the root mean square error (RMSE). Daubechies wavelet of the fourth order (Db4) at the second decomposition level showed better results than both the biorthogonal and Coiflet wavelets (PRE = 97.5%, RMSE = 4.7 rad s-2). The Db4 wavelet was then used to compress complex displacement data obtained from a noisy mathematically generated function. Results clearly indicate superiority of this new smoothing approach over traditional filters.  相似文献   

7.

Background

The electrocardiogram (ECG) signals provide important information about the heart electrical activities in medical and diagnostic applications. This signal may be contaminated by different types of noises. One of the noise types which has a considerable overlap with the ECG signals in frequency domain is electromyogram (EMG). Among the exciting approaches for de-noising the ECG signals, those based on singular spectrum analysis (SSA) are popular.

Methods

In this paper, we propose a method based on SSA to separate the ECG signals from EMG noises. In general, SSA contains four steps as: embedding, singular value decomposition, grouping, and diagonal averaging. Among these steps, grouping step contains parameter (indices) which can be adjusted to achieve the desirable results. Indeed, grouping is one of the important steps of SSA as the ECG and EMG signals are separated in this step. Hence, in the proposed method, a new criterion is presented to select the indices in grouping step to separate the ECG from EMG signal with higher accuracy.

Results

Performance of the proposed method is investigated using several experiments. Two sub-sets from Physionet MIT-BIH arrhythmia database are used for this purpose.

Conclusion

The experimental results demonstrate effectiveness of the proposed method in comparison with other SSA-based techniques.  相似文献   

8.
Based on the theory of stochastic resonance, the signal to noise ratio (SNR) of HPLC/UV chromatographic signal of roxithromycin is enhanced by cooperation of signal, noise and nonlinear system. A simple new method for the determination of low concentration of roxithromycin in beagle dog plasma is presented. Using signal enhancement by stochastic resonance, this method extends the limit of quantitation from the reported 0.5 to 0.1 microg/ml. During validation of the new method, HPLC/MS was used as a comparison technique. The results indicate that the recovery and low concentrations of roxithromycin in beagle dog plasma were equivalent between the two methods (P>0.05). Stochastic resonance may be a promising tool for improving detection limits in trace analysis.  相似文献   

9.
10.
A new method for detection of temporomandibular disorder based on singular spectrum analysis is presented. In this method the motion data of markers placed on the points of special interest on the faces of several subjects is extracted and analysed. The individuals are classified into a group of healthy subjects and a group of those with temporomandibular disorder by extracting the signal components of the original time series and separating the noise using the proposed technique. The results for both simulated and real data verify the effectiveness of the proposed algorithm.  相似文献   

11.
Analysis of dynamic brain imaging data.   总被引:18,自引:0,他引:18       下载免费PDF全文
Modern imaging techniques for probing brain function, including functional magnetic resonance imaging, intrinsic and extrinsic contrast optical imaging, and magnetoencephalography, generate large data sets with complex content. In this paper we develop appropriate techniques for analysis and visualization of such imaging data to separate the signal from the noise and characterize the signal. The techniques developed fall into the general category of multivariate time series analysis, and in particular we extensively use the multitaper framework of spectral analysis. We develop specific protocols for the analysis of fMRI, optical imaging, and MEG data, and illustrate the techniques by applications to real data sets generated by these imaging modalities. In general, the analysis protocols involve two distinct stages: "noise" characterization and suppression, and "signal" characterization and visualization. An important general conclusion of our study is the utility of a frequency-based representation, with short, moving analysis windows to account for nonstationarity in the data. Of particular note are 1) the development of a decomposition technique (space-frequency singular value decomposition) that is shown to be a useful means of characterizing the image data, and 2) the development of an algorithm, based on multitaper methods, for the removal of approximately periodic physiological artifacts arising from cardiac and respiratory sources.  相似文献   

12.
Spatial filtering, or beamforming, is a commonly used data-driven analysis technique in the field of Magnetoencephalography (MEG). Although routinely referred to as a single technique, beamforming in fact encompasses several different methods, both with regard to defining the spatial filters used to reconstruct source-space time series and in terms of the analysis of these time series. This paper evaluates two alternative methods of spatial filter construction and application. It demonstrates how encoding different requirements into the design of these filters has an effect on the results obtained. The analyses presented demonstrate the potential value of implementations which examine the timeseries projections in multiple orientations at a single location by showing that beamforming can reconstruct predominantly radial sources in the case of a multiple-spheres forward model. The accuracy of source reconstruction appears to be more related to depth than source orientation. Furthermore, it is shown that using three 1-dimensional spatial filters can result in inaccurate source-space time series reconstruction. The paper concludes with brief recommendations regarding reporting beamforming methodologies in order to help remove ambiguity about the specifics of the techniques which have been used.  相似文献   

13.
The analysis of a temporal series usually begins with a visual inspection of the raw data, from which a proper method for the detection of periodicities is chosen. Some of the methods currently used, as circular statistics, Cosinor, or spectral analyses, are useful when it comes to ascertain the existence of some periods, expected ‘a priori’ or to detect unknown frequencies. Even though some of the methods allow a wide scanning of possibilities, difficulties arise when signals are weak and concealed in larger amplitude noise. The register of the activity of a cave cricket, Strinatia brevipennis, under constant conditions, showed an intricate pattern of small peaks, interspersed with rare ones of much higher amplitudes. Attempts to analyse these data with the usual methods gave inconsistent results and sometimes did not detect rhythms. The results are mostly biased by the large amplitude components which hamper the detection of rhythms from weak signals. Schimmel and Paulssen (1997) proposed a noise-reduction method, which detects weak but coherent signals. This new tool was developed for the analysis of seismic data, being afterwards adapted to the analysis of temporal series of biological data. The method is called phase weighted stack (PWS) and performs a weighted summation of temporal series according to their coherence. The results are stacked time series which are cleaned up from incoherent noise, allowing the detection of weak signals that otherwise would be undistinguishable from noises. The method also enables the identification of the time (hour) of every periodic signal. The use of PWS in the analysis of cricketsÕ activity data cleared out frequencies, exposing a circadian component in all records.  相似文献   

14.
Biomechanical signals are represented in the time-frequency domain using the Wigner distribution function. Filtering of this representation for the case of a non-stationary displacement signal with impact is studied. Smoothed displacement data are then double differentiated and compared with references accelerometer data. It is shown that this technique is able to remove noise from these signals in a better way than conventional filtering techniques currently used in biomechanics.  相似文献   

15.
1IntroductionItiswellknownthatnervecellsworkinnoisyenvironment,andnoisesourcesrangingfrominternalthermalnoisetoexternalperturbation.Onepuzzlingproblemishowdonervecellsaccommodatenoiseincodingandtransforminginformation,recentresearchshowsthatnoisemayp…  相似文献   

16.
Unsupervised clustering represents a powerful technique for self-organized segmentation of biomedical image time series data describing groups of pixels exhibiting similar properties of local signal dynamics. The theoretical background is presented in the beginning, followed by several medical applications demonstrating the flexibility and conceptual power of these techniques. These applications range from functional MRI data analysis to dynamic contrast-enhanced perfusion MRI and breast MRI. For fMRI, these methods can be employed to identify and separate time courses of interest, along with their associated spatial patterns. When applied to dynamic perfusion MRI, they identify groups of voxels associated with time courses that are clinically informative and straightforward to interpret. In breast MRI, a segmentation of the lesion is achieved and in addition a subclassification is obtained within the lesion with regard to regions characterized by different MRI signal time courses. In the present paper, we conclude that unsupervised clustering techniques provide a robust method for blind analysis of time series image data in the important and current field of functional and dynamic MRI.  相似文献   

17.
《IRBM》2019,40(3):145-156
ObjectiveElectrocardiogram (ECG) is a diagnostic tool for recording electrical activities of the human heart non-invasively. It is detected by electrodes placed on the surface of the skin in a conductive medium. In medical applications, ECG is used by cardiologists to observe heart anomalies (cardiovascular diseases) such as abnormal heart rhythms, heart attacks, effects of drug dosage on subject's heart and knowledge of previous heart attacks. Recorded ECG signal is generally corrupted by various types of noise/distortion such as cardiac (isoelectric interval, prolonged depolarization and atrial flutter) or extra cardiac (respiration, changes in electrode position, muscle contraction and power line noise). These factors hide the useful information and alter the signal characteristic due to low Signal-to-Noise Ratio (SNR). In such situations, any failure to judge the ECG signal correctly may result in a delay in the treatment and harm a subject (patient) health. Therefore, appropriate pre-processing technique is necessary to improve SNR to facilitate better treatment to the subject. Effects of different pre-processing techniques on ECG signal analysis (based on R-peaks detection) are compared using various Figures of Merit (FoM) such as sensitivity (Se), accuracy (Acc) and detection error rate (DER) along with SNR.MethodsIn this research article, a new fractional wavelet transform (FrWT) has been proposed as a pre-processing technique in order to overcome the disadvantages of other existing commonly used techniques viz. wavelet transform (WT) and the fractional Fourier transform (FrFT). The proposed FrWT technique possesses the properties of multiresolution analysis and represents signal in the fractional domain which consists of representation in terms of rotation of signals in the time–frequency plane. In the literature, ECG signal analysis has been improvised using statistical pre-processing techniques such as principal component analysis (PCA), and independent component analysis (ICA). However, both PCA and ICA are prone to suffer from slight alterations in either signal or noise, unless the basis functions are prepared with a worldwide set of ECG. Independent Principal Component Analysis (IPCA) has been used to overcome this shortcoming of PCA and ICA. Therefore, in this paper three techniques viz. FrFT, FrWT and IPCA are selected for comparison in pre-processing of ECG signals.ResultsThe selected methods have been evaluated on the basis of SNR, Se, Acc and DER of the detected ECG beats. FrWT yields the best results among all the methods considered in this paper; 34.37dB output SNR, 99.98% Se, 99.96% Acc, and 0.036% DER. These results indicate the quality of biology-related information retained from the pre-processed ECG signals for identifying different heart abnormalities.ConclusionCorrect analysis of the acquired ECG signal is the main challenge for cardiologist due to involvement of various types of noises (high and low frequency). Twenty two real time ECG records have been evaluated based on various FoM such as SNR, Se, Acc and DER for the proposed FrWT and existing FrFT and IPCA preprocessing techniques. Acquired real-time ECG database in normal and disease situations is used for the purpose. The values of FoMs indicate high SNR and better detection of R-peaks in a ECG signal which is important for the diagnosis of cardiovascular disease. The proposed FrWT outperforms all other techniques and holds both analytical attributes of the actual ECG signal and alterations in the amplitudes of various ECG waveforms adequately. It also provides signal portrayals in the time-fractional-frequency plane with low computational complexity enabling their use practically for versatile applications.  相似文献   

18.
The spike trains generated by a neuron model are studied by the methods of nonlinear time series analysis. The results show that the spike trains are chaotic. To investigate effect of noise on transmission of chaotic spike trains, this chaotic spike trains are used as a discrete subthreshold input signal to the integrate-and-fire neuronal model and the FitzHugh-Nagumo(FHN) neuronal model working in noisy environment. The mutual information between the input spike trains and the output spike trains is calculated, the result shows that the transformation of information encoded by the chaotic spike trains is optimized by some level of noise, and stochastic resonance(SR) measured by mutual information is a property available for neurons to transmit chaotic spike trains.  相似文献   

19.
The surface electromyographic (sEMG) signal that originates in the muscle is inevitably contaminated by various noise signals or artifacts that originate at the skin-electrode interface, in the electronics that amplifies the signals, and in external sources. Modern technology is substantially immune to some of these noises, but not to the baseline noise and the movement artifact noise. These noise sources have frequency spectra that contaminate the low-frequency part of the sEMG frequency spectrum. There are many factors which must be taken into consideration when determining the appropriate filter specifications to remove these artifacts; they include the muscle tested and type of contraction, the sensor configuration, and specific noise source. The band-pass determination is always a compromise between (a) reducing noise and artifact contamination, and (b) preserving the desired information from the sEMG signal. This study was designed to investigate the effects of mechanical perturbations and noise that are typically encountered during sEMG recordings in clinical and related applications. The analysis established the relationship between the attenuation rates of the movement artifact and the sEMG signal as a function of the filter band pass. When this relationship is combined with other considerations related to the informational content of the signal, the signal distortion of filters, and the kinds of artifacts evaluated in this study, a Butterworth filter with a corner frequency of 20 Hz and a slope of 12 dB/oct is recommended for general use. The results of this study are relevant to biomechanical and clinical applications where the measurements of body dynamics and kinematics may include artifact sources.  相似文献   

20.
Averaging signals in time domain is one of the main methods of noise attenuation in biomedical signal processing in case of systems producing repetitive patterns such as electrocardiographic (ECG) acquisition systems. This paper presents a comprehensive study of weighted averaging of ECG signal. Presented methods use criterion function minimization, partitioning of input set of data in the time domain as well as Bayesian and empirical Bayesian framework. The existing methods are described together with their extensions. Performance of all presented methods is experimentally evaluated and compared with the traditional averaging by using arithmetic mean and well-known weighted averaging methods based on criterion function minimization (WACFM).  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号