首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 828 毫秒
1.
A Fourier deconvolution method has been developed to explicitly determine the amount of backbone amide deuterium incorporated into protein regions or segments by hydrogen/deuterium (H/D) exchange with high-resolution mass spectrometry. Determination and analysis of the level and number of backbone amide exchanging in solution provide more information about the solvent accessibility of the protein than do previous centroid methods, which only calculate the average deuterons exchanged. After exchange, a protein is digested into peptides as a way of determining the exchange within a local area of the protein. The mass of a peptide upon deuteration is a sum of the natural isotope abundance, fast exchanging side-chain hydrogens (present in MALDI-TOF H/2H data) and backbone amide exchange. Removal of the components of the isotopic distribution due to the natural isotope abundances and the fast exchanging side-chains allows for a precise quantification of the levels of backbone amide exchange, as is shown by an example from protein kinase A. The deconvoluted results are affected by overlapping peptides or inconsistent mass envelopes, and evaluation procedures for these cases are discussed. Finally, a method for determining the back exchange corrected populations is presented, and its effect on the data is discussed under various circumstances.  相似文献   

2.
The structure of Chromatium high potential iron protein (HiPIP) has been refined by semiautomatic Fo-Fc (observed minus calculated structure amplitude Fourier methods to a convential R index, R=sum of the absolute value of Fo-Fc divided by the sum of Fo, of 24.7% for a model in which bond distances and angles are constrained to standard values. Bond length and angle constraints were applied only intermittenly during the computations. At a late stage of the refinement, atomic parameters for only the Fe4S4 cluster plus the 4 associated cystein S-gamma atoms were adjusted by least squares methods and kept fixed during the rest of the refinement. The refined model consists of 625 of the 632 nonhydrogen atoms in the protein plus 75 water molecules. Seven side chain atoms could not be located in the final electron density map. A computer program rather than visual inspection was used wherever possible in the refinement: for locating water molecules, for removing water molecules that too closely approach other atoms, for deleting atoms that lay in regions of low electron density, and for evaluating the progress of refinement. Fo-Fc Fourier refinement is sufficiently economical to be applied routinely in protein crystal structure determinations. The complete HiPIP refinement required approximately 12 hours of CDC 3600 computer time and cost less than $3000 starting from a "trial structure," based upon multipe isomorphoous replacement phases, which gave an R of 43%...  相似文献   

3.
This study evaluated the contributions of sympathetic and parasympathetic modulation to heart rate variability during situations in which vagal and sympathetic tone predominated. In a placebo-controlled, randomized, double blind blockade study, six young healthy male individuals received propranolol (0.2 mg x kg(-1)), atropine (0.04 mg x kg(-1)), propranolol plus atropine, or placebo infusions over 4 days. Time-domain indices were calculated during 40 min of rest and 20 min of exercise at 70% of maximal exercise intensity. Spectrum analysis, using fast Fourier transformation, was also performed at rest and during the exercise. The time-domain indices standard deviation of R-R intervals, mean of the standard deviations of all R-R intervals for all 5-min segments, percentage of number of pairs of adjacent R-R intervals differing by more than 50 ms, and square root of the mean of the sum of squares of differences between adjacent R-R intervals were reduced after atropine and propranolol plus atropine. Propranolol alone caused no appreciable change in any of the time-domain indices. At rest, all spectrum components were similar after placebo and propranolol infusions, but following parasympathetic and double autonomic blockade there was a reduction in all components of the spectrum analysis, except for the low:high ratio. During exercise, partial and double blockade did not change significantly any of the spectrum components. Thus, time and frequency-domain indices of heart rate variability were able to detect vagal activity, but could not detect sympathetic activity. During exercise, spectrum analysis is not capable of evaluating autonomic modulation of heart rate.  相似文献   

4.
It has been known for some time that the variability of the R-R intervals in the electrocardiogram yields valuable information concerning the types of arrhythmia which might be present. In this paper, an investigation is made into the application of zero-crossing analysis to the study of such variability. The number of times that the R-R interval crosses its mean value over a specified interval of time is counted, and may be associated with a particular characteristic frequency, related to the dominant frequency components of the power spectrum of R-R intervals. Higher order crossing counts may be computed by taking combinations of sum and difference operations on the original time series. The advantage of using zero-crossing analysis over spectral analysis is the computational simplicity of the former. It is demonstrated, by analysing data taken from the MIT-BIH Arrhythmia database, that zero crossing analysis can sometimes be used to distinguish between different arrhythmias, but forethought concerning the number of sum and difference operations to be taken on the original data set is required when computing the higher order crossing counts.  相似文献   

5.
A method of nonlinear analysis in the frequency domain.   总被引:4,自引:0,他引:4       下载免费PDF全文
A method is developed for the analysis of nonlinear biological systems based on an input temporal signal that consists of a sum of a large number of sinusoids. Nonlinear properties of the system are manifest by responses at harmonics and intermodulation frequencies of the input frequencies. The frequency kernels derived from these nonlinear responses are similar to the Fourier transforms of the Wiener kernels. Guidelines for the choice of useful input frequency sets, and examples satisfying these guidelines, are given. A practical algorithm for varying the relative phases of the input sinusoids to separate high-order interactions is presented. The utility of this technique is demonstrated with data obtained from a cat retinal ganglion cell of the Y type. For a high spatial frequency grafting, the entire response is contained in the even-order nonlinear components. Even at low contrast, fourth-order components are detectable. This suggests the presence of an essential nonlinearity in the functional pathway of the Y cell, with its singularity at zero contrast.  相似文献   

6.
7.
The measurement of single ion channel kinetics is difficult when those channels exhibit subconductance events. When the kinetics are fast, and when the current magnitudes are small, as is the case for Na+, Ca2+, and some K+ channels, these difficulties can lead to serious errors in the estimation of channel kinetics. I present here a method, based on the construction and analysis of mean-variance histograms, that can overcome these problems. A mean-variance histogram is constructed by calculating the mean current and the current variance within a brief "window" (a set of N consecutive data samples) superimposed on the digitized raw channel data. Systematic movement of this window over the data produces large numbers of mean-variance pairs which can be assembled into a two-dimensional histogram. Defined current levels (open, closed, or sublevel) appear in such plots as low variance regions. The total number of events in such low variance regions is estimated by curve fitting and plotted as a function of window width. This function decreases with the same time constants as the original dwell time probability distribution for each of the regions. The method can therefore be used: 1) to present a qualitative summary of the single channel data from which the signal-to-noise ratio, open channel noise, steadiness of the baseline, and number of conductance levels can be quickly determined; 2) to quantify the dwell time distribution in each of the levels exhibited. In this paper I present the analysis of a Na+ channel recording that had a number of complexities. The signal-to-noise ratio was only about 8 for the main open state, open channel noise, and fast flickers to other states were present, as were a substantial number of subconductance states. "Standard" half-amplitude threshold analysis of these data produce open and closed time histograms that were well fitted by the sum of two exponentials, but with apparently erroneous time constants, whereas the mean-variance histogram technique provided a more credible analysis of the open, closed, and subconductance times for the patch. I also show that the method produces accurate results on simulated data in a wide variety of conditions, whereas the half-amplitude method, when applied to complex simulated data shows the same errors as were apparent in the real data. The utility and the limitations of this new method are discussed.  相似文献   

8.
9.
Background:The Royal College of Pathologists of Australasia (RCPA) Porphyrin Quality Assurance Program assesses the measurement of urine, faecal, plasma and whole blood porphyrins and their components plus urinary porphobilinogen and delta aminolaevulinic acid and has laboratories enrolled from around the world. It was observed that there was a wide scatter in results submitted to some subsections of the program.Methods:A detailed questionnaire covering the analytical techniques used in the diagnosis of porphyria was sent to all laboratories enrolled in the RCPA Porphyrin Quality Assurance Program. Additionally, self-enrolment data over a five year period was examined for trends/changes in standardisation, reagent sources and analytical technique.Results:Twenty of the 45 laboratories enrolled in the Porphyrin Quality Assurance Program completed the survey, providing a snapshot of the analytical techniques used world-wide. Post survey self enrolment data indicated only little or no noticeable changes to analytical standardisation of techniques despite the continual lack of agreement of results in subsections of the External Quality Assurance program.Conclusions:While some aspects of porphyria testing are relatively consistent between laboratories, other diagnostic techniques vary widely. A wide variety of individualised reference intervals and reporting techniques is currently in use world-wide. While most of the participants in the survey are regional reference centres specialising in the diagnosis of porphyria and, as such, their diagnostic capability is not in question, international guidelines or global harmonisation of analytical techniques should allow better inter-laboratory comparisons to be made, ultimately improving diagnostic accuracy.  相似文献   

10.
11.
A method based on Fourier Transform is presented for the representation of data by an arbitrary sum of exponentials or Gaussian functions. The method has been successfully applied to the type of data sets which arise in pharmacokinetic studies. Two techniques for error ripple elinination are discussed.  相似文献   

12.
13.
Pulsed-laser photoacoustics is a technique which measures photoinduced enthalpic and volumetric changes on the nano- and microsecond timescales. Analysis of photoacoustic data generally requires deconvolution for a sum of exponentials, a procedure which has been developed extensively in the field of time-resolved fluorescence decay. Initial efforts to adapt an iterative nonlinear least squares computer program, utilizing the Marquardt algorithm, from the fluorescence field to photoacoustics indicated that significant modifications were needed. The major problem arises from the wide range of transient decay times which must be addressed by the photoacoustic technique. We describe an alternative approach to numerical convolution with exponential decays, developed to overcome the problems. Instead of using an approximation method (Simpson's rule) for evaluating the convolution integral, we construct a continuous instrumental response function by quadratic fitting of the discrete data and evaluate the convolution integral directly, without approximations. The success and limitations of this quadratic-fit convolution program are then demonstrated using simulated data. Finally, the program is applied to the analysis of experimental data to compare the resolution capabilities of two commercially available transducers. The advantages of a broadband, heavily damped transducer are shown for a standard organic photochemical system, the quenching of the triplet state of benzophenone by 2,5-dimethyl-2,4-hexadiene.  相似文献   

14.
15.
Quantifying mechanical output is fundamental to understanding metabolism that fuels muscle contraction and more recent attempts to understand signal transduction and gene regulation. The latter requires long-term application of exercise protocols that result in large amounts of data on muscle performance. The purpose of this study was to develop software for automated quantification of skeletal muscle contractions. An in situ mouse sciatic nerve stimulation model was used to produce contractions over a broad range of frequencies and recorded as both digital and analog signals using a PC analog to digital converter board and chart recorder, respectively. Spectral analysis of the noise components formed the basis for designing a smoothing Chebyshev filter. Algorithms implemented in custom software identified twitches and estimated baseline levels from the smoothed signal. The time to peak force, peak force, tension-time integral, and half-relaxation time were determined for each twitch after baseline correction. The automated results were compared to those obtained from manual measurements of the analog signal. Bland–Altman analysis of the parameters computed from digital signals compared with the corresponding measurements by manual planometry demonstrates the agreement of the digital processing algorithm with planometry over a wide range of twitch characteristics. This program may also be used to study the mechanics of other preparations from isolated muscles, human proximal limb performance, and other digital physiologic signals. Adaptation of the filter function is required to apply the analysis to another experimental apparatus with differing noise characteristics. A full version of the program and instructions for its use are available for download at www.rad.msu.edu.  相似文献   

16.
A method is presented that reliably detects spherical viruses from a wide variety of noisy low-contrast electron micrographs. Such detection is one of the first image analysis steps in the computer-aided reconstruction of three-dimensional density distribution models of viruses. Particle detection is based on the comparison of intensity in a circular area and in the surrounding ring followed by a number of tests to validate the potential particles. The only required input from the user in addition to the micrograph is an approximate radius of the particle. The method has been implemented as program ETHAN that has been tested for several different data sets. ETHAN has also successfully been used to detect DNA-less virus particles for an actual reconstruction.  相似文献   

17.
Region-based association analysis is a more powerful tool for gene mapping than testing of individual genetic variants, particularly for rare genetic variants. The most powerful methods for regional mapping are based on the functional data analysis approach, which assumes that the regional genome of an individual may be considered as a continuous stochastic function that contains information about both linkage and linkage disequilibrium. Here, we extend this powerful approach, earlier applied only to independent samples, to the samples of related individuals. To this end, we additionally include a random polygene effects in functional linear model used for testing association between quantitative traits and multiple genetic variants in the region. We compare the statistical power of different methods using Genetic Analysis Workshop 17 mini-exome family data and a wide range of simulation scenarios. Our method increases the power of regional association analysis of quantitative traits compared with burden-based and kernel-based methods for the majority of the scenarios. In addition, we estimate the statistical power of our method using regions with small number of genetic variants, and show that our method retains its advantage over burden-based and kernel-based methods in this case as well. The new method is implemented as the R-function ‘famFLM’ using two types of basis functions: the B-spline and Fourier bases. We compare the properties of the new method using models that differ from each other in the type of their function basis. The models based on the Fourier basis functions have an advantage in terms of speed and power over the models that use the B-spline basis functions and those that combine B-spline and Fourier basis functions. The ‘famFLM’ function is distributed under GPLv3 license and is freely available at http://mga.bionet.nsc.ru/soft/famFLM/.  相似文献   

18.
A modification of a method of Gardner, which employs Fourier-transform techniques, is used to obtain initial estimates for the number of terms and values of the parameters for data which are represented by a sum of exponential terms. New experimental methods have increased both the amount and accuracy of data from radiopharmaceutical experiments. This in turn allows one to devise specific numerical methods that utilize the better data. The inherent difficulties of fitting exponentials to data, which is an ill-posed problem, cannot be overcome by any method. However, we show that the present accuracy of Fourier methods may be extended by our numerical methods applied to the improved data sets. In many cases the method yields accurate estimates for the parameters; these estimates then are to be used as initial estimates for a nonlinear least-squares analysis of the problem.  相似文献   

19.
Combining information across genes in the statistical analysis of microarray data is desirable because of the relatively small number of data points obtained for each individual gene. Here we develop an estimator of the error variance that can borrow information across genes using the James-Stein shrinkage concept. A new test statistic (FS) is constructed using this estimator. The new statistic is compared with other statistics used to test for differential expression: the gene-specific F test (F1), the pooled-variance F statistic (F3), a hybrid statistic (F2) that uses the average of the individual and pooled variances, the regularized t-statistic, the posterior odds statistic B, and the SAM t-test. The FS-test shows best or nearly best power for detecting differentially expressed genes over a wide range of simulated data in which the variance components associated with individual genes are either homogeneous or heterogeneous. Thus FS provides a powerful and robust approach to test differential expression of genes that utilizes information not available in individual gene testing approaches and does not suffer from biases of the pooled variance approach.  相似文献   

20.
A method of classifying phytosociological data into non exclusive groups is described. It is an agglomerative, polythetic technique consisting of two stages: in the first stage, groups are formed and in the second, the groups are classified hierarchically. Both normal and inverse analyses of a set of data from limestone grikes in Bruce County, Ontario, are presented as an example of the application of the method, which, it is suggested, is effective in handling species of wide ecological amplitude and samples containing more than one element of vegetational mosaics. The application of Fourier analysis to the problem of determining the optimum number of groups in a classification is advocated.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号