首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
A new algorithm for three-dimensional reconstruction from randomly oriented projections has been developed. The algorithm recovers the 3D Radon transform from the 2D Radon transforms (sinograms) of the projections. The structure in direct space is obtained by an inversion of the 3D Radon transform. The mathematical properties of the Radon transform are exploited to design a special filter that can be used to correct inconsistencies in a data set and to fill the gaps in the Radon transform that originate from missing projections. Several versions of the algorithm have been implemented, with and without a filter and with different interpolation methods for merging the sinograms into the 3D Radon transform. The algorithms have been tested on analytical phantoms and experimental data and have been compared with a weighted back projection algorithm (WBP). A quantitative analysis of phantoms reconstructed from noise-free and noise-corrupted projections shows that the new algorithms are more accurate than WBP when the number of projections is small. Experimental structures obtained by the new methods are strictly comparable to those obtained by WBP. Moreover, the algorithm is more than 10 times faster than WPB when applied to a data set of 1000-5000 projections. Copyright 1999 Academic Press.  相似文献   

2.
J A Dvorak  S M Banks 《Cytometry》1989,10(6):811-813
We describe an algorithm, Vout = Integer ([2(12)-1/2(12 lambda)-1] V lambda in-1) + 1; lambda greater than 0 based upon Box-Cox transformations as an alternative to nonlinear electronic amplifiers to expand or compress high- or low-amplitude flow cytometer-derived signals. If the indexing parameter lambda less than 1, input channels in the high-amplitude input range are compressed in the output range as occurs when an electronic logarithmic amplifier is used. However, if lambda greater than 1, input channels in the low-amplitude input range are compressed in the output range as occurs when an electronic power amplifier is used. Our modified Box-Cox transform can be implemented either during data collection or off-line for the transformation of previously collected raw data. The transform is the equivalent of an infinite class of nonlinear amplifiers. As the transform is implemented in software, it does not suffer from many of the disadvantages of nonlinear electronic amplifiers.  相似文献   

3.
A new method for processing experimental data from MHD diagnostics is discussed that provides a more detailed study of the dynamics of large-scale MHD instabilities. The method is based on the Hilbert-Huang transform method and includes an empirical mode decomposition algorithm, which is used to decompose the experimental MHD diagnostic signals into a set of frequency-and amplitude-modulated harmonics in order to construct the time evolutions of the amplitudes and frequencies of these harmonics with the help of the Hilbert transform. The method can also be applied to analyze data from other diagnostics that measure unsteady oscillating signals.  相似文献   

4.
A database independent search algorithm for the detection of phosphopeptides is described. The program interrogates the tandem mass spectra of LC-MS/MS data sets regarding the presence of phosphorylation specific signatures. To achieve maximum informational content, the complementary fragmentation techniques electron capture dissociation (ECD) and collisionally activated dissociation (CAD) are used independently for peptide fragmentation. Several criteria characteristic for peptides phosphorylated on either serine or threonine residues were evaluated. The final algorithm searches for product ions generated by either the neutral loss of phosphoric acid or the combined neutral loss of phosphoric acid and water. Various peptide mixtures were used to evaluate the program. False positive results were not observed because the program utilizes the parts-per-million mass accuracy of Fourier transform ion cyclotron resonance mass spectrometry. Additionally, false negative results were not generated owing to the high sensitivity of the chosen criteria. The limitations of database dependent data interpretation tools are discussed and the potential of the novel algorithm to overcome these limitations is illustrated.  相似文献   

5.
用麦夸方法最优拟合逻辑斯谛曲线   总被引:37,自引:2,他引:35  
王莽莽  李典谟 《生态学报》1986,6(2):142-147
一般对非线性逻辑斯谛生长曲线的拟合是采用首先对原方程线性化以后用线性最小二乘的方法。此方法不是最优的。本文提出用麦夸方法对曲线进行拟合,并且比较了Gause、Andrewartha、May、Pearl、Krebs、万昌秀等人提出的方法与计算结果。麦夸方法对生物实验及生态学中诸多非线性曲线的参数估计具有普遍的意义。  相似文献   

6.
Structured illumination microscopy (SIM) with axially optical sectioning capability has found widespread applications in three-dimensional live cell imaging in recent years, since it combines high sensitivity, short image acquisition time, and high spatial resolution. To obtain one sectioned slice, three raw images with a fixed phase-shift, normally 2π/3, are generally required. In this paper, we report a data processing algorithm based on the one-dimensional Hilbert transform, which needs only two raw images with arbitrary phase-shift for each single slice. The proposed algorithm is different from the previous two-dimensional Hilbert spiral transform algorithm in theory. The presented algorithm has the advantages of simpler data processing procedure, faster computation speed and better reconstructed image quality. The validity of the scheme is verified by imaging biological samples in our developed DMD-based LED-illumination SIM system.  相似文献   

7.
An algorithm for the solution of the Maximum Entropy problem is presented, for use when the data are considerably oversampled, so that the amount of independent information they contain is very much less than the actual number of data points. The application of general purpose entropy maximisation methods is then comparatively inefficient. In this algorithm the independent variables are in the singular space of the transform between map (or image or spectrum) and data. These variables are much fewer in number than either the data or the reconstructed map, resulting in a fast and accurate algorithm. The speed of this algorithm makes feasible the incorporation of recent ideas in maximum entropy theory (Skilling 1989 a; Gull 1989). This algorithm is particularly appropriate for the exponential decay problem, solution scattering, fibre diffraction, and similar applications.  相似文献   

8.
Zhang H  Liu X 《Bio Systems》2011,105(1):73-82
DNA computing has been applied in broad fields such as graph theory, finite state problems, and combinatorial problem. DNA computing approaches are more suitable used to solve many combinatorial problems because of the vast parallelism and high-density storage. The CLIQUE algorithm is one of the gird-based clustering techniques for spatial data. It is the combinatorial problem of the density cells. Therefore we utilize DNA computing using the closed-circle DNA sequences to execute the CLIQUE algorithm for the two-dimensional data. In our study, the process of clustering becomes a parallel bio-chemical reaction and the DNA sequences representing the marked cells can be combined to form a closed-circle DNA sequences. This strategy is a new application of DNA computing. Although the strategy is only for the two-dimensional data, it provides a new idea to consider the grids to be vertexes in a graph and transform the search problem into a combinatorial problem.  相似文献   

9.
Brain computer interfaces (BCI) provide a new approach to human computer communication, where the control is realised via performing mental tasks such as motor imagery (MI). In this study, we investigate a novel method to automatically segment electroencephalographic (EEG) data within a trial and extract features accordingly in order to improve the performance of MI data classification techniques. A new local discriminant bases (LDB) algorithm using common spatial patterns (CSP) projection as transform function is proposed for automatic trial segmentation. CSP is also used for feature extraction following trial segmentation. This new technique also allows to obtain a more accurate picture of the most relevant temporal–spatial points in the EEG during the MI. The results are compared with other standard temporal segmentation techniques such as sliding window and LDB based on the local cosine transform (LCT).  相似文献   

10.
MOTIVATION: At the core of most protein gene-finding algorithms are the coding measures used to make a decision on coding/non-coding. Of the protein coding measures, the Fourier measure is one of the most important. However, due to the limited length of the windows usually used, the accuracy of the measure is not satisfactory. This paper is devoted to improving the accuracy by lengthening the sequence to amplify the periodicity of 3 in the coding regions. RESULTS: A new algorithm is presented called the lengthen-shuffle Fourier transform algorithm. For the same window length, the percentage accuracy of the new algorithm is 6-7% higher than that of the ordinary Fourier transform algorithm. The resulting percentage accuracy (average of specificity and sensitivity) of the new measure is 84.9% for the window length 162 bp. AVAILABILITY: The program is available on request fromC.- T. Zhang. Contact: ctzhang@tju.edu.cn   相似文献   

11.
In this paper we discuss the embedding of symmetry information in an algorithm for three-dimensional reconstruction, which is based on the discrete Radon transform. The original algorithm was designed for randomly oriented and in principal asymmetric particles. The expanded version presented here covers all symmetry point groups which can be exhibited by macromolecular protein assemblies. The orientations of all symmetry equivalent projections, based on the orientation of an experimental projection, are obtained using global group operators. Further, an improved interpolation scheme for the recovery of the three-dimensional discrete Radon transform has been designed for greater computational efficiency. The algorithm has been tested on phantom structures as well as on real data, a virus structure possessing icosahedral symmetry.  相似文献   

12.
Finite element (FE) models are used to identify head injury mechanisms and design new and improved injury prevention schemes. Although brain-skull boundary conditions strongly influence the model mechanical responses, limited experimental data are available to develop an informed representation. We hypothesize that the spinal cord tension and gravity contribute to the pons displacement in vivo. Static high-resolution T1-weighted sagittal MR images of the inferior portion of the head in neutral and flexion positions were acquired in 15 human volunteers in both supine and prone postures. Boundaries of the pons and clivus were extracted with a gradient-based algorithm, and the pontes were fitted into ellipses. Assuming rigid body motion of the skull, image pairs in different postures were co-registered with an autocorrelation technique. By comparing images before and after the motion, we found that while the rotation of the pons is negligible relative to the skull, the pons displaces significantly at the foramen magnum, on the order of approximately 2 mm. When the spinal cord tension and gravity act in concert, the pons moves caudally; when opposed, superiorly, such that the influence of gravity on the pons is six times that of the spinal cord tension. Based on these findings, we recommend that the brainstem-skull interface be treated as a sliding (with or without friction) boundary condition in FE models of the human head.  相似文献   

13.
A quantitative analysis of the interspecific variability between beamforming baffle shapes in the biosonar system of bats is presented. The data set analyzed consisted of 100 outer ear (pinna) shapes from at least 59 species. A vector-space representation suitable for principal component analysis (PCA) was constructed by virtue of a transform of the pinna surfaces into cylindrical coordinates. The central axis of the cylindrical transform was found by minimizing a potential function. The shapes were aligned by means of their respective axes and their center of gravity. The average pinna of the sample was a symmetrical, obliquely truncated horn. The first seven eigenvalues accounted already for two-thirds of the variability around the mean, which indicates that most of the biodiversity in the bat pinna can be understood in a more low-dimensional space. The first three principal components show that most of the variability of the bat pinna sample is in terms of opening angle, left-right asymmetry, and selective changes in width at the top or the bottom of the pinna. The beampattern effects of these individual components have been characterized. These insights could be used to design bioinspired beamforming devices from the diversity in biosonar.  相似文献   

14.
The Ornstein-Uhlenbeck process has been proposed as a model for the spontaneous activity of a neuron. In this model, the firing of the neuron corresponds to the first passage of the process to a constant boundary, or threshold. While the Laplace transform of the first-passage time distribution is available, the probability distribution function has not been obtained in any tractable form. We address the problem of estimating the parameters of the process when the only available data from a neuron are the interspike intervals, or the times between firings. In particular, we give an algorithm for computing maximum likelihood estimates and their corresponding confidence regions for the three identifiable (of the five model) parameters by numerically inverting the Laplace transform. A comparison of the two-parameter algorithm (where the time constant tau is known a priori) to the three-parameter algorithm shows that significantly more data is required in the latter case to achieve comparable parameter resolution as measured by 95% confidence intervals widths. The computational methods described here are a efficient alternative to other well known estimation techniques for leaky integrate-and-fire models. Moreover, it could serve as a template for performing parameter inference on more complex integrate-and-fire neuronal models.  相似文献   

15.
An approach to Magnetic Resonance (MR) image reconstruction from undersampled data is proposed. Undersampling artifacts are removed using an iterative thresholding algorithm applied to nonlinearly transformed image block arrays. Each block array is transformed using kernel principal component analysis where the contribution of each image block to the transform depends in a nonlinear fashion on the distance to other image blocks. Elimination of undersampling artifacts is achieved by conventional principal component analysis in the nonlinear transform domain, projection onto the main components and back-mapping into the image domain. Iterative image reconstruction is performed by interleaving the proposed undersampling artifact removal step and gradient updates enforcing consistency with acquired k-space data. The algorithm is evaluated using retrospectively undersampled MR cardiac cine data and compared to k-t SPARSE-SENSE, block matching with spatial Fourier filtering and k-t ℓ1-SPIRiT reconstruction. Evaluation of image quality and root-mean-squared-error (RMSE) reveal improved image reconstruction for up to 8-fold undersampled data with the proposed approach relative to k-t SPARSE-SENSE, block matching with spatial Fourier filtering and k-t ℓ1-SPIRiT. In conclusion, block matching and kernel methods can be used for effective removal of undersampling artifacts in MR image reconstruction and outperform methods using standard compressed sensing and 1-regularized parallel imaging methods.  相似文献   

16.
Mammalian reproduction evolved within Earth's 1-g gravitational field. As we move closer to the reality of space habitation, there is growing scientific interest in how different gravitational states influence reproduction in mammals. Habitation of space and extended spaceflight missions require prolonged exposure to decreased gravity (hypogravity, i.e., weightlessness). Lift-off and re-entry of the spacecraft are associated with exposure to increased gravity (hypergravity). Existing data suggest that spaceflight is associated with a constellation of changes in reproductive physiology and function. However, limited spaceflight opportunities and confounding effects of various nongravitational factors associated with spaceflight (i.e., radiation, stress) have led to the development of ground-based models for studying the effects of altered gravity on biological systems. Human bed rest and rodent hindlimb unloading paradigms are used to study exposure to hypogravity. Centrifugation is used to study hypergravity. Here, we review the results of spaceflight and ground-based models of altered gravity on reproductive physiology. Studies utilizing ground-based models that simulate hyper- and hypogravity have produced reproductive results similar to those obtained from spaceflight and are contributing new information on biological responses across the gravity continuum, thereby confirming the appropriateness of these models for studying reproductive responses to altered gravity and the underlying mechanisms of these responses. Together, these unique tools are yielding new insights into the gravitational biology of reproduction in mammals.  相似文献   

17.
We developed an algorithm for the automated detection and analysis of elementary Ca2+ release events (ECRE) based on the two-dimensional nondecimated wavelet transform. The transform is computed with the "à trous" algorithm using the cubic B-spline as the basis function and yields a multiresolution analysis of the image. This transform allows for highly efficient noise reduction while preserving signal amplitudes. ECRE detection is performed at the wavelet levels, thus using the whole spectral information contained in the image. The algorithm was tested on synthetic data at different noise levels as well as on experimental data of ECRE. The noise dependence of the statistical properties of the algorithm (detection sensitivity and reliability) was determined from synthetic data and detection parameters were selected to optimize the detection of experimental ECRE. The wavelet-based method shows considerably higher detection sensitivity and less false-positive counts than previously employed methods. It allows a more efficient detection of elementary Ca2+ release events than conventional methods, in particular in the presence of elevated background noise levels. The subsequent analysis of the morphological parameters of ECRE is reliably reproduced by the analysis procedure that is applied to the median filtered raw data. Testing the algorithm more rigorously showed that event parameter histograms (amplitude, rise time, full duration at half-maximum, and full width at half-maximum) were faithfully extracted from synthetic, "in-focus" and "out-of-focus" line scan sparks. Most importantly, ECRE obtained with laser scanning confocal microscopy of chemically skinned mammalian skeletal muscle fibers could be analyzed automatically to reproducibly establish event parameter histograms. In summary, our method provides a new valuable tool for highly reliable automated detection of ECRE in muscle but can also be adapted to other preparations.  相似文献   

18.
We have developed a computer program for Wiener filtering of evoked potential data. The basic algorithm involves computation of the difference berween the power spectrum of the sweep sum and the sum of power spectra of individual sweeps. Power spectra are computed by means of the discrete Fourier transform. The program is now being run on a LSI-11 computer in a neurophysiology research laboratory to analyze somatic evoked potential data from monkeys.  相似文献   

19.
ABSTRACT: BACKGROUND: Electrical Impedance Tomography (EIT) is used as a fast clinical imaging technique formonitoring the health of the human organs such as lungs, heart, brain and breast. Eachpractical EIT reconstruction algorithm should be efficient enough in terms of convergencerate, and accuracy. The main objective of this study is to investigate the feasibility of preciseempirical conductivity imaging using a sinc-convolution algorithm in D-bar framework. METHODS: At the first step, synthetic and experimental data were used to compute an intermediate objectnamed scattering transform. Next, this object was used in a 2-day integral equation whichwas precisely and rapidly solved via sinc-convolution algorithm to find the square root of theconductivity for each pixel of image. For the purpose of comparison, multigrid and NOSERalgorithms were implemented under a similar setting. Quality of reconstructions of syntheticmodels was tested against GREIT approved quality measures. To validate the simulationresults, reconstructions of a phantom chest and a human lung were used. RESULTS: Evaluation of synthetic reconstructions shows that the quality of sinc-convolutionreconstructions is considerably better than that of each of its competitors in terms ofamplitude response, position error, ringing, resolution and shape-deformation. In addition, theresults confirm near-exponential and linear convergence rates for sinc-convolution andmultigrid, respectively. Moreover, the least degree of relative errors and the most degree oftruth were found in sinc-convolution reconstructions from experimental phantom data.Reconstructions of clinical lung data show that the related physiological effect is wellrecovered by sinc-convolution algorithm. CONCLUSIONS: Parametric evaluation demonstrates the efficiency of sinc-convolution to reconstruct accurateconductivity images from experimental data. Excellent results in phantom and clinicalreconstructions using sinc-convolution support parametric assessment results and suggest thesinc-convolution to be used for precise clinical EIT applications.  相似文献   

20.
In this paper, some morphological transformations are used to detect the unevenly illuminated background of text images characterized by poor lighting, and to acquire illumination normalized result. Based on morphologic Top-Hat transform, the uneven illumination normalization algorithm has been carried out, and typically verified by three procedures. The first procedure employs the information from opening based Top-Hat operator, which is a classical method. In order to optimize and perfect the classical Top-Hat transform, the second procedure, featuring the definition of multi direction illumination notion, utilizes opening by reconstruction and closing by reconstruction based on multi direction structuring elements. Finally, multi direction images are merged to the final even illumination image. The performance of the proposed algorithm is illustrated and verified through the processing of different ideal synthetic and camera collected images, with backgrounds characterized by poor lighting conditions.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号