首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 0 毫秒
1.
Physical phantom models have conventionally been used to determine the accuracy and precision of radiostereometric analysis (RSA) in various orthopaedic applications. Using a phantom model of a fracture of the distal radius it has previously been shown that RSA is a highly accurate and precise method for measuring both translation and rotation in three-dimensions (3-D). The main shortcoming of a physical phantom model is its inability to mimic complex 3-D motion. The goal of this study was to create a realistic computer model for preoperative planning of RSA studies and to test the accuracy of RSA in measuring complex movements in fractures of the distal radius using this new model. The 3-D computer model was created from a set of tomographic scans. The simulation of the radiographic imaging was performed using ray-tracing software (POV-Ray). RSA measurements were performed according to standard protocol. Using a two-part fracture model (AO/ASIF type A2), it was found that for simple movements in one axis, translations in the range of 25microm-2mm could be measured with an accuracy of +/-2microm. Rotations ranging from 16 degrees to 2 degrees could be measured with an accuracy of +/-0.015 degrees . Using a three-part fracture model the corresponding values of accuracy were found to be +/-4microm and +/-0.031 degrees for translation and rotation, respectively. For complex 3-D motion in a three-part fracture model (AO/ASIF type C1) the accuracy was +/-6microm for translation and +/-0.120 degrees for rotation. The use of 3-D computer modelling can provide a method for preoperative planning of RSA studies in complex fractures of the distal radius and in other clinical situations in which the RSA method is applicable.  相似文献   

2.
Optimal calibration marker mesh for 2D X-ray sensors in 3D reconstruction   总被引:1,自引:0,他引:1  
Image intensifiers suffer from distortions due to magnetic fields. In order to use this X-ray projections images for computer-assisted medical interventions, image intensifiers need to be calibrated. Opaque markers are often used for the correction of the image distortion and the estimation of the acquisition geometry parameters. Information under the markers is then lost. In this work, we consider the calibration of image intensifiers in the framework of 3D reconstruction from several 2D X-ray projections. In this context, new schemes of marker distributions are proposed for 2D X-ray sensor calibration. They are based on efficient sampling conditions of the parallel-beam X-ray transform when the detector and source trajectory is restricted to a circle around the measured object. Efficient sampling are essentially subset of standard sampling in this situation. The idea is simply to exploit the data redundancy of standard sampling and to replace some holes of efficient schemes by markers. Optimal location of markers in the sparse efficient sampling geometry can thus be found. In this case, the markers can stay on the sensor during the measurement with--theoretically--no loss of information (when the signal-to-noise ratio is large). Even if the theory is based on the parallel-beam X-ray transform, numerical experiments on both simulated and real data are shown in the case of weakly divergent beam geometry. We show that the 3D reconstruction from simulated data with interlaced markers is essentially the same as those obtained from data with no marker. We show that efficient Fourier interpolation formulas based on optimal sparse sampling schemes can be used to recover the information hidden by the markers.  相似文献   

3.
Image-based Roentgen stereophotogrammetric analysis (IBRSA) integrates 2D-3D image registration and conventional RSA. Instead of radiopaque RSA bone markers, IBRSA uses 3D CT data, from which digitally reconstructed radiographs (DRRs) are generated. Using 2D-3D image registration, the 3D pose of the CT is iteratively adjusted such that the generated DRRs resemble the 2D RSA images as closely as possible, according to an image matching metric. Effectively, by registering all 2D follow-up moments to the same 3D CT, the CT volume functions as common ground. In two experiments, using RSA and using a micromanipulator as gold standard, IBRSA has been validated on cadaveric and sawbone scapula radiographs, and good matching results have been achieved. The accuracy was: |mu |< 0.083 mm for translations and |mu| < 0.023 degrees for rotations. The precision sigma in x-, y-, and z-direction was 0.090, 0.077, and 0.220 mm for translations and 0.155 degrees , 0.243 degrees , and 0.074 degrees for rotations. Our results show that the accuracy and precision of in vitro IBRSA, performed under ideal laboratory conditions, are lower than in vitro standard RSA but higher than in vivo standard RSA. Because IBRSA does not require radiopaque markers, it adds functionality to the RSA method by opening new directions and possibilities for research, such as dynamic analyses using fluoroscopy on subjects without markers and computer navigation applications.  相似文献   

4.
Purpose: For our research on computer-optimised and automated cochlear implant surgery, we pursue a model-based approach to overcome the limitations of currently available clinical imaging modalities. A serial cross section preparation procedure has been developed and evaluated concerning accuracy to serve for modelling of a digital anatomic atlas to make delicate soft tissue structures available for pre-operative planning.

Methods: A special grinding tool was developed allowing the setting of a specific amount of abrasion as equidistant slice thickness was considered a crucial step. Additionally, each actual abrasion was accurately measured and used during three-dimensional reconstruction of the serial cross-sectional images obtained via digital photo documentation after each microgrinding step. A well-known reference object was prepared using this procedure and evaluated in terms of accuracy.

Results: Reconstruction of the whole sample was achieved with an error less than 0.4%, and the edge lengths in the direction of abrasion could be reconstructed with an average error of 0.6 ± 0.3 mm; both prove the realisation of equidistant abrasion. Using artificial registration fiducials and a custom-made algorithm for image alignment, parallelism and rectangularity could be preserved with average errors less than 0.4° ± 0.3°.

Conclusion: We present a systematic, practicable and reliable method for the geometrically accurate reconstruction of anatomical structures, which is especially suitable for the middle and inner ear anatomy including soft tissue structures. For the first time, the quality of such a reconstruction process has been quantified and successfully proven for its usability.  相似文献   

5.
6.
A new simulator, INDISIM-FLOC, based on the individual-based simulator INDISIM, is used to examine the predictions of two different models of yeast flocculation. The first, proposed by Calleja is known as the "addition" model. The second, proposed by Stratford is known as the "cascade" model. The simulations show that the latter exhibits a better qualitative agreement with available experimental data.  相似文献   

7.
Leaf area index (LAI) of the soybean canopy was an important indicator to reflect the growth and development of the soybean plant. However, the traditional LAI measurement in the field was expensive, time-consuming, and challenging to achieve high accuracy. Thus, in this study, a calculation method of LAI for the soybean canopy based on 3D reconstruction was proposed, and the dynamic simulation model of canopy LAI was established. First, northeast soybean varieties of Kangxianchong8 and Dongnong252 were taken as the research objects, and a multi-source image synchronous acquisition platform for soybean canopy based on Kinect 2.0 was constructed to obtain the canopy image data from the V3 to R7 growth periods. Second, the 3D structure of the soybean canopy was reconstructed by conditional filtering and statistical filtering. Third, a soybean LAI estimation method was established by canopy analysis. The determination coefficient R2 between the calculated value and the standard value of LAI was greater than 0.99. Finally, a dynamic simulation model of soybean LAI was established based on the Richards model and the genetic parameters of varieties. The results showed that the accuracy of the dynamic simulation model reached above 0.99, which realized the critical technology of rapid detection and dynamic simulation of LAI for the soybean canopy, and provided quantitative dynamic prediction and technical support for scientific regulation of ecology and morphology for the soybean canopy.  相似文献   

8.

Background

Accurate measurement of the QT interval is very important from a clinical and pharmaceutical drug safety screening perspective. Expert manual measurement is both imprecise and imperfectly reproducible, yet it is used as the reference standard to assess the accuracy of current automatic computer algorithms, which thus produce reproducible but incorrect measurements of the QT interval. There is a scientific imperative to evaluate the most commonly used algorithms with an accurate and objective 'gold standard' and investigate novel automatic algorithms if the commonly used algorithms are found to be deficient.

Methods

This study uses a validated computer simulation of 8 different noise contaminated ECG waveforms (with known QT intervals of 461 and 495 ms), generated from a cell array using Luo-Rudy membrane kinetics and the Crank-Nicholson method, as a reference standard to assess the accuracy of commonly used QT measurement algorithms. Each ECG contaminated with 39 mixtures of noise at 3 levels of intensity was first filtered then subjected to three threshold methods (T1, T2, T3), two T wave slope methods (S1, S2) and a Novel method. The reproducibility and accuracy of each algorithm was compared for each ECG.

Results

The coefficient of variation for methods T1, T2, T3, S1, S2 and Novel were 0.36, 0.23, 1.9, 0.93, 0.92 and 0.62 respectively. For ECGs of real QT interval 461 ms the methods T1, T2, T3, S1, S2 and Novel calculated the mean QT intervals(standard deviations) to be 379.4(1.29), 368.5(0.8), 401.3(8.4), 358.9(4.8), 381.5(4.6) and 464(4.9) ms respectively. For ECGs of real QT interval 495 ms the methods T1, T2, T3, S1, S2 and Novel calculated the mean QT intervals(standard deviations) to be 396.9(1.7), 387.2(0.97), 424.9(8.7), 386.7(2.2), 396.8(2.8) and 493(0.97) ms respectively. These results showed significant differences between means at >95% confidence level. Shifting ECG baselines caused large errors of QT interval with T1 and T2 but no error with Novel.

Conclusion

The algorithms T2, T1 and Novel gave low coefficients of variation for QT measurement. The Novel technique gave the most accurate measurement of QT interval, T3 (a differential threshold method) was the next most accurate by a large margin. The objective and accurate 'gold standard' presented in this paper may be useful to assess new QT measurement algorithms. The Novel algorithm may prove to be more accurate and reliable method to measure the QT interval.  相似文献   

9.
In this article we introduce JULIDE, a software toolkit developed to perform the 3D reconstruction, intensity normalization, volume standardization by 3D image registration and voxel-wise statistical analysis of autoradiographs of mouse brain sections. This software tool has been developed in the open-source ITK software framework and is freely available under a GPL license. The article presents the complete image processing chain from raw data acquisition to 3D statistical group analysis. Results of the group comparison in the context of a study on spatial learning are shown as an illustration of the data that can be obtained with this tool.  相似文献   

10.
Three-dimensional structures of complexes of 66 amino acid-DNA binding domains of human progesterone (hPR), estrogen (hER) and glucocorticoid (hGR) receptors (proteins), with ten base pair DNA duplexes: d(AGGTCATGCT).d(AGCATGACCT) and d(AGAACATGCT).d(AGCATGTTCT) were obtained using computer modeling and molecular mechanics techniques. Cartesian coordinates for the proteins were obtained from: 1) structural data of hER and hGR by NMR spectroscopy; 2) steric constraints imposed by tetrahedral coordination of the zinc ion to Cys residues, and 3) energy minimization in torsional and cartesian space. The proteins were made to interact with DNA (in B-form) in major groove through alpha-helical linker between the two zinc fingers. The geometry of the complexes was obtained by allowing them to slide, glide, penetrate in to and out of the groove, and to rotate about the helical axis. The complexes were energy minimized. Also maximized was the number of H-bonds between proteins and DNA. The complex structures were refined by molecular mechanics using AMBER 3.0. Structural parameters of DNA were analyzed in each complex and compared with those of native DNA optimized separately. The stereochemical differences of the complexes are discussed.  相似文献   

11.
We studied the percolation process in a system consisting of long flexible polymer chains and solvent molecules. The polymer chains were approximated by linear sequences of beads on a two-dimensional triangular lattice. The system was athermal and the excluded volume was the only potential. The properties of the model system across the entire range of polymer concentrations were determined by Monte Carlo simulations employing a cooperative motion algorithm (CMA). The scaling behavior and the structure of the percolation clusters are presented and discussed.  相似文献   

12.
Rotator cuff (RC) tears may be associated with increased glenohumeral instability; however, this instability is difficult to quantify using currently available diagnostic tools. Recently, the three-dimensional (3D) reconstruction and registration method of the scapula and humeral head, based on sequences of low-dose biplane X-ray images, has been proposed for glenohumeral displacement assessment. This research aimed to evaluate the accuracy and reproducibility of this technique and to investigate its potential with a preliminary application comparing RC tear patients and asymptomatic volunteers. Accuracy was assessed using CT scan model registration on biplane X-ray images for five cadaveric shoulder specimens and showed differences ranging from 0.6 to 1.4 mm depending on the direction of interest. Intra- and interobserver reproducibility was assessed through two operators who repeated the reconstruction of five subjects three times, allowing defining 95% confidence interval ranging from ±1.8 to ±3.6 mm. Intraclass correlation coefficient varied between 0.84 and 0.98. Comparison between RC tear patients and asymptomatic volunteers showed differences of glenohumeral displacements, especially in the superoinferior direction when shoulder was abducted at 20° and 45°. This study thus assessed the accuracy of the low-dose 3D biplane X-ray reconstruction technique for glenohumeral displacement assessment and showed potential in biomechanical and clinical research.  相似文献   

13.
The purpose of this study was to systematically determine the effect of experimental errors on the work output calculated using two different methods of inverse dynamics during vertical jumping: (a) the conventional (rotational) method and (b) the translational method. A two-dimensional musculoskeletal model was used to generate precisely known kinematics. Next, the location of each joint center (JC) and the location of each segment's center of mass (CM) were manipulated by +/-10% of segment length to simulate errors in the location of joint centers (delta JC) and errors in the location of segment's center of mass (delta CM), respectively. Work output was subsequently calculated by applying the two methods of inverse dynamics to the manipulated kinematic data. The results showed that the translational method of inverse dynamics was less sensitive (up to 13% error in total work output) to delta JC and delta CM than the rotational method (up to 28% error in total work output). The rotational method of inverse dynamics was particularly sensitive to simulated errors in JC.  相似文献   

14.
García-Dorado A  Gallego A 《Genetics》2003,164(2):807-819
We simulated single-generation data for a fitness trait in mutation-accumulation (MA) experiments, and we compared three methods of analysis. Bateman-Mukai (BM) and maximum likelihood (ML) need information on both the MA lines and control lines, while minimum distance (MD) can be applied with or without the control. Both MD and ML assume gamma-distributed mutational effects. ML estimates of the rate of deleterious mutation had larger mean square error (MSE) than MD or BM had due to large outliers. MD estimates obtained by ignoring the mean decline observed from comparison to a control are often better than those obtained using that information. When effects are simulated using the gamma distribution, reducing the precision with which the trait is assayed increases the probability of obtaining no ML or MD estimates but causes no appreciable increase of the MSE. When the residual errors for the means of the simulated lines are sampled from the empirical distribution in a MA experiment, instead of from a normal one, the MSEs of BM, ML, and MD are practically unaffected. When the simulated gamma distribution accounts for a high rate of mild deleterious mutation, BM detects only approximately 30% of the true deleterious mutation rate, while MD or ML detects substantially larger fractions. To test the robustness of the methods, we also added a high rate of common contaminant mutations with constant mild deleterious effect to a low rate of mutations with gamma-distributed deleterious effects and moderate average. In that case, BM detects roughly the same fraction as before, regardless of the precision of the assay, while ML fails to provide estimates. However, MD estimates are obtained by ignoring the control information, detecting approximately 70% of the total mutation rate when the mean of the lines is assayed with good precision, but only 15% for low-precision assays. Contaminant mutations with only tiny deleterious effects could not be detected with acceptable accuracy by any of the above methods.  相似文献   

15.
The calculation of multipoint likelihoods is computationally challenging, with the exact calculation of multipoint probabilities only possible on small pedigrees with many markers or large pedigrees with few markers. This paper explores the utility of calculating multipoint likelihoods using data on markers flanking a hypothesized position of the trait locus. The calculation of such likelihoods is often feasible, even on large pedigrees with missing data and complex structures. Performance characteristics of the flanking marker procedure are assessed through the calculation of multipoint heterogeneity LOD scores on data simulated for Genetic Analysis Workshop 14 (GAW14). Analysis is restricted to data on the Aipotu population on chromosomes 1, 3, and 4, where chromosomes 1 and 3 are known to contain disease loci. The flanking marker procedure performs well, even when missing data and genotyping errors are introduced.  相似文献   

16.
17.
Summary Embryos of the free-living soil nematodeCaenorhadditis elegans are capable of developing normally outside the mother; we have monitored this process in isolated embryos by light microscopy and recorded it on video tape. The size and position of each nucleus were entered into a computer at short time intervals from the 2- to 102-cell stages. Models were reconstructed in which nuclei are represented by spheres and assigned different colors and patterns according to lineage membership. Three-dimensional reconstructions aid visualization of the spatial arrangement of nuclei and demonstrate the small degree of positional variance among individuals. The dynamic processes of nuclear growth during the cell cycle, division, migration, and patern formation can be quantitatively analyzed. Our knowledge of the complete embryonic lineage allows the correlation of nuclear behavior with eventual cellular fate.  相似文献   

18.
A model of carbohydrate metabolism during differentiation in Dictyostelium discoideum has been used to investigate which enzyme kinetic mechanism(s) might be operative for glycogen phosphorylase in vivo. The model, which has been described previously, is capable of simulating experimentally observed changes in metabolite concentrations and fluxes during differentiation under both the standard starvation condition and in the presence of glucose (25 mM). The concentrations of saccharide end products of differentiation under these 2 conditions differ substantially.Glycogen phosphorylase is described in the model by a rapid equilibrium random bi bi mechanism and the effect of substituting 4 other kinetic mechanisms was examined. Each of these mechanisms in the model allows simulations compatible with the saccharide accumulation patterns found during differentiation in the absence of glucose. However, in the presence of glucose, only a reversible mechanism (random or ordered) is compatible with the experimental data. It is concluded that glycogen degradation in vivo is controlled by an enzyme catalyzing a reversible reaction, the rate of which is inversely related to the glucose-1-P concentration.  相似文献   

19.
Serial section electron microscopy is typically applied to investigation of small tissue volumes encompassing subcellular structures. However, in neurobiology, the need to relate subcellular structure to organization of neural circuits can require investigation of large tissue volumes at ultrastructural resolution. Analysis of ultrastructure and three-dimensional reconstruction of even one to a few cells is time consuming, and still does not generate the necessary numbers of observations to form well-grounded insights into biological principles. We describe an assemblage of existing computer-based methods and strategies for graphical analysis of large photographic montages to accomplish the study of multiple neurons through large tissue volumes. Sample preparation, data collection and subsequent analyses can be completed within 3-4 months. These methods generate extremely large data sets that can be mined in future studies of nervous system organization.  相似文献   

20.
Tropisms and other movements of a plant organ result from alterations in local rates of cell elongation and a consequent development of a growth differential between its opposite sides. Relative elemental rates of elongation (RELELs) are useful to characterize the pattern of growth along and round an organ. We assume that the value of the RELEL at a given point is dependent on distance from the tip and that the distribution of values along the organ surface can be characterized in terms of the spread and the position of the maximum value. A computer model is described which accommodates these parameters and simulates tropic curvatures due to differential growth. Additional regulatory functions help to return the simulated organ to its original orientation. Particular attention is given to the simulation of root gravitropism because here not only do each of the various growth and regulatory parameters have a known biological counterpart, but some can also be given an actual quantitative value. The growth characteristics relate to the biophysical properties of cells in the elongation zone of the root, while the regulatory functions relate to aspects of the graviperception and transmission systems. We believe that, given a suitably flexible model, computer simulation is a powerful means of characterizing, in a quantitative way, the contribution of each parameter to the elongation of plant organs in general and their tropisms in particular.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号