首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 312 毫秒
1.

Background

We present a C++ class library for Monte Carlo simulation of molecular systems, including proteins in solution. The design is generic and highly modular, enabling multiple developers to easily implement additional features. The statistical mechanical methods are documented by extensive use of code comments that – subsequently – are collected to automatically build a web-based manual.

Results

We show how an object oriented design can be used to create an intuitively appealing coding framework for molecular simulation. This is exemplified in a minimalistic C++ program that can calculate protein protonation states. We further discuss performance issues related to high level coding abstraction.

Conclusion

C++ and the Standard Template Library (STL) provide a high-performance platform for generic molecular modeling. Automatic generation of code documentation from inline comments has proven particularly useful in that no separate manual needs to be maintained.  相似文献   

2.

Background

CRANKITE is a suite of programs for simulating backbone conformations of polypeptides and proteins. The core of the suite is an efficient Metropolis Monte Carlo sampler of backbone conformations in continuous three-dimensional space in atomic details.

Methods

In contrast to other programs relying on local Metropolis moves in the space of dihedral angles, our sampler utilizes local crankshaft rotations of rigid peptide bonds in Cartesian space.

Results

The sampler allows fast simulation and analysis of secondary structure formation and conformational changes for proteins of average length.  相似文献   

3.

1 Background

The U.S. Government has encouraged shifting from internal combustion engine vehicles (ICEVs) to alternatively fueled vehicles such as electric vehicles (EVs) for three primary reasons: reducing oil dependence, reducing greenhouse gas emissions, and reducing Clean Air Act criteria pollutant emissions. In comparing these vehicles, there is uncertainty and variability in emission factors and performance variables, which cause wide variation in reported outputs.

2 Objectives

A model was developed to demonstrate the use of Monte Carlo simulation to predict life cycle emissions and energy consumption differences between the ICEV versus the EV on a per kilometer (km) traveled basis. Three EV technologies are considered: lead-acid, nickel-cadmium, and nickel metal hydride batteries.

3 Methods

Variables were identified to build life cycle inventories between the EVs and ICEV. Distributions were selected for each of the variables and input to Monte Carlo Simulation soft-ware called Crystal Ball 2000®.

4 Results and Discussion

All three EV options reduce U.S. oil dependence by shifting to domestic coal. The life cycle energy consumption per kilometer (km) driven for the EVs is comparable to the ICEV; however, there is wide variation in predicted energy values. The model predicts that all three EV technologies will likely increase oxides of sulfur and nitrogen as well as particulate matter emissions on a per km driven basis. The model shows a high probability that volatile organic compounds and carbon monoxide emissions are reduced with the use of EVs. Lead emissions are also predicted to increase for lead-acid battery EVs. The EV will not reduce greenhouse gas emissions substantially and may even increase them based on the current U.S. reliance on coal for electricity generation. The EV may benefit public health by relocating air pollutants from urban centers, where traffic is concentrated, to rural areas where electricity generation and mining generally occur. The use of Monte Carlo simulation in life cycle analysis is demonstrated to be an effective tool to provide further insight on the likelihood of emission outputs and energy consumption.  相似文献   

4.

Background

The identification of copy number aberration in the human genome is an important area in cancer research. We develop a model for determining genomic copy numbers using high-density single nucleotide polymorphism genotyping microarrays. The method is based on a Bayesian spatial normal mixture model with an unknown number of components corresponding to true copy numbers. A reversible jump Markov chain Monte Carlo algorithm is used to implement the model and perform posterior inference.

Results

The performance of the algorithm is examined on both simulated and real cancer data, and it is compared with the popular CNAG algorithm for copy number detection.

Conclusions

We demonstrate that our Bayesian mixture model performs at least as well as the hidden Markov model based CNAG algorithm and in certain cases does better. One of the added advantages of our method is the flexibility of modeling normal cell contamination in tumor samples.  相似文献   

5.

Purpose

The analysis of uncertainty in life cycle assessment (LCA) studies has been a topic for more than 10 years, and many commercial LCA programs now feature a sampling approach called Monte Carlo analysis. Yet, a full Monte Carlo analysis of a large LCA system, for instance containing the 4,000 unit processes of ecoinvent v2.2, is rarely carried out by LCA practitioners. One reason for this is computation time. An alternative faster than Monte Carlo method is analytical error propagation by means of a Taylor series expansion; however, this approach suffers from being explained in the literature in conflicting ways, hampering implementation in most software packages for LCA. The purpose of this paper is to compare the two different approaches from a theoretical and practical perspective.

Methods

In this paper, we compare the analytical and sampling approaches in terms of their theoretical background and their mathematical formulation. Using three case studies—one stylized, one real-sized, and one input–output (IO)-based—we approach these techniques from a practical perspective and compare them in terms of speed and results.

Results

Depending on the precise question, a sampling or an analytical approach provides more useful information. Whenever they provide the same indicators, an analytical approach is much faster but less reliable when the uncertainties are large.

Conclusions

For a good analysis, analytical and sampling approaches are equally important, and we recommend practitioners to use both whenever available, and we recommend software suppliers to implement both.  相似文献   

6.

Background, aim, and scope

Uncertainty information is essential for the proper use of life cycle assessment (LCA) and environmental assessments in decision making. So far, parameter uncertainty propagation has mainly been studied using Monte Carlo techniques that are relatively computationally heavy to conduct, especially for the comparison of multiple scenarios, often limiting its use to research or to inventory only. Furthermore, Monte Carlo simulations do not automatically assess the sensitivity and contribution to overall uncertainty of individual parameters. The present paper aims to develop and apply to both inventory and impact assessment an explicit and transparent analytical approach to uncertainty. This approach applies Taylor series expansions to the uncertainty propagation of lognormally distributed parameters.

Materials and methods

We first apply the Taylor series expansion method to analyze the uncertainty propagation of a single scenario, in which case the squared geometric standard deviation of the final output is determined as a function of the model sensitivity to each input parameter and the squared geometric standard deviation of each parameter. We then extend this approach to the comparison of two or more LCA scenarios. Since in LCA it is crucial to account for both common inventory processes and common impact assessment characterization factors among the different scenarios, we further develop the approach to address this dependency. We provide a method to easily determine a range and a best estimate of (a) the squared geometric standard deviation on the ratio of the two scenario scores, “A/B”, and (b) the degree of confidence in the prediction that the impact of scenario A is lower than B (i.e., the probability that A/B<1). The approach is tested on an automobile case study and resulting probability distributions of climate change impacts are compared to classical Monte Carlo distributions.

Results

The probability distributions obtained with the Taylor series expansion lead to results similar to the classical Monte Carlo distributions, while being substantially simpler; the Taylor series method tends to underestimate the 2.5% confidence limit by 1-11% and the 97.5% limit by less than 5%. The analytical Taylor series expansion easily provides the explicit contributions of each parameter to the overall uncertainty. For the steel front end panel, the factor contributing most to the climate change score uncertainty is the gasoline consumption (>75%). For the aluminum panel, the electricity and aluminum primary production, as well as the light oil consumption, are the dominant contributors to the uncertainty. The developed approach for scenario comparisons, differentiating between common and independent parameters, leads to results similar to those of a Monte Carlo analysis; for all tested cases, we obtained a good concordance between the Monte Carlo and the Taylor series expansion methods regarding the probability that one scenario is better than the other.

Discussion

The Taylor series expansion method addresses the crucial need of accounting for dependencies in LCA, both for common LCI processes and common LCIA characterization factors. The developed approach in Eq. 8, which differentiates between common and independent parameters, estimates the degree of confidence in the prediction that scenario A is better than B, yielding results similar to those found with Monte Carlo simulations.

Conclusions

The probability distributions obtained with the Taylor series expansion are virtually equivalent to those from a classical Monte Carlo simulation, while being significantly easier to obtain. An automobile case study on an aluminum front end panel demonstrated the feasibility of this method and illustrated its simultaneous and consistent application to both inventory and impact assessment. The explicit and innovative analytical approach, based on Taylor series expansions of lognormal distributions, provides the contribution to the uncertainty from each parameter and strongly reduces calculation time.  相似文献   

7.

Introduction

Virtually all existing expectation-maximization (EM) algorithms for quantitative trait locus (QTL) mapping overlook the covariance structure of genetic effects, even though this information can help enhance the robustness of model-based inferences.

Results

Here, we propose fast EM and pseudo-EM-based procedures for Bayesian shrinkage analysis of QTLs, designed to accommodate the posterior covariance structure of genetic effects through a block-updating scheme. That is, updating all genetic effects simultaneously through many cycles of iterations.

Conclusion

Simulation results based on computer-generated and real-world marker data demonstrated the ability of our method to swiftly produce sensible results regarding the phenotype-to-genotype association. Our new method provides a robust and remarkably fast alternative to full Bayesian estimation in high-dimensional models where the computational burden associated with Markov chain Monte Carlo simulation is often unwieldy. The R code used to fit the model to the data is provided in the online supplementary material.  相似文献   

8.
The goal of this study is to prove that the light propagation in the head by used the 3‐D optical model from in vivo MRI data set can also provide significant characteristics on the spatial sensitivity of cerebral cortex folding geometry based on Monte Carlo simulation. Thus, we proposed a MRI based approach for 3‐D brain modeling of near‐infrared spectroscopy (NIRS). In the results, the spatial sensitivity profile of the cerebral cortex folding geometry and the arrangement of source‐detector separation have being necessarily considered for applications of functional NIRS. The optimal choice of source‐detector separation is suggested within 3–3.5 cm by the received intensity with different source‐detector separations and the ratio of received light from the gray and white matter layer is greater than 50%. Additionally, this study has demonstrated the capability of NIRS in not only assessing the functional but also detecting the structural change of the brain by taking advantage of the low scattering and absorption coefficients observed in CSF of sagittal view. (© 2013 WILEY‐VCH Verlag GmbH & Co. KGaA, Weinheim)  相似文献   

9.

Background

Combinatorial complexity is a challenging problem for the modeling of cellular signal transduction since the association of a few proteins can give rise to an enormous amount of feasible protein complexes. The layer-based approach is an approximative, but accurate method for the mathematical modeling of signaling systems with inherent combinatorial complexity. The number of variables in the simulation equations is highly reduced and the resulting dynamic models show a pronounced modularity. Layer-based modeling allows for the modeling of systems not accessible previously.

Results

ALC (Automated Layer Construction) is a computer program that highly simplifies the building of reduced modular models, according to the layer-based approach. The model is defined using a simple but powerful rule-based syntax that supports the concepts of modularity and macrostates. ALC performs consistency checks on the model definition and provides the model output in different formats (C MEX, MATLAB, Mathematica and SBML) as ready-to-run simulation files. ALC also provides additional documentation files that simplify the publication or presentation of the models. The tool can be used offline or via a form on the ALC website.

Conclusion

ALC allows for a simple rule-based generation of layer-based reduced models. The model files are given in different formats as ready-to-run simulation files.  相似文献   

10.

Purpose

When product systems are optimized to minimize environmental impacts, uncertainty in the process data may impact optimal decisions. The purpose of this article is to propose a mathematical method for life cycle assessment (LCA) optimization that protects decisions against uncertainty at the life cycle inventory (LCI) stage.

Methods

A robust optimization approach is proposed for decision making under uncertainty in the LCI stage. The proposed approach incorporates data uncertainty into an optimization problem in which the matrix-based LCI model appears as a constraint. The level of protection against data uncertainty in the technology and intervention matrices can be controlled to reflect varying degrees of conservatism.

Results and discussion

A simple numerical example on an electricity generation product system is used to illustrate the main features of this methodology. A comparison is made between a robust optimization approach, and decision making using a Monte Carlo analysis. Challenges to implement the robust optimization approach on common uncertainty distributions found in LCA and on large product systems are discussed. Supporting source code is available for download at https://github.com/renwang/Robust_Optimization_LCI_Uncertainty.

Conclusions

A robust optimization approach for matrix-based LCI is proposed. The approach incorporates data uncertainties into an optimization framework for LCI and provides a mechanism to control the level of protection against uncertainty. The tool computes optimal decisions that protects against worst-case realizations of data uncertainty. The robust optimal solution is conservative and is able to avoid the negative consequences of uncertainty in decision making.  相似文献   

11.

Background

Steroid 21-hydroxylase deficiency is the most common cause of congenital adrenal hyperplasia (CAH). Detection of underlying mutations in CYP21A2 gene encoding steroid 21-hydroxylase enzyme is helpful both for confirmation of diagnosis and management of CAH patients. Here we report a novel 9-bp insertion in CYP21A2 gene and its structural and functional consequences on P450c21 protein by molecular modeling and molecular dynamics simulations methods.

Methods

A 30-day-old child was referred to our laboratory for molecular diagnosis of CAH. Sequencing of the entire CYP21A2 gene revealed a novel insertion (duplication) of 9-bp in exon 2 of one allele and a well-known mutation I172N in exon 4 of other allele. Molecular modeling and simulation studies were carried out to understand the plausible structural and functional implications caused by the novel mutation.

Results

Insertion of the nine bases in exon 2 resulted in addition of three valine residues at codon 71 of the P450c21 protein. Molecular dynamics simulations revealed that the mutant exhibits a faster unfolding kinetics and an overall destabilization of the structure due to the triple valine insertion was also observed.

Conclusion

The novel 9-bp insertion in exon 2 of CYP21A2 genesignificantly lowers the structural stability of P450c21 thereby leading to the probable loss of its function.  相似文献   

12.

Background

Hyperpolarised helium MRI (He3 MRI) is a new technique that enables imaging of the air distribution within the lungs. This allows accurate determination of the ventilation distribution in vivo. The technique has the disadvantages of requiring an expensive helium isotope, complex apparatus and moving the patient to a compatible MRI scanner. Electrical impedance tomography (EIT) a non-invasive bedside technique that allows constant monitoring of lung impedance, which is dependent on changes in air space capacity in the lung. We have used He3MRI measurements of ventilation distribution as the gold standard for assessment of EIT.

Methods

Seven rats were ventilated in supine, prone, left and right lateral position with 70% helium/30% oxygen for EIT measurements and pure helium for He3 MRI. The same ventilator and settings were used for both measurements. Image dimensions, geometric centre and global in homogeneity index were calculated.

Results

EIT images were smaller and of lower resolution and contained less anatomical detail than those from He3 MRI. However, both methods could measure positional induced changes in lung ventilation, as assessed by the geometric centre. The global in homogeneity index were comparable between the techniques.

Conclusion

EIT is a suitable technique for monitoring ventilation distribution and inhomgeneity as assessed by comparison with He3 MRI.  相似文献   

13.

Key message

Linkage analysis confirmed the association in the region of PHYC in pearl millet. The comparison of genes found in this region suggests that PHYC is the best candidate.

Abstract

Major efforts are currently underway to dissect the phenotype–genotype relationship in plants and animals using existing populations. This method exploits historical recombinations accumulated in these populations. However, linkage disequilibrium sometimes extends over a relatively long distance, particularly in genomic regions containing polymorphisms that have been targets for selection. In this case, many genes in the region could be statistically associated with the trait shaped by the selected polymorphism. Statistical analyses could help in identifying the best candidate genes into such a region where an association is found. In a previous study, we proposed that a fragment of the PHYTOCHROME C gene (PHYC) is associated with flowering time and morphological variations in pearl millet. In the present study, we first performed linkage analyses using three pearl millet F2 families to confirm the presence of a QTL in the vicinity of PHYC. We then analyzed a wider genomic region of ~100 kb around PHYC to pinpoint the gene that best explains the association with the trait in this region. A panel of 90 pearl millet inbred lines was used to assess the association. We used a Markov chain Monte Carlo approach to compare 75 markers distributed along this 100-kb region. We found the best candidate markers on the PHYC gene. Signatures of selection in this region were assessed in an independent data set and pointed to the same gene. These results foster confidence in the likely role of PHYC in phenotypic variation and encourage the development of functional studies.  相似文献   

14.

Purpose

to determine diagnosis and prognosis value of MRI in Peyronie’s disease.

Material and Methods

thirty one penile MR examinations have been performed in 28 patients aged between 21 and 73. (1 tesla; surface coil; sagittal SET1, axial SET2 weighted, T1 before and after Gadolinium)
  • - In all cases but one, fibrous plaques were clinically palpable.
  • - Images were compared with clinical examination and evolution under anti-inflammatory drugs.
  • Results

  • - In 3 cases, MRI misdiagnosed one unique plague.
  • - In 2 additional cases, one of the 2 clinical plaques was not detected.
  • - In 5 cases, MRI depicted more lesions than palpation.
  • - Gadolinium Enhancement was always correlated with a good response to anti-inflamatory drugs but this treatment was also useful in one case who showed no enhancement.
  • Conclusion

    MRI can be helpfull in the pretreatment assessment and int he follow-up of Peyronie’s disease.  相似文献   

    15.

    Background

    Sensitivity analysis is an indispensable tool for the analysis of complex systems. In a recent paper, we have introduced a thermodynamically consistent variance-based sensitivity analysis approach for studying the robustness and fragility properties of biochemical reaction systems under uncertainty in the standard chemical potentials of the activated complexes of the reactions and the standard chemical potentials of the molecular species. In that approach, key sensitivity indices were estimated by Monte Carlo sampling, which is computationally very demanding and impractical for large biochemical reaction systems. Computationally efficient algorithms are needed to make variance-based sensitivity analysis applicable to realistic cellular networks, modeled by biochemical reaction systems that consist of a large number of reactions and molecular species.

    Results

    We present four techniques, derivative approximation (DA), polynomial approximation (PA), Gauss-Hermite integration (GHI), and orthonormal Hermite approximation (OHA), for analytically approximating the variance-based sensitivity indices associated with a biochemical reaction system. By using a well-known model of the mitogen-activated protein kinase signaling cascade as a case study, we numerically compare the approximation quality of these techniques against traditional Monte Carlo sampling. Our results indicate that, although DA is computationally the most attractive technique, special care should be exercised when using it for sensitivity analysis, since it may only be accurate at low levels of uncertainty. On the other hand, PA, GHI, and OHA are computationally more demanding than DA but can work well at high levels of uncertainty. GHI results in a slightly better accuracy than PA, but it is more difficult to implement. OHA produces the most accurate approximation results and can be implemented in a straightforward manner. It turns out that the computational cost of the four approximation techniques considered in this paper is orders of magnitude smaller than traditional Monte Carlo estimation. Software, coded in MATLAB®, which implements all sensitivity analysis techniques discussed in this paper, is available free of charge.

    Conclusions

    Estimating variance-based sensitivity indices of a large biochemical reaction system is a computationally challenging task that can only be addressed via approximations. Among the methods presented in this paper, a technique based on orthonormal Hermite polynomials seems to be an acceptable candidate for the job, producing very good approximation results for a wide range of uncertainty levels in a fraction of the time required by traditional Monte Carlo sampling.  相似文献   

    16.

    Background

    Conventional magnetic resonance imaging (MRI) has improved the diagnosis and monitoring of multiple sclerosis (MS). In clinical trials, MRI has been found to detect treatment effects with greater sensitivity than clinical measures; however, clinical and MRI outcomes tend to correlate poorly.

    Methods

    In this observational study, patients (n = 550; 18-50 years; relapsing-remitting MS [Expanded Disability Status Scale score ≤4.0]) receiving interferon (IFN) β-1a therapy (44 or 22 µg subcutaneously [sc] three times weekly [tiw]) underwent standardized MRI, neuropsychological and quality-of-life (QoL) assessments over 3 years. In this post hoc analysis, MRI outcomes and correlations between MRI parameters and clinical and functional outcomes were analysed.

    Results

    MRI data over 3 years were available for 164 patients. T2 lesion and T1 gadolinium-enhancing (Gd+) lesion volumes, but not black hole (BH) volumes, decreased significantly from baseline to Year 3 (P < 0.0001). Percentage decreases (baseline to Year 3) were greater with the 44 μg dose than with the 22 μg dose for T2 lesion volume (-10.2% vs -4.5%, P = 0.025) and T1 BH volumes (-7.8% vs +10.3%, P = 0.002). A decrease in T2 lesion volume over 3 years predicted stable QoL over the same time period. Treatment with IFN β-1a, 44 μg sc tiw, predicted an absence of cognitive impairment at Year 3.

    Conclusion

    Subcutaneous IFN β-1a significantly decreased MRI measures of disease, with a significant benefit shown for the 44 µg over the 22 µg dose; higher-dose treatment also predicted better cognitive outcomes over 3 years.  相似文献   

    17.

    Background

    Therapeutic irreversible electroporation (IRE) is an emerging technology for the non-thermal ablation of tumors. The technique involves delivering a series of unipolar electric pulses to permanently destabilize the plasma membrane of cancer cells through an increase in transmembrane potential, which leads to the development of a tissue lesion. Clinically, IRE requires the administration of paralytic agents to prevent muscle contractions during treatment that are associated with the delivery of electric pulses. This study shows that by applying high-frequency, bipolar bursts, muscle contractions can be eliminated during IRE without compromising the non-thermal mechanism of cell death.

    Methods

    A combination of analytical, numerical, and experimental techniques were performed to investigate high-frequency irreversible electroporation (H-FIRE). A theoretical model for determining transmembrane potential in response to arbitrary electric fields was used to identify optimal burst frequencies and amplitudes for in vivo treatments. A finite element model for predicting thermal damage based on the electric field distribution was used to design non-thermal protocols for in vivo experiments. H-FIRE was applied to the brain of rats, and muscle contractions were quantified via accelerometers placed at the cervicothoracic junction. MRI and histological evaluation was performed post-operatively to assess ablation.

    Results

    No visual or tactile evidence of muscle contraction was seen during H-FIRE at 250 kHz or 500 kHz, while all IRE protocols resulted in detectable muscle contractions at the cervicothoracic junction. H-FIRE produced ablative lesions in brain tissue that were characteristic in cellular morphology of non-thermal IRE treatments. Specifically, there was complete uniformity of tissue death within targeted areas, and a sharp transition zone was present between lesioned and normal brain.

    Conclusions

    H-FIRE is a feasible technique for non-thermal tissue ablation that eliminates muscle contractions seen in IRE treatments performed with unipolar electric pulses. Therefore, it has the potential to be performed clinically without the administration of paralytic agents.  相似文献   

    18.

    Background

    It is a daunting task to identify all the metabolic pathways of brain energy metabolism and develop a dynamic simulation environment that will cover a time scale ranging from seconds to hours. To simplify this task and make it more practicable, we undertook stoichiometric modeling of brain energy metabolism with the major aim of including the main interacting pathways in and between astrocytes and neurons.

    Model

    The constructed model includes central metabolism (glycolysis, pentose phosphate pathway, TCA cycle), lipid metabolism, reactive oxygen species (ROS) detoxification, amino acid metabolism (synthesis and catabolism), the well-known glutamate-glutamine cycle, other coupling reactions between astrocytes and neurons, and neurotransmitter metabolism. This is, to our knowledge, the most comprehensive attempt at stoichiometric modeling of brain metabolism to date in terms of its coverage of a wide range of metabolic pathways. We then attempted to model the basal physiological behaviour and hypoxic behaviour of the brain cells where astrocytes and neurons are tightly coupled.

    Results

    The reconstructed stoichiometric reaction model included 217 reactions (184 internal, 33 exchange) and 216 metabolites (183 internal, 33 external) distributed in and between astrocytes and neurons. Flux balance analysis (FBA) techniques were applied to the reconstructed model to elucidate the underlying cellular principles of neuron-astrocyte coupling. Simulation of resting conditions under the constraints of maximization of glutamate/glutamine/GABA cycle fluxes between the two cell types with subsequent minimization of Euclidean norm of fluxes resulted in a flux distribution in accordance with literature-based findings. As a further validation of our model, the effect of oxygen deprivation (hypoxia) on fluxes was simulated using an FBA-derivative approach, known as minimization of metabolic adjustment (MOMA). The results show the power of the constructed model to simulate disease behaviour on the flux level, and its potential to analyze cellular metabolic behaviour in silico.

    Conclusion

    The predictive power of the constructed model for the key flux distributions, especially central carbon metabolism and glutamate-glutamine cycle fluxes, and its application to hypoxia is promising. The resultant acceptable predictions strengthen the power of such stoichiometric models in the analysis of mammalian cell metabolism.  相似文献   

    19.

    Background

    IL4/IL4RA pathway plays an important role in atopy and asthma. Different polymorphisms in IL4 and IL4RA genes have been described. Particularly, -33C>TIL4 and 576Q>RIL4RA SNPs have been independently associated to atopy and asthma. The purpose of this study was to analyse these polymorphisms in a population of patients with a well-characterized asthma phenotype.

    Methods

    A total of 212 unrelated Caucasian individuals, 133 patients with asthma and 79 healthy subjects without symptoms or history of asthma or atopy and with negative skin prick tests were recruited. Lung function was measured by spirometry and asthma was specialist physician-diagnosed according to the ATS (American Thoracic Society) criteria and classified following the GINA (Global Initiative for Asthma) guidelines. Skin prick tests were performed according to EAACI recommendations. -33C>TIL4 was studied with TaqMan assay and 576Q>RIL4RA by PCR-RFLP technique. Hardy-Weinberg equilibrium was analysed in all groups. Dichotomous variables were analysed using χ2, Fisher exact test, Monte Carlo simulation test and odds ratio test. To model the effects of multiple covariates logistic regression was used.

    Results

    No statistically significant differences between the group of patients with asthma and the controls were found when the allele and genotype distribution of -33C>TIL4 and 576Q>RIL4RA polymorphisms were compared. However, the T allele of the -33C>TIL4 SNP was more frequent in patients with persistent asthma. Multivariate analysis adjusted for age and sex confirmed that carriers of allele T had an increased risk of persistent asthma (OR:2.77, 95%CI:1.18–6.49; p = 0.019). Analysis of combination of polymorphisms showed that patients carrying both the T allele of -33C>TIL4 and the A allele of 576Q>RIL4RA had an increased risk of asthma. This association was particularly observed in persistent asthma [Fisher's p value = 0.0021, Monte Carlo p value (after 104 simulations) = 0.0016, OR:3.39; 95% CI:1.50–7.66].

    Conclusion

    Our results show a trend of association between the genetic combination of the T allele of -33C>TIL4 and the A allele of 576Q>RIL4RA with asthma. This genetic variant was more frequently observed in patients with persistent asthma. As long as this study was performed in a small population, further studies in other populations are needed to confirm these results.  相似文献   

    20.

    Background

    Protein loops are flexible structures that are intimately tied to function, but understanding loop motion and generating loop conformation ensembles remain significant computational challenges. Discrete search techniques scale poorly to large loops, optimization and molecular dynamics techniques are prone to local minima, and inverse kinematics techniques can only incorporate structural preferences in adhoc fashion. This paper presents Sub-Loop Inverse Kinematics Monte Carlo (SLIKMC), a new Markov chain Monte Carlo algorithm for generating conformations of closed loops according to experimentally available, heterogeneous structural preferences.

    Results

    Our simulation experiments demonstrate that the method computes high-scoring conformations of large loops (> 10 residues) orders of magnitude faster than standard Monte Carlo and discrete search techniques. Two new developments contribute to the scalability of the new method. First, structural preferences are specified via a probabilistic graphical model (PGM) that links conformation variables, spatial variables (e.g., atom positions), constraints and prior information in a unified framework. The method uses a sparse PGM that exploits locality of interactions between atoms and residues. Second, a novel method for sampling sub-loops is developed to generate statistically unbiased samples of probability densities restricted by loop-closure constraints.

    Conclusion

    Numerical experiments confirm that SLIKMC generates conformation ensembles that are statistically consistent with specified structural preferences. Protein conformations with 100+ residues are sampled on standard PC hardware in seconds. Application to proteins involved in ion-binding demonstrate its potential as a tool for loop ensemble generation and missing structure completion.
      相似文献   

    设为首页 | 免责声明 | 关于勤云 | 加入收藏

    Copyright©北京勤云科技发展有限公司  京ICP备09084417号