首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Quantification of uncertainty associated with risk estimates is an important part of risk assessment. In recent years, use of second-order distributions, and two-dimensional simulations have been suggested for quantifying both variability and uncertainty. These approaches are better interpreted within the Bayesian framework. To help practitioners better use such methods and interpret the results, in this article, we describe propagation and interpretation of uncertainty in the Bayesian paradigm. We consider both the estimation problem where some summary measures of the risk distribution (e.g., mean, variance, or selected percentiles) are to be estimated, and the prediction problem, where the risk values for some specific individuals are to be predicted. We discuss some connections and differences between uncertainties in estimation and prediction problems, and present an interpretation of a decomposition of total variability/uncertainty into variability and uncertainty in terms of expected squared error of prediction and its reduction from perfect information. We also discuss the role of Monte Carlo methods in characterizing uncertainty. We explain the basic ideas using a simple example, and demonstrate Monte Carlo calculations using another example from the literature.  相似文献   

2.
Accuracy of results from mathematical and computer models of biological systems is often complicated by the presence of uncertainties in experimental data that are used to estimate parameter values. Current mathematical modeling approaches typically use either single-parameter or local sensitivity analyses. However, these methods do not accurately assess uncertainty and sensitivity in the system as, by default, they hold all other parameters fixed at baseline values. Using techniques described within we demonstrate how a multi-dimensional parameter space can be studied globally so all uncertainties can be identified. Further, uncertainty and sensitivity analysis techniques can help to identify and ultimately control uncertainties. In this work we develop methods for applying existing analytical tools to perform analyses on a variety of mathematical and computer models. We compare two specific types of global sensitivity analysis indexes that have proven to be among the most robust and efficient. Through familiar and new examples of mathematical and computer models, we provide a complete methodology for performing these analyses, in both deterministic and stochastic settings, and propose novel techniques to handle problems encountered during these types of analyses.  相似文献   

3.
Løkkeborg  Svein  Fernö  Anders  Jørgensen  Terje 《Hydrobiologia》2002,483(1-3):259-264
Ultrasonic telemetry using stationary positioning systems allows several fish to be tracked simultaneously, but systems that are incapable of sampling multiple frequencies simultaneously can record data from only one transmitter (individual) at a time. Tracking several individuals simultaneously thus results in longer intervals between successive position fixes for each fish. This deficiency leads to loss of detail in the tracking data collected, and may be expected to cause loss of accuracy in estimates of the swimming speeds and movement patterns of the fish tracked. Even systems that track fish on multiple frequencies are not capable of continuous tracking due to technical issues. We determined the swimming speed, area occupied, activity rhythm and movement pattern of cod (Gadus morhua) using a stationary single-channel positioning system, and analysed how estimates of these behavioural parameters were affected by the interval between successive position fixes. Single fish were tracked at a time, and position fixes were eliminated at regular intervals in the original data to generate new data sets, as if they had been collected in the course of tracking several fish (2–16). In comparison with the complete set, these data sets gave 30–70% decreases in estimates of swimming speed depending on the number of fish supposedly being tracked. These results were similar for two individuals of different size and activity level, indicating that they can be employed as correction factors to partly compensate for underestimates of swimming speed when several fish are tracked simultaneously. Tracking `several' fish only slightly affected the estimates of area occupied (1–15%). The diurnal activity rhythm was also similar between the data sets, whereas details in search pattern were not seen when several fish were tracked simultaneously.  相似文献   

4.
Mathematical models in biology are powerful tools for the study and exploration of complex dynamics. Nevertheless, bringing theoretical results to an agreement with experimental observations involves acknowledging a great deal of uncertainty intrinsic to our theoretical representation of a real system. Proper handling of such uncertainties is key to the successful usage of models to predict experimental or field observations. This problem has been addressed over the years by many tools for model calibration and parameter estimation. In this article we present a general framework for uncertainty analysis and parameter estimation that is designed to handle uncertainties associated with the modeling of dynamic biological systems while remaining agnostic as to the type of model used. We apply the framework to fit an SIR-like influenza transmission model to 7 years of incidence data in three European countries: Belgium, the Netherlands and Portugal.  相似文献   

5.
Fluorescent nanoparticles (FNPs) have been widely used in chemistry and medicine for decades, but their employment in biology is relatively recent. Past reviews on FNPs have focused on chemical, physical or medical uses, making the extrapolation to biological applications difficult. In biology, FNPs have largely been used for biosensing and molecular tracking. However, concerns over toxicity in early types of FNPs, such as cadmium-containing quantum dots (QDs), may have prevented wide adoption. Recent developments, especially in non-Cd-containing FNPs, have alleviated toxicity problems, facilitating the use of FNPs for addressing ecological, physiological and molecule-level processes in biological research. Standardised protocols from synthesis to application and interdisciplinary approaches are critical for establishing FNPs in the biologists’ tool kit. Here, we present an introduction to FNPs, summarise their use in biological applications, and discuss technical issues such as data reliability and biocompatibility. We assess whether biological research can benefit from FNPs and suggest ways in which FNPs can be applied to answer questions in biology. We conclude that FNPs have a great potential for studying various biological processes, especially tracking, sensing and imaging in physiology and ecology.  相似文献   

6.
The development and successful application of high-throughput technologies are transforming biological research. The large quantities of data being generated by these technologies have led to the emergence of systems biology, which emphasizes large-scale, parallel characterization of biological systems and integration of fragmentary information into a coherent whole. Complementing the reductionist approach that has dominated biology for the last century, mathematical modeling is becoming a powerful tool to achieve an integrated understanding of complex biological systems and to guide experimental efforts of engineering biological systems for practical applications. Here I give an overview of current mainstream approaches in modeling biological systems, highlight specific applications of modeling in various settings, and point out future research opportunities and challenges.  相似文献   

7.
Deciphering the biological networks underlying complex phenotypic traits, e.g., human disease is undoubtedly crucial to understand the underlying molecular mechanisms and to develop effective therapeutics. Due to the network complexity and the relatively small number of available experiments, data-driven modeling is a great challenge for deducing the functions of genes/proteins in the network and in phenotype formation. We propose a novel knowledge-driven systems biology method that utilizes qualitative knowledge to construct a Dynamic Bayesian network (DBN) to represent the biological network underlying a specific phenotype. Edges in this network depict physical interactions between genes and/or proteins. A qualitative knowledge model first translates typical molecular interactions into constraints when resolving the DBN structure and parameters. Therefore, the uncertainty of the network is restricted to a subset of models which are consistent with the qualitative knowledge. All models satisfying the constraints are considered as candidates for the underlying network. These consistent models are used to perform quantitative inference. By in silico inference, we can predict phenotypic traits upon genetic interventions and perturbing in the network. We applied our method to analyze the puzzling mechanism of breast cancer cell proliferation network and we accurately predicted cancer cell growth rate upon manipulating (anti)cancerous marker genes/proteins.  相似文献   

8.
9.
Recent successes of systems biology clarified that biological functionality is multilevel. We point out that this fact makes it necessary to revise popular views about macromolecular functions and distinguish between local, physico-chemical and global, biological functions. Our analysis shows that physico-chemical functions are merely tools of biological functionality. This result sheds new light on the origin of cellular life, indicating that in evolutionary history, assignment of biological functions to cellular ingredients plays a crucial role. In this wider picture, even if aggregation of chance mutations of replicator molecules and spontaneously self-assembled proteins led to the formation of a system identical with a living cell in all physical respects but devoid of biological functions, it would remain an inanimate physical system, a pseudo-cell or a zombie-cell but not a viable cell. In the origin of life scenarios, a fundamental circularity arises, since if cells are the minimal units of life, it is apparent that assignments of cellular functions require the presence of cells and vice versa. Resolution of this dilemma requires distinguishing between physico-chemical and biological symbols as well as between physico-chemical and biological information. Our analysis of the concepts of symbol, rule and code suggests that they all rely implicitly on biological laws or principles. We show that the problem is how to establish physico-chemically arbitrary rules assigning biological functions without the presence of living organisms. We propose a solution to that problem with the help of a generalized action principle and biological harnessing of quantum uncertainties. By our proposal, biology is an autonomous science having its own fundamental principle. The biological principle ought not to be regarded as an emergent phenomenon. It can guide chemical evolution towards the biological one, progressively assigning greater complexity and functionality to macromolecules and systems of macromolecules at all levels of organization. This solution explains some perplexing facts and posits a new context for thinking about the problems of the origin of life and mind.  相似文献   

10.
Besides the often-quoted complexity of cellular networks, the prevalence of uncertainties about components, interactions, and their quantitative features provides a largely underestimated hallmark of current systems biology. This uncertainty impedes the development of mechanistic mathematical models to achieve a true systems-level understanding. However, there is increasing evidence that theoretical approaches from diverse scientific domains can extract relevant biological knowledge efficiently, even from poorly characterized biological systems. As a common denominator, the methods focus on structural, rather than more detailed, kinetic network properties. A deeper understanding, better scaling, and the ability to combine the approaches pose formidable challenges for future theory developments.  相似文献   

11.
A goal of synthetic biology is to make biological systems easier to engineer. One of the aims is to design, with nanometer-scale precision, biomaterials with well-defined properties. The surface-layer protein SbpA forms 2D arrays naturally on the cell surface of Lysinibacillus sphaericus, but also as the purified protein in solution upon the addition of divalent cations. The high propensity of SbpA to form crystalline arrays, which can be simply controlled by divalent cations, and the possibility to genetically alter the protein, make SbpA an attractive molecule for synthetic biology. To be a useful tool, however, it is important that a simple protocol can be used to produce recombinant wild-type and modified SbpA in large quantities and in a biologically active form. The present study addresses this requirement by introducing a mild and non-denaturing purification protocol to produce milligram quantities of recombinant, active SbpA.  相似文献   

12.
In a recent epidemiological study, Bayesian uncertainties on lung doses have been calculated to determine lung cancer risk from occupational exposures to plutonium. These calculations used a revised version of the Human Respiratory Tract Model (HRTM) published by the ICRP. In addition to the Bayesian analyses, which give probability distributions of doses, point estimates of doses (single estimates without uncertainty) were also provided for that study using the existing HRTM as it is described in ICRP Publication 66; these are to be used in a preliminary analysis of risk. To infer the differences between the point estimates and Bayesian uncertainty analyses, this paper applies the methodology to former workers of the United Kingdom Atomic Energy Authority (UKAEA), who constituted a subset of the study cohort. The resulting probability distributions of lung doses are compared with the point estimates obtained for each worker. It is shown that mean posterior lung doses are around two- to fourfold higher than point estimates and that uncertainties on doses vary over a wide range, greater than two orders of magnitude for some lung tissues. In addition, we demonstrate that uncertainties on the parameter values, rather than the model structure, are largely responsible for these effects. Of these it appears to be the parameters describing absorption from the lungs to blood that have the greatest impact on estimates of lung doses from urine bioassay. Therefore, accurate determination of the chemical form of inhaled plutonium and the absorption parameter values for these materials is important for obtaining reliable estimates of lung doses and hence risk from occupational exposures to plutonium.  相似文献   

13.
Material flow analysis (MFA) is a widely applied tool to investigate resource and recycling systems of metals and minerals. Owing to data limitations and restricted system understanding, MFA results are inherently uncertain. To demonstrate the systematic implementation of uncertainty analysis in MFA, two mathematical concepts for the quantification of uncertainties were applied to Austrian palladium (Pd) resource flows and evaluated: (1) uncertainty ranges expressed by fuzzy sets and (2) uncertainty ranges defined by normal distributions given as mean values and standard deviations. Whereas normal distributions represent the traditional approach for quantifying uncertainties in MFA, fuzzy sets may offer additional benefits in relation to uncertainty quantification in cases of scarce information. With respect to the Pd case study, the fuzzy representation of uncertain quantities is more consistent with the actual data availability in cases of incomplete databases, and fuzzy sets serve to highlight the effect of uncertainty on resource efficiency indicators derived from the MFA results. For both approaches, data reconciliation procedures offer the potential to reduce uncertainty and evaluate the plausibility of the model results. With respect to Pd resource management, improved formal collection of end‐of‐life (EOL) consumer products is identified as a key factor in increasing the recycling efficiency. In particular, the partial export of EOL vehicles represents a substantial loss of Pd from the Austrian resource system, whereas approximately 70% of the Pd in the EOL consumer products is recovered in waste management. In conclusion, systematic uncertainty analysis is an integral part of MFA required to provide robust decision support in resource management.  相似文献   

14.
High-throughput technologies have led to the generation of an increasing amount of data in different areas of biology. Datasets capturing the cell’s response to its intra- and extra-cellular microenvironment allows such data to be incorporated as signed and directed graphs or influence networks. These prior knowledge networks (PKNs) represent our current knowledge of the causality of cellular signal transduction. New signalling data is often examined and interpreted in conjunction with PKNs. However, different biological contexts, such as cell type or disease states, may have distinct variants of signalling pathways, resulting in the misinterpretation of new data. The identification of inconsistencies between measured data and signalling topologies, as well as the training of PKNs using context specific datasets (PKN contextualization), are necessary conditions to construct reliable, predictive models, which are current challenges in the systems biology of cell signalling. Here we present PRUNET, a user-friendly software tool designed to address the contextualization of a PKNs to specific experimental conditions. As the input, the algorithm takes a PKN and the expression profile of two given stable steady states or cellular phenotypes. The PKN is iteratively pruned using an evolutionary algorithm to perform an optimization process. This optimization rests in a match between predicted attractors in a discrete logic model (Boolean) and a Booleanized representation of the phenotypes, within a population of alternative subnetworks that evolves iteratively. We validated the algorithm applying PRUNET to four biological examples and using the resulting contextualized networks to predict missing expression values and to simulate well-characterized perturbations. PRUNET constitutes a tool for the automatic curation of a PKN to make it suitable for describing biological processes under particular experimental conditions. The general applicability of the implemented algorithm makes PRUNET suitable for a variety of biological processes, for instance cellular reprogramming or transitions between healthy and disease states.  相似文献   

15.
The Guide to the Expression of Uncertainty in Measurement (usually referred to as the GUM) provides the basic framework for evaluating uncertainty in measurement. The GUM however does not always provide clearly identifiable procedures suitable for medical laboratory applications, particularly when internal quality control (IQC) is used to derive most of the uncertainty estimates. The GUM modelling approach requires advanced mathematical skills for many of its procedures, but Monte Carlo simulation (MCS) can be used as an alternative for many medical laboratory applications. In particular, calculations for determining how uncertainties in the input quantities to a functional relationship propagate through to the output can be accomplished using a readily available spreadsheet such as Microsoft Excel.The MCS procedure uses algorithmically generated pseudo-random numbers which are then forced to follow a prescribed probability distribution. When IQC data provide the uncertainty estimates the normal (Gaussian) distribution is generally considered appropriate, but MCS is by no means restricted to this particular case. With input variations simulated by random numbers, the functional relationship then provides the corresponding variations in the output in a manner which also provides its probability distribution. The MCS procedure thus provides output uncertainty estimates without the need for the differential equations associated with GUM modelling.The aim of this article is to demonstrate the ease with which Microsoft Excel (or a similar spreadsheet) can be used to provide an uncertainty estimate for measurands derived through a functional relationship. In addition, we also consider the relatively common situation where an empirically derived formula includes one or more ‘constants’, each of which has an empirically derived numerical value. Such empirically derived ‘constants’ must also have associated uncertainties which propagate through the functional relationship and contribute to the combined standard uncertainty of the measurand.  相似文献   

16.
17.
Identifying the reactions that govern a dynamical biological system is a crucial but challenging task in systems biology. In this work, we present a data-driven method to infer the underlying biochemical reaction system governing a set of observed species concentrations over time. We formulate the problem as a regression over a large, but limited, mass-action constrained reaction space and utilize sparse Bayesian inference via the regularized horseshoe prior to produce robust, interpretable biochemical reaction networks, along with uncertainty estimates of parameters. The resulting systems of chemical reactions and posteriors inform the biologist of potentially several reaction systems that can be further investigated. We demonstrate the method on two examples of recovering the dynamics of an unknown reaction system, to illustrate the benefits of improved accuracy and information obtained.  相似文献   

18.
Robustness analysis and tuning of synthetic gene networks   总被引:1,自引:0,他引:1  
  相似文献   

19.
20.
A major advancement in the use of radio telemetry has been the development of automated radio tracking systems (ARTS), which allow animal movements to be tracked continuously. A new ARTS approach is the use of a network of simple radio receivers (nodes) that collect radio signal strength (RSS) values from animal‐borne radio transmitters. However, the use of RSS‐based localization methods in wildlife tracking research is new, and analytical approaches critical for determining high‐quality location data have lagged behind technological developments. We present an analytical approach to optimize RSS‐based localization estimates for a node network designed to track fine‐scale animal movements in a localized area. Specifically, we test the application of analytical filters (signal strength, distance among nodes) to data from real and simulated node networks that differ in the density and configuration of nodes. We evaluate how different filters and network configurations (density and regularity of node spacing) may influence the accuracy of RSS‐based localization estimates. Overall, the use of signal strength and distance‐based filters resulted in a 3‐ to 9‐fold increase in median accuracy of location estimates over unfiltered estimates, with the most stringent filters providing location estimates with a median accuracy ranging from 28 to 73 m depending on the configuration and spacing of the node network. We found that distance filters performed significantly better than RSS filters for networks with evenly spaced nodes, but the advantage diminished when nodes were less uniformly spaced within a network. Our results not only provide analytical approaches to greatly increase the accuracy of RSS‐based localization estimates, as well as the computer code to do so, but also provide guidance on how to best configure node networks to maximize the accuracy and capabilities of such systems for wildlife tracking studies.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号