首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 46 毫秒
1.
The availability of high-throughput genomic data has motivated the development of numerous algorithms to infer gene regulatory networks. The validity of an inference procedure must be evaluated relative to its ability to infer a model network close to the ground-truth network from which the data have been generated. The input to an inference algorithm is a sample set of data and its output is a network. Since input, output, and algorithm are mathematical structures, the validity of an inference algorithm is a mathematical issue. This paper formulates validation in terms of a semi-metric distance between two networks, or the distance between two structures of the same kind deduced from the networks, such as their steady-state distributions or regulatory graphs. The paper sets up the validation framework, provides examples of distance functions, and applies them to some discrete Markov network models. It also considers approximate validation methods based on data for which the generating network is not known, the kind of situation one faces when using real data.Key Words: Epistemology, gene network, inference, validation.  相似文献   

2.
3.
Acetabular fracture presents a challenging situation to trauma surgeons today due to its complexity. Finite element (FE) models can be of great help as they can improve the surgical planning and post surgery patient management for those with acetabular fractures. We have developed a non-linear finite element model of the pelvis and validated its fracture prediction capability with synthetic polyurethane pelves. A mechanical experiment was performed with the synthetic bones and fracture loads and patterns were observed for two different loading cases. Fracture loads predicted by our FE model were within one standard deviation of the experimental fracture loads for both loading cases. The incipient fracture pattern predicted by the model also resembled the actual pattern from the experiment. Although it is not a complete validation with human cadaver bones, the good agreement between model predictions and experimental results indicate the validity of our approach in using non-linear FE formulation along with contact conditions in predicting bone fractures.  相似文献   

4.
PurposeRadiomic models have been demonstrated to have acceptable discrimination capability for detecting lymph node metastasis (LNM). We aimed to develop a computed tomography–based radiomic model and validate its usefulness in the prediction of normal-sized LNM at node level in cervical cancer.MethodsA total of 273 LNs of 219 patients from 10 centers were evaluated in this study. We randomly divided the LNs from the 2 centers with the largest number of LNs into the training and internal validation cohorts, and the rest as the external validation cohort. Radiomic features were extracted from the arterial and venous phase images. We trained an artificial neural network (ANN) to develop two single-phase models. A radiomic model reflecting the features of two-phase images was also built for directly predicting LNM in cervical cancer. Moreover, four state-of-the-art methods were used for comparison. The performance of all models was assessed using the area under the receiver operating characteristic curve (AUC).ResultsAmong the models we built, the models combining the features of two phases surpassed the single-phase models, and the models generated by ANN had better performance than the others. We found that the radiomic model achieved the highest AUCs of 0.912 and 0.859 in the training and internal validation cohorts, respectively. In the external validation cohort, the AUC of the radiomic model was 0.800.ConclusionWe constructed a radiomic model that exhibited great ability in the prediction of LNM. The application of the model could optimize clinical staging and decision-making.  相似文献   

5.
We propose a structure for presenting risk assessments with the purpose of enhancing the transparency of the selection process of scientific theories and models derived from them. The structure has two stages, with 7 steps, where the stages involve two types of theories: core and auxiliary, which need to be identified in order to explain and evaluate observations and predictions. Core theories are those that are “fundamental” to the phenomena being observed, whereas auxiliary theories are those that describe or explain the actual observation process of the phenomena. The formulation of a scientific theory involves three constitutive components or types of judgments: explanative, evaluative, and regulative or aesthetic, driven by reason. Two perspectives guided us in developing the proposed structure: (1) In a risk assessment explanations based on notions of causality can be used as a tool for developing models and predictions of possible events outside the range of direct experience. The use of causality for development of models is based on judgments, reflecting regulative or aesthetic conceptualizations of different phenomena and how they (should) fit together in the world. (2) Weight of evidence evaluation should be based on falsification principles for excluding models, rather than validation or justification principles that select the best or nearly best-fitting models. Falsification entails discussion that identifies challenges to proposed models, and reconciles apparent inconsistencies between models and data. Based on the discussion of these perspectives the 7 steps of the structure are: the first stage for core theories, (A) scientific concepts, (B) causality network, and (C) mathematical model; and the second stage for auxiliary theories, (D) data interpretation, (E) statistical model, (F) evaluation (weight of evidence), and (G) reconciliation, which includes the actual decision formulation.  相似文献   

6.
Metabolic system modeling for model-based glycaemic control is becoming increasingly important. Few metabolic system models are clinically validated for both fit to the data and prediction ability. This research introduces a new additional form of pharmaco-dynamic (PD) surface comparison for model analysis and validation. These 3D surfaces are developed for 3 clinically validated models and 1 model with an added saturation dynamic. The models include the well-known Minimal Model. They are fit to two different data sets of clinical PD data from hyperinsulinaemic clamp studies at euglycaemia and/or hyperglycaemia. The models are fit to the first data set to determine an optimal set of population parameters. The second data set is used to test trend prediction of the surface modeling as it represents a lower insulin sensitivity cohort and should thus require only scaling in these (or related) parameters to match this data set. This particular approach clearly highlights differences in modeling methods, and the model dynamics utilized that may not appear as clearly in other fitting or prediction validation methods.Across all models saturation of insulin action is seen to be an important determinant of prediction and fit quality. In particular, the well-reported under-modeling of insulin sensitivity in the Minimal Model can be seen in this context to be a result of a lack of saturation dynamics, which in turn affects its ability to detect differences between cohorts. The overall approach of examining PD surfaces is seen to be an effective means of analyzing and thus validating a metabolic model's inherent dynamics and basic trend prediction on a population level, but is not a replacement for data driven, patient-specific fit and prediction validation for clinical use. The overall method presented could be readily generalized to similar PD systems and therapeutics.  相似文献   

7.
Dataset partitioning and validation techniques are required in all artificial neural network based waste models. However, there is currently no consensual approach on the validation techniques. This study examines the effects of three time series nested forward validation techniques (rolling origin - RO, rolling window - RW, and growing window - GW) on total municipal waste disposal estimates using recurrent neural network (RNN) models, and benchmarks model performance with respect to multiple linear regression (MLR) models. Validation selection techniques appear important to waste disposal time series model construction and evaluation. Sample size is found as an important factor on model accuracy for both RNN and MLR models. Better performance in Trial RW4 is observed, probably due to a more consistent testing set in 2019. Overall, the MAPE of the waste disposal models ranging from 10.4% to 12.7%. Both GW and RO validation techniques appear appropriate for RNN waste models. However, MLR waste models are more sensitive to the dataset characteristics, and RO validation technique appears more suitable to MLR models. It is found that data characteristics are more important than training period duration. It is recommended data set normality and skewness be examined for waste disposal modeling.  相似文献   

8.
Guillaume Martin 《Genetics》2014,197(1):237-255
Models relating phenotype space to fitness (phenotype–fitness landscapes) have seen important developments recently. They can roughly be divided into mechanistic models (e.g., metabolic networks) and more heuristic models like Fisher’s geometrical model. Each has its own drawbacks, but both yield testable predictions on how the context (genomic background or environment) affects the distribution of mutation effects on fitness and thus adaptation. Both have received some empirical validation. This article aims at bridging the gap between these approaches. A derivation of the Fisher model “from first principles” is proposed, where the basic assumptions emerge from a more general model, inspired by mechanistic networks. I start from a general phenotypic network relating unspecified phenotypic traits and fitness. A limited set of qualitative assumptions is then imposed, mostly corresponding to known features of phenotypic networks: a large set of traits is pleiotropically affected by mutations and determines a much smaller set of traits under optimizing selection. Otherwise, the model remains fairly general regarding the phenotypic processes involved or the distribution of mutation effects affecting the network. A statistical treatment and a local approximation close to a fitness optimum yield a landscape that is effectively the isotropic Fisher model or its extension with a single dominant phenotypic direction. The fit of the resulting alternative distributions is illustrated in an empirical data set. These results bear implications on the validity of Fisher’s model’s assumptions and on which features of mutation fitness effects may vary (or not) across genomic or environmental contexts.  相似文献   

9.
10.
We compare two broad types of empirically grounded random network models in terms of their abilities to capture both network features and simulated Susceptible-Infected-Recovered (SIR) epidemic dynamics. The types of network models are exponential random graph models (ERGMs) and extensions of the configuration model. We use three kinds of empirical contact networks, chosen to provide both variety and realistic patterns of human contact: a highly clustered network, a bipartite network and a snowball sampled network of a “hidden population”. In the case of the snowball sampled network we present a novel method for fitting an edge-triangle model. In our results, ERGMs consistently capture clustering as well or better than configuration-type models, but the latter models better capture the node degree distribution. Despite the additional computational requirements to fit ERGMs to empirical networks, the use of ERGMs provides only a slight improvement in the ability of the models to recreate epidemic features of the empirical network in simulated SIR epidemics. Generally, SIR epidemic results from using configuration-type models fall between those from a random network model (i.e., an Erdős-Rényi model) and an ERGM. The addition of subgraphs of size four to edge-triangle type models does improve agreement with the empirical network for smaller densities in clustered networks. Additional subgraphs do not make a noticeable difference in our example, although we would expect the ability to model cliques to be helpful for contact networks exhibiting household structure.  相似文献   

11.
The European Centre for the Validation of Alternative Methods (ECVAM) proposes to make the validation process more flexible, while maintaining its high standards. The various aspects of validation are broken down into independent modules, and the information necessary to complete each module is defined. The data required to assess test validity in an independent peer review, not the process, are thus emphasised. Once the information to satisfy all the modules is complete, the test can enter the peer-review process. In this way, the between-laboratory variability and predictive capacity of a test can be assessed independently. Thinking in terms of validity principles will broaden the applicability of the validation process to a variety of tests and procedures, including the generation of new tests, new technologies (for example, genomics, proteomics), computer-based models (for example, quantitative structure-activity relationship models), and expert systems. This proposal also aims to take into account existing information, defining this as retrospective validation, in contrast to a prospective validation study, which has been the predominant approach to date. This will permit the assessment of test validity by completing the missing information via the relevant validation procedure: prospective validation, retrospective validation, catch-up validation, or a combination of these procedures.  相似文献   

12.
邹应斌  米湘成  石纪成 《生态学报》2004,24(12):2967-2972
研究利用人工神经网络模型 ,以水稻群体分蘖动态为例 ,采用交互验证和独立验证的方式 ,对水稻生长 BP网络模型进行了训练与模拟 ,其结果与水稻群体分蘖的积温统计模型、基本动力学模型和复合分蘖模型进行了比较。研究结果表明 ,神经网络模型具有一定的外推能力 ,但其外推能力依赖于大量的训练样本。神经网络模型具有较好的拟合能力 ,是因为有较多的模型参数 ,因此对神经网络模型的训练需要大量的参数来保证其参数不致过度吻合。具有外推能力神经网络模型的最少训练样本数应大于 6 .75倍于神经网络参数数目 ,小于 13.5倍于神经网络参数数目。因此在应用神经网络模型时 ,如果神经网络模型包括较多的输入变量时 ,可考虑采用主成分分析、对应分析等技术对输入变量进行信息综合 ,相应地减少网络模型的参数。另一方面 ,当训练样本不足时 ,最好只用神经网络模型对同一系统的情况进行模拟 ,应谨慎使用神经网络模型进行外推。神经网络模型给作物模拟研究的科学工作者提供了一个“傻瓜”式工具 ,对数学建模不熟悉的农业研究人员 ,人工神经网络可以替代数学建模进行仿真实验 ;对于精通数学建模的研究人员来说 ,它至少是一种补充和可作为比较的非线性数据处理方法  相似文献   

13.
In systems and computational biology, much effort is devoted to functional identification of systems and networks at the molecular-or cellular scale. However, similarly important networks exist at anatomical scales such as the tendon network of human fingers: the complex array of collagen fibers that transmits and distributes muscle forces to finger joints. This network is critical to the versatility of the human hand, and its function has been debated since at least the 16th century. Here, we experimentally infer the structure (both topology and parameter values) of this network through sparse interrogation with force inputs. A population of models representing this structure co-evolves in simulation with a population of informative future force inputs via the predator-prey estimation-exploration algorithm. Model fitness depends on their ability to explain experimental data, while the fitness of future force inputs depends on causing maximal functional discrepancy among current models. We validate our approach by inferring two known synthetic Latex networks, and one anatomical tendon network harvested from a cadaver''s middle finger. We find that functionally similar but structurally diverse models can exist within a narrow range of the training set and cross-validation errors. For the Latex networks, models with low training set error [<4%] and resembling the known network have the smallest cross-validation errors [∼5%]. The low training set [<4%] and cross validation [<7.2%] errors for models for the cadaveric specimen demonstrate what, to our knowledge, is the first experimental inference of the functional structure of complex anatomical networks. This work expands current bioinformatics inference approaches by demonstrating that sparse, yet informative interrogation of biological specimens holds significant computational advantages in accurate and efficient inference over random testing, or assuming model topology and only inferring parameters values. These findings also hold clues to both our evolutionary history and the development of versatile machines.  相似文献   

14.
Outcome prediction is important for conservation; however, analysis may be hampered by specialist resource deficiencies. Mental modelling techniques offer a potential solution, drawing on accessible sources of knowledge held informally by local stakeholders. Mental models show linked social and ecological variables from the perspectives of community members, whose insights may otherwise be neglected. Currently, an important weakness in conservation mental modelling is inadequate attention paid to real-time model predictive validity. To address this knowledge gap, baseline mental model predictions concerning Beaver (Castor fiber) reintroduction in Southwest England were followed up at three years. Participants were invited to submit outcome observations for concept variables identified in their original models, blind to inferences based on model dynamic analysis, so that the two sets of data could be compared. Individual concept values and models were found to show weak and highly inconsistent predictive validity, however, multi-stakeholder aggregated mental models showed consistently strong predictive performance. This finding was enhanced by setting tighter thresholds for inclusion of individual model items in aggregation procedures. Threshold effects can be interpreted as a reflection of greater agreement: tighter thresholds retain more highly shared model components. It is proposed that enhanced real-time predictive validity for aggregated models is explained by a ‘wisdom of the crowd’ statistical effect, analogous to well-recognised crowd judgement effects observed in relation to much simpler questions. The findings show the scope for stakeholder mental modelling methods as an investigative tool, to supplement more conventional ecosystem assessments in predicting data-poor conservation outcomes.  相似文献   

15.
An overview is presented of the validation process adopted by the European Centre for the Validation of Alternative Methods, with particular emphasis on the central role of the prediction model (PM). The development of an adequate PM is considered to be just as important as the development of an adequate test system, since the validity of an alternative test can only be established when both components (the test system and the PM) have successfully undergone validation. It is argued, however, that alternative tests and their associated PMs do not necessarily need to undergo validation at the same time, and that retrospective validation may be appropriate when a test system is found to be reliable, but the case for its relevance remains to be demonstrated. For an alternative test to be considered "scientifically valid", it is necessary for three conditions to be fulfilled, referred to here as the criteria for scientific relevance, predictive relevance, and reliability. A minimal set of criteria for the acceptance of any PM is defined, but it should be noted that required levels of predictive ability need to be established on a case-by-case basis, taking into account the inherent variability of the alternative and in vivo test data. Finally, in view of the growing shift in emphasis from the use of stand-alone alternative tests to alternative testing strategies, the importance of making the PM an integral part of the testing strategy is discussed.  相似文献   

16.
The objective of this study was to validate the MRI-based joint contact modeling methodology in the radiocarpal joints by comparison of model results with invasive specimen-specific radiocarpal contact measurements from four cadaver experiments. We used a single validation criterion for multiple outcome measures to characterize the utility and overall validity of the modeling approach. For each experiment, a Pressurex film and a Tekscan sensor were sequentially placed into the radiocarpal joints during simulated grasp. Computer models were constructed based on MRI visualization of the cadaver specimens without load. Images were also acquired during the loaded configuration used with the direct experimental measurements. Geometric surface models of the radius, scaphoid and lunate (including cartilage) were constructed from the images acquired without the load. The carpal bone motions from the unloaded state to the loaded state were determined using a series of 3D image registrations. Cartilage thickness was assumed uniform at 1.0 mm with an effective compressive modulus of 4 MPa. Validation was based on experimental versus model contact area, contact force, average contact pressure and peak contact pressure for the radioscaphoid and radiolunate articulations. Contact area was also measured directly from images acquired under load and compared to the experimental and model data. Qualitatively, there was good correspondence between the MRI-based model data and experimental data, with consistent relative size, shape and location of radioscaphoid and radiolunate contact regions. Quantitative data from the model generally compared well with the experimental data for all specimens. Contact area from the MRI-based model was very similar to the contact area measured directly from the images. For all outcome measures except average and peak pressures, at least two specimen models met the validation criteria with respect to experimental measurements for both articulations. Only the model for one specimen met the validation criteria for average and peak pressure of both articulations; however the experimental measures for peak pressure also exhibited high variability. MRI-based modeling can reliably be used for evaluating the contact area and contact force with similar confidence as in currently available experimental techniques. Average contact pressure, and peak contact pressure were more variable from all measurement techniques, and these measures from MRI-based modeling should be used with some caution.  相似文献   

17.
Substance flow analysis (SFA) is a frequently used industrial ecology technique for studying societal metal flows, but it is limited in its ability to inform us about future developments in metal flow patterns and how we can affect them. Equation‐based simulation modeling techniques, such as dynamic SFA and system dynamics, can usefully complement static SFA studies in this respect, but they are also restricted in several ways. The objective of this article is to demonstrate the ability of agent‐based modeling to overcome these limitations and its usefulness as a tool for studying societal metal flow systems. The body of the article summarizes the parallel implementation of two models—an agent‐based model and a system dynamics model—both addressing the following research question: What conditions foster the development of a closed‐loop flow network for metals in mobile phones? The results from in silico experimentation with these models highlight three important differences between agent‐based modeling (ABM) and equation‐based modeling (EBM) techniques. An analysis of how these differences affected the insights that could be extracted from the constructed models points to several key advantages of ABM in the study of metal flow systems. In particular, this analysis suggests that a key advantage of the ABM technique is its flexibility to enable the representation of societal metal flow systems in a more native manner. This added flexibility endows modelers with enhanced leverage to identify options for steering metal flows and opens new opportunities for using the metaphor of an ecosystem to understand metal flow systems more fully.  相似文献   

18.
Validation constitutes a vital process in model development and application, as it ensures the applicability of a model for the intended purposes and trustworthy results within the range of model assumptions. Commonly, independent empirical data sets are statistically compared with the generated model results, which is an adequate approach for models which operate on a single hierarchical level, such as most equation-based models. Individual-based models (IBM) can operate on different organisational levels synchronously and have an inherent complex and variable interaction structure for many applications. Thus a plain comparison of data congruity on the result levels might leave too many questions unanswered. However, a more comprehensive assessment of model validity can require additional investigations which encompass also qualitative and structural relationships.Here we describe a hierarchically structured validation which is oriented towards the investigated context of the model and allows organising the validation process in close relation to the different hierarchical levels which are covered in the model. The context oriented organisation protocol for validation includes the following steps: (1) assessing the different model levels separately, then, (2) applying a set of different techniques such as visual inspection, statistical comparison, involvement of experts, aggregation of data on higher integration levels and experimental validation.The context oriented approach accounts for the specificity of individual-based models – i.e., the dynamic self-organisation of model outcomes from biologically underpinned individual interactions without an inherent determination of properties on higher hierarchical levels – and extends the potential of the validation process qualitatively, as it allows to assess complex structural and causal relations and multi-level feedback processes of the developed models.  相似文献   

19.
Two neural network models, called clustering-RBFNN and clustering-BPNN models, are created for estimating the work zone capacity in a freeway work zone as a function of seventeen different factors through judicious integration of the subtractive clustering approach with the radial basis function (RBF) and the backpropagation (BP) neural network models. The clustering-RBFNN model has the attractive characteristics of training stability, accuracy, and quick convergence. The results of validation indicate that the work zone capacity can be estimated by clustering-neural network models in general with an error of less than 10%, even with limited data available to train the models. The clustering-RBFNN model is used to study several main factors affecting work zone capacity. The results of such parametric studies can assist work zone engineers and highway agencies to create effective traffic management plans (TMP) for work zones quantitatively and objectively.  相似文献   

20.
I investigate how theoretical assumptions, pertinent to different perspectives and operative during the modeling process, are central in determining how nature is actually taken to be. I explore two different models by Michael Turelli and Steve Frank of the evolution of parasite-mediated cytoplasmic incompatility, guided, respectively, by Fisherian and Wrightian perspectives. Since the two models can be shown to be commensurable both with respect to mathematics and data, I argue that the differences between them in the (1) mathematical presentation of the models, (2) explanations, and (3) objectified ontologies stem neither from differences in mathematical method nor the employed data, but from differences in the theoretical assumptions, especially regarding ontology, already present in the respective perspectives. I use my "set up, mathematically manipulate, explain, and objectify" (SMEO) account of the modeling process to track the model-mediated imposition of theoretical assumptions. I conclude with a discussion of the general implications of my analysis of these models for the controversy between Fisherian and Wrightian perspectives.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号