首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Goal Scope Background  The main focus in OMNIITOX is on characterisation models for toxicological impacts in a life cycle assessment (LCA) context. The OMNIITOX information system (OMNIITOX IS) is being developed primarily to facilitate characterisation modelling and calculation of characterisation factors to provide users with information necessary for environmental management and control of industrial systems. The modelling and implementation of operational characterisation models on eco and human toxic impacts requires the use of data and modelling approaches often originating from regulatory chemical risk assessment (RA) related disciplines. Hence, there is a need for a concept model for the data and modelling approaches that can be interchanged between these different contexts of natural system model approaches. Methods. The concept modelling methodology applied in the OMNIITOX project is built on database design principles and ontological principles in a consensus based and iterative process by participants from the LCA, RA and environmental informatics disciplines. Results. The developed OMNIITOX concept model focuses on the core concepts of substance, nature framework, load, indicator, and mechanism, with supplementary concepts to support these core concepts. They refer to the modelled cause, effect, and the relation between them, which are aspects inherent in all models used in the disciplines within the scope of OMNIITOX. This structure provides a possibility to compare the models on a fundamental level and a language to communicate information between the disciplines and to assess the possibility of transparently reusing data and modelling approaches of various levels of detail and complexity. Conclusions  The current experiences from applying the concept model show that the OMNIITOX concept model increases the structuring of all information needed to describe characterisation models transparently. From a user perspective the OMNIITOX concept model aids in understanding the applicability, use of a characterisation model and how to interpret model outputs. Recommendations and Outlook  The concept model provides a tool for structured characterisation modelling, model comparison, model implementation, model quality management, and model usage. Moreover, it could be used for the structuring of any natural environment cause-effect model concerning other impact categories than toxicity.  相似文献   

2.
The complexity of a system cannot be measured in absolute terms, but only relative to a specified observer. As a result, the operational definition of complexity will be different for different sorts of systems and problems. In many cases it will not be as a real valued function.For the study of evolution we adopt as our definition the information content of the instructions required to build the system. Because we take into account factors such as the properties of the epigenetic system and the laws of physics and chemistry, this is not the same as the information content of the genome as measured in terms of codon or amino acid frequencies. Using this definition, we derive a principle of minimum increase in complexity, and we apply it to explain two well-known phenomena in evolution, Williston's law and parallelism. It is unlikely that any definite trends in the evolution of complex systems can be accounted for by simple optimization principles such as the law of natural selection, as these contain no arrow of time.  相似文献   

3.
Material Flow Analysis (MFA) is a useful method for modeling, understanding, and optimizing sociometabolic systems. Among others, MFAs can be distinguished by two general system properties: First, they differ in their complexity, which depends on system structure and size. Second, they differ in their inherent uncertainty, which arises from limited data quality. In this article, uncertainty and complexity in MFA are approached from a systems perspective and expressed as formally linked phenomena. MFAs are, in a graph‐theoretical sense, understood as networks. The uncertainty and complexity of these networks are computed by use of information measures from the field of theoretical ecology. The size of a system is formalized as a function of its number of flows. It defines the potential information content of an MFA system and holds as a reference against which complexity and uncertainty are gauged. Integrating data quality measures, the uncertainty of an MFA before and after balancing is determined. The actual information content of an MFA is measured by relating its uncertainty to its potential information content. The complexity of a system is expressed based on the configuration of each individual flow in relation to its neighboring flows. The proposed metrics enable different material flow systems to be compared to one another and the role of individual flows within a system to be assessed. They provide information useful for the design of MFAs and for the communication of MFA results. For exemplification, the regional MFAs of aluminum and plastics in Austria are analyzed in this article.  相似文献   

4.
Fluorescent optical mapping of electrically active cardiac tissues provides a unique method to examine the excitation wave dynamics of underlying action potentials. Such mapping can be viewed as a bridge between cellular level and organ systems physiology, e.g., by facilitating the development of advanced theoretical concepts of arrhythmia. We present the design and use of a high-speed, high-resolution optical mapping system composed entirely of "off the shelf" components. The electrical design integrates a 256 element photodiode array with a 16 bit data acquisition system. Proper grounding and shielding at various stages of the design reduce electromagnetic interference. Our mechanical design provides flexibility in terms of mounting positions and applications (use for whole heart or tissue preparations), while maintaining precise alignment between all optical components. The system software incorporates a user friendly graphical user interface, e.g., spatially recorded action potentials can be represented as intensity graphs or in strip chart format. Thus, this system is capable of displaying cardiac action potentials with high spatiotemporal resolution. Results from cardiac action potential mapping with intact mouse hearts are provided. It should be noted that this system could be readily configured to study isolated myocardial biopsies (e.g., isolated ventricular trabeculae). We describe the details of a versatile, user-friendly system that could be employed for a magnitude of study protocols.  相似文献   

5.
The world of regulatory RNAs is fast expanding into mainstream molecular biology as both a subject of intense mechanistic study and as a tool for functional characterization. The RNA world is one of complex structures that carry out catalysis, sense metabolites and synthesize proteins. The dynamic and structural nature of RNAs presents a whole new set of informatics challenges to the computational community. The ability to relate structure and dynamics to function will be key to understanding this complex world. I review several important classes of structured RNAs that present our community with a series of biologically novel informatics challenges. I also review available informatics tools that have been recently developed in the field.  相似文献   

6.
7.
The determination of the configuration of a protein in three-dimensional (3D) space constitutes one of the major challenges in molecular biology research today. A method consists in choosing a protein structure from a database that minimizes an energy function. First, we model the problem in terms of dynamic programming and show that the determination of the order in which the variables must be considered to minimize the time complexity is an NP-hard problem. Second, we propose a new decomposition algorithm of the threading problem that is based on the connectivity of the graph induced by the 3D structure of a protein. Our decomposition could be used to solve the threading problem. The goal in this paper is to evaluate the intrinsic complexity of 3D structure, which can be viewed as information that may be incorporated into a solution method. It provides two indexes of complexity (time and space) and determines in polynomial time complex components of the 3D structure of a protein.  相似文献   

8.
Recently the terms "codes" and "information" as used in the context of molecular biology have been the subject of much discussion. Here I propose that a variety of structural realism can assist us in rethinking the concepts of DNA codes and information apart from semantic criteria. Using the genetic code as a theoretical backdrop, a necessary distinction is made between codes qua symbolic representations and information qua structure that accords with data. Structural attractors are also shown to be entailed by the mapping relation that any DNA code is a part of (as the domain). In this framework, these attractors are higher-order informational structures that obviate any "DNA-centric" reductionism. In addition to the implications that are discussed, this approach validates the array of coding systems now recognized in molecular biology.  相似文献   

9.
The usefulness of the indicator dye method for the detection of transient enzyme-product intermediates has been very limited due to the near impossibility of resolving the apparent single exponential time courses resulting from sequences of steps linked by rather similar rate constants. We propose here a novel approach, the proton-product time course method, a procedure which can extract a great deal of the mechanistic information which remains buried in conventional proton release-time course measurements. The method involves nothing more than measuring the ratio, r, of the moles of H+ released to the moles of product formed as a function of time. We derive the theory relating this r function to mechanisms of varying complexity, explore the theoretical behavior of the function in various possible mechanistic situations, and employ the new approach in an experimental system. We demonstrate the fact that the proton/product time course ratio method can provide evidence of the existence of hidden steps in transient state kinetic studies, that it can determine accurate thermodynamic pK values of the intermediate complexes involved in those steps, and that it can produce time courses of individual intermediates which are obscure to conventional kinetic methods.  相似文献   

10.
Chan JY 《Bio Systems》2012,108(1-3):28-33
Recent evidence supports the existence of a mutator phenotype in cancer cells, although the mechanistic basis remains unknown. In this paper, it is shown that this enhanced genetic instability is generated by an amplified measurement uncertainty on genetic information during DNA replication. At baseline, an inherent measurement uncertainty implies an imprecision of the recognition, replication and transfer genetic information, and forms the basis for an intrinsic genetic instability in all biological cells. Genetic information is contained in the sequence of DNA bases, each existing due to proton tunnelling, as a coherent superposition of quantum states composed of both the canonical and rare tautomeric forms until decoherence by interaction with DNA polymerase. The result of such a quantum measurement process may be interpreted classically as akin to a Bernoulli trial, whose outcome X is random and can be either of two possibilities, depending on whether the proton is tunnelled (X=1) or not (X=0). This inherent quantum uncertainty is represented by a binary entropy function and quantified in terms of Shannon information entropy H(X)=-P(X=1)log(2)P(X=1)-P(X=0)log(2)P(X=0). Enhanced genetic instability may either be directly derived from amplified uncertainty induced by increases in quantum and thermodynamic fluctuation, or indirectly arise from the loss of natural uncertainty reduction mechanisms.  相似文献   

11.
Complexity is an important aspect of evolutionary biology, but there are many reasonable concepts of complexity, and its objective measurement is an elusive matter. Here we develop a simple measure of complexity based on counts of elements, incorporating the hierarchical information as represented in anatomical ontologies. Neomorphic and transformational characters are used to identify novelties and individuated morphological regions, respectively. By linking the characters to terms in an anatomical ontology a node‐driven approach is implemented, where a node ontology and a complexity score are inferred from the optimization of individual characters on each ancestral or terminal node. From the atomized vector of character scorings, the anatomical ontology is used to integrate the hierarchical structure of morphology in terminals and ancestors. These node ontologies are used to calculate a measure of complexity that can be traced on phylogenetic trees and is harmonious with usual phylogenetic operations. This strategy is compared with a terminal‐driven approach, in which the complexity scores are calculated only for terminals, and optimized as a continuous character on the internal nodes. These ideas are applied to a real dataset of 166 araneomorph spider species scored for 393 characters, using Spider Ontology (SPD, https://bioportal.bioontology.org/ontologies/SPD ); complexity scores and transitions are calculated for each node and branch, respectively. This result in a distribution of transitions skewed towards simplification; the transitions in complexity have no apparent correlation with character branch lengths. The node‐driven and terminal‐driven estimations are generally correlated in the complexity scores, but have higher divergence in the transition values. The structure of the ontology is used to provide complexity scores for organ systems and body parts of the focal groups.  相似文献   

12.
The meaning of optimality and economy in phylogenetics and evolutionary biology is discussed. It can be shown that the prevailing concepts of optimality and economy are equivocal as they are not based on strict theoretical positions and as they have a variable meaning in different theoretical contexts. The ideas of optimality and economy can be considered to be identical with the expectation of a relatively simple order in a particular field of study. Although there exists no way of inferring one or several methods of solving scientific problems from the presupposed idea of economy and optimality, a lack of motivation for scientific investigations would result if the concepts of economy and optimality in nature were dropped. By reference to several examples, it is shown that the concepts of optimality and economy are only useful against the background of indispensable theories. If there is a shift from one theory to another, a restriction on the use of these concepts is necessary. Optimality and economy in the sense of operations research in engineering or economical sciences depend on the principle of minimum costs. Both theoretical concepts: technical efficiency in relation to the energy required to run a machine and profit maximation in an economical framework must be shown to be realistic assumptions. In the field of biology processes of optimization and economization are normally discussed under two different views:
  1. The concept of economy is used in cases of functional adaptation when the organism makes good use of the building material which is available to fulfill one (or more) functions. The theoretical background must be seen in the energy-consuming aspect of the organism.
  2. In evolutionary change and phylogeny ‘economization’ and ‘optimization’ are deduced from the evolutionary theory, and evolution is shown to produce a special kind of biological economy in biological systems (Bock & von Wahlert, 1965). The ‘Okonomie-Prinzip’ or ‘Lesrichtungskriterium’ points out the arguments needed to state a phylogenetic theory and to construct a dendrogram (Peters & Gutmann, 1971).
In every phylogenetic theory concerning the adaptational change in the evolving biological system an explanation for the function of all stages is required. Only those statements should be accepted as phylogenetic theories which are characterized by the demonstration of the process of economization in the functional relations of the evolving organism. The process of adaptation can be determined by the improved chance of some mutants to propagate their genetical information. In this process all functional systems in their interrelations — i.a. mutual dependence — and their relation with the environment add their functional efficiency to the information to be delivered to their progeny, because the more economical biological system in a certain environment will have a better chance to produce offspring. This outcome is affirmed by natural selection which works on all levels of the evolving biological systems (Gutmann & Peters 1973). Nevertheless a judgment about adaptation cannot be taken as a scale of measurement in the phylogenetic process. The conditions in the organism itself and in the environment or in the organic system alone can change in so profound a manner that the marginal conditions of the earlier stages of the process of adaptation are not the same as in the derived ones. During phylogenetic change of the evolving organism the selective strains are also continuously changing. As a consequence no state or invariant concept of economy can cover the different stages of the phylogenetic process. The pragmatical meaning of the theoretical consideration is substantiated by the example of the hydrostatic skeleton theory in which the chordates are derived from metameric worms with a fluid skeleton. Herrn Professor Dr. P. Dullemeijer sind die Verfasser für kritische Lektüre und wertovolle Hinweise zu Dank verpflichtet.  相似文献   

13.
Non-adjacent or long-distance dependencies (LDDs) are routinely considered to be a distinctive trait of language, which purportedly locates it higher than other sequentially organized signal systems in terms of structural complexity. This paper argues that particular languages display specific resources (e.g. non-interpretive morphological agreement paradigms) that help the brain system responsible for dealing with LDDs to develop the capacity of acquiring and processing expressions with such a human-typical degree of computational complexity. Independently obtained naturalistic data is discussed and put to the service of the idea that the above-mentioned resources exert their developmental role from the outside, but in compliance with other internal resources, ultimately compounding an integrated developmental system. Parallels with other human and nonhuman developmental phenomena are explored, which point to the conclusion that the developmental system of concern can be assimilated to cases currently been conceptualized as ‘cue-response systems’ or ‘developmental hybrids’ within the ecological-developmental paradigm in theoretical biology. Such a conclusion is used to support the idea that both current externalist and internalist concepts fall short of a correct characterization of language.  相似文献   

14.
15.
16.
MOTIVATION: The sheer volume of textually described biomedical knowledge exerts the need for natural language processing (NLP) applications in order to allow flexible and efficient access to relevant information. Specialized semantic networks (such as biomedical ontologies, terminologies or semantic lexicons) can significantly enhance these applications by supplying the necessary terminological information in a machine-readable form. With the explosive growth of bio-literature, new terms (representing newly identified concepts or variations of the existing terms) may not be explicitly described within the network and hence cannot be fully exploited by NLP applications. Linguistic and statistical clues can be used to extract many new terms from free text. The extracted terms still need to be correctly positioned relative to other terms in the network. Classification as a means of semantic typing represents the first step in updating a semantic network with new terms. RESULTS: The MaSTerClass system implements the case-based reasoning methodology for the classification of biomedical terms.  相似文献   

17.
We introduce concepts of external and internal complexity to analyze the relation between an adaptive system and its environment. We apply this theoretical framework to the construction of models in a cognitive system and the selection between hypotheses through selective observations performed on a data set in a recurrent process and propose a corresponding neural network architecture.  相似文献   

18.
Complexity can enhance stability in competitive systems   总被引:2,自引:0,他引:2  
Empirical observations often indicate that complexity enhances stability, while most theoretical studies, such as May's (1972) classic paper, point to the opposite. Despite the wide generality of these latter theoretical analyses, our examination of the well-known competitive Lotka–Volterra system reveals that increasing complexity (measured in terms of connectance) can enhance species coexistence and persistence in model communities (measured in terms of their feasibility and stability). The high feasibility and stability found for tightly interconnected competitive subsystems might provide an explanation for the clumped structure in food webs.  相似文献   

19.
The standard approach to the definition of the physical quantities has not produced satisfactory results with the concepts of information and meaning. In the case of information we have at least two unrelated definitions, while in the case of meaning we have no definition at all. Here it is shown that both information and meaning can be defined by operative procedures, but it is also pointed out that we need to recognize them as a new type of natural entities. They are not quantities (neither fundamental nor derived) because they cannot be measured, and they are not qualities because are not subjective features. Here it is proposed to call them nominable entities, i.e., entities which can be specified only by naming their components in their natural order. If the genetic code is not a linguistic metaphor but a reality, we must conclude that information and meaning are real natural entities, and now we must also conclude that they are not equivalent to the quantities and qualities of our present theoretical framework. This gives us two options. One is to extend the definition of physics and say that the list of its fundamental entities must include information and meaning. The other is to say that physics is the science of quantities only, and in this case information and meaning become the exclusive province of biology. The boundary between physics and biology, in short, is a matter of convention, but the existence of information and meaning is not. We can decide to study them in the framework of an extended physics or in a purely biological framework, but we cannot avoid studying them for what they are, i.e., as fundamental components of the fabric of Nature.  相似文献   

20.
The purpose of the present paper is to offer a precise definition of the concepts of integration, emergence and complexity in biological networks through the use of the information theory. If two distinct properties of a network are expressed by two discrete variables, the classical subadditivity principle of Shannon's information theory applies when all the nodes of the network are associated with these properties. If not, the subadditivity principle may not apply. This situation is often to be encountered with enzyme and metabolic networks, for some nodes may well not be associated with these two properties. This is precisely what is occurring with an enzyme that binds randomly its two substrates. This situation implies that an enzyme, or a metabolic network, may display a joint entropy equal, smaller, or larger than the corresponding sum of individual entropies of component sub-systems. In the first case, the collective properties of the network can be reduced to the individual properties of its components. Moreover, the network is devoid of any information. In the second case, the system displays integration effects, behaves as a coherent whole, and has positive information. But if the joint entropy of the network is smaller than the sum of the individual entropies of its components, then the system has emergent collective properties and can be considered complex. Moreover, under these conditions, its information is negative. The extent of negative information is enhanced if the enzyme, or the metabolic network, is far away from equilibrium.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号