首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
我国新时代十年是生态环境保护认识最深、力度最大、举措最实、推进最快、成效最显著的十年。生态环境治理取得成效的同时,管理措施也逐步成熟和规范化,相关生态管理知识成果的文本、视频、照片等多模态数据也日益丰厚。采用先进的知识图谱理念创新我国生态环境保护工作,对未来助力打赢污染防治攻坚战,构建现代环境治理体系具有重要意义。聚焦我国美丽中国和生态文明建设工程领域,将典型污染防治攻坚战、生态恢复工程多模态素材作为数据源,通过数据整合、知识抽取、知识融合后形成标准知识表述,构建生态管理知识图谱体系。具体包括(1)定量分析深圳市"散乱污"企业整治成功案例数据,抽取管理主体、管理对象等实体,挖掘其空间特征、污染特征、治理效果关系;(2)关联分析企业驻点、污染物热点和城市空间相互关系;(3)通过我国典型生态环境损害赔偿案件中的"实施行为-破坏对象-损害功能"特定关系分析,抽取"生态治理行为--受影响环境要素--生态服务提升程度"生态环境管理知识图谱;(4)最终形成了整合"散乱污"治理、生态环境治理行为的综合性生态管理知识图谱,构建了包含12类本体、82个实体,4类、201条关系的图数据库。研究表明,通过污染防治攻坚战成功案例、生态恢复工程成效的多模态数据构建我国生态管理知识图谱,能够形成贴近现实需求的知识体系,有助于依法治污、科学治污和精准治污全过程;也有助于生态环境损害鉴定评估工作中的"多因一果"和"一因多果"分析。建议未来加大生态管理知识图谱的应用,精准识别管理对象、实现科学分析与智能决策,促进公众参与生态管理和加快生态产品价值实现。  相似文献   

2.
We investigate the growth dynamics of Greater London defined by the administrative boundary of the Greater London Authority, based on the evolution of its street network during the last two centuries. This is done by employing a unique dataset, consisting of the planar graph representation of nine time slices of Greater London''s road network spanning 224 years, from 1786 to 2010. Within this time-frame, we address the concept of the metropolitan area or city in physical terms, in that urban evolution reveals observable transitions in the distribution of relevant geometrical properties. Given that London has a hard boundary enforced by its long standing green belt, we show that its street network dynamics can be described as a fractal space-filling phenomena up to a capacitated limit, whence its growth can be predicted with a striking level of accuracy. This observation is confirmed by the analytical calculation of key topological properties of the planar graph, such as the topological growth of the network and its average connectivity. This study thus represents an example of a strong violation of Gibrat''s law. In particular, we are able to show analytically how London evolves from a more loop-like structure, typical of planned cities, toward a more tree-like structure, typical of self-organized cities. These observations are relevant to the discourse on sustainable urban planning with respect to the control of urban sprawl in many large cities which have developed under the conditions of spatial constraints imposed by green belts and hard urban boundaries.  相似文献   

3.
4.
A new approach to loop analysis is presented in which decompositions of the total elasticity of a population projection matrix over a set of life history pathways are obtained as solutions of a constrained system of linear equations. In loop analysis, life history pathways are represented by loops in the life cycle graph, and the elasticity of the loop is interpreted as a measure of the contribution of the life history pathway to the population growth rate. Associated with the life cycle graph is a vector space -- the cycle space of the graph -- which is spanned by the loops. The elasticities of the transitions in the life cycle graph can be represented by a vector in the cycle space, and a loop decomposition of the life cycle graph is then defined to be any nonnegative linear combination of the loops which sum to the vector of elasticities. In contrast to previously published algorithms for carrying out loop analysis, we show that a given life cycle graph admits of either a unique loop decomposition or an infinite set of loop decompositions which can be characterized as a bounded convex set of nonnegative vectors. Using this approach, loop decompositions which minimize or maximize a linear objective function can be obtained as solutions of a linear programming problem, allowing us to place lower and upper bounds on the contributions of life history pathways to the population growth rate. Another consequence of our approach to loop analysis is that it allows us to identify the exact tradeoffs in contributions to the population growth rate that must exist between life history pathways.  相似文献   

5.
6.
Richard E. Scammon's article, “The First Seriatim Study of Human Growth,” provided one of the best‐known visuals in the field of human biology. Scammon resurrected longitudinal height data of one child from Buffon's Histoire Naturelle, converted them to metric, and plotted these measurements as a function of age. The result was the first graph of one individual's growth curve from birth to 18 years of age. This image was subsequently reproduced in numerous texts on human growth and biology. Published in 1927, Scammon's article provides a snapshot of the state of growth research at the time and gives a (literal) picture of the future of human biology. The graph of the growth of one child symbolizes the importance of process and variation in biological anthropology.  相似文献   

7.
Chen Y 《PloS one》2011,6(9):e24791
Zipf's law is one the most conspicuous empirical facts for cities, however, there is no convincing explanation for the scaling relation between rank and size and its scaling exponent. Using the idea from general fractals and scaling, I propose a dual competition hypothesis of city development to explain the value intervals and the special value, 1, of the power exponent. Zipf's law and Pareto's law can be mathematically transformed into one another, but represent different processes of urban evolution, respectively. Based on the Pareto distribution, a frequency correlation function can be constructed. By scaling analysis and multifractals spectrum, the parameter interval of Pareto exponent is derived as (0.5, 1]; Based on the Zipf distribution, a size correlation function can be built, and it is opposite to the first one. By the second correlation function and multifractals notion, the Pareto exponent interval is derived as [1, 2). Thus the process of urban evolution falls into two effects: one is the Pareto effect indicating city number increase (external complexity), and the other the Zipf effect indicating city size growth (internal complexity). Because of struggle of the two effects, the scaling exponent varies from 0.5 to 2; but if the two effects reach equilibrium with each other, the scaling exponent approaches 1. A series of mathematical experiments on hierarchical correlation are employed to verify the models and a conclusion can be drawn that if cities in a given region follow Zipf's law, the frequency and size correlations will follow the scaling law. This theory can be generalized to interpret the inverse power-law distributions in various fields of physical and social sciences.  相似文献   

8.
S Waner  Y H Wu 《Bio Systems》1988,21(2):115-124
We propose an automata-theoretical framework for structured hierarchical control, in terms of rules and meta-rules, for sequences of moves on a graph. This leads to a notion of a "universal" hierarchically structured automaton mu which can move on a given graph in such a way as to emulate any automaton which moves on that graph in response to inputs. This emulation is achieved via a mapping of the inputs in the given automaton to those of mu, and we think of such a mapping as an encoding of the given automaton. We see in several examples that efficient encodings of graph-search algorithms correspond to their natural hierarchical structure (in terms of rules and meta-rules), and this leads one to a precise notion of the "depth" of an automaton which moves on a given graph. By way of application, we discuss a proposed structure of a series of stochastic neural networks which can learn, by example, to encode a given sequence of moves on a graph, so that the encoding obtained is structurally the "natural" one for the given sequence of moves. Thus, such a learning system would perform both structural pattern recognition (in terms of "patterns" of moves), and encoding based on a desired outcome.  相似文献   

9.
Scientific formalizations of the notion of growth and measurement of the rate of growth in living organisms are age-old problems. The most frequently used metric, “Average Relative Growth Rate” is invariant under the choice of the underlying growth model. Theoretically, the estimated rate parameter and relative growth rate remain constant for all mutually exclusive and exhaustive time intervals if the underlying law is exponential but not for other common growth laws (e.g., logistic, Gompertz, power, general logistic). We propose a new growth metric specific to a particular growth law and show that it is capable of identifying the underlying growth model. The metric remains constant over different time intervals if the underlying law is true, while the extent of its variation reflects the departure of the assumed model from the true one. We propose a new estimator of the relative growth rate, which is more sensitive to the true underlying model than the existing one. The advantage of using this is that it can detect crucial intervals where the growth process is erratic and unusual. It may help experimental scientists to study more closely the effect of the parameters responsible for the growth of the organism/population under study.  相似文献   

10.
We present a software platform for reconstructing and analyzing the growth of a plant root system from a time-series of 3D voxelized shapes. It aligns the shapes with each other, constructs a geometric graph representation together with the function that records the time of growth, and organizes the branches into a hierarchy that reflects the order of creation. The software includes the automatic computation of structural and dynamic traits for each root in the system enabling the quantification of growth on fine-scale. These are important advances in plant phenotyping with applications to the study of genetic and environmental influences on growth.  相似文献   

11.
The directed Hamiltonian path (DHP) problem is one of the hard computational problems for which there is no practical algorithm on a conventional computer available. Many problems, including the traveling sales person problem and the longest path problem, can be translated into the DHP problem, which implies that an algorithm for DHP can also solve all the translated problems. To study the robustness of the laboratory protocol of the pioneering DNA computing for the DHP problem performed by Leonard Adleman (1994), we investigated how the graph size, multiplicity of the Hamiltonian paths, and the size of oligonucleotides that encode the vertices would affect the laboratory procedures. We applied Adleman's protocol with 18-mer oligonucleotide per node to a graph with 8 vertices and 14 edges containing two Hamiltonian paths (Adleman used 20-mer oligonucleotides for a graph with 7 nodes, 14 edges and one Hamiltonian path). We found that depending on the graph characteristics such as the number of short cycles, the oligonucleotide size, and the hybridization conditions that used to encode the graph, the protocol should be executed with different parameters from Adleman's.  相似文献   

12.
13.
Non-linear behaviour of biochemical networks, such as intracellular gene, protein or metabolic networks, is commonly represented using graphs of the underlying topology. Nodes represent abundance of molecules and edges interactions between pairs of molecules. These graphs are linear and thus based on an implicit linearization of the kinetic reactions in one or several dynamic modes of the total system. It is common to use data from different sources -- experiments conducted under different conditions or even on different species -- meaning that the graph will be a superposition of linearizations made in many different modes. The mixing of different modes makes it hard to identify functional modules, that is sub-systems that carry out a specific biological function, since the graph will contain many interactions that do not naturally occur at the same time. The ability to establish a boundary between the sub-system and its environment is critical in the definition of a module, contrary to a motif in which only internal interactions count. Identification of functional modules should therefore be done on graphs depicting the mode in which their function is carried out, i.e. graphs that only contain edges representing interactions active in the specific mode. In general, when an interaction between two molecules is established, one should always state the mode of the system in which it is active.  相似文献   

14.
We simulate the growth of neuronal networks using the two recently published tools, NETMORPH and CX3D. The goals of the work are (1) to examine and compare the simulation tools, (2) to construct a model of growth of neocortical cultures, and (3) to characterize the changes in network connectivity during growth, using standard graph theoretic methods. Parameters for the neocortical culture are chosen after consulting both the experimental and the computational work presented in the literature. The first (three) weeks in culture are known to be a time of development of extensive dendritic and axonal arbors and establishment of synaptic connections between the neurons. We simulate the growth of networks from day 1 to day 21. It is shown that for the properly selected parameters, the simulators can reproduce the experimentally obtained connectivity. The selected graph theoretic methods can capture the structural changes during growth.  相似文献   

15.
16.
Graph-theoretical methods have recently been used to analyze certain properties of natural and social networks. In this work, we have investigated the early stages in the growth of a Uruguayan academic network, the Biology Area of the Programme for the Development of Basic Science (PEDECIBA). This transparent social network is a territory for the exploration of the reliability of clustering methods that can potentially be used when we are confronted with opaque natural systems that provide us with a limited spectrum of observables (happens in research on the relations between brain, thought and language). From our social net, we constructed two different graph representations based on the relationships among researchers revealed by their co-participation in Master’s thesis committees. We studied these networks at different times and found that they achieve connectedness early in their evolution and exhibit the small-world property (i.e. high clustering with short path lengths). The data seem compatible with power law distributions of connectivity, clustering coefficients and betweenness centrality. Evidence of preferential attachment of new nodes and of new links between old nodes was also found in both representations. These results suggest that there are topological properties observed throughout the growth of the network that do not depend on the representations we have chosen but reflect intrinsic properties of the academic collective under study. Researchers in PEDECIBA are classified according to their specialties. We analysed the community structure detected by a standard algorithm in both representations. We found that much of the pre-specified structure is recovered and part of the mismatches can be attributed to convergent interests between scientists from different sub-disciplines. This result shows the potentiality of some clustering methods for the analysis of partially known natural systems.  相似文献   

17.
Growth of plants in terrestrial ecosystems is often limited by the availability of nitrogen (N) or phosphorous (P) Liebig's law of the minimum states that the nutrient in least supply relative to the plant's requirement will limit the plant's growth. An alternative to the law of the minimum is the multiple limitation hypothesis (MLH) which states that plants adjust their growth patterns such that they are limited by several resources simultaneously. We use a simple model of plant growth and nutrient uptake to explore the consequences for the plant's relative growth rate of letting plants invest differentially in N and P uptake. We find a smooth transition between limiting elements, in contrast to the strict transition in Liebig's law of the minimum. At N : P supply ratios where the two elements simultaneously limit growth, an increase in either of the nutrients will increase the growth rate because more resources can be allocated towards the limiting element, as suggested by the multiple limitation hypothesis. However, the further the supply ratio deviates from these supply rates, the more the plants will follow the law of the minimum. Liebig's law of the minimum will in many cases be a useful first-order approximation.  相似文献   

18.
The theory of steady-state enzyme processes which avoids using the mass action law of chemical kinetics and consistently describes catalytic mechanisms by probabilistic concepts has recently been proposed (Mazur, 1991, J. theor. Biol. 148, 229-242). To facilitate the analysis of complex reaction graphs by this theory the possibility of constructing schematic rules similar to those used in classical kinetics is studied. It is found that due to the similarity of algebraic procedures the popular method of King & Altman can be applied in probabilistic kinetics in addition to the earlier proposed rule based on enumeration of cycles of the reaction graph. This similarity also allows one to adapt many other shortcut methods of classical kinetics for probabilistic reaction graphs. The paper considers separately the possibility of transforming reaction mechanisms so that the initial graph is replaced by a simpler but equivalent one. It is shown that there are few cases when a group of states can be replaced by one united state, with earlier known rules such as the rule of Cha for equilibrium stages being particular cases of a more general procedure. In addition a novel method is proposed which performs step-by-step reduction of any reaction graph. All the new methods can be adapted for traditional kinetics as well. The results obtained demonstrate that many schematic rules of classical kinetics are of probabilistic origin.  相似文献   

19.
《Bio Systems》2008,91(3):783-791
We propose a dialogue-based society model which explains how the transitive law of the causality is originated. Causality is, in general, formalized by using axiomatic approaches. Instead of using axiomatic methods, we, however, compose a model consisting of agents who have knowledge about causal relations among objects. The model society can reveal the transitive law through interactional dialogues among themselves. The agents are reciprocally influenced, if they have either completely same opinions, or a particular pattern of opinions, that are regarded as the extension of such exact accordance. In addition, we add some vagueness to the dialogue, which is closer to a real communication than the former model. A set of knowledge of each agent is expressed as a directed graph, hence the every model can be construed as mere transformations of directed graphs through the interactions among directed graphs themselves. Following this perspective, the models are the systems that connect local logic with global one, while the union of the directed graphs is regarded as the global.  相似文献   

20.
Sawa K  Gunji YP 《Bio Systems》2007,90(3):783-791
We propose a dialogue-based society model which explains how the transitive law of the causality is originated. Causality is, in general, formalized by using axiomatic approaches. Instead of using axiomatic methods, we, however, compose a model consisting of agents who have knowledge about causal relations among objects. The model society can reveal the transitive law through interactional dialogues among themselves. The agents are reciprocally influenced, if they have either completely same opinions, or a particular pattern of opinions, that are regarded as the extension of such exact accordance. In addition, we add some vagueness to the dialogue, which is closer to a real communication than the former model. A set of knowledge of each agent is expressed as a directed graph, hence the every model can be construed as mere transformations of directed graphs through the interactions among directed graphs themselves. Following this perspective, the models are the systems that connect local logic with global one, while the union of the directed graphs is regarded as the global.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号