首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Purpose

The majority of LCA studies begin with the drawing of a process flow diagram, which then needs to be translated manually into an LCA model. This study presents an initial image processing pipeline, implemented in an open-source software package, called lcopt-cv, which can be used to identify the boxes and links in a photograph of a hand-drawn process flow diagram and automatically create an LCA foreground model.

Methods

The computer vision pipeline consists of a total of 15 steps, beginning with loading the image file and conversion to greyscale. The background is equalised, then the foreground of the image is extracted from the background using thresholding. The lines are then dilated and closed to account for drawing errors. Contours in the image are detected and simplified, and rectangles (contours with four corners) are identified from the simplified contours as ‘boxes’. Links between these boxes are identified using a flood-filling technique. Heuristic processing, based on knowledge of common practice in drawing of process flow diagrams, is then performed to more accurately identify the typology of the identified boxes and the direction of the links between them.

Results and discussion

The performance of the image processing pipeline was tested on four flow diagrams of increasing difficulty: one simple computer drawn diagram and three photographs of hand-drawn diagrams (a simple diagram, a complex diagram and a diagram with merged lines). A set of default values for the variables which define the pipeline was developed through trial and error. For the two simple flow charts, all boxes and links were identified using the default settings. The complex diagram required minor tweaks to the default values to detect all boxes and links. An ‘unstacking’ heuristic allowed the diagram with merged lines to be correctly processed. After some manual reclassification of link directions and process types, the diagrams were turned into LCA models and exported to open-source LCA software packages (lcopt and Brightway) to be verified and analysed.

Conclusions

This study demonstrates that it is possible to generate a fully functional LCA model from a picture of a flow chart. This has potentially important implications not only for LCA practitioners as a whole, but in particular for the teaching of LCA. Skipping the steep learning curve required by most LCA software packages allows teachers to focus on important LCA concepts, while participants maintain the benefits of experiential learning by doing a ‘real’ LCA.

  相似文献   

2.
A numerical model of the coupled motion of a flexing surface in a high Reynolds number flow is presented for the simulation of flexible polyurethane heart valves in the aortic position. This is achieved by matching a Lagrangian dynamic leaflet model with a panel method based flow solver. The two models are coupled via the time-dependent pressure field using the unsteady Bernoulli equation. Incorporation of sub-cycling in the dynamic model equations and fast pre conditioning techniques in the panel method solver yields efficient convergence and near real-time simulations of valve motion. The generality of dynamic model allows different material properties and/or geometries to be studied easily and interactively. This interactivity is realized by embedding the models within a design environment created using the software IRIS Explorer. Two flow domains are developed, an infinite domain and an internal domain using conformal mapping theory. In addition bending stress on the valve is computed using a simple stress model based on spline and circle equation techniques.  相似文献   

3.
Abstract

A numerical model of the coupled motion of a flexing surface in a high Reynolds number flow is presented for the simulation of flexible polyurethane heart valves in the aortic position. This is achieved by matching a Lagrangian dynamic leaflet model with a panel method based flow solver. The two models are coupled via the time-dependent pressure field using the unsteady Bernoulli equation.

Incorporation of sub-cycling in the dynamic model equations and fast pre conditioning techniques in the panel method solver yields efficient convergence and near real-time simulations of valve motion. The generality of dynamic model allows different material properties and/or geometries to be studied easily and interactively. This interactivity is realized by embedding the models within a design environment created using the software IRIS Explorer TM.

Two flow domains are developed, an infinite domain and an internal domain using conformal mapping theory. In addition bending stress on the valve is computed using a simple stress model based on spline and circle equation techniques.  相似文献   

4.
MOTIVATION: The cost of molecular quasi-statics or dynamics simulations increases with the size of the simulated systems, which is a problem when studying biological phenomena that involve large molecules over long time scales. To address this problem, one has often to either increase the processing power (which might be expensive), or make arbitrary simplifications to the system (which might bias the study). RESULTS: We introduce adaptive torsion-angle quasi-statics, a general simulation method able to rigorously and automatically predict the most mobile regions in a simulated system, under user-defined precision or time constraints. By predicting and simulating only these most important regions, the adaptive method provides the user with complete control on the balance between precision and computational cost, without requiring him or her to perform a priori, arbitrary simplifications. We build on our previous research on adaptive articulated-body simulation and show how, by taking advantage of the partial rigidification of a molecule, we are able to propose novel data structures and algorithms for adaptive update of molecular forces and energies. This results in a globally adaptive molecular quasi-statics simulation method. We demonstrate our approach on several examples and show how adaptive quasi-statics allows a user to interactively design, modify and study potentially complex protein structures.  相似文献   

5.
王宜成 《生态学报》2013,33(11):3258-3268
传统的自然保护区设计方法是打分法和Gap分析法,这两种方法简单易行但可靠性不高;地理信息系统(GIS)在保护区设计领域的应用也为人熟悉.关注近年来快速发展而国内使用不多的两种方法——数学建模和计算机模拟.数学建模主要用来从一组备选地块中选择一部分组成自然保护区,包括线性和非线性模型,用启发式算法或最优化算法求解.启发式算法具有速度快、灵活等优点,但解通常不是最优的,不能保证稀缺资源的最优化利用.最优化算法运算效率低,变量较多比如数百时就可能遇到计算困难,但解是最优的.预计两种算法都将继续发展.计算机模拟主要用于保护区评价、功能区划分、预测特定环境比如空间特征和气候变化对物种的影响等,多用启发式算法,与其它软件结合把结果以图画显示出来.两种方法特别是计算机模拟均要求保护区设计者有较强的专业知识.讨论了两种方法面临的问题和新的研究方向,至少包括:1)基础数据依然需要完善;2)一些新的因素比如动态性和不确定性如何在模型中考虑并与其它因素结合;3)气候变化预景下模拟参数如何评估和调整;4)如何协调保护与发展的关系;5)方法的实际应用需要研究者与决策者之间建立交流机制;6)多领域专家和相关利益方应有机会参与保护区设计.  相似文献   

6.
Ligand-Info,searching for similar small compounds using index profiles   总被引:1,自引:0,他引:1  
MOTIVATION: The Ligand-Info system is based on the assumption that small molecules with similar structure have similar functional (binding) properties. The developed system enables a fast and sensitive index based search for similar compounds in large databases. Index profiles, constructed by averaging indexes of related molecules are used to increase the specificity of the search. The utilization of index profiles helps to focus on frequent, common features of a family of compounds. RESULTS: A Java-based tool for clustering and scanning of small molecules has been created. The tool can interactively cluster sets of molecules and create index profiles on the user side and automatically download similar molecules from a databases of 250 000 compounds. The results of the application of index profiles demonstrate that the profile based search strategy can increase the quality of the selection process. AVAILABILITY: The system is available at http://Ligand.Info. The application requires the Java Runtime Environment 1.4, which can be automatically installed during the first use on desktop systems, which support it. A standalone version of the program is available from the authors upon request.  相似文献   

7.
Typical component-placement systems for populating surface mount technology printed circuit boards now exhibit a high degree of concurrency in their functional operations. This concurrency ideally yields high “burst-rate” estimates of throughput. However, if the concurrency is not properly understood and exploited, the burst rate is severely degraded, as exhibited by process rates observed in the actual production environment. This discernment requires an experimental characterization of the system's functional operation, which must also reflect the peculiarities of the controller. Such an experimental analysis is an essential precursor to performance-optimization procedures of numerically controlled flexible manufacturing systems. This article describes our analysis of an extremely complex workcell with a high degree of concurrency. Due to its enveloping complexity, the methodological framework for the analysis should be applicable to a broad class of concurrent systems. Empirically verifying the characterization required the development of an emulator that quantitatively defines the system to be modeled. As such, it is a numerical, off-line design and analysis tool. It has been utilized to obtain the process rate for particular products, preevaluate proposed engineering changes, interactively construct setups and sequences, and obtain parameters required for line-balancing procedures.  相似文献   

8.
Experiments using quantitative real-time PCR to test hypotheses are limited by technical and biological variability; we seek to minimise sources of confounding variability through optimum use of biological and technical replicates. The quality of an experiment design is commonly assessed by calculating its prospective power. Such calculations rely on knowledge of the expected variances of the measurements of each group of samples and the magnitude of the treatment effect; the estimation of which is often uninformed and unreliable. Here we introduce a method that exploits a small pilot study to estimate the biological and technical variances in order to improve the design of a subsequent large experiment. We measure the variance contributions at several ‘levels’ of the experiment design and provide a means of using this information to predict both the total variance and the prospective power of the assay. A validation of the method is provided through a variance analysis of representative genes in several bovine tissue-types. We also discuss the effect of normalisation to a reference gene in terms of the measured variance components of the gene of interest. Finally, we describe a software implementation of these methods, powerNest, that gives the user the opportunity to input data from a pilot study and interactively modify the design of the assay. The software automatically calculates expected variances, statistical power, and optimal design of the larger experiment. powerNest enables the researcher to minimise the total confounding variance and maximise prospective power for a specified maximum cost for the large study.  相似文献   

9.
A computer program is described which aids the clinician in planning craniofacial surgical procedures. It operates on a three-dimensional landmark data base derived by combining posteroanterior and lateral cephalograms from the patient and from the Bolton normative standards. A three-dimensional surgical simulation program based on computerized tomographic (CT) data is also described which can be linked to the cephalometrically based program. After the clinician has selected the number and type of osteotomies to be performed on the patient, an automated optimization program computes the postoperative positions of these fragments which best fit the appropriate normal cephalometric form. The clinician then interactively modifies the design to account for such variables as bone-graft resorption, relapse tendency, occlusal disparities, and the condition of the overlying soft-tissue matrix. Osteotomy movement specifications are easily transferred between the CT-based and the cephalometrically based surgical simulation programs. This allows the automated positioning step to be performed on the cephalometrically based model while the interactive step is performed using the superior image provided by the CT-based model.  相似文献   

10.
Numerous images are produced daily in biomedical research. Inorder to extract relevant and useful results, various processingand analysis steps are mandatory. The present paper describesa new, powerful and user-friendly image analysis system: Labolmage.In addition to standard image processing modules, Labolmagealso contains various specialized tools. These multiple processingmodules and tools are first introduced. A one-dimensional gelanalysis method is then described. The new concept of ‘normalizedvirtual one-dimensional gel’ is introduced, making comparisonsbetween gels particularly easy. This normalized gel is obtainedby compensating for the bending of the lanes automatically;no information loss is incurred in the process. Finally, themodel of interaction in a multi-window environment is discussed.Labolmage is designed to run in two ways: interactively, usingmenus and panels; and in batch mode by means of user-definedmacros. Examples are given to illustrate the potentialitiesof the software. Received on August 27, 1990; accepted on November 15, 1990  相似文献   

11.
12.
In order to interpret the Feulgen-dependent chromatin morphology on a functional basis, we performed model experiments in which labeling with 14C-thymidine and 14C-uridine was used as a functional parameter. Using a relocation facility, information on either DNA or RNA, labeling intensity of a cell was added to the parameters of image analysis by measuring the same cell by scanning photometry after Feulgen staining. The Feulgen-stained nuclei were interactively sampled and automatically segmented. Most of the textural information was gained from a flat texture image obtained by subtracting the original image from a median-filtered image. In addition to the autoradiographic features, visually recognizable differences in nuclear morphology, such as the number of nucleoli and the level of condensed (inactive) and diffuse (active) regions of the chromatin, were also correlated with textural parameters. Using the supervised cluster analysis method, an attempt was made to establish a correlation between visual nuclear morphology and autoradiographic labeling intensity that improved the functional understanding of the Feulgen features. Our results further clarify the supramolecular chromatin structure and its dynamics during specific transitions in the cell cycle, namely the G0-G1, G1-S and S-G2 transitions; this information may become useful in diagnostic procedures.  相似文献   

13.
The totally asymmetric simple exclusion process (TASEP), which describes the stochastic dynamics of interacting particles on a lattice, has been actively studied over the past several decades and applied to model important biological transport processes. Here, we present a software package, called EGGTART (Extensive GUI gives TASEP-realization in Real Time), which quantifies and visualizes the dynamics associated with a generalized version of the TASEP with an extended particle size and heterogeneous jump rates. This computational tool is based on analytic formulas obtained from deriving and solving the hydrodynamic limit of the process. It allows an immediate quantification of the particle density, flux, and phase diagram, as a function of a few key parameters associated with the system, which would be difficult to achieve via conventional stochastic simulations. Our software should therefore be of interest to biophysicists studying general transport processes and can in particular be used in the context of gene expression to model and quantify mRNA translation of different coding sequences.  相似文献   

14.
MOTIVATION: Analyzing the networks of interactions between genes and proteins has become a central theme in systems biology. Versatile software tools for interactively displaying and analyzing these networks are therefore very much in demand. The public-domain open software environment Cytoscape has been developed with the goal of facilitating the design and development of such software tools by the scientific community. RESULTS: We present GenePro, a plugin to Cytoscape featuring a set of versatile tools that greatly facilitates the visualization and analysis of protein networks derived from high-throughput interactions data and the validation of various methods for parsing these networks into meaningful functional modules. AVAILABILITY: The GenePro plugin is available at the website http://genepro.ccb.sickkids.ca.  相似文献   

15.
整合电路理论的生态廊道及其重要性识别   总被引:2,自引:0,他引:2  
宋利利  秦明周 《生态学杂志》2016,27(10):3344-3352
景观连接度被认为是影响诸多生态过程的一个重要因素.基于最小累积阻力模型的最小成本路径识别方法可以有效识别异质性景观中的功能连接,已被广泛应用到景观的功能连接评价与生态廊道模拟的研究中.基于电路理论的连接度模型用电阻代替了图论中的边、用电阻距离代替成本距离,来衡量异质性景观的功能连接.本文以SIMMAP 2.0软件生成的模拟景观为对象,借助于Linkage Mapper工具和Circuitscape软件,探讨如何将最小累积阻力模型与基于电路理论的连接度模型相结合来识别生态廊道及其景观要素的相对重要性.结果表明: 两种模型在应用中各有优势,互为补充.最小成本路径方法可以有效识别栖息地之间的最小成本廊道,基于电路理论的连接度模型通过电流密度的计算可以有效识别对景观连接性有重要影响的景观要素和“夹点”地区,并且“夹点”的位置不受廊道宽度的影响,在廊道重要性识别研究中具有明显优势.该方法可为区域生态保护规划和生态廊道设计提供科学依据.  相似文献   

16.
We present a general computational approach to simulate RNA folding kinetics that can be used to extract population kinetics, folding rates and the formation of particular substructures that might be intermediates in the folding process. Simulating RNA folding kinetics can provide unique insight into RNA whose functions are dictated by folding kinetics and not always by nucleotide sequence or the structure of the lowest free-energy state. The method first builds an approximate map (or model) of the folding energy landscape from which the population kinetics are analyzed by solving the master equation on the map. We present results obtained using an analysis technique, map-based Monte Carlo simulation, which stochastically extracts folding pathways from the map. Our method compares favorably with other computational methods that begin with a comprehensive free-energy landscape, illustrating that the smaller, approximate map captures the major features of the complete energy landscape. As a result, our method scales to larger RNAs. For example, here we validate kinetics of RNA of more than 200 nucleotides. Our method accurately computes the kinetics-based functional rates of wild-type and mutant ColE1 RNAII and MS2 phage RNAs showing excellent agreement with experiment.  相似文献   

17.
18.
Changes in vegetation structure and biogeography due to climate change feedback to alter climate by changing fluxes of energy, moisture, and momentum between land and atmosphere. While the current class of land process models used with climate models parameterizes these fluxes in detail, these models prescribe surface vegetation and leaf area from data sets. In this paper, we describe an approach in which ecological concepts from a global vegetation dynamics model are added to the land component of a climate model to grow plants interactively. The vegetation dynamics model is the Lund–Potsdam–Jena (LPJ) dynamic global vegetation model. The land model is the National Center for Atmospheric Research (NCAR) Land Surface Model (LSM). Vegetation is defined in terms of plant functional types. Each plant functional type is represented by an individual plant with the average biomass, crown area, height, and stem diameter (trees only) of its population, by the number of individuals in the population, and by the fractional cover in the grid cell. Three time‐scales (minutes, days, and years) govern the processes. Energy fluxes, the hydrologic cycle, and carbon assimilation, core processes in LSM, occur at a 20 min time step. Instantaneous net assimilated carbon is accumulated annually to update vegetation once a year. This is carried out with the addition of establishment, resource competition, growth, mortality, and fire parameterizations from LPJ. The leaf area index is updated daily based on prevailing environmental conditions, but the maximum value depends on the annual vegetation dynamics. The coupling approach is successful. The model simulates global biogeography, net primary production, and dynamics of tundra, boreal forest, northern hardwood forest, tropical rainforest, and savanna ecosystems, which are consistent with observations. This suggests that the model can be used with a climate model to study biogeophysical feedbacks in the climate system related to vegetation dynamics.  相似文献   

19.
In silicio design plays a fundamental role in the endeavour to synthesise biological systems. In particular, computer-aided design software enables users to manage the complexity of biological entities that is connected to their construction and reconfiguration. The software’s graphical user interface bridges the gap between the machine-readable data on the algorithmic subface of the computer and its human-amenable surface represented by standardised diagrammatic elements. Notations like the Systems Biology Graphical Notation (SBGN), together with interactive operations such as drag & drop, allow the user to visually design and simulate synthetic systems as ‘bio-algorithmic signs’. Finally, the digital programming process should be extended to the wet lab to manufacture the designed synthetic biological systems. By exploring the different ‘faces’ of synthetic biology, I argue that in particular computer-aided design (CAD) is pushing the idea to automatically produce de novo objects. Multifaceted software processes serve mutually aesthetic, epistemic and performative purposes by simultaneously black-boxing and bridging different data sources, experimental operations and community-wide standards. So far, synthetic biology is mainly a product of digital media technologies that structurally mimic the epistemological challenge to take both qualitative as well as quantitative aspects of biological systems into account in order to understand and produce new and functional entities.  相似文献   

20.
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号