首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
This paper describes advancements in recreation management using new technology that couples Geographic Information Systems (GIS) with Intelligent Agents to simulate recreation behaviour in real world settings. RBSim 2 (Recreation Behaviour Simulator) is a computer simulation program that enables recreation managers to explore the consequences of change to any one or more variables so that the goal of accommodating increasing visitor use is achieved while maintaining the quality of visitor experience. RBSim provides both a qualitative understanding of management scenarios by the use of map graphics from a GIS as well as a quantitative understanding of management consequences by generating statistics during the simulation. Managers are able to identify points of over crowding, bottlenecks in circulation systems, and conflicts between different user groups.

RBSim 2 is a tool designed specifically for the purposes of simulating human recreation behaviour in outdoor environments. The software is designed to allow recreation researchers and managers to simulate any recreation environment where visitors are restricted to movement on a network (roads, trails, rivers, etc.). The software architecture is comprised of the following components:

• GIS module to enter travel network, facilities, and elevation data

• Agent module to specify tourist personality types, travel modes, and agent rules

• Typical Trip planner to specify trips as an aggregation of entry/exit nodes, arrival curves, destinations and agents.

• Scenario designer to specify combinations of travel networks, and typical trip plans.

• Statistical module to specify outputs and summarise simulation results

This paper describes the RBSim software architecture with specific reference to the trip planning algorithms used by the recreation agents. An application of the simulator at Port Campbell National Park, Victoria Australia is described.  相似文献   


2.
3.
FoodChain-Lab is modular open-source software for trace-back and trace-forward analysis in food-borne disease outbreak investigations. Development of FoodChain-Lab has been driven by a need for appropriate software in several food-related outbreaks in Germany since 2011. The software allows integrated data management, data linkage, enrichment and visualization as well as interactive supply chain analyses. Identification of possible outbreak sources or vehicles is facilitated by calculation of tracing scores for food-handling stations (companies or persons) and food products under investigation. The software also supports consideration of station-specific cross-contamination, analysis of geographical relationships, and topological clustering of the tracing network structure. FoodChain-Lab has been applied successfully in previous outbreak investigations, for example during the 2011 EHEC outbreak and the 2013/14 European hepatitis A outbreak. The software is most useful in complex, multi-area outbreak investigations where epidemiological evidence may be insufficient to discriminate between multiple implicated food products. The automated analysis and visualization components would be of greater value if trading information on food ingredients and compound products was more easily available.  相似文献   

4.
Cornelis H  Coop AD  Bower JM 《PloS one》2012,7(1):e28956
Simulator interoperability and extensibility has become a growing requirement in computational biology. To address this, we have developed a federated software architecture. It is federated by its union of independent disparate systems under a single cohesive view, provides interoperability through its capability to communicate, execute programs, or transfer data among different independent applications, and supports extensibility by enabling simulator expansion or enhancement without the need for major changes to system infrastructure. Historically, simulator interoperability has relied on development of declarative markup languages such as the neuron modeling language NeuroML, while simulator extension typically occurred through modification of existing functionality. The software architecture we describe here allows for both these approaches. However, it is designed to support alternative paradigms of interoperability and extensibility through the provision of logical relationships and defined application programming interfaces. They allow any appropriately configured component or software application to be incorporated into a simulator. The architecture defines independent functional modules that run stand-alone. They are arranged in logical layers that naturally correspond to the occurrence of high-level data (biological concepts) versus low-level data (numerical values) and distinguish data from control functions. The modular nature of the architecture and its independence from a given technology facilitates communication about similar concepts and functions for both users and developers. It provides several advantages for multiple independent contributions to software development. Importantly, these include: (1) Reduction in complexity of individual simulator components when compared to the complexity of a complete simulator, (2) Documentation of individual components in terms of their inputs and outputs, (3) Easy removal or replacement of unnecessary or obsoleted components, (4) Stand-alone testing of components, and (5) Clear delineation of the development scope of new components.  相似文献   

5.
6.
In recent years, traceability systems have been developed as effective tools for improving the transparency of supply chains, thereby guaranteeing the quality and safety of food products. In this study, we proposed a cattle/beef supply chain traceability model and a traceability system based on radio frequency identification (RFID) technology and the EPCglobal network. First of all, the transformations of traceability units were defined and analyzed throughout the cattle/beef chain. Secondly, we described the internal and external traceability information acquisition, transformation, and transmission processes throughout the beef supply chain in detail, and explained a methodology for modeling traceability information using the electronic product code information service (EPCIS) framework. Then, the traceability system was implemented based on Fosstrak and FreePastry software packages, and animal ear tag code and electronic product code (EPC) were employed to identify traceability units. Finally, a cattle/beef supply chain included breeding business, slaughter and processing business, distribution business and sales outlet was used as a case study to evaluate the beef supply chain traceability system. The results demonstrated that the major advantages of the traceability system are the effective sharing of information among business and the gapless traceability of the cattle/beef supply chain.  相似文献   

7.
In the supply chain of the automotive industry, where the procurement ratio of parts from partners is very high, the trustworthiness of partners should be considered during supply chain optimization. In this paper, to deal with the trust issue related to collaboration and reduce the computational load in production planning, we develop a collaborative fractal-based supply chain management framework for the automotive industry. In our framework, the relationships between the participants of a supply chain are modeled as a fractal. Each fractal has a goal model and generates a production plan of its participants based on the goal model. The goal model of each fractal is developed from an operational perspective to consider the trust value of participants during production planning. A fuzzy trust evaluation model is used to evaluate the trust value in terms of numerical value. To validate the developed framework in the automotive industry, simulations are conducted. The results of the simulations indicate that our framework can be useful in generating precise production plans.  相似文献   

8.
A mechanism is proposed for mRNA translocation during peptide chain synthesis, which involves a reciprocal rotation of the two ribosomal components on each other in such a way as to propel the mRNA along a string of ribosomes on an endoplasmic reticulum membrane or to propel a single ribosome along an mRNA molecule in a membrane-free system. Release of tRNA's is assumed to occur from only one of the two ribosomal components at a time, so that there is no danger of loss of the tRNA's during peptide chain synthesis. The mechanism would make the ribosome an active, dynamic structure in addition to being a platform with catalytic activity which holds the components of peptide chain synthesis in juxtaposition.  相似文献   

9.
In this article, we present a neurologically motivated computational architecture for visual information processing. The computational architecture’s focus lies in multiple strategies: hierarchical processing, parallel and concurrent processing, and modularity. The architecture is modular and expandable in both hardware and software, so that it can also cope with multisensory integrations – making it an ideal tool for validating and applying computational neuroscience models in real time under real-world conditions. We apply our architecture in real time to validate a long-standing biologically inspired visual object recognition model, HMAX. In this context, the overall aim is to supply a humanoid robot with the ability to perceive and understand its environment with a focus on the active aspect of real-time spatiotemporal visual processing. We show that our approach is capable of simulating information processing in the visual cortex in real time and that our entropy-adaptive modification of HMAX has a higher efficiency and classification performance than the standard model (up to \(\sim \!+6\,\% \) ).  相似文献   

10.
Within industrial ecology, there is a substantial community focusing on life cycle assessment (LCA) and corresponding tools and methods. Within the field of supply chain management, an increasing community is converging around sustainable supply chains. These two communities study the same underlying systems, but bring different perspectives to bear. We review seven issues that arise at this intersection of LCA and supply chain management, with the aim of illustrating how both communities can enrich each other by closer interaction. We conclude with some suggestions for how the two communities can further collaborate.  相似文献   

11.
Phacilitate held a Special Interest Group workshop event in Edinburgh, UK, in May 2017. The event brought together leading stakeholders in the cell therapy bioprocessing field to identify present and future challenges and propose potential solutions to automation in cell therapy bioprocessing. Here, we review and summarize discussions from the event. Deep biological understanding of a product, its mechanism of action and indication pathogenesis underpin many factors relating to bioprocessing and automation. To fully exploit the opportunities of bioprocess automation, therapeutics developers must closely consider whether an automation strategy is applicable, how to design an ‘automatable’ bioprocess and how to implement process modifications with minimal disruption. Major decisions around bioprocess automation strategy should involve all relevant stakeholders; communication between technical and business strategy decision-makers is of particular importance. Developers should leverage automation to implement in-process testing, in turn applicable to process optimization, quality assurance (QA)/ quality control (QC), batch failure control, adaptive manufacturing and regulatory demands, but a lack of precedent and technical opportunities can complicate such efforts. Sparse standardization across product characterization, hardware components and software platforms is perceived to complicate efforts to implement automation. The use of advanced algorithmic approaches such as machine learning may have application to bioprocess and supply chain optimization. Automation can substantially de-risk the wider supply chain, including tracking and traceability, cryopreservation and thawing and logistics. The regulatory implications of automation are currently unclear because few hardware options exist and novel solutions require case-by-case validation, but automation can present attractive regulatory incentives.  相似文献   

12.
Reverse engineering is defined as the process where the internal structures and dynamics of a given system are inferred and analyzed from external observations and relevant knowledge. The first part of this paper surveys existing techniques for biosystem reverse engineering. Network structure inference techniques such as Correlation Matrix Construction (CMC), Boolean network and Bayesian network-based methods are explained. After the numeric and logical simulation techniques are briefly described, several representative working software tools were introduced. The second part presents our component-based software architecture for biosystem reverse engineering. After three design principles are established, a loosely coupled federation architecture consisting of 11 autonomous components is proposed along with their respective functions.  相似文献   

13.
This study formulates a model to maximize the profit of a lignocellulosic biofuel supply chain ranging from feedstock suppliers to biofuel customers. The model deals with a time-staged, multi-commodity, production/distribution system, prescribing facility locations and capacities, technologies, and material flows. A case study based on a region in Central Texas demonstrates application of the proposed model to design the most profitable biofuel supply chain under each of several scenarios. A sensitivity analysis identifies that ethanol (ETOH) price is the most significant factor in the economic viability of a lignocellulosic biofuel supply chain.  相似文献   

14.
This paper presents the design, implementation and evaluation of an extensible, scalable and distributed heterogeneous cluster based programmable router, called DHCR (Distributed Heterogeneous Cluster based Router), capable of supporting and deploying network services at run time. DHCR is a software IP router relying on heterogeneous cluster composed of separated computers with different hardware and software architecture capabilities, running different operating systems and interconnected through a high speed network connection. The DHCR ensures dynamic deployment of services and distributed control of router components (forwarding and routing elements) over heterogeneous system environments. The DHCR combines the IETF ForCES (Forwarding and Control Element Separation) architecture with software component technologies to meet the requirements of the next generation software routers. To ensure reliable and transparent communication between separated, decentralized and heterogeneous router components, the CORBA based middleware technology is used to support the DHCR internal communication. The paper also explores the use of the CORBA Component Model (CCM) to design and implement a modular, distributed and heterogeneous forwarding path for the DHCR router architecture. The CCM based forwarding plane ensures dynamic reconfiguration of the data path topology needed for low-level service deployment. Results on achievable performance using the proposed DHCR router are reported.
Hormuzd M. KhosraviEmail:
  相似文献   

15.
Liu  Yishu  Zheng  Jiangbo 《Cluster computing》2022,25(3):2271-2280

With the rapid development of science and technology and the integration of the global economy, competition in the global market has become increasingly fierce, and competition between enterprises has gradually become competition between supply chains. The application of Internet technology to the agricultural industry chain has a wide potential for development, and the LoT and the big data have become an important pillar of the development of smart agriculture. The implementation of Internet technology improves the degree of visibility and transparency of the agricultural supply chain, reduces supply chain uncertainty and improves the smart management of the supply chain. Research, development and implementation of technology 5g LoT brings new development opportunities for traditional supply chain management. In order to understand changes in the production and life of people under the 5g Internet environment of things, this document uses case analysis and bibliographic analysis methods for the collection of documents from the CNKI, Wanfang database, SSCI and other databases and analyses different fields of life using GIS spatial analysis technology. The results of the survey show that, in the case of the use of the 5g LoT, the supply chain may provide more information on the development of an intelligent supply chain, the smart level of chain efficiency has significantly improved, and the percentage is approximately 25%. This article takes as an example the supply of cars, with the help of the LoT system, the accuracy of delivery and the speed of logistics transport have improved significantly. In the smart model, the coefficient can reach 0813, which shows that smart management of the supply chain fewer than 5g LoT can play a major role.

  相似文献   

16.
MOTIVATION: The presentation of genomics data in a perspicuous visual format is critical for its rapid interpretation and validation. Relatively few public database developers have the resources to implement sophisticated front-end user interfaces themselves. Accordingly, these developers would benefit from a reusable toolkit of user interface and data visualization components. RESULTS: We have designed the bioWidget toolkit as a set of JavaBean components. It includes a wide array of user interface components and defines an architecture for assembling applications. The toolkit is founded on established software engineering design patterns and principles, including componentry, Model-View-Controller, factored models and schema neutrality. As a proof of concept, we have used the bioWidget toolkit to create three extendible applications: AnnotView, BlastView and AlignView.  相似文献   

17.
Confocal laser scanning microscopy (CLSM) is a powerful tool for investigation of biofilms. Very few investigations have successfully quantified concurrent distribution of more than two components within biofilms because: 1) selection of fluorescent dyes having minimal spectral overlap is complicated, and 2) quantification of multiple fluorochromes poses a multifactorial problem. Objectives: Report a methodology to quantify and compare concurrent 3-dimensional distributions of three cellular/extracellular components of biofilms grown on relevant substrates. Methods: The method consists of distinct, interconnected steps involving biofilm growth, staining, CLSM imaging, biofilm structural analysis and visualization, and statistical analysis of structural parameters. Biofilms of Streptococcus mutans (strain UA159) were grown for 48 hr on sterile specimens of Point 4 and TPH3 resin composites. Specimens were subsequently immersed for 60 sec in either Biotène PBF (BIO) or Listerine Total Care (LTO) mouthwashes, or water (control group; n=5/group). Biofilms were stained with fluorochromes for extracellular polymeric substances, proteins and nucleic acids before imaging with CLSM. Biofilm structural parameters calculated using ISA3D image analysis software were biovolume and mean biofilm thickness. Mixed models statistical analyses compared structural parameters between mouthwash and control groups (SAS software; α=0.05). Volocity software permitted visualization of 3D distributions of overlaid biofilm components (fluorochromes). Results: Mouthwash BIO produced biofilm structures that differed significantly from the control (p<0.05) on both resin composites, whereas LTO did not produce differences (p>0.05) on either product. Conclusions: This methodology efficiently and successfully quantified and compared concurrent 3D distributions of three major components within S. mutans biofilms on relevant substrates, thus overcoming two challenges to simultaneous assessment of biofilm components. This method can also be used to determine the efficacy of antibacterial/antifouling agents against multiple biofilm components, as shown using mouthwashes. Furthermore, this method has broad application because it facilitates comparison of 3D structures/architecture of biofilms in a variety of disciplines.  相似文献   

18.
This paper presents a novel networking architecture designed for communication intensive parallel applications running on clusters of workstations (COWs) connected by high speed networks. The architecture addresses what is considered one of the most important problems of cluster-based parallel computing: the inherent inability of scaling the performance of communication software along with the host CPU performance. The Virtual Communication Machine (VCM), resident on the network coprocessor, presents a scalable software solution by providing configurable communication functionality directly accessible at user-level. The VCM architecture is configurable in that it enables the transfer to the VCM of selected communication-related functionality that is traditionally part of the application and/or the host kernel. Such transfers are beneficial when a significant reduction of the host CPU's load translates into a small increase in the coprocessor's load. The functionality implemented by the coprocessor is available at the application level as VCM instructions. Host CPU(s) and coprocessor interact through shared memory regions, thereby avoiding expensive CPU context switches. The host kernel is not involved in this interaction; it simply “connects” the application to the VCM during the initialization phase and is called infrequently to handle exceptional conditions. Protection is enforced by the VCM based on information supplied by the kernel. The VCM-based communication architecture admits low cost and open implementations, as demonstrated by its current ATM-based implementation based on off-the-shelf hardware components and using standard AAL5 packets. The architecture makes it easy to implement communication software that exhibits negligible overheads on the host CPU(s) and offers latencies and bandwidths close to the hardware limits of the underlying network. These characteristics are due to the VCM's support for zero-copy messaging with gather/scatter capabilities and the VCM's direct access to any data structure in an application's address space. This paper describes two versions of an ATM-based VCM implementation, which differ in the way they use the memory on the network adapter. Their performance under heavy load is compared in the context of a synthetic client/server application. The same application is used to evaluate the scalability of the architecture to multiple VCM-based network interfaces per host. Parallel implementations of the Traveling Salesman Problem and of Georgia Tech Time Warp, an engine for discrete-event simulation, are used to demonstrate VCM functionality and the high performance of its implementation. The distributed- and shared-memory versions of these two applications exhibit comparable performance, despite the significant cost-performance advantage of the distributed-memory platform. This revised version was published online in July 2006 with corrections to the Cover Date.  相似文献   

19.
The new economic challenges and recent trends in globalization have made it very difficult for Canadian forest product companies to improve their financial position without the coordinated involvement of the entire company, including their supply chains (distributed facilities, company offices, industrial customers, and distributors). Such a new level of efficiency involves their distributed facilities and offices spread around the world, and their customers. One consequence of this new reality is that forest products companies are now facing the need to re-engineer their organizational processes and business practices with their partners. To do this they must adopt new technologies to support the coordination of their planning and control efforts in a customer-centered environment. This paper first proposes a generic software architecture for development of an experimentation environment to design and test distributed advanced planning and scheduling systems. This architecture enables combination of agent-based technology and operations research-based tools in order to first take advantage of the ability of agent technology to integrate distributed decision problems, and, second, to take advantage of the ability of operations research to develop and exploit specific normative decision models. Next, this paper describes how this architecture has been configured into an advanced planning and scheduling tool for the lumber industry. Finally, we present how an application of this advanced planning tool is currently being validated and tested in a real manufacturing setting.  相似文献   

20.
Agility can be viewed as a need to encourage the enterprise-wide integration of flexible and core competent resources so as to offer value-added product and services in a volatile competitive environment. Since flexibility is considered a property that provides change capabilities of different enterprise-wide resources and processes in time and cost dimensions, supply chain flexibility can be considered a composite state to enterprise-wide resources to meet agility needs. Enterprise modeling frameworks depicting these composite flexibility states are difficult to model because of the complex and tacit interrelationship among system parameters and also because agility thrives on many business objectives. In view of this, the modeling framework presented in this paper is based on analytical network process (ANP) since this methodology can accommodate the complex and tacit interrelationship among factors affecting enterprise agility. The modeling framework forms a three-level network with the goal of attaining agility from the perspective of market, product, and customer as the actors. The goal depends on substrategies that address the characteristics of the three actors. Each of these substrategies further depends on manufacturing, logistic, sourcing, and information technology (IT) flexibility elements of the enterprise supply chain (SC). The research highlights that, under different environmental conditions, enterprises require synergy among appropriate supply chain flexibilities for practising agility. In the present research, the ANP modeling software tool Super Decisions? has been used for relative prioritization of the supply chain flexibilities. We demonstrate through sensitivity analysis that dynamic conditions do require adjustments in the enterprise-wide flexibility spectrum.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号