首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Discrete dynamical systems are used to model various realistic systems in network science, from social unrest in human populations to regulation in biological networks. A common approach is to model the agents of a system as vertices of a graph, and the pairwise interactions between agents as edges. Agents are in one of a finite set of states at each discrete time step and are assigned functions that describe how their states change based on neighborhood relations. Full characterization of state transitions of one system can give insights into fundamental behaviors of other dynamical systems. In this paper, we describe a discrete graph dynamical systems (GDSs) application called GDSCalc for computing and characterizing system dynamics. It is an open access system that is used through a web interface. We provide an overview of GDS theory. This theory is the basis of the web application; i.e., an understanding of GDS provides an understanding of the software features, while abstracting away implementation details. We present a set of illustrative examples to demonstrate its use in education and research. Finally, we compare GDSCalc with other discrete dynamical system software tools. Our perspective is that no single software tool will perform all computations that may be required by all users; tools typically have particular features that are more suitable for some tasks. We situate GDSCalc within this space of software tools.  相似文献   

2.
MOTIVATION: The importance of studying biology at the system level has been well recognized, yet there is no well-defined process or consistent methodology to integrate and represent biological information at this level. To overcome this hurdle, a blending of disciplines such as computer science and biology is necessary. RESULTS: By applying an adapted, sequential software engineering process, a complex biological system (severe acquired respiratory syndrome-coronavirus viral infection) has been reverse-engineered and represented as an object-oriented software system. The scalability of this object-oriented software engineering approach indicates that we can apply this technology for the integration of large complex biological systems. AVAILABILITY: A navigable web-based version of the system is freely available at http://people.musc.edu/~zhengw/SARS/Software-Process.htm  相似文献   

3.
4.
5.
Information about the state of the system is of paramount importance in determining the dynamics underlying manufacturing systems. In this paper, we present an adaptive scheduling policy for dynamic manufacturing system scheduling using information obtained from snapshots of the system at various points in time. Specifically, the framework presented allows for information-based dynamic scheduling where information collected about the system is used to (1) adjust appropriate parameters in the system and (2) search or optimize using genetic algorithms. The main feature of this policy is that it tailors the dispatching rule to be used at a given point in time to the prevailing state of the system. Experimental studies indicate the superiority of the suggested approach over the alternative approach involving the repeated application of a single dispatching rule for randomly generated test problems as well as a real system. In pa ticular, its relative performance improves further when there are frequent disruptions and when disruptions are caused by the introduction of tight due date jobs and machine breakdown—two of the most common sources of disruption in most manufacturing systems. From an operational perspective, the most important characteristics of the pattern-directed scheduling approach are its ability to incorporate the idiosyncratic characteristics of the given system into the dispatching rule selection process and its ability to refine itself incrementally on a continual basis by taking new system parameters into account.  相似文献   

6.
The complexity and diversity of manufacturing software and the need to adapt this software to the frequent changes in the production requirements necessitate the use of a systematic approach to developing this software. The software life-cycle model (Royce, 1970) that consists of specifying the requirements of a software system, designing, implementing, testing, and evolving this software can be followed when developing large portions of manufacturing software. However, the presence of hardware devices in these systems and the high costs of acquiring and operating hardware devices further complicate the manufacturing software development process and require that the functionality of this software be extended to incorporate simulation and prototyping. This paper reviews recent methods for planning, scheduling, simulating, and monitoring the operation of manufacturing systems. A synopsis of the approaches to designing and implementing the real-time control software of these systems is presented. It is concluded that current methodologies support, in a very restricted sense, these planning, scheduling, and monitoring activities, and that enhanced performance can be achieved via an integrated approach.  相似文献   

7.
Microbiological risk assessment is an area of growing importance and significant potential, where the underlying science, software systems and databases are developing to the point of real and useful application. It is also an area where the developing science is posing as many questions as it is presenting answers. Key issues emerging from the day included: the need for more sophisticated management of uncertainty, which is much more relevant to microbiological risk analysis than to other applications; the need for global surveillance systems with better compatibility and appropriate peer review; considered assessment of the impact of new molecular-based diagnostic and screening techniques; the explosion of relevant information available, particularly on the Internet, which makes computer literacy essential both to professionals and 'laymen'; and the appearance of software systems which are either tailored for microbiological application or have the potential for this use. The closely associated issues of risk communication and perception also emerged as being vital to the effective application of microbiological risk management to public health issues. Overall, the majority of participants considered the event to have been valuable and stimulating and thought that it would lead to improvements in the use of microbiological risk assessment. The Advisory Committee on Dangerous Pathogens is committed to taking this topic forward and will be both taking up the messages from this seminar and encouraging development of suitable databases and software systems.  相似文献   

8.
9.
The evolution of biomedical technology has led to an extraordinary use of medical devices in health care delivery. During the last decade, clinical engineering departments (CEDs) turned toward computerization and application of specific software systems for medical equipment management in order to improve their services and monitor outcomes. Recently, much emphasis has been given to patient safety. Through its Medical Device Directives, the European Union has required all member nations to use a vigilance system to prevent the reoccurrence of adverse events that could lead to injuries or death of patients or personnel as a result of equipment malfunction or improper use. The World Health Organization also has made this issue a high priority and has prepared a number of actions and recommendations. In the present workplace, a new integrated, Windows-oriented system is proposed, addressing all tasks of CEDs but also offering a global approach to their management needs, including vigilance. The system architecture is based on a star model, consisting of a central core module and peripheral units. Its development has been based on the integration of 3 software modules, each one addressing specific predefined tasks. The main features of this system include equipment acquisition and replacement management, inventory archiving and monitoring, follow up on scheduled maintenance, corrective maintenance, user training, data analysis, and reports. It also incorporates vigilance monitoring and information exchange for adverse events, together with a specific application for quality-control procedures. The system offers clinical engineers the ability to monitor and evaluate the quality and cost-effectiveness of the service provided by means of quality and cost indicators. Particular emphasis has been placed on the use of harmonized standards with regard to medical device nomenclature and classification. The system's practical applications have been demonstrated through a pilot evaluation trial.  相似文献   

10.
城市生态系统的动力学演化模型研究进展   总被引:5,自引:0,他引:5  
郁亚娟  郭怀成  刘永  黄凯  王真 《生态学报》2007,27(6):2603-2614
从系统分析出发,对城市生态系统的动力学演化模型的发展历程、建模的方法和步骤过程、软件开发方法和目前的模型软件等进行了总结。归纳了城市生态系统的动力学演化建模的方法,主要包括模型定义、模拟、实现、验证、分析和应用等六大步骤。目前国内外用于城市生态系统动力学演化模型的主要方法有:基于数理模型的方法、生态控制论和灵敏度模型、系统动力学模型、多目标规划法等。已经开发的用于城市生态系统的动力学模拟的软件可以划分为两类:基于土地利用和交通规划的专业模型和基于系统动力学和灵敏度模型的一般软件。总结了常用的城市演化模型软件,讨论了模型的研究对象和应用范围。分析了城市生态系统的动力学演化模型建模的不确定性的来源,并指出:向宏观和微观两极化发展是城市生态系统动力学演化模型的发展趋势之一,而与人工智能和地理信息系统等新方法的集成是发展的另一趋势。城市生态系统动力学演化模型的开发前景在于对不确定性问题的定性、定量分析,而多模型的耦合和集成是发展的必然趋势。  相似文献   

11.
Increasingly, applications need to be able to self-reconfigure in response to changing requirements and environmental conditions. Autonomic computing has been proposed as a means for automating software maintenance tasks. As the complexity of adaptive and autonomic systems grows, designing and managing the set of reconfiguration rules becomes increasingly challenging and may produce inconsistencies. This paper proposes an approach to leverage genetic algorithms in the decision-making process of an autonomic system. This approach enables a system to dynamically evolve target reconfigurations at run time that balance tradeoffs between functional and non-functional requirements in response to changing requirements and environmental conditions. A key feature of this approach is incorporating system and environmental monitoring information into the genetic algorithm such that specific changes in the environment automatically drive the evolutionary process towards new viable solutions. We have applied this genetic-algorithm based approach to the dynamic reconfiguration of a collection of remote data mirrors, demonstrating an effective decision-making method for diffusing data and minimizing operational costs while maximizing data reliability and network performance, even in the presence of link failures.  相似文献   

12.
BackgroundThe laboratory testing process consist of five analysis phases featuring the total testing process framework. Activities in laboratory process, including those of testing are error-prone and affect the use of laboratory information systems. This study seeks to identify error factors related to system use and the first and last phases of the laboratory testing process using a proposed framework known as total testing process-laboratory information systems.MethodsWe conducted a qualitative case study evaluation in two private hospitals and a medical laboratory. We collected data using interviews, observations, and document analysis methods involving physicians, nurses, an information technology officer, and the laboratory staff. We employed the proposed framework and Lean problem solving tools namely Value Stream Mapping and A3 for data analysis.ResultsErrors in laboratory information systems and the laboratory testing process were attributed to failure to fulfill user requirements, poor cooperation between the information technology unit and laboratory, inconsistency of software design in system integration, errors during inter-system data transmission, and lack of motivation in system use. The error factors are related to system development elements, namely, latent failures that considerably affected the information quality and system use. Errors in system development were also attributed to poor service quality.ConclusionsComplex laboratory testing process and laboratory information systems require rigorous evaluation in minimizing errors and ensuring patient safety. The proposed framework and Lean approach are applicable for evaluating the laboratory testing process and laboratory information systems in a rigorous, comprehensive, and structured manner.  相似文献   

13.
Recent advances in processor, networking and software technologies have made distributed computing a reality in today's world. Distributed systems offer many advantages, ranging from a higher performance to the effective utilization of physically dispersed resources. Many diverse application domains can benefit by exploiting principles of distributed computing. Information filtering is one such application domain. In this article, we present a design of a homogeneous distributed multi-agent information filtering system, called D-SIFTER. D-SIFTER is based on the language-dependent model of Java RMI. The detailed design process and various experiments carried out using D-SIFTER are also described. The results indicate that the distributed inter-agent collaboration improves the overall filtering performance.  相似文献   

14.
Recent neuropsychological research has begun to reveal that neurons encode information in the timing of spikes. Spiking neural network simulations are a flexible and powerful method for investigating the behaviour of neuronal systems. Simulation of the spiking neural networks in software is unable to rapidly generate output spikes in large-scale of neural network. An alternative approach, hardware implementation of such system, provides the possibility to generate independent spikes precisely and simultaneously output spike waves in real time, under the premise that spiking neural network can take full advantage of hardware inherent parallelism. We introduce a configurable FPGA-oriented hardware platform for spiking neural network simulation in this work. We aim to use this platform to combine the speed of dedicated hardware with the programmability of software so that it might allow neuroscientists to put together sophisticated computation experiments of their own model. A feed-forward hierarchy network is developed as a case study to describe the operation of biological neural systems (such as orientation selectivity of visual cortex) and computational models of such systems. This model demonstrates how a feed-forward neural network constructs the circuitry required for orientation selectivity and provides platform for reaching a deeper understanding of the primate visual system. In the future, larger scale models based on this framework can be used to replicate the actual architecture in visual cortex, leading to more detailed predictions and insights into visual perception phenomenon.  相似文献   

15.
To address the need for a clinically applicable intravital optical imaging system, we developed a new hardware and software framework. We demonstrate its utility by applying it to an endoscope-based white light and fluorescent imaging system. The capabilities include acquisition and visualization algorithms that perform registration, segmentation, and histogram-based autoexposure of two imaging channels (full-spectrum white light and near-infrared fluorescence), all in real time. Data are processed and saved as 12-bit files, matching the standards of clinical imaging. Dynamic range is further improved by the evaluation of flux as a quantitative parameter. The above features are demonstrated in a series of in vitro experiments, and the in vivo application is shown with the visualization of fluorescent-labeled vasculature of a mouse peritoneum. The approach may be applied to diverse systems, including handheld devices, fixed geometry intraoperative devices, catheter-based imaging, and multimodal systems.  相似文献   

16.
In this paper we consider an approach to neuronal transients that is predicated on the information they contain. This perspective is provided by information theory, in particular the principle of maximum information transfer. It is illustrated here in application to visually evoked neuronal transients. The receptive fields that ensue concur with those observed in the real brain, predicting, almost exactly, functional segregation of the sort seen in the visual system. This information theoretical perspective can be reconciled with a selectionist stance by noting that a high mutual information among neuronal systems and the environment has, itself, adaptive value and will be subject to selective pressure, at any level one cares to consider.  相似文献   

17.
Computer-Integrated Manufacturing (CIM) systems may be classified as real-time systems. Hence, the applicability of methodologies that are developed for specifying, designing, implementing, testing, and evolving real-time software is investigated in this article. The paper highlights the activities of the software development process. Among these activities, a great emphasis is placed on automating the software requirements specification activity, and a set of formal models and languages for specifying these requirements is presented. Moreover, a synopsis of the real-time software methodologies that have been implemented by the academic and industrial communities is presented together with a critique of the strengths and weaknesses of these methodologies. The possible use of the real-time methodologies in developing the control software of efficient and dependable manufacturing systems is explored. In these systems, efficiency is achieved by increasing the level of concurrency of the operations of a plan, and by scheduling the execution of these operations with the intent of maximizing the utilization of the devices of their systems. On the other hand, dependability requires monitoring the operations of these systems. This monitoring activity facilitates the detection of faults that may occur when executing the scheduled operations of a plan, recovering from these faults, and, whenever feasible, resuming the original schedule of the system. The paper concludes that the set of surveyed methodologies may be used to develop the real-time control software of efficient and dependable manufacturing systems. However, an integrated approach to planning, scheduling, and monitoring the operations of these systems will significantly enhance their utility, and no such approach is supported by any of these methodologies.  相似文献   

18.
Aqueous two-phase systems (ATPSs) have great potential for use in the downstream processing of fermentation products. A major drawback of these systems, limiting application in industrial practice up till now, is the consumption of large amounts of auxiliary materials such as polymers and salts. Making use of alternative auxiliaries can diminish this relatively large discharge. A possible approach is to make use of volatile salts induced by combinations of ammonia and carbon dioxide that can be recycled to the extraction system. As part of an ongoing research effort on ATPSs with volatile salts, this work aims at getting more information on the system boundaries or operating conditions of these systems in terms of phase behavior. The results show that the NH(3)/CO(2) ratio is an important parameter that has a large influence on the system boundaries. Both for systems with PEG 2000 and PEG 4000, this ratio has to be larger than about 1.75 to make a liquid-liquid phase separation possible. The most optimal ratio seems to be 2.0 for reasons of solution composition and absence of solid salt.  相似文献   

19.
A new methodology based on a metabolic control analysis (MCA) approach is developed for the optimization of continuous cascade bioreactor system. A general framework for representation of a cascade bioreactor system consisting of a large number of reactors as a single network is proposed. The kinetic and transport processes occurring in the system are represented as a reaction network with appropriate stoichiometry. Such representation of the bioreactor systems makes it amenable to the direct application of the MCA approach. The process sensitivity information is extracted using MCA methodology in the form of flux and concentration control coefficients. The process sensitivity information is shown to be a useful guide for determining the choice of decision variables for the purpose of optimization. A generalized problem of optimization of the bioreactor is formulated in which the decision variables are the operating conditions and kinetic parameters. The gradient of the objective function to be maximized with respect to all decision variables is obtained in the form of response coefficients. This gradient information can be used in any gradient-based optimization algorithm. The efficiency of the proposed technique is demonstrated with two examples taken from literature: biotransformation of crotonobetaine and alcohol fermentation in cascade bioreactor system.  相似文献   

20.
The use of camera traps is now widespread and their importance in wildlife studies is well understood. Camera trap studies can produce millions of photographs and there is a need for a software to help manage photographs efficiently. In this paper, we describe a software system that was built to successfully manage a large behavioral camera trap study that produced more than a million photographs. We describe the software architecture and the design decisions that shaped the evolution of the program over the study's three year period. The software system has the ability to automatically extract metadata from images, and add customized metadata to the images in a standardized format. The software system can be installed as a standalone application on popular operating systems. It is minimalistic, scalable and extendable so that it can be used by small teams or individual researchers for a broad variety of camera trap studies.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号