首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
In a dynamic and flexible manufacturing environment, a shop-floor controller must be designed so that it automatically (or with minimum human intervention) and quickly responds to the changes (e.g., in part type or part routing) in the system. Such a performance may be achieved provided that the controller is simple and sufficiently general in its scope of application. In this article, we present an architecture for such a shop-floor controller. The architecture is based on colored Petri nets with ordered colored sets and structured input and output functions.  相似文献   

2.
This paper reports on a multi-fold approach for the building of user models based on the identification of navigation patterns in a virtual campus, allowing for adapting the campus’ usability to the actual learners’ needs, thus resulting in a great stimulation of the learning experience. However, user modeling in this context implies a constant processing and analysis of user interaction data during long-term learning activities, which produces huge amounts of valuable data stored typically in server log files. Due to the large or very large size of log files generated daily, the massive processing is a foremost step in extracting useful information. To this end, this work studies, first, the viability of processing large log data files of a real Virtual Campus using different distributed infrastructures. More precisely, we study the time performance of massive processing of daily log files implemented following the master-slave paradigm and evaluated using Cluster Computing and PlanetLab platforms. The study reveals the complexity and challenges of massive processing in the big data era, such as the need to carefully tune the log file processing in terms of chunk log data size to be processed at slave nodes as well as the bottleneck in processing in truly geographically distributed infrastructures due to the overhead caused by the communication time among the master and slave nodes. Then, an application of the massive processing approach resulting in log data processed and stored in a well-structured format is presented. We show how to extract knowledge from the log data analysis by using the WEKA framework for data mining purposes showing its usefulness to effectively build user models in terms of identifying interesting navigation patters of on-line learners. The study is motivated and conducted in the context of the actual data logs of the Virtual Campus of the Open University of Catalonia.  相似文献   

3.
Reproductive performance has recently been a growing concern in cattle dairy systems, but few research methodologies are available to address it as a complex problem in a livestock farming system. The aim of this paper is to propose a methodology that combines both systemic and analytical approaches in order to better understand and improve reproductive performance in a cattle dairy system. The first phase of our methodology consists in a systemic approach to build the terms of the problem. It results in formalising a set of potential risk factors relevant for the particular system under consideration. The second phase is based on an analytical approach that involves both analysing the shapes of the individual lactation curves and carrying out logistic regression procedures to study the links between reproductive performance and the previously identified potential risk factors. It makes it possible to formulate hypotheses about the biotechnical phenomena underpinning reproductive performance. The last phase is another systemic approach that aims at suggesting new practices to improve the situation. It pays particular attention to the consistency of those suggestions with the farmer's general objectives. This methodology was applied to a French system experiment based on an organic low-input grazing system. It finally suggested to slightly modify the dates of the breeding period so as to improve reproductive performance. The formulated hypotheses leading to this suggestion involved both the breed (Holstein or Montbéliarde cows), the parity, the year and the calving date with regard to the turnout date as the identified risk factors of impaired performance. Possible use of such a methodology in any commercial farm encountering a biotechnical problem is discussed.  相似文献   

4.
This study presents a compound approach with a five-phase process to assess and improve the green performance. Based on the Six Sigma approach, the five-phase process consists of definition, measurement, analysis, improvement and control (DMAIC) phases. During the first three phases, an evaluation the assessment model is developed based on the analytical hierarchy process. Particle swarm optimization is also performed to search for the sequence of actions in order to improve the green performance based on the output of the assessment model. Managers can monitor the progress of the improvement plan during the control phase. A case study involving two global footwear manufacturers demonstrate the effectiveness of the proposed approach.  相似文献   

5.
Predictive performance modelling of parallel component compositions   总被引:1,自引:0,他引:1  
Large-scale scientific computing applications frequently make use of closely-coupled distributed parallel components. The performance of such applications is therefore dependent on the component parts and their interaction at run-time. This paper describes a methodology for predictive performance modelling and evaluation of parallel applications composed of multiple interacting components. In this paper, the fundamental steps and required operations involved in the modelling and evaluation process are identified—including component decomposition, component model combination, M×N communication modelling, dataflow analysis and overall performance evaluation. A case study is presented to illustrate the modelling process and the methodology is verified through experimental analysis.
Stephen A. JarvisEmail:
  相似文献   

6.
This study presents the development of a multi-criteria control methodology for flexible manufacturing systems (FMSs). The control methodology is based on a two-tier decision making mechanism. The first tier is designed to select a dominant decision criterion and a relevant scheduling rule set using a rule-based algorithm. In the second tier, using a look-ahead multi-pass simulation, a scheduling rule that best advances the selected criterion is determined. The decision making mechanism was integrated with the shop floor control module that comprises a real-time simulation model at the top control level and RapidCIM methodology at the low equipment control level. A factorial experiment was designed to analyze and evaluate the two-tier decision making mechanism and the effects that the main design parameters have on the system’s performance. Next, the proposed control methodology was compared to a selected group of scheduling rules/policies using DEA. The results demonstrated the superiority of the suggested control methodology as well as its capacity to cope with a fast changing environment.  相似文献   

7.
Starfish oocytes or eggs were inseminated at various times between first prometaphase and pronuclear stage, and were subsequently labeled with the thymidine analogue 5-bromo-2'-deoxyuridine (BrdU) in order to detect the onset of DNA synthesis phase (S phase) during the first cell cycle using a monoclonal antibody against BrdU. The interval between fertilization and the first S phase was found to be constant (30-45 min, depending on batches) in eggs fertilized after formation of the first polar body. Eggs fertilized before first polar body formation, however, always entered the S phase 10-20 min after the second polar body formation. On the basis of these observations we conclude that (i) the chain of events triggered by fertilization, collectively called "postactivation process" for the first S phase, goes on in parallel with the process of maturation and (ii) only the final step of the postactivation process is arrested until the termination of meiosis. In eggs that had been fertilized before the first polar body formation, the female and male pronuclei exhibited uniformly distributed chromatin soon after the second polar body formation. In eggs that had been fertilized after the second polar body formation, however, the chromatin of the pronuclei remained fibrillar even during the S phase. Thus full decondensation of chromatin appears to depend on a certain advance in the postactivation process.  相似文献   

8.
Real-time fuzzy-knowledge-based control of Baker's yeast production   总被引:1,自引:0,他引:1  
A real-time fuzzy-knowledge-based system for fault diagnosis and control of bioprocesses was constructed using the object-oriented programming environment Small-talk/V Mac. The basic system was implemented in a Macintosh Quadra 900 computer and built to function connected on line to the process computer. Fuzzy logic was employed in handling uncertainties both in the knowledge and in measurements. The fuzzy sets defined for the process variables could be changed on-line according to process dynamics. Process knowledge was implemented in a graphical two-level hierachical knowledge base. In on-line process control the system first recognizes the current process phase on the basis of top-level rules in the knowledge-base. Then, according to the results of process diagnosis based on measurement data, the appropriate control strategy is subsequently inferred making use of the lower level rules describing the process during the phase in question. (c) 1995 John Wiley & Sons, Inc.  相似文献   

9.
Heiner M  Koch I  Will J 《Bio Systems》2004,75(1-3):15-28
This paper demonstrates the first steps of a new integrating methodology to develop and analyse models of biological pathways in a systematic manner using well established Petri net technologies. The whole approach comprises step-wise modelling, animation, model validation as well as qualitative and quantitative analysis for behaviour prediction. In this paper, the first phase is addressed how to develop and validate a qualitative model, which might be extended afterwards to a quantitative model. The example used in this paper is devoted to apoptosis, the genetically programmed cell death. Apoptosis is an essential part of normal physiology for most metazoan species. Disturbances in the apoptotic process could lead to several diseases. The signal transduction pathway of apoptosis includes highly complex mechanisms to control and execute programmed cell death. This paper explains how to model and validate this pathway using qualitative Petri nets. The results provide a mathematically unique and valid model enabling the confirmation of known properties as well as new insights in this pathway.  相似文献   

10.
Understanding the principles governing the dynamic coordination of functional brain networks remains an important unmet goal within neuroscience. How do distributed ensembles of neurons transiently coordinate their activity across a variety of spatial and temporal scales? While a complete mechanistic account of this process remains elusive, evidence suggests that neuronal oscillations may play a key role in this process, with different rhythms influencing both local computation and long-range communication. To investigate this question, we recorded multiple single unit and local field potential (LFP) activity from microelectrode arrays implanted bilaterally in macaque motor areas. Monkeys performed a delayed center-out reach task either manually using their natural arm (Manual Control, MC) or under direct neural control through a brain-machine interface (Brain Control, BC). In accord with prior work, we found that the spiking activity of individual neurons is coupled to multiple aspects of the ongoing motor beta rhythm (10–45 Hz) during both MC and BC, with neurons exhibiting a diversity of coupling preferences. However, here we show that for identified single neurons, this beta-to-rate mapping can change in a reversible and task-dependent way. For example, as beta power increases, a given neuron may increase spiking during MC but decrease spiking during BC, or exhibit a reversible shift in the preferred phase of firing. The within-task stability of coupling, combined with the reversible cross-task changes in coupling, suggest that task-dependent changes in the beta-to-rate mapping play a role in the transient functional reorganization of neural ensembles. We characterize the range of task-dependent changes in the mapping from beta amplitude, phase, and inter-hemispheric phase differences to the spike rates of an ensemble of simultaneously-recorded neurons, and discuss the potential implications that dynamic remapping from oscillatory activity to spike rate and timing may hold for models of computation and communication in distributed functional brain networks.  相似文献   

11.
The consolidation of the industrial production of second-generation (2G) bioethanol relies on the improvement of the economics of the process. Within this general scope, this paper addresses one aspect that impacts the costs of the biochemical route for producing 2G bioethanol: defining optimal operational policies for the reactor running the enzymatic hydrolysis of the C6 biomass fraction. The use of fed-batch reactors is one common choice for this process, aiming at maximum yields and productivities. The optimization problem for fed-batch reactors usually consists in determining substrate feeding profiles, in order to maximize some performance index. In the present control problem, the performance index and the system dynamics are both linear with respect to the control variable (the trajectory of substrate feed flow). Simple Michaelis–Menten pseudo-homogeneous kinetic models with product inhibition were used in the dynamic modeling of a fed-bath reactor, and two feeding policies were implemented and validated in bench-scale reactors processing pre-treated sugarcane bagasse. The first approach applied classical optimal control theory. The second policy was defined with the purpose of sustaining high rates of glucose production, adding enzyme (Accellerase® 1500) and substrate simultaneously during the reaction course. A methodology is described, which used economical criteria for comparing the performance of the reactor operating in successive batches and in fed-batch modes. Fed-batch mode was less sensitive to enzyme prices than successive batches. Process intensification in the fed-batch reactor led to glucose final concentrations around 200 g/L.  相似文献   

12.
Cognitive control is required in situations that involve uncertainty or change, such as when resolving conflict, selecting responses and switching tasks. Recently, it has been suggested that cognitive control can be conceptualised as a mechanism which prioritises goal-relevant information to deal with uncertainty. This hypothesis has been supported using a paradigm that requires conflict resolution. In this study, we examine whether cognitive control during task switching is also consistent with this notion. We used information theory to quantify the level of uncertainty in different trial types during a cued task-switching paradigm. We test the hypothesis that differences in uncertainty between task repeat and task switch trials can account for typical behavioural effects in task-switching. Increasing uncertainty was associated with less efficient performance (i.e., slower and less accurate), particularly on switch trials and trials that afford little opportunity for advance preparation. Interestingly, both mixing and switch costs were associated with a common episodic control process. These results support the notion that cognitive control may be conceptualised as an information processor that serves to resolve uncertainty in the environment.  相似文献   

13.
High-fidelity computational fluid dynamics (CFD) tools, such as the large eddy simulation technique, have become feasible in aiding the field of computational aeroacoustics (CAA) to compute noise on petascale computing platforms. CAA poses significant challenges for researchers because the computational schemes used in the CFD tools should have high accuracy, good spectral resolution, and low dispersion and diffusion errors. A high-order compact finite difference scheme, which is implicit in space, can be used for such simulations because it fulfills the requirements for CAA. Usually, this method is parallelized using a transposition scheme; however, that approach has a high communication overhead. In this paper, we discuss the use of a parallel tridiagonal linear system solver based on the truncated SPIKE algorithm for reducing the communication overhead in our large eddy simulations. We present theoretical performance analysis and report experimental results collected on two parallel computing platforms.  相似文献   

14.
Reliability analysis of the electrical control system of a subsea blowout preventer (BOP) stack is carried out based on Markov method. For the subsea BOP electrical control system used in the current work, the 3-2-1-0 and 3-2-0 input voting schemes are available. The effects of the voting schemes on system performance are evaluated based on Markov models. In addition, the effects of failure rates of the modules and repair time on system reliability indices are also investigated.  相似文献   

15.
In this paper, we present a fault tolerant and recovery system called FRASystem (Fault Tolerant & Recovery Agent System) using multi-agent in distributed computing systems. Previous rollback-recovery protocols were dependent on an inherent communication and an underlying operating system, which caused a decline of computing performance. We propose a rollback-recovery protocol that works independently on an operating system and leads to an increasing portability and extensibility. We define four types of agents: (1) a recovery agent performs a rollback-recovery protocol after a failure, (2) an information agent constructs domain knowledge as a rule of fault tolerance and information during a failure-free operation, (3) a facilitator agent controls the communication between agents, (4) a garbage collection agent performs garbage collection of the useless fault tolerance information. Since agent failures may lead to inconsistent states of a system and a domino effect, we propose an agent recovery algorithm. A garbage collection protocol addresses the performance degradation caused by the increment of saved fault tolerance information in a stable storage. We implemented a prototype of FRASystem using Java and CORBA and experimented the proposed rollback-recovery protocol. The simulations results indicate that the performance of our protocol is better than previous rollback-recovery protocols which use independent checkpointing and pessimistic message logging without using agents. Our contributions are as follows: (1) this is the first rollback-recovery protocol using agents, (2) FRASystem is not dependent on an operating system, and (3) FRASystem provides a portability and extensibility.  相似文献   

16.
Two major sources of theoretical development for biofeedback as an intervention paradigm are considered. An integration of cognitive learning theory approaches to the potential regulation of autonomic processes in an information-processing framework and the phenomenological information-processing approach of Kelly's personal construct theory suggest a new methodological paradigm for biofeedback as a tool of psychotherapeutic intervention, especially for the discipline of behavioral medicine. Biofeedback is reconstructed as a sequence of allocating attention to automatic cognitive processes until cognitive control has been mastered. This sequence is also seen as a circumspection-preemption-control cycle that Kelly (1955) suggested was essential to all problem solving. In light of Kelly's fundamental assumptions regarding the nature of constructs, it is suggested that controlled processing approaches to biofeedback require the biofeedback trainee to investigate both ends of psychophysiological dichotomies instead of demonstrative constructs of traditional biofeedback methodology. Other psychotherapeutic techniques are reviewed to validate this new theoretical approach. Finally, treatment within this paradigm is discussed as a recircumspection of relevant constructs that were routinized during the alarm reaction phase of Selye's general adaptation syndrome.  相似文献   

17.
Cloud storage is an important service of cloud computing. After data file is outsourced, data owner no longer physical controls over the storage. To efficiently verify these data integrity, several Proof of Retrievability (POR) schemes were proposed to achieve data integrity checking. The existing POR schemes offer decent solutions to address various practical issues, however, they either have a non-trivial (linear or quadratic) communication cost, or only support private verification. And most of the existing POR schemes exist active attack and information leakage problem in the data checking procedure. It remains open to design a secure POR scheme with both public verifiability and constant communication cost. To solve the above problems , we propose a novel preserving-private POR scheme with public verifiability and constant communication cost based on end-to-end aggregation authentication in this paper. To resist information leakage, we include zero-knowledge technique to hide the data in the integrity checking process. Our scheme is shown to be secure and efficient by security analysis and performance analysis. The security of our scheme is related to the Computational Diffie–Helleman Problem and Discrete logarithm problem. Finally, we also extend the POR scheme to support multi-file integrity checking and simulation results show that the verifier only needs less computational cost to achieve data integrity checking in our extended scheme.  相似文献   

18.
High-throughput screening (HTS) is used in modern drug discovery to screen hundreds of thousands to millions of compounds on selected protein targets. It is an industrial-scale process relying on sophisticated automation and state-of-the-art detection technologies. Quality control (QC) is an integral part of the process and is used to ensure good quality data and mini mize assay variability while maintaining assay sensitivity. The authors describe new QC methods and show numerous real examples from their biologist-friendly Stat Server HTS application, a custom-developed software tool built from the commercially available S-PLUS and Stat Server statistical analysis and server software. This system remotely processes HTS data using powerful and sophisticated statistical methodology but insulates users from the technical details by outputting results in a variety of readily interpretable graphs and tables. It allows users to visualize HTS data and examine assay performance during the HTS campaign to quickly react to or avoid quality problems.  相似文献   

19.
Nonlinear modeling of multi-input multi-output (MIMO) neuronal systems using Principal Dynamic Modes (PDMs) provides a novel method for analyzing the functional connectivity between neuronal groups. This paper presents the PDM-based modeling methodology and initial results from actual multi-unit recordings in the prefrontal cortex of non-human primates. We used the PDMs to analyze the dynamic transformations of spike train activity from Layer 2 (input) to Layer 5 (output) of the prefrontal cortex in primates performing a Delayed-Match-to-Sample task. The PDM-based models reduce the complexity of representing large-scale neural MIMO systems that involve large numbers of neurons, and also offer the prospect of improved biological/physiological interpretation of the obtained models. PDM analysis of neuronal connectivity in this system revealed “input–output channels of communication” corresponding to specific bands of neural rhythms that quantify the relative importance of these frequency-specific PDMs across a variety of different tasks. We found that behavioral performance during the Delayed-Match-to-Sample task (correct vs. incorrect outcome) was associated with differential activation of frequency-specific PDMs in the prefrontal cortex.  相似文献   

20.
Buck M  Nehaniv CL 《Bio Systems》2008,94(1-2):28-33
Artificial Genetic Regulatory Networks (GRNs) are interesting control models through their simplicity and versatility. They can be easily implemented, evolved and modified, and their similarity to their biological counterparts makes them interesting for simulations of life-like systems as well. These aspects suggest they may be perfect control systems for distributed computing in diverse situations, but to be usable for such applications the computational power and evolvability of GRNs need to be studied. In this research we propose a simple distributed system implementing GRNs to solve the well known NP-complete graph colouring problem. Every node (cell) of the graph to be coloured is controlled by an instance of the same GRN. All the cells communicate directly with their immediate neighbours in the graph so as to set up a good colouring. The quality of this colouring directs the evolution of the GRNs using a genetic algorithm. We then observe the quality of the colouring for two different graphs according to different communication protocols and the number of different proteins in the cell (a measure for the possible complexity of a GRN). Those two points, being the main scalability issues that any computational paradigm raises, will then be discussed.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号