首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
We investigate a multi-agent patrolling problem where information is distributed alongside threats in environments with uncertainties. Specifically, the information and threat at each location are independently modelled as multi-state Markov chains, whose states are not observed until the location is visited by an agent. While agents will obtain information at a location, they may also suffer damage from the threat at that location. Therefore, the goal of the agents is to gather as much information as possible while mitigating the damage incurred. To address this challenge, we formulate the single-agent patrolling problem as a Partially Observable Markov Decision Process (POMDP) and propose a computationally efficient algorithm to solve this model. Building upon this, to compute patrols for multiple agents, the single-agent algorithm is extended for each agent with the aim of maximising its marginal contribution to the team. We empirically evaluate our algorithm on problems of multi-agent patrolling and show that it outperforms a baseline algorithm up to 44% for 10 agents and by 21% for 15 agents in large domains.  相似文献   

2.
The notion that cooperation can aid a group of agents to solve problems more efficiently than if those agents worked in isolation is prevalent in computer science and business circles. Here we consider a primordial form of cooperation – imitative learning – that allows an effective exchange of information between agents, which are viewed as the processing units of a social intelligence system or collective brain. In particular, we use agent-based simulations to study the performance of a group of agents in solving a cryptarithmetic problem. An agent can either perform local random moves to explore the solution space of the problem or imitate a model agent – the best performing agent in its influence network. There is a trade-off between the number of agents and the imitation probability , and for the optimal balance between these parameters we observe a thirtyfold diminution in the computational cost to find the solution of the cryptarithmetic problem as compared with the independent search. If those parameters are chosen far from the optimal setting, however, then imitative learning can impair greatly the performance of the group.  相似文献   

3.
DNA error correcting codes over the edit metric consist of embeddable markers for sequencing projects that are tolerant of sequencing errors. When a genetic library has multiple sources for its sequences, use of embedded markers permit tracking of sequence origin. This study compares different methods for synthesizing DNA error correcting codes. A new code-finding technique called the salmon algorithm is introduced and used to improve the size of best known codes in five difficult cases of the problem, including the most studied case: length six, distance three codes. An updated table of the best known code sizes with 36 improved values, resulting from three different algorithms, is presented. Mathematical background results for the problem from multiple sources are summarized. A discussion of practical details that arise in application, including biological design and decoding, is also given in this study.  相似文献   

4.

Background

Studies of functional modules in a Protein-Protein Interaction (PPI) network contribute greatly to the understanding of biological mechanisms. With the development of computing science, computational approaches have played an important role in detecting functional modules.

Results

We present a new approach using multi-agent evolution for detection of functional modules in PPI networks. The proposed approach consists of two stages: the solution construction for agents in a population and the evolutionary process of computational agents in a lattice environment, where each agent corresponds to a candidate solution to the detection problem of functional modules in a PPI network. First, the approach utilizes a connection-based encoding scheme to model an agent, and employs a random-walk behavior merged topological characteristics with functional information to construct a solution. Next, it applies several evolutionary operators, i.e., competition, crossover, and mutation, to realize information exchange among agents as well as solution evolution. Systematic experiments have been conducted on three benchmark testing sets of yeast networks. Experimental results show that the approach is more effective compared to several other existing algorithms.

Conclusions

The algorithm has the characteristics of outstanding recall, F-measure, sensitivity and accuracy while keeping other competitive performances, so it can be applied to the biological study which requires high accuracy.  相似文献   

5.
Evolutionary models of human cooperation are increasingly emphasizing the role of reputation and the requisite truthful “gossiping” about reputation-relevant behavior. If resources were allocated among individuals according to their reputations, competition for resources via competition for “good” reputations would have created incentives for exaggerated or deceptive gossip about oneself and one’s competitors in ancestral societies. Correspondingly, humans should have psychological adaptations to assess gossip veracity. Using social psychological methods, we explored cues of gossip veracity in four experiments. We found that simple reiteration increased gossip veracity, but only for those who found the gossip relatively uninteresting. Multiple sources of gossip increased its veracity, as did the independence of those sources. Information that suggested alternative, benign interpretations of gossip decreased its veracity. Competition between a gossiper and her target decreased gossip veracity. These results provide preliminary evidence for psychological adaptations for assessing gossip veracity, mechanisms that might be used to assess veracity in other domains involving social exchange of information.  相似文献   

6.
In this paper, we present a fault tolerant and recovery system called FRASystem (Fault Tolerant & Recovery Agent System) using multi-agent in distributed computing systems. Previous rollback-recovery protocols were dependent on an inherent communication and an underlying operating system, which caused a decline of computing performance. We propose a rollback-recovery protocol that works independently on an operating system and leads to an increasing portability and extensibility. We define four types of agents: (1) a recovery agent performs a rollback-recovery protocol after a failure, (2) an information agent constructs domain knowledge as a rule of fault tolerance and information during a failure-free operation, (3) a facilitator agent controls the communication between agents, (4) a garbage collection agent performs garbage collection of the useless fault tolerance information. Since agent failures may lead to inconsistent states of a system and a domino effect, we propose an agent recovery algorithm. A garbage collection protocol addresses the performance degradation caused by the increment of saved fault tolerance information in a stable storage. We implemented a prototype of FRASystem using Java and CORBA and experimented the proposed rollback-recovery protocol. The simulations results indicate that the performance of our protocol is better than previous rollback-recovery protocols which use independent checkpointing and pessimistic message logging without using agents. Our contributions are as follows: (1) this is the first rollback-recovery protocol using agents, (2) FRASystem is not dependent on an operating system, and (3) FRASystem provides a portability and extensibility.  相似文献   

7.
Tolerance of human pathogenic fungi to antifungal drugs is an emerging medical problem. We show how strains of the causative agent of human aspergillosis, Aspergillus fumigatus, tolerant to cell wall-interfering antimycotic drugs become susceptible through chemosensitization by natural compounds. Tolerance of the A. fumigatus mitogen-activated protein kinase (MAPK) mutant, sakAΔ, to these drugs indicates the osmotic/oxidative stress MAPK pathway is involved in maintaining cell wall integrity. Using deletion mutants of the yeast, Saccharomyces cerevisiae, we first identified thymol and 2,3-dihydroxybenzaldehyde (2,3-D) as potent chemosensitizing agents that target the cell wall. We then used these chemosensitizing agents to act as synergists to commercial antifungal drugs against tolerant strains of A. fumigatus. Thymol was an especially potent chemosensitizing agent for amphotericin B, fluconazole or ketoconazole. The potential use of natural, safe chemosensitizing agents in antifungal chemotherapy of human mycoses as an alternative to combination therapy is discussed.  相似文献   

8.
Peak lists derived from nuclear magnetic resonance (NMR) spectra are commonly used as input data for a variety of computer assisted and automated analyses. These include automated protein resonance assignment and protein structure calculation software tools. Prior to these analyses, peak lists must be aligned to each other and sets of related peaks must be grouped based on common chemical shift dimensions. Even when programs can perform peak grouping, they require the user to provide uniform match tolerances or use default values. However, peak grouping is further complicated by multiple sources of variance in peak position limiting the effectiveness of grouping methods that utilize uniform match tolerances. In addition, no method currently exists for deriving peak positional variances from single peak lists for grouping peaks into spin systems, i.e. spin system grouping within a single peak list. Therefore, we developed a complementary pair of peak list registration analysis and spin system grouping algorithms designed to overcome these limitations. We have implemented these algorithms into an approach that can identify multiple dimension-specific positional variances that exist in a single peak list and group peaks from a single peak list into spin systems. The resulting software tools generate a variety of useful statistics on both a single peak list and pairwise peak list alignment, especially for quality assessment of peak list datasets. We used a range of low and high quality experimental solution NMR and solid-state NMR peak lists to assess performance of our registration analysis and grouping algorithms. Analyses show that an algorithm using a single iteration and uniform match tolerances approach is only able to recover from 50 to 80% of the spin systems due to the presence of multiple sources of variance. Our algorithm recovers additional spin systems by reevaluating match tolerances in multiple iterations. To facilitate evaluation of the algorithms, we developed a peak list simulator within our nmrstarlib package that generates user-defined assigned peak lists from a given BMRB entry or database of entries. In addition, over 100,000 simulated peak lists with one or two sources of variance were generated to evaluate the performance and robustness of these new registration analysis and peak grouping algorithms.  相似文献   

9.
The fair division of a surplus is one of the most widely examined problems. This paper focuses on bargaining problems with fixed disagreement payoffs where risk-neutral agents have reached an agreement that is the Nash-bargaining solution (NBS). We consider a stochastic environment, in which the overall return consists of multiple pies with uncertain sizes and we examine how these pies can be allocated with fairness among agents. Specifically, fairness is based on the Aristotle’s maxim: “equals should be treated equally and unequals unequally, in proportion to the relevant inequality”. In this context, fairness is achieved when all the individual stochastic surplus shares which are allocated to agents are distributed in proportion to the NBS. We introduce a novel algorithm, which can be used to compute the ratio of each pie that should be allocated to each agent, in order to ensure fairness within a symmetric or asymmetric NBS.  相似文献   

10.
The theoretical basis for the relationship between the electric and magnetic fields of the heart is examined in terms of impressed currents, Jj, from a biologically active region imbedded in an Ohmic conductor. For quasistatic sources in a uniform conductor, it is shown that the problem of measuring the electrocardiogram or magnetocardiogram is an inverse problem rendered non-unique by the presence of so-called silent sources. An important class of sources, toroidal Jj, are silent electrically and not silent magnetically and these sources result in inherent differences between the information content of the electric and magnetic measurement techniques. A hypothetical example of cardiac activation departing from the conventional uniform double-layer model is presented to indicate that electrically silent sources cannot be ruled out a priori without careful magnetic measurements.  相似文献   

11.
Gossip is a subject that has been studied by researchers from an array of disciplines with various foci and methods. We measured the content of language use by members of a competitive sports team across 18 months, integrating qualitative ethnographic methods with quantitative sampling and analysis. We hypothesized that the use of gossip will vary significantly depending on whether it is used for self-serving or group-serving purposes. Our results support a model of gossip derived from multilevel selection theory that expects gossip to serve group-beneficial rules when rewards are partitioned at the group level on a scale that permits mutual monitoring. We integrate our case study with earlier studies of gossip conducted by anthropologists, psychologists, and management researchers. Kevin M. Kniffin studies cooperation within and among organizations. Kniffin is presently an Honorary Fellow of the University of Wisconsin-Madison’s Department of Anthropology. Kniffin has consulted for a variety of clients, including community-development organizations, labor unions, and credit unions. David Sloan Wilson is an evolutionary biologist interested in a broad range of issues relevant to human behavior. He has authored numerous articles and books, including most recently Darwin’s Cathedral: Evolution, Religion, and the Nature of Society (University of Chicago Press, 2002). Wilson is a professor of biological sciences and anthropology and Director of Evolutionary Studies (EvoS) at SUNY-Binghamton.  相似文献   

12.
This research deals with an innovative methodology for optimising the coal train scheduling problem. Based on our previously published work, generic solution techniques are developed by utilising a ??toolbox?? of standard well-solved standard scheduling problems. According to our analysis, the coal train scheduling problem can be basically modelled a Blocking Parallel-Machine Job-Shop Scheduling (BPMJSS) problem with some minor constraints. To construct the feasible train schedules, an innovative constructive algorithm called the SLEK algorithm is proposed. To optimise the train schedule, a three-stage hybrid algorithm called the SLEK-BIH-TS algorithm is developed based on the definition of a sophisticated neighbourhood structure under the mechanism of the Best-Insertion-Heuristic (BIH) algorithm and Tabu Search (TS) metaheuristic algorithm. A case study is performed for optimising a complex real-world coal rail system in Australia. A method to calculate the lower bound of the makespan is proposed to evaluate results. The results indicate that the proposed methodology is promising to find the optimal or near-optimal feasible train timetables of a coal rail system under network and terminal capacity constraints.  相似文献   

13.
Humans commonly face choices between multiple options with uncertain outcomes. Such situations occur in many contexts, from purely financial decisions (which shares should I buy?) to perceptuo-motor decisions between different actions (where should I aim my shot at goal?). Regardless of context, successful decision-making requires that the uncertainty at the heart of the decision-making problem is taken into account. Here, we ask whether humans can recover an estimate of exogenous uncertainty and then use it to make good decisions. Observers viewed a small dot that moved erratically until it disappeared behind an occluder. We varied the size of the occluder and the unpredictability of the dot''s path. The observer attempted to capture the dot as it emerged from behind the occluded region by setting the location and extent of a ‘catcher’ along the edge of the occluder. The reward for successfully catching the dot was reduced as the size of the catcher increased. We compared human performance with that of an agent maximizing expected gain and found that observers consistently selected catcher size close to this theoretical solution. These results suggest that humans are finely tuned to exogenous uncertainty information and can exploit it to guide action.  相似文献   

14.
Buhler and Tompa (2002) introduced the random projection algorithm for the motif discovery problem and demonstrated that this algorithm performs well on both simulated and biological samples. We describe a modification of the random projection algorithm, called the uniform projection algorithm, which utilizes a different choice of projections. We replace the random selection of projections by a greedy heuristic that approximately equalizes the coverage of the projections. We show that this change in selection of projections leads to improved performance on motif discovery problems. Furthermore, the uniform projection algorithm is directly applicable to other problems where the random projection algorithm has been used, including comparison of protein sequence databases.  相似文献   

15.
Stabilisation of protein/peptide drugs against thermal denaturation is a challenging problem, especially for liquid formulations. Various polysaccharides at high concentrations have been reported to improve stability of polypeptides, probably by providing a crowded environment which retards kinetic unfolding and resultant degradation. Levan is a fructose homopolysaccharide which is finding increasing use in pharmaceutical applications, but its use for protein drug stabilization remains meagre. In this study, we used levan for stabilizing a liquid preparation of a peptide antibiotic, bacitracin. We prepared liquid formulations of bacitracin with or without levan and subjected them to storage at 25 °C. The stored samples were then analysed over 120 days for denaturation and antibacterial activity. Differential Scanning Calorimetry, Circular Dichroism and High Performance Liquid Chromatography were used for evaluating the effect of levan on thermal denaturation of bacitracin. We found that levan at 2.5% w/v significantly preserved the antibacterial activity of bacitracin for 120 days as compared to plain buffered bacitracin, even when stored at 25 °C. Also, levan at high concentrations maintained the secondary structure and increased the melting temperature (Tm) of bacitracin in solution. Levan did not form covalent interactions or strong complexation with bacitracin. Based on this study, levan appears as a promising stabilizing agent for preparing liquid formulations of protein/peptide drugs that can be stored at room temperature.  相似文献   

16.
In this paper, stochastic leader gravitational search algorithm (SL-GSA) based on randomized k is proposed. Standard GSA (SGSA) utilizes the best agents without any randomization, thus it is more prone to converge at suboptimal results. Initially, the new approach randomly choses k agents from the set of all agents to improve the global search ability. Gradually, the set of agents is reduced by eliminating the agents with the poorest performances to allow rapid convergence. The performance of the SL-GSA was analyzed for six well-known benchmark functions, and the results are compared with SGSA and some of its variants. Furthermore, the SL-GSA is applied to minimum variance distortionless response (MVDR) beamforming technique to ensure compatibility with real world optimization problems. The proposed algorithm demonstrates superior convergence rate and quality of solution for both real world problems and benchmark functions compared to original algorithm and other recent variants of SGSA.  相似文献   

17.
A correlation had previously been established between actomyosin content of homogenized skeletal muscle cell segments as determined by extraction in strong salt solution and the ability of those segments to empty when extracted with buffered water. In this study, we examined the ability of certain compounds to inhibit the process of emptying. Adenosine triphosphate (ATP) and adenosine diphosphate (ADP), which dissociate actomyosin, inhibited the process of emptying, while adenosine monophosphate (AMP) which does not dissociate actomyosin, did not. We conclude that the formation of actomyosin is a necessary prerequisite for emptying and not just a secondary effect. Polyvalent cations were also found to inhibit emptying. The inhibition was reversible by washing with a solution of NaCl-histidine or with chelating agents, ethylenediaminetetraacetate (EDTA) and ethylene-glycol-bis(β-amino-ethyl ether) tetraacetic acid (EGTA). A factor(s) solubilized from aged muscle functions as an inhibitory agent; the suggestion is made that this factor(s) may be a polyvalent cation.  相似文献   

18.
In clinical arthrographic examination, strong hypertonic contrast agents are injected directly into the joint space. This may reduce the stiffness of articular cartilage, which is further hypothesized to lead to overload-induced cell death. We investigated the cell death in articular cartilage while the tissue was compressed in situ in physiological saline solution and in full strength hypertonic X-ray contrast agent HexabrixTM. Samples were prepared from bovine patellae and stored in Dulbecco’s Modified Eagle’s Medium overnight. Further, impact tests with or without creep were conducted for the samples with contact stresses and creep times changing from 1 MPa to 10 MPa and from 0 min to 15 min, respectively. Finally, depth-dependent cell viability was assessed with a confocal microscope. In order to characterize changes in the biomechanical properties of cartilage as a result of the use of Hexabrix?, stress-relaxation tests were conducted for the samples immersed in Hexabrix? and phosphate buffered saline (PBS). Both dynamic and equilibrium modulus of the samples immersed in Hexabrix? were significantly (p<0.05) lower than those of the samples immersed in PBS. Cartilage samples immersed in physiological saline solution showed load-induced cell death primarily in the superficial and middle zones. However, under high 8–10 MPa contact stresses, the samples immersed in full strength Hexabrix? showed significantly (p<0.05) higher number of dead cells than the samples compressed in physiological saline, especially in the deep zone of cartilage. In conclusion, excessive loading stresses followed by tissue creep might increase the risk for chondrocyte death in articular cartilage when immersed in hypertonic X-ray contrast agent, especially in the deep zone of cartilage.  相似文献   

19.
Accurate mapping of spliced RNA-Seq reads to genomic DNA has been known as a challenging problem. Despite significant efforts invested in developing efficient algorithms, with the human genome as a primary focus, the best solution is still not known. A recently introduced tool, TrueSight, has demonstrated better performance compared with earlier developed algorithms such as TopHat and MapSplice. To improve detection of splice junctions, TrueSight uses information on statistical patterns of nucleotide ordering in intronic and exonic DNA. This line of research led to yet another new algorithm, UnSplicer, designed for eukaryotic species with compact genomes where functional alternative splicing is likely to be dominated by splicing noise. Genome-specific parameters of the new algorithm are generated by GeneMark-ES, an ab initio gene prediction algorithm based on unsupervised training. UnSplicer shares several components with TrueSight; the difference lies in the training strategy and the classification algorithm. We tested UnSplicer on RNA-Seq data sets of Arabidopsis thaliana, Caenorhabditis elegans, Cryptococcus neoformans and Drosophila melanogaster. We have shown that splice junctions inferred by UnSplicer are in better agreement with knowledge accumulated on these well-studied genomes than predictions made by earlier developed tools.  相似文献   

20.
Fusarium toxins are of great practical relevance in animal feeding since they may occur in toxicologically relevant concentrations. Therefore, many attempts have been made to search for detoxification of contaminated feedstuffs or diets in order to cope with the problem. The supplementation of contaminated diets with detoxifying agents seems to be most feasible, andin vitro results seem to be convincing. According to the Guideline 87/153/EEC, the efficacy has to be proven by using an experimental design justified according to the claim for the use of the additive. Reviewing the literature, only a few studies investigated specific parameters which may clearly reflect the claimed mode of action of the additives, and those demonstrated no measurable detoxifying effects. The majority of investigations focused on the rather non-specific performance parameters, while many of these applied incomplete experimental designs. Nevertheless, most of the experiments did not demonstrate preventative effects. It is concluded that testing of currently available detoxifying agents did not follow the Council Directive since the claim for their use was not proven. The application of complete two by two factorial experimental designs, the investigation of mycotoxins and/or metabolites in physiological samples as specific parameters and the verification of the specificity of the detoxifying agent is recommended for futurein vivo investigations.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号