首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 78 毫秒
1.
Task scheduling for large-scale computing systems is a challenging problem. From the users perspective, the main concern is the performance of the submitted tasks, whereas, for the cloud service providers, reducing operation cost while providing the required service is critical. Therefore, it is important for task scheduling mechanisms to balance users’ performance requirements and energy efficiency because energy consumption is one of the major operational costs. We present a time dependent value of service (VoS) metric that will be maximized by the scheduling algorithm that take into consideration the arrival time of a task while evaluating the value functions for completing a task at a given time and the tasks energy consumption. We consider the variation in value for completing a task at different times such that the value of energy reduction can change significantly between peak and non-peak periods. To determine the value of a task completion, we use completion time and energy consumption with soft and hard thresholds. We define the VoS for a given workload to be the sum of the values for all tasks that are executed during a given period of time. Our system model is based on virtual machines, where each task will be assigned a resource configuration characterized by the number of the homogeneous cores and amount of memory. For the scheduling of each task submitted to our system, we use the estimated time to compute matrix and the estimated energy consumption matrix which are created using historical data. We design, evaluate, and compare our task scheduling methods to show that a significant improvement in energy consumption can be achieved when considering time-of-use dependent scheduling algorithms. The simulation results show that we improve the performance and the energy values up to 49% when compared to schedulers that do not consider the value functions. Similar to the simulation results, our experimental results from running our value based scheduling on an IBM blade server show up to 82% improvement in performance value, 110% improvement in energy value, and up to 77% improvement in VoS compared to schedulers that do not consider the value functions.  相似文献   

2.
As the domain of communication systems grows, heterogeneity among computers and subnetworks employed for a task also increases. Channel bandwidth available for a message on a communication network varies with time and link. This variation can have a significant effect on performance of an individual message and also that of the network as a whole. Therefore, it is important to understand effects of bandwidth heterogeneity on the network performance in order to optimally utilize a heterogeneous communication network. The ability to use such a network optimally is highly desirable in many applications such as network-based data-intensive high performance computing. The main goal of this paper is to analyze effects of temporal and spatial heterogeneity on performance of individual messages in detail via an extensive simulation, in terms of throughput, end-to-end delay, etc. Also, the problems of path selection and multi-path data transfer are considered to illustrate how the analysis results may be used in the future effort of optimizing the network performance by taking channel bandwidth heterogeneity into account.  相似文献   

3.
4.
Problem A large audit of colonoscopy in the United Kingdom showed that the unadjusted completion rate was 57% when stringent criteria for identifying the caecum were applied. The caecum should be reached 90% of the time. Little information is available on what units or operators need to do to improve to acceptable levels.Design Quality improvement programme using two completed cycles of audit.Setting Endoscopy department in a university linked general hospital in northeast England.Key measures for improvement Colonoscopy completion rate.Strategy for change Two audit cycles were completed between 1999 and 2002. Changes to practice were based on results of audit and took into account the opinions of relevant staff. Lack of time for each colonoscopy, poor bowel preparation, especially in frail patients, and a mismatch between number of colonoscopies done and completion rate for individual operators were responsible for failed colonoscopies. Appropriate changes were made.Effects of change The initial crude colonoscopy completion rate was 60%, improving to 71% after the first round of audit and 88% after the second round, which approximates to the agreed audit standard of 90%. The final adjusted completion rate was 94%.Lessons learnt Achievement of the national targets in a UK general hospital is possible by lengthening appointments, admitting frail patients for bowel preparation to one ward, and allocating colonoscopies to the most successful operators.  相似文献   

5.
Some principles of information theory are utilized in the design of neural nets of the McCulloch-Pitts type. In particular, problems are considered where signals from several neurons must pass through a single one, thus resulting in a “bottleneck” in the flow of information, an abstract model of the corresponding bottleneck from the retina to the optic nerve. The first part of the paper deals with a construction of a McCulloch-Pitts net in which the redundancy in the messages originating in two neurons is utilized so that the messages can be sent over a single neuron with little loss of information. In the second part, messages from a set of neurons are “pumped” into two channel neurons. The optimum connection scheme is computed for this case, i.e, one resulting in a minimum loss of information. Possible biological implications of this approach are indicated.  相似文献   

6.
We examined intra-patriline behavioral plasticity in communication behavior by generating lifetime behavioral profiles for the performance of the vibration signal and waggle dance in workers which were the progeny of three unrelated queens, each inseminated with the semen of a single, different drone. We found pronounced variability within each patriline for the tendency to produce each signal, the ontogeny of signal performance, and the persistence with which individual workers performed the signals throughout their lifetimes. Within each patriline, the number of workers that performed each signal and the distribution of onset ages for each signal were significantly different. In each patriline, workers of all ages could perform vibration signals; vibration signal production began 3–5 d before waggle dancing; and some workers began performing waggle dances at ages typically associated with precocious foraging. Most workers vibrated and waggled only 1–2 d during their lifetimes, although each patriline contained some workers that performed the signal persistently for up to 8 or 9 d. We also found marked variability in signal performance among the three worker lineages examined. Because the vibration signal and waggle dance influence task performance, variability in signaling behavior within and between subfamilies may help to organize information flow and collective labor in honey bee colonies. Inter-patriline variability may influence the total number of workers from different partrilines that perform the signals, whereas intra-patriline variability may further fine-tune signal performance and the allocation of labor to a given set of circumstances. Although intra-patriline behavioral variability is assumed to be widespread in the social insects, our study is the first to document the extent of this variability for honey bee communication signals.  相似文献   

7.
During wakefulness, a constant and continuous stream of complex stimuli and self-driven thoughts permeate the human mind. Here, eleven participants were asked to count down numbers and remember negative or positive autobiographical episodes of their personal lives, for 32 seconds at a time, during which they could freely engage in the execution of those tasks. We then examined the possibility of determining from a single whole-brain functional magnetic resonance imaging scan which one of the two mental tasks each participant was performing at a given point in time. Linear support-vector machines were used to build within-participant classifiers and across-participants classifiers. The within-participant classifiers could correctly discriminate scans with an average accuracy as high as 82%, when using data from all individual voxels in the brain. These results demonstrate that it is possible to accurately classify self-driven mental tasks from whole-brain activity patterns recorded in a time interval as short as 2 seconds.  相似文献   

8.
In today''s technology-assisted society, social interactions may be expressed through a variety of techno-communication channels, including online social networks, email and mobile phones (calls, text messages). Consequently, a clear grasp of human behavior through the diverse communication media is considered a key factor in understanding the formation of the today''s information society. So far, all previous research on user communication behavior has focused on a sole communication activity. In this paper we move forward another step on this research path by performing a multidimensional study of human sociality as an expression of the use of mobile phones. The paper focuses on user temporal communication behavior in the interplay between the two complementary communication media, text messages and phone calls, that represent the bi-dimensional scenario of analysis. Our study provides a theoretical framework for analyzing multidimensional bursts as the most general burst category, that includes one-dimensional bursts as the simplest case, and offers empirical evidence of their nature by following the combined phone call/text message communication patterns of approximately one million people over three-month period. This quantitative approach enables the design of a generative model rooted in the three most significant features of the multidimensional burst - the number of dimensions, prevalence and interleaving degree - able to reproduce the main media usage attitude. The other findings of the paper include a novel multidimensional burst detection algorithm and an insight analysis of the human media selection process.  相似文献   

9.
Human behaviour is highly individual by nature, yet statistical structures are emerging which seem to govern the actions of human beings collectively. Here we search for universal statistical laws dictating the timing of human actions in communication decisions. We focus on the distribution of the time interval between messages in human broadcast communication, as documented in Twitter, and study a collection of over 160,000 tweets for three user categories: personal (controlled by one person), managed (typically PR agency controlled) and bot-controlled (automated system). To test our hypothesis, we investigate whether it is possible to differentiate between user types based on tweet timing behaviour, independently of the content in messages. For this purpose, we developed a system to process a large amount of tweets for reality mining and implemented two simple probabilistic inference algorithms: 1. a naive Bayes classifier, which distinguishes between two and three account categories with classification performance of 84.6% and 75.8%, respectively and 2. a prediction algorithm to estimate the time of a user''s next tweet with an . Our results show that we can reliably distinguish between the three user categories as well as predict the distribution of a user''s inter-message time with reasonable accuracy. More importantly, we identify a characteristic power-law decrease in the tail of inter-message time distribution by human users which is different from that obtained for managed and automated accounts. This result is evidence of a universal law that permeates the timing of human decisions in broadcast communication and extends the findings of several previous studies of peer-to-peer communication.  相似文献   

10.
A well known problem in the design of the control system for a swarm of robots concerns the definition of suitable individual rules that result in the desired coordinated behaviour. A possible solution to this problem is given by the automatic synthesis of the individual controllers through evolutionary or learning processes. These processes offer the possibility to freely search the space of the possible solutions for a given task, under the guidance of a user-defined utility function. Nonetheless, there exist no general principles to follow in the definition of such a utility function in order to reward coordinated group behaviours. As a consequence, task dependent functions must be devised each time a new coordination problem is under study. In this paper, we propose the use of measures developed in Information Theory as task-independent, implicit utility functions. We present two experiments in which three robots are trained to produce generic coordinated behaviours. Each robot is provided with rich sensory and motor apparatus, which can be exploited to explore the environment and to communicate with other robots. We show how coordinated behaviours can be synthesised through a simple evolutionary process. The only criteria used to evaluate the performance of the robotic group is the estimate of mutual information between the motor states of the robots.  相似文献   

11.
This paper contrasts two accounts of audience design during multiparty communication: audience design as a strategic individual-level message adjustment or as a non-strategic interaction-level message adjustment. Using a non-interactive communication task, Experiment 1 showed that people distinguish between messages designed for oneself and messages designed for another person; consistent with strategic message design, messages designed for another person/s were longer (number of words) than those designed for oneself. However, audience size did not affect message length (messages designed for different sized audiences were similar in length). Using an interactive communication task Experiment 2 showed that as group size increased so too did communicative effort (number of words exchanged between interlocutors). Consistent with a non-strategic account, as group members were added more social interaction was necessary to coordinate the group''s collective situation model. Experiment 3 validates and extends the production measures used in Experiment 1 and 2 using a comprehension task. Taken together, our results indicate that audience design arises as a non-strategic outcome of social interaction during group discussion.  相似文献   

12.
13.
A “generic” problem amenable to matrix algebraic treatment is outlined. Several examples are given and one, a communication system, is studied in some detail. A typical structure matrix is used to describe the channels of communication and a “status” matrix is used to describe the distribution of information in the system at any time. A theorem is proved relating the status matrix at any timet to thetth power of the structure matrix. The elements of the communication system are interpreted as individuals who can send messages to each other. For the individuals attempting to solve a “group problem” certain relations are derived between the structure and status matrices and time of solution. The structure of the communication system is permitted to vary with time. A general theorem is proved relating the status matrix to the matrix product of the series of structure matrices representing the changing structure of the system. Some suggestions are made for further generalizations. In particular, it is suggested that so-called “higher order” information transmission can be similarly treated.  相似文献   

14.
15.
The original aim of the Information Theory (IT) was to solve a purely technical problem: to increase the performance of communication systems, which are constantly affected by interferences that diminish the quality of the transmitted information. That is, the theory deals only with the problem of transmitting with the maximal precision the symbols constituting a message. In Shannon''s theory messages are characterized only by their probabilities, regardless of their value or meaning. As for its present day status, it is generally acknowledged that Information Theory has solid mathematical foundations and has fruitful strong links with Physics in both theoretical and experimental areas. However, many applications of Information Theory to Biology are limited to using it as a technical tool to analyze biopolymers, such as DNA, RNA or protein sequences. The main point of discussion about the applicability of IT to explain the information flow in biological systems is that in a classic communication channel, the symbols that conform the coded message are transmitted one by one in an independent form through a noisy communication channel, and noise can alter each of the symbols, distorting the message; in contrast, in a genetic communication channel the coded messages are not transmitted in the form of symbols but signaling cascades transmit them. Consequently, the information flow from the emitter to the effector is due to a series of coupled physicochemical processes that must ensure the accurate transmission of the message. In this review we discussed a novel proposal to overcome this difficulty, which consists of the modeling of gene expression with a stochastic approach that allows Shannon entropy (H) to be directly used to measure the amount of uncertainty that the genetic machinery has in relation to the correct decoding of a message transmitted into the nucleus by a signaling pathway. From the value of H we can define a function I that measures the amount of information content in the input message that the cell''s genetic machinery is processing during a given time interval. Furthermore, combining Information Theory with the frequency response analysis of dynamical systems we can examine the cell''s genetic response to input signals with varying frequencies, amplitude and form, in order to determine if the cell can distinguish between different regimes of information flow from the environment. In the particular case of the ethylene signaling pathway, the amount of information managed by the root cell of Arabidopsis can be correlated with the frequency of the input signal. The ethylene signaling pathway cuts off very low and very high frequencies, allowing a window of frequency response in which the nucleus reads the incoming message as a varying input. Outside of this window the nucleus reads the input message as an approximately non-varying one. This frequency response analysis is also useful to estimate the rate of information transfer during the transport of each new ERF1 molecule into the nucleus. Additionally, application of Information Theory to analysis of the flow of information in the ethylene signaling pathway provides a deeper insight in the form in which the transition between auxin and ethylene hormonal activity occurs during a circadian cycle. An ambitious goal for the future would be to use Information Theory as a theoretical foundation for a suitable model of the information flow that runs at each level and through all levels of biological organization.Key words: information theory, shannon entropy, frequency systems analysis, Arabidopsis thaliana, ethylene signaling systems, plant genetic networks, circadian cycles  相似文献   

16.
One difficulty in summarising biological survivorship data is that the hazard rates are often neither constant nor increasing with time or decreasing with time in the entire life span. The promising Weibull model does not work here. The paper demonstrates how bath tub shaped quadratic models may be used in such a case. Further, sometimes due to a paucity of data actual lifetimes are not as certainable. It is shown how a concept from queuing theory namely first in first out (FIFO) can be profitably used here. Another nonstandard situation considered is one in which lifespan of the individual entity is too long compared to duration of the experiment. This situation is dealt with, by using ancilliary information. In each case the methodology is illustrated with numerical examples.  相似文献   

17.
In a previous paper a method was given by which the efferent activity of an idealized neural net could be calculated from a given afferent pattern. Those results are extended in the present paper. Conditions are given under which nets may be considered equivalent. Rules are given for the reduction or extension of a net to an equivalent net. A procedure is given for constructing a net which has the property of converting each of a given set of afferent activity patterns into its corresponding prescribed efferent activity pattern.  相似文献   

18.
Clusters of workstations and networked parallel computing systems are emerging as promising computational platforms for HPC applications. The processors in such systems are typically interconnected by a collection of heterogeneous networks such as Ethernet, ATM, and FDDI, among others. In this paper, we develop techniques to perform block-cyclic redistribution over P processors interconnected by such a collection of heterogeneous networks. We represent the communication scheduling problem using a timing diagram formalism. Here, each interprocessor communication event is represented by a rectangle whose height denotes the time to perform this event over the heterogeneous network. The communication scheduling problem is then one of appropriately positioning the rectangles so as to minimize the completion time of all the communication events. For the important case where the block size changes by a factor of K, we develop a heuristic algorithm whose completion time is at most twice the optimal. The running time of the heuristic is O(PK 2). Our heuristic algorithm is adaptive to variations in network performance, and derives schedules at run-time, based on current information about the available network bandwidth. Our experimental results show that our schedules always have communication times that are very close to optimal. This revised version was published online in July 2006 with corrections to the Cover Date.  相似文献   

19.
Animals must continuously evaluate sensory information to select the preferable among possible actions in a given context, including the option to wait for more information before committing to another course of action. In experimental sensory decision tasks that replicate these features, reaction time distributions can be informative about the implicit rules by which animals determine when to commit and what to do. We measured reaction times of Long-Evans rats discriminating the direction of motion in a coherent random dot motion stimulus, using a self-paced two-alternative forced-choice (2-AFC) reaction time task. Our main findings are: (1) When motion strength was constant across trials, the error trials had shorter reaction times than correct trials; in other words, accuracy increased with response latency. (2) When motion strength was varied in randomly interleaved trials, accuracy increased with motion strength, whereas reaction time decreased. (3) Accuracy increased with reaction time for each motion strength considered separately, and in the interleaved motion strength experiment overall. (4) When stimulus duration was limited, accuracy improved with stimulus duration, whereas reaction time decreased. (5) Accuracy decreased with response latency after stimulus offset. This was the case for each stimulus duration considered separately, and in the interleaved duration experiment overall. We conclude that rats integrate visual evidence over time, but in this task the time of their response is governed more by elapsed time than by a criterion for sufficient evidence.  相似文献   

20.
Motta G  Ombao H 《Biometrics》2012,68(3):825-836
Summary In this article, we develop a novel method that explains the dynamic structure of multi-channel electroencephalograms (EEGs) recorded from several trials in a motor-visual task experiment. Preliminary analyses of our data suggest two statistical challenges. First, the variance at each channel and cross-covariance between each pair of channels evolve over time. Moreover, the cross-covariance profiles display a common structure across all pairs, and these features consistently appear across all trials. In the light of these features, we develop a novel evolutionary factor model (EFM) for multi-channel EEG data that systematically integrates information across replicated trials and allows for smoothly time-varying factor loadings. The individual EEGs series share common features across trials, thus, suggesting the need to pool information across trials, which motivates the use of the EFM for replicated time series. We explain the common co-movements of EEG signals through the existence of a small number of common factors. These latent factors are primarily responsible for processing the visual-motor task which, through the loadings, drive the behavior of the signals observed at different channels. The estimation of the time-varying loadings is based on the spectral decomposition of the estimated time-varying covariance matrix.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号