首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
This paper addresses the efficient exploitation of task-level parallelism, present in many dense linear algebra operations, from the point of view of both computational performance and energy consumption. The strategies considered here, referred to as the Slack Reduction Algorithm (SRA) and the Race-to-Idle Algorithm (RIA), adjust the operation frequency of the cores during the execution of a collection of tasks (in which many dense linear algebra algorithms can be decomposed) with very different approaches to save energy. The procedures are evaluated using an energy-aware simulator, which is in charge of scheduling/mapping the execution of these tasks to the cores, leveraging dynamic frequency voltage scaling featured by current technology. Experiments with this tool and the practical integration of the RIA strategy into a runtime show the energy gains for two versions of the QR factorization.  相似文献   

2.
This work tackles the problem of reducing the power consumption of the OLSR routing protocol in vehicular networks. Nowadays, energy-aware and green communication protocols are important research topics, specially when deploying wireless mobile networks. This article introduces a fast automatic methodology to search for energy-efficient OLSR configurations by using a parallel evolutionary algorithm. The experimental analysis demonstrates that significant improvements over the standard configuration can be attained in terms of power consumption, with no noteworthy loss in the QoS.  相似文献   

3.
Autonomous wireless sensor networks are subject to power, bandwidth, and resource limitations that can be represented as capacity constraints imposed to their equivalent flow networks. The maximum sustainable workload (i.e., the maximum data flow from the sensor nodes to the collection point which is compatible with the capacity constraints) is the maxflow of the flow network. Although a large number of energy-aware routing algorithms for ad-hoc networks have been proposed, they usually aim at maximizing the lifetime of the network rather than the steady-state sustainability of the workload. Energy harvesting techniques, providing renewable supply to sensor nodes, prompt for a paradigm shift from energy-constrained lifetime optimization to power-constrained workload optimization.  相似文献   

4.
Understanding the role of genetic variation in human diseases remains an important problem to be solved in genomics. An important component of such variation consist of variations at single sites in DNA, or single nucleotide polymorphisms (SNPs). Typically, the problem of associating particular SNPs to phenotypes has been confounded by hidden factors such as the presence of population structure, family structure or cryptic relatedness in the sample of individuals being analyzed. Such confounding factors lead to a large number of spurious associations and missed associations. Various statistical methods have been proposed to account for such confounding factors such as linear mixed-effect models (LMMs) or methods that adjust data based on a principal components analysis (PCA), but these methods either suffer from low power or cease to be tractable for larger numbers of individuals in the sample. Here we present a statistical model for conducting genome-wide association studies (GWAS) that accounts for such confounding factors. Our method scales in runtime quadratic in the number of individuals being studied with only a modest loss in statistical power as compared to LMM-based and PCA-based methods when testing on synthetic data that was generated from a generalized LMM. Applying our method to both real and synthetic human genotype/phenotype data, we demonstrate the ability of our model to correct for confounding factors while requiring significantly less runtime relative to LMMs. We have implemented methods for fitting these models, which are available at http://www.microsoft.com/science.  相似文献   

5.
The mass media is a major source of health information for the public, and as such the quality and independence of health news reporting is an important concern. Concerns have been expressed that journalists reporting on health are increasingly dependent on their sources—including representatives of industries responsible for manufacturing health-related products—for story ideas and content. Many critics perceive an imbalance of power between journalists and industry sources, with industry being in a position of relative power, however the empirical evidence to support this view is limited. The analysis presented here—which is part of a larger study of industry-journalist relationships—draws on in-depth, semi-structured interviews with representatives of health-related industries in Australia to inductively examine their perceptions of power relations between industry and journalists. Participants painted a picture in which journalists, rather than themselves, were in a position to control the nature, extent, and outcome of their interactions with industry sources. Our results resonate with the concept of “mediatisation” as it has been applied in the domain of political reporting. It appears that, from the perspective of industry representatives, the imposition of media logic on health-related industries may inappropriately influence the information that the public receives about health-related products.  相似文献   

6.

Data centers, clusters, and grids have historically supported High-Performance Computing (HPC) applications. Due to the high capital and operational expenditures associated with such infrastructures, we have witnessed consistent efforts to run HPC applications in the cloud in the recent past. The potential advantages of this shift include higher scalability and lower costs. If, on the one hand, app instantiation—through customized Virtual Machines (VMs)—is a well-solved issue, on the other, the network still represents a significant bottleneck. When switching HPC applications to be executed on the cloud, we lose control of where VMs will be positioned and of the paths that will be traversed for processes to communicate with one another. To bridge this gap, we present Janus, a framework for dynamic, just-in-time path provisioning in cloud infrastructures. By leveraging emerging software-defined networking principles, the framework allows for an HPC application, once deployed, to have interprocess communication paths configured upon usage based on least-used network links (instead of resorting to shortest, pre-computed paths). Janus is fully configurable to cope with different operating parameters and communication strategies, providing a rich ecosystem for application execution speed up. Through an extensive experimental evaluation, we provide evidence that the proposed framework can lead to significant gains regarding runtime. Moreover, we show what one can expect in terms of system overheads, providing essential insights on how better benefiting from Janus.

  相似文献   

7.
During the past decade, cluster computing and mobile communication technologies have been extensively deployed and widely applied because of their giant commercial value. The rapid technological advancement makes it feasible to integrate these two technologies and a revolutionary application called mobile cluster computing is arising on the horizon. Mobile cluster computing technology can further enhance the power of our laptops and mobile devices by running parallel applications. However, scheduling parallel applications on mobile clusters is technically challenging due to the significant communication latency and limited battery life of mobile devices. Therefore, shortening schedule length and conserving energy consumption have become two major concerns in designing efficient and energy-aware scheduling algorithms for mobile clusters. In this paper, we propose two novel scheduling strategies aimed at leveraging performance and power consumption for parallel applications running on mobile clusters. Our research focuses on scheduling precedence constrained parallel tasks and thus duplication heuristics are applied to schedule parallel tasks to minimize communication overheads. However, existing duplication algorithms are developed with consideration of schedule lengths, completely ignoring energy consumption of clusters. In this regard, we design two energy-aware duplication scheduling algorithms, called EADUS and TEBUS, to schedule precedence constrained parallel tasks with a complexity of O(n 2), where n is the number of tasks in a parallel task set. Unlike the existing duplication-based scheduling algorithms that replicate all the possible predecessors of each task, the proposed algorithms judiciously replicate predecessors of a task if the duplication can help in conserving energy. Our energy-aware scheduling strategies are conducive to balancing scheduling lengths and energy savings of a set of precedence constrained parallel tasks. We conducted extensive experiments using both synthetic benchmarks and real-world applications to compare our algorithms with two existing approaches. Experimental results based on simulated mobile clusters demonstrate the effectiveness and practicality of the proposed duplication-based scheduling strategies. For example, EADUS and TABUS can reduce energy consumption for the Gaussian Elimination application by averages of 16.08% and 8.1% with merely 5.7% and 2.2% increase in schedule length respectively.
Xiao Qin (Corresponding author)Email:
  相似文献   

8.
The paper gives an overview on the status of the theoretical analysis of Ant Colony Optimization (ACO) algorithms, with a special focus on the analytical investigation of the runtime required to find an optimal solution to a given combinatorial optimization problem. First, a general framework for studying questions of this type is presented, and three important ACO variants are recalled within this framework. Secondly, two classes of formal techniques for runtime investigations of the considered type are outlined. Finally, some available runtime complexity results for ACO variants, referring to elementary test problems that have been introduced in the theoretical literature on evolutionary algorithms, are cited and discussed.  相似文献   

9.
10.
Industrial ecology rests historically—even in a short lifetime of 15 years or so—on the metaphorical power of natural ecosystems. Its evolution parallels the rise of concerns over unsustainability, that is, the threats to our world's ability to support human life the emergence of sustainability as a normative goal on a global scale. This article examines the relationships between industrial ecology and sustainability and argues that, in its historical relationship to classical ecology models, the field lacks power to address the full range of goals of sustainability, however defined. The classical ecosystem analogy omits aspects of human social and cultural life central to sustainability. But by moving beyond this model to more recent ecosystem models based on complexity theory, the field can expand its purview to address sustainability more broadly and powerfully. Complexity models of living systems can also ground alternative normative models for sustainability as an emergent property rather than the output of a mechanistic economic model for society's workings.  相似文献   

11.
Results from studies of the parameters of a novel type of plasma source—a hollow cathode magnetron—are presented. The magnetron operates at a gas pressure of 5–20 mTorr, the discharge power being in the range of 0.5–4 kW. At discharge powers exceeding 2 kW, a plasma flow with a density of higher than 1011 cm?3 and length of up to 30 cm forms at the magnetron output. Using a grid quartz crystal microbalance, the ionized copper flux fraction was measured as a function of the gas pressure, discharge power, and distance from the target. At gas pressures of higher than 15 mTorr, the degree of ionization at a distance of 31 cm exceeds 50%.  相似文献   

12.
13.
The sensitivity analysis of a Cellular Genetic Algorithm (CGA) with local search is used to design a new and faster heuristic for the problem of mapping independent tasks to a distributed system (such as a computer cluster or grid) in order to minimize makespan (the time when the last task finishes). The proposed heuristic improves the previously known Min-Min heuristic. Moreover, the heuristic finds mappings of similar quality to the original CGA but in a significantly reduced runtime (1,000 faster). The proposed heuristic is evaluated across twelve different classes of scheduling instances. In addition, a proof of the energy-efficiency of the algorithm is provided. This convergence study suggests how additional energy reduction can be achieved by inserting low power computing nodes to the distributed computer system. Simulation results show that this approach reduces both energy consumption and makespan.  相似文献   

14.
All social species face various “collective action problems” (CAPs) or “social dilemmas,” meaning problems in achieving cooperating when the best move from a selfish point of view yields an inferior collective outcome. Compared to most other species, humans are very good at solving these challenges, suggesting that something rather peculiar about human sociality facilitates collective action. This article proposes that language — the uniquely human faculty of symbolic communication — fundamentally alters the possibilities for collective action. I explore these issues using simple game-theoretic models and empirical evidence (both ethnographic and experimental). I review several standard mechanisms for the evolution of cooperation — mutualism, reciprocal altruism, indirect reciprocity and signaling — highlighting their limitations when it comes to explaining large-group cooperation, as well as the ways in which language helps overcome those limitations. Language facilitates complex coordination and is essential for establishing norms governing production efforts and distribution of collective goods that motivate people to cooperate voluntarily in large groups. Language also significantly lowers the cost of detecting and punishing “free riders,” thus greatly enhancing the scope and power of standard conditional reciprocity. In addition, symbolic communication encourages new forms of collectively beneficial displays and reputation management — what evolutionists often term “signaling” and “indirect reciprocity.” Thus, language reinforces existing forces that favor the evolution of cooperation, as well as creating new opportunities for collective action not available even to our closest primate relatives.  相似文献   

15.
Sequence analysis has become essential to the study of genomes and biological research in general. Basic Local Alignment Search Tool (BLAST) leads the way as the most accepted method for performing necessary query searches and analysis of discovered genes. Combating growing data sizes, with the goal of speeding up job runtimes, scientist are resorting to grid computing technologies. However, grid environments are characterized by dynamic, heterogeneous, and transient state of available resources causing major hindrance to users when trying to realize user-desired levels of service. This paper analyzes performance characteristics of NCBI BLAST on several resources and captures influence of resource characteristics and job parameters on BLAST job runtime across those resources. Obtained results are summarized as a set of principles characterizing performance of NCBI BLAST across homogeneous and heterogeneous environments. These principles are then applied and verified through creation of a grid-enabled BLAST wrapper application called Dynamic BLAST. Results show runtime savings up to 50% and resource utilization improvement of approximately 40%.  相似文献   

16.
The objective of this paper is to explain the cause and proceedings of the 1930s Great Depression from a biophysical economic perspective. The Depression was a painful episode in the socio-technological transition from a coal/railroad regime to one based on hydrocarbons, motor vehicles, and electricity. The beginning—the Great Crash of October 1929—corresponded with drastic cuts in oil prices and announcement of oil supply certainty, following discovery of huge oilfields in the US Southwest. The Depression principally centered on a change from railroads to motor-vehicle-based transportation, but was long and drawn-out due to the hegemonic power that the railroads held over the US economy. The late 1920s saw increased use of hydrocarbon-based technologies, but the emerging technologies were still reliant on the old technological system. Methods of biophysical economics, mapping energy flows to capital formation, show the critical role of railroads in the Depression. In 1929, railroads accounted for 24% of the non-residential capital stock; they delivered between 70% and 76% of energy needs; and 69% of energy required for capital formation. Thus a hypothesis emerges that dwindling investment in the railroads was a major constraint on the economy. In biophysical terms, the US economy's main energy delivery system—coal carried by railcars—was hamstrung. Energy flow Sankey diagrams for 1929 and 1939 show the gradual change in energy systems that occurred over the Depression.  相似文献   

17.
Whole genome comparison based on the analysis of gene cluster conservation has become a popular approach in comparative genomics. While gene order and gene content as a whole randomize over time, it is observed that certain groups of genes which are often functionally related remain co-located across species. However, the conservation is usually not perfect which turns the identification of these structures, often referred to as approximate gene clusters, into a challenging task. In this article, we present an efficient set distance based approach that computes approximate gene clusters by means of reference occurrences. We show that it yields highly comparable results to the corresponding non-reference based approach, while its polynomial runtime allows for approximate gene cluster detection in parameter ranges that used to be feasible only with simpler, e.g., max-gap based, gene cluster models. To illustrate further the performance and predictive power of our algorithm, we compare it to a state-of-the art approach for max-gap gene cluster computation.  相似文献   

18.
MOTIVATION: Physical mapping of chromosomes using the maximum likelihood (ML) model is a problem of high computational complexity entailing both discrete optimization to recover the optimal probe order as well as continuous optimization to recover the optimal inter-probe spacings. In this paper, two versions of the genetic algorithm (GA) are proposed, one with heuristic crossover and deterministic replacement and the other with heuristic crossover and stochastic replacement, for the physical mapping problem under the maximum likelihood model. The genetic algorithms are compared with two other discrete optimization approaches, namely simulated annealing (SA) and large-step Markov chains (LSMC), in terms of solution quality and runtime efficiency. RESULTS: The physical mapping algorithms based on the GA, SA and LSMC have been tested using synthetic datasets and real datasets derived from cosmid libraries of the fungus Neurospora crassa. The GA, especially the version with heuristic crossover and stochastic replacement, is shown to consistently outperform the SA-based and LSMC-based physical mapping algorithms in terms of runtime and final solution quality. Experimental results on real datasets and simulated datasets are presented. Further improvements to the GA in the context of physical mapping under the maximum likelihood model are proposed. AVAILABILITY: The software is available upon request from the first author.  相似文献   

19.
20.
MOTIVATION: To identify and characterize regions of functional interest in genomic sequence requires full, flexible query access to an integrated, up-to-date view of all related information, irrespective of where it is stored (within an organization or across the Internet) and its format (traditional database, flat file, web site, results of runtime analysis). Wide-ranging multi-source queries often return unmanageably large result sets, requiring non-traditional approaches to exclude extraneous data. RESULTS: Target Informatics Net (TINet) is a readily extensible data integration system developed at GlaxoSmith- Kline (GSK), based on the Object-Protocol Model (OPM) multidatabase middleware system of Gene Logic Inc. Data sources currently integrated include: the Mouse Genome Database (MGD) and Gene Expression Database (GXD), GenBank, SwissProt, PubMed, GeneCards, the results of runtime BLAST and PROSITE searches, and GSK proprietary relational databases. Special-purpose class methods used to filter and augment query results include regular expression pattern-matching over BLAST HSP alignments and retrieving partial sequences derived from primary structure annotations. All data sources and methods are accessible through an SQL-like query language or a GUI, so that when new investigations arise no additional programming beyond query specification is required. The power and flexibility of this approach are illustrated in such integrated queries as: (1) 'find homologs in genomic sequence to all novel genes cloned and reported in the scientific literature within the past three months that are linked to the MeSH term 'neoplasms"; (2) 'using a neuropeptide precursor query sequence, return only HSPs where the target genomic sequences conserve the G[KR][KR] motif at the appropriate points in the HSP alignment'; and (3) 'of the human genomic sequences annotated with exon boundaries in GenBank, return only those with valid putative donor/acceptor sites and start/stop codons'.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号