首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 9 毫秒
1.
When users’ tasks in a distributed heterogeneous computing environment (e.g., cluster of heterogeneous computers) are allocated resources, the total demand placed on some system resources by the tasks, for a given interval of time, may exceed the availability of those resources. In such a case, some tasks may receive degraded service or be dropped from the system. One part of a measure to quantify the success of a resource management system (RMS) in such a distributed environment is the collective value of the tasks completed during an interval of time, as perceived by the user, application, or policy maker. The Flexible Integrated System Capability (FISC) measure presented here is a measure for quantifying this collective value. The FISC measure is a flexible multi-dimensional measure such that any task attribute can be inserted and may include priorities, versions of a task or data, deadlines, situational mode, security, application- and domain-specific QoS, and task dependencies. For an environment where it is important to investigate how well data communication requests are satisfied, the data communication request satisfied can be the basis of the FISC measure instead of tasks completed. The motivation behind the FISC measure is to determine the performance of resource management schemes if tasks have multiple attributes that needs to be satisfied. The goal of this measure is to compare the results of different resource management heuristics that are trying to achieve the same performance objective but with different approaches. This research was supported by the DARPA/ITO Quorum Program, by the DARPA/ISO BADD Program and the Office of Naval Research under ONR grant number N00014-97-1-0804, by the DARPA/ITO AICE program under contract numbers DABT63-99-C-0010 and DABT63-99-C-0012, and by the Colorado State University George T. Abell Endowment. Intel and Microsoft donated some of the equipment used in this research. Jong-Kook Kim is pursuing a Ph.D. degree from the School of Electrical and Computer Engineering at Purdue University (expected in August 2004). Jong-Kook received his M.S. degree in electrical and computer engineering from Purdue University in May 2000. He received his B.S. degree in electronic engineering from Korea University, Seoul, Korea in 1998. He has presented his work at several international conferences and has been a reviewer for numerous conferences and journals. His research interests include heterogeneous distributed computing, computer architecture, performance measure, resource management, evolutionary heuristics, and power-aware computing. He is a student member of the IEEE, IEEE Computer Society, and ACM. Debra Hensgen is a member of the Research and Evaluation Team at OpenTV in Mountain View, California. OpenTV produces middleware for set-top boxes in support of interactive television. She received her Ph.D. in the area of Distributed Operating Systems from the University of Kentucky. Prior to moving to private industry, as an Associate Professor in the systems area, she worked with students and colleagues to design and develop tools and systems for resource management, network re-routing algorithms and systems that preserve quality of service guarantees, and visualization tools for performance debugging of parallel and distributed systems. She has published numerous papers concerning her contributions to the Concurra toolkit for automatically generating safe, efficient concurrent code, the Graze parallel processing performance debugger, the SAAM path information base, and the SmartNet and MSHN Resource Management Systems. Taylor Kidd is currently a Software Architect for Vidiom Systems in Portland Oregon. His current work involves the writing of multi-company industrial specifications and the architecting of software systems for the digital cable television industry. He has been involved in the establishment of international specifications for digital interactive television in both Europe and in the US. Prior to his current position, Dr. Kidd has been a researcher for the US Navy as well as an Associate Professor at the Naval Postgraduate School. Dr Kidd received his Ph.D. in Electrical Engineering in 1991 from the University of California, San Diego. H. J. Siegel was appointed the George T. Abell Endowed Chair Distinguished Professor of Electrical and Computer Engineering at Colorado State University (CSU) in August 2001, where he is also a Professor of Computer Science. In December 2002, he became the first Director of the CSU Information Science and Technology Center (ISTeC). ISTeC is a university-wide organization for promoting, facilitating, and enhancing CSU’s research, education, and outreach activities pertaining to the design and innovative application of computer, communication, and information systems. From 1976 to 2001, he was a professor at Purdue University. He received two BS degrees from MIT, and the MA, MSE, and PhD degrees from Princeton University. His research interests include parallel and distributed computing, heterogeneous computing, robust computing systems, parallel algorithms, parallel machine interconnection networks, and reconfigurable parallel computer systems. He has co-authored over 300 published papers on parallel and distributed computing and communication, is an IEEE Fellow, is an ACM Fellow, was a Coeditor-in-Chief of the Journal of Parallel and Distributed Computing, and was on the Editorial Boards of both the IEEE Transactions on Parallel and Distributed Systems and the IEEE Transactions on Computers. He was Program Chair/Co-Chair of three major international conferences, General Chair/Co-Chair of four international conferences, and Chair/Co-Chair of five workshops. He has been an international keynote speaker and tutorial lecturer, and has consulted for industry and government. David St. John is Chief Information Officer for WeatherFlow, Inc., a weather services company specializing in coastal weather observations and forecasts. He received a master’s degree in Engineering from the University of California, Irvine. He spent several years as the head of staff on the Management System for Heterogeneous Networks project in the Computer Science Department of the Naval Postgraduate School. His current relationship with cluster computing is as a user of the Regional Atmospheric Modeling System (RAMS), a numerical weather model developed at Colorado State University. WeatherFlow runs RAMS operationally on a Linux-based cluster. Cynthia Irvine is a Professor of Computer Science at the Naval Postgraduate School in Monterey, California. She received her Ph.D. from Case Western Reserve University and her B.A. in Physics from Rice University. She joined the faculty of the Naval Postgraduate School in 1994. Previously she worked in industry on the development of high assurance secure systems. In 2001, Dr. Irvine received the Naval Information Assurance Award. Dr. Irvine is the Director of the Center for Information Systems Security Studies and Research at the Naval Postgraduate School. She has served on special panels for NSF, DARPA, and OSD. In the area of computer security education Dr. Irvine has most recently served as the general chair of the Third World Conference on Information Security Education and the Fifth Workshop on Education in Computer Security. She co-chaired the NSF workshop on Cyber-security Workforce Needs Assessment and Educational Innovation and was a participant in the Computing Research Association/NSF sponsored Grand Challenges in Information Assurance meeting. She is a member of the editorial board of the Journal of Information Warfare and has served as a reviewer and/or program committee member of a variety of security related conferences. She has written over 100 papers and articles and has supervised the work of over 80 students. Professor Irvine is a member of the ACM, the AAS, a life member of the ASP, and a Senior Member of the IEEE. Timothy E. Levin is a Research Associate Professor at the Naval Postgraduate School. He has spent over 18 years working in the design, development, evaluation, and verification of secure computer systems, including operating systems, databases and networks. His current research interests include high assurance system design and analysis, development of models and methods for the dynamic selection of QoS security attributes, and the application of formal methods to the development of secure computer systems. Viktor K. Prasanna received his BS in Electronics Engineering from the Bangalore University and his MS from the School of Automation, Indian Institute of Science. He obtained his Ph.D. in Computer Science from the Pennsylvania State University in 1983. Currently, he is a Professor in the Department of Electrical Engineering as well as in the Department of Computer Science at the University of Southern California, Los Angeles. He is also an associate member of the Center for Applied Mathematical Sciences (CAMS) at USC. He served as the Division Director for the Computer Engineering Division during 1994–98. His research interests include parallel and distributed systems, embedded systems, configurable architectures and high performance computing. Dr. Prasanna has published extensively and consulted for industries in the above areas. He has served on the organizing committees of several international meetings in VLSI computations, parallel computation, and high performance computing. He is the Steering Co-chair of the International Parallel and Distributed Processing Symposium [merged IEEE International Parallel Processing Symposium (IPPS) and the Symposium on Parallel and Distributed Processing (SPDP)] and is the Steering Chair of the International Conference on High Performance Computing(HiPC). He serves on the editorial boards of the Journal of Parallel and Distributed Computing and the Proceedings of the IEEE. He is the Editor-in-Chief of the IEEE Transactions on Computers. He was the founding Chair of the IEEE Computer Society Technical Committee on Parallel Processing. He is a Fellow of the IEEE. Richard F. Freund is the originator of GridIQ’s network scheduling concepts that arose from mathematical and computing approaches he developed for the Department of Defense in the early 1980’s. Dr. Freund has over twenty-five years experience in computational mathematics, algorithm design, high performance computing, distributed computing, network planning, and heterogeneous scheduling. Since 1989, Dr. Freund has published over 45 journal articles in these fields. He has also been an editor of special editions of IEEE Computer and the Journal of Parallel and Distributed Computing. In addition, he is a founder of the Heterogeneous Computing Workshop, held annually in conjunction with the International Parallel Processing Symposium. Dr. Freund is the recipient of many awards, which includes the prestigious Department of Defense Meritorious Civilian Service Award in 1984 and the Lauritsen-Bennet Award from the Space and Naval Warfare Systems Command in San Diego, California.  相似文献   

2.
Single-cell RNA-sequencing (scRNA-seq) has made it possible to profile gene expression in tissues at high resolution. An important preprocessing step prior to performing downstream analyses is to identify and remove cells with poor or degraded sample quality using quality control (QC) metrics. Two widely used QC metrics to identify a ‘low-quality’ cell are (i) if the cell includes a high proportion of reads that map to mitochondrial DNA (mtDNA) encoded genes and (ii) if a small number of genes are detected. Current best practices use these QC metrics independently with either arbitrary, uniform thresholds (e.g. 5%) or biological context-dependent (e.g. species) thresholds, and fail to jointly model these metrics in a data-driven manner. Current practices are often overly stringent and especially untenable on certain types of tissues, such as archived tumor tissues, or tissues associated with mitochondrial function, such as kidney tissue [1]. We propose a data-driven QC metric (miQC) that jointly models both the proportion of reads mapping to mtDNA genes and the number of detected genes with mixture models in a probabilistic framework to predict the low-quality cells in a given dataset. We demonstrate how our QC metric easily adapts to different types of single-cell datasets to remove low-quality cells while preserving high-quality cells that can be used for downstream analyses. Our software package is available at https://bioconductor.org/packages/miQC.  相似文献   

3.
双荧光素酶报告基因系统能够提供灵敏的读数,但该系统需要依赖组成型表达的内参对读数进行归一化。然而,大多数内参并不是在所有条件下都组成型表达。为此,文中建立了一个有效的方法制备适于家蚕细胞双荧光素酶报告基因系统的内参质粒。首先,突变BmV gP78启动子上的激素应答相关元件,获得了在家蚕细胞中稳定表达的组成型启动子BmV gP78M;然后,用BmV gP78M替换pRL-SV40质粒上的SV40启动子和嵌合内含子序列,成功构建了pRL-V gP78M内参质粒;最后,通过细胞转染实验证实pRL-V gP78M内参在家蚕细胞系中稳定表达,并且pRL-V gP78M内参的表达活性不受蜕皮激素、保幼激素及激素相关转录因子的影响。最终,获得了在家蚕细胞中稳定表达且表达量适中的内参质粒pRL-V gP78M。该内参可以有效地作为双荧光素酶报告基因系统的内参质粒用于家蚕细胞系中激素的研究。同时,该内参质粒的构建方法也为构建适于其他物种细胞系的双荧光素酶报告基因系统的内参质粒提供了参考。  相似文献   

4.
5.

Background

A fundamental problem for translational genomics is to find optimal therapies based on gene regulatory intervention. Dynamic intervention involves a control policy that optimally reduces a cost function based on phenotype by externally altering the state of the network over time. When a gene regulatory network (GRN) model is fully known, the problem is addressed using classical dynamic programming based on the Markov chain associated with the network. When the network is uncertain, a Bayesian framework can be applied, where policy optimality is with respect to both the dynamical objective and the uncertainty, as characterized by a prior distribution. In the presence of uncertainty, it is of great practical interest to develop an experimental design strategy and thereby select experiments that optimally reduce a measure of uncertainty.

Results

In this paper, we employ mean objective cost of uncertainty (MOCU), which quantifies uncertainty based on the degree to which uncertainty degrades the operational objective, that being the cost owing to undesirable phenotypes. We assume that a number of conditional probabilities characterizing regulatory relationships among genes are unknown in the Markovian GRN. In sum, there is a prior distribution which can be updated to a posterior distribution by observing a regulatory trajectory, and an optimal control policy, known as an “intrinsically Bayesian robust” (IBR) policy. To obtain a better IBR policy, we select an experiment that minimizes the MOCU remaining after applying its output to the network. At this point, we can either stop and find the resulting IBR policy or proceed to determine more unknown conditional probabilities via regulatory observation and find the IBR policy from the resulting posterior distribution. For sequential experimental design this entire process is iterated. Owing to the computational complexity of experimental design, which requires computation of many potential IBR policies, we implement an approximate method utilizing mean first passage times (MFPTs) – but only in experimental design, the final policy being an IBR policy.

Conclusions

Comprehensive performance analysis based on extensive simulations on synthetic and real GRNs demonstrate the efficacy of the proposed method, including the accuracy and computational advantage of the approximate MFPT-based design.
  相似文献   

6.
HiperLAN/2 (HIgh PErformance Radio Local Area Network) is a new standard from ETSI (European Telecommunications Standards Institute) for high-speed wireless LANs, interconnecting portable devices to each other and to broadband core networks, based on different networking technologies such as IP, ATM, IEEE 1394, and others. This paper introduces the basic features of the HiperLAN/2 MAC protocol. It presents performance evaluation results, specifically related to the mechanisms provided by HiperLAN/2 to manage bandwidth resource requests and granting. These results are assessed in terms of their flexibility and efficiency in supporting delay sensitive traffic, such as voice and Web data traffic, which are expected to be transported by broadband wireless LANs.  相似文献   

7.
Aim Adaptive trait continua are axes of covariation observed in multivariate trait data for a given taxonomic group. These continua quantify and summarize life‐history variation at the inter‐specific level in multi‐specific assemblages. Here we examine whether trait continua can provide a useful framework to link life‐history variation with demographic and evolutionary processes in species richness gradients. Taking an altitudinal species richness gradient for Mediterranean butterflies as a study case, we examined a suite of traits (larval diet breadth, adult phenology, dispersal capacity and wing length) and species‐specific habitat measures (temperature and aridity breadth). We tested whether traits and species‐specific habitat measures tend to co‐vary, whether they are phylogenetically conserved, and whether they are able to explain species distributions and spatial genetic variation in a large number of butterfly assemblages. Location Catalonia, Spain. Methods We formulated predictions associated with species richness gradients and adaptive trait continua. We applied principal components analyses (PCAs), structural equation modelling and phylogenetic generalized least squares models. Results We found that traits and species‐specific habitat measures covaried along a main PCA axis, ranging from multivoltine trophic generalists with high dispersal capacity to univoltine (i.e. one generation per year), trophic specialist species with low dispersal capacity. This trait continuum was closely associated with the observed distributions along the altitudinal gradient and predicted inter‐specific differences in patterns of spatial genetic variability (FST and genetic distances), population responses to the impacts of global change and local turnover dynamics. Main conclusions The adaptive trait continuum of Mediterranean butterflies provides an integrative and mechanistic framework to: (1) analyse geographical gradients in species richness, (2) explain inter‐specific differences in population abundances, spatial distributions and demographic trends, (3) explain inter‐specific differences in patterns of genetic variation (FST and genetic distances), and (4) study specialist–generalist life‐history transitions frequently involved in butterfly diversification processes.  相似文献   

8.
Translation is the final stage of gene expression where messenger RNA is used as a template for protein polymerization from appropriate amino acids. Release of the completed protein requires a release factor protein acting at the termination/stop codon to liberate it. In this paper we focus on a complex feedback control mechanism involved in the translation and synthesis of release factor proteins, which has been observed in different systems. These release factor proteins are involved in the termination stage of their own translation. Further, mutations in the release factor gene can result in a premature stop codon. In this case translation can result either in early termination and the production of a truncated protein or readthrough of the premature stop codon and production of the complete release factor protein. Thus during translation of the release factor mRNA containing a premature stop codon, the full length protein negatively regulates its production by its action on a premature stop codon, while positively regulating its production by its action on the regular stop codon. This paper develops a mathematical modelling framework to investigate this complex feedback control system involved in translation. A series of models is established to carefully investigate the role of individual mechanisms and how they work together. The steady state and dynamic behaviour of the resulting models are examined both analytically and numerically.  相似文献   

9.
10.
Glutaraldehyde (GLUT) was evaluated for control of single and dual species biofilms of Bacillus cereus and Pseudomonas fluorescens on stainless steel surfaces using a chemostat system. The biofilms were characterized in terms of mass, cell density, total and matrix proteins and polysaccharides. The control action of GLUT was assessed in terms of inactivation and removal of biofilm. Post-biocide action was characterized 3, 7, 12, 24, 48 and 72 h after treatment. Tests with planktonic cells were also performed for comparison. The results demonstrated that in dual species biofilms the metabolic activity, cell density and the content of matrix proteins were higher than those of either single species. Planktonic B. cereus was more susceptible to GLUT than P. fluorescens. The biocide susceptibility of dual species planktonic cultures was an average of each single species. Planktonic cells were more susceptible to GLUT than their biofilm counterparts. Biofilm inactivation was similar for both of the single biofilms while dual biofilms were more resistant than single species biofilms. GLUT at 200 mg l?1 caused low biofilm removal (<10%). Analysis of the post-biocide treatment data revealed the ability of biofilms to recover their activity over time. However, 12 h after biocide application, sloughing events were detected for both single and dual species biofilms, but were more marked for those formed by P. fluorescens (removal >40% of the total biofilm). The overall results suggest that GLUT exerts significant antimicrobial activity against planktonic bacteria and a partial and reversible activity against B. cereus and P. fluorescens single and dual species biofilms. The biocide had low antifouling effects when analysed immediately after treatment. However, GLUT had significant long-term effects on biofilm removal, inducing significant sloughing events (recovery in terms of mass 72 h after treatment for single biofilms and 42 h later for dual biofilms). In general, dual species biofilms demonstrated higher resistance and resilience to GLUT exposure than either of the single species biofilms. P. fluorescens biofilms were more susceptible to the biocide than B. cereus biofilms.  相似文献   

11.
12.
Proteomics is a rapidly expanding field encompassing a multitude of complex techniques and data types. To date much effort has been devoted to achieving the highest possible coverage of proteomes with the aim to inform future developments in basic biology as well as in clinical settings. As a result, growing amounts of data have been deposited in publicly available proteomics databases. These data are in turn increasingly reused for orthogonal downstream purposes such as data mining and machine learning. These downstream uses however, need ways to a posteriori validate whether a particular data set is suitable for the envisioned purpose. Furthermore, the (semi-)automatic curation of repository data is dependent on analyses that can highlight misannotation and edge conditions for data sets. Such curation is an important prerequisite for efficient proteomics data reuse in the life sciences in general. We therefore present here a selection of quality control metrics and approaches for the a posteriori detection of potential issues encountered in typical proteomics data sets. We illustrate our metrics by relying on publicly available data from the Proteomics Identifications Database (PRIDE), and simultaneously show the usefulness of the large body of PRIDE data as a means to derive empirical background distributions for relevant metrics.  相似文献   

13.
Economic framework for decision making in biological control   总被引:1,自引:1,他引:0  
Economic analyses are a valuable input into the decision-making process for biological control programs. The challenge though is how to incorporate qualitative risk assessments of biological control programs, or the risk of nontargeted effects into mathematical economic models. A technique known as threshold cost/benefit analysis is presented and an example on how to apply this method is illustrated using the yellow starthistle biological control program. The results show that incorporating uncertainty into the analysis can have a significant impact on the decision to undertake a biological control program.  相似文献   

14.
How to design an efficient large-area survey continues to be an interesting question for ecologists. In sampling large areas, as is common in environmental studies, adaptive sampling can be efficient because it ensures survey effort is targeted to subareas of high interest. In two-stage sampling, higher density primary sample units are usually of more interest than lower density primary units when populations are rare and clustered. Two-stage sequential sampling has been suggested as a method for allocating second stage sample effort among primary units. Here, we suggest a modification: adaptive two-stage sequential sampling. In this method, the adaptive part of the allocation process means the design is more flexible in how much extra effort can be directed to higher-abundance primary units. We discuss how best to design an adaptive two-stage sequential sample.  相似文献   

15.
When combining adaptive designs with control of the False Discovery Rate one has to keep in mind that the most frequently used procedure for controlling the False Discovery Rate--the explorative Simes procedure--is a stepwise multiple testing procedure. At the interim analysis of an adaptive design it is however not yet known what the boundaries for rejection of hypotheses in the final analysis will be as these boundaries depend on the size of the final p-values. Therefore classical adaptive designs with a predefined stopping criterion for early rejection of hypotheses are not well suited. We propose a generalized definition of a global p-value for a two-stage adaptive design permitting a flexible decision for stopping at the interim analysis. By means of a simulation study in the field of genetic epidemiology we illustrate how applying such a two-stage design can reduce costs.  相似文献   

16.
17.
Proteins located in appropriate cellular compartments are of paramount importance to exert their biological functions. Prediction of protein subcellular localization by computational methods is required in the post-genomic era. Recent studies have been focusing on predicting not only single-location proteins but also multi-location proteins. However, most of the existing predictors are far from effective for tackling the challenges of multi-label proteins. This article proposes an efficient multi-label predictor, namely mPLR-Loc, based on penalized logistic regression and adaptive decisions for predicting both single- and multi-location proteins. Specifically, for each query protein, mPLR-Loc exploits the information from the Gene Ontology (GO) database by using its accession number (AC) or the ACs of its homologs obtained via BLAST. The frequencies of GO occurrences are used to construct feature vectors, which are then classified by an adaptive decision-based multi-label penalized logistic regression classifier. Experimental results based on two recent stringent benchmark datasets (virus and plant) show that mPLR-Loc remarkably outperforms existing state-of-the-art multi-label predictors. In addition to being able to rapidly and accurately predict subcellular localization of single- and multi-label proteins, mPLR-Loc can also provide probabilistic confidence scores for the prediction decisions. For readers’ convenience, the mPLR-Loc server is available online (http://bioinfo.eie.polyu.edu.hk/mPLRLocServer).  相似文献   

18.
This paper highlights an innovative application of inorganic-binding peptides as quality control tools for detecting defects on inorganic surfaces of any shape. The approach involves attaching a fluorescent label to an inorganic-binding peptide and exploiting the peptide's high binding specificity to detect, by simple fluorescence microscopy, chemical composition defects of µm size and crystallographic state defects. Proof of concept was demonstrated by monitoring binding of a previously isolated ZnO-binding peptide to galvanized steel substrates. The approach was further validated for TiO2 coatings and stainless steel, with two new, specific inorganic-binding peptides isolated by phage display.  相似文献   

19.
Plants shaded by neighbors or overhead foliage experience both a reduction in the ratio of red to far red light (R:FR), a specific cue perceived by phytochrome, and reduced photosynthetically active radiation (PAR), an essential resource. We tested the adaptive value of plasticity to crowding and to the cue and resource components of foliage shade in the annual plant Arabidopsis thaliana by exposing 36 inbred families from four natural populations to four experimental treatments: (1) high density, full sun; (2) low density, full sun; (3) low density, neutral shade; and (4) low density, low R:FR-simulated foliage shade. Genotypic selection analysis within each treatment revealed strong environmental differences in selection on plastic life-history traits. We used specific contrasts to measure plasticity to density and foliage shade, to partition responses to foliage shade into phytochrome-mediated responses to the R:FR cue and responses to PAR, and to test whether plasticity was adaptive (i.e., in the same direction as selection in each environment). Contrary to expectation, we found no evidence for adaptive plasticity to density. However, we observed both adaptive and maladaptive responses to foliage shade. In general, phytochrome-mediated plasticity to the R:FR cue of foliage shade was adaptive and counteracted maladaptive growth responses to reduced PAR. These results support the prediction that active developmental responses to environmental cues are more likely to be adaptive than are passive resource-mediated responses. Multiple regression analysis detected a few costs of adaptive plasticity and adaptive homeostasis, but such costs were infrequent and their expression depended on the environment. Thus, costs of plasticity may occasionally constrain the evolution of adaptive responses to foliage shade in Arabidopsis, but this constraint may differ among environments and is far from ubiquitous.  相似文献   

20.
Robotic biomechanics is a powerful tool for further developing our understanding of biological joints, tissues and their repair. Both velocity-based and hybrid force control methods have been applied to biomechanics but the complex and non-linear properties of joints have limited these to slow or stepwise loading, which may not capture the real-time behaviour of joints. This paper presents a novel force control scheme combining stiffness and velocity based methods aimed at achieving six degree of freedom unconstrained force control at physiological loading rates.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号