首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   9906篇
  免费   751篇
  国内免费   3篇
  10660篇
  2023年   49篇
  2022年   131篇
  2021年   222篇
  2020年   113篇
  2019年   156篇
  2018年   206篇
  2017年   167篇
  2016年   277篇
  2015年   487篇
  2014年   519篇
  2013年   643篇
  2012年   890篇
  2011年   830篇
  2010年   555篇
  2009年   473篇
  2008年   569篇
  2007年   565篇
  2006年   500篇
  2005年   500篇
  2004年   482篇
  2003年   442篇
  2002年   413篇
  2001年   102篇
  2000年   54篇
  1999年   94篇
  1998年   122篇
  1997年   75篇
  1996年   88篇
  1995年   62篇
  1994年   83篇
  1993年   66篇
  1992年   55篇
  1991年   63篇
  1990年   44篇
  1989年   55篇
  1988年   36篇
  1987年   37篇
  1986年   36篇
  1985年   27篇
  1984年   42篇
  1983年   25篇
  1982年   30篇
  1981年   36篇
  1980年   37篇
  1979年   22篇
  1978年   23篇
  1977年   23篇
  1975年   17篇
  1974年   15篇
  1973年   18篇
排序方式: 共有10000条查询结果,搜索用时 15 毫秒
991.
When users’ tasks in a distributed heterogeneous computing environment (e.g., cluster of heterogeneous computers) are allocated resources, the total demand placed on some system resources by the tasks, for a given interval of time, may exceed the availability of those resources. In such a case, some tasks may receive degraded service or be dropped from the system. One part of a measure to quantify the success of a resource management system (RMS) in such a distributed environment is the collective value of the tasks completed during an interval of time, as perceived by the user, application, or policy maker. The Flexible Integrated System Capability (FISC) measure presented here is a measure for quantifying this collective value. The FISC measure is a flexible multi-dimensional measure such that any task attribute can be inserted and may include priorities, versions of a task or data, deadlines, situational mode, security, application- and domain-specific QoS, and task dependencies. For an environment where it is important to investigate how well data communication requests are satisfied, the data communication request satisfied can be the basis of the FISC measure instead of tasks completed. The motivation behind the FISC measure is to determine the performance of resource management schemes if tasks have multiple attributes that needs to be satisfied. The goal of this measure is to compare the results of different resource management heuristics that are trying to achieve the same performance objective but with different approaches. This research was supported by the DARPA/ITO Quorum Program, by the DARPA/ISO BADD Program and the Office of Naval Research under ONR grant number N00014-97-1-0804, by the DARPA/ITO AICE program under contract numbers DABT63-99-C-0010 and DABT63-99-C-0012, and by the Colorado State University George T. Abell Endowment. Intel and Microsoft donated some of the equipment used in this research. Jong-Kook Kim is pursuing a Ph.D. degree from the School of Electrical and Computer Engineering at Purdue University (expected in August 2004). Jong-Kook received his M.S. degree in electrical and computer engineering from Purdue University in May 2000. He received his B.S. degree in electronic engineering from Korea University, Seoul, Korea in 1998. He has presented his work at several international conferences and has been a reviewer for numerous conferences and journals. His research interests include heterogeneous distributed computing, computer architecture, performance measure, resource management, evolutionary heuristics, and power-aware computing. He is a student member of the IEEE, IEEE Computer Society, and ACM. Debra Hensgen is a member of the Research and Evaluation Team at OpenTV in Mountain View, California. OpenTV produces middleware for set-top boxes in support of interactive television. She received her Ph.D. in the area of Distributed Operating Systems from the University of Kentucky. Prior to moving to private industry, as an Associate Professor in the systems area, she worked with students and colleagues to design and develop tools and systems for resource management, network re-routing algorithms and systems that preserve quality of service guarantees, and visualization tools for performance debugging of parallel and distributed systems. She has published numerous papers concerning her contributions to the Concurra toolkit for automatically generating safe, efficient concurrent code, the Graze parallel processing performance debugger, the SAAM path information base, and the SmartNet and MSHN Resource Management Systems. Taylor Kidd is currently a Software Architect for Vidiom Systems in Portland Oregon. His current work involves the writing of multi-company industrial specifications and the architecting of software systems for the digital cable television industry. He has been involved in the establishment of international specifications for digital interactive television in both Europe and in the US. Prior to his current position, Dr. Kidd has been a researcher for the US Navy as well as an Associate Professor at the Naval Postgraduate School. Dr Kidd received his Ph.D. in Electrical Engineering in 1991 from the University of California, San Diego. H. J. Siegel was appointed the George T. Abell Endowed Chair Distinguished Professor of Electrical and Computer Engineering at Colorado State University (CSU) in August 2001, where he is also a Professor of Computer Science. In December 2002, he became the first Director of the CSU Information Science and Technology Center (ISTeC). ISTeC is a university-wide organization for promoting, facilitating, and enhancing CSU’s research, education, and outreach activities pertaining to the design and innovative application of computer, communication, and information systems. From 1976 to 2001, he was a professor at Purdue University. He received two BS degrees from MIT, and the MA, MSE, and PhD degrees from Princeton University. His research interests include parallel and distributed computing, heterogeneous computing, robust computing systems, parallel algorithms, parallel machine interconnection networks, and reconfigurable parallel computer systems. He has co-authored over 300 published papers on parallel and distributed computing and communication, is an IEEE Fellow, is an ACM Fellow, was a Coeditor-in-Chief of the Journal of Parallel and Distributed Computing, and was on the Editorial Boards of both the IEEE Transactions on Parallel and Distributed Systems and the IEEE Transactions on Computers. He was Program Chair/Co-Chair of three major international conferences, General Chair/Co-Chair of four international conferences, and Chair/Co-Chair of five workshops. He has been an international keynote speaker and tutorial lecturer, and has consulted for industry and government. David St. John is Chief Information Officer for WeatherFlow, Inc., a weather services company specializing in coastal weather observations and forecasts. He received a master’s degree in Engineering from the University of California, Irvine. He spent several years as the head of staff on the Management System for Heterogeneous Networks project in the Computer Science Department of the Naval Postgraduate School. His current relationship with cluster computing is as a user of the Regional Atmospheric Modeling System (RAMS), a numerical weather model developed at Colorado State University. WeatherFlow runs RAMS operationally on a Linux-based cluster. Cynthia Irvine is a Professor of Computer Science at the Naval Postgraduate School in Monterey, California. She received her Ph.D. from Case Western Reserve University and her B.A. in Physics from Rice University. She joined the faculty of the Naval Postgraduate School in 1994. Previously she worked in industry on the development of high assurance secure systems. In 2001, Dr. Irvine received the Naval Information Assurance Award. Dr. Irvine is the Director of the Center for Information Systems Security Studies and Research at the Naval Postgraduate School. She has served on special panels for NSF, DARPA, and OSD. In the area of computer security education Dr. Irvine has most recently served as the general chair of the Third World Conference on Information Security Education and the Fifth Workshop on Education in Computer Security. She co-chaired the NSF workshop on Cyber-security Workforce Needs Assessment and Educational Innovation and was a participant in the Computing Research Association/NSF sponsored Grand Challenges in Information Assurance meeting. She is a member of the editorial board of the Journal of Information Warfare and has served as a reviewer and/or program committee member of a variety of security related conferences. She has written over 100 papers and articles and has supervised the work of over 80 students. Professor Irvine is a member of the ACM, the AAS, a life member of the ASP, and a Senior Member of the IEEE. Timothy E. Levin is a Research Associate Professor at the Naval Postgraduate School. He has spent over 18 years working in the design, development, evaluation, and verification of secure computer systems, including operating systems, databases and networks. His current research interests include high assurance system design and analysis, development of models and methods for the dynamic selection of QoS security attributes, and the application of formal methods to the development of secure computer systems. Viktor K. Prasanna received his BS in Electronics Engineering from the Bangalore University and his MS from the School of Automation, Indian Institute of Science. He obtained his Ph.D. in Computer Science from the Pennsylvania State University in 1983. Currently, he is a Professor in the Department of Electrical Engineering as well as in the Department of Computer Science at the University of Southern California, Los Angeles. He is also an associate member of the Center for Applied Mathematical Sciences (CAMS) at USC. He served as the Division Director for the Computer Engineering Division during 1994–98. His research interests include parallel and distributed systems, embedded systems, configurable architectures and high performance computing. Dr. Prasanna has published extensively and consulted for industries in the above areas. He has served on the organizing committees of several international meetings in VLSI computations, parallel computation, and high performance computing. He is the Steering Co-chair of the International Parallel and Distributed Processing Symposium [merged IEEE International Parallel Processing Symposium (IPPS) and the Symposium on Parallel and Distributed Processing (SPDP)] and is the Steering Chair of the International Conference on High Performance Computing(HiPC). He serves on the editorial boards of the Journal of Parallel and Distributed Computing and the Proceedings of the IEEE. He is the Editor-in-Chief of the IEEE Transactions on Computers. He was the founding Chair of the IEEE Computer Society Technical Committee on Parallel Processing. He is a Fellow of the IEEE. Richard F. Freund is the originator of GridIQ’s network scheduling concepts that arose from mathematical and computing approaches he developed for the Department of Defense in the early 1980’s. Dr. Freund has over twenty-five years experience in computational mathematics, algorithm design, high performance computing, distributed computing, network planning, and heterogeneous scheduling. Since 1989, Dr. Freund has published over 45 journal articles in these fields. He has also been an editor of special editions of IEEE Computer and the Journal of Parallel and Distributed Computing. In addition, he is a founder of the Heterogeneous Computing Workshop, held annually in conjunction with the International Parallel Processing Symposium. Dr. Freund is the recipient of many awards, which includes the prestigious Department of Defense Meritorious Civilian Service Award in 1984 and the Lauritsen-Bennet Award from the Space and Naval Warfare Systems Command in San Diego, California.  相似文献   
992.
The increasing availability of single-cell RNA-sequencing (scRNA-seq) data from various developmental systems provides the opportunity to infer gene regulatory networks (GRNs) directly from data. Herein we describe IQCELL, a platform to infer, simulate, and study executable logical GRNs directly from scRNA-seq data. Such executable GRNs allow simulation of fundamental hypotheses governing developmental programs and help accelerate the design of strategies to control stem cell fate. We first describe the architecture of IQCELL. Next, we apply IQCELL to scRNA-seq datasets from early mouse T-cell and red blood cell development, and show that the platform can infer overall over 74% of causal gene interactions previously reported from decades of research. We will also show that dynamic simulations of the generated GRN qualitatively recapitulate the effects of known gene perturbations. Finally, we implement an IQCELL gene selection pipeline that allows us to identify candidate genes, without prior knowledge. We demonstrate that GRN simulations based on the inferred set yield results similar to the original curated lists. In summary, the IQCELL platform offers a versatile tool to infer, simulate, and study executable GRNs in dynamic biological systems.  相似文献   
993.
Incongruence between phylogenetic estimates based on nuclear and chloroplast DNA (cpDNA) markers was used to infer that there have been at least two instances of chloroplast transfer, presumably through wide hybridization, in subtribe Helianthinae. One instance involved Simsia dombeyana, which exhibited a cpDNA restriction site phenotype that was markedly divergent from all of the other species of the genus that were surveyed but that matched the restriction site pattern previously reported for South American species of Viguiera. In contrast, analysis of sequence data from the nuclear ribosomal DNA internal transcribed spacer (ITS) region showed Simsia to be entirely monophyletic and placed samples of S. dombeyana as the sister group to the relatively derived S. foetida, a result concordant with morphological information. A sample of a South American species of Viguiera was placed by ITS sequence data as the sister group to a member of V. subg. Amphilepis, which was consistent with cpDNA restriction site data. Samples of Tithonia formed a single monophyletic clade based on ITS sequence data, whereas they were split between two divergent clades based on cpDNA restriction site analysis. The results suggested that cpDNA transfer has occurred between taxa diverged to the level of morphologically distinct genera, and highlight the need for careful and complete assessment of molecular data as a source of phylogenetic information.  相似文献   
994.
Experiments performed on the Cu(II), Pb(II), and Zn(II) binding by saltbush biomass (Atriplex canescens) showed that the metal binding increased as pH increased from 2.0 to 5.0. The highest amounts of Cu, Pb, and Zn bound by the native biomass varied from 48-89%, 89-94%, and 65-73%, respectively. The hydrolyzed biomass bound similar amount of Pb and 50% more Cu and Zn than the native. The esterified biomass had a lower binding capacity than native; however, esterified flowers bound 45% more Cu at pH 2.0 than native flowers. The optimum binding time was 10 min or less. More than 60% of the bound Cu was recovered using 0.1 mM HCl, while more than 90% of Pb was recovered with either HCl or sodium citrate at 0.1 mM. For Zn, 0.1 mM sodium citrate allowed the recovery of 75%. Results indicated that carboxyl groups participate in the Cu, Pb, and Zn binding.  相似文献   
995.
996.
Non-ionic detergents used for the solubilization and purification of acetylcholine receptor from Torpedo californica electroplax may remain tightly bound to this protein. The presence of detergent greatly hinders spectrophotometric and hydrodynamic studies of the receptor protein. β-d-Octylglucopyranoside, however, is found to be effective in solubilizing the receptor from electroplax membranes with minimal interference in the characterization of the protein. The acetylcholine receptor purified from either octylglucopyranoside or Triton X-100-solubilized extracts exhibits identical amino acid compositions, α-Bungarotoxin and (+)-tubocurarine binding parameters, and subunit distributions in SDS-polyacrylamide gels. The use of octylglucopyranoside allows for the assignment of a molar absorptivity for the purified receptor at 280 nm of approx. 530 000 M?1 · cm?1. Additionally, successful reconstitution of octylglucopyranoside-extracted acetylcholine receptor into functional membrane vesicles has recently been achieved (Gonzales-Ros, J.M., Paraschos, A. and Martinez-Carrion, M. (1980) Proc.Natl. Acad. Sci. U.S.A. 77, 1796–1799).Removal of octylglucopyranoside by dialysis does not alter the specific toxin and antagonist binding ability of the receptor or its solubility at low protein concentrations. Sedimentation profiles of the purified acetylcholine receptor in sucrose density gradients reveal several components. Sedimentation coefficients obtained for the slowest sedimenting species agree with previously reported molecular weight values. Additionally, the different sedimenting forms exhibit distinctive behavior in isoelectric focusing gels. Our results suggest that both the concentration and type of detergent greatly influence the physicochemical behavior of the receptor protein.  相似文献   
997.
998.
The double-stranded RNA sensor kinase PKR is one of four integrated stress response (ISR) sensor kinases that phosphorylate the α subunit of eukaryotic initiation factor 2 (eIF2α) in response to stress. The current model of PKR activation considers the formation of back-to-back PKR dimers as a prerequisite for signal propagation. Here we show that PKR signaling involves the assembly of dynamic PKR clusters. PKR clustering is driven by ligand binding to PKR’s sensor domain and by front-to-front interfaces between PKR’s kinase domains. PKR clusters are discrete, heterogeneous, autonomous coalescences that share some protein components with processing bodies. Strikingly, eIF2α is not recruited to PKR clusters, and PKR cluster disruption enhances eIF2α phosphorylation. Together, these results support a model in which PKR clustering may limit encounters between PKR and eIF2α to buffer downstream signaling and prevent the ISR from misfiring.  相似文献   
999.
Wheat is a major crop worldwide, mainly cultivated for human consumption and animal feed. Grain quality is paramount in determining its value and downstream use. While we know that climate change threatens global crop yields, a better understanding of impacts on wheat end-use quality is also critical. Combining quantitative genetics with climate model outputs, we investigated UK-wide trends in genotypic adaptation for wheat quality traits. In our approach, we augmented genomic prediction models with environmental characterisation of field trials to predict trait values and climate effects in historical field trial data between 2001 and 2020. Addition of environmental covariates, such as temperature and rainfall, successfully enabled prediction of genotype by environment interactions (G × E), and increased prediction accuracy of most traits for new genotypes in new year cross validation. We then extended predictions from these models to much larger numbers of simulated environments using climate scenarios projected under Representative Concentration Pathways 8.5 for 2050–2069. We found geographically varying climate change impacts on wheat quality due to contrasting associations between specific weather covariables and quality traits across the UK. Notably, negative impacts on quality traits were predicted in the East of the UK due to increased summer temperatures while the climate in the North and South-west may become more favourable with increased summer temperatures. Furthermore, by projecting 167,040 simulated future genotype–environment combinations, we found only limited potential for breeding to exploit predictable G × E to mitigate year-to-year environmental variability for most traits except Hagberg falling number. This suggests low adaptability of current UK wheat germplasm across future UK climates. More generally, approaches demonstrated here will be critical to enable adaptation of global crops to near-term climate change.  相似文献   
1000.
Species trees have traditionally been inferred from a few selected markers, and genome‐wide investigations remain largely restricted to model organisms or small groups of species for which sampling of fresh material is available, leaving out most of the existing and historical species diversity. The genomes of an increasing number of species, including specimens extracted from natural history collections, are being sequenced at low depth. While these data sets are widely used to analyse organelle genomes, the nuclear fraction is generally ignored. Here we evaluate different reference‐based methods to infer phylogenies of large taxonomic groups from such data sets. Using the example of the Oleeae tribe, a worldwide‐distributed group, we build phylogenies based on single nucleotide polymorphisms (SNPs) obtained using two reference genomes (the olive and ash trees). The inferred phylogenies are overall congruent, yet present differences that might reflect the effect of distance to the reference on the amount of missing data. To limit this issue, genome complexity was reduced by using pairs of orthologous coding sequences as the reference, thus allowing us to combine SNPs obtained using two distinct references. Concatenated and coalescence trees based on these combined SNPs suggest events of incomplete lineage sorting and/or hybridization during the diversification of this large phylogenetic group. Our results show that genome‐wide phylogenetic trees can be inferred from low‐depth sequence data sets for eukaryote groups with complex genomes, and histories of reticulate evolution. This opens new avenues for large‐scale phylogenomics and biogeographical analyses covering both the extant and the historical diversity stored in museum collections.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号