首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   6692篇
  免费   559篇
  国内免费   7篇
  7258篇
  2023年   43篇
  2022年   109篇
  2021年   240篇
  2020年   122篇
  2019年   162篇
  2018年   164篇
  2017年   124篇
  2016年   243篇
  2015年   382篇
  2014年   387篇
  2013年   468篇
  2012年   573篇
  2011年   565篇
  2010年   344篇
  2009年   282篇
  2008年   399篇
  2007年   365篇
  2006年   302篇
  2005年   292篇
  2004年   295篇
  2003年   245篇
  2002年   280篇
  2001年   57篇
  2000年   31篇
  1999年   50篇
  1998年   40篇
  1997年   46篇
  1996年   35篇
  1995年   23篇
  1994年   27篇
  1993年   26篇
  1992年   25篇
  1991年   13篇
  1990年   12篇
  1989年   19篇
  1988年   14篇
  1987年   18篇
  1985年   20篇
  1984年   14篇
  1983年   11篇
  1982年   13篇
  1981年   22篇
  1980年   14篇
  1979年   11篇
  1977年   11篇
  1976年   13篇
  1975年   16篇
  1970年   10篇
  1968年   10篇
  1966年   13篇
排序方式: 共有7258条查询结果,搜索用时 15 毫秒
901.
Bacteriorhodopsin has a polar cluster of amino acids surrounding the retinal molecule, which is responsible for light harvesting to fuel proton pumping. From our previous studies, we have shown that threonine 90 is the pivotal amino acid in this polar cluster, both functionally and structurally. In an attempt to perform a phenotype rescue, we have chemically designed a retinal analogue molecule to compensate the drastic effects of the T90A mutation in bacteriorhodopsin. This analogue substitutes the methyl group at position C(13) of the retinal hydrocarbon chain by and ethyl group (20-methyl retinal). We have analyzed the effect of reconstituting the wild-type and the T90A mutant apoproteins with all-trans-retinal and its 20-methyl derivative (hereafter, 13-ethyl retinal). Biophysical characterization indicates that recovering the steric interaction between the residue 90 and retinal, eases the accommodation of the chromophore, however it is not enough for a complete phenotype rescue. The characterization of these chemically engineered chromoproteins provides further insight into the role of the hydrogen bond network and the steric interactions involving the retinal binding pocket in bacteriorhodopsin and other microbial sensory rhodopsins.  相似文献   
902.

Background

Hepatitis C is a treatment-resistant disease affecting millions of people worldwide. The hepatitis C virus (HCV) genome is a single-stranded RNA molecule. After infection of the host cell, viral RNA is translated into a polyprotein that is cleaved by host and viral proteinases into functional, structural and non-structural, viral proteins. Cleavage of the polyprotein involves the viral NS3/4A proteinase, a proven drug target. HCV mutates as it replicates and, as a result, multiple emerging quasispecies become rapidly resistant to anti-virals, including NS3/4A inhibitors.

Methodology/Principal Findings

To circumvent drug resistance and complement the existing anti-virals, NS3/4A inhibitors, which are additional and distinct from the FDA-approved telaprevir and boceprevir α-ketoamide inhibitors, are required. To test potential new avenues for inhibitor development, we have probed several distinct exosites of NS3/4A which are either outside of or partially overlapping with the active site groove of the proteinase. For this purpose, we employed virtual ligand screening using the 275,000 compound library of the Developmental Therapeutics Program (NCI/NIH) and the X-ray crystal structure of NS3/4A as a ligand source and a target, respectively. As a result, we identified several novel, previously uncharacterized, nanomolar range inhibitory scaffolds, which suppressed of the NS3/4A activity in vitro and replication of a sub-genomic HCV RNA replicon with a luciferase reporter in human hepatocarcinoma cells. The binding sites of these novel inhibitors do not significantly overlap with those of α-ketoamides. As a result, the most common resistant mutations, including V36M, R155K, A156T, D168A and V170A, did not considerably diminish the inhibitory potency of certain novel inhibitor scaffolds we identified.

Conclusions/Significance

Overall, the further optimization of both the in silico strategy and software platform we developed and lead compounds we identified may lead to advances in novel anti-virals.  相似文献   
903.
Posterior polymorphous corneal dystrophy (PPCD) is a rare autosomal dominant genetically heterogeneous disorder. Nineteen Czech PPCD pedigrees with 113 affected family members were identified, and 17 of these kindreds were genotyped for markers on chromosome 20p12.1- 20q12. Comparison of haplotypes in 81 affected members, 20 unaffected first degree relatives and 13 spouses, as well as 55 unrelated controls, supported the hypothesis of a shared ancestor in 12 families originating from one geographic location. In 38 affected individuals from nine of these pedigrees, a common haplotype was observed between D20S48 and D20S107 spanning approximately 23 Mb, demonstrating segregation of disease with the PPCD1 locus. This haplotype was not detected in 110 ethnically matched control chromosomes. Within the common founder haplotype, a core mini-haplotype was detected for D20S605, D20S182 and M189K2 in all 67 affected members from families 1–12, however alleles representing the core mini-haplotype were also detected in population matched controls. The most likely location of the responsible gene within the disease interval, and estimated mutational age, were inferred by linkage disequilibrium mapping (DMLE+2.3). The appearance of a disease-causing mutation was dated between 64–133 generations. The inferred ancestral locus carrying a PPCD1 disease-causing variant within the disease interval spans 60 Kb on 20p11.23, which contains a single known protein coding gene, ZNF133. However, direct sequence analysis of coding and untranslated exons did not reveal a potential pathogenic mutation. Microdeletion or duplication was also excluded by comparative genomic hybridization using a dense chromosome 20 specific array. Geographical origin, haplotype and statistical analysis suggest that in 14 unrelated families an as yet undiscovered mutation on 20p11.23 was inherited from a common ancestor. Prevalence of PPCD in the Czech Republic appears to be the highest worldwide and our data suggests that at least one other novel locus for PPCD also exists.  相似文献   
904.
Autocatalytic cycles are rather widespread in nature and in several theoretical models of catalytic reaction networks their emergence is hypothesized to be inevitable when the network is or becomes sufficiently complex. Nevertheless, the emergence of autocatalytic cycles has been never observed in wet laboratory experiments. Here, we present a novel model of catalytic reaction networks with the explicit goal of filling the gap between theoretical predictions and experimental findings. The model is based on previous study of Kauffman, with new features in the introduction of a stochastic algorithm to describe the dynamics and in the possibility to increase the number of elements and reactions according to the dynamical evolution of the system. Furthermore, the introduction of a temporal threshold allows the detection of cycles even in our context of a stochastic model with asynchronous update. In this study, we describe the model and present results concerning the effect on the overall dynamics of varying (a) the average residence time of the elements in the reactor, (b) both the composition of the firing disk and the concentration of the molecules belonging to it, (c) the composition of the incoming flux.  相似文献   
905.
906.
Synthetic biology and nuclear physics share many commonalities in terms of public perception and funding. Synthetic biologists could learn valuable lessons from the history of the atomic bomb and nuclear power.On 16 July 1945, in the desert of New Mexico, the first nuclear bomb was exploded. It was a crucial moment in the history of the physical sciences—proof positive of the immense forces at work in the heart of atoms—and inevitably changed the world. In 2010, a team at the J. Craig Venter Research Institute in the USA first created artificial life by inserting a synthetic 1.08 megabase pair genome into a mycoplasma cell that lacked its own. They demonstrated that this new cell with its man-made genome was capable of surviving and reproducing [1]. It was a colossal achievement for biology, and its significance might well rank alongside the detonation of the first atomic bomb in terms of scientific advance.…as with post-war physics, synthetic biology''s promises of a brighter future might not all materialize and could have far-reaching effects on society, science and politicsThere are several similarities between twentieth century physics, and twentieth and twenty-first century biology. The nuclear explosion in New Mexico was the result of decades of research and the first splitting of an atom in Otto Hahn''s laboratory in 1938. It ushered in an era of new ideas and hopes for a brighter future built on the power of the atom, but the terrible potential of nuclear weapons and the threat of nuclear warfare ultimately overshadowed these hopes and changed the course of science and politics. The crucial achievement of synthetic life is a strikingly similar event; the culmination of decades of research that started with its own atom-splitting moment: recombinant DNA technology. It promises to bring forth a new era for biology and enable a huge variety of applications for industry, medicine and the military. However, as with post-war physics, synthetic biology''s promises of a brighter future might not all materialize and could have far-reaching effects on society, science and politics. Biology should therefore take note of the consequences of nuclear physics'' iconic event in 1945 for science, politics and society.To appreciate the similarities of these breakthroughs and their consequences for society, it is necessary to understand the historical perspective. The pivotal discoveries for both disciplines were related to fundamental elements of nature. The rise of nuclear physics can be traced back to the discovery of neutrons by James Chadwick in 1932 [2]. Neutrons are essential to the stability of atoms as they insulate the nucleus against the repulsive forces of its positively charged protons. However, the addition of an extra neutron can destabilize the nucleus and cause it to split, releasing more neutrons and a tremendous amount of energy. This nuclear fission reaction was first described by Otto Hahn and Fritz Strassmann in 1938. Leo Szilard realized the possibility of using the neutrons released from the fission of heavy atoms to trigger a nuclear chain reaction to release huge quantities of energy. The first successful chain reactions took place in 1942 in Germany at Leipzig University in the laboratory of Robert Döpel, and in the USA at the University of Chicago in the so-called Chicago Pile-1 reactor, developed by Enrico Fermi. These first nuclear reactors provided the proof of concept for using a nuclear chain reaction as a source of energy. However, even before that, Albert Einstein and Leo Szilard wrote to US President Franklin D. Roosevelt in 1939, suggesting that the US government should develop a new powerful bomb based on nuclear fission. President Roosevelt created the Manhattan Project, which developed the first atomic bomb in 1945.Similarly to nuclear physics, the advent of rDNA technology has concerned the public…The Cold War and the mutually assured nuclear destruction between the USA and the USSR fanned widespread fears about a nuclear Third World War that could wipe out human civilization; Robert Oppenheimer, one of the physicists who developed the atomic bomb, was actually among the first to warn of the spectre of nuclear war. By contrast, the civilian use of nuclear physics, mainly in the form of nuclear reactors, promised a brave new future based on harnessing the power of the atom, but it also generated increasing concerns about the harmful effects of radioactivity, the festering problems of nuclear waste and the safety of nuclear power plants. The nuclear disasters at the Chernobyl reactor in 1986 and the Fukushima power plant in 2011 heightened these concerns to the point that several nations might now abandon nuclear energy altogether.The fundamental discovery in biology, crucial to the creation of synthetic organisms was the double helix structure of DNA in 1953 by Francis Crick and James Watson [3]. The realization that DNA molecules have a universal chemical structure to store and pass on genetic information was the intellectual basis for the development of recombinant DNA (rDNA) technology and genetic engineering. Twenty years after this discovery, Stanley Cohen and Herbert Boyer first transferred DNA from one organism into another by using endonucleases and DNA ligases [4]. This early toolkit was later expanded to include DNA sequencing and synthesizing technologies as well as PCR, which culminated in the creation of the first artificial organism in 2010. Craig Venter''s team synthesized a complete bacterial chromosome from scratch and transferred it into a bacterial cell lacking a genome: the resulting cell was able to synthesize a new set of proteins and to replicate. This proof of concept experiment now enables scientists to pursue further challenges, such as creating organisms with fully designed genomes to achieve agro-biotechnological, commercial, medical and military goals.Similarly to nuclear physics, the advent of rDNA technology has concerned the public, as many fear that genetically modified bacteria could escape the laboratory and wreak havoc, or that the technology could be abused to create biological weapons. Unlike with nuclear physics, the scientists working on rDNA technology anticipated these concerns very early on. In 1974, a group of scientists led by Paul Berg decided to suspend research into rDNA technology to discuss possible hazards and regulation. This discussion took place at a meeting in Asilomar, California, in 1975 [5].A pertinent similarity between these two areas of science is the confluence of several disciplines to create a hybrid technoscience, in which the boundaries between science and technology have become transient [6]. This convergence was vital for the success of both nuclear physics and later synthetic biology, which combines biotechnology, nanotechnology, information technologies and other new fields that have been created along the way [7]. In physics, technoscience received massive support from the government when the military potential of nuclear fission was realized. Although the splitting of the atom took place before the Manhattan Project, the Second World War served as a catalyst to combine research in nuclear physics with organized and goal-directed funding. As most of this funding came from the government, it changed the relationship between politics and research, as scientists were employed to meet specific goals. In the wake of the detonation of the first atomic bombs, the post-war period was another watershed moment for politics, technoscience, industry and society as it generated new and more intimate relationships between science and governments. These included the appointment of a scientific advisor to the President of the USA, the creation of funding organizations such as the National Science Foundation, or research organizations such as the National Aeronautics and Space Administration, and large amounts of federal funding for technoscience research at private and public universities. It also led to the formation of international organizations such as the civilian-controlled International Atomic Energy Agency [6].There is no global war to serve as a catalyst for government spending on synthetic biology. Although the research has benefited tremendously from government agencies and research infrastructure, the funding for Venter''s team largely came from the private sector. In this regard, the relationship between biological techno-science and industry might already be more advanced than with the public sector given the enormous potential of synthetic life for industrial, medical and environmental applications.Research and innovation at universities has always played a vital role in the success of industry-based capitalism [8]; technoscience is now the major determinant of a knowledge-based economy or ''technocapitalism'' [9]. At the heart of technocapitalism are private and public organizations, driven by research and innovation, which are in sharp contrast to industrial capitalism, where the factories were production-driven and research was of less importance [10]. Furthermore, synthetic biology might provide valuable resources to the scientific community and thereby generate new research opportunities and directions for many biological fields [11].However, given the far-reaching implications of creating synthetic life and the risk of abuse, it is probable that the future relationship between synthetic biology and government will include issues of national security. In the light of potential misuse of synthetic biology for bioterrorism, and the safety risks involved in commercial applications, synthetic biology will eventually require some government regulation and oversight. In contrast to nuclear physics, in which the International Atomic Energy Commission was established only after the atomic bomb, the synthetic biology community should hold a new Asilomar meeting to address concerns and formulate guidelines and management protocols, rather than waiting for politicians or commercial enterprises to regulate the field.So far, synthetic biology differs from nuclear physics in terms of handling information. The Manhattan Project inevitably created a need for secrecy as it was created at the height of the Second World War, but the research maintained this shroud of secrecy after the war. After the bombing of Hiroshima and Nagasaki in August 1945, the US government released carefully compiled documents to the American public. The existence of useable nuclear power had been secret until then, and the control of information ensured that the public further supported or tolerated the technology of nuclear fission and the subsequent use of atomic bombs [12]. This initially positive view changed in the ensuing decades with the threat of a global nuclear war.…synthetic biology has side-stepped the mistakes of nuclear physics and might well achieve a more balanced public integration of future developmentsInformation management in synthetic biology differs from nuclear physics, in that most of the crucial breakthroughs are immediately published in peer-reviewed journals and covered by the media. The value of early public discourse on science issues is evident from the reaction towards genetically modified crops and stem cell research. In this regard, synthetic biology has side-stepped the mistakes of nuclear physics and might well achieve a more balanced public integration of future developments.The main issues that might threaten to dampen public support for synthetic biology and favourable public perception are ethics and biosecurity concerns. Ethical concerns have already been addressed in several forums between scientists and public interest groups; this early engagement between science and society and their continuing dialogue might help to address the public''s ethical objections. In terms of biosecurity, biology might learn from nuclear physics'' intimate entanglement with politics and the military. Synthetic biologists should maintain control and regulation of their research and avoid the fate of nuclear physicists, who were recruited to fight the Cold War and were not free to pursue their own research. For synthetic biology to stay independent of government, industry and society, it must capitalize on its public engagement and heed the lessons and mistakes of nuclear physics'' atom-splitting moment. It should not just evaluate, discuss and address the risks for human or environmental health or biosafety concerns, but should also evaluate potential risks to synthetic biology research itself that could either come from falling public acceptance or government intrusion.? Open in a separate windowAlex J ValentineOpen in a separate windowAleysia KleinertOpen in a separate windowJerome Verdier  相似文献   
907.
908.
909.
910.
Overstimulation of the glutamatergic system (excitotoxicity) is involved in various acute and chronic brain diseases. Several studies support the hypothesis that guanosine-5′-monophosphate (GMP) can modulate glutamatergic neurotransmission. The aim of this study was to evaluate the effects of chronically administered GMP on brain cortical glutamatergic parameters in mice. Additionally, we investigated the neuroprotective potential of the GMP treatment submitting cortical brain slices to oxygen and glucose deprivation (OGD). Moreover, measurements of the cerebrospinal fluid (CSF) purine levels were performed after the treatment. Mice received an oral administration of saline or GMP during 3 weeks. GMP significantly decreases the cortical brain glutamate binding and uptake. Accordingly, GMP reduced the immunocontent of the glutamate receptors subunits, NR2A/B and GluR1 (NMDA and AMPA receptors, respectively) and glutamate transporters EAAC1 and GLT1. GMP treatment significantly reduced the immunocontent of PSD-95 while did not affect the content of Snap 25, GLAST and GFAP. Moreover, GMP treatment increased the resistance of neocortex to OGD insult. The chronic GMP administration increased the CSF levels of GMP and its metabolites. Altogether, these findings suggest a potential modulatory role of GMP on neocortex glutamatergic system by promoting functional and plastic changes associated to more resistance of mice neocortex against an in vitro excitotoxicity event.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号