首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   579篇
  免费   27篇
  606篇
  2024年   3篇
  2023年   11篇
  2022年   14篇
  2021年   24篇
  2020年   23篇
  2019年   12篇
  2018年   26篇
  2017年   13篇
  2016年   31篇
  2015年   40篇
  2014年   38篇
  2013年   37篇
  2012年   46篇
  2011年   40篇
  2010年   32篇
  2009年   31篇
  2008年   34篇
  2007年   24篇
  2006年   17篇
  2005年   17篇
  2004年   14篇
  2003年   31篇
  2002年   15篇
  2001年   2篇
  2000年   2篇
  1999年   3篇
  1997年   5篇
  1996年   1篇
  1995年   2篇
  1990年   2篇
  1989年   1篇
  1988年   1篇
  1985年   1篇
  1983年   2篇
  1982年   3篇
  1981年   2篇
  1979年   1篇
  1977年   1篇
  1975年   1篇
  1969年   1篇
  1968年   1篇
  1962年   1篇
排序方式: 共有606条查询结果,搜索用时 15 毫秒
31.
The enormous variety of substances which may be added to forage in order to manipulate and improve the ensilage process presents an empirical, combinatorial optimization problem of great complexity. To investigate the utility of genetic algorithms for designing effective silage additive combinations, a series of small-scale proof of principle silage experiments were performed with fresh ryegrass. Having established that significant biochemical changes occur over an ensilage period as short as 2 days, we performed a series of experiments in which we used 50 silage additive combinations (prepared by using eight bacterial and other additives, each of which was added at six different levels, including zero [i.e., no additive]). The decrease in pH, the increase in lactate concentration, and the free amino acid concentration were measured after 2 days and used to calculate a “fitness” value that indicated the quality of the silage (compared to a control silage made without additives). This analysis also included a “cost” element to account for different total additive levels. In the initial experiment additive levels were selected randomly, but subsequently a genetic algorithm program was used to suggest new additive combinations based on the fitness values determined in the preceding experiments. The result was very efficient selection for silages in which large decreases in pH and high levels of lactate occurred along with low levels of free amino acids. During the series of five experiments, each of which comprised 50 treatments, there was a steady increase in the amount of lactate that accumulated; the best treatment combination was that used in the last experiment, which produced 4.6 times more lactate than the untreated silage. The additive combinations that were found to yield the highest fitness values in the final (fifth) experiment were assessed to determine a range of biochemical and microbiological quality parameters during full-term silage fermentation. We found that these combinations compared favorably both with uninoculated silage and with a commercial silage additive. The evolutionary computing methods described here are a convenient and efficient approach for designing silage additives.  相似文献   
32.
Expertise in recognizing objects in cluttered scenes is a critical skill for our interactions in complex environments and is thought to develop with learning. However, the neural implementation of object learning across stages of visual analysis in the human brain remains largely unknown. Using combined psychophysics and functional magnetic resonance imaging (fMRI), we show a link between shape-specific learning in cluttered scenes and distributed neuronal plasticity in the human visual cortex. We report stronger fMRI responses for trained than untrained shapes across early and higher visual areas when observers learned to detect low-salience shapes in noisy backgrounds. However, training with high-salience pop-out targets resulted in lower fMRI responses for trained than untrained shapes in higher occipitotemporal areas. These findings suggest that learning of camouflaged shapes is mediated by increasing neural sensitivity across visual areas to bolster target segmentation and feature integration. In contrast, learning of prominent pop-out shapes is mediated by associations at higher occipitotemporal areas that support sparser coding of the critical features for target recognition. We propose that the human brain learns novel objects in complex scenes by reorganizing shape processing across visual areas, while taking advantage of natural image correlations that determine the distinctiveness of target shapes.  相似文献   
33.
34.
Wolbachia is a cytoplasmically inherited alpha-proteobacterium found in a wide range of host arthropod and nematode taxa. Wolbachia infection in Drosophila is closely associated with the expression of a unique form of post-fertilization lethality termed cytoplasmic incompatibility (CI). This form of incompatibility is only expressed by infected males suggesting that Wolbachia exerts its effect during spermatogenesis. The growth and distribution of Wolbachia throughout sperm development in individual spermatocysts and elongating sperm bundles is described. Wolbachia growth within a developing cyst seems to begin during the pre-meiotic spermatocyte growth phase with the majority of bacteria accumulating during cyst elongation. Wolbachia are predominantly localized in the proximal end of the immature cyst, opposite the spermatid nuclei, and throughout development there appears little movement of Wolbachia between spermatids via the connecting cytoplasmic bridges. The overall number of new cysts infected as well as the number of spermatids/cysts infected seems to decrease with age and corresponds to the previously documented drop in CI with age. In contrast, in one CI expressing line of Drosophila melanogaster, fewer cysts are infected and a much greater degree of variation in numbers is observed between spermatids. Furthermore, the initiation and extent of the fastest period of Wolbachia growth in the D. melanogaster strain lags behind that of Drosophila simulans. The possible implications on the as yet unexplained mechanism of CI are discussed.  相似文献   
35.
Although portable instruments have been used in the assessment of sleep disturbance for patients with low back pain (LBP), the accuracy of the instruments in detecting sleep/wake episodes for this population is unknown. This study investigated the criterion validity of two portable instruments (Armband and Actiwatch) for assessing sleep disturbance in patients with LBP. 50 patients with LBP performed simultaneous overnight sleep recordings in a university sleep laboratory. All 50 participants were assessed by Polysomnography (PSG) and the Armband and a subgroup of 33 participants wore an Actiwatch. Criterion validity was determined by calculating epoch-by-epoch agreement, sensitivity, specificity and prevalence and bias- adjusted kappa (PABAK) for sleep versus wake between each instrument and PSG. The relationship between PSG and the two instruments was assessed using intraclass correlation coefficients (ICC 2, 1). The study participants showed symptoms of sub-threshold insomnia (mean ISI = 13.2, 95% CI = 6.36) and poor sleep quality (mean PSQI = 9.20, 95% CI = 4.27). Observed agreement with PSG was 85% and 88% for the Armband and Actiwatch. Sensitivity was 0.90 for both instruments and specificity was 0.54 and 0.67 and PABAK of 0.69 and 0.77 for the Armband and Actiwatch respectively. The ICC (95%CI) was 0.76 (0.61 to 0.86) and 0.80 (0.46 to 0.92) for total sleep time, 0.52 (0.29 to 0.70) and 0.55 (0.14 to 0.77) for sleep efficiency, 0.64 (0.45 to 0.78) and 0.52 (0.23 to 0.73) for wake after sleep onset and 0.13 (−0.15 to 0.39) and 0.33 (−0.05 to 0.63) for sleep onset latency, for the Armband and Actiwatch, respectively. The findings showed that both instruments have varied criterion validity across the sleep parameters from excellent validity for measures of total sleep time, good validity for measures of sleep efficiency and wake after onset to poor validity for sleep onset latency.  相似文献   
36.
37.

Introduction

Combination antiretroviral therapy (cART) can effectively prevent vertical transmission of HIV but there is potential risk of adverse maternal, foetal or infant effects. Specifically, the effect of cART use during pregnancy on mitochondrial DNA (mtDNA) content in HIV-positive (HIV+) women is unclear. We sought to characterize subclinical alterations in peripheral blood mtDNA levels in cART-treated HIV+ women during pregnancy and the postpartum period.

Methods

This prospective longitudinal observational cohort study enrolled both HIV+ and HIV-negative (HIV-) pregnant women. Clinical data and blood samples were collected at three time points in pregnancy (13-<23 weeks, 23-<30 weeks, 30–40 weeks), and at delivery and six weeks post-partum in HIV+ women. Peripheral blood mtDNA to nuclear DNA (nDNA) ratio was measured by qPCR.

Results

Over a four year period, 63 HIV+ and 42 HIV- women were enrolled. HIV+ women showed significantly lower mtDNA/nDNA ratios compared to HIV- women during pregnancy (p = 0.003), after controlling for platelet count and repeated measurements using a multivariable mixed-effects model. Ethnicity, gestational age (GA) and substance use were also significantly associated with mtDNA/nDNA ratio (p≤0.02). Among HIV+ women, higher CD4 nadir was associated with higher mtDNA/nDNA ratios (p<0.0001), and these ratio were significantly lower during pregnancy compared to the postpartum period (p<0.0001).

Conclusions

In the context of this study, it was not possible to distinguish between mtDNA effects related to HIV infection versus cART therapy. Nevertheless, while mtDNA levels were relatively stable over time in both groups during pregnancy, they were significantly lower in HIV+ women compared to HIV- women. Although no immediate clinical impact was observed on maternal or infant health, lower maternal mtDNA levels may exert long-term effects on women and children and remain a concern. Improved knowledge of such subclinical alterations is another step toward optimizing the safety and efficacy of cART regimens during pregnancy.  相似文献   
38.
39.
Background and Aims Autumn leaf senescence marks the end of the growing season in temperate ecosystems. Its timing influences a number of ecosystem processes, including carbon, water and nutrient cycling. Climate change is altering leaf senescence phenology and, as those changes continue, it will affect individual woody plants, species and ecosystems. In contrast to spring leaf out times, however, leaf senescence times remain relatively understudied. Variation in the phenology of leaf senescence among species and locations is still poorly understood.Methods Leaf senescence phenology of 1360 deciduous plant species at six temperate botanical gardens in Asia, North America and Europe was recorded in 2012 and 2013. This large data set was used to explore ecological and phylogenetic factors associated with variation in leaf senescence.Key Results Leaf senescence dates among species varied by 3 months on average across the six locations. Plant species tended to undergo leaf senescence in the same order in the autumns of both years at each location, but the order of senescence was only weakly correlated across sites. Leaf senescence times were not related to spring leaf out times, were not evolutionarily conserved and were only minimally influenced by growth habit, wood anatomy and percentage colour change or leaf drop. These weak patterns of leaf senescence timing contrast with much stronger leaf out patterns from a previous study.Conclusions The results suggest that, in contrast to the broader temperature effects that determine leaf out times, leaf senescence times are probably determined by a larger or different suite of local environmental effects, including temperature, soil moisture, frost and wind. Determining the importance of these factors for a wide range of species represents the next challenge for understanding how climate change is affecting the end of the growing season and associated ecosystem processes.  相似文献   
40.
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号