首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   2646篇
  免费   175篇
  国内免费   118篇
  2023年   40篇
  2022年   29篇
  2021年   45篇
  2020年   78篇
  2019年   77篇
  2018年   58篇
  2017年   65篇
  2016年   68篇
  2015年   65篇
  2014年   149篇
  2013年   115篇
  2012年   104篇
  2011年   110篇
  2010年   102篇
  2009年   97篇
  2008年   110篇
  2007年   134篇
  2006年   122篇
  2005年   127篇
  2004年   106篇
  2003年   100篇
  2002年   67篇
  2001年   89篇
  2000年   95篇
  1999年   66篇
  1998年   47篇
  1997年   42篇
  1996年   42篇
  1995年   40篇
  1994年   43篇
  1993年   42篇
  1992年   26篇
  1991年   27篇
  1990年   24篇
  1989年   21篇
  1988年   34篇
  1987年   28篇
  1986年   25篇
  1985年   24篇
  1984年   31篇
  1983年   22篇
  1982年   35篇
  1981年   25篇
  1980年   20篇
  1979年   24篇
  1978年   14篇
  1976年   17篇
  1974年   15篇
  1973年   11篇
  1972年   12篇
排序方式: 共有2939条查询结果,搜索用时 31 毫秒
91.
目的: 验证临床受试者所完成的心肺运动试验(CPET)为最大极限运动,进一步设计完善Max试验验证CPET结果客观定量功能评估的准确性及以某特定指标的特定数值作为停止运动的标准是否可行。方法: 选择2017年9月至2019年1月在阜外医院签署知情同意书后进行CPET和Max试验受试者216例。其中正常受试者41例,因CPET峰值呼吸交换率(RER)≤1.10,或运动中心率和血压不上升,对CPET极限运动结果存在质疑的临床患者175例进行研究。其中60例已初步报告,本研究进一步扩大研究。Max试验方法:完成CPET测试后,先蹬车≥60 r/min,再施加130%峰值功率的恒定功率,鼓励受试者运动至不能坚持的极限状态。计算分析Max试验30 s的最大心率和最大摄氧量、及其与峰值心率和峰值摄氧量之间的差值和百分差值。百分差值=(Max值-峰值值)/Max值× 100%。评测标准:①若心率和摄氧量任一指标的差值百分比≤-10%(Max测试的数值低于CPET峰值数据)则定义Max试验操作失败,否则为成功;2若心率和摄氧量的差值百分比均在-10%~10%,则Max试验操作成功,证明CPET数据为极限运动,CPET 峰值相关数据较为准确;③若心率和摄氧量差值任一指标差值百分比≥10%时,则Max试验操作成功,证明CPET结果为非极限运动。结果: 病例组峰值摄氧量(L/min、ml/(min·kg)、%pred)、无氧阈(L/min、ml/(min·kg)、%pred)、峰值氧脉搏(ml/beat、%pred)、峰值RER、峰值收缩压(mmHg)、峰值运动负荷(W/min)、峰值心率(bpm)、摄氧有效性峰值平台(OUEP)(比值、%pred)低于正常组,二氧化碳通气有效性平均90 s最低值(Lowest Ve/VCO2)(比值、%pred)、二氧化碳通气效率斜率(Ve/VCO2 Slope)(比值、%pred)高于正常组(P<0.05)。所有正常组与病例组均安全无任何事件完成CPET和Max试验。216例受试者中,Max试验成功198例(91.7%),其中证明CPET为极限运动182例,为非极限运动16例;失败18例(8.3%)。结论: 在临床检查中,若对CPET结果是否为最大极限存在质疑,利用Max试验可验证CPET是否为极限运动。Max试验方法安全可行,值得进一步深入研究和临床推广应用。  相似文献   
92.
Hypothesis/objective: Prolonged QT interval is an index of propensity for dangerous ventricular tachyarrhythmias. The aim of this article is to establish an automatic algorithm for QT interval measurement.

Method: The proposed method is based on the continuous wavelet transform. In this method, the concepts of the rescaled wavelet coefficients and dominant scales of the electrocardiogram (ECG) components are used to perform detection of ECG characteristic points. A new concept of rescaled maximum energy density is introduced so as to perform the estimation of the QT interval.

Results and conclusion: We have applied the algorithm to the PTB database of the Physiobank?Physionet in lead II. Then, the results were evaluated using pertinent reference QT. The criterion used for evaluation of the method's performance is the root mean square (RMS) error. The method approached the RMS error of 27.89 ms for 549 subjects. The proposed method is fast, simple and is applicable to a wide range of ECG cardio cycle morphologies.  相似文献   
93.
94.
Some species are adapting to changing environments by expanding their geographic ranges. Understanding whether range shifts will be accompanied by increased exposure to other threats is crucial to predicting when and where new populations could successfully establish. If species overlap to a greater extent with human development under climate change, this could form ecological traps which are attractive to dispersing individuals, but the use of which substantially reduces fitness. Until recently, the core nesting range for the Critically Endangered Kemp's ridley sea turtle (Lepidochelys kempii) was ca. 1000 km of sparsely populated coastline in Tamaulipas, Mexico. Over the past twenty‐five years, this species has expanded its range into populated areas of coastal Florida (>1500 km outside the historical range), where nesting now occurs annually. Suitable Kemp's ridley nesting habitat has persisted for at least 140 000 years in the western Gulf of Mexico, and climate change models predict further nesting range expansion into the eastern Gulf of Mexico and northern Atlantic Ocean. Range expansion is 6–12% more likely to occur along uninhabited stretches of coastline than are current nesting beaches, suggesting that novel nesting areas will not be associated with high levels of anthropogenic disturbance. Although the high breeding‐site fidelity of some migratory species could limit adaptation to climate change, rapid population recovery following effective conservation measures may enhance opportunities for range expansion. Anticipating the interactive effects of past or contemporary conservation measures, climate change, and future human activities will help focus long‐term conservation strategies.  相似文献   
95.
Phytoplankton size structure is key for the ecology and biogeochemistry of pelagic ecosystems, but the relationship between cell size and maximum growth rate (μmax) is not yet well understood. We used cultures of 22 species of marine phytoplankton from five phyla, ranging from 0.1 to 106 μm3 in cell volume (Vcell), to determine experimentally the size dependence of growth, metabolic rate, elemental stoichiometry and nutrient uptake. We show that both μmax and carbon‐specific photosynthesis peak at intermediate cell sizes. Maximum nitrogen uptake rate (VmaxN) scales isometrically with Vcell, whereas nitrogen minimum quota scales as Vcell0.84. Large cells thus possess high ability to take up nitrogen, relative to their requirements, and large storage capacity, but their growth is limited by the conversion of nutrients into biomass. Small species show similar volume‐specific VmaxN compared to their larger counterparts, but have higher nitrogen requirements. We suggest that the unimodal size scaling of phytoplankton growth arises from taxon‐independent, size‐related constraints in nutrient uptake, requirement and assimilation.  相似文献   
96.
Long-term balancing selection typically leaves narrow footprints of increased genetic diversity, and therefore most detection approaches only achieve optimal performances when sufficiently small genomic regions (i.e., windows) are examined. Such methods are sensitive to window sizes and suffer substantial losses in power when windows are large. Here, we employ mixture models to construct a set of five composite likelihood ratio test statistics, which we collectively term B statistics. These statistics are agnostic to window sizes and can operate on diverse forms of input data. Through simulations, we show that they exhibit comparable power to the best-performing current methods, and retain substantially high power regardless of window sizes. They also display considerable robustness to high mutation rates and uneven recombination landscapes, as well as an array of other common confounding scenarios. Moreover, we applied a specific version of the B statistics, termed B2, to a human population-genomic data set and recovered many top candidates from prior studies, including the then-uncharacterized STPG2 and CCDC169SOHLH2, both of which are related to gamete functions. We further applied B2 on a bonobo population-genomic data set. In addition to the MHC-DQ genes, we uncovered several novel candidate genes, such as KLRD1, involved in viral defense, and SCN9A, associated with pain perception. Finally, we show that our methods can be extended to account for multiallelic balancing selection and integrated the set of statistics into open-source software named BalLeRMix for future applications by the scientific community.  相似文献   
97.
Researchers in observational survival analysis are interested in not only estimating survival curve nonparametrically but also having statistical inference for the parameter. We consider right-censored failure time data where we observe n independent and identically distributed observations of a vector random variable consisting of baseline covariates, a binary treatment at baseline, a survival time subject to right censoring, and the censoring indicator. We assume the baseline covariates are allowed to affect the treatment and censoring so that an estimator that ignores covariate information would be inconsistent. The goal is to use these data to estimate the counterfactual average survival curve of the population if all subjects are assigned the same treatment at baseline. Existing observational survival analysis methods do not result in monotone survival curve estimators, which is undesirable and may lose efficiency by not constraining the shape of the estimator using the prior knowledge of the estimand. In this paper, we present a one-step Targeted Maximum Likelihood Estimator (TMLE) for estimating the counterfactual average survival curve. We show that this new TMLE can be executed via recursion in small local updates. We demonstrate the finite sample performance of this one-step TMLE in simulations and an application to a monoclonal gammopathy data.  相似文献   
98.
Experiments that longitudinally collect RNA sequencing (RNA-seq) data can provide transformative insights in biology research by revealing the dynamic patterns of genes. Such experiments create a great demand for new analytic approaches to identify differentially expressed (DE) genes based on large-scale time-course count data. Existing methods, however, are suboptimal with respect to power and may lack theoretical justification. Furthermore, most existing tests are designed to distinguish among conditions based on overall differential patterns across time, though in practice, a variety of composite hypotheses are of more scientific interest. Finally, some current methods may fail to control the false discovery rate. In this paper, we propose a new model and testing procedure to address the above issues simultaneously. Specifically, conditional on a latent Gaussian mixture with evolving means, we model the data by negative binomial distributions. Motivated by Storey (2007) and Hwang and Liu (2010), we introduce a general testing framework based on the proposed model and show that the proposed test enjoys the optimality property of maximum average power. The test allows not only identification of traditional DE genes but also testing of a variety of composite hypotheses of biological interest. We establish the identifiability of the proposed model, implement the proposed method via efficient algorithms, and demonstrate its good performance via simulation studies. The procedure reveals interesting biological insights, when applied to data from an experiment that examines the effect of varying light environments on the fundamental physiology of the marine diatom Phaeodactylum tricornutum.  相似文献   
99.
Recurrent event data are widely encountered in clinical and observational studies. Most methods for recurrent events treat the outcome as a point process and, as such, neglect any associated event duration. This generally leads to a less informative and potentially biased analysis. We propose a joint model for the recurrent event rate (of incidence) and duration. The two processes are linked through a bivariate normal frailty. For example, when the event is hospitalization, we can treat the time to admission and length-of-stay as two alternating recurrent events. In our method, the regression parameters are estimated through a penalized partial likelihood, and the variance-covariance matrix of the frailty is estimated through a recursive estimating formula. Moreover, we develop a likelihood ratio test to assess the dependence between the incidence and duration processes. Simulation results demonstrate that our method provides accurate parameter estimation, with a relatively fast computation time. We illustrate the methods through an analysis of hospitalizations among end-stage renal disease patients.  相似文献   
100.
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号