首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   1650篇
  免费   109篇
  国内免费   10篇
  2023年   4篇
  2022年   15篇
  2021年   33篇
  2020年   3篇
  2019年   2篇
  2018年   23篇
  2017年   14篇
  2016年   90篇
  2015年   185篇
  2014年   121篇
  2013年   131篇
  2012年   240篇
  2011年   196篇
  2010年   80篇
  2009年   51篇
  2008年   82篇
  2007年   66篇
  2006年   47篇
  2005年   62篇
  2004年   55篇
  2003年   74篇
  2002年   42篇
  2001年   31篇
  2000年   15篇
  1999年   5篇
  1998年   8篇
  1997年   5篇
  1994年   3篇
  1993年   6篇
  1992年   3篇
  1991年   2篇
  1990年   14篇
  1989年   2篇
  1988年   5篇
  1987年   2篇
  1986年   5篇
  1985年   5篇
  1983年   2篇
  1982年   3篇
  1981年   2篇
  1980年   2篇
  1979年   2篇
  1978年   3篇
  1976年   2篇
  1974年   2篇
  1972年   2篇
  1971年   3篇
  1959年   2篇
  1958年   2篇
  1943年   2篇
排序方式: 共有1769条查询结果,搜索用时 15 毫秒
21.
22.

Background

The Centers for Disease Control and Prevention recommends nontargeted opt-out HIV screening in healthcare settings. Cost effectiveness is critical when considering potential screening methods. Our goal was to compare programmatic costs of nontargeted opt-out rapid HIV screening with physician-directed diagnostic rapid HIV testing in an urban emergency department (ED) as part of the Denver ED HIV Opt-Out Trial.

Methods

This was a prospective cohort study nested in a larger quasi-experiment. Over 16 months, nontargeted rapid HIV screening (intervention) and diagnostic rapid HIV testing (control) were alternated in 4-month time blocks. During the intervention phase, patients were offered HIV testing using an opt-out approach during registration; during the control phase, physicians used a diagnostic approach to offer HIV testing to patients. Each method was fully integrated into ED operations. Direct program costs were determined using the perspective of the ED. Time-motion methodology was used to estimate personnel activity costs. Costs per patient newly-diagnosed with HIV infection by intervention phase, and incremental cost effectiveness ratios were calculated.

Results

During the intervention phase, 28,043 eligible patients were included, 6,933 (25%) completed testing, and 15 (0.2%, 95% CI: 0.1%–0.4%) were newly-diagnosed with HIV infection. During the control phase, 29,925 eligible patients were included, 243 (0.8%) completed testing, and 4 (1.7%, 95% CI: 0.4%–4.2%) were newly-diagnosed with HIV infection. Total annualized costs for nontargeted screening were $148,997, whereas total annualized costs for diagnostic HIV testing were $31,355. The average costs per HIV diagnosis were $9,932 and $7,839, respectively. Nontargeted HIV screening identified 11 more HIV infections at an incremental cost of $10,693 per additional infection.

Conclusions

Compared to diagnostic testing, nontargeted HIV screening was more costly but identified more HIV infections. More effective and less costly testing strategies may be required to improve the identification of patients with undiagnosed HIV infection in the ED.  相似文献   
23.

Background

Tools to predict death or spontaneous survival are necessary to inform liver transplantation (LTx) decisions in pediatric acute liver failure (PALF), but such tools are not available. Recent data suggest that immune/inflammatory dysregulation occurs in the setting of acute liver failure. We hypothesized that specific, dynamic, and measurable patterns of immune/inflammatory dysregulation will correlate with outcomes in PALF.

Methods

We assayed 26 inflammatory mediators on stored serum samples obtained from a convenience sample of 49 children in the PALF study group (PALFSG) collected within 7 days after enrollment. Outcomes were assessed within 21 days of enrollment consisting of spontaneous survivors, non-survivors, and LTx recipients. Data were subjected to statistical analysis, patient-specific Principal Component Analysis (PCA), and Dynamic Bayesian Network (DBN) inference.

Findings

Raw inflammatory mediator levels assessed over time did not distinguish among PALF outcomes. However, DBN analysis did reveal distinct interferon-gamma-related networks that distinguished spontaneous survivors from those who died. The network identified in LTx patients pre-transplant was more like that seen in spontaneous survivors than in those who died, a finding supported by PCA.

Interpretation

The application of DBN analysis of inflammatory mediators in this small patient sample appears to differentiate survivors from non-survivors in PALF. Patterns associated with LTx pre-transplant were more like those seen in spontaneous survivors than in those who died. DBN-based analyses might lead to a better prediction of outcome in PALF, and could also have more general utility in other complex diseases with an inflammatory etiology.  相似文献   
24.

Background

Epidemiological evidence suggests that vitamin D deficiency is linked to various chronic diseases. However direct measurement of serum 25-hydroxyvitamin D (25(OH)D) concentration, the accepted biomarker of vitamin D status, may not be feasible in large epidemiological studies. An alternative approach is to estimate vitamin D status using a predictive model based on parameters derived from questionnaire data. In previous studies, models developed using Multiple Linear Regression (MLR) have explained a limited proportion of the variance and predicted values have correlated only modestly with measured values. Here, a new modelling approach, nonlinear radial basis function support vector regression (RBF SVR), was used in prediction of serum 25(OH)D concentration. Predicted scores were compared with those from a MLR model.

Methods

Determinants of serum 25(OH)D in Caucasian adults (n = 494) that had been previously identified were modelled using MLR and RBF SVR to develop a 25(OH)D prediction score and then validated in an independent dataset. The correlation between actual and predicted serum 25(OH)D concentrations was analysed with a Pearson correlation coefficient.

Results

Better correlation was observed between predicted scores and measured 25(OH)D concentrations using the RBF SVR model in comparison with MLR (Pearson correlation coefficient: 0.74 for RBF SVR; 0.51 for MLR). The RBF SVR model was more accurately able to identify individuals with lower 25(OH)D levels (<75 nmol/L).

Conclusion

Using identical determinants, the RBF SVR model provided improved prediction of serum 25(OH)D concentrations and vitamin D deficiency compared with a MLR model, in this dataset.  相似文献   
25.

Objective

To replicate the associations of leukocyte telomere length (LTL) with variants at four loci and to investigate their associations with coronary heart disease (CHD) and type II diabetes (T2D), in order to examine possible causal effects of telomere maintenance machinery on disease aetiology.

Methods

Four SNPs at three loci BICD1 (rs2630578 GγC), 18q12.2 (rs2162440 GγT), and OBFC1 (rs10786775 CγG, rs11591710 AγC) were genotyped in four studies comprised of 2353 subjects out of which 1148 had CHD and 566 T2D. Three SNPs (rs12696304 CγG, rs10936601G>T and rs16847897 GγC) at the TERC locus were genotyped in these four studies, in addition to an offspring study of 765 healthy students. For all samples, LTL had been measured using a real-time PCR-based method.

Results

Only one SNP was associated with a significant effect on LTL, with the minor allele G of OBFC1 rs10786775 SNP being associated with longer LTL (β=0.029, P=0.04). No SNPs were significantly associated with CHD or T2D. For OBFC1 the haplotype carrying both rare alleles (rs10786775G and rs11591710C, haplotype frequency 0.089) was associated with lower CHD prevalence (OR: 0.77; 95% CI: 0.61–0.97; P= 0.03). The TERC haplotype GTC (rs12696304G, rs10936601T and rs16847897C, haplotype frequency 0.210) was associated with lower risk for both CHD (OR: 0.86; 95% CI: 0.75-0.99; P=0.04) and T2D (OR: 0.74; 95% CI: 0.61–0.91; P= 0.004), with no effect on LTL. Only the last association remained after adjusting for multiple testing.

Conclusion

Of reported associations, only that between the OBFC1 rs10786775 SNP and LTL was confirmed, although our study has a limited power to detect modest effects. A 2-SNP OBFC1 haplotype was associated with higher risk of CHD, and a 3-SNP TERC haplotype was associated with both higher risk of CHD and T2D. Further work is required to confirm these results and explore the mechanisms of these effects.  相似文献   
26.
Noma (cancrum oris) is a gangrenous disease of unknown etiology affecting the maxillo-facial region of young children in extremely limited resource countries. In an attempt to better understand the microbiological events occurring during this disease, we used phylogenetic and low-density microarrays targeting the 16S rRNA gene to characterize the gingival flora of acute noma and acute necrotizing gingivitis (ANG) lesions, and compared them to healthy control subjects of the same geographical and social background. Our observations raise doubts about Fusobacterium necrophorum, a previously suspected causative agent of noma, as this species was not associated with noma lesions. Various oral pathogens were more abundant in noma lesions, notably Atopobium spp., Prevotella intermedia, Peptostreptococcus spp., Streptococcus pyogenes and Streptococcus anginosus. On the other hand, pathogens associated with periodontal diseases such as Aggregatibacter actinomycetemcomitans, Capnocytophaga spp., Porphyromonas spp. and Fusobacteriales were more abundant in healthy controls. Importantly, the overall loss of bacterial diversity observed in noma samples as well as its homology to that of ANG microbiota supports the hypothesis that ANG might be the immediate step preceding noma.  相似文献   
27.

Background

Increasing active travel (walking, bicycling, and public transport) is promoted as a key strategy to increase physical activity and reduce the growing burden of noncommunicable diseases (NCDs) globally. Little is known about patterns of active travel or associated cardiovascular health benefits in low- and middle-income countries. This study examines mode and duration of travel to work in rural and urban India and associations between active travel and overweight, hypertension, and diabetes.

Methods and Findings

Cross-sectional study of 3,902 participants (1,366 rural, 2,536 urban) in the Indian Migration Study. Associations between mode and duration of active travel and cardiovascular risk factors were assessed using random-effect logistic regression models adjusting for age, sex, caste, standard of living, occupation, factory location, leisure time physical activity, daily fat intake, smoking status, and alcohol use. Rural dwellers were significantly more likely to bicycle (68.3% versus 15.9%; p<0.001) to work than urban dwellers. The prevalence of overweight or obesity was 50.0%, 37.6%, 24.2%, 24.9%; hypertension was 17.7%, 11.8%, 6.5%, 9.8%; and diabetes was 10.8%, 7.4%, 3.8%, 7.3% in participants who travelled to work by private transport, public transport, bicycling, and walking, respectively. In the adjusted analysis, those walking (adjusted risk ratio [ARR] 0.72; 95% CI 0.58–0.88) or bicycling to work (ARR 0.66; 95% CI 0.55–0.77) were significantly less likely to be overweight or obese than those travelling by private transport. Those bicycling to work were significantly less likely to have hypertension (ARR 0.51; 95% CI 0.36–0.71) or diabetes (ARR 0.65; 95% CI 0.44–0.95). There was evidence of a dose-response relationship between duration of bicycling to work and being overweight, having hypertension or diabetes. The main limitation of the study is the cross-sectional design, which limits causal inference for the associations found.

Conclusions

Walking and bicycling to work was associated with reduced cardiovascular risk in the Indian population. Efforts to increase active travel in urban areas and halt declines in rural areas should be integral to strategies to maintain healthy weight and prevent NCDs in India. Please see later in the article for the Editors'' Summary  相似文献   
28.
29.
Monitoring intraocular pressure (IOP) is essential for pediatric cataract treatment but always difficult due to lack of cooperation in young children. We present the baseline characteristics and the first-year results of a long-term prospective cohort study, which are aimed to determine the relationship of the incidence of ocular hypertension (OH) in children after cataract surgery during the first-year period and the risk of developing late-onset glaucoma. Children were included with the following criteria: they were≤10 years old and scheduled to undergo cataract surgery with/without intraocular lens implantation; they were compliant with our follow-up protocol, which included monitoring IOP using a Tono-Pen under sedation or anesthesia. Incidence of OH, peak OH value, OH onset time and OH duration within a 12-month period following surgery were measured. In brief, 206 patients (379 eyes) were included and OH developed in 66 of 379 (17.4%) eyes. The mean follow-up period was 14.0±3.2 months (median, 12 months; range, 10–16 months). Moreover, 33 of 196 (16.8%) aphakic eyes and 33 of 183 (18.0%) IOL eyes were diagnosed with OH. The peak OH onset times were at 1-week (34/66, 51.5%) and 1-month (14/66, 21.2%) appointments postsurgery. The peak IOP value in the OH eyes was 29.9±7.5 mmHg (median, 29 mmHg; range, 21–48 mmHg). The duration of OH was 30.9±31.2 days (median, 30 days; range, 3–150 days). OH recurred in 13 eyes with a history of OH diagnosed within 1 month postsurgery (13/54, 24.1%), which needed temporary or long term use of antiglaucoma medications. In conclusion, the incidence of OH in children after cataract surgery was 17.4% during the first-year period. Children who have suffered elevated IOP in the first year after cataract surgery should be followed closely to determine if there is an increased risk of developing late-onset glaucoma.  相似文献   
30.
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号