首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   876篇
  免费   67篇
  2024年   1篇
  2023年   8篇
  2022年   1篇
  2021年   16篇
  2020年   3篇
  2018年   15篇
  2017年   7篇
  2016年   47篇
  2015年   105篇
  2014年   77篇
  2013年   83篇
  2012年   142篇
  2011年   112篇
  2010年   44篇
  2009年   29篇
  2008年   47篇
  2007年   38篇
  2006年   19篇
  2005年   23篇
  2004年   16篇
  2003年   46篇
  2002年   17篇
  2001年   15篇
  2000年   4篇
  1999年   2篇
  1998年   3篇
  1997年   2篇
  1994年   3篇
  1993年   1篇
  1988年   2篇
  1986年   1篇
  1982年   2篇
  1981年   1篇
  1980年   1篇
  1979年   2篇
  1978年   2篇
  1976年   1篇
  1958年   1篇
  1957年   1篇
  1952年   1篇
  1920年   1篇
  1914年   1篇
排序方式: 共有943条查询结果,搜索用时 15 毫秒
11.

Background

The Centers for Disease Control and Prevention recommends nontargeted opt-out HIV screening in healthcare settings. Cost effectiveness is critical when considering potential screening methods. Our goal was to compare programmatic costs of nontargeted opt-out rapid HIV screening with physician-directed diagnostic rapid HIV testing in an urban emergency department (ED) as part of the Denver ED HIV Opt-Out Trial.

Methods

This was a prospective cohort study nested in a larger quasi-experiment. Over 16 months, nontargeted rapid HIV screening (intervention) and diagnostic rapid HIV testing (control) were alternated in 4-month time blocks. During the intervention phase, patients were offered HIV testing using an opt-out approach during registration; during the control phase, physicians used a diagnostic approach to offer HIV testing to patients. Each method was fully integrated into ED operations. Direct program costs were determined using the perspective of the ED. Time-motion methodology was used to estimate personnel activity costs. Costs per patient newly-diagnosed with HIV infection by intervention phase, and incremental cost effectiveness ratios were calculated.

Results

During the intervention phase, 28,043 eligible patients were included, 6,933 (25%) completed testing, and 15 (0.2%, 95% CI: 0.1%–0.4%) were newly-diagnosed with HIV infection. During the control phase, 29,925 eligible patients were included, 243 (0.8%) completed testing, and 4 (1.7%, 95% CI: 0.4%–4.2%) were newly-diagnosed with HIV infection. Total annualized costs for nontargeted screening were $148,997, whereas total annualized costs for diagnostic HIV testing were $31,355. The average costs per HIV diagnosis were $9,932 and $7,839, respectively. Nontargeted HIV screening identified 11 more HIV infections at an incremental cost of $10,693 per additional infection.

Conclusions

Compared to diagnostic testing, nontargeted HIV screening was more costly but identified more HIV infections. More effective and less costly testing strategies may be required to improve the identification of patients with undiagnosed HIV infection in the ED.  相似文献   
12.

Background

Tools to predict death or spontaneous survival are necessary to inform liver transplantation (LTx) decisions in pediatric acute liver failure (PALF), but such tools are not available. Recent data suggest that immune/inflammatory dysregulation occurs in the setting of acute liver failure. We hypothesized that specific, dynamic, and measurable patterns of immune/inflammatory dysregulation will correlate with outcomes in PALF.

Methods

We assayed 26 inflammatory mediators on stored serum samples obtained from a convenience sample of 49 children in the PALF study group (PALFSG) collected within 7 days after enrollment. Outcomes were assessed within 21 days of enrollment consisting of spontaneous survivors, non-survivors, and LTx recipients. Data were subjected to statistical analysis, patient-specific Principal Component Analysis (PCA), and Dynamic Bayesian Network (DBN) inference.

Findings

Raw inflammatory mediator levels assessed over time did not distinguish among PALF outcomes. However, DBN analysis did reveal distinct interferon-gamma-related networks that distinguished spontaneous survivors from those who died. The network identified in LTx patients pre-transplant was more like that seen in spontaneous survivors than in those who died, a finding supported by PCA.

Interpretation

The application of DBN analysis of inflammatory mediators in this small patient sample appears to differentiate survivors from non-survivors in PALF. Patterns associated with LTx pre-transplant were more like those seen in spontaneous survivors than in those who died. DBN-based analyses might lead to a better prediction of outcome in PALF, and could also have more general utility in other complex diseases with an inflammatory etiology.  相似文献   
13.
Noma (cancrum oris) is a gangrenous disease of unknown etiology affecting the maxillo-facial region of young children in extremely limited resource countries. In an attempt to better understand the microbiological events occurring during this disease, we used phylogenetic and low-density microarrays targeting the 16S rRNA gene to characterize the gingival flora of acute noma and acute necrotizing gingivitis (ANG) lesions, and compared them to healthy control subjects of the same geographical and social background. Our observations raise doubts about Fusobacterium necrophorum, a previously suspected causative agent of noma, as this species was not associated with noma lesions. Various oral pathogens were more abundant in noma lesions, notably Atopobium spp., Prevotella intermedia, Peptostreptococcus spp., Streptococcus pyogenes and Streptococcus anginosus. On the other hand, pathogens associated with periodontal diseases such as Aggregatibacter actinomycetemcomitans, Capnocytophaga spp., Porphyromonas spp. and Fusobacteriales were more abundant in healthy controls. Importantly, the overall loss of bacterial diversity observed in noma samples as well as its homology to that of ANG microbiota supports the hypothesis that ANG might be the immediate step preceding noma.  相似文献   
14.

Background

Increasing active travel (walking, bicycling, and public transport) is promoted as a key strategy to increase physical activity and reduce the growing burden of noncommunicable diseases (NCDs) globally. Little is known about patterns of active travel or associated cardiovascular health benefits in low- and middle-income countries. This study examines mode and duration of travel to work in rural and urban India and associations between active travel and overweight, hypertension, and diabetes.

Methods and Findings

Cross-sectional study of 3,902 participants (1,366 rural, 2,536 urban) in the Indian Migration Study. Associations between mode and duration of active travel and cardiovascular risk factors were assessed using random-effect logistic regression models adjusting for age, sex, caste, standard of living, occupation, factory location, leisure time physical activity, daily fat intake, smoking status, and alcohol use. Rural dwellers were significantly more likely to bicycle (68.3% versus 15.9%; p<0.001) to work than urban dwellers. The prevalence of overweight or obesity was 50.0%, 37.6%, 24.2%, 24.9%; hypertension was 17.7%, 11.8%, 6.5%, 9.8%; and diabetes was 10.8%, 7.4%, 3.8%, 7.3% in participants who travelled to work by private transport, public transport, bicycling, and walking, respectively. In the adjusted analysis, those walking (adjusted risk ratio [ARR] 0.72; 95% CI 0.58–0.88) or bicycling to work (ARR 0.66; 95% CI 0.55–0.77) were significantly less likely to be overweight or obese than those travelling by private transport. Those bicycling to work were significantly less likely to have hypertension (ARR 0.51; 95% CI 0.36–0.71) or diabetes (ARR 0.65; 95% CI 0.44–0.95). There was evidence of a dose-response relationship between duration of bicycling to work and being overweight, having hypertension or diabetes. The main limitation of the study is the cross-sectional design, which limits causal inference for the associations found.

Conclusions

Walking and bicycling to work was associated with reduced cardiovascular risk in the Indian population. Efforts to increase active travel in urban areas and halt declines in rural areas should be integral to strategies to maintain healthy weight and prevent NCDs in India. Please see later in the article for the Editors'' Summary  相似文献   
15.
16.
Monitoring intraocular pressure (IOP) is essential for pediatric cataract treatment but always difficult due to lack of cooperation in young children. We present the baseline characteristics and the first-year results of a long-term prospective cohort study, which are aimed to determine the relationship of the incidence of ocular hypertension (OH) in children after cataract surgery during the first-year period and the risk of developing late-onset glaucoma. Children were included with the following criteria: they were≤10 years old and scheduled to undergo cataract surgery with/without intraocular lens implantation; they were compliant with our follow-up protocol, which included monitoring IOP using a Tono-Pen under sedation or anesthesia. Incidence of OH, peak OH value, OH onset time and OH duration within a 12-month period following surgery were measured. In brief, 206 patients (379 eyes) were included and OH developed in 66 of 379 (17.4%) eyes. The mean follow-up period was 14.0±3.2 months (median, 12 months; range, 10–16 months). Moreover, 33 of 196 (16.8%) aphakic eyes and 33 of 183 (18.0%) IOL eyes were diagnosed with OH. The peak OH onset times were at 1-week (34/66, 51.5%) and 1-month (14/66, 21.2%) appointments postsurgery. The peak IOP value in the OH eyes was 29.9±7.5 mmHg (median, 29 mmHg; range, 21–48 mmHg). The duration of OH was 30.9±31.2 days (median, 30 days; range, 3–150 days). OH recurred in 13 eyes with a history of OH diagnosed within 1 month postsurgery (13/54, 24.1%), which needed temporary or long term use of antiglaucoma medications. In conclusion, the incidence of OH in children after cataract surgery was 17.4% during the first-year period. Children who have suffered elevated IOP in the first year after cataract surgery should be followed closely to determine if there is an increased risk of developing late-onset glaucoma.  相似文献   
17.
18.
The aim of this study was to determine the existence of the circadian rhythm (CR) in the onset of acute myocardial infarction (AMI) in different patient subgroups. Information was collected about 41,244 infarctions from the database of the ARIAM (Analysis of Delay in AMI) Spanish multicenter study. CR in AMI were explored in subgroups of cases categorized by age, gender, previous ischemic heart disease (PIHD), outcome in coronary care unit, infarction electrocardiograph (ECG) characteristics (Q wave or non‐Q wave), and location of AMI. Cases were classified according to these variables in the different subgroups. To verify the presence of CR, a simple test of equality of time series based on the multiple‐sinusoid (24, 12, and 8 h periods) cosinor analysis was developed. For the groups as a whole, the time of pain onset as an indicator of the AMI occurrence showed a CR (p<0.0001), with a morning peak at 10:10 h. All the analyzed subgroups also showed CR. Comparison between subgroups showed significant differences in the PIHD (p<0.01) and infarction ECG characteristics (p<0.01) groups. The CR of the subgroup with Q‐wave infarction differed from that of non‐Q wave subgroup (p<0.01) when the patients had PIHD (23% in Q wave infarction vs. 39.2% in non‐Q wave). AMI onset followed a CR pattern, which is also observed in all analyzed subgroups. Differences in the CR according to the Q/non‐Q wave infarction characteristics could be determined by PIHD. The cosinor model fit with three components (24, 12, and 8 h periods) showed a higher sensitivity than the single 24 h period analysis.  相似文献   
19.

Background

Nosocomial bloodstream infections (nBSIs) are an important cause of morbidity and mortality and are the most frequent type of nosocomial infection in pediatric patients.

Methods

We identified the predominant pathogens and antimicrobial susceptibilities of nosocomial bloodstream isolates in pediatric patients (≤16 years of age) in the Brazilian Prospective Surveillance for nBSIs at 16 hospitals from 12 June 2007 to 31 March 2010 (Br SCOPE project).

Results

In our study a total of 2,563 cases of nBSI were reported by hospitals participating in the Br SCOPE project. Among these, 342 clinically significant episodes of BSI were identified in pediatric patients (≤16 years of age). Ninety-six percent of BSIs were monomicrobial. Gram-negative organisms caused 49.0% of these BSIs, Gram-positive organisms caused 42.6%, and fungi caused 8.4%. The most common pathogens were Coagulase-negative staphylococci (CoNS) (21.3%), Klebsiella spp. (15.7%), Staphylococcus aureus (10.6%), and Acinetobacter spp. (9.2%). The crude mortality was 21.6% (74 of 342). Forty-five percent of nBSIs occurred in a pediatric or neonatal intensive-care unit (ICU). The most frequent underlying conditions were malignancy, in 95 patients (27.8%). Among the potential factors predisposing patients to BSI, central venous catheters were the most frequent (66.4%). Methicillin resistance was detected in 37 S. aureus isolates (27.1%). Of the Klebsiella spp. isolates, 43.2% were resistant to ceftriaxone. Of the Acinetobacter spp. and Pseudomonas aeruginosa isolates, 42.9% and 21.4%, respectively, were resistant to imipenem.

Conclusions

In our multicenter study, we found a high mortality and a large proportion of gram-negative bacilli with elevated levels of resistance in pediatric patients.  相似文献   
20.

Objective

Stevens-Johnson Syndrome (SJS) is one of the most severe muco-cutaneous diseases and its occurrence is often attributed to drug use. The aim of the present study is to quantify the risk of SJS in association with drug and vaccine use in children.

Methods

A multicenter surveillance of children hospitalized through the emergency departments for acute conditions of interest is currently ongoing in Italy. Cases with a diagnosis of SJS were retrieved from all admissions. Parents were interviewed on child’s use of drugs and vaccines preceding the onset of symptoms that led to the hospitalization. We compared the use of drugs and vaccines in cases with the corresponding use in a control group of children hospitalized for acute neurological conditions.

Results

Twenty-nine children with a diagnosis of SJS and 1,362 with neurological disorders were hospitalized between 1st November 1999 and 31st October 2012. Cases were more frequently exposed to drugs (79% vs 58% in the control group; adjusted OR 2.4; 95% CI 1.0–6.1). Anticonvulsants presented the highest adjusted OR: 26.8 (95% CI 8.4–86.0). Significantly elevated risks were also estimated for antibiotics use (adjusted OR 3.3; 95% CI 1.5–7.2), corticosteroids (adjusted OR 4.2; 95% CI 1.8–9.9) and paracetamol (adjusted OR 3.2; 95% CI 1.5–6.9). No increased risk was estimated for vaccines (adjusted OR: 0.9; 95% CI 0.3–2.8).

Discussion

Our study provides additional evidence on the etiologic role of drugs and vaccines in the occurrence of SJS in children.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号