首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.

Rationale

The clinical impact of Xpert MTB/RIF for tuberculosis (TB) diagnosis in high HIV-prevalence settings is unknown.

Objective

To determine the diagnostic accuracy and impact of Xpert MTB/RIF among high-risk TB suspects.

Methods

We prospectively enrolled consecutive, hospitalized, Ugandan TB suspects in two phases: baseline phase in which Xpert MTB/RIF results were not reported to clinicians and an implementation phase in which results were reported. We determined the diagnostic accuracy of Xpert MTB/RIF in reference to culture (solid and liquid) and compared patient outcomes by study phase.

Results

477 patients were included (baseline phase 287, implementation phase 190). Xpert MTB/RIF had high sensitivity (187/237, 79%, 95% CI: 73–84%) and specificity (190/199, 96%, 95% CI: 92–98%) for culture-positive TB overall, but sensitivity was lower (34/81, 42%, 95% CI: 31–54%) among smear-negative TB cases. Xpert MTB/RIF reduced median days-to-TB detection for all TB cases (1 [IQR 0–26] vs. 0 [IQR 0–1], p<0.001), and for smear-negative TB (35 [IQR 22–55] vs. 22 [IQR 0–33], p = 0.001). However, median days-to-TB treatment was similar for all TB cases (1 [IQR 0–5] vs. 0 [IQR 0–2], p = 0.06) and for smear-negative TB (7 [IQR 3–53] vs. 6 [IQR 1–61], p = 0.78). Two-month mortality was also similar between study phases among 252 TB cases (17% vs. 14%, difference +3%, 95% CI: −21% to +27%, p = 0.80), and among 87 smear-negative TB cases (28% vs. 22%, difference +6%, 95% CI: −34 to +46%, p = 0.77).

Conclusions

Xpert MTB/RIF facilitated more accurate and earlier TB diagnosis, leading to a higher proportion of TB suspects with a confirmed TB diagnosis prior to hospital discharge in a high HIV/low MDR TB prevalence setting. However, our study did not detect a decrease in two-month mortality following implementation of Xpert MTB/RIF possibly because of insufficient powering, differences in empiric TB treatment rates, and disease severity between study phases.  相似文献   

2.

Background

Lactic acidosis is a common cause of high anion gap metabolic acidosis. Sodium bicarbonate may be considered for an arterial pH <7.15 but paradoxically depresses cardiac performance and exacerbates acidosis by enhancing lactate production. This study aimed to evaluate the cause and mortality rate of lactic acidosis and to investigate the effect of factors, including sodium bicarbonate use, on death.

Methods

We conducted a single center analysis from May 2011 through April 2012. We retrospectively analyzed 103 patients with lactic acidosis among 207 patients with metabolic acidosis. We used SOFA and APACHE II as severity scores to estimate illness severity. Multivariate logistic regression analysis and Cox regression analysis models were used to identify factors that affect mortality.

Results

Of the 103 patients with a mean age of 66.1±11.4 years, eighty-three patients (80.6%) died from sepsis (61.4%), hepatic failure, cardiogenic shock and other causes. The percentage of sodium bicarbonate administration (p = 0.006), catecholamine use, ventilator care and male gender were higher in the non-survival group than the survival group. The non-survival group had significantly higher initial and follow-up lactic acid levels, lower initial albumin, higher SOFA scores and APACHE II scores than the survival group. The mortality rate was significantly higher in patients who received sodium bicarbonate. Sodium bicarbonate administration (p = 0.016) was associated with higher mortality. Independent factors that affected mortality were SOFA score (Exp (B) = 1.72, 95% CI = 1.12–2.63, p = 0.013) and sodium bicarbonate administration (Exp (B) = 6.27, 95% CI = 1.10–35.78, p = 0.039).

Conclusions

Lactic acidosis, which has a high mortality rate, should be evaluated in patients with metabolic acidosis. In addition, sodium bicarbonate should be prescribed with caution in the case of lactic acidosis because sodium bicarbonate administration may affect mortality.  相似文献   

3.

Introduction

The growing number of renal transplant recipients in a sustained immunosuppressive state is a factor that can contribute to increased incidence of sepsis. However, relatively little is known about sepsis in this population. The aim of this single-center study was to evaluate the factors associated with hospital mortality in renal transplant patients admitted to the intensive care unit (ICU) with severe sepsis and septic shock.

Methods

Patient demographics and transplant-related and ICU stay data were retrospectively collected. Multiple logistic regression was conducted to identify the independent risk factors associated with hospital mortality.

Results

A total of 190 patients were enrolled, 64.2% of whom received kidneys from deceased donors. The mean patient age was 51±13 years (males, 115 [60.5%]), and the median APACHE II was 20 (16–23). The majority of patients developed sepsis late after the renal transplantation (2.1 [0.6–2.3] years). The lung was the most common infection site (59.5%). Upon ICU admission, 16.4% of the patients had ≤1 systemic inflammatory response syndrome criteria. Among the patients, 61.5% presented with ≥2 organ failures at admission, and 27.9% experienced septic shock within the first 24 hours of ICU admission. The overall hospital mortality rate was 38.4%. In the multivariate analysis, the independent determinants of hospital mortality were male gender (OR = 5.9; 95% CI, 1.7–19.6; p = 0.004), delta SOFA 24 h (OR = 1.7; 95% CI, 1.2–2.3; p = 0.001), mechanical ventilation (OR = 30; 95% CI, 8.8–102.2; p<0.0001), hematologic dysfunction (OR = 6.8; 95% CI, 2.0–22.6; p = 0.002), admission from the ward (OR = 3.4; 95% CI, 1.2–9.7; p = 0.02) and acute kidney injury stage 3 (OR = 5.7; 95% CI,1.9–16.6; p = 0.002).

Conclusions

Hospital mortality in renal transplant patients with severe sepsis and septic shock was associated with male gender, admission from the wards, worse SOFA scores on the first day and the presence of hematologic dysfunction, mechanical ventilation or advanced graft dysfunction.  相似文献   

4.

Background and Objectives

Elevated blood lipids during childhood are predictive of dyslipidemia in adults. Although obese and inactive children have elevated values, any potentially protective role of elementary school physical education is unknown. Our objective was to determine the effect of a modern elementary school physical education (PE) program on the blood lipid concentrations in community-based children.

Methods

In this cluster-randomized controlled trial, 708 healthy children (8.1±0.3 years, 367 boys) in 29 schools were allocated to either a 4-year intervention program of specialist-taught PE (13 schools) or to a control group of the currently practiced PE conducted by generalist classroom teachers. Fasting blood lipids were measured at ages 8, 10, and 12 years and intervention and control class activities were recorded.

Results

Intervention classes included more fitness work and more moderate and vigorous physical activity than control classes (both p<0.001). With no group differences at baseline, the percentage of 12 year-old boys and girls with elevated low density lipoprotein cholesterol (LDL-C, >3.36mmol.L−1,130 mg/dL) was lower in the intervention than control group (14% vs. 23%, p = 0.02). There was also an intervention effect on mean LDL-C across all boys (reduction of 9.6% for intervention v 2.8% control, p = 0.02), but not girls (p = 0.2). The intervention effect on total cholesterol mirrored LDL-C, but there were no detectable 4-year intervention effects on high-density lipoprotein cholesterol or triglycerides.

Conclusions

The PE program delivered by specialist teachers over four years in elementary school reduced the incidence of elevated LDL-C in boys and girls, and provides a means by which early preventative practices can be offered to all children.

Trial Registration

Australia New Zealand Clinical Trial Registry ANZRN12612000027819 https://www.anzctr.org.au/Trial/Registration/TrialReview.aspx?id=347799.  相似文献   

5.

Aim

To analyze alcohol use, clinical data and laboratory parameters that may affect FIB-4, an index for measuring liver fibrosis, in HCV-monoinfected and HCV/HIV-coinfected drug users.

Patients and Methods

Patients admitted for substance abuse treatment between 1994 and 2006 were studied. Socio-demographic data, alcohol and drug use characteristics and clinical variables were obtained through hospital records. Blood samples for biochemistry, liver function tests, CD4 cell count, and serology of HIV and HCV infection were collected at admission. Multivariate linear regression was used to analyze the predictors of FIB-4 increase.

Results

A total of 472 (83% M, 17% F) patients were eligible. The median age at admission was 31 years (Interquartile range (IQR) 27–35 years), and the median duration of drug use was 10 years (IQR 5.5–15 years). Unhealthy drinking (>50 grams/day) was reported in 32% of the patients. The FIB-4 scores were significantly greater in the HCV/HIV-coinfected patients (1.14, IQR 0.76–1.87) than in the HCV-monoinfected patients (0.75, IQR 0.56–1.11) (p<0.001). In the multivariate analysis, unhealthy drinking (p = 0.034), lower total cholesterol (p = 0.042), serum albumin (p<0.001), higher GGT (p<0.001) and a longer duration of addiction (p = 0.005) were independently associated with higher FIB-4 scores in the HCV-monoinfected drug users. The effect of unhealthy drinking on FIB-4 scores disappeared in the HCV/HIV-coinfected patients, whereas lower serum albumin (p<0.001), a lower CD4 cell count (p = 0.006), higher total bilirubin (p<0.001) and a longer drug addiction duration (p<0.001) were significantly associated with higher FIB-4 values.

Conclusions

Unhealthy alcohol use in the HCV-monoinfected patients and HIV-related immunodeficiency in the HCV/HIV-coinfected patients are important risk factors associated with liver fibrosis in the respective populations  相似文献   

6.

Background and Objectives

Sudden cardiac death is the most common cause of mortality in chronic kidney disease patients, and it occurs mostly due to ventricular arrhythmias. In this study, we aimed at investigating the prevalence of ventricular arrhythmia and the factors associated with its occurrence in nondialyzed chronic kidney disease patients.

Design, Setting, Participants and Measurements

This cross-sectional study evaluated 111 chronic kidney disease patients (estimated glomerular filtration rate 34.7±16.1 mL/min/1.73 m2, 57±11.4 years, 60% male, 24% diabetics). Ventricular arrhythmia was assessed by 24-hour electrocardiogram. Left ventricular hypertrophy (echocardiogram), 24-hour ambulatory blood pressure monitoring, and coronary artery calcification (multi-slice computed tomography) and laboratory parameters were also evaluated.

Results

Ventricular arrhythmia was found in 35% of the patients. Non-controlled hypertension was observed in 21%, absence of systolic decency in 29%, left ventricular hypertrophy in 27%, systolic dysfunction in 10%, and coronary artery calcification in 49%. Patients with ventricular arrhythmia were older (p<0.001), predominantly men (p = 0.009), had higher estimated glomerular filtration rate (p = 0.03) and hemoglobin (p = 0.005), and lower intact parathyroid hormone (p = 0.024) and triglycerides (p = 0.011) when compared to patients without ventricular arrhythmia. In addition, a higher left ventricular mass index (p = 0.002) and coronary calcium score (p = 0.002), and a lower ejection fraction (p = 0.001) were observed among patients with ventricular arrhythmia. In the multiple logistic regression analysis, aging, increased hemoglobin levels and reduced ejection fraction were independently related to the presence of ventricular arrhythmia.

Conclusions

Ventricular arrhythmia is prevalent in nondialyzed chronic kidney disease patients. Age, hemoglobin levels and ejection fraction were the factors associated with ventricular arrhythmia in these patients.  相似文献   

7.

Objectives

To study the MRI findings of the pancreatic duct in patients with acute pancreatitis.

Materials and Methods

A total of 239 patients with acute pancreatitis and 125 controls were analyzed in this study. The severity of acute pancreatitis was graded using the MR severity index (MRSI) and the Acute Physiology And Chronic Healthy Evaluation II(APACHE II) scoring systems. The number of main pancreatic duct (MPD) segments visualized, and both MPD diameter and pancreatic duct disruption were noted and compared with the severity of acute pancreatitis.

Results

The frequency of MPD segment visualization in the control group was higher than that in the acute pancreatitis group (p<0.05). The number of MPD segments visualized was negatively correlated with the MRSI score (p<0.05) and the APACHE II score (p<0.05). There was no difference in the MPD diameter between the acute pancreatitis and control groups or among the patients with different severities of acute pancreatitis (p>0.05). The prevalence of pancreatic duct disruption was 7.9% in the acute pancreatitis group. The prevalences of pancreatic duct disruption were 4.8% and 15.3% in the mild and severe acute pancreatitis groups based on the APACHE II score, respectively, and were 0%, 5.7% and 43.5% in the mild, moderate and severe acute pancreatitis groups according the MRSI score, respectively. The prevalence of pancreatic duct disruption was correlated with the severity of acute pancreatitis based on the APACHE II score (p<0.05) and MRSI score (p<0.05).

Conclusion

The pancreatic duct in acute pancreatitis patients was of normal diameter. The number of MPD segments visualized and visible pancreatic duct disruption on MRI may be supplementary indicators for determining the severity of acute pancreatitis.  相似文献   

8.

Objective

Aim of this study was to evaluate the association between preoperative health-related quality of life (HRQoL) and mortality in a cohort of elderly patients (>65 years) with gastrointestinal, gynecological and genitourinary carcinomas.

Design

Prospective cohort pilot study.

Setting

Tertiary university hospital in Germany.

Patients

Between June 2008 and July 2010 and after ethical committee approval and written informed consent, 126 patients scheduled for onco-surgery were included. Prior to surgery as well as 3 and 12 months postoperatively all participants completed the EORTC-QLQ-C30 questionnaire (measuring self-reported health-related quality of life). Additionally, demographic and clinical data including the Mini Mental State Examination (MMSE) were collected. Surgery and anesthesia were conducted according to the standard operating procedures. Primary endpoint was the cumulative mortality rate over 12 months after one year. Changes in Quality of life were considered as secondary outcome.

Results

Mortality after one year was 28%. In univariable and multivariable logistic regression analysis baseline HRQoL self-reported cognitive function (OR per point: 0.98; CI 95% 0.96–0.99; p = 0.024) and higher symptom burden for appetite loss (per point: OR 1.02; CI 95% 1.00–1.03; p = 0.014) were predictive for long-term mortality. Additionally the MMSE as an objective measure of cognitive impairment (per point: OR 0.69; CI 95% 0.51–0.96; p = 0.026) as well as severity of surgery (OR 0.31; CI 95% 0.11–0.93; p = 0.036) were predictive for long-term mortality. Global health status 12 months after surgery was comparable to the baseline levels in survivors despite moderate impairments in other domains.

Conclusion

This study showed that objective and self-reported cognitive functioning together with appetite loss were prognostic for mortality in elderly cancer patients. In addition, impaired cognitive dysfunction and severity of surgery were predictive for one-year mortality whereas in this selected population scheduled for surgery age, gender, cancer site and metastases were not.  相似文献   

9.

Purpose

The occurrence of brushite stones has increased during recent years. However, the pathogenic factors driving the development of brushite stones remain unclear.

Methods

Twenty-eight brushite stone formers and 28 age-, sex- and BMI-matched healthy individuals were enrolled in this case-control study. Anthropometric, clinical, 24 h urinary parameters and dietary intake from 7-day weighed food records were assessed.

Results

Pure brushite stones were present in 46% of patients, while calcium oxalate was the major secondary stone component. Urinary pH and oxalate excretion were significantly higher, whereas urinary citrate was lower in patients as compared to healthy controls. Despite lower dietary intake, urinary calcium excretion was significantly higher in brushite stone patients. Binary logistic regression analysis revealed pH>6.50 (OR 7.296; p = 0.035), calcium>6.40 mmol/24 h (OR 25.213; p = 0.001) and citrate excretion <2.600 mmol/24 h (OR 15.352; p = 0.005) as urinary risk factors for brushite stone formation. A total of 56% of patients exhibited distal renal tubular acidosis (dRTA). Urinary pH, calcium and citrate excretion did not significantly differ between patients with or without dRTA.

Conclusions

Hypercalciuria, a diminished citrate excretion and an elevated pH turned out to be the major urinary determinants of brushite stone formation. Interestingly, urinary phosphate was not associated with urolithiasis. The increased urinary oxalate excretion, possibly due to decreased calcium intake, promotes the risk of mixed stone formation with calcium oxalate. Neither dietary factors nor dRTA can account as cause for hypercalciuria, higher urinary pH and diminished citrate excretion. Further research is needed to define the role of dRTA in brushite stone formation and to evaluate the hypothesis of an acquired acidification defect.  相似文献   

10.

Background

HIV infection is a major contributor to maternal mortality in resource-limited settings. The Drug Resource Enhancement Against AIDS and Malnutrition Programme has been promoting HAART use during pregnancy and postpartum for Prevention-of-mother-to-child-HIV transmission (PMTCT) irrespective of maternal CD4 cell counts since 2002.

Methods

Records for all HIV+ pregnancies followed in Mozambique and Malawi from 6/2002 to 6/2010 were reviewed. The cohort was comprised by pregnancies where women were referred for PMTCT and started HAART during prenatal care (n = 8172, group 1) and pregnancies where women were referred on established HAART (n = 1978, group 2).

Results

10,150 pregnancies were followed. Median (IQR) baseline values were age 26 years (IQR:23–30), CD4 count 392 cells/mm3 (IQR:258–563), Viral Load log10 3.9 (IQR:3.2–4.4), BMI 23.4 (IQR:21.5–25.7), Hemoglobin 10.0 (IQR: 9.0–11.0). 101 maternal deaths (0.99%) occurred during pregnancy to 6 weeks postpartum: 87 (1.1%) in group 1 and 14 (0.7%) in group 2. Mortality was 1.3% in women with <than 350 CD4 cells/mm3 and 0.7% in women with greater than 350 CD4s cells/mm3 [OR = 1.9 (CL 1.3–2.9) p = 0.001]. Mortality was higher in patients with shorter antenatal HAART: 22/991 (2.2%) if less than 30 days and 79/9159 (0.9%) if 31 days or greater [OR = 2.6 (CL 1.6–4.2) p<0.001]. By multivariate analysis, shorter antenatal HAART (p<0.001), baseline values for CD4 cell count (p = 0.012), hemoglobin (p = 0.02), and BMI (p<0.001) were associated with mortality. Four years later, survival was 92% for women with shorter antenatal HAART and 98% for women on established therapy prior to pregnancy, p = 0.001.

Conclusions

Antiretrovirals for PMTCT purposes have significant impact on maternal mortality as do CD4 counts and nutritional status. In resource-limited settings, PMTCT programs should provide universal HAART to all HIV+ pregnant women given its impact in prevention of maternal death.  相似文献   

11.

Introduction

Statins have pleiotropic effects that could influence the prevention and outcome of some infectious diseases. There is no information about their specific effect on Staphylococcus aureus bacteremia (SAB).

Methods

A prospective cohort study including all SAB diagnosed in patients aged ≥18 years admitted to a 950-bed tertiary hospital from March 2008 to January 2011 was performed. The main outcome variable was 14-day mortality, and the secondary outcome variables were 30-day mortality, persistent bacteremia (PB) and presence of severe sepsis or septic shock at diagnosis of SAB. The effect of statin therapy at the onset of SAB was studied by multivariate logistic regression and Cox regression analysis, including a propensity score for statin therapy.

Results

We included 160 episodes. Thirty-three patients (21.3%) were receiving statins at the onset of SAB. 14-day mortality was 21.3%. After adjustment for age, Charlson index, Pitt score, adequate management, and high risk source, statin therapy had a protective effect on 14-day mortality (adjusted OR = 0.08; 95% CI: 0.01–0.66; p = 0.02), and PB (OR = 0.89; 95% CI: 0.27–1.00; p = 0.05) although the effect was not significant on 30-day mortality (OR = 0.35; 95% CI: 0.10–1.23; p = 0.10) or presentation with severe sepsis or septic shock (adjusted OR = 0.89; CI 95%: 0.27–2.94; p = 0.8). An effect on 30-day mortality could neither be demonstrated on Cox analysis (adjusted HR = 0.5; 95% CI: 0.19–1.29; p = 0.15).

Conclusions

Statin treatment in patients with SAB was associated with lower early mortality and PB. Randomized studies are necessary to identify the role of statins in the treatment of patients with SAB.  相似文献   

12.

Objectives

We prospectively compared the preventive effects of rosuvastatin and atorvastatin on contrast-induced nephropathy (CIN) in patients with chronic kidney disease (CKD) undergoing percutaneous coronary intervention (PCI).

Methods

We enrolled 1078 consecutive patients with CKD undergoing elective PCI. Patients in Group 1 (n = 273) received rosuvastatin (10 mg), and those in group 2 (n = 805) received atorvastatin (20 mg). The primary end-point was the development of CIN, defined as an absolute increase in serum creatinine ≥0.5 mg/dL, or an increase ≥25% from baseline within 48–72 h after contrast medium exposure.

Results

CIN was observed in 58 (5.4%) patients. The incidence of CIN was similar in patients pretreated with either rosuvastatin or atorvastatin (5.9% vs. 5.2%, p = 0.684). The same results were also observed when using other definitions of CIN. Clinical and procedural characteristics did not show significant differences between the two groups (p>0.05). Additionally, there were no significant inter-group differences with respect to in-hospital mortality rates (0.4% vs. 1.5%, p = 0.141), or other in-hospital complications. Multivariate logistic regression analysis revealed that rosuvastatin and atorvastatin demonstrated similar efficacies for preventing CIN, after adjusting for potential confounding risk factors (odds ratio = 1.17, 95% confidence interval, 0.62–2.20, p = 0.623). A Kaplan–Meier survival analysis showed that patients taking either rosuvastatin or atorvastatin had similar incidences of all-cause mortality (9.4% vs. 7.1%, respectively; p = 0.290) and major adverse cardiovascular events (29.32% vs. 23.14%, respectively; p = 0.135) during follow-up.

Conclusions

Rosuvastatin and atorvastatin have similar efficacies for preventing CIN in patients with CKD undergoing PCI.  相似文献   

13.

Background

Reduced estimated glomerular filtration rate (eGFR) using the cystatin-C derived equations might be a better predictor of cardiovascular disease (CVD) mortality compared with the creatinine-derived equations, but this association remains unclear in elderly individuals.

Aim

The aims of this study were to compare the predictive values of the Chronic Kidney Disease Epidemiology Collaboration (CKD-EPI)-creatinine, CKD-EPI-cystatin C and CKD-EPI-creatinine-cystatin C eGFR equations for all-cause mortality and CVD events (hospitalizations±mortality).

Methods

Prospective cohort study of 1165 elderly women aged>70 years. Associations between eGFR and outcomes were examined using Cox regression analysis. Test accuracy of eGFR equations for predicting outcomes was examined using Receiver Operating Characteristic (ROC) analysis and net reclassification improvement (NRI).

Results

Risk of all-cause mortality for every incremental reduction in eGFR determined using CKD-EPI-creatinine, CKD-EPI-cystatin C and the CKD-EPI-creatinine-cystatic C equations was similar. Areas under the ROC curves of CKD-EPI-creatinine, CKD-EPI-cystatin C and CKD-EPI-creatinine-cystatin C equations for all-cause mortality were 0.604 (95%CI 0.561–0.647), 0.606 (95%CI 0.563–0.649; p = 0.963) and 0.606 (95%CI 0.563–0.649; p = 0.894) respectively. For all-cause mortality, there was no improvement in the reclassification of eGFR categories using the CKD-EPI-cystatin C (NRI -4.1%; p = 0.401) and CKD-EPI-creatinine-cystatin C (NRI -1.2%; p = 0.748) compared with CKD-EPI-creatinine equation. Similar findings were observed for CVD events.

Conclusion

eGFR derived from CKD-EPI cystatin C and CKD-EPI creatinine-cystatin C equations did not improve the accuracy or predictive ability for clinical events compared to CKD-EPI-creatinine equation in this cohort of elderly women.  相似文献   

14.

Purpose

Delays in antimicrobial therapy increase mortality in ventilator-associated pneumonia (VAP). The more objective ventilator-associated complications (VAC) are increasingly used for quality reporting. It is unknown if delays in antimicrobial administration, after patients meet VAC criteria, leads to worse outcomes.

Materials and Methods

Cohort of 81 episodes of antimicrobial treatment for VAP. We compared mortality, superinfections and treatment failures conditional on the timing of identification of VAC.

Results

60% of patients with VAC had an identifiable episode at least 48 before the initiation of antimicrobials. Antimicrobial administration after the identification of VAC was not associated with intensive care unit (ICU) mortality (OR 0.71, 95% CI 0.11–4.48, p = 0.701) compared to immediate antimicrobial administration. Similarly, the risk of treatment failure or superinfection was not affected by the timing of administration of antimicrobials in VAC (HR 0.95, 95% CI 0.42–2.19, p = 0.914).

Conclusions

We observed no signal of harm associated with the timing to initiate antimicrobials after the identification of a VAC. The identification of VAC should not lead clinicians to start antimicrobials before a diagnosis of VAP can be established.  相似文献   

15.

Background

It has been suggested that modestly elevated circulating D-dimer values may be associated with acute ischemic stroke (AIS). Thus, the purpose of this study was to investigate the association between plasma D -dimer level at admission and AIS in Chinese population.

Methods

In a prospective observational study, plasma D-dimer levels were measured using a particle-enhanced, immunoturbidimetric assay on admission in 240 Chinese patients with AIS. The National Institutes of Health Stroke Scale (NIHSS) score was assessed on admission blinded to D-dimer levels.

Results

Plasma median D-dimer levels were significantly (P = 0.000) higher in AIS patients as compared to healthy controls (0.88; interquartiler range [IQR], 0.28–2.11 mg/L and 0.31; IQR, 0.17–0.74 mg/L). D-dimer levels increased with increasing severity of stroke as defined by the NIHSS score(r = 0.179, p = 0.005) and infarct volume(r = 0.425, p = 0.000). Those positive trends still existed even after correcting for possible confounding factors (P = 0.012, 0.000; respectively). Based on the Receiver operating characteristic (ROC) curve, the optimal cut-off value of plasma D-dimer levels as an indicator for diagnosis of cardioembolic strokes was projected to be 0.91 mg/L, which yielded a sensitivity of 83.7% and a specificity of 81.5%, the area under the curve was 0.862(95% confidence interval [CI], 0.811–0.912).

Conclusion

We had shown that plasma D-dimer levels increased with increasing severity of stroke as defined by the NIHSS score and infarct volume. These associations were independent other possible variables. In addition, cardioembolic strokes can be distinguished from other stroke etiologies by measuring plasma D-dimer levels very early (0–48hours from stroke symptom onset).  相似文献   

16.

Background

Sleep deprivation and obesity, are associated with neurocognitive impairments. Effects of sleep deprivation and obesity on cognition are unknown, and the cognitive long-term effects of improvement of sleep have not been prospectively assessed in short sleeping, obese individuals.

Objective

To characterize neurocognitive functions and assess its reversibility.

Design

Prospective cohort study.

Setting

Tertiary Referral Research Clinical Center.

Patients

A cohort of 121 short-sleeping (<6.5 h/night) obese (BMI 30–55 kg/m2) men and pre-menopausal women.

Intervention

Sleep extension (468±88 days) with life-style modifications.

Measurements

Neurocognitive functions, sleep quality and sleep duration.

Results

At baseline, 44% of the individuals had an impaired global deficit score (t-score 0–39). Impaired global deficit score was associated with worse subjective sleep quality (p = 0.02), and lower urinary dopamine levels (p = 0.001). Memory was impaired in 33%; attention in 35%; motor skills in 42%; and executive function in 51% of individuals. At the final evaluation (N = 74), subjective sleep quality improved by 24% (p<0.001), self-reported sleep duration increased by 11% by questionnaires (p<0.001) and by 4% by diaries (p = 0.04), and daytime sleepiness tended to improve (p = 0.10). Global cognitive function and attention improved by 7% and 10%, respectively (both p = 0.001), and memory and executive functions tended to improve (p = 0.07 and p = 0.06). Serum cortisol increased by 17% (p = 0.02). In a multivariate mixed model, subjective sleep quality and sleep efficiency, urinary free cortisol and dopamine and plasma total ghrelin accounted for 1/5 of the variability in global cognitive function.

Limitations

Drop-out rate.

Conclusions

Chronically sleep-deprived obese individuals exhibit substantial neurocognitive deficits that are partially reversible upon improvement of sleep in a non-pharmacological way. These findings have clinical implications for large segments of the US population.

Trail registration

www.ClinicalTrials.gov NCT00261898. NIDDK protocol 06-DK-0036  相似文献   

17.

Background

Baseline hyponatremia predicts acute mortality following pulmonary embolism (PE). The natural history of serum sodium levels after PE and the relevance to acute and long-term mortality after the PE is unknown.

Methods

Clinical details of all patients (n = 1023) admitted to a tertiary institution from 2000–2007 with acute PE were retrieved retrospectively. Serum sodium results from days 1, 3–4, 5–6, and 7 of admission were pre-specified and recorded. We excluded 250 patients without day-1 sodium or had <1 subsequent sodium assessment, leaving 773 patients as the studied cohort. There were 605 patients with normonatremia (sodium≥135 mmol/L throughout admission), 57 with corrected hyponatremia (day-1 sodium<135 mmol/L, then normalized), 54 with acquired hyponatremia and 57 with persistent hyponatremia. Patients’ outcomes were tracked from a state-wide death registry and analyses performed using multivariate-regression modelling.

Results

Mean (±standard deviation) day-1 sodium was 138.2±4.3 mmol/L. Total mortality (mean follow-up 3.6±2.5 years) was 38.8% (in-hospital mortality 3.2%). There was no survival difference between studied (n = 773) and excluded (n = 250) patients. Day-1 sodium (adjusted hazard ratio [aHR] 0.89, 95% confidence interval [CI] 0.83–0.95, p = 0.001) predicted in-hospital death. Relative to normonatremia, corrected hyponatremia increased the risk of in-hospital death 3.6-fold (95% CI 1.20–10.9, p = 0.02) and persistent hyponatremia increased the risk 5.6-fold (95% CI 2.08–15.0, p = 0.001). Patients with either persisting or acquired hyponatremia had worse long-term survival than those who had corrected hyponatremia or had been normonatremic throughout (aHR 1.47, 95% CI 1.06–2.03, p = 0.02).

Conclusion

Sodium fluctuations after acute PE predict acute and long-term outcome. Factors mediating the correction of hyponatremia following acute PE warrant further investigation.  相似文献   

18.

Introduction

Larger populations at risk, broader use of antibiotics and longer hospital stays have impacted on the incidence of Candida sp. bloodstream infections (CBSI).

Objective

To determine clinical and epidemiologic characteristics of patients with CBSI in two tertiary care reference medical institutions in Mexico City.

Design

Prospective and observational laboratory-based surveillance study conducted from 07/2008 to 06/2010.

Methods

All patients with CBSI were included. Identification and antifungal susceptibility were performed using CLSI M27-A3 standard procedures. Frequencies, Mann-Whitney U test or T test were used as needed. Risk factors were determined with multivariable analysis and binary logistic regression analysis.

Results

CBSI represented 3.8% of nosocomial bloodstream infections. Cumulative incidence was 2.8 per 1000 discharges (incidence rate: 0.38 per 1000 patient-days). C. albicans was the predominant species (46%), followed by C. tropicalis (26%). C. glabrata was isolated from patients with diabetes (50%), and elderly patients. Sixty-four patients (86%) received antifungals. Amphotericin-B deoxycholate (AmBD) was the most commonly used agent (66%). Overall mortality rate reached 46%, and risk factors for death were APACHE II score ≥16 (OR = 6.94, CI95% = 2.34–20.58, p<0.0001), and liver disease (OR = 186.11, CI95% = 7.61–4550.20, p = 0.001). Full susceptibility to fluconazole, AmBD and echinocandins among C. albicans, C. tropicalis, and C. parapsilosis was observed.

Conclusions

The cumulative incidence rate in these centers was higher than other reports from tertiary care hospitals from Latin America. Knowledge of local epidemiologic patterns permits the design of more specific strategies for prevention and preemptive therapy of CBSI.  相似文献   

19.

Background

Motor proficiency is positively associated with physical activity levels. The aim of this study is to investigate associations between the timing of infant motor development and subsequent sports participation during adolescence.

Methods

Prospective observational study. The study population consisted of 9,009 individuals from the Northern Finland Birth Cohort 1966. Motor development was assessed by parental report at age 1 year, using age at walking with support and age at standing unaided. At follow up aged 14 years, data were collected on the school grade awarded for physical education (PE). Self report was used to collect information on the frequency of sports participation and number of different sports reported.

Principal Findings

Earlier infant motor development was associated with improved school PE grade, for age at walking supported (p<0.001) and standing unaided (p = <0.001). Earlier infant motor development, in terms of age at walking supported, was positively associated with the number of different sports reported (p = 0.003) and with a greater frequency of sports participation (p = 0.043). These associations were independent of gestational age and birth weight, as well as father''s social class and body mass index at age 14 years.

Conclusions

Earlier infant motor development may predict higher levels of physical activity as indicated by higher school PE grade, participation in a greater number of different types of sports and increased frequency of sports participation. Identification of young children with slower motor development may allow early targeted interventions to improve motor skills and thereby increase physical activity in later life.  相似文献   

20.

Objective

To evaluate the rate and factors associated with attrition of patients receiving ART in tertiary and secondary hospitals in Nigeria.

Methods and Findings

We reviewed patient level data collected between 2007 and 2010 from 11 hospitals across Nigeria. Kaplan-Meier product-limit and Cox regression were used to determine probability of retention in care and risk factors for attrition respectively. Of 6,408 patients in the cohort, 3,839 (59.9%) were females, median age of study population was 33years (IQR: 27–40) and 4,415 (69%) were from secondary health facilities. The NRTI backbone was Stavudine (D4T) in 3708 (57.9%) and Zidovudine (ZDV) in 2613 (40.8%) of patients. Patients lost to follow up accounted for 62.7% of all attrition followed by treatment stops (25.3%) and deaths (12.0%). Attrition was 14.1 (N = 624) and 15.1% (N = 300) in secondary and tertiary hospitals respectively (p = 0.169) in the first 12 months on follow up. During the 13 to 24 months follow up period, attrition was 10.7% (N = 407) and 19.6% (N = 332) in secondary and tertiary facilities respectively (p<0.001). Median time to lost to follow up was 11.1 (IQR: 6.1 to 18.5) months in secondary compared with 13.6 (IQR: 9.9 to 17.0) months in tertiary sites (p = 0.002). At 24 months follow up, male gender [AHR 1.18, 95% CI: 1.01–1.37, P = 0.038]; WHO clinical stage III [AHR 1.30, 95%CI: 1.03–1.66, P = 0.03] and clinical stage IV [AHR 1.90, 95%CI: 1.20–3.02, p = 0.007] and care in a tertiary hospital [AHR 2.21, 95% CI: 1.83–2.67, p<0.001], were associated with attrition.

Conclusion

Attrition could potentially be reduced by decentralizing patients on ART after the first 12 months on therapy to lower level facilities, earlier initiation on treatment and strengthening adherence counseling amongst males.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号