首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.

Background

Acute coronary syndrome (ACS) is common in patients approaching the end-of-life (EoL), but these patients rarely receive palliative care. We compared the utility of a palliative care prognostic tool (Gold Standards Framework (GSF)) and the Global Registry of Acute Coronary Events (GRACE) score, to help identify patients approaching EoL.

Methods and Findings

172 unselected consecutive patients with confirmed ACS admitted over an eight-week period were assessed using prognostic tools and followed up for 12 months. GSF criteria identified 40 (23%) patients suitable for EoL care while GRACE identified 32 (19%) patients with ≥10% risk of death within 6 months. Patients meeting GSF criteria were older (p = 0.006), had more comorbidities (1.6±0.7 vs. 1.2±0.9, p = 0.007), more frequent hospitalisations before (p = 0.001) and after (0.0001) their index admission, and were more likely to die during follow-up (GSF+ 20% vs GSF- 7%, p = 0.03). GRACE score was predictive of 12-month mortality (C-statistic 0.75) and this was improved by the addition of previous hospital admissions and previous history of stroke (C-statistic 0.88).

Conclusions

This study has highlighted a potentially large number of ACS patients eligible for EoL care. GSF or GRACE could be used in the hospital setting to help identify these patients. GSF identifies ACS patients with more comorbidity and at increased risk of hospital readmission.  相似文献   

2.
Li Y  Liu Y  Fu L  Mei C  Dai B 《PloS one》2012,7(4):e34450

Background

A few studies focused on statin therapy as specific prophylactic measures of contrast-induced nephropathy have been published with conflicting results. In this meta-analysis of randomized controlled trials, we aimed to assess the effectiveness of shor-term high-dose statin treatment for the prevention of CIN and clinical outcomes and re-evaluate of the potential benefits of statin therapy.

Methods

We searched PubMed, OVID, EMBASE, Web of science and the Cochrane Central Register of Controlled Trials databases for randomized controlled trials comparing short-term high-dose statin treatment versus low-dose statin treatment or placebo for preventing CIN. Our outcome measures were the risk of CIN within 2–5 days after contrast administration and need for dialysis.

Results

Seven randomized controlled trials with a total of 1,399 patients were identified and analyzed. The overall results based on fixed-effect model showed that the use of short-term high-dose statin treatment was associated with a significant reduction in risk of CIN (RR = 0.51, 95% CI 0.34–0.76, p = 0.001; I2 = 0%). The incidence of acute renal failure requiring dialysis was not significant different after the use of statin (RR = 0.33, 95% CI 0.05–2.10, p = 0.24; I2 = 0%). The use of statin was not associated with a significant decrease in the plasma C-reactive protein level (SMD −0.64, 95% CI: −1.57 to 0.29, P = 0.18, I2 = 97%).

Conclusions

Although this meta-analysis supports the use of statin to reduce the incidence of CIN, it must be considered in the context of variable patient demographics. Only a limited recommendation can be made in favour of the use of statin based on current data. Considering the limitations of included studies, a large, well designed trial that incorporates the evaluation of clinically relevant outcomes in participants with different underlying risks of CIN is required to more adequately assess the role for statin in CIN prevention.  相似文献   

3.
4.
Chavada R  Kok J  van Hal S  Chen SC 《PloS one》2011,6(12):e28247

Background

Fungal peritonitis is a serious complication of peritoneal dialysis (PD) therapy with the majority of patients ceasing PD permanently. The aims of this study were to identify risk factors and clinical associations that may discriminate between fungal from bacterial peritonitis.

Methods

We retrospectively identified episodes of fungal peritonitis from 2001–2010 in PD patients at Liverpool and Westmead Hospitals (Australia). Fungal peritonitis cases were matched in a 1∶2 ratio with patients with bacterial peritonitis from each institution''s dialysis registry, occurring closest in time to the fungal episode. Patient demographic, clinical and outcome data were obtained from the medical records.

Results

Thirty-nine episodes of fungal peritonitis (rate of 0.02 episodes per patient-year of dialysis) were matched with 78 episodes of bacterial peritonitis. Candida species were the commonest pathogens (35/39; 90% episodes) with Candida albicans (37%), Candida parapsilosis (32%) and Candida glabrata (13%) the most frequently isolated species. Compared to bacterial peritonitis, fungal peritonitis patients had received PD for significantly longer (1133 vs. 775 catheter-days; p = 0.016), were more likely to have had previous episodes of bacterial peritonitis (51% vs. 10%; p = 0.01), and to have received prior antibacterial therapy (51% vs. 10%; p = 0.01). Patients with fungal peritonitis were less likely to have fever and abdominal pain on presentation, but had higher rates of PD catheter removal (79% vs. 22%; p<0.005), and permanent transfer to haemodialysis (87% vs. 24%; p<0.005). Hospital length of stay was significantly longer in patients with fungal peritonitis (26.1 days vs. 12.6 days; p = 0.017), but the all-cause 30-day mortality rate was similar in both groups. Fluconazole was a suitable empiric antifungal agent; with no Candida resistance detected.

Conclusion

Prompt recognition of clinical risk factors, initiation of antifungal therapy and removal of PD catheters are key considerations in optimising outcomes.  相似文献   

5.

Objectives

Clinical characteristics and trends in the outcome of acute coronary syndrome (ACS) in patients with prior coronary artery bypass graft surgery (CABG) are unclear. The aim of this study was to evaluate clinical characteristics, in-hospital treatment, and outcomes in patients presented with ACS with or without a history of prior CABG over 2 decades.

Methods

Data were derived from hospital-based study for collected data from 1991 through 2010 of patients hospitalized with ACS in Doha, Qatar. Data were analyzed according to their history of prior CABG. Baseline clinical characteristics, in-hospital treatment, and outcome were compared.

Results

A total 16,750 consecutive patients with ACS were studied, of which 693 (4.1%) had prior CABG. Patients with prior CABG were older (mean 60.5±11 vs. 53±12 years; P = 0.001), more likely to be females and have more cardiovascular risk factors than the non-CABG group. Prior CABG patients had larger infarct size, were less likely to receive reperfusion therapy, early invasive therapy and more likely to receive evidence-based therapies when compared to non-CABG patients. In-hospital mortality and stroke rates were comparable between the 2 groups. Over 2 decades, there was reduction in the in-hospital mortality rates and stroke rates in both groups (CABG, death; 13.2% to 4%, stroke; 1.9% to 0.0%, non-CABG, death; 10% to 3.2%, stroke 1.0% to 0.1%; all, p = 0.001).

Conclusion

Significant reduction in-hospital morbidity and mortality among ACS patients with prior CABG over a 20-year period.  相似文献   

6.

Objective

It may be possible to thrombolyse ischaemic stroke (IS) patients up to 6 h by using penumbral imaging. We investigated whether a perfusion CT (CTP) mismatch can help to select patients for thrombolysis up to 6 h.

Methods

A cohort of 254 thrombolysed IS patients was studied. 174 (69%) were thrombolysed at 0–3 h by using non-contrast CT (NCCT), and 80 (31%) at 3–6 h (35 at 3–4.5 h and 45 at 4.5–6 h) by using CTP mismatch criteria. Symptomatic intracerebral haemorrhage (SICH), the mortality and the modified Rankin Score (mRS) were assessed at 3 months. Independent determinants of outcome in patients thrombolysed between 3 and 6 h were identified.

Results

The baseline characteristics were comparable in the two groups. There were no differences in SICH (3% v 4%, p = 0.71), any ICH (7% v 9%, p = 0.61), or mortality (16% v 9%, p = 0.15) or mRS 0–2 at 3 months (55% v 54%, p = 0.96) between patients thrombolysed at 0–3 h (NCCT only) or at 3–6 h (CTP mismatch). There were no significant differences in outcome between patients thrombolysed at 3–4.5 h or 4.5–6 h. The NIHSS score was the only independent determinant of a mRS of 0–2 at 3 months (OR 0.89, 95% CI 0.82–0.97, p = 0.007) in patients treated using CTP mismatch criteria beyond 3 h.

Conclusions

The use of a CTP mismatch model may help to guide thrombolysis decisions up to 6 h after IS onset.  相似文献   

7.
Cao L  Silvestry S  Zhao N  Diehl J  Sun J 《PloS one》2012,7(2):e30094

Background and Objective

Postoperative cardiocerebral and renal complications are a major threat for patients undergoing cardiac surgery. This study was aimed to examine the effect of preoperative aspirin use on patients undergoing cardiac surgery.

Methods

An observational cohort study was performed on consecutive patients (n = 1879) receiving cardiac surgery at this institution. The patients excluded from the study were those with preoperative anticoagulants, unknown aspirin use, or underwent emergent cardiac surgery. Outcome events included were 30-day mortality, renal failure, readmission and a composite outcome - major adverse cardiocerebral events (MACE) that include permanent or transient stroke, coma, perioperative myocardial infarction (MI), heart block and cardiac arrest.

Results

Of all patients, 1145 patients met the inclusion criteria and were divided into two groups: those taking (n = 858) or not taking (n = 287) aspirin within 5 days preceding surgery. Patients with aspirin presented significantly more with history of hypertension, diabetes, peripheral arterial disease, previous MI, angina and older age. With propensity scores adjusted and multivariate logistic regression, however, this study showed that preoperative aspirin therapy (vs. no aspirin) significantly reduced the risk of MACE (8.4% vs. 12.5%, odds ratio [OR] 0.585, 95% CI 0.355–0.964, P = 0.035), postoperative renal failure (2.6% vs. 5.2%, OR 0.438, CI 0.203–0.945, P = 0.035) and dialysis required (0.8% vs. 3.1%, OR 0.230, CI 0.071–0.742, P = 0.014), but did not significantly reduce 30-day mortality (4.1% vs. 5.8%, OR 0.744, CI 0.376–1.472, P = 0.396) nor it increased readmissions in the patients undergoing cardiac surgery.

Conclusions

Preoperative aspirin therapy is associated with a significant decrease in the risk of MACE and renal failure and did not increase readmissions in patients undergoing non-emergent cardiac surgery.  相似文献   

8.

Rationale

Contamination by bacterial or fungal organisms reduces the effectiveness of mycobacterial culture for diagnosis of pulmonary tuberculosis (TB). We evaluated the effect of an anti-microbial and an anti-fungal oral rinse prior to expectoration on culture-contamination rates.

Methods

We enrolled a consecutive random sample of adults with cough for ≥2 weeks and suspected TB admitted to Mulago Hospital (Kampala, Uganda) between October 2008 and June 2009. We randomly assigned patients to oral rinse (60 seconds with chlorhexidine followed by 60 seconds with nystatin) vs. no oral rinse prior to initial sputum collection. Uganda National Tuberculosis Reference Laboratory technicians blinded to the method of sputum collection (with or without oral rinse) processed all sputum specimens for smear microscopy (direct Ziehl-Neelsen) and mycobacterial culture (Lowenstein-Jensen media).

Results

Of 220 patients enrolled, 177 (80%) were HIV-seropositive (median CD4-count 37 cells/uL, IQR 13–171 cells/uL). Baseline characteristics were similar between patients in the oral-rinse (N = 110) and no oral-rinse (N = 110) groups. The proportion of contaminated cultures was significantly lower in the oral-rinse group compared to the no oral-rinse group (4% vs. 15%, risk difference −11%, 95% CI −18 to −3%, p = 0.005). Oral rinse significantly reduced the proportion of contaminated cultures among HIV-infected patients (3% vs. 18%, risk difference −14%, 95% CI −23 to −6%, p = 0.002) but not HIV-uninfected (6% vs. 4%, risk difference 2%, 95% CI −12 to +15%, p = 0.81) patients. However, the proportion of smear-positive specimens (25% vs. 35%, p = 0.10) and culture-positive specimens (48% vs. 56%, p = 0.24) were lower in the oral-rinse compared to the no oral-rinse group, although the differences were not statistically significant.

Conclusions

Oral rinse prior to sputum expectoration is a promising strategy to reduce mycobacterial culture contamination in areas with high HIV prevalence, if strategies can be devised to reduce the adverse impact of oral rinse on smear- and culture-positivity.  相似文献   

9.
10.

Background

Modification of ritonavir-boosted lopinavir (LPV/r)-based antiretroviral therapy is required for HIV-infected children co-treated for tuberculosis (TB). We aimed to determine virologic and toxicity outcomes among TB/HIV co-treated children with the following modifications to their antiretroviral therapy (ART): (1) super-boosted LPV/r, (2) double-dose LPV/r or (3) ritonavir.

Methods and Findings

A medical record review was conducted at two clinical sites in Johannesburg, South Africa. The records of children 6–24 months of age initiating LPV/r-based therapy were reviewed. Children co-treated for TB were categorized based on the modifications made to their ART regimen and were compared to children of the same age at each site not treated for TB.Included are 526 children, 294 (56%) co-treated for TB. All co-treated children had more severe HIV disease, including lower CD4 percents and worse growth indicators, than comparisons.Children in the super-boosted group (n = 156) were as likely to be virally suppressed (<400 copies/ml) at 6 months as comparisons (69.2% vs. 74.8%, p = 0.36). Children in the double-dose (n = 47) and ritonavir groups (n = 91) were significantly less likely to be virally suppressed at 6 months (53.1% and 49.3%) than comparisons (74.8% and 82.1%; p = 0.02 and p<0.0001, respectively). At 12 months only children in the ritonavir group still had lower rates of virological suppression relative to comparisons (63.9% vs 83.3% p<0.05). Grade 1 or greater ALT elevations were more common in the super-boosted (75%) than double-dose (54.6%) or ritonavir (33.9%) groups (p = 0.09 and p<0.0001) but grade 3/4 elevations were observed in 3 (13.6%) of the super-boosted, 7 (15.9%) of the double-dose and 5 (8.9%) of the ritonavir group (p = 0.81 and p = 0.29).

Conclusion

Good short-term virologic outcomes were achieved in children co-treated for TB and HIV who received super-boosted LPV/r. Treatment limiting toxicity was rare. Strategies for increased dosing of LPV/r with TB treatment warrant further investigation.  相似文献   

11.
12.

Background

Current recommendations to prevent malaria in African pregnant women rely on insecticide treated nets (ITNs) and intermittent preventive treatment (IPTp). However, there is no information on the safety and efficacy of their combined use.

Methods

1030 pregnant Mozambican women of all gravidities received a long-lasting ITN during antenatal clinic (ANC) visits and, irrespective of HIV status, were enrolled in a randomised, double blind, placebo-controlled trial, to assess the safety and efficacy of 2-dose sulphadoxine-pyrimethamine (SP). The main outcome was the reduction in low birth weight.

Findings

Two-dose SP was safe and well tolerated, but was not associated with reductions in anaemia prevalence at delivery (RR, 0.92 [95% CI, 0.79–1.08]), low birth weight (RR, 0.99 [95% CI, 0.70–1.39]), or overall placental infection (p = 0.964). However, the SP group showed a 40% reduction (95% CI, 7.40–61.20]; p = 0.020) in the incidence of clinical malaria during pregnancy, and reductions in the prevalence of peripheral parasitaemia (7.10% vs 15.15%) (p<0.001), and of actively infected placentas (7.04% vs 13.60%) (p = 0.002). There was a reduction in severe anaemia at delivery of borderline statistical significance (p = 0.055). These effects were not modified by gravidity or HIV status. Reported ITN''s use was more than 90% in both groups.

Conclusions

Two-dose SP was associated with a reduction in some indicators, but these were not translated to significant improvement in other maternal or birth outcomes. The use of ITNs during pregnancy may reduce the need to administer IPTp. ITNs should be part of the ANC package in sub-Saharan Africa.

Trial Registration

ClinicalTrials.gov NCT00209781  相似文献   

13.

Objective

1) To evaluate whether peripheral blood mononuclear cells (PBMCs) from type 2 diabetic patients present an impairment of phagocytic activity; 2) To determine whether the eventual impairment in phagocytic activity is related to glycemic control and can be reversed by improving blood glucose levels.

Methods

21 type 2 diabetic patients and 21 healthy volunteers were prospectively recruited for a case-control study. In addition, those patients in whom HbA1c was higher than 8% (n = 12) were hospitalized in order to complete a 5-day intensification treatment of blood glucose. Phagocytic activity was assessed by using a modified flow cytometry procedure developed in our laboratory based on DNA/RNA viable staining to discriminate erythrocytes and debris. This method is simple, highly sensitive and reproducible and it takes advantage of classic methods that are widely used in flow cytometry.

Results

Type 2 diabetic patients showed a lower percentage of activated macrophages in comparison with non-diabetic subjects (54.00±18.93 vs 68.53±12.77%; p = 0.006) Significant negative correlations between phagocytic activity and fasting glucose (r = −0.619, p = 0.004) and HbA1c (r = −0.506, p = 0.019) were detected. In addition, multiple linear regression analyses showed that either fasting plasma glucose or HbA1c were independently associated with phagocytic activity. Furthermore, in the subset of patients who underwent metabolic optimization a significant increase in phagocytic activity was observed (p = 0.029).

Conclusions

Glycemic control is related to phagocytic activity in type 2 diabetes. Our results suggest that improvement in phagocytic activity can be added to the beneficial effects of metabolic optimization.  相似文献   

14.

Objectives

Generic triage risk assessments are widely used in the emergency department (ED), but have not been validated for prediction of short-term risk among patients with acute heart failure (HF). Our objective was to evaluate the Canadian Triage Acuity Scale (CTAS) for prediction of early death among HF patients.

Methods

We included patients presenting with HF to an ED in Ontario from Apr 2003 to Mar 2007. We used the National Ambulatory Care Reporting System and vital statistics databases to examine care and outcomes.

Results

Among 68,380 patients (76±12 years, 49.4% men), early mortality was stratified with death rates of 9.9%, 1.9%, 0.9%, and 0.5% at 1-day, and 17.2%, 5.9%, 3.8%, and 2.5% at 7-days, for CTAS 1, 2, 3, and 4–5, respectively. Compared to lower acuity (CTAS 4–5) patients, adjusted odds ratios (aOR) for 1-day death were 1.32 (95%CI; 0.93–1.88; p = 0.12) for CTAS 3, 2.41 (95%CI; 1.71–3.40; p<0.001) for CTAS 2, and highest for CTAS 1: 9.06 (95%CI; 6.28–13.06; p<0.001). Predictors of triage-critical (CTAS 1) status included oxygen saturation <90% (aOR 5.92, 95%CI; 3.09–11.81; p<0.001), respiratory rate >24 breaths/minute (aOR 1.96, 95%CI; 1.05–3.67; p = 0.034), and arrival by paramedic (aOR 3.52, 95%CI; 1.70–8.02; p = 0.001). While age/sex-adjusted CTAS score provided good discrimination for ED (c-statistic = 0.817) and 1-day (c-statistic = 0.724) death, mortality prediction was improved further after accounting for cardiac and non-cardiac co-morbidities (c-statistics 0.882 and 0.810, respectively; both p<0.001).

Conclusions

A semi-quantitative triage acuity scale assigned at ED presentation and based largely on respiratory factors predicted emergent death among HF patients.  相似文献   

15.

Background

The change of malaria case-management policy in Kenya to recommend universal parasitological diagnosis and targeted treatment with artemether-lumefantrine (AL) is supported with activities aiming by 2013 at universal coverage and adherence to the recommendations. We evaluated changes in health systems and case-management indicators between the baseline survey undertaken before implementation of the policy and the follow-up survey following the first year of the implementation activities.

Methods/Findings

National, cross-sectional surveys using quality-of-care methods were undertaken at public facilities. Baseline and follow-up surveys respectively included 174 and 176 facilities, 224 and 237 health workers, and 2,405 and 1,456 febrile patients. Health systems indicators showed variable changes between surveys: AL stock-out (27% to 21%; p = 0.152); availability of diagnostics (55% to 58%; p = 0.600); training on the new policy (0 to 22%; p = 0.001); exposure to supervision (18% to 13%; p = 0.156) and access to guidelines (0 to 6%; p = 0.001). At all facilities, there was an increase among patients tested for malaria (24% vs 31%; p = 0.090) and those who were both tested and treated according to test result (16% to 22%; p = 0.048). At facilities with AL and malaria diagnostics, testing increased from 43% to 50% (p = 0.196) while patients who were both, tested and treated according to test result, increased from 28% to 36% (p = 0.114). Treatment adherence improved for test positive patients from 83% to 90% (p = 0.150) and for test negative patients from 47% to 56% (p = 0.227). No association was found between testing and exposure to training, supervision and guidelines, however, testing was significantly associated with facility ownership, type of testing, and patients'' caseload, age and clinical presentation.

Conclusions

Most of the case-management indicators have shown some improvement trends; however differences were smaller than expected, rarely statistically significant and still leaving a substantial gap towards optimistic targets. The quantitative and qualitative improvement of interventions will ultimately determine the success of the new policy.  相似文献   

16.

Background

The management of patients with heart failure (HF) needs to account for changeable and complex individual clinical characteristics. The use of renin angiotensin system inhibitors (RAAS-I) to target doses is recommended by guidelines. But physicians seemingly do not sufficiently follow this recommendation, while little is known about the physician and patient predictors of adherence.

Methods

To examine the coherence of primary care (PC) physicians'' knowledge and self-perceived competencies regarding RAAS-I with their respective prescribing behavior being related to patient-associated barriers. Cross-sectional follow-up study after a randomized medical educational intervention trial with a seven month observation period. PC physicians (n = 37) and patients with systolic HF (n = 168) from practices in Baden-Wuerttemberg. Measurements were knowledge (blueprint-based multiple choice test), self-perceived competencies (questionnaire on global confidence in the therapy and on frequency of use of RAAS-I), and patient variables (age, gender, NYHA functional status, blood pressure, potassium level, renal function). Prescribing was collected from the trials'' documentation. The target variable consisted of ≥50% of recommended RAAS-I dosage being investigated by two-level logistic regression models.

Results

Patients (69% male, mean age 68.8 years) showed symptomatic and objectified left ventricular (NYHA II vs. III/IV: 51% vs. 49% and mean LVEF 33.3%) and renal (GFR<50%: 22%) impairment. Mean percentage of RAAS-I target dose was 47%, 59% of patients receiving ≥50%. Determinants of improved prescribing of RAAS-I were patient age (OR 0.95, CI 0.92–0.99, p = 0.01), physician''s global self-confidence at follow-up (OR 1.09, CI 1.02–1.05, p = 0.01) and NYHA class (II vs. III/IV) (OR 0.63, CI 0.38–1.05, p = 0.08).

Conclusions

A change in physician''s confidence as a predictor of RAAS-I dose increase is a new finding that might reflect an intervention effect of improved physicians'' intention and that might foster novel strategies to improve safe evidence-based prescribing. These should include targeting knowledge, attitudes and skills.  相似文献   

17.

Objective

HIV and type 2 diabetes are known risk factors for albuminuria, but no previous reports have characterized albuminuria in HIV-infected patients with diabetes.

Research Design and Methods

We performed a cross-sectional study including 73 HIV-infected adults with type 2 diabetes, 82 HIV-infected non-diabetics, and 61 diabetic control subjects without HIV. Serum creatinine >1.5 mg/dL was exclusionary. Albuminuria was defined as urinary albumin/creatinine ratio >30 mg/g.

Results

The prevalence of albuminuria was significantly increased among HIV-infected diabetics (34% vs. 13% of HIV non-diabetic vs. 16% diabetic control, p = 0.005). HIV status and diabetes remained significant predictors of albuminuria after adjusting for age, race, BMI, and blood pressure. Albumin/creatinine ratio correlated significantly with HIV viral load (r = 0.28, p = 0.0005) and HIV-infected subjects with albuminuria had significantly greater cumulative exposure to abacavir (p = 0.01). In an adjusted multivariate regression analysis of HIV-infected subjects, the diagnosis of diabetes (p = 0.003), higher HIV viral load (p = 0.03) and cumulative exposure to abacavir (p = 0.0009) were significant independent predictors of albuminuria.

Conclusions

HIV and diabetes appear to have additive effects on albuminuria which is also independently associated with increased exposure to abacavir and HIV viral load. Future research on the persistence, progression and management of albuminuria in this unique at-risk population is needed.  相似文献   

18.

Background

Studies indicate that acquired deficits negatively affect patients'' self-reported health related quality of life (HRQOL) and survival, but the impact of HRQOL deterioration after surgery on survival has not been explored.

Objective

Assess if change in HRQOL after surgery is a predictor for survival in patients with glioblastoma.

Methods

Sixty-one patients with glioblastoma were included. The majority of patients (n = 56, 91.8%) were operated using a neuronavigation system which utilizes 3D preoperative MRI and updated intraoperative 3D ultrasound volumes to guide resection. HRQOL was assessed using EuroQol 5D (EQ-5D), a generic instrument. HRQOL data were collected 1–3 days preoperatively and after 6 weeks. The mean change in EQ-5D index was −0.05 (95% CI −0.15–0.05) 6 weeks after surgery (p = 0.285). There were 30 patients (49.2%) reporting deterioration 6 weeks after surgery. In a Cox multivariate survival analysis we evaluated deterioration in HRQOL after surgery together with established risk factors (age, preoperative condition, radiotherapy, temozolomide and extent of resection).

Results

There were significant independent associations between survival and use of temozolomide (HR 0.30, p = 0.019), radiotherapy (HR 0.26, p = 0.030), and deterioration in HRQOL after surgery (HR 2.02, p = 0.045). Inclusion of surgically acquired deficits in the model did not alter the conclusion.

Conclusion

Early deterioration in HRQOL after surgery is independently and markedly associated with impaired survival in patients with glioblastoma. Deterioration in patient reported HRQOL after surgery is a meaningful outcome in surgical neuro-oncology, as the measure reflects both the burden of symptoms and treatment hazards and is linked to overall survival.  相似文献   

19.
Li HM  Peng RR  Li J  Yin YP  Wang B  Cohen MS  Chen XS 《PloS one》2011,6(8):e23431

Background

Men who have sex with men (MSM) have now become one of the priority populations for prevention and control of HIV pandemic in China. Information of HIV incidence among MSM is important to describe the spreading of the infection and predict its trends in this population. We reviewed the published literature on the incidence of HIV infection among MSM in China.

Methods

We identified relevant studies by use of a comprehensive strategy including searches of Medline and two Chinese electronic publication databases from January 2005 to September 2010. Point estimate of random effects incidence with corresponding 95% confidence intervals (CI) of HIV infection was carried out using the Comprehensive Meta-Analysis software. Subgroup analyses were examined separately, stratified by study design and geographic location.

Results

Twelve studies were identified, including three cohort studies and nine cross-sectional studies. The subgroup analyses revealed that the sub-overall incidence estimates were 3.5% (95% CI, 1.7%–5.3%) and 6.7% (95% CI, 4.8%–8.6%) for cohort and cross-sectional studies, respectively (difference between the sub-overalls, Q = 5.54, p = 0.02); and 8.3% (95% CI, 6.9%–9.7%) and 4.6% (95% CI, 2.4%–6.9%) for studies in Chongqing and other areas, respectively (difference between the sub-overalls, Q = 7.58, p<0.01). Syphilis infection (RR = 3.33, p<0.001), multiple sex partnerships (RR = 2.81, p<0.001), and unprotected receptive anal intercourse in the past six months (RR = 3.88, p = 0.007) represented significant risk for HIV seroconversion.

Conclusions

Findings from this meta-analysis indicate that HIV incidence is substantial in MSM in China. High incidence of HIV infection and unique patterns of sexual risk behaviors in this population serve as a call for action that should be answered with the innovative social and public health intervention strategies, and development of biological prevention strategies.  相似文献   

20.

Background

Bacteremia by Pseudomonas aeruginosa represents one severe infection. It is not clear whether beta-lactam monotherapy leads to similar rates of treatment success compared to combinations of beta-lactams with aminoglycosides or quinolones.

Methods

Retrospective cohort study from 3 tertiary hospitals (2 in Greece and 1 in Italy). Pseudomonas aeruginosa isolates were susceptible to a beta-lactam and an aminoglycoside or a quinolone. Patients received appropriate therapy for at least 48 hours. Primary outcome of interest was treatment success in patients with definitive beta-lactam combination therapy compared to monotherapy. Secondary outcomes were treatment success keeping the same empirical and definitive regimen, mortality, and toxicity.

Results

Out of 92 bacteremias there were 54 evaluable episodes for the primary outcome (20 received monotherapy). Treatment success was higher with combination therapy (85%) compared to beta-lactam monotherapy (65%), however not statistically significantly [Odds ratio (OR) 3.1; 95% Confidence Interval (CI) 0.69–14.7, p = 0.1]. Very long (>2 months) hospitalisation before bacteremia was the only factor independently associated with treatment success (OR 0.73; 95% CI 0.01–0.95, p = 0.046), however this result entailed few episodes. All-cause mortality did not differ significantly between combination therapy [6/31 (19%)] and monotherapy [8/19 (42%)], p = 0.11. Only Charlson comorbidity index was associated with excess mortality (p = 0.03).

Conclusion

Our study, in accordance with previous ones, indicates that the choice between monotherapy and combination therapy may not affect treatment success significantly. However, our study does not have statistical power to identify small or moderate differences. A large randomized controlled trial evaluating this issue is justified.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号