首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
BackgroundInvasive pneumococcal disease (IPD) causes considerable morbidity and mortality. We aimed to identify host factors and biomarkers associated with poor outcomes in adult patients with IPD in Japan, which has a rapidly-aging population.MethodsIn a large-scale surveillance study of 506 Japanese adults with IPD, we investigated the role of host factors, disease severity, biomarkers based on clinical laboratory data, treatment regimens, and bacterial factors on 28-day mortality.ResultsOverall mortality was 24.1%, and the mortality rate increased from 10.0% in patients aged ˂50 years to 33.1% in patients aged ≥80 years. Disease severity also increased 28-day mortality, from 12.5% among patients with bacteraemia without sepsis to 35.0% in patients with severe sepsis and 56.9% with septic shock. The death rate within 48 hours after admission was high at 54.9%. Risk factors for mortality identified by multivariate analysis were as follows: white blood cell (WBC) count <4000 cells/μL (odds ratio [OR], 6.9; 95% confidence interval [CI], 3.7–12.8, p < .001); age ≥80 years (OR, 6.5; 95% CI, 2.0–21.6, p = .002); serum creatinine ≥2.0 mg/dL (OR, 4.5; 95% CI, 2.5–8.1, p < .001); underlying liver disease (OR, 3.5; 95% CI, 1.6–7.8, p = .002); mechanical ventilation (OR, 3.0; 95% CI, 1.7–5.6, p < .001); and lactate dehydrogenase ≥300 IU/L (OR, 2.4; 95% CI, 1.4–4.0, p = .001). Pneumococcal serotype and drug resistance were not associated with poor outcomes.ConclusionsHost factors, disease severity, and biomarkers, especially WBC counts and serum creatinine, were more important determinants of mortality than bacterial factors.  相似文献   

2.
BackgroundThrombocytopenia is a hallmark of dengue infection, and bleeding is a dreaded complication of dengue fever. Prophylactic platelet transfusion has been used to prevent bleeding in the management of dengue fever, although the evidence for its benefit is lacking. In adult dengue patients with platelet count <20,000/mm3 without bleeding, we aimed to assess if prophylactic platelet transfusion was effective in reducing clinical bleeding and other outcomes.MethodWe conducted a retrospective non-randomised observational study of dengue patients with platelet count < 20,000/mm3 without bleeding (except petechiae) admitted to Tan Tock Seng Hospital from January 2005 to December 2008. Baseline characteristics and clinical outcomes were compared between the non-transfused vs. transfused groups. Outcomes studied were clinical bleeding, platelet increment, hospital length of stay, intensive care unit admission and death.ResultsOf the 788 patients included, 486 received prophylactic platelet transfusion. There was no significant difference in the presence of clinical bleeding in the two groups (18.2% in non-transfused group vs. 23.5% in transfused group; P = 0.08). Patients in the transfused group took a median of 1 day longer than the non-transfused group to increase their platelet count to 50,000/mm3 or more (3 days vs. 2 days, P <0.0001). The median duration of hospital stay in the non-transfused group was 5 days vs. 6 days in the transfused group (P< 0.0001). There was no significant difference in the proportion requiring ICU admission (non-transfused 0.66% vs. transfused 1.23%, P = 0.44) and death (non-transfused 0% vs. transfused 0.2%, P = 0.43).ConclusionPlatelet transfusion in absence of bleeding in adult dengue with platelet count <20,000/mm3 did not reduce bleeding or expedite platelet recovery. There was potential harm by slowing recovery of platelet count to >50,000/mm3 and increasing length of hospitalization.  相似文献   

3.

Introduction

Markers of the systemic inflammatory response, including C-reactive protein and albumin (combined to form the modified Glasgow Prognostic Score), as well as neutrophil, lymphocyte and platelet counts have been shown to be prognostic of survival in patients with cancer. The aim of the present study was to examine the prognostic relationship between these markers of the systemic inflammatory response and all-cause, cancer, cardiovascular and cerebrovascular mortality in a large incidentally sampled cohort.

Methods

Patients (n = 160 481) who had an incidental blood sample taken between 2000 and 2008 were studied for the prognostic value of C-reactive protein (>10mg/l, albumin (>35mg/l), neutrophil (>7.5×109/l) lymphocyte and platelet counts. Also, patients (n = 52 091) sampled following the introduction of high sensitivity C-reactive protein (>3mg/l) measurements were studied. A combination of these markers, to make cumulative inflammation-based scores, were investigated.

Results

In all patients (n = 160 481) C-reactive protein (>10mg/l) (HR 2.71, p<0.001), albumin (>35mg/l) (HR 3.68, p<0.001) and neutrophil counts (HR 2.18, p<0.001) were independently predictive of all-cause mortality. These associations were also observed in cancer, cardiovascular and cerebrovascular mortality before and after the introduction of high sensitivity C-reactive protein measurements (>3mg/l) (n = 52 091). A combination of high sensitivity C-reactive protein (>3mg/l), albumin and neutrophil count predicted all-cause (HR 7.37, p<0.001, AUC 0.723), cancer (HR 9.32, p<0.001, AUC 0.731), cardiovascular (HR 4.03, p<0.001, AUC 0.650) and cerebrovascular (HR 3.10, p<0.001, AUC 0.623) mortality.

Conclusion

The results of the present study showed that an inflammation-based prognostic score, combining high sensitivity C-reactive protein, albumin and neutrophil count is prognostic of all-cause mortality.  相似文献   

4.
Severe ADAMTS13 deficiency occurs in 13% to 75% of thrombotic microangiopathies (TMA). In this context, the early identification of a severe, antibody-mediated, ADAMTS13 deficiency may allow to start targeted therapies such as B-lymphocytes-depleting monoclonal antibodies. To date, assays exploring ADAMTS13 activity require skill and are limited to only some specialized reference laboratories, given the very low incidence of the disease. To identify clinical features which may allow to predict rapidly an acquired ADAMTS13 deficiency, we performed a cross-sectional analysis of our national registry from 2000 to 2007. The clinical presentation of 160 patients with TMA and acquired ADAMTS13 deficiency was compared with that of 54 patients with detectable ADAMTS13 activity. ADAMTS13 deficiency was associated with more relapses during treatment and with a good renal prognosis. Patients with acquired ADAMTS13 deficiency had platelet count <30×109/L (adjusted odds ratio [OR] 9.1, 95% confidence interval [CI] 3.4–24.2, P<.001), serum creatinine level ≤200 µmol/L (OR 23.4, 95% CI 8.8–62.5, P<.001), and detectable antinuclear antibodies (OR 2.8, 95% CI 1.0–8.0, P<.05). When at least 1 criteria was met, patients with a severe acquired ADAMTS13 deficiency were identified with positive predictive value of 85%, negative predictive value of 93.3%, sensitivity of 98.8%, and specificity of 48.1%. Our criteria should be useful to identify rapidly newly diagnosed patients with an acquired ADAMTS13 deficiency to better tailor treatment for different pathophysiological groups.  相似文献   

5.
BackgroundThe link of low estimated glomerular filtration rate (eGFR) and high proteinuria to cardiovascular disease (CVD) mortality is well known. However, its link to mortality due to other causes is less clear.MethodsWe studied 367,932 adults (20–93 years old) in the Korean Heart Study (baseline between 1996–2004 and follow-up until 2011) and assessed the associations of creatinine-based eGFR and dipstick proteinuria with mortality due to CVD (1,608 cases), cancer (4,035 cases), and other (non-CVD/non-cancer) causes (3,152 cases) after adjusting for potential confounders.ResultsAlthough cancer was overall the most common cause of mortality, in participants with chronic kidney disease (CKD), non-CVD/non-cancer mortality accounted for approximately half of cause of death (47.0%for eGFR <60 ml/min/1.73m2 and 54.3% for proteinuria ≥1+). Lower eGFR (<60 vs. ≥60 ml/min/1.73m2) was significantly associated with mortality due to CVD (adjusted hazard ratio 1.49 [95% CI, 1.24–1.78]) and non-CVD/non-cancer causes (1.78 [1.54–2.05]). The risk of cancer mortality only reached significance at eGFR <45 ml/min/1.73m2 when eGFR 45–59 ml/min/1.73m2 was set as a reference (1.62 [1.10–2.39]). High proteinuria (dipstick ≥1+ vs. negative/trace) was consistently associated with mortality due to CVD (1.93 [1.66–2.25]), cancer (1.49 [1.32–1.68]), and other causes (2.19 [1.96–2.45]). Examining finer mortality causes, low eGFR and high proteinuria were commonly associated with mortality due to coronary heart disease, any infectious disease, diabetes, and renal failure. In addition, proteinuria was also related to death from stroke, cancers of stomach, liver, pancreas, and lung, myeloma, pneumonia, and viral hepatitis.ConclusionLow eGFR was associated with CVD and non-CVD/non-cancer mortality, whereas higher proteinuria was consistently related to mortality due to CVD, cancer, and other causes. These findings suggest the need for multidisciplinary prevention and management strategies in individuals with CKD, particularly when proteinuria is present.  相似文献   

6.
Changes in blood cell parameters are already a well-known feature of malarial infections. To add to this information, the objective of this study was to investigate the varying effects that different levels of parasite density have on blood cell parameters. Patients diagnosed with malaria at Phobphra Hospital, Tak Province, Thailand between January 1st 2009 and January 1st 2012 were recruited as subjects for data collection. Blood cell parameters of 2,024 malaria-infected patients were evaluated and statistically analyzed. Neutrophil and platelet counts were significantly higher, however, RBC count was significantly lower in patients with P. falciparum infection compared to those with P. vivax infection (p<0.0001). Leukocyte counts were also significantly higher in patients with high parasitemia compared to those with low and moderate parasitemia. In terms of differential leukocyte count, neutrophil count was significantly higher in patients with high parasitemia compared to those with low and moderate parasitemia (p<0.0001). On the other hand, both lymphocyte and monocyte counts were significantly lower in patients with high parasitemia (p<0.0001). RBC count and Hb concentration, as well as platelet count were also significantly reduced (p<0.05) and (p<0.0001), respectively. To summarize, patients infected with different malaria parasites exhibited important distinctive hematological parameters, with neutrophil and eosinophil counts being the two hematological parameters most affected. In addition, patients infected with different malarial densities also exhibited important changes in leukocyte count, platelet count and hemoglobin concentration during the infection. These findings offer the opportunity to recognize and diagnose malaria related anemia, help support the treatment thereof, as well as relieve symptoms of severe malaria in endemic regions.  相似文献   

7.
BackgroundPneumonia is a major cause of mortality among HIV-infected patients. Pneumonia severity scores are promising tools to assist clinicians in predicting patients’ 30-day mortality, but existing scores were developed in populations infected with neither HIV nor tuberculosis (TB) and include laboratory data that may not be available in resource-limited settings. The objective of this study was to develop a score to predict mortality in HIV-infected adults with pneumonia in TB-endemic, resource-limited settings.MethodsWe conducted a secondary analysis of data from a prospective study enrolling HIV-infected adults with cough ≥2 weeks and <6 months and clinically suspected pneumonia admitted to Mulago Hospital in Kampala, Uganda from September 2008 to March 2011. Patients provided two sputum specimens for mycobacteria, and those with Ziehl-Neelsen sputum smears that were negative for mycobacteria underwent bronchoscopy with inspection for Kaposi sarcoma and testing for mycobacteria and fungi, including Pneumocystis jirovecii. A multivariable best subsets regression model was developed, and one point was assigned to each variable in the model to develop a clinical predictor score for 30-day mortality.ResultsOverall, 835 patients were studied (mean age 34 years, 53.4% female, 30-day mortality 18.2%). A four-point clinical predictor score was identified and included heart rate >120 beats/minute, respiratory rate >30 breaths/minute, oxygen saturation <90%, and CD4 cell count <50 cells/mm3. Patients’ 30-day mortality, stratified by score, was: score 0 or 1, 12.6%, score 2 or 3, 23.4%, score 4, 53.9%. For each 1 point change in clinical predictor score, the odds of 30-day mortality increased by 65% (OR 1.65, 95% CI 1.39-1.96, p <0.001).ConclusionsA simple, four-point scoring system can stratify patients by levels of risk for mortality. Rapid identification of higher risk patients combined with provision of timely and appropriate treatment may improve clinical outcomes. This predictor score should be validated in other resource-limited settings.  相似文献   

8.
BackgroundEosinophilic meningitis (EM) is a rare clinical syndrome caused by both infectious and noninfectious diseases. In tropical pacific countries, Angiostrongylus cantonensis is the most common cause. However, the EM definition varies in the literature, and its relation to parasitic meningitis (PM) remains unclear.Methodology/Principal findingsAdult and adolescent patients of 13 years old or above with suspected central nervous system (CNS) infections with abnormal CSF findings were prospectively enrolled at a tertiary referral hospital in Hanoi, Vietnam from June 2012 to May 2014. Patients with EM or suspected PM (EM/PM) were defined by the presence of either ≥10% eosinophils or an absolute eosinophil cell counts of ≥10/mm3 in the CSF or blood eosinophilia (>16% of WBCs) without CSF eosinophils. In total 679 patients were enrolled: 7 (1.03%) had ≥10% CSF eosinophilia, 20 (2.95%) had ≥10/mm3 CSF eosinophilia, and 7 (1.03%) had >16% blood eosinophilia. The patients with ≥10% CSF eosinophilia were significantly younger (p = 0.017), had a lower body temperature (p = 0.036) than patients with ≥10/mm3 CSF eosinophilia among whom bacterial pathogens were detected in 72.2% (13/18) of those who were tested by culture and/or PCR. In contrast, the characteristics of the patients with >16% blood eosinophilia resembled those of patients with ≥10% CSF eosinophilia. We further conducted serological tests and real-time PCR to identify A. cantonensis. Serology or real-time PCR was positive in 3 (42.8%) patients with ≥10% CSF eosinophilia and 6 (85.7%) patients with >16% blood eosinophilia without CSF eosinophils but none of patients with ≥10/mm3 CSF eosinophilia.ConclusionsThe etiology of PM in northern Vietnam is A. cantonensis. The eosinophil percentage is a more reliable predictor of parasitic EM than absolute eosinophil count in the CSF. Patients with PM may present with a high percentage of eosinophils in the peripheral blood but not in the CSF.  相似文献   

9.
IntroductionThe increasing incidence of dengue among adults in Malaysia and other countries has important implications for health services. Before 2004, in order to cope with the surge in adult dengue admissions, each of the six medical wards in a university hospital took turns daily to admit and manage patients with dengue. Despite regular in-house training, the implementation of the WHO 1997 dengue case management guidelines by the multiple medical teams was piecemeal and resulted in high variability of care. A restructuring of adult dengue inpatient service in 2004 resulted in all patients being admitted to one ward under the care of the infectious disease unit. Hospital and Intensive Care Unit admission criteria, discharge criteria and clinical laboratory testing were maintained unchanged throughout the study period.ObjectivesTo evaluate the impact of cohorting adult dengue patients on the quality of care and the clinical outcome in a university hospital in Malaysia.MethodsA pre (2003) and post-intervention (2005–6) retrospective study was undertaken.InterventionCohorting all dengue patients under the care of the Infectious Disease team in a designated ward in 2004.ResultsThe number of patients enrolled was 352 in 2003, 785 in 2005 and 1158 in 2006. The evaluation and detection of haemorrhage remained high (>90%) and unchanged throughout the study period. The evaluation of plasma leakage increased from 35.4% pre-intervention to 78.8% post-intervention (p = <0.001) while its detection increased from 11.4% to 41.6% (p = <0.001). Examination for peripheral perfusion was undertaken in only 13.1% of patients pre-intervention, with a significant increase post-intervention, 18.6% and 34.2% respectively, p = <0.001. Pre-intervention, more patients had hypotension (21.5%) than detected peripheral hypoperfusion (11.4%), indicating that clinicians recognised shock only when patients developed hypotension. In contrast, post-intervention, clinicians recognised peripheral hypoperfusion as an early sign of shock. The highest haematocrit was significantly higher post-intervention but the lowest total white cell counts and platelet counts remained unchanged. A significant and progressive reduction in the use of platelet transfusions occurred, from 21.7% pre-intervention to 14.6% in 2005 and 5.2% in 2006 post-intervention, p<0.001. Likewise, the use of plasma transfusion decreased significantly from 6.1% pre-intervention to 4.0% and 1.6% in the post-intervention years of 2005 and 2006 respectively, p<0.001. The duration of intravenous fluid therapy decreased from 3 days pre-intervention to 2.5 days (p<0.001) post-intervention; the length of hospital stay reduced from 4 days pre- to 3 days (p<0.001) post-intervention and the rate of intensive care admission from 5.8% pre to 2.6% and 2.5% post-intervention, p = 0.005.ConclusionCohorting adult dengue patients under a dedicated and trained team of doctors and nurses led to a substantial improvement in quality of care and clinical outcome.  相似文献   

10.

Background

Successful combination antiretroviral therapy (cART) increases levels of CD4+ T-cells, however this increase may not accurately reflect long-term immune recovery since T-cell dysregulation and loss of T-cell homeostasis often persist. We therefore assessed the impact of a decade of effective cART on immune regulation, T-cell homeostasis, and overall T-cell phenotype.

Methods

We conducted a retrospective study of 288 HIV+ cART-naïve patients initiating therapy. We identified 86 individuals who received cART for at least a decade, of which 44 consistently maintained undetectable plasma HIV-RNA levels throughout therapy. At baseline, participants were classified into three groups according to pre-treatment CD4+ T-cell counts: Group I (CD4<200 cells/mm3); Group II (CD4: 200–350 cells/mm3); Group III (CD4>350 cells/mm3). Outcomes of interest were: (1) CD4+ T-cell count restoration (CD4>532 cells/mm3); (2) normalization of CD4:CD8 T-cell ratio (1.2–3.3); (3) maintenance of CD3+ T-cell homeostasis (CD3: 65%–85% of peripheral lymphocytes); (4) normalization of the complete T-cell phenotype (TCP).

Results

Despite a decade of sustained successful cART, complete T-cell phenotype normalization only occurred in 16% of patients, most of whom had initiated therapy at high CD4+ T-cell counts (>350 cells/mm3). The TCP parameter that was the least restored among patients was the CD4:CD8 T-cell ratio.

Conclusions

Failure to normalize the complete T-cell phenotype was most apparent in patients who initiated cART with a CD4+ T-cell count <200 cells/mm3. The impact of this impaired T-cell phenotype on life-long immune function and potential comorbidities remains to be elucidated.  相似文献   

11.
IntroductionThe aim of this study was to investigate the prognostic significance of the combination of the preoperative platelet count and neutrophil-lymphocyte ratio (COP-NLR) for predicting postoperative survival of patients undergoing complete resection for non-small cell lung cancer (NSCLC).MethodsThe preoperative COP-NLR was calculated on the basis of data obtained.Patients with both an increased platelet count (>30.0×104 mm-3) and an elevated NLR (>2.3) were assigned a score of 2, and patients with one or neither were assigned as a score of 1 or 0, respectively.ResultsA total of 1238 NSCLC patients were enrolled in this analysis. Multivariate analysis using the 15 clinicolaboratory variables selected by univariate analyses demonstrated that the preoperative COP-NLR was an independent prognostic factor for DFS (HR: 1.834, 95%CI: 1.536 to 2.200, P<0.001) and OS (HR: 1.810, 95%CI: 1.587 to 2.056, P<0.001). In sub-analyses by tumor stage (I, II, IIIA), a significant association was found between DFS and OS and level of COP-NLR in each subgroup (P<0.001, P=0.002, P<0.001 for DFS, respectively; P<0.001, P=0.001, P<0.001 for OS). When the subgroup of patients with high-risk COP-NLR (score of 2) was analyzed, no benefit of adjuvant chemotherapy could be found (P=0.237 for DFS and P=0.165 for OS).ConclusionsThe preoperative COP-NLR is able to predict the prognosis of patients with NSCLC and divide these patients into three independent groups before surgery. Our results also demonstrate that high-risk patients based on the COP-NLR do not benefit from adjuvant chemotherapy. Independent validation of our findings is warranted.  相似文献   

12.
BackgroundSnakebite is a neglected problem with a high mortality in India. There are no simple clinical prognostic tools which can predict mortality in viper envenomings. We aimed to develop and validate a mortality-risk prediction score for patients of viper envenoming from Southern India.MethodsWe used clinical predictors from a prospective cohort of 248 patients with syndromic diagnosis of viper envenoming and had a positive 20-minute whole blood clotting test (WBCT 20) from a tertiary-care hospital in Puducherry, India. We applied multivariable logistic regression with backward elimination approach. External validation of this score was done among 140 patients from the same centre and its performance was assessed with concordance statistic and calibration plots.FindingsThe final model termed VENOMS from the term “Viper ENvenOming Mortality Score included 7 admission clinical parameters (recorded in the first 48 hours after bite): presence of overt bleeding manifestations, presence of capillary leak syndrome, haemoglobin <10 g/dL, bite to antivenom administration time > 6.5 h, systolic blood pressure < 100 mm Hg, urine output <20 mL/h in 24 h and female gender. The lowest possible VENOMS score of 0 predicted an in-hospital mortality risk of 0.06% while highest score of 12 predicted a mortality of 99.1%. The model had a concordance statistic of 0·86 (95% CI 0·79–0·94) in the validation cohort. Calibration plots indicated good agreement of predicted and observed outcomes.ConclusionsThe VENOMS score is a good predictor of the mortality in viper envenoming in southern India where Russell’s viper envenoming burden is high. The score may have potential applications in triaging patients and guiding management after further validation.  相似文献   

13.
BackgroundGiven the high death rate the first two months of tuberculosis (TB) therapy in HIV patients, it is critical defining the optimal time to initiate combination antiretroviral therapy (cART).MethodsA randomized, open-label, clinical trial comparing efficacy and safety of efavirenz-based cART initiated one week, four weeks, and eight weeks after TB therapy in patients with baseline CD4 count < 200 cells/μL was conducted. The primary endpoint was all-cause mortality rate at 48 weeks. The secondary endpoints were hepatotoxicity-requiring interruption of TB therapy, TB-associated immune reconstitution inflammatory syndrome, new AIDS defining illnesses, CD4 counts, HIV RNA levels, and AFB smear conversion rates. All analyses were intention-to-treat.ResultsWe studied 478 patients with median CD4 count of 73 cells/μL and 5.2 logs HIV RNA randomized to week one (n = 163), week four (n = 160), and week eight (n = 155). Sixty-four deaths (13.4%) occurred in 339.2 person-years. All-cause mortality rates at 48 weeks were 25 per 100 person-years in week one, 18 per 100 person-years in week four and 15 per 100 person-years in week eight (P = 0.2 by the log-rank test). All-cause mortality incidence rate ratios in subgroups with CD4 count below 50 cells/μL versus above were 2.8 in week one (95% CI 1.2–6.7), 3.1 in week four (95% CI 1.2–8.6) and 5.1 in week eight (95% CI 1.8–16). Serum albumin < 3gms/dL (adjusted HR, aHR = 2.3) and CD4 < 50 cells/μL (aHR = 2.7) were independent predictors of mortality. Compared with similar subgroups from weeks four and eight, first-line TB treatment interruption was high in week one deaths (P = 0.03) and in the CD4 subgroup <50 cells/μL (P = 0.02).ConclusionsAntiretroviral therapy one week after TB therapy doesn’t improve overall survival. Despite increased mortality with CD4 < 50 cells/μL, we recommend cART later than the first week of TB therapy to avoid serious hepatotoxicity and treatment interruption.

Trial Registration

ClinicalTrials.gov NCT 01315301  相似文献   

14.
IntroductionSepsis is associated with increased mortality, delirium and long-term cognitive impairment in intensive care unit (ICU) patients. Electroencephalogram (EEG) abnormalities occurring at the acute stage of sepsis may correlate with severity of brain dysfunction. Predictive value of early standard EEG abnormalities for mortality in ICU septic patients remains to be assessed.MethodsIn this prospective, single center, observational study, standard EEG was performed, analyzed and classified according to both Synek and Young EEG scales, in consecutive patients acutely admitted in ICU for sepsis. Delirium, coma and the level of sedation were assessed at the time of EEG recording; and duration of sedation, occurrence of in-ICU delirium or death were assessed during follow-up. Adjusted analyses were carried out using multiple logistic regression.ResultsOne hundred ten patients were included, mean age 63.8 (±18.1) years, median SAPS-II score 38 (29–55). At the time of EEG recording, 46 patients (42%) were sedated and 22 (20%) suffered from delirium. Overall, 54 patients (49%) developed delirium, of which 32 (29%) in the days after EEG recording. 23 (21%) patients died in the ICU. Absence of EEG reactivity was observed in 27 patients (25%), periodic discharges (PDs) in 21 (19%) and electrographic seizures (ESZ) in 17 (15%). ICU mortality was independently associated with a delta-predominant background (OR: 3.36; 95% CI [1.08 to 10.4]), absence of EEG reactivity (OR: 4.44; 95% CI [1.37–14.3], PDs (OR: 3.24; 95% CI [1.03 to 10.2]), Synek grade ≥ 3 (OR: 5.35; 95% CI [1.66–17.2]) and Young grade > 1 (OR: 3.44; 95% CI [1.09–10.8]) after adjustment to Simplified Acute Physiology Score (SAPS-II) at admission and level of sedation. Delirium at the time of EEG was associated with ESZ in non-sedated patients (32% vs 10%, p = 0.037); with Synek grade ≥ 3 (36% vs 7%, p< 0.05) and Young grade > 1 (36% vs 17%, p< 0.001). Occurrence of delirium in the days after EEG was associated with a delta-predominant background (48% vs 15%, p = 0.001); absence of reactivity (39% vs 10%, p = 0.003), Synek grade ≥ 3 (42% vs 17%, p = 0.001) and Young grade >1 (58% vs 17%, p = 0.0001).ConclusionsIn this prospective cohort of 110 septic ICU patients, early standard EEG was significantly disturbed. Absence of EEG reactivity, a delta-predominant background, PDs, Synek grade ≥ 3 and Young grade > 1 at day 1 to 3 following admission were independent predictors of ICU mortality and were associated with occurence of delirium. ESZ and PDs, found in about 20% of our patients. Their prevalence could have been higher, with a still higher predictive value, if they had been diagnosed more thoroughly using continuous EEG.  相似文献   

15.

Background

Anticoagulation therapy is usually required in patients with chronic kidney disease (CKD) for treatment or prevention of thromboembolic diseases. However, this benefit could easily be offset by the risk of bleeding.

Objectives

To determine the incidence of adverse outcomes of anticoagulants in hospitalized patients with CKD, and to compare the rates of major bleeding events between the unfractionated heparin (UFH) and enoxaparin users.

Methods

One year prospective observational study was conducted in patients with CKD stages 3 to 5 (estimated GFR, 10–59 ml/min/1.73 m2) who were admitted to the renal unit of Dubai Hospital. Propensity scores for the use of anticoagulants, estimated for each of the 488 patients, were used to identify a cohort of 117 pairs of patients. Cox regression method was used to estimate association between anticoagulant use and adverse outcomes.

Results

Major bleeding occurred in 1 in 3 patients who received anticoagulation during hospitalization (hazard ratio [HR], 4.61 [95% confidence interval [CI], 2.05–10.35]). Compared with enoxaparin users, patients who received anticoagulation with unfractionated heparin had a lower mean [SD] serum level of platelet counts (139.95 [113]×103/µL vs 205.56 [123] ×103/µL; P<0.001), and had a higher risk of major bleeding (HR, 4.79 [95% CI, 1.85–12.36]). Furthermore, compared with those who did not receive anticoagulants, patients who did had a higher in-hospital mortality (HR, 2.54 [95% CI, 1.03–6.25]); longer length of hospitalization (HR, 1.04 [95% CI, 1.01–1.06]); and higher hospital readmission at 30 days (HR, 1.79 [95% CI, 1.10–2.91]).

Conclusions

Anticoagulation among hospitalized patients with CKD was significantly associated with an increased risk of bleeding and in-hospital mortality. Hence, intensive monitoring and preventive measures such as laboratory monitoring and/or dose adjustment are warranted.  相似文献   

16.
BackgroundImmunological and virological status of HIV-infected individuals entering the Brazilian public system over time was analyzed. We evaluated the impact of ART on virological, immunological and antiretroviral resistance over time.MethodsCD4+ T cell counts, viral loads and genotypes from patients over 13 years old from 2001–2011 were analyzed according to demographic data. We compared groups using parametric t-tests and linear regression analysis in the R statistical software language.ResultsMean baseline CD4+ T cell counts varied from 348 (2003) to 389 (2009) and was higher among women (p = 1.1 x 10−8), lower in older patients (p< 1 x 10−8) and lower in less developed regions (p = 1.864 x 10−5). Percentage of treated patients with undetectable viral loads increased linearly from 46% (2001) to 77% (2011), was lower among women (p = 2.851 x 10−6), younger ages (p = 1 x 10−3), and in less developed regions (p = 1.782 x 10−4). NRTI acquired resistance was 86% in 2001–3 and decreased over time. NNRTI resistance increased from 2001-3(50%) to 2006–9 (60%), PI resistance decreased from 2001–3 (60%) to 2009 (40%), and 3-class resistance was stable over time around 25%. Subtype prevalence comprised B (75.3%), B/F recombinants (12.2%), C (5.7%), F (5.3%) and B/C recombinants (1.5%), with regional variations. Three-class resistance was 26.5% among Bs, 22.4% among Fs and 17.2% among Cs.ConclusionsHIV diagnosis occurs late, especially among elderly Brazilians. Younger individuals need special attention due to poor virological response to treatment. Antiretroviral Resistance profile is subtype related.  相似文献   

17.
18.
BackgroundTo investigate the effects of ultrasound-guided lumbar-sciatic nerve block and epidural anesthesia on the levels of inflammatory factors such as Interleukin-6 (IL6), Interleukin-8 (IL-8), Tumor necrosis factor-a (TNF-α) and coagulation factors in peripheral blood of elderly patients after hip arthroplasty to provides reference value for the choice of intraoperative anesthesia.Methods96 elderly patients underwent hip arthroplasty in our hospital from March 2018 to December 2019 were selected and divided into ultrasound-guided lumbar-sciatic nerve block group (group A) and epidural anesthesia group (group B) randomly , there were 48 cases in each group. The onset time of intraoperative anesthesia, postoperative hemodynamic indexes, pain score, inflammatory factors and blood coagulation factor levels were compared between group A and group B.ResultsIt was proved that: (1) The onset time of sensory block and motor block in group B were shorter compared with group A, and the maintenance time of anesthesia was prolonged (P<0.05); (2) Compared with group A, visual analogue scale (VAS) score of group B patients after operation was lower (P<0.05); (3) The systolic blood pressure (SBP) and diastolic blood pressure (DBP) of group B were higher than group A (P<0.05) ) at T1 and T2, while the comparison of SBP and DBP between groups was not statistical difference at T3 and T4 (P>0.05); (3) Compared with group A, the levels of TNF, IL-8and IL-6 in peripheral blood of group B decreased after T2, T3 and T4 (P<0.05); (4) Statistical difference in plasma factor V activity (FV:C), coagulation factor VIII activity (FVIII:C) and fibrinogen (FIB) levels were showed between groups A and B at T2, T3 and T4 (P<0.05) with significantly lower values in group B compared to group A(P<0.05). (5) The half-year mortality rates of patients in two group were 5.56% and 8.33% respectively. There was no significant difference between group A and group B (P>0.05).ConclusionsCompared with epidural anesthesia, lumbarsciatic nerve block is showed significantly lower values in concentration of peripheral blood coagulation factors and inflammatory factors after surgery, thereby alleviating postoperative hypercoagulability and inflammation.  相似文献   

19.

Objective

To assess the validity of CRB-65 (Confusion, Respiratory rate >30 breaths/min, BP<90/60 mmHg, age >65 years) as a pneumonia severity index in a Malawian hospital population, and determine whether an alternative score has greater accuracy in this setting.

Design

Forty three variables were prospectively recorded during the first 48 hours of admission in all patients admitted to Queen Elizabeth Central Hospital, Malawi, for management of lower respiratory tract infection over a two month period (N = 240). Calculation of sensitivity and specificity for CRB-65 in predicting mortality was followed by multivariate modeling to create a score with superior performance in this population.

Results

Median age 37, HIV prevalence 79.9%, overall mortality 18.3%. CRB-65 predicted mortality poorly, indicated by the area under the ROC curve of 0.649. Independent predictors of death were: Male sex, “S” (AOR 2.6); Wasting, “W” (AOR 6.6); non-ambulatory, “A” (AOR 2.5); Temp >38°C or <35°C, “T” (AOR 3.2); BP<100/60, “Bp” (AOR 3.7). Combining these factors to form a severity index (SWAT-Bp) predicted mortality with high sensitivity and specificity (AUC: 0.867). Mortality for scores 0–5 was 0%, 3.3%, 7.4%, 29.2%, 61.5% and 87.5% respectively. A score ≥3 was 84% sensitive and 77% specific for mortality prediction, with a negative predictive value of 95.8%.

Conclusion

CRB-65 performs poorly in this population. The SWAT-Bp score can accurately stratify patients; ≤2 indicates non-severe infection (mortality 4.4%) and ≥3 severe illness (mortality 45%).  相似文献   

20.
RationaleThe Spanish guideline for COPD (GesEPOC) recommends COPD treatment according to four clinical phenotypes: non-exacerbator phenotype with either chronic bronchitis or emphysema (NE), asthma-COPD overlap syndrome (ACOS), frequent exacerbator phenotype with emphysema (FEE) or frequent exacerbator phenotype with chronic bronchitis (FECB). However, little is known on the distribution and outcomes of the four suggested phenotypes.ObjectiveWe aimed to determine the distribution of these COPD phenotypes, and their relation with one-year clinical outcomes.MethodsWe followed a cohort of well-characterized patients with COPD up to one-year. Baseline characteristics, health status (CAT), BODE index, rate of exacerbations and mortality up to one year of follow-up were compared between the four phenotypes.ResultsOverall, 831 stable COPD patients were evaluated. They were distributed as NE, 550 (66.2%); ACOS, 125 (15.0%); FEE, 38 (4.6%); and FECB, 99 (11.9%); additionally 19 (2.3%) COPD patients with frequent exacerbations did not fulfill the criteria for neither FEE nor FECB. At baseline, there were significant differences in symptoms, FEV1 and BODE index (all p<0.05). The FECB phenotype had the highest CAT score (17.1±8.2, p<0.05 compared to the other phenotypes). Frequent exacerbator groups (FEE and FECB) were receiving more pharmacological treatment at baseline, and also experienced more exacerbations the year after (all p<0.05) with no differences in one-year mortality. Most of NE (93%) and half of exacerbators were stable after one year.ConclusionsThere is an uneven distribution of COPD phenotypes in stable COPD patients, with significant differences in demographics, patient-centered outcomes and health care resources use.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号