首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 28 毫秒
1.

Introduction

Mean platelet volume (MPV) is suggested as an index of inflammation, disease activity, and anti-inflammatory treatment efficacy in chronic inflammatory disorders; however, the effect of MPV on sepsis mortality remains unclear. Therefore, we investigated whether the change in MPV between hospital admission and 72 hours (ΔMPV72h-adm) predicts 28-day mortality in severe sepsis and/or septic shock.

Methods

We prospectively enrolled 345 patients admitted to the emergency department (ED) who received standardized resuscitation (early goal-directed therapy) for severe sepsis and/or septic shock between November 2007 and December 2011. Changes in platelet indices, including ΔMPV72h-adm, were compared between survivors and non-survivors by linear mixed model analysis. The prognostic value of ΔMPV72h-adm for 28-day mortality was ascertained by Cox proportional hazards model analysis.

Results

Thirty-five (10.1%) patients died within 28 days after ED admission. MPV increased significantly during the first 72 hours in non-survivors (P = 0.001) and survivors (P < 0.001); however, the rate of MPV increase was significantly higher in non-survivors (P = 0.003). Nonetheless, the difference in the platelet decline rate over the first 72 hours did not differ significantly between groups (P = 0.360). In multivariate analysis, ΔMPV72h-adm was an independent predictor of 28-day mortality, after adjusting for plausible confounders (hazard ratio, 1.44; 95% confidence interval, 1.01–2.06; P = 0.044).

Conclusions

An increase in MPV during the first 72 hours of hospitalization is an independent risk factor for adverse clinical outcomes. Therefore, continuous monitoring of MPV may be useful to stratify mortality risk in patients with severe sepsis and/or septic shock.  相似文献   

2.

Objectives

To retrospectively validate the new Chinese DIC scoring system (CDSS).

Methods

This study retrospectively collected the information of 619 patients (371 cases with non-hematologic malignancies, 248 cases with hematologic malignancies) who suspected of DIC in Wuhan Union Hospital during 2013-4 to 2014-6. We validated CDSS by comparing it with three leading scoring systems, from International Society on Thrombosis and Haemostasis (ISTH), Japanese Association for Acute Medicine (JAAM) and Japanese Ministry of Health and Welfare (JMHW), and evaluated its prognostic value by 28 days mortality, APACHE II and SOFA score.

Results

In non-hematologic malignancies, CDSS was more specific than JAAM (72.55% vs. 50.49%, p<0.05) and more sensitive than ISTH (77.07% vs. 62.03%, p<0.05). In hematologic malignancies, the area under the ROC curve of CDSS was larger than ISTH and JMHW (0.933 vs. 0.889, p<0.01 with ISTH, 0.944 vs. 0.845, p<0.01 with JMHW). In addition, the 28-day mortality rate, SOFA scores, APACHE II scores of DIC patients diagnosed by CDSS were significantly greater than non-DIC (P <0.05).

Conclusions

We are the first group to propose CDSS. It emphasized the values of the clinical manifestations, the rapidly declining platelet count, APTT in the diagnosis of DIC and used D-dimer as the fibrin-related maker. DIC with hematological malignancies was treated as a special part. In this study we can see that CDSS displayed an acceptable property for the diagnosis of DIC with appropriate sensitivity and specificity, and also had a good prognostic value for DIC patients.  相似文献   

3.

Introduction

Multimodality monitoring is regularly employed in adult traumatic brain injury (TBI) patients where it provides physiologic and therapeutic insight into this heterogeneous condition. Pediatric studies are less frequent.

Methods

An analysis of data collected prospectively from 12 pediatric TBI patients admitted to Addenbrooke’s Hospital, Pediatric Intensive Care Unit (PICU) between August 2012 and December 2014 was performed. Patients’ intracranial pressure (ICP), mean arterial pressure (MAP), and cerebral perfusion pressure (CPP) were monitored continuously using brain monitoring software ICM+®,) Pressure reactivity index (PRx) and ‘Optimal CPP’ (CPPopt) were calculated. Patient outcome was dichotomized into survivors and non-survivors.

Results

At 6 months 8/12 (66%) of the cohort survived the TBI. The median (±IQR) ICP was significantly lower in survivors 13.1±3.2 mm Hg compared to non-survivors 21.6±42.9 mm Hg (p = 0.003). The median time spent with ICP over 20 mm Hg was lower in survivors (9.7+9.8% vs 60.5+67.4% in non-survivors; p = 0.003). Although there was no evidence that CPP was different between survival groups, the time spent with a CPP close (within 10 mm Hg) to the optimal CPP was significantly longer in survivors (90.7±12.6%) compared with non-survivors (70.6±21.8%; p = 0.02). PRx provided significant outcome separation with median PRx in survivors being 0.02±0.19 compared to 0.39±0.62 in non-survivors (p = 0.02).

Conclusion

Our observations provide evidence that multi-modality monitoring may be useful in pediatric TBI with ICP, deviation of CPP from CPPopt, and PRx correlating with patient outcome.  相似文献   

4.

Background and objective

Acute Physiology and Chronic Health Evaluation (APACHE) III score has been widely used for prediction of clinical outcomes in mixed critically ill patients. However, it has not been validated in patients with sepsis-associated acute lung injury (ALI). The aim of the study was to explore the calibration and predictive value of APACHE III in patients with sepsis-associated ALI.

Method

The study was a secondary analysis of a prospective randomized controlled trial investigating the efficacy of rosuvastatin in sepsis-associated ALI (Statins for Acutely Injured Lungs from Sepsis, SAILS). The study population was sepsis-related ALI patients. The primary outcome of the current study was the same as in the original trial, 60-day in-hospital mortality, defined as death before hospital discharge, censored 60 days after enrollment. Discrimination of APACHE III was assessed by calculating the area under the receiver operating characteristic (ROC) curve (AUC) with its 95% CI. Hosmer-Lemeshow goodness-of-fit statistic was used to assess the calibration of APACHE III. The Brier score was reported to represent the overall performance of APACHE III in predicting outcome.

Main results

A total of 745 patients were included in the study, including 540 survivors and 205 non-survivors. Non-survivors were significantly older than survivors (59.71±16.17 vs 52.00±15.92 years, p<0.001). The primary causes of ALI were also different between survivors and non-survivors (p = 0.017). Survivors were more likely to have the cause of sepsis than non-survivors (21.2% vs. 15.1%). APACHE III score was higher in non-survivors than in survivors (106.72±27.30 vs. 88.42±26.86; p<0.001). Discrimination of APACHE III to predict mortality in ALI patients was moderate with an AUC of 0.68 (95% confidence interval: 0.64–0.73).

Conclusion

this study for the first time validated the discrimination of APACHE III in sepsis associated ALI patients. The result shows that APACHE III score has moderate predictive value for in-hospital mortality among adults with sepsis-associated acute lung injury.  相似文献   

5.

Background/Aim

Acetaminophen (APAP) hepatotoxicity is related to the formation of N-acetyl-p-benzoquinone imine (NAPQI), which is detoxified through conjugation with reduced glutathione (GSH). Ophthalmic acid (OA) is an analogue of GSH in which cysteine is replaced with 2-aminobutyrate. Metabolomics studies of mice with APAP-induced acute liver failure (APAP-ALF) identified OA as a marker of oxidative stress and hepatic GSH consumption. The aim of the current study was to determine whether OA is detectable in APAP-ALF human patients either early (day 2) or late (day 4) and whether OA levels were associated with in-hospital survival in the absence of liver transplant.

Methods

Serum samples from 130 APAP-ALF patients (82 survivors, 48 non-survivors) were analyzed by liquid chromatography-tandem mass spectrometry (LC-MS/MS) and correlated with clinical data from the United States Acute Liver Failure Study Group (US ALFSG) Registry (2004–2011).

Results

Survivors had significantly lower admission bilirubin (4.2 vs. 5.7 mg/dl) and lactate levels (3.3 vs. 6.5 μmol/l, p<0.05 for all). During the first 7 days of the study, survivors were less likely to require mechanical ventilation (55% vs. 88%), vasopressor support (9.8% vs. 67%) or renal replacement therapy (26% vs. 63%, p< 0.001 for all). Non-survivors were more likely to have detectable OA levels early (31% vs. 15%, p = 0.034) and late (27% vs. 11%, p = 0.02). However there were no significant differences in mean OA levels between non-survivors and survivors (early 0.48 vs. 0.36, late 0.43 vs. 0.37, P > 0.5 for all).

Conclusion

OA was detectable more frequently in APAP-ALF non-survivors but mean OA levels were not associated with survival. The routine clinical administration of N-acetyl cysteine could replenish GSH levels and prevent OA production.  相似文献   

6.

Background

Therapeutic plasma exchange (TPE)-based protocols immediately before cadaveric donor kidney transplantation have been extensively used in highly sensitized recipients. Plasma is generally preferred over human albumin as replacement fluid to avoid depletion of coagulation factors and perioperative bleeding. The aim of this study was to estimate bleeding risk after TPE replaced with albumin using rotational thromboelastography (ROTEM).

Methodology

Ten patients without overt coagulation abnormalities underwent TPE. Standard laboratory coagulation tests (thromboplastin time, activated partial thromboplastin time (aPTT), international normalized ratio (INR), thrombin clotting time, fibrinogen levels and antithrombin activity) were compared with thrombelastometry analysis (EXTEM and INTEM tests) before and after TPE.

Principal Findings

TPE significantly reduced fibrinogen levels (482 ± 182 vs. 223 ± 122 mg/dL), antithrombin activity (103 ± 11 vs. 54 ± 11 %), and prolonged aPTT (28 ± 3 vs. 45 ± 8 s), thromboplastin time (108 ± 11 vs. 68 ± 11 %), INR (0.95 ± 0.06 vs. 1.25 ± 0.16), and thrombin clotting time (18 ± 2 vs. 20 ± 3 s). INTEM and EXTEM analyses revealed significantly prolonged clot-formation time and reduced maximum clot firmness.

Conclusions/Significance

TPE replaced with albumin induces significant changes in global hemostasis parameters thus potentially increasing bleeding risk. Therefore, pretransplant TPE should be considered carefully in indicated patients before kidney transplantation. The role of the ROTEM point-of-care test to estimate the risk of bleeding in renal transplantation needs to be evaluated in further studies.  相似文献   

7.

Introduction

Acute traumatic coagulopathy has been associated with shock and tissue injury, and may be mediated via activation of the protein C pathway. Patients with acute traumatic coagulopathy have prolonged PT and PTT, and decreased activity of factors V and VIII; they are also hypocoagulable by thromboelastometry (ROTEM) and other viscoelastic assays. To test the etiology of this phenomenon, we hypothesized that such coagulopathy could be induced in vitro in healthy human blood with the addition of activated protein C (aPC).

Methods

Whole blood was collected from 20 healthy human subjects, and was “spiked” with increasing concentrations of purified human aPC (control, 75, 300, 2000 ng/mL). PT/PTT, factor activity assays, and ROTEM were performed on each sample. Mixed effect regression modeling was performed to assess the association of aPC concentration with PT/PTT, factor activity, and ROTEM parameters.

Results

In all subjects, increasing concentrations of aPC produced ROTEM tracings consistent with traumatic coagulopathy. ROTEM EXTEM parameters differed significantly by aPC concentration, with stepwise prolongation of clotting time (CT) and clot formation time (CFT), decreased alpha angle (α), impaired early clot formation (a10 and a20), and reduced maximum clot firmness (MCF). PT and PTT were significantly prolonged at higher aPC concentrations, with corresponding significant decreases in factor V and VIII activity.

Conclusion

A phenotype of acute traumatic coagulopathy can be induced in healthy blood by the in vitro addition of aPC alone, as evidenced by viscoelastic measures and confirmed by conventional coagulation assays and factor activity. This may lend further mechanistic insight to the etiology of coagulation abnormalities in trauma, supporting the central role of the protein C pathway. Our findings also represent a model for future investigations in the diagnosis and treatment of acute traumatic coagulopathy.  相似文献   

8.

Introduction

Impairment of fibrinolysis during sepsis is associated with worse outcome. Early identification of this condition could be of interest. The aim of this study was to evaluate whether a modified point-of-care viscoelastic hemostatic assay can detect sepsis-induced impairment of fibrinolysis and to correlate impaired fibrinolysis with morbidity and mortality.

Methods

This single center observational prospective pilot study was performed in an adult Intensive Care Unit (ICU) of a tertiary academic hospital. Forty consecutive patients admitted to the ICU with severe sepsis or septic shock were included. Forty healthy individuals served as controls. We modified conventional kaolin activated thromboelastography (TEG) adding urokinase to improve assessment of fibrinolysis in real time (UK-TEG). TEG, UK-TEG, plasminogen activator inhibitor (PAI)-1, thrombin-activatable fibrinolysis inhibitor (TAFI), d-dimer, DIC scores and morbidity (rated with the SOFA score) were measured upon ICU admission. Logistic regression was used to calculate odds ratios (ORs) and 95% confidence intervals (95% CIs) of mortality at ICU discharge.

Results

UK-TEG revealed a greater impairment of fibrinolysis in sepsis patients compared to healthy individuals confirmed by PAI-1. TAFI was not different between sepsis patients and healthy individuals. 18/40 sepsis patients had fibrinolysis impaired according to UK-TEG and showed higher SOFA score (8 (6–13) vs 5 (4–7), p = 0.03), higher mortality (39% vs 5%, p = 0.01) and greater markers of cellular damage (lactate levels, LDH and bilirubin). Mortality at ICU discharge was predicted by the degree of fibrinolysis impairment measured by UK-TEG Ly30 (%) parameter (OR 0.95, 95% CI 0.93–0.98, p = 0.003).

Conclusions

Sepsis-induced impairment of fibrinolysis detected at UK-TEG was associated with increased markers of cellular damage, morbidity and mortality.  相似文献   

9.

Introduction

Initial lactate level, lactate clearance, C-reactive protein, and procalcitonin in critically ill patients with sepsis are associated with hospital mortality. However, no study has yet discovered which factor is most important for mortality in severe sepsis patients with lactic acidosis. We sought to clarify this issue in patients with lactic acidosis who were supplementing with sodium bicarbonate.

Materials and Methods

Data were collected from a single center between May 2011 and April 2014. One hundred nine patients with severe sepsis and lactic acidosis who were supplementing with sodium bicarbonate were included.

Results

The 7-day mortality rate was 71.6%. The survivors had higher albumin levels and lower SOFA, APACHE II scores, vasopressor use, and follow-up lactate levels at an elapsed time after their initial lactate levels were checked. In particular, a decrement in lactate clearance of at least 10% for the first 6 hours, 24 hours, and 48 hours of treatment was more dominant among survivors than non-survivors. Although the patients who were treated with broad-spectrum antibiotics showed higher illness severity than those who received conventional antibiotics, there was no significant mortality difference. 6-hour, 24-hour, and 48-hour lactate clearance (HR: 4.000, 95% CI: 1.309–12.219, P = 0.015) and vasopressor use (HR: 4.156, 95% CI: 1.461–11.824, P = 0.008) were significantly associated with mortality after adjusting for confounding variables.

Conclusions

Lactate clearance at a discrete time point seems to be a more reliable prognostic index than initial lactate value in severe sepsis patients with lactic acidosis who were supplementing with sodium bicarbonate. Careful consideration of vasopressor use and the initial application of broad-spectrum antibiotics within the first 48 hours may be helpful for improving survival, and further study is warranted.  相似文献   

10.
11.

Background

Treatment efficacy and costs of anti-VEGF drugs have not been studied in clinical routine.

Objective

To compare treatment costs and clinical outcomes of the medications when adjusting for patients’ characteristics and clinical status.

Design

Comparative study.

Setting

The largest public ophthalmologic clinic in Switzerland.

Patients

Health care claims data of patients with age-related macular degeneration, diabetic macula edema and retinal vein occlusion were matched to clinical and outcome data.

Measurements

Patients’ underlying condition, gender, age, visual acuity and retinal thickness at baseline and after completing the loading phase, the total number of injections per treatment, the visual outcome and vital status was secured.

Results

We included 315 patients (19595 claims) with a follow-up time of 1 to 99 months (mean 32.7, SD 25.8) covering the years 2006–2014. Mean age was 78 years (SD 9.3) and 200 (63.5%) were female. At baseline, the mean number of letters was 55.6 (SD 16.3) and the central retinal thickness was 400.1 μm (SD 110.1). Patients received a mean number of 15.1 injections (SD 13.7; range 1 to 85). Compared to AMD, adjusted cost per month were significantly higher (+2174.88 CHF, 95%CI: 1094.50–3255.27; p<0.001) for patients with DME, while cost per month for RVO were slightly but not significantly higher. (+284.71 CHF, 95% CI: -866.73–1436.15; p = 0.627).

Conclusions

Patients with DME are almost twice as expensive as AMD and RVO patients. Cost excess occurs with non-ophthalmologic interventions. The currently licensed anti-VEGF medications did not differ in costs, injection frequency and clinical outcomes. Linking health care claims to clinical data is a useful tool to examine routine clinical care.  相似文献   

12.

Introduction

A significant barrier to medical diagnostics in low-resource environments is the lack of medical care and equipment. Here we present a low-cost, cloud-connected digital microscope for applications at the point-of-care. We evaluate the performance of the device in the digital assessment of estrogen receptor-alpha (ER) expression in breast cancer samples. Studies suggest computer-assisted analysis of tumor samples digitized with whole slide-scanners may be comparable to manual scoring, here we study whether similar results can be obtained with the device presented.

Materials and Methods

A total of 170 samples of human breast carcinoma, immunostained for ER expression, were digitized with a high-end slide-scanner and the point-of-care microscope. Corresponding regions from the samples were extracted, and ER status was determined visually and digitally. Samples were classified as ER negative (<1% ER positivity) or positive, and further into weakly (1–10% positivity) and strongly positive. Interobserver agreement (Cohen’s kappa) was measured and correlation coefficients (Pearson’s product-momentum) were calculated for comparison of the methods.

Results

Correlation and interobserver agreement (r = 0.98, p < 0.001, kappa = 0.84, CI95% = 0.75–0.94) were strong in the results from both devices. Concordance of the point-of-care microscope and the manual scoring was good (r = 0.94, p < 0.001, kappa = 0.71, CI95% = 0.61–0.80), and comparable to the concordance between the slide scanner and manual scoring (r = 0.93, p < 0.001, kappa = 0.69, CI95% = 0.60–0.78). Fourteen (8%) discrepant cases between manual and device-based scoring were present with the slide scanner, and 16 (9%) with the point-of-care microscope, all representing samples of low ER expression.

Conclusions

Tumor ER status can be accurately quantified with a low-cost imaging device and digital image-analysis, with results comparable to conventional computer-assisted or manual scoring. This technology could potentially be expanded for other histopathological applications at the point-of-care.  相似文献   

13.

Introduction

Bilirubin is well-recognized marker of hepatic dysfunction in intensive care unit (ICU) patients. Multiple organ failure often complicates acute respiratory distress syndrome (ARDS) evolution and is associated with high mortality. The effect of early hepatic dysfunction on ARDS mortality has been poorly investigated. We evaluated the incidence and the prognostic significance of increased serum bilirubin levels in the initial phase of ARDS.

Methods

The data of 805 patients with ARDS were retrospectively analysed. This population was extracted from two recent multicenter, prospective and randomised trials. Patients presenting with ARDS with a ratio of the partial pressure of arterial oxygen to the fraction of inspired oxygen < 150 mmHg measured with a PEEP ≥ 5 cm of water were included. The total serum bilirubin was measured at inclusion and at days 2, 4, 7 and 14. The primary objective was to analyse the bilirubin at inclusion according to the 90-day mortality rate.

Results

The 90-day mortality rate was 33.8% (n = 272). The non-survivors were older, had higher Sepsis-related Organ Failure Assessment (SOFA) score and were more likely to have a medical diagnosis on admission than the survivors. At inclusion, the SOFA score without the liver score (10.3±2.9 vs. 9.0±3.0, p<0.0001) and the serum bilirubin levels (36.1±57.0 vs. 20.5±31.5 μmol/L, p<0.0001) were significantly higher in the non-survivors than in the survivors. Age, the hepatic SOFA score, the coagulation SOFA score, the arterial pH level, and the plateau pressure were independently associated with 90-day mortality in patients with ARDS.

Conclusion

Bilirubin used as a surrogate marker of hepatic dysfunction and measured early in the course of ARDS was associated with the 90-day mortality rate.  相似文献   

14.

Background

Determine the effect of the day 1 urinary excretion of cadmium (D1-UE-Cd) on mortality of patients admitted to a coronary care unit (CCU).

Methods

A total of 323 patients were enrolled in this 6-month study. Urine and blood samples were taken within 24 h after CCU admission. Demographic data, clinical diagnoses, and hospital mortality were recorded. The scores of established systems for prediction of mortality in critically ill patients were calculated.

Results

Compared with survivors (n = 289), non-survivors (n = 34) had higher levels of D1-UE-Cd. Stepwise multiple linear regression analysis indicated that D1-UE-Cd was positively associated with pulse rate and level of aspartate aminotransferase, but negatively associated with serum albumin level. Multivariate Cox analysis, with adjustment for other significant variables and measurements from mortality scoring systems, indicated that respiratory rate and D1-UE-Cd were independent and significant predictors of mortality. For each 1 μg/day increase of D1-UE-Cd, the hazard ratio for CCU mortality was 3.160 (95% confidence interval: 1.944–5.136, p < 0.001). The chi-square value of Hosmer-Lemeshow goodness-of-fit test for D1-UE-Cd was 10.869 (p = 0.213). The area under the receiver operating characteristic curve for D1-UE-Cd was 0.87 (95% confidence interval: 0.81–0.93).

Conclusions

The D1-UE-Cd, an objective variable with no inter-observer variability, accurately predicted hospital mortality of CCU patients and outperformed other established scoring systems. Further studies are needed to determine the physiological mechanism of the effect of cadmium on mortality in CCU patients.  相似文献   

15.

Background

Formation of compact and poorly lysable clots has been reported in thromboembolic disorders. Little is known about clot properties in bleeding disorders.

Objectives

We hypothesized that more permeable and lysis-sensitive fibrin clots can be detected in women with heavy menstrual bleeding (HMB).

Methods

We studied 52 women with HMB of unknown cause and 52 age-matched control women. Plasma clot permeability (Ks), turbidity and efficiency of fibrinolysis, together with coagulation factors, fibrinolysis proteins, and platelet aggregation were measured.

Results

Women with HMB formed looser plasma fibrin clots (+16% [95%CI 7–18%] Ks) that displayed lower maximum absorbancy (-7% [95%CI -9 – -1%] ΔAbsmax), and shorter clot lysis time (-17% [95%CI -23 – -11%] CLT). The HMB patients and controls did not differ with regard to coagulation factors, fibrinogen, von Willebrand antigen, thrombin generation markers and the proportion of subjects with defective platelet aggregation. The patients had lower platelet count (-12% [95%CI -19 – -2%]), tissue plasminogen activator antigen (-39% [95%CI -41 – -29%] tPA:Ag), and plasminogen activator inhibitor-1 antigen (-28% [95%CI -38 – -18%] PAI-1:Ag) compared with the controls. Multiple regression analysis upon adjustment for age, body mass index, glucose, and fibrinogen showed that decreased tPA:Ag and shortened CLT were the independent predictors of HMB.

Conclusions

Increased clot permeability and susceptibility to fibrinolysis are associated with HMB, suggesting that altered plasma fibrin clot properties might contribute to bleeding disorders of unknown origin.  相似文献   

16.

Background

Taking care of children diagnosed with cancer affects parents’ professional life. The impact in the long-term however, is not clear. We aimed to compare the employment situation of parents of long-term childhood cancer survivors with control parents of the general population, and to identify clinical and socio-demographic factors associated with parental employment.

Methods

As part of the Swiss Childhood Cancer Survivor Study, we sent a questionnaire to parents of survivors aged 5–15 years, who survived ≥5 years after diagnosis. Information on control parents of the general population came from the Swiss Health Survey (restricted to men and women with ≥1 child aged 5–15 years). Employment was categorized as not employed, part-time, and full-time employed. We used generalized ordered logistic regression to determine associations with clinical and socio-demographic factors. Clinical data was available from the Swiss Childhood Cancer Registry.

Results

We included 394 parent-couples of survivors and 3’341 control parents (1’731 mothers; 1’610 fathers). Mothers of survivors were more often not employed (29% versus 22%; ptrend = 0.007). However, no differences between mothers were found in multivariable analysis. Fathers of survivors were more often employed full-time (93% versus 87%; ptrend = 0.002), which remained significant in multivariable analysis. Among parents of survivors, mothers with tertiary education (OR = 2.40, CI:1.14–5.07) were more likely to be employed. Having a migration background (OR = 3.63, CI: 1.71–7.71) increased the likelihood of being full-time employed in mothers of survivors. Less likely to be employed were mothers of survivors diagnosed with lymphoma (OR = 0.31, CI:0.13–0.73) and >2 children (OR = 0.48, CI:0.30–0.75); and fathers of survivors who had had a relapse (OR = 0.13, CI:0.04–0.36).

Conclusion

Employment situation of parents of long-term survivors reflected the more traditional parenting roles. Specific support for parents with low education, additional children, and whose child had a more severe cancer disease could improve their long-term employment situation.  相似文献   

17.

Background

Little information is available on the mortality and risk factors associated with death in disseminated non-tuberculous mycobacterial infection (dNTM) in HIV-infected patients in the ART-era.

Methods

In a single-center study, HIV-infected dNTM with positive NTM culture from sterile sites between 2000 and 2013 were analysed. The clinical characteristics at commencement of anti-mycobacterial treatment (baseline) were compared between those who survived and died.

Results

Twenty-four patients were analyzed. [The median CD4 27/μL (range 2–185)]. Mycobacterium avium and M. intracellulare accounted for 20 (83%) and 3 (13%) of isolated NTM. NTM bacteremia was diagnosed in 15 (63%) patients. Seven (29%) patients died, and NTM bacteremia was significantly associated with mortality (p = 0.022). The baseline CD4 count was significantly lower in the non-survivors than the survivors (median 7/μL versus 49, p = 0.034). Concomitant AIDS-defining diseases or malignancies were not associated with mortality. Immune-reconstitution syndrome (IRS) occurred to 19 (79%) patients (8 paradoxical and 11 unmasking), and prognosis tended to be better in unmasking-IRS than the other patients (n = 13) (p = 0.078). Patients with paradoxical-IRS had marginally lower CD4 count and higher frequency of bacteremia than those with unmasking-IRS (p = 0.051, and 0.059). Treatment with systemic corticosteroids was applied in 63% and 55% of patients with paradoxical and unmasking-IRS, respectively.

Conclusion

dNTM in HIV-infected patients resulted in high mortality even in the ART-era. NTM bacteremia and low CD4 count were risk factors for death, whereas patients presented with unmasking-IRS had marginally better prognosis. IRS occurred in 79% of the patients, suggesting difficulty in the management of dNTM.  相似文献   

18.

Background

Rapid coronary recanalization following ST-elevation myocardial infarction (STEMI) requires effective anti-platelet and anti-thrombotic therapies. This study tested the impact of door to end of procedure (‘door-to-end’) time and baseline platelet activity on platelet inhibition within 24hours post-STEMI.

Methods and Findings

108 patients, treated with prasugrel and procedural bivalirudin, underwent Multiplate® platelet function testing at baseline, 0, 1, 2 and 24hours post-procedure. Major adverse cardiac events (MACE), bleeding and stent thrombosis (ST) were recorded. Baseline ADP activity was high (88.3U [71.8–109.0]), procedural time and consequently bivalirudin infusion duration were short (median door-to-end time 55minutes [40–70] and infusion duration 30minutes [20–42]). Baseline ADP was observed to influence all subsequent measurements of ADP activity, whereas door-to-end time only influenced ADP immediately post-procedure. High residual platelet reactivity (HRPR ADP>46.8U) was observed in 75% of patients immediately post-procedure and persisted in 24% of patients at 2hours. Five patients suffered in-hospital MACE (4.6%). Acute ST occurred in 4 patients, all were <120mins post-procedure and had HRPR. No significant bleeding was observed. In a post-hoc analysis, pre-procedural morphine use was associated with significantly higher ADP activity following intervention.

Conclusions

Baseline platelet function, time to STEMI treatment and opiate use all significantly influence immediate post-procedural platelet activity.  相似文献   

19.

Aim

All-trans retinoic acid combined to anthracycline-based chemotherapy is the standard regimen of acute promyelocytic leukemia. The advent of arsenic trioxide has contributed to improve the anti-leukemic efficacy in acute promyelocytic leukemia. The objectives of the current study were to evaluate if dual induction by all-trans retinoic acid and arsenic trioxide could accelerate the recovery of abnormality of coagulation and fibrinolysis in patients with acute promyelocytic leukemia.

Methods

Retrospective analysis was performed in 103 newly-diagnosed patients with acute promyelocytic leukemia. Hemostatic variables and the consumption of component blood were comparably analyzed among patients treated by different induction regimen with or without arsenic trioxide.

Results

Compared to patients with other subtypes of de novo acute myeloid leukemia, patients with acute promyelocytic leukemia had lower platelet counts and fibrinogen levels, significantly prolonged prothrombin time and elevated D-dimers (P<0.001). Acute promyelocytic leukemia patients with high or intermediate risk prognostic stratification presented lower initial fibrinogen level than that of low-risk group (P<0.05). After induction treatment, abnormal coagulation and fibrinolysis of patients with acute promyelocytic leukemia was significantly improved before day 10. The recovery of abnormal hemostatic variables (platelet, prothrombin time, fibrinogen and D-dimer) was not significantly accelerated after adding arsenic trioxide in induction regimens; and the consumption of transfused component blood (platelet and plasma) did not dramatically change either. Acute promyelocytic leukemia patients with high or intermediate risk prognostic stratification had higher platelet transfusion demands than that of low-risk group (P<0.05).

Conclusions

Unexpectedly, adding arsenic trioxide could not accelerate the recovery of abnormality of coagulation and fibrinolysis in acute promyelocytic leukemia patients who received all-trans retinoic acid combining chemotherapy.  相似文献   

20.

Background

The analysis of heart rate variability (HRV) has been shown as a promising non-invasive technique for assessing the cardiac autonomic modulation in trauma. The aim of this study was to evaluate HRV during hemorrhagic shock and fluid resuscitation, comparing to traditional hemodynamic and metabolic parameters.

Methods

Twenty anesthetized and mechanically ventilated pigs were submitted to hemorrhagic shock (60% of estimated blood volume) and evaluated for 60 minutes without fluid replacement. Surviving animals were treated with Ringer solution and evaluated for an additional period of 180 minutes. HRV metrics (time and frequency domain) as well as hemodynamic and metabolic parameters were evaluated in survivors and non-survivors animals.

Results

Seven of the 20 animals died during hemorrhage and initial fluid resuscitation. All animals presented an increase in time-domain HRV measures during haemorrhage and fluid resuscitation restored baseline values. Although not significantly, normalized low-frequency and LF/HF ratio decreased during early stages of haemorrhage, recovering baseline values later during hemorrhagic shock, and increased after fluid resuscitation. Non-surviving animals presented significantly lower mean arterial pressure (43±7vs57±9 mmHg, P<0.05) and cardiac index (1.7±0.2vs2.6±0.5 L/min/m2, P<0.05), and higher levels of plasma lactate (7.2±2.4vs3.7±1.4 mmol/L, P<0.05), base excess (-6.8±3.3vs-2.3±2.8 mmol/L, P<0.05) and potassium (5.3±0.6vs4.2±0.3 mmol/L, P<0.05) at 30 minutes after hemorrhagic shock compared with surviving animals.

Conclusions

The HRV increased early during hemorrhage but none of the evaluated HRV metrics was able to discriminate survivors from non-survivors during hemorrhagic shock. Moreover, metabolic and hemodynamic variables were more reliable to reflect hemorrhagic shock severity than HRV metrics.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号