首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 46 毫秒
1.
It is unclear if there is an association between the duration of delayed graft function (DGF) and kidney transplant (KT) outcomes. This study investigated the impact of prolonged DGF on patient and graft survivals, and renal function one year after KT. This single center retrospective analysis included all deceased donor KT performed between Jan/1998 and Dec/2008 (n = 1412). Patients were grouped in quartiles according to duration of DGF (1–5, 6–10, 11–15, and >15 days, designated as prolonged DGF). The overall incidence of DGF was 54.2%. Prolonged DGF was associated with retransplantation (OR 2.110, CI95% 1.064–4.184,p = 0.033) and more than 3 HLA mismatches (OR 1.819, CI95% 1.117–2.962,p = 0.016). The incidence of acute rejection was higher in patients with DGF compared with those without DGF (36.2% vs. 12.2%, p<0.001). Compared to patients without DGF, DGF(1–5), DGF(6–10), and DGF(11–15), patients with prolonged DGF showed inferior one year patient survival (95.2% vs. 95.4% vs. 95.5% vs. 93.4% vs. 88.86%, p = 0.003), graft survival (91% vs. 91.4% vs. 92% vs. 88.7% vs. 70.5%, p<0.001), death-censored graft survival (95.7% vs. 95.4% vs. 96.4% vs. 94% vs. 79.3%, p<0.001), and creatinine clearance (58.0±24.6 vs. 55.8±22.2 vs. 53.8±24.1 vs. 53.0±27.2 vs. 36.8±27.0 mL/min, p<0.001), respectively. Multivariable analysis showed that prolonged DGF was an independent risk factor for graft loss (OR 3.876, CI95% 2.270–6.618, p<0.001), death censored graft loss (OR 4.103, CI95% 2.055–8.193, p<0.001), and death (OR 3.065, CI95% 1.536–6.117, p = 0.001). Prolonged DGF, determined by retransplantation and higher HLA mismatches, was associated with inferior renal function, and patient and graft survivals at one year.  相似文献   

2.

Background

Elderly patients with end-stage renal disease have become the fastest growing population of kidney transplant candidates in recent years. However, the risk factors associated with long-term outcomes in these patients remain unclear.

Methods

We retrospectively analyzed 166 recipients aged 60 years or older who underwent primary deceased kidney transplantation between 2002 and 2013 in our center. The main outcomes included 1-, 3- and 5-year patient survival as well as overall and death-censored graft survival. The independent risk factors affecting graft and patient survival were analyzed using Cox regression analysis.

Results

The 1-, 3-, 5-year death-censored graft survival rates were 93.6%, 89.4% and 83.6%, respectively. Based on the Cox multivariate analysis, panel reactive antibody (PRA)>5% [hazard ratio (HR) 4.295, 95% confidence interval (CI) 1.321–13.97], delayed graft function (HR 4.744, 95% CI 1.611–13.973) and acute rejection (HR 4.971, 95% CI 1.516–16.301) were independent risk factors for graft failure. The 1-, 3-, 5-year patient survival rates were 84.8%, 82.1% and 77.1%, respectively. Longer dialysis time (HR 1.011 for 1-month increase, 95% CI 1.002–1.020), graft loss (HR 3.501, 95% CI 1.559–7.865) and low-dose ganciclovir prophylaxis (1.5 g/d for 3 months) (HR 3.173, 95% CI 1.063–9.473) were risk factors associated with patient death.

Conclusions

The five-year results show an excellent graft and patient survival in elderly kidney transplant recipients aged ≥60 years. PRA>5%, delayed graft function, and acute rejection are risk factors for graft failure, while longer duration of dialysis, graft loss and low-dose ganciclovir prophylaxis are risk factors for mortality in elderly recipients. These factors represent potential targets for interventions aimed at improving graft and patient survival in elderly recipients.  相似文献   

3.

Introduction

Female gender is a known risk factor for early and late mortality after coronary artery bypass graft surgery (CABG). Higher age of women at operation may influence outcome, since age per se is also an important risk factor. The purpose of our study was to analyze possible gender differences in outcome after isolated CABG in different age groups to delineate the impact of female gender and age.

Methods

All patients over 60 years of age undergoing isolated CABG at our department during 2001 and 2011 were included and categorized by age into sexagenarians (2266, 16.6% women), septuagenarians (2332, 25.4% women) and octogenarians (374, 32% women) and assessed by gender for 30-day and 180-day mortality.

Results

Thirty-day mortality was significantly higher in women only amongst septuagenarians (7.1 vs. 4.7%, p = 0.033). Same differences apply for 180-day mortality (12.3 vs. 8.2%, p = 0.033) and estimated one-year survival (81.6 ± 4.2 vs. 86.9 ± 2.2%, p = 0.001). Predictive factors for 30-day mortality of septuagenarian were logistic EuroSCORE (ES) (p = 0.003), perioperative myocardial infarction (MI) (p<0.001), pneumonia (p<0.001), abnormal LV-function (p<0.04) and use of LIMA graft (p<0.001), but not female gender. However, female gender was found to be an independent predictor for 180-day mortality (HR 1.632, p = 0.001) in addition to ES, use of LIMA graft, perioperative MI, pneumonia and abnormal LV function (HR 1.013, p = 0.004; HR 0.523, p<0.001; HR 2.710, p<0.001; HR 3.238, p<0.001; HR 2.013, p<0.001).

Conclusion

Women have a higher observed probability of early death after CABG in septuagenarians. However, female gender was not found to be an independent risk factor for 30-day, but for 180-day survival. Therefore, reduction of high impact risk factors such as perioperative MI and enhancement of LIMA use should be future goals. In view of our findings, decision for surgical revascularization should not be based on gender.  相似文献   

4.
IntroductionDelayed graft function is a prevalent clinical problem in renal transplantation for which there is no objective system to predict occurrence in advance. It can result in a significant increase in the necessity for hospitalisation post-transplant and is a significant risk factor for other post-transplant complications.MethodologyThe importance of microRNAs (miRNAs), a specific subclass of small RNA, have been clearly demonstrated to influence many pathways in health and disease. To investigate the influence of miRNAs on renal allograft performance post-transplant, the expression of a panel of miRNAs in pre-transplant renal biopsies was measured using qPCR. Expression was then related to clinical parameters and outcomes in two independent renal transplant cohorts.ResultsHere we demonstrate, in two independent cohorts of pre-implantation human renal allograft biopsies, that a novel pre-transplant renal performance scoring system (GRPSS), can determine the occurrence of DGF with a high sensitivity (>90%) and specificity (>60%) for donor allografts pre-transplant, using just three senescence associated microRNAs combined with donor age and type of organ donation.ConclusionThese results demonstrate a relationship between pre-transplant microRNA expression levels, cellular biological ageing pathways and clinical outcomes for renal transplantation. They provide for a simple, rapid quantitative molecular pre-transplant assay to determine post-transplant allograft function and scope for future intervention. Furthermore, these results demonstrate the involvement of senescence pathways in ischaemic injury during the organ transplantation process and an indication of accelerated bio-ageing as a consequence of both warm and cold ischaemia.  相似文献   

5.

Introduction

Acute kidney injury is associated with a poor prognosis in acute liver failure but little is known of outcomes in patients undergoing transplantation for acute liver failure who require renal replacement therapy.

Methods

A retrospective analysis of the United Kingdom Transplant Registry was performed (1 January 2001–31 December 2011) with patient and graft survival determined using Kaplan-Meier methods. Cox proportional hazards models were used together with propensity-score based full matching on renal replacement therapy use.

Results

Three-year patient and graft survival for patients receiving renal replacement therapy were 77.7% and 72.6% compared with 85.1% and 79.4% for those not requiring renal replacement therapy (P<0.001 and P = 0.009 respectively, n = 725). In a Cox proportional hazards model, renal replacement therapy was a predictor of both patient death (hazard ratio (HR) 1.59, 95% CI 1.01–2.50, P = 0.044) but not graft loss (HR 1.39, 95% CI 0.92–2.10, P = 0.114). In groups fully matched on baseline covariates, those not receiving renal replacement therapy with a serum creatinine greater than 175μmol/L had a significantly worse risk of graft failure than those receiving renal replacement therapy.

Conclusion

In patients being transplanted for acute liver failure, use of renal replacement therapy is a strong predictor of patient death and graft loss. Those not receiving renal replacement therapy with an elevated serum creatinine may be at greater risk of early graft failure than those receiving renal replacement therapy. A low threshold for instituting renal replacement therapy may therefore be beneficial.  相似文献   

6.
BackgroundImmunological non-response (INR) despite virological suppression is associated with AIDS-defining events/death (ADE). Little is known about its association with serious non-AIDS-defining events (nADE).MethodsPatients highly-active antiretroviral therapy (HAART) with <200 CD4+/μl and achieving HIV-RNA <50 copies/ml within 12 (±3) months were categorized as INR if CD4+ T-cell count at year 1 was <200/μl. Predictors of nADE (malignancies, severe infections, renal failure—ie, estimated glomerular filtration rate <30 ml/min, cardiovascular events and liver decompensation) were assessed using multivariable Cox models. Follow-up was right-censored in case of HAART discontinuation or confirmed HIV-RNA>50.Results1221 patients were observed for a median of 3 (IQR: 1.3-6.1) years. Pre-HAART CD4+ were 77/μl (IQR: 28-142) and 56% of patients had experienced an ADE. After 1 year, CD4+ increased to 286 (IQR: 197-387), but 26.1% of patients were INR. Thereafter, 86 nADE (30.2% malignancies, 27.9% infectious, 17.4% renal, 17.4% cardiovascular, 7% hepatic) were observed, accounting for an incidence of 1.83 events (95%CI: 1.73-2.61) per 100 PYFU. After adjusting for measurable confounders, INR had a significantly greater risk of nADE (HR 1.65; 95%CI: 1.06-2.56). Older age (per year, HR 1.03; 95%CI: 1.01-1.05), hepatitis C co-infection (HR 2.09; 95%CI: 1.19-3.7), a history of previous nADE (HR 2.16; 95%CI: 1.06-4.4) and the occurrence of ADE during the follow-up (HR 2.2; 95%CI: 1.15-4.21) were other independent predictors of newly diagnosed nADE.ConclusionsPatients failing to restore CD4+ to >200 cells/μl run a greater risk of serious nADE, which is intertwined or predicted by AIDS progression. Improved management of this fragile population and innovative therapy able to induce immune-reconstitution are urgently needed. Also, our results strengthen the importance of earlier diagnosis and HAART introduction.  相似文献   

7.
BackgroundFemoral fragility fractures are one of the most common injuries managed by orthopedic surgeons. Malnutrition influences the poor outcomes observed in this population. Our purpose was to assess the annual trends of malnutrition diagnosis and determine risk factors for malnutrition and complications in patients 65 years and older presenting with femoral fragility fractures. We hypothesized that malnutrition would increase the risk of postoperative wound infection, wound dehiscence, non-union, and mortality.MethodsThe PearlDiver database was reviewed from 2010 to 2020. Patients ≥ 65-years-old with femur fractures treated with operative fixation were identified by CPT code. A preoperative diagnosis of malnourished state was defined by ICD-9 and ICD-10 codes and patients were divided into malnourished and non-malnourished cohorts. Patients were tracked for one year following operative fixation of a femoral fragility fracture for the occurrence of infection, wound dehiscence, nonunion and mortality. The rates of these complications were compared between malnourished and nonmalnourished cohorts.ResultsThere were 178,283 total femoral fragility fractures identified in patients aged 65-years or older. The overall prevalence of malnutrition diagnosis in this geriatric population was 12.8%. Documented malnutrition in femoral fragility fractures increased from 1.6% to 32.9% from 2010-2020 (P<0.0001). Compared to patients without malnutrition, patients with malnutrition are at increased risk of mortality (OR 1.31, 95% CI 1.2558 – 1.3752, p < 0.0001), are more likely to develop a wound infection (OR 1.49; 95% CI 1.252 – 1.7626; p < 0.0001), more likely to have a wound dehiscence (OR 1.55; 95% CI 1.3416 – 1.7949; p < 0.0001), and more likely to develop non-union (1.89; 95% CI 1.6946 – 2.1095; p < 0.0001). Multiple demographic variables were associated with malnutrition diagnosis including higher age, higher Charlson Comorbidity Index, female sex, dementia, and institutionalization. Parkinson’s disease, feeding difficulty and institutionalization demographic variables had the highest risk of malnutrition.ConclusionThe current study found that malnutrition diagnosis significantly increases the risk of adverse medical events in elderly adults with femoral fragility fractures. The rates of malnutrition increased steadily from 2010-2020. This trend is likely a result of increased awareness and testing for malnutrition, not reflecting an actual increased prevalence of malnutrition. Multiple expected demographic variables are associated with diagnosis of malnutrition. Level of Evidence: III  相似文献   

8.
BackgroundThe early prediction of delayed graft function (DGF) would facilitate patient management after kidney transplantation.MethodsIn a single-centre retrospective analysis, we investigated kinetic estimated GFR under non-steady-state conditions, KeGFR, in prediction of DGF. KeGFRsCr was calculated at 4h, 8h and 12h in 56 recipients of deceased donor kidneys from initial serum creatinine (sCr) concentrations, estimated creatinine production rate, volume of distribution, and the difference between consecutive sCr values. The utility of KeGFRsCr for DGF prediction was compared with, sCr, plasma cystatin C (pCysC), and KeGFRpCysC similarly derived from pCysC concentrations.ResultsAt 4h, the KeGFRsCr area under the receiver operator characteristic curve (AUC) for DGF prediction was 0.69 (95% CI: 0.56–0.83), while sCr was not useful (AUC 0.56, (CI: 0.41–0.72). Integrated discrimination improvement analysis showed that the KeGFRsCr improved a validated clinical prediction model at 4h, 8h, and 12h, increasing the AUC from 0.68 (0.52–0.83) to 0.88 (0.78–0.99) at 12h (p = 0.01). KeGFRpCysC also improved DGF prediction. In contrast, sCr provided no improvement at any time point.ConclusionsCalculation of KeGFR from sCr facilitates early prediction of DGF within 4 hours of renal transplantation.  相似文献   

9.
BackgroundPublic and scientific concerns about the social gradient of end-stage renal disease and access to renal replacement therapies are increasing. This study investigated the influence of social inequalities on the (i) access to renal transplant waiting list, (ii) access to renal transplantation and (iii) patients’ survival.MethodsAll incident adult patients with end-stage renal disease who lived in Bretagne, a French region, and started dialysis during the 2004–2009 period were geocoded in census-blocks. To each census-block was assigned a level of neighborhood deprivation and a degree of urbanization. Cox proportional hazards models were used to identify factors associated with each study outcome.ResultsPatients living in neighborhoods with low level of deprivation had more chance to be placed on the waiting list and less risk of death (HR = 1.40 95%CI: [1.1–1.7]; HR = 0.82 95%CI: [0.7–0.98]), but this association did not remain after adjustment for the patients’ clinical features. The likelihood of receiving renal transplantation after being waitlisted was not associated with neighborhood deprivation in univariate and multivariate analyses.ConclusionsIn a mixed rural and urban French region, patients living in deprived or advantaged neighborhoods had the same chance to be placed on the waiting list and to undergo renal transplantation. They also showed the same mortality risk, when their clinical features were taken into account.  相似文献   

10.
BackgroundThe purpose of this study was to determine risk factors for blood transfusion in primary anatomic and reverse total shoulder arthroplasty (TSA) performed for osteoarthritis.MethodsPatients who underwent anatomic or reverse TSA for a diagnosis of primary osteoarthritis were identified in a national surgical database from 2005 to 2018 by utilizing both CPT and ICD-9/ICD-10 codes. Univariate analysis was performed on the two transfused versus non-transfused cohorts to compare for differences in comorbidities and demographics. Independent risk factors for perioperative blood transfusions were identified via multivariate regression models.Results305 transfused and 18,124 nontransfused patients were identified. Female sex (p<0.001), age >85 years (p=0.001), insulin-dependent diabetes mellitus (p=0.001), dialysis dependence (p=0.001), acute renal failure (p=0.012), hematologic disorders (p=0.010), disseminated cancer (p<0.001), ASA ≥ 3 (p<0.001), and functional dependence (p=0.001) were shown to be independent risk factors for blood transfusions on multivariate logistic regression analysis.ConclusionSeveral independent risk factors for blood transfusion following anatomic/reverse TSA for osteoarthritis were identified. Awareness of these risk factors can help surgeons and perioperative care teams to both identify and optimize high-risk patients to decrease both transfusion requirements and its associated complications in this patient population. Level of Evidence: III  相似文献   

11.

Background

Reactivation of cytomegalovirus (CMV) has been reported occasionally in immnunocompetent patients in the intensive care unit (ICU). The epidemiology and association of CMV infection with adverse outcome is not well defined in this population. Patients undergoing major heart surgery (MHS) are at a particularly high risk of infection. CMV infection has not been systematically monitored in MSH-ICU patients.

Methods

We assessed CMV plasma viremia weekly using a quantitative polymerase chain reaction assay in a prospective cohort of immunocompetent adults admitted to the MHS-ICU for at least 72 hours between October 2012 and May 2013. Risk factors for CMV infection and its potential association with continued hospitalization or death by day 30 (composited endpoint) were assessed using univariate and multivariate logistic regression analyses.

Results

CMV viremia at any level was recorded in 16.5% of patients at a median of 17 days (range, 3-54 days) after admission to the MHS-ICU. Diabetes (adjusted OR, 5.6; 95% CI, 1.8-17.4; p=0.003) and transfusion requirement (>10 units) (adjusted OR, 13.7; 95% CI, 3.9-47.8; p<0.001) were independent risk factors associated with CMV reactivation. Reactivation of CMV at any level was independently associated with the composite endpoint (adjusted OR, 12.1; 95% CI, 2.3-64; p=0.003).

Conclusion

Reactivation of CMV is relatively frequent in immunocompetent patients undergoing MHS and is associated with prolonged hospitalization or death.  相似文献   

12.
PurposeAssess prevalence of myopia and identify associated risk factors in urban school children.MethodsThis was a cross-sectional study screening children for sub-normal vision and refractive errors in Delhi. Vision was tested by trained health workers using ETDRS charts. Risk factor questionnaire was filled for children with vision <6/9.5, wearing spectacles and for a subset (10%) of randomly selected children with normal vision. All children with vision <6/9.5 underwent cycloplegic refraction. The prevalence of myopia <-0.5 diopters was assessed. Association of risk factors and prevalence of myopia was analyzed for children with myopia and randomly selected non myopic children and adjusted odds ratio values for all risk factors were estimated.ResultsA total number of 9884 children were screened with mean age of 11.6 + 2.2 years and 66.8% boys. Prevalence of myopia was 13.1% with only 320 children (24.7%) wearing appropriate spectacles. Mean myopic spherical error was -1.86 + 1.4 diopters. Prevalence of myopia was higher in private schools compared to government schools (p<0.001), in girls vs. boys (p = 0.004) and among older (> 11 years) children (p<0.001). There was a positive association of myopia with studying in private schools vs. government schools (p<0.001), positive family history (p< 0.001) and higher socio-economic status (p = 0.037). Positive association of presence of myopia was observed with children studying/reading > 5 hours per day (p < 0.001), watching television > 2 hours / day (p < 0.001) and with playing computer/video/mobile games (p < 0.001). An inverse association with outdoor activities/playing was observed with children playing > 2 hours in a day.ConclusionMyopia is a major health problem in Indian school children. It is important to identify modifiable risk factors associated with its development and try to develop cost effective intervention strategies.  相似文献   

13.

Introduction

Anemia and renal impairment are important co-morbidities among patients with coronary artery disease undergoing Percutaneous Coronary Intervention (PCI). Disease progression to eventual death can be understood as the combined effect of baseline characteristics and intermediate outcomes.

Methods

Using data from a prospective cohort study, we investigated clinical pathways reflecting the transitions from PCI through intermediate ischemic or hemorrhagic events to all-cause mortality in a multi-state analysis as a function of anemia (hemoglobin concentration <120 g/l and <130 g/l, for women and men, respectively) and renal impairment (creatinine clearance <60 ml/min) at baseline.

Results

Among 6029 patients undergoing PCI, anemia and renal impairment were observed isolated or in combination in 990 (16.4%), 384 (6.4%), and 309 (5.1%) patients, respectively. The most frequent transition was from PCI to death (6.7%, 95% CI 6.1–7.3), followed by ischemic events (4.8%, 95 CI 4.3–5.4) and bleeding (3.4%, 95% CI 3.0–3.9). Among patients with both anemia and renal impairment, the risk of death was increased 4-fold as compared to the reference group (HR 3.9, 95% CI 2.9–5.4) and roughly doubled as compared to patients with either anemia (HR 1.7, 95% CI 1.3–2.2) or renal impairment (HR 2.1, 95% CI 1.5–2.9) alone. Hazard ratios indicated an increased risk of bleeding in all three groups compared to patients with neither anemia nor renal impairment.

Conclusions

Applying a multi-state model we found evidence for a gradient of risk for the composite of bleeding, ischemic events, or death as a function of hemoglobin value and estimated glomerular filtration rate at baseline.  相似文献   

14.
Whilst renal dysfunction, especially mild impairment (60<eGFR<90 ml/min), has been often described in HIV-infected population, its potential contribution to HIV evolution and risk of cerebro-cardiovascular disease (CCVD) has not been clarified. Data from HIV-1 infected patients enrolled in the Italian Cohort of Antiretroviral-Naïve (Icona) Foundation Study collected between January 2000 and February 2014 with at least two creatinine values available. eGFR (CKD-epi) and renal dysfunction defined using a priori cut-offs of 60 (severely impaired) and 90 ml/min/1.73m2 (mildly impaired). Characteristics of patients were described after stratification in these groups and compared using chi-square test (categorical variables) or Kruskal Wallis test comparing median values. Follow-up accrued from baseline up to the date of the CCVD or AIDS related events or death or last available visit. Kaplan Meier curves were used to estimate the cumulative probability of occurrence of the events over time. Adjusted analysis was performed using a proportional hazards Cox regression model. We included 7,385 patients, observed for a median follow-up of 43 months (inter-quartile range [IQR]: 21-93 months). Over this time, 130 cerebro-cardiovascular events (including 11 deaths due to CCVD) and 311 AIDS-related events (including 45 deaths) were observed. The rate of CCVD events among patients with eGFR >90, 60-89, <60 ml/min, was 2.91 (95% CI 2.30-3.67), 4.63 (95% CI 3.51-6.11) and 11.9 (95% CI 6.19-22.85) per 1,000 PYFU respectively, with an unadjusted hazard ratio (HR) of 4.14 (95%CI 2.07-8.29) for patients with eGFR <60 ml/min and 1.58 (95%CI 1.10-2.27) for eGFR 60-89 compared to those with eGFR ≥90. Of note, these estimates are adjusted for traditional cardio-vascular risk factors (e.g. smoking, diabetes, hypertension, dyslipidemia). Incidence of AIDS-related events was 9.51 (95%CI 8.35-10.83), 6.04 (95%CI 4.74-7.71) and 25.0 (95%CI 15.96-39.22) per 1,000 PYFU, among patients with eGFR >90, 60-89, <60 ml/min, respectively, with an unadjusted HR of 2.49 (95%CI 1.56-3.97) for patients with eGFR <60 ml/min and 0.68 (95%CI 0.52-0.90) for eGFR 60-89. The risk of AIDS events was significantly lower in mild renal dysfunction group even after adjustment for HIV-related characteristics. Our data confirm that impaired renal function is an important risk marker for CCVD events in the HIV-population; importantly, even those with mild renal impairment (90<eGFR<60) seem to be at increased risk of cerebro-cardiovascular morbidity and mortality.  相似文献   

15.
BackgroundNon-Hodgkin''s lymphoma (NHL) development in Sjögren’s syndrome (SS) remains a potentially lethal complication and efforts should focus on the identification of predictors that could aid in appropriate therapeutic decisions.MethodsIn order to identify potential prognostic factors for outcome in SS-associated NHL, we retrospectively analyzed a cohort of 77 patients, diagnosed with NHL according to WHO classification criteria and meeting the American-European Consensus Classification (AECC) criteria for SS and examined the effect of SS-activity (defined as the EULAR SS disease activity index-ESSDAI) in the prognosis of SS-related NHLs, as defined in terms of overall and event-free survivals (OS and EFS). An event was defined as lymphoma relapse, treatment failure, disease progression, histological transformation or death. The effect of NHL clinical and laboratory characteristics was also investigated.ResultsMALT lymphomas constituted the majority (66.2%) of lymphomas. During the follow-up (median = 57.93 months), the 5-year OS was 90.91% (95% CI: 82.14–95.80%) and the EFS was 77.92% (95% CI: 67.37–85.82%). Patients with high ESSDAI score at lymphoma diagnosis had a greater risk for death (OR = 5.241, 95% CI: 1.034–26.568) or for event (OR = 4.317, 95% CI: 1.146–9.699, p = 0.008). These patients had also significantly worse EFS (HR = 4.541, 95% CI: 1.772–11.637) and OS (HR = 5.946, 95% CI: 1.259–28.077). In addition, post-chemotherapy ESSDAI improvement was significantly lower in patients who had experienced an event (p = 0.005). An unfavorable International prognostic index (IPI) score (high-intermediate/high) was associated with high risk of death and event (OR = 13.867, 95% CI: 2.656–72.387 and OR = 12.589, 95% CI: 3.911–40.526, respectively), worse EFS (log-rank p<0.001, HR = 8.718, 95% CI: 3.477–21.858), as well as with worse OS (log-rank p<0.001, HR = 11.414, 95% CI: 2.414–53.974). After adjustment for identified risk factors, IPI score retained a significant prognostic role following by a strong effect of ESSDAI in survival outcomes.ConclusionsAt the point of NHL diagnosis, IPI and ESSDAI might be proved useful predictive tools in SS-associated lymphoma prognosis, directing to a more patient-tailored approach.  相似文献   

16.
BackgroundRed cell distribution width (RDW), neutrophil-to-lymphocyte ratio (NLR), and platelet-to-lymphocyte ratio (PLR) are known inflammatory indices. Elevated values are found in many cancers and may be associated with a poor prognosis. The article aimed to assess the impact of RDW, NLR, and PLR on overall survival (OS) of patients with oropharyngeal cancer treated with radiotherapy (RT).Materials and methodsThis retrospective study includes 208 patients treated for oropharyngeal cancer with definitive RT or RT combined with neoadjuvant or concurrent systemic therapy, at one institution between 2004 and 2014. The receiver operating characteristic (ROC) method, log-rank testing, and Cox proportional hazards regression model were used for the analysis.ResultsThe OS was significantly higher in RDW ≤ 13.8% (p = 0.001) and NLR ≤ 2.099 (p = 0.016) groups. The RDW index was characterized by the highest discriminatory ability [area under the curve (AUC) = 0.59, 95% confidence interval (CI): 0.51–0.67], closely followed by NLR (AUC = 0.58, 95% CI: 0.50–0.65). In the univariate Cox regression analysis, RDW [hazard ratio (HR): 1.28, 95% CI: 1.12–1.47, p < 0.001] and NLR (HR: 1.11, 95% CI: 1.06–1.18, p < 0.001) were associated with an increased risk of death. In the multivariate analysis, among the analyzed indices, only NLR was significantly associated with survival (HR: 1.16, 95% CI: 1.03–1.29, p = 0.012).ConclusionsIn the study, only NLR proved to be an independent predictor of OS. However, its clinical value is limited due to the relatively low sensitivity and specificity.  相似文献   

17.
Severe non-AIDS bacterial infections (SBI) are the leading cause of hospital admissions among people living with HIV (PLHIV) in industrialized countries. We aimed to estimate the incidence of SBI and their risk factors in a large prospective cohort of PLHIV patients over a 13-year period in France. Patients followed up in the ANRS CO3 Aquitaine cohort between 2000 and 2012 were eligible; SBI was defined as a clinical diagnosis associated with hospitalization of ≥48 hours or death. Survival analysis was conducted to identify risk factors for SBI.Total follow-up duration was 39,256 person-years [PY] (31,370 PY on antiretroviral treatment [ART]). The incidence of SBI decreased from 26.7/1000 PY [95% CI: 22.9–30.5] over the period 2000–2002 to 11.9/1000 PY [10.1–13.8] in 2009–2012 (p <0.0001). Factors independently associated to increased risk of SBI were: plasma HIVRNA>50 copies/mL (Hazard Ratio [HR] = 5.1, 95% Confidence Interval: 4.2–6.2), CD4 count <500 cells/mm3 and CD4/CD8 ratio <0.8 (with a dose-response relationship for both markers), history of cancer (HR = 1.4 [1.0–1.9]), AIDS stage (HR = 1.7 [1.3–2.1]) and HCV coinfection (HR = 1.4, [1.1–1.6]). HIV-positive patients with diabetes were more prone to SBI (HR = 1.6 [0.9–2.6]). Incidence of SBI decreased over a 13-year period due to the improvement in the virological and immune status of PLHIV on ART. Risk factors for SBI include low CD4 count and detectable HIV RNA, but also CD4/CD8 ratio, HCV coinfection, history of cancer and diabetes, comorbid conditions that have been frequent among PLHIV in recent years.  相似文献   

18.
Patients with a history of diabetes mellitus (DM) have worse survival than those without DM after liver transplantation. However, the effect of liver grafts from DM donors on the post-transplantation survival of recipients is unclear. Using the Scientific Registry of Transplant Recipients database (2004–2008), 25,413 patients were assessed. Among them, 2,469 recipients received grafts from donors with DM. The demographics and outcome of patients were assessed. Patient survival was assessed using Kaplan–Meier methodology and Cox regression analyses. Recipients from DM donors experienced worse graft survival than recipients from non-DM donors (one-year survival: 81% versus 85%, and five-year survival: 67% versus 74%, P<0.001, respectively). Graft survival was significantly lower for recipients from DM donors with DM duration >5 years (P<0.001) compared with those with DM duration <5 years. Cox regression analyses showed that DM donors were independently associated with worse graft survival (hazard ratio, 1.11; 95% confidence interval, 1.02–1.19). The effect of DM donors was more pronounced on certain underlying liver diseases of recipients. Increases in the risk of graft loss were noted among recipients from DM donors with hepatitis-C virus (HCV) infection, whereas those without HCV experienced similar outcomes compared with recipients from non-DM donors. These data suggest that recipients from DM donors experience significantly worse patient survival after liver transplantation. However, in patients without HCV infection, using DM donors was not independently associated with worse post-transplantation graft survival. Matching these DM donors to recipients without HCV may be safe.  相似文献   

19.

Background

Polyomavirus associated nephropathy (PVAN) is a significant cause of early allograft loss and the course is difficult to predict. The aim of this study is to identify factors influencing outcome for PVAN.

Methods

Between 2006 and 2014, we diagnosed PVAN in 48 (7.8%) of 615 patients monitored for BK virus every 1–4 weeks after modification of maintenance immunosuppression. Logistic or Cox regression analysis were performed to determine which risk factors independently affected clinical outcome and graft loss respectively.

Results

After 32.1±26.4 months follow-up, the frequencies of any graft functional decline at 1 year post-diagnosis, graft loss and any graft functional decline at the last available follow-up were 27.1% (13/48), 25.0% (12/48), and 33.3% (16/48), respectively. The 1, 3, 5 year graft survival rates were 100%, 80.5% and 69.1%, respectively. The mean level of serum creatinine at 1 year post-diagnosis and long-term graft survival rates were the worst in class C (p<0.05). Thirty-eight of 46 (82.6%) BKV DNAuria patients reduced viral load by 90% with a median time of 2.75 months (range, 0.25–34.0 months) and showed better graft survival rates than the 8 patients (17.4%) without viral load reduction (p<0.001). Multivariate logistic regression analysis showed that extensive interstitial inflammation (OR 20.2, p = 0.042) and delayed fall in urinary viral load (>2.75 months for >90% decrease) in urine (OR 16.7, p = 0.055) correlated with worse creatinine at 1 year post-diagnosis. Multivariate Cox regression analysis showed that extensive interstitial inflammation (HR 46988, p = 0.032) at diagnosis, and high PVAN stage (HR 162.2, p = 0.021) were associated with worse long-term graft survival rates.

Conclusions

The extent of interstitial inflammation influences short and long-term graft outcomes in patients with PVAN. The degree of PVAN, rate of reduction in viral load, and viral clearance also can be used as prognostic markers in PVAN.  相似文献   

20.
Urinary tract infection (UTI) is a common complication after kidney transplantation, often associated to graft loss and increased healthcare costs. Kidney transplant patients (KTPs) are particularly susceptible to infection by Enterobacteriaceae-producing extended-spectrum β-lactamases (ESBLs). A retrospective case-control study was conducted to identify independent risk factors for ESBL-producing Escherichia coli and Klebsiella pneumoniae in non-hospitalized KTPs with UTI. Forty-nine patients suffering from UTI by ESBL-producing bacteria (ESBL-P) as case group and the same number of patients with UTI by ESBL negative (ESBL-N) as control-group were compared. Clinical data, renal function parameters during UTI episodes, UTI recurrence and relapsing rate, as well as risk factors for recurrence, molecular characterization of isolates and the respective antimicrobial susceptibility profile were evaluated. Diabetes mellitus (p <0.007), previous antibiotic prophylaxis (p=0.017) or therapy (p<0.001), previous UTI (p=0.01), relapsing infection (p=0.019) and patients with delayed graft function after transplant (p=0.001) represented risk factors for infection by ESBL positive Enterobacteriaceae in KTPs. Interestingly, the period of time between data of transplantation and data of UTI was shorter in case of ESBL-P case-group (28.8 months) compared with ESBL-N control-group (50.9 months). ESBL-producing bacteria exhibited higher resistance to fluoroquinolones (p=0.002), trimethoprim-sulfamethoxazole (p<0.001) and gentamicin (p<0.001). Molecular analysis showed that bla CTX-M was the most common ESBL encoding gene (65.3%), although in 55.1% of the cases more than one ESBL gene was found. In 29.4% of K. pneumoniae isolates, three bla-genes (bla CTX-M-bla TEM-bla SHV) were simultaneously detected. Low estimated glomerular filtration rate (p=0.009) was found to be risk factor for UTI recurrence. Over 60% of recurrent UTI episodes were caused by genetically similar strains. UTI by ESBL-producing Enterobacteriaceae in KTPs represent an important clinical challenge regarding not only hospitalized patients but also concerning outpatients.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号