首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.

Background

Expanded criteria donors (ECDs) are currently accepted as potential sources to increase the donor pool and to provide more chances of kidney transplantation for elderly recipients who would not survive long waiting periods. Hypothermic machine perfusion (HMP) is designed to mitigate the deleterious effects of simple cold storage (CS) on the quality of preserved organs, particularly when the donor is in a marginal status.

Methods

We compared the transplant outcomes in patients receiving ECD kidneys with either HMP or CS graft preservation. Articles from the MEDLINE, EMBASE and Cochrane Library databases were searched and all studies reporting outcomes from HMP versus CS methods of kidney preservation were included in this meta-analysis. The parameters analyzed included the incidence of delayed graft function (DGF), primary non-function (PNF) and one-year graft and patient survival.

Results

A total of seven studies qualified for the review, involving 2374 and 8716 kidney grafts with HMP or CS preservation respectively, all from ECD donors. The incidence of delayed graft function (DGF) was significantly reduced with an odd ratio(OR) of 0.59 (95% CI 0.54–0.66, P<0.001) and one-year graft survival was significantly improved with an OR of 1.12 (95% CI 1.03–1.21, P = 0.005) in HMP preservation compared to CS. However, there was no difference in the incidence of PNF (OR 0.54, 95% CI 0.21–1.40, P = 0.20), and one-year patient survival (OR 0.98, 95% CI 0.94–1.02, P = 0.36) between HMP and CS preservation.

Conclusions

HMP was associated with a reduced incidence of DGF and an with increased one-year graft survival, but it was not associated with the incidence of PNF and one-year patient survival.  相似文献   

2.

Background

The use of expanded criteria donor kidneys (ECD) had been associated with worse outcomes. Whole gene expression of pre-implantation allograft biopsies from deceased donor kidneys (DDKs) was evaluated to compare the effect of pulsatile pump preservation (PPP) vs. cold storage preservation (CSP) on standard and ECD kidneys.

Methodology/Principal Findings

99 pre-implantation DDK biopsies were studied using gene expression with GeneChips. Kidneys transplant recipients were followed post transplantation for 35.8 months (range = 24–62). The PPP group included 60 biopsies (cold ischemia time (CIT)  = 1,367+/−509 minutes) and the CSP group included 39 biopsies (CIT = 1,022+/−485 minutes) (P<0.001). Donor age (42.0±14.6 vs. 34.1±14.2 years, P = 0.009) and the percentage of ECD kidneys (PPP = 35% vs. CSP = 12.8%, P = 0.012) were significantly different between groups. A two-sample t-test was performed, and probe sets having a P<0.001 were considered significant. Probe set level linear models were fit using cold ischemia time and CSP/PPP as independent variables to determine significant probe sets (P<0.001) between groups after adjusting for cold ischemia time. Thus, 43 significant genes were identified (P<0.001). Over-expression of genes associated with inflammation (CD86, CD209, CLEC4, EGFR2, TFF3, among others) was observed in the CSP group. Cell-to-cell signaling and interaction, and antigen presentation were the most important pathways with genes significantly over-expressed in CSP kidneys. When the analysis was restricted to ECD kidneys, genes involved in inflammation were also differentially up-regulated in ECD kidneys undergoing CSP. However, graft survival at the end of the study was similar between groups (P = 0.2). Moreover, the incidence of delayed graft function was not significant between groups.

Conclusions/Significance

Inflammation was the most important up-regulated pattern associated with pre-implantation biopsies undergoing CSP even when the PPP group has a larger number of ECD kidneys. No significant difference was observed in delayed graft function incidence and graft function post-transplantation. These findings support the use of PPP in ECD donor kidneys.  相似文献   

3.

Background

Old studies reported a worse outcome for second transplant recipient (STR) than for first transplant recipient (FTR) mainly due to non-comparable populations with numbers confounding factors. More recent analysis, based on improved methodology by using multivariate regressions, challenged this generally accepted idea: the poor prognosis for STR is still under debate.

Methodology

To assess the long-term patient-and-graft survival of STR compared to FTR, we performed an observational study based on the French DIVAT prospective cohort between 1996 and 2010 (N = 3103 including 641 STR). All patients were treated with a CNI, an mTOR inhibitor or belatacept in addition to steroids and mycophenolate mofetil for maintenance therapy. Patient-and-graft survival and acute rejection episode (ARE) were analyzed using Cox models adjusted for all potential confounding factors such as pre-transplant anti-HLA immunization.

Results

We showed that STR have a higher risk of graft failure than FTR (HR = 2.18, p = 0.0013) but that this excess risk was observed after few years of transplantation. There was no significant difference between STR and FTR in the occurrence of either overall ARE (HR = 1.01, p = 0.9675) or steroid-resistant ARE (HR = 1.27, p = 0.4087).

Conclusions

The risk of graft failure following second transplantation remained consistently higher than that observed in first transplantation after adjusting for confounding factors. The rarely performed time-dependent statistical modeling may explain the heterogeneous conclusions of the literature concerning second transplantation outcomes. In clinical practice, physicians should not consider STR and FTR equally.  相似文献   

4.

Background

Most liver transplant recipients receive calcineurin inhibitors (CNIs), especially tacrolimus and cyclosporine, as immunosuppressant agents to prevent rejection. A controversy exists as to whether the outcomes of hepatitis C virus (HCV)-infected liver transplant patients differ based on the CNIs used. This meta-analysis compares the clinical outcomes of tacrolimus-based and cyclosporine-based immunosuppression, especially cases of HCV recurrence in liver transplant patients with end-stage liver disease caused by HCV infection.

Methods

Related articles were identified from the Cochrane Hepato-Biliary Group Controlled Trials Register, the Cochrane Central Register of Controlled Trials (CENTRAL) in the Cochrane Library, Medline, and Embase. Meta-analyses were performed for the results of homogeneous studies.

Results

Nine randomized or quasi-randomized controlled trials were included. The total effect size of mortality (RR = 0.98, 95% CI: 0.77–1.25, P = 0.87) and graft loss (RR = 1.05, 95% CI: 0.83–1.33, P = 0.67) showed no significant difference between the two groups irrespective of duration of immunosuppressant therapy after liver transplantation. In addition, the HCV recurrence-induced mortality (RR = 1.11, 95% CI: 0.66–1.89, P = 0.69), graft loss (RR = 1.62, 95% CI: 0.64–4.07, P  = 0.31) and retransplantation (RR = 1.40, 95% CI: 0.48–4.09, P = 0.54), as well as available biopsies, confirmed that histological HCV recurrences (RR =  0.92, 95% CI: 0.71–1.19, P = 0.51) were similar.

Conclusion

These results suggested no difference in posttransplant HCV recurrence-induced mortality, graft loss and retransplantation, as well as histological HCV recurrence in patients treated with tacrolimus-based and cyclosporine-based immunosuppresion.  相似文献   

5.

Background

The aim of this study was to investigate factors that may improve the condition of a marginal kidney preserved with a normothermic solution following cardiac death (CD) in a model of rat kidney transplantation (RTx).

Methods

Post-euthanasia, Lewis (LEW) donor rats were left for 1 h in a 23°C room. These critical kidney grafts were preserved in University of Wisconsin (UW), lactate Ringer''s (LR), or extracellular-trehalose-Kyoto (ETK) solution, followed by intracellular-trehalose-Kyoto (ITK) solution at 4, 23, or 37°C for another 1 h, and finally transplanted into bilaterally nephrectomized LEW recipient rats (n = 4–6). Grafts of rats surviving to day 14 after RTx were evaluated by histopathological examination. The energy activity of these marginal rat kidneys was measured by high-performance liquid chromatography (HPLC; n = 4 per group) and fluorescence intensity assay (n = 6 per group) after preservation with UW or ETK solutions at each temperature. Finally, the transplanted kidney was assessed by an in vivo luciferase imaging system (n = 2).

Results

Using the 1-h normothermic preservation of post-CD kidneys, five out of six recipients in the ETK group survived until 14 days, in contrast to zero out of six in the UW group (p<0.01). Preservation with ITK rather than ETK at 23°C tended to have an inferior effect on recipient survival (p = 0.12). Energy activities of the fresh donor kidneys decreased in a temperature-dependent manner, while those of post-CD kidneys remained at the lower level. ETK was superior to UW in protecting against edema of the post-CD kidneys at the higher temperature. Luminescence intensity of successful grafts recovered within 1 h, while the intensity of grafts of deceased recipients did not change at 1 h post-reperfusion.

Conclusions

Normothermic storage with extracellular-type solution containing trehalose might prevent reperfusion injury due to temperature-dependent tissue edema.  相似文献   

6.

Background

HLA-C is an important ligand for killer immunoglobulin like receptors (KIR) that regulate natural killer (NK) cell function. Based on KIR specificity HLA-C molecules are allocated into two groups, HLA-C1 or HLA-C2; HLA-C2 is more inhibiting to NK cell function than HLA-C1. We studied the clinical importance of HLA-C genotypes on the long-term graft survival of 760 kidney transplants performed at our centre utilising a population based genetic study and cell culture model to define putative mechanisms.

Methods and Findings

Genotyping was performed using conventional DNA PCR techniques and correlations made to clinical outcomes. We found that transplant recipients with HLA-C2 had significantly better long-term graft survival than transplant recipients with HLA-C1 (66% versus 44% at 10 years, log-rank p = 0.002, HR = 1.51, 95%CI = 1.16–1.97). In in-vitro NK and dendritic cell (DC) co-culture model we made several key observations that correlated with the population based genetic study. We observed that donor derived NK cells, on activation with IL-15, promoted differential HLA-C genotype dependent DC maturation. In NK-DC co-culture, the possession of HLA-C2 by DC was associated with anti-inflammatory cytokine production (IL-1RA/IL-6), diminished DC maturation (CD86, HLA-DR), and absent CCR7 expression. Conversely, possession of HLA-C1 by DC was associated with pro-inflammatory cytokine synthesis (TNF-α, IL-12p40/p70), enhanced DC maturation and up-regulation of CCR7 expression. By immunohistochemistry the presence of donor NK cells was confirmed in pre-transplant kidneys.

Conclusions

We propose that after kidney transplantation IL-15 activated donor derived NK cells interact with recipient DC with less activation of indirect allo-reactivity in HLA-C2 positive recipients than HLA-C1 positive recipients; this has implications for long-term graft survival. Early events following kidney transplantation involving NK-DC interaction via KIR and HLA-C immune synapse may have a central role in long-term kidney transplant outcomes.  相似文献   

7.

Background

Elderly patients with end-stage renal disease have become the fastest growing population of kidney transplant candidates in recent years. However, the risk factors associated with long-term outcomes in these patients remain unclear.

Methods

We retrospectively analyzed 166 recipients aged 60 years or older who underwent primary deceased kidney transplantation between 2002 and 2013 in our center. The main outcomes included 1-, 3- and 5-year patient survival as well as overall and death-censored graft survival. The independent risk factors affecting graft and patient survival were analyzed using Cox regression analysis.

Results

The 1-, 3-, 5-year death-censored graft survival rates were 93.6%, 89.4% and 83.6%, respectively. Based on the Cox multivariate analysis, panel reactive antibody (PRA)>5% [hazard ratio (HR) 4.295, 95% confidence interval (CI) 1.321–13.97], delayed graft function (HR 4.744, 95% CI 1.611–13.973) and acute rejection (HR 4.971, 95% CI 1.516–16.301) were independent risk factors for graft failure. The 1-, 3-, 5-year patient survival rates were 84.8%, 82.1% and 77.1%, respectively. Longer dialysis time (HR 1.011 for 1-month increase, 95% CI 1.002–1.020), graft loss (HR 3.501, 95% CI 1.559–7.865) and low-dose ganciclovir prophylaxis (1.5 g/d for 3 months) (HR 3.173, 95% CI 1.063–9.473) were risk factors associated with patient death.

Conclusions

The five-year results show an excellent graft and patient survival in elderly kidney transplant recipients aged ≥60 years. PRA>5%, delayed graft function, and acute rejection are risk factors for graft failure, while longer duration of dialysis, graft loss and low-dose ganciclovir prophylaxis are risk factors for mortality in elderly recipients. These factors represent potential targets for interventions aimed at improving graft and patient survival in elderly recipients.  相似文献   

8.

Aim

The selection criteria for patients with hepatocellular carcinoma (HCC) to undergo liver transplantation should accurately predict posttransplant recurrence while not denying potential beneficiaries. In the present study, we attempted to identify risk factors associated with posttransplant recurrence and to expand the selection criteria.

Patients and Methods

Adult patients with HCC who underwent liver transplantation between November 2004 and September 2012 at our centre were recruited into the current study (N = 241). Clinical and pathological data were retrospectively reviewed. Patients who died during the perioperative period or died of non-recurrence causes were excluded from this study (N = 25). All potential risk factors were analysed using uni- and multi-variate analyses.

Results

Sixty-one recipients of 216 qualified patients suffered from recurrence. Similar recurrence-free and long-term survival rates were observed between living donor liver transplant recipients (N = 60) and deceased donor liver transplant recipients (N = 156). Total tumour volume (TTV) and preoperative percentage of lymphocytes (L%) were two independent risk factors in the multivariate analysis. We propose a prognostic score model based on these two risk factors. Patients within our criteria achieved a similar recurrence-free survival to patients within the Milan criteria. Seventy-one patients who were beyond the Milan criteria but within our criteria also had comparable survival to patients within the Milan criteria.

Conclusions

TTV and L% are two risk factors that contribute to posttransplant recurrence. Selection criteria based on these two factors, which are proposed by our study, expanded the Milan criteria without increasing the risk of posttransplant recurrence.  相似文献   

9.

Background

Extensively drug-resistant Acinetobacter baumannii (XDR-Ab) has emerged as a major nosocomial pathogen, but optimal treatment regimens are unknown. Although solid organ transplant (SOT) recipients are particularly susceptible to XDR-Ab infections, studies in this population are limited. Our objectives were to determine the epidemiology, clinical characteristics and outcomes of XDR-Ab infections among SOT patients.

Methods

A retrospective study of SOT recipients at our center who were colonized or infected with XDR-Ab between November 2006 and December 2011 was conducted. Among infected patients, the primary outcome was survival at 28 days. Secondary outcomes included survival at 90 days and clinical success at 28 days, and XDR-Ab infection recurrence.

Results

XDR-Ab was isolated from 69 SOT patients, of whom 41% (28) and 59% (41) were colonized and infected, respectively. Infections were significantly more common among cardiothoracic than abdominal transplant recipients (p = 0.0004). Ninety-eight percent (40/41) of patients had respiratory tract infections, most commonly ventilator-associated pneumonia (VAP; 88% [36/41]). Survival rates at 28 and 90 days were 54% (22/41) and 46% (19/41), respectively. Treatment with a colistin-carbapenem regimen was an independent predictor of 28-day survival (p = 0.01; odds ratio = 7.88 [95% CI: 1.60–38.76]). Clinical success at 28 days was achieved in 49% (18/37) of patients who received antimicrobial therapy, but 44% (8/18) of successes were associated with infection recurrence within 3 months. Colistin resistance emerged in 18% (2/11) and 100% (3/3) of patients treated with colistin-carbapenem and colistin-tigecycline, respectively (p = 0.03).

Conclusions

XDR-Ab causes VAP and other respiratory infections following SOT that are associated with significant recurrence and mortality rates. Cardiothoracic transplant recipients are at greatest risk. Results from this retrospective study suggest that colistin-carbapenem combinations may result in improved clinical responses and survival compared to other regimens and may also limit the emergence of colistin resistance.  相似文献   

10.

Introduction

The growing number of renal transplant recipients in a sustained immunosuppressive state is a factor that can contribute to increased incidence of sepsis. However, relatively little is known about sepsis in this population. The aim of this single-center study was to evaluate the factors associated with hospital mortality in renal transplant patients admitted to the intensive care unit (ICU) with severe sepsis and septic shock.

Methods

Patient demographics and transplant-related and ICU stay data were retrospectively collected. Multiple logistic regression was conducted to identify the independent risk factors associated with hospital mortality.

Results

A total of 190 patients were enrolled, 64.2% of whom received kidneys from deceased donors. The mean patient age was 51±13 years (males, 115 [60.5%]), and the median APACHE II was 20 (16–23). The majority of patients developed sepsis late after the renal transplantation (2.1 [0.6–2.3] years). The lung was the most common infection site (59.5%). Upon ICU admission, 16.4% of the patients had ≤1 systemic inflammatory response syndrome criteria. Among the patients, 61.5% presented with ≥2 organ failures at admission, and 27.9% experienced septic shock within the first 24 hours of ICU admission. The overall hospital mortality rate was 38.4%. In the multivariate analysis, the independent determinants of hospital mortality were male gender (OR = 5.9; 95% CI, 1.7–19.6; p = 0.004), delta SOFA 24 h (OR = 1.7; 95% CI, 1.2–2.3; p = 0.001), mechanical ventilation (OR = 30; 95% CI, 8.8–102.2; p<0.0001), hematologic dysfunction (OR = 6.8; 95% CI, 2.0–22.6; p = 0.002), admission from the ward (OR = 3.4; 95% CI, 1.2–9.7; p = 0.02) and acute kidney injury stage 3 (OR = 5.7; 95% CI,1.9–16.6; p = 0.002).

Conclusions

Hospital mortality in renal transplant patients with severe sepsis and septic shock was associated with male gender, admission from the wards, worse SOFA scores on the first day and the presence of hematologic dysfunction, mechanical ventilation or advanced graft dysfunction.  相似文献   

11.

Background

To evaluate the outcomes of Descemet-stripping automated endothelial keratoplasty (DSAEK) with the use of continuous air pumping technique in Asian eyes with previous Ahmed glaucoma valve implantation.

Methods

The DSAEK procedure was modified in that complete air retention of the anterior chamber was maintained for 10 min using continuous air pumping at 30 mm Hg. The primary outcome measurement was graft survival, and postoperative clinical features including, rate of graft detachment, endothelial cell count, intraocular pressure (IOP), surgical time and cup/disc ratio were also recorded.

Results

A total of 13 eyes of 13 patients which underwent modified DSAEK and 6 eyes of 6 patients which underwent conventional DSAEK were included. There was a significant difference in graft survival curves between two groups (P = 0.029); the 1-year graft survival rates were estimated as 100% and 66.7% for patients with modified DSAEK and those with traditional DSAEK, respectively. The rate of graft detachment were 0% and 33.3% for the modified DSAEK and conventional DSAEK groups, respectively (P = 0.088). The significantly lowered surgical time for air tamponade was noted in the modified DSAEK group compared to that in the conventional DSAEK group [median (IQR): 10.0 (10.0, 10.0) min vs. 24.5 (22.0, 27.0) min; P<0.001] Postoperatively, patients in the modified DSAEK group had significantly lower IOP as compared to the conventional DSAEK group [12.0 (11.0, 15.0) mm Hg vs. 16.0 (15.0, 18.0) mm Hg; P = 0.047]. Modified DSAEK patients had higher endothelial cell counts as compared to conventional DSAEK patients [2148.0 (1964.0, 2218.0) vs. 1529.0 (713.0, 2014.0)], but the difference did not reach statistical significance (P = 0.072).

Conclusions

New continuous air pumping technique in DSAEK can be performed safely and effectively in patients with prior GDDs placement who have corneal failure.  相似文献   

12.

Introduction

The overall effect of pamidronate on bone mass density (BMD) in the early renal transplant period varies considerably among studies. The effects of pamidronate on graft function have not been determined.

Materials and Methods

A comprehensive search was conducted in PubMed, the Cochrane Central Register of Controlled Trials (CENTRAL) and Embase independently by two authors. Randomized controlled trials of pamidronate evaluating bone loss in the first year of renal transplantation were included. Methods reported in the “Cochrane Handbook for Systematic Reviews of Interventions 5.0.2” were used to evaluate changes of lumbar spine and femoral neck BMD, and serum creatinine, calcium and intact parathyroid hormone (iPTH) levels. Fixed or random effect models were used as appropriate.

Results

Six randomized trials evaluating 281 patients were identified. One hundred forty-four were treated with pamidronate and 137 were control patients. Administration of pamidronate was associated with significant reduction of bone loss in the lumbar spine, compared to the control group (standardized mean difference (SMD)  = 24.62 [16.25, 32.99]). There was no difference between the pamidronate treated and control femoral neck BMD (SMD  = 3.53 [−1.84, 8.90]). A significant increase in the serum creatinine level of the intervention group was seen, compared to the control group. The serum calcium and iPTH of the pamidronate and control groups were not different after 1 year (serum creatinine: SMD  = −3.101 [−5.33, −0.89]; serum calcium: SMD  = 2.18 [−0.8, 5.16]; serum iPTH: SMD  = 0.06 [−0.19, 0.31]). Heterogeneity was low for serum calcium and iPTH and high for serum creatinine.

Conclusions

This meta-analysis demonstrated the beneficial clinical efficacy of pamidronate on BMD with no association with any alteration in graft function during the first year of renal transplantation. Significant heterogeneity precludes the conclusion of the relationship between serum creatinine and pamidronate.  相似文献   

13.

Background and Aims

Treatment of patients with Barcelona Clinic Liver Cancer Stage B hepatocellular carcinoma (BCLC-B HCC) is controversial. This study compared the long-term survival of patients with BCLC-B HCC who received liver resection (LR) or transarterial chemoembolization (TACE).

Methods

A total of 257 and 135 BCLC-B HCC patients undergoing LR and TACE, respectively, were retrospectively evaluated. Kaplan–Meier method was used for long-term survival analysis. Independent prognostic predictors were determined by the Cox proportional hazards model.

Results

The hospital mortality rate was similar between groups (3.1% vs. 3.7%; P = 0.76). However, the LR group showed a significantly higher postoperative complication rate than the TACE group (28 vs. 18.5%; P = 0.04). At the same time, the LR group showed significantly higher overall survival rates (1 year, 84 vs. 69%; 3 years, 59 vs. 29%; 5 years, 37 vs. 14%; P<0.001). Moreover, similar results were observed in the propensity score model. Three independent prognostic factors were associated with worse overall survival: serum AFP level (≥400 ng/ml), serum ALT level, and TACE.

Conclusions

LR appears to be as safe as TACE for patients with BCLC-B HCC, and it provides better long-term overall survival. However, prospective studies are needed to disclose if LR may be regarded as the preferred treatment for these patients as long as liver function is preserved.  相似文献   

14.

Background

The reoperation rate remains high after liver transplantation and the impact of reoperation on graft and recipient outcome is unclear. The aim of our study is to evaluate the impact of early reoperation following living-donor liver transplantation (LDLT) on graft and recipient survival.

Methods

Recipients that underwent LDLT (n = 111) at the University of Tokyo Hospital between January 2007 and December 2012 were divided into two groups, a reoperation group (n = 27) and a non-reoperation group (n = 84), and case-control study was conducted.

Results

Early reoperation was performed in 27 recipients (24.3%). Mean time [standard deviation] from LDLT to reoperation was 10 [9.4] days. Female sex, Child-Pugh class C, Non-HCV etiology, fulminant hepatitis, and the amount of intraoperative fresh frozen plasma administered were identified as possibly predictive variables, among which females and the amount of FFP were identified as independent risk factors for early reoperation by multivariable analysis. The 3-, and 6- month graft survival rates were 88.9% (95%confidential intervals [CI], 70.7–96.4), and 85.2% (95%CI, 66.5–94.3), respectively, in the reoperation group (n = 27), and 95.2% (95%CI, 88.0–98.2), and 92.9% (95%CI, 85.0–96.8), respectively, in the non-reoperation group (n = 84) (the log-rank test, p = 0.31). The 12- and 36- month overall survival rates were 96.3% (95%CI, 77.9–99.5), and 88.3% (95%CI, 69.3–96.2), respectively, in the reoperation group, and 89.3% (95%CI, 80.7–94.3) and 88.0% (95%CI, 79.2–93.4), respectively, in the non-reoperation group (the log-rank test, p = 0.59).

Conclusions

Observed graft survival for the recipients who underwent reoperation was lower compared to those who did not undergo reoperation, though the result was not significantly different. Recipient overall survival with reoperation was comparable to that without reoperation. The present findings enhance the importance of vigilant surveillance for postoperative complication and surgical rescue at an early postoperative stage in the LDLT setting.  相似文献   

15.

Background

Immunosuppressive therapy is usually administered following renal transplantation to protect the graft from rejection. However, this often causes complications such as infections to occur. Single nucleotide polymorphisms (SNPs) within the CTLA4 gene, such as −1772T/C (rs733618), +49A/G (rs231775) and +6230 G/A (rs3087243), can affect graft rejection and the long-term clinical outcome of organ transplantation. The role of CTLA4 SNPs in T cell-mediated immunity in renal transplantation and association with infection after transplantation is unknown.

Methods

In this study, the risk of infection according to CTLA4 SNPs was investigated in 304 patients who received kidney graft transplants between 2008 and 2012.

Results

The frequency of the rs4553808 GG genotype was significantly higher in recipients with viral infection (14.89%) than in those without infections (3.50%) (Bonferroni-adjusted p = 0.005). A significant difference (p = 0.001) in patients with the rs4553808 GG genotype from those with the AA+AG genotypes was found in the viral cohort using the log-rank test. A significant association was found between the rs4553808 genotype and onset of viral infection in transplant recipients (p = 0.001). The frequencies of the CGTAG and CGCAG haplotypes were significantly higher in the viral infection group (9.6% and 5.3%) than in the non-viral infection group (3.8% and 1.4%) (p = 0.0149 and p = 0.0111). No association between any CTLA4 SNP and bacterial infection was found. Multivariate analyses revealed that one risk factor, the use of antibody induction therapy (p = 0.007), was associated with bacterial infection, and two risk factors, antibody use (p = 0.015) and recipient rs4553808 genotype (p = 0.001), were associated with viral infection.

Conclusions

The rs4553808 GG genotype may be a risk factor for viral infection in kidney transplantation. The CTLA4 haplotypes CGTAG and CGCAG were partially associated with the development of viral infection in Chinese kidney transplant recipients.  相似文献   

16.
I Gotsman  D Zwas  C Lotan  A Keren 《PloS one》2012,7(7):e41022

Background

Patients with heart failure (HF) have a poor prognosis. The proportion of patients with HF and preserved left ventricular function (LVF) is increasing. Long term prognosis of HF with preserved LVF may not be so benign.

Objectives

To evaluate the long term clinical outcome of patients with HF and preserved LVF and predictors of outcome.

Methods

We prospectively evaluated 309 patients hospitalized with a definite clinical diagnosis of HF. Patients were followed for a mean of 6.5 years for clinical outcome.

Results

More than a third (36%) of the patients had preserved systolic LVF based on echocardiography. The long term survival rate in this group was poor and not significantly different from patients with reduced LVF (28% vs 23% respectively, P = 0.2). The adjusted survival rate by Cox regression analysis was also not significantly different (hazard ratio 1.16, 95% confidence interval 0.87–1.55, P = 0.31). The event free survival from death or heart failure re-hospitalization was also low in both groups and not significantly different between patients with preserved vs. reduced LVF (12% vs. 10% respectively, P = 0.2). Predictors of mortality in patients with preserved LVF were age, functional capacity and serum urea levels.

Conclusions

The long term clinical outcome of patients with heart failure and preserved LVF is poor and not significantly different from patients with reduced LVF.  相似文献   

17.

Background

The increasing incidence and heterogeneous behavior of intestinal neuroendocrine tumors (iNETs) pose a clinicopathological challenge. Our goal was to decribe the prognostic value of the new WHO 2010 grading and the AJCC/UICC TNM staging systems for iNETs. Moreover, outcomes of patients treated with somatostatin analogs were assessed.

Methods

We collected epidemiological and clinicopathological data from 93 patients with histologically proven iNETs including progression and survival outcomes. The WHO 2010 grading and the AJCC/UICC TNM staging systems were applied for all cases. RECIST criteria were used to define progression. Kaplan-Meier analyses for progression free survival (PFS) and overall survival (OS) were performed.

Results

Mean follow-up was 58.6 months (4–213 months). WHO 2010 grading yielded PFS and disease-specific OS of 125.0 and 165.8 months for grade 1 (G1), 100.0 and 144.2 months for G2 and 15.0 and 15.8 months for G3 tumors (p = 0.004 and p = 0.001). Using AJCC staging, patients with stage I and II tumors had no progression and no deaths. Stage III and IV patients demonstrated PFS of 138.4 and 84.7 months (p = 0.003) and disease-specific OS of 210.0 and 112.8 months (p = 0.017). AJCC staging also provided informative PFS (91.2 vs. 50.0 months, p = 0.004) and OS (112.3 vs. 80.0 months, p = 0.005) measures with somatostatin analog use in stage IV patients.

Conclusion

Our findings underscore the complementarity of WHO 2010 and AJCC classifications in providing better estimates of iNETS disease outcomes and extend the evidence for somatostatin analog benefit in patients with metastatic disease.  相似文献   

18.

Background

Growth hormone (GH) treatment has become a frequently applied growth promoting therapy in short children born small for gestational age (SGA). Children born SGA have a higher risk of developing attention deficit hyperactivity disorder (ADHD). Treatment of ADHD with methylphenidate (MP) has greatly increased in recent years, therefore more children are being treated with GH and MP simultaneously. Some studies have found an association between MP treatment and growth deceleration, but data are contradictory.

Objective

To explore the effects of MP treatment on growth in GH-treated short SGA children

Methods

Anthropometric measurements were performed in 78 GH-treated short SGA children (mean age 10.6 yr), 39 of whom were also treated with MP (SGA-GH/MP). The SGA-GH/MP group was compared to 39 SGA-GH treated subjects. They were matched for sex, age and height at start of GH, height SDS at start of MP treatment and target height SDS. Serum insulin-like growth factor-I (IGF-I) and IGF binding protein-3 (IGFBP-3) levels were yearly determined. Growth, serum IGF-I and IGFBP-3 levels during the first three years of treatment were analyzed using repeated measures regression analysis.

Results

The SGA-GH/MP group had a lower height gain during the first 3 years than the SGA-GH subjects, only significant between 6 and 12 months of MP treatment. After 3 years of MP treatment, the height gain was 0.2 SDS (±0.1 SD) lower in the SGA-GH/MP group (P = 0.17). Adult height was not significantly different between the SGA-GH/MP and SGA-GH group (−1.9 SDS and −1.9 SDS respectively, P = 0.46). Moreover, during the first 3 years of MP treatment IGF-I and IGFBP-3 measurements were similar in both groups.

Conclusion

MP has some negative effect on growth during the first years in short SGA children treated with GH, but adult height is not affected.  相似文献   

19.

Background

Reduced lean body mass (LBM) is one of the main indicators in malnutrition inflammation syndrome among patients on dialysis. However, the influence of LBM on peritoneal dialysis (PD) patients’ outcomes and the factors related to increasing LBM are seldom reported.

Methods

We enrolled 103 incident PD patients between 2002 and 2003, and followed them until December 2011. Clinical characteristics, PD-associated parameters, residual renal function, and serum chemistry profiles of each patient were collected at 1 month and 1 year after initiating PD. LBM was estimated using creatinine index corrected with body weight. Multiple linear regression analysis, Kaplan–Meier survival analysis, and Cox regression proportional hazard analysis were used to define independent variables and compare survival between groups.

Results

Using the median LBM value (70% for men and 64% for women), patients were divided into group 1 (n = 52; low LBM) and group 2 (n = 51; high LBM). Group 1 patients had higher rates of peritonitis (1.6 vs. 1.1/100 patient months; p<0.05) and hospitalization (14.6 vs. 9.7/100 patient months; p<0.05). Group 1 patients also had shorter overall survival and technique survival (p<0.01). Each percentage point increase in LBM reduced the hazard ratio for mortality by 8% after adjustment for diabetes, age, sex, and body mass index (BMI). Changes in residual renal function and protein catabolic rate were independently associated with changes in LBM in the first year of PD.

Conclusions

LBM serves as a good parameter in addition to BMI to predict the survival of patients on PD. Preserving residual renal function and increasing protein intake can increase LBM.  相似文献   

20.

Objective

To evaluate surgical outcomes and prognostic factors for T4 gastric cancer treated with curative resection.

Methods

Between January 1994 and December 2008, 94 patients diagnosed with histological T4 gastric carcinoma and treated with curative resection were recruited. Patient characteristics, surgical complications, survival, and prognostic factors were analyzed.

Results

Postoperative morbidity and mortality were 18.1% and 2.1%, respectively. Multivariate analysis indicated lymph node metastasis (hazard ratio, 2.496; 95% confidence interval, 1.218–5.115; p = 0.012) was independent prognostic factor.

Conclusions

For patients with T4 gastric cancer, lymph node metastasis was associated with poorer survival. Neoadjuvant chemotherapy or aggressive adjuvant chemotherapy after radical resection was strongly recommended for these patients.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号