首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 16 毫秒
1.

Background

Traveling to seek specialized care such as liver transplantation (LT) is a reality in the United States. Patient migration has been attributed to organ availability. The aims of this study were to delineate patterns of patient migration and outcomes after LT.

Study Design

All deceased donor LT between 2008–2013 were extracted from UNOS data. Migrated patients were defined as those patients who underwent LT at a center in a different UNOS region from the region in which they resided and traveled a distance > 100 miles.

Results

Migrated patients comprised 8.2% of 28,700 LT performed. Efflux and influx of patients were observed in all 11 UNOS regions. Regions 1, 5, 6, and 9 had a net efflux, while regions 2, 3, 4, 7, 10, and 11 had a net influx of patients. After multivariate adjustment for donor and recipient factors, graft (p = 0.68) and patient survival (p = 0.52) were similar between migrated and non-migrated patients.

Conclusion

A significant number of patients migrated in patterns that could not be explained alone by regional variations in MELD score and wait time. Migration may be a complex interplay of factors including referral patterns, specialized services at centers of excellence and patient preference.  相似文献   

2.
Li C  Mi K  Wen Tf  Yan Ln  Li B  Yang Jy  Xu Mq  Wang Wt  Wei Yg 《PloS one》2011,6(11):e27366

Background/Aims

The number of people undergoing living donor liver transplantation (LDLT) has increased rapidly in many transplant centres. Patients considering LDLT need to know whether LDLT is riskier than deceased donor liver transplantation (DDLT). The aim of this study was to compare the outcomes of patients undergoing LDLT versus DDLT.

Methods

A total of 349 patients with benign liver diseases were recruited from 2005 to 2011 for this study. LDLT was performed in 128 patients, and DDLT was performed in 221 patients. Pre- and intra-operative variables for the two groups were compared. Statistically analysed post-operative outcomes include the postoperative incidence of complication, biliary and vascular complication, hepatitis B virus (HBV) recurrence, long-term survival rate and outcomes of emergency transplantation.

Results

The waiting times were 22.10±15.31 days for the patients undergoing LDLT versus 35.81±29.18 days for the patients undergoing DDLT. The cold ischemia time (CIT) was 119.34±19.75 minutes for the LDLT group and 346±154.18 for DDLT group. LDLT group had higher intraoperative blood loss, but red blood cell (RBC) transfusion was not different. Similar ≥ Clavien III complications, vascular complications, hepatitis B virus (HBV) recurrence and long-term survival rates were noted. LDLT patients suffered a higher incidence of biliary complications in the early postoperative days. However, during the long-term follow-up period, biliary complication rates were similar between the two groups. The long-term survival rate of patients undergoing emergency transplantation was lower than of patients undergoing elective transplantation. However, no significant difference was observed between emergency LDLT and emergency DDLT.

Conclusions

Patients undergoing LDLT achieved similar outcomes to patients undergoing DDLT. Although LDLT patients may suffer a higher incidence of early biliary complications, the total biliary complication rate was similar during the long-term follow-up period.  相似文献   

3.

Background

Islet transplantation may potentially cure type 1 diabetes mellitus (T1DM). However, immune rejection, especially that induced by the alloreactive T-cell response, remains a restraining factor for the long-term survival of grafted islets. Programmed death ligand-1 (PD-L1) is a negative costimulatory molecule. PD-L1 deficiency within the donor heart accelerates allograft rejection. Here, we investigate whether PD-L1 deficiency in donor islets reduces allograft survival time.

Methods

Glucose Stimulation Assays were performed to evaluate whether PD-L1 deficiency has detrimental effects on islet function. Islets isolated from PDL1-deficient mice or wild- type (WT) mice (C57BL/6j) were implanted beneath the renal capsule of streptozotocin (STZ)-induced diabetic BALB/c mice. Blood glucose levels and graft survival time after transplantation were monitored. Moreover, we analyzed the residual islets, infiltrating immune cells and alloreactive cells from the recipients.

Results

PD-L1 deficiency within islets does not affect islet function. However, islet PD-L1 deficiency increased allograft rejection and was associated with enhanced inflammatory cell infiltration and recipient T-cell alloreactivity.

Conclusions

This is the first report to demonstrate that PD-L1 deficiency accelerated islet allograft rejection and regulated recipient alloimmune responses.  相似文献   

4.

Objectives

Cytomegalovirus (CMV) infections in liver transplant recipients are common and result in significant morbidity and mortality. Intravenous ganciclovir or oral valganciclovir are the standard treatment for CMV infection. The present study investigates the efficacy of oral valganciclovir in CMV infection as a preemptive treatment after liver transplantation.

Methods

Between 2012 and 2013, 161 patients underwent liver transplantation at Samsung Medical Center. All patients received tacrolimus, steroids, and mycophenolate mofetil. Patients with CMV infection were administered oral valganciclovir (VGCV) 900mg/day daily or intravenous ganciclovir (GCV) 5mg/kg twice daily as preemptive treatment. Stable liver transplant recipients received VGCV.

Results

Eighty-three patients (51.6%) received antiviral therapy as a preemptive treatment because of CMV infection. The model for end-stage liver disease (MELD) score and the proportions of Child-Pugh class C, hepatorenal syndrome, and deceased donor liver transplantation in the CMV infection group were higher than in the no CMV infection group. Sixty-one patients received GCV and 22 patients received VGCV. The MELD scores in the GCV group were higher than in the VGCV group, but there were no statistical differences in the pretransplant variables between the two groups. AST, ALT, and total bilirubin levels in the GCV group were higher than in the VGCV group when CMV infection occurred. The incidences of recurrent CMV infection in the GCV and VGCV groups were 14.8% and 4.5%, respectively (P=0.277).

Conclusion

Oral valganciclovir is feasible as a preemptive treatment for CMV infection in liver transplant recipients with stable graft function.  相似文献   

5.
Ling Q  Xu X  Wei Q  Liu X  Guo H  Zhuang L  Chen J  Xia Q  Xie H  Wu J  Zheng S  Li L 《PloS one》2012,7(1):e30322

Background

High score of model for end-stage liver diseases (MELD) before liver transplantation (LT) indicates poor prognosis. Artificial liver support system (ALSS) has been proved to effectively improve liver and kidney functions, and thus reduce the MELD score. We aim to evaluate whether downgrading MELD score could improve patient survival after LT.

Methodology/Principal Findings

One hundred and twenty-six LT candidates with acute-on-chronic hepatitis B liver failure and MELD score ≥30 were included in this prospective study. Of the 126 patients, 42 received emergency LT within 72 h (ELT group) and the other 84 were given ALSS as salvage treatment. Of the 84 patients, 33 were found to have reduced MELD score (<30) on the day of LT (DGM group), 51 underwent LT with persistent high MELD score (N-DGM group). The median waiting time for a donor was 10 for DGM group and 9.5 days for N-DGM group. In N-DGM group there is a significantly higher overall mortality (43.1%) than that in ELT group (16.7%) and DGM group (15.2%). N-DGM (vs. ECT and DGM) was the only independent risk factor of overall mortality (P = 0.003). Age >40 years and the interval from last ALSS to LT >48 h were independent negative influence factors of downgrading MELD.

Conclusions/Significance

Downgrading MELD for liver transplant candidates with MELD score ≥30 was effective in improving patient prognosis. An appropriate ALSS treatment within 48 h prior to LT is potentially beneficial.  相似文献   

6.

Background

Sleeve anastomosis is the most common technique used to rearterialize orthotopic liver transplants (OLT). However, this technique has a number of disadvantages, including difficulty of performance of the technique visually unaided. We herein describe a novel rearterialized OLT model in the rat.

Materials and Methods

Forty-six male Sprague Dawley rats (300–400 g) were used as donors and recipients. Based on Kamada’s cuff technique, the new model involved performing a modified “sleeve” anastomosis between the celiac trunk of the donor and common hepatic artery of the recipient to reconstruct blood flow to the hepatic artery. An additional ten male Sprague Dawley rats underwent liver transplantation without artery reconstruction. Liver grafts were retrieved from the two groups and histological examination was performed following surgery.

Results

Total mean operating times were ~42 minutes for the donor liver extraction and 57 minutes for the recipient transplantation. Graft preparation took an additional 15 minutes and the time to fix the arterial bracket was ~3 minutes. During transplantation, the anhepatic phase lasted 18 ± 2.5 min and the artery reconstruction only required ~3 minutes. The patency rate was 94.44% and the 4-week survival rate was 90%. Histology indicated obvious fibrosis in the liver grafts without artery reconstruction, while normal histology was observed in the arterialized graft.

Conclusions

This new method allows for the surgical procedure to be performed visually unaided with good survival and patency rates and represents an alternative model investigating OLT in rats.  相似文献   

7.

Objective

To compare the effectiveness of different technique modifications in laparoscopic donor nephrectomy.

Design

Systematic review and meta-analyses.

Data Sources

Searches of PubMed, EMBASE, Web of Science and Central from January 1st 1997 until April 1st 2014.

Study Design

All cohort studies and randomized clinical trials comparing fully laparoscopic donor nephrectomy with modifications of the standard technique including hand-assisted, retroperitoneoscopic and single port techniques, were included.

Data-Extraction and Analysis

The primary outcome measure was the number of complications. Secondary outcome measures included: conversion to open surgery, first warm ischemia time, estimated blood loss, graft function, operation time and length of hospital stay. Each technique modification was compared with standard laparoscopic donor nephrectomy. Data was pooled with a random effects meta-analysis using odds ratios, weighted mean differences and their corresponding 95% confidence intervals. To assess heterogeneity, the I2 statistic was used. First, randomized clinical trials and cohort studies were analyzed separately, when data was comparable, pooled analysis were performed.

Results

31 studies comparing laparoscopic donor nephrectomy with other technique modifications were identified, including 5 randomized clinical trials and 26 cohort studies. Since data of randomized clinical trials and cohort studies were comparable, these data were pooled. There were significantly less complications in the retroperitoneoscopic group as compared to transperitoneal group (OR 0.52, 95%CI 0.33–0.83, I2 = 0%). Hand-assisted techniques showed shorter first warm ischemia and operation times.

Conclusions

Hand-assistance reduces the operation and first warm ischemia times and may improve safety for surgeons with less experience in laparoscopic donor nephrectomy. The retroperitoneoscopic approach was significantly associated with less complications. However, given the, in general, poor to intermediate quality and considerable heterogeneity in the included studies, further high-quality studies are required.

Trial Registration

The review protocol was registered in the PROSPERO database before the start of the review process (CRD number 42013006565).  相似文献   

8.

Background

Delirium is one of the main causes of increased length of intensive care unit (ICU) stay among patients who have undergone living donor liver transplantation (LDLT). We aimed to evaluate risk factors for delirium after LDLT as well as to investigate whether delirium impacts the length of ICU and hospital stay.

Methods

Seventy-eight patients who underwent LDLT during the period January 2010 to December 2012 at a single medical center were enrolled. The Confusion Assessment Method for the Intensive Care Unit (CAM-ICU) scale was used to diagnose delirium. Preoperative, postoperative, and hematologic factors were included as potential risk factors for developing delirium.

Results

During the study period, delirium was diagnosed in 37 (47.4%) patients after LDLT. The mean onset of symptoms occurred 7.0±5.5 days after surgery and the mean duration of symptoms was 5.0±2.6 days. The length of stay in the ICU for patients with delirium (39.8±28.1 days) was significantly longer than that for patients without delirium (29.3±19.0 days) (p<0.05). Risk factors associated with delirium included history of alcohol abuse [odds ratio (OR) = 6.40, 95% confidence interval (CI): 1.85–22.06], preoperative hepatic encephalopathy (OR = 4.45, 95% CI: 1.36–14.51), APACHE II score ≥16 (OR = 1.73, 95% CI: 1.71–2.56), and duration of endotracheal intubation ≥5 days (OR = 1.81, 95% CI: 1.52–2.23).

Conclusions

History of alcohol abuse, preoperative hepatic encephalopathy, APACHE II scores ≥16 and endotracheal intubation ≥5 days were predictive of developing delirium in the ICU following liver transplantation surgery and were associated with increased length of ICU and hospital stay.  相似文献   

9.

Background

Hepatitis B virus-associated hepatocellular carcinoma (HBV-HCC) and hepatitis C virus (HCV)-HCC are the main indications for liver transplantation. We compared differences in survival outcomes between these two conditions.

Methods and Findings

The China Liver Transplant Registry (CLTR) contains data collated from all transplants performed in 86 liver transplantation centers across China. We analyzed CLTR data from January 1999 to December 2010. In all, 7,658 patients (7,162 with HBV-HCC and 496 with HCV-HCC) were included in this study. Clinical characteristics were compared between the HBV-HCC and HCV-HCC groups; Kaplan–Meier analysis was used to calculate the overall, tumor-free and hepatitis-free survival rates. The 1-year, 3-year and 5-year overall survival was significantly higher in HBV-HCC recipients than in HCV-HCC recipients (76.65%, 56.61% and 49.10% vs. 64.59%, 42.78% and 39.20%, respectively; P<0.001). The corresponding tumor-free survival rates (63.55%, 47.37%, 40.99% vs. 56.84%, 38.04%, 35.66%, respectively) and hepatitis-free survival rates (75.49%, 54.84%, 47.34% vs. 63.87%, 42.15%, 39.33%, respectively) were both superior in HBV-HCC recipients (both P<0.001). Multivariate analyses identified hepatitis, preoperative alpha-fetoprotein (AFP) level, size of largest tumor, number of tumor nodules, TNM stage, vascular invasion and preoperative model for end-stage liver disease (MELD) score as independent predictors of overall, tumor-free and hepatitis-free survival.

Conclusions

Survival outcomes after liver transplantation were significantly better in HBV-HCC patients than in HCV-HCC patients. This finding may be used to guide donor liver allocation in transplantation programs.  相似文献   

10.

Background

Given the limited efficacy and high adverse event rate associated with treatment of recurrent hepatitis C after liver transplantation, an individualized treatment strategy should be considered. The aim of this study was to identify predictors of response to antiviral therapy for hepatitis C after living donor liver transplantation (LDLT) and to study the associated adverse events.

Methods

A retrospective chart review was performed on 125 hepatitis C virus (HCV)-positive LDLT recipients who received interferon plus ribavirin and/or peginterferon plus ribavirin therapy at Kyoto University between January 2001 and June 2011.

Results

Serum HCV RNA reached undetectable levels within 48 weeks in 77 (62%) of 125 patients, and these patients were defined as showing virological response (VR). Of 117 patients, 50 (43%) achieved sustained VR (SVR). Predictive factors associated with both VR and SVR by univariate analysis included low pretransplant serum HCV RNA levels, a non-1 HCV genotype, and low pretreatment serum HCV RNA levels. In addition, LDLT from ABO-mismatched donors was significantly associated with VR, and white cell and neutrophil counts before interferon therapy were associated with SVR. Multivariate analysis showed that 2 variables–pretransplant serum HCV RNA level less than 500 kIU/mL and a non-1 HCV genotype–remained in models of both VR and SVR and that an ABO mismatch was associated with VR. No variables with a significant effect on treatment withdrawal were found.

Conclusions

Virological response to antiviral therapy in patients with hepatitis C recurring after LDLT can be predicted prior to transplant, based on pretransplant serum HCV-RNA levels and HCV genotype. LDLT from ABO-mismatched donors may contribute to more efficacious interferon therapy.

Trial Registration

UMIN-CTR UMIN000003286  相似文献   

11.
Huang CT  Lin HC  Chang SC  Lee WC 《PloS one》2011,6(8):e22689

Objective

Post-operative pulmonary complications significantly affect patient survival rates, but there is still no conclusive evidence regarding the effect of post-operative respiratory failure after liver transplantation on patient prognosis. This study aimed to predict the risk factors for post-operative respiratory failure (PRF) after liver transplantation and the impact on short-term survival rates.

Design

The retrospective observational cohort study was conducted in a twelve-bed adult surgical intensive care unit in northern Taiwan. The medical records of 147 liver transplant patients were reviewed from September 2002 to July 2007. Sixty-two experienced post-operative respiratory failure while the remaining 85 patients did not.

Measurements and Main Results

Gender, age, etiology, disease history, pre-operative ventilator use, molecular adsorbent re-circulating system (MARS) use, source of organ transplantation, model for end-stage liver disease score (MELD) and Child-Turcotte-Pugh score calculated immediately before surgery were assessed for the two groups. The length of the intensive care unit stay, admission duration, and mortality within 30 days, 3 months, and 1 year were also evaluated. Using a logistic regression model, post-operative respiratory failure correlated with diabetes mellitus prior to liver transplantation, pre-operative impaired renal function, pre-operative ventilator use, pre-operative MARS use and deceased donor source of organ transplantation (p<0.05). Once liver transplant patients developed PRF, their length of ICU stay and admission duration were prolonged, significantly increasing their mortality and morbidity (p<0.001).

Conclusions

The predictive pre-operative risk factors significantly influenced the occurrence of post-operative respiratory failure after liver transplantation.  相似文献   

12.

Background

Advanced liver cirrhosis is associated with systemic hemodynamic derangement leading to the development of severe complications associated with increased mortality. Copeptin is a stable cleavage product of the precursor of arginine vasopressin, a key-regulator in hemodynamic homeostasis. Copeptin is currently considered a reliable prognostic marker in a wide variety of diseases other than cirrhosis. The present study aimed to assess copeptin, both experimentally and clinically, as a potential biomarker of hemodynamic derangement and to evaluate its prognostic significance in cirrhosis.

Materials and Methods

Two studies were executed: 1) in 18 thioacetamide-induced cirrhotic rats and 5 control rats, plasma copeptin and hemodynamic measurements were performed, 2) in 61 cirrhotic patients, serum copeptin concentration was measured in samples collected at time of registration at the waiting list for liver transplantation. In 46 patients, also a second copeptin measurement was performed during follow-up while registered at the waiting list for liver transplantation. To determine the association of serum copeptin and clinical data with outcome, Cox proportional hazard regression analysis and Kaplan Meier analysis were performed.

Results

Plasma copeptin concentration was significantly higher in cirrhotic rats than in controls (1.6 ± 0.5 vs. 0.9 ± 0.1 pmol/L, p< 0.01) and was negatively correlated to the mean arterial blood pressure (r = -0.574, p = 0.013). In cirrhotic patients, serum copeptin concentration was high [11.0 (5.2–24.0) pmol/L] and increased significantly during the time of registration at the waiting list for liver transplantation. MELD and MELD-sodium score were significantly correlated to serum copeptin [MELD: (r = 0.33, p = 0.01), MELD-sodium: (r = 0.29, p = 0.02)], also at time of the second copeptin measurement [MELD and MELD-sodium: r = 0.39, p< 0.01]. In cirrhotic humans, serum copeptin concentration was significantly associated with outcome, independently of the MELD and MELD-sodium score. Patients with a low serum copeptin concentration at time of registration at the liver transplant waiting list had significantly better transplant-free survival rates at 3, 6 and 12 months of follow-up as compared to those with a high serum copeptin concentration (Log-rank: p< 0.01, p< 0.01 and p = 0.02 respectively).

Conclusions

Circulating copeptin levels are elevated in rats and humans with cirrhosis. Copeptin is independently associated with outcome in cirrhotic patients awaiting liver transplantation.  相似文献   

13.

Background

Liver transplantation can prolong survival in patients with end-stage liver disease. We have proposed that the Sequential Organ Failure Assessment (SOFA) score calculated on post-transplant day 7 has a great discriminative power for predicting 1-year mortality after liver transplantation. The Chronic Liver Failure - Sequential Organ Failure Assessment (CLIF-SOFA) score, a modified SOFA score, is a newly developed scoring system exclusively for patients with end-stage liver disease. This study was designed to compare the CLIF-SOFA score with other main scoring systems in outcome prediction for liver transplant patients.

Methods

We retrospectively reviewed medical records of 323 patients who had received liver transplants in a tertiary care university hospital from October 2002 to December 2010. Demographic parameters and clinical characteristic variables were recorded on the first day of admission before transplantation and on post-transplantation days 1, 3, 7, and 14.

Results

The overall 1-year survival rate was 78.3% (253/323). Liver diseases were mostly attributed to hepatitis B virus infection (34%). The CLIF-SOFA score had better discriminatory power than the Child-Pugh points, Model for End-Stage Liver Disease (MELD) score, RIFLE (risk of renal dysfunction, injury to the kidney, failure of the kidney, loss of kidney function, and end-stage kidney disease) criteria, and SOFA score. The AUROC curves were highest for CLIF-SOFA score on post-liver transplant day 7 for predicting 1-year mortality. The cumulative survival rates differed significantly for patients with a CLIF-SOFA score ≤8 and those with a CLIF-SOFA score >8 on post-liver transplant day 7.

Conclusion

The CLIF-SOFA score can increase the prediction accuracy of prognosis after transplantation. Moreover, the CLIF-SOFA score on post-transplantation day 7 had the best discriminative power for predicting 1-year mortality after liver transplantation.  相似文献   

14.
Zhang M  Yin F  Chen B  Li YP  Yan LN  Wen TF  Li B 《PloS one》2012,7(3):e31256

Background

The scarcity of grafts available necessitates a system that considers expected posttransplant survival, in addition to pretransplant mortality as estimated by the MELD. So far, however, conventional linear techniques have failed to achieve sufficient accuracy in posttransplant outcome prediction. In this study, we aim to develop a pretransplant predictive model for liver recipients'' survival with benign end-stage liver diseases (BESLD) by a nonlinear method based on pretransplant characteristics, and compare its performance with a BESLD-specific prognostic model (MELD) and a general-illness severity model (the sequential organ failure assessment score, or SOFA score).

Methodology/Principal Findings

With retrospectively collected data on 360 recipients receiving deceased-donor transplantation for BESLD between February 1999 and August 2009 in the west China hospital of Sichuan university, we developed a multi-layer perceptron (MLP) network to predict one-year and two-year survival probability after transplantation. The performances of the MLP, SOFA, and MELD were assessed by measuring both calibration ability and discriminative power, with Hosmer-Lemeshow test and receiver operating characteristic analysis, respectively. By the forward stepwise selection, donor age and BMI; serum concentration of HB, Crea, ALB, TB, ALT, INR, Na+; presence of pretransplant diabetes; dialysis prior to transplantation, and microbiologically proven sepsis were identified to be the optimal input features. The MLP, employing 18 input neurons and 12 hidden neurons, yielded high predictive accuracy, with c-statistic of 0.91 (P<0.001) in one-year and 0.88 (P<0.001) in two-year prediction. The performances of SOFA and MELD were fairly poor in prognostic assessment, with c-statistics of 0.70 and 0.66, respectively, in one-year prediction, and 0.67 and 0.65 in two-year prediction.

Conclusions/Significance

The posttransplant prognosis is a multidimensional nonlinear problem, and the MLP can achieve significantly high accuracy than SOFA and MELD scores in posttransplant survival prediction. The pattern recognition methodologies like MLP hold promise for solving posttransplant outcome prediction.  相似文献   

15.

Background

Lung transplantation has been established as the definitive treatment option for patients with advanced lymphangioleiomyomatosis (LAM). However, the prognosis after registration and the circumstances of lung transplantation with sirolimus therapy have never been reported.

Methods

In this national survey, we analyzed data from 98 LAM patients registered for lung transplantation in the Japan Organ Transplantation Network.

Results

Transplantation was performed in 57 patients as of March 2014. Survival rate was 86.7% at 1 year, 82.5% at 3 years, 73.7% at 5 years, and 73.7% at 10 years. Of the 98 patients, 21 had an inactive status and received sirolimus more frequently than those with an active history (67% vs. 5%, p<0.001). Nine of twelve patients who remained inactive as of March 2014 initiated sirolimus before or while on a waiting list, and remained on sirolimus thereafter. Although the statistical analysis showed no statistically significant difference, the survival rate after registration tended to be better for lung transplant recipients than for those who awaited transplantation (p = 0.053).

Conclusions

Lung transplantation is a satisfactory therapeutic option for advanced LAM, but the circumstances for pre-transplantation LAM patients are likely to alter with the use of sirolimus.  相似文献   

16.

Objective

Aim of this study was to identify the nitric oxide synthase (NOS) isoform involved in early microcirculatory derangements following solid organ transplantation.

Background

Tetrahydrobiopterin donor treatment has been shown to specifically attenuate these derangements following pancreas transplantation, and tetrahydrobiopterin-mediated protective effects to rely on its NOS-cofactor activity, rather than on its antioxidant capacity. However, the NOS-isoform mainly involved in this process has still to be defined.

Methods

Using a murine pancreas transplantation model, grafts lacking one of the three NOS-isoforms were compared to grafts from wild-type controls. Donors were treated with either tetrahydrobiopterin or remained untreated. All grafts were subjected to 16 h cold ischemia time and transplanted into wild-type recipients. Following 4 h graft reperfusion, microcirculation was analysed by confocal intravital fluorescence microscopy. Recipient survival was monitored for 50 days.

Results

Transplantation of the pancreas from untreated wild-type donor mice resulted in microcirculatory damage of the transplanted graft and no recipient survived more than 72 h. Transplanting grafts from untreated donor mice lacking either endothelial or inducible NOS led to similar outcomes. In contrast, donor treatment with tetrahydrobiopterin prevented microcirculatory breakdown enabling long-term survival. Sole exception was transplantation of grafts from untreated donor mice lacking neuronal NOS. It resulted in intact microvascular structure and long-term recipient survival, either if donor mice were untreated or treated with tetrahydrobiopterin.

Conclusion

We demonstrate for the first time the crucial involvement of neuronal NOS in early microcirculatory derangements following solid organ transplantation. In this model, protective effects of tetrahydrobiopterin are mediated by targeting this isoform.  相似文献   

17.

Background

Kidney transplantation is the therapy of choice for end-stage kidney disease. Graft’s life span is shorter than expected due in part to the delayed diagnosis of various complications, specifically those related to silent progression. It is recognized that serum creatinine levels and proteinuria are poor markers of mild kidney lesions, which results in delayed clinical information. There are many investigation looking for early markers of graft damage. Decreasing kidney graft cortical microcirculation has been related to poor prognosis in kidney transplantation. Cortical capillary blood flow (CCBF) can be measured by real-time contrast-enhanced sonography (RT-CES). Our aim was to describe the natural history of CCBF over time under diverse conditions of kidney transplantation, to explore the influence of donor conditions and recipient events, and to determine the capacity of CCBF for predicting renal function in medium term.

Patients and Methods

RT-CES was performed in 79 consecutive kidney transplant recipients during the first year under regular clinical practice. Cortical capillary blood flow was measured. Clinical variables were analyzed. The influence of CCBF has been determined by univariate and multivariate analysis using mixed regression models based on sequential measurements for each patient over time. We used a first-order autoregression model as the structure of the covariation between measures. The post-hoc comparisons were considered using the Bonferroni correction.

Results

The CCBF values varied significantly over the study periods and were significantly lower at 48 h and day 7. Brain-death donor age and CCBF levels showed an inverse relationship (r: -0.62, p<0.001). Living donors showed higher mean CCBF levels than brain-death donors at each point in the study. These significant differences persisted at month 12 (54.5 ± 28.2 vs 33.7 ± 30 dB/sec, living vs brain-death donor, respectively, p = 0.004) despite similar serum creatinine levels (1.5 ± 0.3 and 1.5 ± 0.5 mg/dL). A sole rejection episode was associated with lower overall CCBF values over the first year. CCBF defined better than level of serum creatinine the graft function status at medium-term.

Conclusion

RT-CES is a non-invasive tool that can quantify and iteratively estimate cortical microcirculation. We have described the natural history of cortical capillary blood flow under regular clinical conditions.  相似文献   

18.

Objective

Fecal microbiota transplantation (FMT) is an investigational treatment for diseases thought to involve alterations in the intestinal microbiota including ulcerative colitis (UC). Case reports have described therapeutic benefit of FMT in patients with UC, possibly due to changes in the microbiota. We measured the degree to which the transplanted microbiota engraft following FMT in patients with UC using a donor similarity index (DSI).

Methods

Seven patients with mild to moderate UC (UC disease activity index scores 3–10) received a single colonoscopic administration of FMT. Metagenomic sequence data from stool were analyzed using an alignment-free comparison tool, to measure the DSI, and a phylogenetic analysis tool, to characterize taxonomic changes. Clinical, endoscopic, histologic, and fecal calprotectin outcome measures were also collected.

Results

One of 5 patients from whom sequencing data were available achieved the primary endpoint of 50% donor similarity at week 4; an additional 2 patients achieved 40% donor similarity. One patient with 40% donor similarity achieved clinical and histologic remission 1 month after FMT. However, these were lost by 2−3 months, and loss correlated with a decrease in DSI. The remaining patients did not demonstrate clinical response or remission. Histology scores improved in all but 1 patient. No patients remained in remission at 3 months after FMT.

Conclusions

Following a single colonoscopic fecal transplant, a DSI of 40-50% is achieved in about two-thirds of recipients. This level of engraftment correlated with a temporary clinical improvement in only 1/5 patients. Larger sample sizes could further validate this method for measuring engraftment, and changes in transplant frequency or method might improve microbiota engraftment and efficacy.

Trial Registration

ClinicalTrials.gov NCT01742754  相似文献   

19.

Background

Bronchial epithelium is a target of the alloimmune response in lung transplantation, and intact epithelium may protect allografts from rejection and obliterative bronchiolitis (OB). Herein we study the influence of chimerism on bronchial epithelium and OB development in pigs.

Methods

A total of 54 immunosuppressed and unimmunosuppressed bronchial allografts were serially obtained 2-90 days after transplantation. Histology (H&E) was assessed and the fluorescence in situ hybridization (FISH) method for Y chromosomes using pig-specific DNA-label was used to detect recipient derived cells in graft epithelium and bronchial wall, and donor cell migration to recipient organs. Ingraft chimerism was studied by using male recipients with female donors, whereas donor cell migration to recipient organs was studied using female recipients with male donors.

Results

Early appearance of recipient-derived cells in the airway epithelium appeared predictive of epithelial destruction (R = 0.610 - 0.671 and p < 0.05) and of obliteration of the bronchial lumen (R = 0.698 and p < 0.01). All allografts with preserved epithelium showed epithelial chimerism throughout the follow-up. Antirejection medication did not prevent, but delayed the appearance of Y chromosome positive cells in the epithelium (p < 0.05), or bronchial wall (p < 0.05).

Conclusions

In this study we demonstrate that early appearance of Y chromosomes in the airway epithelium predicts features characteristic of OB. Chimerism occurred in all allografts, including those without features of OB. Therefore we suggest that ingraft chimerism may be a mechanism involved in the repair of alloimmune-mediated tissue injury after transplantation.  相似文献   

20.

Objectives

Induction of the immune response is a major problem in replacement therapies for inherited protein deficiencies. Tolerance created in utero can facilitate postnatal treatment. In this study, we aimed to induce immune tolerance towards a foreign protein with early gestational cell transplantation into the chorionic villi under ultrasound guidance in the murine model.

Methods

Pregnant C57BL/6 (B6) mice on day 10 of gestation were anesthetized and imaged by high resolution ultrasound. Murine embryos and their placenta were positioned to get a clear view in B-mode with power mode of the labyrinth, which is the equivalent of chorionic villi in the human. Bone marrow cells (BMCs) from B6-Green Fluorescence Protein (B6GFP) transgenic mice were injected into the fetal side of the placenta which includes the labyrinth with glass microcapillary pipettes. Each fetal mouse received 2 x 105 viable GFP-BMCs. After birth, we evaluated the humoral and cell-mediated immune response against GFP.

Results

Bone marrow transfer into fetal side of placenta efficiently distributed donor cells to the fetal mice. The survival rate of this procedure was 13.5%(5 out of 37). Successful engraftment of the B6-GFP donor skin grafts was observed in all recipient (5 out of 5) mice 6 weeks after birth. Induction of anti-GFP antibodies was completely inhibited. Cytotoxic immune reactivity of thymic cells against cells harboring GFP was suppressed by ELISPOT assay.

Conclusions

In this study, we utilized early gestational placental injection targeting the murine fetus, to transfer donor cells carrying a foreign protein into the fetal circulation. This approach is sufficient to induce both humoral and cell-mediated immune tolerance against the foreign protein.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号