首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 46 毫秒
1.

Introduction

With increasing numbers of patients diagnosed with ESRD, arteriovenous fistula (AVF) maturation has become a major factor in improving both dialysis related outcomes and quality of life of those patients. Compared to other types of access it has been established that a functional AVF access is the least likely to be associated with thrombosis, infection, hospital admissions, secondary interventions to maintain patency and death.

Aim

Study of demographic factors implicated in the functional maturation of arteriovenous fistulas. Also, to explore any possible association between preoperative haematological investigations and functional maturation.

Methods

We performed a retrospective chart review of all patients with ESRD who were referred to the vascular service in the University Hospital of Limerick for creation of vascular access for HD. We included patients with primary AVFs; and excluded those who underwent secondary procedures.

Results

Overall AVF functional maturation rate in our study was 53.7% (52/97). Female gender showed significant association with nonmaturation (P = 0.004) and was the only predictor for non-maturation in a logistic regression model (P = 0.011). Patients who had history of renal transplant (P = 0.036), had relatively lower haemoglobin levels (P = 0.01) and were on calcium channel blockers (P = 0.001) showed better functional maturation rates.

Conclusion

Female gender was found to be associated with functional non-maturation, while a history kidney transplant, calcium channel-blocker agents and low haemoglobin levels were all associated with successful functional maturation. In view of the conflicting evidence in the literature, large prospective multi-centre registry-based studies with well-defined outcomes are needed.  相似文献   

2.

Background

Lung transplantation has been established as the definitive treatment option for patients with advanced lymphangioleiomyomatosis (LAM). However, the prognosis after registration and the circumstances of lung transplantation with sirolimus therapy have never been reported.

Methods

In this national survey, we analyzed data from 98 LAM patients registered for lung transplantation in the Japan Organ Transplantation Network.

Results

Transplantation was performed in 57 patients as of March 2014. Survival rate was 86.7% at 1 year, 82.5% at 3 years, 73.7% at 5 years, and 73.7% at 10 years. Of the 98 patients, 21 had an inactive status and received sirolimus more frequently than those with an active history (67% vs. 5%, p<0.001). Nine of twelve patients who remained inactive as of March 2014 initiated sirolimus before or while on a waiting list, and remained on sirolimus thereafter. Although the statistical analysis showed no statistically significant difference, the survival rate after registration tended to be better for lung transplant recipients than for those who awaited transplantation (p = 0.053).

Conclusions

Lung transplantation is a satisfactory therapeutic option for advanced LAM, but the circumstances for pre-transplantation LAM patients are likely to alter with the use of sirolimus.  相似文献   

3.

Introduction

Cardiovascular disease is the leading cause of mortality after renal transplantation. The purpose of this study was to analyze cardiovascular risk factors at transplantation, occurrence of cardiovascular events in the first year after transplantation and evaluate pre-transplant work-up.

Material and Method

In total, 244 renal transplant recipients older than 50 years were included. The results of pre-transplant work-up, including clinical evaluation, electrocardiogram, echocardiography, myocardial perfusion testing and coronary angiography were analyzed.

Results

Patients had multiple risk factors at inclusion on renal transplantation waiting list as high blood pressure (94.7%), dyslipidemia (81.1%), smoking (45.3%), diabetes (23.6%), past history of cardiovascular disease (21.3%) and obesity (12.7%). Following transplantation, 15.5% (n = 38) of patients experienced a cardiovascular event, including 2.8% (n = 7) acute coronary syndrome, 5.8% (n = 14) isolated increase in troponin level and 5.3% (n = 13) new onset atrial fibrillation. The pre-transplant parameters associated with a cardiovascular event were a past medical history of cardiovascular disease (HR = 2.06 [1.06–4.03], p = 0.03), echocardiographic left ventricular hypertrophy (HR = 2.04 [1.04–3.98], p = 0.037) and abnormal myocardial perfusion testing (HR = 2.25 [1.09 –5.96], p = 0.03). Pre-transplantation evaluation allowed the diagnosis of unknown coronary artery lesions in 8.9% of patients.  相似文献   

4.

Background

We combined the outcomes of all randomised controlled trials to investigate the safety and efficacy of steroid avoidance or withdrawal (SAW) regimens in paediatric kidney transplantation compared with steroid-based (SB) regimens.

Methods

A systematic literature search of PubMed, Embase, Cochrane Library, the trials registry and BIOSIS previews was performed. A change in the height standardised Z-score from baseline (ΔHSDS) and acute rejection were the primary endpoints.

Results

Eight reports from 5 randomised controlled trials were included, with a total of 528 patients. Sufficient evidence of a significant increase in the ΔHSDS was observed in the SAW group (mean difference (MD) = 0.38, 95% confidence interval (CI) 0.07–0.68, P = 0.01), particularly within the first year post-withdrawal (MD = 0.22, 95% CI 0.10–0.35, P = 0.0003) and in the prepubertal recipients (MD = 0.60, 95% CI 0.21–0.98, P = 0.002). There was no significant difference in the risk of acute rejection between the groups (relative risk = 1.04, 95% CI 0.80–1.36, P = 0.77).

Conclusions

The SAW regimen is justified in select paediatric renal allograft recipients because it provides significant benefits in post-transplant growth within the first year post-withdrawal with minimal effects on the risk of acute rejection, graft function, and graft and patient survival within 3 years post-withdrawal. These select paediatric recipients should have the following characteristics: prepubertal; Caucasian; with primary disease not related to immunological factors; de novo kidney transplant recipient; with low panel reactive antibody.  相似文献   

5.

Background

The growing burden of end-stage renal disease (ESRD) has been a great challenge to the health care system of China. However, the exact epidemiological data for ESRD in China remain unclear. We aimed to investigate the epidemiology of ESRD treated by renal replacement therapy (RRT) in Nanjing based on analysing ten-year data of Nanjing three million insurance covered population.

Methods

Using the electronic registry system of Urban Employee Basic Medical Insurance (UEBMI), we included all subjects insured by UEBMI in Nanjing from 2005 to 2014 and identified subjects who developed ESRD and started RRT in this cohort.

Results

The UEBMI population in Nanjing increased from 1,301,882 in 2005 to 2,921,065 in 2014, among which a total of 5,840 subjects developed ESRD and received RRT. Over the 10-year period, the adjusted incidence rates of RRT in the UEBMI cohort gradually decreased from 289.3pmp in 2005 to 218.8pmp in 2014. However, the adjusted prevalence rate increased steadily from 891.7pmp in 2005 to 1,228.6pmp in 2014. The adjusted annual mortality rate decreased from 138.4 per 1000 patient-years in 2005 to 97.8 per 1000 patient-years in 2014. The long-term survival rate fluctuated over the past decade, with the 1-year survival rate ranging from 85.1% to 91.7%, the 3-year survival rate from 69.9% to 78.3% and the 5-year survival rate from 58% to 65.4%.

Conclusion

Nanjing is facing an increasing burden of ESRD with its improvement of medical reform. The ten-year complete registry data on RRT in urban employees in Nanjing provided a unique opportunity to understand the real threat of ESRD confronting China during its process of health care transition.  相似文献   

6.

Background

End-stage renal disease patients have a dysfunctional, prematurely aged peripheral T-cell system. Here we hypothesized that the degree of premature T-cell ageing before kidney transplantation predicts the risk for early acute allograft rejection (EAR).

Methods

222 living donor kidney transplant recipients were prospectively analyzed. EAR was defined as biopsy proven acute allograft rejection within 3 months after kidney transplantation. The differentiation status of circulating T cells, the relative telomere length and the number of CD31+ naive T cells were determined as T-cell ageing parameters.

Results

Of the 222 patients analyzed, 30 (14%) developed an EAR. The donor age and the historical panel reactive antibody score were significantly higher (p = 0.024 and p = 0.039 respectively) and the number of related donor kidney transplantation was significantly lower (p = 0.018) in the EAR group. EAR-patients showed lower CD4+CD28null T-cell numbers (p<0.01) and the same trend was observed for CD8+CD28null T-cell numbers (p = 0.08). No differences regarding the other ageing parameters were found. A multivariate Cox regression analysis showed that higher CD4+CD28null T-cell numbers was associated with a lower risk for EAR (HR: 0.65, p = 0.028). In vitro, a significant lower percentage of alloreactive T cells was observed within CD28null T cells (p<0.001).

Conclusion

Immunological ageing-related expansion of highly differentiated CD28null T cells is associated with a lower risk for EAR.  相似文献   

7.

Background

The proportion of elderly patients beginning to undergo dialysis is increasing globally. Whether early referral (ER) of elderly patients is associated with favorable outcomes remains under debate. We investigated the influence of referral timing on the mortality of elderly patients.

Methods

We retrospectively assessed mortality in 820 patients aged ≥70 years with end-stage renal disease (ESRD) who initiated hemodialysis at a tertiary university hospital between 2000 and 2010. Mortality data was obtained from the time of dialysis initiation until December 2010. We assigned patients to one of two groups according to the time of their first encounters with nephrologists: ER (≥ 3 months) and late referral (LR; < 3 months).

Results

During a mean follow-up period of 25.1 months, the ER group showed a 24% reduced risk of long-term mortality relative to the LR group (HR = 0.760, P = 0.009). Rate of reduction in 90-day mortality for ER patients was 58% (HR = 0.422, P=0.012). However, the statistical significance of the difference in mortality rates between ER and LR group was not observed across age groups after 90 days. Old age, LR, central venous catheter, high white blood cell count and corrected Ca level, and lower levels of albumin, creatinine, hemoglobin, and sodium were significantly associated with increased risk of mortality.

Conclusions

Timely referral was also associated with reduced mortality in elderly ESRD patients who initiated hemodialysis. In particular, the initial 90-day mortality reduction in ER patients contributed to mortality differences during the follow-up period.  相似文献   

8.

Introduction

Immunosuppressants are used ubiquitously post-liver transplantation to prevent allograft rejection. However their effects on hepatocytes are unknown. Experimental data from non-liver cells indicate that immunosuppressants may promote cell death thereby driving an inflammatory response that promotes fibrosis and raises concerns that a similar effect may occur within the liver. We evaluated apoptosis within the liver tissue of post-liver transplant patients and correlated these findings with in vitro experiments investigating the effects of immunosuppressants on apoptosis in primary hepatocytes.

Methods

Hepatocyte apoptosis was assessed using immunohistochemistry for M30 CytoDEATH and cleaved PARP in human liver tissue. Primary mouse hepatocytes were treated with various combinations of cyclosporine, tacrolimus, sirolimus, or MMF. Cell viability and apoptosis were evaluated using crystal violet assays and Western immunoblots probed for cleaved PARP and cleaved caspase 3.

Results

Post-liver transplant patients had a 4.9-fold and 1.7-fold increase in M30 CytoDEATH and cleaved PARP compared to normal subjects. Cyclosporine and tacrolimus at therapeutic concentrations did not affect hepatocyte apoptosis, however when they were combined with MMF, cell death was significantly enhanced. Cell viability was reduced by 46% and 41%, cleaved PARP was increased 2.6-fold and 2.2-fold, and cleaved caspase 3 increased 2.2-fold and 1.8-fold following treatment with Cyclosporine/MMF and Tacrolimus/MMF respectively. By contrast, the sirolimus/MMF combination did not significantly reduce hepatocyte viability or promote apoptosis.

Conclusion

Commonly used immunosuppressive drug regimens employed after liver transplantation enhance hepatocyte cell death and may thus contribute to the increased liver fibrosis that occurs in a proportion of liver transplant recipients.  相似文献   

9.

Background

Patients started on long term hemodialysis have typically had low rates of reported renal recovery with recent estimates ranging from 0.9–2.4% while higher rates of recovery have been reported in cohorts with higher percentages of patients with acute renal failure requiring dialysis.

Study Design

Our analysis followed approximately 194,000 patients who were initiated on hemodialysis during a 2-year period (2008 & 2009) with CMS-2728 forms submitted to CMS by dialysis facilities, cross-referenced with patient record updates through the end of 2010, and tracked through December 2010 in the CMS SIMS registry.

Results

We report a sustained renal recovery (i.e no return to ESRD during the available follow up period) rate among Medicare ESRD patients of > 5% - much higher than previously reported. Recovery occurred primarily in the first 2 months post incident dialysis, and was more likely in cases with renal failure secondary to etiologies associated with acute kidney injury. Patients experiencing sustained recovery were markedly less likely than true long-term ESRD patients to have permanent vascular accesses in place at incident hemodialysis, while non-White patients, and patients with any prior nephrology care appeared to have significantly lower rates of renal recovery. We also found widespread geographic variation in the rates of renal recovery across the United States.

Conclusions

Renal recovery rates in the US Medicare ESRD program are higher than previously reported and appear to have significant geographic variation. Patients with diagnoses associated with acute kidney injury who are initiated on long-term hemodialysis have significantly higher rates of renal recovery than the general ESRD population and lower rates of permanent access placement.  相似文献   

10.

Background

HLA directed antibodies play an important role in acute and chronic allograft rejection. During viral infection of a patient with HLA antibodies, the HLA antibody levels may rise even though there is no new immunization with antigen. However it is not known whether the converse occurs, and whether changes on non-donor specific antibodies are associated with any outcomes following HLA antibody incompatible renal transplantation.

Methods

55 patients, 31 women and 24 men, who underwent HLAi renal transplant in our center from September 2005 to September 2010 were included in the studies. We analysed the data using two different approaches, based on; i) DSA levels and ii) rejection episode post transplant. HLA antibody levels were measured during the early post transplant period and corresponding CMV, VZV and Anti-HBs IgG antibody levels and blood group IgG, IgM and IgA antibodies were quantified.

Results

Despite a significant DSA antibody rise no significant non-donor specific HLA antibody, viral or blood group antibody rise was found. In rejection episode analyses, multiple logistic regression modelling showed that change in the DSA was significantly associated with rejection (p = 0.002), even when adjusted for other antibody levels. No other antibody levels were predictive of rejection. Increase in DSA from pre treatment to a post transplant peak of 1000 was equivalent to an increased chance of rejection with an odds ratio of 1.47 (1.08, 2.00).

Conclusion

In spite of increases or decreases in the DSA levels, there were no changes in the viral or the blood group antibodies in these patients. Thus the DSA rise is specific in contrast to the viral, blood group or third party antibodies post transplantation. Increases in the DSA post transplant in comparison to pre-treatment are strongly associated with occurrence of rejection.  相似文献   

11.

Background/Objective

RAC1 gene could influence susceptibility to renal failure by altering the activity and expression of Rac1, which is a member of the Rho family of small GTP-binding proteins. In clinical practice, renal transplantation provides the optimal treatment for people with end-stage renal disease (ESRD). The objective of this present study was to determine whether the RAC1 gene polymorphisms were associated with primary ESRD susceptibility in Chinese renal recipients.

Methods

Six single nucleotide polymorphisms (SNPs) of RAC1 gene, including rs836488 T>C, rs702482 A>T, rs10951982 G>A, rs702483 A>G, rs6954996 G>A, and rs9374 G>A, were genotyped in 300 renal transplant recipients (cases) and 998 healthy Chinese subjects (controls) by using TaqMan SNP genotyping assay. Allele, genotype, and haplotype frequencies of the six SNPs were compared between cases and controls. Odds ratios (OR) and 95% confidence intervals (CI) were calculated in logistic regression models to evaluate the associations of the six SNPs with ESRD risk.

Results

The genotype distributions for the six SNPs in controls were consistent with Hardy-Weinberg equilibrium (P > 0.05). Association analysis revealed that three SNPs were significantly associated with ESRD risk. Positive associations with ESRD risk were found for the rs836488, rs702482, and rs702483 in the co-dominant model (minor allele homozygotes versus major allele homozygotes); specifically, the frequencies of the minor allele homozygotes and the minor allele for the three SNPs were higher in the cases than in the controls. In addition, these three SNPs also had associations with increased ESRD risk under the additive model (P < 0.05), and positive associations were also found for the rs836488 in the dominant model (P < 0.05) and for the rs702483 in the recessive model (P < 0.05). All these associations were independent of confounding factors. The other three SNPs (rs10951982, rs6954996, and rs9374), in all comparison models, were not associated with ESRD risk (P > 0.05). In haplotype analysis, carriers with "C-T-G-G-G-G" haplotype had a significantly higher risk of ESRD compared with the most common haplotype "T-A-G-A-G-G" (P = 0.011, OR = 1.46, 95% CI = 1.09–1.94).

Conclusion

This study suggested that polymorphisms of RAC1 gene might influence the susceptibility to ESRD in Chinese Han population. Further studies are necessary to confirm our findings.  相似文献   

12.

Objectives

The primary objective was to examine trends in new HIV diagnoses in a UK area of high HIV prevalence between 2000 and 2012 with respect to site of diagnosis and stage of HIV infection.

Design

Single-centre observational cohort study.

Setting

An outpatient HIV department in a secondary care UK hospital.

Participants

1359 HIV-infected adults.

Main Outcome Measures

Demographic information (age, gender, ethnicity, and sexual orientation), site of initial HIV diagnosis (Routine settings such as HIV/GUM clinics versus Non-Routine settings such as primary care and community venues), stage of HIV infection, CD4 count and seroconversion symptoms were collated for each participant.

Results

There was a significant increase in the proportion of new HIV diagnoses made in Non-Routine settings (from 27.0% in 2000 to 58.8% in 2012; p<0.001). Overall there was a decrease in the rate of late diagnosis from 50.7% to 32.9% (p=0.001). Diagnosis of recent infection increased from 23.0% to 47.1% (p=0.001). Of those with recent infection, significantly more patients were likely to report symptoms consistent with a seroconversion illness over the 13 years (17.6% to 65.0%; p<0.001).

Conclusions

This is the first study, we believe, to demonstrate significant improvements in HIV diagnosis and a shift in diagnosis of HIV from HIV/GUM settings to primary practice and community settings due to multiple initiatives.  相似文献   

13.

Background

Islet transplantation may potentially cure type 1 diabetes mellitus (T1DM). However, immune rejection, especially that induced by the alloreactive T-cell response, remains a restraining factor for the long-term survival of grafted islets. Programmed death ligand-1 (PD-L1) is a negative costimulatory molecule. PD-L1 deficiency within the donor heart accelerates allograft rejection. Here, we investigate whether PD-L1 deficiency in donor islets reduces allograft survival time.

Methods

Glucose Stimulation Assays were performed to evaluate whether PD-L1 deficiency has detrimental effects on islet function. Islets isolated from PDL1-deficient mice or wild- type (WT) mice (C57BL/6j) were implanted beneath the renal capsule of streptozotocin (STZ)-induced diabetic BALB/c mice. Blood glucose levels and graft survival time after transplantation were monitored. Moreover, we analyzed the residual islets, infiltrating immune cells and alloreactive cells from the recipients.

Results

PD-L1 deficiency within islets does not affect islet function. However, islet PD-L1 deficiency increased allograft rejection and was associated with enhanced inflammatory cell infiltration and recipient T-cell alloreactivity.

Conclusions

This is the first report to demonstrate that PD-L1 deficiency accelerated islet allograft rejection and regulated recipient alloimmune responses.  相似文献   

14.

Introduction

Data on the efficacy and safety of everolimus in pediatric renal transplantation compared to other immunosuppressive regimens are scarce.

Patients/Methods

We therefore performed a multicenter, observational, matched cohort study over 4 years post-transplant in 35 patients on everolimus plus low-dose cyclosporine, who were matched (1:2) with a control group of 70 children receiving a standard-dose calcineurin-inhibitor- and mycophenolate mofetil-based regimen.

Results

Corticosteroids were withdrawn in 83% in the everolimus vs. 39% in the control group (p<0.001). Patient and graft survival were comparable. The rate of biopsy-proven acute rejection episodes Banff score ≥ IA during the first year post-transplant was 6% in the everolimus vs. 13% in the control group (p = 0.23). The rate of de novo donor-specific HLA antibodies (11% in everolimus, 18% in controls) was comparable (p = 0.55). At 4 years post-transplant, mean eGFR in the everolimus group was 56±33 ml/min per 1.73 m² vs. 63±22 ml/min per 1.73 m² in the control group (p = 0.14). Everolimus therapy was associated with less BK polyomavirus replication (3% vs. 17% in controls; p = 0.04), but with a higher percentage of arterial hypertension and more hyperlipidemia (p<0.001).

Conclusion

In pediatric renal transplantation, an everolimus-based regimen with low-dose cyclosporine yields comparable four year results as a standard regimen, but with a different side effect profile.  相似文献   

15.

Introduction

The delta neutrophil index (DNI) is the fraction of circulating immature granulocytes, which reflect infectious and/or septic condition. Acute graft pyelonephritis (AGPN) versus acute graft rejection is a frequently encountered diagnostic and therapeutic dilemma in kidney transplant recipients, but little is known about the clinical usefulness of DNI value in the differentiation of the two conditions.

Material & Methods

A total of 90 episodes of AGPN or acute graft rejection were evaluated at the Kangdong Sacred Heart Hospital between 2008 and 2014. We performed retrospective analysis of demographic, clinical, and laboratory parameters data. Receiver operating curves (ROC) and multivariate logistic regression were conducted to ascertain the utility of DNI in discriminating between AGPN and acute graft rejection.

Results

AGPN group had significantly higher DNI values than acute graft rejection group (2.9% vs. 1.9%, P < 0.001). The area under the ROC curve for DNI value to discriminate between AGPN and acute graft rejection was 0.85 (95% confidence interval [CI]; 0.76–0.92, P < 0.001). A DNI value of 2.7% was selected as the cut-off value for AGPN, and kidney transplant recipients with a DNI value ≥ 2.7% were found to be at a higher risk of infection than those with a DNI < 2.7% (odd ratio [OR] 40.50; 95% CI 8.68–189.08; P < 0.001). In a multivariate logistic regression analysis, DNI was a significant independent factor for predicting AGPN after adjusting age, sex, log WBC count, log neutorphil count, log lymphocyte count, CRP concentration, and procalcitonin concentration (OR 4.32; 95% CI 1.81–10.34, P < 0.001).

Conclusions

This study showed that DNI was an effective marker to differentiate between AGPN and acute graft rejection. Thus, these finding suggest that DNI may be a useful marker in the management of these patients.  相似文献   

16.

Introduction

Acute kidney injury is associated with a poor prognosis in acute liver failure but little is known of outcomes in patients undergoing transplantation for acute liver failure who require renal replacement therapy.

Methods

A retrospective analysis of the United Kingdom Transplant Registry was performed (1 January 2001–31 December 2011) with patient and graft survival determined using Kaplan-Meier methods. Cox proportional hazards models were used together with propensity-score based full matching on renal replacement therapy use.

Results

Three-year patient and graft survival for patients receiving renal replacement therapy were 77.7% and 72.6% compared with 85.1% and 79.4% for those not requiring renal replacement therapy (P<0.001 and P = 0.009 respectively, n = 725). In a Cox proportional hazards model, renal replacement therapy was a predictor of both patient death (hazard ratio (HR) 1.59, 95% CI 1.01–2.50, P = 0.044) but not graft loss (HR 1.39, 95% CI 0.92–2.10, P = 0.114). In groups fully matched on baseline covariates, those not receiving renal replacement therapy with a serum creatinine greater than 175μmol/L had a significantly worse risk of graft failure than those receiving renal replacement therapy.

Conclusion

In patients being transplanted for acute liver failure, use of renal replacement therapy is a strong predictor of patient death and graft loss. Those not receiving renal replacement therapy with an elevated serum creatinine may be at greater risk of early graft failure than those receiving renal replacement therapy. A low threshold for instituting renal replacement therapy may therefore be beneficial.  相似文献   

17.

Objectives

Cytomegalovirus (CMV) infections in liver transplant recipients are common and result in significant morbidity and mortality. Intravenous ganciclovir or oral valganciclovir are the standard treatment for CMV infection. The present study investigates the efficacy of oral valganciclovir in CMV infection as a preemptive treatment after liver transplantation.

Methods

Between 2012 and 2013, 161 patients underwent liver transplantation at Samsung Medical Center. All patients received tacrolimus, steroids, and mycophenolate mofetil. Patients with CMV infection were administered oral valganciclovir (VGCV) 900mg/day daily or intravenous ganciclovir (GCV) 5mg/kg twice daily as preemptive treatment. Stable liver transplant recipients received VGCV.

Results

Eighty-three patients (51.6%) received antiviral therapy as a preemptive treatment because of CMV infection. The model for end-stage liver disease (MELD) score and the proportions of Child-Pugh class C, hepatorenal syndrome, and deceased donor liver transplantation in the CMV infection group were higher than in the no CMV infection group. Sixty-one patients received GCV and 22 patients received VGCV. The MELD scores in the GCV group were higher than in the VGCV group, but there were no statistical differences in the pretransplant variables between the two groups. AST, ALT, and total bilirubin levels in the GCV group were higher than in the VGCV group when CMV infection occurred. The incidences of recurrent CMV infection in the GCV and VGCV groups were 14.8% and 4.5%, respectively (P=0.277).

Conclusion

Oral valganciclovir is feasible as a preemptive treatment for CMV infection in liver transplant recipients with stable graft function.  相似文献   

18.

Background

Risk factors for and optimal surveillance of renal dysfunction in patients on tenofovir disoproxil fumarate (TDF) remain unclear. We investigated whether a urine protein-osmolality (P/O) ratio would be associated with renal dysfunction in HIV-infected persons on TDF.

Methods

This retrospective, single-center study investigated the relationship between parameters of renal function (estimated glomerular filtration rate (eGFR) and P/O-ratio) and risk factors for development of kidney dysfunction. Subjects were HIV-infected adults receiving TDF with at least one urinalysis and serum creatinine performed between 2010 and 2013. Regression analyses were used to analyze risk factors associated with abnormal P/O-ratio and abnormal eGFR during TDF therapy.

Results

Patients were predominately male (81%); (65%) were Caucasian. Mean age was 45.1(±11.8) years; median [IQR] TDF duration was 3.3 years. [1.5–7.6]. Median CD4+ T cell count and HIV viral load were 451 cells/μL [267.5–721.5] and 62 copies/mL [0–40,150], respectively. Abnormal P/O-ratio was not associated with low eGFR. 68% of subjects had an abnormal P/O-ratio and 9% had low eGFR. Duration of TDF use, age, diabetes and hypertension were associated with renal dysfunction in this study. After adjustment for age, subjects on TDF > 5 years had almost a four-fold increased likelihood of having an abnormal P/O-ratio than subjects on TDF for < 1yr (OR 3.9; 95% CI 1.2–14.0; p = 0.024).

Conclusion

Abnormal P/O-ratio is common in HIV-infected patients on TDF but was not significantly associated with low eGFR, suggesting that abnormal P/O-ratio may be a very early biomarker of decreased renal function in HIV infected patients.  相似文献   

19.

Introduction

We explored the risk of end-stage renal disease (ESRD) among gout patients in a representative cohort in Taiwan.

Methods

The primary database used was the Taiwan National Health Insurance Research Database. Subjects older than 20 years without ESRD, coronary heart disease, or stroke were included in the study. The case definition of gout in the present study was gout diagnosis and medical treatment for gout. An ESRD case was defined by the presence of chronic renal failure necessitating long-term renal replacement therapy. Multivariate Cox proportional hazards models were used to evaluate the risk of ESRD among gout patients.

Results

The analysis included data of 656,108 patients who were followed up for a mean of 8.0 years. Among them, 19,963 (3.0%) patients had gout. At the end of 2008, 2,377 individuals (gout, n = 276; non-gout, n = 2,101) had ESRD, and 861 individuals (gout, n = 77, 27.9%; non-gout, n = 521, 24.8%) died due to ESRD. The rates of incidence of ESRD were 1.73 and 0.41 cases per 1,000 patient-years in the gout and non-gout groups. After adjustment for age, sex, and history of diabetes mellitus and/or hypertension, gout was associated with a hazard ratio (HR) of 1.57 for ESRD (95% confidence interval [CI], 1.38-1.79; P < 0.001). In patients with ESRD, the adjusted HR for death in patients with gout was 0.95 (0.74-1.23, P = 0.71), which was similar to the HR obtained in patients without gout.

Conclusions

Gout is associated with an increased hazard for development of ESRD.  相似文献   

20.

Background

Arterial stiffness is closely associated with cardiovascular disease (CVD) in end stage renal disease (ESRD) patients. However, the clinical significance of pre-transplant arterial stiffness and the impact of kidney transplantation (KT) on arterial stiffness have not yet been determined.

Method

We measured the brachial-ankle pulse wave velocity (baPWV) before KT and one year after KT. We evaluated the potential utility of pre-transplant baPWV as a screening test to predict CVD. The impact of KT on progression of arterial stiffness was evaluated according to changes in baPWV after KT. The factors that influence the change of baPWV after KT were also examined.

Result

The mean value of pre-transplant baPWV was 1508 ± 300 cm/s in ESRD patients; 93.4% had a higher baPWV value than healthy controls. Pre-transplant baPWV was higher in patients with CVD than in those without CVD (1800 ± 440 vs. 1491 ± 265 cm/s, p<0.05), and was a strong predictive factor of CVD (OR 1.003, p<0.05). The optimal cut-off value of baPWV for the detection of CVD was 1591 cm/s, and this value was an independent predictor of CVD in KT recipients (OR 6.3, p<0.05). The post-transplant baPWV was significantly decreased compared to that of pre-transplant rates (1418 ± 235 vs. 1517 ± 293 cm/s, p<0.05), and progression of arterial stiffness was not observed in 86.9% patients. Logistic regression analysis revealed that higher body mass index and the degree of increase in calcium levels were independent risk factors that affected baPWV after KT.

Conclusions

Evaluation of arterial stiffness with baPWV is a useful screening test for predicting CVD after KT, and KT is effective in preventing the progression of arterial stiffness in ESRD patients.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号