首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.

Background

Comorbid conditions are highly prevalent among patients with end-stage renal disease (ESRD) and index score is a predictor of mortality in dialysis patients. The aim of this study is to perform a population-based cohort study to investigate the survival rate by age and Charlson comorbidity index (CCI) in incident dialysis patients.

Methods

Using the catastrophic illness registration of the Taiwan National Health Insurance Research Database for all patients from 1 January 1998 to 31 December 2008, individuals newly diagnosed with ESRD and receiving dialysis for more than 90 days were eligible for our study. Individuals younger than 18 years or renal transplantation patients either before or after dialysis were excluded. We calculated the CCI, age-weighted CCI by Deyo-Charlson method according to ICD-9 code and categorized CCI into six groups as index scores <3, 4–6, 7–9, 10–12, 13–15, >15. Cox regression models were used to analyze the association between age, CCI and survival, and the risk markers of survival.

Results

There were 79,645 incident dialysis patients, whose mean age (± SD) was 60.96 (±13.92) years; 51.43% of patients were women and 51.2% were diabetic. In cox proportional hazard models and stratifying by age, older patients had significantly higher mortality than younger patients. The mortality risk was higher in persons with higher CCI as compared with low CCI. Mortality increased steadily with higher age or comorbidity both for unadjusted and for adjusted models. For all age groups, mortality rates increased in different CCI groups with the highest rates occurring in the oldest age groups.

Conclusions

Age and CCI are both strong predictors of survival in Taiwan. The older age or higher comorbidity index in incident dialysis patient is associated with lower long-term survival rates. These population-based estimates may assist clinicians who make decisions when patients need long-term dialysis.  相似文献   

2.

Background

Invariant Natural Killer T (iNKT) cells represent a determinant in the course of infections and diseases, however, their role in the pathogenesis of non-infectious co-morbidities in HIV-positive patients is unknown.

Methods

Flow cytometry was used to investigate iNKT cell frequency, phenotype and function in HIV-infected patients on HAART with bone and/or cardiovascular disorders and in HIV-positive controls free from co-morbidities.

Results

iNKT cells from subjects with bone and cardiovascular impairment expressed high levels of CD161 and predominantly secreted TNF. iNKT cells from individuals with bone disease alone did not show any distinctive phenotypical or functional characteristics. The functional capacity of iNKT cells in patients with cardiovascular disorder was impaired with no cytokine release upon stimulation.

Conclusion

iNKT cells may have a role in non-infectious co-morbidities in treated HIV disease, possibly through the exacerbation of inflammation. Further studies are needed to investigate iNKT cells in the pathogenesis of non-communicable disorders in HIV infection.  相似文献   

3.

Objective

To determine the cost-effectiveness of in-utero percutaneous Vesico Amniotic Shunt (VAS) in the management of fetal lower urinary tract obstruction (LUTO)

Design

Model based economic analysis using data from the randomised controlled arm of the PLUTO (percutaneous vesico-amniotic shunting for lower urinary tract obstruction) trial.

Setting

Fetal medicine departments in United Kingdom, Ireland and Netherlands.

Population or Sample

Pregnant women with a male, singleton fetus with LUTO.

Methods

Costs and outcomes were prospectively collected in the trial; three separate base case analyses were performed using the intention to treat (ITT), per protocol and uniform prior methods. Deterministic and probabilistic sensitivity analyses were performed to explore data uncertainty.

Main Outcome Measures

Survival at 28 days, 1 year and disease free survival at 1 year.

Results

VAS was more expensive but appeared to result in higher rates of survival compared with conservative management in patients with LUTO. Using ITT analysis the incremental cost effectiveness ratios based on outcomes of survival at 28 days, 1 year, or 1 morbidity-free year on the VAS arm were £15,506, £15,545, and £43,932, respectively.

Conclusions

VAS is a more expensive option compared to the conservative approach in the management of individuals with LUTO. Data from the RCT suggest that VAS improves neonatal survival but does not result in significant improvements in morbidity. Our analysis concludes that VAS is not likely to be cost effective in the management of these patients given the NICE (National Institute of Health and Clinical Excellence) cost threshold of £20,000 per QALY.  相似文献   

4.

Objective

To explore the feasibility and implementation efficiency of Nutritional Report Cards(NRCs) in helping children make healthier food choices at school.

Methods

Pilot testing was conducted in a rural New York school district (K-12). Over a five-week period, 27 parents received a weekly e-mail containing a NRC listing how many meal components (fruits, vegetables, starches, milk), snacks, and a-la-carte foods their child selected. We analyzed choices of students in the NRC group vs. the control group, both prior to and during the intervention period. Point-of-sale system data for a-la-carte items was analyzed using Generalized Least Squares regressions with clustered standard errors.

Results

NRCs encouraged more home conversations about nutrition and more awareness of food selections. Despite the small sample, the NRC was associated with reduced selection of some items, such as the percentage of those selecting cookies which decreased from 14.3 to 6.5 percent. Additionally, despite requiring new keys on the check-out registers to generate the NRC, checkout times increased by only 0.16 seconds per transaction, and compiling and sending the NRCs required a total weekly investment of 30 minutes of staff time.

Conclusions

This test of concept suggests that NRCs are a feasible and inexpensive tool to guide children towards healthier choices.  相似文献   

5.

Background

New Zealand (NZ) has a high prevalence of both peritoneal dialysis (PD) and home haemodialysis (HD) relative to other countries, and probably less selection bias. We aimed to determine if home dialysis associates with better survival than facility HD by simultaneous comparisons of the three modalities.

Methods

We analysed survival by time-varying dialysis modality in New Zealanders over a 15-year period to 31-Dec-2011, adjusting for patient co-morbidity by Cox proportional hazards multivariate regression.

Results

We modelled 6,419 patients with 3,254 deaths over 20,042 patient-years of follow-up. Patients treated with PD and facility HD are similar; those on home HD are younger and healthier. Compared to facility HD, home dialysis (as a unified category) associates with an overall 13% lower mortality risk. Home HD associates with a 52% lower mortality risk. PD associates with a 20% lower mortality risk in the early period (<3 years) that is offset by a 33% greater mortality risk in the late period (>3 years), with no overall net effect. There was effect modification and less observable benefit associated with PD in those with diabetes mellitus, co-morbidity, and in NZ Maori and Pacific People. There was no effect modification by age or by era.

Conclusion

Our study supports the culture of home dialysis in NZ, and suggests that the extent and duration of survival benefit associated with early PD may be greater than appreciated. We are planning further analyses to exclude residual confounding from unmeasured co-morbidity and other sociodemographic factors using database linkage to NZ government datasets. Finally, our results suggest further research into the practice of PD in NZ Maori and Pacific People, as well as definitive study to determine the best timing for switching from PD in the late phase.  相似文献   

6.

Objective

Aim of this study was to identify the nitric oxide synthase (NOS) isoform involved in early microcirculatory derangements following solid organ transplantation.

Background

Tetrahydrobiopterin donor treatment has been shown to specifically attenuate these derangements following pancreas transplantation, and tetrahydrobiopterin-mediated protective effects to rely on its NOS-cofactor activity, rather than on its antioxidant capacity. However, the NOS-isoform mainly involved in this process has still to be defined.

Methods

Using a murine pancreas transplantation model, grafts lacking one of the three NOS-isoforms were compared to grafts from wild-type controls. Donors were treated with either tetrahydrobiopterin or remained untreated. All grafts were subjected to 16 h cold ischemia time and transplanted into wild-type recipients. Following 4 h graft reperfusion, microcirculation was analysed by confocal intravital fluorescence microscopy. Recipient survival was monitored for 50 days.

Results

Transplantation of the pancreas from untreated wild-type donor mice resulted in microcirculatory damage of the transplanted graft and no recipient survived more than 72 h. Transplanting grafts from untreated donor mice lacking either endothelial or inducible NOS led to similar outcomes. In contrast, donor treatment with tetrahydrobiopterin prevented microcirculatory breakdown enabling long-term survival. Sole exception was transplantation of grafts from untreated donor mice lacking neuronal NOS. It resulted in intact microvascular structure and long-term recipient survival, either if donor mice were untreated or treated with tetrahydrobiopterin.

Conclusion

We demonstrate for the first time the crucial involvement of neuronal NOS in early microcirculatory derangements following solid organ transplantation. In this model, protective effects of tetrahydrobiopterin are mediated by targeting this isoform.  相似文献   

7.

Background

Controversy persists regarding the appropriate initiation timing of renal replacement therapy for patients with end-stage renal disease. We evaluated the effect of dialysis initiation timing on clinical outcomes. Initiation times were classified according to glomerular filtration rate (GFR).

Methods

We enrolled a total of 1691 adult patients who started dialysis between August 2008 and March 2013 in a multi-center, prospective cohort study at the Clinical Research Center for End Stage Renal Disease in the Republic of Korea. The patients were classified into the early-start group or the late-start group according to the mean estimated GFR value, which was 7.37 ml/min/1.73 m2. The primary outcome was patient survival, and the secondary outcomes were hospitalization, cardiovascular events, vascular access complications, change of dialysis modality, and peritonitis. The two groups were compared before and after matching with propensity scores.

Results

Before propensity score matching, the early-start group had a poor survival rate (P<0.001). Hospitalization, cardiovascular events, vascular access complications, changes in dialysis modality, and peritonitis were not different between the groups. A total of 854 patients (427 in each group) were selected by propensity score matching. After matching, neither patient survival nor any of the other outcomes differed between groups.

Conclusions

There was no clinical benefit after adjustment by propensity scores comparing early versus late initiation of dialysis.  相似文献   

8.

Background

Transplantation as a therapeutic strategy for inherited retinal degeneration has been historically viewed to restore vision as a method by replacing the lost retinal cells and attempting to reconstruct the neural circuitry with stem cells, progenitor cells and mature neural retinal cells.

Methods and Findings

We present evidence for an alternative strategy aimed at preventing the secondary loss of cones, the most crucial photoreceptors for vision, by transplanting normal photoreceptors cells into the eye of the P23H rat, a model of dominant retinitis pigmentosa. We carried out transplantation of photoreceptors or total neural retina in 3-month-old P23H rats and evaluated the function and cell counts 6 months after surgery. In both groups, cone loss was significantly reduced (10%) in the transplanted eyes where the cone outer segments were found to be considerably longer. This morphological effect correlated with maintenance of the visual function of cones as scored by photopic ERG recording, but more precisely with an increase in the photopic b-wave amplitudes by 100% and 78% for photoreceptor transplantation and whole retinal transplantation respectively.

Conclusions

We demonstrate here that the transplanted tissue prevents the loss of cone function, which is further translated into cone survival.  相似文献   

9.
Li C  Mi K  Wen Tf  Yan Ln  Li B  Yang Jy  Xu Mq  Wang Wt  Wei Yg 《PloS one》2011,6(11):e27366

Background/Aims

The number of people undergoing living donor liver transplantation (LDLT) has increased rapidly in many transplant centres. Patients considering LDLT need to know whether LDLT is riskier than deceased donor liver transplantation (DDLT). The aim of this study was to compare the outcomes of patients undergoing LDLT versus DDLT.

Methods

A total of 349 patients with benign liver diseases were recruited from 2005 to 2011 for this study. LDLT was performed in 128 patients, and DDLT was performed in 221 patients. Pre- and intra-operative variables for the two groups were compared. Statistically analysed post-operative outcomes include the postoperative incidence of complication, biliary and vascular complication, hepatitis B virus (HBV) recurrence, long-term survival rate and outcomes of emergency transplantation.

Results

The waiting times were 22.10±15.31 days for the patients undergoing LDLT versus 35.81±29.18 days for the patients undergoing DDLT. The cold ischemia time (CIT) was 119.34±19.75 minutes for the LDLT group and 346±154.18 for DDLT group. LDLT group had higher intraoperative blood loss, but red blood cell (RBC) transfusion was not different. Similar ≥ Clavien III complications, vascular complications, hepatitis B virus (HBV) recurrence and long-term survival rates were noted. LDLT patients suffered a higher incidence of biliary complications in the early postoperative days. However, during the long-term follow-up period, biliary complication rates were similar between the two groups. The long-term survival rate of patients undergoing emergency transplantation was lower than of patients undergoing elective transplantation. However, no significant difference was observed between emergency LDLT and emergency DDLT.

Conclusions

Patients undergoing LDLT achieved similar outcomes to patients undergoing DDLT. Although LDLT patients may suffer a higher incidence of early biliary complications, the total biliary complication rate was similar during the long-term follow-up period.  相似文献   

10.
CM Chang  KY Huang  TW Hsu  YC Su  WZ Yang  TC Chen  P Chou  CC Lee 《PloS one》2012,7(7):e40590

Background

Positive results between caseloads and outcomes have been validated in several procedures and cancer treatments. However, there is limited information available on the combined effects of surgeon and hospital caseloads. We used nationwide population-based data to explore the association between surgeon and hospital caseloads and survival rates for major cancers.

Methodology

A total of 11677 patients with incident cancer diagnosed in 2002 were identified from the Taiwan National Health Insurance Research Database. Survival analysis, the Cox proportional hazards model, and propensity scores were used to assess the relationship between 5-year survival rates and different caseload combinations.

Results

Based on the Cox proportional hazard model, cancer patients treated by low-volume surgeons in low-volume hospitals had poorer survival rates, and hazard ratios ranged from 1.3 in head and neck cancer to 1.8 in lung cancer after adjusting for patients’ demographic variables, co-morbidities, and treatment modality. When analyzed using the propensity scores, the adjusted 5-year survival rates were poorer for patients treated by low-volume surgeons in low-volume hospitals, compared to those treated by high-volume surgeons in high-volume hospitals (P<0.005).

Conclusions

After adjusting for differences in the case mix, cancer patients treated by low-volume surgeons in low-volume hospitals had poorer 5-year survival rates. Payers may implement quality care improvement in low-volume surgeons.  相似文献   

11.

Background

The presence of monocyte-macrophage lineage cells in rejecting kidney transplants is associated with worse graft outcome. At present, it is still unclear how the monocyte-macrophage related responses develop after transplantation. Here, we studied the dynamics, phenotypic and functional characteristics of circulating monocytes during the first 6 months after transplantation and aimed to establish the differences between kidney transplant recipients and healthy individuals.

Methods

Phenotype, activation status and cytokine production capacity of classical (CD14++CD16−), intermediate (CD14++CD16+) and non-classical (CD14+CD16++), monocytes were determined by flow cytometry in a cohort of 33 healthy individuals, 30 renal transplant recipients at transplantation, 19 recipients at 3 months and 16 recipients at 6 months after transplantation using a cross-sectional approach.

Results

The percentage of both CD16+ monocyte subsets was significantly increased in transplant recipients compared to healthy individuals, indicative of triggered innate immunity (p≤0.039). Enhanced production capacity of tumor necrosis factor-α, interferon-γ and interleukin-1β was observed by monocytes at transplantation compared to healthy individuals. Remarkably, three months post-transplant, in presence of potent immunosuppressive drugs and despite improved kidney function, interferon-γ, tumor necrosis factor-α and interleukin-10 production capacity still remained significantly increased.

Conclusion

Our data demonstrate a skewed balance towards pro-inflammatory CD16+ monocytes that is present at the time of transplantation and retained for at least 6 months after transplantation. This shift could be one of the important drivers of early post-transplant cellular immunity.  相似文献   

12.

Background

Prognostic biomarkers are needed for superficial gastroesophageal adenocarcinoma (EAC) to predict clinical outcomes and select therapy. Although recurrent mutations have been characterized in EAC, little is known about their clinical and prognostic significance. Aneuploidy is predictive of clinical outcome in many malignancies but has not been evaluated in superficial EAC.

Methods

We quantified copy number changes in 41 superficial EAC using Affymetrix SNP 6.0 arrays. We identified recurrent chromosomal gains and losses and calculated the total copy number abnormality (CNA) count for each tumor as a measure of aneuploidy. We correlated CNA count with overall survival and time to first recurrence in univariate and multivariate analyses.

Results

Recurrent segmental gains and losses involved multiple genes, including: HER2, EGFR, MET, CDK6, KRAS (recurrent gains); and FHIT, WWOX, CDKN2A/B, SMAD4, RUNX1 (recurrent losses). There was a 40-fold variation in CNA count across all cases. Tumors with the lowest and highest quartile CNA count had significantly better overall survival (p = 0.032) and time to first recurrence (p = 0.010) compared to those with intermediate CNA counts. These associations persisted when controlling for other prognostic variables.

Significance

SNP arrays facilitate the assessment of recurrent chromosomal gain and loss and allow high resolution, quantitative assessment of segmental aneuploidy (total CNA count). The non-monotonic association of segmental aneuploidy with survival has been described in other tumors. The degree of aneuploidy is a promising prognostic biomarker in a potentially curable form of EAC.  相似文献   

13.

Background:

Relatively little is known about the management and outcomes of Aboriginal children with renal failure in Canada. We evaluated differences in dialysis modality, time spent on dialysis, rates of kidney transplantation, and patient and allograft survival between Aboriginal children and non-Aboriginal children.

Methods:

For this population-based cohort study, we used data from a national pediatric end-stage renal disease database. Patients less than 18 years old who started renal replacement treatment (dialysis or kidney transplantation) in nine Canadian provinces (Quebec data were not available) and all three territories between 1992 and 2007 were followed until death, loss to follow-up or end of the study period. We compared initial modality of dialysis and time to first kidney transplant between Aboriginal children, white children and children of other ethnicity. We examined the association between ethnicity and likelihood of kidney transplantation using adjusted Cox proportional hazard models for Aboriginal and white children (data for the children of other ethnicity did not meet the assumptions of proportional hazards).

Results:

Among 843 pediatric patients included in the study, 104 (12.3%) were Aboriginal, 521 (61.8%) were white, and 218 (25.9%) were from other ethnic minorities. Hemodialysis was the initial modality of dialysis for 48.0% of the Aboriginal patients, 42.7% of the white patients and 62.6% of those of other ethnicity (p < 0.001). The time from start of dialysis to first kidney transplant was longer among the Aboriginal children (median 1.75 years, interquartile range 0.69–2.81) than among the children in the other two groups (p < 0.001). After adjustment for confounders, Aboriginal children were less likely than white children to receive a transplant from a living donor (hazard ratio [HR] 0.36, 95% confidence interval [CI] 0.21–0.61) or a transplant from any donor (HR 0.54, 95% CI 0.40–0.74) during the study period.

Interpretation:

The time from start of dialysis to first kidney transplant was longer among Aboriginal children than among white children. Further evaluation is needed to determine barriers to transplantation among Aboriginal children.Compared with non-Aboriginal people, Aboriginal adults with end-stage renal disease in Canada have lower rates of kidney transplantation, the optimal treatment for renal failure.14 Most studies to date that have examined health outcomes among Canadian Aboriginal people with kidney disease have focused on adults.18 Relatively little is known about the outcomes among Aboriginal children with renal failure. A single-centre cohort study from the province of British Columbia reported that Aboriginal children who received a kidney transplant had similar short-term, but poorer long-term allograft survival than white children.9 No further studies have examined differences in modality of renal replacement treatment or the likelihood of kidney transplantation among Aboriginal children with renal failure.We performed an observational cohort study of children beginning renal replacement treatment in Canada. We compared differences in dialysis modality, time spent on dialysis, rates of kidney transplantation, and graft and patient survival between Aboriginal children, white children and children of other ethnicities.  相似文献   

14.

Background

Anti-glomerular basement membrane (GBM) antibody disease may lead to acute crescentic glomerulonephritis with poor renal prognosis. Current therapy favours plasma exchange (PE) for removal of pathogenic antibodies. Immunoadsorption (IAS) is superior to PE regarding efficiency of antibody-removal and safety. Apart from anecdotal data, there is no systemic analysis of the long-term effects of IAS on anti-GBM-disease and antibody kinetics.

Objective

To examine the long-term effect of high-frequency IAS combined with standard immunosuppression on patient and renal survival in patients with anti-GBM-disease and to quantify antibody removal and kinetics through IAS.

Design

Retrospective review of patients treated with IAS for anti-GBM-antibody disease confirmed by biopsy and/or anti-GBM-antibodies.

Setting

University Hospital of Vienna, Austria.

Participants

10 patients with anti-GBM-disease treated with IAS.

Measurements

Patient and renal survival, renal histology, anti-GBM-antibodies.

Results

Anti-GBM-antibodies were reduced by the first 9 IAS treatments (mean number of 23) to negative levels in all patients. Renal survival was 40% at diagnosis, 70% after the end of IAS, 63% after one year and 50% at the end of observation (mean 84 months, range 9 to 186). Dialysis dependency was successfully reversed in three of six patients. Patient survival was 90% at the end of observation.

Conclusion

IAS efficiently eliminates anti-GBM-antibodies suggesting non-inferiority to PE with regard to renal and patient survival. Hence IAS should be considered as a valuable treatment option for anti-GBM-disease, especially in patients presenting with a high percentage of crescents and dialysis dependency due to an unusual high proportion of responders.  相似文献   

15.

Background

Patients with epilepsy often suffer from other important conditions. The existence of such co-morbidities is frequently not recognized and their relationship with epilepsy usually remains unexplained.

Methodology/Principal Findings

We describe three patients with common, sporadic, non-syndromic epilepsies in whom large genomic microdeletions were found during a study of genetic susceptibility to epilepsy. We performed detailed gene-driven clinical investigations in each patient. Disruption of the function of genes in the deleted regions can explain co-morbidities in these patients.

Conclusions/Significance

Co-morbidities in patients with epilepsy can be part of a genomic abnormality even in the absence of (known) congenital malformations or intellectual disabilities. Gene-driven phenotype examination can also reveal clinically significant unsuspected condition.  相似文献   

16.

Background

Collaborative care is an effective treatment for the management of depression but evidence on its cost-effectiveness in the UK is lacking.

Aims

To assess the cost-effectiveness of collaborative care in a UK primary care setting.

Methods

An economic evaluation alongside a multi-centre cluster randomised controlled trial comparing collaborative care with usual primary care for adults with depression (n = 581). Costs, quality-adjusted life-years (QALYs), and incremental cost-effectiveness ratios (ICER) were calculated over a 12-month follow-up, from the perspective of the UK National Health Service and Personal Social Services (i.e. Third Party Payer). Sensitivity analyses are reported, and uncertainty is presented using the cost-effectiveness acceptability curve (CEAC) and the cost-effectiveness plane.

Results

The collaborative care intervention had a mean cost of £272.50 per participant. Health and social care service use, excluding collaborative care, indicated a similar profile of resource use between collaborative care and usual care participants. Collaborative care offered a mean incremental gain of 0.02 (95% CI: –0.02, 0.06) quality-adjusted life-years over 12 months, at a mean incremental cost of £270.72 (95% CI: –202.98, 886.04), and resulted in an estimated mean cost per QALY of £14,248. Where costs associated with informal care are considered in sensitivity analyses collaborative care is expected to be less costly and more effective, thereby dominating treatment as usual.

Conclusion

Collaborative care offers health gains at a relatively low cost, and is cost-effective compared with usual care against a decision-maker willingness to pay threshold of £20,000 per QALY gained. Results here support the commissioning of collaborative care in a UK primary care setting.  相似文献   

17.

Background

Patients with end-stage renal disease (ESRD) and latently infected with Mycobacterium tuberculosis (LTBI) are at higher risk to develop tuberculosis (TB) than healthy subjects. Interferon-gamma release assays (IGRAs) were reported to be more sensitive than tuberculin skin tests for the detection of infected individuals in dialysis patients.

Methods

On 143 dialysis patients prospectively enrolled, we compared the results from the QuantiFERON®-TB Gold assay (QFT), to those of an IGRA in response to in vitro stimulation of circulating mononuclear cells with the mycobacterial latency antigen Heparin-Binding Haemagglutinin purified from Mycobacterium bovis BCG (native HBHA, nHBHA).

Results

Seven patients had a past history of active TB and 1 had an undetermined result with both IGRAs. Among the other 135 patients, 94 had concordant results with the QFT and nHBHA-IGRA, 40.0% being negative and therefore not latently infected, and 29.6% being positive and thus LTBI. Discrepant results between these tests were found for 36 patients positive only with the nHBHA-IGRA and 5 only with the QFT.

Conclusions

The nHBHA-IGRA is more sensitive than the QFT for the detection of LTBI dialysis patients, and follow-up of the patients will allow us to define the clinical significance of discrepant results between the nHBHA-IGRA and the QFT.  相似文献   

18.

Background

Elderly patients with end-stage renal disease have become the fastest growing population of kidney transplant candidates in recent years. However, the risk factors associated with long-term outcomes in these patients remain unclear.

Methods

We retrospectively analyzed 166 recipients aged 60 years or older who underwent primary deceased kidney transplantation between 2002 and 2013 in our center. The main outcomes included 1-, 3- and 5-year patient survival as well as overall and death-censored graft survival. The independent risk factors affecting graft and patient survival were analyzed using Cox regression analysis.

Results

The 1-, 3-, 5-year death-censored graft survival rates were 93.6%, 89.4% and 83.6%, respectively. Based on the Cox multivariate analysis, panel reactive antibody (PRA)>5% [hazard ratio (HR) 4.295, 95% confidence interval (CI) 1.321–13.97], delayed graft function (HR 4.744, 95% CI 1.611–13.973) and acute rejection (HR 4.971, 95% CI 1.516–16.301) were independent risk factors for graft failure. The 1-, 3-, 5-year patient survival rates were 84.8%, 82.1% and 77.1%, respectively. Longer dialysis time (HR 1.011 for 1-month increase, 95% CI 1.002–1.020), graft loss (HR 3.501, 95% CI 1.559–7.865) and low-dose ganciclovir prophylaxis (1.5 g/d for 3 months) (HR 3.173, 95% CI 1.063–9.473) were risk factors associated with patient death.

Conclusions

The five-year results show an excellent graft and patient survival in elderly kidney transplant recipients aged ≥60 years. PRA>5%, delayed graft function, and acute rejection are risk factors for graft failure, while longer duration of dialysis, graft loss and low-dose ganciclovir prophylaxis are risk factors for mortality in elderly recipients. These factors represent potential targets for interventions aimed at improving graft and patient survival in elderly recipients.  相似文献   

19.
Wu HY  Hung KY  Huang TM  Hu FC  Peng YS  Huang JW  Lin SL  Chen YM  Chu TS  Tsai TJ  Wu KD 《PloS one》2012,7(1):e30337

Background

Effects of long-term glucose load on peritoneal dialysis (PD) patient safety and outcomes have seldom been reported. This study demonstrates the influence of long-term glucose load on patient and technique survival.

Methods

We surveyed 173 incident PD patients. Long-term glucose load was evaluated by calculating the average dialysate glucose concentration since initiation of PD. Risk factors were assessed by fitting Cox''s models with repeatedly measured time-dependent covariates.

Results

We noted that older age, higher glucose concentration, and lower residual renal function (RRF) were significantly associated with a worse patient survival. We found that female gender, absence of diabetes, lower glucose concentration, use of icodextrin, higher serum high density lipoprotein cholesterol, and higher RRF were significantly associated with a better technique survival.

Conclusions

Long-term glucose load predicted mortality and technique failure in chronic PD patients. These findings emphasize the importance of minimizing glucose load in PD patients.  相似文献   

20.

Background

Expanded criteria donors (ECDs) are currently accepted as potential sources to increase the donor pool and to provide more chances of kidney transplantation for elderly recipients who would not survive long waiting periods. Hypothermic machine perfusion (HMP) is designed to mitigate the deleterious effects of simple cold storage (CS) on the quality of preserved organs, particularly when the donor is in a marginal status.

Methods

We compared the transplant outcomes in patients receiving ECD kidneys with either HMP or CS graft preservation. Articles from the MEDLINE, EMBASE and Cochrane Library databases were searched and all studies reporting outcomes from HMP versus CS methods of kidney preservation were included in this meta-analysis. The parameters analyzed included the incidence of delayed graft function (DGF), primary non-function (PNF) and one-year graft and patient survival.

Results

A total of seven studies qualified for the review, involving 2374 and 8716 kidney grafts with HMP or CS preservation respectively, all from ECD donors. The incidence of delayed graft function (DGF) was significantly reduced with an odd ratio(OR) of 0.59 (95% CI 0.54–0.66, P<0.001) and one-year graft survival was significantly improved with an OR of 1.12 (95% CI 1.03–1.21, P = 0.005) in HMP preservation compared to CS. However, there was no difference in the incidence of PNF (OR 0.54, 95% CI 0.21–1.40, P = 0.20), and one-year patient survival (OR 0.98, 95% CI 0.94–1.02, P = 0.36) between HMP and CS preservation.

Conclusions

HMP was associated with a reduced incidence of DGF and an with increased one-year graft survival, but it was not associated with the incidence of PNF and one-year patient survival.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号