首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.

Background

The worldwide elderly (≥65 years old) dialysis population has grown significantly. This population is expected to have more comorbid conditions and shorter life expectancies than the general elderly population. Predicting outcomes for this population is important for decision-making. Recently, a new comorbidity index (nCI) with good predictive value for patient outcomes was developed and validated in chronic dialysis patients regardless of age. Our study examined the nCI outcome predictability in elderly dialysis patients.

Methods and Findings

For this population-based cohort study, we used Taiwan''s National Health Insurance Research Database of enrolled elderly patients, who began maintenance dialysis between January 1999 and December 2005. A total of 21,043 incident dialysis patients were divided into 4 groups by nCI score (intervals ≤3, 4–6, 7–9, ≥10) and followed nearly for 10 years. All-cause mortality and life expectancy were analyzed. During the follow-up period, 11272 (53.55%) patients died. Kaplan-Meier curves showed significant group difference in survival (log-rank: P<0.001). After stratification by age, life expectancy was found to be significantly longer in groups with lower nCI scores.

Conclusion

The nCI, even without the age component, is a strong predictor of mortality in elderly dialysis patients. Because patients with lower nCI scores may predict better survival, more attention should paid to adequate dialysis rather than palliative care, especially in those without obvious functional impairments.  相似文献   

2.

Purpose

To characterize the impact of comorbidity on survival outcomes for patients with nasopharyngeal carcinoma (NPC) post radiotherapy (RT).

Methods

A total of 4095 patients with NPC treated by RT or RT plus chemotherapy (CT) in the period from 2007 to 2011 were included through Taiwan’s National Health Insurance Research Database. Information on comorbidity present prior to the NPC diagnosis was obtained and adapted to the Charlson Comorbidity Index (CCI), Age-Adjusted Charlson Comorbidity Index (ACCI) and a revised head and neck comorbidity index (HN-CCI). The prevalence of comorbidity and the influence on survival were calculated and analyzed.

Results

Most of the patients (75%) were male (age 51±13 years) and 2470 of them (60%) had at least one comorbid condition. The most common comorbid condition was diabetes mellitus. According to these three different comorbidity index (CCI, ACCI and HN-CCI), higher scores were associated with worse overall survival (P< 0.001). The Receiver Operating Characteristic (ROC) curve was used to assess the discriminating ability of CCI, AACI and HN-CCI scores and it demonstrated the predictive ability for mortality with the ACCI (0.693, 95% CI 0.670–0.715) was superior to that of the CCI (0.619, 95% CI 0.593–0.644) and HN-CCI (0.545, 95%CI 0.519–0.570).

Conclusion

Comorbidities greatly influenced the clinical presentations, therapeutic interventions, and outcomes of patients with NPC post RT. Higher comorbidity index scores accurately was associated with worse survival. The ACCI seems to be a more appropriate prognostic indicator and should be considered in further clinical studies.  相似文献   

3.

Backgrounds and Aims

Visceral fat has a crucial role in the development and progression of cardiovascular disease, the major cause of death in end-stage renal disease (ESRD). Although sagittal abdominal diameter (SAD), as an index of visceral fat, significantly correlated with mortality in the general population, the impact of SAD on clinical outcomes has never been explored in ESRD patients. Therefore, we sought to elucidate the prognostic value of SAD in incident peritoneal dialysis (PD) patients.

Methods

We prospectively determined SAD by lateral abdominal X-ray at PD initiation, and evaluated the association of SAD with all-cause and cardiovascular mortality in 418 incident PD patients.

Results

The mean SAD was 24.5±4.3 cm, and during a mean follow-up of 39.4 months, 97 patients (23.2%) died, and 49.4% of them died due to cardiovascular disease. SAD was a significant independent predictor of all-cause [3rd versus 1st tertile, HR (hazard ratio): 3.333, 95% CI (confidence interval): 1.514–7.388, P = 0.01; per 1 cm increase, HR: 1.071, 95% CI: 1.005–1.141, P = 0.03] and cardiovascular mortality (3rd versus 1st tertile, HR: 8.021, 95% CI: 1.994–32.273, P = 0.01; per 1 cm increase, HR: 1.106, 95% CI: 1.007–1.214, P = 0.03). Multivariate fractional polynomial analysis also showed that all-cause and cardiovascular mortality risk increased steadily with higher SAD values. In addition, SAD provided higher predictive value for all-cause (AUC: 0.691 vs. 0.547, P<0.001) and cardiovascular mortality (AUC: 0.644 vs. 0.483, P<0.001) than body mass index (BMI). Subgroup analysis revealed higher SAD (≥24.2 cm) was significantly associated with all-cause mortality in men, women, younger patients (<65 years), and patients with lower BMI (<22.3 kg/m2).

Conclusions

SAD determined by lateral abdominal X-ray at PD initiation was a significant independent predictor of all-cause and cardiovascular mortality in incident PD patients. Estimating visceral fat by SAD could be useful to stratify mortality risk in these patients.  相似文献   

4.

Background

Over the past decade, there has been a steep rise in the number of people with complex medical problems who require dialysis. We sought to determine the life expectancy of elderly patients after starting dialysis and to identify changes in survival rates over time.

Methods

All patients aged 65 years or older who began dialysis in Canada between 1990 and 1999 were identified from the Canadian Organ Replacement Register. We used Cox proportional hazards models to examine the effect that the period during which dialysis was initiated (era 1, 1990–1994; era 2, 1995–1999) had on patient survival, after adjusting for diabetes, sex and comorbidity. Patients were followed from initiation of dialysis until death, transplantation, loss to follow-up or study end (Dec. 31, 2004).

Results

A total of 14 512 patients aged 65 years or older started dialysis between 1990 and 1999. The proportion of these patients who were 75 years or older at the start of dialysis increased from 32.7% in era 1 (1990–1994) to 40.0% in era 2 (1995–1999). Despite increased comorbidity over the 2 study periods, the unadjusted 1-, 3- and 5-year survival rates among patients aged 65–74 years at dialysis initiation rose from 74.4%, 44.9% and 25.8% in era 1 to 78.1%, 51.5% and 33.5% in era 2. The respective survival rates among those aged 75 or more at dialysis initiation increased from 67.2%, 32.3% and 14.2% in era 1 to 69.0%, 36.7% and 20.3% in era 2. This survival advantage persisted after adjustment for diabetes, sex and comorbidity in both age groups (65–74 years: hazard ratio [HR] 0.76, 95% confidence interval [CI] 0.72–0.81; 75 years or more: HR 0.86, 95% CI 0.80–0.92).

Interpretation

Survival after dialysis initiation among elderly patients has improved from 1990 to 1999, despite an increasing burden of comorbidity. Physicians may find these data useful when discussing prognosis with elderly patients who are initiating dialysis.Within general medical and subspecialty areas, chronic kidney disease is increasingly recognized as an important comorbid condition that is often associated with prolonged hospital stays and increased morbidity and mortality.1–3 As a result, internists and other specialists are more likely than before to be involved with the care of patients for whom dialysis needs to be started because of end-stage kidney disease. The majority of patients starting dialysis are 65 years or older at the time of their first treatment, and many are over 75 years.4 Given the heightened awareness of chronic kidney disease, its high prevalence, the association with multiple comorbidity, and the impact of dialysis on survival and quality of life, we sought to calculate the mean life expectancy of elderly patients who began dialysis at either 65–74 years of age or at 75 years or more, and to identify whether there was any change in survival probability, or in the effect of comorbidity characteristics, on dialysis over the past decade.  相似文献   

5.

Background

Previous studies have suggested that erectile dysfunction (ED) is an independent risk factor for macrovascular disease. Very few studies have evaluated the relationship between ED and risk of end stage renal disease (ESRD) requiring dialysis.

Methods

A random sample of 1,000,000 individuals from Taiwan''s National Health Insurance database was collected. We selected the control group by matching the subjects and controls by age, diabetes, hypertension, coronary heart disease, hyperlipidemia, area of residence, monthly income and index date. We identified 3985 patients with newly-diagnosed ED between 2000 and 2008 and compared them with a matched cohort of 23910 patients without ED. All patients were tracked from the index date to identify which patients subsequently developed a need for dialysis.

Results

The incidence rates of dialysis in the ED cohort and comparison groups were 10.85 and 9.06 per 10000 person-years, respectively. Stratified by age, the incidence rate ratio for dialysis was greater in ED patients aged <50 years (3.16, 95% CI: 1.62–6.19, p = 0.0008) but not in aged 50–64 (0.94, 95% CI: 0.52–1.69, p = 0.8397) and those aged ≧65 (0.69, 95% CI: 0.32–1.52, p = 0.3594). After adjustment for patient characteristics and medial comorbidities, the adjusted HR for dialysis remained greater in ED patients aged <50 years (adjusted HR: 2.08, 95% CI: 1.05–4.11, p<0.05). The log-rank test revealed that ED patients <50-years-old had significantly higher cumulative incidence rates of dialysis than those without (p = 0.0004).

Conclusion

Patients with ED, especially younger patients, are at an increased risk for ESRD requiring dialysis later in life.  相似文献   

6.

Purpose

This study aimed to investigate the one-year mortality associations in hemodialysis patients who underwent neurosurgical intervention after traumatic brain injury (TBI) using a nationwide database in Taiwan.

Materials and Methods

An age- and gender-matched longitudinal cohort study of 4416 subjects, 1104 TBI patients with end-stage renal disease (ESRD) and 3312 TBI patients without ESRD, was conducted using the National Health Insurance Research Database in Taiwan between January 2000 and December 2007. The demographic characteristics, length of stay (LOS), length of ICU stay, length of ventilation (LOV), and tracheostomy were collected and analyzed. The co-morbidities of hypertension (HTN), diabetes mellitus (DM), myocardial infarction (MI), stroke, and heart failure (HF) were also evaluated.

Results

TBI patients with ESRD presented a shorter LOS, a longer length of ICU stay and LOV, and a higher percentage of comorbidities compared with those without ESRD. TBI patients with ESRD displayed a stable trend of one-year mortality rate, 75.82% to 76.79%, from 2000–2007. For TBI patients with ESRD, the median survival time was 0.86 months, and pre-existing stroke was a significant risk factor of mortality (HR: 1.29, 95% C.I.: 1.08–1.55). Pre-existing DM (HR: 1.35, 95% C.I.: 1.12–1.63) and MI (HR: 1.61, 95% C.I.: 1.07–2.42) effect on the mortality in ESRD patients who underwent TBI surgical intervention in the younger (age<65) and older (age≥65) population, respectively. In addition, the length of ICU stay and tracheostomy may provide important information to predict the mortality risk.

Conclusions

This is the first report indicating an increased risk of one-year mortality among TBI patients with a pre-existing ERSD insult. Comorbidities were more common in TBI patients with ESRD. Physicians should pay more attention to TBI patients with ESRD based on the status of age, comorbidities, length of ICU stay, and tracheostomy to improve their survival.  相似文献   

7.

Background

Studies comparing patient survival of hemodialysis (HD) and peritoneal dialysis (PD) have yielded conflicting results and no such study was from South-East Asia. This study aimed to compare the survival outcomes of patients with end-stage renal disease (ESRD) who started dialysis with HD and PD in Singapore.

Methods

Survival data for a maximum of 5 years from a single-center cohort of 871 ESRD patients starting dialysis with HD (n = 641) or PD (n = 230) from 2005–2010 was analyzed using the flexible Royston-Parmar (RP) model. The model was also applied to a subsample of 225 propensity-score-matched patient pairs and subgroups defined by age, diabetes mellitus, and cardiovascular disease.

Results

After adjusting for the effect of socio-demographic and clinical characteristics, the risk of death was higher in patients initiating dialysis with PD than those initiating dialysis with HD (hazard ratio [HR]: 2.08; 95% confidence interval [CI]: 1.67–2.59; p<0.001), although there was no significant difference in mortality between the two modalities in the first 12 months of treatment. Consistently, in the matched subsample, patients starting PD had a higher risk of death than those starting HD (HR: 1.73, 95% CI: 1.30–2.28, p<0.001). Subgroup analysis showed that PD may be similar to or better than HD in survival outcomes among young patients (≤65 years old) without diabetes or cardiovascular disease.

Conclusion

ESRD patients who initiated dialysis with HD experienced better survival outcomes than those who initiated dialysis with PD in Singapore, although survival outcomes may not differ between the two dialysis modalities in young and healthier patients. These findings are potentially confounded by selection bias, as patients were not randomized to the two dialysis modalities in this cohort study.  相似文献   

8.

Objectives

The relationship between disability and comorbidity on mortality is widely perceived as additive in clinical models of frailty.

Design

National data were retrospectively extracted from medical records of community hospital.

Data Sources

There were of 12,804 acutely-disabled patients admitted for inpatient rehabilitation in Singapore rehabilitation community hospitals from 1996 through 2005 were followed up for death till 31 December 2011.

Outcome Measure

Cox proportional-hazards regression to assess the interaction of comorbidity and disability at discharge on all-cause mortality.

Results

During a median follow-up of 10.9 years, there were 8,565 deaths (66.9%). The mean age was 73.0 (standard deviation: 11.5) years. Independent risk factors of mortality were higher comorbidity (p<0.001), severity of disability at discharge (p<0.001), being widowed (adjusted hazard ratio [aHR]: 1.38, 95% confidence interval [CI]:1.25–1.53), low socioeconomic status (aHR:1.40, 95%CI:1.29–1.53), discharge to nursing home (aHR:1.14, 95%CI:1.05–1.22) and re-admission into acute care (aHR:1.54, 95%CI:1.45–1.65). In the main effects model, those with high comorbidity had an aHR = 2.41 (95%CI:2.13–2.72) whereas those with total disability had an aHR = 2.28 (95%CI:2.12–2.46). In the interaction model, synergistic interaction existed between comorbidity and disability (p<0.001) where those with high comorbidity and total disability had much higher aHR = 6.57 (95%CI:5.15–8.37).

Conclusions

Patients with greater comorbidity and disability at discharge, discharge to nursing home or re-admission into acute care, lower socioeconomic status and being widowed had higher mortality risk. Our results identified predictive variables of mortality that map well onto the frailty cascade model. Increasing comorbidity and disability interacted synergistically to increase mortality risk.  相似文献   

9.

Background

Prolonged mechanical ventilation (PMV) is increasingly common worldwide, consuming enormous healthcare resources. Factors that modify PMV outcome are still obscure.

Methods

We selected patients without preceding mechanical ventilation within the one past year and who developed PMV during index admission in Taiwan''s National Health Insurance (NHI) system during 1998–2007 for comparison of mortality and resource use. They were divided into three groups: (1) patients with end-stage renal diseases (ESRD) before the index admission for PMV onset; (2) patients with dialysis-requiring acute kidney injury (AKI-dialysis) during the hospitalization course; and (3) patients without AKI or with non dialysis-requiring AKI during the hospitalization course (non-AKI). We used a random-effects logistic regression model to identify factors associated with mortality.

Results

Compared with the other two groups, patients with AKI-dialysis had significantly longer mechanical ventilation, more frequent use of vasopressors, longer intensive care unit/hospital stay and higher inpatient expenditures during the index admission. Relative to non-AKI patients, patients with AKI-dialysis had an elevated mortality hazard; the adjusted relative risk ratios were 1.51 (95% confidence interval [CI]:1.46–1.56), 1.27 (95% CI: 1.23–1.32), and 1.10 (95% CI: 1.08–1.12) for mortality rates at discharge, 3 months, and 4 years after PMV, respectively. Patients with AKI-dialysis also consumed significantly higher total in-patient expenditure than the other two patient groups (p<0.001).

Conclusions

Among patients that need PMV care during an admission, the presence of de novo AKI requiring dialysis significantly increased short and long term mortality, and demand for health care resources.  相似文献   

10.

Background

Reduced lean body mass (LBM) is one of the main indicators in malnutrition inflammation syndrome among patients on dialysis. However, the influence of LBM on peritoneal dialysis (PD) patients’ outcomes and the factors related to increasing LBM are seldom reported.

Methods

We enrolled 103 incident PD patients between 2002 and 2003, and followed them until December 2011. Clinical characteristics, PD-associated parameters, residual renal function, and serum chemistry profiles of each patient were collected at 1 month and 1 year after initiating PD. LBM was estimated using creatinine index corrected with body weight. Multiple linear regression analysis, Kaplan–Meier survival analysis, and Cox regression proportional hazard analysis were used to define independent variables and compare survival between groups.

Results

Using the median LBM value (70% for men and 64% for women), patients were divided into group 1 (n = 52; low LBM) and group 2 (n = 51; high LBM). Group 1 patients had higher rates of peritonitis (1.6 vs. 1.1/100 patient months; p<0.05) and hospitalization (14.6 vs. 9.7/100 patient months; p<0.05). Group 1 patients also had shorter overall survival and technique survival (p<0.01). Each percentage point increase in LBM reduced the hazard ratio for mortality by 8% after adjustment for diabetes, age, sex, and body mass index (BMI). Changes in residual renal function and protein catabolic rate were independently associated with changes in LBM in the first year of PD.

Conclusions

LBM serves as a good parameter in addition to BMI to predict the survival of patients on PD. Preserving residual renal function and increasing protein intake can increase LBM.  相似文献   

11.

Background

Few studies have examined the association of rheumatoid arthritis (RA) with nontuberculosis mycobacterium (NTM) disease and pulmonary tuberculosis (PTB).

Methods

We identified 29 131 patients with RA from the catastrophic illness registry who were diagnosed from 1998–2008; 116 524 patients without RA from inpatient data files were randomly frequency matched according to sex, age, and index year and used as a comparison group. Both groups were followed-up until the end of 2010 to measure the incidence of NTM disease and active PTB. We analyzed the risk of NTM disease and active PTB using the Cox proportional hazards regression models, controlling for sex, age, and Charlson comorbidity index (CCI).

Results

The incidence of NTM disease was 4.22 times greater in the RA group than in the non-RA group (1.91 vs 0.45 per 10,000 person-years). The incidence of PTB was 2.99 times greater in the RA group than in the non-RA group (25.3 vs 8.46 per 10,000 person-years). After adjusting for age, sex, and CCI, the adjusted hazard ratios (HRs) of NTM disease and active PTB for the RA group were 4.17 (95% CI = 2.61–6.65) and 2.87 (95% CI = 2.55–3.23), respectively, compared with the non-RA group. In the first 2 years of follow-up, the RA group yielded corresponding adjusted HRs of 4.98 and 3.39 compared with the non-RA group. The follow-up time-specific RA group to the non-RA group HR of both the NTM disease and active PTB varied.

Conclusion

This study can serve as a reference for clinical physicians to increase awareness regarding the detection of NTM disease and active PTB in RA patients among the any stage of the clinical course even without CCI.  相似文献   

12.

Background

Abnormal serum potassium is associated with an increased risk of mortality in dialysis patients. However, the impacts of serum potassium levels on short- and long-term mortality and association of potassium variability with death in peritoneal dialysis (PD) patients are uncertain.

Methods

We examined mortality-predictability of serum potassium at baseline and its variability in PD patients treated in our center January 2006 through December 2010 with follow-up through December 2012. The hazard ratios (HRs) were used to assess the relationship between baseline potassium levels and short-term (≤1 year) as well as long-term (>1 year) survival. Variability of serum potassium was defined as the coefficient of variation of serum potassium (CVSP) during the first year of PD.

Results

A total of 886 incident PD patients were enrolled, with 248 patients (27.9%) presented hypokalemia (serum potassium <3.5 mEq/L). During a median follow-up of 31 months (range: 0.5–81.0 months), adjusted all-cause mortality hazard ratio (HR) and 95% confidence interval (CI) for baseline serum potassium of <3.0, 3.0 to <3.5, 3.5 to <4.0, 4.5 to <5.0, and ≥5.0 mEq/L, compared with 4.0 to <4.5 (reference), were 1.79 (1.02–3.14), 1.15 (0.72–1.86), 1.31 (0.82–2.08), 1.33 (0.71–2.48), 1.28 (0.53–3.10), respectively. The increased risk of lower potassium with mortality was evident during the first year of follow-up, but vanished thereafter. Adjusted all-cause mortality HR for CVSP increments of 7.5% to <12.0%; 12.0% to <16.7% and ≥16.7%, compared with <7.5% (reference), were 1.35 (0.67–2.71), 2.00 (1.05–3.83) and 2.18 (1.18–4.05), respectively. Similar association was found between serum potassium levels and its variability and cardiovascular mortality.

Conclusions

A lower serum potassium level was associated with all-cause and cardiovascular mortality during the first year of follow-up in incident PD patients. In addition, higher variability of serum potassium levels conferred an increased risk of death in this population.  相似文献   

13.

Background and Objectives

Numerous substances accumulate in the body in uremia but those contributing to cardiovascular morbidity and mortality in dialysis patients are still undefined. We examined the association of baseline free levels of four organic solutes that are secreted in the native kidney — p-cresol sulfate, indoxyl sulfate, hippurate and phenylacetylglutamine — with outcomes in hemodialysis patients.

Design, Setting, Participants and Measurements

We measured these solutes in stored specimens from 394 participants of a US national prospective cohort study of incident dialysis patients. We examined the relation of each solute and a combined solute index to cardiovascular mortality and morbidity (first cardiovascular event) using Cox proportional hazards regression adjusted for demographics, comorbidities, clinical factors and laboratory tests including Kt/VUREA.

Results

Mean age of the patients was 57 years, 65% were white and 55% were male. In fully adjusted models, a higher p-cresol sulfate level was associated with a greater risk (HR per SD increase; 95% CI) of cardiovascular mortality (1.62; 1.17–2.25; p=0.004) and first cardiovascular event (1.60; 1.23–2.08; p<0.001). A higher phenylacetylglutamine level was associated with a greater risk of first cardiovascular event (1.37; 1.18–1.58; p<0.001). Patients in the highest quintile of the combined solute index had a 96% greater risk of cardiovascular mortality (1.96; 1.05–3.68; p=0.04) and 62% greater risk of first cardiovascular event (1.62; 1.12–2.35; p=0.01) compared with patients in the lowest quintile. Results were robust in sensitivity analyses.

Conclusions

Free levels of uremic solutes that are secreted by the native kidney are associated with a higher risk of cardiovascular morbidity and mortality in incident hemodialysis patients.  相似文献   

14.

Background

Profound alterations in immune responses associated with uremia and exacerbated by dialysis increase the risk of active tuberculosis (TB). Evidence of the long-term risk and outcome of active TB after acute kidney injury (AKI) is limited.

Methods

This population-based-cohort study used claim records retrieved from the Taiwan National Health Insurance database. We retrieved records of all hospitalized patients, more than 18 years, who underwent dialysis for acute kidney injury (AKI) during 1999–2008 and validated using the NSARF data. Time-dependent Cox proportional hazards model to adjust for the ongoing effect of end-stage renal disease (ESRD) was conducted to predict long-term de novo active TB after discharge from index hospitalization.

Results

Out of 2,909 AKI dialysis patients surviving 90 days after index discharge, 686 did not require dialysis after hospital discharge. The control group included 11,636 hospital patients without AKI, dialysis, or history of TB. The relative risk of active TB in AKI dialysis patients, relative to the general population, after a mean follow-up period of 3.6 years was 7.71. Patients who did (hazard ratio [HR], 3.84; p<0.001) and did not (HR, 6.39; p<0.001) recover from AKI requiring dialysis had significantly higher incidence of TB than patients without AKI. The external validated data also showed nonrecovery subgroup (HR = 4.37; p = 0.049) had high risk of developing active TB compared with non-AKI. Additionally, active TB was associated with long-term all-cause mortality after AKI requiring dialysis (HR, 1.34; p = 0.032).

Conclusions

AKI requiring dialysis seems to independently increase the long-term risk of active TB, even among those who weaned from dialysis at discharge. These results raise concerns that the increasing global burden of AKI will in turn increase the incidence of active TB.  相似文献   

15.

Background

Patients started on long term hemodialysis have typically had low rates of reported renal recovery with recent estimates ranging from 0.9–2.4% while higher rates of recovery have been reported in cohorts with higher percentages of patients with acute renal failure requiring dialysis.

Study Design

Our analysis followed approximately 194,000 patients who were initiated on hemodialysis during a 2-year period (2008 & 2009) with CMS-2728 forms submitted to CMS by dialysis facilities, cross-referenced with patient record updates through the end of 2010, and tracked through December 2010 in the CMS SIMS registry.

Results

We report a sustained renal recovery (i.e no return to ESRD during the available follow up period) rate among Medicare ESRD patients of > 5% - much higher than previously reported. Recovery occurred primarily in the first 2 months post incident dialysis, and was more likely in cases with renal failure secondary to etiologies associated with acute kidney injury. Patients experiencing sustained recovery were markedly less likely than true long-term ESRD patients to have permanent vascular accesses in place at incident hemodialysis, while non-White patients, and patients with any prior nephrology care appeared to have significantly lower rates of renal recovery. We also found widespread geographic variation in the rates of renal recovery across the United States.

Conclusions

Renal recovery rates in the US Medicare ESRD program are higher than previously reported and appear to have significant geographic variation. Patients with diagnoses associated with acute kidney injury who are initiated on long-term hemodialysis have significantly higher rates of renal recovery than the general ESRD population and lower rates of permanent access placement.  相似文献   

16.

Purpose

We examined individual-level and neighborhood-level predictors of mortality in CRC patients diagnosed in Florida to identify high-risk groups for targeted interventions.

Methods

Demographic and clinical data from the Florida Cancer Data System registry (2007–2011) were linked with Agency for Health Care Administration and US Census data (n = 47,872). Cox hazard regression models were fitted with candidate predictors of CRC survival and stratified by age group (18–49, 50–64, 65+).

Results

Stratified by age group, higher mortality risk per comorbidity was found among youngest (21%), followed by middle (19%), and then oldest (14%) age groups. The two younger age groups had higher mortality risk with proximal compared to those with distal cancer. Compared with private insurance, those in the middle age group were at higher death risk if not insured (HR = 1.35), or received healthcare through Medicare (HR = 1.44), Medicaid (HR = 1.53), or the Veteran’s Administration (HR = 1.26). Only Medicaid in the youngest (52% higher risk) and those not insured in the oldest group (24% lower risk) were significantly different from their privately insured counterparts. Among 18–49 and 50–64 age groups there was a higher mortality risk among the lowest SES (1.17- and 1.23-fold higher in the middle age and 1.12- and 1.17-fold higher in the older age group, respectively) compared to highest SES. Married patients were significantly better off than divorced/separated (HR = 1.22), single (HR = 1.29), or widowed (HR = 1.19) patients.

Conclusion

Factors associated with increased risk for mortality among individuals with CRC included being older, uninsured, unmarried, more comorbidities, living in lower SES neighborhoods, and diagnosed at later disease stage. Higher risk among younger patients was attributed to proximal cancer site, Medicaid, and distant disease; however, lower SES and being unmarried were not risk factors in this age group. Targeted interventions to improve survivorship and greater social support while considering age classification may assist these high-risk groups.  相似文献   

17.

Background

Studies on the association between iron supplementation and mortality in dialysis patients are rare and conflicting.

Methods

In our observational single-center cohort study (INVOR study) we prospectively studied 235 incident dialysis patients. Time-dependent Cox proportional hazards models using all measured laboratory values for up to 7.6 years were applied to study the association between iron supplementation and all-cause mortality, cardiovascular and sepsis-related mortality. Furthermore, the time-dependent association of ferritin levels with mortality in patients with normal C-reactive protein (CRP) levels (<0.5 mg/dL) and elevated CRP levels (≧0.5 mg/dL) was evaluated by using non-linear P-splines to allow flexible modeling of the association.

Results

One hundred and ninety-one (81.3%) patients received intravenous iron, 13 (5.5%) patients oral iron, whereas 31 (13.2%) patients were never supplemented with iron throughout the observation period. Eighty-two (35%) patients died during a median follow-up of 34 months, 38 patients due to cardiovascular events and 21 patients from sepsis. Baseline CRP levels were not different between patients with and without iron supplementation. However, baseline serum ferritin levels were lower in patients receiving iron during follow up (median 93 vs 251 ng/mL, p<0.001). Iron supplementation was associated with a significantly reduced all-cause mortality [HR (95%CI): 0.22 (0.08–0.58); p = 0.002] and a reduced cardiovascular and sepsis-related mortality [HR (95%CI): 0.31 (0.09–1.04); p = 0.06]. Increasing ferritin concentrations in patients with normal CRP were associated with a decreasing mortality, whereas in patients with elevated CRP values ferritin levels>800 ng/mL were linked with increased mortality.

Conclusions

Iron supplementation is associated with reduced all-cause mortality in incident dialysis patients. While serum ferritin levels up to 800 ng/mL appear to be safe, higher ferritin levels are associated with increased mortality in the setting of concomitant inflammation.  相似文献   

18.

Background

The causes of death on long-term mortality after acute kidney injury (AKI) have not been well studied. The purpose of the study was to evaluate the role of comorbidities and the causes of death on the long-term mortality after AKI.

Methodology/Principal Findings

We retrospectively studied 507 patients who experienced AKI in 2005–2006 and were discharged free from dialysis. In June 2008 (median: 21 months after AKI), we found that 193 (38%) patients had died. This mortality is much higher than the mortality of the population of São Paulo City, even after adjustment for age. A multiple survival analysis was performed using Cox proportional hazards regression model and showed that death was associated with Khan’s index indicating high risk [adjusted hazard ratio 2.54 (1.38–4.66)], chronic liver disease [1.93 (1.15–3.22)], admission to non-surgical ward [1.85 (1.30–2.61)] and a second AKI episode during the same hospitalization [1.74 (1.12–2.71)]. The AKI severity evaluated either by the worst stage reached during AKI (P = 0.20) or by the need for dialysis (P = 0.12) was not associated with death. The causes of death were identified by a death certificate in 85% of the non-survivors. Among those who died from circulatory system diseases (the main cause of death), 59% had already suffered from hypertension, 34% from diabetes, 47% from heart failure, 38% from coronary disease, and 66% had a glomerular filtration rate <60 previous to the AKI episode. Among those who died from neoplasms, 79% already had the disease previously.

Conclusions

Among AKI survivors who were discharged free from dialysis the increased long-term mortality was associated with their pre-existing chronic conditions and not with the severity of the AKI episode. These findings suggest that these survivors should have a medical follow-up after hospital discharge and that all efforts should be made to control their comorbidities.  相似文献   

19.

Background

Delayed nephrology consultation (NC) seems to be associated with worse prognosis in critically ill acute kidney injury (AKI) patients.

Design, Setting, Participants, & Measurements

The aims of this study were to analyze factors related with timing of NC and its relation with AKI patients'' outcome in intensive care units of a tertiary hospital. AKI was defined as an increase ≥50% in baseline serum creatinine (SCr). Early NC and delayed NC were defined as NC performed before and two days after AKI diagnosis day. Multivariable logistic regression and propensity scores (PS) were used to adjust for confounding and selection biases. Hospital mortality and dialysis dependence on hospital discharge were the primary outcomes.

Results

A total of 366 AKI patients were analyzed and NCs were carried out in 53.6% of the patients. Hospital mortality was 67.8% and dialysis required in 31.4% patients (115/366). Delayed NCs (34%) occurred two days after AKI diagnosis day. This group presented higher mortality (OR: 4.04/CI: 1.60–10.17) and increased dialysis dependence (OR: 3.00/CI: 1.43–6.29) on hospital discharge. Four variables were retained in the PS model for delayed NC: diuresis (1000 ml/24 h - OR: 1.92/CI: 1.27–2.90), SCr (OR: 0.49/CI: 0.32–0.75), surgical AKI (OR: 3.67/CI: 1.65–8.15), and mechanical ventilation (OR: 2.82/CI: 1.06–7.44). After correction by PS, delayed NC was still associated with higher mortality (OR: 3.39/CI: 1.24–9.29) and increased dialysis dependence (OR: 3.25/CI: 1.41–7.51). Delayed NC was associated with increased mortality either in dialyzed patients (OR: 1.54/CI: 1.35–1.78) or non-dialyzed patients (OR: 2.89/CI: 1.00–8.35).

Conclusion

Delayed NC was associated with higher mortality and increased dialysis dependence rates in critically ill AKI patients at hospital discharge. Further studies are necessary to ascertain whether this effect is due to delayed nephrology intervention or residual confounding factors.  相似文献   

20.

Background

Diastolic heart failure (HF), the prevalence of which is gradually increasing, is associated with cardiovascular (CV) morbidity and mortality in the general population and, more specifically, in patients with end-stage renal disease (ESRD). However, the impact of diastolic dysfunction on CV outcomes has not been studied in incident dialysis patients with preserved systolic function.

Methods

This prospective observational cohort study investigates the clinical consequence of diastolic dysfunction and the predictive power of diastolic echocardiographic parameters for CV events in 194 incident ESRD patients with normal or near normal systolic function, who started dialysis between July 2008 and August 2012.

Results

During a mean follow-up duration of 27.2 months, 57 patients (29.4%) experienced CV events. Compared to the CV event-free group, patients with CV events had a significantly higher left ventricular (LV) mass index, ratio of early mitral flow velocity (E) to early mitral annulus velocity (E’) (E/E’), LA volume index (LAVI), deceleration time, and right ventricular systolic pressure, and a significantly lower LV ejection fraction and E’. In multivariate Cox proportional hazard analysis, E/E’>15 and LAVI>32 mL/m2 significantly predicted CV events (E/E’>15: hazard ratio [HR] = 5.40, 95% confidence interval [CI] = 2.73–10.70, P< .001; LAVI>32 mL/m2: HR = 5.56, 95% CI = 2.28–13.59, P< .001]. Kaplan-Meier analysis revealed that patients with both E/E’>15 and LAVI>32mL/m2 had the worst CV outcomes.

Conclusion

An increase in E/E’ or LAVI is a significant risk factor for CV events in incident dialysis patients with preserved LV systolic function.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号