首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 343 毫秒
1.

Background

The worldwide elderly (≥65 years old) dialysis population has grown significantly. This population is expected to have more comorbid conditions and shorter life expectancies than the general elderly population. Predicting outcomes for this population is important for decision-making. Recently, a new comorbidity index (nCI) with good predictive value for patient outcomes was developed and validated in chronic dialysis patients regardless of age. Our study examined the nCI outcome predictability in elderly dialysis patients.

Methods and Findings

For this population-based cohort study, we used Taiwan''s National Health Insurance Research Database of enrolled elderly patients, who began maintenance dialysis between January 1999 and December 2005. A total of 21,043 incident dialysis patients were divided into 4 groups by nCI score (intervals ≤3, 4–6, 7–9, ≥10) and followed nearly for 10 years. All-cause mortality and life expectancy were analyzed. During the follow-up period, 11272 (53.55%) patients died. Kaplan-Meier curves showed significant group difference in survival (log-rank: P<0.001). After stratification by age, life expectancy was found to be significantly longer in groups with lower nCI scores.

Conclusion

The nCI, even without the age component, is a strong predictor of mortality in elderly dialysis patients. Because patients with lower nCI scores may predict better survival, more attention should paid to adequate dialysis rather than palliative care, especially in those without obvious functional impairments.  相似文献   

2.

Background

Comorbid conditions are highly prevalent among patients with end-stage renal disease (ESRD) and index score is a predictor of mortality in dialysis patients. The aim of this study is to perform a population-based cohort study to investigate the survival rate by age and Charlson comorbidity index (CCI) in incident dialysis patients.

Methods

Using the catastrophic illness registration of the Taiwan National Health Insurance Research Database for all patients from 1 January 1998 to 31 December 2008, individuals newly diagnosed with ESRD and receiving dialysis for more than 90 days were eligible for our study. Individuals younger than 18 years or renal transplantation patients either before or after dialysis were excluded. We calculated the CCI, age-weighted CCI by Deyo-Charlson method according to ICD-9 code and categorized CCI into six groups as index scores <3, 4–6, 7–9, 10–12, 13–15, >15. Cox regression models were used to analyze the association between age, CCI and survival, and the risk markers of survival.

Results

There were 79,645 incident dialysis patients, whose mean age (± SD) was 60.96 (±13.92) years; 51.43% of patients were women and 51.2% were diabetic. In cox proportional hazard models and stratifying by age, older patients had significantly higher mortality than younger patients. The mortality risk was higher in persons with higher CCI as compared with low CCI. Mortality increased steadily with higher age or comorbidity both for unadjusted and for adjusted models. For all age groups, mortality rates increased in different CCI groups with the highest rates occurring in the oldest age groups.

Conclusions

Age and CCI are both strong predictors of survival in Taiwan. The older age or higher comorbidity index in incident dialysis patient is associated with lower long-term survival rates. These population-based estimates may assist clinicians who make decisions when patients need long-term dialysis.  相似文献   

3.

Background

Studies comparing patient survival of hemodialysis (HD) and peritoneal dialysis (PD) have yielded conflicting results and no such study was from South-East Asia. This study aimed to compare the survival outcomes of patients with end-stage renal disease (ESRD) who started dialysis with HD and PD in Singapore.

Methods

Survival data for a maximum of 5 years from a single-center cohort of 871 ESRD patients starting dialysis with HD (n = 641) or PD (n = 230) from 2005–2010 was analyzed using the flexible Royston-Parmar (RP) model. The model was also applied to a subsample of 225 propensity-score-matched patient pairs and subgroups defined by age, diabetes mellitus, and cardiovascular disease.

Results

After adjusting for the effect of socio-demographic and clinical characteristics, the risk of death was higher in patients initiating dialysis with PD than those initiating dialysis with HD (hazard ratio [HR]: 2.08; 95% confidence interval [CI]: 1.67–2.59; p<0.001), although there was no significant difference in mortality between the two modalities in the first 12 months of treatment. Consistently, in the matched subsample, patients starting PD had a higher risk of death than those starting HD (HR: 1.73, 95% CI: 1.30–2.28, p<0.001). Subgroup analysis showed that PD may be similar to or better than HD in survival outcomes among young patients (≤65 years old) without diabetes or cardiovascular disease.

Conclusion

ESRD patients who initiated dialysis with HD experienced better survival outcomes than those who initiated dialysis with PD in Singapore, although survival outcomes may not differ between the two dialysis modalities in young and healthier patients. These findings are potentially confounded by selection bias, as patients were not randomized to the two dialysis modalities in this cohort study.  相似文献   

4.

Background

Many Canadian patients who receive hemodialysis live far from their attending nephrologist, which may affect clinical outcomes. We investigated whether patients receiving hemodialysis who live farther from their attending nephrologist are more likely to die than those who live closer.

Methods

We studied a random sample of 18 722 patients who began hemodialysis between 1990 and 2000 in Canada. We calculated the distance between each patient''s residence location at the start of dialysis and the practice location of their attending nephrologist. We used Cox proportional hazards models to examine the adjusted relation between distance and clinical outcomes (death from all causes, infectious causes and cardiovascular causes) over a follow-up period of up to 14 years.

Results

During the follow-up period (median 2.5 yr, interquartile range 1.0–4.7 yr), 11 582 (62%) patients died. Compared with patients who lived within 50 km of their nephrologist, the adjusted hazard ratio of death among those who lived 50.1–150 km away was 1.06 (95% confidence interval [CI] 1.01–1.12), 1.13 (95% CI 1.04–1.22) for those who lived 150.1–300 km away and 1.13 (95% CI 1.03–1.24) for those who lived more than 300 km from their nephrologist (p for trend < 0.001). The risk of death from infectious causes increased with greater distance from the attending nephrologist (p for trend < 0.001). The risk of death from cardiovascular causes did not increase with distance from the attending nephrologist (p for trend = 0.21). Compared with patients who lived within 50 km of their nephrologist, the adjusted hazard ratio of death among those who lived more than 300 km away was 1.75 (95% CI 1.32–2.32) for infectious causes and 0.93 (95% CI 0.79–1.09) for cardiovascular causes.

Conclusions

Mortality associated with hemodialysis was greater among patients who lived farther from their attending nephrologist, as compared with those who lived closer. This was especially evident for death from infectious causes.In Canada, no one is denied renal replacement therapy because of their residence location; however, a substantial proportion of patients receiving dialysis live more than 300 km from the closest nephrologist.1 Since this geographic barrier may make it more difficult to provide high-quality renal care, it is plausible that disparities in access to appropriate care may result in differences in health outcomes. Despite the potential implications for health policy, this issue has not been formally studied.We sought to examine this issue using data collected prospectively from patients who began hemodialysis in Canada between 1990 and 2000. We hypothesized that patients who lived farther away from their attending nephrologist would be more likely than patients who lived closer to die after starting dialysis.  相似文献   

5.

Purpose

To characterize the impact of comorbidity on survival outcomes for patients with nasopharyngeal carcinoma (NPC) post radiotherapy (RT).

Methods

A total of 4095 patients with NPC treated by RT or RT plus chemotherapy (CT) in the period from 2007 to 2011 were included through Taiwan’s National Health Insurance Research Database. Information on comorbidity present prior to the NPC diagnosis was obtained and adapted to the Charlson Comorbidity Index (CCI), Age-Adjusted Charlson Comorbidity Index (ACCI) and a revised head and neck comorbidity index (HN-CCI). The prevalence of comorbidity and the influence on survival were calculated and analyzed.

Results

Most of the patients (75%) were male (age 51±13 years) and 2470 of them (60%) had at least one comorbid condition. The most common comorbid condition was diabetes mellitus. According to these three different comorbidity index (CCI, ACCI and HN-CCI), higher scores were associated with worse overall survival (P< 0.001). The Receiver Operating Characteristic (ROC) curve was used to assess the discriminating ability of CCI, AACI and HN-CCI scores and it demonstrated the predictive ability for mortality with the ACCI (0.693, 95% CI 0.670–0.715) was superior to that of the CCI (0.619, 95% CI 0.593–0.644) and HN-CCI (0.545, 95%CI 0.519–0.570).

Conclusion

Comorbidities greatly influenced the clinical presentations, therapeutic interventions, and outcomes of patients with NPC post RT. Higher comorbidity index scores accurately was associated with worse survival. The ACCI seems to be a more appropriate prognostic indicator and should be considered in further clinical studies.  相似文献   

6.

Background

The association between diabetes mellitus (DM) and tuberculosis (TB) is re-emerging worldwide. Recently, the prevalence of DM is increasing in resource poor countries where TB is of high burden. The objective of the current study was to determine the prevalence and analyze associated factors of TB and DM comorbidity in South-Eastern Amhara Region, Ethiopia.

Methods

This was a facility based cross-sectional study. All newly diagnosed TB patients attending selected health facilities in the study area were consecutively screened for DM. DM was diagnosed based on the World Health Organization diagnostic criteria. A pre-tested semi-structured questionnaire was used to collect socio-demographic, lifestyles and clinical data. Logistic regression analysis was performed to identify factors associated with TB and DM comorbidity.

Result

Among a total of 1314 patients who participated in the study, the prevalence of DM was estimated at 109 (8.3%). Being female [odds ratio (OR) 1.70; 95% confidence interval (CI) (1.10–2.62)], patients age [41–64 years (OR 3.35; 95% CI (2.01–5.57), 65–89 years (OR 3.18; 95% CI (1.52–6.64)], being a pulmonary TB case [(OR 1.69; 95% CI 1.09–2.63)] and having a family history of DM [(OR 4.54; 95% CI (2.36–8.73)] were associated factors identified with TB and DM comorbidity.

Conclusion

The prevalence of DM among TB patients in South-Eastern Amahra Region is high. Routine screening of TB patients for DM is recommended in the study area.  相似文献   

7.

Background

Outcomes for pediatric solid tumors have significantly improved over the last 30 years. However, much of this improvement is due to improved outcome for patients with localized disease. Here we evaluate overall survival (OS) for pediatric patients with metastatic disease over the last 40 years.

Procedure

The United States Surveillance, Epidemiology, and End Results (SEER) database was used to conduct this study. Patients diagnosed between 0 and 18 years of age with metastatic Ewings sarcoma, neuroblastoma, osteosarcoma, rhabdomyosarcoma or Wilms tumor were included in the analysis.

Results

3,009 patients diagnosed between 1973–2010 met inclusion criteria for analysis. OS at 10 years for patients diagnosed between 1973–1979, 1980–1989, 1990–1999 and 2000–2010 was 28.3%, 37.2%, 44.7% and 49.3%, respectively (p<0.001). For patients diagnosed between 2000–2010, 10-year OS for patients with Ewing sarcoma, neuroblastoma, osteosarcoma, rhabdomyosarcoma and Wilms tumor was 30.6%, 54.4%, 29.3%, 27.5%, and 76.6%, respectively, as compared to 13.8%, 25.1%, 13.6%, 17.9% and 57.1%, respectively, for patients diagnosed between 1973–1979. OS for neuroblastoma significantly increased with each decade. For patients with osteosarcoma and Ewing sarcoma, there was no improvement in OS over the last two decades. There was no improvement in outcome for patients with rhabdomyosarcoma or Wilms tumor over the last 30 years.

Conclusions

OS for pediatric patients with metastatic solid tumors has significantly improved since the 1970s. However, outcome has changed little for some malignancies in the last 20–30 years. These data underscore the importance of continued collaboration and studies to improve outcome for these patients.  相似文献   

8.

Background

Elderly patients with end-stage renal disease have become the fastest growing population of kidney transplant candidates in recent years. However, the risk factors associated with long-term outcomes in these patients remain unclear.

Methods

We retrospectively analyzed 166 recipients aged 60 years or older who underwent primary deceased kidney transplantation between 2002 and 2013 in our center. The main outcomes included 1-, 3- and 5-year patient survival as well as overall and death-censored graft survival. The independent risk factors affecting graft and patient survival were analyzed using Cox regression analysis.

Results

The 1-, 3-, 5-year death-censored graft survival rates were 93.6%, 89.4% and 83.6%, respectively. Based on the Cox multivariate analysis, panel reactive antibody (PRA)>5% [hazard ratio (HR) 4.295, 95% confidence interval (CI) 1.321–13.97], delayed graft function (HR 4.744, 95% CI 1.611–13.973) and acute rejection (HR 4.971, 95% CI 1.516–16.301) were independent risk factors for graft failure. The 1-, 3-, 5-year patient survival rates were 84.8%, 82.1% and 77.1%, respectively. Longer dialysis time (HR 1.011 for 1-month increase, 95% CI 1.002–1.020), graft loss (HR 3.501, 95% CI 1.559–7.865) and low-dose ganciclovir prophylaxis (1.5 g/d for 3 months) (HR 3.173, 95% CI 1.063–9.473) were risk factors associated with patient death.

Conclusions

The five-year results show an excellent graft and patient survival in elderly kidney transplant recipients aged ≥60 years. PRA>5%, delayed graft function, and acute rejection are risk factors for graft failure, while longer duration of dialysis, graft loss and low-dose ganciclovir prophylaxis are risk factors for mortality in elderly recipients. These factors represent potential targets for interventions aimed at improving graft and patient survival in elderly recipients.  相似文献   

9.

Background/Aims

Monitoring of serum ferritin levels is widely recommended in the management of anemia among patients on dialysis. However, associations between serum ferritin and mortality are unclear and there have been no investigations among patients undergoing peritoneal dialysis (PD).

Methods

Baseline data of 191,902 patients on dialysis (age, 65 ± 13 years; male, 61.1%; median dialysis duration, 62 months) were extracted from a nationwide dialysis registry in Japan at the end of 2007. Outcomes, such as one-year mortality, were then evaluated using the registry at the end of 2008.

Results

Within one year, a total of 15,284 (8.0%) patients had died, including 6,210 (3.2%) cardiovascular and 2,707 (1.4%) infection-related causes. Higher baseline serum ferritin levels were associated with higher mortality rates among patients undergoing hemodialysis (HD). In contrast, there were no clear associations between serum ferritin levels and mortality among PD patients. Multivariate Cox regression analysis of HD patients showed that those in the highest serum ferritin decile group had higher rates of all-cause and cardiovascular mortality than those in the lowest decile group (hazard ratio [HR], 1.54; 95% confidence interval [CI], 1.31–1.81 and HR, 1.44; 95% CI, 1.13–1.84, respectively), whereas associations with infection-related mortality became non-significant (HR, 1.14; 95% CI, 0.79–1.65).

Conclusions

Using Japanese nationwide dialysis registry, higher serum ferritin values were associated with mortality not in PD patients but in HD patients.  相似文献   

10.

Background

The incidence of acute coronary syndrome (ACS) in young people (≤65 years) is continuously rising. While prognostic factors in ACS are well-investigated less attention has been paid to their age-dependent prognostic value and their particular relevance in younger patients. The aim of our study was to assess the age-dependent prognostic impact of butyrylcholinesterase (BChE).

Methods

Retrospective cohort study including 624 patients with ACS. Patients were stratified by age into equal groups (n = 208) corresponding to “young patients” (45–64 years), "middle-aged patients” (65–84 years) and “old patients” (85–100 years). Cox regression hazard analysis was used to assess the influence of BChE on survival.

Results

After a mean follow-up time of 4.0 (interquartile range [IQR] 2.0–6.4) years, 154 patients (24.7%) died due to a cardiac cause. In the overall cohort, BChE was indirectly associated with cardiac mortality-free survival (adjusted hazard ratio (HR): 0.70 (95% confidence interval [CI] 0.53–0.93, p = 0.01). The primary-analysis of BChE by age strata showed the strongest effect in the age group 45–64 years with an adjusted HR per 1-SD of 0.28 (95% CI 0.12–0.64, p = 0.003), a weaker association with mortality in middle aged (65–84 years: adjusted HR per 1-SD 0.66 [95% CI: 0.41–1.06], p = 0.087), and no association in older patients (85–100 years: adjusted HR per 1-SD 0.89 [95% CI: 0.58–1.38], p = 0.613).

Conclusion

BChE is a strong predictor for cardiac mortality specifically in younger patients with ACS aged between 45 and 64 years. No significant association of BChE with cardiac-mortality was detected in other age classes.  相似文献   

11.

Background

Previous studies have suggested that erectile dysfunction (ED) is an independent risk factor for macrovascular disease. Very few studies have evaluated the relationship between ED and risk of end stage renal disease (ESRD) requiring dialysis.

Methods

A random sample of 1,000,000 individuals from Taiwan''s National Health Insurance database was collected. We selected the control group by matching the subjects and controls by age, diabetes, hypertension, coronary heart disease, hyperlipidemia, area of residence, monthly income and index date. We identified 3985 patients with newly-diagnosed ED between 2000 and 2008 and compared them with a matched cohort of 23910 patients without ED. All patients were tracked from the index date to identify which patients subsequently developed a need for dialysis.

Results

The incidence rates of dialysis in the ED cohort and comparison groups were 10.85 and 9.06 per 10000 person-years, respectively. Stratified by age, the incidence rate ratio for dialysis was greater in ED patients aged <50 years (3.16, 95% CI: 1.62–6.19, p = 0.0008) but not in aged 50–64 (0.94, 95% CI: 0.52–1.69, p = 0.8397) and those aged ≧65 (0.69, 95% CI: 0.32–1.52, p = 0.3594). After adjustment for patient characteristics and medial comorbidities, the adjusted HR for dialysis remained greater in ED patients aged <50 years (adjusted HR: 2.08, 95% CI: 1.05–4.11, p<0.05). The log-rank test revealed that ED patients <50-years-old had significantly higher cumulative incidence rates of dialysis than those without (p = 0.0004).

Conclusion

Patients with ED, especially younger patients, are at an increased risk for ESRD requiring dialysis later in life.  相似文献   

12.

Background

In an accompanying article, we report moderate between-hospital variation in the postdischarge use of β-blockers, angiotensin-modifying drugs and statins by elderly patients who had been admitted to hospital with acute myocardial infarction. Our objective was to identify the characteristics of patients, physicians, hospitals and communities associated with differences in the use of these medications after discharge.

Methods

For this retrospective, population-based cohort study, we used linked administrative databases. We examined data for all patients aged 65 years or older who were discharged from hospital in 2005/06 with a diagnosis of myocardial infarction. We determined the effect of patient, physician, hospital and community characteristics on the rate of postdischarge medication use.

Results

Increasing patient age was associated with lower postdischarge use of medications. The odds ratios (ORs) for a 1-year increase in age were 0.98 (95% confidence interval [CI] 0.97–0.99) for β-blockers, 0.97 (95% CI 0.97–0.98) for angiotensin-converting-enzyme inhibitors and angiotensin-receptor blockers and 0.94 (95% CI 0.93–0.95) for statins. Having a general or family practitioner, a general internist or a physician of another specialty as the attending physician, relative to having a cardiologist, was associated with lower postdischarge use of β-blockers, angiotensin-modifying agents and statins (ORs ranging from 0.46 to 0.82). Having an attending physician with 29 or more years experience, relative to having a physician who had graduated within the past 15 years, was associated with lower use of β-blockers (OR 0.71, 95% CI 0.60–0.84) and statins (OR 0.81, 95% CI 0.67–0.97).

Interpretation

Patients who received care from noncardiologists and physicians with at least 29 years of experience had substantially lower use of evidence-based drug therapies after discharge. Dissemination strategies should be devised to improve the prescribing of evidence-based medications by these physicians.The use of medications such as acetylsalicylic acid (ASA), β-blockers, angiotensin-modifying drugs (angiotensin-converting-enzyme [ACE] inhibitors and angiotensin-receptor blockers) and statins is a mainstay of secondary prevention of myocardial infarction. In a companion study published in this issue of CMAJ, we report substantial increases in the use of evidence-based drug therapies after discharge among elderly patients with myocardial infarction over a 14-year period.1 However, despite temporal improvements, the prescribing of evidence-based drug therapies differed among hospitals in 2005.Studies from the late 1980s to the mid-1990s showed that the prescribing of evidence-based drug therapies was influenced by patient characteristics.2–6 However, the extent to which postdischarge prescribing is influenced by patient, physician, hospital and community characteristics has not been extensively explored.Our objective was to identify patient, physician, hospital and community characteristics associated with the use of of evidence-based drug therapies after discharge among patients with myocardial infarction.  相似文献   

13.

Context

The autopsy rate gradually decreased during 1950–1999, and increased during the most recent decade (2000–2009). The diagnostic inaccuracy rate was continuously high during the 60 years.

Objective

To investigate disagreement between the pathological and clinical diagnosis during 60 years (1950–2009).

Data Sources

A 60-year retrospective study was carried out on the 4140 autopsy cases performed in Zhejiang University School of Medicine.

Results

The highest number of cases was 1037 during 1960–1969, while the lowest was 102 during 1990–1999. During the 1999–2009 period, 978 cases were completed, which ranked second within the 60 years. The total clinical misdiagnosis rate was 46.38%, while the highest was 73.82% in 2000–2009. During the 60 years, the diseases associated with highest diagnostic inaccuracy rates were circulatory diseases (76.97%), cancer (60.99%), and brain diseases (54.48%). The invasive fungal infection rate was 1.84% of the 4140 cases, and the diagnostic inaccuracy rate for this condition reached as high as 86.10%. In the autopsied disease spectrum over the 60 years, the most common diseases were respiratory (1349, 32.58%), circulatory (495, 11.96%), and brain diseases (424, 10.24%).

Conclusion

Although the number of autopsies decreased from 1950 to 1999, it increased from 2000 to 2009, while the discordance rate between clinical and autopsy diagnosis remained high throughout.  相似文献   

14.

Background

Second-line therapy is frequently utilized for metastatic urothelial carcinoma, but there are limited data to guide this approach. While an assessment of overall survival based on registry data may not capture the impact of second- and third-line therapies on clinical outcome, this may be reflected in relative conditional survival (RCS).

Methods

Patients with stage IV urothelial carcinoma diagnosed from 1990–2010 were identified from the Surveillance, Epidemiology and End Results (SEER) dataset. The association of clinicopathologic variables with disease specific survival (DSS) was explored through univariate and multivariate analyses. DSS in subgroups divided by time period (1990–2000 v 2001–2010) was compared using the Kaplan-Meier method and log-rank test. One-year RCS at annual landmarks up to 5 years was compared in subgroups divided by time period.

Results

Of 261,987 patients diagnosed with urothelial carcinoma from 1990–2010, 3,110 patients met criteria for the current analysis. Characteristics of patients diagnosed between 1990 and 2000 (n = 810) and 2001 to 2010 (n = 2,300) were similar and there was no significant difference in DSS between the two groups. On multivariate analysis, older age (age ≥ 80) was associated with shorter DSS (HR 1.79, 95%CI 1.48–2.15), but no association was found between time period of diagnosis and outcome. One-year RCS improved substantially through successive annual landmarks up to 5 years, but no differences were seen in subgroups divided by time of diagnosis.

Conclusions

No difference in RCS was observed amongst patients with stage IV urothelial carcinoma diagnosed from 1990–2000 and 2001–2010. A lack of difference in RCS (more so than cumulative DSS) may reflect a lack of progress in salvage therapies for the disease.  相似文献   

15.

Objectives

The incidence of ischemic stroke has increased and that of hemorrhagic stroke has decreased in urban China; however, the trends in rural areas are unknown. We aimed to explore the secular trends in incidence and transition of stroke subtypes among rural Chinese.

Methods

This was a population-based stroke surveillance through the Tianjin Brain Study. A total of 14,538 residents in a township of Ji County in Tianjin, China participated in the study since 1985. We investigated the age-standardized stroke incidence (sex-specific, type-specific, and age-specific), the annual proportion of change in the incidence of stroke, and the proportion of intracerebral hemorrhage in the periods 1992–1998, 1999–2005, and 2006–2012, because the neuroimaging technique was available since 1992 in this area.

Results

The age-standardized incidence per 100,000 person-years increased significantly for both intracerebral hemorrhage (37.8 in 1992–1998, 46.5 in 1999–2005, and 76.5 in 2006–2012) and ischemic stroke (83.9 in 1992–1998, 135.3 in 1999–2005, and 238.0 in 2006–2012). The age-standardized incidence of first-ever stroke increased annually by 4.9% for intracerebral hemorrhage and by 7.3% for ischemic stroke. The greatest increase was observed in men aged 45–64 years for both stroke types (P < 0.001). The proportion of intracerebral hemorrhage was stable overall, increased among men aged 45–64 years, and decreased among men aged ≥65 years. The average age of intracerebral hemorrhage in men reduced by 7.5 years from 1992 to 2012.

Conclusion

The age-standardized incidence of main stroke subtypes increased significantly in rural China over the past 21 years; the overall proportion of intracerebral hemorrhage was stable, but the incidence increased significantly among middle-aged men. These findings imply that it is crucial to control stroke risk factors in middle-aged men for stroke prevention in future decades.  相似文献   

16.

Objectives

The relationship between disability and comorbidity on mortality is widely perceived as additive in clinical models of frailty.

Design

National data were retrospectively extracted from medical records of community hospital.

Data Sources

There were of 12,804 acutely-disabled patients admitted for inpatient rehabilitation in Singapore rehabilitation community hospitals from 1996 through 2005 were followed up for death till 31 December 2011.

Outcome Measure

Cox proportional-hazards regression to assess the interaction of comorbidity and disability at discharge on all-cause mortality.

Results

During a median follow-up of 10.9 years, there were 8,565 deaths (66.9%). The mean age was 73.0 (standard deviation: 11.5) years. Independent risk factors of mortality were higher comorbidity (p<0.001), severity of disability at discharge (p<0.001), being widowed (adjusted hazard ratio [aHR]: 1.38, 95% confidence interval [CI]:1.25–1.53), low socioeconomic status (aHR:1.40, 95%CI:1.29–1.53), discharge to nursing home (aHR:1.14, 95%CI:1.05–1.22) and re-admission into acute care (aHR:1.54, 95%CI:1.45–1.65). In the main effects model, those with high comorbidity had an aHR = 2.41 (95%CI:2.13–2.72) whereas those with total disability had an aHR = 2.28 (95%CI:2.12–2.46). In the interaction model, synergistic interaction existed between comorbidity and disability (p<0.001) where those with high comorbidity and total disability had much higher aHR = 6.57 (95%CI:5.15–8.37).

Conclusions

Patients with greater comorbidity and disability at discharge, discharge to nursing home or re-admission into acute care, lower socioeconomic status and being widowed had higher mortality risk. Our results identified predictive variables of mortality that map well onto the frailty cascade model. Increasing comorbidity and disability interacted synergistically to increase mortality risk.  相似文献   

17.

Introduction

Dialysis-requiring acute kidney injury is a severe illness associated with poor prognosis. However, information pertaining to incidence rates and prevalence of risk factors remains limited in spite of increasing focus. We evaluate time trends of incidence rates and changing patterns in prevalence of comorbidities, concurrent medication, and other risk factors in nationwide retrospective cohort study.

Materials and Methods

All patients with dialysis-requiring acute kidney injury were identified between January 1st 2000 and December 31st 2012. By cross-referencing data from national administrative registries, the association of changing patterns in dialysis treatment, comorbidity, concurrent medication and demographics with incidence of dialysis-requiring acute kidney injury was evaluated.

Results

A total of 18,561 adult patients with dialysis-requiring AKI were identified between 2000 and 2012. Crude incidence rate of dialysis-requiring AKI increased from 143 per million (95% confidence interval, 137–144) in 2000 to 366 per million (357–375) in 2006, and remained stable hereafter. Notably, incidence of continuous veno-venous hemodialysis (CRRT) and use of acute renal replacement therapy in elderly >75 years increased substantially from 23 per million (20–26) and 328 per million (300–355) in 2000, to 213 per million (206–220) and 1124 per million (1076–1172) in 2012, respectively. Simultaneously, patient characteristics and demographics shifted towards increased age and comorbidity.

Conclusions

Although growth in crude incidence rate of dialysis-requiring AKI stabilized in 2006, continuous growth in use of CRRT, and acute renal replacement therapy of elderly patients >75 years, was observed. Our results indicate an underlying shift in clinical paradigm, as opposed to unadulterated growth in incidence of dialysis-requiring AKI.  相似文献   

18.

Background

Surveillance in patients with previous polypectomy was underused in the Medicare population in 1994. This study investigates whether expansion of Medicare reimbursement for colonoscopy screening in high-risk individuals has reduced the inappropriate use of surveillance.

Methods

We used Kaplan-Meier analysis to estimate time to surveillance and polyp recurrence rates for Medicare beneficiaries with a colonoscopy with polypectomy between 1998 and 2003 who were followed through 2008 for receipt of surveillance colonoscopy. Generalized Estimating Equations were used to estimate risk factors for: 1) failing to undergo surveillance and 2) polyp recurrence among these individuals. Analyses were stratified into three 2-year cohorts based on baseline colonoscopy date.

Results

Medicare beneficiaries undergoing a colonoscopy with polypectomy in the 1998–1999 (n = 4,136), 2000–2001 (n = 3,538) and 2002–2003 (n = 4,655) cohorts had respective probabilities of 30%, 26% and 20% (p<0.001) of subsequent surveillance events within 3 years. At the same time, 58%, 52% and 45% (p<0.001) of beneficiaries received a surveillance event within 5 years. Polyp recurrence rates after 5 years were 36%, 30% and 26% (p<0.001) respectively. Older age (≥ 70 years), female gender, later cohort (2000–2001 & 2002–2003), and severe comorbidity were the most important risk factors for failure to undergo a surveillance event. Male gender and early cohort (1998–1999) were the most important risk factors for polyp recurrence.

Conclusions

Expansion of Medicare reimbursement for colonoscopy screening in high-risk individuals has not reduced underutilization of surveillance in the Medicare population. It is important to take action now to improve this situation, because polyp recurrence is substantial in this population.  相似文献   

19.

Background

The impact of dialysis modality on survival is still somewhat controversial. Given possible differences in patients’ characteristics and the cause and rate of death in different countries, the issue needs to be evaluated in Korean cohorts.

Methods

A nationwide prospective observational cohort study (NCT00931970) was performed to compare survival between peritoneal dialysis (PD) and hemodialysis (HD). A total of 1,060 end-stage renal disease patients in Korea who began dialysis between September 1, 2008 and June 30, 2011 were followed through December 31, 2011.

Results

The patients (PD, 30.6%; HD, 69.4%) were followed up for 16.3±7.9 months. PD patients were significantly younger, less likely to be diabetic, with lower body mass index, and larger urinary volume than HD patients. Infection was the most common cause of death. Multivariate Cox regression with the entire cohort revealed that PD tended to be associated with a lower risk of death compared to HD [hazard ratio (HR) 0.63, 95% confidence interval (CI) 0.36–1.08]. In propensity score matched pairs (n = 278 in each modality), cumulative survival probabilities for PD and HD patients were 96.9% and 94.1% at 12 months (P = 0.152) and 94.3% and 87.6% at 24 months (P = 0.022), respectively. Patients on PD had a 51% lower risk of death compared to those on HD (HR 0.49, 95% CI 0.25–0.97).

Conclusions

PD exhibits superior survival to HD in the early period of dialysis, even after adjusting for differences in the patients’ characteristics between the two modalities. Notably, the most common cause of death was infection in this Korean cohort.  相似文献   

20.

Background

Older patients who experience a fragility fracture are at high risk of future fractures but are rarely tested or treated for osteoporosis. We developed a multifaceted intervention directed at older patients with wrist fractures (in the form of telephone-based education) and their physicians (in the form of guidelines endorsed by opinion leaders, supported by reminders) to improve the quality of osteoporosis care.

Methods

In a randomized controlled trial with blinded ascertainment of outcomes, we compared our intervention with usual care (provision of printed educational materials to patients). Eligible patients were those older than 50 years of age who had experienced a wrist fracture and were seen in emergency departments and fracture clinics; we excluded those who were already being treated for osteoporosis. The primary outcome was bisphosphonate treatment within 6 months after the fracture. Secondary outcomes included bone mineral density testing, “appropriate care” (consisting of bone mineral density testing with treatment if bone mass was low) and quality of life.

Results

We screened 795 patients for eligibility and randomly assigned 272 to the intervention (137 patients) or control (135 patients) group. The median age was 60 years; 210 (77%) of the subjects were women, and 130 (48%) reported a previous fracture as an adult. Six months after the fracture, 30 (22%) of the intervention patients, as compared with 10 (7%) of the control patients, were receiving bisphosphonate therapy for osteoporosis (adjusted relative risk [RR] 2.6, 95% confidence interval [CI] 1.3–5.1, p = 0.008). Intervention patients were more likely than control patients to undergo bone mineral density testing (71/137 [52%] v. 24/135 [18%]; adjusted RR 2.8, 95% CI 1.9–4.2, p < 0.001) and to receive appropriate care (52/137 [38%] v. 15/135 [11%]; adjusted RR 3.1, 95% CI 1.8–5.3, p < 0.001). There were no differences between the groups in other outcomes. One patient died, and 4 others experienced recurrent fracture.

Interpretation

A multifaceted intervention directed at high-risk patients and their physicians substantially increased rates of testing and treatment for osteoporosis. Nevertheless, more than half of the patients in the intervention group were not receiving appropriate care 6 months after their fracture, which suggests that additional strategies should be explored. (ClinicalTrials.gov trial register no. NCT00152321.)Osteoporosis is a common, chronic and costly condition affecting at least 25% of women and 12% of men over 50 years of age.1–3 Without better prevention strategies, the incidence of and costs related to osteoporotic fractures are expected to increase by 50% over the next 2 decades.3 Case-finding and secondary prevention (e.g., by identifying patients who have experienced a fragility fracture, ensuring that their bone mineral density is tested and offering efficacious osteoporosis treatments to those with low bone mass) constitute the most cost-effective strategy for reducing future fractures.4–6An obvious target group for case-finding consists of older patients who experience a wrist fracture. Wrist fracture is the most common symptomatic fracture related to osteoporosis; its occurrence is a powerful forecaster of future fractures, and these fractures typically occur 10–20 years before the more devastating osteoporosis-related fractures of the spine or the hip.7 Unfortunately, although most older patients with wrist fractures have low bone mass and are eligible for treatment,4,7 less than about 10% to 20% are tested or treated for osteoporosis in the 6 to 12 months after a wrist fracture.4–9We previously reported a nonrandomized study of an intervention that incorporated patient education, physician reminders and treatment guidelines endorsed by opinion leaders, to improve osteoporosis treatment in patients with wrist fractures; in that study, which involved 102 patients, the rate of treatment was 40% in the intervention group but only 10% in the group receiving usual care.7 Several concerns were raised about the internal and external validity of that small study, so we conducted a randomized controlled trial of the intervention, which is reported here.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号