首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 62 毫秒
1.

Objective

To study the association of long-term statin use and the risk of low-energy hip fractures in middle-aged and elderly women.

Design

A register-based cohort study.

Setting

Finland.

Participants

Women aged 45–75 years initiating statin therapy between 1996 and 2001 with adherence to statins ≥80% during the subsequent five years (n = 40 254), a respective cohort initiating hypertension drugs (n = 41 610), and women randomly selected from the population (n = 62 585).

Main Outcome Measures

Incidence rate of and hazard ratio (HR) for low-energy hip fracture during the follow-up extending up to 7 years after the 5-year exposure period.

Results

Altogether 199 low-energy hip fractures occurred during the 135 330 person-years (py) of follow-up in the statin cohort, giving an incidence rate of 1.5 hip fractures per 1000 py. In the hypertension and the population cohorts, the rates were 2.0 per 1000 py (312 fractures per 157 090 py) and 1.0 per 1000 py (212 fractures per 216 329 py), respectively. Adjusting for a propensity score and individual variables strongly predicting the outcome, good adherence to statins for five years was associated with a 29% decreased risk (HR 0.71; 95% CI 0.58–0.86) of a low-energy hip fracture in comparison with adherent use of hypertension drugs. The association was of the same magnitude when comparing the statin users with the population cohort, the HR being 0.69 (0.55–0.87). When women with poor (<40%), moderate (40 to 80%), and good adherence (≥80%) to statins were compared to those with good adherence to hypertension drugs (≥80%) or to the population cohort, the protective effect associated with statin use attenuated with the decreasing level of adherence.

Conclusions

5-year exposure to statins is associated with a reduced risk of low-energy hip fracture in women aged 50–80 years without prior hospitalizations for fractures.  相似文献   

2.

Background

Whether HbA1c is a predictor of end-stage renal disease (ESRD) in type 2 diabetes patients remains unclear. This study evaluated relationship between HbA1c and ESRD in Chinese patients with type 2 diabetes.

Methods

Patients aged ≥ 30 years who were free of ESRD (n = 51 681) were included from National Diabetes Care Management Program from 2002–2003. Extended Cox proportional hazard model with competing risk of death served to evaluate association between HbA1c level and ESRD.

Results

A total of 2613 (5.06%) people developed ESRD during a follow-up period of 8.1 years. Overall incidence rate of ESRD was 6.26 per 1000 person-years. Patients with high levels of HbA1c had a high incidence rate of ESRD, from 4.29 for HbA1c of  6.0%–6.9% to 10.33 for HbA1c ≥ 10.0% per 1000 person-years. Patients with HbA1c < 6.0% particularly had a slightly higher ESRD incidence (4.34 per 1000 person-years) than those with HbA1c  of 6.0%–6.9%. A J-shaped relationship between HbA1c level and ESRD risk was observed. After adjustment, patients with HbA1c < 6.0% and ≥ 10.0% exhibited an increased risk of ESRD (HR: 1.99, 95% CI: 1.62–2.44; HR: 4.42, 95% CI: 3.80–5.14, respectively) compared with those with HbA1c of 6.0%–6.9%.

Conclusions

Diabetes care has focused on preventing hyperglycemia, but not hypoglycemia. Our study revealed that HbA1c level ≥ 7.0% was linked with increased ESRD risk in type 2 diabetes patients, and that HbA1c < 6.0% also had the potential to increase ESRD risk. Our study provides epidemiological evidence that appropriate glycemic control is essential for diabetes care to meet HbA1c targets and improve outcomes without increasing the risk to this population. Clinicians need to pay attention to HbA1c results on diabetic nephropathy.  相似文献   

3.

Background

Decrease in quality of life (QoL) in left-sided heart failure precedes poor survival, which can be reversed with exercise training. We investigated whether QoL is associated with mortality in pulmonary arterial hypertension due to congenital heart disease (PAH-CHD) patients.

Methods

In this observational study, PAH-CHD adults referred for PAH-specific therapy were included. QoL surveys (SF36) were recorded during 2 years of therapy. Based on shift in SF36 scores during this period, patients had either decreased or non-decreased QoL. Subsequently, the patients were followed for mortality.

Results

Thirty-nine PAH-CHD patients (mean age 42, 44 % male, 49 % Down’s syndrome) were analysed. Following PAH-specific therapy, SF36 physical component summary (PCS) decreased in 13 (35–31 points, p = 0.001) and showed no decrease in 26 patients (34–43 points, mean values, p < 0.001). Post-initiation phase, median follow-up was 4.5 years, during which 12 deaths occurred (31 %), 10 (56 %) in the decreased and 2 (10 %) in the non-decreased group (p = 0.002). Cox regression showed a decrease in SF36 PCS predicted mortality (HR 3.4, 95 % CI 1.03–11, p = 0.045).

Conclusions

In PAH-CHD patients, decrease in SF36 PCS following initiation of PAH-specific therapy is a determinant of mortality.

Electronic supplementary material

The online version of this article (doi:10.1007/s12471-015-0666-9) contains supplementary material, which is available to authorized users.  相似文献   

4.

Background

Patients aged ≥65 years are vulnerable to readmissions due to a transient period of generalized risk after hospitalization. However, whether young and middle-aged adults share a similar risk pattern is uncertain. We compared the rate, timing, and readmission diagnoses following hospitalization for heart failure (HF), acute myocardial infarction (AMI), and pneumonia among patients aged 18–64 years with patients aged ≥65 years.

Methods and Findings

We used an all-payer administrative dataset from California consisting of all hospitalizations for HF (n = 206,141), AMI (n = 107,256), and pneumonia (n = 199,620) from 2007–2009. The primary outcomes were unplanned 30-day readmission rate, timing of readmission, and readmission diagnoses. Our findings show that the readmission rate among patients aged 18–64 years exceeded the readmission rate in patients aged ≥65 years in the HF cohort (23.4% vs. 22.0%, p<0.001), but was lower in the AMI (11.2% vs. 17.5%, p<0.001) and pneumonia (14.4% vs. 17.3%, p<0.001) cohorts. When adjusted for sex, race, comorbidities, and payer status, the 30-day readmission risk in patients aged 18–64 years was similar to patients ≥65 years in the HF (HR 0.99; 95%CI 0.97–1.02) and pneumonia (HR 0.97; 95%CI 0.94–1.01) cohorts and was marginally lower in the AMI cohort (HR 0.92; 95%CI 0.87–0.96). For all cohorts, the timing of readmission was similar; readmission risks were highest between days 2 and 5 and declined thereafter across all age groups. Diagnoses other than the index admission diagnosis accounted for a substantial proportion of readmissions among age groups <65 years; a non-cardiac diagnosis represented 39–44% of readmissions in the HF cohort and 37–45% of readmissions in the AMI cohort, while a non-pulmonary diagnosis represented 61–64% of patients in the pneumonia cohort.

Conclusion

When adjusted for differences in patient characteristics, young and middle-aged adults have 30-day readmission rates that are similar to elderly patients for HF, AMI, and pneumonia. A generalized risk after hospitalization is present regardless of age. Please see later in the article for the Editors'' Summary  相似文献   

5.

Background

In dementia screening, most studies have focused on early cognitive impairment by comparing patients suffering from mild dementia or mild cognitive impairment with normal subjects. Few studies have focused on modifications over time of the cognitive function in the healthy elderly. The objective of the present study was to analyze the cognitive function changes of two different samples, born > 15 years apart.

Method

A first sample of 204 cognitively normal participants was recruited in the memory clinic of Broca hospital between 1991 and 1997. A second sample of 177 cognitively normal participants was recruited in 2008–2009 in the same institution. Both samples were from the same districts of Paris and were assessed with the same neuropsychological test battery. Mean cognitive test scores were compared between 1991 and 2008 samples, between < 80 years old and ≥ 80 years old in 1991 and 2008 samples, and finally between subjects < 80 year old of 1991 sample and subjects ≥ 80 years old of the 2008 sample. Means were compared with T-tests stratified on gender, age-groups and educational level.

Results

Cognitive scores were significantly higher in the 2008 sample. Participants < 80 years old outperformed those ≥ 80 in both samples. However, participants < 80 years old in 1991 sample and subjects ≥ 80 in the 2008 sample, born on average in 1923, performed mostly identically.

Conclusion

This study showed a significant increase of cognitive scores over time. Further, contemporary octogenarians in the later sample performed like septuagenarians in the former sample. These findings might be consistent with the increase in life expectancy and life span in good health. The study highlights the necessity to take into account factors which may contaminate and artificially inflate the age-related differences in favor of younger to the older adults.  相似文献   

6.

Background

The cardiopulmonary exercise test (CPX) is an affordable tool for risk prediction in patients with chronic heart failure (CHF). We aimed to determine the role of CPX parameters in predicting the risk of incidence of sustained ventricular arrhythmias (SVA) in CHF.

Methods

Sixty-one consecutive patients with CHF enrolled in the Daunia Heart Failure Registry underwent CPX and were followed for 327 ± 247 days. Clinical follow-up was performed every month and anticipated in case of re-hospitalisation for cardiac disease. Incidence of SVA was evaluated by direct clinical examination (ECG, ambulatory ECG).

Results

Patients with episodes of SVA (N 14) showed lower values of pVO2 and PetCO2, and higher values of VE/VCO2, VE/VCO2 slope, and VE%. After correction for age, gender, diabetes, ischaemic heart disease and left ventricular ejection fraction, peak VO2 (hazard ratio (HR) 0.68, 95 % confidence interval (CI) 0.51–0.91, p < 0.05), VE% (HR 1.38, 95 % CI 1.04–1.84, p < 0.05), VE/VCO2 (HR 1.38, 95 % CI 1.04–1.82, p < 0.05), VE/VCO2 slope (HR 1.77, 95 % CI 1.31–2.39, p < 0.01), PetCO2 (HR 0.66, 95 % CI 0.50–0.88, p < 0.01) were found as predictors of SVA. At Kaplan-Meier analysis, lower event-free rates were found in subjects with peak VO2 values below median (log rank p < 0.05), values of VE/VCO2 above mean (p < 0.05), higher VE/VCO2 slope tertiles (p <0.05), and values of PetCO2 below median (p < 0.05).

Conclusions

CPX provides prognostic independent information for risk of SVA in subjects with CHF.  相似文献   

7.

Background

In Kenya, detailed data on the age-specific burden of influenza and RSV are essential to inform use of limited vaccination and treatment resources.

Methods

We analyzed surveillance data from August 2009 to July 2012 for hospitalized severe acute respiratory illness (SARI) and outpatient influenza-like illness (ILI) at two health facilities in western Kenya to estimate the burden of influenza and respiratory syncytial virus (RSV). Incidence rates were estimated by dividing the number of cases with laboratory-confirmed virus infections by the mid-year population. Rates were adjusted for healthcare-seeking behavior, and to account for patients who met the SARI/ILI case definitions but were not tested.

Results

The average annual incidence of influenza-associated SARI hospitalization per 1,000 persons was 2.7 (95% CI 1.8–3.9) among children <5 years and 0.3 (95% CI 0.2–0.4) among persons ≥5 years; for RSV-associated SARI hospitalization, it was 5.2 (95% CI 4.0–6.8) among children <5 years and 0.1 (95% CI 0.0–0.2) among persons ≥5 years. The incidence of influenza-associated medically-attended ILI per 1,000 was 24.0 (95% CI 16.6–34.7) among children <5 years and 3.8 (95% CI 2.6–5.7) among persons ≥5 years. The incidence of RSV-associated medically-attended ILI was 24.6 (95% CI 17.0–35.4) among children <5 years and 0.8 (95% CI 0.3–1.9) among persons ≥5 years.

Conclusions

Influenza and RSV both exact an important burden in children. This highlights the possible value of influenza vaccines, and future RSV vaccines, for Kenyan children.  相似文献   

8.

Background

The role of pulmonary hypertension as a cause of mortality in sickle cell disease (SCD) is controversial.

Methods and Results

We evaluated the relationship between an elevated estimated pulmonary artery systolic pressure and mortality in patients with SCD. We followed patients from the walk-PHaSST screening cohort for a median of 29 months. A tricuspid regurgitation velocity (TRV)≥3.0 m/s cuttof, which has a 67–75% positive predictive value for mean pulmonary artery pressure ≥25 mm Hg was used. Among 572 subjects, 11.2% had TRV≥3.0 m/sec. Among 582 with a measured NT-proBNP, 24.1% had values ≥160 pg/mL. Of 22 deaths during follow-up, 50% had a TRV≥3.0 m/sec. At 24 months the cumulative survival was 83% with TRV≥3.0 m/sec and 98% with TRV<3.0 m/sec (p<0.0001). The hazard ratios for death were 11.1 (95% CI 4.1–30.1; p<0.0001) for TRV≥3.0 m/sec, 4.6 (1.8–11.3; p = 0.001) for NT-proBNP≥160 pg/mL, and 14.9 (5.5–39.9; p<0.0001) for both TRV≥3.0 m/sec and NT-proBNP≥160 pg/mL. Age >47 years, male gender, chronic transfusions, WHO class III–IV, increased hemolytic markers, ferritin and creatinine were also associated with increased risk of death.

Conclusions

A TRV≥3.0 m/sec occurs in approximately 10% of individuals and has the highest risk for death of any measured variable.

The study is registered in ClinicalTrials.gov with identifier

NCT00492531  相似文献   

9.

Background

Wearing an activity monitor as a motivational tool and incorporating a behavior-based reward system or a computerized game element might have a synergistic effect on an increase in daily physical activity, thereby inducing body fat reduction. This pilot crossover study aimed to examine the effects of a short-term lifestyle intervention using an activity monitor with computerized game functions on physical activity and body composition.

Methods

Twenty healthy volunteers (31 ± 3 years) participated in a 12-week crossover study. The participants were randomly assigned to either Group A (a 6-week game intervention followed by a 6-week normal intervention) or Group B (a 6-week normal intervention followed by a 6-week game intervention). The participants wore both a normal activity monitor (Lifecorder EX) and an activity monitor with computerized game functions (Yuuhokei) during the game intervention, whereas they only wore a normal activity monitor during the normal intervention. Before, during, and after the intervention, body composition was assessed.

Results

Significantly more daily steps were recorded for the game intervention than for the normal intervention (10,520 ± 562 versus 8,711 ± 523 steps/day, P < 0.01). The participants performed significantly more physical activity at an intensity of ≥ 3 metabolic equivalents (METs) in the game intervention than in the normal intervention (3.1 ± 0.2 versus 2.4 ± 0.2 METs · hour/day, P < 0.01). Although body mass and fat were significantly reduced in both periods (P < 0.01), the difference in body fat reduction was significantly greater in the game intervention than in the normal intervention (P < 0.05).

Conclusions

A short-term intervention using an activity monitor with computerized game functions increases physical activity and reduces body fat more effectively than an intervention using a standard activity monitor.  相似文献   

10.

Background

It is unclear how geographic and social diversity affects the prevalence of chronic obstructive pulmonary disease (COPD). We sought to characterize the prevalence of COPD and identify risk factors across four settings in Peru with varying degrees of urbanization, altitude, and biomass fuel use.

Methods

We collected sociodemographics, clinical history, and post-bronchodilator spirometry in a randomly selected, age-, sex- and site-stratified, population-based sample of 2,957 adults aged ≥35 years (median age was 54.8 years and 49.3% were men) from four resource-poor settings: Lima, Tumbes, urban and rural Puno. We defined COPD as a post-bronchodilator FEV1/FVC < 70%.

Results

Overall prevalence of COPD was 6.0% (95% CI 5.1%–6.8%) but with marked variation across sites: 3.6% in semi-urban Tumbes, 6.1% in urban Puno, 6.2% in Lima, and 9.9% in rural Puno (p < 0.001). Population attributable risks (PARs) of COPD due to smoking ≥10 pack-years were less than 10% for all sites, consistent with a low prevalence of daily smoking (3.3%). Rather, we found that PARs of COPD varied by setting. In Lima, for example, the highest PARs were attributed to post-treatment tuberculosis (16% and 22% for men and women, respectively). In rural Puno, daily biomass fuel for cooking among women was associated with COPD (prevalence ratio 2.22, 95% CI 1.02–4.81) and the PAR of COPD due to daily exposure to biomass fuel smoke was 55%.

Conclusions

The burden of COPD in Peru was not uniform and, unlike other settings, was not predominantly explained by tobacco smoking. This study emphasizes the role of biomass fuel use, and highlights pulmonary tuberculosis as an often neglected risk factor in endemic areas.  相似文献   

11.

Introduction

The aim of this study was to investigate the influence of symptom duration on treatment response and on the correlation between improvements in patient reported outcomes (PRO) and objective inflammation in patients with axial spondylarthritis (SpA) treated with etanercept (ETA) or adalimumab (ADA).

Methods

Data from 112 patients with axial SpA originally enrolled in two randomized controlled clinical trials were pooled and analyzed after one year of treatment with ETA (n = 66) or ADA (n = 46). Patients with <4 years and ≥4 years of disease were compared for improvement in Bath Ankylosing Spondylitis Disease Activity Index (BASDAI), Bath Ankylosing Spondylitis Functional Index (BASFI), Ankylosing Spondylitis Disease Activity Score (ASDAS), C-reactive protein (CRP) and magnetic resonance imaging (MRI) score for sacroiliac joints (SIJ).

Results

Patients with <4 years of disease showed a significantly better improvement than longer diseased patients in BASDAI (3.2 (95% confidence interval (CI): 2.7 to 3.7) vs. 1.7 (1.1 to 2.2)), BASFI, BASMI and ASDAS (1.6 (1.4 to 1.8) vs. 0.9 (0.7 to 1.1)). The change in BASDAI showed a significant correlation with the change in SIJ score (Spearman’s rank correlation coefficient (rho) = 0.37, P = 0.01) and the change in CRP (rho = 0.45, P = 0.001) in patients with <4 years of disease. For long diseased patients this correlation was poor and did not achieve statistical significance (rho = 0.13, P = 0.46; rho = 0.22, P = 0.13 respectively).

Conclusion

The low correlation between change of PROs and change of objective signs of inflammation seen in axial SpA patients with longer symptom duration treated with tumor necrosis factor-blocker seems to indicate that inflammation is not the only cause of the patients’ symptoms, while inflammation seems to be the major cause in short diseased patients.

Trial registration

Clinical Trials.gov NCT00844142 (Trial 1); NCT00235105 (Trial 2)  相似文献   

12.

Background

The combination of aclidinium bromide, a long-acting anticholinergic, and formoterol fumarate, a long-acting beta2-agonist (400/12 μg twice daily) achieves improvements in lung function greater than either monotherapy in patients with chronic obstructive pulmonary disease (COPD), and is approved in the European Union as a maintenance treatment. The effect of this combination on symptoms of COPD and exacerbations is less well established. We examined these outcomes in a pre-specified analysis of pooled data from two 24-week, double-blind, parallel-group, active- and placebo-controlled, multicentre, randomised Phase III studies (ACLIFORM and AUGMENT).

Methods

Patients ≥40 years with moderate to severe COPD (post-bronchodilator forced expiratory volume in 1 s [FEV1]/forced vital capacity <70 % and FEV1 ≥30 % but <80 % predicted normal) were randomised (ACLIFORM: 2:2:2:2:1; AUGMENT: 1:1:1:1:1) to twice-daily aclidinium/formoterol 400/12 μg or 400/6 μg, aclidinium 400 μg, formoterol 12 μg or placebo via Genuair™/Pressair®. Dyspnoea (Transition Dyspnoea Index; TDI), daily symptoms (EXAcerbations of Chronic pulmonary disease Tool [EXACT]-Respiratory Symptoms [E-RS] questionnaire), night-time and early-morning symptoms, exacerbations (Healthcare Resource Utilisation [HCRU] and EXACT definitions) and relief-medication use were assessed.

Results

The pooled intent-to-treat population included 3394 patients. Aclidinium/formoterol 400/12 μg significantly improved TDI focal score versus placebo and both monotherapies at Week 24 (all p < 0.05). Over 24 weeks, significant improvements in E-RS total score, overall night-time and early-morning symptom severity and limitation of early-morning activities were observed with aclidinium/formoterol 400/12 μg versus placebo and both monotherapies (all p < 0.05). The rate of moderate or severe HCRU exacerbations was significantly reduced with aclidinium/formoterol 400/12 μg compared with placebo (p < 0.05) but not monotherapies; the rate of EXACT-defined exacerbations was significantly reduced with aclidinium/formoterol 400/12 μg versus placebo (p < 0.01) and aclidinium (p < 0.05). Time to first HCRU or EXACT exacerbation was longer with aclidinium/formoterol 400/12 μg compared with placebo (all p < 0.05) but not the monotherapies. Relief-medication use was reduced with aclidinium/formoterol 400/12 μg versus placebo and aclidinium (p < 0.01).

Conclusions

Aclidinium/formoterol 400/12 μg significantly improves 24-hour symptom control compared with placebo, aclidinium and formoterol in patients with moderate to severe COPD. Furthermore, aclidinium/formoterol 400/12 μg reduces the frequency of exacerbations compared with placebo.

Trial registration

NCT01462942 and NCT01437397 (ClinicalTrials.gov)

Electronic supplementary material

The online version of this article (doi:10.1186/s12931-015-0250-2) contains supplementary material, which is available to authorized users.  相似文献   

13.

Background

Elderly patients with end-stage renal disease have become the fastest growing population of kidney transplant candidates in recent years. However, the risk factors associated with long-term outcomes in these patients remain unclear.

Methods

We retrospectively analyzed 166 recipients aged 60 years or older who underwent primary deceased kidney transplantation between 2002 and 2013 in our center. The main outcomes included 1-, 3- and 5-year patient survival as well as overall and death-censored graft survival. The independent risk factors affecting graft and patient survival were analyzed using Cox regression analysis.

Results

The 1-, 3-, 5-year death-censored graft survival rates were 93.6%, 89.4% and 83.6%, respectively. Based on the Cox multivariate analysis, panel reactive antibody (PRA)>5% [hazard ratio (HR) 4.295, 95% confidence interval (CI) 1.321–13.97], delayed graft function (HR 4.744, 95% CI 1.611–13.973) and acute rejection (HR 4.971, 95% CI 1.516–16.301) were independent risk factors for graft failure. The 1-, 3-, 5-year patient survival rates were 84.8%, 82.1% and 77.1%, respectively. Longer dialysis time (HR 1.011 for 1-month increase, 95% CI 1.002–1.020), graft loss (HR 3.501, 95% CI 1.559–7.865) and low-dose ganciclovir prophylaxis (1.5 g/d for 3 months) (HR 3.173, 95% CI 1.063–9.473) were risk factors associated with patient death.

Conclusions

The five-year results show an excellent graft and patient survival in elderly kidney transplant recipients aged ≥60 years. PRA>5%, delayed graft function, and acute rejection are risk factors for graft failure, while longer duration of dialysis, graft loss and low-dose ganciclovir prophylaxis are risk factors for mortality in elderly recipients. These factors represent potential targets for interventions aimed at improving graft and patient survival in elderly recipients.  相似文献   

14.

Background

Patients eligible for cardiac resynchronisation therapy (CRT) have an indication for primary prophylactic implantable cardioverter defibrillator (ICD) therapy. However, response to CRT might influence processes involved in arrhythmogenesis and therefore change the necessity of ICD therapy in certain patients.

Method

In 202 CRT-defibrillator patients, the association between baseline variables, 6-month echocardiographic outcome (volume response: left ventricular end-systolic volume decrease < ≥15 % and left ventricular ejection fraction (LVEF) ≤ >35 %) and the risk of first appropriate ICD therapy was analysed retrospectively.

Results

Fifty (25 %) patients received appropriate ICD therapy during a median follow-up of 37 (23–52) months. At baseline ischaemic cardiomyopathy (hazard ratio (HR) 2.0, p = 0.019) and a B-type natriuretic peptide level > 163 pmol/l (HR 3.8, p < 0.001) were significantly associated with the risk of appropriate ICD therapy. After 6 months, 105 (52 %) patients showed volume response and 51 (25 %) reached an LVEF > 35 %. Three (6 %) patients with an LVEF > 35 % received appropriate ICD therapy following echocardiography at ± 6 months compared with 43 patients (29 %) with an LVEF ≤ 35 % (p = 0.001). LVEF post-CRT was more strongly associated to the risk of ventricular arrhythmias than volume response (LVEF > 35 %, HR 0.23, p = 0.020).

Conclusion

Assessing the necessity of an ICD in patients eligible for CRT remains a challenge. Six months post-CRT an LVEF > 35 % identified patients at low risk of ventricular arrhythmias. LVEF might be used at the time of generator replacement to identify patients suitable for downgrading to a CRT-pacemaker.  相似文献   

15.

Background

It is important to gain insight into opportunities for secondary prevention of cardiovascular disease. Our aim was to investigate levels and trends in cardiovascular risk factors and drug treatment in Dutch post-myocardial infarction (MI) patients between 2002 and 2006 and to make comparisons with the EUROASPIRE surveys (1999–2007).

Methods

We analysed data from 4837 post-MI patients (aged 69 years, 78% men) from 32 Dutch hospitals, using baseline cross-sectional data from the Alpha Omega Trial.

Results

Between 2002 and 2006, significant declines were found in the prevalence of smoking (23% to 16%, p < 0.001), hypercholesterolaemia (≥5 mmol/l; 54% to 27%, p < 0.0001) and hypertension (≥140/90 mmHg; 58% to 48%, p < 0.001). The prevalence of antithrombotic drugs was high (97%). The prevalence of lipid-modifying drugs and antihypertensives was high, and increased (74% to 90%, p < 0.0001 and 82% to 93%, p < 0.001, respectively). The prevalence of obesity (27%) was high in 2002 and decreased to 24% in 2006, albeit not significantly. Diabetes prevalence was high and increased between 2002 and 2006 (18% to 22%, p = 0.02). In comparison with EUROASPIRE patients, who were on average 8–10 years younger, our study in 2006 included patients with lower levels of obesity, hypertension, hypercholesterolaemia, diabetes and lower use of antiplatelets and β-blockers, but similar levels of lipid-modifying drugs.

Conclusions

This study showed that older Dutch post-MI patients were adequately treated with drugs, and that risk factors reached lower levels than in the younger EUROASPIRE patients. However, there is room for improvement in diet and lifestyle, given the high prevalence of smoking, obesity, and diabetes.

Electronic supplementary material

The online version of this article (doi:10.1007/s12471-012-0248-z) contains supplementary material, which is available to authorized users.  相似文献   

16.

Background

Liver stiffness measurement (LSM) by transient elastography (TE, FibroScan) is a validated method for noninvasively staging liver fibrosis. Most hepatic complications occur in patients with advanced fibrosis. Our objective was to determine the ability of LSM by TE to predict hepatic complications and mortality in a large cohort of patients with chronic liver disease.

Methods

In consecutive adults who underwent LSM by TE between July 2008 and June 2011, we used Cox regression to determine the independent association between liver stiffness and death or hepatic complications (decompensation, hepatocellular carcinoma, and liver transplantation). The performance of LSM to predict complications was determined using the c-statistic.

Results

Among 2,052 patients (median age 51 years, 65% with hepatitis B or C), 87 patients (4.2%) died or developed a hepatic complication during a median follow-up period of 15.6 months (interquartile range, 11.0–23.5 months). Patients with complications had higher median liver stiffness than those without complications (13.5 vs. 6.0 kPa; P<0.00005). The 2-year incidence rates of death or hepatic complications were 2.6%, 9%, 19%, and 34% in patients with liver stiffness <10, 10–19.9, 20–39.9, and ≥40 kPa, respectively (P<0.00005). After adjustment for potential confounders, liver stiffness by TE was an independent predictor of complications (hazard ratio [HR] 1.05 per kPa; 95% confidence interval [CI] 1.03–1.06). The c-statistic of liver-stiffness for predicting complications was 0.80 (95% CI 0.75–0.85). A liver stiffness below 20 kPa effectively excluded complications (specificity 93%, negative predictive value 97%); however, the positive predictive value of higher results was sub-optimal (20%).

Conclusions

Liver stiffness by TE accurately predicts the risk of death or hepatic complications in patients with chronic liver disease. TE may facilitate the estimation of prognosis and guide management of these patients.  相似文献   

17.

Purpose

To retrospectively assess the clinical utility in ureteroscopy (URS) planning of cumulative stone diameter (CSD), which does not account for stone width or depth, as a predictor of URS outcome and compare it with stone volume.

Materials and Methods

Patients with renal stones treated at a single institute by flexible URS were retrospectively evaluated. To assess the clinical utility of CSD, relationships between stone-free (SF) status and stone burden (CSD and volume) were analyzed using the area under the receiver operating characteristics (AUROC) curve. To identify stone number impact on CSD, the AUROC of CSD divided by stone number was evaluated. Correlation coefficients of CSD and stone volume were also calculated for groups by stone number.

Results

In cases with CSD <20.0 mm, CSD and stone volume revealed equal ability to predict SF status. In cases with CSD ≥20.0 mm, stone volume showed higher predictive ability. The ROC curves for cases with ≥4 stones showed that CSD was less predictive of SF status than stone volume. The correlation coefficients of CSD and stone volume by stone number were 0.922 for 1 stone, 0.900 for 2–3 stones, and 0.661 for ≥4 stones.

Conclusions

In cases with CSD ≥20.0 mm or ≥4 stones, we should evaluate stone volume for a more predictive stone burden, and pretreatment non-contrast CT seems sufficient. In cases with CSD <20.0 mm or 1–3 stones, CSD was as valid a predictor of preoperative stone burden as stone volume, so preoperative kidney-ureter-bladder (KUB) films may be sufficient.  相似文献   

18.

Introduction

In clinical practice, nonsteroidal anti-inflammatory drugs (NSAIDs) are commonly discontinued after response to biologic therapy is achieved in patients with axial spondyloarthritis (axSpA), but the impact of NSAID discontinuation has not been assessed in prospective controlled trials. The aim of the SPARSE study was to evaluate the effects of the anti-tumor necrosis factor agent etanercept on NSAID intake and conventional clinical outcomes in axSpA patients.

Methods

In the double-blind, placebo-controlled period, patients with active (mini Bath Ankylosing Spondylitis Disease Activity Index (BASDAI) ≥4) axSpA despite optimal NSAID intake were randomized to receive etanercept 50 mg or placebo once weekly for 8 weeks. All patients were advised to taper/discontinue their NSAID intake during the treatment period. NSAID intake was self-reported by diary and Assessment of SpondyloArthritis International Society (ASAS)-NSAID scores calculated based on ASAS recommendations. The primary endpoint was change from baseline to week 8 in ASAS-NSAID score (analysis of covariance).

Results

In 90 randomized patients at baseline, mean age (standard deviation) was 38.9 (11.8) years; disease duration, 5.7 (8.1) years; 59/90 (66%) were human leukocyte antigen-B27 positive; 51/90 (57%) had radiographic sacroiliitis; and 45/90 (50%) were magnetic resonance imaging sacroiliitis-positive. Mean ASAS-NSAID scores were similar between etanercept and placebo groups at baseline (98.2 (39.0) versus 93.0 (23.4)), as were BASDAI (6.0 (1.7) versus 5.9 (1.5)), and Bath Ankylosing Spondylitis Functional Index (5.2 (2.1) versus 5.1 (2.2)). Mean changes (SE) in ASAS-NSAID score from baseline to week 8 were –63.9 (6.1) and –36.6 (5.9) in the etanercept and placebo groups (between-group difference, –27.3; P = 0.002). Significantly higher proportions of patients receiving etanercept versus placebo had an ASAS-NSAID score <10 (46% versus 17%; P = 0.008) and ASAS-NSAID score of 0 (41% versus 14%; P = 0.013) at this time point. Significantly more patients in the etanercept versus placebo group achieved BASDAI50 (39% versus 18%; P = 0.032) and ASAS40 (44% versus 21%; P = 0.028) at week 8.

Conclusions

In patients with axSpA, etanercept was associated with clinically relevant NSAID-sparing effects in addition to significant improvements in conventional clinical outcomes.

Trial registration

ClinicalTrials.gov NCT01298531. Registered 16 February 2011.

Electronic supplementary material

The online version of this article (doi:10.1186/s13075-014-0481-5) contains supplementary material, which is available to authorized users.  相似文献   

19.

Introduction

Pain in osteoarthritis (OA) has been classically attributed to joint structural damage. Disparity between the degree of radiographic structural damage and the severity of symptoms implies that factors other than the joint pathology itself contribute to the pain. Peripheral and central sensitization have been suggested as two of the underlying mechanisms that contribute to pain in OA. The aim of this study was to explore in symptomatic knee OA patients, the structural changes assessed by magnetic resonance imaging (MRI) that could be used as markers of neuropathic pain (NP).

Methods

This cross-sectional observational pilot study included 50 knee OA patients with moderate to severe pain (VAS ≥40) in the target knee. The presence of NP was determined based on the PainDETECT questionnaire. Among the 50 patients included, 25 had PainDETECT score ≤12 (unlikely NP), 9 had PainDETECT score between 13 and 18 (uncertain NP) and 16 had PainDETECT score ≥19 (likely NP). WOMAC, PainDETECT, and VAS pain scores as well as knee MRI were assessed.

Results

Data showed no significant difference in demographic characteristics between the three groups. However, a positive and statistically significant association was found between the WOMAC pain (P <0.001), function (P <0.001), stiffness (P = 0.007) and total (P <0.001) scores as well as higher VAS pain score (P = 0.023), and PainDETECT scores. Although no difference was found in the cartilage volume between groups, the presence of meniscal extrusion in both medial (P = 0.006) and lateral (P = 0.023) compartments, and presence of meniscal tears in the lateral compartment (P = 0.011), were significantly associated with increasing PainDETECT score. Moreover, the presence of bone marrow lesions in the lateral plateau and the extent of the synovial membrane thickness in the lateral recess were associated with increasing PainDETECT scores (P = 0.032, P = 0.027, respectively).

Conclusions

In this study, meniscal lesions, particularly extrusion, were found to be among the strongest risk factors for NP in knee OA patients.

Trial registration

ClinicalTrials.gov NCT01733277. Registered 16 November 2012.  相似文献   

20.

Background

The efficacy of clopidogrel is inconclusive in the chronic kidney disease (CKD) population with acute coronary syndrome (ACS). Furthermore, CKD patients are prone to bleeding with antiplatelet therapy. We investigated the efficacy and safety of clopidogrel in patients with ACS and CKD.

Methods

In a Taiwan national-wide registry, 2819 ACS patients were enrolled. CKD is defined as an estimated glomerular filtration rate of less than 60 ml/min per 1.73 m2. The primary endpoints are the combined outcomes of death, non-fatal myocardial infarction and stroke at 12 months.

Results

Overall 949 (33.7%) patients had CKD and 2660 (94.36%) patients received clopidogrel treatment. CKD is associated with increased risk of the primary endpoint at 12 months (HR 2.39, 95% CI 1.82 to 3.15, p<0.01). Clopidogrel use is associated with reduced risk of the primary endpoint at 12 months (HR 0.42, 95% CI: 0.29–0.60, p<0.01). Cox regression analysis showed that clopidogrel reduced death and primary endpoints for CKD population (HR 0.35, 95% CI: 0.21–0.61 and HR 0.48, 95% CI: 0.30–0.77, respectively, both p<0.01). Patients with clopidogrel(−)/CKD(−), clopidogrel(+)/CKD(+) and clopidogrel(−)/CKD(+) have 2.4, 3.0 and 10.4 fold risk to have primary endpoints compared with those receiving clopidogrel treatment without CKD (all p<0.01). Clopidogrel treatment was not associated with increased in-hospital Thrombolysis In Myocardial Infarction (TIMI) bleeding in CKD population.

Conclusion

Clopidogrel could decrease mortality and improve cardiovascular outcomes without increasing risk of bleeding in ACS patients with CKD.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号