首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Wald NJ  Simmonds M  Morris JK 《PloS one》2011,6(5):e18742

Background

Risk factors such as blood pressure and serum cholesterol are used, with age, in screening for future cardiovascular disease (CVD) events. The value of using these risk factors with age compared with using age alone is not known. We compared screening for future CVD events using age alone with screening using age and multiple risk factors based on regular Framingham risk assessments.

Methods

Ten-year CVD risk was estimated using Framingham risk equations in a hypothetical sample population of 500,000 people aged 0–89 years. Risk estimates were used to identify individuals who did and did not have a CVD event over a ten-year period. For screening using age alone (age screening) and screening using multiple risk factors and age (Framingham screening) we estimated the (i) detection rate (sensitivity); (ii) false–positive rate; (iii) proportion of CVD-free years of life lost in affected individuals with positive results (person-years detection rate); and (iv) cost per CVD-free life year gained from preventive treatment.

Results

Age screening using a cut-off of 55 years detected 86% of all first CVD events arising in the population every year and 72% of CVD-free years of life lost for a 24% false-positive rate; for five yearly Framingham screening the false-positive rate was 21% for the same 86% detection rate. The estimated cost per CVD-free year of life gained was £2,000 for age screening and £2,200 for Framingham screening if a Framingham screen costs £150 and the annual cost of preventive treatment is £200.

Conclusion

Age screening for future CVD events is simpler than Framingham screening with a similar screening performance and cost-effectiveness. It avoids blood tests and medical examinations. The advantages of age screening in the prevention of heart attack and stroke warrant considering its use in preference to multiple risk factor screening.  相似文献   

2.

Background

Leisure time physical activity reduces the risk of premature mortality, but the years of life expectancy gained at different levels remains unclear. Our objective was to determine the years of life gained after age 40 associated with various levels of physical activity, both overall and according to body mass index (BMI) groups, in a large pooled analysis.

Methods and Findings

We examined the association of leisure time physical activity with mortality during follow-up in pooled data from six prospective cohort studies in the National Cancer Institute Cohort Consortium, comprising 654,827 individuals, 21–90 y of age. Physical activity was categorized by metabolic equivalent hours per week (MET-h/wk). Life expectancies and years of life gained/lost were calculated using direct adjusted survival curves (for participants 40+ years of age), with 95% confidence intervals (CIs) derived by bootstrap. The study includes a median 10 y of follow-up and 82,465 deaths. A physical activity level of 0.1–3.74 MET-h/wk, equivalent to brisk walking for up to 75 min/wk, was associated with a gain of 1.8 (95% CI: 1.6–2.0) y in life expectancy relative to no leisure time activity (0 MET-h/wk). Higher levels of physical activity were associated with greater gains in life expectancy, with a gain of 4.5 (95% CI: 4.3–4.7) y at the highest level (22.5+ MET-h/wk, equivalent to brisk walking for 450+ min/wk). Substantial gains were also observed in each BMI group. In joint analyses, being active (7.5+ MET-h/wk) and normal weight (BMI 18.5–24.9) was associated with a gain of 7.2 (95% CI: 6.5–7.9) y of life compared to being inactive (0 MET-h/wk) and obese (BMI 35.0+). A limitation was that physical activity and BMI were ascertained by self report.

Conclusions

More leisure time physical activity was associated with longer life expectancy across a range of activity levels and BMI groups. Please see later in the article for the Editors'' Summary  相似文献   

3.

Background

Point-of-care CD4 tests at HIV diagnosis could improve linkage to care in resource-limited settings. Our objective is to evaluate the clinical and economic impact of point-of-care CD4 tests compared to laboratory-based tests in Mozambique.

Methods and Findings

We use a validated model of HIV testing, linkage, and treatment (CEPAC-International) to examine two strategies of immunological staging in Mozambique: (1) laboratory-based CD4 testing (LAB-CD4) and (2) point-of-care CD4 testing (POC-CD4). Model outcomes include 5-y survival, life expectancy, lifetime costs, and incremental cost-effectiveness ratios (ICERs). Input parameters include linkage to care (LAB-CD4, 34%; POC-CD4, 61%), probability of correctly detecting antiretroviral therapy (ART) eligibility (sensitivity: LAB-CD4, 100%; POC-CD4, 90%) or ART ineligibility (specificity: LAB-CD4, 100%; POC-CD4, 85%), and test cost (LAB-CD4, US$10; POC-CD4, US$24). In sensitivity analyses, we vary POC-CD4-specific parameters, as well as cohort and setting parameters to reflect a range of scenarios in sub-Saharan Africa. We consider ICERs less than three times the per capita gross domestic product in Mozambique (US$570) to be cost-effective, and ICERs less than one times the per capita gross domestic product in Mozambique to be very cost-effective. Projected 5-y survival in HIV-infected persons with LAB-CD4 is 60.9% (95% CI, 60.9%–61.0%), increasing to 65.0% (95% CI, 64.9%–65.1%) with POC-CD4. Discounted life expectancy and per person lifetime costs with LAB-CD4 are 9.6 y (95% CI, 9.6–9.6 y) and US$2,440 (95% CI, US$2,440–US$2,450) and increase with POC-CD4 to 10.3 y (95% CI, 10.3–10.3 y) and US$2,800 (95% CI, US$2,790–US$2,800); the ICER of POC-CD4 compared to LAB-CD4 is US$500/year of life saved (YLS) (95% CI, US$480–US$520/YLS). POC-CD4 improves clinical outcomes and remains near the very cost-effective threshold in sensitivity analyses, even if point-of-care CD4 tests have lower sensitivity/specificity and higher cost than published values. In other resource-limited settings with fewer opportunities to access care, POC-CD4 has a greater impact on clinical outcomes and remains cost-effective compared to LAB-CD4. Limitations of the analysis include the uncertainty around input parameters, which is examined in sensitivity analyses. The potential added benefits due to decreased transmission are excluded; their inclusion would likely further increase the value of POC-CD4 compared to LAB-CD4.

Conclusions

POC-CD4 at the time of HIV diagnosis could improve survival and be cost-effective compared to LAB-CD4 in Mozambique, if it improves linkage to care. POC-CD4 could have the greatest impact on mortality in settings where resources for HIV testing and linkage are most limited. Please see later in the article for the Editors'' Summary  相似文献   

4.

Background

Data on mortality among homeless people are limited. Therefore, this study aimed to describe mortality patterns within a cohort of homeless adults in Rotterdam (the Netherlands) and to assess excess mortality as compared to the general population in that city.

Methods

Based on 10-year follow-up of homeless adults aged ≥ 20 years who visited services for homeless people in Rotterdam in 2001, and on vital statistics, we assessed the association of mortality with age, sex and type of service used (e.g. only day care, convalescence care, other) within the homeless cohort, and also compared mortality between the homeless and general population using Poisson regression. Life tables and decomposition methods were used to examine differences in life expectancy.

Results

During follow-up, of the 2096 adult homeless 265 died. Among the homeless, at age 30 years no significant sex differences were found in overall mortality rates and life expectancy. Compared with the general Rotterdam population, mortality rates were 3.5 times higher in the homeless cohort. Excess mortality was larger in women (rate ratio [RR] RR 5.56, 95% CI 3.95–7.82) as compared to men (RR 3.31, 95% CI 2.91–3.77), and decreased with age (RR 7.67, 95% CI 6.87–8.56 for the age group 20–44 and RR 1.63, 95% CI 1.41–1.88 for the age group 60+ years). Life expectancy at age 30 years was 11.0 (95% CI 9.1–12.9) and 15.9 (95% CI 10.3–21.5) years lower for homeless men and women compared to men and women in the general population respectively.

Conclusion

Homeless adults face excessive losses in life expectancy, with greatest disadvantages among homeless women and the younger age groups.  相似文献   

5.
《PloS one》2013,8(12)

Background

Combination antiretroviral therapy (ART) has significantly increased survival among HIV-positive adults in the United States (U.S.) and Canada, but gains in life expectancy for this region have not been well characterized. We aim to estimate temporal changes in life expectancy among HIV-positive adults on ART from 2000–2007 in the U.S. and Canada.

Methods

Participants were from the North American AIDS Cohort Collaboration on Research and Design (NA-ACCORD), aged ≥20 years and on ART. Mortality rates were calculated using participants'' person-time from January 1, 2000 or ART initiation until death, loss to follow-up, or administrative censoring December 31, 2007. Life expectancy at age 20, defined as the average number of additional years that a person of a specific age will live, provided the current age-specific mortality rates remain constant, was estimated using abridged life tables.

Results

The crude mortality rate was 19.8/1,000 person-years, among 22,937 individuals contributing 82,022 person-years and 1,622 deaths. Life expectancy increased from 36.1 [standard error (SE) 0.5] to 51.4 [SE 0.5] years from 2000–2002 to 2006–2007. Men and women had comparable life expectancies in all periods except the last (2006–2007). Life expectancy was lower for individuals with a history of injection drug use, non-whites, and in patients with baseline CD4 counts <350 cells/mm3.

Conclusions

A 20-year-old HIV-positive adult on ART in the U.S. or Canada is expected to live into their early 70 s, a life expectancy approaching that of the general population. Differences by sex, race, HIV transmission risk group, and CD4 count remain.  相似文献   

6.

Background

There has been substantial research on psychosocial and health care determinants of health disparities in the United States (US) but less on the role of modifiable risk factors. We estimated the effects of smoking, high blood pressure, elevated blood glucose, and adiposity on national life expectancy and on disparities in life expectancy and disease-specific mortality among eight subgroups of the US population (the “Eight Americas”) defined on the basis of race and the location and socioeconomic characteristics of county of residence, in 2005.

Methods and Findings

We combined data from the National Health and Nutrition Examination Survey and the Behavioral Risk Factor Surveillance System to estimate unbiased risk factor levels for the Eight Americas. We used data from the National Center for Health Statistics to estimate age–sex–disease-specific number of deaths in 2005. We used systematic reviews and meta-analyses of epidemiologic studies to obtain risk factor effect sizes for disease-specific mortality. We used epidemiologic methods for multiple risk factors to estimate the effects of current exposure to these risk factors on death rates, and life table methods to estimate effects on life expectancy. Asians had the lowest mean body mass index, fasting plasma glucose, and smoking; whites had the lowest systolic blood pressure (SBP). SBP was highest in blacks, especially in the rural South—5–7 mmHg higher than whites. The other three risk factors were highest in Western Native Americans, Southern low-income rural blacks, and/or low-income whites in Appalachia and the Mississippi Valley. Nationally, these four risk factors reduced life expectancy at birth in 2005 by an estimated 4.9 y in men and 4.1 y in women. Life expectancy effects were smallest in Asians (M, 4.1 y; F, 3.6 y) and largest in Southern rural blacks (M, 6.7 y; F, 5.7 y). Standard deviation of life expectancies in the Eight Americas would decline by 0.50 y (18%) in men and 0.45 y (21%) in women if these risks had been reduced to optimal levels. Disparities in the probabilities of dying from cardiovascular diseases and diabetes at different ages would decline by 69%–80%; the corresponding reduction for probabilities of dying from cancers would be 29%–50%. Individually, smoking and high blood pressure had the largest effect on life expectancy disparities.

Conclusions

Disparities in smoking, blood pressure, blood glucose, and adiposity explain a significant proportion of disparities in mortality from cardiovascular diseases and cancers, and some of the life expectancy disparities in the US. Please see later in the article for the Editors'' Summary  相似文献   

7.

Introduction

To compare statin initiation and treatment non-adherence following a first acute myocardial infarction (MI) in patients with inflammatory rheumatic disease (IRD) and the general population.

Methods

We conducted a retrospective cohort study using a population-based linked database. Cases of first MI from July 2001 to June 2009 were identified based on International Classification of Diseases (ICD-10-AM) codes. Statin initiation and adherence was identified based on pharmaceutical claims records. Logistic regression was used to assess the odds of statin initiation by IRD status. Non-adherence was assessed as the time to first treatment gap using a Cox proportional hazards model.

Results

There were 18,518 individuals with an index MI over the time period surviving longer than 30 days, of whom 415 (2.2%) were IRD patients. The adjusted odds of receiving a statin by IRD status was significantly lower (OR =0.69, 95% CI: 0.55 to 0.86) compared to the general population. No association between IRD status and statin non-adherence was identified (hazard ratio (HR) =1.12, 95% CI: 0.82 to 1.52).

Conclusions

Statin initiation was significantly lower for people with IRD conditions compared to the general population. Once initiated on statins, the proportion of IRD patients who adhered to treatment was similar to the general population. Given the burden of cardiovascular disease and excess mortality in IRD patients, encouraging the use of evidence-based therapies is critical for ensuring the best outcomes in this high risk group.

Electronic supplementary material

The online version of this article (doi:10.1186/s13075-014-0443-y) contains supplementary material, which is available to authorized users.  相似文献   

8.

Background

Prostate cancer (PCa) is the most common non-skin cancer among men in developed countries. Several novel treatments have been adopted by healthcare systems to manage PCa. Most of the observational studies and randomized trials on PCa have concurrently evaluated fewer treatments over short follow-up. Further, preceding decision analytic models on PCa management have not evaluated various contemporary management options. Therefore, a contemporary decision analytic model was necessary to address limitations to the literature by synthesizing the evidence on novel treatments thereby forecasting short and long-term clinical outcomes.

Objectives

To develop and validate a Markov Monte Carlo model for the contemporary clinical management of PCa, and to assess the clinical burden of the disease from diagnosis to end-of-life.

Methods

A Markov Monte Carlo model was developed to simulate the management of PCa in men 65 years and older from diagnosis to end-of-life. Health states modeled were: risk at diagnosis, active surveillance, active treatment, PCa recurrence, PCa recurrence free, metastatic castrate resistant prostate cancer, overall and PCa death. Treatment trajectories were based on state transition probabilities derived from the literature. Validation and sensitivity analyses assessed the accuracy and robustness of model predicted outcomes.

Results

Validation indicated model predicted rates were comparable to observed rates in the published literature. The simulated distribution of clinical outcomes for the base case was consistent with sensitivity analyses. Predicted rate of clinical outcomes and mortality varied across risk groups. Life expectancy and health adjusted life expectancy predicted for the simulated cohort was 20.9 years (95%CI 20.5–21.3) and 18.2 years (95% CI 17.9–18.5), respectively.

Conclusion

Study findings indicated contemporary management strategies improved survival and quality of life in patients with PCa. This model could be used to compare long-term outcomes and life expectancy conferred of PCa management paradigms.  相似文献   

9.

Background

Deceased donor kidneys for transplantation are in most countries allocated preferentially to recipients who have limited co-morbidities. Little is known about the incremental health and economic gain from transplanting those with co-morbidities compared to remaining on dialysis. The aim of our study is to estimate the average and incremental survival benefits and health care costs of listing and transplantation compared to dialysis among individuals with varying co-morbidities.

Methods

A probabilistic Markov model was constructed, using current outcomes for patients with defined co-morbidities treated with either dialysis or transplantation, to compare the health and economic benefits of listing and transplantation with dialysis.

Findings

Using the current waiting time for deceased donor transplantation, transplanting a potential recipient, with or without co-morbidities achieves survival gains of between 6 months and more than three life years compared to remaining on dialysis, with an average incremental cost-effectiveness ratio (ICER) of less than $50,000/LYS, even among those with advanced age. Age at listing and the waiting time for transplantation are the most influential variables within the model. If there were an unlimited supply of organs and no waiting time, transplanting the younger and healthier individuals saves the most number of life years and is cost-saving, whereas transplanting the middle-age to older patients still achieves substantial incremental gains in life expectancy compared to being on dialysis.

Conclusions

Our modelled analyses suggest transplanting the younger and healthier individuals with end-stage kidney disease maximises survival gains and saves money. Listing and transplanting those with considerable co-morbidities is also cost-effective and achieves substantial survival gains compared with the dialysis alternative. Preferentially excluding the older and sicker individuals cannot be justified on utilitarian grounds.  相似文献   

10.
Li Y  Liu Y  Fu L  Mei C  Dai B 《PloS one》2012,7(4):e34450

Background

A few studies focused on statin therapy as specific prophylactic measures of contrast-induced nephropathy have been published with conflicting results. In this meta-analysis of randomized controlled trials, we aimed to assess the effectiveness of shor-term high-dose statin treatment for the prevention of CIN and clinical outcomes and re-evaluate of the potential benefits of statin therapy.

Methods

We searched PubMed, OVID, EMBASE, Web of science and the Cochrane Central Register of Controlled Trials databases for randomized controlled trials comparing short-term high-dose statin treatment versus low-dose statin treatment or placebo for preventing CIN. Our outcome measures were the risk of CIN within 2–5 days after contrast administration and need for dialysis.

Results

Seven randomized controlled trials with a total of 1,399 patients were identified and analyzed. The overall results based on fixed-effect model showed that the use of short-term high-dose statin treatment was associated with a significant reduction in risk of CIN (RR = 0.51, 95% CI 0.34–0.76, p = 0.001; I2 = 0%). The incidence of acute renal failure requiring dialysis was not significant different after the use of statin (RR = 0.33, 95% CI 0.05–2.10, p = 0.24; I2 = 0%). The use of statin was not associated with a significant decrease in the plasma C-reactive protein level (SMD −0.64, 95% CI: −1.57 to 0.29, P = 0.18, I2 = 97%).

Conclusions

Although this meta-analysis supports the use of statin to reduce the incidence of CIN, it must be considered in the context of variable patient demographics. Only a limited recommendation can be made in favour of the use of statin based on current data. Considering the limitations of included studies, a large, well designed trial that incorporates the evaluation of clinically relevant outcomes in participants with different underlying risks of CIN is required to more adequately assess the role for statin in CIN prevention.  相似文献   

11.

Background

Indian guidelines recommend routine referral for HIV testing of all tuberculosis (TB) patients in the nine states with the highest HIV prevalence, and selective referral for testing elsewhere. We assessed the clinical impact and cost-effectiveness of alternative HIV testing referral strategies among TB patients in India.

Methods and Findings

We utilized a computer model of HIV and TB disease to project outcomes for patients with active TB in India. We compared life expectancy, cost, and cost-effectiveness for three HIV testing referral strategies: 1) selective referral for HIV testing of those with increased HIV risk, 2) routine referral of patients in the nine highest HIV prevalence states with selective referral elsewhere (current standard), and 3) routine referral of all patients for HIV testing. TB-related data were from the World Health Organization. HIV prevalence among TB patients was 9.0% in the highest prevalence states, 2.9% in the other states, and 4.9% overall. The selective referral strategy, beginning from age 33.50 years, had a projected discounted life expectancy of 16.88 years and a mean lifetime HIV/TB treatment cost of US$100. The current standard increased mean life expectancy to 16.90 years with additional per-person cost of US$10; the incremental cost-effectiveness ratio was US$650/year of life saved (YLS) compared to selective referral. Routine referral of all patients for HIV testing increased life expectancy to 16.91 years, with an incremental cost-effectiveness ratio of US$730/YLS compared to the current standard. For HIV-infected patients cured of TB, receiving antiretroviral therapy increased survival from 4.71 to 13.87 years. Results were most sensitive to the HIV prevalence and the cost of second-line antiretroviral therapy.

Conclusions

Referral of all patients with active TB in India for HIV testing will be both effective and cost-effective. While effective implementation of this strategy would require investment, routine, voluntary HIV testing of TB patients in India should be recommended.  相似文献   

12.

Background

A combination of clinical and routine laboratory data with biomarkers reflecting different pathophysiological pathways may help to refine risk stratification in heart failure (HF). A novel calculator (BCN Bio-HF calculator) incorporating N-terminal pro B-type natriuretic peptide (NT-proBNP, a marker of myocardial stretch), high-sensitivity cardiac troponin T (hs-cTnT, a marker of myocyte injury), and high-sensitivity soluble ST2 (ST2), (reflective of myocardial fibrosis and remodeling) was developed.

Methods

Model performance was evaluated using discrimination, calibration, and reclassification tools for 1-, 2-, and 3-year mortality. Ten-fold cross-validation with 1000 bootstrapping was used.

Results

The BCN Bio-HF calculator was derived from 864 consecutive outpatients (72% men) with mean age 68.2±12 years (73%/27% New York Heart Association (NYHA) class I-II/III-IV, LVEF 36%, ischemic etiology 52.2%) and followed for a median of 3.4 years (305 deaths). After an initial evaluation of 23 variables, eight independent models were developed. The variables included in these models were age, sex, NYHA functional class, left ventricular ejection fraction, serum sodium, estimated glomerular filtration rate, hemoglobin, loop diuretic dose, β-blocker, Angiotensin converting enzyme inhibitor/Angiotensin-2 receptor blocker and statin treatments, and hs-cTnT, ST2, and NT-proBNP levels. The calculator may run with the availability of none, one, two, or the three biomarkers. The calculated risk of death was significantly changed by additive biomarker data. The average C-statistic in cross-validation analysis was 0.79.

Conclusions

A new HF risk-calculator that incorporates available biomarkers reflecting different pathophysiological pathways better allowed individual prediction of death at 1, 2, and 3 years.  相似文献   

13.

Background

Disease prevention has been claimed to reduce health care costs. However, preventing lethal diseases increases life expectancy and, thereby, indirectly increases the demand for health care. Previous studies have argued that on balance preventing diseases that reduce longevity increases health care costs while preventing non-fatal diseases could lead to health care savings. The objective of this research is to investigate if disease prevention could result in both increased longevity and lower lifetime health care costs.

Methods

Mortality rates for Netherlands in 2009 were used to construct cause-deleted life tables. Data originating from the Dutch Costs of Illness study was incorporated in order to estimate lifetime health care costs in the absence of selected disease categories. We took into account that for most diseases health care expenditures are concentrated in the last year of life.

Results

Elimination of diseases that reduce life expectancy considerably increase lifetime health care costs. Exemplary are neoplasms that, when eliminated would increase both life expectancy and lifetime health care spending with roughly 5% for men and women. Costs savings are incurred when prevention has only a small effect on longevity such as in the case of mental and behavioural disorders. Diseases of the circulatory system stand out as their elimination would increase life expectancy while reducing health care spending.

Conclusion

The stronger the negative impact of a disease on longevity, the higher health care costs would be after elimination. Successful treatment of fatal diseases leaves less room for longevity gains due to effective prevention but more room for health care savings.  相似文献   

14.

Background

The population of Japan has achieved the longest life expectancy in the world. To further improve population health, consistent and comparative evidence on mortality attributable to preventable risk factors is necessary for setting priorities for health policies and programs. Although several past studies have quantified the impact of individual risk factors in Japan, to our knowledge no study has assessed and compared the effects of multiple modifiable risk factors for non-communicable diseases and injuries using a standard framework. We estimated the effects of 16 risk factors on cause-specific deaths and life expectancy in Japan.

Methods and Findings

We obtained data on risk factor exposures from the National Health and Nutrition Survey and epidemiological studies, data on the number of cause-specific deaths from vital records adjusted for ill-defined codes, and data on relative risks from epidemiological studies and meta-analyses. We applied a comparative risk assessment framework to estimate effects of excess risks on deaths and life expectancy at age 40 y. In 2007, tobacco smoking and high blood pressure accounted for 129,000 deaths (95% CI: 115,000–154,000) and 104,000 deaths (95% CI: 86,000–119,000), respectively, followed by physical inactivity (52,000 deaths, 95% CI: 47,000–58,000), high blood glucose (34,000 deaths, 95% CI: 26,000–43,000), high dietary salt intake (34,000 deaths, 95% CI: 27,000–39,000), and alcohol use (31,000 deaths, 95% CI: 28,000–35,000). In recent decades, cancer mortality attributable to tobacco smoking has increased in the elderly, while stroke mortality attributable to high blood pressure has declined. Life expectancy at age 40 y in 2007 would have been extended by 1.4 y for both sexes (men, 95% CI: 1.3–1.6; women, 95% CI: 1.2–1.7) if exposures to multiple cardiovascular risk factors had been reduced to their optimal levels as determined by a theoretical-minimum-risk exposure distribution.

Conclusions

Tobacco smoking and high blood pressure are the two major risk factors for adult mortality from non-communicable diseases and injuries in Japan. There is a large potential population health gain if multiple risk factors are jointly controlled. Please see later in the article for the Editors'' Summary  相似文献   

15.

Introduction

HIV infection is a disease associated with chronic inflammation and immune activation. Antiretroviral therapy reduces inflammation, but not to levels in comparable HIV-negative individuals. The HMG-coenzyme A reductase inhibitors (statins) inhibit several pro-inflammatory processes and suppress immune activation, and are a logical therapy to assess for a possible salutary effect on HIV disease progression and outcomes.

Methods

Eligible patients were patients enrolled in the Johns Hopkins HIV Clinical Cohort who achieved virologic suppression within 180 days of starting a new highly active antiretroviral therapy (HAART) regimen after January 1, 1998. Assessment was continued until death in patients who maintained a virologic suppression, with right-censoring of their follow-up time if they had an HIV RNA > 500 copies/ml. Cox proportional hazards regression was used to assess statin use as a time-varying covariate, as well as other demographic and clinical factors.

Results

A total of 1538 HIV-infected patients fulfilled eligibility criteria, of whom 238 (15.5%) received a statin while taking HAART. There were 85 deaths (7 in statin users, 78 in non-users). By multivariate Cox regression, statin use was associated with a relative hazard of 0.33 (95% CI: 0.14, 0.76; P = 0.009) after adjusting for CD4, HIV-1 RNA, hemoglobin and cholesterol levels at the start of HAART, age, race, HIV risk group, prior use of ART, year of HAART start, NNRTI vs. PI-based ART, prior AIDS-defining illness, and viral hepatitis coinfection. Malignancy, non-AIDS-defining infection and liver failure were particularly prominent causes of death.

Discussion

Statin use was associated with significantly lower hazard of dying in these HIV-infected patients who were being effectively treated with HAART as determined by virologic suppression. Our results suggest the need for confirmation in other observational cohorts, and if confirmed, the need for a clinical trial of statin use in HIV infection.  相似文献   

16.

Background

Guidelines recommend incorporating life expectancy (LE) into clinical decision-making for preventive interventions such as cancer screening. Previous research focused on mortality risk (e.g. 28% at 4 years) which is more difficult to interpret than LE (e.g. 7.3 years) for both patients and clinicians. Our objective was to utilize the Gompertz Law of Human Mortality which states that mortality risk doubles in a fixed time interval to transform the Lee mortality index into a LE calculator.

Methods

We examined community-dwelling older adults age 50 and over enrolled in the nationally representative 1998 wave of the Health and Retirement Study or HRS (response rate 81%), dividing study respondents into development (n = 11701) and validation (n = 8009) cohorts. In the development cohort, we fit proportional hazards Gompertz survival functions for each of the risk groups defined by the Lee mortality index. We validated our LE estimates by comparing our predicted LE with observed survival in the HRS validation cohort and an external validation cohort from the 2004 wave of the English Longitudinal Study on Ageing or ELSA (n = 7042).

Results

The ELSA cohort had a lower 8-year mortality risk (14%) compared to our HRS development (23%) and validation cohorts (25%). Our model had good discrimination in the validation cohorts (Harrell’s c 0.78 in HRS and 0.80 in the ELSA). Our predicted LE’s were similar to observed survival in the HRS validation cohort without evidence of miscalibration (Hosmer-Lemeshow, p = 0.2 at 8 years). However, our predicted LE’s were longer than observed survival in the ELSA cohort with evidence of miscalibration (Hosmer-Lemeshow, p<0.001 at 8 years) reflecting the lower mortality rate in ELSA.

Conclusion

We transformed a previously validated mortality index into a LE calculator that incorporated patient-level risk factors. Our LE calculator may help clinicians determine which preventive interventions are most appropriate for older US adults.  相似文献   

17.

Background

The Zimbabwean national prevention of mother-to-child HIV transmission (PMTCT) program provided primarily single-dose nevirapine (sdNVP) from 2002–2009 and is currently replacing sdNVP with more effective antiretroviral (ARV) regimens.

Methods

Published HIV and PMTCT models, with local trial and programmatic data, were used to simulate a cohort of HIV-infected, pregnant/breastfeeding women in Zimbabwe (mean age 24.0 years, mean CD4 451 cells/µL). We compared five PMTCT regimens at a fixed level of PMTCT medication uptake: 1) no antenatal ARVs (comparator); 2) sdNVP; 3) WHO 2010 guidelines using “Option A” (zidovudine during pregnancy/infant NVP during breastfeeding for women without advanced HIV disease; lifelong 3-drug antiretroviral therapy (ART) for women with advanced disease); 4) WHO “Option B” (ART during pregnancy/breastfeeding without advanced disease; lifelong ART with advanced disease); and 5) “Option B+:” lifelong ART for all pregnant/breastfeeding, HIV-infected women. Pediatric (4–6 week and 18-month infection risk, 2-year survival) and maternal (2- and 5-year survival, life expectancy from delivery) outcomes were projected.

Results

Eighteen-month pediatric infection risks ranged from 25.8% (no antenatal ARVs) to 10.9% (Options B/B+). Although maternal short-term outcomes (2- and 5-year survival) varied only slightly by regimen, maternal life expectancy was reduced after receipt of sdNVP (13.8 years) or Option B (13.9 years) compared to no antenatal ARVs (14.0 years), Option A (14.0 years), or Option B+ (14.5 years).

Conclusions

Replacement of sdNVP with currently recommended regimens for PMTCT (WHO Options A, B, or B+) is necessary to reduce infant HIV infection risk in Zimbabwe. The planned transition to Option A may also improve both pediatric and maternal outcomes.  相似文献   

18.

Background

Oral pre-exposure prophylaxis (PrEP) can be clinically effective and cost-effective for HIV prevention in high-risk men who have sex with men (MSM). However, individual patients have different risk profiles, real-world populations vary, and no practical tools exist to guide clinical decisions or public health strategies. We introduce a practical model of HIV acquisition, including both a personalized risk calculator for clinical management and a cost-effectiveness calculator for population-level decisions.

Methods

We developed a decision-analytic model of PrEP for MSM. The primary clinical effectiveness and cost-effectiveness outcomes were the number needed to treat (NNT) to prevent one HIV infection, and the cost per quality-adjusted life-year (QALY) gained. We characterized patients according to risk factors including PrEP adherence, condom use, sexual frequency, background HIV prevalence and antiretroviral therapy use.

Results

With standard PrEP adherence and national epidemiologic parameters, the estimated NNT was 64 (95% uncertainty range: 26, 176) at a cost of $160,000 (cost saving, $740,000) per QALY – comparable to other published models. With high (35%) HIV prevalence, the NNT was 35 (21, 57), and cost per QALY was $27,000 (cost saving, $160,000), and with high PrEP adherence, the NNT was 30 (14, 69), and cost per QALY was $3,000 (cost saving, $200,000). In contrast, for monogamous, serodiscordant relationships with partner antiretroviral therapy use, the NNT was 90 (39, 157) and cost per QALY was $280,000 ($14,000, $670,000).

Conclusions

PrEP results vary widely across individuals and populations. Risk calculators may aid in patient education, clinical decision-making, and cost-effectiveness evaluation.  相似文献   

19.

Background

An arteriovenous fistula (AVF) is considered the vascular access of choice, but uncertainty exists about the optimal time for its creation in pre-dialysis patients. The aim of this study was to determine the optimal vascular access referral strategy for stage 4 (glomerular filtration rate <30 ml/min/1.73 m2) chronic kidney disease patients using a decision analytic framework.

Methods

A Markov model was created to compare two strategies: refer all stage 4 chronic kidney disease patients for an AVF versus wait until the patient starts dialysis. Data from published observational studies were used to estimate the probabilities used in the model. A Markov cohort analysis was used to determine the optimal strategy with life expectancy and quality adjusted life expectancy as the outcomes. Sensitivity analyses, including a probabilistic sensitivity analysis, were performed using Monte Carlo simulation.

Results

The wait strategy results in a higher life expectancy (66.6 versus 65.9 months) and quality adjusted life expectancy (38.9 versus 38.5 quality adjusted life months) than immediate AVF creation. It was robust across all the parameters except at higher rates of progression and lower rates of ischemic steal syndrome.

Conclusions

Early creation of an AVF, as recommended by most guidelines, may not be the preferred strategy in all pre-dialysis patients. Further research on cost implications and patient preferences for treatment options needs to be done before recommending early AVF creation.  相似文献   

20.

Introduction

Systemic lupus erythematosus (SLE) is a chronic autoimmune disease. Cardiovascular disease (CVD) is common and a major cause of mortality. Studies on cardiovascular morbidity are abundant, whereas mortality studies focusing on cardiovascular outcomes are scarce. The aim of this study was to investigate causes of death and baseline predictors of overall (OM), non-vascular (N-VM), and specifically cardiovascular (CVM) mortality in SLE, and to evaluate systematic coronary risk evaluation (SCORE).

Methods

208 SLE patients were included 1995-1999 and followed up after 12 years. Clinical evaluation, CVD risk factors, and biomarkers were recorded at inclusion. Death certificates and autopsy protocols were collected. Causes of death were divided into CVM (ischemic vascular and general atherosclerotic diseases), N-VM and death due to pulmonary hypertension. Predictors of mortality were investigated using multivariable Cox regression. SCORE and standardized mortality ratio (SMR) were calculated.

Results

During follow-up 42 patients died at mean age of 62 years. SMR 2.4 (CI 1.7-3.0). 48% of deaths were caused by CVM. SCORE underestimated CVM but not to a significant level. Age, high cystatin C levels and established arterial disease were the strongest predictors for all- cause mortality. After adjusting for these in multivariable analyses, only smoking among traditional risk factors, and high soluble vascular cell adhesion molecule-1 (sVCAM-1), high sensitivity C-reactive protein (hsCRP), anti-beta2 glycoprotein-1 (abeta2GP1) and any antiphospholipid antibody (aPL) among biomarkers, remained predictive of CVM.

Conclusion

With the exception of smoking, traditional risk factors do not capture the main underlying risk factors for CVM in SLE. Rather, cystatin C levels, inflammatory and endothelial markers, and antiphospholipid antibodies (aPL) differentiate patients with favorable versus severe cardiovascular prognosis. Our results suggest that these new biomarkers are useful in evaluating the future risk of cardiovascular mortality in SLE patients.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号