首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 23 毫秒
1.

Background

This study was conducted to assess the impact of chikungunya on health costs during the epidemic that occurred on La Réunion in 2005–2006.

Methodology/Principal Findings

From data collected from health agencies, the additional costs incurred by chikungunya in terms of consultations, drug consumption and absence from work were determined by a comparison with the expected costs outside the epidemic period. The cost of hospitalization was estimated from data provided by the national hospitalization database for short-term care by considering all hospital stays in which the ICD-10 code A92.0 appeared. A cost-of-illness study was conducted from the perspective of the third-party payer. Direct medical costs per outpatient and inpatient case were evaluated. The costs were estimated in Euros at 2006 values. Additional reimbursements for consultations with general practitioners and drugs were estimated as €12.4 million (range: €7.7 million–€17.1 million) and €5 million (€1.9 million–€8.1 million), respectively, while the cost of hospitalization for chikungunya was estimated to be €8.5 million (€5.8 million–€8.7 million). Productivity costs were estimated as €17.4 million (€6 million–€28.9 million). The medical cost of the chikungunya epidemic was estimated as €43.9 million, 60% due to direct medical costs and 40% to indirect costs (€26.5 million and €17.4 million, respectively). The direct medical cost was assessed as €90 for each outpatient and €2,000 for each inpatient.

Conclusions/Significance

The medical management of chikungunya during the epidemic on La Réunion Island was associated with an important economic burden. The estimated cost of the reported disease can be used to evaluate the cost/efficacy and cost/benefit ratios for prevention and control programmes of emerging arboviruses.  相似文献   

2.

Objective

Estimates of healthcare costs associated with HIV infection would provide valuable insight for evaluating the cost-effectiveness of possible prevention interventions. We evaluate the additional lifetime healthcare cost incurred due to living with HIV.

Methods

We used a stochastic computer simulation model to project the distribution of lifetime outcomes and costs of men-who-have-sex-with-men (MSM) infected with HIV in 2013 aged 30, over 10,000 simulations. We assumed a resource-rich setting with no loss to follow-up, and that standards and costs of healthcare management remain as now.

Results

Based on a median (interquartile range) life expectancy of 71.5 (45.0–81.5) years for MSM in such a setting, the estimated mean lifetime cost of treating one person was £360,800 ($567,000 or €480,000). With 3.5% discounting, it was £185,200 ($291,000 or €246,000). The largest proportion (68%) of these costs was attributed to antiretroviral drugs. If patented drugs are replaced by generic versions (at 20% cost of patented prices), estimated mean lifetime costs reduced to £179,000 ($281,000 or €238,000) and £101,200 ($158,900 or €134,600) discounted.

Conclusions

If 3,000 MSM had been infected in 2013, then future lifetime costs relating to HIV care is likely to be in excess of £1 billion. It is imperative for investment into prevention programmes to be continued or scaled-up in settings with good access to HIV care services. Costs would be reduced considerably with use of generic antiretroviral drugs.  相似文献   

3.
4.

Background

To evaluate the type and frequency of antiretroviral drug stock-outs, and their impact on death and interruption in care among HIV-infected patients in Abidjan, Côte d''Ivoire.

Methods and Findings

We conducted a cohort study of patients who initiated combination antiretroviral therapy (cART) in three adult HIV clinics between February 1, 2006 and June 1, 2007. Follow-up ended on February 1, 2008. The primary outcome was cART regimen modification, defined as at least one drug substitution, or discontinuation for at least one month due to drug stock-outs at the clinic pharmacy. The secondary outcome for patients who were on cART for at least six months was interruption in care, or death. A Cox regression model with time-dependent variables was used to assess the impact of antiretroviral drug stock-outs on interruption in care or death. Overall, 1,554 adults initiated cART and were followed for a mean of 13.2 months. During this time, 72 patients discontinued treatment and 98 modified their regimen because of drug stock-outs. Stock-outs involved nevirapine and fixed-dose combination zidovudine/lamivudine in 27% and 51% of cases. Of 1,554 patients, 839 (54%) initiated cART with fixed-dose stavudine/lamivudine/nevirapine and did not face stock-outs during the study period. Among the 975 patients who were on cART for at least six months, stock-out-related cART discontinuations increased the risk of interruption in care or death (adjusted hazard ratio [HR], 2.83; 95%CI, 1.25–6.44) but cART modifications did not (adjusted HR, 1.21; 95%CI, 0.46–3.16).

Conclusions

cART stock-outs affected at least 11% of population on treatment. Treatment discontinuations due to stock-outs were frequent and doubled the risk of interruption in care or death. These stock-outs did not involve the most common first-line regimen. As access to cART continues to increase in sub-Saharan Africa, first-line regimens should be standardized to decrease the probability of drug stock-outs.  相似文献   

5.

Objective

The HEART score serves risk stratification of chest pain patients at the emergency department (ED). Quicker and more solid decisions may be taken in these patients with application of this score. An analysis of medical consumption of 122 acute chest pain patients admitted before the introduction of this score may be indicative of possible savings.

Methods

Numbers of cardiology investigations and clinical admission days were counted. Charged cost of medicine was divided into three categories: ED, in-hospital, and outpatient clinic.

Results

The total cost of care was € 469,631, with an average of € 3849 per patient. Seventy-five percent of this cost was due to hospitalisation under the initial working diagnosis of acute coronary syndrome (ACS). This diagnosis was confirmed in only 29/122 (24 %) of the patients. The low-risk group (41 patients with HEART scores 0–3) included one patient with a previously scheduled CABG. In the remaining 40 patients, hospitalisation occurred in 12/40 (30 %) patients and 30/40 (75 %) patients visited the outpatient clinic. The total cost of medical care after presentation of these 40 patients was € 37,641; there were no cases where a new diagnosis of coronary artery disease was made. When medical care in this subgroup is declared redundant, major savings on national medical care budgets could be made.

Conclusion

If the HEART score were to be routinely applied, diagnostic pathways could be shortened and costs reduced, in particular in low-risk patients.  相似文献   

6.

Background

Efficiently delivered interventions to reduce HIV, malaria, and diarrhea are essential to accelerating global health efforts. A 2008 community integrated prevention campaign in Western Province, Kenya, reached 47,000 individuals over 7 days, providing HIV testing and counseling, water filters, insecticide-treated bed nets, condoms, and for HIV-infected individuals cotrimoxazole prophylaxis and referral for ongoing care. We modeled the potential cost-effectiveness of a scaled-up integrated prevention campaign.

Methods

We estimated averted deaths and disability-adjusted life years (DALYs) based on published data on baseline mortality and morbidity and on the protective effect of interventions, including antiretroviral therapy. We incorporate a previously estimated scaled-up campaign cost. We used published costs of medical care to estimate savings from averted illness (for all three diseases) and the added costs of initiating treatment earlier in the course of HIV disease.

Results

Per 1000 participants, projected reductions in cases of diarrhea, malaria, and HIV infection avert an estimated 16.3 deaths, 359 DALYs and $85,113 in medical care costs. Earlier care for HIV-infected persons adds an estimated 82 DALYs averted (to a total of 442), at a cost of $37,097 (reducing total averted costs to $48,015). Accounting for the estimated campaign cost of $32,000, the campaign saves an estimated $16,015 per 1000 participants. In multivariate sensitivity analyses, 83% of simulations result in net savings, and 93% in a cost per DALY averted of less than $20.

Discussion

A mass, rapidly implemented campaign for HIV testing, safe water, and malaria control appears economically attractive.  相似文献   

7.

Background

Antiretroviral Treatment (ART) significantly reduces HIV transmission. We conducted a cost-effectiveness analysis of the impact of expanded ART in South Africa.

Methods

We model a best case scenario of 90% annual HIV testing coverage in adults 15–49 years old and four ART eligibility scenarios: CD4 count <200 cells/mm3 (current practice), CD4 count <350, CD4 count <500, all CD4 levels. 2011–2050 outcomes include deaths, disability adjusted life years (DALYs), HIV infections, cost, and cost per DALY averted. Service and ART costs reflect South African data and international generic prices. ART reduces transmission by 92%. We conducted sensitivity analyses.

Results

Expanding ART to CD4 count <350 cells/mm3 prevents an estimated 265,000 (17%) and 1.3 million (15%) new HIV infections over 5 and 40 years, respectively. Cumulative deaths decline 15%, from 12.5 to 10.6 million; DALYs by 14% from 109 to 93 million over 40 years. Costs drop $504 million over 5 years and $3.9 billion over 40 years with breakeven by 2013. Compared with the current scenario, expanding to <500 prevents an additional 585,000 and 3 million new HIV infections over 5 and 40 years, respectively. Expanding to all CD4 levels decreases HIV infections by 3.3 million (45%) and costs by $10 billion over 40 years, with breakeven by 2023. By 2050, using higher ART and monitoring costs, all CD4 levels saves $0.6 billion versus current; other ART scenarios cost $9–194 per DALY averted. If ART reduces transmission by 99%, savings from all CD4 levels reach $17.5 billion. Sensitivity analyses suggest that poor retention and predominant acute phase transmission reduce DALYs averted by 26% and savings by 7%.

Conclusion

Increasing the provision of ART to <350 cells/mm3 may significantly reduce costs while reducing the HIV burden. Feasibility including HIV testing and ART uptake, retention, and adherence should be evaluated.  相似文献   

8.

Aim

To calculate use, cost and cost-effectiveness of people living with HIV (PLHIV) starting routine treatment and care before starting combination antiretroviral therapy (cART) and PLHIV starting first-line 2NRTIs+NNRTI or 2NRTIs+PIboosted, comparing PLHIV with CD4≤200 cells/mm3 and CD4>200 cells/mm3. Few studies have calculated the use, cost and cost-effectiveness of routine treatment and care before starting cART and starting cART above and below CD4 200 cells/mm3.

Methods

Use, costs and cost-effectiveness were calculated for PLHIV in routine pre-cART and starting first-line cART, comparing CD4≤200 cells/mm3 with CD4>200 cells/mm3 (2008 UK prices).

Results

cART naïve patients CD4≤200 cells/mm3 had an annual cost of £6,407 (95%CI £6,382 to £6,425) PPY compared with £2,758 (95%CI £2,752 to £2,761) PPY for those with CD4>200 cells/mm3; cost per life year gained of pre-cART treatment and care for those with CD4>200 cells/mm3 was £1,776 (cost-saving to £2,752). Annual cost for starting 2NRTIs+NNRTI or 2NRTIs+PIboosted with CD4≤200 cells/mm3 was £12,812 (95%CI £12,685–£12,937) compared with £10,478 (95%CI £10,376–£10,581) for PLHIV with CD4>200 cells/mm3. Cost per additional life-year gained on first-line therapy for those with CD4>200 cells/mm3 was £4639 (£3,967 to £2,960).

Conclusion

PLHIV starting to use HIV services before CD4≤200 cells/mm3 is cost-effective and enables them to be monitored so they start cART with a CD4>200 cells/mm3, which results in better outcomes and is cost-effective. However, 25% of PLHIV accessing services continue to present with CD4≤200 cells/mm3. This highlights the need to investigate the cost-effectiveness of testing and early treatment programs for key populations in the UK.  相似文献   

9.

Background

Trials in Africa indicate that medical adult male circumcision (MAMC) reduces the risk of HIV by 60%. MAMC may avert 2 to 8 million HIV infections over 20 years in sub-Saharan Africa and cost less than treating those who would have been infected. This paper estimates the financial and human resources required to roll out MAMC and the net savings due to reduced infections.

Methods

We developed a model which included costing, demography and HIV epidemiology. We used it to investigate 14 countries in sub-Saharan Africa where the prevalence of male circumcision was lower than 80% and HIV prevalence among adults was higher than 5%, in addition to Uganda and the Nyanza province in Kenya. We assumed that the roll-out would take 5 years and lead to an MC prevalence among adult males of 85%. We also assumed that surgery would be done as it was in the trials. We calculated public program cost, number of full-time circumcisers and net costs or savings when adjusting for averted HIV treatments. Costs were in USD, discounted to 2007. 95% percentile intervals (95% PI) were estimated by Monte Carlo simulations.

Results

In the first 5 years the number of circumcisers needed was 2 282 (95% PI: 2 018 to 2 959), or 0.24 (95% PI: 0.21 to 0.31) per 10 000 adults. In years 6–10, the number of circumcisers needed fell to 513 (95% PI: 452 to 664). The estimated 5-year cost of rolling out MAMC in the public sector was $919 million (95% PI: 726 to 1 245). The cumulative net cost over the first 10 years was $672 million (95% PI: 437 to 1 021) and over 20 years there were net savings of $2.3 billion (95% PI: 1.4 to 3.4).

Conclusion

A rapid roll-out of MAMC in sub-Saharan Africa requires substantial funding and a high number of circumcisers for the first five years. These investments are justified by MAMC''s substantial health benefits and the savings accrued by averting future HIV infections. Lower ongoing costs and continued care savings suggest long-term sustainability.  相似文献   

10.

Background

A multi centre double-blind randomised-controlled trial (M-RCT), carried out in the Netherlands in 2005–2007, showed that hospitalised patients with S. aureus nasal carriage who were treated prophylactically with mupirocin nasal ointment and chlorhexidine gluconate medicated soap (MUP-CHX), had a significantly lower risk of health-care associated S. aureus infections than patients receiving placebo (3.4% vs. 7.7%, RR 0.42, 95% CI 0.23–0.75). The objective of the present study was to determine whether treatment of patients undergoing elective cardiothoracic or orthopaedic surgery with MUP-CHX (screen-and-treat strategy) affected the costs of patient care.

Methods

We compared hospital costs of patients undergoing cardiothoracic or orthopaedic surgery (n = 415) in one of the participating centres of the M-RCT. Data from the ‘Planning and Control’ department were used to calculate total hospital costs of the patients. Total costs were calculated including nursing days, costs of surgery, costs for laboratory and radiological tests, functional assessments and other costs. Costs for personnel, materials and overhead were also included. Mean costs in the two treatment arms were compared using the t-test for equality of means (two-tailed). Subgroup analysis was performed for cardiothoracic and orthopaedic patients.

Results

An investigator-blinded analysis revealed that costs of care in the treatment arm (MUP-CHX, n = 210) were on average €1911 lower per patient than costs of care in the placebo arm (n = 205) (€8602 vs. €10513, p = 0.01). Subgroup analysis showed that MUP-CHX treated cardiothoracic patients cost €2841 less (n = 280, €9628 vs €12469, p = 0.006) and orthopaedic patients €955 less than non-treated patients (n = 135, €6097 vs €7052, p = 0.05).

Conclusions

In conclusion, in patients undergoing cardiothoracic or orthopaedic surgery, screening for S. aureus nasal carriage and treating carriers with MUP-CHX results in a substantial reduction of hospital costs.  相似文献   

11.

Background

In France, roughly 40,000 HIV-infected persons are unaware of their HIV infection. Although previous studies have evaluated the cost-effectiveness of routine HIV screening in the United States, differences in both the epidemiology of infection and HIV testing behaviors warrant a setting-specific analysis for France.

Methods/Principal Findings

We estimated the life expectancy (LE), cost and cost-effectiveness of alternative HIV screening strategies in the French general population and high-risk sub-populations using a computer model of HIV detection and treatment, coupled with French national clinical and economic data. We compared risk-factor-based HIV testing (“current practice”) to universal routine, voluntary HIV screening in adults aged 18–69. Screening frequencies ranged from once to annually. Input data included mean age (42 years), undiagnosed HIV prevalence (0.10%), annual HIV incidence (0.01%), test acceptance (79%), linkage to care (75%) and cost/test (€43). We performed sensitivity analyses on HIV prevalence and incidence, cost estimates, and the transmission benefits of ART. “Current practice” produced LEs of 242.82 quality-adjusted life months (QALM) among HIV-infected persons and 268.77 QALM in the general population. Adding a one-time HIV screen increased LE by 0.01 QALM in the general population and increased costs by €50/person, for a cost-effectiveness ratio (CER) of €57,400 per quality-adjusted life year (QALY). More frequent screening in the general population increased survival, costs and CERs. Among injection drug users (prevalence 6.17%; incidence 0.17%/year) and in French Guyana (prevalence 0.41%; incidence 0.35%/year), annual screening compared to every five years produced CERs of €51,200 and €46,500/QALY.

Conclusions/Significance

One-time routine HIV screening in France improves survival compared to “current practice” and compares favorably to other screening interventions recommended in Western Europe. In higher-risk groups, more frequent screening is economically justifiable.  相似文献   

12.

Background

Immune activation is a strong predictor of disease progression in HIV infection. Combinatorial plasma biomarker signatures that represent surrogate markers of immune activation in both viremic and aviremic HIV patients on combination antiretroviral therapy (cART) have not been defined. Here, we identify a plasma inflammatory biomarker signature that distinguishes between both viremic and aviremic HIV patients on cART and healthy controls and examine relationships of this signature to markers of disease progression.

Methods

Multiplex profiling and ELISA were used to detect 15 cytokines/chemokines, soluble IL-2R (sIL-2R), and soluble CD14 (sCD14) in plasma from 57 HIV patients with CD4 nadir <300 cells/µl and 29 healthy controls. Supervised and unsupervised analyses were used to identify biomarkers explaining variance between groups defined by HIV status or drug abuse. Relationships between biomarkers and disease markers were examined by Spearman correlation.

Results

The majority (91%) of HIV subjects were on cART, with 38% having undetectable viral loads (VL). Hierarchical clustering identified a biomarker cluster in plasma consisting of two interferon-stimulated gene products (CXCL9 and CXCL10), T cell activation marker (sIL-2R), and monocyte activation marker (sCD14) that distinguished both viremic and aviremic HIV patients on cART from controls (p<0.0001) and were top-ranked in variables important in projection plots. IL-12 and CCL4 were also elevated in viremic and aviremic patients compared to controls (p<0.05). IL-12 correlated with IFNα, IFNγ, CXCL9, and sIL-2R (p<0.05). CXCL10 correlated positively with plasma VL and percentage of CD16+ monocytes, and inversely with CD4 count (p = 0.001, <0.0001, and 0.04, respectively).

Conclusion

A plasma inflammatory biomarker signature consisting of CXCL9, CXCL10, sIL-2R, and sCD14 may be useful as a surrogate marker to monitor immune activation in both viremic and aviremic HIV patients on cART during disease progression and therapeutic responses.  相似文献   

13.

Objectives

We analyzed clinical progression among persons diagnosed with HIV at the time of an AIDS-defining event, and assessed the impact on outcome of timing of combined antiretroviral treatment (cART).

Methods

Retrospective, European and Canadian multicohort study.. Patients were diagnosed with HIV from 1997–2004 and had clinical AIDS from 30 days before to 14 days after diagnosis. Clinical progression (new AIDS event, death) was described using Kaplan-Meier analysis stratifying by type of AIDS event. Factors associated with progression were identified with multivariable Cox regression. Progression rates were compared between those starting early (<30 days after AIDS event) or deferred (30–270 days after AIDS event) cART.

Results

The median (interquartile range) CD4 count and viral load (VL) at diagnosis of the 584 patients were 42 (16, 119) cells/µL and 5.2 (4.5, 5.7) log10 copies/mL. Clinical progression was observed in 165 (28.3%) patients. Older age, a higher VL at diagnosis, and a diagnosis of non-Hodgkin lymphoma (NHL) (vs. other AIDS events) were independently associated with disease progression. Of 366 patients with an opportunistic infection, 178 (48.6%) received early cART. There was no significant difference in clinical progression between those initiating cART early and those deferring treatment (adjusted hazard ratio 1.32 [95% confidence interval 0.87, 2.00], p = 0.20).

Conclusions

Older patients and patients with high VL or NHL at diagnosis had a worse outcome. Our data suggest that earlier initiation of cART may be beneficial among HIV-infected patients diagnosed with clinical AIDS in our setting.  相似文献   

14.

Background

Adolescents have been identified as a high-risk group for poor adherence to and defaulting from combination antiretroviral therapy (cART) care. However, data on outcomes for adolescents on cART in resource-limited settings remain scarce.

Methods

We developed an observational study of patients who started cART at The AIDS Service Organization (TASO) in Uganda between 2004 and 2009. Age was stratified into three groups: children (≤10 years), adolescents (11–19 years), and adults (≥20 years). Kaplan-Meier survival curves were generated to describe time to mortality and loss to follow-up, and Cox regression used to model associations between age and mortality and loss to follow-up. To address loss to follow up, we applied a weighted analysis that assumes 50% of lost patients had died.

Findings

A total of 23,367 patients were included in this analysis, including 810 (3.5%) children, 575 (2.5%) adolescents, and 21 982 (94.0%) adults. A lower percentage of children (5.4%) died during their cART treatment compared to adolescents (8.5%) and adults (10%). After adjusting for confounding, other features predicted mortality than age alone. Mortality was higher among males (p<0.001), patients with a low initial CD4 cell count (p<0.001), patients with advanced WHO clinical disease stage (p<0.001), and shorter duration of time receiving cART (p<0.001). The crude mortality rate was lower for children (22.8 per 1000 person-years; 95% CI: 16.1, 29.5), than adolescents (36.5 per 1000 person-years; 95% CI: 26.3, 46.8) and adults (37.5 per 1000 person-years; 95% CI: 35.9, 39.1).

Interpretation

This study is the largest assessment of adolescents receiving cART in Africa. Adolescents did not have cART mortality outcomes different from adults or children.  相似文献   

15.

Objective

Intravenous iron is widely used to treat iron deficiency in day-care units. Ferric carboxymaltose (FCM) allows administration of larger iron doses than iron sucrose (IS) in each infusion (1000 mg vs. 200 mg). As FCM reduces the number of infusions required but is more expensive, we performed a cost-minimization analysis to compare the cost impact of the two drugs.

Materials and Methods

The number of infusions and the iron dose of 111 consecutive patients who received intravenous iron at a gastrointestinal diseases day-care unit from 8/2007 to 7/2008 were retrospectively obtained. Costs of intravenous iron drugs were obtained from the Spanish regulatory agencies. The accounting department of the Hospital determined hospital direct and indirect costs for outpatient iron infusion. Non-hospital direct costs were calculated on the basis of patient interviews. In the pharmacoeconomic model, base case mean costs per patient were calculated for administering 1000 mg of iron per infusion using FCM or 200 mg using IS. Sensitivity analysis and Monte Carlo simulation were performed.

Results

Under baseline assumptions, the estimated cost of iron infusion per patient and year was €304 for IS and €274 for FCM, a difference of €30 in favour of FCM. Adding non-hospital direct costs to the model increased the difference to €67 (€354 for IS vs. €287 for FCM). A Monte Carlo simulation taking into account non-hospital direct costs favoured the use of FCM in 97% of simulations.

Conclusion

In this pharmacoeconomic analysis, FCM infusion reduced the costs of iron infusion at a gastrointestinal day-care unit.  相似文献   

16.

Background

Progressive multifocal leukoencephalopathy (PML), a rare devastating demyelinating disease caused by the polyomavirus JC (JCV), occurs in severely immunocompromised patients, most of whom have advanced-stage HIV infection. Despite combination antiretroviral therapy (cART), 50% of patients die within 6 months of PML onset. We conducted a multicenter, open-label pilot trial evaluating the survival benefit of a five-drug cART designed to accelerate HIV replication decay and JCV-specific immune recovery.

Methods and Findings

All the patients received an optimized cART with three or more drugs for 12 months, plus the fusion inhibitor enfuvirtide during the first 6 months. The main endpoint was the one-year survival rate. A total of 28 patients were enrolled. At entry, median CD4+ T-cell count was 53 per microliter and 86% of patients had detectable plasma HIV RNA and CSF JCV DNA levels. Seven patients died, all before month 4. The one-year survival estimate was 0.75 (95% confidence interval, 0.61 to 0.93). At month 6, JCV DNA was undetectable in the CSF of 81% of survivors. At month 12, 81% of patients had undetectable plasma HIV RNA, and the median CD4+ T-cell increment was 105 per microliter. In univariate analysis, higher total and naive CD4+ T-cell counts and lower CSF JCV DNA level at baseline were associated with better survival. JCV-specific functional memory CD4+ T-cell responses, based on a proliferation assay, were detected in 4% of patients at baseline and 43% at M12 (P = 0.008).

Conclusions

The early use of five-drug cART after PML diagnosis appears to improve survival. This is associated with recovery of anti-JCV T-cell responses and JCV clearance from CSF. A low CD4+ T-cell count (particularly naive subset) and high JCV DNA copies in CSF at PML diagnosis appear to be risk factors for death.

Trial Registration

ClinicalTrials.gov NCT00120367  相似文献   

17.

Objectives

Immune activation is decreased by combination antiretroviral therapy (cART) in patients infected with human immunodeficiency virus (HIV), but residual activation remains and has been proposed as a cause of premature aging and death, but data are lacking. We analyzed the relationship between T-cell subsets after 18 months of cART and overall survival during 12 years of follow up.

Methods

A cohort of 101 HIV infected patients who had undetectable plasma HIV after starting cART was included in 1997–1998. T cell subsets were analyzed by flowcytometry after 18 months of cART. Relation to survival was calculated using Kaplan-Meier curves and multiple Cox regression.

Results

Seventeen patients died during the observation period. The leading causes of death were non-AIDS cancer and cardiovascular disease. Higher levels of CD8 memory T cells (CD8+,CD45RO+,CD45RA-) showed a significant beneficiary effect on survival, HR of 0.95 (95% confidence interval 0.91–0.99, P = 0.016) when adjusted for age, nadir CD4 count, CD4 count, and AIDS and hepatitis C status. T cell activation was not associated with increased risk of death.

Conclusions

Larger and longitudinal studies are needed to accurately establish prognostic factors, but overall results seem to suggest that prognostic information exists within the CD8 compartment.  相似文献   

18.

Background

The costs and benefits of controlling nosocomial spread of antibiotic-resistant bacteria are unknown.

Methods

We developed a mathematical algorithm to determine cost-effectiveness of infection control programs and explored the dynamical interactions between different epidemiological variables and cost-effectiveness. The algorithm includes occurrence of nosocomial infections, attributable mortality, costs and efficacy of infection control and how antibiotic-resistant bacteria affect total number of infections: do infections with antibiotic-resistant bacteria replace infections caused by susceptible bacteria (replacement scenario) or occur in addition to them (addition scenario). Methicillin-resistant Staphylococcus aureus (MRSA) bacteremia was used for illustration using observational data on S. aureus bacteremia (SAB) in our hospital (n = 189 between 2001–2004, all being methicillin-susceptible S. aureus [MSSA]).

Results

In the replacement scenario, the costs per life year gained range from € 45,912 to € 6590 for attributable mortality rates ranging from 10% to 50%. Using € 20,000 per life year gained as a threshold, completely preventing MRSA would be cost-effective in the replacement scenario if attributable mortality of MRSA is ≥21%. In the addition scenario, infection control would be cost saving along the entire range of estimates for attributable mortality.

Conclusions

Cost-effectiveness of controlling antibiotic-resistant bacteria is highly sensitive to the interaction between infections caused by resistant and susceptible bacteria (addition or replacement) and attributable mortality. In our setting, controlling MRSA would be cost saving for the addition scenario but would not be cost-effective in the replacement scenario if attributable mortality would be <21%.  相似文献   

19.
20.

Introduction

As high out-of-pocket healthcare expenses pose heavy financial burden on the families, Government of India is considering a variety of financing and delivery options to universalize health care services. Hence, an estimate of the cost of delivering universal health care services is needed.

Methods

We developed a model to estimate recurrent and annual costs for providing health services through a mix of public and private providers in Chandigarh located in northern India. Necessary health services required to deliver good quality care were defined by the Indian Public Health Standards. National Sample Survey data was utilized to estimate disease burden. In addition, morbidity and treatment data was collected from two secondary and two tertiary care hospitals. The unit cost of treatment was estimated from the published literature. For diseases where data on treatment cost was not available, we collected data on standard treatment protocols and cost of care from local health providers.

Results

We estimate that the cost of universal health care delivery through the existing mix of public and private health institutions would be INR 1713 (USD 38, 95%CI USD 18–73) per person per annum in India. This cost would be 24% higher, if branded drugs are used. Extrapolation of these costs to entire country indicates that Indian government needs to spend 3.8% (2.1%–6.8%) of the GDP for universalizing health care services.

Conclusion

The cost of universal health care delivered through a combination of public and private providers is estimated to be INR 1713 per capita per year in India. Important issues such as delivery strategy for ensuring quality, reducing inequities in access, and managing the growth of health care demand need be explored.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号