首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 218 毫秒
1.
2.

Background

Universal access to first-line antiretroviral therapy (ART) for HIV infection is becoming more of a reality in most low and middle income countries in Asia. However, second-line therapies are relatively scarce.

Methods and Findings

We developed a mathematical model of an HIV epidemic in a Southeast Asian setting and used it to forecast the impact of treatment plans, without second-line options, on the potential degree of acquisition and transmission of drug resistant HIV strains. We show that after 10 years of universal treatment access, up to 20% of treatment-naïve individuals with HIV may have drug-resistant strains but it depends on the relative fitness of viral strains.

Conclusions

If viral load testing of people on ART is carried out on a yearly basis and virological failure leads to effective second-line therapy, then transmitted drug resistance could be reduced by 80%. Greater efforts are required for minimizing first-line failure, to detect virological failure earlier, and to procure access to second-line therapies.  相似文献   

3.
4.

Objective

This study aims to describe the virological, immunological and clinical efficacy of protease inhibitor (PI)-based second-line antiretroviral therapy (ART) in rural South Africa.

Methods

An observational cohort study was performed on 210 patients (including 39 children) who initiated PI-based second-line therapy at least 12 months prior to data collection. Biannual clinical, immunological and virological monitoring was performed. Primary endpoints were adequate virological response (plasma HIV-1 RNA<400 copies/ml), full virological suppression (plasma HIV-1 RNA<50 copies/ml) and treatment failure (virological failure (plasma HIV-1 RNA>1000 after initial virological response) or on-going viremia (plasma HIV-1 RNA never<400 copies/ml for more than six months)). Data were analyzed by an on-treatment (OT) and intention-to-treat (ITT) approach. Analyses were primarily performed on the group of patients who switched following first-line virological failure.

Results

Median duration of follow-up after switch to second-line treatment was 20 months [IQR 11–35]. 191 patients had switched to second-line ART due to first-line virological failure. 139/191 of them (72.8%, ITT) were in care and on treatment at the end of follow-up and 11/191 (5.8%, ITT) had died. After twelve months, an adequate virological response was seen in 92/128 patients (71.9%, OT), of which 78/128 (60.9%, OT) experienced full virological suppression. Virological response remained stable after 24 months. Virological efficacy was similar amongst adult and pediatric patients. As in first-line ART, we observed a lack of correlation between virological failure and WHO-defined immunological failure.

Conclusions

Good virological outcomes following first-line failure can be achieved with PI-based, second-line antiretroviral therapy in both adult and pediatric patients in rural South Africa. Retention rates were high and virological outcomes were sustainable during the two-year follow-up period, although persisting low-level viremia occurred in a subset of patients. The observed viro-immunological dissociation emphasizes the need for virological monitoring.  相似文献   

5.
6.

Objectives

In developing nations, the use of operational parameters (OPs) in the prediction of clinical care represents a missed opportunity to enhance the care process. We modeled the impact of multiple measurements of antiretroviral treatment (ART) adherence on antiretroviral treatment outcomes in Peru.

Design And Methods

Retrospective cohort study including ART naïve, non-pregnant, adults initiating therapy at Hospital Nacional Cayetano Heredia, Lima-Peru (2006-2010). Three OPs were defined: 1) Medication possession ratio (MPR): days with antiretrovirals dispensed/days on first-line therapy; 2) Laboratory monitory constancy (LMC): proportion of 6 months intervals with ≥1 viral load or CD4 reported; 3) Clinic visit constancy (CVC): proportion of 6 months intervals with ≥1 clinic visit.Three multi-variable Cox proportional hazard (PH) models (one per OP) were fit for (1) time of first-line ART persistence and (2) time to second-line virologic failure. All models were adjusted for socio-demographic, clinical and laboratory variables.

Results

856 patients were included in first-line persistence analyses, median age was 35.6 years [29.4-42.9] and most were male (624; 73%). In multivariable PH models, MPR (per 10% increase HR=0.66; 95%CI=0.61-0.71) and LMC (per 10% increase 0.83; 0.71-0.96) were associated with prolonged time on first-line therapies.Among 79 individuals included in time to second-line virologic failure analyses, MPR was the only OP independently associated with prolonged time to second-line virologic failure (per 10% increase 0.88; 0.77-0.99).

Conclusions

The capture and utilization of program level parameters such as MPR can provide valuable insight into patient-level treatment outcomes.  相似文献   

7.

Background

There is an urgent need to improve the evidence base for provision of second-line antiretroviral therapy (ART) following first-line virological failure. This is particularly the case in Sub-Saharan Africa where 70% of all people living with HIV/AIDS (PHA) reside. The aim of this study was to simulate the potential risks and benefits of treatment simplification in second-line therapy compared to the current standard of care (SOC) in a lower-middle income and an upper-middle income country in Sub-Saharan Africa.

Methods

We developed a microsimulation model to compare outcomes associated with reducing treatment discontinuations between current SOC for second-line therapy in South Africa and Nigeria and an alternative regimen: ritonavir-boosted lopinavir (LPV/r) combined with raltegravir (RAL). We used published studies and collaborating sites to estimate efficacy, adverse effect and cost. Model outcomes were reported as incremental cost effectiveness ratios (ICERs) in 2011 USD per quality adjusted life year ($/QALY) gained.

Results

Reducing treatment discontinuations with LPV/r+RAL resulted in an additional 0.4 discounted QALYs and increased the undiscounted life expectancy by 0.8 years per person compared to the current SOC. The average incremental cost was $6,525 per treated patient in Nigeria and $4,409 per treated patient in South Africa. The cost-effectiveness ratios were $16,302/QALY gained and $11,085/QALY gained for Nigeria and South Africa, respectively. Our results were sensitive to the probability of ART discontinuation and the unit cost for RAL.

Conclusions

The combination of raltegravir and ritonavir-boosted lopinavir was projected to be cost-effective in South Africa. However, at its current price, it is unlikely to be cost-effective in Nigeria.  相似文献   

8.

Objectives

Mortality in patients starting antiretroviral therapy (ART) is higher in Malawi and Zambia than in South Africa. We examined whether different monitoring of ART (viral load [VL] in South Africa and CD4 count in Malawi and Zambia) could explain this mortality difference.

Design:

Mathematical modelling study based on data from ART programmes.

Methods

We used a stochastic simulation model to study the effect of VL monitoring on mortality over 5 years. In baseline scenario A all parameters were identical between strategies except for more timely and complete detection of treatment failure with VL monitoring. Additional scenarios introduced delays in switching to second-line ART (scenario B) or higher virologic failure rates (due to worse adherence) when monitoring was based on CD4 counts only (scenario C). Results are presented as relative risks (RR) with 95% prediction intervals and percent of observed mortality difference explained.

Results

RRs comparing VL with CD4 cell count monitoring were 0.94 (0.74–1.03) in scenario A, 0.94 (0.77–1.02) with delayed switching (scenario B) and 0.80 (0.44–1.07) when assuming a 3-times higher rate of failure (scenario C). The observed mortality at 3 years was 10.9% in Malawi and Zambia and 8.6% in South Africa (absolute difference 2.3%). The percentage of the mortality difference explained by VL monitoring ranged from 4% (scenario A) to 32% (scenarios B and C combined, assuming a 3-times higher failure rate). Eleven percent was explained by non-HIV related mortality.

Conclusions

VL monitoring reduces mortality moderately when assuming improved adherence and decreased failure rates.  相似文献   

9.

Background

Tenofovir (TDF) and ritonavir-boosted lopinavir (LPV/r) were not introduced to China as second-line medications until 2009. The efficacy and safety of TDF/3TC/LPV/r based second-line regimen have not been evaluated in Chinese HIV patients who failed first-line regimens.

Methods

This was a multicenter cohort study recruiting patients from Beijing, Shanghai, Guangdong, and Henan provinces between November 2008 and January 2010. Eighty HIV infected patients failing first-line regimens with serum creatinine lower than 1.5 times the upper limit of normal received TDF+ lamivudine (3TC)+ LPV/r were followed up for 120 weeks. CD4 cell count, viral load, and estimated glomerular filtration rate (eGFR) were monitored at each visit.

Results

At baseline, 31.2% and 48.8% of patients had moderate/high-level resistance to TDF and 3TC, respectively; while 2.5% of patients had only low-level resistance to LPV/r. During 120 weeks of follow-up, virological suppression rate reached over 70% (<40 copies/ml) and 90% (<400 copies/ml), and median CD4 cell count increased from 157 cells/μL at baseline to 307 cells/μL at week 120. Baseline drug-resistance mutations had no impact on the efficacy of second-line antiretroviral therapy. Median eGFR dropped from 104.7 ml/min/1.73m2 at baseline to 95.6 ml/min/1.73m2 at week 24 and then recovered after week 96.

Conclusion

This study for the first time demonstrated that TDF+ 3TC+ LPV/r was efficacious as second-line regimen with acceptable nephrotoxicity profiles in patients who failed zidovudine or stavudine based first-line regimens in China.

Trial Registration

ClinicalTrials.gov NCT00872417  相似文献   

10.

Background

Hepatitis B coinfection is common in HIV-positive individuals and as antiretroviral therapy has made death due to AIDS less common, hepatitis has become increasingly important. Several drugs are available to treat hepatitis B. The most potent and the one with the lowest risk of resistance appears to be tenofovir (TDF). However there are several questions that remain unanswered regarding the use of TDF, including the proportion of patients that achieves suppression of HBV viral load and over what time, whether suppression is durable and whether prior treatment with other HBV-active drugs such as lamivudine, compromises the efficacy of TDF due to possible selection of resistant HBV strains.

Methods

A systematic review and meta-analysis following PRISMA guidelines and using multilevel mixed effects logistic regression, stratified by prior and/or concomitant use of lamivudine and/or emtricitabine.

Results

Data was available from 23 studies including 550 HBV/HIV coinfected patients treated with TDF. Follow up was for up to seven years but to ensure sufficient power the data analyses were limited to three years. The overall proportion achieving suppression of HBV replication was 57.4%, 79.0% and 85.6% at one, two and three years, respectively. No effect of prior or concomitant 3TC/FTC was shown. Virological rebound on TDF treatment was rare.

Interpretation

TDF suppresses HBV to undetectable levels in the majority of HBV/HIV coinfected patients with the proportion fully suppressed continuing to increase during continuous treatment. Prior treatment with 3TC/FTC does not compromise efficacy of TDF treatment. The use of combination treatment with 3TC/FTC offers no significant benefit over TDF alone.  相似文献   

11.

Aim

Investigate the cost and effects of a single-pill versus two- or three pill first-line antiretroviral combinations in reducing viral load, increasing CD4 counts, and first-line failure rate associated with respective regimens at 6 and 12 months.

Methods

Patients on first-line TDF+3TC+EFV, TDF+FTC+EFV, Truvada®+EFV or Atripla® between 1996–2008 were identified and viral load and CD4 counts measured at baseline, six and twelve months respectively. Factors that independently predicted treatment failure at six and twelve months were derived using multivariate Cox''s proportional hazard regression analyses. Use and cost of hospital services were calculated at six and twelve months respectively.

Results

All regimens reduced viral load to below the limit of detection and CD4 counts increased to similar levels at six and twelve months for all treatment regimens. No statistically significant differences were observed for rate of treatment failure at six and twelve months. People on Atripla® generated lower healthcare costs for non-AIDS patients at £5,340 (£5,254 to £5,426) per patient-semester and £9,821 (£9,719 to £9,924) per patient-year that was £1,344 (95%CI £1,222 to £1,465) less per patient-semester and £1,954 (95%CI £1,801 to £2,107) less per patient-year compared with Truvada®+EFV; healthcare costs for AIDS patients were similar across all regimens.

Conclusion

The single pill regimen is as effective as the two- and three-pill regimens of the same drugs, but if started as first-line induction therapy there would be a 20% savings on healthcare costs at six and 17% of costs at twelve months compared with Truvada®+EFV, that generated the next lowest costs.  相似文献   

12.
13.
14.
15.

Background

Costs of tuberculosis diagnosis and treatment may represent a significant burden for the poor and for the health system in resource-poor countries.

Objectives

The aim of this study was to analyze patients'' costs of tuberculosis care and to estimate the incremental cost-effectiveness ratio (ICER) of the directly observed treatment (DOT) strategy per completed treatment in Rio de Janeiro, Brazil.

Methods

We interviewed 218 adult patients with bacteriologically confirmed pulmonary tuberculosis. Information on direct (out-of-pocket expenses) and indirect (hours lost) costs, loss in income and costs with extra help were gathered through a questionnaire. Healthcare system additional costs due to supervision of pill-intake were calculated considering staff salaries. Effectiveness was measured by treatment completion rate. The ICER of DOT compared to self-administered therapy (SAT) was calculated.

Principal Findings

DOT increased costs during the treatment phase, while SAT increased costs in the pre-diagnostic phase, for both the patient and the health system. Treatment completion rates were 71% in SAT facilities and 79% in DOT facilities. Costs per completed treatment were US$ 194 for patients and U$ 189 for the health system in SAT facilities, compared to US$ 336 and US$ 726 in DOT facilities. The ICER was US$ 6,616 per completed DOT treatment compared to SAT.

Conclusions

Costs incurred by TB patients are high in Rio de Janeiro, especially for those under DOT. The DOT strategy doubles patients'' costs and increases by fourfold the health system costs per completed treatment. The additional costs for DOT may be one of the contributing factors to the completion rates below the targeted 85% recommended by WHO.  相似文献   

16.

Introduction

Antimalarial resistance has led to a global policy of artemisinin-based combination therapy. Despite growing resistance chloroquine (CQ) remained until recently the official first-line treatment for falciparum malaria in Pakistan, with sulfadoxine-pyrimethamine (SP) second-line. Co-treatment with the gametocytocidal primaquine (PQ) is recommended for transmission control in South Asia. The relative effect of artesunate (AS) or primaquine, as partner drugs, on clinical outcomes and gametocyte carriage in this setting were unknown.

Methods

A single-blinded, randomized trial among Afghan refugees in Pakistan compared six treatment arms: CQ; CQ+(single-dose)PQ; CQ+(3 d)AS; SP; SP+(single-dose)PQ, and SP+(3 d)AS. The objectives were to compare treatment failure rates and effect on gametocyte carriage, of CQ or SP monotherapy against the respective combinations (PQ or AS). Outcomes included trophozoite and gametocyte clearance (read by light microscopy), and clinical and parasitological failure.

Findings

A total of 308 (87%) patients completed the trial. Failure rates by day 28 were: CQ 55/68 (81%); CQ+AS 19/67 (28%), SP 4/41 (9.8%), SP+AS 1/41 (2.4%). The addition of PQ to CQ or SP did not affect failure rates (CQ+PQ 49/67 (73%) failed; SP+PQ 5/33 (16%) failed). AS was superior to PQ at clearing gametocytes; gametocytes were seen on d7 in 85% of CQ, 40% of CQ+PQ, 21% of CQ+AS, 91% of SP, 76% of SP+PQ and 23% of SP+AS treated patients. PQ was more effective at clearing older gametocyte infections whereas AS was more effective at preventing emergence of mature gametocytes, except in cases that recrudesced.

Conclusions

CQ is no longer appropriate by itself or in combination. These findings influenced the replacement of CQ with SP+AS for first-line treatment of uncomplicated falciparum malaria in the WHO Eastern Mediterranean Region. The threat of SP resistance remains as SP monotherapy is still common. Three day AS was superior to single-dose PQ for reducing gametocyte carriage.

Trial Registration

ClinicalTrials.gov bold>  相似文献   

17.

Background

HIV drug resistance (HIVDR) testing is not routinely available in many resource-limited settings, therefore antiretroviral therapy (ART) program and site factors known to be associated with emergence of HIVDR should be monitored to optimize the quality of patient care and minimize the emergence of preventable HIVDR.

Methods

In 2010, Namibia selected five World Health Organization Early Warning Indicators (EWIs) and scaled-up monitoring from 9 to 33 ART sites: ART prescribing practices, Patients lost to follow-up (LTFU) at 12 months, Patients switched to a second-line regimen at 12 months, On-time antiretroviral (ARV) drug pick-up, and ARV drug-supply continuity.

Results

Records allowed reporting on three of the five selected EWIs. 22 of 33 (67%) sites met the target of 100% initiated on appropriate first-line regimens. 17 of 33 (52%) sites met the target of ≤20% LTFU. 15 of 33 (45%) sites met the target of 0% switched to a second-line regimen.

Conclusions

EWI monitoring directly resulted in public health action which will optimize the quality of care, specifically the strengthening of ART record systems, engagement of ART sites, and operational research for improved adherence assessment and ART patient defaulter tracing.  相似文献   

18.

Background

In resource-limited settings where viral load (VL) monitoring is scarce or unavailable, clinicians must use immunological and clinical criteria to define HIV virological treatment failure. This study examined the performance of World Health Organization (WHO) clinical and immunological failure criteria in predicting virological failure in HIV patients receiving antiretroviral therapy (ART).

Methods

In a HIV/AIDS program in Busia District Hospital, Kenya, a retrospective, cross-sectional cohort analysis was performed in April 2008 for all adult patients (>18 years old) on ART for ≥12 months, treatment-naive at ART start, attending the clinic at least once in last 6 months, and who had given informed consent. Treatment failure was assessed per WHO clinical (disease stage 3 or 4) and immunological (CD4 cell count) criteria, and compared with virological failure (VL >5,000 copies/mL).

Results

Of 926 patients, 123 (13.3%) had clinically defined treatment failure, 53 (5.7%) immunologically defined failure, and 55 (6.0%) virological failure. Sensitivity, specificity, positive predictive value, and negative predictive value of both clinical and immunological criteria (combined) in predicting virological failure were 36.4%, 83.5%, 12.3%, and 95.4%, respectively.

Conclusions

In this analysis, clinical and immunological criteria were found to perform relatively poorly in predicting virological failure of ART. VL monitoring and new algorithms for assessing clinical or immunological treatment failure, as well as improved adherence strategies, are required in ART programs in resource-limited settings.  相似文献   

19.

Aim

To describe the incidence of extensive drug-resistant tuberculosis (XDR-TB) reported in the Peruvian National multidrug-resistant tuberculosis (MDR-TB) registry over a period of more than ten years and present the treatment outcomes for a cohort of these patients.

Methods

From the Peruvian MDR-TB registry we extracted all entries that were approved for second-line anti-TB treatment between January 1997 and June of 2007 and that had Drug Susceptibility Test (DST) results indicating resistance to both rifampicin and isoniazid (i.e. MDR-TB) in addition to results for at least one fluoroquinolone and one second-line injectable (amikacin, capreomycin and kanamycin).

Results

Of 1,989 confirmed MDR-TB cases with second-line DSTs, 119(6.0%) XDR-TB cases were detected between January 1997 and June of 2007. Lima and its metropolitan area account for 91% of cases, a distribution statistically similar to that of MDR-TB. A total of 43 XDR-TB cases were included in the cohort analysis, 37 of them received ITR. Of these, 17(46%) were cured, 8(22%) died and 11(30%) either failed or defaulted treatment. Of the 14 XDR-TB patients diagnosed as such before ITR treatment initiation, 10 (71%) were cured and the median conversion time was 2 months.

Conclusion

In the Peruvian context, with long experience in treating MDR-TB and low HIV burden, although the overall cure rate was poor, a large proportion of XDR-TB patients can be cured if DST to second-line drugs is performed early and treatment is delivered according to the WHO Guidelines.  相似文献   

20.

Objectives

Switching to second-line antiretroviral therapy (ART) largely depends on careful clinical assessment and access to biological measurements. We performed a systematic review and meta-analysis to estimate the incidence of switching to second-line ART in sub-Saharan Africa and its main programmatic determinants.

Methods

We searched 2 databases for studies reporting the incidence rate of switching to second-line ART in adults living in sub-Saharan Africa. Data on the incidence rate of switching were pooled, and random-effect models were used to evaluate the effect of factors measured at the programme level on this incidence rate.

Results

Nine studies (157,340 patients) in 21 countries were included in the meta-analysis. All studies considered patients under first-line ART and conditions to initiate ART were similar across studies. Overall, 3,736 (2.4%) patients switched to second-line ART. Incidence rate of switch was in mean 2.65 per 100 person-years (PY) (95% confidence interval: 2.01–3.30); it ranged from 0.42 to 4.88 per 100 PY and from 0 to 4.80 per 100 PY in programmes with and without viral load monitoring, respectively. No factors measured at the programme level were associated with the incidence rate of switching to second-line ART.

Conclusion

The low incidence rate of switching to second-line ART suggests that the monitoring of patients under ART is challenging and that access to second-line ART is ineffective; efforts should be made to increase access to second-line ART to those in need by providing monitoring tools, education and training, as well as a more convenient regimen.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号