首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 46 毫秒
1.

Objective

To describe patient antiretroviral therapy (cART) outcomes associated with intensive decentralization of services in a rural HIV program in Malawi.

Methods

Longitudinal analysis of data from HIV-infected patients starting cART between August 2001 and December 2008 and of a cross-sectional immunovirological assessment conducted 12 (±2) months after therapy start. One-year mortality, lost to follow-up, and attrition (deaths and lost to follow-up) rates were estimated with exact Poisson 95% confidence intervals (CI) by type of care delivery and year of initiation. Association of virological suppression (<50 copies/mL) and immunological success (CD4 gain ≥100 cells/µL), with type of care was investigated using multiple logistic regression.

Results

During the study period, 4322 cART patients received centralized care and 11,090 decentralized care. At therapy start, patients treated in decentralized health facilities had higher median CD4 count levels (167 vs. 130 cell/µL, P<0.0001) than other patients. Two years after cART start, program attrition was lower in decentralized than centralized facilities (9.9 per 100 person-years, 95% CI: 9.5–10.4 vs. 20.8 per 100 person-years, 95% CI: 19.7–22.0). One year after treatment start, differences in immunological success (adjusted OR = 1.23, 95% CI: 0.83–1.83), and viral suppression (adjusted OR = 0.80, 95% CI: 0.56–1.14) between patients followed at centralized and decentralized facilities were not statistically significant.

Conclusions

In rural Malawi, 1- and 2-year program attrition was lower in decentralized than in centralized health facilities and no statistically significant differences in one-year immunovirological outcomes were observed between the two health care levels. Longer follow-up is needed to confirm these results.  相似文献   

2.

Background

Life expectancy has increased for newly diagnosed HIV patients since the inception of combination antiretroviral treatment (cART), but there remains a need to better understand the characteristics of long-term survival in HIV-positive patients. We examined long-term survival in HIV-positive patients receiving cART in the Australian HIV Observational Database (AHOD), to describe changes in mortality compared to the general population and to develop longer-term survival models.

Methods

Data were examined from 2,675 HIV-positive participants in AHOD who started cART. Standardised mortality ratios (SMR) were calculated by age, sex and calendar year across prognostic characteristics using Australian Bureau of Statistics national data as reference. SMRs were examined by years of duration of cART by CD4 and similarly by viral load. Survival was analysed using Cox-proportional hazards and parametric survival models.

Results

The overall SMR for all-cause mortality was 3.5 (95% CI: 3.0–4.0). SMRs by CD4 count were 8.6 (95% CI: 7.2–10.2) for CD4<350 cells/µl; 2.1 (95% CI: 1.5–2.9) for CD4 = 350–499 cells/µl; and 1.5 (95% CI: 1.1–2.0) for CD4≥500 cells/µl. SMRs for patients with CD4 counts <350 cells/µL were much higher than for patients with higher CD4 counts across all durations of cART. SMRs for patients with viral loads greater than 400 copies/ml were much higher across all durations of cART. Multivariate models demonstrated improved survival associated with increased recent CD4, reduced recent viral load, younger patients, absence of HBVsAg-positive ever, year of HIV diagnosis and incidence of ADI. Parametric models showed a fairly constant mortality risk by year of cART up to 15 years of treatment.

Conclusion

Observed mortality remained fairly constant by duration of cART and was modelled accurately by accepted prognostic factors. These rates did not vary much by duration of treatment. Changes in mortality with age were similar to those in the Australian general population.  相似文献   

3.

Background

Limited antiretroviral treatment regimens in resource-limited settings require long-term sustainability of patients on the few available options. We evaluated the incidence and predictors of combined antiretroviral treatment (cART) modifications, in an outpatient cohort of 955 patients who initiated cART between January 2009 and January 2011 in western Kenya.

Methods

cART modification was defined as either first time single drug substitution or switch. Incidence rates were determined by Poisson regression and risk factor analysis assessed using multivariate Cox regression modeling.

Results

Over a median follow-up period of 10.7 months, 178 (18.7%) patients modified regimens (incidence rate (IR); 18.6 per 100 person years [95% CI: 16.2–21.8]). Toxicity was the most common cited reason (66.3%). In adjusted multivariate Cox piecewise regression model, WHO disease stage III/IV (aHR; 1.82, 95%CI: 1.25–2.66), stavudine (d4T) use (aHR; 2.21 95%CI: 1.49–3.30) and increase in age (aHR; 1.02, 95%CI: 1.0–1.04) were associated with increased risk of treatment modification within the first year post-cART. Zidovudine (AZT) and tenofovir (TDF) use had a reduced risk for modification (aHR; 0.60 95%CI: 0.38–0.96 and aHR; 0.51 95%CI: 0.29–0.91 respectively). Beyond one year of treatment, d4T use (aHR; 2.75, 95% CI: 1.25–6.05), baseline CD4 counts ≤350 cells/mm3 (aHR; 2.45, 95%CI: 1.14–5.26), increase in age (aHR; 1.05 95%CI: 1.02–1.07) and high baseline weight >60kg aHR; 2.69 95% CI: 1.58–4.59) were associated with risk of cART modification.

Conclusions

Early treatment initiation at higher CD4 counts and avoiding d4T use may reduce treatment modification and subsequently improve sustainability of patients on the available limited options.  相似文献   

4.

Study Background

Vitamin D has wide-ranging effects on the immune system, and studies suggest that low serum vitamin D levels are associated with worse clinical outcomes in HIV. Recent studies have identified an interaction between antiretrovirals used to treat HIV and reduced serum vitamin D levels, but these studies have been done in North American and European populations.

Methods

Using a prospective cohort study design nested in a multinational clinical trial, we examined the effect of three combination antiretroviral (cART) regimens on serum vitamin D levels in 270 cART-naïve, HIV-infected adults in nine diverse countries, (Brazil, Haiti, Peru, Thailand, India, Malawi, South Africa, Zimbabwe and the United States). We evaluated the change between baseline serum vitamin D levels and vitamin D levels 24 and 48 weeks after cART initiation.

Results

Serum vitamin D levels decreased significantly from baseline to 24 weeks among those randomized to efavirenz/lamivudine/zidovudine (mean change: −7.94 [95% Confidence Interval (CI) −10.42, −5.54] ng/ml) and efavirenz/emtricitabine/tenofovir-DF (mean change: −6.66 [95% CI −9.40, −3.92] ng/ml) when compared to those randomized to atazanavir/emtricitabine/didanosine-EC (mean change: −2.29 [95% CI –4.83, 0.25] ng/ml). Vitamin D levels did not change significantly between week 24 and 48. Other factors that significantly affected serum vitamin D change included country (p<0.001), season (p<0.001) and baseline vitamin D level (p<0.001).

Conclusion

Efavirenz-containing cART regimens adversely affected vitamin D levels in patients from economically, geographically and racially diverse resource-limited settings. This effect was most pronounced early after cART initiation. Research is needed to define the role of Vitamin D supplementation in HIV care.  相似文献   

5.

Background

Combination antiretroviral treatment (cART) has been very successful, especially among selected patients in clinical trials. The aim of this study was to describe outcomes of cART on the population level in a large national cohort.

Methods

Characteristics of participants of the Swiss HIV Cohort Study on stable cART at two semiannual visits in 2007 were analyzed with respect to era of treatment initiation, number of previous virologically failed regimens and self reported adherence. Starting ART in the mono/dual era before HIV-1 RNA assays became available was counted as one failed regimen. Logistic regression was used to identify risk factors for virological failure between the two consecutive visits.

Results

Of 4541 patients 31.2% and 68.8% had initiated therapy in the mono/dual and cART era, respectively, and been on treatment for a median of 11.7 vs. 5.7 years. At visit 1 in 2007, the mean number of previous failed regimens was 3.2 vs. 0.5 and the viral load was undetectable (<50 copies/ml) in 84.6% vs. 89.1% of the participants, respectively. Adjusted odds ratios of a detectable viral load at visit 2 for participants from the mono/dual era with a history of 2 and 3, 4, >4 previous failures compared to 1 were 0.9 (95% CI 0.4–1.7), 0.8 (0.4–1.6), 1.6 (0.8–3.2), 3.3 (1.7–6.6) respectively, and 2.3 (1.1–4.8) for >2 missed cART doses during the last month, compared to perfect adherence. From the cART era, odds ratios with a history of 1, 2 and >2 previous failures compared to none were 1.8 (95% CI 1.3–2.5), 2.8 (1.7–4.5) and 7.8 (4.5–13.5), respectively, and 2.8 (1.6–4.8) for >2 missed cART doses during the last month, compared to perfect adherence.

Conclusions

A higher number of previous virologically failed regimens, and imperfect adherence to therapy were independent predictors of imminent virological failure.  相似文献   

6.
7.

Background

In 2012, the World Health Organization (WHO) amended their 2010 guidelines for women receiving limited duration, triple-antiretroviral drug regimens during pregnancy and breastfeeding for prevention of mother-to-child transmission of HIV (tARV-PMTCT) (Option B) to include the option to continue lifelong combination antiretroviral therapy (cART) (Option B+). We evaluated clinical and CD4 outcomes in women who had received antiretrovirals for prevention of mother-to-child transmission and then discontinued antiretrovirals 6-months postpartum.

Methods and Findings

The Kisumu Breastfeeding Study, 2003–2009, was a prospective, non-randomized, open-label clinical trial of tARV-PMTCT in ARV-naïve, Kenyan women. Women received tARV-PMTCT from 34 weeks'' gestation until 6-months postpartum when women were instructed to discontinue breastfeeding. Women with CD4 count (CD4) <250cells/mm3 or WHO stage III/IV prior to 6-months postpartum continued cART indefinitely. We estimated the change in CD4 after discontinuing tARV-PMTCT and the adjusted relative risk [aRR] for factors associated with declines in maternal CD4. We compared maternal and infant outcomes following weaning–when tARV-PMTCT discontinued–by maternal ARV status through 24-months postpartum. Compared with women who continued cART, discontinuing antiretrovirals was associated with infant HIV transmission and death (10.1% vs. 2.4%; P = 0.03). Among women who discontinued antiretrovirals, CD4<500 cells/mm3 at either initiation (21.8% vs. 1.5%; P = 0.002; aRR: 9.8; 95%-confidence interval [CI]: 2.4–40.6) or discontinuation (36.9% vs. 8.3%; P<0.0001; aRR: 4.4; 95%-CI: 1.9–5.0) were each associated with increased risk of women requiring cART for their own health within 6 months after discontinuing.

Conclusions

Considering the serious health risks to the woman''s infant and the brief reprieve from cART gained by stopping, every country should evaluate the need for and feasibility to implement WHO Option B+ for PMTCT. Evaluating CD4 at antiretroviral initiation or 6-months postpartum can identify pregnant women who would most benefit from continuing cART in settings unable to implement WHO Option B+.  相似文献   

8.

Background

Conducted in Wuhan China, this study examined follow-up and health markers in HIV patients receiving care in two treatment settings. Participants, all men who have sex with men, were followed for18–24 months.

Method

Patients in a “one-stop” service (ACC; N = 89) vs those in standard care clinics (CDC; N = 243) were compared on HIV treatment and retention in care outcomes.

Results

Among patients with CD4 cell count ≦350 cells/µL, the proportion receiving cART did not differ across clinic groups. The ACC was favored across five other indicators: proportion receiving tests for CD4 cell count at the six-month interval (98.2% vs. 79.4%, 95% CI 13.3–24.3, p = 0.000), proportion with HIV suppression for patients receiving cART for 6 months (86.5% vs. 57.1%, 95% CI 14.1–44.7, p = 0.000), proportion with CD4 cell recovery for patients receiving cART for 12 months (55.8% vs. 22.2%, 95% CI 18.5–48.6, p = 0.000), median time from HIV confirmation to first test for CD4 cell count (7 days, 95% CI 4–8 vs. 10 days, 95% CI 9–12, log-rank p = 0.000) and median time from first CD4 cell count ≦350 cells/µL to cART initiation (26 days, 95% CI 16–37 vs. 41.5 days, 95% CI 35–46, log-rank p = 0.031). Clinic groups did not differ on any biomedical indicator at baseline, and no baseline biomedical or demographic variables remained significant in the multivariate analysis. Nonetheless, post-hoc analyses suggest the possibility of self-selection bias.

Conclusions

Study findings lend preliminary support to a one-stop patient-centered care model that may be useful across various HIV care settings.  相似文献   

9.

Background

In Kenya, detailed data on the age-specific burden of influenza and RSV are essential to inform use of limited vaccination and treatment resources.

Methods

We analyzed surveillance data from August 2009 to July 2012 for hospitalized severe acute respiratory illness (SARI) and outpatient influenza-like illness (ILI) at two health facilities in western Kenya to estimate the burden of influenza and respiratory syncytial virus (RSV). Incidence rates were estimated by dividing the number of cases with laboratory-confirmed virus infections by the mid-year population. Rates were adjusted for healthcare-seeking behavior, and to account for patients who met the SARI/ILI case definitions but were not tested.

Results

The average annual incidence of influenza-associated SARI hospitalization per 1,000 persons was 2.7 (95% CI 1.8–3.9) among children <5 years and 0.3 (95% CI 0.2–0.4) among persons ≥5 years; for RSV-associated SARI hospitalization, it was 5.2 (95% CI 4.0–6.8) among children <5 years and 0.1 (95% CI 0.0–0.2) among persons ≥5 years. The incidence of influenza-associated medically-attended ILI per 1,000 was 24.0 (95% CI 16.6–34.7) among children <5 years and 3.8 (95% CI 2.6–5.7) among persons ≥5 years. The incidence of RSV-associated medically-attended ILI was 24.6 (95% CI 17.0–35.4) among children <5 years and 0.8 (95% CI 0.3–1.9) among persons ≥5 years.

Conclusions

Influenza and RSV both exact an important burden in children. This highlights the possible value of influenza vaccines, and future RSV vaccines, for Kenyan children.  相似文献   

10.

Background

In Côte d''Ivoire during 2004–2007, numbers of ART enrollees increased from <5,000 to 36,943. Trends in nationally representative ART program outcomes have not yet been reported.

Methodology/Principal Findings

We conducted a retrospective chart review to assess trends in patient characteristics and attrition [death or loss to follow-up (LTFU)] over time, among a nationally representative sample of 3,682 adults (≥15 years) initiating ART during 2004–2007 at 34 health facilities. Among ART enrollees during 2004–2007, median age was 36, the proportion female was 67%, the proportion HIV-2-infected or dually HIV-1&2 reactive was 5%, and median baseline CD4+ T-cell (CD4) count was 135 cells/µL. Comparing cohorts initiating ART in 2004 with cohorts initiating ART in 2007, median baseline weight declined from 55 kg to 52 kg (p = 0.008) and the proportion weighing <45 kg increased from 17% to 22% (p = 0.014). During 2004–2007, pharmacy-based estimates of the percentage of new ART enrollees 95% adherent to ART declined from 74% to 60% (p = 0.026), and twelve-month retention declined from 86% to 69%, due to increases in 12-month mortality from 2%–4% and LTFU from 12%–28%. In univariate analysis, year of ART initiation was associated with increasing rates of both LTFU and mortality. Controlling for baseline CD4, weight, adherence, and other risk factors, year of ART initiation was still strongly associated with LTFU but not mortality. In multivariate analysis, weight <45 kg and adherence <95% remained strong predictors of LTFU and mortality.

Conclusions

During 2004–2007, increasing prevalence among ART enrollees of measured mortality risk factors, including weight <45 kg and ART adherence <95%, might explain increases in mortality over time. However, the association between later calendar year and increasing LTFU is not explained by risk factors evaluated in this analysis. Undocumented transfers, political instability, and patient dissatisfaction with crowded facilities might explain increasing LTFU.  相似文献   

11.

Background

In human immunodeficiency virus treatment adequate virological suppression is warranted, nevertheless for some patients it remains a challenge. We investigated factors associated with low-level viraemia (LLV) and virological failure (VF) under combined antiretroviral therapy (cART).

Materials and Methods

We analysed patients receiving standard regimens between 1st July 2012 and 1st July 2013 with at least one viral load (VL) measurement below the quantification limit (BLQ) in their treatment history. After a minimum of 6 months of unmodified cART, the next single VL measurement within 6 months was analysed. VF was defined as HIV RNA levels ≥200 copies/mL and all other quantifiable measurements were classified as LLV. Factors associated with LLV and VF compared to BLQ were identified by logistic regression models.

Results

Of 2276 participants, 1972 (86.6%) were BLQ, 222 (9.8%) showed LLV and 82 (3.6%) had VF. A higher risk for LLV and VF was shown in patients with cART interruptions and in patients with boosted PI therapy. The risk for LLV and VF was lower in patients from centres using the Abbott compared to the Roche assay to measure VL. A higher risk for LLV but not for VF was found in patients with a higher VL before cART [for >99.999 copies/mL: aOR (95% CI): 4.19 (2.07–8.49); for 10.000–99.999 copies/mL: aOR (95% CI): 2.52 (1.23–5.19)] and shorter cART duration [for <9 months: aOR (95% CI): 2.59 (1.38–4.86)]. A higher risk for VF but not for LLV was found in younger patients [for <30 years: aOR (95% CI): 2.76 (1.03–7.35); for 30–50 years: aOR (95% CI): 2.70 (1.26–5.79)], people originating from high prevalence countries [aOR (95% CI): 2.20 (1.09–4.42)] and in male injecting drug users [aOR (95% CI): 2.72 (1.38–5.34)].

Conclusions

For both VF and LLV, factors associated with adherence play a prominent role. Furthermore, performance characteristics of the diagnostic assay used for VL quantification should also be taken into consideration.  相似文献   

12.

Objective

To evaluate the Fibrosis (FIB)-4 index as a predictor of major liver-related events (LRE) and liver-related death (LRD) in human immunodeficiency virus (HIV) type-1 patients initiating combination antiretroviral therapy (cART).

Design

Retrospective analysis of a prospective cohort study.

Setting

Italian HIV care centers participating to the ICONA Foundation cohort.

Participants

Treatment-naive patients enrolled in ICONA were selected who: initiated cART, had hepatitis C virus (HCV) serology results, were HBsAg negative, had an available FIB-4 index at cART start and during follow up.

Methods

Cox regression models were used to determine the association of FIB4 with the risk of major LRE (gastrointestinal bleeding, ascites, hepatic encephalopathy, hepato-renal syndrome or hepatocellular carcinoma) or LRD.

Results

Three-thousand four-hundred seventy-five patients were enrolled: 73.3% were males, 27.2% HCV seropositive. At baseline (time of cART initiation) their median age was 39 years, had a median CD4+ T cell count of 260 cells/uL, and median HIV RNA 4.9 log copies/mL, 65.9% had a FIB-4 <1.45, 26.4% 1.45–3.25 and 7.7% >3.25. Over a follow up of 18,662 person-years, 41 events were observed: 25 major LRE and 16 LRD (incidence rate, IR, 2.2 per 1,000 PYFU [95% confidence interval, CI 1.6–3.0]). IR was higher in HCV seropositives as compared to negatives (5.9 vs 0.5 per 1,000 PYFU). Higher baseline FIB-4 category as compared to <1.45 (FIB-4 1.45–3.25: HR 3.55, 95% CI 1.09–11.58; FIB-4>3.25: HR 4.25, 1.21–14.92) and time-updated FIB-4 (FIB-4 1.45–3.25: HR 3.40, 1.02–11.40; FIB-4>3.25: HR 21.24, 6.75–66.84) were independently predictive of major LRE/LRD, after adjusting for HIV- and HCV-related variables, alcohol consumption and type of cART.

Conclusions

The FIB-4 index at cART initiation, and its modification over time are risk factors for major LRE or LRD, independently of infection with HCV and could be used to monitor patients on cART.  相似文献   

13.

Background

Improving childhood tuberculosis (TB) evaluation and care is a global priority, but data on performance at community health centers in TB endemic regions are sparse.

Objective

To describe the current practices and quality of TB evaluation for children with cough ≥2 weeks'' duration presenting to community health centers in Uganda.

Methods

Cross-sectional analysis of children (<15 years) receiving care at five Level IV community health centers in rural Uganda for any reason between 2009–2012. Quality of TB care was assessed using indicators derived from the International Standards of Tuberculosis Care (ISTC).

Results

From 2009–2012, 1713 of 187,601 (0.9%, 95% CI: 0.4–1.4%) children presenting to community health centers had cough ≥ 2 weeks'' duration. Of those children, only 299 (17.5%, 95% CI: 15.7–19.3%) were referred for sputum microscopy, but 251 (84%, 95% CI: 79.8–88.1%) completed sputum examination if referred. The yield of sputum microscopy was only 3.6% (95% CI: 1.3–5.9%), and only 55.6% (95% CI: 21.2–86.3%) of children with acid-fast bacilli positive sputum were started on treatment. Children under age 5 were less likely to be referred for sputum examination and to receive care in accordance with ISTC. The proportion of children evaluated in accordance with ISTC increased over time (4.6% in 2009 to 27.9% in 2012, p = 0.03), though this did not result in increased case-detection.

Conclusion

The quality of TB evaluation was poor for children with cough ≥2 weeks'' duration presenting for health care. Referrals for sputum smear microscopy and linkage to TB treatment were key gaps in the TB evaluation process, especially for children under the age of five.  相似文献   

14.
15.

Background

With increased availability of paediatric combination antiretroviral therapy (cART) in resource limited settings, cART outcomes and factors associated with outcomes should be assessed.

Methods

HIV-infected children <15 years of age, initiating cART in Kigali, Rwanda, were followed for 18 months. Prospective clinical and laboratory assessments included weight-for-age (WAZ) and height-for-age (HAZ) z-scores, complete blood cell count, liver transaminases, creatinine and lipid profiles, CD4 T-cell count/percent, and plasma HIV-1 RNA concentration. Clinical success was defined as WAZ and WAZ >−2, immunological success as CD4 cells ≥500/mm3 and ≥25% for respectively children over 5 years and under 5 years, and virological success as a plasma HIV-1 RNA concentration <40 copies/mL.

Results

Between March 2008 and December 2009, 123 HIV-infected children were included. The median (interquartile (IQR) age at cART initiation was 7.4 (3.2, 11.5) years; 40% were <5 years and 54% were female. Mean (95% confidence interval (95%CI)) HAZ and WAZ at baseline were −2.01 (−2.23, −1.80) and −1.73 (−1.95, −1.50) respectively and rose to −1.75 (−1.98, −1.51) and −1.17 (−1.38, −0.96) after 12 months of cART. The median (IQR) CD4 T-cell values for children <5 and ≥5 years of age were 20% (13, 28) and 337 (236, 484) cells/mm3respectively, and increased to 36% (28, 41) and 620 (375, 880) cells/mm3. After 12 months of cART, 24% of children had a detectable viral load, including 16% with virological failure (HIV-RNA>1000 c/mL). Older age at cART initiation, poor adherence, and exposure to antiretrovirals around birth were associated with virological failure. A third (33%) of children had side effects (by self-report or clinical assessment), but only 9% experienced a severe side effect requiring a cART regimen change.

Conclusions

cART in Rwandan HIV-infected children was successful but success might be improved further by initiating cART as early as possible, optimizing adherence and optimizing management of side effects.  相似文献   

16.

Background

Reliable HIV incidence estimates for Mozambique are limited. We conducted a prospective HIV incidence study as part of a clinical research site development initiative in Chókwè district, Gaza Province, southern Mozambique.

Methods

Between June 2010 and October 2012, we recruited women at sites where women at higher risk of HIV infection would likely be found. We enrolled and tested 1,429 sexually active women in the screening phase and 479 uninfected women in the prospective phase. Participants were scheduled for 12+ months follow-up, when they underwent face-to-face interviews, HIV counseling and testing, and pregnancy testing. We observed a total of 373.1 woman-years (WY) of follow-up, with mean (median) of 9.4 (9.7) women-months per participant.

Results

The prevalence of HIV was 29.4% (95% confidence interval [CI]: 27.0–31.8%). In multivariable logistic regression analysis, factors that remained significantly associated with prevalent HIV were: older age (OR: 0.6; 95% CI: 0.4–0.7), lower educational level (OR: 0.4; 95% CI: 0.3–0.7), and using hormonal contraception (OR: 0.6; 95% CI: 0.4–0.7) or condoms (OR: 0.5; 95% CI: 0.3–0.7). We observed an HIV incidence rate of 4.6 per 100 WY (95% CI: 2.7, 7.3). The HIV incidence was 4.8 per 100 WY (95% CI: 2.5, 8.3) in women aged 18–24 years, 4.5 per 100 WY (95% CI: 1.2, 11.4) in women aged 25–29 years and 3.2 per 100 WY (95% CI: 0.1, 18.0) in the 30–35 years stratum. None of the demographic factors or time-varying behavioral factors examined was significantly associated with incident HIV infection in bivariable analysis at p≤0.10.

Conclusions

We found a high HIV incidence among sexually active young women in Chókwè, Mozambique. HIV prevention programs should be strengthened in the area, with more comprehensive reproductive health services, regular HIV testing, condom promotion, and messaging about multiple sexual partners.  相似文献   

17.

Background

Adolescents have been identified as a high-risk group for poor adherence to and defaulting from combination antiretroviral therapy (cART) care. However, data on outcomes for adolescents on cART in resource-limited settings remain scarce.

Methods

We developed an observational study of patients who started cART at The AIDS Service Organization (TASO) in Uganda between 2004 and 2009. Age was stratified into three groups: children (≤10 years), adolescents (11–19 years), and adults (≥20 years). Kaplan-Meier survival curves were generated to describe time to mortality and loss to follow-up, and Cox regression used to model associations between age and mortality and loss to follow-up. To address loss to follow up, we applied a weighted analysis that assumes 50% of lost patients had died.

Findings

A total of 23,367 patients were included in this analysis, including 810 (3.5%) children, 575 (2.5%) adolescents, and 21 982 (94.0%) adults. A lower percentage of children (5.4%) died during their cART treatment compared to adolescents (8.5%) and adults (10%). After adjusting for confounding, other features predicted mortality than age alone. Mortality was higher among males (p<0.001), patients with a low initial CD4 cell count (p<0.001), patients with advanced WHO clinical disease stage (p<0.001), and shorter duration of time receiving cART (p<0.001). The crude mortality rate was lower for children (22.8 per 1000 person-years; 95% CI: 16.1, 29.5), than adolescents (36.5 per 1000 person-years; 95% CI: 26.3, 46.8) and adults (37.5 per 1000 person-years; 95% CI: 35.9, 39.1).

Interpretation

This study is the largest assessment of adolescents receiving cART in Africa. Adolescents did not have cART mortality outcomes different from adults or children.  相似文献   

18.

Background

Elevated serum phosphorus levels have been linked with cardiovascular disease and mortality with conflicting results, especially in the presence of normal renal function.

Methods

We studied the association between serum phosphorus levels and clinical outcomes in 1663 patients with acute myocardial infarction (AMI). Patients were categorized into 4 groups based on serum phosphorus levels (<2.50, 2.51–3.5, 3.51–4.50 and >4.50 mg/dL). Cox proportional-hazards models were used to examine the association between serum phosphorus and clinical outcomes after adjustment for potential confounders.

Results

The mean follow up was 45 months. The lowest mortality occurred in patients with serum phosphorus between 2.5–3.5 mg/dL, with a multivariable-adjusted hazard ratio of 1.24 (95% CI 0.85–1.80), 1.35 (95% CI 1.05–1.74), and 1.75 (95% CI 1.27–2.40) in patients with serum phosphorus of <2.50, 3.51–4.50 and >4.50 mg/dL, respectively. Higher phosphorus levels were also associated with increased risk of heart failure, but not the risk of myocardial infarction or stroke. The effect of elevated phosphorus was more pronounced in patients with chronic kidney disease (CKD). The hazard ratio for mortality in patients with serum phosphorus >4.5 mg/dL compared to patients with serum phosphorus 2.50–3.50 mg/dL was 2.34 (95% CI 1.55–3.54) with CKD and 1.53 (95% CI 0.87–2.69) without CKD.

Conclusion

We found a graded, independent association between serum phosphorus and all-cause mortality and heart failure in patients after AMI. The risk for mortality appears to increase with serum phosphorus levels within the normal range and is more prominent in the presence of CKD.  相似文献   

19.

Background

Several epidemiological studies have been conducted to address the later effect of anesthesia on neurodevelopment in children. However, the results are still inconclusive.

Methods

We here conducted a systematic review and meta-analysis to summarize the currently available clinical and epidemiologic evidence on the association of anesthesia/surgery with neurodevelopmental outcomes in children by searching PubMed, EMBASE, and Web of Science database (from January-1 2000 to February-1, 2013). The evaluation of neurodevelopment includes language and learning disabilities, cognition, behavioral development, and academic performance. Both retrospective and prospective studies were included. Data were abstracted from seven eligible studies. We estimated the synthesized hazard ratios (HR) and 95% confidence interval (CI) according to inter-study heterogeneity.

Results

The pooled HR for the association of anesthesia/surgery with an adverse behavioral or developmental outcome was 1.25 (95% CI, 1.13–1.38, P<0.001; random-effects model) in children undergoing the first anesthesia before the age of 4-year. Then we analyzed the factors for this association using meta-regression method. It showed that it was the number of times of exposure (HR = 1.75, 95% CI 1.31–2.33; P<0.001) rather than the time at exposure before 4-year (HR = 1.08, 95% CI 0.87–1.34 for the effect of per 1-year early exposure; P = 0.47) is a risk factor for neurodevelopmental impairment.

Conclusion

The current clinical evidence suggests modestly elevated risk of adverse neurodevelopmental outcomes in children who were exposed to anesthesia/surgery during early childhood, especially for those with multiple times of exposure. Due to limitation of retrospective studies, prospective investigations are needed to determine whether anesthesia/surgery is causative.  相似文献   

20.

Importance

The association between hospital volume and inpatient mortality for severe sepsis is unclear.

Objective

To assess the effect of severe sepsis case volume and inpatient mortality.

Design Setting and Participants

Retrospective cohort study from 646,988 patient discharges with severe sepsis from 3,487 hospitals in the Nationwide Inpatient Sample from 2002 to 2011.

Exposures

The exposure of interest was the mean yearly sepsis case volume per hospital divided into tertiles.

Main Outcomes and Measures

Inpatient mortality.

Results

Compared with the highest tertile of severe sepsis volume (>60 cases per year), the odds ratio for inpatient mortality among persons admitted to hospitals in the lowest tertile (≤10 severe sepsis cases per year) was 1.188 (95% CI: 1.074–1.315), while the odds ratio was 1.090 (95% CI: 1.031–1.152) for patients admitted to hospitals in the middle tertile. Similarly, improved survival was seen across the tertiles with an adjusted inpatient mortality incidence of 35.81 (95% CI: 33.64–38.03) for hospitals with the lowest volume of severe sepsis cases and a drop to 32.07 (95% CI: 31.51–32.64) for hospitals with the highest volume.

Conclusions and Relevance

We demonstrate an association between a higher severe sepsis case volume and decreased mortality. The need for a systems-based approach for improved outcomes may require a high volume of severely septic patients.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号