首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 46 毫秒
1.

Aim

To calculate use, cost and cost-effectiveness of people living with HIV (PLHIV) starting routine treatment and care before starting combination antiretroviral therapy (cART) and PLHIV starting first-line 2NRTIs+NNRTI or 2NRTIs+PIboosted, comparing PLHIV with CD4≤200 cells/mm3 and CD4>200 cells/mm3. Few studies have calculated the use, cost and cost-effectiveness of routine treatment and care before starting cART and starting cART above and below CD4 200 cells/mm3.

Methods

Use, costs and cost-effectiveness were calculated for PLHIV in routine pre-cART and starting first-line cART, comparing CD4≤200 cells/mm3 with CD4>200 cells/mm3 (2008 UK prices).

Results

cART naïve patients CD4≤200 cells/mm3 had an annual cost of £6,407 (95%CI £6,382 to £6,425) PPY compared with £2,758 (95%CI £2,752 to £2,761) PPY for those with CD4>200 cells/mm3; cost per life year gained of pre-cART treatment and care for those with CD4>200 cells/mm3 was £1,776 (cost-saving to £2,752). Annual cost for starting 2NRTIs+NNRTI or 2NRTIs+PIboosted with CD4≤200 cells/mm3 was £12,812 (95%CI £12,685–£12,937) compared with £10,478 (95%CI £10,376–£10,581) for PLHIV with CD4>200 cells/mm3. Cost per additional life-year gained on first-line therapy for those with CD4>200 cells/mm3 was £4639 (£3,967 to £2,960).

Conclusion

PLHIV starting to use HIV services before CD4≤200 cells/mm3 is cost-effective and enables them to be monitored so they start cART with a CD4>200 cells/mm3, which results in better outcomes and is cost-effective. However, 25% of PLHIV accessing services continue to present with CD4≤200 cells/mm3. This highlights the need to investigate the cost-effectiveness of testing and early treatment programs for key populations in the UK.  相似文献   

2.

Background

In the USA, most HIV-1 infected children are on antiretroviral drug regimens, with many individuals surviving through adolescence and into adulthood. The course of HIV-1 infection in these children is variable, and understudied.

Methodology/Principal Findings

We determined whether qualitative differences in immune cell subsets could explain a slower disease course in long term survivors with no evidence of immune suppression (LTS-NS; CD4%≥25%) compared to those with severe immune suppression (LTS-SS; CD4%≤15%). Subjects in the LTS-NS group had significantly higher frequencies of naïve (CCR7+CD45RA+) and central memory (CCR7+CD45RA−) CD4+ T cells compared to LTS-SS subjects (p = 0.0005 and <0.0001, respectively). Subjects in the rapid progressing group had significantly higher levels of CD4+ TEMRA (CCR7−CD45RA+) cells compared to slow progressing subjects (p<0.0001).

Conclusions/Significance

Rapid disease progression in vertical infection is associated with significantly higher levels of CD4+ TEMRA (CCR7−CD45RA+) cells.  相似文献   

3.

Introduction

CYP19A1 encodes aromatase, the enzyme responsible for the conversion of androgens to estrogens, and may play a role in variation in outcomes among men and women with cardiovascular disease. We sought to examine genetic variation in CYP19A1 for its potential role in sex differences in cardiovascular disease outcomes.

Methods

Caucasian individuals from two independent populations were assessed: 1) a prospective cohort of patients with acute coronary syndromes with 3-year mortality follow-up (n = 568) and 2) a nested case-control study from a randomized, controlled trial of hypertension patients with stable coronary disease in which the primary outcome was death, nonfatal myocardial infarction (MI) or nonfatal stroke (n = 619). Six CYP19A1 SNPs were genotyped (-81371 C>T, -45965 G>C, M201T, R264C, 80 A>G, and +32226 G>A). The sex*genotype interaction term was assessed for the primary outcome and compared by genotype in men and women when a significant interaction term was identified.

Results

We identified a significant interaction between -81371 C>T and sex (p = 0.025) in the ACS population. The variant allele was associated with a 78% increase in mortality in men (HR 1.78, 95% confidence interval [CI] 1.08-2.94) and a nonsignificant 42% decrease in mortality among women (HR 0.58, 95% CI 0.22-1.54). We identified a similar association in the hypertensive CAD group, the -81371 C>T*sex interaction term was p<0.0001, with an associated 65% increase in death, MI, or stroke (HR 1.65, 95% CI 1.00-2.73) in men and a 69% decrease (HR 0.31, 95% CI 0.16-0.6) in women.

Conclusions

Using two independent populations, this study is the first to document a significant interaction between CYP19A1 genotype and sex on cardiovascular outcomes. These findings could illuminate potential mechanisms of sex differences in cardiovascular disease outcomes.  相似文献   

4.

Introduction

Reduced left ventricular function in patients with severe symptomatic valvular aortic stenosis is associated with impaired clinical outcome in patients undergoing surgical aortic valve replacement (SAVR). Transcatheter Aortic Valve Implantation (TAVI) has been shown non-inferior to SAVR in high-risk patients with respect to mortality and may result in faster left ventricular recovery.

Methods

We investigated clinical outcomes of high-risk patients with severe aortic stenosis undergoing medical treatment (n = 71) or TAVI (n = 256) stratified by left ventricular ejection fraction (LVEF) in a prospective single center registry.

Results

Twenty-five patients (35%) among the medical cohort were found to have an LVEF≤30% (mean 26.7±4.1%) and 37 patients (14%) among the TAVI patients (mean 25.2±4.4%). Estimated peri-interventional risk as assessed by logistic EuroSCORE was significantly higher in patients with severely impaired LVEF as compared to patients with LVEF>30% (medical/TAVI 38.5±13.8%/40.6±16.4% versus medical/TAVI 22.5±10.8%/22.1±12.8%, p <0.001). In patients undergoing TAVI, there was no significant difference in the combined endpoint of death, myocardial infarction, major stroke, life-threatening bleeding, major access-site complications, valvular re-intervention, or renal failure at 30 days between the two groups (21.0% versus 27.0%, p = 0.40). After TAVI, patients with LVEF≤30% experienced a rapid improvement in LVEF (from 25±4% to 34±10% at discharge, p = 0.002) associated with improved NYHA functional class at 30 days (decrease ≥1 NYHA class in 95%). During long-term follow-up no difference in survival was observed in patients undergoing TAVI irrespective of baseline LVEF (p = 0.29), whereas there was a significantly higher mortality in medically treated patients with severely reduced LVEF (log rank p = 0.001).

Conclusion

TAVI in patients with severely reduced left ventricular function may be performed safely and is associated with rapid recovery of systolic left ventricular function and heart failure symptoms.  相似文献   

5.

Background

Cryptococcal infection is a frequent cause of mortality in Cambodian HIV-infected patients with CD4+ count ≤100 cells/µl. This study assessed the cost-effectiveness of three strategies for cryptococcosis prevention in HIV-infected patients.

Methods

A Markov decision tree was used to compare the following strategies at the time of HIV diagnosis: no intervention, one time systematic serum cryptococcal antigen (CRAG) screening and treatment of positive patients, and systematic primary prophylaxis with fluconazole. The trajectory of a hypothetical cohort of HIV-infected patients with CD4+ count ≤100 cells/µl initiating care was simulated over a 1-year period (cotrimoxazole initiation at enrollment; antiretroviral therapy within 3 months). Natural history and cost data (US$ 2009) were from Cambodia. Efficacy data were from international literature.

Results

In a population in which 81% of patients had a CD4+ count ≤50 cells/ µl and 19% a CD4+ count between 51–100 cells/µl, the proportion alive 1 year after enrolment was 61% (cost $ 472) with no intervention, 70% (cost $ 483) with screening, and 72% (cost $ 492) with prophylaxis. After one year of follow-up, the cost-effectiveness of screening vs. no intervention was US$ 180/life year gained (LYG). The cost-effectiveness of prophylaxis vs. screening was $ 511/LYG. The cost-effectiveness of prophylaxis vs. screening was estimated at $1538/LYG if the proportion of patients with CD4+ count ≤50 cells/µl decreased by 75%.

Conclusion

In a high endemic area of cryptococcosis and HIV infection, serum CRAG screening and prophylaxis are two cost effective strategies to prevent AIDS associated cryptococcosis in patients with CD4+ count ≤100 cells/µl, at a short-term horizon, screening being more cost-effective but less effective than prophylaxis. Systematic primary prophylaxis may be preferred in patients with CD4+ below 50 cells/µl while systematic serum CRAG screening for early targeted treatment may be preferred in patients with CD4+ between 51–100 cells/µl.  相似文献   

6.

Background

There is limited data on the clinical outcome of patients with pandemic H1N1 (pH1N1) pneumonia who received oseltamivir treatment, especially when the treatment was administered more than 48 hours after symptom onset.

Methods

During the pandemic in 2009, a cohort of pH1N1 influenza pneumonia was built in China, and their clinical information was collected systematically, and analyzed with Cox models.

Results

920 adults and 541 children with pneumonia who didn''t receive corticosteroids were analyzed. In-hospital mortality was higher in adults who did not receive antiviral therapy (18.2%) than those with who received oseltamivir ≤ 2days (2.9%), between 2–5 days (4.6%) and >5 days after illness onset (4.9%), p<0.01. A similar trend was observed in pediatric patients. Cox regression showed that at 60 days after symptoms onset, 11 patients (10.8%) who did not receive antivirals died versus 4 (1.8%), 18 (3.3%), and 23 (3.7%) patients whose oseltamivir treatment was started ≤ 2days, between 2–5days, and >5 days, respectively. For males patients, aged ≥ 14 years and baseline PaO2/FiO2<200, oseltamivir administration reduced the mortality risk by 92.1%, 88% and 83.5%, respectively. Higher doses of oseltamivir (>3.8 mg/kg/d) did not improve clinical outcome (mortality, higher dose 2.5% vs standard dose 2.8%, p>0.05).

Conclusions

Antiviral therapy might reduce mortality of patients with pH1N1 pneumonia, even when initiated more than 48 hours after onset of illness. Greater protective effects might be in males, patients aged 14–60 years, and patients with PaO2/FiO2<200.  相似文献   

7.

Objective

To determine the optimal imaging strategy for ICH incorporating CTA or DSA with and without a NCCT risk stratification algorithm.

Methods

A Markov model included costs, outcomes, prevalence of a vascular lesion, and the sensitivity and specificity of a risk stratification algorithm from the literature. The four imaging strategies were: (a) CTA screening of the entire cohort; (b) CTA only in those where NCCT suggested a high or indeterminate likelihood of a lesion; (c) DSA screening of the entire cohort and (d) DSA only for those with a high or indeterminate suspicion of a lesion following NCCT. Branch d was the comparator.

Results

Age of the cohort and the probability of an underlying lesion influenced the choice of optimal imaging strategy. With a low suspicion for a lesion (<12%), branch (a) was the optimal strategy for a willingness-to-pay of $100,000/QALY. Branch (a) remained the optimal strategy in younger people (<35 years) with a risk below 15%. If the probability of a lesion was >15%, branch (b) became preferred strategy. The probabilistic sensitivity analysis showed that branch (b) was the optimal choice 70–72% of the time over varying willingness-to-pay values.

Conclusions

CTA has a clear role in the evaluation of people presenting with ICH, though the choice of CTA everyone or CTA using risk stratification depends on age and likelihood of finding a lesion.  相似文献   

8.

Objective

Pre-treatment with angiotensin receptor blockers is known to improve neurological outcome after stroke. This study investigated for the first time, whether the renin inhibitor aliskiren has similar neuroprotective effects.

Methods

Since aliskiren specifically blocks human renin, double transgenic rats expressing human renin and angiotensinogen genes were used. To achieve a systolic blood pressure of 150 or 130 mmHg animals were treated with aliskiren (7.5 or 12.5 mg/kg*d) or candesartan (1.5 or 10 mg/kg*d) via osmotic minipump starting five days before middle cerebral artery occlusion with reperfusion. Infarct size was determined by magnetic resonance imaging. mRNA of inflammatory marker genes was studied in different brain regions.

Results

The mortality of 33.3% (7 of 21 animals) in the vehicle group was reduced to below 10% by treatment with candesartan or aliskiren (p<0.05). Aliskiren-treated animals had a better neurological outcome 7 days post-ischemia, compared to candesartan (Garcia scale: 9.9±0.7 vs. 7.3±0.7; p<0.05). The reduction of infarct size in the aliskiren group did not reach statistical significance compared to candesartan and vehicle (24 h post-ischemia: 314±81 vs. 377±70 and 403±70 mm3 respectively). Only aliskiren was able to significantly reduce stroke-induced gene expression of CXC chemokine ligand 1, interleukin-6 and tumor necrosis factor-alpha in the ischemic core.

Conclusions

Head-to-head comparison suggests that treatment with aliskiren before and during cerebral ischemia is at least as effective as candesartan in double transgenic rats. The improved neurological outcome in the aliskiren group was blood pressure independent. Whether this effect is due to primary anti-inflammatory mechanisms has to be investigated further.  相似文献   

9.

Background

Diagnosis of childhood tuberculosis (TB) is difficult in high TB burden settings. Interferon-gamma-induced protein 10 (IP10) has been suggested as a marker of TB infection and disease, but its ability to differentiate the two conditions remains uncertain.

Objectives

To describe Interferon-gamma (INFγ) and IP10 expression in children with TB infection and disease and controls to assess their potential to differentiate latent and active TB.

Methods

This was a cross sectional study of 322 1–15 years old children with symptoms of TB (28 confirmed, 136 probable and 131 unlikely TB), 335 children in contact with adults with pulmonary TB and 156 community controls in Southern Ethiopia. The Tuberculin Skin Test (TST) and Quantiferon-In-Tube (QFT-IT) were performed. INFγ and IP10 were measured in plasma supernatants.

Results and Interpretation

Children with confirmed and probable TB and contacts were more likely to have TST+ (78.6%, 59.3% and 54.1%, respectively) than children with unlikely TB (28.7%) and controls (12.8%) (p<0.001). Children with confirmed TB (59.3%) and contacts (44.7%) were more likely to have INFγ+ than children with probable (37.6%) or unlikely TB (28.1%) and controls (13.1%) (p<0.001). IP10 concentrations were higher in INFγ+ children independently of TST (p<0.001). There was no difference between IP10 concentrations of children with confirmed TB and contacts (p = 0.8) and children with and without HIV (p>0.1).INFγ and IP10 can identify children with TB infection and disease, but cannot differentiate between the two conditions. HIV status did not affect the expression of IP10.  相似文献   

10.

Purpose

To asses if tennis at prepubertal age elicits the hypertrophy of dominant arm muscles.

Methods

The volume of the muscles of both arms was determined using magnetic resonance imaging (MRI) in 7 male prepubertal tennis players (TP) and 7 non-active control subjects (CG) (mean age 11.0±0.8 years, Tanner 1–2).

Results

TP had 13% greater total muscle volume in the dominant than in the contralateral arm. The magnitude of inter-arm asymmetry was greater in TP than in CG (13 vs 3%, P<0.001). The dominant arm of TP was 16% greater than the dominant arm of CG (P<0.01), whilst non-dominant arms had similar total muscle volumes in both groups (P = 0.25), after accounting for height as covariate. In TP, dominant deltoid (11%), forearm supinator (55%) and forearm flexors (21%) and extensors (25%) were hypertrophied compared to the contralateral arm (P<0.05). In CG, the dominant supinator muscle was bigger than its contralateral homonimous (63%, P<0.05).

Conclusions

Tennis at prepubertal age is associated with marked hypertrophy of the dominant arm, leading to a marked level of asymmetry (+13%), much greater than observed in non-active controls (+3%). Therefore, tennis particpation at prepubertal age is associated with increased muscle volumes in dominant compared to the non-dominant arm, likely due to selectively hypertrophy of the loaded muscles.  相似文献   

11.

Introduction

Antimalarial resistance has led to a global policy of artemisinin-based combination therapy. Despite growing resistance chloroquine (CQ) remained until recently the official first-line treatment for falciparum malaria in Pakistan, with sulfadoxine-pyrimethamine (SP) second-line. Co-treatment with the gametocytocidal primaquine (PQ) is recommended for transmission control in South Asia. The relative effect of artesunate (AS) or primaquine, as partner drugs, on clinical outcomes and gametocyte carriage in this setting were unknown.

Methods

A single-blinded, randomized trial among Afghan refugees in Pakistan compared six treatment arms: CQ; CQ+(single-dose)PQ; CQ+(3 d)AS; SP; SP+(single-dose)PQ, and SP+(3 d)AS. The objectives were to compare treatment failure rates and effect on gametocyte carriage, of CQ or SP monotherapy against the respective combinations (PQ or AS). Outcomes included trophozoite and gametocyte clearance (read by light microscopy), and clinical and parasitological failure.

Findings

A total of 308 (87%) patients completed the trial. Failure rates by day 28 were: CQ 55/68 (81%); CQ+AS 19/67 (28%), SP 4/41 (9.8%), SP+AS 1/41 (2.4%). The addition of PQ to CQ or SP did not affect failure rates (CQ+PQ 49/67 (73%) failed; SP+PQ 5/33 (16%) failed). AS was superior to PQ at clearing gametocytes; gametocytes were seen on d7 in 85% of CQ, 40% of CQ+PQ, 21% of CQ+AS, 91% of SP, 76% of SP+PQ and 23% of SP+AS treated patients. PQ was more effective at clearing older gametocyte infections whereas AS was more effective at preventing emergence of mature gametocytes, except in cases that recrudesced.

Conclusions

CQ is no longer appropriate by itself or in combination. These findings influenced the replacement of CQ with SP+AS for first-line treatment of uncomplicated falciparum malaria in the WHO Eastern Mediterranean Region. The threat of SP resistance remains as SP monotherapy is still common. Three day AS was superior to single-dose PQ for reducing gametocyte carriage.

Trial Registration

ClinicalTrials.gov bold>  相似文献   

12.

Background

Oxaliplatin, a platinum-based chemotherapy utilised in the treatment of colorectal cancer, produces two forms of neurotoxicity- acute sensorimotor neuropathic symptoms and a dose-limiting chronic sensory neuropathy. Given that a Na+ channelopathy has been proposed as the mechanism underlying acute oxaliplatin-induced neuropathy, the present study aimed to determine specific mechanisms of Na+ channel dysfunction.

Methodology/Principal Findings

Specifically the function of transient and persistent Na+ currents were followed during treatment and were investigated in relation to oxaliplatin dose level. Eighteen patients were assessed before and after a single oxaliplatin infusion with motor and sensory axonal excitability studies performed on the median nerve at the wrist. While refractoriness (associated with Na+ channel inactivation) was significantly altered post-oxaliplatin infusion in both motor (Pre: 31.7±6.4%; Post: 68.8±14.5%; P≤.001) and sensory axons (Pre: 31.4±5.4%; Post: 21.4±5.5%; P<.05), strength-duration time constant (marker of persistent Na+ conductances) was not significantly altered post-infusion (Motor Pre: 0.395±0.01 ms; Post: 0.394±0.02 ms; NS; Sensory Pre:0.544±0.03 ms; Post: 0.535±0.05 ms; NS). However, changes in strength-duration time constant were significantly correlated with changes in refractoriness in motor and sensory axons (Motor correlation coefficient = −.65; P<.05; Sensory correlation coefficient = .67; P<.05).

Conclusions/Significance

It is concluded that the predominant effect of acute oxaliplatin exposure in human motor and sensory axons is mediated through changes in transient rather than persistent Na+ conductances. These findings are likely to have implications for the design and trial of neuroprotective strategies.  相似文献   

13.

Objective

Evaluate the predictive value of Boston Acute Stroke Imaging Scale (BASIS) in acute ischemic stroke in Chinese population.

Methods

This was a retrospective study. 566 patients of acute ischemic stroke were classified as having a major stroke or minor stroke based on BASIS. We compared short-term outcome (death, occurrence of complications, admission to intensive care unit [ICU] or neurological intensive care unit [NICU]), long-term outcome (death, recurrence of stroke, myocardial infarction, modified Rankin scale) and economic index including in-hospital cost and length of hospitalization. Continuous variables were compared by using the Student t test or Kruskal-Wallis test. Categorical variables were tested with the Chisquare test. Cox regression analysis was applied to identify whether BASIS was the independent predictive variable of death.

Results

During hospitalization, 9 patients (4.6%) died in major stroke group while no patients died in minor stroke group (p<0.001), 12 patients in the major stroke group and 5 patients in minor stroke group were admitted to ICU/NICU (p = 0.001). There were more complications (cerebral hernia, pneumonia, urinary tract infection) in major stroke group than minor stroke group (p<0.05). Meanwhile, the average cost of hospitalization in major stroke group was 3,100 US$ and 1,740 US$ in minor stroke group (p<0.001); the average length of stay in major and minor stroke group was 21.3 days and 17.3 days respectively (p<0.001). Results of the follow-up showed that 52 patients (26.7%) died in major stroke group while 56 patients (15.1%) died in minor stroke group (P<0.001). 62.2% of the patients in major stroke group and 80.4% of the patients in minor stroke group were able to live independently (P = 0.002). The survival analysis showed that patients with major stroke had 80% higher of risk of death than patients with minor stroke even after adjusting traditional atherosclerotic factors and NIHSS at baseline (HR = 1.8, 95% CI: 1.1–3.1).

Conclusion

BASIS can predict in-hospital mortality, occurrence of complication, length of stay and hospitalization cost of the acute ischemic stroke patients and can also estimate the long term outcome (death and the dependency). BASIS could and should be used as a dichotomous stroke classification system in the daily practice.  相似文献   

14.

Background

Accurate, inexpensive point-of-care CD4+ T cell testing technologies are needed that can deliver CD4+ T cell results at lower level health centers or community outreach voluntary counseling and testing. We sought to evaluate a point-of-care CD4+ T cell counter, the Pima CD4 Test System, a portable, battery-operated bench-top instrument that is designed to use finger stick blood samples suitable for field use in conjunction with rapid HIV testing.

Methods

Duplicate measurements were performed on both capillary and venous samples using Pima CD4 analyzers, compared to the BD FACSCalibur (reference method). The mean bias was estimated by paired Student''s t-test. Bland Altman plots were used to assess agreement.

Results

206 participants were enrolled with a median CD4 count of 396 (range; 18–1500). The finger stick PIMA had a mean bias of −66.3 cells/µL (95%CI −83.4−49.2, P<0.001) compared to the FACSCalibur; the bias was smaller at lower CD4 counts (0–250 cells/µL) with a mean bias of −10.8 (95%CI −27.3−+5.6, P = 0.198), and much greater at higher CD4 cell counts (>500 cells/µL) with a mean bias of −120.6 (95%CI −162.8, −78.4, P<0.001). The sensitivity (95%CI) of the Pima CD4 analyzer was 96.3% (79.1–99.8%) for a <250 cells/ul cut-off with a negative predictive value of 99.2% (95.1–99.9%).

Conclusions

The Pima CD4 finger stick test is an easy-to-use, portable, relatively fast device to test CD4+ T cell counts in the field. Issues of negatively-biased CD4 cell counts especially at higher absolute numbers will limit its utility for longitudinal immunologic response to ART. The high sensitivity and negative predictive value of the test makes it an attractive option for field use to identify patients eligible for ART, thus potentially reducing delays in linkage to care and ART initiation.  相似文献   

15.

Background

Prospective cohort studies have shown that high fruit and vegetable consumption is inversely associated with coronary heart disease (CHD). Whether food processing affects this association is unknown. Therefore, we quantified the association of fruit and vegetable consumption with 10-year CHD incidence in a population-based study in the Netherlands and the effect of processing on these associations.

Methods

Prospective population-based cohort study, including 20,069 men and women aged 20 to 65 years, enrolled between 1993 and 1997 and free of cardiovascular disease at baseline. Diet was assessed using a validated 178-item food frequency questionnaire. Hazard ratios (HR) were calculated for CHD incidence using multivariable Cox proportional hazards models.

Results

During a mean follow-up time of 10.5y, 245 incident cases of CHD were documented, which comprised 211 non-fatal acute myocardial infarctions and 34 fatal CHD events. The risk of CHD incidence was 34% lower for participants with a high intake of total fruit and vegetables (>475 g/d; HR: 0.66; 95% CI: 0.45–0.99) compared to participants with a low total fruit and vegetable consumption (≤241 g/d). Intake of raw fruit and vegetables (>262 g/d vs ≤92 g/d; HR: 0.70; 95% CI: 0.47–1.04) as well as processed fruit and vegetables (>234 g/d vs ≤113 g/d; HR: 0.79; 95% CI: 0.54–1.16) were inversely related with CHD incidence.

Conclusion

Higher consumption of fruit and vegetables, whether consumed raw or processed, may protect against CHD incidence.  相似文献   

16.

Background

Limited knowledge exists on early HIV events that may inform preventive and therapeutic strategies. This study aims to characterize the earliest immunologic and virologic HIV events following infection and investigates the usage of a novel therapeutic strategy.

Methods and Findings

We prospectively screened 24,430 subjects in Bangkok and identified 40 AHI individuals. Thirty Thais were enrolled (8 Fiebig I, 5 Fiebig II, 15 Fiebig III, 2 Fiebig IV) of whom 15 completed 24 weeks of megaHAART (tenofovir/emtricitabine/efavirenz/raltegravir/maraviroc). Sigmoid biopsies were completed in 24/30 at baseline and 13/15 at week 24.At baseline, the median age was 29 years and 83% were MSM. Most were symptomatic (87%), and were infected with R5-tropic (77%) CRF01_AE (70%). Median CD4 was 406 cells/mm3. HIV RNA was 5.5 log10 copies/ml. Median total blood HIV DNA was higher in Fiebig III (550 copy/106 PBMC) vs. Fiebig I (8 copy/106 PBMC) (p = 0.01) while the median %CD4+CCR5+ gut T cells was lower in Fiebig III (19%) vs. Fiebig I (59%) (p = 0.0008).After 24 weeks of megaHAART, HIV RNA levels of <50 copies were achieved in 14/15 in blood and 13/13 in gut. Total blood HIV DNA at week 0 predicted reservoir size at week 24 (p<0.001). Total HIV DNA declined significantly and was undetectable in 3 of 15 in blood and 3 of 7 in gut. Frequency of CD4+CCR5+ gut T cells increased from 41% at baseline to 64% at week 24 (p>0.050); subjects with less than 40% at baseline had a significant increase in CD4+CCR5+ T cells from baseline to week 24 (14% vs. 71%, p = 0.02).

Conclusions

Gut T cell depletion and HIV reservoir seeding increases with progression of AHI. MegaHAART was associated with immune restoration and reduced reservoir size. Our findings could inform research on strategies to achieve HIV drug-free remission.  相似文献   

17.

Background

Long term efficacy of raltegravir (RAL)-including regimens in highly pre-treated HIV-1-infected patients has been demonstrated in registration trials. However, few studies have assessed durability in routine clinical settings.

Methods

Antiretroviral treatment-experienced patients initiating a RAL-containing salvage regimen were enrolled. Routine clinical and laboratory follow-up was performed at baseline, week 4, 12, and every 12 weeks thereafter. Data were censored at week 96.

Results

Out of 320 patients enrolled, 292 (91.25%) subjects maintained their initial regimen for 96 weeks; 28 discontinued prematurely for various reasons: death (11), viral failure (8), adverse events (5), loss to follow-up (3), consent withdrawal (1). Eight among these 28 subjects maintained RAL but changed the accompanying drugs. The mean CD4+ T-cell increase at week 96 was 227/mm3; 273 out of 300 patients (91%), who were still receiving RAL at week 96, achieved viral suppression (HIV-1 RNA <50 copies/mL). When analyzing the immuno-virologic outcome according to the number of drugs used in the regimen, 2 (n = 45), 3 (n = 111), 4 (n = 124), or >4 (n = 40), CD4+ T-cell gain was similar across strata: +270, +214, +216, and +240 cells/mm3, respectively, as was the proportion of subjects with undetectable viral load. Laboratory abnormalities (elevation of liver enzymes, total cholesterol and triglycerides) were rare, ranging from 0.9 to 3.1%. The mean 96-week total cholesterol increase was 23.6 mg/dL.

Conclusions

In a routine clinical setting, a RAL-based regimen allowed most patients in salvage therapy to achieve optimal viral suppression for at least 96 weeks, with relevant immunologic gain and very few adverse events.  相似文献   

18.
19.

Background

Type I interferons play important roles in innate immune defense. In HIV infection, type I interferons may delay disease progression by inhibiting viral replication while at the same time accelerating disease progression by contributing to chronic immune activation.

Methods

To investigate the effects of type I interferons in HIV-infection, we obtained cryopreserved peripheral blood mononuclear cell samples from 10 subjects who participated in AIDS Clinical Trials Group Study 5192, a trial investigating the activity of systemic administration of IFNα for twelve weeks to patients with untreated HIV infection. Using flow cytometry, we examined changes in cell cycle status and expression of activation antigens by circulating T cells and their maturation subsets before, during and after IFNα treatment.

Results

The proportion of CD38+HLA-DR+CD8+ T cells increased from a mean of 11.7% at baseline to 24.1% after twelve weeks of interferon treatment (p = 0.006). These frequencies dropped to an average of 20.1% six weeks after the end of treatment. In contrast to CD8+ T cells, the frequencies of activated CD4+ T cells did not change with administration of type I interferon (mean percentage of CD38+DR+ cells = 2.62% at baseline and 2.17% after 12 weeks of interferon therapy). As plasma HIV levels fell with interferon therapy, this was correlated with a “paradoxical” increase in CD8+ T cell activation (p<0.001).

Conclusion

Administration of type I interferon increased expression of the activation markers CD38 and HLA DR on CD8+ T cells but not on CD4+ T cells of HIV+ persons. These observations suggest that type I interferons may contribute to the high levels of CD8+ T cell activation that occur during HIV infection.  相似文献   

20.

Objective

It may be possible to thrombolyse ischaemic stroke (IS) patients up to 6 h by using penumbral imaging. We investigated whether a perfusion CT (CTP) mismatch can help to select patients for thrombolysis up to 6 h.

Methods

A cohort of 254 thrombolysed IS patients was studied. 174 (69%) were thrombolysed at 0–3 h by using non-contrast CT (NCCT), and 80 (31%) at 3–6 h (35 at 3–4.5 h and 45 at 4.5–6 h) by using CTP mismatch criteria. Symptomatic intracerebral haemorrhage (SICH), the mortality and the modified Rankin Score (mRS) were assessed at 3 months. Independent determinants of outcome in patients thrombolysed between 3 and 6 h were identified.

Results

The baseline characteristics were comparable in the two groups. There were no differences in SICH (3% v 4%, p = 0.71), any ICH (7% v 9%, p = 0.61), or mortality (16% v 9%, p = 0.15) or mRS 0–2 at 3 months (55% v 54%, p = 0.96) between patients thrombolysed at 0–3 h (NCCT only) or at 3–6 h (CTP mismatch). There were no significant differences in outcome between patients thrombolysed at 3–4.5 h or 4.5–6 h. The NIHSS score was the only independent determinant of a mRS of 0–2 at 3 months (OR 0.89, 95% CI 0.82–0.97, p = 0.007) in patients treated using CTP mismatch criteria beyond 3 h.

Conclusions

The use of a CTP mismatch model may help to guide thrombolysis decisions up to 6 h after IS onset.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号