首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.

Purpose

Improve the ability to infer sex behaviors more accurately using network data.

Methods

A hybrid network analytic approach was utilized to integrate: (1) the plurality of reports from others tied to individual(s) of interest; and (2) structural features of the network generated from those ties. Network data was generated from digitally extracted cell-phone contact lists of a purposeful sample of 241 high-risk men in India. These data were integrated with interview responses to describe the corresponding individuals in the contact lists and the ties between them. HIV serostatus was collected for each respondent and served as an internal validation of the model’s predictions of sex behavior.

Results

We found that network-based model predictions of sex behavior and self-reported sex behavior had limited correlation (54% agreement). Additionally, when respondent sex behaviors were re-classified to network model predictions from self-reported data, there was a 30.7% decrease in HIV seroprevalence among groups of men with lower risk behavior, which is consistent with HIV transmission biology.

Conclusion

Combining the relative completeness and objectivity of digital network data with the substantive details of classical interview and HIV biomarker data permitted new analyses and insights into the accuracy of self-reported sex behavior.  相似文献   

2.
摘要 目的:探究HIV抗体不确定及后续随访阳转样本的WB条带和流行病学特征。方法:对太原市范围内2017年至2022年监测到的790份HIV抗体不确定及71例后续随访阳性人员的WB条带和流行病学信息进行分析。结果:(1)HIV抗体不确定样本主要来自医院、血液中心和自愿咨询与检测(VCT)机构,占比84.47%;阳转样本主要来自医院和VCT,占88.73%;不确定人群中的性病门诊就诊者和VCT人群阳转率最高,分别为50%、25.49%。(2)不确定和阳转样本以20-40周岁人员为主,占53.8%;不确定人群中的20-30岁阳转率最高,占15.57%。(3)不确定样本在学历、性别方面无明显差异,但阳转样本男性占比远高于女性、高学历人群相对较高。(4)不确定样本以已婚、无高危行为、就业人群为主,阳转样本以具有未婚、离异、丧偶、阳性配偶、男男同性恋(MSM)、学生、无业或待业等特点人群为主。(5)不确定样本条带以P24和P17条带和带型为主,阳转样本条带以P24和P17为主,以含有P24和gP160条带的带型为主。结论:不确定样本中具有性病门诊就诊者、VCT、MSM、20-30岁、未婚、无业等流行病学因素和具有P24或gP160条带等特点的样本后续阳转可能性较大,应以上述人群为防控干预重点,重点加强具有上述特点不确定样本的个案跟踪和随访检测。  相似文献   

3.
Serological studies are the gold standard method to estimate influenza infection attack rates (ARs) in human populations. In a common protocol, blood samples are collected before and after the epidemic in a cohort of individuals; and a rise in haemagglutination-inhibition (HI) antibody titers during the epidemic is considered as a marker of infection. Because of inherent measurement errors, a 2-fold rise is usually considered as insufficient evidence for infection and seroconversion is therefore typically defined as a 4-fold rise or more. Here, we revisit this widely accepted 70-year old criterion. We develop a Markov chain Monte Carlo data augmentation model to quantify measurement errors and reconstruct the distribution of latent true serological status in a Vietnamese 3-year serological cohort, in which replicate measurements were available. We estimate that the 1-sided probability of a 2-fold error is 9.3% (95% Credible Interval, CI: 3.3%, 17.6%) when antibody titer is below 10 but is 20.2% (95% CI: 15.9%, 24.0%) otherwise. After correction for measurement errors, we find that the proportion of individuals with 2-fold rises in antibody titers was too large to be explained by measurement errors alone. Estimates of ARs vary greatly depending on whether those individuals are included in the definition of the infected population. A simulation study shows that our method is unbiased. The 4-fold rise case definition is relevant when aiming at a specific diagnostic for individual cases, but the justification is less obvious when the objective is to estimate ARs. In particular, it may lead to large underestimates of ARs. Determining which biological phenomenon contributes most to 2-fold rises in antibody titers is essential to assess bias with the traditional case definition and offer improved estimates of influenza ARs.  相似文献   

4.
The data on risk of mortality from cardiovascular disease due to radiation exposure at low or medium doses are inconsistent. This paper reports an analysis of the Semipalatinsk historical cohort exposed to radioactive fallout from nuclear testing in the vicinity of the Semipalatinsk Nuclear Test Site, Kazakhstan. The cohort study, which includes 19,545 persons of exposed and comparison villages in the Semipalatinsk region, had been set up in the 1960s and comprises 582,656 person-years of follow-up between 1960 and 1999. A dosimetric approach developed by the U.S. National Cancer Institute (NCI) has been used. Radiation dose estimates in this cohort range from 0 to 630 mGy (whole-body external). Overall, the exposed population showed a high mortality from cardiovascular disease. Rates of mortality from cardiovascular disease in the exposed group substantially exceeded those of the comparison group. Dose-response analyses were conducted for both the entire cohort and the exposed group only. A dose-response relationship that was found when analyzing the entire cohort could be explained completely by differences between the baseline rates in exposed and unexposed groups. When taking this difference into account, no statistically significant dose-response relationship for all cardiovascular disease, for heart disease, or for stroke was found. Our results suggest that within this population and at the level of doses estimated, there is no detectable risk of radiation-related mortality from cardiovascular disease.  相似文献   

5.
Recurrent episodes of tuberculosis (TB) can be due to relapse of latent infection or exogenous reinfection, and discrimination is crucial for control planning. Molecular genotyping of Mycobacterium tuberculosis isolates offers concrete opportunities to measure the relative contribution of reinfection in recurrent disease. Here, a mathematical model of TB transmission is fitted to data from 14 molecular epidemiology studies, enabling the estimation of relevant epidemiological parameters. Meta-analysis reveals that rates of reinfection after successful treatment are higher than rates of new TB, raising an important question about the underlying mechanism. We formulate two alternative mechanisms within our model framework: (i) infection increases susceptibility to reinfection or (ii) infection affects individuals differentially, thereby recruiting high-risk individuals to the group at risk for reinfection. The second mechanism is better supported by the fittings to the data, suggesting that reinfection rates are inflated through a population phenomenon that occurs in the presence of heterogeneity in individual risk of infection. As a result, rates of reinfection are higher when measured at the population level even though they might be lower at the individual level. Finally, differential host recruitment is modulated by transmission intensity, being less pronounced when incidence is high.  相似文献   

6.

Background

Efforts to monitor malaria transmission increasingly use cross-sectional surveys to estimate transmission intensity from seroprevalence data using malarial antibodies. To date, seroconversion rates estimated from cross-sectional surveys have not been compared to rates estimated in prospective cohorts. Our objective was to compare seroconversion rates estimated in a prospective cohort with those from a cross-sectional survey in a low-transmission population.

Methods and Findings

The analysis included two studies from Haiti: a prospective cohort of 142 children ages ≤11 years followed for up to 9 years, and a concurrent cross-sectional survey of 383 individuals ages 0–90 years old. From all individuals, we analyzed 1,154 blood spot specimens for the malaria antibody MSP-119 using a multiplex bead antigen assay. We classified individuals as positive for malaria using a cutoff derived from the mean plus 3 standard deviations in antibody responses from a negative control set of unexposed individuals. We estimated prospective seroconversion rates from the longitudinal cohort based on 13 incident seroconversions among 646 person-years at risk. We also estimated seroconversion rates from the cross-sectional survey using a reversible catalytic model fit with maximum likelihood. We found the two approaches provided consistent results: the seroconversion rate for ages ≤11 years was 0.020 (0.010, 0.032) estimated prospectively versus 0.023 (0.001, 0.052) in the cross-sectional survey.

Conclusions

The estimation of seroconversion rates using cross-sectional data is a widespread and generalizable problem for many infectious diseases that can be measured using antibody titers. The consistency between these two estimates lends credibility to model-based estimates of malaria seroconversion rates using cross-sectional surveys. This study also demonstrates the utility of including malaria antibody measures in multiplex assays alongside targets for vaccine coverage and other neglected tropical diseases, which together could comprise an integrated, large-scale serological surveillance platform.  相似文献   

7.
Some improvements are presented for the affected-pedigree-member method of linkage analysis, which is a generalization of the sib-pair method. The test statistic is extended to include contrasts between affected and unaffected pedigree members, so that it now utilizes marker information from all typed pedigree members rather than just the typed affected members. Computer simulation using a sample pedigree of 14 individuals shows that this modification can substantially increase statistical power where there is a direct association between marker variation and disease and where disease risk is elevated in carriers of the disease allele. Data on Huntington disease in 16 British families, which were analyzed previously using only the affected individuals, are reanalyzed with the unaffected individuals included. Strong rejection of the null hypothesis of no association between Huntington disease and the HindIII polymorphism is confirmed, but the particular families in which the association is significant differs from that obtained through an analysis based only on affected individuals and reflects more closely the results obtained from a lod-score analysis. The test statistic is also modified here to incorporate contrasts between individuals of zero kinship, if needed. This enables contrasts between individuals from different pedigrees, as well as contrasts involving individuals sampled from the general population, to be incorporated into the test of association. For population data, the methodology reduces to a type of contingency-table analysis, in which the rows of the table correspond to different marker-locus genotypes and in which the two columns categorize subjects into an "affected" group versus an "unaffected," or control, group. This aspect of the methodology is illustrated using two population data sets, the first relating APO-E genotype to the frequency of individuals undergoing maintenance hemodialysis and the second relating APO-B genotype to the frequency of coronary artery disease. The present methodology confirms the lack of association between marker and disease in the former data set and confirms the presence of association in the latter. Finally, the methodology is formulated here in terms of ordinary, multiperson kinship coefficients rather than in terms of the generalized kinship coefficients originally proposed. This greatly reduces the number of coefficients to be calculated, thereby enhancing the computational efficiency of the computer program.  相似文献   

8.
We describe a new approach for analysis of the epidemiology of progressive genetic disorders that quantifies the rate of progression of the disease in the population by measuring the mutational flow. The framework is applied to Huntington disease (HD), a dominant neurological disorder caused by the expansion of a CAG-trinucleotide sequence to >35 repeats. The disease is 100% penetrant in individuals with > or = 42 repeats. Measurement of the flow from disease alleles provides a minimum estimate of the flow in the whole population and implies that the new mutation rate for HD in each generation is > or = 10% of currently known cases (95% confidence limits 6%-14%). Analysis of the pattern of flow demonstrates systematic underascertainment for repeat lengths <44. Ascertainment falls to <50% for individuals with 40 repeats and to <5% for individuals with 36-38 repeats. Clinicians should not assume that HD is rare outside known pedigrees or that most cases have onset at age <50 years.  相似文献   

9.
In England, during pandemic 2009 H1N1, vaccine efficacy and immunogenicity population studies in priority groups were rolled out in parallel to evaluate the pandemic vaccination programme. This provided a unique opportunity to compare immunogenicity and clinical protection in the same population and thus provide insights into the correlates of protection for the pandemic H1N1 2009 vaccine in risk groups. While clinical protection from AS03-adjuvanted pandemic 2009 H1N1 vaccine was high in those aged <25 years and pregnant women, effectiveness in older adults with chronic conditions has been found to be surprisingly poor. Here we present results from the immunogenicity study derived from the same population. Individuals from priority groups eligible for pandemic vaccination attending participating general practices were recruited. Pre and post-vaccination blood samples were collected and HI antibody testing to assess immune response to vaccination performed. The final cohort consisted of 610 individuals: 60 healthy children aged <5 years; 32 healthy pregnant women; 518 individuals from risk groups. Seroconversion rate in healthy children aged <5 years (87%, 95% CI: 75% to 94%) was higher than that of risk groups combined (65%, 95% CI: 61% to 69%) (p<0.001). Multivariable analysis of risk groups showed that the size of response in those who did seroconvert was lower in those who received the 2009/10 seasonal TIV (Fold effect: 0.52, 0.35 to 0.78). Predicted immunological boosting from higher pre-vaccine titres after 2009 pandemic H1N1 vaccination only occurred in children (seroconversion rate = 92%) and not in individuals aged 10 to 39 from risk groups (seroconversion rate = 74%). The lack of clinical protection identified in the same population in older adults from risk groups could be attributed to these lower seroresponses. Current immunogenicity licensing criteria for pandemic influenza vaccine may not correlate with clinical protection in individuals with chronic disease or immunocompromised.  相似文献   

10.

Background

Life expectancy has increased in HIV-positive individuals receiving combination antiretroviral therapy (cART); however, they still experience increased mortality due to ageing-associated comorbidities compared with HIV-negative individuals.

Methods

A retrospective study of 314 Queensland HIV-infected males on cART was conducted. The negative impact of ageing was assessed by estimating the probability of 5-year mortality; comparisons were made between an HIV-specific predictive tool (VACS index) and the Australian Bureau of Statistics (ABS) life-tables to examine potential differences attributed to HIV. The negative impact of ageing was also assessed by the prevalence of comorbidities. Associations between comorbidity and estimates of predicted mortality by regression analysis were assessed.

Results

The mean predicted 5-year mortality rate was 6% using the VACS index compared with 2.1% using the ABS life-table (p<0.001). The proportion of patients at predicted high risk of mortality (>9%) using the VACS index or ABS life-table were 17% and 1.8% respectively. Comorbidities were also more prevalent in this cohort compared with rates of comorbidities in age-matched Australian men from the general population. Metabolic disease (38.2%) was the most prevalent comorbidity followed by renal (33.1%) and cardiovascular disease (23.9%). Multivariate analysis demonstrated that patients with a history of cardiovascular disease had a higher predicted risk of mortality (OR=1.69;95%CI:1.17-2.45) whereas ex-smokers had a lower predicted risk of mortality (OR=0.61;95%CI:0.41-0.92).

Conclusions

Using the VACS Index there is an increased predicted risk of mortality in cART-treated HIV infected Australian men compared with age-matched men using the ABS data. This increased predicted mortality risk is associated with cardiovascular disease and the number of comorbidities per subject; which suggests that the VACS Index may discriminate between high and low predicted mortality risks in this population. However, until the VACS Index is validated in Australia this data may suggest the VACS Index overestimates predicted mortality risk in this country.  相似文献   

11.
S R Stock  A Gafni  R F Bloch 《CMAJ》1990,142(9):937-946
The universal precautions recommended by the US Centers for Disease Control (CDC), Atlanta, for the prevention of HIV (human immunodeficiency virus) transmission to health care workers are widely accepted, despite little documentation of their effectiveness and efficiency. We reviewed the evidence on the risk of HIV transmission to hospital workers and the effectiveness of the universal precautions. We also evaluated the costs of implementing the recommendations in a 450-bed acute care teaching hospital in Hamilton, Ont. On the basis of aggregated results from six prospective studies the risk of HIV seroconversion among hospital workers after a needlestick injury involving a patient known to have AIDS (acquired immune deficiency syndrome) is 0.36% (upper 95% confidence limit 0.67%); the risk after skin and mucous membrane exposure to blood or other body fluids of AIDS patients is 0% (upper 95% confidence limit 0.38%). We estimated that 0.038 cases of HIV seroconversion would be prevented annually in the study hospital if the CDC recommendations were followed. The incremental cost of implementing the universal precautions was estimated to be about $315,000 per year, or over $8 million per case of HIV seroconversion prevented. If all HIV-infected workers were assumed to have AIDS within 10 years of infection the of the program would be about $565,000 per life-year saved. When less conservative, more probable assumptions were applied the best estimate of the implementation cost was $128,862,000 per case of HIV seroconversion prevented. The universal precautions implemented in the study hospital were not found to be efficacious or cost-effective. To minimize the already small risk of HIV transmission in hospitals the sources of risk of percutaneous injury should be better defined and the design of percutaneous lines, needles and surgical equipment as well as techniques improved. Preventive measures recommended on the basis of demonstrated efficacy and aimed at routes of exposure that represent true risk are needed.  相似文献   

12.
The goal of predictive testing is to modify the risk for currently healthy individuals to develop a genetic disease in the future. Such testing using polymorphic DNA markers has had major application in Huntington disease. The Canadian Collaborative Study of Predictive Testing for Huntington Disease has been guided by major principles of medical ethics, including autonomy, beneficence, confidentiality, and justice. Numerous ethical and legal dilemmas have arisen in this program, challenging these principles and occasionally casting them into conflict. The present report describes these dilemmas and offers our approach to resolving them. These issues will have relevance to predictive-testing programs for other adult-onset disorders.  相似文献   

13.
The prevention of common diseases relies on identifying risk factors and implementing intervention in high-risk groups. Nevertheless, most known risk factors have low positive predictive value (PPV) and low population-attributable fraction (PAF) for diseases (e.g., cholesterol and coronary heart disease). With advancing genetic technology, it will be possible to refine the risk-factor approach to target intervention to individuals with risk factors who also carry disease-susceptibility allele(s). We provide an epidemiological approach to assess the impact of genetic testing on the PPV and PAF associated with risk factors. Under plausible models of interaction between a risk factor and a genotype, we derive values of PPV and PAF associated with the joint effects of a risk factor and a genotype. The use of genetic testing can markedly increase the PPV of a risk factor. PPV increases with increasing genotype-risk factor interaction and increasing marginal relative risk associated with the factor, but it is inversely proportional to the prevalences of the genotype and the factor. For example, for a disease with lifetime risk of 1%, if all the risk-factor effect is confined to individuals with a susceptible genotype, a risk factor with 10% prevalence and disease relative risk of 2 in the population will have a disease PPV of 1.8%, but it will have a PPV of 91.8% among persons with a genotype of 1% prevalence. On the other hand, genetic testing and restriction of preventive measures to those susceptible may decrease the PAF of the risk factor, especially at low prevalences of the risk factor and genotype.(ABSTRACT TRUNCATED AT 250 WORDS)  相似文献   

14.

Objective

Randomized clinical trials of HIV prevention in high-risk populations of women often assume that all participants have similar exposure to HIV. However, a substantial fraction of women enrolled in the trial may have no or low exposure to HIV. Our objective was to estimate the proportion of women exposed to HIV throughout a hypothetical high-risk study population.

Methods

A stochastic individual-based model was developed to simulate the sexual behavior and the risk of HIV acquisition for a cohort of sexually active HIV-uninfected women in high HIV prevalence settings. Key behavior and epidemic assumptions in the model were based on published studies on HIV transmission in South Africa. The prevalence of exposure, defined as the proportion of women who have sex with HIV-infected partner, and HIV incidence were evaluated.

Results

Our model projects that in communities with HIV incidence rate of 1 per 100 person years, only 5-6% of women are exposed to HIV annually while in communities with an HIV incidence of 5 per 100 person years 20-25% of women are exposed to HIV. Approximately 70% of the new infections are acquired from partners with asymptomatic HIV.

Conclusions

Mathematical models suggest that a high proportion of women enrolled in HIV prevention trials may be unexposed to HIV even when incidence rates are high. The relationship between HIV exposure and other risk factors should be carefully analyzed when future clinical trials are planned.  相似文献   

15.

Background

Local HIV epidemiology data are critical in determining the suitability of a population for HIV vaccine efficacy trials. The objective of this study was to estimate the prevalence and incidence of, and determine risk factors for HIV transmission in a rural community-based HIV vaccine preparedness cohort in Masaka, Uganda.

Methods

Between February and July 2004, we conducted a house-to-house HIV sero-prevalence survey among consenting individuals aged 18–60 years. Participants were interviewed, counseled and asked to provide blood for HIV testing. We then enrolled the HIV uninfected participants in a 2-year HIV sero-incidence study. Medical evaluations, HIV counseling and testing, and sample collection for laboratory analysis were done quarterly. Sexual risk behaviour data was collected every 6 months.

Results

The HIV point prevalence was 11.2%, and was higher among women than men (12.9% vs. 8.6%, P = 0.007). Risk factors associated with prevalent HIV infection for men were age <25 years (aOR = 0.05, 95% CI 0.01–0.35) and reported genital ulcer disease in the past year (aOR = 2.17, 95% CI 1.23–3.83). Among women, being unmarried (aOR = 2.59, 95% CI 1.75–3.83) and reported genital ulcer disease in the past year (aOR = 2.40, 95% CI 1.64–3.51) were associated with prevalent HIV infection. Twenty-one seroconversions were recorded over 2025.8 person-years, an annual HIV incidence of 1.04% (95% CI: 0.68–1.59). The only significant risk factor for incident HIV infection was being unmarried (aRR = 3.44, 95% CI 1.43–8.28). Cohort retention after 2 years was 87%.

Conclusions

We found a high prevalence but low incidence of HIV in this cohort. HIV vaccine efficacy trials in this population may not be feasible due to the large sample sizes that would be required. HIV vaccine preparatory efforts in this setting should include identification of higher risk populations.  相似文献   

16.
Summary In the last decade, interest has been focused on human immunodeficiency virus (HIV) antibody assays and testing strategies that could distinguish recent infections from established infection in a single serum sample. Incidence estimates are obtained by using the relationship between prevalence, incidence, and duration of recent infection (window period). However, recent works demonstrated limitations of this approach due to the use of an estimated mean “window period.” We propose an alternative approach that consists in estimating the distribution of infection times based on serological marker values at the moment when the infection is first discovered. We propose a model based on the repeated measurements of virological markers of seroconversion for the marker trajectory. The parameters of the model are estimated using data from a cohort of HIV‐infected patients enrolled during primary infection. This model can be used for estimating the distribution of infection times for newly HIV diagnosed subjects reported in a HIV surveillance system. An approach is proposed for estimating HIV incidence from these results.  相似文献   

17.
OBJECTIVES: To evaluate the level of 90K as a predictor of AIDS; to describe 90K levels over time after HIV serconversion; and to evaluate the 90K level as a marker of the maturity of infection. DESIGN: Prospective incident cohort of HIV-infected individuals with documented dates of seroconversion. METHODS: Cox models were applied to estimate the crude and adjusted relative hazards (RH) of AIDS by level of 90K. Regression models were applied to describe the temporal trend and the correlates of the level of 90K over time after HIV-seroconversion. Logistic models were applied to evaluate the probability of a sample of 90K having been taken within a certain time period after HIV-seroconversion. RESULTS: The study population consisted of 150 participants of the Italian Seroconversion Study. A total of 429 measurements of 90K were taken. Both early and later measurements of 90K were highly predictive of AIDS, also when adjusting for CD4 lymphocyte count and HIV load. The 90K level (U/ml) increased by 10% annually (95% CI: 7%-13%); the increase over time was linear. IDUs had higher 90K levels than heterosexuals and homosexuals over the course of HIV disease. High 90K levels were highly predictive of distant seroconversions (age-adjusted probability, 74%), whereas were poorly predictive of recent seroconversions (age-adjusted probability, 5%); the results were similar for the predictability of CD4 lymphocyte count. CONCLUSIONS: The level of 90K is a useful prognostic tool for clinical purposes. As a marker of the maturity of infection, 90K is similar to the CD4 lymphocyte count, with the advantage of being able to use serum instead of fresh whole blood. It has a good capacity to identify distant infections.  相似文献   

18.
The HIV epidemic is, by many criteria, the worst outbreak of infectious disease in history. The rate of new infections is now approximately 5 million per year, mainly in the developing world, and is increasing. Women are now substantially more at risk of infection with HIV than men. With no cure or effective vaccine in sight, a huge effort is required to develop topical agents (often called microbicides) that, applied to the vaginal mucosa, would prevent infection of these high-risk individuals. We discuss the targets for topical agents that have been identified by studies of the biology of HIV infection and provide an overview of the progress towards the development of a usable agent.  相似文献   

19.
Project Horizonte, an open cohort of homosexual and bisexual human immunodeficiency virus (HIV-1) negative men, is a component of the AIDS Vaccine Program, in Belo Horizonte, Minas Gerais, Brazil. The objective of this study was to compare volunteers testing HIV positive at cohort entry with a sample of those who tested HIV negative in order to identify risk factors for prevalent HIV infection, in a population being screened for enrollment at Project Horizonte. A nested case-control study was conducted. HIV positive volunteers at entry (cases) were matched by age and admission date to three HIV negative controls each. Selected variables used for the current analysis included demographic factors, sexual behavior and other risk factors for HIV infection. During the study period (1994-2001), among the 621 volunteers screened, 61 tested positive for HIV. Cases were matched to 183 HIV negative control subjects. After adjustments, the main risk factors associated with HIV infection were unprotected sex with an occasional partners, OR = 3.7 (CI 95% 1.3-10.6), receptive anal intercourse with an occasional partner, OR = 2.8 (95% CI 0.9-8.9) and belonging to the negro racial group, OR = 3.4 (CI 95% 1.1-11.9). These variables were associated with an increase in the risk of HIV infection among men who have sex with men at the screening for admission to an open HIV negative cohort.  相似文献   

20.
The mechanism underlying the excess risk of non-AIDS diseases among HIV infected people is unclear. HIV associated inflammation/hypercoagulability likely plays a role. While antiretroviral therapy (ART) may return this process to pre-HIV levels, this has not been directly demonstrated. We analyzed data/specimens on 249 HIV+ participants from the US Military HIV Natural History Study, a prospective, multicenter observational cohort of >5600 active duty military personnel and beneficiaries living with HIV. We used stored blood specimens to measure D-dimer and Interleukin-6 (IL-6) at three time points: pre-HIV seroconversion, ≥6 months post-HIV seroconversion but prior to ART initiation, and ≥6 months post-ART with documented HIV viral suppression on two successive evaluations. We evaluated the changes in biomarker levels between time points, and the association between these biomarker changes and future non-AIDS events. During a median follow-up of 3.7 years, there were 28 incident non-AIDS diseases. At ART initiation, the median CD4 count was 361cells/mm3; median duration of documented HIV infection 392 days; median time on ART was 354 days. Adjusted mean percent increase in D-dimer levels from pre-seroconversion to post-ART was 75.1% (95% confidence interval 24.6–148.0, p = 0.002). This increase in D-dimer was associated with a significant 22% increase risk of future non-AIDS events (p = 0.03). Changes in IL-6 levels across time points were small and not associated with future non-AIDS events. In conclusion, ART initiation and HIV viral suppression does not eliminate HIV associated elevation in D-dimer levels. This residual pathology is associated with an increased risk of future non-AIDS diseases.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号