首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Z Zhao  S Li  G Liu  F Yan  X Ma  Z Huang  H Tian 《PloS one》2012,7(7):e41641

Background and Objective

Emerging evidence from biological and epidemiological studies has suggested that body iron stores and heme-iron intake may be related to the risk of type 2 diabetes (T2D). We aimed to examine the association of body iron stores and heme-iron intake with T2D risk by conducting a systematic review and meta-analysis of previously published studies.

Research Design and Methods

Systematic review and subsequent meta-analysis were conducted by searching MEDLINE database up to June 22, 2012 to identify studies that analyzed the association of body iron stores or dietary heme-iron intake with T2D risk. The meta-analysis was performed using the effect estimates and 95% confidence intervals (CIs) to calculate the pooled risk estimates, while the heterogeneity among studies was examined using the I2 and Q statistic.

Results

The meta-analysis included 16 high-quality studies: 12 studies analyzed ferritin levels (4,366 T2D patients and 41,091 controls) and 4 measured heme-iron intake (9,246 T2D patients and 179,689 controls). The combined relative risk (RR) comparing the highest and lowest category of ferritin levels was 1.66 (95% CI: 1.15–2.39) for prospective studies, 2.29 (95% CI: 1.48–3.54) for cross-sectional studies with heterogeneity (Q = 14.84, p = 0.01, I2 = 66.3%; Q = 44.16, p<0.001, I2 = 88.7%). The combined RR comparing the highest and lowest category of heme-iron intake was 1.31 (95% CI: 1.21–1.43) with heterogeneity (Q = 1.39, p = 0.71, I2 = 0%). No publication bias was found. Additional 15 studies that were of good quality, had significant results, and analyzed the association between body iron stores and T2D risk were qualitatively included in the systematic review.

Conclusions

The meta-analysis and systematic review suggest that increased ferritin levels and heme-iron intake are both associated with higher risk of T2D.  相似文献   

2.
Zhou YH  Tang JY  Wu MJ  Lu J  Wei X  Qin YY  Wang C  Xu JF  He J 《PloS one》2011,6(9):e25142

Background

Folic acid is widely used to lower homocysteine concentrations and prevent adverse cardiovascular outcomes. However, the effect of folic acid on cardiovascular events is not clear at the present time. We carried out a comprehensive systematic review and meta-analysis to assess the effects of folic acid supplementation on cardiovascular outcomes.

Methodology and Principal Findings

We systematically searched Medline, EmBase, the Cochrane Central Register of Controlled Trials, reference lists of articles, and proceedings of major meetings for relevant literature. We included randomized placebo-controlled trials that reported on the effects of folic acid on cardiovascular events compared to placebo. Of 1594 identified studies, we included 16 trials reporting data on 44841 patients. These studies reported 8238 major cardiovascular events, 2001 strokes, 2917 myocardial infarctions, and 6314 deaths. Folic acid supplementation as compared to placebo had no effect on major cardiovascular events (RR, 0.98; 95% CI, 0.93–1.04), stroke (RR, 0.89; 95% CI,0.78–1.01), myocardial infarction (RR, 1.00; 95% CI, 0.93–1.07), or deaths from any cause (RR, 1.00;95% CI, 0.96–1.05). Moreover, folic acid as compared to placebo also had no effect on the following secondary outcomes: risk of revascularization (RR, 1.05; 95%CI, 0.95–1.16), acute coronary syndrome (RR, 1.06; 95%CI, 0.97–1.15), cancer (RR, 1.08; 95%CI, 0.98–1.21), vascular death (RR, 0.94; 95%CI,0.88–1.02), or non-vascular death (RR, 1.06; 95%CI, 0.97–1.15).

Conclusion/Significance

Folic acid supplementation does not effect on the incidence of major cardiovascular events, stroke, myocardial infarction or all cause mortality.  相似文献   

3.

Purpose

Despite discrepant results on clinical utility, several trials are already prospectively randomizing non-small cell lung cancer (NSCLC) patients by ERCC1 status. We aimed to characterize the prognostic and predictive effect of ERCC1 by systematic review and meta-analysis.

Methods

Eligible studies assessed survival and/or chemotherapy response in NSCLC or SCLC by ERCC1 status. Effect measures of interest were hazard ratio (HR) for survival or relative risk (RR) for chemotherapy response. Random-effects meta-analyses were used to account for between-study heterogeneity, with unadjusted/adjusted effect estimates considered separately.

Results

23 eligible studies provided survival results in 2,726 patients. Substantial heterogeneity was observed in all meta-analyses (I2 always >30%), partly due to variability in thresholds defining ‘low’ and ‘high’ ERCC1. Meta-analysis of unadjusted estimates showed high ERCC1 was associated with significantly worse overall survival in platinum-treated NSCLC (average unadjusted HR = 1.61, 95%CI:1.23–2.1, p = 0.014), but not in NSCLC untreated with chemotherapy (average unadjusted HR = 0.82, 95%CI:0.51–1.31). Meta-analysis of adjusted estimates was limited by variable choice of adjustment factors and potential publication bias (Egger''s p<0.0001). There was evidence that high ERCC1 was associated with reduced response to platinum (average RR = 0.80; 95%CI:0.64–0.99). SCLC data were inadequate to draw firm conclusions.

Conclusions

Current evidence suggests high ERCC1 may adversely influence survival and response in platinum-treated NSCLC patients, but not in non-platinum treated, although definitive evidence of a predictive influence is lacking. International consensus is urgently required to provide consistent, validated ERCC1 assessment methodology. ERCC1 assessment for treatment selection should currently be restricted to, and evaluated within, clinical trials.  相似文献   

4.

Background

Recent clinical trials and observational studies have reported increased coronary events associated with non steroidal anti-inflammatory drugs (NSAIDs). There appeared to be a disproportionate increase in non-fatal versus fatal events, however, numbers of fatal events in individual studies were too small, and event rates too low, to be meaningful.

Objectives

We undertook a pooled analysis to investigate the effect of NSAIDs on myocardial infarction (MI) risk with the specific aim to differentiate non-fatal from fatal events.

Methods

We searched Pubmed (January, 1990 to March, 2010) for observational studies and randomised controlled trials that assessed the effect of NSAIDs (traditional or selective COX-2 inhibitors [coxibs]) on MI incidence separately for fatal and non-fatal events. Summary estimates of relative risk (RR) for non-fatal and fatal MIs were calculated with a random effects model.

Results

NSAID therapy carried a RR of 1.30 (95% CI, 1.20–1.41) for non-fatal MI with no effect on fatal MI (RR 1.02, 95% CI, 0.89–1.17) in six observational studies. Overall, the risk increase for non-fatal MI was 25% higher (95% CI, 11%–42%) than for fatal MI. The two studies that included only individuals with prior cardiovascular disease presented risk estimates for non-fatal MI on average 58% greater (95% CI, 26%–98%) than those for fatal MI. In nine randomised controlled trials, all investigating coxibs, the pooled RR estimate for non-fatal MI was 1.61 (95% CI, 1.04–2.50) and 0.86 (95% CI 0.51–1.47) for fatal MIs.

Conclusions

NSAID use increases the risk of non-fatal MI with no substantial effect on fatal events. Such differential effects, with potentially distinct underlying pathology may provide insights into NSAID-induced coronary pathology. We studied the association between the use of nonsteroidal anti-inflammatory drugs (NSAIDs) and the risk of myocardial infarction (MI), separating non-fatal from fatal events, summarizing the evidence from both observational studies and randomised controlled trials. An increased risk of non-fatal MI was clearly found in both types of studies while use of NSAID did not confer an increased risk of fatal MI. Our findings provide support for the concept that thrombi generated under NSAID treatment could be different from spontaneous thrombi.  相似文献   

5.

Background

Several sub-Saharan African countries have rapidly scaled up the number of households that own insecticide-treated mosquito nets (ITNs). Although the efficacy of ITNs in trials has been shown, evidence on their impact under routine conditions is limited to a few countries and the extent to which the scale-up of ITNs has improved population health remains uncertain.

Methods and Findings

We used matched logistic regression to assess the individual-level association between household ITN ownership or use in children under 5 years of age and the prevalence of parasitemia among children using six malaria indicator surveys (MIS) and one demographic and health survey. We used Cox proportional hazards models to assess the relationship between ITN household ownership and child mortality using 29 demographic and health surveys. The pooled relative reduction in parasitemia prevalence from random effects meta-analysis associated with household ownership of at least one ITN was 20% (95% confidence interval [CI] 3%–35%; I 2 = 73.5%, p<0.01 for I 2 value). Sleeping under an ITN was associated with a pooled relative reduction in parasitemia prevalence in children of 24% (95% CI 1%–42%; I 2 = 79.5%, p<0.001 for I 2 value). Ownership of at least one ITN was associated with a pooled relative reduction in mortality between 1 month and 5 years of age of 23% (95% CI 13–31%; I 2 = 25.6%, p>0.05 for I 2 value).

Conclusions

Our findings across a number of sub-Saharan African countries were highly consistent with results from previous clinical trials. These findings suggest that the recent scale-up in ITN coverage has likely been accompanied by significant reductions in child mortality and that additional health gains could be achieved with further increases in ITN coverage in populations at risk of malaria. Please see later in the article for the Editors'' Summary  相似文献   

6.
Wu Y  Shi Y  Wu H  Bian C  Tang Q  Xu G  Yang J 《PloS one》2011,6(6):e20759

Background

It has been controversial whether abciximab offered additional benefits for diabetic patients who underwent percutaneous coronary intervention (PCI) with thienopyridines loading.

Methods

MEDLINE, EMBASE, the Cochrane library clinical trials registry, ISI Science Citation Index, ISI Web of Knowledge and China National Knowledge Infrastructure (CNKI) were searched, supplemented with manual-screening for relevant publications. Quantitative meta-analyses were performed to assess differences between abciximab groups and controls with respect to post-PCI risk of major cardiac events (MACEs), angiographic restenosis and bleeding complications.

Results

9 trials were identified, involving 2,607 diabetic patients receiving PCI for coronary artery diseases. Among those patients who underwent elective PCI or primary PCI, pooling results showed that abciximab did not significantly reduce risks of MACEs (for elective-PCI patients: RR1-month: 0.93, 95% CI: 0.60–1.44; RR1-year: 0.95, 95% CI: 0.81–1.11; for primary-PCI patients: RR1-month: 1.05, 95% CI: 0.70–1.57; RR1-year: 0.98, 95% CI: 0.80–1.21), nor all-cause mortality, re-infarction and angiographic restenosis in either group. The only beneficial effect by abciximab appeared to be a decrease 1-year TLR (target lesion revascularization) risk in elective-PCI patients (RR1-year: 0.83, 95% CI: 0.70–0.99). Moreover, occurrence of minor bleeding complications increased in elective-PCI patients treated with abciximab (RR: 2.94, 95% CI: 1.68–5.13, P<0.001), whereas major bleedings rate was similar (RR: 0.83, 95% CI: 0.27–2.57).

Conclusions

Concomitant dosing of abciximab and thienopyridines provides no additional benefit among diabetic patients who underwent PCI; this conclusion, though, needs further confirmation in larger studies.  相似文献   

7.
Zhang Lq  Zhou Jn  Wang J  Liang Gd  Li Jy  Zhu Yd  Su Yt 《PloS one》2012,7(3):e32425

Background and Objectives

N-Acetyltransferase (NAT) 2 is an important enzyme involved in the metabolism of different xenobiotics, including potential carcinogens, whose phenotypes were reported to be related to individual susceptibility to colorectal cancer (CRC). However, the results remain conflicting. To assess the relationship between NAT2 phenotypes and CRC risk, we performed this meta-analysis.

Methods

A comprehensive literature search was conducted to identify all case-control or cohort studies of NAT2 acetylator status on the susceptibility of CRC by searching of PubMed and EMBASE, up to May 20, 2011. Crude odds ratios (ORs) with 95% confidence intervals (CIs) were used to assess the association.

Results

A total of over 40,000 subjects from 40 published literatures were identified by searching the databases. No significantly elevated CRC risk in individuals with NAT2 slow acetylators compared with fast acetylators was found when all studies pooled (OR = 0.95, 95% CI: 0.87–1.04, I2 = 52.6%). While three studies contributed to the source of heterogeneity were removed, there was still null result observed (OR = 0.96, 95% CI: 0.90–1.03, P = 0.17 for heterogeneity, I2 = 17.8%). In addition, we failed to detect any associations in the stratified analyses by race, sex, source of controls, smoking status, genotyping methods or tumor localization. No publication bias was observed in this study.

Conclusions

This meta-analysis suggests that the NAT2 phenotypes may not be associated with colorectal cancer development.  相似文献   

8.

Background

Statin therapy reduces the risk of occlusive vascular events, but uncertainty remains about potential effects on cancer. We sought to provide a detailed assessment of any effects on cancer of lowering LDL cholesterol (LDL-C) with a statin using individual patient records from 175,000 patients in 27 large-scale statin trials.

Methods and Findings

Individual records of 134,537 participants in 22 randomised trials of statin versus control (median duration 4.8 years) and 39,612 participants in 5 trials of more intensive versus less intensive statin therapy (median duration 5.1 years) were obtained. Reducing LDL-C with a statin for about 5 years had no effect on newly diagnosed cancer or on death from such cancers in either the trials of statin versus control (cancer incidence: 3755 [1.4% per year [py]] versus 3738 [1.4% py], RR 1.00 [95% CI 0.96-1.05]; cancer mortality: 1365 [0.5% py] versus 1358 [0.5% py], RR 1.00 [95% CI 0.93–1.08]) or in the trials of more versus less statin (cancer incidence: 1466 [1.6% py] vs 1472 [1.6% py], RR 1.00 [95% CI 0.93–1.07]; cancer mortality: 447 [0.5% py] versus 481 [0.5% py], RR 0.93 [95% CI 0.82–1.06]). Moreover, there was no evidence of any effect of reducing LDL-C with statin therapy on cancer incidence or mortality at any of 23 individual categories of sites, with increasing years of treatment, for any individual statin, or in any given subgroup. In particular, among individuals with low baseline LDL-C (<2 mmol/L), there was no evidence that further LDL-C reduction (from about 1.7 to 1.3 mmol/L) increased cancer risk (381 [1.6% py] versus 408 [1.7% py]; RR 0.92 [99% CI 0.76–1.10]).

Conclusions

In 27 randomised trials, a median of five years of statin therapy had no effect on the incidence of, or mortality from, any type of cancer (or the aggregate of all cancer).  相似文献   

9.

Objective

Biological evidence suggests that inflammation might induce type 2 diabetes (T2D), and epidemiological studies have shown an association between higher white blood cell count (WBC) and T2D. However, the association has not been systematically investigated.

Research Design and Methods

Studies were identified through computer-based and manual searches. Previously unreported studies were sought through correspondence. 20 studies were identified (8,647 T2D cases and 85,040 non-cases). Estimates of the association of WBC with T2D were combined using random effects meta-analysis; sources of heterogeneity as well as presence of publication bias were explored.

Results

The combined relative risk (RR) comparing the top to bottom tertile of the WBC count was 1.61 (95% CI: 1.45; 1.79, p = 1.5*10−18). Substantial heterogeneity was present (I2 = 83%). For granulocytes the RR was 1.38 (95% CI: 1.17; 1.64, p = 1.5*10−4), for lymphocytes 1.26 (95% CI: 1.02; 1.56, p = 0.029), and for monocytes 0.93 (95% CI: 0.68; 1.28, p = 0.67) comparing top to bottom tertile. In cross-sectional studies, RR was 1.74 (95% CI: 1.49; 2.02, p = 7.7*10−13), while in cohort studies it was 1.48 (95% CI: 1.22; 1.79, p = 7.7*10−5). We assessed the impact of confounding in EPIC-Norfolk study and found that the age and sex adjusted HR of 2.19 (95% CI: 1.74; 2.75) was attenuated to 1.82 (95% CI: 1.45; 2.29) after further accounting for smoking, T2D family history, physical activity, education, BMI and waist circumference.

Conclusions

A raised WBC is associated with higher risk of T2D. The presence of publication bias and failure to control for all potential confounders in all studies means the observed association is likely an overestimate.  相似文献   

10.

Background

Patients who participate in clinical trials may experience better clinical outcomes than patients who initiate similar therapy within clinical care (trial effect), but no published studies have evaluated a trial effect in HIV clinical trials.

Methods

To examine a trial effect we compared virologic suppression (VS) among patients who initiated HAART in a clinical trial versus in routine clinical care. VS was defined as a plasma HIV RNA ≤400 copies/ml at six months after HAART initiation and was assessed within strata of early (1996–99) or current (2000–06) HAART periods. Risk ratios (RR) were estimated using binomial models.

Results

Of 738 persons initiating HAART, 30.6% were women, 61.7% were black, 30% initiated therapy in a clinical trial and 67% (n = 496) had an evaluable six month HIV RNA result. HAART regimens differed between the early and current periods (p<0.001); unboosted PI regimens (55.6%) were more common in the early and NNRTI regimens (46.4%) were more common in the current period. Overall, 78% (95%CI 74, 82%) of patients achieved VS and trial participants were 16% more likely to achieve VS (unadjusted RR 1.16, 95%CI 1.06, 1.27). Comparing trial to non-trial participants, VS differed by study period. In the early period, trial participants initiating HAART were significantly more likely to achieve VS than non-trial participants (adjusted RR 1.33; 95%CI 1.15, 1.54), but not in the current period (adjusted RR 0.98; 95%CI 0.87, 1.11).

Conclusions

A clear clinical trial effect on suppression of HIV replication was observed in the early HAART period but not in the current period.  相似文献   

11.
Li Y  Liu Y  Fu L  Mei C  Dai B 《PloS one》2012,7(4):e34450

Background

A few studies focused on statin therapy as specific prophylactic measures of contrast-induced nephropathy have been published with conflicting results. In this meta-analysis of randomized controlled trials, we aimed to assess the effectiveness of shor-term high-dose statin treatment for the prevention of CIN and clinical outcomes and re-evaluate of the potential benefits of statin therapy.

Methods

We searched PubMed, OVID, EMBASE, Web of science and the Cochrane Central Register of Controlled Trials databases for randomized controlled trials comparing short-term high-dose statin treatment versus low-dose statin treatment or placebo for preventing CIN. Our outcome measures were the risk of CIN within 2–5 days after contrast administration and need for dialysis.

Results

Seven randomized controlled trials with a total of 1,399 patients were identified and analyzed. The overall results based on fixed-effect model showed that the use of short-term high-dose statin treatment was associated with a significant reduction in risk of CIN (RR = 0.51, 95% CI 0.34–0.76, p = 0.001; I2 = 0%). The incidence of acute renal failure requiring dialysis was not significant different after the use of statin (RR = 0.33, 95% CI 0.05–2.10, p = 0.24; I2 = 0%). The use of statin was not associated with a significant decrease in the plasma C-reactive protein level (SMD −0.64, 95% CI: −1.57 to 0.29, P = 0.18, I2 = 97%).

Conclusions

Although this meta-analysis supports the use of statin to reduce the incidence of CIN, it must be considered in the context of variable patient demographics. Only a limited recommendation can be made in favour of the use of statin based on current data. Considering the limitations of included studies, a large, well designed trial that incorporates the evaluation of clinically relevant outcomes in participants with different underlying risks of CIN is required to more adequately assess the role for statin in CIN prevention.  相似文献   

12.

Background

The response rates to physician postal surveys remain modest. The primary objective of this study was to assess the effect of tracking responses on physician survey response rate (i.e., determining whether each potential participant has responded or not). A secondary objective was to assess the effects of day of mailing (Monday vs. Friday) on physician survey response rate.

Methods

We conducted 3 randomized controlled trials. The first 2 trials had a 2×2 factorial design and tested the effect of day of mailing (Monday vs. Friday) and of tracking vs. no tracking responses. The third trial tested the effect of day of mailing (Monday vs. Friday). We meta-analyzed these 3 trials using a random effects model.

Results

The total number of participants in the 3 trials was 1339. The response rate with tracked mailing was not statistically different from that with non-tracked mailing by the time of the first reminder (RR = 1.01 95% CI 0.84, 1.22; I2 = 0%). There was a trend towards lower response rate with tracked mailing by the time of the second reminder (RR = 0.91; 95% CI 0.78, 1.06; I2 = 0%). The response rate with mailing on Mondays was not statistically different from that with Friday mailing by the time of first reminder (RR = 1.01; 95% CI 0.87, 1.17; I2 = 0%), and by the time of the 2nd reminder (RR = 1.08; 95% CI 0.84, 1.39; I2 = 77%).

Conclusions

Tracking response may negatively affect physicians'' response rate. The day of mailing does not appear to affect physicians'' response rate.  相似文献   

13.

Objective

To describe the likely extent of confounding in evaluating the risks of cardiovascular (CV) events and mortality in patients using diabetes medication.

Methods

The General Practice Research Database was used to identify inception cohorts of insulin and different oral antidiabetics. An analysis of bias and incidence of mortality, acute coronary syndrome, stroke and heart failure were analysed in GPRD, Hospital Episode Statistics and death certificates.

Results

206,940 patients were identified. The bias analysis showed that past thiazolidinedione users had a lower mortality risk compared to past metformin users. There were no differences between past users of rosiglitazone and pioglitazone (adjusted RR of 1.04; 95% CI 0.93–1.18). Current rosiglitazone users had an increased risk of death (adjusted RR 1.20; 95% CI 1.08–1.34) and of hospitalisation for heart failure (adjusted RR of 1.73; 95% CI 1.19–2.51) compared to current pioglitazone users. Risk of mortality was increased two-fold shortly after starting rosiglitazone. Excess risk of death over 3 years with rosiglitazone was 0.3 per 100 in those aged 50–64 years, 2.0 aged 65–74, 3.0 aged 75–84, and 7.0 aged 85+. The cause of death with rosiglitazone was more likely to be due to a disease of the circulatory system.

Conclusions

Higher risks for death (overall and due to cardiovascular disease) and heart failure were found for rosiglitazone compared to pioglitazone. These excess risks were largest in patients aged 65 years or older. The European regulatory decision to suspend rosiglitazone is supported by this study.  相似文献   

14.
Chen J  Zhang R  Wang J  Liu L  Zheng Y  Shen Y  Qi T  Lu H 《PloS one》2011,6(11):e26827

Background

Interferon-gamma release assays (IGRAs) have provided a new method for the diagnosis of Mycobacterium tuberculosis infection. However, the role of IGRAs for the diagnosis of active tuberculosis (TB), especially in HIV-infected patients remains unclear.

Methods

We searched PubMed, EMBASE and Cochrane databases to identify studies published in January 2001–July 2011 that evaluated the evidence of using QuantiFERON-TB Gold in-tube (QFT-GIT) and T-SPOT.TB (T-SPOT) on blood for the diagnosis of active TB in HIV-infected patients.

Results

The search identified 16 eligible studies that included 2801 HIV-infected individuals (637 culture confirmed TB cases). The pooled sensitivity for the diagnosis of active TB was 76.7% (95%CI, 71.6–80.5%) and 77.4% (95%CI, 71.4–82.6%) for QFT-GIT and T-SPOT, respectively, while the specificity was 76.1% (95%CI, 74.0–78.0%) and 63.1% (95%CI, 57.6–68.3%) after excluding the indeterminate results. Studies conducted in low/middle income countries showed slightly lower sensitivity and specificity when compared to that in high-income countries. The proportion of indeterminate results was as high as 10% (95%CI, 8.8–11.3%) and 13.2% (95%CI, 10.6–16.0%) for QFT-GIT and T-SPOT, respectively.

Conclusion

IGRAs in their current formulations have limited accuracy in diagnosing active TB in HIV-infected patients, and should not be used alone to rule out or rule in active TB cases in HIV-infected patients. Further modification is needed to improve their accuracy.  相似文献   

15.
Woo HD  Kim J 《PloS one》2012,7(4):e34615

Background

Good biomarkers for early detection of cancer lead to better prognosis. However, harvesting tumor tissue is invasive and cannot be routinely performed. Global DNA methylation of peripheral blood leukocyte DNA was evaluated as a biomarker for cancer risk.

Methods

We performed a meta-analysis to estimate overall cancer risk according to global DNA hypomethylation levels among studies with various cancer types and analytical methods used to measure DNA methylation. Studies were systemically searched via PubMed with no language limitation up to July 2011. Summary estimates were calculated using a fixed effects model.

Results

The subgroup analyses by experimental methods to determine DNA methylation level were performed due to heterogeneity within the selected studies (p<0.001, I2: 80%). Heterogeneity was not found in the subgroup of %5-mC (p = 0.393, I2: 0%) and LINE-1 used same target sequence (p = 0.097, I2: 49%), whereas considerable variance remained in LINE-1 (p<0.001, I2: 80%) and bladder cancer studies (p = 0.016, I2: 76%). These results suggest that experimental methods used to quantify global DNA methylation levels are important factors in the association study between hypomethylation levels and cancer risk. Overall, cancer risks of the group with the lowest DNA methylation levels were significantly higher compared to the group with the highest methylation levels [OR (95% CI): 1.48 (1.28–1.70)].

Conclusions

Global DNA hypomethylation in peripheral blood leukocytes may be a suitable biomarker for cancer risk. However, the association between global DNA methylation and cancer risk may be different based on experimental methods, and region of DNA targeted for measuring global hypomethylation levels as well as the cancer type. Therefore, it is important to select a precise and accurate surrogate marker for global DNA methylation levels in the association studies between global DNA methylation levels in peripheral leukocyte and cancer risk.  相似文献   

16.
Li BS  Wang XY  Ma FL  Jiang B  Song XX  Xu AG 《PloS one》2011,6(12):e28078

Background

High Resolution Melting Analysis (HRMA) is becoming the preferred method for mutation detection. However, its accuracy in the individual clinical diagnostic setting is variable. To assess the diagnostic accuracy of HRMA for human mutations in comparison to DNA sequencing in different routine clinical settings, we have conducted a meta-analysis of published reports.

Methodology/Principal Findings

Out of 195 publications obtained from the initial search criteria, thirty-four studies assessing the accuracy of HRMA were included in the meta-analysis. We found that HRMA was a highly sensitive test for detecting disease-associated mutations in humans. Overall, the summary sensitivity was 97.5% (95% confidence interval (CI): 96.8–98.5; I2 = 27.0%). Subgroup analysis showed even higher sensitivity for non-HR-1 instruments (sensitivity 98.7% (95%CI: 97.7–99.3; I2 = 0.0%)) and an eligible sample size subgroup (sensitivity 99.3% (95%CI: 98.1–99.8; I2 = 0.0%)). HRMA specificity showed considerable heterogeneity between studies. Sensitivity of the techniques was influenced by sample size and instrument type but by not sample source or dye type.

Conclusions/Significance

These findings show that HRMA is a highly sensitive, simple and low-cost test to detect human disease-associated mutations, especially for samples with mutations of low incidence. The burden on DNA sequencing could be significantly reduced by the implementation of HRMA, but it should be recognized that its sensitivity varies according to the number of samples with/without mutations, and positive results require DNA sequencing for confirmation.  相似文献   

17.

Background

QuantiFERON-TB Gold In Tube (QFT-GIT) is a tool for detecting M. tuberculosis infection. However, interpretation and utility of serial QFT-GIT testing of pediatric tuberculosis (TB) contacts is not well understood. We compared TB prevalence between baseline and 6 months follow-up using QFT-GIT and tuberculin skin testing (TST) in children who were household contacts of adults with pulmonary TB in South Africa, and explored factors associated with QFT-GIT conversions and reversions.

Method

Prospective study with six month longitudinal follow-up.

Results

Among 270 enrolled pediatric contacts, 196 (73%) underwent 6-month follow-up testing. The 6-month prevalence estimate of MTB infection in pediatric contacts increased significantly from a baseline of 29% (79/270, 95%CI [24–35]) to 38% (103/270, 95% CI [32–44], p<0.001) using QFT-GIT; prevalence increased from a baseline of 28% (71/254, 95%CI [23–34]) to 33% (88/263, 95%CI [21–32], p = 0.002) using TST. Prevalence estimates were influenced by thresholds for positivity for TST, but not for QFT-GIT. Among 134 children with a negative or indeterminate baseline QFT-GIT, 24 (18%) converted to positive at follow-up; conversion rates did not differ significantly when using more stringent thresholds to define QFT-GIT conversion. Older age >10 years (AOR 8.9 95%CI [1.1–72]) and baseline TST positivity ≥5 mm (AOR 5.2 95%CI [1.2–23]) were associated with QFT-GIT conversion. Among 62 children with a positive baseline QFT-GIT, 9 (15%) reverted to negative; female gender (AOR 18.5 95%CI [1.1–321]; p = 0.04] was associated with reversion, while children with baseline positive TST were less likely to have QFT-GIT reversion (AOR 0.01 95%CI [0.001–0.24]).

Conclusion

Among pediatric contacts of adult household TB cases in South Africa, prevalence estimates of TB infection increased significantly from baseline to 6 months. Conversions and reversions occurred among pediatric TB contacts using QFT-GIT, but QFT-GIT conversion rates were less influenced by thresholds used for conversions than were TST conversion rates.  相似文献   

18.

Background

To estimate the effectiveness of routine antenatal anti-D prophylaxis for preventing sensitisation in pregnant Rhesus negative women, and to explore whether this depends on the treatment regimen adopted.

Methods

Ten studies identified in a previous systematic literature search were included. Potential sources of bias were systematically identified using bias checklists, and their impact and uncertainty were quantified using expert opinion. Study results were adjusted for biases and combined, first in a random-effects meta-analysis and then in a random-effects meta-regression analysis.

Results

In a conventional meta-analysis, the pooled odds ratio for sensitisation was estimated as 0.25 (95% CI 0.18, 0.36), comparing routine antenatal anti-D prophylaxis to control, with some heterogeneity (I 2 = 19%). However, this naïve analysis ignores substantial differences in study quality and design. After adjusting for these, the pooled odds ratio for sensitisation was estimated as 0.31 (95% CI 0.17, 0.56), with no evidence of heterogeneity (I 2 = 0%). A meta-regression analysis was performed, which used the data available from the ten anti-D prophylaxis studies to inform us about the relative effectiveness of three licensed treatments. This gave an 83% probability that a dose of 1250 IU at 28 and 34 weeks is most effective and a 76% probability that a single dose of 1500 IU at 28–30 weeks is least effective.

Conclusion

There is strong evidence for the effectiveness of routine antenatal anti-D prophylaxis for prevention of sensitisation, in support of the policy of offering routine prophylaxis to all non-sensitised pregnant Rhesus negative women. All three licensed dose regimens are expected to be effective.  相似文献   

19.

Introduction

The utility of T-cell based interferon-gamma release assays for the diagnosis of latent tuberculosis infection remains unclear in settings with a high burden of tuberculosis.

Objectives

To determine risk factors associated with positive QuantiFERON-TB Gold In-Tube (QFT-GIT) and tuberculin skin test (TST) results and the level of agreement between the tests; to explore the hypotheses that positivity in QFT-GIT is more related to recent infection and less affected by HIV than the TST.

Methods

Adult household contacts of tuberculosis patients were invited to participate in a cross-sectional study across 24 communities in Zambia and South Africa. HIV, QFT-GIT and TST tests were done. A questionnaire was used to assess risk factors.

Results

A total of 2,220 contacts were seen. 1,803 individuals had interpretable results for both tests, 1,147 (63.6%) were QFT-GIT positive while 725 (40.2%) were TST positive. Agreement between the tests was low (kappa = 0.24). QFT-GIT and TST results were associated with increasing age (adjusted OR [aOR] for each 10 year increase for QFT-GIT 1.15; 95% CI: 1.06–1.25, and for TST aOR: 1.10; 95% CI 1.01–1.20). HIV positivity was less common among those with positive results on QFT-GIT (aOR: 0.51; 95% CI: 0.39–0.67) and TST (aOR: 0.61; 95% CI: 0.46–0.82). Smear positivity of the index case was associated with QFT-GIT (aOR: 1.25; 95% CI: 0.90–1.74) and TST (aOR: 1.39; 95% CI: 0.98–1.98) results. We found little evidence in our data to support our hypotheses.

Conclusion

QFT-GIT may not be more sensitive than the TST to detect risk factors associated with tuberculous infection. We found little evidence to support the hypotheses that positivity in QFT-GIT is more related to recent infection and less affected by HIV than the TST.  相似文献   

20.

Background

Starting HAART in a very advanced stage of disease is assumed to be the most prevalent form of initiation in HIV-infected subjects in developing countries. Data from Latin America and the Caribbean is still lacking. Our main objective was to determine the frequency, risk factors and trends in time for being late HAART initiator (LHI) in this region.

Methodology

Cross-sectional analysis from 9817 HIV-infected treatment-naïve patients initiating HAART at 6 sites (Argentina, Chile, Haiti, Honduras, Peru and Mexico) from October 1999 to July 2010. LHI had CD4+ count ≤200cells/mm3 prior to HAART. Late testers (LT) were those LHI who initiated HAART within 6 months of HIV diagnosis. Late presenters (LP) initiated after 6 months of diagnosis. Prevalence, risk factors and trends over time were analyzed.

Principal Findings

Among subjects starting HAART (n = 9817) who had baseline CD4+ available (n = 8515), 76% were LHI: Argentina (56%[95%CI:52–59]), Chile (80%[95%CI:77–82]), Haiti (76%[95%CI:74–77]), Honduras (91%[95%CI:87–94]), Mexico (79%[95%CI:75–83]), Peru (86%[95%CI:84–88]). The proportion of LHI statistically changed over time (except in Honduras) (p≤0.02; Honduras p = 0.7), with a tendency towards lower rates in recent years. Males had increased risk of LHI in Chile, Haiti, Peru, and in the combined site analyses (CSA). Older patients were more likely LHI in Argentina and Peru (OR 1.21 per +10-year of age, 95%CI:1.02–1.45; OR 1.20, 95%CI:1.02–1.43; respectively), but not in CSA (OR 1.07, 95%CI:0.94–1.21). Higher education was associated with decreased risk for LHI in Chile (OR 0.92 per +1-year of education, 95%CI:0.87–0.98) (similar trends in Mexico, Peru, and CSA). LHI with date of HIV-diagnosis available, 55% were LT and 45% LP.

Conclusion

LHI was highly prevalent in CCASAnet sites, mostly due to LT; the main risk factors associated were being male and older age. Earlier HIV-diagnosis and earlier treatment initiation are needed to maximize benefits from HAART in the region.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号