首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.

Background

Intermittent preventive treatment of malaria in children (IPTc) is a promising new approach to the control of malaria in areas of seasonal malaria transmission but it is not known if IPTc adds to the protection provided by an insecticide-treated net (ITN).

Methods and Findings

An individually randomised, double-blind, placebo-controlled trial of seasonal IPTc was conducted in Burkina Faso in children aged 3 to 59 months who were provided with a long-lasting insecticide-treated bednet (LLIN). Three rounds of treatment with sulphadoxine pyrimethamine plus amodiaquine or placebos were given at monthly intervals during the malaria transmission season. Passive surveillance for malaria episodes was established, a cross-sectional survey was conducted at the end of the malaria transmission season, and use of ITNs was monitored during the intervention period. Incidence rates of malaria were compared using a Cox regression model and generalized linear models were fitted to examine the effect of IPTc on the prevalence of malaria infection, anaemia, and on anthropometric indicators. 3,052 children were screened and 3,014 were enrolled in the trial; 1,505 in the control arm and 1,509 in the intervention arm. Similar proportions of children in the two treatment arms were reported to sleep under an LLIN during the intervention period (93%). The incidence of malaria, defined as fever or history of fever with parasitaemia ≥5,000/µl, was 2.88 (95% confidence interval [CI] 2.70–3.06) per child during the intervention period in the control arm versus 0.87 (95% CI 0.78–0.97) in the intervention arm, a protective efficacy (PE) of 70% (95% CI 66%–74%) (p<0.001). There was a 69% (95% CI 6%–90%) reduction in incidence of severe malaria (p = 0.04) and a 46% (95% CI 7%–69%) (p = 0.03) reduction in the incidence of all-cause hospital admissions. IPTc reduced the prevalence of malaria infection at the end of the malaria transmission season by 73% (95% CI 68%–77%) (p<0.001) and that of moderately severe anaemia by 56% (95% CI 36%–70%) (p<0.001). IPTc reduced the risks of wasting (risk ratio [RR] = 0.79; 95% CI 0.65–1.00) (p = 0.05) and of being underweight (RR = 0.84; 95% CI 0.72–0.99) (p = 0.03). Children who received IPTc were 2.8 (95% CI 2.3–3.5) (p<0.001) times more likely to vomit than children who received placebo but no drug-related serious adverse event was recorded.

Conclusions

IPT of malaria provides substantial protection against malaria in children who sleep under an ITN. There is now strong evidence to support the integration of IPTc into malaria control strategies in areas of seasonal malaria transmission.

Trial Registration

ClinicalTrials.gov NCT00738946 Please see later in the article for the Editors'' Summary  相似文献   

2.

Background

Reducing substance use and unprotected sex by HIV-positive persons improves individual health status while decreasing the risk of HIV transmission. Despite recommendations that health care providers screen and counsel their HIV-positive patients for ongoing behavioral risks, it is unknown how to best provide “prevention with positives” in clinical settings. Positive Choice, an interactive, patient-tailored computer program, was developed in the United States to improve clinic-based assessment and counseling for risky behaviors.

Methodology and Findings

We conducted a parallel groups randomized controlled trial (December 2003–September 2006) at 5 San Francisco area outpatient HIV clinics. Eligible patients (HIV-positive English-speaking adults) completed an in-depth computerized risk assessment. Participants reporting substance use or sexual risks (n = 476) were randomized in stratified blocks. The intervention group received tailored risk-reduction counseling from a “Video Doctor” via laptop computer and a printed Educational Worksheet; providers received a Cueing Sheet on reported risks. Compared with control, fewer intervention participants reported continuing illicit drug use (RR 0.81, 95% CI: 0.689, 0.957, p = 0.014 at 3 months; and RR 0.65, 95% CI: 0.540, 0.785, p<0.001 at 6 months) and unprotected sex (RR 0.88, 95% CI: 0.773, 0.993, p = 0.039 at 3 months; and RR 0.80, 95% CI: 0.686, 0.941, p = 0.007 at 6 months). Intervention participants reported fewer mean days of ongoing illicit drug use (-4.0 days vs. -1.3 days, p = 0.346, at 3 months; and -4.7 days vs. -0.7 days, p = 0.130, at 6 months) than did controls, and had fewer casual sex partners at (−2.3 vs. −1.4, p = 0.461, at 3 months; and −2.7 vs. −0.6, p = 0.042, at 6 months).

Conclusions

The Positive Choice intervention achieved significant cessation of illicit drug use and unprotected sex at the group-level, and modest individual-level reductions in days of ongoing drug use and number of casual sex partners compared with the control group. Positive Choice, including Video Doctor counseling, is an efficacious and appropriate adjunct to risk-reduction efforts in outpatient settings, and holds promise as a public health HIV intervention.

Trial Registration

Clinicaltrials.gov NCT00447707  相似文献   

3.

Background

Current recommendations to prevent malaria in African pregnant women rely on insecticide treated nets (ITNs) and intermittent preventive treatment (IPTp). However, there is no information on the safety and efficacy of their combined use.

Methods

1030 pregnant Mozambican women of all gravidities received a long-lasting ITN during antenatal clinic (ANC) visits and, irrespective of HIV status, were enrolled in a randomised, double blind, placebo-controlled trial, to assess the safety and efficacy of 2-dose sulphadoxine-pyrimethamine (SP). The main outcome was the reduction in low birth weight.

Findings

Two-dose SP was safe and well tolerated, but was not associated with reductions in anaemia prevalence at delivery (RR, 0.92 [95% CI, 0.79–1.08]), low birth weight (RR, 0.99 [95% CI, 0.70–1.39]), or overall placental infection (p = 0.964). However, the SP group showed a 40% reduction (95% CI, 7.40–61.20]; p = 0.020) in the incidence of clinical malaria during pregnancy, and reductions in the prevalence of peripheral parasitaemia (7.10% vs 15.15%) (p<0.001), and of actively infected placentas (7.04% vs 13.60%) (p = 0.002). There was a reduction in severe anaemia at delivery of borderline statistical significance (p = 0.055). These effects were not modified by gravidity or HIV status. Reported ITN''s use was more than 90% in both groups.

Conclusions

Two-dose SP was associated with a reduction in some indicators, but these were not translated to significant improvement in other maternal or birth outcomes. The use of ITNs during pregnancy may reduce the need to administer IPTp. ITNs should be part of the ANC package in sub-Saharan Africa.

Trial Registration

ClinicalTrials.gov NCT00209781  相似文献   

4.

Background

Behavioral interventions that promote adherence to antiretroviral medications may decrease HIV treatment failure. Antiretroviral treatment programs in sub-Saharan Africa confront increasing financial constraints to provide comprehensive HIV care, which include adherence interventions. This study compared the impact of counseling and use of an alarm device on adherence and biological outcomes in a resource-limited setting.

Methods and Findings

A randomized controlled, factorial designed trial was conducted in Nairobi, Kenya. Antiretroviral-naïve individuals initiating free highly active antiretroviral therapy (HAART) in the form of fixed-dose combination pills (d4T, 3TC, and nevirapine) were randomized to one of four arms: counseling (three counseling sessions around HAART initiation), alarm (pocket electronic pill reminder carried for 6 months), counseling plus alarm, and neither counseling nor alarm. Participants were followed for 18 months after HAART initiation. Primary study endpoints included plasma HIV-1 RNA and CD4 count every 6 months, mortality, and adherence measured by monthly pill count. Between May 2006 and September 2008, 400 individuals were enrolled, 362 initiated HAART, and 310 completed follow-up. Participants who received counseling were 29% less likely to have monthly adherence <80% (hazard ratio [HR] = 0.71; 95% confidence interval [CI] 0.49–1.01; p = 0.055) and 59% less likely to experience viral failure (HIV-1 RNA ≥5,000 copies/ml) (HR 0.41; 95% CI 0.21–0.81; p = 0.01) compared to those who received no counseling. There was no significant impact of using an alarm on poor adherence (HR 0.93; 95% CI 0.65–1.32; p = 0.7) or viral failure (HR 0.99; 95% CI 0.53–1.84; p = 1.0) compared to those who did not use an alarm. Neither counseling nor alarm was significantly associated with mortality or rate of immune reconstitution.

Conclusions

Intensive early adherence counseling at HAART initiation resulted in sustained, significant impact on adherence and virologic treatment failure during 18-month follow-up, while use of an alarm device had no effect. As antiretroviral treatment clinics expand to meet an increasing demand for HIV care in sub-Saharan Africa, adherence counseling should be implemented to decrease the development of treatment failure and spread of resistant HIV.

Trial registration

ClinicalTrials gov NCT00273780 Please see later in the article for the Editors'' Summary  相似文献   

5.

Background

Oncogenic BRAF mutations have been found in diverse malignancies and activate RAF/MEK/ERK signaling, a critical pathway of tumorigenesis. We examined the clinical characteristics and outcomes of patients with mutant (mut) BRAF advanced cancer referred to phase 1 clinic.

Methods

We reviewed the records of 80 consecutive patients with mutBRAF advanced malignancies and 149 with wild-type (wt) BRAF (matched by tumor type) referred to the Clinical Center for Targeted Therapy and analyzed their outcome.

Results

Of 80 patients with mutBRAF advanced cancer, 56 had melanoma, 10 colorectal, 11 papillary thyroid, 2 ovarian and 1 esophageal cancer. Mutations in codon 600 were found in 77 patients (62, V600E; 13, V600K; 1, V600R; 1, unreported). Multivariate analysis showed less soft tissue (Odds ratio (OR) = 0.39, 95%CI: 0.20–0.77, P = 0.007), lung (OR = 0.38, 95%CI: 0.19–0.73, p = 0.004) and retroperitoneal metastases (OR = 0.34, 95%CI: 0.13–0.86, p = 0.024) and more brain metastases (OR = 2.05, 95%CI: 1.02–4.11, P = 0.043) in patients with mutBRAF versus wtBRAF. Comparing to the corresponding wtBRAF, mutBRAF melanoma patients had insignificant trend to longer median survival from diagnosis (131 vs. 78 months, p = 0.14), while mutBRAF colorectal cancer patients had an insignificant trend to shorter median survival from diagnosis (48 vs. 53 months, p = 0.22). In melanoma, V600K mutations in comparison to other BRAF mutations were associated with more frequent brain (75% vs. 36.3%, p = 0.02) and lung metastases (91.6% vs. 47.7%, p = 0.007), and shorter time from diagnosis to metastasis and to death (19 vs. 53 months, p = 0.046 and 78 vs. 322 months, p = 0.024 respectively). Treatment with RAF/MEK targeting agents (Hazard ratio (HR) = 0.16, 95%CI: 0.03–0.89, p = 0.037) and any decrease in tumor size after referral (HR = 0.07, 95%CI: 0.015–0.35, p = 0.001) correlated with longer survival in mutBRAF patients.

Conclusions

BRAF appears to be a druggable mutation that also defines subgroups of patients with phenotypic overlap, albeit with differences that correlate with histology or site of mutation.  相似文献   

6.

Background

Buruli ulcer is an infectious disease involving the skin, caused by Mycobacterium ulcerans. Its exact transmission mechanism remains unknown. Several arguments indicate a possible role for insects in its transmission. A previous case-control study in the Nyong valley region in central Cameroon showed an unexpected association between bed net use and protection against Buruli ulcer. We investigated whether this association persisted in a newly discovered endemic Buruli ulcer focus in Bankim, northwestern Cameroon.

Methodology/Principal Findings

We conducted a case-control study on 77 Buruli ulcer cases and 153 age-, gender- and village-matched controls. Participants were interviewed about their activities and habits. Multivariate conditional logistic regression analysis identified systematic use of a bed net (Odds-Ratio (OR) = 0.4, 95% Confidence Interval [95%CI] = [0.2–0.9], p-value (p) = 0.04), cleansing wounds with soap (OR [95%CI] = 0.1 [0.03–0.3], p<0.0001) and growing cassava (OR [95%CI] = 0.3 [0.2–0.7], p = 0.005) as independent protective factors. Independent risk factors were bathing in the Mbam River (OR [95%CI] = 6.9 [1.4–35], p = 0.02) and reporting scratch lesions after insect bites (OR [95%CI] = 2.7 [1.4–5.4], p = 0.004). The proportion of cases that could be prevented by systematic bed net use was 32%, and by adequate wound care was 34%.

Conclusions/Significance

Our study confirms that two previously identified factors, adequate wound care and bed net use, significantly decreased the risk of Buruli ulcer. These associations withstand generalization to different geographic, climatic and epidemiologic settings. Involvement of insects in the household environment, and the relationship between wound hygiene and M. ulcerans infection should now be investigated.  相似文献   

7.

Background

Thymic stromal lymphopoietin (TSLP), an IL7-like cytokine produced by bronchial epithelial cells is upregulated in asthma and induces dendritic cell maturation supporting a Th2 response. Environmental pollutants, including tobacco smoke and diesel exhaust particles upregulate TSLP suggesting that TSLP may be an interface between environmental pollution and immune responses in asthma. Since asthma is prevalent in urban communities, variants in the TSLP gene may be important in asthma susceptibility in these populations.

Objectives

To determine whether genetic variants in TSLP are associated with asthma in an urban admixed population.

Methodology and Main Results

Ten tag-SNPs in the TSLP gene were analyzed for association with asthma using 387 clinically diagnosed asthmatic cases and 212 healthy controls from an urban admixed population. One SNP (rs1898671) showed nominally significant association with asthma (odds ratio (OR) = 1.50; 95% confidence interval (95% CI): 1.09–2.05, p = 0.01) after adjusting for age, BMI, income, education and population stratification. Association results were consistent using two different approaches to adjust for population stratification. When stratified by smoking status, the same SNP showed a significantly increased risk associated with asthma in ex-smokers (OR = 2.00, 95% CI: 1.04–3.83, p = 0.04) but not significant in never-smokers (OR = 1.34; 95% CI: 0.93–1.94, p = 0.11). Haplotype-specific score test indicated that an elevated risk for asthma was associated with a specific haplotype of TSLP involving SNP rs1898671 (OR = 1.58, 95% CI: 1.10–2.27, p = 0.01). Association of this SNP with asthma was confirmed in an independent large population-based cohort consortium study (OR = 1.15, 95% CI: 1.07–1.23, p = 0.0003) and the results stratified by smoking status were also validated (ex-smokers: OR = 1.21, 95% CI: 1.08–1.34, p = 0.003; never-smokers: OR = 1.06, 95% CI: 0.94–1.17, p = 0.33).

Conclusions

Genetic variants in TSLP may contribute to asthma susceptibility in admixed urban populations with a gene and environment interaction.  相似文献   

8.

Background

Given that micronutrient deficiency, neglected intestinal parasitic infections (IPIs) and poor socioeconomic status are closely linked, we conducted a cross-sectional study to assess the relationship between IPIs and nutritional status of children living in remote and rural areas in West Malaysia.

Methods/Findings

A total of 550 children participated, comprising 520 (94.5%) school children aged 7 to 12 years old, 30 (5.5%) young children aged 1 to 6 years old, 254 (46.2%) boys and 296 (53.8%) girls. Of the 550 children, 26.2% were anaemic, 54.9% iron deficient and 16.9% had iron deficiency anaemia (IDA). The overall prevalence of helminths was 76.5% comprising Trichuris trichiura (71.5%), Ascaris lumbricoides (41.6%) and hookworm infection (13.5%). It was observed that iron deficiency was significantly higher in girls (p = 0.032) compared to boys. Univariate analysis demonstrated that low level of mother''s education (OR = 2.52; 95% CI = 1.38–4.60; p = 0.002), non working parents (OR = 2.18; 95% CI = 2.06–2.31; p = 0.013), low household income (OR = 2.02; 95% CI = 1.14–3.59; p = 0.015), T. trichiura (OR = 2.15; 95% CI = 1.21–3.81; p = 0.008) and A. lumbricoides infections (OR = 1.63; 95% CI = 1.04–2.55; p = 0.032) were significantly associated with the high prevalence of IDA. Multivariate analysis confirmed that low level of mother''s education (OR = 1.48; 95 CI% = 1.33–2.58; p<0.001) was a significant predictor for IDA in these children.

Conclusion

It is crucial that a comprehensive primary health care programme for these communities that includes periodic de-worming, nutrition supplement, improved household economy, education, sanitation status and personal hygiene are taken into consideration to improve the nutritional status of these children.  相似文献   

9.

Objective

Biological evidence suggests that inflammation might induce type 2 diabetes (T2D), and epidemiological studies have shown an association between higher white blood cell count (WBC) and T2D. However, the association has not been systematically investigated.

Research Design and Methods

Studies were identified through computer-based and manual searches. Previously unreported studies were sought through correspondence. 20 studies were identified (8,647 T2D cases and 85,040 non-cases). Estimates of the association of WBC with T2D were combined using random effects meta-analysis; sources of heterogeneity as well as presence of publication bias were explored.

Results

The combined relative risk (RR) comparing the top to bottom tertile of the WBC count was 1.61 (95% CI: 1.45; 1.79, p = 1.5*10−18). Substantial heterogeneity was present (I2 = 83%). For granulocytes the RR was 1.38 (95% CI: 1.17; 1.64, p = 1.5*10−4), for lymphocytes 1.26 (95% CI: 1.02; 1.56, p = 0.029), and for monocytes 0.93 (95% CI: 0.68; 1.28, p = 0.67) comparing top to bottom tertile. In cross-sectional studies, RR was 1.74 (95% CI: 1.49; 2.02, p = 7.7*10−13), while in cohort studies it was 1.48 (95% CI: 1.22; 1.79, p = 7.7*10−5). We assessed the impact of confounding in EPIC-Norfolk study and found that the age and sex adjusted HR of 2.19 (95% CI: 1.74; 2.75) was attenuated to 1.82 (95% CI: 1.45; 2.29) after further accounting for smoking, T2D family history, physical activity, education, BMI and waist circumference.

Conclusions

A raised WBC is associated with higher risk of T2D. The presence of publication bias and failure to control for all potential confounders in all studies means the observed association is likely an overestimate.  相似文献   

10.

Background

Chagas'' disease is an important neglected public health problem in many Latin American countries, but population-based epidemiological data are scarce. Here we present a nationwide analysis on Chagas-associated mortality, and risk factors for death from this disease.

Methodology/Principal Findings

We analyzed all death certificates of individuals who died between 1999 and 2007 in Brazil, based on the nationwide Mortality Information System (a total of 243 data sets with about 9 million entries). Chagas'' disease was mentioned in 53,930 (0.6%) of death certificates, with 44,537 (82.6%) as an underlying cause and 9,387 (17.4%) as an associated cause of death. Acute Chagas'' disease was responsible for 2.8% of deaths. The mean standardized mortality rate was 3.36/100.000 inhabitants/year. Nationwide standardized mortality rates reduced gradually, from 3.78 (1999) to 2.78 (2007) deaths/year per 100,000 inhabitants (−26.4%). Standardized mortality rates were highest in the Central-West region, ranging from 15.23 in 1999 to 9.46 in 2007 (−37.9%), with a significant negative linear trend (p = 0.001; R2 = 82%). Proportional mortality considering multiple causes of death was 0.60%. The Central-West showed highest proportional mortality among regions (2.17%), with a significant linear negative trend, from 2.28% to 1.90% (−19.5%; p = 0.001; R2 = 84%). There was a significant increase in the Northeast of 38.5% (p = 0.006; R2 = 82%). Bivariable analysis on risk factors for death from Chagas'' disease showed highest relative risks (RR) in older age groups (RR: 10.03; 95% CI: 9.40–10.70; p<0.001) and those residing in the Central-West region (RR: 15.01; 95% CI: 3.90–16.22; p<0.001). In logistic regression analysis, age ≥30 years (adjusted OR: 10.81; 95% CI: 10.03–10.65; p<0.001) and residence in one of the three high risk states Minas Gerais, Goiás or the Federal District (adjusted OR: 5.12; 95% CI: 5.03–5.22, p<0.001) maintained important independent risk factors for death by Chagas'' disease.

Conclusions/Significance

This is the first nationwide population-based study on Chagas mortality in Brazil, considering multiple causes of death. Despite the decline of mortality associated with Chagas'' disease in Brazil, the disease remains a serious public health problem with marked regional differences.  相似文献   

11.

Background

It is not known whether or not delivering acupuncture triggers mechanisms cited as placebo and if acupuncture or sham reduces radiotherapy-induced emesis more than standard care.

Methodology/Principal Findings

Cancer patients receiving radiotherapy over abdominal/pelvic regions were randomized to verum (penetrating) acupuncture (n = 109; 99 provided data) in the alleged antiemetic acupuncture point PC6 or sham acupuncture (n = 106; 101 provided data) performed with a telescopic non-penetrating needle at a sham point 2–3 times/week during the whole radiotherapy period. The acupuncture cohort was compared to a reference cohort receiving standard care (n = 62; 62 provided data). The occurrence of emesis in each group was compared after a mean dose of 27 Gray. Nausea and vomiting were experienced during the preceding week by 37 and 8% in the verum acupuncture group, 38 and 7% in the sham acupuncture group and 63 and 15% in the standard care group, respectively. The lower occurrence of nausea in the acupuncture cohort (verum and sham) compared to patients receiving standard care (37% versus 63%, relative risk (RR) 0.6, 95 % confidence interval (CI) 0.5–0.8) was also true after adjustment for potential confounding factors for nausea (RR 0.8, CI 0.6 to 0.9). Nausea intensity was lower in the acupuncture cohort (78% no nausea, 13% a little, 8% moderate, 1% much) compared to the standard care cohort (52% no nausea, 32% a little, 15% moderate, 2% much) (p = 0.002). The acupuncture cohort expected antiemetic effects from their treatment (95%). Patients who expected nausea had increased risk for nausea compared to patients who expected low risk for nausea (RR 1.6; Cl 1.2–2.4).

Conclusions/Significance

Patients treated with verum or sham acupuncture experienced less nausea and vomiting compared to patients receiving standard care, possibly through a general care effect or due to the high level of patient expectancy.

Trial Registration

ClinicalTrials.gov NCT00621660  相似文献   

12.

Background

Several studies have shown an association between vitamin D deficiency and cardiovascular risk. Vitamin D status is assessed by determination of 25-hydroxyvitamin D [25(OH)D] in serum.

Methods

We assessed the prognostic utility of 25(OH)D in 982 chest-pain patients with suspected acute coronary syndrome (ACS) from Salta, Northern Argentina. 2-year follow-up data including all-cause mortality, cardiac death and sudden cardiac death were analyzed in quartiles of 25(OH)D, applying univariate and multivariate analysis.

Results

There were statistically significant changes in seasonal 25(OH)D levels. At follow-up, 119 patients had died. The mean 25(OH)D levels were significantly lower among patients dying than in long-term survivors, both in the total population and in patients with a troponin T (TnT) release (n = 388). When comparing 25(OH)D in the highest quartile to the lowest quartile in a multivariable Cox regression model for all-cause mortality, the hazard ratio (HR) for cardiac death and sudden cardiac death in the total population was 0.37 (95% CI, 0.19–0.73), p = 0.004, 0.23 (95% CI, 0.08–0.67), p = 0.007, and 0.32 (95% CI, 0.11–0.94), p = 0.038, respectively. In patients with TnT release, the respective HR was 0.24 (95% CI, 0.10–0.54), p = 0.001, 0.18 (95% CI, 0.05–0.60), p = 0.006 and 0.25 (95% CI, 0.07–0.89), p = 0.033. 25(OH)D had no prognostic value in patients with no TnT release.

Conclusion

Vitamin D was shown to be a useful biomarker for prediction of mortality when obtained at admission in chest pain patients with suspected ACS.

Trial registration

ClinicalTrials.gov NCT01377402  相似文献   

13.

Background

Migraine is associated with an increased risk for cardiovascular disease (CVD). Both migraine and CVD are highly heritable. However, the genetic liability for CVD among migraineurs is unclear.

Methods

We performed a genome-wide association study for incident CVD events during 12 years of follow-up among 5,122 migraineurs participating in the population-based Women''s Genome Health Study. Migraine was self-reported and CVD events were confirmed after medical records review. We calculated odds ratios (OR) and 95% confidence intervals (CI) and considered a genome-wide p-value <5×10−8 as significant.

Results

Among the 5,122 women with migraine 164 incident CVD events occurred during follow-up. No SNP was associated with major CVD, ischemic stroke, myocardial infarction, or CVD death at the genome-wide level; however, five SNPs showed association with p<5×10−6. Among migraineurs with aura rs7698623 in MEPE (OR = 6.37; 95% CI 3.15–12.90; p = 2.7×10−7) and rs4975709 in IRX4 (OR = 5.06; 95% CI 2.66–9.62; p = 7.7×10−7) appeared to be associated with ischemic stroke, rs2143678 located close to MDF1 with major CVD (OR = 3.05; 95% CI 1.98–4.69; p = 4.3×10−7), and the intergenic rs1406961 with CVD death (OR = 12.33; 95% CI 4.62–32.87; p = 5.2×10−7). Further, rs1047964 in BACE1 appeared to be associated with CVD death among women with any migraine (OR = 4.67; 95% CI 2.53–8.62; p = 8.0×10−7).

Conclusion

Our results provide some suggestion for an association of five SNPs with CVD events among women with migraine; none of the results was genome-wide significant. Four associations appeared among migraineurs with aura, two of those with ischemic stroke. Although our population is among the largest with migraine and incident CVD information, these results must be treated with caution, given the limited number of CVD events among women with migraine and the low minor allele frequencies for three of the SNPs. Our results await independent replication and should be considered hypothesis generating for future research.  相似文献   

14.

Introduction

Pre-temozolomide studies demonstrated that loss of the tumor suppressor gene PTEN held independent prognostic significance in GBM patients. We investigated whether loss of PTEN predicted shorter survival in the temozolomide era. The role of PTEN in the PI3K/Akt pathway is also reviewed.

Methods

Patients with histologically proven newly diagnosed GBM were identified from a retrospective database between 2007 and 2010. Cox proportional hazards analysis was used to calculate the independent effects of PTEN expression, age, extent of resection, Karnofsky performance scale (KPS), and treatment on overall survival.

Results

Sixty-five percent of patients were men with median age of 63 years, and 70% had KPS≥80. Most patients (81%) received standard treatment (temozolomide with concurrent radiation). A total of 72 (47%) patients had retained PTEN expression. Median overall survival (OS) was 19.1 months (95% CI: 15.0–22.5). Median survival of 20.0 months (95% CI: 15.0–25.5) and 18.2 months (95% CI: 13.0–25.7) was observed in PTEN retained and PTEN loss patients, respectively (p = .71). PTEN loss patients were also found to have amplifications of EGFR gene more frequently than patients with retained PTEN (70.8% vs. 47.8%, p = .01). Multivariate analysis showed that older age (HR 1.64, CI: 1.02–2.63, p = .04), low KPS (HR 3.57, CI: 2.20–5.79, p<.0001), and lack of standard treatment (HR 3.98, CI: 2.38–6.65, p<.0001) yielded worse survival. PTEN loss was not prognostic of overall survival (HR 1.31, CI: 0.85–2.03, p = .22).

Conclusions

Loss of expression of PTEN does not confer poor overall survival in the temozolomide era. These findings imply a complex and non-linear molecular relationship between PTEN, its regulators and effectors in the tumorigenesis of glioblastoma. Additionally, there is evidence that temozolomide may be more effective in eradicating GBM cancer cells with PTEN loss and hence, level the outcomes between the PTEN retained and loss groups.  相似文献   

15.

Background

The clinical and scientific usage of patient-reported outcome measures is increasing in the health services. Often paper forms are used. Manual double entry of data is defined as the definitive gold standard for transferring data to an electronic format, but the process is laborious. Automated forms processing may be an alternative, but further validation is warranted.

Methods

200 patients were randomly selected from a cohort of 5777 patients who had previously answered two different questionnaires. The questionnaires were scanned using an automated forms processing technique, as well as processed by single and double manual data entry, using the EpiData Entry data entry program. The main outcome measure was the proportion of correctly entered numbers at question, form and study level.

Results

Manual double-key data entry (error proportion per 1000 fields = 0.046 (95% CI: 0.001–0.258)) performed better than single-key data entry (error proportion per 1000 fields = 0.370 (95% CI: 0.160–0.729), (p = 0.020)). There was no statistical difference between Optical Mark Recognition (error proportion per 1000 fields = 0.046 (95% CI: 0.001–0.258)) and double-key data entry (p = 1.000). With the Intelligent Character Recognition method, there was no statistical difference compared to single-key data entry (error proportion per 1000 fields = 6.734 (95% CI: 0.817–24.113), (p = 0.656)), as well as double-key data entry (error proportion per 1000 fields = 3.367 (95% CI: 0.085–18.616)), (p = 0.319)).

Conclusions

Automated forms processing is a valid alternative to double manual data entry for highly structured forms containing only check boxes, numerical codes and no dates. Automated forms processing can be superior to single manual data entry through a data entry program, depending on the method chosen.  相似文献   

16.
Z Zhao  S Li  G Liu  F Yan  X Ma  Z Huang  H Tian 《PloS one》2012,7(7):e41641

Background and Objective

Emerging evidence from biological and epidemiological studies has suggested that body iron stores and heme-iron intake may be related to the risk of type 2 diabetes (T2D). We aimed to examine the association of body iron stores and heme-iron intake with T2D risk by conducting a systematic review and meta-analysis of previously published studies.

Research Design and Methods

Systematic review and subsequent meta-analysis were conducted by searching MEDLINE database up to June 22, 2012 to identify studies that analyzed the association of body iron stores or dietary heme-iron intake with T2D risk. The meta-analysis was performed using the effect estimates and 95% confidence intervals (CIs) to calculate the pooled risk estimates, while the heterogeneity among studies was examined using the I2 and Q statistic.

Results

The meta-analysis included 16 high-quality studies: 12 studies analyzed ferritin levels (4,366 T2D patients and 41,091 controls) and 4 measured heme-iron intake (9,246 T2D patients and 179,689 controls). The combined relative risk (RR) comparing the highest and lowest category of ferritin levels was 1.66 (95% CI: 1.15–2.39) for prospective studies, 2.29 (95% CI: 1.48–3.54) for cross-sectional studies with heterogeneity (Q = 14.84, p = 0.01, I2 = 66.3%; Q = 44.16, p<0.001, I2 = 88.7%). The combined RR comparing the highest and lowest category of heme-iron intake was 1.31 (95% CI: 1.21–1.43) with heterogeneity (Q = 1.39, p = 0.71, I2 = 0%). No publication bias was found. Additional 15 studies that were of good quality, had significant results, and analyzed the association between body iron stores and T2D risk were qualitatively included in the systematic review.

Conclusions

The meta-analysis and systematic review suggest that increased ferritin levels and heme-iron intake are both associated with higher risk of T2D.  相似文献   

17.

Objectives

Generic triage risk assessments are widely used in the emergency department (ED), but have not been validated for prediction of short-term risk among patients with acute heart failure (HF). Our objective was to evaluate the Canadian Triage Acuity Scale (CTAS) for prediction of early death among HF patients.

Methods

We included patients presenting with HF to an ED in Ontario from Apr 2003 to Mar 2007. We used the National Ambulatory Care Reporting System and vital statistics databases to examine care and outcomes.

Results

Among 68,380 patients (76±12 years, 49.4% men), early mortality was stratified with death rates of 9.9%, 1.9%, 0.9%, and 0.5% at 1-day, and 17.2%, 5.9%, 3.8%, and 2.5% at 7-days, for CTAS 1, 2, 3, and 4–5, respectively. Compared to lower acuity (CTAS 4–5) patients, adjusted odds ratios (aOR) for 1-day death were 1.32 (95%CI; 0.93–1.88; p = 0.12) for CTAS 3, 2.41 (95%CI; 1.71–3.40; p<0.001) for CTAS 2, and highest for CTAS 1: 9.06 (95%CI; 6.28–13.06; p<0.001). Predictors of triage-critical (CTAS 1) status included oxygen saturation <90% (aOR 5.92, 95%CI; 3.09–11.81; p<0.001), respiratory rate >24 breaths/minute (aOR 1.96, 95%CI; 1.05–3.67; p = 0.034), and arrival by paramedic (aOR 3.52, 95%CI; 1.70–8.02; p = 0.001). While age/sex-adjusted CTAS score provided good discrimination for ED (c-statistic = 0.817) and 1-day (c-statistic = 0.724) death, mortality prediction was improved further after accounting for cardiac and non-cardiac co-morbidities (c-statistics 0.882 and 0.810, respectively; both p<0.001).

Conclusions

A semi-quantitative triage acuity scale assigned at ED presentation and based largely on respiratory factors predicted emergent death among HF patients.  相似文献   

18.

Background

Our objective was to examine whether gestational diabetes mellitus (GDM) or newborns'' high birthweight can be prevented by lifestyle counseling in pregnant women at high risk of GDM.

Method and Findings

We conducted a cluster-randomized trial, the NELLI study, in 14 municipalities in Finland, where 2,271 women were screened by oral glucose tolerance test (OGTT) at 8–12 wk gestation. Euglycemic (n = 399) women with at least one GDM risk factor (body mass index [BMI] ≥25 kg/m2, glucose intolerance or newborn''s macrosomia (≥4,500 g) in any earlier pregnancy, family history of diabetes, age ≥40 y) were included. The intervention included individual intensified counseling on physical activity and diet and weight gain at five antenatal visits. Primary outcomes were incidence of GDM as assessed by OGTT (maternal outcome) and newborns'' birthweight adjusted for gestational age (neonatal outcome). Secondary outcomes were maternal weight gain and the need for insulin treatment during pregnancy. Adherence to the intervention was evaluated on the basis of changes in physical activity (weekly metabolic equivalent task (MET) minutes) and diet (intake of total fat, saturated and polyunsaturated fatty acids, saccharose, and fiber). Multilevel analyses took into account cluster, maternity clinic, and nurse level influences in addition to age, education, parity, and prepregnancy BMI. 15.8% (34/216) of women in the intervention group and 12.4% (22/179) in the usual care group developed GDM (absolute effect size 1.36, 95% confidence interval [CI] 0.71–2.62, p = 0.36). Neonatal birthweight was lower in the intervention than in the usual care group (absolute effect size −133 g, 95% CI −231 to −35, p = 0.008) as was proportion of large-for-gestational-age (LGA) newborns (26/216, 12.1% versus 34/179, 19.7%, p = 0.042). Women in the intervention group increased their intake of dietary fiber (adjusted coefficient 1.83, 95% CI 0.30–3.25, p = 0.023) and polyunsaturated fatty acids (adjusted coefficient 0.37, 95% CI 0.16–0.57, p<0.001), decreased their intake of saturated fatty acids (adjusted coefficient −0.63, 95% CI −1.12 to −0.15, p = 0.01) and intake of saccharose (adjusted coefficient −0.83, 95% CI −1.55 to −0.11, p  =  0.023), and had a tendency to a smaller decrease in MET minutes/week for at least moderate intensity activity (adjusted coefficient 91, 95% CI −37 to 219, p = 0.17) than women in the usual care group. In subgroup analysis, adherent women in the intervention group (n = 55/229) had decreased risk of GDM (27.3% versus 33.0%, p = 0.43) and LGA newborns (7.3% versus 19.5%, p = 0.03) compared to women in the usual care group.

Conclusions

The intervention was effective in controlling birthweight of the newborns, but failed to have an effect on maternal GDM.

Trial registration

Current Controlled Trials ISRCTN33885819 Please see later in the article for the Editors'' Summary  相似文献   

19.

Objective

To compare the rate of decline of renal function in tenofovir- and abacavir-based antiretroviral therapy (ART) in low-body weight treatment-naïve patients with HIV infection.

Design

We conducted a single-center retrospective cohort study of 503 Japanese patients who commenced on either tenofovir- or abacavir-based initial ART.

Methods

The incidence of renal dysfunction, defined as more than 25% fall in estimated glomerular filtration rate (eGFR) from the baseline, was determined in each group. The effect of tenofovir on renal dysfunction was estimated by univariate and multivariate Cox hazards models as the primary exposure. Changes in eGFR until 96 weeks were estimated in both groups with a repeated measures mixed model.

Results

The median body weight of the cohort was 64 kg. The estimated incidence of renal dysfunction in the tenofovir and the abacavir arm was 9.84 per 100 and 4.55 per 100 person-years, respectively. Tenofovir was significantly associated with renal dysfunction by univariate and multivariate analysis (HR = 1.747; 95% CI, 1.152–2.648; p = 0.009) (adjusted HR = 2.080; 95% CI, 1.339–3.232; p<0.001). In subgroup analysis of the patients stratified by intertertile baseline body weight, the effect of tenofovir on renal dysfunction was more evident in patients with lower baseline body weight by multivariate analysis (≤60 kg: adjusted HR = 2.771; 95%CI, 1.494–5.139; p = 0.001) (61–68 kg: adjusted HR = 1.908; 95%CI, 0.764–4.768; p = 0.167) (>68 kg: adjusted HR = 0.997; 95%CI, 0.318–3.121; p = 0.995). The fall in eGFR was significantly greater in the tenofovir arm than the abacavir arm after starting ART (p = 0.003).

Conclusion

The incidence of renal dysfunction in low body weight patients treated with tenofovir was twice as high as those treated with abacavir. Close monitoring of renal function is recommended for patients with small body weight especially those with baseline body weight <60 kg treated with tenofovir.  相似文献   

20.

Background

Intermittent preventive treatment (IPT) is a promising malaria control strategy; however, the optimal regimen remains unclear. We conducted a randomized, single-blinded, placebo-controlled trial to evaluate the efficacy, safety, and tolerability of a single course of sulfadoxine-pyrimethamine (SP), amodiaquine + SP (AQ+SP) or dihydroartemisinin-piperaquine (DP) among schoolchildren to inform IPT.

Methods

Asymptomatic girls aged 8 to 12 years and boys aged 8 to 14 years enrolled in two primary schools in Tororo, Uganda were randomized to receive one of the study regimens or placebo, regardless of presence of parasitemia at enrollment, and followed for 42 days. The primary outcome was risk of parasitemia at 42 days. Survival analysis was used to assess differences between regimens.

Results

Of 780 enrolled participants, 769 (98.6%) completed follow-up and were assigned a treatment outcome. The risk of parasitemia at 42 days varied significantly between DP (11.7% [95% confidence interval (CI): 7.9, 17.1]), AQ+SP (44.3% [37.6, 51.5]), and SP (79.7% [95% CI: 73.6, 85.2], p<0.001). The risk of parasitemia in SP-treated children was no different than in those receiving placebo (84.6% [95% CI: 79.1, 89.3], p = 0.22). No serious adverse events occurred, but AQ+SP was associated with increased risk of vomiting compared to placebo (13.0% [95% CI: 9.1, 18.5] vs. 4.7% [95% CI: 2.5, 8.8], respectively, p = 0.003).

Conclusions

DP was the most efficacious and well-tolerated regimen tested, although AQ+SP appears to be a suitable alternative for IPT in schoolchildren. Use of SP for IPT may not be appropriate in areas with high-level SP resistance in Africa.

Trial Registration

ClinicalTrials.gov NCT00852371  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号