首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Zhou YH  Tang JY  Wu MJ  Lu J  Wei X  Qin YY  Wang C  Xu JF  He J 《PloS one》2011,6(9):e25142

Background

Folic acid is widely used to lower homocysteine concentrations and prevent adverse cardiovascular outcomes. However, the effect of folic acid on cardiovascular events is not clear at the present time. We carried out a comprehensive systematic review and meta-analysis to assess the effects of folic acid supplementation on cardiovascular outcomes.

Methodology and Principal Findings

We systematically searched Medline, EmBase, the Cochrane Central Register of Controlled Trials, reference lists of articles, and proceedings of major meetings for relevant literature. We included randomized placebo-controlled trials that reported on the effects of folic acid on cardiovascular events compared to placebo. Of 1594 identified studies, we included 16 trials reporting data on 44841 patients. These studies reported 8238 major cardiovascular events, 2001 strokes, 2917 myocardial infarctions, and 6314 deaths. Folic acid supplementation as compared to placebo had no effect on major cardiovascular events (RR, 0.98; 95% CI, 0.93–1.04), stroke (RR, 0.89; 95% CI,0.78–1.01), myocardial infarction (RR, 1.00; 95% CI, 0.93–1.07), or deaths from any cause (RR, 1.00;95% CI, 0.96–1.05). Moreover, folic acid as compared to placebo also had no effect on the following secondary outcomes: risk of revascularization (RR, 1.05; 95%CI, 0.95–1.16), acute coronary syndrome (RR, 1.06; 95%CI, 0.97–1.15), cancer (RR, 1.08; 95%CI, 0.98–1.21), vascular death (RR, 0.94; 95%CI,0.88–1.02), or non-vascular death (RR, 1.06; 95%CI, 0.97–1.15).

Conclusion/Significance

Folic acid supplementation does not effect on the incidence of major cardiovascular events, stroke, myocardial infarction or all cause mortality.  相似文献   

2.

Background

Risk factors for ischemic stroke are mostly known, but it is still unclear in most countries, what are their combined population-attributable risk percent (PAR%). In a case-control study the individual odds ratios (ORs) and the individual and combined PAR%, including risk factors not addressed in previous studies were estimated.

Methods

Cases and controls were selected from patients attending to an emergency department. Cases were patients aged with 45 years or more with the first episode of ischemic stroke, characterized by a focal neurological deficit or change in the mental status occurring during the previous 24 hours. Controls, matched to cases by age and gender, were selected from patients without neurological complaints.

Results

133 cases and 272 controls were studied. Odds ratios for ischemic stroke were: atrial fibrillation (27.3; CI 95% 7.5–99.9), left ventricular hypertrophy (20.3; CI 95% 8.8–46.4), history of hypertension (11.2; CI 95% 5.4–23.3), physical inactivity (6.6; CI 95% 3.3–13.1), low levels of HDL-cholesterol (5.0; CI 95%2.8–8.9), heavy smoking (2.8; CI 95% 1.5–5.0), carotid bruit (2.5; CI 95% 1.3–4.6), diabetes (2.4; CI 95% 1.4–4.0) and alcohol abuse (2.1; CI 95% 1.1–4.0), The combination of these risk factors accounted for 98.9% (95% CI; 96.4%–99.7%) of the PAR% for all stroke.

Conclusions

Nine risk factors, easily identified, explain almost 100% of the population attributable risk for ischemic stroke.  相似文献   

3.

Background

Stavudine continues to be used in antiretroviral treatment (ART) regimens in many resource-limited settings. The use of zidovudine instead of stavudine in higher-risk patients to reduce the likelihood of lactic acidosis and hyperlactatemia (LAHL) has not been examined.

Methods

Antiretroviral-naïve, HIV-infected adults initiating ART between 2004 and 2007 were divided into cohorts of those initiated on stavudine- or zidovudine-containing therapy. We evaluated stavudine or zidovudine use, age, sex, body mass index (BMI), baseline CD4 cell count, creatinine, hemoglobin, alanine aminotransferase, and albumin as predictors of time to LAHL with Cox Proportional Hazards (PH) regression models.

Results

Among 2062 patients contributing 2747 patient years (PY), the combined incidence of LAHL was 3.2/100 PY in those initiating stavudine- and 0.34/100 PY in those initiating zidovudine-containing ART (RR 9.26, 95% CI: 1.28–66.93). In multivariable Cox PH analysis, stavudine exposure (HR 14.31, 95% CI: 5.79–35.30), female sex (HR 3.41, 95% CI: 1.89–6.19), higher BMI (HR 3.21, 95% CI: 2.16–4.77), higher creatinine (1.63, 95% CI: 1.12–2.36), higher albumin (HR 1.04, 95% CI: 1.01–1.07), and lower CD4 cell count (HR 0.96, 95% CI: 0.92–1.0) at baseline were associated with higher LAHL rates. Among participants who started on stavudine, switching to zidovudine was associated with lower LAHL rates (HR 0.15, 95% CI: 0.06–0.35). Subgroup analysis limited to women with higher BMI≥25 kg/m2 initiated on stavudine also showed that switch to zidovudine was protective when controlling for other risk factors (HR 0.21, 95% CI .07–0.64).

Conclusions

Stavudine exposure, female sex, and higher BMI are strong, independent predictors for developing LAHL. Patients with risk factors for lactic acidosis have less LAHL while on zidovudine- rather than stavudine-containing ART. Switching patients from stavudine to zidovudine is protective. Countries continuing to use stavudine should avoid this drug in women and patients with higher BMI.  相似文献   

4.

Background

In order to review the epidemiologic evidence concerning previous lung diseases as risk factors for lung cancer, a meta-analysis and systematic review was conducted.

Methods

Relevant studies were identified through MEDLINE searches. Using random effects models, summary effects of specific previous conditions were evaluated separately and combined. Stratified analyses were conducted based on smoking status, gender, control sources and continent.

Results

A previous history of COPD, chronic bronchitis or emphysema conferred relative risks (RR) of 2.22 (95% confidence interval (CI): 1.66, 2.97) (from 16 studies), 1.52 (95% CI: 1.25, 1.84) (from 23 studies) and 2.04 (95% CI: 1.72, 2.41) (from 20 studies), respectively, and for all these diseases combined 1.80 (95% CI: 1.60, 2.11) (from 39 studies). The RR of lung cancer for subjects with a previous history of pneumonia was 1.43 (95% CI: 1.22–1.68) (from 22 studies) and for subjects with a previous history of tuberculosis was 1.76 (95% CI = 1.49, 2.08), (from 30 studies). Effects were attenuated when restricting analysis to never smokers only for COPD/emphysema/chronic bronchitis (RR = 1.22, 0.97–1.53), however remained significant for pneumonia 1.36 (95% CI: 1.10, 1.69) (from 8 studies) and tuberculosis 1.90 (95% CI: 1.45, 2.50) (from 11 studies).

Conclusions

Previous lung diseases are associated with an increased risk of lung cancer with the evidence among never smokers supporting a direct relationship between previous lung diseases and lung cancer.  相似文献   

5.

Background

Statin therapy reduces the risk of occlusive vascular events, but uncertainty remains about potential effects on cancer. We sought to provide a detailed assessment of any effects on cancer of lowering LDL cholesterol (LDL-C) with a statin using individual patient records from 175,000 patients in 27 large-scale statin trials.

Methods and Findings

Individual records of 134,537 participants in 22 randomised trials of statin versus control (median duration 4.8 years) and 39,612 participants in 5 trials of more intensive versus less intensive statin therapy (median duration 5.1 years) were obtained. Reducing LDL-C with a statin for about 5 years had no effect on newly diagnosed cancer or on death from such cancers in either the trials of statin versus control (cancer incidence: 3755 [1.4% per year [py]] versus 3738 [1.4% py], RR 1.00 [95% CI 0.96-1.05]; cancer mortality: 1365 [0.5% py] versus 1358 [0.5% py], RR 1.00 [95% CI 0.93–1.08]) or in the trials of more versus less statin (cancer incidence: 1466 [1.6% py] vs 1472 [1.6% py], RR 1.00 [95% CI 0.93–1.07]; cancer mortality: 447 [0.5% py] versus 481 [0.5% py], RR 0.93 [95% CI 0.82–1.06]). Moreover, there was no evidence of any effect of reducing LDL-C with statin therapy on cancer incidence or mortality at any of 23 individual categories of sites, with increasing years of treatment, for any individual statin, or in any given subgroup. In particular, among individuals with low baseline LDL-C (<2 mmol/L), there was no evidence that further LDL-C reduction (from about 1.7 to 1.3 mmol/L) increased cancer risk (381 [1.6% py] versus 408 [1.7% py]; RR 0.92 [99% CI 0.76–1.10]).

Conclusions

In 27 randomised trials, a median of five years of statin therapy had no effect on the incidence of, or mortality from, any type of cancer (or the aggregate of all cancer).  相似文献   

6.

Objectives

We prospectively examined whether socioeconomic status (SES) predicts incident type II diabetes (diabetes), a cardiovascular risk equivalent and burgeoning public health epidemic among women.

Methods

Participants include 23,992 women with HbA1c levels <6% and no CVD or diabetes at baseline followed from February 1993 to March 2007. SES was measured by education and income while diabetes was self-reported.

Results

Over 12.3 years of follow-up, 1,262 women developed diabetes. In age and race adjusted models, the relative risk of diabetes decreased with increasing education (<2 years of nursing, 2 to <4 years of nursing, bachelor''s degree, master''s degree, and doctorate: 1.0, 0.7 [95% Confidence Interval (CI), 0.6–0.8], 0.6 (95% CI, 0.5–0.7), 0.5 (95% CI, 0.4–0.6), 0.4 (95% CI, 0.3–0.5); ptrend<0.001). Adjustment for traditional and non-traditional cardiovascular risk factors attenuated this relationship (education: ptrend = 0.96). Similar associations were observed between income categories and diabetes.

Conclusion

Advanced education and increasing income were both inversely associated with incident diabetes even in this relatively well-educated cohort. This relationship was largely explained by behavioral factors, particularly body mass index.  相似文献   

7.

Background

Hearing difficulties are a large public health problem. Knowledge is scarce regarding risk of disability pension among people who have been sickness absent due to these difficulties.

Methods

A cohort including all 4,687,756 individuals living in Sweden in 2005, aged 20–64, and not on disability or old-age pension, was followed through 2009. Incidence rate ratios (RR) of disability pension with 95% confidence intervals (CI) were estimated using Cox proportional hazard models.

Results

In multivariable models, individuals who had a sick-leave spell due to otoaudiological diagnoses in 2005 had a 1.52-fold (95% CI: 1.43–1.62) increased risk of being granted a disability pension compared to individuals on sick leave due to other diagnoses. Hearing and tinnitus sick-leave diagnoses were associated with risk of disability pension: RR 3.38, 95% CI: 3.04–3.75, and 3.30, 95% CI: 2.95–3.68, respectively. No association was observed between sick leave due to vertigo diagnoses and disability pension whereas otological diagnoses and no sick leave were inversely associated with risk of disability pension compared to non-otoaudiological sick-leave diagnoses. Sick leave due to otoaudiological diagnoses was positively associated with risk of disability pension due to otoaudiological diagnoses and sick leave due to a tinnitus diagnosis was also associated with risk of disability pension due to mental diagnoses. The risk of disability pension among individuals with hearing or tinnitus sick-leave diagnoses was highest in the age group 35–44. Moreover, men had a slightly higher risk.

Conclusion

This large cohort study suggests an increased risk of disability pension among those with sickness absence due to otoaudiological diagnoses, particularly hearing and tinnitus diagnoses, compared to those with sickness absence due to non-otoaudiological diagnoses.  相似文献   

8.
Chen J  Zhang R  Wang J  Liu L  Zheng Y  Shen Y  Qi T  Lu H 《PloS one》2011,6(11):e26827

Background

Interferon-gamma release assays (IGRAs) have provided a new method for the diagnosis of Mycobacterium tuberculosis infection. However, the role of IGRAs for the diagnosis of active tuberculosis (TB), especially in HIV-infected patients remains unclear.

Methods

We searched PubMed, EMBASE and Cochrane databases to identify studies published in January 2001–July 2011 that evaluated the evidence of using QuantiFERON-TB Gold in-tube (QFT-GIT) and T-SPOT.TB (T-SPOT) on blood for the diagnosis of active TB in HIV-infected patients.

Results

The search identified 16 eligible studies that included 2801 HIV-infected individuals (637 culture confirmed TB cases). The pooled sensitivity for the diagnosis of active TB was 76.7% (95%CI, 71.6–80.5%) and 77.4% (95%CI, 71.4–82.6%) for QFT-GIT and T-SPOT, respectively, while the specificity was 76.1% (95%CI, 74.0–78.0%) and 63.1% (95%CI, 57.6–68.3%) after excluding the indeterminate results. Studies conducted in low/middle income countries showed slightly lower sensitivity and specificity when compared to that in high-income countries. The proportion of indeterminate results was as high as 10% (95%CI, 8.8–11.3%) and 13.2% (95%CI, 10.6–16.0%) for QFT-GIT and T-SPOT, respectively.

Conclusion

IGRAs in their current formulations have limited accuracy in diagnosing active TB in HIV-infected patients, and should not be used alone to rule out or rule in active TB cases in HIV-infected patients. Further modification is needed to improve their accuracy.  相似文献   

9.

Introduction

The utility of T-cell based interferon-gamma release assays for the diagnosis of latent tuberculosis infection remains unclear in settings with a high burden of tuberculosis.

Objectives

To determine risk factors associated with positive QuantiFERON-TB Gold In-Tube (QFT-GIT) and tuberculin skin test (TST) results and the level of agreement between the tests; to explore the hypotheses that positivity in QFT-GIT is more related to recent infection and less affected by HIV than the TST.

Methods

Adult household contacts of tuberculosis patients were invited to participate in a cross-sectional study across 24 communities in Zambia and South Africa. HIV, QFT-GIT and TST tests were done. A questionnaire was used to assess risk factors.

Results

A total of 2,220 contacts were seen. 1,803 individuals had interpretable results for both tests, 1,147 (63.6%) were QFT-GIT positive while 725 (40.2%) were TST positive. Agreement between the tests was low (kappa = 0.24). QFT-GIT and TST results were associated with increasing age (adjusted OR [aOR] for each 10 year increase for QFT-GIT 1.15; 95% CI: 1.06–1.25, and for TST aOR: 1.10; 95% CI 1.01–1.20). HIV positivity was less common among those with positive results on QFT-GIT (aOR: 0.51; 95% CI: 0.39–0.67) and TST (aOR: 0.61; 95% CI: 0.46–0.82). Smear positivity of the index case was associated with QFT-GIT (aOR: 1.25; 95% CI: 0.90–1.74) and TST (aOR: 1.39; 95% CI: 0.98–1.98) results. We found little evidence in our data to support our hypotheses.

Conclusion

QFT-GIT may not be more sensitive than the TST to detect risk factors associated with tuberculous infection. We found little evidence to support the hypotheses that positivity in QFT-GIT is more related to recent infection and less affected by HIV than the TST.  相似文献   

10.

Objective

To identify factors associated with negative direct sputum examination among African and Cambodian patients co-infected by Mycobacterium tuberculosis and HIV.

Design

Prospective multicenter study (ANRS1260) conducted in Cambodia, Senegal and Central African Republic.

Methods

Univariate and multivariate analyses (logistic regression) were used to identify clinical and radiological features associated with negative direct sputum examination in HIV-infected patients with positive M. tuberculosis culture on Lowenstein-Jensen medium.

Results

Between September 2002 and December 2005, 175 co-infected patients were hospitalized with at least one respiratory symptom and pulmonary radiographic anomaly. Acid-fast bacillus (AFB) examination was positive in sputum samples from 110 subjects (63%) and negative in 65 patients (37%). Most patients were at an advanced stage of HIV disease (92% at stage III or IV of the WHO classification) with a median CD4 cell count of 36/mm3. In this context, we found that sputum AFB negativity was more frequent in co-infected subjects with associated respiratory tract infections (OR = 2.8 [95%CI:1.1–7.0]), dyspnea (OR = 2.5 [95%CI:1.1–5.6]), and localized interstitial opacities (OR = 3.1 [95%CI:1.3–7.6]), but was less frequent with CD4≤50/mm3 (OR = 0.4 [95%CI:0.2–0.90), adenopathies (OR = 0.4 [95%CI:0.2–0.93]) and cavitation (OR = 0.1 [95%CI:0.03–0.6]).

Conclusions

One novel finding of this study is the association between concomitant respiratory tract infection and negative sputum AFB, particularly in Cambodia. This finding suggests that repeating AFB testing in AFB-negative patients should be conducted when broad spectrum antibiotic treatment does not lead to complete recovery from respiratory symptoms. In HIV-infected patients with a CD4 cell count below 50/mm3 without an identified cause of pneumonia, systematic AFB direct sputum examination is justified because of atypical clinical features (without cavitation) and high pulmonary mycobacterial burden.  相似文献   

11.
Feng JY  Su WJ  Chiu YC  Huang SF  Lin YY  Huang RM  Lin CH  Hwang JJ  Lee JJ  Yu MC  Yu KW  Lee YC 《PloS one》2011,6(9):e23715

Background

Despite effective anti-TB treatments, tuberculosis remains a serious threat to public health and is associated with high mortality. Old age and multiple co-morbidities are known risk factors for death. The association of clinical presentations with mortality in pulmonary tuberculosis patients remains an issue of controversy.

Methods

This prospective observational study enrolled newly diagnosed, culture-proven pulmonary tuberculosis patients from five medical centers and one regional hospital, which were referral hospitals of TB patients. Radiographic findings and clinical symptoms were determined at the time of diagnosis. Patients who died for any reason during the course of anti-TB treatment were defined as mortality cases and death that occurred within 30 days of initiating treatment was defined as early mortality. Clinical factors associated with overall mortality and early mortality were investigated.

Results

A total of 992 patients were enrolled and 195 (19.7%) died. Nearly one-third (62/195, 31.8%) of the deaths occurred before or within 30 days of treatment initiation. Older age (RR = 1.04, 95%CI: 1.03–1.05), malignancy (RR = 2.42, 95%CI: 1.77–3.31), renal insufficiency (RR = 1.77, 95%CI: 1.12–2.80), presence of chronic cough (RR = 0.63, 95%CI: 0.47–0.84), fever (RR = 1.45, 95%CI: 1.09–1.94), and anorexia (RR = 1.49, 95%CI: 1.07–2.06) were independently associated with overall mortality. Kaplan-Meier survival analysis demonstrated significantly higher mortality in patients present with fever (p<0.001), anorexia (p = 0.005), and without chronic cough (p<0.001). Among patients of mortality, those with respiratory symptoms of chronic cough (RR = 0.56, 95%CI: 0.33–0.98) and dyspnea (HR = 0.51, 95%CI: 0.27–0.98) were less likely to experience early mortality. The radiological features were comparable between survivors and non-survivors.

Conclusions

In addition to demographic characteristics, clinical presentations including the presence of fever, anorexia, and the absence of chronic cough, were also independent predictors for on-treatment mortality in pulmonary tuberculosis patients.  相似文献   

12.

Purpose

Residing in deprived areas may increase risk of mortality beyond that explained by a person''s own SES-related factors and lifestyle. The aim of this study was to examine the relation between neighborhood socioeconomic deprivation and all-cause, cancer- and cardiovascular disease (CVD)-specific mortality for men and women after accounting for education and other important person-level risk factors.

Methods

In the longitudinal NIH-AARP Study, we analyzed data from healthy participants, ages 50–71 years at study baseline (1995–1996). Deaths (n = 33831) were identified through December 2005. Information on census tracts was obtained from the 2000 US Census. Cox models estimated hazard ratios (HRs) and 95% confidence intervals (CIs) for quintiles of neighborhood deprivation.

Results

Participants in the highest quintile of deprivation had elevated risks for overall mortality (HRmen = 1.17, 95% CI: 1.10, 1.24; HRwomen = 1.13, 95% CI: 1.05, 1.22) and marginally increased risk for cancer deaths (HRmen = 1.09, 95% CI: 1.00, 1.20; HRwomen = 1.09, 95% CI: 0.99, 1.22). CVD mortality associations appeared stronger in men (HR = 1.33, 95% CI: 1.19, 1.49) than women (HR = 1.18, 95% CI: 1.01, 1.38). There was no evidence of an effect modification by education.

Conclusion

Higher neighborhood deprivation was associated with modest increases in all-cause, cancer- and CVD-mortality after accounting for many established risk factors.  相似文献   

13.

Objectives

Use electronic health records Autism Spectrum Disorder (ASD) to assess the comorbidity burden of ASD in children and young adults.

Study Design

A retrospective prevalence study was performed using a distributed query system across three general hospitals and one pediatric hospital. Over 14,000 individuals under age 35 with ASD were characterized by their co-morbidities and conversely, the prevalence of ASD within these comorbidities was measured. The comorbidity prevalence of the younger (Age<18 years) and older (Age 18–34 years) individuals with ASD was compared.

Results

19.44% of ASD patients had epilepsy as compared to 2.19% in the overall hospital population (95% confidence interval for difference in percentages 13.58–14.69%), 2.43% of ASD with schizophrenia vs. 0.24% in the hospital population (95% CI 1.89–2.39%), inflammatory bowel disease (IBD) 0.83% vs. 0.54% (95% CI 0.13–0.43%), bowel disorders (without IBD) 11.74% vs. 4.5% (95% CI 5.72–6.68%), CNS/cranial anomalies 12.45% vs. 1.19% (95% CI 9.41–10.38%), diabetes mellitus type I (DM1) 0.79% vs. 0.34% (95% CI 0.3–0.6%), muscular dystrophy 0.47% vs 0.05% (95% CI 0.26–0.49%), sleep disorders 1.12% vs. 0.14% (95% CI 0.79–1.14%). Autoimmune disorders (excluding DM1 and IBD) were not significantly different at 0.67% vs. 0.68% (95% CI −0.14-0.13%). Three of the studied comorbidities increased significantly when comparing ages 0–17 vs 18–34 with p<0.001: Schizophrenia (1.43% vs. 8.76%), diabetes mellitus type I (0.67% vs. 2.08%), IBD (0.68% vs. 1.99%) whereas sleeping disorders, bowel disorders (without IBD) and epilepsy did not change significantly.

Conclusions

The comorbidities of ASD encompass disease states that are significantly overrepresented in ASD with respect to even the patient populations of tertiary health centers. This burden of comorbidities goes well beyond those routinely managed in developmental medicine centers and requires broad multidisciplinary management that payors and providers will have to plan for.  相似文献   

14.

Background

A common weakness of patient satisfaction surveys is a suboptimal participation rate. Some patients may be unable to participate, because of language barriers, physical limitations, or mental problems. As the role of these barriers is poorly understood, we aimed to identify patient characteristics that are associated with non-participation in a patient satisfaction survey.

Methodology

At the University Hospitals of Geneva, Switzerland, a patient satisfaction survey is regularly conducted among all adult patients hospitalized for >24 hours on a one-month period in the departments of internal medicine, geriatrics, surgery, neurosciences, psychiatry, and gynaecology-obstetrics. In order to assess the factors associated with non-participation to the patient satisfaction survey, a case-control study was conducted among patients selected for the 2005 survey. Cases (non respondents, n = 195) and controls (respondents, n = 205) were randomly selected from the satisfaction survey, and information about potential barriers to participation was abstracted in a blinded fashion from the patients'' medical and nursing charts.

Principal Findings

Non-participation in the satisfaction survey was independently associated with the presence of a language barrier (odds ratio [OR] 4.53, 95% confidence interval [CI95%]: 2.14–9.59), substance abuse (OR 3.75, CI95%: 1.97–7.14), cognitive limitations (OR 3.72, CI95%: 1.64–8.42), a psychiatric diagnosis (OR 1.99, CI95%: 1.23–3.23) and a sight deficiency (OR 2.07, CI95%: 0.98–4.36). The odds ratio for non-participation increased gradually with the number of predictors.

Conclusions

Five barriers to non-participation in a mail survey were identified. Gathering patient feedback through mailed surveys may lead to an under-representation of some patient subgroups.  相似文献   

15.

Background

Numerous observational studies suggest that preventable adverse drug reactions are a significant burden in healthcare, but no meta-analysis using a standardised definition for adverse drug reactions exists. The aim of the study was to estimate the percentage of patients with preventable adverse drug reactions and the preventability of adverse drug reactions in adult outpatients and inpatients.

Methods

Studies were identified through searching Cochrane, CINAHL, EMBASE, IPA, Medline, PsycINFO and Web of Science in September 2010, and by hand searching the reference lists of identified papers. Original peer-reviewed research articles in English that defined adverse drug reactions according to WHO’s or similar definition and assessed preventability were included. Disease or treatment specific studies were excluded. Meta-analysis on the percentage of patients with preventable adverse drug reactions and the preventability of adverse drug reactions was conducted.

Results

Data were analysed from 16 original studies on outpatients with 48797 emergency visits or hospital admissions and from 8 studies involving 24128 inpatients. No studies in primary care were identified. Among adult outpatients, 2.0% (95% confidence interval (CI): 1.2–3.2%) had preventable adverse drug reactions and 52% (95% CI: 42–62%) of adverse drug reactions were preventable. Among inpatients, 1.6% (95% CI: 0.1–51%) had preventable adverse drug reactions and 45% (95% CI: 33–58%) of adverse drug reactions were preventable.

Conclusions

This meta-analysis corroborates that preventable adverse drug reactions are a significant burden to healthcare among adult outpatients. Among both outpatients and inpatients, approximately half of adverse drug reactions are preventable, demonstrating that further evidence on prevention strategies is required. The percentage of patients with preventable adverse drug reactions among inpatients and in primary care is largely unknown and should be investigated in future research.  相似文献   

16.

Background

Intimate Partner Violence (IPV) is a major public health problem with serious consequences. This study was conducted to assess the magnitude of IPV in Southwest Ethiopia in predominantly rural community.

Methods

This community based cross-sectional study was conducted in May, 2009 in Southwest Ethiopia using the World Health Organization core questionnaire to measure violence against women. Trained data collectors interviewed 851 ever-married women. Stata version 10.1 software and SPSS version 12.0.1 for windows were used for data analysis.

Result

In this study the life time prevalence of sexual or physical partner violence, or both was 64.7% (95%CI: 61.4%–67.9%). The lifetime sexual violence [50.1% (95% CI: 46.7%–53.4%)] was considerably more prevalent than physical violence [41.1% (95%:37.8–44.5)]. A sizable proportion [41.5%(95%CI: 38.2%–44.8%)] of women reported physical or sexual violence, or both, in the past year. Men who were controlling were more likely to be violent against their partner.

Conclusion

Physical and sexual violence is common among ever-married women in Southwest Ethiopia. Interventions targeting controlling men might help in reducing IPV. Further prospective longitudinal studies among ever-married women are important to identify predictors and to study the dynamics of violence over time.  相似文献   

17.

Background

Fever is common following infant vaccinations. Two randomized controlled trials demonstrated the efficacy of acetaminophen prophylaxis in preventing fever after whole cell pertussis vaccination, but acetaminophen prophylaxis has not been evaluated for prevention of fever following contemporary vaccines recommended for infants in the United States.

Methods

Children six weeks through nine months of age were randomized 1∶1 to receive up to five doses of acetaminophen (10–15 mg per kg) or placebo following routine vaccinations. The primary outcome was a rectal temperature ≥38°C within 32 hours following the vaccinations. Secondary outcomes included medical utilization, infant fussiness, and parents'' time lost from work. Parents could request unblinding of the treatment assignment if the child developed fever or symptoms that would warrant supplementary acetaminophen treatment for children who had been receiving placebo.

Results

A temperature ≥38°C was recorded for 14% (25/176) of children randomized to acetaminophen compared with 22% (37/176) of those randomized to placebo but that difference was not statistically significant (relative risk [RR], 0.63; 95% CI, 0.40–1.01). Children randomized to acetaminophen were less likely to be reported as being much more fussy than usual (10% vs 24%) (RR, 0.42; 95% CI, 0.25–0.70) or to have the treatment assignment unblinded (3% vs 9%) (RR, 0.31; 95% CI, 0.11–0.83) than those randomized to placebo. In age-stratified analyses, among children ≥24 weeks of age, there was a significantly lower risk of temperature ≥38°C in the acetaminophen group (13% vs. 25%; p = 0.03).

Conclusion

The results of this relatively small trial suggest that acetaminophen may reduce the risk of post-vaccination fever and fussiness.

Trial registration

Clinicaltrials.gov NCT00325819  相似文献   

18.
Z Zhao  S Li  G Liu  F Yan  X Ma  Z Huang  H Tian 《PloS one》2012,7(7):e41641

Background and Objective

Emerging evidence from biological and epidemiological studies has suggested that body iron stores and heme-iron intake may be related to the risk of type 2 diabetes (T2D). We aimed to examine the association of body iron stores and heme-iron intake with T2D risk by conducting a systematic review and meta-analysis of previously published studies.

Research Design and Methods

Systematic review and subsequent meta-analysis were conducted by searching MEDLINE database up to June 22, 2012 to identify studies that analyzed the association of body iron stores or dietary heme-iron intake with T2D risk. The meta-analysis was performed using the effect estimates and 95% confidence intervals (CIs) to calculate the pooled risk estimates, while the heterogeneity among studies was examined using the I2 and Q statistic.

Results

The meta-analysis included 16 high-quality studies: 12 studies analyzed ferritin levels (4,366 T2D patients and 41,091 controls) and 4 measured heme-iron intake (9,246 T2D patients and 179,689 controls). The combined relative risk (RR) comparing the highest and lowest category of ferritin levels was 1.66 (95% CI: 1.15–2.39) for prospective studies, 2.29 (95% CI: 1.48–3.54) for cross-sectional studies with heterogeneity (Q = 14.84, p = 0.01, I2 = 66.3%; Q = 44.16, p<0.001, I2 = 88.7%). The combined RR comparing the highest and lowest category of heme-iron intake was 1.31 (95% CI: 1.21–1.43) with heterogeneity (Q = 1.39, p = 0.71, I2 = 0%). No publication bias was found. Additional 15 studies that were of good quality, had significant results, and analyzed the association between body iron stores and T2D risk were qualitatively included in the systematic review.

Conclusions

The meta-analysis and systematic review suggest that increased ferritin levels and heme-iron intake are both associated with higher risk of T2D.  相似文献   

19.

Background

Patients who participate in clinical trials may experience better clinical outcomes than patients who initiate similar therapy within clinical care (trial effect), but no published studies have evaluated a trial effect in HIV clinical trials.

Methods

To examine a trial effect we compared virologic suppression (VS) among patients who initiated HAART in a clinical trial versus in routine clinical care. VS was defined as a plasma HIV RNA ≤400 copies/ml at six months after HAART initiation and was assessed within strata of early (1996–99) or current (2000–06) HAART periods. Risk ratios (RR) were estimated using binomial models.

Results

Of 738 persons initiating HAART, 30.6% were women, 61.7% were black, 30% initiated therapy in a clinical trial and 67% (n = 496) had an evaluable six month HIV RNA result. HAART regimens differed between the early and current periods (p<0.001); unboosted PI regimens (55.6%) were more common in the early and NNRTI regimens (46.4%) were more common in the current period. Overall, 78% (95%CI 74, 82%) of patients achieved VS and trial participants were 16% more likely to achieve VS (unadjusted RR 1.16, 95%CI 1.06, 1.27). Comparing trial to non-trial participants, VS differed by study period. In the early period, trial participants initiating HAART were significantly more likely to achieve VS than non-trial participants (adjusted RR 1.33; 95%CI 1.15, 1.54), but not in the current period (adjusted RR 0.98; 95%CI 0.87, 1.11).

Conclusions

A clear clinical trial effect on suppression of HIV replication was observed in the early HAART period but not in the current period.  相似文献   

20.

Background

Soil-transmitted helminth (STH) infections (i.e., Ascaris lumbricoides, hookworm, and Trichuris trichiura) affect more than a billion people. Preventive chemotherapy (i.e., repeated administration of anthelmintic drugs to at-risk populations), is the mainstay of control. This strategy, however, does not prevent reinfection. We performed a systematic review and meta-analysis to assess patterns and dynamics of STH reinfection after drug treatment.

Methodology

We systematically searched PubMed, ISI Web of Science, EMBASE, Cochrane Database of Systematic Reviews, China National Knowledge Infrastructure, WanFang Database, Chinese Scientific Journal Database, and Google Scholar. Information on study year, country, sample size, age of participants, diagnostic method, drug administration strategy, prevalence and intensity of infection pre- and posttreatment, cure and egg reduction rate, evaluation period posttreatment, and adherence was extracted. Pooled risk ratios from random-effects models were used to assess the risk of STH reinfection after treatment. Our protocol is available on PROSPERO, registration number: CRD42011001678.

Principal Findings

From 154 studies identified, 51 were included and 24 provided STH infection rates pre- and posttreatment, whereas 42 reported determinants of predisposition to reinfection. At 3, 6, and 12 months posttreatment, A. lumbricoides prevalence reached 26% (95% confidence interval (CI): 16–43%), 68% (95% CI: 60–76%) and 94% (95% CI: 88–100%) of pretreatment levels, respectively. For T. trichiura, respective reinfection prevalence were 36% (95% CI: 28–47%), 67% (95% CI: 42–100%), and 82% (95% CI: 62–100%), and for hookworm, 30% (95% CI: 26–34%), 55% (95% CI: 34–87%), and 57% (95% CI: 49–67%). Prevalence and intensity of reinfection were positively correlated with pretreatment infection status.

Conclusion

STH reinfections occur rapidly after treatment, particularly for A. lumbricoides and T. trichiura. Hence, there is a need for frequent anthelmintic drug administrations to maximize the benefit of preventive chemotherapy. Integrated control approaches emphasizing health education and environmental sanitation are needed to interrupt transmission of STH.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号