首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.

Background

Clostridium difficile infection (CDI) has become a global epidemiological problem for both hospitalized patients and outpatients. The most commonly used drugs to treat CDI are metronidazole and vancomycin. The aim of this study was to compare the efficacy and safety of metronidazole monotherapy with vancomycin monotherapy and combination therapy in CDI patients.

Methods

A comprehensive search without publication status or other restrictions was conducted. Studies comparing metronidazole monotherapy with vancomycin monotherapy or combination therapy in patients with CDI were considered eligible. Meta-analysis was performed using the Mantel-Haenszel fixed-effects model, and odds ratios (ORs) with 95% confidence intervals (95% CIs) were calculated and reported.

Results

Of the 1910 records identified, seventeen studies from thirteen articles (n = 2501 patients) were included. No statistically significant difference in the rate of clinical cure was found between metronidazole and vancomycin for mild CDI (OR = 0.67, 95% CI (0.45, 1.00), p = 0.05) or between either monotherapy and combination therapy for CDI (OR = 1.07, 95% CI (0.58, 1.96), p = 0.83); however, the rate of clinical cure was lower for metronidazole than for vancomycin for severe CDI (OR = 0.46, 95% CI (0.26, 0.80), p = 0.006). No statistically significant difference in the rate of CDI recurrence was found between metronidazole and vancomycin for mild CDI (OR = 0.99, 95% CI (0.40, 2.45), p = 0.98) or severe CDI (OR = 0.98, 95% CI (0.63, 1.53), p = 0.94) or between either monotherapy and combination therapy for CDI (OR = 0.91, 95% CI (0.66, 1.26), p = 0.56). In addition, there was no significant difference in the rate of adverse events (AEs) between metronidazole and vancomycin (OR = 1.18, 95% CI (0.80, 1.74), p = 0.41). In contrast, the rate of AEs was significantly lower for either monotherapy than for combination therapy (OR = 0.30, 95% CI (0.17, 0.51), p<0.0001).

Conclusions

Metronidazole and vancomycin are equally effective for the treatment of mild CDI, but vancomycin is superior for the treatment of severe CDI. Combination therapy is not superior to monotherapy because it appears to be associated with an increase in the rate of AEs.  相似文献   

2.

Background

Studies have demonstrated seasonal variability in rates of Clostridium difficile infection (CDI). Synthesising all available information on seasonality is a necessary step in identifying large-scale epidemiological patterns and elucidating underlying causes.

Methods

Three medical and life sciences publication databases were searched from inception to October 2014 for longitudinal epidemiological studies written in English, Spanish or Portuguese that reported the incidence of CDI. The monthly frequency of CDI were extracted, standardized and weighted according to the number of follow-up months. Cross correlation coefficients (XCORR) were calculated to examine the correlation and lag between the year-month frequencies of reported CDI across hemispheres and continents.

Results

The search identified 13, 5 and 2 studies from North America, Europe, and Oceania, respectively that met the inclusion criteria. CDI had a similar seasonal pattern in the Northern and Southern Hemisphere characterized by a peak in spring and lower frequencies of CDI in summer/autumn with a lag of 8 months (XCORR = 0.60) between hemispheres. There was no difference between the seasonal patterns across European and North American countries.

Conclusion

CDI demonstrates a distinct seasonal pattern that is consistent across North America, Europe and Oceania. Further studies are required to identify the driving factors of the observed seasonality.  相似文献   

3.

Background

Research is needed to identify early life risk factors associated with different developmental paths leading to overweight by adolescence.

Objectives

To model heterogeneity in overweight development during middle childhood and identify factors associated with differing overweight trajectories.

Methods

Data was drawn from the Quebec Longitudinal Study of Child Development (QLSCD; 1998-2010). Trained research assistants measured height and weight according to a standardized protocol and conducted yearly home interviews with the child’s caregiver (mother in 98% of cases). Information on several putative early life risk factors for the development of overweight were obtained, including factors related to the child’s perinatal, early behavioral family and social environment. Group-based trajectories of the probability of overweight (6-12 years) were identified with a semiparametric method (n=1678). Logistic regression analyses were used to identify early risk factors (5 months- 5 years) associated with each trajectory.

Results

Three trajectories of overweight were identified: “early-onset overweight” (11.0 %), “late-onset overweight” (16.6%) and “never overweight” (72.5%). Multinomial analyses indicated that children in the early and late-onset group, compared to the never overweight group, had 3 common types of risk factors: parental overweight, preschool overweight history, and large size for gestational age. Maternal overprotection (OR= 1.12, CI: 1.01-1.25), short nighttime sleep duration (OR=1.66, CI: 1.07-2.57), and immigrant status (OR=2.01, CI: 1.05-3.84) were factors specific to the early-onset group. Finally, family food insufficiency (OR=1.81, CI: 1.00-3.28) was weakly associated with membership in the late-onset trajectory group.

Conclusions

The development of overweight in childhood follows two different trajectories, which have common and distinct risk factors that could be the target of early preventive interventions.  相似文献   

4.

Background

There are limited data examining healthcare resource utilization in patients with recurrent Clostridium difficile infection (CDI).

Methods

Patients with CDI at a tertiary-care hospital in Houston, TX, were prospectively enrolled into an observational cohort study. Recurrence was assessed via follow-up phone calls. Patients with one or more recurrence were included in this study. The location at which healthcare was obtained by patients with recurrent CDI was identified along with hospital length of stay. CDI-attributable readmissions, defined as a positive toxin test within 48 hours of admission and a primary CDI diagnosis, were also assessed.

Results

372 primary cases of CDI were identified of whom 64 (17.2%) experienced at least one CDI recurrence. Twelve of 64 patients experienced 18 further episodes of CDI recurrence. Of these 64 patients, 33 (50.8%) patients with recurrent CDI were readmitted of which 6 (18.2%) required ICU care, 29 (45.3%) had outpatient care only, and 2 (3.1%) had an ED visit. Nineteen (55.9%) readmissions were defined as CDI-attributable. For patients with CDI-attributable readmission, the average length of stay was 6±6 days.

Conclusion

Recurrent CDI leads to significant healthcare resource utilization. Methods of reducing the burden of recurrent CDI should be further studied.  相似文献   

5.

Background

Modifiable lifestyle risk behaviours such as smoking, unhealthy diet, physical inactivity and alcohol misuse are the leading causes of major, non-communicable diseases worldwide. It is increasingly being recognised that interventions which target more than one risk behaviour may be an effective and efficient way of improving people’s lifestyles. To date, there has been no attempt to summarise the global evidence base for interventions targeting multiple risk behaviours.

Objective

To identify and map the characteristics of studies evaluating multiple risk behaviour change interventions targeted at adult populations in any country.

Methods

Seven bibliographic databases were searched between January, 1990, and January/ May, 2013. Authors of protocols, conference abstracts, and other relevant articles were contacted. Study characteristics were extracted and inputted into Eppi-Reviewer 4.

Results

In total, 220 studies were included in the scoping review. Most were randomised controlled trials (62%) conducted in the United States (49%), and targeted diet and physical activity (56%) in people from general populations (14%) or subgroups of general populations (45%). Very few studies had been conducted in the Middle East (2%), Africa (0.5%), or South America (0.5%). There was also a scarcity of studies conducted among young adults (1%), or racial and minority ethnic populations (4%) worldwide.

Conclusions

Research is required to investigate the interrelationships of lifestyle risk behaviours in varying cultural contexts around the world. Cross-cultural development and evaluation of multiple risk behaviour change interventions is also needed, particularly in populations of young adults and racial and minority ethnic populations.  相似文献   

6.

Background

Norms clarification has been identified as an effective component of college student drinking interventions, prompting research on norms clarification as a single-component intervention known as Personalized Normative Feedback (PNF). Previous reviews have examined PNF in combination with other components but not as a stand-alone intervention.

Objectives

To investigate the degree to which computer-delivered stand-alone personalized normative feedback interventions reduce alcohol consumption and alcohol-related harms among college students and to compare gender-neutral and gender-specific PNF.

Data Sources

Electronic databases were searched systematically through November 2014. Reference lists were reviewed manually and forward and backward searches were conducted.

Selection Criteria

Outcome studies that compared computer-delivered, stand-alone PNF intervention with an assessment only, attention-matched, or active treatment control and reported alcohol use and harms among college students.

Methods

Between-group effect sizes were calculated as the standardized mean difference in change scores between treatment and control groups divided by pooled standard deviation. Within-group effect sizes were calculated as the raw mean difference between baseline and follow-up divided by pooled within-groups standard deviation.

Results

Eight studies (13 interventions) with a total of 2,050 participants were included. Compared to control participants, students who received gender-neutral (d between = 0.291, 95% CI [0.159, 0.423]) and gender-specific PNF (d between = 0.284, 95% CI [0.117, 0.451]) reported greater reductions in drinking from baseline to follow-up. Students who received gender-neutral PNF reported 3.027 (95% CI [2.171, 3.882]) fewer drinks per week at first follow-up and gender-specific PNF reported 3.089 (95% CI [0.992, 5.186]) fewer drinks. Intervention effects were small for harms (d between = 0.157, 95% CI [0.037, 0.278]).

Conclusions

Computer-delivered PNF is an effective stand-alone approach for reducing college student drinking and has a small impact on alcohol-related harms. Effects are small but clinically relevant when considered from a public health perspective. Additional research is needed to examine computer-delivered, stand-alone PNF as a population-level prevention program.  相似文献   

7.

Background

Measuring effectiveness of HIV prevention interventions is challenged by bias when using self-reported knowledge, attitude or behavior change. HIV incidence is an objective marker to measure effectiveness of HIV prevention interventions, however, because new infection rates are relatively low, prevention studies require large sample sizes. Herpes simplex virus type 2 (HSV-2) is similarly transmitted and more prevalent and could thus serve as a proxy marker for sexual risk behavior and therefore HIV infection.

Methods

HSV-2 antibodies were assessed in a sub-study of 70,000 students participating in an education intervention in Western Province, Kenya. Feasibility of testing for HSV-2 antibodies was assessed comparing two methods using Fisher’s exact test. Three hundred and ninety four students (aged 18 to 22 years) were randomly chosen from the cohort and tested for HIV, Chlamydia trachomatis, Neisseria gonorrhoeae, and Trichomonas vaginalis. Out of these, 139 students were tested for HSV-2 with ELISA and surveyed for sexual risk behavior and 89 students were additionally tested for HSV-2 with a point-of-contact (POC) test.

Results

Prevalence rates were 0.5%, 1.8%, 0.3% and 2.3% for HIV, Chlamydia trachomatis, Neisseria gonorrhoeae, and Trichomonas vaginalis, respectively. Prevalence of HSV-2 antibodies was 3.4 % as measured by POC test (n=89) and 14.4 % by ELISA (n=139). Specificity of the POC test compared with ELISA was 100%, and the sensitivity only 23.1%. Associations between self-reported sexual behavior and HSV-2 serostatus could not be shown.

Conclusions

Associations between self-reported sexual risk behavior and HSV-2 serostatus could not be shown, probably due to social bias in interviews since its transmission is clearly linked. HSV-2 antibody testing is feasible in resource-poor settings and shows higher prevalence rates than other sexually transmitted diseases thus representing a potential biomarker for evaluation of HIV prevention interventions.  相似文献   

8.

Objective

Allergic conjunctivitis (AC) is a common condition, especially in childhood. The extent to which it occurs concurrently with or independently from allergic rhinitis (AR) has not been well described.

Aim

To examine the inter-relationship between rhinitis and conjunctivitis and the epidemiological risk factors for these conditions in a rural UK population.

Methods

Cross-sectional study of rural school children (aged 5–11 years). Parental questionnaires were used to diagnose allergic outcomes (including conjunctivitis, rhinitis and rhinoconjunctivitis), and to collect data on atopic history, demographic and environmental exposures. Odds ratios of allergic outcome by exposure were examined adjusted for age, sex, breastfeeding, family history of allergy, number of older and younger siblings.

Results

Prevalence of conjunctivitis was 17.5%, rhinitis 15.1% and rhinoconjunctivitis 13.0%. Seasonality of symptoms varied by condition: 64.7% of those with conjunctivitis had seasonal symptoms (April-Sept only), 46.7% of those with rhinitis and 92.2% of those with rhinoconjunctivitis. Living on a farm consistently reduced the risk of conjunctivitis (odds ratio 0.47, 95%CI 0.29–0.79, p = 0.004), rhinitis (OR 0.57, 95%CI 0.33–1.01, p = 0.05) and rhinoconjunctivitis (OR 0.57, 95%CI 0.32–1.03, p = 0.06). Exposure to farm animals (particularly in early life), current consumption of unpasteurised milk and playing in a barn or stable significantly reduced the risk of all three conditions.

Conclusion

More children had parent-reported conjunctivitis than rhinitis. The majority of children with either condition also reported symptoms with the other condition. Farmers’ children have less eye and/or nasal symptoms. A number of farming variables linked with the farm microbial environment are likely to be mediating the protective effect.  相似文献   

9.

Background

Mainland Tanzania scaled up multiple malaria control interventions between 1999 and 2010. We evaluated whether, and to what extent, reductions in all-cause under-five child mortality (U5CM) tracked with malaria control intensification during this period.

Methods

Four nationally representative household surveys permitted trend analysis for malaria intervention coverage, severe anemia (hemoglobin <8 g/dL) prevalence (SAP) among children 6–59 months, and U5CM rates stratified by background characteristics, age, and malaria endemicity. Prevalence of contextual factors (e.g., vaccination, nutrition) likely to influence U5CM were also assessed. Population attributable risk percentage (PAR%) estimates for malaria interventions and contextual factors that changed over time were used to estimate magnitude of impact on U5CM.

Results

Household ownership of insecticide-treated nets (ITNs) rose from near zero in 1999 to 64% (95% CI, 61.7–65.2) in 2010. Intermittent preventive treatment of malaria in pregnancy reached 26% (95% CI, 23.6–28.0) by 2010. Sulfadoxine-pyrimethamine replaced chloroquine in 2002 and artemisinin-based combination therapy was introduced in 2007. SAP among children 6–59 months declined 50% between 2005 (11.1%; 95% CI, 10.0–12.3%) and 2010 (5.5%; 95% CI, 4.7–6.4%) and U5CM declined by 45% between baseline (1995–9) and endpoint (2005–9), from 148 to 81 deaths/1000 live births, respectively. Mortality declined 55% among children 1–23 months of age in higher malaria endemicity areas. A large reduction in U5CM was attributable to ITNs (PAR% = 11) with other malaria interventions adding further gains. Multiple contextual factors also contributed to survival gains.

Conclusion

Marked declines in U5CM occurred in Tanzania between 1999 and 2010 with high impact from ITNs and ACTs. High-risk children (1–24 months of age in high malaria endemicity) experienced the greatest declines in mortality and SAP. Malaria control should remain a policy priority to sustain and further accelerate progress in child survival.  相似文献   

10.

Background

Breast cancer survivors have an increased risk of bone fracture. But the risk among young patients with adjuvant therapies remains unknown. This population-based study is aimed to assess the incidence and risk of fracture among young (age of 20 to 39 years) breast cancer patients who received adjuvant therapies.

Methods

From January 2001 to December 2007, 5,146 newly diagnosed breast cancer patients were enrolled from the National Health Insurance Research Database (NHIRD) in Taiwan. Patients were observed for a maximum of 6 years to determine the incidence of newly onset fracture. Kaplan Meier and Cox regression analyses were used to evaluate the risk of fracture in young breast cancer patients who received adjuvant treatments.

Results

Of the total 5,146 young (age of 20 to 39 years) breast cancer patients, the Cox multivariate proportional hazards analysis showed that AIs, radiotherapy, and monoclonal antibodies were significantly associated with a high risk of fracture. Moreover, patients who received AIs for more than 180 days had a high hazard ratio (HR) of 1.77 (95% CI = 0.68–4.57), and patients who received more than four radiotherapy visits had a high HR of 2.54 (95% CI = 1.07–6.06). Under the site-specific analysis, young breast cancer patients who received AIs had the highest risk of hip fracture (HR = 8.520, 95% CI = 1.711–42.432, p < 0.04), whereas patients who received radiotherapy had the highest risk of vertebral fracture (HR = 5.512, 95% CI = 1.847–16.451, p < 0.01).

Conclusion

Young breast cancer patients who are receiving AIs, radiotherapy or monoclonal antibody need to be more careful for preventing fracture events. Breast cancer treatment plans are suggested to incorporate fracture prevention interventions.  相似文献   

11.

Background

People with severe mental illnesses die early from cardiovascular disease. Evidence is lacking regarding effective primary care based interventions to tackle this problem.

Aim

To identify current procedures for, barriers to, and facilitators of the delivery of primary care based interventions for lowering cardiovascular risk for people with severe mental illnesses.

Method

75 GPs, practice nurses, service users, community mental health staff and carers in UK GP practice or community mental health settings were interviewed in 14 focus groups which were audio-recorded, transcribed and analysed using Framework Analysis.

Results

Five barriers to delivering primary care based interventions for lowering cardiovascular risk in people with severe mental illnesses were identified by the groups: negative perceptions of people with severe mental illnesses amongst some health professionals, difficulties accessing GP and community-based services, difficulties in managing a healthy lifestyle, not attending appointments, and a lack of awareness of increased cardiovascular risk in people with severe mental illnesses by some health professionals. Identified facilitators included involving supportive others, improving patient engagement with services, continuity of care, providing positive feedback in consultations and goal setting.

Conclusion

We identified a range of factors which can be incorporated in to the design, delivery and evaluation of services to reduce cardiovascular risk for people with severe mental illnesses in primary care. The next step is determining the clinical and cost effectiveness of primary care based interventions for lowering cardiovascular risk in people with severe mental illnesses, and evaluating the most important components of such interventions.  相似文献   

12.

Background

Given the prevalence of non-valvular atrial fibrillation in the geriatric population, thromboembolic prevention by means of vitamin K antagonists (VKA) is one of the most frequent daily concerns of practitioners. The effectiveness and safety of treatment with VKA correlates directly with maximizing the time in therapeutic range, with an International Normalized Ratio (INR) of 2.0-3.0. The older population concentrates many of factors known to influence INR rate, particularly concomitant medications and concurrent medical conditions, also defined as comorbidities.

Objective

Determine whether a high burden on comorbidities, defined by a Charlson Comorbidity Index (CCI) of 3 or greater, is associated a lower quality of INR control.

Study-Design

Cross-sectional study.

Settings

French geriatric care units nationwide.

Participants

2164 patients aged 80 and over and treated with vitamin K antagonists.

Measurements

Comorbidities were assessed using the Charlson Comorbidity Index (CCI). The recorded data included age, sex, falls, kidney failure, hemorrhagic event, VKA treatment duration, and the number and type of concomitant medications. Quality of INR control, defined as time in therapeutic range (TTR), was assessed using the Rosendaal method.

Results

487 patients were identified the low-quality control of INR group. On multivariate logistic regression analysis, low-quality control of INR was independently associated with a CCI ≥3 (OR = 1.487; 95% CI [1.15; 1.91]). The other variables associated with low-quality control of INR were: hemorrhagic event (OR = 3.151; 95% CI [1.64; 6.07]), hospitalization (OR = 1.614, 95% CI [1.21; 2.14]).

Conclusion

An elevated CCI score (≥3) was associated with low-quality control of INR in elderly patients treated with VKA. Further research is needed to corroborate this finding.  相似文献   

13.

Introduction

Alcohol use is one of the leading modifiable morbidity and mortality risk factors among young adults.

Study Design

2 parallel-group randomized controlled trial with follow-up at 1 and 6 months.

Setting/Participants

Internet based study in a general population sample of young men with low-risk drinking, recruited between June 2012 and February 2013.Intervention: Internet-based brief alcohol primary prevention intervention (IBI). The IBI aims at preventing an increase in alcohol use: it consists of normative feedback, feedback on consequences, calorific value alcohol, computed blood alcohol concentration, indication that the reported alcohol use is associated with no or limited risks for health. Intervention group participants received the IBI. Control group (CG) participants completed only an assessment.

Main Outcome Measures

Alcohol use (number of drinks per week), binge drinking prevalence. Analyses were conducted in 2014–2015.

Results

Of 4365 men invited to participate, 1633 did so; 896 reported low-risk drinking and were randomized (IBI: n = 451; CG: n = 445). At baseline, 1 and 6 months, the mean (SD) number of drinks/week was 2.4(2.2), 2.3(2.6), 2.5(3.0) for IBI, and 2.4(2.3), 2.8(3.7), 2.7(3.9) for CG. Binge drinking, absent at baseline, was reported by 14.4% (IBI) and 19.0% (CG) at 1 month and by 13.3% (IBI) and 13.0% (CG) at 6 months. At 1 month, beneficial intervention effects were observed on the number of drinks/week (p = 0.05). No significant differences were observed at 6 months.

Conclusion

We found protective short term effects of a primary prevention IBI.

Trial Registration

Controlled-Trials.com ISRCTN55991918  相似文献   

14.

Objectives

Cigarette smoking has been shown to be related to inflammatory bowel disease. We investigated whether smoking affected the probability of developing Clostridium difficile infection (CDI).

Methods

We conducted a longitudinal study of 16,781 older individuals from the nationally representative Health and Retirement Study. Data were linked to files from the Centers for Medicare and Medicaid Services.

Results

Overall, the rate of CDI in older individuals was 220.6 per 100,000 person-years (95% CI 193.3, 248.0). Rates of CDI were 281.6/100,000 person-years in current smokers, 229.0/100,000 in former smokers and 189.1/100,000 person-years in never smokers. The odds of CDI were 33% greater in former smokers (95% CI: 8%, 65%) and 80% greater in current smokers (95% CI: 33%, 145%) when compared to never smokers. When the number of CDI-related visits was evaluated, current smokers had a 75% increased rate of CDI compared to never smokers (95% CI: 15%, 167%).

Conclusions

Smoking is associated with developing a Clostridium difficile infection. Current smokers have the highest risk, followed by former smokers, when compared to rates of infection in never smokers.  相似文献   

15.

Background

There is evidence from 2 previous meta-analyses that interventions to promote poison prevention behaviours are effective in increasing a range of poison prevention practices in households with children. The published meta-analyses compared any intervention against a “usual care or no intervention” which potentially limits the usefulness of the analysis to decision makers. We aim to use network meta-analysis to simultaneously evaluate the effectiveness of different interventions to increase prevalence of safe storage of i) Medicines only, ii) Other household products only, iii) Poisons (both medicines and non-medicines), iv) Poisonous plants; and v) Possession of poison control centre (PCC) telephone number in households with children.

Methods

Data on the effectiveness of poison prevention interventions was extracted from primary studies identified in 2 newly-undertaken systematic reviews. Effect estimates were pooled across studies using a random effects network meta-analysis model.

Results

28 of the 47 primary studies identified were included in the analysis. Compared to usual care intervention, the intervention with education and low cost/free equipment elements was most effective in promoting safe storage of medicines (odds ratio 2.51, 95% credible interval 1.01 to 6.00) while interventions with education, low cost/free equipment, home safety inspection and fitting components were most effective in promoting safe storage of other household products (2.52, 1.12 to 7.13), safe storage of poisons (11.10, 1.60 to 141.50) and possession of PCC number (38.82, 2.19 to 687.10). No one intervention package was more effective than the others in promoting safe storage of poisonous plants.

Conclusion

The most effective interventions varied by poison prevention practice, but education alone was not the most effective intervention for any poison prevention practice. Commissioners and providers of poison prevention interventions should tailor the interventions they commission or provide to the poison prevention practices they wish to promote.

Highlights

  • Network meta-analysis is useful for comparing multiple injury-prevention interventions.
  • More intensive poison prevention interventions were more effective than education alone.
  • Education and low cost/free equipment was most effective in promoting safe storage of medicines.
  • Education, low cost/free equipment, home safety inspection and fitting was most effective in promoting safe storage of household products and poisons.
  • Education, low cost/free equipment and home inspection were most effective in promoting possession of a poison control centre number.
  • None of the intervention packages was more effective than the others in promoting safe storage of poisonous plants.
  相似文献   

16.

Background

Men who have sex with men (MSM) are marginalized, hidden, underserved and at high risk for HIV in Nepal. We examined the association between MSM sub-populations, psychosocial health problems and support, access to prevention and non-use of condoms.

Methods

Between September-November of 2010, a cross-sectional survey on HIV-related risk behavior was performed across Nepal through snowball sampling facilitated by non-governmental organizations, recruiting 339 MSM, age 15 or older. The primary outcomes were: (a) non-use of condoms at least once in last three anal sex encounters with men and (b) non-use of condoms with women in the last encounter. The secondary outcome was participation in HIV prevention interventions in the past year.

Results

Among the 339 MSM interviewed, 78% did not use condoms at their last anal sex with another man, 35% did not use condoms in their last sex with a woman, 70% had experienced violence in the last 12 months, 61% were experiencing depression and 47% had thought of committing suicide. After adjustment for age, religion, marital status, and MSM subpopulations (bisexual, ta, meti, gay), non-use of condoms at last anal sex with a man was significantly associated with non-participation in HIV interventions, experience of physical and sexual violence, depression, repeated suicidal thoughts, small social support network and being dissatisfied with social support. Depression was marginally associated with non-use of condoms with women. The findings suggest that among MSM who reported non-use of condoms at last anal sex, the ta subgroup and those lacking family acceptance were the least likely to have participated in any preventive interventions.

Conclusions

MSM in Nepal have a prevalence of psychosocial health problems in turn associated with high risk behavior for HIV. Future HIV prevention efforts targeting MSM in Nepal should cover all MSM subpopulations and prioritize psychosocial health interventions.  相似文献   

17.

Objective

To evaluate the efficacy of the program Keep Moving toward Healthy Heart and Healthy Brain (KM2H2) in encouraging physical activities for the prevention of heart attack and stroke among hypertensive patients enrolled in the Community-Based Hypertension Control Program (CBHCP).

Design

Cluster randomized controlled trial with three waves of longitudinal assessments at baseline, 3 and 6 months post intervention.

Setting

Community-based and patient-centered self-care for behavioral intervention in urban settings of China.

Participants

A total of 450 participants diagnosed with hypertension from 12 community health centers in Wuhan, China were recruited, and were randomly assigned by center to receive either KM2H2 plus standard CBHCP care (6 centers and 232 patients) or the standard care only (6 centers and 218 patients).

Intervention

KM2H2 is a behavioral intervention guided by the Transtheoretical Model, the Model of Personalized Medicine and Social Capital Theory. It consists of six intervention sessions and two booster sessions engineered in a progressive manner. The purpose is to motivate and maintain physical activities for the prevention of heart attack and stroke.

Outcome Measures

Heart attack and stroke (clinically diagnosed, primary outcome), blood pressure (measured, secondary outcome), and physical activity (self-report, tertiary outcome) were assessed at the individual level during the baseline, 3- and 6-month post-intervention.

Results

Relative to the standard care, receiving KM2H2 was associated with significant reductions in the incidence of heart attack (3.60% vs. 7.03%, p < .05) and stroke (5.11% vs. 9.90%, p<0.05), and moderate reduction in blood pressure (-3.72mmHg in DBP and -2.92 mmHg in DBP) at 6-month post-intervention; and significant increases in physical activity at 3- (d = 0.53, 95% CI: 0.21, 0.85) and 6-month (d = 0.45, 95% CI: 0.04, 0.85) post-intervention, respectively.

Conclusion

The program KM2H2 is efficacious to reduce the risk of heart attack and stroke among senior patients who are on anti-hypertensive medication. Findings of this study provide solid data supporting a formal phase-III trial to establish the effectiveness of KM2H2 for use in community settings for prevention.

Trial Registration

ISRCTN Register ISRCTN12608966  相似文献   

18.

Background

Pancreatic cancer has poor prognosis and existing interventions provide a modest benefit. Statin has anti-cancer properties that might enhance survival in pancreatic cancer patients. We sought to determine whether statin treatment after cancer diagnosis is associated with longer survival in those with pancreatic ductal adenocarcinoma (PDAC).

Methods

We analyzed data on 7813 elderly patients with PDAC using the linked Surveillance, Epidemiology, and End Results (SEER) - Medicare claims files. Information on the type, intensity and duration of statin use after cancer diagnosis was extracted from Medicare Part D. We treated statin as a time-dependent variable in a Cox regression model to determine the association with overall survival adjusting for follow-up, age, sex, race, neighborhood income, stage, grade, tumor size, pancreatectomy, chemotherapy, radiation, obesity, dyslipidemia, diabetes, chronic pancreatitis and chronic obstructive pulmonary disease (COPD).

Results

Overall, statin use after cancer diagnosis was not significantly associated with survival when all PDAC patients were considered (HR = 0.94, 95%CI 0.89, 1.01). However, statin use after cancer diagnosis was associated with a 21% reduced hazard of death (Hazard ratio = 0.79, 95% confidence interval (CI) 0.67, 0.93) in those with grade I or II PDAC and to a similar extent in those who had undergone a pancreatectomy, in those with chronic pancreatitis and in those who had not been treated with statin prior to cancer diagnosis.

Conclusions

We found that statin treatment after cancer diagnosis is associated with enhanced survival in patients with low-grade, resectable PDAC.  相似文献   

19.

Purpose

To evaluate the ability of longitudinal Useful Field of View (UFOV) and simulated driving measurements to predict future occurrence of motor vehicle collision (MVC) in drivers with glaucoma.

Design

Prospective observational cohort study.

Participants

117 drivers with glaucoma followed for an average of 2.1 ± 0.5 years.

Methods

All subjects had standard automated perimetry (SAP), UFOV, driving simulator, and cognitive assessment obtained at baseline and every 6 months during follow-up. The driving simulator evaluated reaction times to high and low contrast peripheral divided attention stimuli presented while negotiating a winding country road, with central driving task performance assessed as “curve coherence”. Drivers with MVC during follow-up were identified from Department of Motor Vehicle records.

Main Outcome Measures

Survival models were used to evaluate the ability of driving simulator and UFOV to predict MVC over time, adjusting for potential confounding factors.

Results

Mean age at baseline was 64.5 ± 12.6 years. 11 of 117 (9.4%) drivers had a MVC during follow-up. In the multivariable models, low contrast reaction time was significantly predictive of MVC, with a hazard ratio (HR) of 2.19 per 1 SD slower reaction time (95% CI, 1.30 to 3.69; P = 0.003). UFOV divided attention was also significantly predictive of MVC with a HR of 1.98 per 1 SD worse (95% CI, 1.10 to 3.57; P = 0.022). Global SAP visual field indices in the better or worse eye were not predictive of MVC. The longitudinal model including driving simulator performance was a better predictor of MVC compared to UFOV (R2 = 0.41 vs R2 = 0.18).

Conclusions

Longitudinal divided attention metrics on the UFOV test and during simulated driving were significantly predictive of risk of MVC in glaucoma patients. These findings may help improve the understanding of factors associated with driving impairment related to glaucoma.  相似文献   

20.

Background and Aims

Prediction of severe clinical outcomes in Clostridium difficile infection (CDI) is important to inform management decisions for optimum patient care. Currently, treatment recommendations for CDI vary based on disease severity but validated methods to predict severe disease are lacking. The aim of the study was to derive and validate a clinical prediction tool for severe outcomes in CDI.

Methods

A cohort totaling 638 patients with CDI was prospectively studied at three tertiary care clinical sites (Boston, Dublin and Houston). The clinical prediction rule (CPR) was developed by multivariate logistic regression analysis using the Boston cohort and the performance of this model was then evaluated in the combined Houston and Dublin cohorts.

Results

The CPR included the following three binary variables: age ≥ 65 years, peak serum creatinine ≥2 mg/dL and peak peripheral blood leukocyte count of ≥20,000 cells/μL. The Clostridium difficile severity score (CDSS) correctly classified 76.5% (95% CI: 70.87-81.31) and 72.5% (95% CI: 67.52-76.91) of patients in the derivation and validation cohorts, respectively. In the validation cohort, CDSS scores of 0, 1, 2 or 3 were associated with severe clinical outcomes of CDI in 4.7%, 13.8%, 33.3% and 40.0% of cases respectively.

Conclusions

We prospectively derived and validated a clinical prediction rule for severe CDI that is simple, reliable and accurate and can be used to identify high-risk patients most likely to benefit from measures to prevent complications of CDI.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号