首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.

Background

Access to “safe” water and “adequate” sanitation are emphasized as important measures for schistosomiasis control. Indeed, the schistosomes'' lifecycles suggest that their transmission may be reduced through safe water and adequate sanitation. However, the evidence has not previously been compiled in a systematic review.

Methodology

We carried out a systematic review and meta-analysis of studies reporting schistosome infection rates in people who do or do not have access to safe water and adequate sanitation. PubMed, Web of Science, Embase, and the Cochrane Library were searched from inception to 31 December 2013, without restrictions on year of publication or language. Studies'' titles and abstracts were screened by two independent assessors. Papers deemed of interest were read in full and appropriate studies included in the meta-analysis. Publication bias was assessed through the visual inspection of funnel plots and through Egger''s test. Heterogeneity of datasets within the meta-analysis was quantified using Higgins'' I2.

Principal Findings

Safe water supplies were associated with significantly lower odds of schistosomiasis (odds ratio (OR) = 0.53, 95% confidence interval (CI): 0.47–0.61). Adequate sanitation was associated with lower odds of Schistosoma mansoni, (OR = 0.59, 95% CI: 0.47–0.73) and Schistosoma haematobium (OR = 0.69, 95% CI: 0.57–0.84). Included studies were mainly cross-sectional and quality was largely poor.

Conclusions/Significance

Our systematic review and meta-analysis suggests that increasing access to safe water and adequate sanitation are important measures to reduce the odds of schistosome infection. However, most of the studies were observational and quality was poor. Hence, there is a pressing need for adequately powered cluster randomized trials comparing schistosome infection risk with access to safe water and adequate sanitation, more studies which rigorously define water and sanitation, and new research on the relationships between water, sanitation, hygiene, human behavior, and schistosome transmission.  相似文献   

2.
This study investigated the relationship between level of stress in middle and high school students aged 12–18 and risk of atopic dermatitis. Data from the Sixth Korea Youth Risk Behavior Web-based Survey (KYRBWS-VI), a cross-sectional study among 74,980 students in 800 middle schools and high schools with a response rate of 97.7%, were analyzed. Ordinal logistic regression analyses were conducted to determine the relationship between stress and atopic dermatitis with severity. A total of 5,550 boys and 6,964 girls reported having been diagnosed with atopic dermatitis. Younger students were more likely to have atopic dermatitis. Interestingly, the educational level of parents was found to be associated with having atopic dermatitis and having more severe condition. In particular, girls with mothers with at least college education had a 41% higher risk of having atopic dermatitis and severe atopic condition (odds ratio (OR)) = 1.41, 95% CI, 1.22–1.63; P<0.0001) compared with those with mothers who had attended middle school at most. Similar trend was shown among both boys and girls for their father''s education level. The stress level was found to be significantly associated with the risk of atopic dermatitis. Compared to boys with who reported “no stress”, boys with “very high” stress had 46% higher the risk of having more severe atopic dermatitis (OR = 1.46, 95% CI, 1.20–1.78; P<0.0001), 44% higher (OR = 1.44, 95% CI, 1.19–1.73; P<0.0001) with “high” stress, and 21% higher (OR = 1.21, 95% CI, 1.00–1.45; P = 0.05) with “moderate” stress. In contrast, we found no statistically significant relationship between stress and atopic dermatitis in girls. This study suggests that stress and parents'' education level were associated with atopic dermatitis. Specifically, degree of stress is positively correlated with likelihood of being diagnosed with this condition and increasing the severity.  相似文献   

3.

Background

IPT with or without concomitant administration of ART is a proven intervention to prevent tuberculosis among PLHIV. However, there are few data on the routine implementation of this intervention and its effectiveness in settings with limited resources.

Objectives

To measure the level of uptake and effectiveness of IPT in reducing tuberculosis incidence in a cohort of PLHIV enrolled into HIV care between 2007 and 2010 in five hospitals in southern Ethiopia.

Methods

A retrospective cohort analysis of electronic patient database was done. The independent effects of no intervention, “IPT-only,” “IPT-before-ART,” “IPT-and-ART started simultaneously,” “ART-only,” and “IPT-after-ART” on TB incidence were measured. Cox-proportional hazards regression was used to assess association of treatment categories with TB incidence.

Results

Of 7,097 patients, 867 were excluded because they were transferred-in; a further 823 (12%) were excluded from the study because they were either identified to have TB through screening (292 patients) or were on TB treatment (531). Among the remaining 5,407 patients observed, IPT had been initiated for 39% of eligible patients. Children, male sex, advanced disease, and those in Pre-ART were less likely to be initiated on IPT. The overall TB incidence was 2.6 per 100 person-years. As compared to those with no intervention, use of “IPT-only” (aHR = 0.36, 95% CI = 0.19–0.66) and “ART-only” (aHR = 0.32, 95% CI = 0.24–0.43) were associated with significant reduction in TB incidence rate. Combining ART and IPT had a more profound effect. Starting IPT-before-ART (aHR = 0.18, 95% CI = 0.08–0.42) or simultaneously with ART (aHR = 0.20, 95% CI = 0.10–0.42) provided further reduction of TB at ∼80%.

Conclusions

IPT was found to be effective in reducing TB incidence, independently and with concomitant ART, under programme conditions in resource-limited settings. The level of IPT provision and effectiveness in reducing TB was encouraging in the study setting. Scaling up and strengthening IPT service in addition to ART can have beneficial effect in reducing TB burden among PLHIV in settings with high TB/HIV burden.  相似文献   

4.

Objective

Recent treatment patterns and cost data associated with HIV in the United States are limited. This study assessed first-line persistence and healthcare costs of HIV-1 in patients by treatment line and CD4 cell count.

Methods

MarketScan Commercial Claims and Encounters Database (2007–2011) and Lab Database (2007–2010) were used to construct two HIV-1 cohorts: 1) newly treated HIV-1–infected patients with ≥6 months'' continuous enrollment prior to first third-agent drug claim (Newly Treated Cohort) and 2) CD4 cell count test results (CD4 Measurements Cohort). All patients were ≥18 years old and without hepatitis co-infection. The Kaplan-Meier method was used to measure treatment switch rates. Generalized linear models (gamma distribution, log link) were used to compare healthcare costs by treatment line and CD4 cell count controlling for potential confounders.

Results

Newly treated patients (n = 8,617) had mean age of 41, 82% were male, and 20% had experienced AIDS-defining events at baseline. Over 20% of newly treated patients switched initial treatment regimen within 2 years. Average unadjusted (and covariate-adjusted) total healthcare cost/year was $33,674 ($28,861) for first-line, $39,191 ($35,805) for second-line, and $39,882 ($40,804) for third-line treatment. Covariate-adjusted costs of care on second- and third-line treatments were significantly more expensive than first-line treatment (24% [p<0.001] and 41% [p = 0.006] higher, respectively). The CD4 Measurements Cohort included 803 CD4 measurements (mean age 49, 76% male, 8% experienced an AIDS-defining event). Costs associated with CD4 measurements <100 cells/µL were 92% higher than those with >350 cells/µL (p<0.001). For higher CD4 cell counts, the majority of expenditures were for antiretrovirals (64% of total for CD4 >350 cells/µL).

Conclusions

Despite modern advances in antiretroviral therapy and medical care, direct medical costs of HIV-1–infected patients increase after treatment switch and with lower CD4 counts, consistent with previous costing studies.  相似文献   

5.
It has been suggested that mitochondrial dysfunction plays a role in metabolic disorders including obesity, diabetes, and hypertension. The fact that mitochondrial defects can be accumulated over time as a normal part of aging may explain why some individuals can eat all sorts of foods and remain at normal weight while they are young. However, around the fourth decade of life there is a trend towards “middle-age spread” with weight gain and the body''s decreasing ability to metabolize calories efficiently. To test the hypothesis that mitochondrial variants are associated with BMI in adults, we analyzed a total number of 984 mitochondrial single nucleotide polymorphisms (mtSNPs) in a sample of 6,528 individuals participating in the KORA studies. To assess mtSNP association while taking heteroplasmy into account we used the raw signal intensity values measured on the microarray and applied linear regression. Significant results were obtained for 2 mtSNPs located in the Cytochrome c oxidase subunit genes (MT-CO1: Padjusted = 0.0140 and MT-CO3: Padjusted = 0.0286) and 3 mtSNPs located in the NADH dehydrogenase subunit genes (MT-ND1, MT-ND2 and MT-ND4L: Padjusted = 0.0286). Polymorphisms located in the MT-CO3 and MT-ND4L genes have not been associated with BMI or related phenotypes in the past. Our results highlight the importance of the mitochondrial genome among the factors that contribute to the risk of high BMI. Focusing on mitochondrial variants may lead to further insights regarding effects of existing medications, or even to the development of innovative treatments.  相似文献   

6.

Objective

How do the holidays – and the possible New Year’s resolutions that follow – influence a household’s purchase patterns of healthier foods versus less healthy foods? This has important implications for both holiday food shopping and post-holiday shopping.

Methods

207 households were recruited to participate in a randomized-controlled trial conducted at two regional-grocery chain locations in upstate New York. Item-level transaction records were tracked over a seven-month period (July 2010 to March 2011). The cooperating grocer’s proprietary nutrient-rating system was used to designate “healthy,” and “less healthy” items. Calorie data were extracted from online nutritional databases. Expenditures and calories purchased for the holiday period (Thanksgiving-New Year’s), and the post-holiday period (New Year’s-March), were compared to baseline (July-Thanksgiving) amounts.

Results

During the holiday season, household food expenditures increased 15% compared to baseline ($105.74 to $121.83; p<0.001), with 75% of additional expenditures accounted for by less-healthy items. Consistent with what one would expect from New Year’s resolutions, sales of healthy foods increased 29.4% ($13.24/week) after the holiday season compared to baseline, and 18.9% ($9.26/week) compared to the holiday period. Unfortunately, sales of less-healthy foods remained at holiday levels ($72.85/week holiday period vs. $72.52/week post-holiday). Calories purchased each week increased 9.3% (450 calories per serving/week) after the New Year compared to the holiday period, and increased 20.2% (890 calories per serving/week) compared to baseline.

Conclusions

Despite resolutions to eat more healthfully after New Year’s, consumers may adjust to a new “status quo” of increased less-healthy food purchasing during the holidays, and dubiously fulfill their New Year’s resolutions by spending more on healthy foods. Encouraging consumers to substitute healthy items for less-healthy items may be one way for practitioners and public health officials to help consumers fulfill New Year’s resolutions, and reverse holiday weight gain.  相似文献   

7.

Background

Even when tuberculosis (TB) treatment is free, hidden costs incurred by patients and their households (TB-affected households) may worsen poverty and health. Extreme TB-associated costs have been termed “catastrophic” but are poorly defined. We studied TB-affected households'' hidden costs and their association with adverse TB outcome to create a clinically relevant definition of catastrophic costs.

Methods and Findings

From 26 October 2002 to 30 November 2009, TB patients (n = 876, 11% with multi-drug-resistant [MDR] TB) and healthy controls (n = 487) were recruited to a prospective cohort study in shantytowns in Lima, Peru. Patients were interviewed prior to and every 2–4 wk throughout treatment, recording direct (household expenses) and indirect (lost income) TB-related costs. Costs were expressed as a proportion of the household''s annual income. In poorer households, costs were lower but constituted a higher proportion of the household''s annual income: 27% (95% CI = 20%–43%) in the least-poor houses versus 48% (95% CI = 36%–50%) in the poorest. Adverse TB outcome was defined as death, treatment abandonment or treatment failure during therapy, or recurrence within 2 y. 23% (166/725) of patients with a defined treatment outcome had an adverse outcome. Total costs ≥20% of household annual income was defined as catastrophic because this threshold was most strongly associated with adverse TB outcome. Catastrophic costs were incurred by 345 households (39%). Having MDR TB was associated with a higher likelihood of incurring catastrophic costs (54% [95% CI = 43%–61%] versus 38% [95% CI = 34%–41%], p<0.003). Adverse outcome was independently associated with MDR TB (odds ratio [OR] = 8.4 [95% CI = 4.7–15], p<0.001), previous TB (OR = 2.1 [95% CI = 1.3–3.5], p = 0.005), days too unwell to work pre-treatment (OR = 1.01 [95% CI = 1.00–1.01], p = 0.02), and catastrophic costs (OR = 1.7 [95% CI = 1.1–2.6], p = 0.01). The adjusted population attributable fraction of adverse outcomes explained by catastrophic costs was 18% (95% CI = 6.9%–28%), similar to that of MDR TB (20% [95% CI = 14%–25%]). Sensitivity analyses demonstrated that existing catastrophic costs thresholds (≥10% or ≥15% of household annual income) were not associated with adverse outcome in our setting. Study limitations included not measuring certain “dis-saving” variables (including selling household items) and gathering only 6 mo of costs-specific follow-up data for MDR TB patients.

Conclusions

Despite free TB care, having TB disease was expensive for impoverished TB patients in Peru. Incurring higher relative costs was associated with adverse TB outcome. The population attributable fraction indicated that catastrophic costs and MDR TB were associated with similar proportions of adverse outcomes. Thus TB is a socioeconomic as well as infectious problem, and TB control interventions should address both the economic and clinical aspects of this disease. Please see later in the article for the Editors'' Summary  相似文献   

8.
Latent print examiners use their expertise to determine whether the information present in a comparison of two fingerprints (or palmprints) is sufficient to conclude that the prints were from the same source (individualization). When fingerprint evidence is presented in court, it is the examiner''s determination—not an objective metric—that is presented. This study was designed to ascertain the factors that explain examiners'' determinations of sufficiency for individualization. Volunteer latent print examiners (n = 170) were each assigned 22 pairs of latent and exemplar prints for examination, and annotated features, correspondence of features, and clarity. The 320 image pairs were selected specifically to control clarity and quantity of features. The predominant factor differentiating annotations associated with individualization and inconclusive determinations is the count of corresponding minutiae; other factors such as clarity provided minimal additional discriminative value. Examiners'' counts of corresponding minutiae were strongly associated with their own determinations; however, due to substantial variation of both annotations and determinations among examiners, one examiner''s annotation and determination on a given comparison is a relatively weak predictor of whether another examiner would individualize. The extensive variability in annotations also means that we must treat any individual examiner''s minutia counts as interpretations of the (unknowable) information content of the prints: saying “the prints had N corresponding minutiae marked” is not the same as “the prints had N corresponding minutiae.” More consistency in annotations, which could be achieved through standardization and training, should lead to process improvements and provide greater transparency in casework.  相似文献   

9.

Background

Patients with type 2 diabetes vary greatly with respect to degree of obesity at time of diagnosis. To address the heterogeneity of type 2 diabetes, we characterised patterns of change in body mass index (BMI) and other cardiometabolic risk factors before type 2 diabetes diagnosis.

Methods and Findings

We studied 6,705 participants from the Whitehall II study, an observational prospective cohort study of civil servants based in London. White men and women, initially free of diabetes, were followed with 5-yearly clinical examinations from 1991–2009 for a median of 14.1 years (interquartile range [IQR]: 8.7–16.2 years). Type 2 diabetes developed in 645 (1,209 person-examinations) and 6,060 remained free of diabetes during follow-up (14,060 person-examinations). Latent class trajectory analysis of incident diabetes cases was used to identify patterns of pre-disease BMI. Associated trajectories of cardiometabolic risk factors were studied using adjusted mixed-effects models. Three patterns of BMI changes were identified. Most participants belonged to the “stable overweight” group (n = 604, 94%) with a relatively constant BMI level within the overweight category throughout follow-up. They experienced slightly worsening of beta cell function and insulin sensitivity from 5 years prior to diagnosis. A small group of “progressive weight gainers” (n = 15) exhibited a pattern of consistent weight gain before diagnosis. Linear increases in blood pressure and an exponential increase in insulin resistance a few years before diagnosis accompanied the weight gain. The “persistently obese” (n = 26) were severely obese throughout the whole 18 years before diabetes diagnosis. They experienced an initial beta cell compensation followed by loss of beta cell function, whereas insulin sensitivity was relatively stable. Since the generalizability of these findings is limited, the results need confirmation in other study populations.

Conclusions

Three patterns of obesity changes prior to diabetes diagnosis were accompanied by distinct trajectories of insulin resistance and other cardiometabolic risk factors in a white, British population. While these results should be verified independently, the great majority of patients had modest weight gain prior to diagnosis. These results suggest that strategies focusing on small weight reductions for the entire population may be more beneficial than predominantly focusing on weight loss for high-risk individuals. Please see later in the article for the Editors'' Summary  相似文献   

10.

Objective

To examine whether psychosocial factors mediate (explain) the association between socioeconomic position and takeaway food consumption.

Design

A cross-sectional postal survey conducted in 2009.

Setting

Participants reported their usual consumption of 22 takeaway food items, and these were grouped into a “healthy” and “less healthy” index based on each items'' nutritional properties. Principal Components Analysis was used to derive three psychosocial scales that measured beliefs about the relationship between diet and health (α = 0.73), and perceptions about the value (α = 0.79) and pleasure (α = 0.61) of takeaway food. A nutrition knowledge index was also used. Socioeconomic position was measured by highest attained education level.

Subjects

Randomly selected adults (n = 1,500) aged between 25–64 years in Brisbane, Australia (response rate  =  63.7%, N = 903).

Results

Compared with those with a bachelor degree or higher, participants with a diploma level of education were more likely to consume “healthy” takeaway food (p = 0.023) whereas the least educated (high school only) were more likely to consume “less healthy” choices (p = 0.002). The least educated were less likely to believe in a relationship between diet and health (p<0.001), and more likely to have lower nutritional knowledge compared with their highly educated counterparts (p<0.001). Education differences in beliefs about the relationship between diet and health partly and significantly mediated the association between education and “healthy” takeaway food consumption. Diet- and health-related beliefs and nutritional knowledge partly and significantly mediated the education differences in “less healthy” takeaway food consumption.

Conclusions

Interventions that target beliefs about the relationship between diet and health, and nutritional knowledge may reduce socioeconomic differences in takeaway food consumption, particularly for “less healthy” options.  相似文献   

11.
12.
13.
Recent evidence has demonstrated the efficacy of pre-exposure prophylaxis (PrEP) for HIV prevention, but concerns persist around its use. Little is known about Canadian physicians'' knowledge of and willingness to prescribe PrEP. We disseminated an online survey to Canadian family, infectious disease, internal medicine, and public health physicians between September 2012–June 2013 to determine willingness to prescribe PrEP. Criteria for analysis were met by 86 surveys. 45.9% of participants felt “very familiar” with PrEP, 49.4% felt that PrEP should be approved by Health Canada, and 45.4% of respondents were willing to prescribe PrEP. Self-identifying as an HIV expert (odds ratio, OR = 4.1, 95% confidence interval, CI = 1.6–10.2), familiarity with PrEP (OR = 5.0, 95%CI = 1.3–19.0) and having been asked by patients about PrEP (OR = 4.0, 95%CI = 1.5–10.5) were positively associated with willingness to prescribe PrEP on univariable analysis. The latter two were the strongest predictors on multivariate analysis. Participants cited cost and efficacy as major concerns. 75.3% did not feel that information had been adequately disseminated among physicians. In summary, Canadian physicians demonstrate varying levels of support for PrEP and express concerns about its implementation. Further research on real-world effectiveness, continuing medical education, and clinical support is needed to prepare physicians for this prevention strategy.  相似文献   

14.
《PloS one》2013,8(6)

Background

Multimorbidity has a negative impact on health-related quality of life (HRQL). Previous studies included only a limited number of conditions. In this study, we analyse the impact of a large number of conditions on HRQL in multimorbid patients without preselecting particular diseases. We also explore the effects of these conditions on the specific dimensions of HRQL.

Materials and Methods

This analysis is based on a multicenter, prospective cohort study of 3189 multimorbid primary care patients aged 65 to 85. The impact of 45 conditions on HRQL was analysed. The severity of the conditions was rated. The EQ-5D, consisting of 5 dimensions and a visual-analogue-scale (EQ VAS), was employed. Data were analysed using multiple ordinary least squares and multiple logistic regressions. Multimorbidity measured by a weighted count score was significantly associated with lower overall HRQL (EQ VAS), b = −1.02 (SE: 0.06). Parkinson’s disease had the most pronounced negative effect on overall HRQL (EQ VAS), b = −12.29 (SE: 2.18), followed by rheumatism, depression, and obesity. With regard to the individual EQ-5D dimensions, depression (OR = 1.39 to 3.3) and obesity (OR = 1.44 to 1.95) affected all five dimensions of the EQ-5D negatively except for the dimension anxiety/depression. Obesity had a positive effect on this dimension, OR = 0.78 (SE: 0.07). The dimensions “self-care”, OR = 4.52 (SE: 1.37) and “usual activities”, OR = 3.59 (SE: 1.0), were most strongly affected by Parkinson’s disease. As a limitation our sample may only represent patients with at most moderate disease severity.

Conclusions

The overall HRQL of multimorbid patients decreases with an increasing count and severity of conditions. Parkinson’s disease, depression and obesity have the strongest impact on HRQL. Further studies should address the impact of disease combinations which require very large sample sizes as well as advanced statistical methods.  相似文献   

15.

Purpose

To investigate the current status of diabetic self-management behavior and the factors influencing this behavior in Chengdu, a typical city in western China.

Methods

We performed stratified sampling in 6 urban districts of Chengdu. We used questionnaires concerning self-management knowledge, self-management beliefs, self-management efficacy, social support, and self-management behavior to investigate patients with T2DM from August to November 2011. All of the data were analyzed using the SPSS 17.0 statistical package.

Results

We enrolled a total of 364 patients in the present study. The median score of self-management behavior was 111.00, the interquartile range was 100.00–119.00, and the index score was 77.77. Self-management was described as “good” in 46%, “fair” in 45%, and “poor” in 6% of patients. A multiple-factor analysis identified age (OR, 0.43; 95% CI, 0.20–0.91; P = 0.026), education in “foot care” (OR, 0.42; 95% CI, 0.18–0.99; P = 0.048), self-management knowledge (OR, 0.86; 95% CI, 0.80–0.92; P<0.001), self-management belief (OR, 0.92; 95% CI, 0.87–0.97; P = 0.002), self-efficacy (OR, 0.93; 95% CI, 0.90–0.96; P<0.001), and social support (OR, 0.62; 95% CI, 0.41–0.94; P = 0.023) as positive factors. Negative factors included diabetes duration (5–9 years: OR, 14.82; 95% CI, 1.64–133.73; P = 0.016; and ≥10 years: OR, 10.28; 95% CI, 1.06–99.79; P = 0.045) and hospitalization experience (OR, 2.96; 95% CI, 1.64–5.36; P<0.001).

Conclusion

We observed good self-management behavior in patients with T2DM in Chengdu. When self-management education is provided, age, education, knowledge, belief, self-efficacy, and social support should be considered to offer more appropriate intervention and to improve patients'' behavior.  相似文献   

16.

Objectives

We aimed to disentangle the effects of obesity and mobility limitation on cervical and breast cancer screening among community dwelling women.

Methods

The data source was the French national Health and Disability Survey - Household Section, 2008. The Body Mass Index (BMI) was used to categorize obesity status. We constructed a continuous score of mobility limitations to assess the severity of disability (Cronbach''s alpha = 0.84). Logistic regressions were performed to examine the association between obesity, mobility limitations and the use of Pap test (n = 8 133) and the use of mammography (n = 7 561). Adjusted odds ratios were calculated (AOR). Interaction terms between obesity and the disability score were included in models testing for effect modifications.

Results

Compared with non-obese women, the odds of having a Pap test in the past 3 years was 24% lower in obese women (AOR = 0.76; 95% CI: 0.65 to 0.89), the odds of having a mammogram in the past 2 years was 23% lower (AOR = 0.77; 95% CI: 0.66 to 0.91). Each time the disability score was 5 points higher, the odds of having a Pap test decreases by 20% (AOR = 0.96; 95% CI: 0.94 to 0.98), the odds of having a mammogram decreases by 25% (AOR = 0.95; 95% CI: 0.94 to 0.97). There was no significant interaction between obesity and disability score.

Conclusion

Obesity and mobility limitation are independently associated with a lower likelihood of cervical and breast cancer screening. Protective outreach and follow-up are necessary to reduce inequalities and thus to reduce health disparities in these vulnerable and high-risk populations of obese women with disabilities.  相似文献   

17.

Background

Access to safe drinking-water is a fundamental requirement for good health and is also a human right. Global access to safe drinking-water is monitored by WHO and UNICEF using as an indicator “use of an improved source,” which does not account for water quality measurements. Our objectives were to determine whether water from “improved” sources is less likely to contain fecal contamination than “unimproved” sources and to assess the extent to which contamination varies by source type and setting.

Methods and Findings

Studies in Chinese, English, French, Portuguese, and Spanish were identified from online databases, including PubMed and Web of Science, and grey literature. Studies in low- and middle-income countries published between 1990 and August 2013 that assessed drinking-water for the presence of Escherichia coli or thermotolerant coliforms (TTC) were included provided they associated results with a particular source type. In total 319 studies were included, reporting on 96,737 water samples. The odds of contamination within a given study were considerably lower for “improved” sources than “unimproved” sources (odds ratio [OR] = 0.15 [0.10–0.21], I2 = 80.3% [72.9–85.6]). However over a quarter of samples from improved sources contained fecal contamination in 38% of 191 studies. Water sources in low-income countries (OR = 2.37 [1.52–3.71]; p<0.001) and rural areas (OR = 2.37 [1.47–3.81] p<0.001) were more likely to be contaminated. Studies rarely reported stored water quality or sanitary risks and few achieved robust random selection. Safety may be overestimated due to infrequent water sampling and deterioration in quality prior to consumption.

Conclusion

Access to an “improved source” provides a measure of sanitary protection but does not ensure water is free of fecal contamination nor is it consistent between source types or settings. International estimates therefore greatly overstate use of safe drinking-water and do not fully reflect disparities in access. An enhanced monitoring strategy would combine indicators of sanitary protection with measures of water quality. Please see later in the article for the Editors'' Summary  相似文献   

18.

Objective

Sexual orientation is usually considered to be determined in early life and stable in the course of adulthood. In contrast, some transgender individuals report a change in sexual orientation. A common reason for this phenomenon is not known.

Methods

We included 115 transsexual persons (70 male-to-female “MtF” and 45 female-to-male “FtM”) patients from our endocrine outpatient clinic, who completed a questionnaire, retrospectively evaluating the history of their gender transition phase. The questionnaire focused on sexual orientation and recalled time points of changes in sexual orientation in the context of transition. Participants were further asked to provide a personal concept for a potential change in sexual orientation.

Results

In total, 32.9% (n =  23) MtF reported a change in sexual orientation in contrast to 22.2% (n =  10) FtM transsexual persons (p =  0.132). Out of these patients, 39.1% (MtF) and 60% (FtM) reported a change in sexual orientation before having undergone any sex reassignment surgery. FtM that had initially been sexually oriented towards males ( = androphilic), were significantly more likely to report on a change in sexual orientation than gynephilic, analloerotic or bisexual FtM (p  =  0.012). Similarly, gynephilic MtF reported a change in sexual orientation more frequently than androphilic, analloerotic or bisexual MtF transsexual persons (p  =  0.05).

Conclusion

In line with earlier reports, we reveal that a change in self-reported sexual orientation is frequent and does not solely occur in the context of particular transition events. Transsexual persons that are attracted by individuals of the opposite biological sex are more likely to change sexual orientation. Qualitative reports suggest that the individual''s biography, autogynephilic and autoandrophilic sexual arousal, confusion before and after transitioning, social and self-acceptance, as well as concept of sexual orientation itself may explain this phenomenon.  相似文献   

19.

Objective

This study aims to determine the up-to-date prevalence of overweight and obesity, the distributions of body weight perception and weight loss practice in Beijing adults.

Methods

A cross-sectional study was conducted in 2011. A total of 2563 men and 4088 women aged 18–79 years from the general population were included. Data were obtained from questionnaire and physical examination.

Results

The prevalence of overweight (BMI 24–27.9 kg/m2) and obesity (BMI≥28 kg/m2) was 42.1% and 20.3% in men and 35.6% and 17.1% in women, respectively. Age was inversely associated with overweight in both sexes, and obesity in women. Education level was negatively associated with overweight and obesity in women but not in men. Only 49.1% men and 58.3% women had a correct perception of their body weight. Underestimation of body weight was more common than overestimation, especially in men, the older people, and those with low education level. The percentage of taking action to lose weight was inversely associated with men and old age, and positively associated with higher education level, higher BMI, and self-perception as “fat” (OR = 3.78 in men, OR = 2.91 in women). Only 26.1% of overweight/obese individuals took action to lose weight. The top two weight loss practices were to reduce the amount of food intake and exercise.

Conclusion

Overweight and obesity were highly prevalent with high incorrect body weight perceptions in the general adult population in Beijing. Weight loss practice was poor in overweight and obese individuals. Actions at multiple levels are needed to slow or control this overweight and obesity epidemic.  相似文献   

20.

Background

To examine histomorphometrically the parapapillary region in human eyes.

Methodology/Principal Findings

The histomorphometric study included 65 human globes (axial length:21–37 mm). On anterior-posterior histological sections, we measured the distance Bruch''s membrane end (BME)-optic nerve margin (“Gamma zone”), BME-retinal pigment epithelium (RPE) (“Beta zone”), BME-beginning of non-occluded choriocapillaris, and BME-beginning of photoreceptor layer. “Delta zone” was defined as part of gamma zone in which blood vessels of at least 50 µm diameter were not present over a length of >300 µm. Beta zone (mean length:0.35±0.52 mm) was significantly (P = 0.01) larger in the glaucoma group than in the non-glaucomatous group. It was not significantly (P = 0.28) associated with axial length. Beta zone was significantly (P = 0.004) larger than the region with occluded choriocapillaris. Gamma zone (mean length:0.63±1.25 mm) was associated with axial length (P<0.001;r2 = 0.73) with an increase starting at an axial length of 26.5 mm. It was not significantly (P = 0.24) associated with glaucomatous optic neuropathy. Delta zone (present only in eyes with axial length of ≥27 mm) was associated with axial length (P = 0.001) and scleral flange length (P<0.001) but not with glaucoma (P = 0.73).

Conclusions/Significance

Parapapillary gamma zone (peripapillary sclera without overlying choroid, Bruch''s membrane and deep retinal layers) was related with axial globe elongation and was independent of glaucoma. Delta zone (no blood vessels >50 µm diameter within gamma zone) was present only in highly axially elongated globes and was not related with glaucoma. Beta zone (Bruch''s membrane without RPE) was correlated with glaucoma but not with globe elongation. Since the region with occluded choriocapillaris was smaller than beta zone, complete loss of RPE may have occurred before complete choriocapillaris closure.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号