首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
2.

Background

It has been suggested that prenatal stress contributes to the risk of obesity later in life. In a population–based cohort study, we examined whether prenatal stress related to maternal bereavement during pregnancy was associated with the risk of overweight in offspring during school age.

Methodology/Principal Findings

We followed 65,212 children born in Denmark from 1970–1989 who underwent health examinations from 7 to 13 years of age in public or private schools in Copenhagen. We identified 459 children as exposed to prenatal stress, defined by being born to mothers who were bereaved by death of a close family member from one year before pregnancy until birth of the child. We compared the prevalence of overweight between the exposed and the unexposed. Body mass index (BMI) values and prevalence of overweight were higher in the exposed children, but not significantly so until from 10 years of age and onwards, as compared with the unexposed children. For example, the adjusted odds ratio (OR) for overweight was 1.68 (95% confidence interval [CI] 1.08–2.61) at 12 years of age and 1.63 (95% CI 1.00–2.61) at 13 years of age. The highest ORs were observed when the death occurred in the period from 6 to 0 month before pregnancy (OR 3.31, 95% CI 1.71–6.42 at age 12, and OR 2.31, 95% CI 1.08–4.97 at age 13).

Conclusions/Significance

Our results suggest that severe pre-pregnancy stress is associated with an increased risk of overweight in the offspring in later childhood.  相似文献   

3.

Introduction

HIV prevalence among state prison inmates in the United States is more than five times higher than among nonincarcerated persons, but HIV transmission within U.S. prisons is sparsely documented. We investigated 88 HIV seroconversions reported from 1988–2005 among male Georgia prison inmates.

Methods

We analyzed medical and administrative data to describe seroconverters'' HIV testing histories and performed a case-crossover analysis of their risks before and after HIV diagnosis. We sequenced the gag, env, and pol genes of seroconverters'' HIV strains to identify genetically-related HIV transmission clusters and antiretroviral resistance. We combined risk, genetic, and administrative data to describe prison HIV transmission networks.

Results

Forty-one (47%) seroconverters were diagnosed with HIV from July 2003–June 2005 when voluntary annual testing was offered. Seroconverters were less likely to report sex (OR [odds ratio] = 0.02, 95% CI [confidence interval]: 0–0.10) and tattooing (OR = 0.03, 95% CI: <0.01–0.20) in prison after their HIV diagnosis than before. Of 67 seroconverters'' specimens tested, 33 (49%) fell into one of 10 genetically-related clusters; of these, 25 (76%) reported sex in prison before their HIV diagnosis. The HIV strains of 8 (61%) of 13 antiretroviral-naïve and 21 (40%) of 52 antiretroviral-treated seroconverters were antiretroviral-resistant.

Discussion

Half of all HIV seroconversions were identified when routine voluntary testing was offered, and seroconverters reduced their risks following their diagnosis. Most genetically-related seroconverters reported sex in prison, suggesting HIV transmission through sexual networks. Resistance testing before initiating antiretroviral therapy is important for newly-diagnosed inmates.  相似文献   

4.

Background

Human exposure to silica dust is very common in both working and living environments. However, the potential long-term health effects have not been well established across different exposure situations.

Methods and Findings

We studied 74,040 workers who worked at 29 metal mines and pottery factories in China for 1 y or more between January 1, 1960, and December 31, 1974, with follow-up until December 31, 2003 (median follow-up of 33 y). We estimated the cumulative silica dust exposure (CDE) for each worker by linking work history to a job–exposure matrix. We calculated standardized mortality ratios for underlying causes of death based on Chinese national mortality rates. Hazard ratios (HRs) for selected causes of death associated with CDE were estimated using the Cox proportional hazards model. The population attributable risks were estimated based on the prevalence of workers with silica dust exposure and HRs. The number of deaths attributable to silica dust exposure among Chinese workers was then calculated using the population attributable risk and the national mortality rate. We observed 19,516 deaths during 2,306,428 person-years of follow-up. Mortality from all causes was higher among workers exposed to silica dust than among non-exposed workers (993 versus 551 per 100,000 person-years). We observed significant positive exposure–response relationships between CDE (measured in milligrams/cubic meter–years, i.e., the sum of silica dust concentrations multiplied by the years of silica exposure) and mortality from all causes (HR 1.026, 95% confidence interval 1.023–1.029), respiratory diseases (1.069, 1.064–1.074), respiratory tuberculosis (1.065, 1.059–1.071), and cardiovascular disease (1.031, 1.025–1.036). Significantly elevated standardized mortality ratios were observed for all causes (1.06, 95% confidence interval 1.01–1.11), ischemic heart disease (1.65, 1.35–1.99), and pneumoconiosis (11.01, 7.67–14.95) among workers exposed to respirable silica concentrations equal to or lower than 0.1 mg/m3. After adjustment for potential confounders, including smoking, silica dust exposure accounted for 15.2% of all deaths in this study. We estimated that 4.2% of deaths (231,104 cases) among Chinese workers were attributable to silica dust exposure. The limitations of this study included a lack of data on dietary patterns and leisure time physical activity, possible underestimation of silica dust exposure for individuals who worked at the mines/factories before 1950, and a small number of deaths (4.3%) where the cause of death was based on oral reports from relatives.

Conclusions

Long-term silica dust exposure was associated with substantially increased mortality among Chinese workers. The increased risk was observed not only for deaths due to respiratory diseases and lung cancer, but also for deaths due to cardiovascular disease. Please see later in the article for the Editors'' Summary  相似文献   

5.

Background

Environmental factors during childhood are thought to play a role in the aetiolgy of Crohn''s Disease (CD). However the association between age at time of exposure and the subsequent development of CD in South Africa is unknown.

Methods

A case control study of all consecutive CD patients seen at 2 large inflammatory bowel disease (IBD) referral centers in the Western Cape, South Africa between September 2011 and January 2013 was performed. Numerous environmental exposures during 3 age intervals; 0–5, 6–10 and 11–18 years were extracted using an investigator administered questionnaire. An agreement analysis was performed to determine the reliability of questionnaire data for all the relevant variables.

Results

This study included 194 CD patients and 213 controls. On multiple logistic regression analysis, a number of childhood environmental exposures during the 3 age interval were significantly associated with the risk of developing CD. During the age interval 6–10 years, never having had consumed unpasteurized milk (OR = 5.84; 95% CI, 2.73–13.53) and never having a donkey, horse, sheep or cow on the property (OR = 2.48; 95% CI, 1.09–5.98) significantly increased the risk of developing future CD. During the age interval 11–18 years, an independent risk-association was identified for; never having consumed unpasteurized milk (OR = 2.60; 95% CI, 1.17–6.10) and second-hand cigarette smoke exposure (OR = 1.93; 95% CI, 1.13–3.35).

Conclusion

This study demonstrates that both limited microbial exposures and exposure to second-hand cigarette smoke during childhood is associated with future development of CD.  相似文献   

6.
7.
8.

Setting

Under India''s Revised National Tuberculosis Control Programme (RNTCP), >15% of previously-treated patients in the reported 2006 patient cohort defaulted from anti-tuberculosis treatment.

Objective

To assess the timing, characteristics, and risk factors for default amongst re-treatment TB patients.

Methodology

For this case-control study, in 90 randomly-selected programme units treatment records were abstracted from all 2006 defaulters from the RNTCP re-treatment regimen (cases), with one consecutively-selected non-defaulter per case. Patients who interrupted anti-tuberculosis treatment for >2 months were classified as defaulters.

Results

1,141 defaulters and 1,189 non-defaulters were included. The median duration of treatment prior to default was 81 days (25%–75% interquartile range 44–117 days) and documented retrieval efforts after treatment interruption were inadequate. Defaulters were more likely to have been male (adjusted odds ratio [aOR] 1.4, 95% confidence interval [CI] 1.2–1.7), have previously defaulted anti-tuberculosis treatment (aOR 1.3 95%CI 1.1–1.6], have previous treatment from non-RNTCP providers (AOR 1.3, 95%CI 1.0–1.6], or have public health facility-based treatment observation (aOR 1.3, 95%CI 1.1–1.6).

Conclusions

Amongst the large number of re-treatment patients in India, default occurs early and often. Improved pre-treatment counseling and community-based treatment provision may reduce default rates. Efforts to retrieve treatment interrupters prior to default require strengthening.  相似文献   

9.

Background

Previous studies have demonstrated an association between preterm delivery and increased risk of special educational need (SEN). The aim of our study was to examine the risk of SEN across the full range of gestation.

Methods and Findings

We conducted a population-based, retrospective study by linking school census data on the 407,503 eligible school-aged children resident in 19 Scottish Local Authority areas (total population 3.8 million) to their routine birth data. SEN was recorded in 17,784 (4.9%) children; 1,565 (8.4%) of those born preterm and 16,219 (4.7%) of those born at term. The risk of SEN increased across the whole range of gestation from 40 to 24 wk: 37–39 wk adjusted odds ratio (OR) 1.16, 95% confidence interval (CI) 1.12–1.20; 33–36 wk adjusted OR 1.53, 95% CI 1.43–1.63; 28–32 wk adjusted OR 2.66, 95% CI 2.38–2.97; 24–27 wk adjusted OR 6.92, 95% CI 5.58–8.58. There was no interaction between elective versus spontaneous delivery. Overall, gestation at delivery accounted for 10% of the adjusted population attributable fraction of SEN. Because of their high frequency, early term deliveries (37–39 wk) accounted for 5.5% of cases of SEN compared with preterm deliveries (<37 wk), which accounted for only 3.6% of cases.

Conclusions

Gestation at delivery had a strong, dose-dependent relationship with SEN that was apparent across the whole range of gestation. Because early term delivery is more common than preterm delivery, the former accounts for a higher percentage of SEN cases. Our findings have important implications for clinical practice in relation to the timing of elective delivery. Please see later in the article for the Editors'' Summary  相似文献   

10.

Background

The Centers for Disease Control recommend screening for asymptomatic sexually transmitted infection (STI) among HIV-infected men when there is self-report of unprotected anal-receptive exposure. The study goals were: (1) to estimate the validity and usefulness for screening policies of self-reported unprotected anal-receptive exposure as a risk indicator for asymptomatic anorectal infection with Neisseria gonorrhoeae (GC) and/or Chlamydia trachomatis (CT). (2) to estimate the number of infections that would be missed if anal diagnostic assays were not performed among patients who denied unprotected anorectal exposure in the preceding month.

Methods and Findings

Retrospective analysis in HIV primary care and high resolution anoscopy (HRA) clinics. HIV-infected adult men were screened for self-reported exposure during the previous month at all primary care and HRA appointments. Four sub-cohorts were defined based on microbiology methodology (GC culture and CT direct fluorescent antibody vs. GC/CT nucleic acid amplification test) and clinical setting (primary care vs. HRA). Screening question operating characteristics were estimated using contingency table methods and then pooled across subcohorts. Among 803 patients, the prevalence of anorectal GC/CT varied from 3.5–20.1% in the 4 sub-cohorts. The sensitivity of the screening question for self-reported exposure to predict anorectal STI was higher in the primary care than in the HRA clinic, 86–100% vs. 12–35%, respectively. The negative predictive value of the screening question to predict asymptomatic anorectal STI was ≥90% in all sub-cohorts. In sensitivity analyses, the probability of being an unidentified case among those denying exposure increased from 0.4–8.1% in the primary care setting, and from 0.9–18.8% in the HRA setting as the prevalence varied from 1–20%.

Conclusion

As STI prevalence increases, denial of unprotected anal-receptive exposure leads to an increasingly unacceptable proportion of unidentified asymptomatic anorectal STI if used as a criterion not to obtain microbiologic assays.  相似文献   

11.

Background

Thousands of human deaths from rabies occur annually despite the availability of effective vaccines following exposure, and for disease control in the animal reservoir. Our aim was to assess risk factors associated with exposure and to determine why human deaths from endemic canine rabies still occur.

Methods and Findings

Contact tracing was used to gather data on rabies exposures, post-exposure prophylaxis (PEP) delivered and deaths in two rural districts in northwestern Tanzania from 2002 to 2006. Data on risk factors and the propensity to seek and complete courses of PEP was collected using questionnaires. Exposures varied from 6–141/100,000 per year. Risk of exposure to rabies was greater in an area with agropastoralist communities (and larger domestic dog populations) than an area with pastoralist communities. Children were at greater risk than adults of being exposed to rabies and of developing clinical signs. PEP dramatically reduced the risk of developing rabies (odds ratio [OR] 17.33, 95% confidence interval [CI] 6.39–60.83) and when PEP was not delivered the risks were higher in the pastoralist than the agro-pastoralist area (OR 6.12, 95% CI 2.60–14.58). Low socioeconomic class and distance to medical facilities lengthened delays before PEP delivery. Over 20% of rabies-exposed individuals did not seek medical treatment and were not documented in official records and <65% received PEP. Animal bite injury records were an accurate indicator of rabies exposure incidence.

Conclusions

Insufficient knowledge about rabies dangers and prevention, particularly prompt PEP, but also wound management, was the main cause of rabies deaths. Education, particularly in poor and marginalized communities, but also for medical and veterinary workers, would prevent future deaths.  相似文献   

12.

Background

The beneficial effects of statins in rheumatoid arthritis (RA) have been suggested previously, but it is unclear whether statins may prevent its development. The aim of this retrospective cohort study was to explore whether persistent use of statins is associated with onset of RA.

Methods and Findings

The computerized medical databases of a large health organization in Israel were used to identify diagnosed RA cases among adults who began statin therapy between 1998 and 2007. Persistence with statins was assessed by calculating the mean proportion of follow-up days covered (PDC) with statins for every study participant. To assess the possible effects of healthy user bias, we also examined the risk of osteoarthritis (OA), a common degenerative joint disease that is unlikely to be affected by use of statins.A total of 211,627 and 193,770 individuals were eligible for the RA and OA cohort analyses, respectively. During the study follow-up period, there were 2,578 incident RA cases (3.07 per 1,000 person-years) and 17,878 incident OA cases (24.34 per 1,000 person-years). The crude incidence density rate of RA among nonpersistent patients (PDC level of <20%) was 51% higher (3.89 per 1,000 person-years) compared to highly persistent patients who were covered with statins for at least 80% of the follow-up period. After adjustment for potential confounders, highly persistent patients had a hazard ratio of 0.58 (95% confidence interval 0.52–0.65) for RA compared with nonpersistent patients. Larger differences were observed in younger patients and in patients initiating treatment with high efficacy statins. In the OA cohort analysis, high persistence with statins was associated only with a modest decrement in risk ratio (hazard ratio = 0.85; 0.81–0.88) compared to nonadherent patients.

Conclusions

The present study demonstrates an association between persistence with statin therapy and reduced risk of developing RA. The relationship between continuation of statin use and OA onset was weak and limited to patients with short-term follow-up. Please see later in the article for the Editors'' Summary  相似文献   

13.

Background

We aimed to compare reproductive outcomes following ectopic pregnancy (EP) versus livebirth, miscarriage, or termination in a first pregnancy.

Methods And Findings

A retrospective cohort study design was used. Scottish national data on all women whose first pregnancy occurred between 1981 and 2000 were linked to records of a subsequent pregnancy. The exposed cohort comprised women with an EP in their first pregnancy. There were three unexposed cohorts: women with livebirth, miscarriage, and termination of their first pregnancies. Any differences in rates of second pregnancy, livebirth, EP, miscarriage, or terminations and complications of a second ongoing pregnancy and delivery were assessed among the different exposure groups. A total of 2,969 women had an initial EP; 667,299 had a livebirth, 39,705 women miscarried, and 78,697 terminated their first pregnancies. Women with an initial EP had an increased chance of another pregnancy within 2 years (adjusted hazard ratio (AHR) 2.76 [95% CI 2.58–2.95]) or after 6 years (AHR 1.57 [95% CI 1.29–1.91]) compared to women with a livebirth. In comparison with women with an initial miscarriage, women who had an EP had a lower chance of a second pregnancy (AHR 0.53 [95% CI 0.50–0.56]). Compared to women with an initial termination, women with an EP had an increased chance of a second pregnancy (AHR 2.38 [95% CI 2.23–2.55]) within 2 years. Women with an initial EP suffered an increased risk of another EP compared to women with a livebirth (AHR 13.0 [95% CI 11.63–16.86]), miscarriage (AHR 6.07 [95% CI 4.83–7.62]), or termination (AHR 12.84 [95% CI 10.07–16.37]). Perinatal complications in a pregnancy following EP were not significantly higher than those in primigravidae or in women with a previous miscarriage or termination.

Conclusion

Women with an initial EP have a lower chance of conception than those who miscarry but an increased risk of a repeat EP in comparison with all three comparison groups. A major limitation of this study was the inability to separate women using contraception from those who were intending to conceive. Please see later in the article for the Editors'' Summary  相似文献   

14.

Background

Integrated disease prevention in low resource settings can increase coverage, equity and efficiency in controlling high burden infectious diseases. A public-private partnership with the Ministry of Health, CDC, Vestergaard Frandsen and CHF International implemented a one-week integrated multi-disease prevention campaign.

Method

Residents of Lurambi, Western Kenya were eligible for participation. The aim was to offer services to at least 80% of those aged 15–49. 31 temporary sites in strategically dispersed locations offered: HIV counseling and testing, 60 male condoms, an insecticide-treated bednet, a household water filter for women or an individual filter for men, and for those testing positive, a 3-month supply of cotrimoxazole and referral for follow-up care and treatment.

Findings

Over 7 days, 47,311 people attended the campaign with a 96% uptake of the multi-disease preventive package. Of these, 99.7% were tested for HIV (87% in the target 15–49 age group); 80% had previously never tested. 4% of those tested were positive, 61% were women (5% of women and 3% of men), 6% had median CD4 counts of 541 cell/µL (IQR; 356, 754). 386 certified counselors attended to an average 17 participants per day, consistent with recommended national figures for mass campaigns. Among women, HIV infection varied by age, and was more likely with an ended marriage (e.g. widowed vs. never married, OR.3.91; 95% CI. 2.87–5.34), and lack of occupation. In men, quantitatively stronger relationships were found (e.g. widowed vs. never married, OR.7.0; 95% CI. 3.5–13.9). Always using condoms with a non-steady partner was more common among HIV-infected women participants who knew their status compared to those who did not (OR.5.4 95% CI. 2.3–12.8).

Conclusion

Through integrated campaigns it is feasible to efficiently cover large proportions of eligible adults in rural underserved communities with multiple disease preventive services simultaneously achieving various national and international health development goals.  相似文献   

15.

Background

The public health response to pandemic influenza is contingent on the pandemic strain''s severity. In late April 2009, a potentially pandemic novel H1N1 influenza strain (nH1N1) was recognized. New York City (NYC) experienced an intensive initial outbreak that peaked in late May, providing the need and opportunity to rapidly quantify the severity of nH1N1.

Methods and Findings

Telephone surveys using rapid polling methods of approximately 1,000 households each were conducted May 20–27 and June 15–19, 2009. Respondents were asked about the occurrence of influenza-like illness (ILI, fever with either cough or sore throat) for each household member from May 1–27 (survey 1) or the preceding 30 days (survey 2). For the overlap period, prevalence data were combined by weighting the survey-specific contribution based on a Serfling model using data from the NYC syndromic surveillance system. Total and age-specific prevalence of ILI attributed to nH1N1 were estimated using two approaches to adjust for background ILI: discounting by ILI prevalence in less affected NYC boroughs and by ILI measured in syndromic surveillance data from 2004–2008. Deaths, hospitalizations and intensive care unit (ICU) admissions were determined from enhanced surveillance including nH1N1-specific testing. Combined ILI prevalence for the 50-day period was 15.8% (95% CI:13.2%–19.0%). The two methods of adjustment yielded point estimates of nH1N1-associated ILI of 7.8% and 12.2%. Overall case-fatality (CFR) estimates ranged from 0.054–0.086 per 1000 persons with nH1N1-associated ILI and were highest for persons ≥65 years (0.094–0.147 per 1000) and lowest for those 0–17 (0.008–0.012). Hospitalization rates ranged from 0.84–1.34 and ICU admission rates from 0.21–0.34 per 1000, with little variation in either by age-group.

Conclusions

ILI prevalence can be quickly estimated using rapid telephone surveys, using syndromic surveillance data to determine expected “background” ILI proportion. Risk of severe illness due to nH1N1 was similar to seasonal influenza, enabling NYC to emphasize preventing severe morbidity rather than employing aggressive community mitigation measures.  相似文献   

16.

Background

Falls are a major cause of morbidity and mortality in dementia, but there have been no prospective studies of risk factors for falling specific to this patient population, and no successful falls intervention/prevention trials. This prospective study aimed to identify modifiable risk factors for falling in older people with mild to moderate dementia.

Methods and Findings

179 participants aged over 65 years were recruited from outpatient clinics in the UK (38 Alzheimer''s disease (AD), 32 Vascular dementia (VAD), 30 Dementia with Lewy bodies (DLB), 40 Parkinson''s disease with dementia (PDD), 39 healthy controls). A multifactorial assessment of baseline risk factors was performed and fall diaries were completed prospectively for 12 months. Dementia participants experienced nearly 8 times more incident falls (9118/1000 person-years) than controls (1023/1000 person-years; incidence density ratio: 7.58, 3.11–18.5). In dementia, significant univariate predictors of sustaining at least one fall included diagnosis of Lewy body disorder (proportional hazard ratio (HR) adjusted for age and sex: 3.33, 2.11–5.26), and history of falls in the preceding 12 months (HR: 2.52, 1.52–4.17). In multivariate analyses, significant potentially modifiable predictors were symptomatic orthostatic hypotension (HR: 2.13, 1.19–3.80), autonomic symptom score (HR per point 0–36: 1.055, 1.012–1.099), and Cornell depression score (HR per point 0–40: 1.053, 1.01–1.099). Higher levels of physical activity were protective (HR per point 0–9: 0.827, 0.716–0.956).

Conclusions

The management of symptomatic orthostatic hypotension, autonomic symptoms and depression, and the encouragement of physical activity may provide the core elements for the most fruitful strategy to reduce falls in people with dementia. Randomised controlled trials to assess such a strategy are a priority.  相似文献   

17.

Background

The timeliness of HIV diagnosis and the initiation of antiretroviral treatment are major determinants of survival for HIV-infected people. Injection drug users (IDUs) are less likely than persons in other transmission categories to seek early HIV counseling, testing, and treatment. Our objective was to estimate the proportion of IDUs with a late HIV diagnosis (AIDS diagnosis within 12 months of HIV diagnosis) and determine the factors associated with disease progression after HIV diagnosis.

Methodology/Principal Findings

Using data from 33 states with confidential name-based HIV reporting, we determined the proportion of IDUs aged ≥13 years who received a late HIV diagnosis during 1996–2004. We used standardized Kaplan-Meier survival methods to determine differences in time of progression from HIV to AIDS and death, by race/ethnicity, sex, age group, CD4+ T-cell count, metropolitan residence, and diagnosis year. We compared the survival of IDUs with the survival of persons in other transmission categories. During 1996–2004, 42.2% (11,635) of 27,572 IDUs were diagnosed late. For IDUs, the risk for progression from HIV to AIDS 3 years after HIV diagnosis was greater for nonwhites, males and older persons. Three-year survival after HIV diagnosis was lower for IDU males (87.3%, 95% confidence interval (CI), 87.1–87.4) compared with males exposed through male-to-male sexual contact (91.6%, 95% CI, 91.6–91.7) and males exposed through high-risk heterosexual contact (HRHC) (91.9%, 95% CI, 91.8–91.9). Survival was also lower for IDU females (89.5%, 95% CI, 89.4–89.6) compared to HRHC females (93.3%, 95% CI, 93.3–93.4).

Conclusions/Significance

A substantial proportion of IDUs living with HIV received their HIV diagnosis late. To improve survival of IDUs, HIV prevention efforts must ensure early access to HIV testing and care, as well as encourage adherence to antiretroviral treatment to slow disease progression.  相似文献   

18.

Background

Exposure to energy restriction during childhood and adolescence is associated with a lower risk of developing colorectal cancer (CRC). Epigenetic dysregulation during this critical period of growth and development may be a mechanism to explain such observations. Within the Netherlands Cohort Study on diet and cancer, we investigated the association between early life energy restriction and risk of subsequent CRC characterized by the (promoter) CpG island methylation phenotype (CIMP).

Methodology/Principal Findings

Information on diet and risk factors was collected by baseline questionnaire (n = 120,856). Three indicators of exposure were assessed: place of residence during the Hunger Winter (1944–45) and World War II years (1940–44), and father''s employment status during the Economic Depression (1932–40). Methylation specific PCR (MSP) on DNA from paraffin embedded tumor tissue was performed to determine CIMP status according to the Weisenberger markers. After 7.3 years of follow-up, 603 cases and 4631 sub-cohort members were available for analysis. Cox regression was used to calculate hazard ratios (HR) and 95% confidence intervals for CIMP+ (27.7%) and CIMP- (72.3%) tumors according to the three time periods of energy restriction, adjusted for age and gender. Individuals exposed to severe famine during the Hunger Winter had a decreased risk of developing a tumor characterized by CIMP compared to those not exposed (HR 0.65, 95%CI: 0.45–0.92). Further categorizing individuals by an index of ‘0–1’ ‘2–3’ or ‘4–7’ genes methylated in the promoter region suggested that exposure to the Hunger Winter was associated with the degree of promoter hypermethylation (‘0–1 genes methylated’ HR = 1.01, 95%CI:0.74–1.37; ‘2–3 genes methylated’ HR = 0.83, 95% CI:0.61–1.15; ‘4–7 genes methylated’ HR = 0.72, 95% CI:0.49–1.04). No associations were observed with respect to the Economic Depression and WWII years.

Conclusions

This is the first study indicating that exposure to a severe, transient environmental condition during adolescence and young adulthood may result in persistent epigenetic changes that later influence CRC development.  相似文献   

19.

Background

Environmental risk factors playing a causative role in Crohn''s Disease (CD) remain largely unknown. Recently, it has been suggested that refrigerated food could be involved in disease development. We thus conducted a pilot case control study to explore the association of CD with the exposure to domestic refrigeration in childhood.

Methodology/Principal Findings

Using a standard questionnaire we interviewed 199 CD cases and 207 age-matched patients with irritable bowel syndrome (IBS) as controls. Cases and controls were followed by the same gastroenterologists of tertiary referral clinics in Tehran, Iran. The questionnaire focused on the date of the first acquisition of home refrigerator and freezer. Data were analysed by a multivariate logistic model. The current age was in average 34 years in CD cases and the percentage of females in the case and control groups were respectively 48.3% and 63.7%. Patients were exposed earlier than controls to the refrigerator (X2 = 9.9, df = 3, P = 0.04) and refrigerator exposure at birth was found to be a risk factor for CD (OR = 2.08 (95% CI: 1.01–4.29), P = 0.05). Comparable results were obtained looking for the exposure to freezer at home. Finally, among the other recorded items reflecting the hygiene and comfort at home, we also found personal television, car and washing machine associated with CD.

Conclusion

This study supports the opinion that CD is associated with exposure to domestic refrigeration, among other household factors, during childhood.  相似文献   

20.

Background

The relationship between passive smoking exposure (PSE) and breast cancer risk is of major interest.

Objective

To evaluate the relationship between PSE from partners and breast cancer risk stratified by hormone-receptor (HR) status in Chinese urban women population.

Design

Hospital-based matched case control study.

Setting

Chinese urban breast cancer patients without current or previous active smoking history in China Medical University 1st Hospital, Liaoning Province, China between Jan 2009 and Nov 2009.

Patients

Each breast cancer patient was matched 1∶1 with healthy controls by gender and age (±2 years) from the same hospital.

Measurements

The authors used unconditional logistic regression analyses to estimate odds ratio for women with PSE from partners and breast cancer risk.

Results

312 pairs were included in the study. Women who endured PSE had significantly increased risk of breast cancer (adjusted OR: 1.46; 95% CI: 1.05–2.03; P = 0.027), comparing with unexposed women. Women who exposed to >5 cigarettes/day also had significant increased risk (adjusted OR: 1.99; 95% CI: 1.28–3.10; P = 0.002), as were women exposed to passive smoke for 16–25 years (adjusted OR: 1.87 95% CI: 1.22–2.86; P = 0.004), and those exposed to > 4 pack-years (adjusted OR: 1.71 95% CI: 1.17–2.50; P = 0.004). Similar trends were significant for estrogen receptor (ER)/progesterone receptor (PR) double positive subgroup(adjusted OR: 1.71; 2.20; 1.99; 1.92, respectively), but not for ER+/PR−, ER−/PR+, or ER−/PR− subgroups.

Limitations

limitations of the hospital-based retrospective study, lack of information on entire lifetime PSE and low statistical power.

Conclusions

Our findings provide further evidence that PSE from partners contributes to increased risk of breast cancer, especially for ER/PR double positive breast cancer, in Chinese urban women.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号