首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 46 毫秒
1.

Objectives

This study aimed to investigate longitudinal relations between leisure and social activities and mental health status, considering the presence or absence of other persons in the activity as an additional variable, among middle-aged adults in Japan. This study used nationally representative data in Japan with a five-year follow-up period.

Methods

This study focused on 16,642 middle-aged adults, age 50–59 at baseline, from a population-based, six-year panel survey conducted by the Japanese Ministry of Health, Labour and Welfare. To investigate the relations between two leisure activities (‘hobbies or cultural activities’ and ‘exercise or sports’) and four social activities (‘community events’, ‘support for children’, ‘support for elderly individuals’ and ‘other social activities’) at baseline and mental health status at follow-up, multiple logistic regression analysis was used. We also used multiple logistic regression analysis to investigate the association between ways of participating in these activities (‘by oneself’, ‘with others’, or ‘both’ (both ‘by oneself’ and ‘with others’)) at baseline and mental health status at follow-up.

Results

Involvement in both leisure activity categories, but not in social activities, was significantly and positively related to mental health status in both men and women.Furthermore, in men, both ‘hobbies or cultural activities’ and ‘exercise or sports’ were significantly related to mental health status only when conducted ‘with others’. In women, the effects of ‘hobbies or cultural activities’ on mental health status were no differences regardless of the ways of participating, while the result of ‘exercise or sports’ was same as that in men.

Conclusions

Leisure activities appear to benefit mental health status among this age group, whereas specific social activities do not. Moreover, participation in leisure activities would be effective especially if others are present. These findings should be useful for preventing the deterioration of mental health status in middle-aged adults in Japan.  相似文献   

2.

Objective

To estimate the prevalence and identify correlates of smokeless tobacco consumption among married rural women with a history of at least one pregnancy in Madaripur, Bangladesh.

Materials and Methods

We conducted a cross-sectional survey using an interviewer administered, pre-tested, semi-structured questionnaire. All women living in the study area, aged 18 years and above with at least one pregnancy in their lifetime, who were on the electoral roll and agreed to participate were included in the study. Information on socio-demographic characteristics and smokeless tobacco consumption was collected. Smokeless tobacco consumption was categorized as ‘Current’, ‘Ever but not current’ and ‘Never’. Associations between smokeless tobacco consumption and the explanatory variables were estimated using simple and multiple binary logistic regression.

Results

8074 women participated (response rate 99.9%). The prevalence of ‘Current consumption’, ‘Ever consumption but not current’, and ‘Never consumption’ was 25%, 44% and 31%, respectively. The mean age at first use was 31.5 years. 87% of current consumers reported using either Shadapata or Hakimpuree Jarda. Current consumption was associated with age, level of education, religion, occupation, being an income earner, marital status, and age at first use of smokeless tobacco. After adjustment for demographic variables, current consumption was associated with being over 25 years of age, a lower level of education, being an income earner, being Muslim, and being divorced, separated or widowed.

Conclusion

The prevalence of smokeless tobacco consumption is high among rural women in Bangladesh and the age of onset is considerably older than that for smoking. Smokeless tobacco consumption is likely to be producing a considerable burden of non-communicable disease in Bangladesh. Smokeless tobacco control strategies should be implemented.  相似文献   

3.

Background

The positive association between education level and health outcomes can be partly explained by dietary behaviour. We investigated the associations between education and several indices of food intake and potential influencing factors, placing special emphasis on physical-activity patterns, using a representative sample of the German adult population.

Methods

The German National Health Interview and Examination Survey 1998 (GNHIES98) involved 7,124 participants aged between 18 and 79. Complete information on the exposure (education) and outcome (nutrition) variables was available for 6,767 persons. The associations between ‘education’ and indices of ‘sugar-rich food’, ‘fat-rich food’, ‘fruit-and-vegetable’ and ‘alcohol’ intake were analysed separately for men and women using multivariate logistic regression analysis. Odds ratios (OR) of education level on nutrition outcomes were calculated and adjusted for age, region (former East/West Germany), occupation, income and other influencing factors such as physical activity indicators.

Results

Men and women with only a primary education had a more frequent intake of sugar-rich and fat-rich foods and a less frequent intake of fruit and vegetables and alcohol than people with a tertiary education. ‘Physical work activity’ partly explained the associations between education and sugar-rich food intake. The interference with physical work activity was stronger among men than women. No significant associations between education and energy-dense food intake were observed in the retirement-age group of persons aged 65+ and among persons with low energy expenditure.

Conclusions

In Germany, adults with a low level of education report that they consume energy-dense foods more frequently – and fruit and vegetables and alcohol less frequently – than adults with a high education level. High levels of physical work activity among adults with a low education level may partly explain why they consume more energy-dense foods.  相似文献   

4.

Objective

Treatment in the ultra-high risk stage for a psychotic episode is critical to the course of symptoms. Markers for the development of psychosis have been studied, to optimize the detection of people at risk of psychosis. One possible marker for the transition to psychosis is social cognition. To estimate effect sizes for social cognition based on a quantitative integration of the published evidence, we conducted a meta-analysis of social cognitive performance in people at ultra high risk (UHR).

Methods

A literature search (1970-July 2015) was performed in PubMed, PsychINFO, Medline, Embase, and ISI Web of Science, using the search terms ‘social cognition’, ‘theory of mind’, ‘emotion recognition’, ‘attributional style’, ‘social knowledge’, ‘social perception’, ‘empathy’, ‘at risk mental state’, ‘clinical high risk’, ‘psychosis prodrome’, and ‘ultra high risk’. The pooled effect size (Cohen’s D) and the effect sizes for each domain of social cognition were calculated. A random effects model with 95% confidence intervals was used.

Results

Seventeen studies were included in the analysis. The overall significant effect was of medium magnitude (d = 0.52, 95% Cl = 0.38–0.65). No moderator effects were found for age, gender and sample size. Sub-analyses demonstrated that individuals in the UHR phase show significant moderate deficits in affect recognition and affect discrimination in faces as well as in voices and in verbal Theory of Mind (TOM). Due to an insufficient amount of studies, we did not calculate an effect size for attributional bias and social perception/ knowledge. A majority of studies did not find a correlation between social cognition deficits and transition to psychosis, which may suggest that social cognition in general is not a useful marker for the development of psychosis. However some studies suggest the possible predictive value of verbal TOM and the recognition of specific emotions in faces for the transition into psychosis. More research is needed on these subjects.

Conclusion

The published literature indicates consistent general impairments in social cognition in people in the UHR phase, but only very specific impairments seem to predict transition to psychosis.  相似文献   

5.

Background Aim

To gain insight into patient and doctor delay in testicular cancer (TC) and factors associated with delay.

Materials and Methods

Sixty of the 66 eligible men; median age 26 (range 17–45) years, diagnosed with TC at the University Medical Center Groningen completed a questionnaire on patients’ delay: interval from symptom onset to first consultation with a general practitioner (GP) and doctors’ delay: interval between GP and specialist visit.

Results

Median patient reported delay was 30 (range 1–365) days. Patient delay and TC tumor stage were associated (p = .01). Lower educated men and men embarrassed about their scrotal change reported longer patient delay (r = -.25, r = .79 respectively). Age, marital status, TC awareness, warning signals, nor perceived limitations were associated with patient delay. Median patient reported time from GP to specialist (doctors’ delay) was 7 (range 0–240) days. Referral time and disease stage were associated (p = .04). Six patients never reported a scrotal change. Of the 54 patients reporting a testicular change, 29 (54%) patients were initially ‘misdiagnosed’, leading to a median doctors’ delay of 14 (1–240) days, which was longer (p< .001) than in the 25 (46%) patients whose GP suspected TC (median doctors’ delay 1(0–7 days).

Conclusions

High variation in patients’ and doctors’ delay was found. Most important risk variables for longer patient delay were embarrassment and lower education. Most important risk variable in GP’s was ‘misdiagnosis’. TC awareness programs for men and physicians are required to decrease delay in the diagnosis of TC and improve disease free survival.  相似文献   

6.

Background

Failure to recognize acute deterioration in hospitalized patients may contribute to cardiopulmonary arrest, unscheduled intensive care unit admission and increased mortality.

Purpose

In this systematic review we aimed to determine whether continuous non-invasive respiratory monitoring improves early diagnosis of patient deterioration and reduces critical incidents on hospital wards.

Data Sources

Studies were retrieved from Medline, Embase, CINAHL, and the Cochrane library, searched from 1970 till October 25, 2014.

Study Selection

Electronic databases were searched using keywords and corresponding synonyms ‘ward’, ‘continuous’, ‘monitoring’ and ‘respiration’. Pediatric, fetal and animal studies were excluded.

Data Extraction

Since no validated tool is currently available for diagnostic or intervention studies with continuous monitoring, methodological quality was assessed with a modified tool based on modified STARD, CONSORT, and TREND statements.

Data Synthesis

Six intervention and five diagnostic studies were included, evaluating the use of eight different devices for continuous respiratory monitoring. Quantitative data synthesis was not possible because intervention, study design and outcomes differed considerably between studies. Outcomes estimates for the intervention studies ranged from RR 0.14 (0.03, 0.64) for cardiopulmonary resuscitation to RR 1.00 (0.41, 2.35) for unplanned ICU admission after introduction of continuous respiratory monitoring,

Limitations

The methodological quality of most studies was moderate, e.g. ‘before-after’ designs, incomplete reporting of primary outcomes, and incomplete clinical implementation of the monitoring system.

Conclusions

Based on the findings of this systematic review, implementation of routine continuous non-invasive respiratory monitoring on general hospital wards cannot yet be advocated as results are inconclusive, and methodological quality of the studies needs improvement. Future research in this area should focus on technology explicitly suitable for low care settings and tailored alarm and treatment algorithms.  相似文献   

7.

Objective

Recognising overweight and obesity is critical to prompting action, and consequently preventing and treating obesity. The present study examined the association between parental perceptions of child weight status and child’s diet.

Methods

Participants were members of the Gateshead Millennium Study. Parental perception of their child’s weight status was assessed using a questionnaire and compared against International Obesity Task Force cut-offs for childhood overweight and obesity when the children were aged 6–8 years old. Diet was assessed at age 6-8years old using the FAST (Food Assessment in Schools Tool) food diary method. The association between parental perception and dietary patterns as defined by Principal Components Analysis, was assessed using multivariate regression after adjustment for child’s gender, child’s weight status, maternal body mass index (BMI), maternal education and deprivation status.

Results

Of the 361 parents who provided complete data on confounders and on their perception of their child’s weight status, 63 (17%) parents perceived their child as being of ‘normal’ weight or ‘overweight’ when they were actually ‘overweight’ or ‘obese’, respectively. After adjustment for confounders, parents who misperceived their child’s weight had children with a lower ‘healthy’ dietary pattern score compared to children whose parents correctly perceived their weight (β = -0.88; 95% CI: -1.7, -0.1; P-value = 0.028). This association was found despite higher consumption of reduced sugar carbonated drinks amongst children whose parents incorrectly perceived their weight status compared to children whose parents perceived their weight correctly (52.4% vs. 33.6%; P-value = 0.005).

Conclusions

In conclusion, children whose parents did not correctly perceive their weight status scored lower on the ‘healthy’ dietary pattern. Further research is required to define parents’ diets based on their perception status and to examine if a child’s or parent’s diet mediates the association between parental perception and child weight.  相似文献   

8.

Background

China’s rapidly changing economic landscape has led to widening social inequalities. Occupational status in terms of occupational type and prestige may reflect these socio-structural shifts of social position and be more predictive of self-rated health status than income and education, which may only reflect more gradual acquisitions of social status over time. The goals of this study were to understand the role of occupational status in predicting self-rated health, which is well known to be associated with long-term mortality, as well as compare the occupational status to the other major socioeconomic indicators of income and education.

Methods

Data from the 2010 baseline surveys of the China Family Panel Studies, which utilized multi-stage probability sampling with implicit stratification was used. Logistic regression was used to examine the relationship of various socioeconomic indicators (i.e. occupational status, income, and education) with self-rated health as the primary outcome of interest. A series of models considered the associations of occupational category or occupational prestige with self-rated health.

Results

The final sample consisted of 14,367 employed adults aged 18–60, which was nationally representative of working adults in China. We found that occupation was not a major predictor of self-rated health in China when age, ethnicity, location, marital status, physical and mental health status were controlled for, with the exception of women working in lower grade management and professional jobs (OR = 1.82, 95% CI: 1.03–3.22). In comparison, income followed by education exhibited greater association with self-rated health. The highest income group had the least probability to report poor health (In men: OR = 0.30, 95% CI: 0.21–0.43. In women: OR = 0.44, 95% CI: 0.26–0.73). People educated with junior high school had better self-rated health than those with primary and below education level (In men: OR = 0.62, 95% CI: 0.50–0.75. In women: OR = 0.53, 95% CI: 0.42–0.68). Income, education and occupation were correlated with each other.

Conclusions

Within the context of rapid societal changes in China, income and its implications for greater healthcare access and benefits had the greatest association with self-rated health followed by education. Occupational status was not associated. Occupational categories and prestige should be better adapted to reflect China’s unique sociopolitical and historical context.  相似文献   

9.

Background

Most diabetic foot amputations are caused by ulcers on the skin of the foot i.e. diabetic foot ulcers. Early identification of patients at high risk for diabetic foot ulcers is crucial. The ‘Simplified 60-Second Diabetic Foot Screening Tool’ has been designed to rapidly detect high risk diabetic feet, allowing for timely identification and referral of patients needing treatment. This study aimed to determine the clinical performance and inter-rater reliability of ‘Simplified 60 Second Diabetic Foot Screening Tool’ in order to evaluate its applicability for routine screening.

Methods and Findings

The tool was independently tested by n=12 assessors with n=18 Guyanese patients with diabetes. Inter-rater reliability was assessed by calculating Cronbach’s alpha for each of the assessment items. A minimum value of 0.60 was considered acceptable. Reliability scores of the screening tool assessment items were: ‘monofilament test’ 0.98; ‘active ulcer’ 0.97; ‘previous amputation’ 0.97; ‘previous ulcer’ 0.97; ‘fixed ankle’ 0.91; ‘deformity’ 0.87; ‘callus’ 0.87; ‘absent pulses’ 0.87; ‘fixed toe’ 0.80; ‘blisters’ 0.77; ‘ingrown nail’ 0.72; and ‘fissures’ 0.55. The item ‘stiffness in the toe or ankle’ was removed as it was observed in only 1.3% of patients. The item ‘fissures’ was also removed due to low inter-rater reliability. Clinical performance was assessed via a pilot study utilizing the screening tool on n=1,266 patients in an acute care setting in Georgetown, Guyana. In total, 48% of patients either had existing diabetic foot ulcers or were found to be at high risk for developing ulcers.

Conclusions

Clinicians in low and middle income countries such as Guyana can use the Simplified 60-Second Diabetic Screening Tool to facilitate early detection and appropriate treatment of diabetic foot ulcers. Implementation of this screening tool has the potential to decrease diabetes related disability and mortality.  相似文献   

10.

Background

Over the last decade academic interest in the prevalence and nature of herbal medicines use by pregnant women has increased significantly. Such data are usually collected by means of an administered questionnaire survey, however a key methodological limitation using this approach is the need to clearly define the scope of ‘herbals’ to be investigated. The majority of published studies in this area neither define ‘herbals’ nor provide a detailed checklist naming specific ‘herbals’ and CAM modalities, which limits inter-study comparison, generalisability and the potential for meta-analyses. The aim of this study was to compare the self-reported use of herbs, herbal medicines and herbal products using two different approaches implemented in succession.

Methods

Cross-sectional questionnaire surveys of women attending for their mid-trimester scan or attending the postnatal unit following live birth at the Royal Aberdeen Maternity Hospital, North-East Scotland. The questionnaire utilised two approaches to collect data on ‘herbals’ use, a single closed yes/no answer to the question “have you used herbs, herbal medicines and herbal products in the last three months”; and a request to tick which of a list of 40 ‘herbals’ they had used in the same time period.

Results

A total of 889 responses were obtained of which 4.3% (38) answered ‘yes’ to herbal use via the closed question. However, using the checklist 39% (350) of respondents reported the use of one or more specific ‘herbals’ (p<0.0001). The 312 respondents who reported ‘no’ to ‘herbals’ use via the closed question but “yes” via the checklist consumed a total of 20 different ‘herbals’ (median 1, interquartile range 1–2, range 1–6).

Conclusions

This study demonstrates that the use of a single closed question asking about the use of ‘herbals’, as frequently reported in published studies, may not yield valid data resulting in a gross underestimation of actual use.  相似文献   

11.

Objectives

Severe pre-eclampsia and eclampsia are one of the major causes of maternal mortality globally. Reducing maternal morbidity and mortality demands optimizing quality of care. Criteria-based audits are a tool to define, assess and improve quality of care. The aim of this study was to determine applicability of a criteria-based audit to assess quality of care delivered to women with severe hypertensive disorders in pregnancy, and to assess adherence to protocols and quality of care provided at a regional hospital in Accra, Ghana.

Methods

Checklists for management of severe preeclampsia, hypertensive emergency and eclampsia were developed in an audit cycle based on nine existing key clinical care protocols. Fifty cases were audited to assess quality of care, defined as adherence to protocols. Analysis was stratified for complicated cases, defined as (imminent) eclampsia, perinatal mortality and/or one or more WHO maternal near miss C-criteria.

Results

Mean adherence to the nine protocols ranged from 15–85%. Protocols for ‘plan for delivery’ and ‘magnesium sulphate administration’ were best adhered to (85%), followed by adherence to protocols for ‘eclampsia’ (64%), ‘severe pre-eclampsia at admission’ (60%), ‘severe pre-eclampsia ward follow-up’ (53%) and ‘hypertensive emergency’ (53%). Protocols for monitoring were least adhered to (15%). No difference was observed for severe disease. Increased awareness, protocol-based training of staff, and clear task assignment were identified as contributors to better adherence.

Conclusion

A criteria-based audit is an effective tool to determine quality of care, identify gaps in standard of care, and allow for monitoring and evaluation in a health facility, ultimately resulting in improved quality of care provided and reduced maternal morbidity and mortality. In our audit, good adherence was observed for plan for delivery and treatment with magnesium sulphate. Substandard adherence to a number of protocols was identified, and points towards opportunities for targeted improvement strategies.  相似文献   

12.

Background

The HIV cascade of care (cascade) is a comprehensive tool which identifies attrition along the HIV care continuum. We executed analyses to explicate heterogeneity in the cascade across key strata, as well as identify predictors of attrition across stages of the cascade.

Methods

Using linked individual-level data for the population of HIV-positive individuals in BC, we considered the 2011 calendar year, including individuals diagnosed at least 6 months prior, and excluding individuals that died or were lost to follow-up before January 1st, 2011. We defined five stages in the cascade framework: HIV ‘diagnosed’, ‘linked’ to care, ‘retained’ in care, ‘on HAART’ and virologically ‘suppressed’. We stratified the cascade by sex, age, risk category, and regional health authority. Finally, multiple logistic regression models were built to predict attrition across each stage of the cascade, adjusting for stratification variables.

Results

We identified 7621 HIV diagnosed individuals during the study period; 80% were male and 5% were <30, 17% 30–39, 37% 40–49 and 40% were ≥50 years. Of these, 32% were MSM, 28% IDU, 8% MSM/IDU, 12% heterosexual, and 20% other. Overall, 85% of individuals ‘on HAART’ were ‘suppressed’; however, this proportion ranged from 60%–93% in our various stratifications. Most individuals, in all subgroups, were lost between the stages: ‘linked’ to ‘retained’ and ‘on HAART’ to ‘suppressed’. Subgroups with the highest attrition between these stages included females and individuals <30 years (regardless of transmission risk group). IDUs experienced the greatest attrition of all subgroups. Logistic regression results found extensive statistically significant heterogeneity in attrition across the cascade between subgroups and regional health authorities.

Conclusions

We found that extensive heterogeneity in attrition existed across subgroups and regional health authorities along the HIV cascade of care in B.C., Canada. Our results provide critical information to optimize engagement in care and health service delivery.  相似文献   

13.

Background

Aedes aegypti, the principal vector of dengue fever, have been genetically engineered for use in a sterile insect control programme. To improve our understanding of the dispersal ecology of mosquitoes and to inform appropriate release strategies of ‘genetically sterile’ male Aedes aegypti detailed knowledge of the dispersal ability of the released insects is needed.

Methodology/Principal Findings

The dispersal ability of released ‘genetically sterile’ male Aedes aegypti at a field site in Brazil has been estimated. Dispersal kernels embedded within a generalized linear model framework were used to analyse data collected from three large scale mark release recapture studies. The methodology has been applied to previously published dispersal data to compare the dispersal ability of ‘genetically sterile’ male Aedes aegypti in contrasting environments. We parameterised dispersal kernels and estimated the mean distance travelled for insects in Brazil: 52.8m (95% CI: 49.9m, 56.8m) and Malaysia: 58.0m (95% CI: 51.1m, 71.0m).

Conclusions/Significance

Our results provide specific, detailed estimates of the dispersal characteristics of released ‘genetically sterile’ male Aedes aegypti in the field. The comparative analysis indicates that despite differing environments and recapture rates, key features of the insects’ dispersal kernels are conserved across the two studies. The results can be used to inform both risk assessments and release programmes using ‘genetically sterile’ male Aedes aegypti.  相似文献   

14.

Background

HIV Pre-Exposure Prophylaxis (PrEP) has been found to be efficacious in preventing HIV acquisition among seronegative individuals in a variety of risk groups, including men who have sex with men and people who inject drugs. To date, however, it remains unclear how socio-cultural norms (e.g., attitudes towards HIV; social understandings regarding HIV risk practices) may influence the scalability of future PrEP interventions. The objective of this study is to assess how socio-cultural norms may influence the implementation and scalability of future HIV PrEP interventions in Vancouver, Canada.

Methods

We conducted 50 interviews with young men (ages 18–24) with a variety of HIV risk behavioural profiles (e.g., young men who inject drugs; MSM). Interviews focused on participants’ experiences and perceptions with various HIV interventions and policies, including PrEP.

Results

While awareness of PrEP was generally low, perceptions about the potential personal and public health gains associated with PrEP were interconnected with expressions of complex and sometimes conflicting social norms. Some accounts characterized PrEP as a convenient form of reliable protection against HIV, likening it to the female birth control pill. Other accounts cast PrEP as a means to facilitate ‘socially unacceptable’ behaviour (e.g., promiscuity). Stigmatizing rhetoric was used to position PrEP as a tool that could promote some groups’ proclivities to take ‘risks’.

Conclusion

Stigma regarding ‘risky’ behaviour and PrEP should not be underestimated as a serious implementation challenge. Pre-implementation strategies that concomitantly aim to improve knowledge about PrEP, while addressing associated social prejudices, may be key to effective implementation and scale-up.  相似文献   

15.

Objectives

To investigate the teaching of antimicrobial stewardship (AS) in undergraduate healthcare educational degree programmes in the United Kingdom (UK).

Participants and Methods

Cross-sectional survey of undergraduate programmes in human and veterinary medicine, dentistry, pharmacy and nursing in the UK. The main outcome measures included prevalence of AS teaching; stewardship principles taught; estimated hours apportioned; mode of content delivery and teaching strategies; evaluation methodologies; and frequency of multidisciplinary learning.

Results

80% (112/140) of programmes responded adequately. The majority of programmes teach AS principles (88/109, 80.7%). ‘Adopting necessary infection prevention and control precautions’ was the most frequently taught principle (83/88, 94.3%), followed by ''timely collection of microbiological samples for microscopy, culture and sensitivity’ (73/88, 82.9%) and ‘minimisation of unnecessary antimicrobial prescribing’ (72/88, 81.8%). The ‘use of intravenous administration only to patients who are severely ill, or unable to tolerate oral treatment’ was reported in ~50% of courses. Only 32/88 (36.3%) programmes included all recommended principles.

Discussion

Antimicrobial stewardship principles are included in most undergraduate healthcare and veterinary degree programmes in the UK. However, future professionals responsible for using antimicrobials receive disparate education. Education may be boosted by standardisation and strengthening of less frequently discussed principles.  相似文献   

16.

Objectives

The home environment is thought to play a key role in early weight trajectories, although direct evidence is limited. There is general agreement that multiple factors exert small individual effects on weight-related outcomes, so use of composite measures could demonstrate stronger effects. This study therefore examined whether composite measures reflecting the ‘obesogenic’ home environment are associated with diet, physical activity, TV viewing, and BMI in preschool children.

Methods

Families from the Gemini cohort (n = 1096) completed a telephone interview (Home Environment Interview; HEI) when their children were 4 years old. Diet, physical activity, and TV viewing were reported at interview. Child height and weight measurements were taken by the parents (using standard scales and height charts) and reported at interview. Responses to the HEI were standardized and summed to create four composite scores representing the food (sum of 21 variables), activity (sum of 6 variables), media (sum of 5 variables), and overall (food composite/21 + activity composite/6 + media composite/5) home environments. These were categorized into ‘obesogenic risk’ tertiles.

Results

Children in ‘higher-risk’ food environments consumed less fruit (OR; 95% CI = 0.39; 0.27–0.57) and vegetables (0.47; 0.34–0.64), and more energy-dense snacks (3.48; 2.16–5.62) and sweetened drinks (3.49; 2.10–5.81) than children in ‘lower-risk’ food environments. Children in ‘higher-risk’ activity environments were less physically active (0.43; 0.32–0.59) than children in ‘lower-risk’ activity environments. Children in ‘higher-risk’ media environments watched more TV (3.51; 2.48–4.96) than children in ‘lower-risk’ media environments. Neither the individual nor the overall composite measures were associated with BMI.

Conclusions

Composite measures of the obesogenic home environment were associated as expected with diet, physical activity, and TV viewing. Associations with BMI were not apparent at this age.  相似文献   

17.

Background

Smoking may worsen the disease outcomes in patients with Crohn’s disease (CD), however the effect of exposure to second-hand cigarette smoke during childhood is unclear. In South Africa, no such literature exists. The aim of this study was to investigate whether disease phenotype, at time of diagnosis of CD, was associated with exposure to second-hand cigarette during childhood and active cigarette smoking habits.

Methods

A cross sectional examination of all consecutive CD patients seen during the period September 2011-January 2013 at 2 large inflammatory bowel disease centers in the Western Cape, South Africa was performed. Data were collected via review of patient case notes, interviewer-administered questionnaire and clinical examination by the attending gastroenterologist. Disease phenotype (behavior and location) was evaluated at time of diagnosis, according to the Montreal Classification scheme. In addition, disease behavior was stratified as ‘complicated’ or ‘uncomplicated’, using predefined definitions. Passive cigarette smoke exposure was evaluated during 3 age intervals: 0–5, 6–10, and 11–18 years.

Results

One hundred and ninety four CD patients were identified. Cigarette smoking during the 6 months prior to, or at time of diagnosis was significantly associated with ileo-colonic (L3) disease (RRR = 3.63; 95%CI, 1.32–9.98, p = 0.012) and ileal (L1) disease (RRR = 3.54; 95%CI, 1.06–11.83, p = 0.040) compared with colonic disease. In smokers, childhood passive cigarette smoke exposure during the 0–5 years age interval was significantly associated with ileo-colonic CD location (RRR = 21.3; 95%CI, 1.16–391.55, p = 0.040). No significant association between smoking habits and disease behavior at diagnosis, whether defined by the Montreal scheme, or stratified as ‘complicated’ vs ‘uncomplicated’, was observed.

Conclusion

Smoking habits were associated with ileo-colonic (L3) and ileal (L1) disease at time of diagnosis in a South African cohort.  相似文献   

18.

Objectives

To quantify and compare the association between the World Health Organizations’ Asian-specific trigger points for public health action [‘increased risk’: body mass index (BMI) ≥23 kg/m2, and; ‘high risk’: BMI ≥27.5 kg/m2] with self-reported cardiovascular-related conditions in Asian-Canadian sub-groups.

Methods

Six cycles of the Canadian Community Health Survey (2001–2009) were pooled to examine BMI and health in Asian sub-groups (South Asians, Chinese, Filipino, Southeast Asians, Arabs, West Asians, Japanese and Korean; N = 18 794 participants, ages 18–64 y). Multivariable logistic regression, adjusting for demographic, lifestyle characteristics and acculturation measures, was used to estimate the odds of cardiovascular-related health (high blood pressure, heart disease, diabetes, ‘at least one cardiometabolic condition’) outcomes across all eight Asian sub-groups.

Results

Compared to South Asians (OR = 1.00), Filipinos had higher odds of having ‘at least one cardiometabolic condition’ (OR = 1.29, 95% CI: 1.04–1.62), whereas Chinese (0.63, 0.474–0.9) and Arab-Canadians had lower odds (0.38, 0.28–0.51). In ethnic-specific analyses (with ‘acceptable’ risk weight as the referent), ‘increased’ and ‘high’ risk weight categories were the most highly associated with ‘at least one cardiometabolic condition’ in Chinese (‘increased’: 3.6, 2.34–5.63; ‘high’: 8.9, 3.6–22.01). Compared to normal weight South Asians, being in the ‘high’ risk weight category in all but the Southeast Asian, Arab, and Japanese ethnic groups was associated with approximately 3-times the likelihood of having ‘at least one cardiometabolic condition’.

Conclusion

Differences in the association between obesity and cardiometabolic health risks were seen among Asian sub-groups in Canada. The use of WHO’s lowered Asian-specific BMI cut-offs identified obesity-related risks in South Asian, Filipino and Chinese sub-groups that would have been masked by traditional BMI categories. These findings have implications for public health messaging, especially for ethnic groups at higher odds of obesity-related health risks.  相似文献   

19.

Background

A considerable number of bariatric patients report poor long-term weight loss after Roux-en-Y gastric bypass (RYGB) surgery. One possibility for an underlying cause is an impairment of cognitive control that impedes this patient group’s dietary efforts.

Objective

To investigate if patients having either poor or good weight loss response, ~12 years after RYGB-surgery, differ in their ability to inhibit prepotent responses when processing food cues during attentional operations—as measure of cognitive control.

Methods

In terms of weight loss following RYGB-surgery, 15 ‘poor responders’ and 15 ‘good responders’, matched for gender, age, education, preoperative body mass index, and years since surgery, were administered two tasks that measure sustained attention and response control: a go/no-go task and a Stroop interference task; both of which are associated with maladaptive eating behaviours.

Results

The poor responders (vs. good responders) needed significantly more time when conducting a go/no-go task (603±134 vs. 519±44 msec, p = 0.03), but the number of errors did not differ between groups. When conducting a Stroop interference task, poor responders read fewer inks than good responders (68±16 vs. 85±10 words, p = 0.002).

Conclusion

Patients lacking sustainable weight loss after RYGB-surgery showed poorer inhibitory control than patients that successfully lost weight. In the authors’ view, these results suggest that cognitive behavioral therapies post-RYGB-surgery may represent a promising behavioral adjuvant to achieve sustainable weight loss in patients undergoing this procedure. Future studies should examine whether these control deficits in poor responders are food-specific or not.  相似文献   

20.

Background

Healthy diet has been associated with better muscle strength and physical performance in cross-sectional studies of older adults but the effect of dietary patterns (DP) on subsequent decline, particularly in the very old (aged 85+), has not been determined.

Objective

We investigated the association between previously established DP and decline in muscle strength and physical performance in the very old.

Design

791 participants (61.8% women) from the Newcastle 85+ Study were followed-up for change in hand grip strength (HGS) and Timed Up-and Go (TUG) test over 5 years (four waves 1.5 years apart). Mixed models were used to determine the effects of DP on muscle strength and physical performance in the entire cohort and separately by sex.

Results

Previously we have established three DP that varied in intake of red meats, potato, gravy and butter and differed with key health and social factors. HGS declined linearly by 1.59 kgF in men and 1.08 kgF in women (both p<0.001), and TUG slowed by 0.13 log10-transformed seconds (log10-s) in men and 0.11 log10-s in women per wave after adjusting for important covariates (both p<0.001), and also showed a nonlinear change (p<0.001). Men in DP1 (‘High Red Meat’) had worse overall HGS (β = -1.70, p = 0.05), but men in DP3 (‘High Butter’) had a steeper decline (β = -0.63, p = 0.05) than men in DP2 (‘Low Meat’). Men in DP1 and women in DP3 also had overall slower TUG than those in DP2 (β = 0.08, p = 0.001 and β = 0.06, p = 0.01, respectively), but similar rate of decline after adjusting for sociodemographic, lifestyle, health, and functioning factors. The results for HGS and TUG were not affected by participants’ cognitive status.

Conclusions

DP high in red meats, potato and gravy (DP1), or butter (DP3) may adversely affect muscle strength and physical performance in later life, independently of important covariates and cognitive status.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号