首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.

Objectives

To date, limited and inconsistent evidence exists regarding racial discrimination and risk of cardiovascular disease (CVD).

Methods

Cross-sectional observational study of 1005 US-born non-Hispanic black (n = 504) and white (n = 501) participants age 35–64 randomly selected from community health centers in Boston, MA (2008–2010; 82.4% response rate), using 3 racial discrimination measures: explicit self-report; implicit association test (IAT, a time reaction test for self and group as target vs. perpetrator of discrimination); and structural (Jim Crow status of state of birth, i.e. legal racial discrimination prior 1964).

Results

Black and white participants both had adverse cardiovascular and socioeconomic profiles, with black participants most highly exposed to racial discrimination. Positive crude associations among black participants occurred for Jim Crow birthplace and hypertension (odds ratio (OR) 1.92, 95% confidence interval (CI) 1.28, 2.89) and for explicit self-report and the Framingham 10 year CVD risk score (beta  = 0.04; 95% CI 0.01, 0.07); among white participants, only negative crude associations existed (for IAT for self, for lower systolic blood pressure (SBP; beta  = −4.86; 95% CI −9.08, −0.64) and lower Framingham CVD score (beta  = −0.36, 95% CI −0.63, −0.08)). All of these associations were attenuated and all but the white IAT-Framingham risk score association were rendered null in analyses that controlled for lifetime socioeconomic position and additional covariates. Controlling for racial discrimination, socioeconomic position, and other covariates did not attenuate the crude black excess risk for SBP and hypertension and left unaffected the null excess risk for the Framingham CVD score.

Conclusion

Despite worse exposures among the black participants, racial discrimination and socioeconomic position were not associated, in multivariable analyses, with risk of CVD. We interpret results in relation to constrained variability of exposures and outcomes and discuss implications for valid research on social determinants of health.  相似文献   

2.

Background

Implicit racial bias denotes socio-cognitive attitudes towards other-race groups that are exempt from conscious awareness. In parallel, other-race faces are more difficult to differentiate relative to own-race faces – the “Other-Race Effect.” To examine the relationship between these two biases, we trained Caucasian subjects to better individuate other-race faces and measured implicit racial bias for those faces both before and after training.

Methodology/Principal Findings

Two groups of Caucasian subjects were exposed equally to the same African American faces in a training protocol run over 5 sessions. In the individuation condition, subjects learned to discriminate between African American faces. In the categorization condition, subjects learned to categorize faces as African American or not. For both conditions, both pre- and post-training we measured the Other-Race Effect using old-new recognition and implicit racial biases using a novel implicit social measure – the “Affective Lexical Priming Score” (ALPS). Subjects in the individuation condition, but not in the categorization condition, showed improved discrimination of African American faces with training. Concomitantly, subjects in the individuation condition, but not the categorization condition, showed a reduction in their ALPS. Critically, for the individuation condition only, the degree to which an individual subject''s ALPS decreased was significantly correlated with the degree of improvement that subject showed in their ability to differentiate African American faces.

Conclusions/Significance

Our results establish a causal link between the Other-Race Effect and implicit racial bias. We demonstrate that training that ameliorates the perceptual Other-Race Effect also reduces socio-cognitive implicit racial bias. These findings suggest that implicit racial biases are multifaceted, and include malleable perceptual skills that can be modified with relatively little training.  相似文献   

3.
W Wang  W Fu  J Wu  XC Ma  XL Sun  Y Huang  K Hashimoto  CG Gao 《PloS one》2012,7(7):e41665

Context

On May12th 2008, a devastating earthquake measuring 8.0 on the Richter scale, struck Wenchuan county and surrounding areas in China. The prevalence of mental illness among children and adolescents in a rural town far from the earthquake epicenter is unknown.

Objective

To assess the prevalence of posttraumatic stress disorder (PTSD) and depression among junior middle school students in a rural town Ningqiang county, 327 km from the earthquake epicenter.

Design, Setting, and Participants

A population-based mental health survey was conducted in March, 2009.

Main Outcome Measure

Survey Self-designed General Condition Survey Scale, Children''s Revised Impact of Event Scale (CRIES-13), and the Depression Self-rating Scale for Children (DSRSC) were used to sample 1,841 junior middle school students in Ningqiang county, ten months after the Wenchuan earthquake.

Results

The prevalence rate of a high-risk for PTSD was 28.4%, with 32.7% among females, 23.8% among males (female vs. male, p<0.001), 38.6% in the severe exposure group and 24.3% in the mild exposure group (severe vs. mild exposure, p<0.001). For depressive symptoms, the overall prevalence was 19.5%, with 24.0% among females, 14.7% among males, 24.5% in the severe exposure group and 17.5% in the mild exposure group (female vs. male, p<0.001; severe vs. mild exposure, p<0.001, respectively). In multivariate analysis, factors such as “having felt despair”, or “danger” and “having own house destroyed or damaged” were significantly associated with PTSD symptoms. Female gender and delayed evacuation in females, and earthquake related experiences in males were significantly associated with depression.

Conclusion

Traumatic events experienced during the earthquake were significantly associated with symptoms of PTSD and depression in children and adolescents, ten months after the Wenchuan earthquake. These data highlight a need for mental health services for children and adolescents in rural areas, far from earthquake epicenters.  相似文献   

4.

Background

Reducing substance use and unprotected sex by HIV-positive persons improves individual health status while decreasing the risk of HIV transmission. Despite recommendations that health care providers screen and counsel their HIV-positive patients for ongoing behavioral risks, it is unknown how to best provide “prevention with positives” in clinical settings. Positive Choice, an interactive, patient-tailored computer program, was developed in the United States to improve clinic-based assessment and counseling for risky behaviors.

Methodology and Findings

We conducted a parallel groups randomized controlled trial (December 2003–September 2006) at 5 San Francisco area outpatient HIV clinics. Eligible patients (HIV-positive English-speaking adults) completed an in-depth computerized risk assessment. Participants reporting substance use or sexual risks (n = 476) were randomized in stratified blocks. The intervention group received tailored risk-reduction counseling from a “Video Doctor” via laptop computer and a printed Educational Worksheet; providers received a Cueing Sheet on reported risks. Compared with control, fewer intervention participants reported continuing illicit drug use (RR 0.81, 95% CI: 0.689, 0.957, p = 0.014 at 3 months; and RR 0.65, 95% CI: 0.540, 0.785, p<0.001 at 6 months) and unprotected sex (RR 0.88, 95% CI: 0.773, 0.993, p = 0.039 at 3 months; and RR 0.80, 95% CI: 0.686, 0.941, p = 0.007 at 6 months). Intervention participants reported fewer mean days of ongoing illicit drug use (-4.0 days vs. -1.3 days, p = 0.346, at 3 months; and -4.7 days vs. -0.7 days, p = 0.130, at 6 months) than did controls, and had fewer casual sex partners at (−2.3 vs. −1.4, p = 0.461, at 3 months; and −2.7 vs. −0.6, p = 0.042, at 6 months).

Conclusions

The Positive Choice intervention achieved significant cessation of illicit drug use and unprotected sex at the group-level, and modest individual-level reductions in days of ongoing drug use and number of casual sex partners compared with the control group. Positive Choice, including Video Doctor counseling, is an efficacious and appropriate adjunct to risk-reduction efforts in outpatient settings, and holds promise as a public health HIV intervention.

Trial Registration

Clinicaltrials.gov NCT00447707  相似文献   

5.
Wan X  Shin SS  Wang Q  Raymond HF  Liu H  Ding D  Yang G  Novotny TE 《PloS one》2011,6(8):e23028

Background

Rural-to-urban migrant women may be vulnerable to smoking initiation as they are newly exposed to risk factors in the urban environment. We sought to identify correlates of smoking among rural-to-urban migrant women in China.

Methods/Principal Findings

A cross-sectional survey of rural-to-urban migrant women working in restaurants and hotels (RHW) and those working as commercial sex workers (CSW) was conducted in ten provincial capital cities in China. Multiple logistic regression was conducted to identify correlates of smoking. We enrolled 2229 rural-to-urban migrant women (1697 RHWs aged 18–24 years and 532 CSWs aged 18–30 years). Of these, 18.4% RHWs and 58.3% CSWs reported ever tried smoking and 3.2% RHWs and 41.9% CSWs reported current smoking. Participants who first tried smoking after moving to the city were more likely to be current smokers compared to participants who first tried smoking before moving to the city (25.3% vs. 13.8% among RHWs, p = 0.02; 83.6% vs. 58.6% among CSWs, p = <0.01). Adjusting for other factors, “tried female cigarette brands” had the strongest association with current smoking (OR 5.69, 95%CI 3.44 to 9.41) among participants who had ever tried smoking.

Conclusions/Significance

Exposure to female cigarette brands may increase the susceptibility to smoking among rural-to-urban migrant women. Smoke-free policies and increased taxes may be effective in preventing rural-to-urban migrant women from smoking initiation.  相似文献   

6.

Background

Poverty and blindness are believed to be intimately linked, but empirical data supporting this purported relationship are sparse. The objective of this study is to assess whether there is a reduction in poverty after cataract surgery among visually impaired cases.

Methodology/Principal Findings

A multi-centre intervention study was conducted in three countries (Kenya, Philippines, Bangladesh). Poverty data (household per capita expenditure – PCE, asset ownership and self-rated wealth) were collected from cases aged ≥50 years who were visually impaired due to cataract (visual acuity<6/24 in the better eye) and age-sex matched controls with normal vision. Cases were offered free/subsidised cataract surgery. Approximately one year later participants were re-interviewed about poverty. 466 cases and 436 controls were examined at both baseline and follow-up (Follow up rate: 78% for cases, 81% for controls), of which 263 cases had undergone cataract surgery (“operated cases”). At baseline, operated cases were poorer compared to controls in terms of PCE (Kenya: $22 versus £35 p = 0.02, Bangladesh: $16 vs $24 p = 0.004, Philippines: $24 vs 32 p = 0.0007), assets and self-rated wealth. By follow-up PCE had increased significantly among operated cases in each of the three settings to the level of controls (Kenya: $30 versus £36 p = 0.49, Bangladesh: $23 vs $23 p = 0.20, Philippines: $45 vs $36 p = 0.68). There were smaller increases in self-rated wealth and no changes in assets. Changes in PCE were apparent in different socio-demographic and ocular groups. The largest PCE increases were apparent among the cases that were poorest at baseline.

Conclusions/Significance

This study showed that cataract surgery can contribute to poverty alleviation, particularly among the most vulnerable members of society. This study highlights the need for increased provision of cataract surgery to poor people and shows that a focus on blindness may help to alleviate poverty and achieve the Millennium Development Goals.  相似文献   

7.

Objective

Willingness to participate in obesity prevention programs is low; underlying reasons are poorly understood. We evaluated reasons for (non)participating in a novel telephone-based obesity prevention program for overweight children and their families.

Method

Overweight children and adolescents (BMI>90th percentile) aged 3.5–17.4 years were screened via the CrescNet database, a representative cohort of German children, and program participation (repetitive computer aided telephone counseling) was offered by their local pediatrician. Identical questionnaires to collect baseline data on anthropometrics, lifestyle, eating habits, sociodemographic and psychosocial parameters were analyzed from 433 families (241 participants, 192 nonparticipants). Univariate analyses and binary logistic regression were used to identify factors associated with nonparticipation.

Results

The number of overweight children (BMI>90th percentile) was higher in nonparticipants than participants (62% vs. 41.1%,p<0.001), whereas the number of obese children (BMI>97th percentile) was higher in participants (58.9% vs.38%,p<0.001). Participating girls were younger than boys (8.8 vs.10.4 years, p<0.001). 87.3% and 40% of participants, but only 72.2% and 24.7% of nonparticipants, respectively, reported to have regular breakfasts (p = 0.008) and 5 regular daily meals (p = 0.003). Nonparticipants had a lower household-net-income (p<0.001), but higher subjective physical wellbeing than participants (p = 0.018) and believed that changes in lifestyle can be made easily (p = 0.05).

Conclusion

An important reason for nonparticipation was non-awareness of their child''s weight status by parents. Nonparticipants, who were often low-income families, believed that they already perform a healthy lifestyle and had a higher subjective wellbeing. We hypothesize that even a low-threshold intervention program does not reach the families who really need it.  相似文献   

8.

Objective

To examine the associations between pet keeping in early childhood and asthma and allergies in children aged 6–10 years.

Design

Pooled analysis of individual participant data of 11 prospective European birth cohorts that recruited a total of over 22,000 children in the 1990s.

Exposure definition

Ownership of only cats, dogs, birds, rodents, or cats/dogs combined during the first 2 years of life.

Outcome definition

Current asthma (primary outcome), allergic asthma, allergic rhinitis and allergic sensitization during 6–10 years of age.

Data synthesis

Three-step approach: (i) Common definition of outcome and exposure variables across cohorts; (ii) calculation of adjusted effect estimates for each cohort; (iii) pooling of effect estimates by using random effects meta-analysis models.

Results

We found no association between furry and feathered pet keeping early in life and asthma in school age. For example, the odds ratio for asthma comparing cat ownership with “no pets” (10 studies, 11489 participants) was 1.00 (95% confidence interval 0.78 to 1.28) (I2 = 9%; p = 0.36). The odds ratio for asthma comparing dog ownership with “no pets” (9 studies, 11433 participants) was 0.77 (0.58 to 1.03) (I2 = 0%, p = 0.89). Owning both cat(s) and dog(s) compared to “no pets” resulted in an odds ratio of 1.04 (0.59 to 1.84) (I2 = 33%, p = 0.18). Similarly, for allergic asthma and for allergic rhinitis we did not find associations regarding any type of pet ownership early in life. However, we found some evidence for an association between ownership of furry pets during the first 2 years of life and reduced likelihood of becoming sensitized to aero-allergens.

Conclusions

Pet ownership in early life did not appear to either increase or reduce the risk of asthma or allergic rhinitis symptoms in children aged 6–10. Advice from health care practitioners to avoid or to specifically acquire pets for primary prevention of asthma or allergic rhinitis in children should not be given.  相似文献   

9.

Background

Preclinical research implicates dopaminergic and noradrenergic mechanisms in mediating the reinforcing effects of drugs of abuse, including cocaine. The objective of this study was to evaluate the impact of treatment with the noradrenergic α1 receptor antagonist doxazosin on the positive subjective effects of cocaine.

Methods

Thirteen non-treatment seeking, cocaine-dependent volunteers completed this single-site, randomized, placebo-controlled, within-subjects study. In one study phase volunteers received placebo and in the other they received doxazosin, with the order counterbalanced across participants. Study medication was masked by over-encapsulating doxazosin tablets and matched placebo lactose served as the control. Study medication treatment was initiated at 1 mg doxazosin or equivalent number of placebo capsules PO/day and increased every three days by 1 mg. After receiving 4 mg doxazosin or equivalent number of placebo capsules participants received masked doses of 20 and 40 mg cocaine IV in that order with placebo saline randomly interspersed to maintain the blind.

Results

Doxazosin treatment was well tolerated and doxazosin alone produced minimal changes in heart rate and blood pressure. During treatment with placebo, cocaine produced dose-dependent increases in subjective effect ratings of “high”, “stimulated”, “like cocaine”, “desire cocaine”, “any drug effect”, and “likely to use cocaine if had access” (p<.001). Doxazosin treatment significantly attenuated the effects of 20 mg cocaine on ratings of “stimulated”, “like cocaine”, and “likely to use cocaine if had access” (p<.05). There were trends for doxazosin to reduce ratings of “stimulated”, “desire cocaine”, and “likely to use cocaine if had access” (p<.10).

Conclusions

Medications that block noradrenergic α1 receptors, such as doxazosin, may be useful as treatments for cocaine dependence, and should be evaluated further.

Trial Registration

Clinicaltrials.gov NCT01062945  相似文献   

10.

Background

While evidence of the contribution of racial discrimination to ethnic health disparities has increased significantly, there has been less research examining relationships between ascribed racial/ethnic categories and health. It has been hypothesized that in racially-stratified societies being assigned as belonging to the dominant racial/ethnic group may be associated with health advantage. This study aimed to investigate associations between socially-assigned ethnicity, self-identified ethnicity, and health, and to consider the role of self-reported experience of racial discrimination in any relationships between socially-assigned ethnicity and health.

Methods

The study used data from the 2006/07 New Zealand Health Survey (n = 12,488), a nationally representative cross-sectional survey of adults 15 years and over. Racial discrimination was measured as reported individual-level experiences across five domains. Health outcome measures examined were self-reported general health and psychological distress.

Results

The study identified varying levels of agreement between participants'' self-identified and socially-assigned ethnicities. Individuals who reported both self-identifying and being socially-assigned as always belonging to the dominant European grouping tended to have more socioeconomic advantage and experience less racial discrimination. This group also had the highest odds of reporting optimal self-rated health and lower mean levels of psychological distress. These differences were attenuated in models adjusting for socioeconomic measures and individual-level racial discrimination.

Conclusions

The results suggest health advantage accrues to individuals who self-identify and are socially-assigned as belonging to the dominant European ethnic grouping in New Zealand, operating in part through socioeconomic advantage and lower exposure to individual-level racial discrimination. This is consistent with the broader evidence of the negative impacts of racism on health and ethnic inequalities that result from the inequitable distribution of health determinants, the harm and chronic stress linked to experiences of racial discrimination, and via the processes and consequences of racialization at a societal level.  相似文献   

11.
Doolan K  Ehrlich R  Myer L 《PloS one》2007,2(12):e1290

Background

Violence is a leading cause of morbidity and mortality in South Africa and needs to be researched from a public health perspective. Typically in violence research, socioeconomic position is used in the analysis to control for confounding. Social epidemiology approaches this variable as a primary determinant of interest and is used in this research to better understand the aetiology of violence in South Africa. We hypothesised that measures of socioeconomic position (employment, education and household wealth) would be inversely related to violence at the individual and household levels.

Methodology/Principal Findings

Data came from the1998 South African Demographic and Health Survey (SADHS). Measures of socioeconomic position used were employment, education and household wealth. Eighty-eight people (0.2%) received treatment for a violent injury in the previous 30 days and 103 households (0.9%) experienced a violent death in the previous year. Risk factors for violence at the individual level included employment (41% of those who experienced violence were employed vs. 27% of those who did not, p = 0.02), and education (those who experienced violence had on average, one year more education than those who did not, p = 0.04). Belonging to a household in the wealthiest quintile was protective against violence (OR: 0.32; 95% CI: 0.12–0.89). In contrast, at the household level all three measures of socioeconomic position were protective against the experience of a violent death. The only association to persist in the multivariate analysis was that between the wealth of the household and violence at the individual level.

Conclusions/Significance

Our hypothesis was supported if household wealth was used as the measure of socioeconomic position at the individual level. While more research is needed to inform the conflicting results observed between the individual and household levels, this analysis has begun to identify the disparities across the socioeconomic structure with respect to violent outcomes.  相似文献   

12.

Background

Stigmatization is one of the greatest obstacles to the successful integration of people with Trisomy 21 (T21 or Down syndrome), the most frequent genetic disorder associated with intellectual disability. Research on attitudes and stereotypes toward these people still focuses on explicit measures subjected to social-desirability biases, and neglects how variability in facial stigmata influences attitudes and stereotyping.

Methodology/Principal Findings

The participants were 165 adults including 55 young adult students, 55 non-student adults, and 55 professional caregivers working with intellectually disabled persons. They were faced with implicit association tests (IAT), a well-known technique whereby response latency is used to capture the relative strength with which some groups of people—here photographed faces of typically developing children and children with T21—are automatically (without conscious awareness) associated with positive versus negative attributes in memory. Each participant also rated the same photographed faces (consciously accessible evaluations). We provide the first evidence that the positive bias typically found in explicit judgments of children with T21 is smaller for those whose facial features are highly characteristic of this disorder, compared to their counterparts with less distinctive features and to typically developing children. We also show that this bias can coexist with negative evaluations at the implicit level (with large effect sizes), even among professional caregivers.

Conclusion

These findings support recent models of feature-based stereotyping, and more importantly show how crucial it is to go beyond explicit evaluations to estimate the true extent of stigmatization of intellectually disabled people.  相似文献   

13.

Background

The Mallett Unit is a clinical test designed to detect the fixation disparity that is most likely to occur in the presence of a decompensated heterophoria. It measures the associated phoria, which is the “aligning prism” needed to nullify the subjective disparity. The technique has gained widespread acceptance within professions such as optometry, for investigating suspected cases of decompensating heterophoria; it is, however, rarely used by orthoptists and ophthalmologists. The aim of this study was to investigate whether fusional vergence reserves, measured routinely by both orthoptists and ophthalmologists to detect heterophoria decompensation, were correlated with aligning prism (associated phoria) in a normal clinical population.

Methodology/Principal Findings

Aligning prism (using the Mallett Unit) and fusional vergence reserves (using a prism bar) were measured in 500 participants (mean 41.63 years; standard deviation 11.86 years) at 40 cm and 6 m. At 40 cm a strong correlation (p<0.001) between base in aligning prism (Exo FD) and positive fusional reserves was found. Of the participants with zero aligning prism 30% had reduced fusional reserves. At 6 m a weak correlation between base out aligning prism (Eso FD) and negative fusional reserves was found to break (p = 0.01) and to recovery (p = 0.048). Of the participants with zero aligning prism 12% reported reduced fusional reserves.

Conclusions/Significance

For near vision testing, the strong inverse correlation between base in aligning prism (Exo FD) and fusional vergence reserves supports the notion that both measures are indicators of decompensation of heterophoria. For distance vision testing and for those patients reporting zero aligning prism further research is required to determine why the relationship appears to be weak/non-existent?  相似文献   

14.

Background

The sieve analysis for the Step trial found evidence that breakthrough HIV-1 sequences for MRKAd5/HIV-1 Gag/Pol/Nef vaccine recipients were more divergent from the vaccine insert than placebo sequences in regions with predicted epitopes. We linked the viral sequence data with immune response and acute viral load data to explore mechanisms for and consequences of the observed sieve effect.

Methods

Ninety-one male participants (37 placebo and 54 vaccine recipients) were included; viral sequences were obtained at the time of HIV-1 diagnosis. T-cell responses were measured 4 weeks post-second vaccination and at the first or second week post-diagnosis. Acute viral load was obtained at RNA-positive and antibody-negative visits.

Findings

Vaccine recipients had a greater magnitude of post-infection CD8+ T cell response than placebo recipients (median 1.68% vs 1.18%; p = 0·04) and greater breadth of post-infection response (median 4.5 vs 2; p = 0·06). Viral sequences for vaccine recipients were marginally more divergent from the insert than placebo sequences in regions of Nef targeted by pre-infection immune responses (p = 0·04; Pol p = 0·13; Gag p = 0·89). Magnitude and breadth of pre-infection responses did not correlate with distance of the viral sequence to the insert (p>0·50). Acute log viral load trended lower in vaccine versus placebo recipients (estimated mean 4·7 vs 5·1) but the difference was not significant (p = 0·27). Neither was acute viral load associated with distance of the viral sequence to the insert (p>0·30).

Interpretation

Despite evidence of anamnestic responses, the sieve effect was not well explained by available measures of T-cell immunogenicity. Sequence divergence from the vaccine was not significantly associated with acute viral load. While point estimates suggested weak vaccine suppression of viral load, the result was not significant and more viral load data would be needed to detect suppression.  相似文献   

15.

Background & Methods

To examine the relationship between breastfeeding and maternally-rated infant temperament at age 3 months, 316 infants in the prospective Cambridge Baby Growth Study, UK had infant temperament assessed at age 3 months by mothers using the Revised Infant Behavior Questionnaire, which produces scores for three main dimensions of temperament derived from 14 subscales. Infant temperament scores were related to mode of infant milk feeding at age 3 months (breast only; formula milk only; or mixed) with adjustment for infant''s age at assessment and an index of deprivation.

Results

Infant temperament dimension scores differed across the three infant feeding groups, but appeared to be comparable between exclusive breast-fed and mixed-fed infants. Compared to formula milk-fed infants, exclusive breast-fed and mixed-fed infants were rated as having lower impulsivity and positive responses to stimulation (adjusted mean [95% CI] “Surgency/Extraversion” in formula-fed vs. mixed-fed vs. breast-fed groups: 4.3 [4.2–4.5] vs. 4.0 [3.8–4.1] vs. 4.0 [3.9–4.1]; p-heterogeneity = 0.0006), lower ability to regulate their own emotions (“Orienting/Regulation”: 5.1 [5.0–5.2], vs. 4.9 [4.8–5.1] vs. 4.9 [4.8–5.0]; p = 0.01), and higher emotional instability (“Negative affectivity”: 2.8 [2.6–2.9] vs. 3.0 [2.8–3.1] vs. 3.0 [2.9–3.1]; p = 0.03).

Conclusions

Breast and mixed-fed infants were rated by their mothers as having more challenging temperaments in all three dimensions; particular subscales included greater distress, less smiling, laughing, and vocalisation, and lower soothability. Increased awareness of the behavioural dynamics of breastfeeding, a better expectation of normal infant temperament and support to cope with difficult infant temperament could potentially help to promote successful breastfeeding.  相似文献   

16.
17.

Purpose

Graft failure remains an obstacle to experimental subretinal cell transplantation. A key step is preparing a viable graft, as high levels of necrosis and apoptosis increase the risk of graft failure. Retinal grafts are commonly harvested from cell cultures. We termed the graft preparation procedure “transplant conditions” (TC). We hypothesized that culture conditions influenced graft viability, and investigated whether viability decreased following TC using a mouse retinal pigment epithelial (RPE) cell line, DH01.

Methods

Cell viability was assessed by trypan blue exclusion. Levels of apoptosis and necrosis in vitro were determined by flow cytometry for annexin V and propidium iodide and Western blot analysis for the pro- and cleaved forms of caspases 3 and 7. Graft viability in vivo was established by terminal deoxynucleotidyl transferase dUTP nick end labeling (TUNEL) and cleaved caspase 3 immunolabeling of subretinal allografts.

Results

Pre-confluent cultures had significantly less nonviable cells than post-confluent cultures (6.6%±0.8% vs. 13.1%±0.9%, p<0.01). Cell viability in either group was not altered significantly following TC. Caspases 3 and 7 were not altered by levels of confluence or following TC. Pre-confluent cultures had low levels of apoptosis/necrosis (5.6%±1.1%) that did not increase following TC (4.8%±0.5%). However, culturing beyond confluence led to progressively increasing levels of apoptosis and necrosis (up to 16.5%±0.9%). Allografts prepared from post-confluent cultures had significantly more TUNEL-positive cells 3 hours post-operatively than grafts of pre-confluent cells (12.7%±3.1% vs. 4.5%±1.4%, p<0.001). Subretinal grafts of post-confluent cells also had significantly higher rates of cleaved caspase 3 than pre-confluent grafts (20.2%±4.3% vs. 7.8%±1.8%, p<0.001).

Conclusion

Pre-confluent cells should be used to maximize graft cell viability.  相似文献   

18.

Background

The United States Public Health Service (USPHS) Guideline for Treating Tobacco Use and Dependence includes ten key recommendations regarding the identification and the treatment of tobacco users seen in all health care settings. To our knowledge, the impact of system-wide brief interventions with cigarette smokers on smoking prevalence and health care utilization has not been examined using patient population-based data.

Methods and Findings

Data on clinical interventions with cigarette smokers were examined for primary care office visits of 104,639 patients at 17 Harvard Vanguard Medical Associates (HVMA) sites. An operational definition of “systems change” was developed. It included thresholds for intervention frequency and sustainability. Twelve sites met the criteria. Five did not. Decreases in self-reported smoking prevalence were 40% greater at sites that achieved systems change (13.6% vs. 9.7%, p<.01). On average, the likelihood of quitting increased by 2.6% (p<0.05, 95% CI: 0.1%–4.6%) per occurrence of brief intervention. For patients with a recent history of current smoking whose home site experienced systems change, the likelihood of an office visit for smoking-related diagnoses decreased by 4.3% on an annualized basis after systems change occurred (p<0.05, 95% CI: 0.5%–8.1%). There was no change in the likelihood of an office visit for smoking-related diagnoses following systems change among non-smokers.

Conclusions

The clinical practice data from HVMA suggest that a systems approach can lead to significant reductions in smoking prevalence and the rate of office visits for smoking-related diseases. Most comprehensive tobacco intervention strategies focus on the provider or the tobacco user, but these results argue that health systems should be included as an integral component of a comprehensive tobacco intervention strategy. The HVMA results also give us an indication of the potential health impacts when meaningful use core tobacco measures are widely adopted.  相似文献   

19.

Objectives

To examine demographic, environmental and clinical factors associated with severe bronchiolitis in infants admitted to hospital and quantify the independent effects of these factors.

Design

Prospective cohort study.

Setting

Alder Hey Children''s Hospital, Liverpool, United Kingdom.

Participants

378 infants admitted to hospital with a diagnosis of bronchiolitis, of whom 299 (79%) were antigen positive to respiratory syncytial virus (RSV).

Outcome

Severity of disease during admission, defined as “no need for supplemental oxygen” (reference group), “any need for supplemental oxygen” and “any need for mechanical ventilation”.

Results

Univariate analysis found male sex (p = 0.035) and tobacco smoking by a household member (p<0.001) were associated with need for both supplemental oxygen and mechanical ventilation. Premature birth, low gestation, low birth weight, low admission weight and low corrected age on admission were also associated with need for mechanical ventilation (all p≤0.002). Deprivation scores (IMD 2004) were significantly higher in households where a member smoked compared to non-smoking households (p<0.001). The odds of smoking predicted by deprivation were 7 times higher (95%CI (3.59, 14.03)), when comparing the least and most deprived quintiles of the study population. Family history of atopic disease and deprivation score were not associated with severe disease. Multivariate multinomial logistic regression which initially included all covariates, found household tobacco smoking (adjusted OR = 2.45, 95%CI (1.60, 3.74) predicted need for oxygen supplementation. Household tobacco smoking (adjusted OR = 5.49, (2.78, 10.83)) and weight (kg) on admission (adjusted OR = 0.51, (0.40, 0.65)) were both significant predictors in the final model for mechanical ventilation. The same associations and similar size of effects were found when only children with proven RSV infection were included in analysis.

Conclusions

Low admission weight and householder tobacco smoking increased the risk of severe bronchiolitis in infants admitted to hospital. These effects were independent of a standard deprivation measure. NIHR Study Ref. DHCS/G121/10.  相似文献   

20.

Introduction

Mitochondrial function influences T cell dynamics and is affected by mitochondrial DNA (mtDNA) variation. We previously reported an association between African mtDNA haplogroup L2 and less robust CD4 cell recovery on antiretroviral therapy (ART) in non-Hispanic black ACTG 384 subjects. We explored whether additional T cell parameters in this cohort differed by mtDNA haplogroup.

Methods

ACTG 384 randomized ART-naïve subjects to two different nucleoside regimens with efavirenz, nelfinavir, or both. CD4 and CD8 memory and activation markers were available at baseline and week 48 on most subjects. mtDNA sequencing was performed on whole blood DNA, and haplogroups were determined. We studied non-Hispanic black subjects with HIV RNA <400 copies/mL at week 48. Analyses included Wilcoxon ranksum test and linear regression.

Results

Data from 104 subjects were included. Major African mtDNA haplogroups included L1 (N = 25), L2 (N = 31), and L3 (N = 32). Baseline age, HIV RNA, and CD4 cells did not differ between L2 and non-L2 haplogroups. Compared to non-L2 haplogroups, L2 subjects had lower baseline activated CD4 cells (median 12% vs. 17%; p = 0.03) and tended toward lower activated CD8 cells (41% vs. 47%; p = 0.06). At 48 weeks of ART, L2 subjects had smaller decreases in activated CD4 cells (−4% vs. −11%; p = 0.01), and smaller CD4 cell increases (+95 vs. +178; p = 0.002). In models adjusting for baseline age, CD4 cells, HIV RNA, and naïve-to-memory CD4 cell ratio, haplogroup L2 was associated with lower baseline (p = 0.04) and 48-week change in (p = 0.01) activated CD4 cells.

Conclusions

Among ART-naïve non-Hispanic blacks, mtDNA haplogroup L2 was associated with baseline and 48-week change in T cell activation, and poorer CD4 cell recovery. These data suggest mtDNA variation may influence CD4 T cell dynamics by modulating T cell activation. Further study is needed to replicate these associations and identify mechanisms.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号