首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 46 毫秒
1.

Background

Recent behavioral studies report correlational evidence to suggest that non-musicians with good pitch discrimination sing more accurately than those with poorer auditory skills. However, other studies have reported a dissociation between perceptual and vocal production skills. In order to elucidate the relationship between auditory discrimination skills and vocal accuracy, we administered an auditory-discrimination training paradigm to a group of non-musicians to determine whether training-enhanced auditory discrimination would specifically result in improved vocal accuracy.

Methodology/Principal Findings

We utilized micromelodies (i.e., melodies with seven different interval scales, each smaller than a semitone) as the main stimuli for auditory discrimination training and testing, and we used single-note and melodic singing tasks to assess vocal accuracy in two groups of non-musicians (experimental and control). To determine if any training-induced improvements in vocal accuracy would be accompanied by related modulations in cortical activity during singing, the experimental group of non-musicians also performed the singing tasks while undergoing functional magnetic resonance imaging (fMRI). Following training, the experimental group exhibited significant enhancements in micromelody discrimination compared to controls. However, we did not observe a correlated improvement in vocal accuracy during single-note or melodic singing, nor did we detect any training-induced changes in activity within brain regions associated with singing.

Conclusions/Significance

Given the observations from our auditory training regimen, we therefore conclude that perceptual discrimination training alone is not sufficient to improve vocal accuracy in non-musicians, supporting the suggested dissociation between auditory perception and vocal production.  相似文献   

2.

Introduction

Difficulties in word-level reading skills are prevalent in Brazilian schools and may deter children from gaining the knowledge obtained through reading and academic achievement. Music education has emerged as a potential method to improve reading skills because due to a common neurobiological substratum.

Objective

To evaluate the effectiveness of music education for the improvement of reading skills and academic achievement among children (eight to 10 years of age) with reading difficulties.

Method

235 children with reading difficulties in 10 schools participated in a five-month, randomized clinical trial in cluster (RCT) in an impoverished zone within the city of São Paulo to test the effects of music education intervention while assessing reading skills and academic achievement during the school year. Five schools were chosen randomly to incorporate music classes (n = 114), and five served as controls (n = 121). Two different methods of analysis were used to evaluate the effectiveness of the intervention: The standard method was intention-to-treat (ITT), and the other was the Complier Average Causal Effect (CACE) estimation method, which took compliance status into account.

Results

The ITT analyses were not very promising; only one marginal effect existed for the rate of correct real words read per minute. Indeed, considering ITT, improvements were observed in the secondary outcomes (slope of Portuguese = 0.21 [p<0.001] and slope of math = 0.25 [p<0.001]). As for CACE estimation (i.e., complier children versus non-complier children), more promising effects were observed in terms of the rate of correct words read per minute [β = 13.98, p<0.001] and phonological awareness [β = 19.72, p<0.001] as well as secondary outcomes (academic achievement in Portuguese [β = 0.77, p<0.0001] and math [β = 0.49, p<0.001] throughout the school year).

Conclusion

The results may be seen as promising, but they are not, in themselves, enough to make music lessons as public policy.  相似文献   

3.

Background

The ability to separate two interleaved melodies is an important factor in music appreciation. This ability is greatly reduced in people with hearing impairment, contributing to difficulties in music appreciation. The aim of this study was to assess whether visual cues, musical training or musical context could have an effect on this ability, and potentially improve music appreciation for the hearing impaired.

Methods

Musicians (N = 18) and non-musicians (N = 19) were asked to rate the difficulty of segregating a four-note repeating melody from interleaved random distracter notes. Visual cues were provided on half the blocks, and two musical contexts were tested, with the overlap between melody and distracter notes either gradually increasing or decreasing.

Conclusions

Visual cues, musical training, and musical context all affected the difficulty of extracting the melody from a background of interleaved random distracter notes. Visual cues were effective in reducing the difficulty of segregating the melody from distracter notes, even in individuals with no musical training. These results are consistent with theories that indicate an important role for central (top-down) processes in auditory streaming mechanisms, and suggest that visual cues may help the hearing-impaired enjoy music.  相似文献   

4.

Background

Enjoyment of music is an important part of life that may be degraded for people with hearing impairments, especially those using cochlear implants. The ability to follow separate lines of melody is an important factor in music appreciation. This ability relies on effective auditory streaming, which is much reduced in people with hearing impairment, contributing to difficulties in music appreciation. The aim of this study was to assess whether visual cues could reduce the subjective difficulty of segregating a melody from interleaved background notes in normally hearing listeners, those using hearing aids, and those using cochlear implants.

Methodology/Principal Findings

Normally hearing listeners (N = 20), hearing aid users (N = 10), and cochlear implant users (N = 11) were asked to rate the difficulty of segregating a repeating four-note melody from random interleaved distracter notes. The pitch of the background notes was gradually increased or decreased throughout blocks, providing a range of difficulty from easy (with a large pitch separation between melody and distracter) to impossible (with the melody and distracter completely overlapping). Visual cues were provided on half the blocks, and difficulty ratings for blocks with and without visual cues were compared between groups. Visual cues reduced the subjective difficulty of extracting the melody from the distracter notes for normally hearing listeners and cochlear implant users, but not hearing aid users.

Conclusion/Significance

Simple visual cues may improve the ability of cochlear implant users to segregate lines of music, thus potentially increasing their enjoyment of music. More research is needed to determine what type of acoustic cues to encode visually in order to optimise the benefits they may provide.  相似文献   

5.

Background

The contribution of different cognitive abilities to academic performance in children surviving cerebral insult can guide the choice of interventions to improve cognitive and academic outcomes. This study''s objective was to identify which cognitive abilities are associated with academic performance in children after malaria with neurological involvement.

Methods

62 Ugandan children with a history of malaria with neurological involvement were assessed for cognitive ability (working memory, reasoning, learning, visual spatial skills, attention) and academic performance (reading, spelling, arithmetic) three months after the illness. Linear regressions were fit for each academic score with the five cognitive outcomes entered as predictors. Adjusters in the analysis were age, sex, education, nutrition, and home environment. Exploratory factor analysis (EFA) and structural equation models (SEM) were used to determine the nature of the association between cognition and academic performance. Predictive residual sum of squares was used to determine which combination of cognitive scores was needed to predict academic performance.

Results

In regressions of a single academic score on all five cognitive outcomes and adjusters, only Working Memory was associated with Reading (coefficient estimate = 0.36, 95% confidence interval = 0.10 to 0.63, p<0.01) and Spelling (0.46, 0.13 to 0.78, p<0.01), Visual Spatial Skills was associated with Arithmetic (0.15, 0.03 to 0.26, p<0.05), and Learning was associated with Reading (0.06, 0.00 to 0.11, p<0.05). One latent cognitive factor was identified using EFA. The SEM found a strong association between this latent cognitive ability and each academic performance measure (P<0.0001). Working memory, visual spatial ability and learning were the best predictors of academic performance.

Conclusion

Academic performance is strongly associated with the latent variable labelled “cognitive ability” which captures most of the variation in the individual specific cognitive outcome measures. Working memory, visual spatial skills, and learning together stood out as the best combination to predict academic performance.  相似文献   

6.

Background

The purpose of this work was to determine in a clinical trial the efficacy of reducing or preventing seizures in patients with neurological handicaps through sustained cortical activation evoked by passive exposure to a specific auditory stimulus (particular music). The specific type of stimulation had been determined in previous studies to evoke anti-epileptiform/anti-seizure brain activity.

Methods

The study was conducted at the Thad E. Saleeby Center in Harstville, South Carolina, which is a permanent residence for individuals with heterogeneous neurological impairments, many with epilepsy. We investigated the ability to reduce or prevent seizures in subjects through cortical stimulation from sustained passive nightly exposure to a specific auditory stimulus (music) in a three-year randomized controlled study. In year 1, baseline seizure rates were established. In year 2, subjects were randomly assigned to treatment and control groups. Treatment group subjects were exposed during sleeping hours to specific music at regular intervals. Control subjects received no music exposure and were maintained on regular anti-seizure medication. In year 3, music treatment was terminated and seizure rates followed. We found a significant treatment effect (p = 0.024) during the treatment phase persisting through the follow-up phase (p = 0.002). Subjects exposed to treatment exhibited a significant 24% decrease in seizures during the treatment phase, and a 33% decrease persisting through the follow-up phase. Twenty-four percent of treatment subjects exhibited a complete absence of seizures during treatment.

Conclusion/Significance

Exposure to specific auditory stimuli (i.e. music) can significantly reduce seizures in subjects with a range of epilepsy and seizure types, in some cases achieving a complete cessation of seizures. These results are consistent with previous work showing reductions in epileptiform activity from particular music exposure and offers potential for achieving a non-invasive, non-pharmacologic treatment of epilepsy.

Trial Registration

Clinicaltrials.gov NCT01459692  相似文献   

7.

Objective

To estimate the effectiveness of anterior cervical discectomy with arthroplasty (ACDA) compared to anterior cervical discectomy with fusion (ACDF) for patient-important outcomes for single-level cervical spondylosis.

Data sources

Electronic databases (MEDLINE, EMBASE, Cochrane Register for Randomized Controlled Trials, BIOSIS and LILACS), archives of spine meetings and bibliographies of relevant articles.

Study selection

We included RCTs of ACDF versus ACDA in adult patients with single-level cervical spondylosis reporting at least one of the following outcomes: functionality, neurological success, neck pain, arm pain, quality of life, surgery for adjacent level degeneration (ALD), reoperation and dysphonia/dysphagia. We used no language restrictions. We performed title and abstract screening and full text screening independently and in duplicate.

Data synthesis

We used random-effects model to pool data using mean difference (MD) for continuous outcomes and relative risk (RR) for dichotomous outcomes. We used GRADE to evaluate the quality of evidence for each outcome.

Results

Of 2804 citations, 9 articles reporting on 9 trials (1778 participants) were eligible. ACDA is associated with a clinically significant lower incidence of neurologic failure (RR  = 0.53, 95% CI  = 0.37–0.75, p = 0.0004) and improvement in the Neck pain visual analogue scale (VAS) (MD  = 6.56, 95% CI  = 3.22–9.90, p = 0.0001; Minimal clinically important difference (MCID)  = 2.5. ACDA is associated with a statistically but not clinically significant improvement in Arm pain VAS and SF-36 physical component summary. ACDA is associated with non-statistically significant higher improvement in the Neck Disability Index Score and lower incidence of ALD requiring surgery, reoperation, and dysphagia/dysphonia.

Conclusions

There is no strong evidence to support the routine use of ACDA over ACDF in single-level cervical spondylosis. Current trials lack long-term data required to assess safety as well as surgery for ALD. We suggest that ACDA in patients with single level cervical spondylosis is an option although its benefits and indication over ACDF remain in question.  相似文献   

8.

Background

Birth asphyxia kills 0.7 to 1.6 million newborns a year globally with 99% of deaths in developing countries. Effective newborn resuscitation could reduce this burden of disease but the training of health-care providers in low income settings is often outdated. Our aim was to determine if a simple one day newborn resuscitation training (NRT) alters health worker resuscitation practices in a public hospital setting in Kenya.

Methods/Principal Findings

We conducted a randomised, controlled trial with health workers receiving early training with NRT (n = 28) or late training (the control group, n = 55). The training was adapted locally from the approach of the UK Resuscitation Council. The primary outcome was the proportion of appropriate initial resuscitation steps with the frequency of inappropriate practices as a secondary outcome. Data were collected on 97 and 115 resuscitation episodes over 7 weeks after early training in the intervention and control groups respectively. Trained providers demonstrated a higher proportion of adequate initial resuscitation steps compared to the control group (trained 66% vs control 27%; risk ratio 2.45, [95% CI 1.75–3.42], p<0.001, adjusted for clustering). In addition, there was a statistically significant reduction in the frequency of inappropriate and potentially harmful practices per resuscitation in the trained group (trained 0.53 vs control 0.92; mean difference 0.40, [95% CI 0.13–0.66], p = 0.004).

Conclusions/Significance

Implementation of a simple, one day newborn resuscitation training can be followed immediately by significant improvement in health workers'' practices. However, evidence of the effects on long term performance or clinical outcomes can only be established by larger cluster randomised trials.

Trial Registration

Controlled-Trials.com ISRCTN92218092  相似文献   

9.

Background

Evidence based largely on self-report data suggests that factors associated with medical education erode the critical human quality of empathy. These reports have caused serious concern among medical educators and clinicians and have led to changes in medical curricula around the world. This study aims to provide a more objective index of possible changes in empathy across the spectrum of clinical exposure, by using a behavioural test of empathic accuracy in addition to self-report questionnaires. Moreover, non-medical groups were used to control for maturation effects.

Methods

Three medical groups (N = 3×20) representing a spectrum of clinical exposure, and two non-medical groups (N = 2×20) matched for age, sex and educational achievements completed self-report measures of empathy, and tests of empathic accuracy and interoceptive sensitivity.

Results

Between-group differences in reported empathy related to maturation rather than clinical training/exposure. Conversely, analyses of the “eyes” test results specifically identified clinical practice, but not medical education, as the key influence on performance. The data from the interoception task did not support a link between visceral feedback and empathic processes.

Conclusions

Clinical practice, but not medical education, impacts on empathy development and seems instrumental in maintaining empathetic skills against the general trend of declining empathic accuracy with age.  相似文献   

10.

Objectives

There is a lack of information on sexual violence (SV) among men who have sex with men and transgendered individuals (MSM-T) in southern India. As SV has been associated with HIV vulnerability, this study examined health related behaviours and practices associated with SV among MSM-T.

Design

Data were from cross-sectional surveys from four districts in Karnataka, India.

Methods

Multivariable logistic regression models were constructed to examine factors related to SV. Multivariable negative binomial regression models examined the association between physician visits and SV.

Results

A total of 543 MSM-T were included in the study. Prevalence of SV was 18% in the past year. HIV prevalence among those reporting SV was 20%, compared to 12% among those not reporting SV (p = .104). In multivariable models, and among sex workers, those reporting SV were more likely to report anal sex with 5+ casual sex partners in the past week (AOR: 4.1; 95%CI: 1.2–14.3, p = .029). Increased physician visits among those reporting SV was reported only for those involved in sex work (ARR: 1.7; 95%CI: 1.1–2.7, p = .012).

Conclusions

These results demonstrate high levels of SV among MSM-T populations, highlighting the importance of integrating interventions to reduce violence as part of HIV prevention programs and health services.  相似文献   

11.

Background

Capecitabine has proven effective as a chemotherapy for metastatic breast cancer. Though several Phase II/III studies of capecitabine as neoadjuvant chemotherapy have been conducted, the results still remain inconsistent. Therefore, we performed a meta-analysis to obtain more precise understanding of the role of capecitabine in neoadjuvant chemotherapy for breast cancer patients.

Methods

The electronic database PubMed and online abstracts from ASCO and SABCS were searched to identify randomized clinical trials comparing neoadjuvant chemotherapy with or without capecitabine in early/operable breast cancer patients without distant metastasis. Risk ratios were used to estimate the association between capecitabine in neoadjuvant chemotherapy and various efficacy outcomes. Fixed- or random-effect models were adopted to pool data in RevMan 5.1.

Results

Five studies were included in the meta-analysis. Neoadjuvant use of capecitabine with anthracycline and/or taxane based therapy was not associated with significant improvement in clinical outcomes including: pathologic complete response in breast (pCR; RR = 1.10, 95% CI 0.87–1.40, p = 0.43), pCR in breast tumor and nodes (tnpCR RR = 0.99, 95% CI 0.83–1.18, p = 0.90), overall response rate (ORR; RR = 1.00, 95% CI 0.94–1.07, p = 0.93), or breast-conserving surgery (BCS; RR = 0.98, 95% CI 0.93–1.04, p = 0.49).

Conclusions

Neoadjuvant treatment of breast cancer involving capecitabine did not significantly improve pCR, tnpCR, BCS or ORR. Thus adding capecitabine to neoadjuvant chemotherapy regimes is unlikely to improve outcomes in breast cancer patients without distant metastasis. Further research is required to establish the condition that capecitabine may be useful in breast cancer neoadjuvant chemotherapy.  相似文献   

12.

Background

Basal cell carcinoma (BCC) tumors are the most common skin cancer and are highly immunogenic.

Objective

The goal of this study was to assess how immune-cell related gene expression in an initial BCC tumor biopsy was related to the appearance of subsequent BCC tumors.

Materials and Methods

Levels of mRNA for CD3ε (a T-cell receptor marker), CD25 (the alpha chain of the interleukin (IL)-2 receptor expressed on activated T-cells and B-cells), CD68 (a marker for monocytes/macrophages), the cell surface glycoprotein intercellular adhesion molecule-1 (ICAM-1), the cytokine interferon-γ (IFN-γ) and the anti-inflammatory cytokine IL-10 were measured in BCC tumor biopsies from 138 patients using real-time PCR.

Results

The median follow-up was 26.6 months, and 61% of subjects were free of new BCCs two years post-initial biopsy. Patients with low CD3ε CD25, CD68, and ICAM-1 mRNA levels had significantly shorter times before new tumors were detected (p = 0.03, p = 0.02, p = 0.003, and p = 0.08, respectively). Furthermore, older age diminished the association of mRNA levels with the appearance of subsequent tumors.

Conclusions

Our results show that levels of CD3ε, CD25, CD68, and ICAM-1 mRNA in BCC biopsies may predict risk for new BCC tumors.  相似文献   

13.

Background

Non-pulsatile tinnitus is considered a subjective auditory phantom phenomenon present in 10 to 15% of the population. Tinnitus as a phantom phenomenon is related to hyperactivity and reorganization of the auditory cortex. Magnetoencephalography studies demonstrate a correlation between gamma band activity in the contralateral auditory cortex and the presence of tinnitus. The present study aims to investigate the relation between objective gamma-band activity in the contralateral auditory cortex and subjective tinnitus loudness scores.

Methods and Findings

In unilateral tinnitus patients (N = 15; 10 right, 5 left) source analysis of resting state electroencephalographic gamma band oscillations shows a strong positive correlation with Visual Analogue Scale loudness scores in the contralateral auditory cortex (max r = 0.73, p<0.05).

Conclusion

Auditory phantom percepts thus show similar sound level dependent activation of the contralateral auditory cortex as observed in normal audition. In view of recent consciousness models and tinnitus network models these results suggest tinnitus loudness is coded by gamma band activity in the contralateral auditory cortex but might not, by itself, be responsible for tinnitus perception.  相似文献   

14.

Study Objectives

To investigate the effect of an eight-week, home-based, personalized, computerized cognitive training program on sleep quality and cognitive performance among older adults with insomnia.

Design

Participants (n = 51) were randomly allocated to a cognitive training group (n = 34) or to an active control group (n = 17). The participants in the cognitive training group completed an eight-week, home-based, personalized, computerized cognitive training program, while the participants in the active control group completed an eight-week, home-based program involving computerized tasks that do not engage high-level cognitive functioning. Before and after training, all participants'' sleep was monitored for one week by an actigraph and their cognitive performance was evaluated.

Setting

Community setting: residential sleep/performance testing facility.

Participants

Fifty-one older adults with insomnia (aged 65–85).

Interventions

Eight weeks of computerized cognitive training for older adults with insomnia.

Results

Mixed models for repeated measures analysis showed between-group improvements for the cognitive training group on both sleep quality (sleep onset latency and sleep efficiency) and cognitive performance (avoiding distractions, working memory, visual memory, general memory and naming). Hierarchical linear regressions analysis in the cognitive training group indicated that improved visual scanning is associated with earlier advent of sleep, while improved naming is associated with the reduction in wake after sleep onset and with the reduction in number of awakenings. Likewise the results indicate that improved “avoiding distractions” is associated with an increase in the duration of sleep. Moreover, the results indicate that in the active control group cognitive decline observed in working memory is associated with an increase in the time required to fall asleep.

Conclusions

New learning is instrumental in promoting initiation and maintenance of sleep in older adults with insomnia. Lasting and personalized cognitive training is particularly indicated to generate the type of learning necessary for combined cognitive and sleep enhancements in this population.

Trial Registration

ClinicalTrials.gov NCT00901641http://clinicaltrials.gov/ct2/show/NCT00901641  相似文献   

15.

Background

The clinical, radiological and pathological similarities between sarcoidosis and tuberculosis can make disease differentiation challenging. A complicating factor is that some cases of sarcoidosis may be initiated by mycobacteria. We hypothesised that immunological profiling might provide insight into a possible relationship between the diseases or allow us to distinguish between them.

Methods

We analysed bronchoalveolar lavage (BAL) fluid in sarcoidosis (n = 18), tuberculosis (n = 12) and healthy volunteers (n = 16). We further investigated serum samples in the same groups; sarcoidosis (n = 40), tuberculosis (n = 15) and healthy volunteers (n = 40). A cross-sectional analysis of multiple cytokine profiles was performed and data used to discriminate between samples.

Results

We found that BAL profiles were indistinguishable between both diseases and significantly different from healthy volunteers. In sera, tuberculosis patients had significantly lower levels of the Th2 cytokine interleukin-4 (IL-4) than those with sarcoidosis (p = 0.004). Additional serum differences allowed us to create a linear regression model for disease differentiation (within-sample accuracy 91%, cross-validation accuracy 73%).

Conclusions

These data warrant replication in independent cohorts to further develop and validate a serum cytokine signature that may be able to distinguish sarcoidosis from tuberculosis. Systemic Th2 cytokine differences between sarcoidosis and tuberculosis may also underly different disease outcomes to similar respiratory stimuli.  相似文献   

16.

Background

Family caregivers of dementia patients are at increased risk of developing depression or anxiety. A multi-component program designed to mobilize support of family networks demonstrated effectiveness in decreasing depressive symptoms in caregivers. However, the impact of an intervention consisting solely of family meetings on depression and anxiety has not yet been evaluated. This study examines the preventive effects of family meetings for primary caregivers of community-dwelling dementia patients.

Methods

A randomized multicenter trial was conducted among 192 primary caregivers of community dwelling dementia patients. Caregivers did not meet the diagnostic criteria for depressive or anxiety disorder at baseline. Participants were randomized to the family meetings intervention (n = 96) or usual care (n = 96) condition. The intervention consisted of two individual sessions and four family meetings which occurred once every 2 to 3 months for a year. Outcome measures after 12 months were the incidence of a clinical depressive or anxiety disorder and change in depressive and anxiety symptoms (primary outcomes), caregiver burden and quality of life (secondary outcomes). Intention-to-treat as well as per protocol analyses were performed.

Results

A substantial number of caregivers (72/192) developed a depressive or anxiety disorder within 12 months. The intervention was not superior to usual care either in reducing the risk of disorder onset (adjusted IRR 0.98; 95% CI 0.69 to 1.38) or in reducing depressive (randomization-by-time interaction coefficient = −1.40; 95% CI −3.91 to 1.10) or anxiety symptoms (randomization-by-time interaction coefficient = −0.55; 95% CI −1.59 to 0.49). The intervention did not reduce caregiver burden or their health related quality of life.

Conclusion

This study did not demonstrate preventive effects of family meetings on the mental health of family caregivers. Further research should determine whether this intervention might be more beneficial if provided in a more concentrated dose, when applied for therapeutic purposes or targeted towards subgroups of caregivers.

Trial Registration

Controlled-Trials.com ISRCTN90163486  相似文献   

17.

Background

There are approximately 3 million people aged 50 and older in sub-Saharan Africa who are HIV-positive. Despite this, little is known about the characteristics of older adults who are on treatment and their treatment outcomes.

Methods

A retrospective cohort analysis was performed using routinely collected data with Malawi Ministry of Health monitoring tools from facilities providing antiretroviral therapy services in Zomba district. Patients aged 25 years and older initiated on treatment from July 2005 to June 2010 were included. Differences in survival, by age group, were determined using Kaplan–Meier survival plots and Cox proportional hazards regression models.

Results

There were 10,888 patients aged 25 and older. Patients aged 50 and older (N = 1419) were more likely to be male (P<0.0001) and located in rural areas (P = 0.003) than those aged 25–49. Crude survival estimates among those aged 50–59 were not statistically different from those aged 25–49 (P = 0.925). However, survival among those aged 60 and older (N = 345) was worse (P = 0.019) than among those 25–59. In the proportional hazards model, after controlling for sex and stage at initiation, survival in those aged 50–59 did not differ significantly from those aged 25–49 (hazard ratio 1.00 (95% CI: 0.79 to 1.27; P = 0.998) but the hazard ratio was 1.46 (95% CI: 1.03 to 2.06; P = 0.032) for those aged 60 and older compared to those aged 25–49.

Conclusions

Treatment outcomes of those aged 50–59 are similar to those aged 25–49. A better understanding of how older adults present for and respond to treatment is critical to improving HIV services.  相似文献   

18.

Introduction

Brain derived neurotrophic factor (BDNF) has been implicated in memory, learning, and neurodegenerative diseases. However, the relationship of BDNF with cardiometabolic risk factors is unclear, and the effect of exercise training on BDNF has not been previously explored in individuals with type 2 diabetes.

Methods

Men and women (N = 150) with type 2 diabetes were randomized to an aerobic exercise (aerobic), resistance exercise (resistance), or a combination of both (combination) for 9 months. Serum BDNF levels were evaluated at baseline and follow-up from archived blood samples.

Results

Baseline serum BDNF was not associated with fitness, body composition, anthropometry, glucose control, or strength measures (all, p>0.05). Similarly, no significant change in serum BDNF levels was observed following exercise training in the aerobic (−1649.4 pg/ml, CI: −4768.9 to 1470.2), resistance (−2351.2 pg/ml, CI:−5290.7 to 588.3), or combination groups (−827.4 pg/ml, CI: −3533.3 to1878.5) compared to the control group (−2320.0 pg/ml, CI: −5750.8 to 1110.8). However, reductions in waist circumference were directly associated with changes in serum BDNF following training (r = 0.25, p = 0.005).

Conclusions

Serum BDNF was not associated with fitness, body composition, anthropometry, glucose control, or strength measures at baseline. Likewise, serum BDNF measures were not altered by 9 months of aerobic, resistance, or combination training. However, reductions in waist circumference were associated with decreased serum BDNF levels. Future studies should investigate the relevance of BDNF with measures of cognitive function specifically in individuals with type-2 diabetes.  相似文献   

19.

Introduction

Several authors have underscored a strong relation between the molecular subtypes and the axillary status of breast cancer patients. The aim of our work was to decipher the interaction between this classification and the probability of a positive sentinel node biopsy.

Materials and Methods

Our dataset consisted of a total number of 2654 early-stage breast cancer patients. Patients treated at first by conservative breast surgery plus sentinel node biopsies were selected. A multivariate logistic regression model was trained and validated. Interaction covariate between ER and HER2 markers was a forced input of this model. The performance of the multivariate model in the training and the two validation sets was analyzed in terms of discrimination and calibration. Probability of axillary metastasis was detailed for each molecular subtype.

Results

The interaction covariate between ER and HER2 status was a stronger predictor (p = 0.0031) of positive sentinel node biopsy than the ER status by itself (p = 0.016). A multivariate model to determine the probability of sentinel node positivity was defined with the following variables; tumour size, lympho-vascular invasion, molecular subtypes and age at diagnosis. This model showed similar results in terms of discrimination (AUC = 0.72/0.73/0.72) and calibration (HL p = 0.28/0.05/0.11) in the training and validation sets. The interaction between molecular subtypes, tumour size and sentinel nodes status was approximated.

Discussion

We showed that biologically-driven analyses are able to build new models with higher performance in terms of breast cancer axillary status prediction. The molecular subtype classification strongly interacts with the axillary and distant metastasis process.  相似文献   

20.

Background

Cannabis dependence is a significant public health problem. Because there are no approved medications for this condition, treatment must rely on behavioral approaches empirically complemented by such lifestyle change as exercise.

Aims

To examine the effects of moderate aerobic exercise on cannabis craving and use in cannabis dependent adults under normal living conditions.

Design

Participants attended 10 supervised 30-min treadmill exercise sessions standardized using heart rate (HR) monitoring (60–70% HR reserve) over 2 weeks. Exercise sessions were conducted by exercise physiologists under medical oversight.

Participants

Sedentary or minimally active non-treatment seeking cannabis-dependent adults (n = 12, age 25±3 years, 8 females) met criteria for primary cannabis dependence using the Substance Abuse module of the Structured Clinical Interview for DSM-IV (SCID).

Measurements

Self-reported drug use was assessed for 1-week before, during, and 2-weeks after the study. Participants viewed visual cannabis cues before and after exercise in conjunction with assessment of subjective cannabis craving using the Marijuana Craving Questionnaire (MCQ-SF).

Findings

Daily cannabis use within the run-in period was 5.9 joints per day (SD = 3.1, range 1.8–10.9). Average cannabis use levels within the exercise (2.8 joints, SD = 1.6, range 0.9–5.4) and follow-up (4.1 joints, SD = 2.5, range 1.1–9.5) periods were lower than during the run-in period (both P<.005). Average MCQ factor scores for the pre- and post-exercise craving assessments were reduced for compulsivity (P  = .006), emotionality (P  = .002), expectancy (P  = .002), and purposefulness (P  = .002).

Conclusions

The findings of this pilot study warrant larger, adequately powered controlled trials to test the efficacy of prescribed moderate aerobic exercise as a component of cannabis dependence treatment. The neurobiological mechanisms that account for these beneficial effects on cannabis use may lead to understanding of the physical and emotional underpinnings of cannabis dependence and recovery from this disorder.

Trial Registration

ClinicalTrials.gov NCT00838448]  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号