首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.

Background

Cannabis dependence is a significant public health problem. Because there are no approved medications for this condition, treatment must rely on behavioral approaches empirically complemented by such lifestyle change as exercise.

Aims

To examine the effects of moderate aerobic exercise on cannabis craving and use in cannabis dependent adults under normal living conditions.

Design

Participants attended 10 supervised 30-min treadmill exercise sessions standardized using heart rate (HR) monitoring (60–70% HR reserve) over 2 weeks. Exercise sessions were conducted by exercise physiologists under medical oversight.

Participants

Sedentary or minimally active non-treatment seeking cannabis-dependent adults (n = 12, age 25±3 years, 8 females) met criteria for primary cannabis dependence using the Substance Abuse module of the Structured Clinical Interview for DSM-IV (SCID).

Measurements

Self-reported drug use was assessed for 1-week before, during, and 2-weeks after the study. Participants viewed visual cannabis cues before and after exercise in conjunction with assessment of subjective cannabis craving using the Marijuana Craving Questionnaire (MCQ-SF).

Findings

Daily cannabis use within the run-in period was 5.9 joints per day (SD = 3.1, range 1.8–10.9). Average cannabis use levels within the exercise (2.8 joints, SD = 1.6, range 0.9–5.4) and follow-up (4.1 joints, SD = 2.5, range 1.1–9.5) periods were lower than during the run-in period (both P<.005). Average MCQ factor scores for the pre- and post-exercise craving assessments were reduced for compulsivity (P  = .006), emotionality (P  = .002), expectancy (P  = .002), and purposefulness (P  = .002).

Conclusions

The findings of this pilot study warrant larger, adequately powered controlled trials to test the efficacy of prescribed moderate aerobic exercise as a component of cannabis dependence treatment. The neurobiological mechanisms that account for these beneficial effects on cannabis use may lead to understanding of the physical and emotional underpinnings of cannabis dependence and recovery from this disorder.

Trial Registration

ClinicalTrials.gov NCT00838448]  相似文献   

2.

Introduction

Cannabis is Europe''s most commonly used illicit drug. Some users do not develop dependence or other problems, whereas others do. Many factors are associated with the occurrence of cannabis-related disorders. This makes it difficult to identify key risk factors and markers to profile at-risk cannabis users using traditional hypothesis-driven approaches. Therefore, the use of a data-mining technique called binary recursive partitioning is demonstrated in this study by creating a classification tree to profile at-risk users.

Methods

59 variables on cannabis use and drug market experiences were extracted from an internet-based survey dataset collected in four European countries (Czech Republic, Italy, Netherlands and Sweden), n = 2617. These 59 potential predictors of problematic cannabis use were used to partition individual respondents into subgroups with low and high risk of having a cannabis use disorder, based on their responses on the Cannabis Abuse Screening Test. Both a generic model for the four countries combined and four country-specific models were constructed.

Results

Of the 59 variables included in the first analysis step, only three variables were required to construct a generic partitioning model to classify high risk cannabis users with 65–73% accuracy. Based on the generic model for the four countries combined, the highest risk for cannabis use disorder is seen in participants reporting a cannabis use on more than 200 days in the last 12 months. In comparison to the generic model, the country-specific models led to modest, non-significant improvements in classification accuracy, with an exception for Italy (p = 0.01).

Conclusion

Using recursive partitioning, it is feasible to construct classification trees based on only a few variables with acceptable performance to classify cannabis users into groups with low or high risk of meeting criteria for cannabis use disorder. The number of cannabis use days in the last 12 months is the most relevant variable. The identified variables may be considered for use in future screeners for cannabis use disorders.  相似文献   

3.

Background

Bipolar disorder (BD) is a significant cause of functional, cognitive, and social impairment. However, classic studies of functioning and social skills have not investigated how BD may impact behavior on the Internet. Given that the digital age has been changing the way people communicate, this study aims to investigate the pattern of Internet use in patients with BD.

Methods

This cross-sectional study assessed 30 patients with BD I or II and 30 matched controls. Patients were not in an acute mood episode, according to DSM-IV. A standard protocol examined sociodemographic variables and social behavior on the Internet, assessed by Facebook number of friends (FBN) and lifetime estimated number of offline contacts (social network number, SNN).

Results

SNN (p<0.001) and FBN (p = 0.036) of patients with BD were significantly lower than those of controls. Also, variables related with Internet use were significantly lower in patients, e.g., close contacts on Facebook (p = 0.021), Internet experience (p = 0.020), and knowledge of terms associated with social networking sites (p = 0.042). Also, patients showed lower rates of the expected pattern of Internet use (based on their age generation), including a poorer knowledge of SNS (p = 0.018) and a lower frequency of Internet use (p = 0.010).

Discussion

This study suggests that patients with BD show smaller social networks both in real-world settings and on the Internet. Also, patients tend to use the Internet and social networking sites less frequently and show a poorer knowledge of Internet and social media than healthy controls, below the expected for their generation. These significant differences between patients and controls suggest that the effects of BD on social relationships and functioning extend to electronic media.  相似文献   

4.

Background

The tobacco withdrawal syndrome indicates the development of neurophysiologic dependence. Clinical evidence indicates that neurophysiologic dependence develops through a set sequence of symptom presentation that can be assessed with a new 3-item survey measure of wanting, craving, and needing tobacco, the Level of Physical Dependence (PD). This study sought to determine if advancing neurophysiologic dependence as measured by the Level of PD correlates with characteristics of white matter structure measured by Fractional Anisotropy (FA).

Methods

Diffusion-MRI based FA and diffusion tensor imaging probabilistic tractography were used to evaluate 11 smokers and 10 nonsmokers. FA was also examined in relation to two additional measures of dependence severity, the Hooked on Nicotine Checklist (HONC), and the Fagerström Test for Nicotine Dependence (FTND).

Results

Among smokers, FA in the left anterior cingulate bundle (ACb) correlated negatively with the Level of PD (r = −0.68, p = 0.02) and HONC scores (r = −0.65, p = 0.03), but the correlation for the FTND did not reach statistical significance (r = −49, p = 0.12). With advancing Levels of PD, the density of streamlines between the ACb and precuneus increased (r = −0.67, p<0.05) and those between the ACb and white matter projecting to the superior-frontal cortex (r = −0.86, p = 0.0006) decreased significantly.

Conclusions

The correlations between neural structure and both the clinical Level of PD survey measure and the HONC suggest that the Level of PD and the HONC may reflect the microstructural integrity of white matter, as influenced by tobacco abuse. Given that the Level of PD is measuring a sequence of symptoms of neurophysiologic dependence that develops over time, the correlation between the Level of PD and neural structure suggests that these features might represent neuroplastic changes that develop over time to support the development of neurophysiologic dependence.  相似文献   

5.

Background

In order to assess the importance of environmental and genetic risk on transition from health to psychotic disorder, a prospective study of individuals at average (n = 462) and high genetic risk (n = 810) was conducted.

Method

A three-year cohort study examined the rate of transition to psychotic disorder. Binary measures indexing environmental exposure (combining urban birth, cannabis use, ethnicity and childhood trauma) and proxy genetic risk (high-risk sibling status) were used to model transition.

Results

The majority of high-risk siblings (68%) and healthy comparison subjects (60%) had been exposed to one or more environmental risks. The risk of transition in siblings (n = 9, 1.1%) was higher than the risk in healthy comparison subjects (n = 2, 0.4%; ORadj  = 2.2,95%CI:5–10.3). All transitions (100%) were associated with environmental exposure, compared to 65% of non-transitions (p = 0.014), with the greatest effects for childhood trauma (ORadj  = 34.4,95%CI:4.4–267.4), cannabis use (OR = 4.1,95%CI:1.1, 15.4), minority ethnic group (OR = 3.8,95%CI:1.2,12.8) and urban birth (OR = 3.7,95%CI:0.9,15.4). The proportion of transitions in the population attributable to environmental and genetic risk ranged from 28% for minority ethnic group, 45% for urban birth, 57% for cannabis use, 86% for childhood trauma, and 50% for high-risk sibling status. Nine out of 11 transitions (82%) were exposed to both genetic and environmental risk, compared to only 43% of non-transitions (p = 0.03).

Conclusion

Environmental risk associated with transition to psychotic disorder is semi-ubiquitous regardless of genetic high risk status. Careful prospective documentation suggests most transitions can be attributed to powerful environmental effects that become detectable when analysed against elevated background genetic risk, indicating gene-environment interaction.  相似文献   

6.
7.

Aims

To investigate functional platelet recovery after preoperative withdrawal of aspirin and clopidogrel and platelet function 5 days after treatment resumption.

Methods/Results

We conducted an observational study, which prospectively included consecutive patients taking aspirin, taking clopidogrel, and untreated controls (15 patients in each group). The antiplatelet drugs were withdrawn five days before surgery (baseline) and were reintroduced two days after surgery. Platelet function was evaluated by optical aggregation in the presence of collagen, arachidonic acid (aspirin) and ADP (clopidogrel) and by VASP assay (clopidogrel). Platelet-leukocyte complex (PLC) level was quantified at each time-point. At baseline, platelet function was efficiently inhibited by aspirin and had recovered fully in most patients 5 days after drug withdrawal. PLC levels five days after aspirin reintroduction were similar to baseline (+4±10%; p = 0.16), in line with an effective platelet inhibition. Chronic clopidogrel treatment was associated with variable platelet inhibition and its withdrawal led to variable functional recovery. PLC levels were significantly increased five days after clopidogrel reintroduction (+10±15%; p = 0.02), compared to baseline.

Conclusions

Aspirin withdrawal 5 days before high-bleeding-risk procedures was associated with functional platelet recovery, and its reintroduction two days after surgery restored antiplaletet efficacy five days later. This was not the case of clopidogrel, and further work is therefore needed to define its optimal perioperative management.  相似文献   

8.

Background

Treatment of end stage renal disease patients with short daily hemodialysis has been associated with an improvement in blood pressure. It is unclear from these studies if anti-hypertensive management had been optimized prior to starting short daily hemodialysis. Also, the potential mechanism(s) of blood pressure improvement remain to be fully elucidated.

Study Design, Setting and Participants

We undertook a randomized cross-over trial in adult hypertensive patients with ESRD treated with conventional hemodialysis to determine: 1) if short-daily hemodialysis is associated with a reduction in systolic blood pressure after a 3-month blood pressure optimization period and; 2) the potential mechanism(s) of blood pressure reduction. Blood pressure was measured using Canadian Hypertension Education Program guidelines. Extracellular fluid volume (ECFV) was assessed with bioimpedance. Serum catecholamines were used to assess the sympathetic nervous system. Interleukin-6 (IL-6) and thiobarbituric acid reactive substances (T-BARS) were used as markers of inflammation and oxidative stress respectively.

Results

After a 3-month run-in phase in which systolic blood pressure improved, there was no significant difference in pre-dialysis systolic pressure between short-daily and conventional hemodialysis (p = 0.39). However, similar blood pressures were achieved on fewer anti-hypertensive medications with short daily hemodialysis compared to conventional hemodialysis (p = 0.01). Short daily hemodialysis, compared to conventional hemodialysis, was not associated with a difference in dry weight or ECFV (p = 0.77). Sympathetic nervous system activity as assessed by plasma epinephrine (p = 1.0) and norepinephrine (p = 0.52) was also not different. Markers of inflammation (p = 0.42) and oxidative stress (p = 0.83) were also similar between the two treatment arms.

Conclusions

Patients treated with short daily, compared to conventional hemodialysis, have similar blood pressure control on fewer anti-hypertensive medications. The mechanism(s) by which short daily hemodialysis allows for decreased anti-hypertensive medication use remains unclear but effects on sodium balance and changes in peripheral vascular resistance require further study.

Trial Registration

ClinicalTrials.gov NCT00759967  相似文献   

9.

Background/Objectives

It has been hypothesized that assuming most of the caloric intake later in the day leads to metabolic disadvantages, but few studies are available on this topic. Aim of our study was to prospectively examine whether eating more of the daily caloric intake at dinner leads to an increased risk of obesity, hyperglycemia, metabolic syndrome, and non-alcoholic fatty liver disease (NAFLD).

Subjects/Methods

1245 non-obese, non-diabetic middle-aged adults from a population-based cohort underwent a 3-day food record questionnaire at enrollment. Anthropometric values, blood pressure, blood metabolic variables, and estimated liver fat were measured at baseline and at 6-year follow-up.

Design

Prospective cohort study.

Results

Subjects were divided according to tertiles of percent daily caloric intake at dinner. A significant increase in the incidence rate of obesity (from 4.7 to 11.4%), metabolic syndrome (from 11.1 to 16.1%), and estimated NAFLD (from 16.5 to 23.8%) was observed from the lower to higher tertile. In a multiple logistic regression model adjusted for multiple covariates, subjects in the highest tertile showed an increased risk of developing obesity (OR = 2.33; 95% CI 1.17–4.65; p = 0.02), metabolic syndrome (OR = 1.52; 95% CI 1.01–2.30; p = 0.04), and NAFLD (OR = 1.56; 95% CI 1.10–2.22; p = 0.01).

Conclusions

Consuming more of the daily energy intake at dinner is associated with an increased risk of obesity, metabolic syndrome, and NAFLD.  相似文献   

10.

Objectives

To investigate the presence and the nature of cognitive impairment in a large sample of patients with Multiple Sclerosis (MS), and to identify clinical and demographic determinants of cognitive impairment in MS.

Methods

303 patients with MS and 279 healthy controls were administered the Brief Repeatable Battery of Neuropsychological tests (BRB-N); measures of pre-morbid verbal competence and neuropsychiatric measures were also administered.

Results

Patients and healthy controls were matched for age, gender, education and pre-morbid verbal Intelligence Quotient. Patients presenting with cognitive impairment were 108/303 (35.6%). In the overall group of participants, the significant predictors of the most sensitive BRB-N scores were: presence of MS, age, education, and Vocabulary. The significant predictors when considering MS patients only were: course of MS, age, education, vocabulary, and depression. Using logistic regression analyses, significant determinants of the presence of cognitive impairment in relapsing-remitting MS patients were: duration of illness (OR = 1.053, 95% CI = 1.010–1.097, p = 0.015), Expanded Disability Status Scale score (OR = 1.247, 95% CI = 1.024–1.517, p = 0.028), and vocabulary (OR = 0.960, 95% CI = 0.936–0.984, p = 0.001), while in the smaller group of progressive MS patients these predictors did not play a significant role in determining the cognitive outcome.

Conclusions

Our results corroborate the evidence about the presence and the nature of cognitive impairment in a large sample of patients with MS. Furthermore, our findings identify significant clinical and demographic determinants of cognitive impairment in a large sample of MS patients for the first time. Implications for further research and clinical practice were discussed.  相似文献   

11.
12.

Background

Efficacy of tumor necrosis factor alpha (TNF-α) blockers for treatment of ulcerative colitis that is unresponsive to conventional therapy is unclear due to recent studies yielding conflicting results.

Aim

To assess the efficacy and safety of anti-TNF-α agents for treatment of ulcerative colitis patients who were intolerant or refractory to conventional medical therapy.

Methods

Pubmed, Embase, and the Cochrane database were searched. Analysis was performed on randomized controlled trials that assessed anti-TNF-α therapy on ulcerative colitis patients that had previously failed therapy with corticosteroids and/or immunosuppressants. The primary outcome focused on was the frequency of patients that achieved clinical remission. Further trial outcomes of interest included rates of remission without patient use of corticosteroids during the trial, extent of mucosal healing, and the number of cases that resulted in colectomy and serious side effects.

Results

Eight trials from seven studies (n = 2122) met the inclusion criteria and were thus included during analysis. TNF-α blockers demonstrated clinical benefit as compared to placebo control as evidenced by an increased frequency of clinical remission (p<0.00001), steroid-free remission (p = 0.01), endoscopic remission (p<0.00001) and a decrease in frequency of colectomy (p = 0.03). No difference was found concerning serious side effects (p = 0.05). Three small trials (n = 57) comparing infliximab to corticosteroid treatment, showed no difference in frequency of clinical remission (p = 0.93), mucosal healing (p = 0.80), and requirement for a colectomy (p = 0.49). One trial compared infliximab to cyclosporine (n = 115), wherein no difference was found in terms of mucosal healing (p = 0.85), colectomy frequency (p = 0.60) and serious side effects (p = 0.23).

Conclusion

TNF-α blockers are effective and safe therapies for the induction and maintenance of long-term remission and prevention of treatment by colectomy for patients with refractory ulcerative colitis where conventional treatment was previously ineffective. Furthermore, infliximab and cyclosporine were found to be comparable for treating acute severe steroid-refractory ulcerative colitis.  相似文献   

13.

Background

Substance misuse is associated with cognitive dysfunction. We used a stop signal task to examine deficits in cognitive control in individuals with opioid dependence (OD). We examined how response inhibition and post-error slowing are compromised and whether methadone maintenance treatment (MMT), abstinence duration, and psychiatric comorbidity are related to these measures in individuals with OD.

Methods

Two-hundred-and-sixty-four men with OD who were incarcerated at a detention center and abstinent for up to 2 months (n = 108) or at a correctional facility and abstinent for approximately 6 months (n = 156), 65 OD men under MMT at a psychiatric clinic, and 64 age and education matched healthy control (HC) participants were assessed. We computed the stop signal reaction time (SSRT) to index the capacity of response inhibition and post-error slowing (PES) to represent error-related behavioral adjustment, as in our previous work. We examined group effects with analyses of variance and covariance analyses, followed by planned comparisons. Specifically, we compared OD and HC participants to examine the effects of opioid dependence and MMT and compared OD sub-groups to examine the effects of abstinence duration and psychiatric comorbidity.

Results

The SSRT was significantly prolonged in OD but not MMT individuals, as compared to HC. The extent of post-error slowing diminished in OD and MMT, as compared to HC (trend; p = 0.061), and there was no difference between the OD and MMT groups. Individuals in longer abstinence were no less impaired in these measures. Furthermore, these results remained when psychiatric comorbidities including misuse of other substances were accounted for.

Conclusions

Methadone treatment appears to be associated with relatively intact cognitive control in opioid dependent individuals. MMT may facilitate public health by augmenting cognitive control and thereby mitigating risky behaviors in heroin addicts.  相似文献   

14.

Background

Infections with different herpes viruses have been associated with cognitive functioning in psychiatric patients and healthy adults. The aim of this study was to find out whether antibodies to different herpes viruses are prospectively associated with cognitive functioning in a general adolescent population.

Methods

This study was performed in TRAILS, a large prospective general population cohort (N = 1084, 54% female, mean age 16.2 years (SD 0.6)). At age 16, immunoglobulin G antibodies against HSV1, HSV2, CMV and EBV were measured next to high sensitive C-Reactive Protein (hsCRP). Two years later, immediate memory and executive functioning were assessed using the 15 words task and the self ordered pointing task. Multiple linear regression analysis with bootstrapping was performed to study the association between viral infections and cognitive function, adjusting for gender, socioeconomic status, ethnicity, and cannabis use.

Results

Presence of HSV1 antibodies was associated with memory function ((B = −0.272, 95% CI = −0.556 to −0.016, p = 0.047)), while the association with executive functioning did not reach statistical significance (B = 0.560, 95% CI is −0.053 to 1.184, p = 0.075). The level of HSV1 antibodies was associated with both memory function (B = −0.160, 95% CI = −0.280 to −0.039, p = 0.014) and executive functioning (B = 0.296, 95% CI = 0.011 to 0.578, p = 0.046). Other herpes viruses and hsCRP were not associated with cognitive functioning.

Conclusions

Both presence and level of HSV1 antibodies are prospectively associated with reduced cognitive performance in a large cohort of adolescents.  相似文献   

15.

Objective

Retrograde trans-synaptic degeneration of retinal ganglion cell layer (GCL) has been proposed as one of the mechanisms contributing to permanent disability after visual pathway damage. We set out to test this mechanism taking advantage of the new methods for imaging the macula with high resolution by optical coherence tomography (OCT) in patients with lesions in the posterior visual pathway. Additionally, we explored the association between thinning of GCL as an imaging marker of visual impairment such as visual field defects.

Methods

Retrospective case note review of patients with retrogeniculate lesions studied by spectral domain OCT of the macula and quadrant pattern deviation (PD) of the visual fields.

Results

We analysed 8 patients with either hemianopia or quadrantanopia due to brain lesions (stroke  = 5; surgery  = 2; infection  = 1). We found significant thinning of the GCL in the projecting sector of the retina mapping to the brain lesion. Second, we found strong correlation between the PD of the visual field quadrant and the corresponding macular GCL sector for the right (R = 0.792, p<0.001) and left eyes (R = 0.674, p<0.001).

Conclusions

The mapping between lesions in the posterior visual pathway and their projection in the macula GCL sector corroborates retrograde trans-synaptic neuronal degeneration after brain injury as a mechanism of damage with functional consequences. This finding supports the use of GCL thickness as an imaging marker of trans-synaptic degeneration in the visual pathway after brain lesions.  相似文献   

16.

Context

Cost-effective, scalable programs are urgently needed in countries deeply affected by HIV.

Methods

This parallel-group RCT was conducted in four secondary schools in Mbarara, Uganda. Participants were 12 years and older, reported past-year computer or Internet use, and provided informed caregiver permission and youth assent. The intervention, CyberSenga, was a five-hour online healthy sexuality program. Half of the intervention group was further randomized to receive a booster at four-months post-intervention. The control arm received ‘treatment as usual’ (i.e., school-delivered sexuality programming). The main outcome measures were: 1) condom use and 2) abstinence in the past three months at six-months'' post-intervention. Secondary outcomes were: 1) condom use and 2) abstinence at three-month''s post-intervention; and 6-month outcomes by booster exposure. Analyses were intention to treat.

Results

All 416 eligible youth were invited to participate, 88% (n = 366) of whom enrolled. Participants were randomized to the intervention (n = 183) or control (n = 183) arm; 91 intervention participants were further randomized to the booster. No statistically significant results were noted among the main outcomes. Among the secondary outcomes: At three-month follow-up, trends suggested that intervention participants (81%) were more likely to be abstinent than control participants (74%; p = 0.08), and this was particularly true among youth who were abstinent at baseline (88% vs. 77%; p = 0.02). At six-month follow-up, those in the booster group (80%) reported higher rates of abstinence than youth in the intervention, no booster (57%) and control (55%) groups (p = 0.15); they also reported lower rates of unprotected sex (5%) compared to youth in the intervention, no booster (24%) and control (21%) groups (p = 0.21) among youth sexually active at baseline.

Conclusions

The CyberSenga program may affect HIV preventive behavior among abstinent youth in the short term and, with the booster, may also promote HIV preventive behavior among sexually active youth in the longer term.

Trial Registration

NCT00906178.  相似文献   

17.

Background

Poor sleep is a frequent symptom in patients with multiple sclerosis (MS). Sleep may be influenced by MS-related symptoms and adverse effects from immunotherapy and symptomatic medications. We aimed to study the prevalence of poor sleep and the influence of socio-demographic and clinical factors on sleep quality in MS- patients.

Methods

A total of 90 MS patients and 108 sex-and age- matched controls were included in a questionnaire survey. Sleep complaints were evaluated by Pittsburgh Sleep Quality Index (PSQI) and a global PSQI score was used to separate good sleepers (≤5) from poor sleepers (>5). Excessive daytime sleepiness, the use of immunotherapy and antidepressant drugs, symptoms of pain, depression, fatigue and MS-specific health related quality of life were registered. Results were compared between patients and controls and between good and poor sleepers among MS patients.

Results

MS patients reported a higher mean global PSQI score than controls (8.6 vs. 6.3, p = 0.001), and 67.1% of the MS patients compared to 43.9% of the controls (p = 0.002) were poor sleepers. Pain (p = 0.02), fatigue (p = 0.001), depression (p = 0.01) and female gender (p = 0.04) were associated with sleep disturbance. Multivariate analyses showed that female gender (p = 0.02), use of immunotherapy (p = 005) and a high psychological burden of MS (p = 0.001) were associated with poor sleep among MS patients.

Conclusions

Poor sleep is common in patients with MS. Early identification and treatment of modifiable risk factors may improve sleep and quality of life in MS.  相似文献   

18.

Background

Pneumonia is one of most prevalent infectious diseases worldwide and is associated with considerable mortality. In comparison to general population, schizophrenia patients hospitalized for pneumonia have poorer outcomes. We explored the risk factors of short-term mortality in this population because the information is lacking in the literature.

Methods

In a nationwide schizophrenia cohort, derived from the National Health Insurance Research Database in Taiwan, that was hospitalized for pneumonia between 2000 and 2008 (n = 1,741), we identified 141 subjects who died during their hospitalizations or shortly after their discharges. Based on risk-set sampling in a 1∶4 ratio, 468 matched controls were selected from the study cohort (i.e., schizophrenia cohort with pneumonia). Physical illnesses were categorized as pre-existing and incident illnesses that developed after pneumonia respectively. Exposures to medications were categorized by type, duration, and defined daily dose. We used stepwise conditional logistic regression to explore the risk factors for short-term mortality.

Results

Pre-existing arrhythmia was associated with short-term mortality (adjusted risk ratio [RR] = 4.99, p<0.01). Several variables during hospitalization were associated with increased mortality risk, including incident arrhythmia (RR = 7.44, p<0.01), incident heart failure (RR = 5.49, p = 0.0183) and the use of hypoglycemic drugs (RR = 2.32, p<0.01). Furthermore, individual antipsychotic drugs (such as clozapine) known to induce pneumonia were not significantly associated with the risk.

Conclusions

Incident cardiac complications following pneumonia are associated with increased short-term mortality. These findings have broad implications for clinical intervention and future studies are needed to clarify the mechanisms of the risk factors.  相似文献   

19.
20.

Introduction

Balance deficits are identified as important risk factors for falling in individuals with chronic obstructive pulmonary disease (COPD). However, the specific use of proprioception, which is of primary importance during balance control, has not been studied in individuals with COPD. The objective was to determine the specific proprioceptive control strategy during postural balance in individuals with COPD and healthy controls, and to assess whether this was related to inspiratory muscle weakness.

Methods

Center of pressure displacement was determined in 20 individuals with COPD and 20 age/gender-matched controls during upright stance on an unstable support surface without vision. Ankle and back muscle vibration were applied to evaluate the relative contribution of different proprioceptive signals used in postural control.

Results

Individuals with COPD showed an increased anterior-posterior body sway during upright stance (p = 0.037). Compared to controls, individuals with COPD showed an increased posterior body sway during ankle muscle vibration (p = 0.047), decreased anterior body sway during back muscle vibration (p = 0.025), and increased posterior body sway during simultaneous ankle-muscle vibration (p = 0.002). Individuals with COPD with the weakest inspiratory muscles showed the greatest reliance on ankle muscle input when compared to the stronger individuals with COPD (p = 0.037).

Conclusions

Individuals with COPD, especially those with inspiratory muscle weakness, increased their reliance on ankle muscle proprioceptive signals and decreased their reliance on back muscle proprioceptive signals during balance control, resulting in a decreased postural stability compared to healthy controls. These proprioceptive changes may be due to an impaired postural contribution of the inspiratory muscles to trunk stability. Further research is required to determine whether interventions such as proprioceptive training and inspiratory muscle training improve postural balance and reduce the fall risk in individuals with COPD.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号