共查询到20条相似文献,搜索用时 31 毫秒
1.
Aims
Patients with renal failure develop cardiovascular alterations which contribute to the higher rate of cardiac death. Blockade of the renin angiotensin system ameliorates the development of such changes. It is unclear, however, to what extent ACE-inhibitors can also reverse existing cardiovascular alterations. Therefore, we investigated the effect of high dose enalapril treatment on these alterations.Methods
Male Sprague Dawley rats underwent subtotal nephrectomy (SNX, n = 34) or sham operation (sham, n = 39). Eight weeks after surgery, rats were sacrificed or allocated to treatment with either high-dose enalapril, combination of furosemide/dihydralazine or solvent for 4 weeks. Heart and aorta were evaluated using morphometry, stereological techniques and TaqMan PCR.Results
After 8 and 12 weeks systolic blood pressure, albumin excretion, and left ventricular weight were significantly higher in untreated SNX compared to sham. Twelve weeks after SNX a significantly higher volume density of cardiac interstitial tissue (2.57±0.43% in SNX vs 1.50±0.43% in sham, p<0.05) and a significantly lower capillary length density (4532±355 mm/mm3 in SNX vs 5023±624 mm/mm3 in sham, p<0.05) were found. Treatment of SNX with enalapril from week 8–12 significantly improved myocardial fibrosis (1.63±0.25%, p<0.05), but not capillary reduction (3908±486 mm/mm3) or increased intercapillary distance. In contrast, alternative antihypertensive treatment showed no such effect. Significantly increased media thickness together with decreased vascular smooth muscles cell number and a disarray of elastic fibres were found in the aorta of SNX animals compared to sham. Both antihypertensive treatments failed to cause complete regression of these alterations.Conclusions
The study indicates that high dose ACE-I treatment causes partial, but not complete, reversal of cardiovascular changes in SNX. 相似文献2.
Kennelly KP Wallace DM Holmes TM Hankey DJ Grant TS O'Farrelly C Keegan DJ 《PloS one》2011,6(6):e21365
Purpose
Graft failure remains an obstacle to experimental subretinal cell transplantation. A key step is preparing a viable graft, as high levels of necrosis and apoptosis increase the risk of graft failure. Retinal grafts are commonly harvested from cell cultures. We termed the graft preparation procedure “transplant conditions” (TC). We hypothesized that culture conditions influenced graft viability, and investigated whether viability decreased following TC using a mouse retinal pigment epithelial (RPE) cell line, DH01.Methods
Cell viability was assessed by trypan blue exclusion. Levels of apoptosis and necrosis in vitro were determined by flow cytometry for annexin V and propidium iodide and Western blot analysis for the pro- and cleaved forms of caspases 3 and 7. Graft viability in vivo was established by terminal deoxynucleotidyl transferase dUTP nick end labeling (TUNEL) and cleaved caspase 3 immunolabeling of subretinal allografts.Results
Pre-confluent cultures had significantly less nonviable cells than post-confluent cultures (6.6%±0.8% vs. 13.1%±0.9%, p<0.01). Cell viability in either group was not altered significantly following TC. Caspases 3 and 7 were not altered by levels of confluence or following TC. Pre-confluent cultures had low levels of apoptosis/necrosis (5.6%±1.1%) that did not increase following TC (4.8%±0.5%). However, culturing beyond confluence led to progressively increasing levels of apoptosis and necrosis (up to 16.5%±0.9%). Allografts prepared from post-confluent cultures had significantly more TUNEL-positive cells 3 hours post-operatively than grafts of pre-confluent cells (12.7%±3.1% vs. 4.5%±1.4%, p<0.001). Subretinal grafts of post-confluent cells also had significantly higher rates of cleaved caspase 3 than pre-confluent grafts (20.2%±4.3% vs. 7.8%±1.8%, p<0.001).Conclusion
Pre-confluent cells should be used to maximize graft cell viability. 相似文献3.
4.
Meuwese MC Broekhuizen LN Kuikhoven M Heeneman S Lutgens E Gijbels MJ Nieuwdorp M Peutz CJ Stroes ES Vink H van den Berg BM 《PloS one》2010,5(12):e14262
Objective
Functional studies show that disruption of endothelial surface layer (ESL) is accompanied by enhanced sensitivity of the vasculature towards atherogenic stimuli. However, relevance of ESL disruption as causal mechanism for vascular dysfunction remains to be demonstrated. We examined if loss of ESL through enzymatic degradation would affect vascular barrier properties in an atherogenic model.Methods
Eight week old male apolipoprotein E deficient mice on Western-type diet for 10 weeks received continuous active or heat-inactivated hyaluronidase (10 U/hr, i.v.) through an osmotic minipump during 4 weeks. Blood chemistry and anatomic changes in both macrovasculature and kidneys were examined.Results
Infusion with active hyaluronidase resulted in decreased ESL (0.32±0.22 mL) and plasma volume (1.03±0.18 mL) compared to inactivated hyaluronidase (0.52±0.29 mL and 1.28±0.08 mL, p<0.05 respectively).Active hyaluronidase increased proteinuria compared to inactive hyaluronidase (0.27±0.02 vs. 0.15±0.01 µg/µg protein/creatinin, p<0.05) without changes in glomerular morphology or development of tubulo-interstitial inflammation. Atherosclerotic lesions in the aortic branches showed increased matrix production (collagen, 32±5 vs. 18±3%; glycosaminoglycans, 11±5 vs. 0.1±0.01%, active vs. inactive hyaluronidase, p<0.05).Conclusion
ESL degradation in apoE deficient mice contributes to reduced increased urinary protein excretion without significant changes in renal morphology. Second, the induction of compositional changes in atherogenic plaques by hyaluronidase point towards increased plaque vulnerability. These findings support further efforts to evaluate whether ESL restoration is a valuable target to prevent (micro) vascular disease progression. 相似文献5.
Brinkley TE Jerosch-Herold M Folsom AR Carr JJ Hundley WG Allison MA Bluemke DA Burke GL Szklo M Ding J 《PloS one》2011,6(12):e28410
Background
Pericardial fat has adverse effects on the surrounding vasculature. Previous studies suggest that pericardial fat may contribute to myocardial ischemia in symptomatic individuals. However, it is unknown if pericardial fat has similar effects in asymptomatic individuals.Methods
We determined the association between pericardial fat and myocardial blood flow (MBF) in 214 adults with no prior history of cardiovascular disease from the Minnesota field center of the Multi-Ethnic Study of Atherosclerosis (43% female, 56% Caucasian, 44% Hispanic). Pericardial fat volume was measured by computed tomography. MBF was measured by MRI at rest and during adenosine-induced hyperemia. Myocardial perfusion reserve (PR) was calculated as the ratio of hyperemic to resting MBF.Results
Gender-stratified analyses revealed significant differences between men and women including less pericardial fat (71.9±31.3 vs. 105.2±57.5 cm3, p<0.0001) and higher resting MBF (1.12±0.23 vs. 0.93±0.19 ml/min/g, p<0.0001), hyperemic MBF (3.49±0.76 vs. 2.65±0.72 ml/min/g, p<0.0001), and PR (3.19±0.78 vs. 2.93±0.89, p = 0.03) in women. Correlations between pericardial fat and clinical and hemodynamic variables were stronger in women. In women only (p = 0.01 for gender interaction) higher pericardial fat was associated with higher resting MBF (p = 0.008). However, this association was attenuated after accounting for body mass index or rate-pressure product. There were no significant associations between pericardial fat and hyperemic MBF or PR after multivariate adjustment in either gender. In logistic regression analyses there was also no association between impaired coronary vasoreactivity, defined as having a PR <2.5, and pericardial fat in men (OR, 1.18; 95% CI, 0.82–1.70) or women (OR, 1.11; 95% CI, 0.68–1.82).Conclusions
Our data fail to support an independent association between pericardial fat and myocardial perfusion in adults without symptomatic cardiovascular disease. Nevertheless, these findings highlight potentially important differences between asymptomatic and symptomatic individuals with respect to the underlying subclinical disease burden. 相似文献6.
Jankowska EA Filippatos GS von Haehling S Papassotiriou J Morgenthaler NG Cicoira M Schefold JC Rozentryt P Ponikowska B Doehner W Banasiak W Hartmann O Struck J Bergmann A Anker SD Ponikowski P 《PloS one》2011,6(1):e14506
Objectives
We hypothesised that assessment of plasma C-terminal pro-endothelin-1 (CT-proET-1), a stable endothelin-1 precursor fragment, is of prognostic value in patients with chronic heart failure (CHF), beyond other prognosticators, including N-terminal pro-B-type natriuretic peptide (NT-proBNP).Methods
We examined 491 patients with systolic CHF (age: 63±11 years, 91% men, New York Heart Association [NYHA] class [I/II/III/IV]: 9%/45%/38%/8%, 69% ischemic etiology). Plasma CT-proET-1 was detected using a chemiluminescence immunoassay.Results
Increasing CT-proET-1 was a predictor of increased cardiovascular mortality at 12-months of follow-up (standardized hazard ratio 1.42, 95% confidence interval [CI] 1.04–1.95, p = 0.03) after adjusting for NT-proBNP, left ventricular ejection fraction (LVEF), age, creatinine, NYHA class. In receiver operating characteristic curve analysis, areas under curve for 12-month follow-up were similar for CT-proET-1 and NT-proBNP (p = 0.40). Both NT-proBNP and CT-proET-1 added prognostic value to a base model that included LVEF, age, creatinine, and NYHA class. Adding CT-proET-1 to the base model had stronger prognostic power (p<0.01) than adding NT-proBNP (p<0.01). Adding CT-proET-1 to NT-proBNP in this model yielded further prognostic information (p = 0.02).Conclusions
Plasma CT-proET-1 constitutes a novel predictor of increased 12-month cardiovascular mortality in patients with CHF. High CT-proET-1 together with high NT-proBNP enable to identify patients with CHF and particularly unfavourable outcomes. 相似文献7.
Purpose
Mitochondrial disease is the most common neuromuscular disease and has a profound impact upon daily life, disease and longevity. Exercise therapy has been shown to improve mitochondrial function in patients with mitochondrial disease. However, no information exists about the level of habitual physical activity of people with mitochondrial disease and its relationship with clinical phenotype.Methods
Habitual physical activity, genotype and clinical presentations were assessed in 100 patients with mitochondrial disease. Comparisons were made with a control group individually matched by age, gender and BMI.Results
Patients with mitochondrial disease had significantly lower levels of physical activity in comparison to matched people without mitochondrial disease (steps/day; 6883±3944 vs. 9924±4088, p = 0.001). 78% of the mitochondrial disease cohort did not achieve 10,000 steps per day and 48%were classified as overweight or obese. Mitochondrial disease was associated with less breaks in sedentary activity (Sedentary to Active Transitions, % per day; 13±0.03 vs. 14±0.03, p = 0.001) and an increase in sedentary bout duration (bout lengths / fraction of total sedentary time; 0.206±0.044 vs. 0.187±0.026, p = 0.001). After adjusting for covariates, higher physical activity was moderately associated with lower clinical disease burden (steps / day; rs = −0.49; 95% CI −0.33, −0.63, P<0.01). There were no systematic differences in physical activity between different genotypes mitochondrial disease.Conclusions
These results demonstrate for the first time that low levels of physical activity are prominent in mitochondrial disease. Combined with a high prevalence of obesity, physical activity may constitute a significant and potentially modifiable risk factor in mitochondrial disease. 相似文献8.
Background
There is increasing recognition that pulmonary artery stiffness is an important determinant of right ventricular (RV) afterload in pulmonary arterial hypertension (PAH). We used intravascular ultrasound (IVUS) to evaluate the mechanical properties of the elastic pulmonary arteries (PA) in subjects with PAH, and assessed the effects of PAH-specific therapy on indices of arterial stiffness.Method
Using IVUS and simultaneous right heart catheterisation, 20 pulmonary segments in 8 PAH subjects and 12 pulmonary segments in 8 controls were studied to determine their compliance, distensibility, elastic modulus and stiffness index β. PAH subjects underwent repeat IVUS examinations after 6-months of bosentan therapy.Results
At baseline, PAH subjects demonstrated greater stiffness in all measured indices compared to controls: compliance (1.50±0.11×10–2 mm2/mmHg vs 4.49±0.43×10–2 mm2/mmHg, p<0.0001), distensibility (0.32±0.03%/mmHg vs 1.18±0.13%/mmHg, p<0.0001), elastic modulus (720±64 mmHg vs 198±19 mmHg, p<0.0001), and stiffness index β (15.0±1.4 vs 11.0±0.7, p = 0.046). Strong inverse exponential associations existed between mean pulmonary artery pressure and compliance (r2 = 0.82, p<0.0001), and also between mean PAP and distensibility (r2 = 0.79, p = 0.002). Bosentan therapy, for 6-months, was not associated with any significant changes in all indices of PA stiffness.Conclusion
Increased stiffness occurs in the proximal elastic PA in patients with PAH and contributes to the pathogenesis RV failure. Bosentan therapy may not be effective at improving PA stiffness. 相似文献9.
Janiszewski PM Ross R Despres JP Lemieux I Orlando G Carli F Bagni P Menozzi M Zona S Guaraldi G 《PloS one》2011,6(9):e25032
Background
Although half of HIV-infected patients develop lipodystrophy and metabolic complications, there exists no simple clinical screening tool to discern the high from the low-risk HIV-infected patient. Thus, we evaluated the associations between waist circumference (WC) combined with triglyceride (TG) levels and the severity of lipodystrophy and cardiovascular risk among HIV-infected men and women.Methods
1481 HIV-infected men and 841 HIV-infected women were recruited between 2005 and 2009 at the metabolic clinic of the University of Modena and Reggio Emilia in Italy. Within each gender, patients were categorized into 4 groups according to WC and TG levels. Total and regional fat and fat-free mass were assessed by duel-energy x-ray absorptiometry, and visceral adipose tissue (VAT) and abdominal subcutaneous AT (SAT) were quantified by computed tomography. Various cardiovascular risk factors were assessed in clinic after an overnight fast.Results
The high TG/high WC men had the most VAT (208.0±94.4 cm2), as well as the highest prevalence of metabolic syndrome (42.2%) and type-2 diabetes (16.2%), and the highest Framingham risk score (10.3±6.5) in comparison to other groups (p<0.05 for all). High TG/high WC women also had elevated VAT (150.0±97.9 cm2) and a higher prevalence of metabolic syndrome (53.3%), hypertension (30.5%) and type-2 diabetes (12.0%), and Framingham risk score(2.9±2.8) by comparison to low TG/low WC women (p<0.05 for all).Conclusions
A simple tool combining WC and TG levels can discriminate high- from low-risk HIV-infected patients. 相似文献10.
Nankabirwa J Cundill B Clarke S Kabatereine N Rosenthal PJ Dorsey G Brooker S Staedke SG 《PloS one》2010,5(10):e13438
Background
Intermittent preventive treatment (IPT) is a promising malaria control strategy; however, the optimal regimen remains unclear. We conducted a randomized, single-blinded, placebo-controlled trial to evaluate the efficacy, safety, and tolerability of a single course of sulfadoxine-pyrimethamine (SP), amodiaquine + SP (AQ+SP) or dihydroartemisinin-piperaquine (DP) among schoolchildren to inform IPT.Methods
Asymptomatic girls aged 8 to 12 years and boys aged 8 to 14 years enrolled in two primary schools in Tororo, Uganda were randomized to receive one of the study regimens or placebo, regardless of presence of parasitemia at enrollment, and followed for 42 days. The primary outcome was risk of parasitemia at 42 days. Survival analysis was used to assess differences between regimens.Results
Of 780 enrolled participants, 769 (98.6%) completed follow-up and were assigned a treatment outcome. The risk of parasitemia at 42 days varied significantly between DP (11.7% [95% confidence interval (CI): 7.9, 17.1]), AQ+SP (44.3% [37.6, 51.5]), and SP (79.7% [95% CI: 73.6, 85.2], p<0.001). The risk of parasitemia in SP-treated children was no different than in those receiving placebo (84.6% [95% CI: 79.1, 89.3], p = 0.22). No serious adverse events occurred, but AQ+SP was associated with increased risk of vomiting compared to placebo (13.0% [95% CI: 9.1, 18.5] vs. 4.7% [95% CI: 2.5, 8.8], respectively, p = 0.003).Conclusions
DP was the most efficacious and well-tolerated regimen tested, although AQ+SP appears to be a suitable alternative for IPT in schoolchildren. Use of SP for IPT may not be appropriate in areas with high-level SP resistance in Africa.Trial Registration
ClinicalTrials.gov NCT00852371相似文献11.
Background
Augmentation cystoplasty (AC) with autogenous ileum remains the current gold standard surgical treatment for many patients with end-stage bladder disease. However, the presence of mucus-secreting epithelium within the bladder is associated with debilitating long-term complications. Currently, decellularised biological materials derived from porcine extracellular matrix (ECM) are under investigation as potential augmentation scaffolds. Important biomechanical limitations of ECMs are decreased bladder capacity and poor compliance after implantation.Methodology/Principal Findings
In the present ex vivo study a novel concept was investigated where a two-fold increase in ECM scaffold surface-area relative to the resected ileal segment was compared in ovine bladder models after AC. Results showed that bladder capacity increased by 40±4% and 37±11% at 10 mmHg and compliance by 40.4±4% and 39.7±6% (ΔP = 0–10 mmHg) after AC with ileum and porcine urinary bladder matrix (UBM) respectively (p<0.05). Comparative assessment between ileum and UBM demonstrated no significant differences in bladder capacity or compliance increases after AC (p>0.05).Conclusions
These findings may have important clinical implications as metabolic, infective and malignant complications precipitated by mucus-secreting epithelium are potentially avoided after augmentation with ECM scaffolds. 相似文献12.
Background
For successful cardiac resynchronisation therapy (CRT) a spatial and electrical separation of right and left ventricular electrodes is essential. The spatial distribution of electrical delays within the coronary sinus (CS) tributaries has not yet been identified.Objective
Electrical delays within the CS are described during sinus rhythm (SR) and right ventricular pacing (RVP). A coordinate system grading the mitral ring from 0° to 360° and three vertical segments is proposed to define the lead positions irrespective of individual CS branch orientation.Methods
In 13 patients undergoing implantation of a CRT device 6±2.5, (median 5) lead positions within the CS were mapped during SR and RVP. The delay to the onset and the peak of the local signal was measured from the earliest QRS activation or the pacing spike. Fluoroscopic positions were compared to localizations on a nonfluoroscopic electrode imaging system.Results
During SR, electrical delays in the CS were inhomogenous in patients with or without left bundle branch block (LBBB). During RVP, the delays increased by 44±32 ms (signal onset from 36±33 ms to 95±30 ms; p<0.001, signal peak from 105±44 ms to 156±30 ms; p<0.001). The activation pattern during RVP was homogeneous and predictable by taking the grading on the CS ring into account: (% QRS) = 78−0.002 (grade−162)2, p<0.0001. This indicates that 78% of the QRS duration can be expected as a maximum peak delay at 162° on the CS ring.Conclusion
Electrical delays within the CS vary during SR, but prolong and become predictable during RVP. A coordinate system helps predicting the local delays and facilitates interindividual comparison of lead positions irrespective of CS branch anatomy. 相似文献13.
R Al-Aqeedi N Asaad A Al-Qahtani R Singh HA Al Binali AW Al Mulla J Al Suwaidi 《PloS one》2012,7(7):e40571
Objectives
Clinical characteristics and trends in the outcome of acute coronary syndrome (ACS) in patients with prior coronary artery bypass graft surgery (CABG) are unclear. The aim of this study was to evaluate clinical characteristics, in-hospital treatment, and outcomes in patients presented with ACS with or without a history of prior CABG over 2 decades.Methods
Data were derived from hospital-based study for collected data from 1991 through 2010 of patients hospitalized with ACS in Doha, Qatar. Data were analyzed according to their history of prior CABG. Baseline clinical characteristics, in-hospital treatment, and outcome were compared.Results
A total 16,750 consecutive patients with ACS were studied, of which 693 (4.1%) had prior CABG. Patients with prior CABG were older (mean 60.5±11 vs. 53±12 years; P = 0.001), more likely to be females and have more cardiovascular risk factors than the non-CABG group. Prior CABG patients had larger infarct size, were less likely to receive reperfusion therapy, early invasive therapy and more likely to receive evidence-based therapies when compared to non-CABG patients. In-hospital mortality and stroke rates were comparable between the 2 groups. Over 2 decades, there was reduction in the in-hospital mortality rates and stroke rates in both groups (CABG, death; 13.2% to 4%, stroke; 1.9% to 0.0%, non-CABG, death; 10% to 3.2%, stroke 1.0% to 0.1%; all, p = 0.001).Conclusion
Significant reduction in-hospital morbidity and mortality among ACS patients with prior CABG over a 20-year period. 相似文献14.
Background
Acute coronary syndrome (ACS) is common in patients approaching the end-of-life (EoL), but these patients rarely receive palliative care. We compared the utility of a palliative care prognostic tool (Gold Standards Framework (GSF)) and the Global Registry of Acute Coronary Events (GRACE) score, to help identify patients approaching EoL.Methods and Findings
172 unselected consecutive patients with confirmed ACS admitted over an eight-week period were assessed using prognostic tools and followed up for 12 months. GSF criteria identified 40 (23%) patients suitable for EoL care while GRACE identified 32 (19%) patients with ≥10% risk of death within 6 months. Patients meeting GSF criteria were older (p = 0.006), had more comorbidities (1.6±0.7 vs. 1.2±0.9, p = 0.007), more frequent hospitalisations before (p = 0.001) and after (0.0001) their index admission, and were more likely to die during follow-up (GSF+ 20% vs GSF- 7%, p = 0.03). GRACE score was predictive of 12-month mortality (C-statistic 0.75) and this was improved by the addition of previous hospital admissions and previous history of stroke (C-statistic 0.88).Conclusions
This study has highlighted a potentially large number of ACS patients eligible for EoL care. GSF or GRACE could be used in the hospital setting to help identify these patients. GSF identifies ACS patients with more comorbidity and at increased risk of hospital readmission. 相似文献15.
Versluis B Dremmen MH Nelemans PJ Wildberger JE Schurink GW Leiner T Backes WH 《PloS one》2012,7(3):e31514
Objectives
The aim of this work was to develop a MRI method to determine arterial flow reserve in patients with intermittent claudication and to investigate whether this method can discriminate between patients and healthy control subjects.Methods
Ten consecutive patients with intermittent claudication and 10 healthy control subjects were included. All subjects underwent vector cardiography triggered quantitative 2D cine MR phase-contrast imaging to obtain flow waveforms of the popliteal artery at rest and during reactive hyperemia. Resting flow, maximum hyperemic flow and absolute flow reserve were determined and compared between the two groups by two independent MRI readers. Also, interreader reproducibility of flow measures was reported.Results
Resting flow was lower in patients compared to controls (4.9±1.6 and 11.1±3.2 mL/s in patients and controls, respectively (p<0.01)). Maximum hyperemic flow was 7.3±2.9 and 16.4±3.2 mL/s (p<0.01) and the absolute flow reserve was 2.4±1.6 and 5.3±1.3 mL/s (p<0.01), respectively in patients and controls. The interreader coefficient of variation was below 10% for all measures in both patients and controls.Conclusions
Quantitative 2D MR cine phase-contrast imaging is a promising method to determine flow reserve measures in patients with peripheral arterial disease and can be helpful to discriminate patients with intermittent claudication from healthy controls. 相似文献16.
Background
Sports-related head trauma is common but still there is no established laboratory test used in the diagnostics of minimal or mild traumatic brain injuries. Further the effects of recurrent head trauma on brain injury markers are unknown. The purpose of this study was to investigate the relationship between Olympic (amateur) boxing and cerebrospinal fluid (CSF) brain injury biomarkers.Methods
The study was designed as a prospective cohort study. Thirty Olympic boxers with a minimum of 45 bouts and 25 non-boxing matched controls were included in the study. CSF samples were collected by lumbar puncture 1–6 days after a bout and after a rest period for at least 14 days. The controls were tested once. Biomarkers for acute and chronic brain injury were analysed.Results
NFL (mean ± SD, 532±553 vs 135±51 ng/L p = 0.001), GFAP (496±238 vs 247±147 ng/L p<0.001), T-tau (58±26 vs 49±21 ng/L p<0.025) and S-100B (0.76±0.29 vs 0.60±0.23 ng/L p = 0.03) concentrations were significantly increased after boxing compared to controls. NFL (402±434 ng/L p = 0.004) and GFAP (369±113 ng/L p = 0.001) concentrations remained elevated after the rest period.Conclusion
Increased CSF levels of T-tau, NFL, GFAP, and S-100B in >80% of the boxers demonstrate that both the acute and the cumulative effect of head trauma in Olympic boxing may induce CSF biomarker changes that suggest minor central nervous injuries. The lack of normalization of NFL and GFAP after the rest period in a subgroup of boxers may indicate ongoing degeneration. The recurrent head trauma in boxing may be associated with increased risk of chronic traumatic brain injury. 相似文献17.
Background
The importance of neonatal experience upon behaviour in later life is increasingly recognised. The overlap between pain and reward pathways led us to hypothesise that neonatal pain experience influences reward-related pathways and behaviours in adulthood.Methodology/Principal Findings
Rat pups received repeat plantar skin incisions (neonatal IN) or control procedures (neonatal anesthesia only, AN) at postnatal days (P)3, 10 and 17. When adult, rats with neonatal ‘pain history’ showed greater sensory sensitivity than control rats following acute plantar skin incision. Motivational behaviour in the two groups of rats was tested in a novelty-induced hypophagia (NIH) paradigm. The sensitivity of this paradigm to pain-induced changes in motivational behaviour was shown by significant increases in the time spent in the central zone of the arena (43.7±5.9% vs. 22.5±6.7%, p<0.05), close to centrally placed food treats, and decreased number of rears (9.5±1.4 vs. 19.2±2.3, p<0.001) in rats with acute plantar skin incision compared to naive, uninjured animals. Rats with a neonatal ‘pain history’ showed the same pain-induced behaviour in the novelty-induced hypophagia paradigm as controls. However, differences were observed in reward-related neural activity between the two groups. Two hours after behavioural testing, brains were harvested and neuronal activity mapped using c-Fos expression in lateral hypothalamic orexin neurons, part of a specific reward seeking pathway. Pain-induced activity in orexin neurons of control rats (18.4±2.8%) was the same as in uninjured naive animals (15.5±2.6%), but in those rats with a ‘pain history’, orexinergic activity was significantly increased (27.2±4.1%, p<0.01). Furthermore the extent of orexin neuron activation in individual rats with a ‘pain history’ was highly correlated with their motivational behaviour (r = −0.86, p = 0.01).Conclusions/Significance
These results show that acute pain alters motivational behaviour and that neonatal pain experience causes long-term changes in brain motivational orexinergic pathways, known to modulate mesolimbic dopaminergic reward circuitry. 相似文献18.
Background
Hypercapnic Chronic Obstructive Pulmonary Disease (COPD) exacerbation in patients with comorbidities and multidrug therapy is complicated by mixed acid-base, hydro-electrolyte and lactate disorders. Aim of this study was to determine the relationships of these disorders with the requirement for and duration of noninvasive ventilation (NIV) when treating hypercapnic respiratory failure.Methods
Sixty-seven consecutive patients who were hospitalized for hypercapnic COPD exacerbation had their clinical condition, respiratory function, blood chemistry, arterial blood gases, blood lactate and volemic state assessed. Heart and respiratory rates, pH, PaO2 and PaCO2 and blood lactate were checked at the 1st, 2nd, 6th and 24th hours after starting NIV.Results
Nine patients were transferred to the intensive care unit. NIV was performed in 11/17 (64.7%) mixed respiratory acidosis–metabolic alkalosis, 10/36 (27.8%) respiratory acidosis and 3/5 (60%) mixed respiratory-metabolic acidosis patients (p = 0.026), with durations of 45.1±9.8, 36.2±8.9 and 53.3±4.1 hours, respectively (p = 0.016). The duration of ventilation was associated with higher blood lactate (p<0.001), lower pH (p = 0.016), lower serum sodium (p = 0.014) and lower chloride (p = 0.038). Hyponatremia without hypervolemic hypochloremia occurred in 11 respiratory acidosis patients. Hypovolemic hyponatremia with hypochloremia and hypokalemia occurred in 10 mixed respiratory acidosis–metabolic alkalosis patients, and euvolemic hypochloremia occurred in the other 7 patients with this mixed acid-base disorder.Conclusions
Mixed acid-base and lactate disorders during hypercapnic COPD exacerbations predict the need for and longer duration of NIV. The combination of mixed acid-base disorders and hydro-electrolyte disturbances should be further investigated. 相似文献19.
Reduced exercise tolerance and pulmonary capillary recruitment with remote secondhand smoke exposure
Rationale
Flight attendants who worked on commercial aircraft before the smoking ban in flights (pre-ban FAs) were exposed to high levels of secondhand smoke (SHS). We previously showed never-smoking pre-ban FAs to have reduced diffusing capacity (Dco) at rest.Methods
To determine whether pre-ban FAs increase their Dco and pulmonary blood flow () during exercise, we administered a symptom-limited supine-posture progressively increasing cycle exercise test to determine the maximum work (watts) and oxygen uptake () achieved by FAs. After 30 min rest, we then measured Dco and at 20, 40, 60, and 80 percent of maximum observed work.Results
The FAs with abnormal resting Dco achieved a lower level of maximum predicted work and compared to those with normal resting Dco (mean±SEM; 88.7±2.9 vs. 102.5±3.1%predicted ; p = 0.001). Exercise limitation was associated with the FAs'' FEV1 (r = 0.33; p = 0.003). The Dco increased less with exercise in those with abnormal resting Dco (mean±SEM: 1.36±0.16 vs. 1.90±0.16 ml/min/mmHg per 20% increase in predicted watts; p = 0.020), and amongst all FAs, the increase with exercise seemed to be incrementally lower in those with lower resting Dco. Exercise-induced increase in was not different in the two groups. However, the FAs with abnormal resting Dco had less augmentation of their Dco with increase in during exercise (mean±SEM: 0.93±0.06 vs. 1.47±0.09 ml/min/mmHg per L/min; p<0.0001). The Dco during exercise was inversely associated with years of exposure to SHS in those FAs with ≥10 years of pre-ban experience (r = −0.32; p = 0.032).Conclusions
This cohort of never-smoking FAs with SHS exposure showed exercise limitation based on their resting Dco. Those with lower resting Dco had reduced pulmonary capillary recruitment. Exposure to SHS in the aircraft cabin seemed to be a predictor for lower Dco during exercise. 相似文献20.
David G. Zacharias Sung Gyun Kim Alfonso Eirin Massat Adi R. Bachar Yun K. Oh Joerg Herrmann Martin Rodriguez-Porcel Pinchas Cohen Lilach O. Lerman Amir Lerman 《PloS one》2012,7(2)