首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 562 毫秒
1.

Purpose

To investigate the characteristics of macular ganglion cell-inner plexiform layer (GCIPL) thickness profiles associated with ocular dominance.

Setting

Private practice, Seoul, Republic of Korea.

Design

Comparative case-control study.

Methods

Both eyes of 199 participants with no ophthalmic abnormalities were included. Participants were imaged by spectral-domain optical coherence tomography, and underwent dominant eye testing using a hole-in-a-card test (sighting dominance) at the same visit. Macular GCIPL, as well as circumpapillary retinal nerve fiber layer (RNFL) thickness were compared for individual patients, according to ocular dominance.

Results

Ocular dominance occurred predominantly in the right eye (right vs. left: 72.36 vs. 27.60%; P < 0.001). In the comparison of macular GCIPL thickness, the average (81.27±5.01 μm vs. 80.66±6.31 μm in dominant vs. non-dominant eyes), inferonasal (81.39±5.47μm vs. 80.33±6.82μm, and inferior sectors (77.95±6.05μm vs. 76.97±8.15μm) were significantly different between dominant and non-dominant eyes (P = 0.040, 0.005, and 0.032, respectively). Significant predictors of average GCIPL thickness were spherical equivalent (β = 1.37, P<0.001), astigmatic power (β = 1.44, P = 0.009), disc area (β = 3.90, P < 0.001), average RNFL thickness (β = 0.22, P<0.001), average cup-to-disc ratio (β = 5.74, P = 0.002), difference between the inferior and superior quadrant RNFL thicknesses (β = 0.08, P = 0.024), and ocular dominance (β = 2.10, P = 0.020). On multivariate regression analysis, ocular dominance was correlated with average GCIPL thickness after adjusting for potential confounders (β = 1.63, P = 0.048).

Conclusions

Dominant eyes accompanied significantly thicker average macular GCIPL. This information suggests that macular GCIPL thickness may provide an indicator of the relative dominance of an eye.  相似文献   

2.

Background

Complementary and alternative medicine (CAM) use has become increasingly popular among patients with cancer. The purposes of this study were to compare the QOL in CAM users and non-CAM users and to determine whether CAM use influences QOL among breast cancer patients during chemotherapy.

Methodology

A cross-sectional survey was conducted at two outpatient chemotherapy centers. A total of 546 patients completed the questionnaires on CAM use. QOL was evaluated based on the European Organization for Research and Treatment of Cancer (EORTC) core quality of life (QLQ-C30) and breast cancer-specific quality of life (QLQ-BR23) questionnaires.

Results

A total of 70.7% of patients were identified as CAM users. There was no significant difference in global health status scores and in all five subscales of the QLQ C30 functional scales between CAM users and non-CAM users. On the QLQ-C30 symptom scales, CAM users (44.96±3.89) had significantly (p = 0.01) higher mean scores for financial difficulties than non-CAM users (36.29±4.81). On the QLQ-BR23 functional scales, CAM users reported significantly higher mean scores for sexual enjoyment (6.01±12.84 vs. 4.64±12.76, p = 0.04) than non-CAM users. On the QLQ-BR23 symptom scales, CAM users reported higher systemic therapy side effects (41.34±2.01 vs. 37.22±2.48, p = 0.04) and breast symptoms (15.76±2.13 vs. 11.08±2.62, p = 0.02) than non-CAM users. Multivariate logistic regression analysis indicated that the use of CAM modality was not significantly associated with higher global health status scores (p = 0.71).

Conclusion

While the findings indicated that there was no significant difference between users and non-users of CAM in terms of QOL, CAM may be used by health professionals as a surrogate to monitor patients with higher systemic therapy side effects and breast symptoms. Furthermore, given that CAM users reported higher financial burdens (which may have contributed to increased distress), patients should be encouraged to discuss the potential benefits and/or disadvantages of using CAM with their healthcare providers.  相似文献   

3.

Objective

We wished to determine the prevalence of fever as one of the first symptoms of the enthesitis-related arthritis (ERA) subtype of juvenile idiopathic arthritis. Also, we wished to ascertain if ERA patients with fever at disease onset differed from those without fever.

Methods

Consecutive cases of ERA were diagnosed and followed in a retrospective observational study from 1998 to 2013. Information about clinical/laboratory data, medications, magnetic resonance imaging (MRI), and disease activity during the study period was also recorded.

Results

A total of 146 consecutive ERA patients were assessed. Among them, 52 patients (35.6%) had fever as one of the first symptoms at disease onset. Compared with ERA patients without fever at disease onset, patients with fever had significantly more painful joints (3.5 vs. 2.8), more swollen joints (1.1 vs. 0.8), and more enthesitis (1.0 vs. 0.4) (p<0.05 for all comparisons). Patients with fever had significantly higher mean values of erythrocyte sedimentation rate, C-reactive protein, platelet count, and child health assessment questionnaire (CHAQ) scores (40.8 vs. 26.4 mm/h; 20.7 vs. 9.7 mg/dL; 353.2×109/L vs. 275.6×109/L; 1.0 vs. 0.8, respectively; all p<0.05). During two-year follow-up, CHAQ score, number of flares, as well as the number of patients treated with oral non-steroidal anti-inflammatory drugs, corticosteroids and combination therapy with disease-modifying anti-rheumatic drugs, were significantly higher in ERA patients with fever.

Conclusions

Fever was a frequent manifestation of ERA. ERA patients with fever had more active disease at disease onset and poorer outcomes than ERA patients without fever.  相似文献   

4.

Introduction

Multimodality monitoring is regularly employed in adult traumatic brain injury (TBI) patients where it provides physiologic and therapeutic insight into this heterogeneous condition. Pediatric studies are less frequent.

Methods

An analysis of data collected prospectively from 12 pediatric TBI patients admitted to Addenbrooke’s Hospital, Pediatric Intensive Care Unit (PICU) between August 2012 and December 2014 was performed. Patients’ intracranial pressure (ICP), mean arterial pressure (MAP), and cerebral perfusion pressure (CPP) were monitored continuously using brain monitoring software ICM+®,) Pressure reactivity index (PRx) and ‘Optimal CPP’ (CPPopt) were calculated. Patient outcome was dichotomized into survivors and non-survivors.

Results

At 6 months 8/12 (66%) of the cohort survived the TBI. The median (±IQR) ICP was significantly lower in survivors 13.1±3.2 mm Hg compared to non-survivors 21.6±42.9 mm Hg (p = 0.003). The median time spent with ICP over 20 mm Hg was lower in survivors (9.7+9.8% vs 60.5+67.4% in non-survivors; p = 0.003). Although there was no evidence that CPP was different between survival groups, the time spent with a CPP close (within 10 mm Hg) to the optimal CPP was significantly longer in survivors (90.7±12.6%) compared with non-survivors (70.6±21.8%; p = 0.02). PRx provided significant outcome separation with median PRx in survivors being 0.02±0.19 compared to 0.39±0.62 in non-survivors (p = 0.02).

Conclusion

Our observations provide evidence that multi-modality monitoring may be useful in pediatric TBI with ICP, deviation of CPP from CPPopt, and PRx correlating with patient outcome.  相似文献   

5.

Background

Immunonutrition in sepsis, including n-3 poly-unsaturated fatty acids (PUFAs) or L-arginine supplementation, is a controversial issue that has yielded a great number of studies for the last thirty-five years, and the conclusions regarding the quantity and quality of this support in patients are deceiving. The aim of the present experimental study is to investigate the effects of a pretreatment with enteral nutrition enriched with n-3 PUFAs or L-arginine on vascular dysfunctions, inflammation and oxidative stress during septic shock in rats.

Design

Rats were fed with enteral Peptamen® HN (HN group), Peptamen® AF containing n-3 PUFAs (AF group) or Peptamen® AF enriched with L-arginine (AFA group). On day 4, peritonitis by cecal ligation and puncture (CLP) was performed. Rats were resuscitated (H18) once septic shock was established. After a 4-hour resuscitation, vessels and organs were harvested to assess inflammation, superoxide anion, nitric oxide and prostacyclin levels. Ex-vivo vascular reactivity was also performed.

Results

Compared to CLP-AF or CLP-HN groups, 47.6% of CLP-AFA rats died before the beginning of hemodynamic measurements (vs. 8.0% and 20.0% respectively, p<0.05). AF and AFA rats required significantly increased norepinephrine infusion rates to reach the mean arterial pressure objective, compared to CLP-HN rats. Both CLP-AF and CLP-AFA reduced mesenteric resistance arterial contractility, decreased vascular oxidative stress, but increased NF-κB (0.40±0.15 in CLP-AF and 0.69±0.06 in CLP-AFA vs. 0.09±0.03 in SHAM rats and 0.30±0.06 in CLP-HN, ß-actin ratio, p<0.05) and pIκB expression (0.60±0.03 in CLP-AF and 0.94±0.15 in CLP-AFA vs. 0.04±0.01 in SHAM rats and 0.56±0.07 in CLP-HN, ß-actin ratio, p<0.05), nitric oxide and prostacyclin production in septic rats.

Conclusions

Although n-3 PUFAs or L-arginine supplementation exhibited an antioxidant effect, it worsened the septic shock-induced vascular dysfunction. Furthermore, mortality was higher after L-arginine supplementation.  相似文献   

6.

Background and objective

Acute Physiology and Chronic Health Evaluation (APACHE) III score has been widely used for prediction of clinical outcomes in mixed critically ill patients. However, it has not been validated in patients with sepsis-associated acute lung injury (ALI). The aim of the study was to explore the calibration and predictive value of APACHE III in patients with sepsis-associated ALI.

Method

The study was a secondary analysis of a prospective randomized controlled trial investigating the efficacy of rosuvastatin in sepsis-associated ALI (Statins for Acutely Injured Lungs from Sepsis, SAILS). The study population was sepsis-related ALI patients. The primary outcome of the current study was the same as in the original trial, 60-day in-hospital mortality, defined as death before hospital discharge, censored 60 days after enrollment. Discrimination of APACHE III was assessed by calculating the area under the receiver operating characteristic (ROC) curve (AUC) with its 95% CI. Hosmer-Lemeshow goodness-of-fit statistic was used to assess the calibration of APACHE III. The Brier score was reported to represent the overall performance of APACHE III in predicting outcome.

Main results

A total of 745 patients were included in the study, including 540 survivors and 205 non-survivors. Non-survivors were significantly older than survivors (59.71±16.17 vs 52.00±15.92 years, p<0.001). The primary causes of ALI were also different between survivors and non-survivors (p = 0.017). Survivors were more likely to have the cause of sepsis than non-survivors (21.2% vs. 15.1%). APACHE III score was higher in non-survivors than in survivors (106.72±27.30 vs. 88.42±26.86; p<0.001). Discrimination of APACHE III to predict mortality in ALI patients was moderate with an AUC of 0.68 (95% confidence interval: 0.64–0.73).

Conclusion

this study for the first time validated the discrimination of APACHE III in sepsis associated ALI patients. The result shows that APACHE III score has moderate predictive value for in-hospital mortality among adults with sepsis-associated acute lung injury.  相似文献   

7.

Objective

We investigated whether and to what extent cystatin C was associated with angiographic coronary collateralization in patients with stable coronary artery disease and chronic total occlusion.

Methods

Serum levels of cystatin C and high-sensitive C-reactive protein (hsCRP) and glomerular filtration rate (GFR) were determined in 866 patients with stable angina and angiographic total occlusion of at least one major coronary artery. The degree of collaterals supplying the distal aspect of a total occlusion from the contra-lateral vessel was graded as poor (Rentrop score of 0 or 1) or good coronary collateralization (Rentrop score of 2 or 3).

Results

In total, serum cystatin C was higher in patients with poor collateralization than in those with good collateralization (1.08 ± 0.32 mg/L vs. 0.90 ± 0.34 mg/L, P < 0.001), and correlated inversely with Rentrop score (adjusted Spearmen’s r = -0.145, P < 0.001). The prevalence of poor coronary collateralization increased stepwise with increasing cystatin C quartiles (P for trend < 0.001). After adjusting for age, gender, risk factors for coronary artery disease, GFR and hsCRP, serum cystatin C ≥ 0.97 mg/L remained independently associated with poor collateralization (OR 2.374, 95% CI 1.660 ~ 3.396, P < 0.001). The diagnostic value of cystatin C levels for detecting poor coronary collateralization persisted regardless of age, gender, presence or absence of diabetes, hypertension or renal dysfunction.

Conclusions

Serum cystatin C reflects angiographic coronary collateralization in patients with stable coronary artery disease, and cystatin C ≥ 0.97 mg/L indicates a great risk of poor coronary collaterals.  相似文献   

8.

Background

The current clinical classification of chronic kidney disease (CKD) is not perfect and may be overestimating both the prevalence and the risk for progressive disease. Novel markers are being sought to identify those at risk of progression. This preliminary study evaluates the feasibility of magnetic resonance imaging based markers to identify early changes in CKD.

Methods

Fifty-nine subjects (22 healthy, 7 anemics with no renal disease, 30 subjects with CKD) participated. Data using 3D volume imaging, blood oxygenation level dependent (BOLD) and Diffusion MRI was acquired. BOLD MRI acquisition was repeated after 20 mg of iv furosemide.

Results

Compared to healthy subjects, those with CKD have lower renal parenchymal volumes (329.6±66.4 vs. 257.1±87.0 ml, p<0.005), higher cortical R2* values (19.7±3.2 vs. 23.2±6.3 s−1, p = 0.013) (suggesting higher levels of hypoxia) and lower response to furosemide on medullary R2* (6.9±3.3 vs. 3.1±7.5 s−1, p = 0.02). All three parameters showed significant correlation with estimated glomerular filtration rate (eGFR). When the groups were matched for age and sex, cortical R2* and kidney volume still showed significant differences between CKD and healthy controls. The most interesting observation is that a small number of subjects (8 of 29) contributed to the increase in mean value observed in CKD. The difference in cortical R2* between these subjects compared to the rest were highly significant and had a large effect size (Cohen’s d = 3.5). While highly suggestive, future studies may be necessary to verify if such higher levels of hypoxia are indicative of progressive disease. Diffusion MRI showed no differences between CKD and healthy controls.

Conclusions

These data demonstrate that BOLD MRI can be used to identify enhanced hypoxia associated with CKD and the preliminary observations are consistent with the chronic hypoxia model for disease progression in CKD. Longitudinal studies are warranted to further verify these findings and assess their predictive value.  相似文献   

9.

Aims

Growth arrest-specific protein 6 (Gas6) is a vitamin K-dependent protein expressed by endothelial cells and leukocytes that are involved in cell survival, migration, and proliferation in response to inflammatory processes. The aim of this study was to assess the implications of Gas6 in Sjögren syndrome (SS) and its expression in the labial salivary gland.

Methods and Results

A total of 254 adults, including 159 with primary Sjögren syndrome (pSS), 34 with secondary Sjögren syndrome (sSS), and 61 normal controls, were recruited. Plasma Gas6 concentrations were determined, and Gas6 expressions in labial salivary gland (LSG) tissues from controls and pSS and sSS patients were also evaluated. Plasma Gas6 concentrations were significantly lower among patients with pSS than normal controls (13.5 ± 8.6 vs. 19.9 ± 13.4 ng/ml, p < 0.001). There were, however, no significant differences in plasma Gas6 levels between pSS and sSS patients (13.5 ± 8.6 vs. 16.9 ± 11.2 ng/ml, p = 0.068). In multivariate logistic regression analysis, after adjustment for white blood cell count, hemoglobin level, platelet count, lymphocyte count, and C3 and C4 levels, lower plasma Gas6 concentrations were significantly associated with an increased risk of SS. Moreover, by using a semi-quantitative scale to evaluate Gas6 expression in LSG tissues, Gas6 expression was found to be markedly lower in LSG tissues from pSS patients than in tissues from normal controls.

Conclusions

Decreased plasma Gas6 concentration and LSG expression were associated with pSS. As such, Gas6 may represent a novel independent risk factor for pSS, with a potential role in salivary gland inflammation and dysfunction.  相似文献   

10.

Background

The analysis of heart rate variability (HRV) has been shown as a promising non-invasive technique for assessing the cardiac autonomic modulation in trauma. The aim of this study was to evaluate HRV during hemorrhagic shock and fluid resuscitation, comparing to traditional hemodynamic and metabolic parameters.

Methods

Twenty anesthetized and mechanically ventilated pigs were submitted to hemorrhagic shock (60% of estimated blood volume) and evaluated for 60 minutes without fluid replacement. Surviving animals were treated with Ringer solution and evaluated for an additional period of 180 minutes. HRV metrics (time and frequency domain) as well as hemodynamic and metabolic parameters were evaluated in survivors and non-survivors animals.

Results

Seven of the 20 animals died during hemorrhage and initial fluid resuscitation. All animals presented an increase in time-domain HRV measures during haemorrhage and fluid resuscitation restored baseline values. Although not significantly, normalized low-frequency and LF/HF ratio decreased during early stages of haemorrhage, recovering baseline values later during hemorrhagic shock, and increased after fluid resuscitation. Non-surviving animals presented significantly lower mean arterial pressure (43±7vs57±9 mmHg, P<0.05) and cardiac index (1.7±0.2vs2.6±0.5 L/min/m2, P<0.05), and higher levels of plasma lactate (7.2±2.4vs3.7±1.4 mmol/L, P<0.05), base excess (-6.8±3.3vs-2.3±2.8 mmol/L, P<0.05) and potassium (5.3±0.6vs4.2±0.3 mmol/L, P<0.05) at 30 minutes after hemorrhagic shock compared with surviving animals.

Conclusions

The HRV increased early during hemorrhage but none of the evaluated HRV metrics was able to discriminate survivors from non-survivors during hemorrhagic shock. Moreover, metabolic and hemodynamic variables were more reliable to reflect hemorrhagic shock severity than HRV metrics.  相似文献   

11.

Introduction

Bilirubin is well-recognized marker of hepatic dysfunction in intensive care unit (ICU) patients. Multiple organ failure often complicates acute respiratory distress syndrome (ARDS) evolution and is associated with high mortality. The effect of early hepatic dysfunction on ARDS mortality has been poorly investigated. We evaluated the incidence and the prognostic significance of increased serum bilirubin levels in the initial phase of ARDS.

Methods

The data of 805 patients with ARDS were retrospectively analysed. This population was extracted from two recent multicenter, prospective and randomised trials. Patients presenting with ARDS with a ratio of the partial pressure of arterial oxygen to the fraction of inspired oxygen < 150 mmHg measured with a PEEP ≥ 5 cm of water were included. The total serum bilirubin was measured at inclusion and at days 2, 4, 7 and 14. The primary objective was to analyse the bilirubin at inclusion according to the 90-day mortality rate.

Results

The 90-day mortality rate was 33.8% (n = 272). The non-survivors were older, had higher Sepsis-related Organ Failure Assessment (SOFA) score and were more likely to have a medical diagnosis on admission than the survivors. At inclusion, the SOFA score without the liver score (10.3±2.9 vs. 9.0±3.0, p<0.0001) and the serum bilirubin levels (36.1±57.0 vs. 20.5±31.5 μmol/L, p<0.0001) were significantly higher in the non-survivors than in the survivors. Age, the hepatic SOFA score, the coagulation SOFA score, the arterial pH level, and the plateau pressure were independently associated with 90-day mortality in patients with ARDS.

Conclusion

Bilirubin used as a surrogate marker of hepatic dysfunction and measured early in the course of ARDS was associated with the 90-day mortality rate.  相似文献   

12.
ObjectiveIt has proved that muscle paralysis was more protective for injured lung in severe acute respiratory distress syndrome (ARDS), but the precise mechanism is not clear. The purpose of this study was to test the hypothesis that abdominal muscle activity during mechanically ventilation increases lung injury in severe ARDS.MethodsEighteen male Beagles were studied under mechanical ventilation with anesthesia. Severe ARDS was induced by repetitive oleic acid infusion. After lung injury, Beagles were randomly assigned into spontaneous breathing group (BIPAPSB) and abdominal muscle paralysis group (BIPAPAP). All groups were ventilated with BIPAP model for 8h, and the high pressure titrated to reached a tidal volume of 6ml/kg, the low pressure was set at 10 cmH2O, with I:E ratio 1:1, and respiratory rate adjusted to a PaCO2 of 35–60 mmHg. Six Beagles without ventilator support comprised the control group. Respiratory variables, end-expiratory volume (EELV) and gas exchange were assessed during mechanical ventilation. The levels of Interleukin (IL)-6, IL-8 in lung tissue and plasma were measured by qRT-PCR and ELISA respectively. Lung injury scores were determined at end of the experiment.ResultsFor the comparable ventilator setting, as compared with BIPAPSB group, the BIPAPAP group presented higher EELV (427±47 vs. 366±38 ml) and oxygenation index (293±36 vs. 226±31 mmHg), lower levels of IL-6(216.6±48.0 vs. 297.5±71.2 pg/ml) and IL-8(246.8±78.2 vs. 357.5±69.3 pg/ml) in plasma, and lower express levels of IL-6 mRNA (15.0±3.8 vs. 21.2±3.7) and IL-8 mRNA (18.9±6.8 vs. 29.5±7.9) in lung tissues. In addition, less lung histopathology injury were revealed in the BIPAPAP group (22.5±2.0 vs. 25.2±2.1).ConclusionAbdominal muscle activity during mechanically ventilation is one of the injurious factors in severe ARDS, so abdominal muscle paralysis might be an effective strategy to minimize ventilator-induce lung injury.  相似文献   

13.

Background

We have previously reported that high glucose impairs coronary vasodilation by reducing voltage-gated K+ (Kv) channel activity. However, the underlying mechanisms remain unknown. Advanced glycation end products (AGEs) are potent factors that contribute to the development of diabetic vasculopathy. The aim of this study was to investigate the role of AGEs in high glucose-induced impairment of Kv channels-mediated coronary vasodilation.

Methods

Patch-clamp recording and molecular biological techniques were used to assess the function and expression of Kv channels. Vasodilation of isolated rat small coronary arteries was measured using a pressurized myograph. Treatment of isolated coronary vascular smooth muscle cells (VSMCs) and streptozotocin-induced diabetic rats with aminoguanidine, the chemical inhibitor of AGEs formation, was performed to determine the contribution of AGEs.

Results

Incubation of VSMCs with high glucose reduced Kv current density by 60.4 ± 4.8%, and decreased expression of Kv1.2 and Kv1.5 both at the gene and protein level, whereas inhibiting AGEs formation or blocking AGEs interacting with their receptors prevented high glucose-induced impairment of Kv channels. In addition, diabetic rats manifested reduced Kv channels-mediated coronary dilation (9.3 ± 1.4% vs. 36.9 ± 1.4%, P < 0.05), which was partly corrected by the treatment with aminoguanidine (24.4 ± 2.2% vs. 9.3 ± 1.4%, P < 0.05).

Conclusions

Excessive formation of AGEs impairs Kv channels in VSMCs, then leading to attenuation of Kv channels-mediated coronary vasodilation.  相似文献   

14.

Background and Purpose

Maternal glucocorticoid treatment for threatened premature delivery dramatically improves neonatal survival and short-term morbidity; however, its effects on neurodevelopmental outcome are variable. We investigated the effect of maternal glucocorticoid exposure after acute asphyxia on injury in the preterm brain.

Methods

Chronically instrumented singleton fetal sheep at 0.7 of gestation received asphyxia induced by complete umbilical cord occlusion for 25 minutes. 15 minutes after release of occlusion, ewes received a 3 ml i.m. injection of either dexamethasone (12 mg, n = 10) or saline (n = 10). Sheep were killed after 7 days recovery; survival of neurons in the hippocampus and basal ganglia, and oligodendrocytes in periventricular white matter were assessed using an unbiased stereological approach.

Results

Maternal dexamethasone after asphyxia was associated with more severe loss of neurons in the hippocampus (CA3 regions, 290±76 vs 484±98 neurons/mm2, mean±SEM, P<0.05) and basal ganglia (putamen, 538±112 vs 814±34 neurons/mm2, P<0.05) compared to asphyxia-saline, and with greater loss of both total (913±77 vs 1201±75/mm2, P<0.05) and immature/mature myelinating oligodendrocytes in periventricular white matter (66±8 vs 114±12/mm2, P<0.05, vs sham controls 165±10/mm2, P<0.001). This was associated with transient hyperglycemia (peak 3.5±0.2 vs. 1.4±0.2 mmol/L at 6 h, P<0.05) and reduced suppression of EEG power in the first 24 h after occlusion (maximum −1.5±1.2 dB vs. −5.0±1.4 dB in saline controls, P<0.01), but later onset and fewer overt seizures.

Conclusions

In preterm fetal sheep, exposure to maternal dexamethasone during recovery from asphyxia exacerbated brain damage.  相似文献   

15.

Background

The ovary is an important site where gene variants modulate pubertal timing. The cannabinoid receptor 2 (CB2) is expressed in the ovary, plays a role in folliculogenesis and ovulation, and can be modulated by estrogens. Obesity is strictly associated with early menarche and is characterized by sex hormone and endocannabinoid derangement.

Aim

In this study, we investigated the role of the CB2 receptor in determining the age at menarche in obese girls.

Methods

We studied a cohort of 240 obese girls (age 11.9±3 years; BMI z-score 2.8±0.8). The age at menarche (if it had already occurred) was recorded at the time of the visit or via phonecall. The CNR2 rs35761398 polymorphism, which leads to the CB2 Q63R variant, was detected by the TaqMan assay.

Results

In total, 105 patients were homozygous for the R63-coding allele (RR), 113 were QR and 22 were QQ. Variance analysis revealed a significantly earlier age of menarche in subjects carrying the Q63 allele, which was also found after adjusting for BMI z-score (11±1.2 vs. 11.6±1.2 years, p = 0.0003). Logistic regression analysis demonstrated that patients homozygous for the Q allele had a 2.2-fold higher risk (odds ratio = 2.2; CI1.1–3.4; p = 0.02) of presenting with an early menarche (age at menarche <12 years).

Conclusion

We demonstrated for the first time the association between the CB2 Q63R functional variant and the age at menarche in a cohort of Italian obese girls.  相似文献   

16.

Background

There is a need for biomarkers insuring identification of septic patients at high-risk for death. We performed a prospective, multicenter, observational study to investigate the time-course of lipopolysaccharide binding protein (LBP) serum levels in patients with severe sepsis and examined whether serial serum levels of LBP could be used as a marker of outcome.

Methodology/Principal Findings

LBP serum levels at study entry, at 48 hours and at day-7 were measured in 180 patients with severe sepsis. Data regarding the nature of infections, disease severity, development of acute lung injury (ALI) and acute respiratory distress syndrome (ARDS), and intensive care unit (ICU) outcome were recorded. LBP serum levels were similar in survivors and non-survivors at study entry (117.4±75.7 µg/mL vs. 129.8±71.3 µg/mL, P = 0.249) but there were significant differences at 48 hours (77.2±57.0 vs. 121.2±73.4 µg/mL, P<0.0001) and at day-7 (64.7±45.8 vs. 89.7±61.1 µg/ml, p = 0.017). At 48 hours, LBP levels were significantly higher in ARDS patients than in ALI patients (112.5±71.8 µg/ml vs. 76.6±55.9 µg/ml, P = 0.0001). An increase of LBP levels at 48 hours was associated with higher mortality (odds ratio 3.97; 95%CI: 1.84–8.56; P<0.001).

Conclusions/Significance

Serial LBP serum measurements may offer a clinically useful biomarker for identification of patients with severe sepsis having the worst outcomes and the highest probability of developing sepsis-induced ARDS.  相似文献   

17.

Background

We determined reliability of cardiac output (CO) measured by pulse wave transit time cardiac output system (esCCO system; COesCCO) vs transthoracic echocardiography (COTTE) in mechanically ventilated patients in the early phase of septic shock. A secondary objective was to assess ability of esCCO to detect change in CO after fluid infusion.

Methods

Mechanically ventilated patients admitted to the ICU, aged >18 years, in sinus rhythm, in the early phase of septic shock were prospectively included. We performed fluid infusion of 500ml of crystalloid solution over 20 minutes and recorded CO by EsCCO and TTE immediately before (T0) and 5 minutes after (T1) fluid administration. Patients were divided into 2 groups (responders and non-responders) according to a threshold of 15% increase in COTTE in response to volume expansion.

Results

In total, 25 patients were included, average 64±15 years, 15 (60%) were men. Average SAPSII and SOFA scores were 55±21.3 and 13±2, respectively. ICU mortality was 36%. Mean cardiac output at T0 was 5.8±1.35 L/min by esCCO and 5.27±1.17 L/min by COTTE. At T1, respective values were 6.63 ± 1.57 L/min for esCCO and 6.10±1.29 L/min for COTTE. Overall, 12 patients were classified as responders, 13 as non-responders by the reference method. A threshold of 11% increase in COesCCO was found to discriminate responders from non-responders with a sensitivity of 83% (95% CI, 0.52-0.98) and a specificity of 77% (95% CI, 0.46-0.95).

Conclusion

We show strong correlation esCCO and echocardiography for measuring CO, and change in CO after fluid infusion in ICU patients.  相似文献   

18.

Objectives

Oxidized low-density lipoproteins (oxLDL) and oxidized low-density lipoprotein autoantibodies (OLAB) have been detected in human plasma and atherosclerotic lesions. OLAB appear to play a role in the clearance of oxLDL from circulation. Higher levels of OLAB appear to be associated with a reduced risk of a wide range of cardiovascular diseases. We investigated the prognostic value of plasma oxLDL and OLAB in patients undergoing primary coronary balloon angioplasty for acute ST-elevation myocardial infarction (STEMI).

Methods

Plasma oxLDL and OLAB concentrations were measured in 56 patients with acute STEMI before primary angioplasty, and then 3 days, 7 days and 1 month after the acute event. Follow-up angiography was repeated 6 months later to detect the presence of restensosis (defined as >50% luminal diameter stenosis). The thrombolysis in myocardial infarction (TIMI) risk score was calculated to determine the relationship between OLAB/oxLDL ratio and TIMI risk scores.

Results

Of the 56 patients, 18 (31%) had angiographic evidence of restenosis. Plasma OLAB concentrations were significantly lower in the restenosis group before angioplasty (181±114 vs. 335±257 U/L, p = 0.003), and at day 3 (155±92 vs. 277±185 U/L, p<0.001) and day 7 (177±110 vs. 352±279 U/L, p<0.001) after the acute event. There was no difference in oxLDL concentration between the two groups. The ratio of OLAB/oxLDL positively correlated with TIMI risk scores before angioplasty (p for trend analysis, p = 0.004), at day 3 (p = 0.008) and day 7 (p<0.001) after STEMI.

Significance

A relative deficit of OLAB, and hence likely impaired clearance of oxLDL, is associated with the risk of arterial restenosis after primary angioplasty for acute STEMI.  相似文献   

19.

Background and Aims

Since high-density lipoprotein (HDL) has pro-endothelial and anti-thrombotic effects, a HDL recruiting stent may prevent restenosis. In the present study we address the functional characteristics of an apolipoprotein A-I (ApoA-I) antibody coating in vitro. Subsequently, we tested its biological performance applied on stents in vivo in rabbits.

Materials and Methods

The impact of anti ApoA-I- versus apoB-antibody coated stainless steel discs were evaluated in vitro for endothelial cell adhesion, thrombin generation and platelet adhesion. In vivo, response to injury in the iliac artery of New Zealand white rabbits was used as read out comparing apoA-I-coated versus bare metal stents.

Results

ApoA-I antibody coated metal discs showed increased endothelial cell adhesion and proliferation and decreased thrombin generation and platelet adhesion, compared to control discs. In vivo, no difference was observed between ApoA-I and BMS stents in lumen stenosis (23.3±13.8% versus 23.3±11.3%, p=0.77) or intima surface area (0.81±0.62 mm2 vs 0.84±0.55 mm2, p=0.85). Immunohistochemistry also revealed no differences in cell proliferation, fibrin deposition, inflammation and endothelialization.

Conclusion

ApoA-I antibody coating has potent pro-endothelial and anti-thrombotic effects in vitro, but failed to enhance stent performance in a balloon injury rabbit model in vivo.  相似文献   

20.
BackgroundThe aim of the study was to assess the dosimetric comparison of bone marrow between standard IMRT(SD-IMRT) and bone marrow sparing IMRT (BMS-IMRT) among carcinoma cervix patients who underwent radical or adjuvant chemoradiation in a tertiary cancer center.Materials and methodsForty eligible patients of histo-pathologically proven carcinoma cervix were enrolled in the study that was randomized on a 1:1 basis between SD-IMRT and BMS-IMRT from July 2018 to October 2019. The whole pelvis, bilateral femoral heads, and upper 1/3rd femur were contoured using the whole bone technique as a surrogate marker for the bone marrow. In both arms, V10, V20, and V40, bone marrow was noted along with mean, maximum, minimum dose, and total volume. DVH for the bone marrow in both arms was compared using the unpaired student t-test.ResultsWe found no significant difference in the mean of various parameters in SD-IMRT arm vs. BMS IMRT arm — for the bone marrow: V10 (89 ± 4.3% vs. 86.7 ± 3.7%), V20 (73.2 ± 5.3% vs. 73.1 ± 4.5%), V40 (23.9 ± 5.4% vs. 26.6 ± 7.4%) and, similarly, for mean dose (28.1 ± 3.5% vs. 28.1 ± 1.8%), maximum dose (53.4 ± 0.58% vs. 53.2 ± 0.58%), minimum dose (0.33 ± 0.18% vs. 0.38 ± 0.38%), total volume (961 ± 110 cc vs. 901 ± 152 cc).ConclusionThis study shows no statistically significant difference in dosimetry between the two groups, which suggests that SD-IMRT spares the bone marrow adequately. Therefore, the need for BMS-IMRT using the present contouring technique does not give any added advantage over SD-IMRT. However, large sample size, other novel contouring technique, and multivariate analysis are needed to reach a definite conclusion.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号