首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 46 毫秒
1.

Background

The G1 cell cycle inhibitors tissue inhibitor of metalloproteinase-2 (TIMP-2) and insulin-like growth factor-binding protein 7 (IGFBP7) have been identified as promising biomarkers for the prediction of adverse outcomes including renal replacement therapy (RRT) and mortality in critically ill adult patients who develop acute kidney injury (AKI). However, the prognostic value of urinary TIMP-2 and IGFBP7 in neonatal and pediatric AKI for adverse outcome has not been investigated yet.

Methods

The product of the urinary concentration of TIMP-2 and IGFBP7 ([TIMP-2]•[IGFBP7]) was assessed by a commercially available immunoassay (NephroCheck) in a prospective cohort study in 133 subjects aged 0–18 years including 46 patients with established AKI according to pRIFLE criteria, 27 patients without AKI (non-AKI group I) and 60 apparently healthy neonates and children (non-AKI group II). AKI etiologies were: dehydration/hypovolemia (n = 7), hemodynamic instability (n = 7), perinatal asphyxia (n = 9), septic shock (n = 7), typical hemolytic-uremic syndrome (HUS; n = 5), interstitial nephritis (n = 5), vasculitis (n = 4), nephrotoxic injury (n = 1) and renal vein thrombosis (n = 1).

Results

When AKI patients were classified into pRIFLE criteria, 6/46 (13%) patients fulfilled the criteria for the category “Risk”, 13/46 (28%) for “Injury”, 26/46 (57%) for “Failure” and 1/46 (2%) for “Loss”. Patients in the “Failure” stage had a median 3.7-fold higher urinary [TIMP-2]•[IGFBP7] compared to non-AKI subjects (P<0.001). When analyzed for AKI etiology, highest [TIMP-2]•[IGFBP7] values were found in patients with septic shock (P<0.001 vs. non-AKI I+II). Receiver operating characteristic (ROC) curve analyses in the AKI group revealed good performance of [TIMP-2]•[IGFBP7] in predicting 30-day (area under the curve (AUC) 0.79; 95% CI, 0.61–0.97) and 3-month mortality (AUC 0.84; 95% CI, 0.67–0.99) and moderate performance in predicting RRT (AUC 0.67; 95% CI, 0.50–0.84).

Conclusions

This study shows that urinary [TIMP-2]•[IGFBP7] has a good diagnostic performance in predicting adverse outcomes in neonatal and pediatric AKI of heterogeneous etiology.  相似文献   

2.

Objective

To assess the ability of the urinary biomarkers IGFBP7 (insulin-like growth factor-binding protein 7) and TIMP-2 (tissue inhibitor of metalloproteinase 2) to early predict acute kidney injury (AKI) in high-risk surgical patients.

Introduction

Postoperative AKI is associated with an increase in short and long-term mortality. Using IGFBP7 and TIMP-2 for early detection of cellular kidney injury, thus allowing the early initiation of renal protection measures, may represent a new concept of evaluating renal function.

Methods

In this prospective study, urinary [TIMP-2]×[IGFBP7] was measured in surgical patients at high risk for AKI. A predefined cut-off value of [TIMP-2]×[IGFBP7] >0.3 was used for assessing diagnostic accuracy. Perioperative characteristics were evaluated, and ROC analyses as well as logistic regression models of risk assessment were calculated with and without a [TIMP-2]×[IGFBP7] test.

Results

107 patients were included in the study, of whom 45 (42%) developed AKI. The highest median values of biomarker were detected in septic, transplant and patients after hepatic surgery (1.24 vs 0.45 vs 0.47 ng/l2/1000). The area under receiving operating characteristic curve (AUC) for the risk of any AKI was 0.85, for early use of RRT 0.83 and for 28-day mortality 0.77. In a multivariable model with established perioperative risk factors, the [TIMP-2]×[IGFBP7] test was the strongest predictor of AKI and significantly improved the risk assessment (p<0.001).

Conclusions

Urinary [TIMP-2]×[IGFBP7] test sufficiently detect patients with risk of AKI after major non-cardiac surgery. Due to its rapid responsiveness it extends the time frame for intervention to prevent development of AKI.  相似文献   

3.

Importance

Poor mental health places a burden on individuals and populations. Resilient persons are able to adapt to life’s challenges and maintain high quality of life and function. Finding effective strategies to bolster resilience in individuals and populations is of interest to many stakeholders.

Objectives

To synthesize the evidence for resiliency training programs in improving mental health and capacity in 1) diverse adult populations and 2) persons with chronic diseases.

Data Sources

Electronic databases, clinical trial registries, and bibliographies. We also contacted study authors and field experts.

Study Selection

Randomized trials assessing the efficacy of any program intended to enhance resilience in adults and published after 1990. No restrictions were made based on outcome measured or comparator used.

Data Extraction and Synthesis

Reviewers worked independently and in duplicate to extract study characteristics and data. These were confirmed with authors. We conducted a random effects meta-analysis on available data and tested for interaction in planned subgroups.

Main Outcomes

The standardized mean difference (SMD) effect of resiliency training programs on 1) resilience/hardiness, 2) quality of life/well-being, 3) self-efficacy/activation, 4) depression, 5) stress, and 6) anxiety.

Results

We found 25 small trials at moderate to high risk of bias. Interventions varied in format and theoretical approach. Random effects meta-analysis showed a moderate effect of generalized stress-directed programs on enhancing resilience [pooled SMD 0.37 (95% CI 0.18, 0.57) p = .0002; I2 = 41%] within 3 months of follow up. Improvement in other outcomes was favorable to the interventions and reached statistical significance after removing two studies at high risk of bias. Trauma-induced stress-directed programs significantly improved stress [−0.53 (−1.04, −0.03) p = .03; I2 = 73%] and depression [−0.51 (−0.92, −0.10) p = .04; I2 = 61%].

Conclusions

We found evidence warranting low confidence that resiliency training programs have a small to moderate effect at improving resilience and other mental health outcomes. Further study is needed to better define the resilience construct and to design interventions specific to it.

Registration Number

PROSPERO #CRD42014007185  相似文献   

4.

Background

There is an ongoing debate whether stroke patients presenting with minor or moderate symptoms benefit from thrombolysis. Up until now, stroke severity on admission is typically measured with the NIHSS, and subsequently used for treatment decision.

Hypothesis

Acute MRI lesion volume assessment can aid in therapy decision for iv-tPA in minor stroke.

Methods

We analysed 164 patients with NIHSS 0–7 from a prospective stroke MRI registry, the 1000+ study (clinicaltrials.org NCT00715533). Patients were examined in a 3 T MRI scanner and either received (n = 62) or did not receive thrombolysis (n = 102). DWI (diffusion weighted imaging) and PI (perfusion imaging) at admission were evaluated for diffusion - perfusion mismatch. Our primary outcome parameter was final lesion volume, defined by lesion volume on day 6 FLAIR images.

Results

The association between t-PA and FLAIR lesion volume on day 6 was significantly different for patients with smaller DWI volume compared to patients with larger DWI volume (interaction between DWI and t-PA: p = 0.021). Baseline DWI lesion volume was dichotomized at the median (0.7 ml): final lesion volume at day 6 was larger in patients with large baseline DWI volumes without t-PA treatment (median difference 3, IQR −0.4–9.3 ml). Conversely, in patients with larger baseline DWI volumes final lesion volumes were smaller after t-PA treatment (median difference 0, IQR −4.1–5 ml). However, this did not translate into a significant difference in the mRS at day 90 (p = 0.577).

Conclusion

Though this study is only hypothesis generating considering the number of cases, we believe that the size of DWI lesion volume may support therapy decision in patients with minor stroke.

Trial Registration

Clinicaltrials.org NCT00715533  相似文献   

5.

Introduction

Regional citrate anticoagulation (RCA) is gaining popularity in continous renal replacement therapy (CRRT) for critically ill patients. The risk of citrate toxicity is a primary concern during the prolonged process. The aim of this study was to assess the pharmacokinetics of citrate in critically ill patients with AKI, and used the kinetic parameters to predict the risk of citrate accumulation in this population group undergoing continuous veno-venous hemofiltration (CVVH) with RCA.

Methods

Critically ill patients with AKI (n = 12) and healthy volunteers (n = 12) were investigated during infusing comparative dosage of citrate. Serial blood samples were taken before, during 120 min and up to 120 min after infusion. Citrate pharmacokinetics were calculated and compared between groups. Then the estimated kinetic parameters were applied to the citrate kinetic equation for validation in other ten patients’ CVVH sessions with citrate anticoagulation.

Results

Total body clearance of citrate was similar in critically ill patients with AKI and healthy volunteers (648.04±347.00 L/min versus 686.64±353.60 L/min; P = 0.624). Basal and peak citrate concentrations were similar in both groups (p = 0.423 and 0.247, respectively). The predicted citrate curve showed excellent fit to the measurements.

Conclusions

Citrate clearance is not impaired in critically ill patients with AKI in the absence of severe liver dysfunction. Citrate pharmacokinetic data can provide a basis for the clinical use of predicting the risk of citrate accumulation.

Trial Registration

ClinicalTrials.gov Identifier NCT00948558  相似文献   

6.

Background

The incidence of acute kidney injury (AKI) is increasing globally and it is much more common than end-stage kidney disease. AKI is associated with high mortality and cost of hospitalisation. Studies of treatments to reduce this high mortality have used differing renal replacement therapy (RRT) modalities and have not shown improvement in the short term. The reported long-term outcomes of AKI are variable and the effect of differing RRT modalities upon them is not clear. We used the prolonged follow-up of a large clinical trial to prospectively examine the long-term outcomes and effect of RRT dosing in patients with AKI.

Methods and Findings

We extended the follow-up of participants in the Randomised Evaluation of Normal vs. Augmented Levels of RRT (RENAL) study from 90 days to 4 years after randomization. Primary and secondary outcomes were mortality and requirement for maintenance dialysis, respectively, assessed in 1,464 (97%) patients at a median of 43.9 months (interquartile range [IQR] 30.0–48.6 months) post randomization. A total of 468/743 (63%) and 444/721 (62%) patients died in the lower and higher intensity groups, respectively (risk ratio [RR] 1.04, 95% CI 0.96–1.12, p = 0.49). Amongst survivors to day 90, 21 of 411 (5.1%) and 23 of 399 (5.8%) in the respective groups were treated with maintenance dialysis (RR 1.12, 95% CI 0.63–2.00, p = 0.69). The prevalence of albuminuria among survivors was 40% and 44%, respectively (p = 0.48). Quality of life was not different between the two treatment groups. The generalizability of these findings to other populations with AKI requires further exploration.

Conclusions

Patients with AKI requiring RRT in intensive care have high long-term mortality but few require maintenance dialysis. Long-term survivors have a heavy burden of proteinuria. Increased intensity of RRT does not reduce mortality or subsequent treatment with dialysis.

Trial registration

www.ClinicalTrials.gov NCT00221013 Please see later in the article for the Editors'' Summary  相似文献   

7.

Background

To investigate if the cramp threshold frequency (CTF) can be altered by electrical muscle stimulation in a shortened position.

Methods

A total of 15 healthy male sport students were randomly allocated to an intervention (IG, n = 10) and a non-treatment control group (CG, n = 5). Calf muscles of both legs in the IG were stimulated equally twice a week over 6 weeks. The protocol was 3×5 s on, 10 s off, 150 µs impulse width, 30 Hz above the individual CTF, and was at 85% of the maximal tolerated stimulation energy. One leg was stimulated in a shortened position, inducing muscle cramps (CT), while the opposite leg was fixated in a neutral position at the ankle, hindering muscle cramps (nCT). CTF tests were performed prior to the first and 96 h after the 6th (3 w) and 12th (6 w) training session.

Results

After 3 w, the CTF had significantly (p<0.001) increased in CT calves from 23.3±5.7 Hz to 33.3±6.9 Hz, while it remained unchanged in nCT (pre: 23.6±5.7 Hz, mid: 22.3±3.5 Hz) and in both legs of the CG (pre: 21.8±3.2 Hz, mid: 22.0±2.7 Hz). Only CT saw further insignificant increases in the CTF. The applied stimulation energy (mA2 • µs) positively correlated with the effect on the CTF (r = 0.92; p<0.001).

Conclusions

The present study may be useful for developing new non-pharmacological strategies to reduce cramp susceptibility.

Trial Registry

German Clinical Trials Register DRKS00005312  相似文献   

8.

Background

Protein supplementation has been shown to reduce the increases in intrahepatic triglyceride (IHTG) content induced by acute hypercaloric high-fat and high-fructose diets in humans.

Objective

To assess the effect of a 12-wk iso-energetic high protein-low carbohydrate (HPLC) diet compared with an iso-energetic high carbohydrate-low protein (HCLP) diet on IHTG content in healthy non-obese subjects, at a constant body weight.

Design

Seven men and nine women [mean ± SD age: 24±5 y; BMI: 22.9±2.1 kg/m2] were randomly allocated to a HPLC [30/35/35% of energy (En%) from protein/carbohydrate/fat] or a HCLP (5/60/35 En%) diet by stratification on sex, age and BMI. Dietary guidelines were prescribed based on individual daily energy requirements. IHTG content was measured by 1H-magnetic resonance spectroscopy before and after the dietary intervention.

Results

IHTG content changed in different directions with the HPLC (CH2H2O: 0.23±0.17 to 0.20±0.10; IHTG%: 0.25±0.20% to 0.22±0.11%) compared with the HCLP diet (CH2H2O: 0.34±0.20 vs. 0.38±0.21; IHTG%: 0.38±0.22% vs. 0.43±0.24%), which resulted in a lower IHTG content in the HPLC compared with the HCLP diet group after 12 weeks, which almost reached statistical significance (P = 0.055).

Conclusions

A HPLC vs. a HCLP diet has the potential to preserve vs. enlarge IHTG content in healthy non-obese subjects at a constant body weight.

Trial Registration

Clinicaltrials.gov NCT01551238  相似文献   

9.

Introduction

131I-CLR1404 is a small molecule that combines a tumor-targeting moiety with a therapeutic radioisotope. The primary aim of this phase 1 study was to determine the administered radioactivity expected to deliver 400 mSv to the bone marrow. The secondary aims were to determine the pharmacokinetic (PK) and safety profiles of 131I-CLR1404.

Methods

Eight subjects with refractory or relapsed advanced solid tumors were treated with a single injection of 370 MBq of 131I-CLR1404. Whole body planar nuclear medicine scans were performed at 15–35 minutes, 4–6, 18–24, 48, 72, 144 hours, and 14 days post injection. Optional single photon emission computed tomography imaging was performed on two patients 6 days post injection. Clinical laboratory parameters were evaluated in blood and urine. Plasma PK was evaluated on 127I-CLR1404 mass measurements. To evaluate renal clearance of 131I-CLR1404, urine was collected for 14 days post injection. Absorbed dose estimates for target organs were determined using the RADAR method with OLINDA/EXM software.

Results

Single administrations of 370 MBq of 131I-CLR1404 were well tolerated by all subjects. No severe adverse events were reported and no adverse event was dose-limiting. Plasma 127I-CLR1404 concentrations declined in a bi-exponential manner with a mean t½ value of 822 hours. Mean Cmax and AUC(0-t) values were 72.2 ng/mL and 15753 ng•hr/mL, respectively. An administered activity of approximately 740 MBq is predicted to deliver 400 mSv to marrow.

Conclusions

Preliminary data suggest that 131I-CLR1404 is well tolerated and may have unique potential as an anti-cancer agent.

Trial Registration

ClinicalTrials.gov NCT00925275  相似文献   

10.

Background

Induction of HIV-1-specific T-cell responses relevant to diverse subtypes is a major goal of HIV vaccine development. Prime-boost regimens using heterologous gene-based vaccine vectors have induced potent, polyfunctional T cell responses in preclinical studies.

Methods

The first opportunity to evaluate the immunogenicity of DNA priming followed by recombinant adenovirus serotype 5 (rAd5) boosting was as open-label rollover trials in subjects who had been enrolled in prior studies of HIV-1 specific DNA vaccines. All subjects underwent apheresis before and after rAd5 boosting to characterize in depth the T cell and antibody response induced by the heterologous DNA/rAd5 prime-boost combination.

Results

rAd5 boosting was well-tolerated with no serious adverse events. Compared to DNA or rAd5 vaccine alone, sequential DNA/rAd5 administration induced 7-fold higher magnitude Env-biased HIV-1-specific CD8+ T-cell responses and 100-fold greater antibody titers measured by ELISA. There was no significant neutralizing antibody activity against primary isolates. Vaccine-elicited CD4+ and CD8+ T-cells expressed multiple functions and were predominantly long-term (CD127+) central or effector memory T cells and that persisted in blood for >6 months. Epitopes mapped in Gag and Env demonstrated partial cross-clade recognition.

Conclusion

Heterologous prime-boost using vector-based gene delivery of vaccine antigens is a potent immunization strategy for inducing both antibody and T-cell responses.

Trial Registration

ClinicalTrails.gov NCT00102089, NCT00108654  相似文献   

11.

Background

Tree nut consumption has been associated with reduced diabetes risk, however, results from randomized trials on glycemic control have been inconsistent.

Objective

To provide better evidence for diabetes guidelines development, we conducted a systematic review and meta-analysis of randomized controlled trials to assess the effects of tree nuts on markers of glycemic control in individuals with diabetes.

Data Sources

MEDLINE, EMBASE, CINAHL, and Cochrane databases through 6 April 2014.

Study Selection

Randomized controlled trials ≥3 weeks conducted in individuals with diabetes that compare the effect of diets emphasizing tree nuts to isocaloric diets without tree nuts on HbA1c, fasting glucose, fasting insulin, and HOMA-IR.

Data Extraction and Synthesis

Two independent reviewer’s extracted relevant data and assessed study quality and risk of bias. Data were pooled by the generic inverse variance method and expressed as mean differences (MD) with 95% CI’s. Heterogeneity was assessed (Cochran Q-statistic) and quantified (I2).

Results

Twelve trials (n = 450) were included. Diets emphasizing tree nuts at a median dose of 56 g/d significantly lowered HbA1c (MD = −0.07% [95% CI:−0.10, −0.03%]; P = 0.0003) and fasting glucose (MD = −0.15 mmol/L [95% CI: −0.27, −0.02 mmol/L]; P = 0.03) compared with control diets. No significant treatment effects were observed for fasting insulin and HOMA-IR, however the direction of effect favoured tree nuts.

Limitations

Majority of trials were of short duration and poor quality.

Conclusions

Pooled analyses show that tree nuts improve glycemic control in individuals with type 2 diabetes, supporting their inclusion in a healthy diet. Owing to the uncertainties in our analyses there is a need for longer, higher quality trials with a focus on using nuts to displace high-glycemic index carbohydrates.

Trial Registration

ClinicalTrials.gov NCT01630980  相似文献   

12.

Introduction

During HIV infection the severe depletion of intestinal CD4+ T-cells is associated with microbial translocation, systemic immune activation, and disease progression. This study examined intestinal and peripheral CD4+ T-cell subsets reconstitution under combined antiretroviral therapy (cART), and systemic immune activation markers.

Methods

This longitudinal single-arm pilot study evaluates CD4+ T cells, including Th1 and Th17, in gut and blood and soluble markers for inflammation in HIV-infected individuals before (M0) and after eight (M8) months of cART. From January 2010 to December 2011, 10 HIV-1 naïve patients were screened and 9 enrolled. Blood and gut CD4+ T-cells subsets and cellular immune activation were determined by flow-cytometry and plasma soluble CD14 by ELISA. CD4+ Th17 cells were detected in gut biopsies by immunohistochemistry. Microbial translocation was measured by limulus-amebocyte-lysate assay to detect bacterial lipopolysaccharide (LPS) and PCR Real Time to detect plasma bacterial 16S rDNA.

Results

Eight months of cART increased intestinal CD4+ and Th17 cells and reduced levels of T-cell activation and proliferation. The magnitude of intestinal CD4+ T-cell reconstitution correlated with the reduction of plasma LPS. Importantly, the magnitude of Th17 cells reconstitution correlated directly with blood CD4+ T-cell recovery.

Conclusion

Short-term antiretroviral therapy resulted in a significant increase in the levels of total and Th17 CD4+ T-cells in the gut mucosa and in decline of T-cell activation. The observation that pre-treatment levels of CD4+ and of CD8+ T-cell activation are predictors of the magnitude of Th17 cell reconstitution following cART provides further rationale for an early initiation of cART in HIV-infected individuals.

Trial Registration

ClinicalTrials.gov NCT02097381  相似文献   

13.

Background

By measuring very early changes in muscle strength and functional performance after fast-track total hip arthroplasty (THA), post-operative rehabilitation, introduced soon after surgery, can be designed to specifically target identified deficits.

Objective(s)

Firstly, to quantify changes (compared to pre-operative values) in hip muscle strength, leg-press power, and functional performance in the first week after THA, and secondly, to explore relationships between the muscle strength changes, and changes in hip pain, systemic inflammation, and thigh swelling.

Design

Prospective, cohort study.

Setting

Convenience sample of patients receiving a THA at Copenhagen University Hospital, Hvidovre, Denmark, between March and December 2011.

Participants

Thirty-five patients (65.9±7.2 years) undergoing THA.

Main outcome measures

Hip muscle strength, leg-press power, performance-based function, and self-reported disability were determined prior to, and 2 and 8 days after, THA (Day 2 and 8, respectively). Hip pain, thigh swelling, and C-Reactive Protein were also determined.

Results

Five patients were lost to follow-up. Hip muscle strength and leg press power were substantially reduced at Day 2 (range of reductions: 41–58%, P<0.001), but less pronounced at Day 8 (range of reductions: 23–31%, P<0.017). Self-reported symptoms and function (HOOS: Pain, Symptoms, and ADL) improved at Day 8 (P<0.014). Changes in hip pain, C-Reactive Protein, and thigh swelling were not related to the muscle strength and power losses.

Conclusion(s)

Hip muscle strength and leg-press power decreased substantially in the first week after THA – especially at Day 2 – with some recovery at Day 8. The muscle strength loss and power loss were not related to changes in hip pain, systemic inflammation, or thigh swelling. In contrast, self-reported symptoms and function improved. These data on surgery-induced changes in muscle strength may help design impairment-directed, post-operative rehabilitation to be introduced soon after surgery.

Trial Registration

ClinicalTrials.gov NCT01246674.  相似文献   

14.

Background

While current recommendations on exercise type and volume have strong experimental bases, there is no clear evidence from large-sized studies indicating whether increasing training intensity provides additional benefits to subjects with type 2 diabetes.

Objective

To compare the effects of moderate-to-high intensity (HI) versus low-to-moderate intensity (LI) training of equal energy cost, i.e. exercise volume, on modifiable cardiovascular risk factors.

Design

Pre-specified sub-analysis of the Italian Diabetes and Exercise Study (IDES), a randomized multicenter prospective trial comparing a supervised exercise intervention with standard care for 12 months (2005–2006).

Setting

Twenty-two outpatient diabetes clinics across Italy.

Patients

Sedentary patients with type 2 diabetes assigned to twice-a-week supervised progressive aerobic and resistance training plus exercise counseling (n = 303).

Interventions

Subjects were randomized by center to LI (n = 142, 136 completed) or HI (n = 161, 152 completed) progressive aerobic and resistance training, i.e. at 55% or 70% of predicted maximal oxygen consumption and at 60% or 80% of predicted 1-Repetition Maximum, respectively, of equal volume.

Main Outcome Measure(s)

Hemoglobin (Hb) A1c and other cardiovascular risk factors; 10-year coronary heart disease (CHD) risk scores.

Results

Volume of physical activity, both supervised and non-supervised, was similar in LI and HI participants. Compared with LI training, HI training produced only clinically marginal, though statistically significant, improvements in HbA1c (mean difference −0.17% [95% confidence interval −0.44,0.10], P = 0.03), triglycerides (−0.12 mmol/l [−0.34,0.10], P = 0.02) and total cholesterol (−0.24 mmol/l [−0.46, −0.01], P = 0.04), but not in other risk factors and CHD risk scores. However, intensity was not an independent predictor of reduction of any of these parameters. Adverse event rate was similar in HI and LI subjects.

Conclusions

Data from the large IDES cohort indicate that, in low-fitness individuals such as sedentary subjects with type 2 diabetes, increasing exercise intensity is not harmful, but does not provide additional benefits on cardiovascular risk factors.

Trial Registration

www.ISRCTN.org ISRCTN-04252749.  相似文献   

15.

Background

We evaluated the efficacy of a new anesthetic technique termed ultrasound-guided capsule-sheath space block (CSSB) combined with anterior cervical cutaneous nerve block (CCNB) for thyroidectomy.

Methods

The study included two parts: Part one was an imaging study to determine technique feasibility. The CSSB was performed on five healthy volunteers by introducing the needle 0.5 cm lateral to the probe under in-plane needle ultrasound guidance. After puncture of the false capsule and its subsequent contraction with the true capsule of thyroid, 10 mL of contrast medium was deposited slowly in the capsule-sheath space. The CCNB was performed bilaterally as follows: Under ultrasound guidance, a subcutaneous injection was made along the sternocleidomastoid using 10 mL of contrast medium which was followed by a girdle-shaped picchu raised from the cricoid cartilage to supraclavicular region. The spreading pattern of contrast medium was imaged using computed tomographic scanning. In part two (a clinical case series) the technique efficacy was evaluated. Seventy-eight patients undergoing thyroidectomy had ultrasound-guided CSSB and CCNB with local anesthetics. The sensory onset of CCNB, intraoperative hemodynamic parameters, and analgesic effect were assessed and complications were noted.

Results

The distribution of contrast medium was well defined. In part two the onset time of CCNB was 2.2 ± 0.7 min, and the hemodynamic parameters remained stable intraoperatively. The recall of visual analogue scale scores during surgery was 2 [1–4] for median (range). The patients’ and surgeons’ satisfaction scores were 2 [1–4] and 1 [1–3] for median (range). No serious complications occurred.

Conclusions

Combining ultrasound-guided CSSB and CCNB is a feasible, effective and safe technique for thyroidectomy.

Trial registration

Current Controlled Trials ChiCTR-ONC-12002025. Registered 19 March 2012.  相似文献   

16.

Background

To determine the maximum tolerated dose (MTD) and dose limiting toxicity (DLT) of irinotecan administered in combination with vincristine, temozolomide and bevacizumab in children with refractory solid tumors.

Methods

The study design included two dose levels (DL) of irinotecan given intravenously once daily for 5 consecutive days (DL1: 30 mg/m2, and DL2: 50 mg/m2), combined with vincristine 1.5 mg/m2 on days 1 and 8, temozolomide 100 mg/m2 on days 1-5, and bevacizumab 15mg/kg on day 1, administered every 21 days for a maximum of 12 cycles.

Results

Thirteen patients were enrolled and 12 were evaluable for toxicity Dose limiting toxicity observed included grade 3 hyperbilirubinemia in 1 of 6 patients on DL1, and grade 3 colitis in 1 of 6 patients on DL2. DL 2 was the determined MTD. A total of 87 cycles were administered. Myelosuppression was mild. Grade 1-2 diarrhea occurred in the majority of cycles with grade 3 diarrhea occurring in only one cycle. Grade 2 hypertension developed in two patients. Severe hemorrhage, intestinal perforation, posterior leukoencephalopathy or growth plate abnormalities were not observed. Objective responses were noted in three Wilms tumor patients and one each of medulloblastoma and hepatocellular carcinoma. Five patients completed all 12 cycles of protocol therapy.

Conclusions

Irinotecan 50 mg/m2/day for 5 days was the MTD when combined with vincristine, temozolomide and bevacizumab administered on a 21 day schedule. Encouraging anti-tumor activity was noted.

Trial Registration

ClinicalTrials.gov; NCT00993044; http://clinicaltrials.gov/show/NCT00993044  相似文献   

17.

Background

While the ergogenic effect of sodium bicarbonate (BICA) on short-term, sprint-type performance has been repeatedly demonstrated, little is known about its effectiveness during prolonged high-intensity exercise in well-trained athletes. Therefore, this study aims to examine the influence of BICA on performance during exhaustive, high-intensity endurance cycling.

Methods

This was a single-center, double-blind, randomized, placebo-controlled cross-over study. Twenty-one well-trained cyclists (mean ± SD: age 24±8 y, BMI 21.3±1.7, VO2peak 67.3±9.8 ml·kg−1·min−1) were randomly allocated to sequences of following interventions: oral ingestion of 0.3 g·kg−1 BICA or 4 g of sodium chloride (placebo), respectively. One h after ingestion subjects exercised for 30 min at 95% of the individual anaerobic threshold (IAT) followed by 110% IAT until exhaustion. Prior to these constant load tests stepwise incremental exercise tests were conducted under both conditions to determine IAT and VO2peak. Analysis of blood gas parameters, blood lactate (BLa) and gas exchange measurements were conducted before, during and after the tests. The main outcome measure was the time to exhaustion in the constant load test.

Results

Cycling time to exhaustion was improved (p<0.05) under BICA (49.5±11.5 min) compared with placebo (45.0±9.5 min). No differences in maximal or sub-maximal measures of performance were observed during stepwise incremental tests. BICA ingestion resulted in an increased pH, bicarbonate concentration and BLa before, throughout and after both exercise testing modes.

Conclusion

The results suggest that ingestion of BICA may improve prolonged, high-intensity cycling performance.

Trial Registration

German Clinical Trials Register (DRKS) DRKS00006198.  相似文献   

18.

Background

The effect of tranexamic acid (TXA) on bleeding and improvement of surgical field during functional endoscopic sinus surgery (FESS) is not clear yet. This study was conducted to answer this question.

Methods

This trial was conducted on 60 patients with chronic sinusitis at Beasat Hospital, Hamadan, Iran, from April to November 2013. Thirty patients in the intervention group received three pledgets soaked with TXA 5% and phenylephrine 0.5% for 10 minutes in each nasal cavity before surgery. Thirty patients in the control group received phenylephrine 0.5% with the same way. The amount of bleeding and the quality of surgical field were evaluated at 15, 30, and 45 minutes after the start of surgery using Boezaart grading.

Results

The quality of the surgical field in the intervention group compared to the control group was significantly better in the first quarter (P = 0.002) and the second quarter (P = 0.003) but not in the third quarter (P = 0.163). Furthermore, the amount of bleeding was much less during all periods in the intervention group than in the control group (P = 0.001).

Conclusion

Topical TXA can efficiently reduce bleeding and improve the surgical field in FESS in patients with rhinosinusitis. Based on these findings, topical TXA may be a useful method for providing a suitable surgical field during the first 30 minutes after use.

Trial Registration

Iranian Registry of Clinical Trials IRCT201212139014N15  相似文献   

19.

Objective

We aimed in this investigation to study deep brain stimulation (DBS) battery drain with special attention directed toward patient symptoms prior to and following battery replacement.

Background

Previously our group developed web-based calculators and smart phone applications to estimate DBS battery life (http://mdc.mbi.ufl.edu/surgery/dbs-battery-estimator).

Methods

A cohort of 320 patients undergoing DBS battery replacement from 2002–2012 were included in an IRB approved study. Statistical analysis was performed using SPSS 20.0 (IBM, Armonk, NY).

Results

The mean charge density for treatment of Parkinson’s disease was 7.2 µC/cm2/phase (SD = 3.82), for dystonia was 17.5 µC/cm2/phase (SD = 8.53), for essential tremor was 8.3 µC/cm2/phase (SD = 4.85), and for OCD was 18.0 µC/cm2/phase (SD = 4.35). There was a significant relationship between charge density and battery life (r = −.59, p<.001), as well as total power and battery life (r = −.64, p<.001). The UF estimator (r = .67, p<.001) and the Medtronic helpline (r = .74, p<.001) predictions of battery life were significantly positively associated with actual battery life. Battery status indicators on Soletra and Kinetra were poor predictors of battery life. In 38 cases, the symptoms improved following a battery change, suggesting that the neurostimulator was likely responsible for symptom worsening. For these cases, both the UF estimator and the Medtronic helpline were significantly correlated with battery life (r = .65 and r = .70, respectively, both p<.001).

Conclusions

Battery estimations, charge density, total power and clinical symptoms were important factors. The observation of clinical worsening that was rescued following neurostimulator replacement reinforces the notion that changes in clinical symptoms can be associated with battery drain.  相似文献   

20.

Background

Short cycle treatment interruption could reduce toxicity and drug costs and contribute to further expansion of antiretroviral therapy (ART) programs.

Methods

A 72 week, non-inferiority trial enrolled one hundred forty six HIV positive persons receiving ART (CD4+ cell count ≥125 cells/mm3 and HIV RNA plasma levels <50 copies/ml) in one of three arms: continuous, 7 days on/7 days off and 5 days on/2 days off treatment. Primary endpoint was ART treatment failure determined by plasma HIV RNA level, CD4+ cell count decrease, death attributed to study participation, or opportunistic infection.

Results

Following enrollment of 32 participants, the 7 days on/7 days off arm was closed because of a failure rate of 31%. Six of 52 (11.5%) participants in the 5 days on/2 days off arm failed. Five had virologic failure and one participant had immunologic failure. Eleven of 51 (21.6%) participants in the continuous treatment arm failed. Nine had virologic failure with 1 death (lactic acidosis) and 1 clinical failure (extra-pulmonary TB). The upper 97.5% confidence boundary for the difference between the percent of non-failures in the 5 days on/2 days off arm (88.5% non-failure) compared to continuous treatment (78.4% non failure) was 4.8% which is well within the preset non-inferiority margin of 15%. No significant difference was found in time to failure in the 2 study arms (p = 0.39).

Conclusions

Short cycle 5 days on/2 days off intermittent ART was at least as effective as continuous therapy.

Trial Registration

ClinicalTrials.gov NCT00339456  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号