首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.

Background/Objective

Inadvertent intraoperative hypothermia (core temperature <360 C) is a recognized risk in surgery and has adverse consequences. However, no data about this complication in China are available. Our study aimed to determine the incidence of inadvertent intraoperative hypothermia and its associated risk factors in a sample of Chinese patients.

Methods

We conducted a regional cross-sectional survey in Beijing from August through December, 2013. Eight hundred thirty patients who underwent various operations under general anesthesia were randomly selected from 24 hospitals through a multistage probability sampling. Multivariate logistic regression analyses were applied to explore the risk factors of developing hypothermia.

Results

The overall incidence of intraoperative hypothermia was high, 39.9%. All patients were warmed passively with surgical sheets or cotton blankets, whereas only 10.7% of patients received active warming with space heaters or electric blankets. Pre-warmed intravenous fluid were administered to 16.9% of patients, and 34.6% of patients had irrigation of wounds with pre-warmed fluid. Active warming (OR = 0.46, 95% CI 0.26–0.81), overweight or obesity (OR = 0.39, 95% CI 0.28–0.56), high baseline core temperature before anesthesia (OR = 0.08, 95% CI 0.04–0.13), and high ambient temperature (OR = 0.89, 95% CI 0.79–0.98) were significant protective factors for hypothermia. In contrast, major-plus operations (OR = 2.00, 95% CI 1.32–3.04), duration of anesthesia (1–2 h) (OR = 3.23, 95% CI 2.19–4.78) and >2 h (OR = 3.44, 95% CI 1.90–6.22,), and intravenous un-warmed fluid (OR = 2.45, 95% CI 1.45–4.12) significantly increased the risk of hypothermia.

Conclusions

The incidence of inadvertent intraoperative hypothermia in Beijing is high, and the rate of active warming of patients during operation is low. Concern for the development of intraoperative hypothermia should be especially high in patients undergoing major operations, requiring long periods of anesthesia, and receiving un-warmed intravenous fluids.  相似文献   

2.

Objectives

The purpose of this study was to compare the efficacy and safety of a single-dose intra-articular morphine plus bupivacaine versus morphine alone in patients undergoing arthroscopic knee surgery.

Methods

Randomized controlled trials comparing a combination of morphine and bupivacaine with morphine alone injected intra-articularly in the management of pain after knee arthrocopic surgery were retrieved (up to August 10, 2014) from MEDLINE, the Cochrane Library and Embase databases. The weighted mean difference (WMD), relative risk (RR) and their corresponding 95% confidence intervals (CIs) were calculated using RevMan statistical software.

Results

Thirteen randomized controlled trials were included. Statistically significant differences were observed with regard to the VAS values during the immediate period (0-2h) (WMD -1.16; 95% CI -2.01 to -0.31; p = 0.007) and the time to first request for rescue analgesia (WMD = 2.05; 95% CI 0.19 to 3.92; p = 0.03). However, there was no significant difference in the VAS pain score during the early period (2-6h) (WMD -0.36; 95% CI -1.13 to 0.41; p = 0.35), the late period (6-48h) (WMD 0.11; 95% CI -0.40 to 0.63; p = 0.67), and the number of patients requiring supplementary analgesia (RR = 0.78; 95% CI 0.57 to 1.05; p = 0.10). In addition, systematic review showed that intra-articular morphine plus bupivacaine would not increase the incidence of adverse effects compared with morphine alone.

Conclusion

The present study suggested that the administration of single-dose intra-articular morphine plus bupivacaine provided better pain relief during the immediate period (0-2h), and lengthened the time interval before the first request for analgesic rescue without increasing the short-term side effects when compared with morphine alone.

Level of Evidence

Level I, meta-analysis of Level I studies.  相似文献   

3.

Background

Badminton players often perform powerful and long-distance lunges during such competitive matches. The objective of this study is to compare the plantar loads of three one-step maximum forward lunges in badminton.

Methods

Fifteen right-handed male badminton players participated in the study. Each participant performed five successful maximum lunges at three directions. For each direction, the participant wore three different shoe brands. Plantar loading, including peak pressure, maximum force, and contact area, was measured by using an insole pressure measurement system. Two-way ANOVA with repeated measures was employed to determine the effects of the different lunge directions and different shoes, as well as the interaction of these two variables, on the measurements.

Results

The maximum force (MF) on the lateral midfoot was lower when performing left-forward lunges than when performing front-forward lunges (p = 0.006, 95% CI = −2.88 to −0.04%BW). The MF and peak pressures (PP) on the great toe region were lower for the front-forward lunge than for the right-forward lunge (MF, p = 0.047, 95% CI = −3.62 to −0.02%BW; PP, p = 0.048, 95% CI = −37.63 to −0.16 KPa) and left-forward lunge (MF, p = 0.015, 95% CI = −4.39 to −0.38%BW; PP, p = 0.008, 95% CI = −47.76 to −5.91 KPa).

Conclusions

These findings indicate that compared with the front-forward lunge, left and right maximum forward lunges induce greater plantar loads on the great toe region of the dominant leg of badminton players. The differences in the plantar loads of the different lunge directions may be potential risks for injuries to the lower extremities of badminton players.  相似文献   

4.

Background

Whether additional benefit can be achieved with the use of trimetazidine (TMZ) in patients with chronic heart failure (CHF) remains controversial. We therefore performed a meta-analysis of randomized controlled trials (RCTs) to evaluate the effects of TMZ treatment in CHF patients.

Methods

We searched PubMed, EMBASE, and Cochrane databases through October 2013 and included 19 RCTs involving 994 CHF patients who underwent TMZ or placebo treatment. Risk ratio (RR) and weighted mean differences (WMD) were calculated using fixed or random effects models.

Results

TMZ therapy was associated with considerable improvement in left ventricular ejection fraction (WMD: 7.29%, 95% CI: 6.49 to 8.09, p<0.01) and New York Heart Association classification (WMD: −0.55, 95% CI: −0.81 to −0.28, p<0.01). Moreover, treatment with TMZ also resulted in significant decrease in left ventricular end-systolic volume (WMD: −17.09 ml, 95% CI: −20.15 to −14.04, p<0.01), left ventricular end-diastolic volume (WMD: −11.24 ml, 95% CI: −14.06 to −8.42, p<0.01), hospitalization for cardiac causes (RR: 0.43, 95% CI: 0.21 to 0.91, p = 0.03), B-type natriuretic peptide (BNP; WMD: −157.08 pg/ml, 95% CI: −176.55 to −137.62, p<0.01) and C-reactive protein (CRP; WMD: −1.86 mg/l, 95% CI: −2.81 to −0.90, p<0.01). However, there were no significant differences in exercise duration and all-cause mortality between patients treated with TMZ and placebo.

Conclusions

TMZ treatment in CHF patients may improve clinical symptoms and cardiac function, reduce hospitalization for cardiac causes, and decrease serum levels of BNP and CRP.  相似文献   

5.

Objective

The English questionnaire Pregnancy-Unique Quantification of Emesis and nausea (PUQE) identifies women with severe Hyperemesis Gravidarum. Our aim was to investigate whether scores from the translated Norwegian version; SUKK (SvangerskapsUtløst Kvalme Kvantifisering) was associated with severity of hyperemesis and nutritional intake.

Design

A prospective cohort validation study.

Setting

Hospital cohort of Hyperemesis Gravidarum (HG) patients from western Norway and healthy pregnant women from Bergen, Norway.

Sample

38 women hospitalized due to HG and 31 healthy pregnant controls attending routine antenatal check-up at health centers.

Methods

Data were collected May 2013-January 2014. The study participants answered the Norwegian PUQE-questionnaire (scores ranging from 3 to15) and registered prospectively 24-hours nutritional intake by a food list form.

Main outcome measures

Differences of PUQE-scores, QOL-score and nutritional intake between hyperemesis patients and controls.

Results

Hyperemesis patients had shorter gestational age compared to controls (median 9.7 weeks; 95% CI 8.6-10.6 versus 11.9; 95% CI 10.1-12.9, p=0.004), and larger weight-change from pre-pregnant weight (loss of median 3 kg; 95% CI 3-4 versus gain of 2 kg; 95% CI 0.5-2, p<0.001) otherwise groups were similar regarding pre-pregnant BMI, age, gravidity, and inclusion weight. Compared to controls, hyperemesis patients had significant higher PUQE-score (median 13; 95% CI 11-14 vs. 7; 95% CI 4-8), lower QOL (median score 3; 95% CI 2-4 vs. 6; 95% CI 4.5-8) and lower nutritional intake (energy intake median 990 kcal/24 hours; 95% CI 709-1233 vs. 1652; 95% CI 1558-1880 all p<0.001). PUQE-score was inversely correlated to nutritional intake (-0.5, p<0.001). At discharge PUQE-score had fallen to median 6 (95% CI 5-8) and QOL score risen to 7 (95% CI 6-8) in the HG group, (both p<0.001 compared to admission values).

Conclusion

PUQE-scoring has been validated as a robust indicator of severe hyperemesis gravidarum and insufficient nutritional intake in a Norwegian setting.  相似文献   

6.

Background

Red cell distribution width (RDW) is a routine laboratory measure associated with poor outcomes in adult critical illness.

Objective

We determined the utility of RDW as an early pragmatic biomarker for outcome in pediatric critical illness.

Methods

We used multivariable logistic regression to test the association of RDW on the first day of pediatric intensive care unit (PICU) admission with prolonged PICU length of stay (LOS) >48 hours and mortality. The area under the receiver operating characteristic curve (AUROC) for RDW was compared to the Pediatric Index of Mortality (PIM)-2 score.

Results

Over a 13-month period, 596 unique patients had RDW measured on the first day of PICU admission. Sepsis was an effect modifier for LOS >48 hours but not mortality. In sepsis, RDW was not associated with LOS >48 hours. For patients without sepsis, each 1% increase in RDW was associated with 1.17 (95% CI 1.06, 1.30) increased odds of LOS >48 hours. In all patients, RDW was independently associated with PICU mortality (OR 1.25, 95% CI 1.09, 1.43). The AUROC for RDW to predict LOS >48 hours and mortality was 0.61 (95% CI 0.56, 0.66) and 0.65 (95% CI 0.55, 0.75), respectively. Although the AUROC for mortality was comparable to PIM-2 (0.75, 95% CI 0.66, 0.83; p = 0.18), RDW did not increase the discriminative utility when added to PIM-2. Despite the moderate AUROC, RDW <13.4% (upper limit of lower quartile) had 53% risk of LOS >48 hours and 3.3% risk of mortality compared to patients with an RDW >15.7% (lower limit of upper quartile) who had 78% risk of LOS >48 hours and 12.9% risk of mortality (p<0.001 for both outcomes).

Conclusions

Elevated RDW was associated with outcome in pediatric critical illness and provided similar prognostic information as the more complex PIM-2 severity of illness score. Distinct RDW thresholds best discriminate low- versus high-risk patients.  相似文献   

7.

Background

We performed an updated meta-analysis of randomized placebo-controlled trials testing memantine monotherapy for patients with Alzheimer’s disease (AD).

Methods

The meta-analysis included randomized controlled trials of memantine monotherapy for AD, omitting those in which patients were also administered a cholinesterase inhibitor. Cognitive function, activities of daily living, behavioral disturbances, global function, stage of dementia, drug discontinuation rate, and individual side effects were compared between memantine monotherapy and placebo groups. The primary outcomes were cognitive function and behavioral disturbances; the others were secondary outcomes.

Results

Nine studies including 2433 patients that met the study’s inclusion criteria were identified. Memantine monotherapy significantly improved cognitive function [standardized mean difference (SMD)=−0.27, 95% confidence interval (CI)=−0.39 to −0.14, p=0.0001], behavioral disturbances (SMD=−0.12, 95% CI=−0.22 to −0.01, p=0.03), activities of daily living (SMD=−0.09, 95% CI=−0.19 to −0.00, p=0.05), global function assessment (SMD=−0.18, 95% CI=−0.27 to −0.09, p=0.0001), and stage of dementia (SMD=−0.23, 95% CI=−0.33 to −0.12, p=0.0001) scores. Memantine was superior to placebo in terms of discontinuation because of inefficacy [risk ratio (RR)=0.36, 95% CI=0.17¬ to 0.74, p=0.006, number needed to harm (NNH)=non significant]. Moreover, memantine was associated with less agitation compared with placebo (RR=0.68, 95% CI=0.49 to 0.94, p=0.02, NNH=non significant). There were no significant differences in the rate of discontinuation because of all causes, all adverse events, and individual side effects other than agitation between the memantine monotherapy and placebo groups.

Conclusions

Memantine monotherapy improved cognition, behavior, activities of daily living, global function, and stage of dementia and was well-tolerated by AD patients. However, the effect size in terms of efficacy outcomes was small and thus there is limited evidence of clinical benefit.  相似文献   

8.

Objective

We performed a systematic review and meta-analysis of double-blind, randomized, placebo-controlled trials evaluating suvorexant for primary insomnia.

Methods

Relevant studies were identified through searches of PubMed, databases of the Cochrane Library, and PsycINFO citations through June 27, 2015. We performed a systematic review and meta-analysis of suvorexant trial efficacy and safety outcomes. The primary efficacy outcomes were either subjective total sleep time (sTST) or subjective time-to-sleep onset (sTSO) at 1 month. The secondary outcomes were other efficacy outcomes, discontinuation rate, and individual adverse events. The risk ratio, number-needed-to-treat/harm, and weighted mean difference (WMD) and 95% confidence intervals (CI) based on a random effects model were calculated.

Results

The computerized literature database search initially yielded 48 results, from which 37 articles were excluded following a review of titles and abstracts and another eight review articles after full-text review. Thus, we identified 4 trials that included a total of 3,076 patients. Suvorexant was superior to placebo with regard to the two primary efficacy outcomes (sTST: WMD = −20.16, 95% CI = −25.01 to −15.30, 1889 patients, 3 trials, sTSO: WMD = −7.62, 95% CI = −11.03 to −4.21, 1889 patients, 3 trials) and was not different from placebo in trial discontinuations. Suvorexant caused a higher incidence than placebo of at least one side effects, abnormal dreams, somnolence, excessive daytime sleepiness/sedation, fatigue, dry mouth, and rebound insomnia.

Conclusions

Our analysis of published trial results suggests that suvorexant is effective in treating primary insomnia and is well-tolerated.  相似文献   

9.

Background

Shift work is indicated to be associated with adverse metabolic disorders. However, potential effects of shift work on metabolic syndrome (MetS) and its components have not been well established.

Methods

In total, 26,382 workers from Dongfeng-Tongji Cohort were included in this study. Information on shift work history was gathered through questionnaires and metabolic traits were measured. Logistic regression models were used to calculate the odds ratio (OR) and 95% confidence interval (CI) for long-term shift work related with MetS and each component, respectively. Further stratification analysis was performed to detect the differences on MetS between female and male shift workers.

Results

Long-term shift work was associated with MetS without adjusting for any confounders. Compared with the group of non-shift work, the multivariate-adjusted ORs (95%CI) of MetS associated with 1–10, 11−20, and ≥20y of shift work were 1.05 (0.95−1.16), 1.14 (1.03−1.26), 1.16 (1.01−1.31), respectively. In female workers, we found a dose-response relationship that every 10 years increase in shift work was associated with a 10% (95% CI: 1%−20%) elevated OR of MetS, while no significant dose-response trend was found among male workers. Furthermore, shift work duration was significantly associated with ORs of high blood pressure (1.07, 1.01−1.13), long waist circumference (1.10, 1.01−1.20) and high glucose levels (1.09, 1.04−1.15). No significant association was observed between shift work and low HDL cholesterol) and raised triglyceride levels.

Conclusions

Long-term shift work was associated with metabolic syndrome and the association might differ by gender in retired workers. Applicable intervention strategies are needed for prevention of metabolic disorders for shift workers.  相似文献   

10.

Objectives

We aimed to describe and compare the prevalence of vitamin D deficiency between HIV-negative and HIV-infected veterans in the southern United States, and to determine risk factors for vitamin D deficiency for HIV infected patients.

Methods

Cross-sectional, retrospective study including all patients followed at the Atlanta VA Medical Center with the first 25-hydroxyvitamin D [25(OH)D] level determined between January 2007 and August 2010. Multivariate logistic regression analysis was used to determine risk factors associated with vitamin D deficiency (< 20 ng/ml).

Results

There was higher prevalence of 25(OH)D deficiency among HIV-positive compared to HIV-negative patients (53.2 vs. 38.5%, p <0.001). Independent risk factors for vitamin D deficiency in HIV + patients included black race (OR 3.24, 95% CI 2.28–4.60), winter season (OR 1.39, 95% CI 1.05–1.84) and higher GFR (OR 1.01, CI 1.00–1.01); increasing age (OR 0.98, 95% CI 0.95–0.98), and tenofovir use (OR 0.72, 95% CI 0.54–0.96) were associated with less vitamin D deficiency.

Conclusions

Vitamin D deficiency is a prevalent problem that varies inversely with age and affects HIV-infected patients more than other veterans in care. In addition to age, tenofovir and kidney disease seem to confer a protective effect from vitamin D deficiency in HIV-positive patients.  相似文献   

11.

Objective

To assess the predictive factors for subjective improvement with nonsurgical treatment in consecutive patients with lumbar spinal stenosis (LSS).

Materials and Methods

Patients with LSS were enrolled from 17 medical centres in Japan. We followed up 274 patients (151 men; mean age, 71 ± 7.4 years) for 3 years. A multivariable logistic regression model was used to assess the predictive factors for subjective symptom improvement with nonsurgical treatment.

Results

In 30% of patients, conservative treatment led to a subjective improvement in the symptoms; in 70% of patients, the symptoms remained unchanged, worsened, or required surgical treatment. The multivariable analysis of predictive factors for subjective improvement with nonsurgical treatment showed that the absence of cauda equina symptoms (only radicular symptoms) had an odds ratio (OR) of 3.31 (95% confidence interval [CI]: 1.50–7.31); absence of degenerative spondylolisthesis/scoliosis had an OR of 2.53 (95% CI: 1.13–5.65); <1-year duration of illness had an OR of 3.81 (95% CI: 1.46–9.98); and hypertension had an OR of 2.09 (95% CI: 0.92–4.78).

Conclusions

The predictive factors for subjective symptom improvement with nonsurgical treatment in LSS patients were the presence of only radicular symptoms, absence of degenerative spondylolisthesis/scoliosis, and an illness duration of <1 year.  相似文献   

12.

Background

High prevalence of erectile dysfunction (ED) has been observed in patients with chronic prostatitis/chronic pelvic pain syndrome (CP/CPPS). However, whether or not CP/CPPS is a risk factor of ED remains unknown and controversial. Therefore, we conducted this systematic review and meta-analysis to evaluate the relationship between CP/CPPS and ED.

Methods

PubMed, Embase, Web of Science, and The Cochrane Library were searched up to November 11, 2014 to identify studies reporting the association between CP/CPPS and ED. Case–control, cohort and cross-sectional studies were included. Quality of the included studies was assessed. The odds ratio of ED and the mean difference of five-item International Index of Erectile Function (IIEF-5) score were pooled using a random effects model. Subgroup analysis and sensitivity analyses were performed.

Results

Three cross-sectional studies, two case–control studies, and four retrospective studies with 31,956 participants were included to calculate the pooled odds ratio of ED, and two studies with 1499 participants were included to calculate the pooled mean difference of IIEF-5 scores. A strong correlation was found between CP/CPPS and ED (pooled odds ratio: 3.02, 95% CI: 2.18–4.17, P < 0.01), with heterogeneity across studies (I 2 = 65%; P < 0.01). A significant decrease in the IIFE-5 score was observed in the CP/CPPS group (pooled mean difference: −4.54, 95% CI: −5.11–−3.98; P < 0.01).

Conclusion

Our study indicates that patients with CP/CPPS have an increased risk of suffering from ED. Assessment of erectile function is necessary for the therapy of patients with CP/CPPS. Further evidence is necessary to confirm the relationship between CP/CPPS and ED.  相似文献   

13.

Background

Previous analyses reported age- and gender-related differences in the provision of cardiac care. The objective of the study was to compare circadian disparities in the delivery of primary percutaneous coronary intervention (PCI) for acute myocardial infarction (AMI) according to the patient’s age and gender.

Methods

We investigated patients included into the Acute Myocardial Infarction in Switzerland (AMIS) registry presenting to one of 11 centers in Switzerland providing primary PCI around the clock, and stratified patients according to gender and age.

Findings

A total of 4723 patients presented with AMI between 2005 and 2010; 1319 (28%) were women and 2172 (54%) were ≥65 years of age. More than 90% of patients <65 years of age underwent primary PCI without differences between gender. Elderly patients and particularly women were at increased risk of being withheld primary PCI (males adj. HR 4.91, 95% CI 3.93–6.13; females adj. HR 9.31, 95% CI 7.37–11.75) as compared to males <65 years of age. An increased risk of a delay in door-to-balloon time >90 minutes was found in elderly males (adj HR 1.66 (95% CI 1.40–1.95), p<0.001) and females (adj HR 1.57 (95% CI 1.27–1.93), p<0.001), as well as in females <65 years (adj HR 1.47 (95% CI 1.13–1.91), p = 0.004) as compared to males <65 years of age, with significant differences in circadian patterns during on- and off-duty hours.

Conclusions

In a cohort of patients with AMI in Switzerland, we observed discrimination of elderly patients and females in the circadian provision of primary PCI.  相似文献   

14.

Background

Anterior plate fusion is an effective procedure for the treatment of cervical spinal diseases but is accompanied by a high incidence of postoperative dysphagia. A zero profile (Zero-P) spacer is increasingly being used to reduce postoperative dysphagia and other potential complications associated with surgical intervention. Studies comparing the Zero-P spacer and anterior plate have reported conflicting results.

Methodology

A meta-analysis was conducted to compare the safety, efficacy, radiological outcomes and complications associated with the use of a Zero-P spacer versus an anterior plate in anterior cervical spine fusion for the treatment of cervical spinal disease. We comprehensively searched PubMed, Embase, the Cochrane Library and other databases and performed a meta-analysis of all randomized controlled trials (RCTs) and prospective or retrospective comparative studies assessing the two techniques.

Results

Ten studies enrolling 719 cervical spondylosis patients were included. The pooled data showed significant differences in the operation time [SMD = –0.58 (95% CI = −0.77 to 0.40, p < 0.01)] and blood loss [SMD = −0.40, 95% CI (−0.59 to –0.21), p < 0.01] between the two groups. Compared to the anterior plate group, the Zero-P group exhibited a significantly improved JOA score and reduced NDI and VAS. However, anterior plate fusion had greater postoperative segmental and cervical Cobb’s angles than the Zero-P group at the last follow-up. The fusion rate in the two groups was similar. More importantly, the Zero-P group had a lower incidence of earlier and later postoperative dysphagia.

Conclusions

Compared to anterior plate fusion, Zero-P is a safer and effective procedure, with a similar fusion rate and lower incidence of earlier and later postoperative dysphagia. However, the results of this meta-analysis should be accepted with caution due to the limitations of the study. Further evaluation and large-sample RCTs are required to confirm and update the results of this study.  相似文献   

15.

Background

The post-resuscitation phase after out-of-hospital cardiac arrest (OHCA) is characterised by a systemic inflammatory response (e.g., severe sepsis), for which the immature granulocyte count is a diagnostic marker. In this study we evaluated the prognostic significance of the delta neutrophil index (DNI), which is the difference in leukocyte subfractions as assessed by an automated blood cell analyser, for early mortality after OHCA.

Materials and Methods

OHCA records from the emergency department cardiac arrest registry were retrospectively analysed. Patients who survived at least 24 h after return of spontaneous circulation were included in the analysis. We evaluated mortality and cerebral performance category scores at 30 days.

Results

A total of 83 patients with OHCA were included in the study. Our results showed that DNI >8.4% on day 1 (hazard ratio [HR], 3.227; 95% CI, 1.485–6.967; p = 0.001) and DNI >10.5% on day 2 (HR, 3.292; 95% CI, 1.662–6.519; p<0.001) were associated with increased 30-day mortality in patients with OHCA. Additionally, DNI >8.4% on day 1 (HR, 2.718; 95% CI, 1.508–4.899; p<0.001) and DNI >10.5% on day 2 (HR, 1.709; 95% CI, 1.051–2.778; p = 0.02) were associated with worse neurologic outcomes 30 days after OHCA.

Conclusion

A higher DNI is a promising prognostic marker for 30-day mortality and neurologic outcomes after OHCA. Our findings indicate that patients with elevated DNI values after OHCA might be closely monitored so that appropriate treatment strategies can be implemented.  相似文献   

16.

Background and Aims

The fatty liver index (FLI) is an algorithm involving the waist circumference, body mass index, and serum levels of triglyceride and gamma-glutamyl transferase to identify fatty liver. Although some studies have attempted to validate the FLI, few studies have been conducted for external validation among Asians. We attempted to validate FLI to predict ultrasonographic fatty liver in Taiwanese subjects.

Methods

We enrolled consecutive subjects who received health check-up services at the Taipei Veterans General Hospital from 2002 to 2009. Ultrasonography was applied to diagnose fatty liver. The ability of the FLI to detect ultrasonographic fatty liver was assessed by analyzing the area under the receiver operating characteristic (AUROC) curve.

Results

Among the 29,797 subjects enrolled in this study, fatty liver was diagnosed in 44.5% of the population. Subjects with ultrasonographic fatty liver had a significantly higher FLI than those without fatty liver by multivariate analysis (odds ratio 1.045; 95% confidence interval, CI 1.044–1.047, p< 0.001). Moreover, FLI had the best discriminative ability to identify patients with ultrasonographic fatty liver (AUROC: 0.827, 95% confidence interval, 0.822–0.831). An FLI < 25 (negative likelihood ratio (LR−) 0.32) for males and <10 (LR− 0.26) for females rule out ultrasonographic fatty liver. Moreover, an FLI ≥ 35 (positive likelihood ratio (LR+) 3.12) for males and ≥ 20 (LR+ 4.43) for females rule in ultrasonographic fatty liver.

Conclusions

FLI could accurately identify ultrasonographic fatty liver in a large-scale population in Taiwan but with lower cut-off value than the Western population. Meanwhile the cut-off value was lower in females than in males.  相似文献   

17.

Background

Screening of household contacts of tuberculosis (TB) patients is a recommended strategy to improve early case detection. While it has been widely implemented in low prevalence countries, the most optimal protocols for contact investigation in high prevalence, low resource settings is yet to be determined. This study evaluated contact investigation interventions in eleven lower and middle income countries and reviewed the association between context or program-related factors and the yield of cases among contacts.

Methods

We reviewed data from nineteen first wave TB REACH funded projects piloting innovations to improve case detection. These nineteen had fulfilled the eligibility criteria: contact investigation implementation and complete data reporting. We performed a cross-sectional analysis of the percentage yield and case notifications for each project. Implementation strategies were delineated and the association between independent variables and yield was analyzed by fitting a random effects logistic regression.

Findings

Overall, the nineteen interventions screened 139,052 household contacts, showing great heterogeneity in the percentage yield of microscopy confirmed cases (SS+), ranging from 0.1% to 6.2%). Compared to the most restrictive testing criteria (at least two weeks of cough) the aOR’s for lesser (any TB related symptom) and least (all contacts) restrictive testing criteria were 1.71 (95%CI 0.94−3.13) and 6.90 (95% CI 3.42−13.93) respectively. The aOR for inclusion of SS- and extra-pulmonary TB was 0.31 (95% CI 0.15−0.62) compared to restricting index cases to SS+ TB. Contact investigation contributed between <1% and 14% to all SS+ cases diagnosed in the intervention areas.

Conclusions

This study confirms that high numbers of active TB cases can be identified through contact investigation in a variety of contexts. However, design and program implementation factors appear to influence the yield of contact investigation and its concomitant contribution to TB case detection.  相似文献   

18.

Background

Peptidylprolyl cis/trans isomerase NIMA-interacting 1 (PIN1) is involved in the process of tumorigenesis. The two single nucleotide polymorphisms (−677T>C, −842G>C) in the PIN1 promoter region have been suspected of being associated with cancer risk for years, but the conclusion is still inconclusive.

Methods

Eligible case-control studies were retrieved by searching databases and references of related reviews and studies. Genotype distribution data, adjusted odds ratios (ORs) and 95% confidence (CIs) intervals were extracted to calculate pooled ORs.

Results

A total of 4619 cancer cases and 4661 controls were included in this meta-analysis. Overall, the PIN1 −667T>C polymorphism was not associated with cancer risk, while the −842C allele was significantly associated with reduced cancer risk (CC+GC vs. GG, OR = 0.725, 95% CI: 0.607–0.865; Pheterogeneity = 0.012 and GC vs. GG: OR = 0.721, 95% CI: 0.591–0.880; Pheterogeneity = 0.003). Results from genotype distribution data were in agreement with those calculated with adjusted ORs and 95% CIs. No publication bias was detected.

Conclusions

Results of this meta-analysis suggest that the PIN1 −842G>C polymorphism is associated with decreased cancer risk, but that the −667T>C polymorphism is not.  相似文献   

19.

Objective

To test the hypothesis that acute myocardial infarction (AMI) might accelerate development of new onset diabetes in patients with coronary artery disease independent of known risk factors.

Methods

We conducted a retrospective cohort study within COACT (CathOlic medical center percutAneous Coronary inTervention) registry. From a total of 9,127 subjects, 2,036 subjects were diabetes naïve and followed up for at least one year with both index and follow-up laboratory data about diabetes. Cox proportional hazard model was used to derive hazard ratios (HRs) and 95% confidence interval (CI) for new onset diabetes associated with AMI in univariate and multivariate analysis after adjusting several covariates.

Results

The overall hazard for diabetes was higher in AMI compared to non-AMI patients (p by log rank <0.01) with HR of 1.78 and 95% CI of 1.37–2.32 in univariate analysis. This association remained significant after adjusting covariates (HR, 1.54; 95% CI, 1.14–2.07; p<0.01). AMI was an independent predictor for higher quartile of WBC count in multivariate ordinal logistic regression analysis (OR, 6.75; 95% CI, 5.53–8.22, p<0.01). In subgroup analysis, the diabetogenic effect of AMI was more prominent in the subgroup without MetS compared to MetS patients (p for interaction<0.05). Compared to the reference group of non-AMI+nonMetS, the group of AMI+non-MetS (HR, 2.44; 95% CI, 1.58–3.76), non-AMI+MetS (HR, 3.42; 95% CI, 2.34–4.98) and AMI+MetS (HR, 4.12; 95% CI, 2.67–6.36) showed higher HR after adjusting covariates. However, the hazard was not different between the non-AMI+MetS and AMI+non-MetS groups.

Conclusions

AMI patients have a greater risk of new-onset diabetes when compared to non AMI patients, especially those with mild metabolic abnormalities.  相似文献   

20.

Objective

A systematic review and a meta-analysis were carried out in order to summarize the current published studies and to evaluate LINE-1 hypomethylation in blood and other tissues as an epigenetic marker for cancer risk.

Methods

A systematic literature search in the Medline database, using PubMed, was conducted for epidemiological studies, published before March 2014. The random-effects model was used to estimate weighted mean differences (MDs) with 95% Confidence Intervals (CIs). Furthermore, subgroup analyses were conducted by sample type (tissue or blood samples), cancer types, and by assays used to measure global DNA methylation levels. The Cochrane software package Review Manager 5.2 was used.

Results

A total of 19 unique articles on 6107 samples (2554 from cancer patients and 3553 control samples) were included in the meta-analysis. LINE-1 methylation levels were significantly lower in cancer patients than in controls (MD: −6.40, 95% CI: −7.71, −5.09; p<0.001). The significant difference in methylation levels was confirmed in tissue samples (MD −7.55; 95% CI: −9.14, −65.95; p<0.001), but not in blood samples (MD: −0.26, 95% CI: −0.69, 0.17; p = 0.23). LINE-1 methylation levels were significantly lower in colorectal and gastric cancer patients than in controls (MD: −8.33; 95% CI: −10.56, −6.10; p<0.001 and MD: −5.75; 95% CI: −7.75, −3.74; p<0.001) whereas, no significant difference was observed for hepatocellular cancer.

Conclusions

The present meta-analysis adds new evidence to the growing literature on the role of LINE-1 hypomethylation in human cancer and demonstrates that LINE-1 methylation levels were significantly lower in cancer patients than in control samples, especially in certain cancer types. This result was confirmed in tissue samples, both fresh/frozen or FFPE specimens, but not in blood. Further studies are needed to better clarify the role of LINE-1 methylation in specific subgroups, considering both cancer and sample type, and the methods of measurement.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号