首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.

Objective

To evaluate the effect of an improved salt-restriction spoon on the attitude of salt-restriction, the using rate of salt-restriction-spoon, the actual salt intake, and 24-hour urinary sodium excretion (24HUNa).

Design

A community intervention study.

Setting

Two villages in Beijing.

Participants

403 local adult residents being responsible for home cooking.

Intervention

Participants were randomly assigned to the intervention group or the control group. Those in the intervention group were provided with an improved salt-restriction-spoon and health education, and were informed of their actual salt intake and 24HUNa. Not any intervention was given to those in the control group.

Main Outcome Measures

The scores on the variables of Health Belief Model, the using rate of salt-restriction-spoon, the actual salt intake, and 24HUNa.

Analysis

Covariance analyses, Chi-square tests, Student’s t tests, and repeated measures analyses of variance.

Results

After 6 months of intervention, the intervention group felt significantly less objective barriers, and got access to significantly more cues to action as compared to the control group. The using rate and the correctly using rate of salt-restriction-spoon were significantly higher in the intervention group. The daily salt intake decreased by 1.42 g in the intervention group and by 0.28 g in the control group, and repeated measures analysis of variance showed significant change over time (F = 7.044, P<0.001) and significant difference between groups by time (F = 2.589, P = 0.041). The 24HUNa decreased by 34.84 mmol in the intervention group and by 33.65 mmol in the control group, and repeated measures analysis of variance showed significant change over time (F = 14.648, P<0.001) without significant difference between groups by time (F = 0.222, P = 0.870).

Conclusions

The intervention effect was acceptable, therefore, the improved salt-restriction-spoon and corresponding health education could be considered as an alternative for salt reduction strategy in China and other countries where salt intake comes mainly from home cooking.  相似文献   

2.

Background

The relationship between passive smoking exposure (PSE) and breast cancer risk is of major interest.

Objective

To evaluate the relationship between PSE from partners and breast cancer risk stratified by hormone-receptor (HR) status in Chinese urban women population.

Design

Hospital-based matched case control study.

Setting

Chinese urban breast cancer patients without current or previous active smoking history in China Medical University 1st Hospital, Liaoning Province, China between Jan 2009 and Nov 2009.

Patients

Each breast cancer patient was matched 1∶1 with healthy controls by gender and age (±2 years) from the same hospital.

Measurements

The authors used unconditional logistic regression analyses to estimate odds ratio for women with PSE from partners and breast cancer risk.

Results

312 pairs were included in the study. Women who endured PSE had significantly increased risk of breast cancer (adjusted OR: 1.46; 95% CI: 1.05–2.03; P = 0.027), comparing with unexposed women. Women who exposed to >5 cigarettes/day also had significant increased risk (adjusted OR: 1.99; 95% CI: 1.28–3.10; P = 0.002), as were women exposed to passive smoke for 16–25 years (adjusted OR: 1.87 95% CI: 1.22–2.86; P = 0.004), and those exposed to > 4 pack-years (adjusted OR: 1.71 95% CI: 1.17–2.50; P = 0.004). Similar trends were significant for estrogen receptor (ER)/progesterone receptor (PR) double positive subgroup(adjusted OR: 1.71; 2.20; 1.99; 1.92, respectively), but not for ER+/PR−, ER−/PR+, or ER−/PR− subgroups.

Limitations

limitations of the hospital-based retrospective study, lack of information on entire lifetime PSE and low statistical power.

Conclusions

Our findings provide further evidence that PSE from partners contributes to increased risk of breast cancer, especially for ER/PR double positive breast cancer, in Chinese urban women.  相似文献   

3.

Introduction

World Health Organization (WHO) radiological classification remains an important entry criterion in epidemiological studies of pneumonia in children. We report inter-observer variability in the interpretation of 169 chest radiographs in children suspected of having pneumonia.

Methods

An 18-month prospective aetiological study of pneumonia was undertaken in Northern England. Chest radiographs were performed on eligible children aged ≤16 years with clinical features of pneumonia. The initial radiology report was compared with a subsequent assessment by a consultant cardiothoracic radiologist. Chest radiographic changes were categorised according to the WHO classification.

Results

There was significant disagreement (22%) between the first and second reports (kappa = 0.70, P<0.001), notably in those aged <5 years (26%, kappa = 0.66, P<0.001). The most frequent sources of disagreement were the reporting of patchy and perihilar changes.

Conclusion

This substantial inter-observer variability highlights the need for experts from different countries to create a consensus to review the radiological definition of pneumonia in children.  相似文献   

4.

Objective

To determine whether the patient-clinician relationship has a beneficial effect on either objective or validated subjective healthcare outcomes.

Design

Systematic review and meta-analysis.

Data Sources

Electronic databases EMBASE and MEDLINE and the reference sections of previous reviews.

Eligibility Criteria for Selecting Studies

Included studies were randomized controlled trials (RCTs) in adult patients in which the patient-clinician relationship was systematically manipulated and healthcare outcomes were either objective (e.g., blood pressure) or validated subjective measures (e.g., pain scores). Studies were excluded if the encounter was a routine physical, or a mental health or substance abuse visit; if the outcome was an intermediate outcome such as patient satisfaction or adherence to treatment; if the patient-clinician relationship was manipulated solely by intervening with patients; or if the duration of the clinical encounter was unequal across conditions.

Results

Thirteen RCTs met eligibility criteria. Observed effect sizes for the individual studies ranged from d = −.23 to .66. Using a random-effects model, the estimate of the overall effect size was small (d = .11), but statistically significant (p = .02).

Conclusions

This systematic review and meta-analysis of RCTs suggests that the patient-clinician relationship has a small, but statistically significant effect on healthcare outcomes. Given that relatively few RCTs met our eligibility criteria, and that the majority of these trials were not specifically designed to test the effect of the patient-clinician relationship on healthcare outcomes, we conclude with a call for more research on this important topic.  相似文献   

5.

Purpose

To investigate the vitreous and plasma levels of vascular endothelial growth factor (VEGF) in patients with proliferative diabetic retinopathy (PDR) and to determine whether they predict a disease prognosis after primary vitrectomy.

Methods

Fifty patients (50 eyes) with PDR who underwent pars plana vitrectomy (PPV) and 56 healthy controls (56 eyes) were enrolled in this retrospective study. Clinical data were collected and analyzed. Vitreous and plasma VEGF concentrations were measured using enzyme-linked immunosorbent assays. VEGF levels and clinical data were compared and analyzed to see if they provide a prognosis of PDR progression after primary vitrectomy at more than 6 months follow-up. Correlation of VEGF concentrations between vitreous fluid and plasma was analyzed.

Results

The average BCVA was significantly improved after surgery (P<0.001). Vitreous and plasma VEGF levels were significantly elevated in PDR patients than those in healthy controls (P vitreous<0.001; P plasma<0.001). Both vitreous and plasma VEGF levels were significantly higher in PDR progression group than in stable group (P vitreous<0.001; P plasma = 0.004). Multivariate logistic regression analyses showed that the increased vitreous VEGF level was associated with the progression of PDR after primary PPV (OR = 1.539; P = 0.036). Vitreous VEGF level was positively associated with plasma VEGF level in PDR patients (P<0.001).

Conclusion

The increased VEGF level in vitreous fluid may be identified as a significant predictive factor for the outcome of vitrectomy in patients with PDR.  相似文献   

6.

Background

We sought to examine whether type 2 diabetes increases the risk of acute organ dysfunction and of hospital mortality following severe sepsis that requires admission to an intensive care unit (ICU).

Methods

Nationwide population-based retrospective cohort study of 16,497 subjects with severe sepsis who had been admitted for the first time to an ICU during the period of 1998–2008. A diabetic cohort (n = 4573) and a non-diabetic cohort (n = 11924) were then created. Relative risk (RR) of organ dysfunctions, length of hospital stay (LOS), 90-days hospital mortality, ICU resource utilization and hazard ratio (HR) of mortality adjusted for age, gender, Charlson-Deyo comorbidity index score, surgical condition and number of acute organ dysfunction, were compared across patients with severe sepsis with or without diabetes.

Results

Diabetic patients with sepsis had a higher risk of developing acute kidney injury (RR, 1.54; 95% confidence interval (CI), 1.44–1.63) and were more likely to be undergoing hemodialysis (15.55% vs. 7.24%) in the ICU. However, the diabetic cohort had a lower risk of developing acute respiratory dysfunction (RR = 0.96, 0.94–0.97), hematological dysfunction (RR = 0.70, 0.56–0.89), and hepatic dysfunction (RR = 0.77, 0.63–0.93). In terms of adjusted HR for 90-days hospital mortality, the diabetic patients with severe sepsis did not fare significantly worse when afflicted with cardiovascular, respiratory, hepatic, renal and/or neurologic organ dysfunction and by numbers of organ dysfunction. There was no statistically significant difference in LOS between the two cohorts (median 17 vs. 16 days, interquartile range (IQR) 8–30 days, p = 0.11). Multiple logistic regression analysis to predict the occurrence of mortality shows that being diabetic was not a predictive factor with an odds ratio of 0.972, 95% CI 0.890–1.061, p = 0.5203.

Interpretation

This large nationwide population-based cohort study suggests that diabetic patients do not fare worse than non-diabetic patients when suffering from severe sepsis that requires ICU admission.  相似文献   

7.

Purpose

To understand if clinicians can tell apart patients with healthcare-associated infections (HCAI) from those with community-acquired infections (CAI) and to determine the impact of HCAI in the adequacy of initial antibiotic therapy and hospital mortality.

Methods

One-year prospective cohort study including all consecutive infected patients admitted to a large university tertiary care hospital.

Results

A total of 1035 patients were included in this study. There were 718 patients admitted from the community: 225 (31%) with HCAI and 493 (69%) with CAI. Total microbiologic documentation rate of infection was 68% (n = 703): 56% in CAI, 73% in HCAI and 83% in hospital-acquired infections (HAI). Antibiotic therapy was inadequate in 27% of patients with HCAI vs. 14% of patients with CAI (p<0.001). Among patients with HCAI, 47% received antibiotic therapy in accordance with international recommendations for treatment of CAI. Antibiotic therapy was inadequate in 36% of patients with HCAI whose treatment followed international recommendations for CAI vs. 19% in the group of HCAI patients whose treatment did not follow these guidelines (p = 0.014). Variables independently associated with inadequate antibiotic therapy were: decreased functional capacity (adjusted OR = 2.24), HCAI (adjusted OR = 2.09) and HAI (adjusted OR = 2.24). Variables independently associated with higher hospital mortality were: age (adjusted OR = 1.05, per year), severe sepsis (adjusted OR = 1.92), septic shock (adjusted OR = 8.13) and inadequate antibiotic therapy (adjusted OR = 1.99).

Conclusions

HCAI was associated with an increased rate of inadequate antibiotic therapy but not with a significant increase in hospital mortality. Clinicians need to be aware of healthcare-associated infections among the group of infected patients arriving from the community since the existing guidelines regarding antibiotic therapy do not apply to this group and they will otherwise receive inadequate antibiotic therapy which will have a negative impact on hospital outcome.  相似文献   

8.

Background and Aim

Hyponatremia is common in patients with chronic kidney disease and is associated with increased mortality in hemodialysis patients. However, few studies have addressed this issue in peritoneal dialysis (PD) patients.

Methods

This prospective observational study included a total of 441 incident patients who started PD between January 2000 and December 2005. Using time-averaged serum sodium (TA-Na) levels, we aimed to investigate whether hyponatremia can predict mortality in these patients.

Results

Among the baseline parameters, serum sodium level was positively associated with serum albumin (β = 0.145; p = 0.003) and residual renal function (RRF) (β = 0.130; p = 0.018) and inversely associated with PD ultrafiltration (β = −0.114; p = 0.024) in a multivariable linear regression analysis. During a median follow-up of 34.8 months, 149 deaths were recorded. All-cause death occurred in 81 (55.9%) patients in the lowest tertile compared to 37 (25.0%) and 31 (20.9%) patients in the middle and highest tertiles, respectively. After adjusting for multiple potentially confounding covariates, increased TA-Na level was associated with a significantly decreased risk of all-cause (HR per 1 mEq/L increase, 0.79; 95% CI, 0.73–0.86; p<0.001) and infection-related (HR per 1 mEq/L increase, 0.77; 95% CI, 0.70–0.85; p<0.001) deaths.

Conclusions

This study showed that hyponatremia is an independent predictor of mortality in PD patients. Nevertheless, whether correcting hyponatremia improves patient survival is unknown. Future interventional studies should address this question more appropriately.  相似文献   

9.
10.

Background

There is limited evidence for the impacts of meteorological changes on asthma hospital admissions in adults in Shanghai, China.

Objectives

To quantitatively evaluate the short-term effects of daily mean temperature on asthma hospital admissions.

Methods

Daily hospital admissions for asthma and daily mean temperatures between January 2005 and December 2012 were analyzed. After controlling for secular and seasonal trends, weather, air pollution and other confounding factors, a Poisson generalized additive model (GAM) combined with a distributed lag non-linear model were used to explore the associations between temperature and hospital admissions for asthma.

Results

During the study periods, there were 15,678 hospital admissions for asthma by residents of Shanghai, an average 5.6 per day. Pearson correlation analysis found a significant negative correlation (r = −0.174, P<0.001) between asthma hospitalizations and daily mean temperature (DMT). The DMT effect on asthma increased below the median DMT, with lower temperatures associated with a higher risk of hospital admission for asthma. Generally, the cold effect appeared to be relatively acute, with duration lasting several weeks, while the hot effect was short-term. The relative risk of asthma hospital admissions associated with cold temperature (the 25th percentile of temperature relative to the median temperature) was 1.20 (95% confidence interval [CI], 1.01∼1.41) at lag0-14. However, warmer temperatures were not associated with asthma hospital admissions.

Conclusions

Cold temperatures may trigger asthmatic attacks. Effective strategies are needed to protect populations at risk from the effects of cold.  相似文献   

11.

Rationale

Optimal management of complicated parapneumonic effusions (CPPE) remains controversial.

Objectives

to assess safety and efficacy of iterative therapeutic thoracentesis (ITTC), the first-line treatment of CPPE in Rennes University Hospital.

Methods

Patients with CPPE were identified through our computerized database. We retrospectively studied all cases of CPPE initially managed with ITTC in our institution between 2001 and 2010. ITTC failure was defined by the need for additional treatment (i.e. surgery or percutaneous drainage), or death.

Results

Seventy-nine consecutive patients were included. The success rate was 81% (n = 64). Only 3 patients (4%) were referred to thoracic surgery. The one-year survival rate was 88%. On multivariate analysis, microorganisms observed in pleural fluid after Gram staining and first thoracentesis volume ≥450 mL were associated with ITTC failure with adjusted odds-ratios of 7.65 [95% CI, 1.44–40.67] and 6.97 [95% CI, 1.86–26.07], respectively. The main complications of ITTC were iatrogenic pneumothorax (n = 5, 6%) and vasovagal reactions (n = 3, 4%). None of the pneumothoraces required chest tube drainage, and no hemothorax or re-expansion pulmonary edema was observed.

Conclusions

Although not indicated in international recommendations, ITTC is safe and effective as first-line treatment of CPPE, with limited invasiveness.  相似文献   

12.
13.

Aims

This study aimed to document and compare the nature of clinical pharmacists’ interventions made in different practice settings within a children’s hospital.

Methods

The primary investigator observed and documented all clinical interventions performed by clinical pharmacists for between 35–37 days on each of the five study wards from the three practice settings, namely general medical, general surgical and hematology-oncology. The rates, types and significance of the pharmacists’ interventions in the different settings were compared.

Results

A total of 982 interventions were documented, related to the 16,700 medication orders reviewed on the five wards in the three practice settings over the duration of the study. Taking medication histories and/or patient counselling were the most common pharmacists’ interventions in the general settings; constituting more than half of all interventions. On the Hematology-Oncology Ward the pattern was different with drug therapy changes being the most common interventions (n = 73/195, 37.4% of all interventions). Active interventions (pharmacists’ activities leading to a change in drug therapy) constituted less than a quarter of all interventions on the general medical and surgical wards compared to nearly half on the specialty Hematology-Oncology Ward. The majority (n = 37/42, 88.1%) of a random sample of the active interventions reviewed were rated as clinically significant. Dose adjustment was the most frequent active interventions in the general settings, whilst drug addition constituted the most common active interventions on the Hematology-Oncology Ward. The degree of acceptance of pharmacists’ active interventions by prescribers was high (n = 223/244, 91.4%).

Conclusions

The rate of pharmacists’ active interventions differed across different practice settings, being most frequent in the specialty hematology-oncology setting. The nature and type of the interventions documented in the hematology-oncology were also different compared to those in the general medical and surgical settings.  相似文献   

14.

Background

Partial mosquito-proofing of houses with screens and ceilings has the potential to reduce indoor densities of malaria mosquitoes. We wish to measure whether it will also reduce indoor densities of vectors of neglected tropical diseases.

Methodology

The main house entry points preferred by anopheline and culicine vectors were determined through controlled experiments using specially designed experimental huts and village houses in Lupiro village, southern Tanzania. The benefit of screening different entry points (eaves, windows and doors) using PVC-coated fibre glass netting material in terms of reduced indoor densities of mosquitoes was evaluated compared to the control.

Findings

23,027 mosquitoes were caught with CDC light traps; 77.9% (17,929) were Anopheles gambiae sensu lato, of which 66.2% were An. arabiensis and 33.8% An. gambiae sensu stricto. The remainder comprised 0.2% (50) An. funestus, 10.2% (2359) Culex spp. and 11.6% (2664) Mansonia spp. Screening eaves reduced densities of Anopheles gambiae s. l. (Relative ratio (RR)  = 0.91; 95% CI = 0.84, 0.98; P = 0.01); Mansonia africana (RR = 0.43; 95% CI = 0.26, 0.76; P<0.001) and Mansonia uniformis (RR = 0.37; 95% CI = 0.25, 0.56; P<0.001) but not Culex quinquefasciatus, Cx. univittatus or Cx. theileri. Numbers of these species were reduced by screening windows and doors but this was not significant.

Significance

This study confirms that across Africa, screening eaves protects households against important mosquito vectors of filariasis, Rift Valley Fever and O''Nyong nyong as well as malaria. While full house screening is required to exclude Culex species mosquitoes, screening of eaves alone or fitting ceilings has considerable potential for integrated control of other vectors of filariasis, arbovirus and malaria.  相似文献   

15.

Objectives

To (1) identify social and rehabilitation predictors of nursing home placement, (2) investigate the association between effectiveness and efficiency in rehabilitation and nursing home placement of patients admitted for inpatient rehabilitation from 1996 to 2005 by disease in Singapore.

Design

National data were retrospectively extracted from medical records of community hospital.

Data Sources

There were 12,506 first admissions for rehabilitation in four community hospitals. Of which, 8,594 (90.3%) patients were discharged home and 924 (9.7%) patients were discharged to a nursing home. Other discharge destinations such as sheltered home (n = 37), other community hospital (n = 31), death in community hospital (n = 12), acute hospital (n = 1,182) and discharge against doctor’s advice (n = 24) were excluded.

Outcome Measure

Nursing home placement.

Results

Those who were discharged to nursing home had 33% lower median rehabilitation effectiveness and 29% lower median rehabilitation efficiency compared to those who were discharged to nursing homes. Patients discharged to nursing homes were significantly older (mean age: 77 vs. 73 years), had lower mean Bathel Index scores (40 vs. 48), a longer median length of stay (40 vs. 33 days) and a longer time to rehabilitation (19 vs. 15 days), had a higher proportion without a caregiver (28 vs. 7%), being single (21 vs. 7%) and had dementia (23 vs. 10%). Patients admitted for lower limb amputation or falls had an increased odds of being discharged to a nursing home by 175% (p<0.001) and 65% (p = 0.043) respectively compared to stroke patients.

Conclusions

In our study, the odds of nursing home placement was found to be increased in Chinese, males, single or widowed or separated/divorced, patients in high subsidy wards for hospital care, patients with dementia, without caregivers, lower functional scores at admission, lower rehabilitation effectiveness or efficiency at discharge and primary diagnosis groups such as fractures, lower limb amputation and falls in comparison to strokes.  相似文献   

16.

Background

With growing evidence on the role of inflammation in cancer biology, the presence of a systemic inflammatory response has been postulated as having prognostic significance in a wide range of cancer types. The derived neutrophil to lymphocyte ratio (dNLR), which represents an easily determinable potential prognostic marker in daily practise and clinical trials, has never been externally validated in pancreatic cancer (PC) patients.

Methods

Data from 474 consecutive PC patients, treated between 2004 and 2012 at a single centre, were evaluated retrospectively. Cancer-specific survival (CSS) was assessed using the Kaplan-Meier method. To evaluate the prognostic relevance of dNLR, univariate and multivariate Cox regression models were applied.

Results

We calculated by ROC analysis a cut-off value of 2.3 for the dNLR to be ideal to discriminate between patients’ survival in the whole cohort. Kaplan-Meier curve reveals a dNLR≥2.3 as a factor for decreased CSS in PC patients (p<0.001, log-rank test). An independent significant association between high dNLR≥2.3 and poor clinical outcome in multivariate analysis (HR = 1.24, CI95% = 1.01–1.51, p = 0.041) was identified.

Conclusion

In the present study we confirmed elevated pre-treatment dNLR as an independent prognostic factor for clinical outcome in PC patients. Our data encourage independent replication in other series and settings of this easily available parameter as well as stratified analysis according to tumor resectability.  相似文献   

17.

Background

Toll like receptor 4 (TLR4) has been related to inflammation and beta-amyloid deposition in Alzheimer''s disease (AD) brain. No study has explored the association between haplotype-tagging single nucleotide polymorphisms (htSNPs) of TLR4 and AD risk previously and ApoE e4 status alone showed low sensitivity in identifying late-onset AD (LOAD) patients.

Methods

A total of 269 LOAD patients were recruited from three hospitals in northern Taiwan (2007–2010). Controls (n = 449) were recruited from elderly health checkup and volunteers of the hospital during the same period of time. Five common (frequency≥5%) TLR4 htSNPs were selected to assess the association between TLR4 polymorphisms and the risk of LOAD in the Chinese ethnic population.

Results

Homozygosity of TLR4 rs1927907 was significantly associated with an increased risk of LOAD [TT vs. CC: adjusted odds ratio (AOR) = 2.45, 95% confidence interval (CI) = 1.30–4.64]. After stratification, the association increased further in ApoE e4 non-carriers (AOR = 3.07) and in hypertensive patients (AOR = 3.60). Haplotype GACGG was associated with a decreased risk of LOAD (1 vs. 0 copies: AOR = 0.59, 95% CI = 0.36–0.96; 2 vs. 0 copies: AOR = 0.31, 95% CI = 0.14–0.67) in ApoE e4 non-carriers. ApoE e4 status significantly modified this association (p interaction = 0.01). These associations remained significant after correction for multiple tests.

Conclusions

Sequence variants of TLR4 were associated with an increased risk of LOAD, especially in ApoE e4 non-carriers and in hypertensive patients. The combination of TLR4 rs1927907 and ApoE e4 significantly increased the screening sensitivity in identifying LOAD patients from 0.4 to 0.7.  相似文献   

18.

Objective

We elected to analyze the correlation between the pre-treatment apparent diffusion coefficient (ADC) and the clinical, histological, and immunohistochemical status of rectal cancers.

Materials and Methods

Forty-nine rectal cancer patients who received surgical resection without neoadjuvant therapy were selected that underwent primary MRI and diffusion-weighted imaging (DWI). Tumor ADC values were determined and analyzed to identify any correlations between these values and pre-treatment CEA or CA19-9 levels, and/or the histological and immunohistochemical properties of the tumor.

Results

Inter-observer agreement of confidence levels from two separate observers was suitable for ADC measurement (k  =  0.775). The pre-treatment ADC values of different T stage tumors were not equal (p  =  0.003). The overall trend was that higher T stage values correlated with lower ADC values. ADC values were also significantly lower for the following conditions: tumors with the presence of extranodal tumor deposits (p  =  0.006) and tumors with CA19-9 levels ≥ 35 g/ml (p  =  0.006). There was a negative correlation between Ki-67 LI and the ADC value (r  =  −0.318, p  =  0.026) and between the AgNOR count and the ADC value (r  =  −0.310, p  =  0.030).

Conclusion

Significant correlations were found between the pre-treatment ADC values and T stage, extranodal tumor deposits, CA19-9 levels, Ki-67 LI, and AgNOR counts in our study. Lower ADC values were associated with more aggressive tumor behavior. Therefore, the ADC value may represent a useful biomarker for assessing the biological features and possible relationship to the status of identified rectal cancers.  相似文献   

19.

Aim

To assess the feasibility and safety of early oral feeding (EOF) after gastrectomy for gastric cancer through a systematic review and meta-analysis based on randomized controlled trials.

Methods

A literature search in PubMed, Embase, Web of Science and Cochrane library databases was performed for eligible studies published between January 1995 and March 2014. Systematic review was carried out to identify randomized controlled trials comparing EOF and traditional postoperative oral feeding after gastric cancer surgery. Meta-analyses were performed by either a fixed effects model or a random effects model according to the heterogeneity using RevMan 5.2 software.

Results

Six studies remained for final analysis. Included studies were published between 2005 and 2013 reporting on a total of 454 patients. No significant differences were observed for postoperative complication (RR = 0.95; 95%CI, 0.70 to 1.29; P = 0.75), the tolerability of oral feeding (RR = 0.98; 95%CI, 0.91 to 1.06; P = 0.61), readmission rate (RR = 1; 95%CI, 0.30 to 3.31; P = 1.00) and incidence of anastomotic leakage (RR = 0.31; 95%CI, 0.01 to 7.30; P = 0.47) between two groups. EOF after gastrectomy for gastric cancer was associated with significant shorter duration of the hospital stay (WMD = −2.36; 95%CI, −3.37 to −1.34; P<0.0001) and time to first flatus (WMD = −19.94; 95%CI, −32.03 to −7.84; P = 0.001). There were no significant differences in postoperative complication, tolerability of oral feeding, readmission rates, duration of hospital stay and time to first flatus among subgroups stratified by the time to start EOF or by partial and total gastrectomy or by laparoscopic and open surgery.

Conclusions

The result of this meta-analysis showed that EOF after gastric cancer surgery seems feasible and safe, even started at the day of surgery irrespective of the extent of the gastric resection and the type of surgery. However, more prospective, well-designed multicenter RCTs with more clinical outcomes are needed for further validation.  相似文献   

20.

Objective

The purpose of this study was to establish an animal model of chronic pulmonary hypertension with a single-dose intraperitoneal injection of monocrotaline (MCT) in young Tibet minipigs, so as to enable both invasive and noninvasive measurements and hence facilitate future studies.

Methods

Twenty-four minipigs (8-week-old) were randomized to receive single-dose injection of 12.0 mg/kg MCT (MCT group, n = 12) or placebo (control group, n = 12 each). On day 42, all animals were evaluated for pulmonary hypertension with conventional transthoracic echocardiography, right heart catheterization (RHC), and pathological changes. Findings of these studies were compared between the two groups.

Results

At echocardiography, the MCT group showed significantly higher pulmonary arterial mean pressure (PAMP) compared with the controls (P<0.001). The pulmonary valve curve showed v-shaped signals with reduction of a-waves in minipigs treated with MCT. In addition, the MCT group had longer pulmonary artery pre-ejection phases, and shorter acceleration time and ejection time. RHC revealed higher mean pulmonary arterial pressure (mPAP) in the MCT group than in the control group (P<0.01). A significant and positive correlation between the mPAP values and the PAMP values (R = 0.974, P<0.0001), and a negative correlation between the mPAP and ejection time (R = 0.680, P<0.0001) was noted. Pathology demonstrated evidence of pulmonary vascular remodeling and higer index of right ventricular hypertrophy in MCT-treated minipigs.

Conclusion

A chronic pulmonary hypertension model can be successfully established in young minipigs at six weeks after MCT injection. These minipig models exhibited features of pulmonary arterial hypertension that can be evaluated by both invasive (RHC) and noninvasive (echocardiography) measurements, and may be used as an easy and stable tool for future studies on pulmonary hypertension.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号