首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.

Background and Aims

Surveillance is an integral part of the colorectal cancer (CRC) screening process. We aimed to investigate inter-physician variation in follow-up procedures after screening colonoscopy in an opportunistic CRC screening program.

Methods

A historical cohort study in the German statutory health insurance system was conducted. 55,301 individuals who underwent screening colonoscopy in 2006 in Bavaria, Germany, and who were not diagnosed with CRC were included. Utilization of follow-up colonoscopies performed by the same physician (328 physicians overall) within 3 years was ascertained. Mixed effects logistic regression modelling was used to assess the effect of physicians and other potential predictors (screening result, age group, and sex) on re-utilization of colonoscopy. Physicians were grouped into quintiles according to individual effects estimated in a preliminary model. Predicted probabilities of follow-up colonoscopy by screening result and physician group were calculated.

Results

The observed rate of follow-up colonoscopy was 6.2% (95% confidence interval: 5.9-6.4%), 18.6% (17.8-19.4%), and 37.0% (35.5-38.4%) after negative colonoscopy, low-risk adenoma and high-risk adenoma detection, respectively. All considered predictors were statistically significantly associated with follow-up colonoscopy. The predicted probabilities of follow-up colonoscopy ranged from 1.7% (1.4-2.0%) to 11.0% (10.2-11.7%), from 7.3% (6.2-8.5%) to 35.1% (32.6-37.7%), and from 17.9% (15.5-20.6%) to 56.9% (53.5-60.3%) in the 1st quintile (lowest rates of follow-up) and 5th quintile (highest rates of follow-up) of physicians after negative colonoscopy, low-risk adenoma and high-risk adenoma detection, respectively.

Conclusions

This study suggests substantial inter-physician variation in follow-up habits after screening colonoscopy. Interventions, including organizational changes in CRC screening should be considered to reduce this variation.  相似文献   

2.
3.

Background

Supplementary observational data in the community setting are required to better assess the predictors of colorectal polyp recurrence and the effectiveness of colonoscopy surveillance under real circumstances.

Aim

The goal of this study was to identify patient characteristics and polyp features at baseline colonoscopy that are associated with the recurrence of colorectal polyps (including hyperplastic polyps) among patients consulting private practice physicians.

Patients and Methods

This cohort study was conducted from March 2004 to December 2010 in 26 private gastroenterology practices (France). It included 1023 patients with a first-time diagnosis of histologically confirmed polyp removed during a diagnostic or screening colonoscopy. At enrollment, interviews were conducted to obtain data on socio-demographic variables and risk factors. Pathology reports were reviewed to abstract data on polyp features at baseline colonoscopy. Colorectal polyps diagnosed at the surveillance colonoscopy were considered as end points. The time to event was analyzed with an accelerated failure time model assuming a Weibull distribution.

Results

Among the 1023 patients with colorectal polyp at baseline, 553 underwent a surveillance colonoscopy. The mean time interval from baseline colonoscopy to first surveillance examination was 3.42 (standard deviation, 1.45) years. The recurrence rates were 50.5% and 32.9% for all polyps and adenomas, respectively. In multivariate models, the number of polyps at baseline was the only significant predictor for both polyp recurrence (hazard ratio [HR] 1.19, 95% CI 1.06 to 1.33), and adenoma recurrence (HR 1.17, 95% CI 1.03 to 1.34).

Conclusion

The efficacy of surveillance colonoscopy in community gastroenterology practice compared favorably with academic settings. This study provides further evidence that the number of initial colorectal polyps is useful for predicting the risk of polyp recurrence, even in the community setting.  相似文献   

4.

Purpose

Epidermal growth factor receptor (EGFR) inhibitors are approved for treating metastatic colorectal cancer (CRC); KRAS mutation testing is recommended prior to treatment. We conducted a non-inferiority analysis to examine whether KRAS testing has impacted survival in CRC patients.

Patients and Methods

We included 1186 metastatic CRC cases from seven health plans. A cutpoint of July, 2008, was used to define two KRAS testing time period groups: “pre-testing” (n = 760 cases) and “post-testing” (n = 426 cases). Overall survival (OS) was estimated, and the difference in median OS between the groups was calculated. The lower bound of the one-sided 95% confidence interval (CI) for the difference in survival was used to test the null hypothesis of post-testing inferiority. Multivariable Cox regression models were constructed to adjust for covariates.

Results

The median unadjusted OS was 15.4 months (95% CI: 14.0–17.5) and 12.8 months (95% CI: 10.0–15.2) in the pre- and post-testing groups, respectively. The OS difference was −2.6 months with one-sided 95% lower confidence bound of −5.13 months, which was less than the non-inferiority margin (−5.0 months, unadjusted p = 0.06), leading to a failure to reject inferiority of OS in the post-testing period. In contrast, in the adjusted analysis, OS non-inferiority was identified in the post-testing period (p = 0.001). Sensitivity analyses using cutpoints before and after July, 2008, also met the criteria for non-inferiority.

Conclusion

Implementation of KRAS testing did not influence CRC OS. Our data support the use of KRAS testing to guide administration of EGFR inhibitors for treatment of metastatic CRC without diminished OS.  相似文献   

5.

Purpose

Very few studies examined the issue of regret on choosing colorectal cancer (CRC) screening tests. We evaluated the determinants of regret and tested the hypothesis that regret over screening choices was associated with poorer screening compliance.

Methods

A bowel cancer screening centre invited all Hong Kong citizens aged 50-70 years who were asymptomatic of CRC to participate in free-of-charge screening programmes. Upon attendance they attended health seminars on CRC and its screening, and were offered an option to choose yearly faecal immunochemical test (FIT) for up to four years vs. one direct colonoscopy. They were not allowed to switch the screening option after decision. A self-administered, four-item validated survey was used to assess whether they regretted over their choice (> 2 = regretful from a scale of 0 [no regret]-5 [extreme regret]). A binary logistic regression model evaluated if initial regret over their choice was associated with poorer programme compliance.

Results

From 4,341 screening participants who have chosen FIT or colonoscopy, 120 (2.8%) regretted over their decision and 1,029 (23.7%) were non-compliant with the screening programme. Younger subjects and people who felt pressure when making their decision were associated with regret. People who regretted their decision were 2.189 (95% C.I. 1.361-3.521, p = 0.001) times more likely to be non-compliant with the programme.

Conclusions

This study is the first to show that regret over the initial CRC screening choice was associated with later non-compliance. Screening participants who expressed regret over their choice should receive additional reminders to improve their programmatic compliance.  相似文献   

6.

Background

Medication nonadherence costs $300 billion annually in the US. Medicare Advantage plans have a financial incentive to increase medication adherence among members because the Centers for Medicare and Medicaid Services (CMS) now awards substantive bonus payments to such plans, based in part on population adherence to chronic medications. We sought to build an individualized surveillance model that detects early which beneficiaries will fall below the CMS adherence threshold.

Methods

This was a retrospective study of over 210,000 beneficiaries initiating statins, in a database of private insurance claims, from 2008-2011. A logistic regression model was constructed to use statin adherence from initiation to day 90 to predict beneficiaries who would not meet the CMS measure of proportion of days covered 0.8 or above, from day 91 to 365. The model controlled for 15 additional characteristics. In a sensitivity analysis, we varied the number of days of adherence data used for prediction.

Results

Lower adherence in the first 90 days was the strongest predictor of one-year nonadherence, with an odds ratio of 25.0 (95% confidence interval 23.7-26.5) for poor adherence at one year. The model had an area under the receiver operating characteristic curve of 0.80. Sensitivity analysis revealed that predictions of comparable accuracy could be made only 40 days after statin initiation. When members with 30-day supplies for their first statin fill had predictions made at 40 days, and members with 90-day supplies for their first fill had predictions made at 100 days, poor adherence could be predicted with 86% positive predictive value.

Conclusions

To preserve their Medicare Star ratings, plan managers should identify or develop effective programs to improve adherence. An individualized surveillance approach can be used to target members who would most benefit, recognizing the tradeoff between improved model performance over time and the advantage of earlier detection.  相似文献   

7.

Introduction

K-ras gene mutations were common in colorectal patients, but their relationship with prognosis was unclear.

Objective

Verify prognostic differences between patient with and without mutant K-ras genes by reviewing the published evidence.

Method

Systematic reviews and data bases were searched for cohort/case-control studies of prognosis of colorectal cancer patients with detected K-ras mutations versus those without mutant K-ras genes, both of whom received chemotherapy. Number of patients, regimens of chemotherapy, and short-term or long-term survival rate (disease-free or overall) were extracted. Quality of studies was also evaluated.

Principal Findings

7 studies of comparisons with a control group were identified. No association between K-ras gene status with neither short-term disease free-survival (OR=1.01, 95% CI, 0.73-1.38, P=0.97) nor overall survival (OR=1.06, 95% CI, 0.82-1.36, P=0.66) in CRC patients who received chemotherapy was indicated. Comparison of long-term survival between two groups also indicated no significant difference after heterogeneity was eliminated (OR=1.09, 95% CI, 0.85-1.40, P=0.49).

Conclusions

K-ras gene mutations may not be a prognostic index for colorectal cancer patients who received chemotherapy.  相似文献   

8.

Background

Linkage of risk-factor data for blood-stream infection (BSI) in paediatric intensive care (PICU) with bacteraemia surveillance data to monitor risk-adjusted infection rates in PICU is complicated by a lack of unique identifiers and under-ascertainment in the national surveillance system. We linked, evaluated and performed preliminary analyses on these data to provide a practical guide on the steps required to handle linkage of such complex data sources.

Methods

Data on PICU admissions in England and Wales for 2003-2010 were extracted from the Paediatric Intensive Care Audit Network. Records of all positive isolates from blood cultures taken for children <16 years and captured by the national voluntary laboratory surveillance system for 2003-2010 were extracted from the Public Health England database, LabBase2. “Gold-standard” datasets with unique identifiers were obtained directly from three laboratories, containing microbiology reports that were eligible for submission to LabBase2 (defined as “clinically significant” by laboratory microbiologists). Reports in the gold-standard datasets were compared to those in LabBase2 to estimate ascertainment in LabBase2. Linkage evaluated by comparing results from two classification methods (highest-weight classification of match weights and prior-informed imputation using match probabilities) with linked records in the gold-standard data. BSI rate was estimated as the proportion of admissions associated with at least one BSI.

Results

Reporting gaps were identified in 548/2596 lab-months of LabBase2. Ascertainment of clinically significant BSI in the remaining months was approximately 80-95%. Prior-informed imputation provided the least biased estimate of BSI rate (5.8% of admissions). Adjusting for ascertainment, the estimated BSI rate was 6.1-7.3%.

Conclusion

Linkage of PICU admission data with national BSI surveillance provides the opportunity for enhanced surveillance but analyses based on these data need to take account of biases due to ascertainment and linkage error. This study provides a generalisable guide for linkage, evaluation and analysis of complex electronic healthcare data.  相似文献   

9.

Background

Non-adherence is one of the strongest predictors of therapeutic failure in HIV-positive patients. Virologic failure with subsequent emergence of resistance reduces future treatment options and long-term clinical success.

Methods

Prospective observational cohort study including patients starting new class of antiretroviral therapy (ART) between 2003 and 2010. Participants were naïve to ART class and completed ≥1 adherence questionnaire prior to resistance testing. Outcomes were development of any IAS-USA, class-specific, or M184V mutations. Associations between adherence and resistance were estimated using logistic regression models stratified by ART class.

Results

Of 314 included individuals, 162 started NNRTI and 152 a PI/r regimen. Adherence was similar between groups with 85% reporting adherence ≥95%. Number of new mutations increased with increasing non-adherence. In NNRTI group, multivariable models indicated a significant linear association in odds of developing IAS-USA (odds ratio (OR) 1.66, 95% confidence interval (CI): 1.04-2.67) or class-specific (OR 1.65, 95% CI: 1.00-2.70) mutations. Levels of drug resistance were considerably lower in PI/r group and adherence was only significantly associated with M184V mutations (OR 8.38, 95% CI: 1.26-55.70). Adherence was significantly associated with HIV RNA in PI/r but not NNRTI regimens.

Conclusion

Therapies containing PI/r appear more forgiving to incomplete adherence compared with NNRTI regimens, which allow higher levels of resistance, even with adherence above 95%. However, in failing PI/r regimens good adherence may prevent accumulation of further resistance mutations and therefore help to preserve future drug options. In contrast, adherence levels have little impact on NNRTI treatments once the first mutations have emerged.  相似文献   

10.

Introduction

Health authorities find thresholds useful to gauge the start and severity of influenza seasons. We explored a method for deriving thresholds proposed in an influenza surveillance manual published by the World Health Organization (WHO).

Methods

For 2002-2011, we analysed two routine influenza-like-illness (ILI) datasets, general practice sentinel surveillance and a locum medical service sentinel surveillance, plus laboratory data and hospital admissions for influenza. For each sentinel dataset, we created two composite variables from the product of weekly ILI data and the relevant laboratory data, indicating the proportion of tested specimens that were positive. For all datasets, including the composite datasets, we aligned data on the median week of peak influenza or ILI activity and assigned three threshold levels: seasonal threshold, determined by inspection; and two intensity thresholds termed average and alert thresholds, determined by calculations of means, medians, confidence intervals (CI) and percentiles. From the thresholds, we compared the seasonal onset, end and intensity across all datasets from 2002-2011. Correlation between datasets was assessed using the mean correlation coefficient.

Results

The median week of peak activity was week 34 for all datasets, except hospital data (week 35). Means and medians were comparable and the 90% upper CIs were similar to the 95th percentiles. Comparison of thresholds revealed variations in defining the start of a season but good agreement in describing the end and intensity of influenza seasons, except in hospital admissions data after the pandemic year of 2009. The composite variables improved the agreements between the ILI and other datasets. Datasets were well correlated, with mean correlation coefficients of >0.75 for a range of combinations.

Conclusions

Thresholds for influenza surveillance are easily derived from historical surveillance and laboratory data using the approach proposed by WHO. Use of composite variables is helpful for describing influenza season characteristics.  相似文献   

11.

Introduction

Colonoscopy can prevent deaths due to colorectal cancer (CRC) through early diagnosis or resection of colonic adenomas. We conducted a prospective, nationwide study on colonoscopy practice in France.

Methods

An online questionnaire was administered to 2,600 French gastroenterologists. Data from all consecutive colonoscopies performed during one week were collected. A statistical extrapolation of the results to a whole year was performed, and factors potentially associated with the adenoma detection rate (ADR) or the diagnosis of polyps or cancer were assessed.

Results

A total of 342 gastroenterologists, representative of the overall population of French gastroenterologists, provided data on 3,266 colonoscopies, corresponding to 1,200,529 (95% CI: 1,125,936-1,275,122) procedures for the year 2011. The indication for colonoscopy was CRC screening and digestive symptoms in 49.6% and 38.9% of cases, respectively. Polypectomy was performed in 35.5% of cases. The ADR and prevalence of CRC were 17.7% and 2.9%, respectively. The main factors associated with a high ADR were male gender (p=0.0001), age over 50 (p=0.0001), personal or family history of CRC or colorectal polyps (p<0.0001 and p<0.0001, respectively), and positive fecal occult blood test (p=0.0005). The prevalence of CRC was three times higher in patients with their first colonoscopy (4.2% vs. 1.4%; p<0.0001).

Conclusions

For the first time in France, we report nationwide prospective data on colonoscopy practice, including histological results. We found an average ADR of 17.7%, and observed reduced CRC incidence in patients with previous colonoscopy.  相似文献   

12.

Background

Recent studies showed that previous negative results from faecal immunochemical tests (FITs) for colorectal cancer (CRC) screening was associated with lower risk of advanced neoplasia (AN). We evaluated whether prior FIT results should be included to estimate the risk of AN in 2008–2012.

Methods

A community-based screening practice recruited 5,813 asymptomatic residents aged 50 to 70 years in Hong Kong for CRC screening. We included study participants who had (1). positive FIT with subsequent colonoscopy workup (FIT+ group; n = 356); (2). negative FIT in three consecutive years and received a colonoscopy (FIT- group; n = 857); (3). received colonoscopy without FIT (colonoscopy group; n = 473); and (4). received both colonoscopy and FIT at the same time (combined group; n = 4,127). One binary logistic regression model evaluated whether prior FIT results were associated with colonoscopy findings of AN.

Results

The proportion of participants having AN/CRC was 18.0% (FIT+), 5.5% (FIT-), 8.0% (colonoscopy group), and 4.3% (combined group), respectively. When compared with the colonoscopy group, those in the FIT- group were not significantly more or less likely to have AN/CRC (AOR  = 0.77, 95% C.I. = 0.51 to 1.18, p  = 0.230). Having one (AOR = 0.73, 95% C.I. 0.48–1.12, p = 0.151) or three consecutive negative FIT result (AOR = 0.98, 95% C.I. 0.60–1.62, p = 0.944) were not associated with lower risks of AN/CRC. Subjects in the FIT+ group was 3.32-fold (95% C.I. 2.07 to 5.32, p<0.001) more likely to have AN/CRC.

Conclusions

These findings indicated that subjects with negative FIT findings could be risk stratified similarly as those who had not previously received FIT.  相似文献   

13.

Context

Stress response induced by surgery is proposed to play an important role in the pathogenesis of postoperative cognitive dysfunction.

Objective

To investigate the association between postoperative serum cortisol level and occurrence of cognitive dysfunction early after coronary artery bypass graft surgery.

Design

Prospective cohort study.

Setting

Two teaching hospitals.

Patients

One hundred and sixth-six adult patients who were referred to elective coronary artery bypass graft surgery from March 2008 to December 2009.

Intervention

None.

Main Outcome Measures

Neuropsychological tests were completed one day before and seven days after surgery. Cognitive dysfunction was defined using the same definition as used in the ISPOCD1-study. Blood samples were obtained in the first postoperative morning for measurement of serum cortisol concentration. Multivariate Logistic regression analyses were performed to assess the relationship between serum cortisol level and occurrence of postoperative cognitive dysfunction.

Results

Cognitive dysfunction occurred in 39.8% (66 of 166) of patients seven days after surgery. Multivariate Logistic regression analysis showed that high serum cortisol level was significantly associated with the occurrence of postoperative cognitive dysfunction (odds ratio [OR] 2.603, 95% confidence interval [CI] 1.371-4.944, P = 0.003). Other independent predictors of early postoperative cognitive dysfunction included high preoperative New York Heart Association functional class (OR 0.402, 95% CI 0.207-0.782, P = 0.007), poor preoperative Grooved Pegboard test score of nondominant hand (OR 1.022, 95% CI 1.003-1.040, P = 0.020), use of penehyclidine as premedication (OR 2.565, 95% CI 1.109-5.933, P = 0.028), and occurrence of complications within seven days after surgery (OR 2.677, 95% CI 1.201-5.963, P = 0.016).

Conclusions

High serum cortisol level in the first postoperative morning was associated with increased risk of cognitive dysfunction seven days after coronary artery bypass graft surgery.  相似文献   

14.

Background

Peritoneal carcinomatosis (PC) is a difficult clinical challenge in colorectal cancer (CRC) because conventional treatment modalities could not produce significant survival benefit, which highlights the acute need for new treatment strategies. Our previous case-control study demonstrated the potential survival advantage of cytoreductive surgery (CRS) plus hyperthermic intraperitoneal chemotherapy (HIPEC) over CRS alone. This phase II study was to further investigate the efficacy and adverse events of CRS+HIPEC for Chinese patients with CRC PC.

Methods

A total of 60 consecutive CRC PC patients underwent 63 procedures consisting of CRS+HIPEC and postoperative chemotherapy, all by a designated team focusing on this combined treatment modality. All the clinico-pathological information was systematically integrated into a prospective database. The primary end point was disease-specific overall survival (OS), and the secondary end points were perioperative safety profiles.

Results

By the most recent database update, the median follow-up was 29.9 (range 3.5–108.9) months. The peritoneal cancer index (PCI) ≤20 was in 47.0% of patients, complete cytoreductive surgery (CC0-1) was performed in 53.0% of patients. The median OS was 16.0 (95% confidence interval [CI] 12.2–19.8) months, and the 1-, 2-, 3-, and 5-year survival rates were 70.5%, 34.2%, 22.0% and 22.0%, respectively. Mortality and grades 3 to 5 morbidity rates in postoperative 30 days were 0.0% and 30.2%, respectively. Univariate analysis identified 3 parameters with significant effects on OS: PCI ≤20, CC0-1 and adjuvant chemotherapy over 6 cycles. On multivariate analysis, however, only CC0-1 and adjuvant chemotherapy ≥6 cycles were found to be independent factors for OS benefit.

Discussion

CRS+HIPEC at a specialized treatment center could improve OS for selected CRC PC patients from China, with acceptable perioperative safety.  相似文献   

15.

Purpose

To examine the associations between area-level socioeconomic attributes and stage of esophageal adenocarcinoma diagnoses in 16 SEER cancer registries during 2000-2007.

Methods

Odds ratios (OR) and 95% confidence intervals (CI) were calculated using multivariable logistic regression models to assess the relationship between distant-stage esophageal adenocarcinoma and individual, census tract, and county-level attributes.

Results

Among cases with data on birthplace, no significant association was seen between reported birth within versus outside the United States and distant-stage cancer (adjusted OR=1.02, 95% CI: 0.85-1.22). Living in an area with a higher percentage of residents born outside the United States than the national average was associated with distant-stage esophageal adenocarcinoma; census tract level: >11.8%, (OR=1.10, 95% CI:1.01–1.19), county level: >11.8%, (OR=1.14, 95% CI:1.05-1.24). No association was observed between median household income and distant-stage cancer at either census tract or county levels.

Conclusion

The finding of greater odds of distant-stage esophageal adenocarcinoma among cases residing in SEER areas with higher proportion of non-U.S. Natives suggests local areas where esophageal cancer control efforts might be focused. Missing data at the individual level was a limitation of the present study. Furthermore, inconsistent associations with foreign birth at individual- versus area-levels cautions against using area-level attributes as proxies for case attributes.  相似文献   

16.

Background

There is little information about influenza among the Pakistani population. In order to assess the trends of Influenza-like-Illness (ILI) and to monitor the predominant circulating strains of influenza viruses, a country-wide lab-based surveillance system for ILI and Severe Acute Respiratory Illness (SARI) with weekly sampling and reporting was established in 2008. This system was necessary for early detection of emerging novel influenza subtypes and timely response for influenza prevention and control.

Methods

Five sentinel sites at tertiary care hospitals across Pakistan collected epidemiological data and respiratory samples from Influenza-like illness (ILI) and severe acute respiratory illness (SARI) cases from January 2008 to December 2011. Samples were typed and sub-typed by Real-Time RT-PCR assay.

Results

A total of 6258 specimens were analyzed; influenza virus was detected in 1489 (24%) samples, including 1066 (72%) Influenza type A and 423 (28%) influenza type B viruses. Amongst influenza A viruses, 25 (2%) were seasonal A/H1N1, 169 (16%) were A/H3N2 and 872 (82 %) were A(H1N1)pdm09. Influenza B virus circulation was detected throughout the year along with few cases of seasonal A/H1N1 virus during late winter and spring. Influenza A/H3N2 virus circulation was mainly observed during summer months (August-October).

Conclusions

The findings of this study emphasize the need for continuous and comprehensive influenza surveillance. Prospective data from multiple years is needed to predict seasonal trends for vaccine development and to further fortify pandemic preparedness.  相似文献   

17.

Objective

To identify the socio demographic, life style and foot examination related predictors of diabetic foot and leg ulcers with a view to develop a screening tool appropriate for the use in an outpatient setting.

Research design and methods

This cross sectional study included type 2 diabetes mellitus (DM) patients; 88 subjects with leg and foot ulcers and 80 non ulcer controls. Socio demographic data and life style factors were documented. Foot was examined for skin changes and structural abnormalities. Distal peripheral neuropathy was assessed by pressure sense, vibration sense and joint position sense. Multivariate analysis by logistic regression was used to determine the significant predictors in screening for foot ulcers.

Results

Education of grade 6 and below (OR - 1.41, 95% CI; 1.03 - 4.68), low income (OR - 23.3, 95% CI; 1.5 - 34.0), impaired vibration sense (OR - 24.79, 95% CI; 9.3 - 66.2), abnormal monofilament test on first (OR - 1.69, 95% CI; 1.36 - 16.6), third (OR - 3.4, 95% CI; 1.1 - 10.6) and fifth (OR - 1.8, 95% CI; 1.61- 12.6) toes are found to be predictors of increased risk whereas incidental diagnosis of DM (OR - 0.03, 95% CI; 0.003 - 0.28), wearing covered shoes (OR - 0.003, 95% CI; 0.00 - 0.28), presence of normal skin color (OR - 0.01, 95% CI; 0.001 - 0.14) and normal monofilament test on first metatarsal head (OR - 0.10, 95% CI; 0.00 - 0.67) are protective factors for ulcers.

Conclusions

Ten independent risk and protective factors identified in this study are proposed as a simple screening tool to predict the risk of developing leg and foot ulcers in patients with DM.  相似文献   

18.

Introduction

Colorectal cancer (CRC) is one of the world’s three most common cancers and its incidence is rising. To identify patients who benefit from adjuvant therapy requires novel biomarkers. The regenerating islet-derived gene (REG) 4 belongs to a group of small secretory proteins involved in cell proliferation and regeneration. Its up-regulated expression occurs in inflammatory bowel diseases also in gastrointestinal cancers. Reports on the association of REG4 expression with CRC prognosis have been mixed. Our aim was to investigate tumor REG4 expression in CRC patients and its coexpression with other intestinal markers.

Methods

Tumor expression of REG4 was evaluated by immunohistochemistry in 840 consecutive surgically treated CRC patients at Helsinki University Central Hospital. Expression of MUC1, MUC2, MUC5AC, synapthophysin, and chromogranin was evaluated in a subgroup of 220 consecutively operated CRC patients. REG4 expression with clinicopathological parameters, other intestinal markers, and the impact of REG4 expression on survival were assessed.

Results

REG4 expression associated with favorable clinicopathological parameters and with higher overall survival from non-mucinous CRC (p = 0.019). For such patients under 65, its expression was an independent marker of lower risk of death within 5 years that cancer; univariable hazard ratio (HR) = 0.57; 95% confidence interval (CI) (0.34–0.94); multivariable HR = 0.55; 95% CI (0.33–0.92). In non-mucinous CRC, REG4 associated with positive MUC2, MUC4, and MUC5AC expression.

Conclusion

We show, to our knowledge for the first time, that REG4 IHC expression to be an independent marker of favorable prognosis in non-mucinous CRC. Our results contradict those from studies based on quantification of REG4 mRNA levels, a discrepancy warranting further studies.  相似文献   

19.

Background

Methionine is one of the key components of one carbon metabolism. Experimental studies indicate that methionine may reduce inflammation-induced colon cancer. However, epidemiologic findings as to whether dietary methionine intake influences colorectal cancer incidence in humans are inconsistent.

Objective

To investigate the relationship between dietary methionine intake and risk of colorectal cancer by performing a meta-analysis of prospective studies.

Methods

Eligible studies were identified by searching PubMed and Embase and by reviewing the bibliographies of the retrieved publications. The summary risk estimates were computed using both a random- effects and a fixed-effects model.

Results

Eight eligible prospective cohort studies involving 431,029 participants and 6,331 colorectal cancer cases were identified. According to the random-effects model, the summary relative risks (RRs) for the highest compared with the lowest intake of methionine were 0.89 (95% confidence interval [CI] = 0.77-1.03) for colorectal cancer, 0.77 (95% CI = 0.64 - 0.92) for colon cancer, and 0.88 (95% CI = 0.55-1.42) for rectal cancer. In the stratified analysis, a significant inverse association between dietary methionine intake and risk of colorectal cancer was observed in studies with longer follow-up time (RR=0.81, 95% CI= 0.70- 0.95), in Western studies (RR= 0.83, 95% CI = 0.73 - 0.95) and in men (RR = 0.75, 95% CI= 0.57-0.99). We found no indication of publication bias.

Conclusion

This meta-analysis indicates that dietary methionine intake may be associated with decreased risk of colorectal cancer, especially colon cancer. More prospective studies with long follow-up time are needed to confirm these findings.  相似文献   

20.

Objectives:

Muscle mass and muscle power considerably decline with aging. The aim of the present study was to determine the association between muscular function using mechanography and sarcopenia, falls and impairment in the activities of daily living (ADL) in a sample of 293 community-dwelling women and men aged 60-85 years in Berlin, Germany.

Methods:

Muscle function was determined by muscle power per body mass in vertical countermovement jumps (2LJPrel) and the chair rising test (CRTPrel) on a force plate. Sarcopenia status was assessed by estimating appendicular muscle mass with dual-X-ray absorptiometry. Self-reported ADL impairment and falls in the last 12 months were determined.

Results:

ADL impairment was significantly correlated with all performance tests but not with muscle mass. The 2LJPrel (OR 0.88, 95%-CI 0.79-0.98), the Esslinger Fitness Index (EFI) (OR 0.97, 95%-CI 0.94-1.00) and the maximal velocity of the CRT (OR 0.70, 95%-CI 0.53-0.93) remained significant correlates for sarcopenia independent of age in men but not in women. The EFI could differentiate female individuals who had past fall events (OR 0.96, 95%-CI 0.93-0.98).

Conclusion:

The results of the present study highlight the importance of assessing muscle power in older individuals as a relevant correlate for functional decline.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号