首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.

Background

We aimed to develop a multivariable model for prediction of underestimated invasiveness in women with ductal carcinoma in situ at stereotactic large core needle biopsy, that can be used to select patients for sentinel node biopsy at primary surgery.

Methods

From the literature, we selected potential preoperative predictors of underestimated invasive breast cancer. Data of patients with nonpalpable breast lesions who were diagnosed with ductal carcinoma in situ at stereotactic large core needle biopsy, drawn from the prospective COBRA (Core Biopsy after RAdiological localization) and COBRA2000 cohort studies, were used to fit the multivariable model and assess its overall performance, discrimination, and calibration.

Results

348 women with large core needle biopsy-proven ductal carcinoma in situ were available for analysis. In 100 (28.7%) patients invasive carcinoma was found at subsequent surgery. Nine predictors were included in the model. In the multivariable analysis, the predictors with the strongest association were lesion size (OR 1.12 per cm, 95% CI 0.98-1.28), number of cores retrieved at biopsy (OR per core 0.87, 95% CI 0.75-1.01), presence of lobular cancerization (OR 5.29, 95% CI 1.25-26.77), and microinvasion (OR 3.75, 95% CI 1.42-9.87). The overall performance of the multivariable model was poor with an explained variation of 9% (Nagelkerke’s R 2), mediocre discrimination with area under the receiver operating characteristic curve of 0.66 (95% confidence interval 0.58-0.73), and fairly good calibration.

Conclusion

The evaluation of our multivariable prediction model in a large, clinically representative study population proves that routine clinical and pathological variables are not suitable to select patients with large core needle biopsy-proven ductal carcinoma in situ for sentinel node biopsy during primary surgery.  相似文献   

2.

Context

Stress response induced by surgery is proposed to play an important role in the pathogenesis of postoperative cognitive dysfunction.

Objective

To investigate the association between postoperative serum cortisol level and occurrence of cognitive dysfunction early after coronary artery bypass graft surgery.

Design

Prospective cohort study.

Setting

Two teaching hospitals.

Patients

One hundred and sixth-six adult patients who were referred to elective coronary artery bypass graft surgery from March 2008 to December 2009.

Intervention

None.

Main Outcome Measures

Neuropsychological tests were completed one day before and seven days after surgery. Cognitive dysfunction was defined using the same definition as used in the ISPOCD1-study. Blood samples were obtained in the first postoperative morning for measurement of serum cortisol concentration. Multivariate Logistic regression analyses were performed to assess the relationship between serum cortisol level and occurrence of postoperative cognitive dysfunction.

Results

Cognitive dysfunction occurred in 39.8% (66 of 166) of patients seven days after surgery. Multivariate Logistic regression analysis showed that high serum cortisol level was significantly associated with the occurrence of postoperative cognitive dysfunction (odds ratio [OR] 2.603, 95% confidence interval [CI] 1.371-4.944, P = 0.003). Other independent predictors of early postoperative cognitive dysfunction included high preoperative New York Heart Association functional class (OR 0.402, 95% CI 0.207-0.782, P = 0.007), poor preoperative Grooved Pegboard test score of nondominant hand (OR 1.022, 95% CI 1.003-1.040, P = 0.020), use of penehyclidine as premedication (OR 2.565, 95% CI 1.109-5.933, P = 0.028), and occurrence of complications within seven days after surgery (OR 2.677, 95% CI 1.201-5.963, P = 0.016).

Conclusions

High serum cortisol level in the first postoperative morning was associated with increased risk of cognitive dysfunction seven days after coronary artery bypass graft surgery.  相似文献   

3.

Background

Studies conducted in high income countries have shown that anaemia is a common medical condition among older people, but such data are scarce in Africa. The objectives of this study were to estimate the prevalence, types, risk factors and clinical correlates of anaemia in older people.

Methods

Participants were aged (≥ 50) years recruited from a general population cohort from January 2012 to January 2013. Blood samples were collected for assessing hemoglobin, serum ferritin, serum vitamin B12, serum folate, C-reactive protein, malaria infection and stool samples for assessment of hookworm infection. HIV status was assessed using an algorithm for HIV rapid testing. Questionnaires were used to collect data on sociodemographic characteristics and other risk factors for anaemia.

Results

In total, 1449 people participated (response rate 72.3%). The overall prevalence of anaemia was 20.3 % (95% CI 18.2-22.3%), and this was higher for males (24.1%, 95% CI=20.7-27.7%) than females (17.5%, 95% CI=15.0-20.1%). In males, the prevalence of anaemia increased rapidly with age almost doubling between 50 and 65 years (p-trend<0.001). Unexplained anaemia was responsible for more than half of all cases (59.7%). Anaemia was independently associated with infections including malaria (OR 3.49, 95% CI 1.78-6.82), HIV (OR 2.17, 1.32-3.57) heavy hookworm infection (OR 3.45, 1.73-6.91), low fruit consumption (OR 1.55, 1.05-2.29) and being unmarried (OR 1.37 , 95% CI 1.01-1.89). However, the odds of anaemia were lower among older people with elevated blood pressure (OR 0.47, 95% CI 0.29-0.77).

Conclusion

Anaemia control programmes in Uganda should target older people and should include interventions to treat and control hookworms and educational programs on diets that enhance iron absorption. Clinicians should consider screening older people with HIV or malaria for anaemia. Further studies should be done on unexplained anaemia and serum ferritin levels that predict iron deficiency anaemia in older people.  相似文献   

4.

Background

Limited evidence exists on the utilization of surveillance colonoscopy in colorectal cancer (CRC) screening programs. We assessed adherence to physician recommendations for surveillance in opportunistic CRC screening in Germany.

Methods

A follow-up study of screening colonoscopy participants in 2007-2009 in Saarland, Germany, was conducted using health insurance claims data. Utilization of additional colonoscopies through to 2011 was ascertained. Adherence to surveillance intervals of 3, 6, 12 and 36 months, defined as having had colonoscopy at 2.5 to 4, 5 to 8, 10.5 to 16 and 33 to 48 months, respectively (i.e., tolerating a delay of 33% of each interval) was assessed. Potential predictors of non-adherence were investigated using logistic regression analysis.

Results

A total of 20,058 screening colonoscopy participants were included in the study. Of those with recommended surveillance intervals of 3, 6, 12 and 36 months, 46.5% (95%-confidence interval [CI]: 37.3-55.7%), 38.5% (95%-CI: 29.6-47.3%), 25.4% (95%-CI: 21.2-29.6%) and 28.0% (95%-CI: 25.5-30.5%), respectively, had a subsequent colonoscopy within the specified margins. Old age, longer recommended surveillance interval, not having had polypectomy at screening and negative colonoscopy were statistically significant predictors of non-adherence.

Conclusion

This study suggests frequent non-adherence to physician recommendations for surveillance colonoscopy in community practice. Increased efforts to improve adherence, including introduction of more elements of an organized screening program, seem necessary to assure a high-quality CRC screening process.  相似文献   

5.

Background

Several studies have reported osteomyelitis of the jaw (OMJ) as a side effect of bisphosphonates (BPs), and the risk of oral BPs has been recently clarified. However, other systemic risk factors of OMJ remain unclear. Importantly, the possibility of risk classification based on the clinical characteristics of patients has not been explored. Here, we clarified risk factors of OMJ and evaluate the predictive accuracy of risk indices in osteoporosis patients.

Methods

We performed sub-analysis using a database developed for a retrospective cohort study in patients taking medications for osteoporosis at Kyoto University Hospital. Risk indices for OMJ were constructed using logistic regression analysis, and odds ratios (OR) for OMJ cases and 95% confidence intervals (CI) were estimated. Potential risk factors included in the statistical analysis were age; sex; diabetes; use of oral BPs, corticosteroids, cancer chemotherapy, antirheumatic drugs, and biologic agents; and their interactions. Risk indices were calculated by the sum of potential risk factors of an individual patient multiplied by the regression coefficients. The discriminatory power of the risk indices was assessed by receiver operating characteristic (ROC) analysis.

Results

In analysis of all patients, oral BPs (OR: 4.98, 95% CIs: 1.94-12.75), age (OR: 1.28, 95% CI: 1.06-1.60) and sex-chemotherapy interaction (OR: 11.70, 95% CI: 1.46-93.64) were significant risk factors of OMJ. Areas under the ROC curves of these risk indices provided moderate sensitivity or specificity regardless of group (0.683 to 0.718).

Conclusions

Our data suggest that oral BP use, age, and sex-chemotherapy are predictors of OMJ in osteoporosis patients. The risk indices are moderately high, and allow the prediction of OMJ incidence.  相似文献   

6.

Background

Despite routine use of clopidogrel, adverse cardiovascular events recur among some patients undergoing percutaneous coronary intervention (PCI). To optimize antiplatelet therapies, we performed a meta-analysis to quantify the efficacy of high versus standard-maintenance-dose clopidogrel in these patients.

Methods

Randomized controlled trials (RCTs) comparing high (>75 mg) and standard maintenance doses of clopidogrel in patients undergoing PCI were included. The primary efficacy and safety end-points were major adverse cardiovascular/cerebrovascular events (MACE/MACCE) and major bleeding. The secondary end-points were other ischemic and bleeding adverse effects. The pooled odds ratio (OR) for each outcome was estimated.

Results

14 RCTs with 4424 patients were included. Compared with standard-maintenance-dose clopidogrel, high-maintenance-dose clopidogrel significantly reduced the incidence of MACE/MACCE (OR 0.60; 95% CI 0.43 to 0.83), stent thrombosis (OR 0.56; 95% CI 0.32 to 0.99) and target vessel revascularization (OR 0.38; 95% CI 0.20 to 0.74), without significant decrease of the risk of cardiovascular death (OR 0.92; 95% CI 0.74 to 1.13) and myocardial infarction (OR 0.83; 95% CI 0.51 to 1.33). For safety outcomes, it did not significantly increase the risk of major bleeding (OR 0.73; 95% CI 0.41 to 1.32), minor bleeding (OR 1.29; 95% CI 1.00 to 1.66) and any bleeding (OR 1.14; 95% CI 0.91 to 1.43).

Conclusion

High-maintenance-dose clopidogrel reduces the recurrence of most ischemic events in patients post-PCI without increasing the risk of bleeding complications.  相似文献   

7.

Background

Non-adherence is one of the strongest predictors of therapeutic failure in HIV-positive patients. Virologic failure with subsequent emergence of resistance reduces future treatment options and long-term clinical success.

Methods

Prospective observational cohort study including patients starting new class of antiretroviral therapy (ART) between 2003 and 2010. Participants were naïve to ART class and completed ≥1 adherence questionnaire prior to resistance testing. Outcomes were development of any IAS-USA, class-specific, or M184V mutations. Associations between adherence and resistance were estimated using logistic regression models stratified by ART class.

Results

Of 314 included individuals, 162 started NNRTI and 152 a PI/r regimen. Adherence was similar between groups with 85% reporting adherence ≥95%. Number of new mutations increased with increasing non-adherence. In NNRTI group, multivariable models indicated a significant linear association in odds of developing IAS-USA (odds ratio (OR) 1.66, 95% confidence interval (CI): 1.04-2.67) or class-specific (OR 1.65, 95% CI: 1.00-2.70) mutations. Levels of drug resistance were considerably lower in PI/r group and adherence was only significantly associated with M184V mutations (OR 8.38, 95% CI: 1.26-55.70). Adherence was significantly associated with HIV RNA in PI/r but not NNRTI regimens.

Conclusion

Therapies containing PI/r appear more forgiving to incomplete adherence compared with NNRTI regimens, which allow higher levels of resistance, even with adherence above 95%. However, in failing PI/r regimens good adherence may prevent accumulation of further resistance mutations and therefore help to preserve future drug options. In contrast, adherence levels have little impact on NNRTI treatments once the first mutations have emerged.  相似文献   

8.

Introduction

In clinical populations paranoid delusions are associated with making global, stable and external attributions for negative events. Paranoia is common in community samples but it is not known whether it is associated with a similar cognitive style. This study investigates the association between cognitive style and paranoia in a large community sample of young adults.

Methods

2694 young adults (mean age 17.8, SD 4.6) from the ALSPAC cohort provided data on psychotic experiences and cognitive style. Psychotic experiences were assessed using a semi-structured interview and cognitive style was assessed using the Cognitive Styles Questionnaire-Short Form (CSQ-SF) on the same occasion. Logistic regression was used to investigate associations between paranoia and CSQ-SF scores, both total and domain-related (global, stable, self, external). The role of concurrent self-reported depressive symptoms in the association was explored.

Results

Paranoia was associated with Total CSQ-SF scores (adjusted OR 1.69 95% CI 1.29, 2.22), as well as global (OR 1.56 95% CI 1.17, 2.08), stable (OR 1.56 95% CI 1.17, 2.08) and self (OR 1.37 95% CI 1.05, 1.79) domains, only Total score and global domain associations remained after additional adjustment for self-reported depression. There was no association between paranoia and external cognitive style (OR 1.10 95% CI 0.83, 1.47).

Conclusion

Paranoid ideation in a community sample is associated with a global rather than an external cognitive style. An external cognitive style may be a characteristic of more severe paranoid beliefs. Further work is required to determine the role of depression in the association between cognitive style and paranoia.  相似文献   

9.

Introduction

K-ras gene mutations were common in colorectal patients, but their relationship with prognosis was unclear.

Objective

Verify prognostic differences between patient with and without mutant K-ras genes by reviewing the published evidence.

Method

Systematic reviews and data bases were searched for cohort/case-control studies of prognosis of colorectal cancer patients with detected K-ras mutations versus those without mutant K-ras genes, both of whom received chemotherapy. Number of patients, regimens of chemotherapy, and short-term or long-term survival rate (disease-free or overall) were extracted. Quality of studies was also evaluated.

Principal Findings

7 studies of comparisons with a control group were identified. No association between K-ras gene status with neither short-term disease free-survival (OR=1.01, 95% CI, 0.73-1.38, P=0.97) nor overall survival (OR=1.06, 95% CI, 0.82-1.36, P=0.66) in CRC patients who received chemotherapy was indicated. Comparison of long-term survival between two groups also indicated no significant difference after heterogeneity was eliminated (OR=1.09, 95% CI, 0.85-1.40, P=0.49).

Conclusions

K-ras gene mutations may not be a prognostic index for colorectal cancer patients who received chemotherapy.  相似文献   

10.

Objectives

To compare uterine rupture, maternal and perinatal morbidity rates in women with one single previous cesarean after spontaneous onset of labor or low-dose prostaglandin-induced cervical ripening for unfavourable cervix.

Study Design

This was a retrospective cohort study of 4,137 women with one single previous cesarean over a 22-year period. Inpatient prostaglandin administration consisted in single daily local applications.

Results

Vaginal delivery was planned for 3,544 (85.7%) patients, 2,704 (76.3%) of whom delivered vaginally (vaginal birth after Cesarean (VBAC) rate = 65.4%). Among women receiving prostaglandins (n=515), 323 (62.7%) delivered vaginally. Uterine rupture (0.7% compared with 0.8%, OR 1.1, 95% CI 0.4-3.4, p=0.88), maternal (0.9% compared with 1.2%, OR 1.3, 95% CI 0.5-3.2, p=0.63) and perinatal (0.3% compared with 0.8%, OR 2.4, 95% CI 0.7-8.5, p=0.18) morbidity rates did not differ significantly between patients with spontaneous onset of labor and those receiving prostaglandins, nor did these rates differ according to the planned mode of delivery.

Conclusion

In comparison with patients with spontaneous labor, inducing cervical ripening with low-dose prostaglandins in case of unfavourable cervix is not associated with appreciable increase in uterine rupture, maternal or perinatal morbidity.  相似文献   

11.

Purpose

Patients with Atrial Fibrillation (AF) and prior stroke are classified as high risk in all risk stratification schemes. A systematic review and meta-analysis was performed to compare the efficacy and safety of New Oral Anticoagulants (NOACs) to warfarin in patients with AF and previous stroke or transient ischemic attack (TIA).

Methods

Three randomized controlled trials (RCTs), including total 14527 patients, comparing NOACs (apixaban, dabigatran and rivaroxaban) with warfarin were included in the analysis. Primary efficacy endpoint was ischemic stroke, and primary safety endpoint was intracranial bleeding. Random-effects models were used to pool efficacy and safety data across RCTs. RevMan and Stata software were used for direct and indirect comparisons, respectively.

Results

In patients with AF and previous stroke or TIA, effects of NOACs were not statistically different from that of warfarin, in reduction of stroke (Odds Ratio [OR] 0.86, 95% confidence interval [CI] 0.73- 1.01), disabling and fatal stroke (OR 0.85, 95% CI 0.71-1.04), and all-cause mortality (OR 0.90, 95% CI 0.79 -1.02). Randomization to NOACs was associated with a significantly lower risk of intracranial bleeding (OR 0.42, 95% CI 0.25-0.70). There were no major differences in efficacy between apixaban, dabigatran (110 mg BID and 150 mg BID) and rivaroxaban. Major bleeding was significantly lower with apixaban and dabigatran (110 mg BID) compared with dabigatran (150 mg BID) and rivaroxaban.

Conclusion

NOACs may not be more effective than warfarin in the secondary prevention of ischemic stroke in patients with a prior history of cerebrovascular ischemia, but have a lower risk of intracranial bleeding.  相似文献   

12.

Background

Physical performance is a major determinant of health in older adults, and is related to lifestyle factors. Dietary fiber has multiple health benefits. It remains unclear whether fiber intake is independently linked to superior physical performance. We aimed to assess the association between dietary fiber and physical performance in older adults.

Methods

This was a cross-sectional study conducted with community-dwelling adults aged 55 years and older (n=2680) from the ongoing Healthy Aging Longitudinal Study (HALST) in Taiwan 2008-2010. Daily dietary fiber intake was assessed using a validated food frequency questionnaire. Physical performance was determined objectively by measuring gait speed, 6-minute walk distance, timed “up and go” (TUG), summary performance score, hand grip strength.

Results

Adjusting for all potential confounders, participants with higher fiber intake had significantly faster gait speed, longer 6-minute walk distance, faster TUG, higher summary performance score, and higher hand grip strength (all P <.05). Comparing with the highest quartile of fiber intake, the lowest quartile of fiber intake was significantly associated with the lowest sex-specific quartile of gait speed (adjusted OR, 2.18 in men [95% CI, 1.33-3.55] and 3.65 in women [95% CI, 2.20-6.05]), 6-minute walk distance (OR, 2.40 in men [95% CI, 1.38-4.17] and 4.32 in women [95% CI, 2.37-7.89]), TUG (OR, 2.42 in men [95% CI, 1.43-4.12] and 3.27 in women [95% CI, 1.94-5.52]), summary performance score (OR, 2.12 in men [95% CI, 1.19-3.78] and 5.47 in women [95% CI, 3.20-9.35]), and hand grip strength (OR, 2.64 in men [95% CI, 1.61-4.32] and 4.43 in women [95% CI, 2.62-7.50]).

Conclusions

Dietary fiber intake was independently associated with better physical performance.  相似文献   

13.

Objective

To describe the prevalence of dementia and subtypes in a general elderly population in northwestern Spain and to analyze the influence of socio-demographic factors.

Methods

Cross-sectional, two-phase, door-to-door, population-based study. A total of 870 individuals from a rural region and 2,119 individuals from an urban region of Valladolid, Spain, were involved. The seven-minute screen neurocognitive battery was used in the screening phase. A control group was included.

Results

A total of 2,170 individuals aged 65 to 104 years (57% women) were assessed. There were 184 subjects diagnosed with dementia. The crude prevalence was 8.5% (95% CI: 7.3-9.7). Age- and sex-adjusted prevalence was 5.5 (95% CI: 4.5-6.5). Main subtypes of dementia were: Alzheimer’s disease (AD) 77.7%, Lewy Body disease, 7.6% and vascular dementia (VD) 5.9%. Crude prevalences were 6.6% (AD), 0.6% (Lewy Body disease), and 0.5% (VD). Dementia was associated with age (OR 1.14 for 1-year increase in age), female sex (OR 1.79) and the absence of formal education (OR 2.53 compared to subjects with primary education or more).

Conclusion

The prevalence of dementia in the study population was lower than the most recent estimates for Western Europe. There was a high proportion of AD among all dementia cases and very low prevalence of VD. Old age, female sex, and low education level were independent risk factors for dementia and AD.  相似文献   

14.

Background

Gonadotropin-releasing hormone agonists (GnRHa) might play a role in preserving ovarian function in lymphoma patients by inhibiting chemotherapy-induced ovarian follicular damage. However, studies of its clinical efficacy have reported conflicting results.

Method

We conducted a meta-analysis to determine the effect of the preservation of ovarian function by administering GnRHa in young patients with lymphoma undergoing chemotherapy. Seven studies were identified that met inclusion criteria and comprised 434 patients assigned to GnRHa combined chemotherapy or chemotherapy alone.

Results

The incidence of women with premature ovarian failure (POF) demonstrated a statistically significant difference in favor of the use of GnRHa (OR=0.32, 95% CI 0.13-0.77). In addition, the final level of FSH in the GnRH group was significantly lower than control group. (MD= -11.73, 95% CI,-22.25- -1.20), and the final level of AMH in the GnRH group was significantly higher than control group (MD=0.80; 95% CI, 0.61–0.98). However, there was no statistically significant difference between treatment and the control groups in the incidence of a spontaneous pregnancy (OR=1.11; 95% CI, 0.55–2.26).

Conclusion

This meta-analysis suggests that GnRHa may be effective in protecting ovarian function during chemotherapy in lymphoma patients. More well-designed prospective studies are needed to carry out for further understanding of this topic.  相似文献   

15.

Purpose

To examine the associations between area-level socioeconomic attributes and stage of esophageal adenocarcinoma diagnoses in 16 SEER cancer registries during 2000-2007.

Methods

Odds ratios (OR) and 95% confidence intervals (CI) were calculated using multivariable logistic regression models to assess the relationship between distant-stage esophageal adenocarcinoma and individual, census tract, and county-level attributes.

Results

Among cases with data on birthplace, no significant association was seen between reported birth within versus outside the United States and distant-stage cancer (adjusted OR=1.02, 95% CI: 0.85-1.22). Living in an area with a higher percentage of residents born outside the United States than the national average was associated with distant-stage esophageal adenocarcinoma; census tract level: >11.8%, (OR=1.10, 95% CI:1.01–1.19), county level: >11.8%, (OR=1.14, 95% CI:1.05-1.24). No association was observed between median household income and distant-stage cancer at either census tract or county levels.

Conclusion

The finding of greater odds of distant-stage esophageal adenocarcinoma among cases residing in SEER areas with higher proportion of non-U.S. Natives suggests local areas where esophageal cancer control efforts might be focused. Missing data at the individual level was a limitation of the present study. Furthermore, inconsistent associations with foreign birth at individual- versus area-levels cautions against using area-level attributes as proxies for case attributes.  相似文献   

16.

Background

Migration is a major challenge to tuberculosis (TB) control worldwide. TB treatment requires multiple drugs for at least six months. Some TB patients default before completing their treatment regimen, which can lead to ongoing infectiousness and drug resistance.

Methods

We conducted a retrospective analysis of 29,943 active TB cases among urban migrants that were reported between 2000 to 2008 in Shanghai, China. We used logistic regression models to identify factors independently associated with treatment defaults in TB patients among urban migrants during 2005-2008.

Results

Fifty-two percent of the total TB patients reported in Shanghai during the study period were among urban migrants. Three factors increased the odds of a treatment default: case management using self-administered therapy (OR, 5.84, 95% CI, 3.14-10.86, p<0.0005), being a retreatment case (OR, 1.47, 95% CI, 1.25-1.71, p<0.0005), and age >60 years old (OR, 1.33, 95% CI, 1.05-1.67, p=0.017). The presence of a cavity in the initial chest radiograph decreased the odds for a treatment default (OR, 0.87, 95% CI, 0.77-0.97, p=0.015), as did migration from central China (OR, 0.85, 95% CI, 0.73-0.99, p=0.042), case management by family members (OR, 0.73, 95% CI 0.66-0.81, p<0.0005), and the combination of case detection by a required physical exam and case management by health care staff (OR, 0.64, 95% CI, 0.45-0.93, p=0.019).

Conclusion

Among TB patients who were urban migrants in Shanghai, case management using self-administered therapy was the strongest modifiable risk factor that was independently associated with treatment defaults. Interventions that target retreated TB cases could also reduce treatment defaults among urban migrants. Health departments should develop effective measures to prevent treatment defaults among urban migrants, to ensure completion of therapy among urban migrants who move between cities and provinces, and to improve reporting of treatment outcomes.  相似文献   

17.

Background

The association between diabetes mellitus (DM) and tuberculosis (TB) is re-emerging worldwide. Recently, the prevalence of DM is increasing in resource poor countries where TB is of high burden. The objective of the current study was to determine the prevalence and analyze associated factors of TB and DM comorbidity in South-Eastern Amhara Region, Ethiopia.

Methods

This was a facility based cross-sectional study. All newly diagnosed TB patients attending selected health facilities in the study area were consecutively screened for DM. DM was diagnosed based on the World Health Organization diagnostic criteria. A pre-tested semi-structured questionnaire was used to collect socio-demographic, lifestyles and clinical data. Logistic regression analysis was performed to identify factors associated with TB and DM comorbidity.

Result

Among a total of 1314 patients who participated in the study, the prevalence of DM was estimated at 109 (8.3%). Being female [odds ratio (OR) 1.70; 95% confidence interval (CI) (1.10–2.62)], patients age [41–64 years (OR 3.35; 95% CI (2.01–5.57), 65–89 years (OR 3.18; 95% CI (1.52–6.64)], being a pulmonary TB case [(OR 1.69; 95% CI 1.09–2.63)] and having a family history of DM [(OR 4.54; 95% CI (2.36–8.73)] were associated factors identified with TB and DM comorbidity.

Conclusion

The prevalence of DM among TB patients in South-Eastern Amahra Region is high. Routine screening of TB patients for DM is recommended in the study area.  相似文献   

18.

Background

Breast fibroglandular (dense) tissue is a risk factor for breast cancer. Beyond breast cancer, little is known regarding the prognostic significance of mammographic features.

Methods

We evaluated relationships between nondense (fatty) breast area and dense area with all-cause mortality in 4,245 initially healthy women from the Breast Cancer Detection Demonstration Project; 1,361 died during a mean follow-up of 28.2 years. Dense area and total breast area were assessed using planimeter measurements from screening mammograms. Percent density reflects dense area relative to breast area and nondense area was calculated as the difference between total breast area and dense area. Hazard ratios (HRs) and 95% confidence intervals (CIs) were estimated by Cox proportional hazards regression.

Results

In age-adjusted models, greater nondense and total breast area were associated with increased risk of death (HR 1.17, 95% CI 1.10-1.24 and HR 1.13, 95% CI 1.06-1.19, per SD difference) while greater dense area and percent density were associated with lower risk of death (HR 0.91, 95% CI 0.86-0.95 and HR 0.87, 95% CI 0.83-0.92, per SD difference). Associations were not attenuated with adjustment for race, education, mammogram type (x-ray or xerogram), smoking status, diabetes and heart disease. With additional adjustment for body mass index, associations were diminished for all features but remained statistically significant for dense area (HR 0.94, 95% CI 0.89-0.99, per SD difference) and percent density (HR 0.93, 95% CI 0.87-0.98, per SD difference).

Conclusions

These data indicate that dense area and percent density may relate to survival in healthy women and suggest the potential utility of mammograms beyond prediction of breast cancer risk.  相似文献   

19.

Objective

To investigate the contributions of adenoid and tonsil size to childhood obstructive sleep apnea (OSA) and the interactions between adenotonsillar hypertrophy, age, and obesity in children with OSA.

Methods

In total, 495 symptomatic patients were recruited. The patients were assigned to four groups according to age:toddler (age 1-3, n=42), preschool (age 3-6, n=164), school (age 6-12, n=200), and adolescence (age 12-18, n=89). All subjects had tonsil size graded by otolaryngologists, adenoid size determined on lateral radiographs (Fujioka method), and a full-night polysomnography. The apnea-hypopnea index (AHI), adenoid size, and tonsil size were compared in obese and non-obese children in the four age groups. Adjusted odds ratios (ORs) and 95% confidence interval (CI) of adenotonsillar hypertrophy and OSA risk were estimated by multi-logistic regression.

Results

The AHI was positively related to tonsil grade (r=0.33, p <0.001) and adenoid size (r=0.24, p <0.01) in all patients. Tonsil grade was positively related to AHI in all four age groups. Adenoid size was positively related to AHI in the toddler, preschool, school groups, but not in the adolescent group (r=0.11, p=0.37). Tonsil grade and adenoid size were both positively related to AHI in obese and non-obese children. In the regression model, obesity (OR=2.89; 95% CI 1.47-5.68), tonsillar hypertrophy (OR=3.15; 95% CI 2.04-4.88), and adenoidal hypertrophy (OR=1.89; 95% CI 1.19-3.00) significantly increased OSA risk.

Conclusions

Adenotonsillar hypertrophy and obesity are the major determinants of OSA in children. However, the influence of adenoid size decreases in adolescence.  相似文献   

20.

Background

Although both oral fluoropyrimidines were reported effective and safe, doubts exist about whether S-1 or capecitabine is more advantageous in advanced gastric carcinoma (AGC). Herein, we performed a meta-analysis to comprehensively compare the efficacy and safety of S-1-based chemotherapy versus capecitabine-based chemotherapy as first-line treatment for AGC.

Methods

PubMed/Medline, EmBase, Cochrane library, and China National Knowledge Infrastructure databases were searched for articles comparing S-1-based chemotherapy to capecitabine-based chemotherapy for AGC. Primary outcomes were overall response rate (ORR), time to progression (TTP), overall survival (OS), progression-free probability, and survival probability. Secondary outcomes were toxicities. Fixed-effects model were used and all the results were confirmed by random-effects model.

Results

Five randomized controlled trials and five cohort studies with 821 patients were included. We found equivalent ORR (38.3% vs. 39.1%, odds ratio [OR] 0.92, 95% confidence interval [CI] 0.69-1.24, P = 0.59), TTP (harzad ratio [HR] 0.98, 95% CI 0.82-1.16, P = 0.79), OS (HR 0.99, 95% CI 0.87-1.13, P = 0.91), progression-free probability (3-month OR 1.02, 95% CI 0.62-1.68, P = 0.94; 6-month OR 1.34, 95% CI 0.88-2.04, P = 0.18) and survival probability (0.5-year OR 0.90, 95% CI 0.61-1.31, P =0.57; 1-year OR 0.97, 95% CI 0.70- 1.33, P = 0.84; 2-year OR 1.15, 95% CI 0.61-2.17, P = 0.66). Equivalent grade 3 to 4 hematological and non-hematological toxicities were found except hand-foot syndrome was less prominent in S-1-based chemotherapy (0.3% vs. 5.9%, OR 0.19, 95% CI 0.06-0.56, P = 0.003). There’re no significant heterogeneity and publication bias. Cumulative analysis found stable time-dependent trend. Consistent results stratified by study design, age, regimen, cycle, country were observed.

Conclusion

S-1-based chemotherapy was associated with non-inferior antitumor efficacy and better safety profile, compared with capecitabine-based therapy. We recommended S-1 and capecitabine can be used interchangeably for AGC, at least in Asia.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号