首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.

Objective

Patients with late-onset depression (LOD) have been reported to run a higher risk of subsequent dementia. The present study was conducted to assess whether statins can reduce the risk of dementia in these patients.

Methods

We used the data from National Health Insurance of Taiwan during 1996–2009. Standardized Incidence Ratios (SIRs) were calculated for LOD and subsequent dementia. The criteria for LOD diagnoses included age ≥65 years, diagnosis of depression after 65 years of age, at least three service claims, and treatment with antidepressants. The time-dependent Cox proportional hazards model was applied for multivariate analyses. Propensity scores with the one-to-one nearest-neighbor matching model were used to select matching patients for validation studies. Kaplan-Meier curve estimate was used to measure the group of patients with dementia living after diagnosis of LOD.

Results

Totally 45,973 patients aged ≥65 years were enrolled. The prevalence of LOD was 12.9% (5,952/45,973). Patients with LOD showed to have a higher incidence of subsequent dementia compared with those without LOD (Odds Ratio: 2.785; 95% CI 2.619–2.958). Among patients with LOD, lipid lowering agent (LLA) users (for at least 3 months) had lower incidence of subsequent dementia than non-users (Hazard Ratio = 0.781, 95% CI 0.685–0.891). Nevertheless, only statins users showed to have reduced risk of dementia (Hazard Ratio = 0.674, 95% CI 0.547–0.832) while other LLAs did not, which was further validated by Kaplan-Meier estimates after we used the propensity scores with the one-to-one nearest-neighbor matching model to control the confounding factors.

Conclusions

Statins may reduce the risk of subsequent dementia in patients with LOD.  相似文献   

2.
BackgroundWe sought to determine the clinical outcomes of patients with breast cancer (BC) who had undergone stereotactic radiosurgery (SRS) for a limited number of brain metastases (BM) and to identify factors influencing overall survival (OS) and local control.Materials and methodsThe records of 45 patients who underwent SRS for 72 brain lesions were retrospectively evaluated. Statistics included the chi-squared test, Kaplan-Meier method, and the multivariate Cox model.ResultsThe median number of treated BM was 2 (range 1–10). Median OS from BM diagnosis and post-SRS were 27.6 [95% confidence interval (CI): 14.8–40.5) and 18.5 months (95% CI: 11.1–25.8), respectively. One-year and two-year survival rates after BM diagnosis were 55% and 41%, respectively. In a univariate analysis, the Luminal-B-human-epidermal-growth-receptor-positive (HER2+) subtype had the longest median OS at 39.1 months (95% CI: 34.1–44.1, p = 0.004). In an adjusted analysis, grade 2 [hazard ratio (HR): 0.1; 95% CI: 0.1–0.6, p = 0.005), craniotomy (HR: 0.3; 95% CI: 0.1–0.7; p = 0.006), and ≥ 2 systemic therapies received (HR: 0.3; 95% CI: 0.1–0.9, p = 0.028) were associated with improved OS. One-year and two-year intracranial progression-free survival rates were 85% and 63%, respectively. Four factors for a higher risk of any intracranial recurrence remained significant in the adjusted analysis, as follows: age < 50 years (HR: 4.2; 95% CI: 1.3–36.3; p = 0.014), grade 3 (HR: 3.7; 95% CI: 1.1–13.2; p = 0.038), HER2+ (HR: 6.9; 95% CI: 1.3–36.3; p = 0.023), and whether the brain was the first metastatic site (HR: 4.7; 95% CI: 1.6–14.5; p = 0.006).ConclusionIntrinsic BC characteristics are important determinants for both survival and intracranial control for patients undergoing SRS for oligometastatic brain disease.  相似文献   

3.

Background

Liver transplantation has received increased attention in the medical field since the 1980s following the introduction of new immunosuppressants and improved surgical techniques. Currently, transplantation is the treatment of choice for patients with end-stage liver disease, and it has been expanded for other indications. Liver transplantation outcomes depend on donor factors, operating conditions, and the disease stage of the recipient. A retrospective cohort was studied to identify mortality and graft failure rates and their associated factors. All adult liver transplants performed in the state of São Paulo, Brazil, between 2006 and 2012 were studied.

Methods and Findings

A hierarchical Poisson multiple regression model was used to analyze factors related to mortality and graft failure in liver transplants. A total of 2,666 patients, 18 years or older, (1,482 males; 1,184 females) were investigated. Outcome variables included mortality and graft failure rates, which were grouped into a single binary variable called negative outcome rate. Additionally, donor clinical, laboratory, intensive care, and organ characteristics and recipient clinical data were analyzed. The mortality rate was 16.2 per 100 person-years (py) (95% CI: 15.1–17.3), and the graft failure rate was 1.8 per 100 py (95% CI: 1.5–2.2). Thus, the negative outcome rate was 18.0 per 100 py (95% CI: 16.9–19.2). The best risk model demonstrated that recipient creatinine ≥ 2.11 mg/dl [RR = 1.80 (95% CI: 1.56–2.08)], total bilirubin ≥ 2.11 mg/dl [RR = 1.48 (95% CI: 1.27–1.72)], Na+ ≥ 141.01 mg/dl [RR = 1.70 (95% CI: 1.47–1.97)], RNI ≥ 2.71 [RR = 1.64 (95% CI: 1.41–1.90)], body surface ≥ 1.98 [RR = 0.81 (95% CI: 0.68–0.97)] and donor age ≥ 54 years [RR = 1.28 (95% CI: 1.11–1.48)], male gender [RR = 1.19(95% CI: 1.03–1.37)], dobutamine use [RR = 0.54 (95% CI: 0.36–0.82)] and intubation ≥ 6 days [RR = 1.16 (95% CI: 1.10–1.34)] affected the negative outcome rate.

Conclusions

The current study confirms that both donor and recipient characteristics must be considered in post-transplant outcomes and prognostic scores. Our data demonstrated that recipient characteristics have a greater impact on post-transplant outcomes than donor characteristics. This new concept makes liver transplant teams to rethink about the limits in a MELD allocation system, with many teams competing with each other. The results suggest that although we have some concerns about the donors features, the recipient factors were heaviest predictors for bad outcomes.  相似文献   

4.
The incidence and outcomes of acute kidney injury (AKI) in kidney transplantation are poorly known. Retrospective cohort analysis was performed on the data of all patients (≥3 months after transplantation and ≥16 years of age) admitted to the hospital due to medical or surgical complications from 2007 to 2010. We analyzed 458 kidney transplant recipients, 55.2% men, median age 49 (IQR, 36–58) years, median of 12.5 (IQR, 3–35) months after kidney transplantation; admitted to the hospital due to medical or surgical complications. Most of the patients received a kidney from a deceased donor (62.2%), the primary cause for hospital admission was infection (60.7%) and 57 (12.4%) individuals were diagnosed with acute rejection (AR). The incidence of AKI was 82.3%: 31.9% stage 1, 29.3% stage 2 and 21.2% stage 3. Intensive care unit (ICU) admission (OR 8.90, 95% CI: 1.77–44.56 p = 0.008), infection (OR 5.73, 95% CI: 2.61–12.56, p<0.001) and the use of contrast media (OR 9.34, 95% CI: 2.04–42.70, p = 0.004) were the independent risk factors for AKI development. The mortality rate was 2.1% and all patients who died were diagnosed with AKI. Even after the exclusion of AR cases, at the end of 12 months, the individuals with AKI exhibited higher percent changes in creatinine values when compared with individuals without AKI (9.1% vs. -4.3%; p<0.001). According to KDIGO system, we found a high incidence of AKI among the complications of renal transplantation. As in other scenarios, AKI was associated with renal function loss at 1-year after the hospital discharge.  相似文献   

5.

Objective

Two studies have reported that patients with the 4G/4G genotype of the plasminogen activator inhibitor-1 (PAI-1) genetic polymorphism had higher plasma PAI-1 concentrations and higher risk of death than those with the 4G/5G or 5G/5G genotypes; one study involved 175 children with meningococcal disease, and the other included 88 adult patients with septic shock. Thus, the objective of this study was to determine whether there is an association between carriage of the 4G/4G genotype, plasma PAI-1 concentrations and mortality in a large series of adult septic patients.

Methods

An observational, prospective, multicenter study was carried out in six Spanish Intensive Care Units including severe septic patients. We determined the PAI-1 4G/5G polymorphism and plasma PAI-1 concentrations in all patients. The end-points of the study were 30-day and 6-month mortality.

Results

We included a total of 260 patients, 82 (31.5%) with 4G/4G, 126 (48.5%) with 4G/5G and 52 (20.0%) with 5G/5G genotype. Multivariate logistic regression analysis showed that the 4G/4G genotype was associated with higher mortality at 30 days (Odds Ratio = 1.95; 95% CI = 1.063–3.561; p = 0.03) and at 6 months (Odds Ratio = 2.19; 95% CI = 1.221–3.934; p = 0.01), and that higher plasma PAI-1 concentrations were associated with higher mortality at 30 days (Odds Ratio = 1.01; 95% CI = 1.002–1.022; p = 0.02) at 6 months (Odds Ratio = 1.01; 95% CI = 1.003–1.023; p = 0.01). Multivariate linear regression analysis showed that increased plasma PAI-1 concentrations were associated with the PAI-1 4G/4G genotype (regression coefficient = 4.82; 95% CI = 3.227 to 6.406; p<0.001).

Conclusions

The major findings of our study, to our knowledge the largest series reporting data about 4G/5G polymorphism of the PAI-1 gene, plasma PAI-1 concentrations and mortality in septic patients, were that septic patients with the 4G/4G genotype had higher plasma PAI-1 concentrations and higher risk of death than those with 4G/5G or 5G/5G genotypes.  相似文献   

6.
BackgroundInvasive pneumococcal disease (IPD) causes considerable morbidity and mortality. We aimed to identify host factors and biomarkers associated with poor outcomes in adult patients with IPD in Japan, which has a rapidly-aging population.MethodsIn a large-scale surveillance study of 506 Japanese adults with IPD, we investigated the role of host factors, disease severity, biomarkers based on clinical laboratory data, treatment regimens, and bacterial factors on 28-day mortality.ResultsOverall mortality was 24.1%, and the mortality rate increased from 10.0% in patients aged ˂50 years to 33.1% in patients aged ≥80 years. Disease severity also increased 28-day mortality, from 12.5% among patients with bacteraemia without sepsis to 35.0% in patients with severe sepsis and 56.9% with septic shock. The death rate within 48 hours after admission was high at 54.9%. Risk factors for mortality identified by multivariate analysis were as follows: white blood cell (WBC) count <4000 cells/μL (odds ratio [OR], 6.9; 95% confidence interval [CI], 3.7–12.8, p < .001); age ≥80 years (OR, 6.5; 95% CI, 2.0–21.6, p = .002); serum creatinine ≥2.0 mg/dL (OR, 4.5; 95% CI, 2.5–8.1, p < .001); underlying liver disease (OR, 3.5; 95% CI, 1.6–7.8, p = .002); mechanical ventilation (OR, 3.0; 95% CI, 1.7–5.6, p < .001); and lactate dehydrogenase ≥300 IU/L (OR, 2.4; 95% CI, 1.4–4.0, p = .001). Pneumococcal serotype and drug resistance were not associated with poor outcomes.ConclusionsHost factors, disease severity, and biomarkers, especially WBC counts and serum creatinine, were more important determinants of mortality than bacterial factors.  相似文献   

7.

Background

Adults with sickle cell anemia (HbSS) are inconsistently treated with hydroxyurea.

Objectives

We retrospectively evaluated the effects of elevating fetal hemoglobin with hydroxyurea on organ damage and survival in patients enrolled in our screening study between 2001 and 2010.

Methods

An electronic medical record facilitated development of a database for comparison of study parameters based on hydroxyurea exposure and dose. This study is registered with ClinicalTrials.gov, number NCT00011648.

Results

Three hundred eighty-three adults with homozygous sickle cell disease were analyzed with 59 deaths during study follow-up. Cox regression analysis revealed deceased subjects had more hepatic dysfunction (elevated alkaline phosphatase, Hazard Ratio = 1.005, 95% CI 1.003–1.006, p<0.0.0001), kidney dysfunction (elevated creatinine, Hazard Ratio = 1.13, 95% CI 1.00–1.27, p = 0.043), and cardiopulmonary dysfunction (elevated tricuspid jet velocity on echocardiogram, Hazard Ratio = 2.22, 1.23–4.02, p = 0.0082). Sixty-six percent of subjects were treated with hydroxyurea, although only 66% of those received a dose within the recommended therapeutic range. Hydroxyurea use was associated with improved survival (Hazard Ratio = 0.58, 95% CI 0.34–0.97, p = 0.040). This effect was most pronounced in those taking the recommended dose of 15–35 mg/kg/day (Hazard Ratio 0.36, 95% CI 0.17–0.73, p = 0.0050). Hydroxyurea use was not associated with changes in organ function over time. Further, subjects with higher fetal hemoglobin responses to hydroxyurea were more likely to survive (p = 0.0004). While alkaline phosphatase was lowest in patients with the best fetal hemoglobin response (95.4 versus 123.6, p = 0.0065 and 96.1 versus 113.6U/L, p = 0.041 at first and last visits, respectively), other markers of organ damage were not consistently improved over time in patients with the highest fetal hemoglobin levels.

Conclusions

Our data suggest that adults should be treated with the maximum tolerated hydroxyurea dose, ideally before organ damage occurs. Prospective studies are indicated to validate these findings.  相似文献   

8.
There are limited data available on the risk factors for multidrug-resistant tuberculosis (MDR-TB). Therefore, we here conducted a retrospective matched case−control study among adults with pulmonary TB who received treatment at the Central Chest Institute of Thailand (CCIT) between January 2007 and December 2013, in order to determine the risk factors associated with MDR-TB among patients with pulmonary TB. We identified 145 patients with pulmonary MDR-TB (cases) and 145 patients with drug-sensitive pulmonary TB (controls). Multivariate analysis identified the independent risk factors for MDR-TB as follows: (1) ≥ 2 episodes of prior pulmonary TB (odds ratio [OR] 39.72, 95% confidence interval (95% CI) 7.86−200.66), (2) duration of illness > 60 days (OR 3.08, 95% CI 1.52−6.22), (3) sputum acid fast bacilli smear 3+ (OR 13.09, 95% CI 4.64−36.91), (4) presence of lung cavities (OR 3.82, 95% CI 1.89−7.73), and (5) presence of pleural effusion (OR 2.75, 95% CI 1.06−7.16). Prior pulmonary TB management with a non-category I regimen (P = 0.012) and having treatment failure or default as treatment outcomes (P = 0.036) were observed in a higher proportion among patients with MDR-TB. Particular characteristics of lung cavities, including the maximum diameter ≥ 30 mm (P < 0.001), the number of cavities ≥ 3 (P = 0.001), bilateral involvement (P < 0.001), and ≥ 2 lung zones involved (P = 0.001) were more commonly observed in patients with MDR-TB. In conclusion, these clinical factors and chest radiographic findings associated with MDR-TB among patients with pulmonary TB may help physicians to provide proper management of cases for prevention of the development and spread of MDR-TB in future.  相似文献   

9.

Objective

The HAS-BLED score enables a risk estimate of major bleeds in patients with atrial fibrillation on vitamin K-antagonists (VKA) treatment, but has not been validated for patients with venous thromboembolism (VTE). We analyzed whether the HAS-BLED score accurately identifies patients at high risk of major bleeds during VKA treatment for acute VTE.

Methods

Medical records of 537 patients with acute VTE (primary diagnosis pulmonary embolism in 223, deep vein thrombosis in 314) starting VKA treatment between 2006-2007 were searched for items on the HAS-BLED score and the occurrence of major bleeds during the first 180 days of follow-up. The hazard ratio (HR) for the occurrence of major bleeds comparing non-high with high-risk patients as defined by a HAS-BLED score ≥ 3 points was calculated using Cox-regression analysis.

Results

Major bleeds occurred in 11/537 patients (2.0%, 5.2/100 person years, 95% CI 2.8-9.2). Cumulative incidences of major bleeds were 1.3% (95% CI 0.1-2.5) in the non-high (HAS-BLED < 3) and 9.6% (95%CI 2.2-17.0) in the high-risk group (HAS-BLED ≥ 3), (p <0.0001 by Log-Rank test), with a HR of 8.7 (95% CI 2.7-28.4). Of the items in the HAS-BLED score, abnormal renal function (HR 10.8, 95% CI 1.9-61.7) and a history of bleeding events (HR 10.4, 95% CI 2.5-42.5) were independent predictors of major bleeds during follow-up.

Conclusion

Acute VTE patients with a HAS-BLED score ≥ 3 points are at increased risk of major bleeding. These results warrant for correction of the potentially reversible risk factors for major bleeding and careful International Normalized Ratio monitoring in acute VTE patients with a high HAS-BLED score.  相似文献   

10.
BackgroundIn the late twentieth century, emergence of high rates of treatment failure with antimonial compounds (SSG) for visceral leishmaniasis (VL) caused a public health crisis in Bihar, India. We hypothesize that exposure to arsenic through drinking contaminated groundwater may be associated with SSG treatment failure due to the development of antimony-resistant parasites.MethodsA retrospective cohort design was employed, as antimony treatment is no longer in routine use. The study was performed on patients treated with SSG between 2006 and 2010. Outcomes of treatment were assessed through a field questionnaire and treatment failure used as a proxy for parasite resistance. Arsenic exposure was quantified through analysis of 5 water samples from within and surrounding the patient’s home. A logistic regression model was used to evaluate the association between arsenic exposure and treatment failure. In a secondary analysis survival curves and Cox regression models were applied to assess the risk of mortality in VL patients exposed to arsenic.ResultsOne hundred and ten VL patients treated with SSG were analysed. The failure rate with SSG was 59%. Patients with high mean local arsenic level had a non-statistically significant higher risk of treatment failure (OR = 1.78, 95% CI: 0.7–4.6, p = 0.23) than patients using wells with arsenic concentration <10 μg/L. Twenty one patients died in our cohort, 16 directly as a result of VL. Arsenic levels ≥ 10 μg/L increased the risk of all-cause (HR 3.27; 95% CI: 1.4–8.1) and VL related (HR 2.65; 95% CI: 0.96–7.65) deaths. This was time dependent: 3 months post VL symptom development, elevated risks of all-cause mortality (HR 8.56; 95% CI: 2.5–29.1) and of VL related mortality (HR 9.27; 95% CI: 1.8–49.0) were detected.Discussion/ConclusionThis study indicates a trend towards increased treatment failure in arsenic exposed patients. The limitations of the retrospective study design may have masked a strong association between arsenic exposure and selection for antimonial resistance in the field. The unanticipated strong correlation between arsenic exposure and VL mortality warrants further investigation.  相似文献   

11.

Background and Purpose

Selecting an ideal antithrombotic therapy for elderly patients with atrial fibrillation (AF) undergoing percutaneous coronary intervention (PCI) can be challenging since they have a higher thromboembolic and bleeding risk than younger patients. The current study aimed to assess the efficacy and safety of triple therapy (TT: oral anticoagulation plus dual antiplatelet therapy: aspirin plus clopidogrel) in patients ≥75 years of age with atrial fibrillation (AF) undergoing percutaneous coronary intervention (PCI).

Methods

A prospective multicenter study was conducted from 2003 to 2012 at 6 Spanish teaching hospitals. A cohort study of consecutive patients with AF undergoing PCI and treated with TT or dual antiplatelet therapy (DAPT) was analyzed. All outcomes were evaluated at 1-year of follow-up.

Results

Five hundred and eighty-five patients, 289 (49%) of whom were ≥75 years of age (79.6±3.4 years; 33% women) were identified. TT was prescribed in 55.9% of patients at discharge who had a higher thromboembolic risk (CHA2DS2VASc score: 4.23±1.51 vs 3.76±1.40, p = 0.007 and a higher bleeding risk (HAS-BLED ≥3: 88.6% vs 79.2%, p = 0.02) than those on DAPT. Therefore, patients on TT had a lower rate of thromboembolism than those on DAPT (0.6% vs 6.9%, p = 0.004; HR 0.08, 95% CI: 0.01–0.70, p = 0.004). Major bleeding events occurred more frequently in patients on TT than in those on DAPT (11.7% vs 2.4%, p = 0.002; HR 5.2, 95% CI: 1.53–17.57, p = 0.008). The overall mortality rate was similar in both treatment groups (11.9% vs 13.9%, p = 0.38); however, after adjustment for confounding variables, TT was associated with a reduced mortality rate (HR 0.33, 95% CI: 0.12–0.86, p = 0.02).

Conclusions

In elderly patients with AF undergoing PCI, the use of TT compared to DAPT was associated with reduced thromboembolism and mortality rates, although a higher rate of major bleeding.  相似文献   

12.

Background

The clinical presentation of M. ulcerans disease and the safety and effectiveness of treatment may differ in elderly compared with younger populations related to relative immune defficiencies, co-morbidities and drug interactions. However, elderly populations with M. ulcerans disease have not been comprehensively studied.

Methodology/Principal Findings

A retrospective analysis was performed on an observational cohort of all confirmed M. ulcerans cases managed at Barwon Health from 1/1/1998-31/12/2014. The cohort included 327 patients; 131(40.0%) ≥65 years and 196(60.0%) <65 years of age. Patients ≥65 years had a shorter median duration of symptoms prior to diagnosis (p<0.01), a higher proportion with diabetes (p<0.001) and immune suppression (p<0.001), and were more likely to have lesions that were multiple (OR 4.67, 95% CI 1.78–12.31, p<0.001) and WHO category 3 (OR 4.59, 95% CI 1.98–10.59, p<0.001). Antibiotic complications occurred in 69(24.3%) treatment episodes at an increased incidence in those aged ≥65 years (OR 5.29, 95% CI 2.81–9.98, p<0.001). There were 4(1.2%) deaths, with significantly more in the age-group ≥65 years (4 compared with 0 deaths, p = 0.01). The overall treatment success rate was 92.2%. For the age-group ≥65 years there was a reduced rate of treatment success overall (OR 0.34, 95% CI 0.14–0.80, p = <0.01) and when surgery was used alone (OR 0.21, 95% CI 0.06–0.76, p<0.01). Patients ≥65 years were more likely to have a paradoxical reaction (OR 2.06, 95% CI 1.17–3.62, p = 0.01).

Conclusions/Significance

Elderly patients comprise a significant proportion of M. ulcerans disease patients in Australian populations and present with more severe and advanced disease forms. Currently recommended treatments are associated with increased toxicity and reduced effectiveness in elderly populations. Increased efforts are required to diagnose M. ulcerans earlier in elderly populations, and research is urgently required to develop more effective and less toxic treatments for this age-group.  相似文献   

13.
MethodsA total of 1187 patients with a mean age 65±12 years consecutively referred for AFL ablation were retrospectively analyzed in the study.Results445 (37.5%) patients were aged ≥70 (range 70 to 93) among which 345 were aged 70 to 79 years (29.1%) and 100 were aged ≥80 (8.4%). In multivariable analysis, AFL-related rhythmic cardiomyopathy and presentation with 1/1 AFL were less frequent (respectively adjusted OR = 0.44, 0.27–0.74, p = 0.002 and adjusted OR = 0.29, 0.16–0.52, p<0.0001). AFL ablation-related major complications were more frequent in patients ≥70 although remained lower than 10% (7.4% in ≥70 vs. 4.2% in <70, adjusted OR = 1.74, 1.04–2.89, p = 0.03). After 2.1±2.7 years, AFL recurrence was less frequent in patients ≥70 (adjusted OR = 0.54, 0.37–0.80, p = 0.002) whereas atrial fibrillation (AF) occurrence was as frequent in the 70–79 and ≥80 age subsets. As expected, cardiac mortality was higher in older patients. Patients aged ≥80 also had a low probability of AFL recurrence (5.0%) and AF onset (19.0%).ConclusionsOlder patients represent 37.5% of patients referred for AFL ablation and displayed a <10% risk of ablation-related complications. Importantly, AFL recurrences were less frequent in patients ≥70 while AF occurrence was as frequent as in patients <70. Similar observations were made in patients ≥80 years. AFL ablation appears to be safe and efficient and should not be ruled out in elderly patients.  相似文献   

14.
BackgroundTo date, few studies have been published on image-guided helical tomotherapy (HT) in a moderate hypofractionation of localized PCa. We report outcome and toxicity of localized PCa patients treated with HT-based moderate hypofractionated radiotherapy.Materials and methods76 patients were retrospectively analyzed. A total dose of 60 Gy (20 × 3 Gy) or 67.5 Gy (25 × 2.7 Gy) was prescribed. The χ2 test was used to analyze associations between toxicity and dosimetric and clinical parameters. The Cox proportional hazard regression model was used for multivariate analysis. Kaplan-Meier method was used for survival analysis.Resultsmedian follow-up was 42.26 months [interquartile (IQR), 23–76). At 4-year, overall survival (OS) and metastasis-free survival (MFS) were 91% and 89%, respectively. At multivariate analysis, smoking habitude was associated with MFS [hazard ratio (HR) 7.32, 95% CI: 1.57–34.16, p = 0.011]. Acute and late grade ≥ 2 gastro-intestinal (GI) toxicity was observed in 6.5% and 2.6% of patients, respectively. Acute and late grade ≥ 2 genito-urinary (GU) toxicity were 31.5% and 3.9%. Four-year late GI and GU grade ≥ 2 toxicity were 3% and 7%, respectively. Acute GI toxicity was associated with statins medication (p = 0.04) and androgen deprivation therapy (p = 0.013). Acute GU toxicity was associated with the use of anticoagulants (p = 0.029) and antiaggregants (p = 0.013).ConclusionsHT-based moderate hypofractionation shows very low rates of toxicity. Smoking habitude is associated with the risk of developing metastases after radical treatment for localized PCa.  相似文献   

15.
IntroductionSepsis is associated with increased mortality, delirium and long-term cognitive impairment in intensive care unit (ICU) patients. Electroencephalogram (EEG) abnormalities occurring at the acute stage of sepsis may correlate with severity of brain dysfunction. Predictive value of early standard EEG abnormalities for mortality in ICU septic patients remains to be assessed.MethodsIn this prospective, single center, observational study, standard EEG was performed, analyzed and classified according to both Synek and Young EEG scales, in consecutive patients acutely admitted in ICU for sepsis. Delirium, coma and the level of sedation were assessed at the time of EEG recording; and duration of sedation, occurrence of in-ICU delirium or death were assessed during follow-up. Adjusted analyses were carried out using multiple logistic regression.ResultsOne hundred ten patients were included, mean age 63.8 (±18.1) years, median SAPS-II score 38 (29–55). At the time of EEG recording, 46 patients (42%) were sedated and 22 (20%) suffered from delirium. Overall, 54 patients (49%) developed delirium, of which 32 (29%) in the days after EEG recording. 23 (21%) patients died in the ICU. Absence of EEG reactivity was observed in 27 patients (25%), periodic discharges (PDs) in 21 (19%) and electrographic seizures (ESZ) in 17 (15%). ICU mortality was independently associated with a delta-predominant background (OR: 3.36; 95% CI [1.08 to 10.4]), absence of EEG reactivity (OR: 4.44; 95% CI [1.37–14.3], PDs (OR: 3.24; 95% CI [1.03 to 10.2]), Synek grade ≥ 3 (OR: 5.35; 95% CI [1.66–17.2]) and Young grade > 1 (OR: 3.44; 95% CI [1.09–10.8]) after adjustment to Simplified Acute Physiology Score (SAPS-II) at admission and level of sedation. Delirium at the time of EEG was associated with ESZ in non-sedated patients (32% vs 10%, p = 0.037); with Synek grade ≥ 3 (36% vs 7%, p< 0.05) and Young grade > 1 (36% vs 17%, p< 0.001). Occurrence of delirium in the days after EEG was associated with a delta-predominant background (48% vs 15%, p = 0.001); absence of reactivity (39% vs 10%, p = 0.003), Synek grade ≥ 3 (42% vs 17%, p = 0.001) and Young grade >1 (58% vs 17%, p = 0.0001).ConclusionsIn this prospective cohort of 110 septic ICU patients, early standard EEG was significantly disturbed. Absence of EEG reactivity, a delta-predominant background, PDs, Synek grade ≥ 3 and Young grade > 1 at day 1 to 3 following admission were independent predictors of ICU mortality and were associated with occurence of delirium. ESZ and PDs, found in about 20% of our patients. Their prevalence could have been higher, with a still higher predictive value, if they had been diagnosed more thoroughly using continuous EEG.  相似文献   

16.

Background

Obesity is associated with advanced cardiovascular disease. However, some studies have reported the “obesity paradox” after percutaneous coronary intervention (PCI). The relationship between body mass index (BMI) and clinical outcomes after PCI has not been thoroughly investigated, especially in Asian populations.

Methods

We studied 10,142 patients who underwent PCI at 15 Japanese hospitals participating in the JCD-KICS registry from September 2008 to April 2013. Patients were divided into four groups according to BMI: underweight, BMI <18.5 (n=462); normal, BMI ≥18.5 and <25.0 (n=5,945); overweight, BMI ≥25.0 and <30.0 (n=3,100); and obese, BMI ≥30.0 (n=635).

Results

Patients with a high BMI were significantly younger (p<0.001) and had a higher incidence of coronary risk factors such as hypertension (p<0.001), hyperlipidemia (p<0.001), diabetes mellitus (p<0.001), and current smoking (p<0.001), than those with a low BMI. Importantly, patients in the underweight group had the worst in-hospital outcomes, including overall complications (underweight, normal, overweight, and obese groups: 20.4%, 11.5%, 8.4%, and 10.2%, p<0.001), in-hospital mortality (5.8%, 2.1%, 1.2%, and 2.7%, p<0.001), cardiogenic shock (3.5%, 2.0%, 1.5%, and 1.6%, p=0.018), bleeding complications (10.0%, 4.5%, 2.6%, and 2.8%, p<0.001), and receiving blood transfusion (7.6%, 2.7%, 1.6%, and 1.7%, p<0.001). BMI was inversely associated with bleeding complications after adjustment by multivariate logistic regression analysis (odds ratio, 0.95; 95% confidence interval, 0.92–0.98; p=0.002). In subgroup multivariate analysis of patients without cardiogenic shock, BMI was inversely associated with overall complications (OR, 0.98; 95% CI, 0.95–0.99; p=0.033) and bleeding complications (OR, 0.95; 95% CI, 0.91–0.98; p=0.006). Furthermore, there was a trend that BMI was moderately associated with in-hospital mortality (OR, 0.94; 95% CI, 0.88–1.01; p=0.091).

Conclusions

Lean patients, rather than obese patients are at greater risk for in-hospital complications during and after PCI, particularly for bleeding complications.  相似文献   

17.

Background

Cutaneous leishmaniasis (CL) is a neglected infectious disease and a major health problem in several developing countries. Despite some reasonable explanation for their potential benefits, there is only trace evidence regarding the role of dressings in the treatment of CL.

Methods

This randomized, assessor-blind, controlled, clinical trial was conducted in an endemic area for CL caused by Leishmania major in Iran to assess the efficacy of administration of weekly intralesional meglumine antimoniate (i.l.MA) either alone or combined with application of a silver or a non-silver polyester dressing on their lesions for 6 weeks. After screening of 241 patients with CL lesions, 83 eligible patients with 158 lesions were randomly allocated in three arms of the study. Eligibility criteria included parasitologically confirmed CL, age of 12 to 60 years; willingness to participate, duration of lesion<3 months, number of lesions<5, largest ulcer diameter<5 cm. Pregnant or lactating women were excluded. The primary outcome was absolute risk reduction (ARR) based on the proportion of complete healing, which was defined as more than 75% reduction in the size of the lesion compared with baseline in each group at the termination of treatment and 1 month later.

Findings

ARR (95% Confidence Interval [CI]) in i.l.MA versus i.l.MA+non-silver dressing groups was 5.98% (−7.07% to 20.25%), between i.l.MA versus i.l.MA+silver dressing groups was −0.23% (−13.53% to 14.82%), and between i.l.MA+non-silver dressing versus i.l.MA+silver dressing groups was −6.21%(−18.28% to 6.52%) after 6 weeks of treatment. ARR (95% CI) in i.l.MA versus i.l.MA+non-silver dressing groups was −2.22% (−22.12% to 18.10%), between i.l.MA versus i.l.MA+silver dressing groups was 3.64% (−15.36% to 22.82%), and between i.l.MA+non-silver dressing versus i.l.MA+silver dressing groups was 5.86% (−12.86% to 24.31%) 1 month later.

Conclusion

It could not be demonstrated that the efficacy of i.l.MA was improved by either dressing.

Trial Registration

Iranian Registry of Clinical Trials (IRCT.ir) IRCT138707201166N2.  相似文献   

18.

Objective

To evaluate the efficacy and safety of sorafenib for Korean patients with metastatic renal cell carcinoma (mRCC).

Methods

A total of 177 mRCC patients using sorafenib as first- (N = 116), second- (N = 43), and third-line (N = 18) therapies were enrolled from 11 Korean centers between 2006 and 2012. The patient characteristics, therapy duration, tumor response, disease control rate, and tolerability were assessed at baseline and at routine follow-ups, and the progression-free survival (PFS) and overall survival (OS) times and rates were analyzed.

Results

Among all patients, 18 (10.2%) stopped sorafenib treatment for a median of 1.7 weeks, including 15 (8.5%) who discontinued the drug, while 40 (22.6%) and 12 (6.8%) patients required dose reductions and drug interruptions, respectively. Severe adverse events (AEs) or poor compliance was observed in 64 (36.2%) patients, with 118 (7.4%) ≥grade 3 AEs. During the treatment, one myocardial infarction was observed. The number of ≥grade 3 AEs in the first-line sorafenib group was 71 (6.8% of the total 1048 AEs). During a median follow-up of 17.2 months, the radiologically confirmed best objective response rate, disease control rate, median PFS, and median OS were 22.0%, 53.0%, 6.4 months (95% confidence interval [CI], 5.2–8.9), and 32.6 months (95% CI, 27.3–63.8) for the total 177 sorafenib-treated patients, respectively, and 23.2%, 56.0%, 7.4 months (95% CI, 5.5–10.5), and not reached yet (95% CI, 1.0–31.1) for the first-line sorafenib group, respectively.

Conclusions

Sorafenib produced tolerable safety, with a ≥grade 3 AE rate of 7.4% and an acceptable disease control rate (53.0%) in Korean mRCC patients.  相似文献   

19.

Background

Obesity is associated with increased mortality, and weight loss trials show rapid improvement in many mortality risk factors. Yet, observational studies typically associate weight loss with higher mortality risk. The purpose of this meta-analysis of randomized controlled trials (RCTs) of weight loss was to clarify the effects of intentional weight loss on mortality.

Methods

2,484 abstracts were identified and reviewed in PUBMED, yielding 15 RCTs reporting (1) randomization to weight loss or non-weight loss arms, (2) duration of ≥18 months, and (3) deaths by intervention arm. Weight loss interventions were all lifestyle-based. Relative risks (RR) and 95% confidence intervals (95% CI) were estimated for each trial. For trials reporting at least one death (n = 12), a summary estimate was calculated using the Mantel-Haenszel method. Sensitivity analysis using sparse data methods included remaining trials.

Results

Trials enrolled 17,186 participants (53% female, mean age at randomization = 52 years). Mean body mass indices ranged from 30–46 kg/m2, follow-up times ranged from 18 months to 12.6 years (mean: 27 months), and average weight loss in reported trials was 5.5±4.0 kg. A total of 264 deaths were reported in weight loss groups and 310 in non-weight loss groups. The weight loss groups experienced a 15% lower all-cause mortality risk (RR = 0.85; 95% CI: 0.73–1.00). There was no evidence for heterogeneity of effect (Cochran’s Q = 5.59 (11 d.f.; p = 0.90); I2 = 0). Results were similar in trials with a mean age at randomization ≥55 years (RR = 0.84; 95% CI 0.71–0.99) and a follow-up time of ≥4 years (RR = 0.85; 95% CI 0.72–1.00).

Conclusions

In obese adults, intentional weight loss may be associated with approximately a 15% reduction in all-cause mortality.  相似文献   

20.

Objective

To evaluate the incidence rate of Chronic Kidney Disease (CKD) stage 3-5 (persistent decreased kidney function under 60 mL/min per 1.73 m2) among patients with type 2 diabetes over five years, to identify the risk factors associated with CKD, and develop a risk table to predict five-year CKD stage 3-5 risk stratification for clinical use.

Design

The MADIABETES Study is a prospective cohort study of 3,443 outpatients with type 2 diabetes mellitus, sampled from 56 primary health care centers (131 general practitioners) in Madrid (Spain).

Results

The cumulative incidence of CKD stage 3-5 at five-years was 10.23% (95% CI = 9.12–11.44) and the incidence density was 2.07 (95% CI = 1.83–2.33) cases per 1,000 patient-months or 2.48 (95% CI = 2.19–2.79) cases per 100 patient-years. The highest hazard ratio (HR) for developing CKD stage 3-5 was albuminuria ≥300 mg/g (HR = 4.57; 95% CI= 2.46-8.48). Furthermore, other variables with a high HR were age over 74 years (HR = 3.20; 95% CI = 2.13–4.81), a history of Hypertension (HR = 2.02; 95% CI = 1.42–2.89), Myocardial Infarction (HR= 1.72; 95% IC= 1.25–2.37), Dyslipidemia (HR = 1.68; 95% CI 1.30–2.17), duration of diabetes mellitus ≥ 10 years (HR = 1.46; 95% CI = 1.14-1.88) and Systolic Blood Pressure >149 mmHg (HR = 1.52; 95% CI = 1.02–2.24).

Conclusions

After a five-year follow-up, the cumulative incidence of CKD is concordant with rates described in Spain and other countries. Albuminuria ≥ 300 mg/g and age over 74 years were the risk factors more strongly associated with developing CKD (Stage 3-5). Blood Pressure, lipid and albuminuria control could reduce CKD incidence of CKD in patients with T2DM.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号