首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.

Background

Anticoagulation therapy is usually required in patients with chronic kidney disease (CKD) for treatment or prevention of thromboembolic diseases. However, this benefit could easily be offset by the risk of bleeding.

Objectives

To determine the incidence of adverse outcomes of anticoagulants in hospitalized patients with CKD, and to compare the rates of major bleeding events between the unfractionated heparin (UFH) and enoxaparin users.

Methods

One year prospective observational study was conducted in patients with CKD stages 3 to 5 (estimated GFR, 10–59 ml/min/1.73 m2) who were admitted to the renal unit of Dubai Hospital. Propensity scores for the use of anticoagulants, estimated for each of the 488 patients, were used to identify a cohort of 117 pairs of patients. Cox regression method was used to estimate association between anticoagulant use and adverse outcomes.

Results

Major bleeding occurred in 1 in 3 patients who received anticoagulation during hospitalization (hazard ratio [HR], 4.61 [95% confidence interval [CI], 2.05–10.35]). Compared with enoxaparin users, patients who received anticoagulation with unfractionated heparin had a lower mean [SD] serum level of platelet counts (139.95 [113]×103/µL vs 205.56 [123] ×103/µL; P<0.001), and had a higher risk of major bleeding (HR, 4.79 [95% CI, 1.85–12.36]). Furthermore, compared with those who did not receive anticoagulants, patients who did had a higher in-hospital mortality (HR, 2.54 [95% CI, 1.03–6.25]); longer length of hospitalization (HR, 1.04 [95% CI, 1.01–1.06]); and higher hospital readmission at 30 days (HR, 1.79 [95% CI, 1.10–2.91]).

Conclusions

Anticoagulation among hospitalized patients with CKD was significantly associated with an increased risk of bleeding and in-hospital mortality. Hence, intensive monitoring and preventive measures such as laboratory monitoring and/or dose adjustment are warranted.  相似文献   

2.

Background

There is conflicting evidence regarding the impact of preexisting renal dysfunction (RD) on mid-term outcomes after transcatheter aortic valve implantation (TAVI) in patients with symptomatic aortic stenosis (AS).

Methods and results

Forty-seven articles representing 32,131 patients with AS undergoing a TAVI procedure were included in this systematic review and meta-analysis. Pooled analyses were performed with both univariate and multivariate models, using a fixed or random effects method when appropriate. Compared with patients with normal renal function, mid-term mortality was significantly higher in patients with preexisting RD, as defined by the author (univariate hazard ratio [HR]: 1.69; 95% confidence interval [CI]: 1.50–1.90; multivariate HR: 1.47; 95% CI: 1.17–1.84), baseline estimated glomerular filtration rate (eGFR) (univariate HR: 1.65; 95% CI: 1.47–1.86; multivariate HR: 1.46; 95% CI: 1.24–1.71), and serum creatinine (univariate HR: 1.69; 95% CI: 1.48–1.92; multivariate HR: 1.65; 95% CI: 1.36–1.99). Advanced stage of chronic kidney disease (CKD stage 3–5) was strongly related to bleeding (univariate HR in CKD stage 3: 1.30, 95% CI: 1.13–1.49; in CKD stage 4: 1.30, 95% CI: 1.04–1.62), acute kidney injure (AKI) (univariate HR in CKD stage 3: 1.28, 95% CI: 1.03–1.59; in CKD stage 4: 2.27, 95% CI: 1.74–2.96), stroke (univariate HR in CKD stage 4: 3.37, 95% CI: 1.52–7.46), and mid-term mortality (univariate HR in CKD stage 3: 1.57, 95% CI: 1.26–1.95; in CKD stage 4: 2.77, 95% CI: 2.06–3.72; in CKD stage 5: 2.64, 95% CI: 1.91–3.65) compared with CKD stage 1+2. Patients with CKD stage 4 had a higher incidence of AKI (univariate HR: 1.70, 95% CI: 1.34–2.16) and all-cause death (univariate HR: 1.60, 95% CI: 1.28–1.99) compared with those with CKD stage 3. A per unit decrease in serum creatinine was also associated with a higher mortality at mid-term follow-up (univariate HR: 1.24, 95% CI: 1.18–1.30; multivariate HR: 1.19, 95% CI: 1.08–1.30).

Conclusions

Preexisting RD was associated with increased mid-term mortality after TAVI. Patients with CKD stage 4 had significantly higher incidences of peri-procedural complications and a poorer prognosis, a finding that should be factored into the clinical decision-making process regarding these patients.  相似文献   

3.

Background

Limited information exists on adults ≥50 years receiving HIV care in sub-Saharan Africa.

Methodology

Using routinely-collected longitudinal patient-level data among 391,111 adults ≥15 years enrolling in HIV care from January 2005–December 2010 and 184,689 initiating ART, we compared characteristics and outcomes between older (≥50 years) and younger adults at 199 clinics in Kenya, Mozambique, Rwanda, and Tanzania. We calculated proportions over time of newly enrolled and active adults receiving HIV care and initiating ART who were ≥50 years; cumulative incidence of loss to follow-up (LTF) and recorded death one year after enrollment and ART initiation, and CD4+ response following ART initiation.

Findings

From 2005–2010, the percentage of adults ≥50 years newly enrolled in HIV care remained stable at 10%, while the percentage of adults ≥50 years newly initiating ART (10% [2005]-12% [2010]), active in follow-up (10% [2005]-14% (2010]), and active on ART (10% [2005]-16% [2010]) significantly increased. One year after enrollment, older patients had significantly lower incidence of LTF (33.1% vs. 32.6%[40–49 years], 40.5%[25–39 years], and 56.3%[15–24 years]; p-value<0.0001), but significantly higher incidence of recorded death (6.0% vs. 5.0% [40–49 years], 4.1% [25–39 years], and 2.8% [15–24 years]; p-valve<0.0001). LTF was lower after vs. before ART initiation for all ages, with older adults experiencing less LTF than younger adults. Among 85,763 ART patients with baseline and follow-up CD4+ counts, adjusted average 12-month CD4+ response for older adults was 20.6 cells/mm3 lower than for adults 25–39 years of age (95% CI: 17.1–24.1).

Conclusions

The proportion of patients who are ≥50 years has increased over time and been driven by aging of the existing patient population. Older patients experienced less LTF, higher recorded mortality and less robust CD4+ response after ART initiation. Increased programmatic attention on older adults receiving HIV care in sub-Saharan Africa is warranted.  相似文献   

4.

Background

Associations between angiotensin-converting enzyme (ACE) gene insertion/deletion (I/D) polymorphisms and chronic kidney disease (CKD) have been extensively studied, with most studies reporting that individuals with the D allele have a higher risk. Although some factors, such as ethnicity, may moderate the association between ACE I/D polymorphisms and CKD risk, gender-dependent effects on the CKD risk remain controversial.

Objectives

This study investigated the gender-dependent effects of ACE I/D polymorphisms on CKD risk.

Data sources

PubMed, the Cochrane library, and EMBASE were searched for studies published before January 2013.

Study eligibility criteria, participants, and interventions

Cross-sectional surveys and case–control studies analyzing ACE I/D polymorphisms and CKD were included. They were required to match the following criteria: age >18 years, absence of rare diseases, and Asian or Caucasian ethnicity.

Study appraisal and synthesis methods

The effect of carrying the D allele on CKD risk was assessed by meta-analysis and meta-regression using random-effects models.

Results

Ethnicity [odds ratio (OR): 1.24; 95% confidence interval (CI): 1.08–1.42] and hypertension (OR: 1.55; 95% CI: 1.04–2.32) had significant moderate effects on the association between ACE I/D polymorphisms and CKD risk, but they were not significant in the diabetic nephropathy subgroup. Males had higher OR for the association between ACE I/D polymorphisms and CKD risk than females in Asians but not Caucasians, regardless of adjustment for hypertension (p<0.05). In subgroup analyses, this result was significant in the nondiabetic nephropathy group. Compared with the I allele, the D allele had the highest risk (OR: 3.75; 95% CI: 1.84–7.65) for CKD in hypertensive Asian males.

Conclusions and implications of key findings

The ACE I/D polymorphisms may incur the highest risk for increasing CKD in hypertensive Asian males.  相似文献   

5.

Background

The benefit of corticosteroids in community-acquired pneumonia (CAP) remains controversial. We did a meta-analysis to include all the randomized controlled trials (RCTs) which used corticosteroids as adjunctive therapy, to examine the benefits and risks of corticosteroids in the treatment of CAP in adults.

Methods

Databases including Pubmed, EMBASE, the Cochrane controlled trials register, and Google Scholar were searched to find relevant trials. Randomized and quasi-randomized trials of corticosteroids treatment in adult patients with CAP were included. Effects on primary outcome (mortality) and secondary outcomes (adverse events) were accessed in this meta-analysis.

Results

Nine trials involving 1001 patients were included. Use of corticosteroids did not significantly reduce mortality (Peto odds ratio [OR] 0.62, 95% confidence interval [CI] 0.37–1.04; P = 0.07). In the subgroup analysis by the severity, a survival benefit was found among severe CAP patients (Peto OR 0.26, 95% CI 0.11–0.64; P = 0.003). In subgroup analysis by duration of corticosteroids treatment, significant reduced mortality was found among patients with prolonged corticosteroids treatment (Peto OR 0.51, 95% CI 0.26–0.97; P = 0.04; I 2 = 37%). Corticosteroids increased the risk of hyperglycemia (Peto OR 2.64, 95% CI 1.68–4.15; P<0.0001), but without increasing the risk of gastroduodenal bleeding (Peto OR 1.67, 95% CI 0.41–6.80; P = 0.47) and superinfection (Peto OR 1.36, 95% CI 0.65–2.84; P = 0.41).

Conclusion

Results from this meta-analysis did not suggest a benefit for corticosteroids treatment in patients with CAP. However, the use of corticosteroids was associated with improved mortality in severe CAP. In addition, prolonged corticosteroids therapy suggested a beneficial effect on mortality. These results should be confirmed by future adequately powered randomized trials.  相似文献   

6.

Background

There is a paucity of clinical trials informing specific questions faced by infectious diseases (ID) specialists. The ClinicalTrials.gov registry offers an opportunity to evaluate the ID clinical trials portfolio.

Methods

We examined 40,970 interventional trials registered with ClinicalTrials.gov from 2007–2010, focusing on study conditions and interventions to identify ID-related trials. Relevance to ID was manually confirmed for each programmatically identified trial, yielding 3570 ID trials and 37,400 non-ID trials for analysis.

Results

The number of ID trials was similar to the number of trials identified as belonging to cardiovascular medicine (n = 3437) or mental health (n = 3695) specialties. Slightly over half of ID trials were treatment-oriented trials (53%, vs. 77% for non-ID trials) followed by prevention (38%, vs. 8% in non-ID trials). ID trials tended to be larger than those of other specialties, with a median enrollment of 125 subjects (interquartile range [IQR], 45–400) vs. 60 (IQR, 30–160) for non-ID trials. Most ID studies are randomized (73%) but nonblinded (56%). Industry was the funding source in 51% of ID trials vs. 10% that were primarily NIH-funded. HIV-AIDS trials constitute the largest subset of ID trials (n = 815 [23%]), followed by influenza vaccine (n = 375 [11%]), and hepatitis C (n = 339 [9%]) trials. Relative to U.S. and global mortality rates, HIV-AIDS and hepatitis C virus trials are over-represented, whereas lower respiratory tract infection trials are under-represented in this large sample of ID clinical trials.

Conclusions

This work is the first to characterize ID clinical trials registered in ClinicalTrials.gov, providing a framework to discuss prioritization, methodology, and policy.  相似文献   

7.

Background

We used population based infectious disease surveillance to characterize mortality rates in residents of an urban slum in Kenya.

Methods

We analyzed biweekly household visit data collected two weeks before death for 749 cases who died during January 1, 2007 to December 31, 2010. We also selected controls matched by age, gender and having a biweekly household visit within two weeks before death of the corresponding case and compared the symptoms reported.

Results

The overall mortality rate was 6.3 per 1,000 person years of observation (PYO) (females: 5.7; males: 6.8). Infant mortality rate was 50.2 per 1000 PYOs, and it was 15.1 per 1,000 PYOs for children <5 years old. Poisson regression indicates a significant decrease over time in overall mortality from (6.0 in 2007 to 4.0 in 2010 per 1000 PYOs; p<0.05) in persons ≥5 years old. This decrease was predominant in females (7.8 to 5.7 per 1000 PYOs; p<0.05). Two weeks before death, significantly higher prevalence for cough (OR = 4.7 [95% CI: 3.7–5.9]), fever (OR = 8.1 [95% CI: 6.1–10.7]), and diarrhea (OR = 9.1 [95% CI: 6.4–13.2]) were reported among participants who died (cases) when compared to participants who did not die (controls). Diarrhea followed by fever were independently associated with deaths (OR = 14.4 [95% CI: 7.1–29.2]), and (OR = 11.4 [95% CI: 6.7–19.4]) respectively.

Conclusions

Despite accessible health care, mortality rates are high among people living in this urban slum; infectious disease syndromes appear to be linked to a substantial proportion of deaths. Rapid urbanization poses an increasing challenge in national efforts to improve health outcomes, including reducing childhood mortality rates. Targeting impoverished people in urban slums with effective interventions such as water and sanitation interventions are needed to achieve national objectives for health.  相似文献   

8.

Background

Uremic toxins are emerging as important, non-traditional cardiovascular risk factors in chronic kidney disease (CKD). P-cresol has been defined as a prototype protein-bound uremic toxin. Conjugation of p-cresol creates p-cresylsulfate (PCS) as the main metabolite and p-cresylglucuronide (PCG), at a markedly lower concentration. The objective of the present study was to evaluate serum PCG levels, determine the latter’s association with mortality and establish whether the various protein-bound uremic toxins (i.e. PCS, PCG and indoxylsulfate (IS)) differed in their ability to predict mortality.

Methodology/Principal Findings

We studied 139 patients (mean ± SD age: 67±12; males: 60%) at different CKD stages (34.5% at CKD stages 2–3, 33.5% at stage 4–5 and 32% at stage 5D). A recently developed high-performance liquid chromatography method was used to assay PCG concentrations. Total and free PCG levels increased with the severity of CKD. During the study period (mean duration: 779±185 days), 38 patients died. High free and total PCG levels were correlated with overall and cardiovascular mortality independently of well-known predictors of survival, such as age, vascular calcification, anemia, inflammation and (in predialysis patients) the estimated glomerular filtration rate. In the same cohort, free PCS levels and free IS levels were both correlated with mortality. Furthermore, the respective predictive powers of three Cox multivariate models (free PCS+other risk factors, free IS+other risk factors and free PCS+other risk factors) were quite similar - suggesting that an elevated PCG concentration has much the same impact on mortality as other uremic toxins (such as PCS or IS) do.

Conclusions

Although PCG is the minor metabolite of p-cresol, our study is the first to reveal its association with mortality. Furthermore, the free fraction of PCG appears to have much the same predictive power for mortality as PCS and IS do.  相似文献   

9.

Background

Hyponatremia is the most common electrolyte disorder in clinical practice, and evidence to date indicates that severe hyponatremia is associated with increased morbidity and mortality. The aim of our study was to perform a meta-analysis that included the published studies that compared mortality rates in subjects with or without hyponatremia of any degree.

Methods and Findings

An extensive Medline, Embase and Cochrane search was performed to retrieve the studies published up to October 1st 2012, using the following words: “hyponatremia” and “mortality”. Eighty-one studies satisfied inclusion criteria encompassing a total of 850222 patients, of whom 17.4% were hyponatremic. The identification of relevant abstracts, the selection of studies and the subsequent data extraction were performed independently by two of the authors, and conflicts resolved by a third investigator. Across all 81 studies, hyponatremia was significantly associated with an increased risk of overall mortality (RR = 2.60[2.31–2.93]). Hyponatremia was also associated with an increased risk of mortality in patients with myocardial infarction (RR = 2.83[2.23–3.58]), heart failure (RR = 2.47[2.09–2.92]), cirrhosis (RR = 3.34[1.91–5.83]), pulmonary infections (RR = 2.49[1.44–4.30]), mixed diseases (RR = 2.59[1.97–3.40]), and in hospitalized patients (RR = 2.48[2.09–2.95]). A mean difference of serum [Na+] of 4.8 mmol/L was found in subjects who died compared to survivors (130.1±5.6 vs 134.9±5.1 mmol/L). A meta-regression analysis showed that the hyponatremia-related risk of overall mortality was inversely correlated with serum [Na+]. This association was confirmed in a multiple regression model after adjusting for age, gender, and diabetes mellitus as an associated morbidity.

Conclusions

This meta-analysis shows for the first time that even a moderate serum [Na+] decrease is associated with an increased risk of mortality in commonly observed clinical conditions across large numbers of patients.  相似文献   

10.

Background

Elevated serum phosphorus levels have been linked with cardiovascular disease and mortality with conflicting results, especially in the presence of normal renal function.

Methods

We studied the association between serum phosphorus levels and clinical outcomes in 1663 patients with acute myocardial infarction (AMI). Patients were categorized into 4 groups based on serum phosphorus levels (<2.50, 2.51–3.5, 3.51–4.50 and >4.50 mg/dL). Cox proportional-hazards models were used to examine the association between serum phosphorus and clinical outcomes after adjustment for potential confounders.

Results

The mean follow up was 45 months. The lowest mortality occurred in patients with serum phosphorus between 2.5–3.5 mg/dL, with a multivariable-adjusted hazard ratio of 1.24 (95% CI 0.85–1.80), 1.35 (95% CI 1.05–1.74), and 1.75 (95% CI 1.27–2.40) in patients with serum phosphorus of <2.50, 3.51–4.50 and >4.50 mg/dL, respectively. Higher phosphorus levels were also associated with increased risk of heart failure, but not the risk of myocardial infarction or stroke. The effect of elevated phosphorus was more pronounced in patients with chronic kidney disease (CKD). The hazard ratio for mortality in patients with serum phosphorus >4.5 mg/dL compared to patients with serum phosphorus 2.50–3.50 mg/dL was 2.34 (95% CI 1.55–3.54) with CKD and 1.53 (95% CI 0.87–2.69) without CKD.

Conclusion

We found a graded, independent association between serum phosphorus and all-cause mortality and heart failure in patients after AMI. The risk for mortality appears to increase with serum phosphorus levels within the normal range and is more prominent in the presence of CKD.  相似文献   

11.

Background

Previous studies exploring the association between 25[OH]D levels and mortality in adults with and without kidney disease utilized 25[OH]D thresholds that have recently been scrutinized by the Institute of Medicine Committee to Review Dietary References Intakes for Vitamin D and Calcium.

Objective

We explored all-cause mortality rates across the spectrum of 25[OH]D levels over an eighteen-year follow-up among adults with and without an estimated glomerular filtration rate (eGFR) <60 ml/min/1.73 m2.

Design

The study included 1,097 U.S. adults with eGFR <60 ml/min/1.73 m2 and 14, 002 adults with eGFR ≥60 ml/min/1.73 m2. Mortality rates and rate ratios (RR) across 25[OH]D groups were calculated with Poisson regression and restricted cubic splines while adjusting for covariates.

Results

Prevalence of 25[OH]D levels <30 and <20 ng/ml among adults with eGFR <60 ml/min/1.73 m2 was 76.5% (population estimate 6.2 million) and 35.4% (population estimate 2.9 million), respectively. Among adults with eGFR ≥60 ml/min/1.73 m2, 70.5% had 25[OH]D levels <30 ng/ml (population estimate 132.2 million) while 30.3% had 25[OH]D levels <20 ng/ml (population estimate 56.8 million). Significantly higher mortality rates were noted among individuals with 25[OH]D levels <12 ng/ml compared to referent group (24 to <30 ng/ml): RR1.41 (95% CI 1.17, 1.71) among individuals with eGFR <60 ml/min/1.73 m2 and RR 1.32 (95% CI 1.13, 1.56) among individuals with eGFR ≥60 ml/min/1.73 m2 after adjustment for covariates including co-morbid conditions. Mortality rates were fairly similar across all 25[OH]D groups with levels >20 ng/ml after adjustment for all covariates.

Conclusions

Regardless of presence of eGFR <60 ml/min/1.73 m2, mortality rates across groups with 25[OH]D levels 20–40 ng/ml are similar.  相似文献   

12.

Background

The efficacy of clopidogrel is inconclusive in the chronic kidney disease (CKD) population with acute coronary syndrome (ACS). Furthermore, CKD patients are prone to bleeding with antiplatelet therapy. We investigated the efficacy and safety of clopidogrel in patients with ACS and CKD.

Methods

In a Taiwan national-wide registry, 2819 ACS patients were enrolled. CKD is defined as an estimated glomerular filtration rate of less than 60 ml/min per 1.73 m2. The primary endpoints are the combined outcomes of death, non-fatal myocardial infarction and stroke at 12 months.

Results

Overall 949 (33.7%) patients had CKD and 2660 (94.36%) patients received clopidogrel treatment. CKD is associated with increased risk of the primary endpoint at 12 months (HR 2.39, 95% CI 1.82 to 3.15, p<0.01). Clopidogrel use is associated with reduced risk of the primary endpoint at 12 months (HR 0.42, 95% CI: 0.29–0.60, p<0.01). Cox regression analysis showed that clopidogrel reduced death and primary endpoints for CKD population (HR 0.35, 95% CI: 0.21–0.61 and HR 0.48, 95% CI: 0.30–0.77, respectively, both p<0.01). Patients with clopidogrel(−)/CKD(−), clopidogrel(+)/CKD(+) and clopidogrel(−)/CKD(+) have 2.4, 3.0 and 10.4 fold risk to have primary endpoints compared with those receiving clopidogrel treatment without CKD (all p<0.01). Clopidogrel treatment was not associated with increased in-hospital Thrombolysis In Myocardial Infarction (TIMI) bleeding in CKD population.

Conclusion

Clopidogrel could decrease mortality and improve cardiovascular outcomes without increasing risk of bleeding in ACS patients with CKD.  相似文献   

13.

Objective

To review the incidence of respiratory conditions and their effect on mortality in HIV-infected and uninfected individuals prior to and during the era of highly active antiretroviral therapy (HAART).

Design

Two large observational cohorts of HIV-infected and HIV-uninfected men (Multicenter AIDS Cohort Study [MACS]) and women (Women’s Interagency HIV Study [WIHS]), followed since 1984 and 1994, respectively.

Methods

Adjusted odds or hazards ratios for incident respiratory infections or non-infectious respiratory diagnoses, respectively, in HIV-infected compared to HIV-uninfected individuals in both the pre-HAART (MACS only) and HAART eras; and adjusted Cox proportional hazard ratios for mortality in HIV-infected persons with lung disease during the HAART era.

Results

Compared to HIV-uninfected participants, HIV-infected individuals had more incident respiratory infections both pre-HAART (MACS, odds ratio [adjusted-OR], 2.4; 95% confidence interval [CI], 2.2–2.7; p<0.001) and after HAART availability (MACS, adjusted-OR, 1.5; 95%CI 1.3–1.7; p<0.001; WIHS adjusted-OR, 2.2; 95%CI 1.8–2.7; p<0.001). Chronic obstructive pulmonary disease was more common in MACS HIV-infected vs. HIV-uninfected participants pre-HAART (hazard ratio [adjusted-HR] 2.9; 95%CI, 1.02–8.4; p = 0.046). After HAART availability, non-infectious lung diseases were not significantly more common in HIV-infected participants in either MACS or WIHS participants. HIV-infected participants in the HAART era with respiratory infections had an increased risk of death compared to those without infections (MACS adjusted-HR, 1.5; 95%CI, 1.3–1.7; p<0.001; WIHS adjusted-HR, 1.9; 95%CI, 1.5–2.4; p<0.001).

Conclusion

HIV infection remained a significant risk for infectious respiratory diseases after the introduction of HAART, and infectious respiratory diseases were associated with an increased risk of mortality.  相似文献   

14.

Background

The effects of intermittent, high dose vitamin D treatment in older adults have not been documented. We conducted a meta-analysis to provide a quantitative assessment of the efficiency of intermittent, high dose vitamin D treatment on falls, fractures, and mortality among older adults.

Methods

Electronic databases were searched for randomized controlled trials (RCTs) on high dose, intermittent vitamin D supplementation among older adults. Two researchers independently screened the literature according to specified inclusive and exclusive criteria to extract the data. Meta-analysis was performed by using Review Manager 5.1.0 software.

Results

Nine trials were included in this meta-analysis. High dose, intermittent vitamin D therapy did not decrease all-cause mortality among older adults. The risk ratio (95% CI) was 1.04 (0.91–1.17). No benefit was seen in fracture or fall prevention. The risk ratio for hip fractures (95% CI) was 1.17 (0.97–1.41) while for non-vertebral fractures (95% CI) it was 1.06 (0.91–1.22), and the risk ratio for falls (95% CI) was 1.02 (0.96–1.08). Results remained robust after sensitivity analysis.

Conclusion

Supplementation of intermittent, high dose vitamin D may not be effective in preventing overall mortality, fractures, or falls among older adults. The route of administration of vitamin D supplements may well change the physiological effects.  相似文献   

15.

Background

It is demonstrated that elevated serum levels of alkaline phosphatase (ALP) and phosphate indicate a higher risks of cardiovascular disease (CVD) and total mortality in population with chronic kidney disease (CKD), but it remains unclear whether this association exists in people with normal or preserved renal function.

Method

Clinical trials were searched from Embase and PubMed from inception to 2013 December using the keywords “ALP”, “phosphate”, “CVD”, “mortality” and so on, and finally 24 trials with a total of 147634 patients were included in this study. Dose-response and semi-parametric meta-analyses were performed.

Results

A linear association of serum levels of ALP and phosphate with risks of coronary heart disease (CHD) events, CVD events and deaths was identified. The relative risk(RR)of ALP for CVD deaths was 1.02 (95% confidence interval [CI], 1.01–1.04). The RR of phosphate for CVD deaths and events was 1.05 (95% CI, 1.02–1.09) and 1.04 (95% CI: 1.03–1.06), respectively. A non-linear association of ALP and phosphate with total mortality was identified. Compared with the reference category of ALP and phosphate, the pooled RR of ALP for total mortality was 1.57 (95% CI, 1.27–1.95) for the high ALP group, while the RR of phosphate for total mortality was 1.33 (95% CI, 1.21–1.46) for the high phosphate group. It was observed in subgroup analysis that higher levels of serum ALP and phosphate seemed to indicate a higher mortality rate in diabetic patients and those having previous CVD. The higher total mortality rate was more obvious in the men and Asians with high ALP.

Conclusion

A non-linear relationship exists between serum levels of ALP and phosphate and risk of total mortality. There appears to be a positive association of serum levels of ALP/phosphate with total mortality in people with normal or preserved renal function, while the relationship between ALP and CVD is still ambiguous.  相似文献   

16.

Objectives

To perform a systematic review of randomized controlled trials to determine whether prevention or slowing of progression of chronic kidney disease would translate into improved mortality, and if so, the attributable risk due to CKD itself on mortality.

Background

CKD is associated with increased mortality. This association is largely based on evidence from the observational studies and evidence from randomized controlled trials is lacking.

Methods

We searched Ovid, Medline and Embase for RCTs in which an intervention was given to prevent or slow the progression of CKD and mortality was reported as primary, secondary or adverse outcomes were eligible and selected. For the first phase, pooled relative risks for renal endpoints were assessed. For the second phase, we assessed the effect on mortality in trials of interventions that definitively reduced CKD endpoints.

Results

Among 52 studies selected in first phase, only renin-angiotensin-aldosterone-system blockade vs. placebo (n = 18 trials, 32,557 participants) met the efficacy criteria for further analysis in the second phase by reducing renal endpoints 15 to 27% compared to placebo. There was no difference in all-cause mortality (RR 0.99, 95% CI 0.92 to 1.08) or CV death (RR 0.97, 95% CI 0.78 to 1.21) between the treatment and control groups in these trials. There was sufficient statistical power to detect a 9% relative risk reduction in all-cause mortality and a 14% relative risk reduction in cardiovascular mortality.

Conclusions

Firm evidence is lacking that prevention of CKD translates into reductions in mortality. Larger trials with longer follow-up time are needed to determine the benefit of CKD prevention on survival.  相似文献   

17.

Background

Severe malaria (SM) is a major cause of death in sub-Saharan Africa. Identification of both specific and sensitive clinical features to predict death is needed to improve clinical management.

Methods

A 13-year observational study was conducted from 1997 through 2009 of 2,901 children with SM enrolled at the Royal Victoria Teaching Hospital in The Gambia to identify sensitive and specific predictors of poor outcome in Gambian children with severe malaria between the ages 4 months to 14 years. We have measured the sensitivity and specificity of clinical features that predict death or development of neurological sequelae.

Findings

Impaired consciousness (odds ratio {OR} 4.4 [95% confidence interval {CI}, 2.7–7.3]), respiratory distress (OR 2.4 [95%CI, 1.7–3.2]), hypoglycemia (OR 1.7 [95%CI, 1.2–2.3]), jaundice (OR 1.9 [95%CI, 1.2–2.9]) and renal failure (OR 11.1 [95%CI, 3.3–36.5]) were independently associated with death in children with SM. The clinical features that showed the highest sensitivity and specificity to predict death were respiratory distress (area under the curve 0.63 [95%CI, 0.60–0.65]) and impaired consciousness (AUC 0.61[95%CI, 0.59–0.63]), which were comparable to the ability of hyperlactatemia (blood lactate>5 mM) to predict death (AUC 0.64 [95%CI, 0.55–0.72]). A Blantyre coma score (BCS) of 2 or less had a sensitivity of 74% and specificity of 67% to predict death (AUC 0.70 [95% C.I. 0.68–0.72]), and sensitivity and specificity of 74% and 69%, respectively to predict development of neurological sequelae (AUC 0.72 [95% CI, 0.67–0.76]).The specificity of this BCS threshold to identify children at risk of dying improved in children less than 3 years of age (AUC 0.74, [95% C.I 0.71–0.76]).

Conclusion

The BCS is a quantitative predictor of death. A BCS of 2 or less is the most sensitive and specific clinical feature to predict death or development of neurological sequelae in children with SM.  相似文献   

18.

Background

Area-based measures of economic deprivation are seldom applied to large medical records databases to establish population-scale associations between deprivation and disease.

Objective

To study the association between deprivation and incidence of common cancer types in a Southern European region.

Methods

Retrospective ecological study using the SIDIAP (Information System for the Development of Research in Primary Care) database of longitudinal electronic medical records for a representative population of Catalonia (Spain) and the MEDEA index based on urban socioeconomic indicators in the Spanish census. Study outcomes were incident cervical, breast, colorectal, prostate, and lung cancer in 2009–2012. The completeness of SIDIAP cancer recording was evaluated through linkage of a geographic data subset to a hospital cancer registry. Associations between MEDEA quintiles and cancer incidence was evaluated using zero-inflated Poisson regression adjusted for sex, age, smoking, alcoholism, obesity, hypertension, and diabetes.

Results

SIDIAP sensitivity was 63% to 92% for the five cancers studied. There was direct association between deprivation and lung, colorectal, and cervical cancer: incidence rate ratios (IRR) 1.82 [1.64–2.01], IRR 1.60 [1.34–1.90], IRR 1.22 [1.07–1.38], respectively, comparing the most deprived to most affluent areas. In wealthy areas, prostate and breast cancers were more common: IRR 0.92 [0.80–1.00], IRR 0.91 [0.78–1.06]. Adjustment for confounders attenuated the association with lung cancer risk (fully adjusted IRR 1.16 [1.08–1.25]), reversed the direction of the association with colorectal cancer (IRR 0.90 [0.84–0.95]), and did not modify the associations with cervical (IRR 1.27 [1.11–1.45]), prostate (0.74 [0.69–0.80]), and breast (0.76 [0.71–0.81]) cancer.

Conclusions

Deprivation is associated differently with the occurrence of various cancer types. These results provide evidence that MEDEA is a useful, area-based deprivation index for analyses of the SIDIAP database. This information will be useful to improve screening programs, cancer prevention and management strategies, to reach patients more effectively, particularly in deprived urban areas.  相似文献   

19.

Background

The expending and invasive features of tumor nests could reflect the malignant biological behaviors of breast invasive ductal carcinoma. Useful information on cancer invasiveness hidden within tumor nests could be extracted and analyzed by computer image processing and big data analysis.

Methods

Tissue microarrays from invasive ductal carcinoma (n = 202) were first stained with cytokeratin by immunohistochemical method to clearly demarcate the tumor nests. Then an expert-aided computer analysis system was developed to study the mathematical and geometrical features of the tumor nests. Computer recognition system and imaging analysis software extracted tumor nests information, and mathematical features of tumor nests were calculated. The relationship between tumor nests mathematical parameters and patients'' 5-year disease free survival was studied.

Results

There were 8 mathematical parameters extracted by expert-aided computer analysis system. Three mathematical parameters (number, circularity and total perimeter) with area under curve >0.5 and 4 mathematical parameters (average area, average perimeter, total area/total perimeter, average (area/perimeter)) with area under curve <0.5 in ROC analysis were combined into integrated parameter 1 and integrated parameter 2, respectively. Multivariate analysis showed that integrated parameter 1 (P = 0.040) was independent prognostic factor of patients'' 5-year disease free survival. The hazard risk ratio of integrated parameter 1 was 1.454 (HR 95% CI [1.017–2.078]), higher than that of N stage (HR 1.396, 95% CI [1.125–1.733]) and hormone receptor status (HR 0.575, 95% CI [0.353–0.936]), but lower than that of histological grading (HR 3.370, 95% CI [1.125–5.364]) and T stage (HR 1.610, 95% CI [1.026 –2.527]).

Conclusions

This study indicated integrated parameter 1 of mathematical features (number, circularity and total perimeter) of tumor nests could be a useful parameter to predict the prognosis of early stage breast invasive ductal carcinoma.  相似文献   

20.

Aims

The role of low ankle-brachial index (ABI) in early-stage chronic kidney disease (CKD) is not fully known. This study was designed to investigate the prevalence of low ABI in early-stage CKD defined as an estimated glomerular filtration rate (eGFR) between 60–89 ml/min/1.73 m2 of type 2 diabetic patients without albuminuria and to determine the association between the low ABI and mildly decreased eGFR.

Methods

The cross-sectional study enrolled 448 type 2 diabetic patients with normoalbuminuria. The patients were stratified into two groups according to the CKD-EPI eGFR level: the normal group with eGFR level ≥90 mL/min/1.73 m2 and the lower group with eGFR of 60–89. ABI was categorized as normal (1.0–1.39), low-normal (0.9–0.99), and low (<0.9). Both stepwise forward multiple linear regression and binary logistic regression analyses were performed to examine the association between ABI categories and eGFR levels and to assess the relation of low ABI and early-stage CKD.

Results

The prevalence of low ABI in early-stage CKD of type 2 diabetic patients without albuminuria was 39.5%. Low ABI was associated with an approximate 3-fold greater risk of early-stage CKD in bivariate logistic regression analysis, and remained significantly associated with a 2.2 fold risk (95% confidence interval: 1.188–4.077; P = 0.012) after adjusting traditional chronic kidney disease risk factors.

Conclusions

There was a high prevalence of low ABI in early-stage CKD patients of type 2 diabetes with normoalbuminuria and a close relation between low ABI and early-stage CKD, suggesting that we should pay much more attention to the patients who have only mildly decreased eGFR and normoalbuminuria but have already had a low ABI in clinic work and consider the preventive therapy in early stage.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号