首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.

Background

The causes of death on long-term mortality after acute kidney injury (AKI) have not been well studied. The purpose of the study was to evaluate the role of comorbidities and the causes of death on the long-term mortality after AKI.

Methodology/Principal Findings

We retrospectively studied 507 patients who experienced AKI in 2005–2006 and were discharged free from dialysis. In June 2008 (median: 21 months after AKI), we found that 193 (38%) patients had died. This mortality is much higher than the mortality of the population of São Paulo City, even after adjustment for age. A multiple survival analysis was performed using Cox proportional hazards regression model and showed that death was associated with Khan’s index indicating high risk [adjusted hazard ratio 2.54 (1.38–4.66)], chronic liver disease [1.93 (1.15–3.22)], admission to non-surgical ward [1.85 (1.30–2.61)] and a second AKI episode during the same hospitalization [1.74 (1.12–2.71)]. The AKI severity evaluated either by the worst stage reached during AKI (P = 0.20) or by the need for dialysis (P = 0.12) was not associated with death. The causes of death were identified by a death certificate in 85% of the non-survivors. Among those who died from circulatory system diseases (the main cause of death), 59% had already suffered from hypertension, 34% from diabetes, 47% from heart failure, 38% from coronary disease, and 66% had a glomerular filtration rate <60 previous to the AKI episode. Among those who died from neoplasms, 79% already had the disease previously.

Conclusions

Among AKI survivors who were discharged free from dialysis the increased long-term mortality was associated with their pre-existing chronic conditions and not with the severity of the AKI episode. These findings suggest that these survivors should have a medical follow-up after hospital discharge and that all efforts should be made to control their comorbidities.  相似文献   

2.

Background

The incidence of acute kidney injury (AKI) is increasing globally and it is much more common than end-stage kidney disease. AKI is associated with high mortality and cost of hospitalisation. Studies of treatments to reduce this high mortality have used differing renal replacement therapy (RRT) modalities and have not shown improvement in the short term. The reported long-term outcomes of AKI are variable and the effect of differing RRT modalities upon them is not clear. We used the prolonged follow-up of a large clinical trial to prospectively examine the long-term outcomes and effect of RRT dosing in patients with AKI.

Methods and Findings

We extended the follow-up of participants in the Randomised Evaluation of Normal vs. Augmented Levels of RRT (RENAL) study from 90 days to 4 years after randomization. Primary and secondary outcomes were mortality and requirement for maintenance dialysis, respectively, assessed in 1,464 (97%) patients at a median of 43.9 months (interquartile range [IQR] 30.0–48.6 months) post randomization. A total of 468/743 (63%) and 444/721 (62%) patients died in the lower and higher intensity groups, respectively (risk ratio [RR] 1.04, 95% CI 0.96–1.12, p = 0.49). Amongst survivors to day 90, 21 of 411 (5.1%) and 23 of 399 (5.8%) in the respective groups were treated with maintenance dialysis (RR 1.12, 95% CI 0.63–2.00, p = 0.69). The prevalence of albuminuria among survivors was 40% and 44%, respectively (p = 0.48). Quality of life was not different between the two treatment groups. The generalizability of these findings to other populations with AKI requires further exploration.

Conclusions

Patients with AKI requiring RRT in intensive care have high long-term mortality but few require maintenance dialysis. Long-term survivors have a heavy burden of proteinuria. Increased intensity of RRT does not reduce mortality or subsequent treatment with dialysis.

Trial registration

www.ClinicalTrials.gov NCT00221013 Please see later in the article for the Editors'' Summary  相似文献   

3.

Background

Endotoxemia is exaggerated and contributes to systemic inflammation and atherosclerosis in patients requiring continuous ambulatory peritoneal dialysis (CAPD). The risk of mortality is substantially increased in patients requiring CAPD for >2 years. However, little is known about the effects of long-term CAPD on circulating endotoxin and cytokine levels. Therefore, the present study evaluated the associations between plasma endotoxin levels, cytokine levels, and clinical parameters with the effects of a short-dwell exchange on endotoxemia and cytokine levels in patients on long-term CAPD.

Methods

A total of 26 patients were enrolled and divided into two groups (short-term or long-term CAPD) according to the 2-year duration of CAPD. Plasma endotoxin and cytokine levels were measured before and after a short-dwell exchange (4-h dwell) during a peritoneal equilibration test (a standardized method to evaluate the solute transport function of peritoneal membrane). These data were analyzed to determine the relationship of circulating endotoxemia, cytokines and clinical characteristics between the two groups.

Results

Plasma endotoxin and monocyte chemotactic protein-1 (MCP-1) levels were significantly elevated in the long-term group. PD duration was significantly correlated with plasma endotoxin (r = 0.479, P = 0.016) and MCP-1 (r = 0.486, P = 0.012). PD duration was also independently associated with plasma MCP-1 levels in multivariate regression. Plasma MCP-1 levels tended to decrease (13.3% reduction, P = 0.077) though endotoxin levels did not decrease in the long-term PD group after the 4-h short-dwell exchange.

Conclusion

Long-term PD may result in exaggerated endotoxemia and elevated plasma MCP-1 levels. The duration of PD was significantly correlated with circulating endotoxin and MCP-1 levels, and was an independent predictor of plasma MCP-1 levels. Short-dwell exchange seemed to have favorable effects on circulating MCP-1 levels in patients on long-term PD.  相似文献   

4.

Background

Although currently available IGRA have been reported to be promising markers for TB infection, they cannot distinguish active tuberculosis (TB) from latent infection (LTBI).

Objective

Children with LTBI, active TB disease or uninfected were prospectively evaluated by an in-house ELISPOT assay in order to investigate possible immunological markers for a differential diagnosis between LTBI and active TB.

Methods

Children at risk for TB infection prospectively enrolled in our infectious disease unit were evaluated by in-house IFN-γ and IL-2 based ELISPOT assays using a panel of Mycobacterium tuberculosis antigens.

Results

Twenty-nine children were classified as uninfected, 21 as LTBI and 25 as active TB cases (including 5 definite and 20 probable cases). Significantly higher IFN-γ ELISPOT responses were observed in infected vs. uninfected children for ESAT-6 (p<0.0001), CFP-10 (p<0.0001), TB 10.3 (p = 0.003), and AlaDH (p = 0.001), while differences were not significant considering Ag85B (p = 0.063), PstS1 (p = 0.512), and HspX (16 kDa) (p = 0.139). IL-2 ELISPOT assay responses were different for ESAT-6 (p<0.0001), CFP-10 (p<0.0001), TB 10.3 (p<0.0001), HspX (16 kDa) (p<0.0001), PstS1 (p<0.0001) and AlaDH (p = 0.001); but not for Ag85B (p = 0.063). Comparing results between children with LTBI and those with TB disease differences were significant for IFN-γ ELISPOT only for AlaDH antigen (p = 0.021) and for IL-2 ELISPOT assay for AlaDH (p<0.0001) and TB 10.3 antigen (p = 0.043). ROC analyses demonstrated sensitivity of 100% and specificity of 81% of AlaDH-IL-2 ELISPOT assay in discriminating between latent and active TB using a cut off of 12.5 SCF per million PBMCs.

Conclusion

Our data suggest that IL-2 based ELISPOT with AlaDH antigen may be of help in discriminating children with active from those with latent TB.  相似文献   

5.

Background

Previous studies have suggested that erectile dysfunction (ED) is an independent risk factor for macrovascular disease. Very few studies have evaluated the relationship between ED and risk of end stage renal disease (ESRD) requiring dialysis.

Methods

A random sample of 1,000,000 individuals from Taiwan''s National Health Insurance database was collected. We selected the control group by matching the subjects and controls by age, diabetes, hypertension, coronary heart disease, hyperlipidemia, area of residence, monthly income and index date. We identified 3985 patients with newly-diagnosed ED between 2000 and 2008 and compared them with a matched cohort of 23910 patients without ED. All patients were tracked from the index date to identify which patients subsequently developed a need for dialysis.

Results

The incidence rates of dialysis in the ED cohort and comparison groups were 10.85 and 9.06 per 10000 person-years, respectively. Stratified by age, the incidence rate ratio for dialysis was greater in ED patients aged <50 years (3.16, 95% CI: 1.62–6.19, p = 0.0008) but not in aged 50–64 (0.94, 95% CI: 0.52–1.69, p = 0.8397) and those aged ≧65 (0.69, 95% CI: 0.32–1.52, p = 0.3594). After adjustment for patient characteristics and medial comorbidities, the adjusted HR for dialysis remained greater in ED patients aged <50 years (adjusted HR: 2.08, 95% CI: 1.05–4.11, p<0.05). The log-rank test revealed that ED patients <50-years-old had significantly higher cumulative incidence rates of dialysis than those without (p = 0.0004).

Conclusion

Patients with ED, especially younger patients, are at an increased risk for ESRD requiring dialysis later in life.  相似文献   

6.

Background

The blood based interferon-gamma release assays (IGRA) for the diagnosis of tuberculosis do not discriminate between active TB disease and latent TB infection (LTBI). The search for distinguishing biomarkers therefore continues, as the accurate diagnosis of tuberculosis is particularly challenging in children. IFN-γ-inducible protein 10 (IP-10/CXCL10) has recently been evaluated as a marker for active TB in adults with promising results.

Aim

To investigate this new biomarker for active TB and LTBI in paediatrics.

Method

We measured IP-10 levels using ELISA in supernatants of whole blood samples stimulated with TB-specific-antigens and negative control antigen.

Results

IP-10 is produced in high levels following mycobacterial antigen stimulation in active TB (n = 17) and LTBI (n = 16) compared to controls (n = 16) and to IFN-γ. The baseline levels of IP-10 are increased in active TB and in LTBI, but there is no significant difference of stimulated levels of IP-10 between active TB and LTBI.

Conclusions

IP-10 is a biomarker for tuberculosis in children. However like IFNγ, IP-10 also does not distinguish between active TB and LTBI.  相似文献   

7.

Background

There is a need for reliable markers to diagnose active and latent tuberculosis (TB). The interferon gamma release assays (IGRAs) are compared to the tuberculin skin test (TST) more specific, but cannot discriminate between recent or remote TB infection. Here the Flow-cytometric Assay for Specific Cell-mediated Immune-response in Activated whole blood (FASCIA), which quantifies expanded T-lymphoblasts by flow-cytometric analysis after long-term antigen stimulation of whole blood, is combined with cytokine/chemokine analysis in the supernatant by multiplex technology for diagnosis of Mycobacterium tuberculosis (Mtb) infection.

Methods and Findings

Consecutive patients with suspected TB (n = 85), with microbiologically verified active pulmonary TB (n = 33), extra pulmonary TB (n = 21), clinical TB (n = 11), presumed latent TB infection (LTBI) (n = 23), patients negative for TB (n = 8) and 21 healthy controls were studied. Blood samples were analyzed with FASCIA and multiplex technology to determine and correlate proliferative responses and the value of 14 cytokines for diagnosis of Mtb infection: IFN- γ, IL-2, TNF-α, IP-10, IL-12, IL-6, IL-4, IL-5, IL-13, IL-17, MIP-1β, GM-CSF, IFN-α2 and IL-10. Cytokine levels for IFN-γ, IP-10, MIP-1β, IL-2, TNF-α, IL-6, IL-10, IL-13 and GM-CSF were significantly higher after stimulation with the Mtb specific antigens ESAT-6 and CFP-10 in patients with active TB compared to healthy controls (p<0.05) and correlated with proliferative responses. IP-10 was positive in all patients with verified TB, if using a combination of ESAT-6 and CFP-10 and was the only marker significantly more sensitive in detecting active TB then IFN-γ (p = 0.012). Cytokine responses in patients with active TB were more frequent and detected at higher levels than in patients with LTBI.

Conclusions

IP-10 seems to be an important marker for diagnosis of active and latent TB. Patients with active TB and LTBI responded with similar cytokine profiles against TB antigens but proliferative and cytokine responses were generally higher in patients with active TB.  相似文献   

8.

Backgrounds and Aims

The presence and progression of vascular calcification have been demonstrated as important risk factors for mortality in dialysis patients. However, since the majority of subjects included in most previous studies were hemodialysis patients, limited information was available in peritoneal dialysis (PD) patients. Therefore, the aim of this study was to investigate the prevalence of aortic arch calcification (AoAC) and prognostic value of AoAC progression in PD patients.

Methods

We prospectively determined AoAC by chest X-ray at PD start and after 12 months, and evaluated the impact of AoAC progression on mortality in 415 incident PD patients.

Results

Of 415 patients, 169 patients (40.7%) had AoAC at baseline with a mean of 18.1±11.2%. The presence of baseline AoAC was an independent predictor of all-cause [Hazard ratio (HR): 2.181, 95% confidence interval (CI): 1.336–3.561, P = 0.002] and cardiovascular mortality (HR: 3.582, 95% CI: 1.577–8.132, P = 0.002). Among 363 patients with follow-up chest X-rays at 12 months after PD start, the proportion of patients with AoAC progression was significantly higher in patients with baseline AoAC (64.2 vs. 5.3%, P<0.001). Moreover, all-cause and cardiovascular death rates were significantly higher in the progression groups than in the non-progression group (P<0.001). Multivariate Cox analysis revealed that AoAC progression was an independent predictor for all-cause (HR: 2.625, 95% CI: 1.150–5.991, P = 0.022) and cardiovascular mortality (HR: 4.008, 95% CI: 1.079–14.890, P = 0.038) in patients with AoAC at baseline.

Conclusions

The presence and progression of AoAC assessed by chest X-ray were independently associated with unfavorable outcomes in incident PD patients. Regular follow-up by chest X-ray could be a simple and useful method to stratify mortality risk in these patients.  相似文献   

9.

Background

Continuous ambulatory peritoneal dialysis (CAPD) patients with diabetes are at increased risk of mortality and high peritoneal transporters appear to contribute to poor survival. However, little is known about the combined impacts of high peritoneal transporters and diabetes on mortality.

Methods

This was a prospective observational cohort study. 776 incident CAPD patients were enrolled. Unadjusted and adjusted Cox proportional regression models were used to evaluate the association and interaction of peritoneal transport and diabetic status with mortality

Results

In the entire cohort, high peritoneal transport status was associated with an increased risk of all-cause mortality in unadjusted model [hazard ratio (HR) 2.35, 95% confidence interval (CI) 1.30 to 4.25, P = 0.01], but this association was not significant in multivariable model. There was an interaction between peritoneal membrane transport status and diabetes (P = 0.028). Subgroup analyses showed that compared to low and low average transporters, high transporters was associated with a higher risk of all-cause mortality (adjusted HR 1.78, 95% CI 1.07 to 4.70, P = 0.04) in CAPD patients without diabetes, but not in those with diabetes (adjusted HR 0.79, 95%CI 0.33 to 1.89, P = 0.59). Results were similar when transport status was assessed as a continuous variable.

Conclusions

The association between high peritoneal transport and all-cause mortality was likely to vary with diabetes status. High peritoneal transport was associated with an elevated risk of death among CAPD patients without diabetes, but not in those with diabetes.  相似文献   

10.

Background

Both chronic obstructive pulmonary disease (COPD) and tuberculosis (TB) primarily affect the lungs and are major causes of morbidity and mortality worldwide. COPD and TB have common risk factors such as smoking, low socioeconomic status and dysregulation of host defence functions. COPD is a prevalent co-morbid condition, especially in elderly with TB but in contrast to other diseases known to increase the risk of TB, relatively little is known about the specific relationship and impact from COPD on TB-incidence and mortality.

Methods and Findings

All individuals ≥40 years of age, discharged with a diagnosis of COPD from Swedish hospitals 1987–2003 were identified in the Swedish Inpatient Register (n = 115,867). Records were linked to the Swedish Tuberculosis Register 1989–2007 and the relative risk of active TB in patients with COPD compared to control subjects randomly selected from the general population (matched for sex, year of birth and county of residence) was estimated using Cox regression. The analyses were stratified by year of birth, sex and county of residence and adjusted for immigration status, socioeconomic status (SES) and inpatient co-morbidities previously known to increase the risk of TB. COPD patients had a three-fold increased hazard ratio (HR) of developing active TB (HR 3.0 (95% confidence interval 2.4 to 4.0)) that was mainly dependent on an increased risk of pulmonary TB. In addition, logistic regression estimates showed that COPD patients who developed active TB had a two-fold increased risk of death from all causes within first year after the TB diagnosis compared to the general population control subjects with TB (OR 2.2, 95% confidence interval 1.2 to 4.1).

Conclusions

This population-based study comprised of a large number of COPD patients shows that these patients have an increased risk of developing active TB compared to the general population. The results raise concerns that the increasing global burden of COPD will increase the incidence of active TB. The underlying contributory factors need to be disentangled in further studies.  相似文献   

11.

Background

From long instances, it is debatable whether three sputum specimens are required for the diagnosis of pulmonary tuberculosis (TB) or TB can be diagnosed effectively using two consecutive sputum specimens. This study was set out to evaluate the significance of examining multiple sputum specimens in diagnosis of TB.

Methods

We retrospectively reviewed the acid-fast bacillus (AFB) smear and culture results of three consecutive days’ sputum specimens from 413 confirmed TB patients which were detected as part of a larger active case finding study in Dhaka Central Jail, the largest correctional facility in Bangladesh.

Results

AFB was detected from 81% (n = 334) patients, of which 89% (n = 297) were diagnosed from the first and additional 9% (n = 30) were from the second sputum specimen. M. tuberculosis growth was observed for 406 patients and 85% (n = 343) were obtained from the first sputum and additional 10% (n = 42) were from the second one. The third specimen didn’t show significant additional diagnostic value for the detection of AFB by microscopy or growth of the M. tuberculosis.

Conclusions

We concluded from our study results that examining two consecutive sputum specimens is sufficient enough for the effective diagnosis of TB. It can also decrease the laboratory workload and hence improve the quality of work in settings with high TB burden like Bangladesh.  相似文献   

12.

Background

Diffuse bronchiectasis (DB) may occur in rheumatoid arthritis (RA). CFTR (cystic fibrosis transmembrane conductance regulator) mutations predispose RA patients to DB, but the prognosis of RA-associated DB (RA-DB) is unclear.

Methods

We report long-term mortality data from a nationwide family-based association study of patients with RA only, DB only or RA-DB. We assessed mortality as a function of clinical characteristics and CF/CFTR-RD (CFTR-related disorders) mutations in 137 subjects from 24 kindreds. Potential risk factors were investigated by Cox proportional-hazard analysis with shared Gaussian random effects to account for within-family correlations.

Results

During a median follow-up of 11 years after inclusion, 18 patients died, mostly from cardiorespiratory causes. Survival was significantly lower for RA-DB patients than for unaffected relatives and for patients with RA or DB only. RA patients with DB had also a poorer prognosis in terms of survival after RA diagnosis (HR, 8.6; 95% CI, 1.5–48.2; P = 0.014) and from birth (HR, 9.6; 95% CI, 1.1–81.7; P = 0.039). Early onset of DB (HR, 15.4; 95% CI, 2.1–113.2; P = 0.007) and CF/CFTR-RD mutation (HR, 7.2; 95% CI, 1.4–37.1; P = 0.018) were associated with poorer survival in patients with RA-DB. Thus, CF/CFTR-RD mutations in RA patients with early-onset DB defined a subgroup of high-risk patients with higher mortality rates (log-rank test P = 1.28×10−5).

Conclusion

DB is associated with poorer survival in patients with RA. Early-onset DB and CFTR mutations are two markers that identify RA patients at a high risk of death, for whom future therapeutic interventions should be designed and evaluated.  相似文献   

13.

Background

Betaine is a major osmolyte, also important in methyl group metabolism. Concentrations of betaine, its metabolite dimethylglycine and analog trimethylamine-N-oxide (TMAO) in blood are cardiovascular risk markers. Diabetes disturbs betaine: does diabetes alter associations between betaine-related measures and cardiovascular risk?

Methods

Plasma samples were collected from 475 subjects four months after discharge following an acute coronary admission. Death (n = 81), secondary acute MI (n = 87), admission for heart failure (n = 85), unstable angina (n = 72) and all cardiovascular events (n = 283) were recorded (median follow-up: 1804 days).

Results

High and low metabolite concentrations were defined as top or bottom quintile of the total cohort. In subjects with diabetes (n = 79), high plasma betaine was associated with increased frequencies of events; significantly for heart failure, hazard ratio 3.1 (1.2–8.2) and all cardiovascular events, HR 2.8 (1.4–5.5). In subjects without diabetes (n = 396), low plasma betaine was associated with events; significantly for secondary myocardial infarction, HR 2.1 (1.2–3.6), unstable angina, HR 2.3 (1.3–4.0), and all cardiovascular events, HR 1.4 (1.0–1.9). In diabetes, high TMAO was a marker of all outcomes, HR 2.7 (1.1–7.1) for death, 4.0 (1.6–9.8) for myocardial infarction, 4.6 (2.0–10.7) for heart failure, 9.1 (2.8–29.7) for unstable angina and 2.0 (1.1–3.6) for all cardiovascular events. In subjects without diabetes TMAO was only significant for death, HR 2.7 (1.6–4.8) and heart failure, HR 1.9 (1.1–3.4). Adding the estimated glomerular filtration rate to Cox regression models tended to increase the apparent risks associated with low betaine.

Conclusions

Elevated plasma betaine concentration is a marker of cardiovascular risk in diabetes; conversely low plasma betaine concentrations indicate increased risk in the absence of diabetes. We speculate that the difference reflects control of osmolyte retention in tissues. Elevated plasma TMAO is a strong risk marker in diabetes.  相似文献   

14.

Aims

To investigate whether uric acid (UA) is an independent predictor of cardiovascular (CV) and all-cause mortality in peritoneal dialysis (PD) patients after controlling for recognized CV risk factors.

Methods

A total of 2264 patients on chronic PD were collected from seven centers affiliated with the Socioeconomic Status on the Outcome of Peritoneal Dialysis (SSOP) Study. All demographic and laboratory data were recorded at baseline. Multivariate Cox regression was used to calculate the hazard ratio (HR) of CV and all-cause mortality with adjustments for recognized traditional and uremia-related CV factors.

Results

There were no significant differences in baseline characteristics between patients with (n = 2193) and without (n = 71) UA measured. Each 1 mg/dL of increase in UA was associated with higher all-cause mortality with 1.05(1.00∼1.10) of HR and higher CV mortality with 1.12 (1.05∼1.20) of HR after adjusting for age, gender and center size. The highest gender-specific tertile of UA predicted higher all-cause mortality with 1.23(1.00∼1.52) of HR and higher CV mortality with 1.69 (1.21∼2.38) of HR after adjusting for age, gender and center size. The predictive value of UA was stronger in patients younger than 65 years without CV disease or diabetes at baseline. The prognostic value of UA as both continuous and categorical variable weakened or disappeared after further adjusted for uremia-related and traditional CV risk factors.

Conclusions

The prognostic value of UA in CV and all-cause mortality was weak in PD patients generally, which was confounded by uremia-related and traditional CV risk factors.  相似文献   

15.

Background

To examine the characteristics of oxidative stress in patients with acute kidney injury (AKI) and investigate the association between plasma nitrotyrosine levels and 90-day mortality in patients with AKI.

Methodology/Principal Findings

158 patients with hospital-acquired AKI were recruited to this prospective cohort study according to RIFLE (Risk, Injury, Failure, Lost or End Stage Kidney) criteria. Twelve critically ill patients without AKI and 15 age and gender-matched healthy subjects served as control. Plasma 3-nitrotyrosine was analyzed in relation to 90-day all cause mortality of patients with AKI. The patients with AKI were followed up for 90 days and grouped according to median plasma 3-nitrotyrosine concentrations. Highest 3-NT/Tyr was detected in patients with AKI compared with healthy subjects, and critically ill patients without AKI (ANOVA p<0.001). The 90-day survival curves of patients with high 3-NT/Tyr showed significant differences compared with the curves of individuals with low 3-NT/Tyr (p = 0.001 by log rank test). Multivariate analysis (Cox regression) revealed that 3-NT/Tyr (p = 0.025) was independently associated with mortality after adjustment for age, gender, sepsis and Acute Physiology and Chronic Health Evaluation (APACHE) II score.

Conclusions/Significance

There is excess plasma protein oxidation in patients with AKI, as evidenced by increased nitrotyrosine content. 3-NT/Tyr level was associated with mortality of AKI patients independent of the severity of illness.  相似文献   

16.

Background

Biomarkers to differentiate between active tuberculosis (TB) and latent TB infection (LTBI) and to monitor treatment responses are requested to complement TB diagnostics and control, particularly in patients with multi-drug resistant TB. We have studied soluble markers of the Toll-like-receptor 4 (TLR-4) pathway in various stages of TB disease and during anti-TB treatment.

Methods

Plasma samples from patients with culture confirmed drug-sensitive TB (n = 19) were collected before and after 2, 8 and 24 weeks of efficient anti-TB treatment and in a LTBI group (n = 6). Soluble (s) CD14 and myeloid differentiation-2 (MD-2) were analyzed by the Enzyme-linked immunosorbent assay (ELISA). Lipopolysaccharide (LPS) was analyzed by the Limulus Amebocyte Lysate colorimetric assay. Nonparametric statistics were applied.

Results

Plasma levels of sCD14 (p<0.001), MD-2 (p = 0.036) and LPS (p = 0.069) were elevated at baseline in patients with untreated active TB compared to the LTBI group. MD-2 concentrations decreased after 2 weeks of treatment (p = 0.011), while LPS levels decreased after 8 weeks (p = 0.005). In contrast, sCD14 levels increased after 2 weeks (p = 0.047) with a subsequent modest decrease throughout the treatment period. There was no significant difference in concentrations of any of these markers between patients with pulmonary and extrapulmonary TB or between patients with or without symptoms.

Conclusion

Our data suggest that plasma levels of LPS, MD-2 and sCD14 can discriminate between active TB and LTBI. A decline in LPS and MD-2 concentrations was associated with response to anti-TB treatment. The clinical potential of these soluble TLR-4 pathway proteins needs to be further explored.  相似文献   

17.

Background & Aims

Official guidelines do not recommend hepatic resection (HR) for patients with hepatocellular carcinoma (HCC) and portal hypertension (PHT). This study aims to investigate the safety and efficacy of HR for patients with HCC and PHT.

Methods

Mortality and survival after HR were analyzed retrospectively in a consecutive sample of 1738 HCC patients with PHT (n = 386) or without it (n = 1352). To assess the robustness of findings, we repeated the analysis using propensity score-matched analysis. We also comprehensively searched the PubMed database for studies evaluating the efficacy and safety of HR for patients with HCC and PHT.

Results

The 90-day mortality rate was 6.7% among those with PHT and 2.1% among those without it (P<.001). Patients without PHT had a survival benefit over those with PHT at 1, 3, and 5 years (96% vs 90%, 75% vs 67%, 54% vs 45%, respectively; P = .001). In contrast, PHT was not associated with worse short- or long-term survival when only propensity score-matched pairs of patients and those with early-stage HCC or those who underwent minor hepatectomy were included in the analysis (all P>.05). Moreover, the recurrence rates were similar between the two groups. Consistent with our findings, all 9 studies identified in our literature search reported HR to be safe and effective for patients with HCC and PHT.

Conclusions

HR is safe and effective in HCC patients with PHT and preserved liver function. This is especially true for patients who have early-stage HCC or who undergo minor hepatectomy.  相似文献   

18.

Introduction

Regional citrate anticoagulation (RCA) is gaining popularity in continous renal replacement therapy (CRRT) for critically ill patients. The risk of citrate toxicity is a primary concern during the prolonged process. The aim of this study was to assess the pharmacokinetics of citrate in critically ill patients with AKI, and used the kinetic parameters to predict the risk of citrate accumulation in this population group undergoing continuous veno-venous hemofiltration (CVVH) with RCA.

Methods

Critically ill patients with AKI (n = 12) and healthy volunteers (n = 12) were investigated during infusing comparative dosage of citrate. Serial blood samples were taken before, during 120 min and up to 120 min after infusion. Citrate pharmacokinetics were calculated and compared between groups. Then the estimated kinetic parameters were applied to the citrate kinetic equation for validation in other ten patients’ CVVH sessions with citrate anticoagulation.

Results

Total body clearance of citrate was similar in critically ill patients with AKI and healthy volunteers (648.04±347.00 L/min versus 686.64±353.60 L/min; P = 0.624). Basal and peak citrate concentrations were similar in both groups (p = 0.423 and 0.247, respectively). The predicted citrate curve showed excellent fit to the measurements.

Conclusions

Citrate clearance is not impaired in critically ill patients with AKI in the absence of severe liver dysfunction. Citrate pharmacokinetic data can provide a basis for the clinical use of predicting the risk of citrate accumulation.

Trial Registration

ClinicalTrials.gov Identifier NCT00948558  相似文献   

19.

Background

The impact of dialysis modality on survival is still somewhat controversial. Given possible differences in patients’ characteristics and the cause and rate of death in different countries, the issue needs to be evaluated in Korean cohorts.

Methods

A nationwide prospective observational cohort study (NCT00931970) was performed to compare survival between peritoneal dialysis (PD) and hemodialysis (HD). A total of 1,060 end-stage renal disease patients in Korea who began dialysis between September 1, 2008 and June 30, 2011 were followed through December 31, 2011.

Results

The patients (PD, 30.6%; HD, 69.4%) were followed up for 16.3±7.9 months. PD patients were significantly younger, less likely to be diabetic, with lower body mass index, and larger urinary volume than HD patients. Infection was the most common cause of death. Multivariate Cox regression with the entire cohort revealed that PD tended to be associated with a lower risk of death compared to HD [hazard ratio (HR) 0.63, 95% confidence interval (CI) 0.36–1.08]. In propensity score matched pairs (n = 278 in each modality), cumulative survival probabilities for PD and HD patients were 96.9% and 94.1% at 12 months (P = 0.152) and 94.3% and 87.6% at 24 months (P = 0.022), respectively. Patients on PD had a 51% lower risk of death compared to those on HD (HR 0.49, 95% CI 0.25–0.97).

Conclusions

PD exhibits superior survival to HD in the early period of dialysis, even after adjusting for differences in the patients’ characteristics between the two modalities. Notably, the most common cause of death was infection in this Korean cohort.  相似文献   

20.

Background

Several studies have shown a prolonged or increased susceptibility to malaria in the post-partum period. A matched cohort study was conducted to evaluate prospectively the susceptibility to malaria of post-partum women in an area where P.falciparum and P.vivax are prevalent.

Methods

In an area of low seasonal malaria transmission on the Thai-Myanmar border pregnant women attending antenatal clinics were matched to a non-pregnant, non-post-partum control and followed up prospectively until 12 weeks after delivery.

Results

Post-partum women (n = 744) experienced significantly less P.falciparum episodes than controls (hazard ratio (HR) 0.39 (95%CI 0.21–0.72) p = 0.003) but significantly more P.vivax (HR 1.34 (1.05–1.72) p = 0.018). The reduced risk of falciparum malaria was accounted for by reduced exposure, whereas a history of P.vivax infection during pregnancy was a strong risk factor for P.vivax in post-partum women (HR 13.98 (9.13–21.41), p<0.001). After controlling for effect modification by history of P.vivax, post-partum women were not more susceptible to P.vivax than controls (HR: 0.33 (0.21–0.51), p<0.001). Genotyping of pre-and post-partum infections (n⊕ = ⊕10) showed that each post-partum P.falciparum was a newly acquired infection.

Conclusions

In this area of low seasonal malaria transmission post-partum women were less likely to develop falciparum malaria but more likely to develop vivax malaria than controls. This was explained by reduced risk of exposure and increased risk of relapse, respectively. There was no evidence for altered susceptibility to malaria in the post-partum period. The treatment of vivax malaria during and immediately after pregnancy needs to be improved.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号