首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.

Introduction

Low birthweight, which can be caused by inappropriate intrauterine growth or prematurity, is associated with development of gestational diabetes mellitus (GDM) as well as pre-eclampsia later in life, but the relative effects of prematurity and inappropriate intrauterine growth remain uncertain.

Methods

Through nation-wide registries we identified all Danish mothers in the years 1989–2007. Two separate cohorts consisting mothers born 1974–1977 (n = 84219) and 1978–1981 (n = 32376) were studied, due to different methods of registering birthweight and gestational age in the two periods. Data was linked with information on GDM, pre-eclampsia and education.

Results

In a multivariate logistic regression model the odds of developing GDM was increased by 5–7% for each week the mother was born before term (p = 0.018 for 1974–1977, p = 0.048 for 1978–1981), while the odds were increased by 13–17% for each standard deviation (SD) reduction in birthweight for gestational age for those who were small or normal for gestational age (p<0.0001 and p = 0.035) and increased by 118–122% for each SD increase above the normal range (p<0.0001 and p = 0.024). The odds of pre-eclampsia was increased by 3–5% for each week the mother was born before term (p = 0.064 and p = 0.04), while the odds were increased 11–12% for each SD reduction in birthweight for gestational age (p<0.0001 and p = 0.0002).

Conclusion

In this cohort of young Danish mothers, being born premature or with increasingly low birthweight for gestational age was associated with an increased risk of GDM and pre-eclampsia in adulthood, while increasingly high birthweight for gestational age was associated with an increased risk of GDM and a decreased risk of pre-eclampsia. Inappropriate weight for gestational age was a more important risk factor than prematurity.  相似文献   

2.

Background

Thymic stromal lymphopoietin (TSLP), an IL7-like cytokine produced by bronchial epithelial cells is upregulated in asthma and induces dendritic cell maturation supporting a Th2 response. Environmental pollutants, including tobacco smoke and diesel exhaust particles upregulate TSLP suggesting that TSLP may be an interface between environmental pollution and immune responses in asthma. Since asthma is prevalent in urban communities, variants in the TSLP gene may be important in asthma susceptibility in these populations.

Objectives

To determine whether genetic variants in TSLP are associated with asthma in an urban admixed population.

Methodology and Main Results

Ten tag-SNPs in the TSLP gene were analyzed for association with asthma using 387 clinically diagnosed asthmatic cases and 212 healthy controls from an urban admixed population. One SNP (rs1898671) showed nominally significant association with asthma (odds ratio (OR) = 1.50; 95% confidence interval (95% CI): 1.09–2.05, p = 0.01) after adjusting for age, BMI, income, education and population stratification. Association results were consistent using two different approaches to adjust for population stratification. When stratified by smoking status, the same SNP showed a significantly increased risk associated with asthma in ex-smokers (OR = 2.00, 95% CI: 1.04–3.83, p = 0.04) but not significant in never-smokers (OR = 1.34; 95% CI: 0.93–1.94, p = 0.11). Haplotype-specific score test indicated that an elevated risk for asthma was associated with a specific haplotype of TSLP involving SNP rs1898671 (OR = 1.58, 95% CI: 1.10–2.27, p = 0.01). Association of this SNP with asthma was confirmed in an independent large population-based cohort consortium study (OR = 1.15, 95% CI: 1.07–1.23, p = 0.0003) and the results stratified by smoking status were also validated (ex-smokers: OR = 1.21, 95% CI: 1.08–1.34, p = 0.003; never-smokers: OR = 1.06, 95% CI: 0.94–1.17, p = 0.33).

Conclusions

Genetic variants in TSLP may contribute to asthma susceptibility in admixed urban populations with a gene and environment interaction.  相似文献   

3.

Background

The clinical and scientific usage of patient-reported outcome measures is increasing in the health services. Often paper forms are used. Manual double entry of data is defined as the definitive gold standard for transferring data to an electronic format, but the process is laborious. Automated forms processing may be an alternative, but further validation is warranted.

Methods

200 patients were randomly selected from a cohort of 5777 patients who had previously answered two different questionnaires. The questionnaires were scanned using an automated forms processing technique, as well as processed by single and double manual data entry, using the EpiData Entry data entry program. The main outcome measure was the proportion of correctly entered numbers at question, form and study level.

Results

Manual double-key data entry (error proportion per 1000 fields = 0.046 (95% CI: 0.001–0.258)) performed better than single-key data entry (error proportion per 1000 fields = 0.370 (95% CI: 0.160–0.729), (p = 0.020)). There was no statistical difference between Optical Mark Recognition (error proportion per 1000 fields = 0.046 (95% CI: 0.001–0.258)) and double-key data entry (p = 1.000). With the Intelligent Character Recognition method, there was no statistical difference compared to single-key data entry (error proportion per 1000 fields = 6.734 (95% CI: 0.817–24.113), (p = 0.656)), as well as double-key data entry (error proportion per 1000 fields = 3.367 (95% CI: 0.085–18.616)), (p = 0.319)).

Conclusions

Automated forms processing is a valid alternative to double manual data entry for highly structured forms containing only check boxes, numerical codes and no dates. Automated forms processing can be superior to single manual data entry through a data entry program, depending on the method chosen.  相似文献   

4.
Feng JY  Su WJ  Chiu YC  Huang SF  Lin YY  Huang RM  Lin CH  Hwang JJ  Lee JJ  Yu MC  Yu KW  Lee YC 《PloS one》2011,6(9):e23715

Background

Despite effective anti-TB treatments, tuberculosis remains a serious threat to public health and is associated with high mortality. Old age and multiple co-morbidities are known risk factors for death. The association of clinical presentations with mortality in pulmonary tuberculosis patients remains an issue of controversy.

Methods

This prospective observational study enrolled newly diagnosed, culture-proven pulmonary tuberculosis patients from five medical centers and one regional hospital, which were referral hospitals of TB patients. Radiographic findings and clinical symptoms were determined at the time of diagnosis. Patients who died for any reason during the course of anti-TB treatment were defined as mortality cases and death that occurred within 30 days of initiating treatment was defined as early mortality. Clinical factors associated with overall mortality and early mortality were investigated.

Results

A total of 992 patients were enrolled and 195 (19.7%) died. Nearly one-third (62/195, 31.8%) of the deaths occurred before or within 30 days of treatment initiation. Older age (RR = 1.04, 95%CI: 1.03–1.05), malignancy (RR = 2.42, 95%CI: 1.77–3.31), renal insufficiency (RR = 1.77, 95%CI: 1.12–2.80), presence of chronic cough (RR = 0.63, 95%CI: 0.47–0.84), fever (RR = 1.45, 95%CI: 1.09–1.94), and anorexia (RR = 1.49, 95%CI: 1.07–2.06) were independently associated with overall mortality. Kaplan-Meier survival analysis demonstrated significantly higher mortality in patients present with fever (p<0.001), anorexia (p = 0.005), and without chronic cough (p<0.001). Among patients of mortality, those with respiratory symptoms of chronic cough (RR = 0.56, 95%CI: 0.33–0.98) and dyspnea (HR = 0.51, 95%CI: 0.27–0.98) were less likely to experience early mortality. The radiological features were comparable between survivors and non-survivors.

Conclusions

In addition to demographic characteristics, clinical presentations including the presence of fever, anorexia, and the absence of chronic cough, were also independent predictors for on-treatment mortality in pulmonary tuberculosis patients.  相似文献   

5.

Background

Oncogenic BRAF mutations have been found in diverse malignancies and activate RAF/MEK/ERK signaling, a critical pathway of tumorigenesis. We examined the clinical characteristics and outcomes of patients with mutant (mut) BRAF advanced cancer referred to phase 1 clinic.

Methods

We reviewed the records of 80 consecutive patients with mutBRAF advanced malignancies and 149 with wild-type (wt) BRAF (matched by tumor type) referred to the Clinical Center for Targeted Therapy and analyzed their outcome.

Results

Of 80 patients with mutBRAF advanced cancer, 56 had melanoma, 10 colorectal, 11 papillary thyroid, 2 ovarian and 1 esophageal cancer. Mutations in codon 600 were found in 77 patients (62, V600E; 13, V600K; 1, V600R; 1, unreported). Multivariate analysis showed less soft tissue (Odds ratio (OR) = 0.39, 95%CI: 0.20–0.77, P = 0.007), lung (OR = 0.38, 95%CI: 0.19–0.73, p = 0.004) and retroperitoneal metastases (OR = 0.34, 95%CI: 0.13–0.86, p = 0.024) and more brain metastases (OR = 2.05, 95%CI: 1.02–4.11, P = 0.043) in patients with mutBRAF versus wtBRAF. Comparing to the corresponding wtBRAF, mutBRAF melanoma patients had insignificant trend to longer median survival from diagnosis (131 vs. 78 months, p = 0.14), while mutBRAF colorectal cancer patients had an insignificant trend to shorter median survival from diagnosis (48 vs. 53 months, p = 0.22). In melanoma, V600K mutations in comparison to other BRAF mutations were associated with more frequent brain (75% vs. 36.3%, p = 0.02) and lung metastases (91.6% vs. 47.7%, p = 0.007), and shorter time from diagnosis to metastasis and to death (19 vs. 53 months, p = 0.046 and 78 vs. 322 months, p = 0.024 respectively). Treatment with RAF/MEK targeting agents (Hazard ratio (HR) = 0.16, 95%CI: 0.03–0.89, p = 0.037) and any decrease in tumor size after referral (HR = 0.07, 95%CI: 0.015–0.35, p = 0.001) correlated with longer survival in mutBRAF patients.

Conclusions

BRAF appears to be a druggable mutation that also defines subgroups of patients with phenotypic overlap, albeit with differences that correlate with histology or site of mutation.  相似文献   

6.

Background

The change of malaria case-management policy in Kenya to recommend universal parasitological diagnosis and targeted treatment with artemether-lumefantrine (AL) is supported with activities aiming by 2013 at universal coverage and adherence to the recommendations. We evaluated changes in health systems and case-management indicators between the baseline survey undertaken before implementation of the policy and the follow-up survey following the first year of the implementation activities.

Methods/Findings

National, cross-sectional surveys using quality-of-care methods were undertaken at public facilities. Baseline and follow-up surveys respectively included 174 and 176 facilities, 224 and 237 health workers, and 2,405 and 1,456 febrile patients. Health systems indicators showed variable changes between surveys: AL stock-out (27% to 21%; p = 0.152); availability of diagnostics (55% to 58%; p = 0.600); training on the new policy (0 to 22%; p = 0.001); exposure to supervision (18% to 13%; p = 0.156) and access to guidelines (0 to 6%; p = 0.001). At all facilities, there was an increase among patients tested for malaria (24% vs 31%; p = 0.090) and those who were both tested and treated according to test result (16% to 22%; p = 0.048). At facilities with AL and malaria diagnostics, testing increased from 43% to 50% (p = 0.196) while patients who were both, tested and treated according to test result, increased from 28% to 36% (p = 0.114). Treatment adherence improved for test positive patients from 83% to 90% (p = 0.150) and for test negative patients from 47% to 56% (p = 0.227). No association was found between testing and exposure to training, supervision and guidelines, however, testing was significantly associated with facility ownership, type of testing, and patients'' caseload, age and clinical presentation.

Conclusions

Most of the case-management indicators have shown some improvement trends; however differences were smaller than expected, rarely statistically significant and still leaving a substantial gap towards optimistic targets. The quantitative and qualitative improvement of interventions will ultimately determine the success of the new policy.  相似文献   

7.

Background

Low adherence to multidrug therapy against leprosy (MDT) is still an important obstacle of disease control, and may lead to remaining sources of infection, incomplete cure, irreversible complications, and multidrug resistance.

Methodology/Principal Finding

We performed a population-based study in 78 municipalities in Tocantins State, central Brazil, and applied structured questionnaires on leprosy-affected individuals. We used two outcomes for assessment of risk factors: defaulting (not presenting to health care center for supervised treatment for >12 months); and interruption of MDT. In total, 28/936 (3.0%) patients defaulted, and 147/806 (18.2%) interrupted MDT. Defaulting was significantly associated with: low number of rooms per household (OR = 3.43; 0.98–9.69; p = 0.03); moving to another residence after diagnosis (OR = 2.90; 0.95–5.28; p = 0.04); and low family income (OR = 2.42; 1.02–5.63: p = 0.04). Interruption of treatment was associated with: low number of rooms per household (OR = 1.95; 0.98–3.70; p = 0.04); difficulty in swallowing MDT drugs (OR = 1.66; 1.03–2.63; p = 0.02); temporal non-availability of MDT at the health center (OR = 1.67; 1.11–2.46; p = 0.01); and moving to another residence (OR = 1.58; 95% confidence interval: 1.03–2.40; p = 0.03). Logistic regression identified temporal non-availability of MDT as an independent risk factor for treatment interruption (adjusted OR = 1.56; 1.05–2.33; p = 0.03), and residence size as a protective factor (adjusted OR = 0.89 per additional number of rooms; 0.80–0.99; p = 0.03). Residence size was also independently associated with defaulting (adjusted OR = 0.67; 0.52–0.88; p = 0.003).

Conclusions

Defaulting and interruption of MDT are associated with some poverty-related variables such as family income, household size, and migration. Intermittent problems of drug supply need to be resolved, mainly on the municipality level. MDT producers should consider oral drug formulations that may be more easily accepted by patients. Thus, an integrated approach is needed for further improving control, focusing on vulnerable population groups and the local health system.  相似文献   

8.

Objectives

To assess the association of gender with clinical expression, health-related quality of life (HRQoL), disability, and self-reported symptoms of depression and anxiety in patients with systemic sclerosis (SSc).

Methods

SSc patients fulfilling the American College of Rheumatology and/or the Leroy and Medsger criteria were assessed for clinical symptoms, disability, HRQoL, self-reported symptoms of depression and anxiety by specific measurement scales.

Results

Overall, 381 SSc patients (62 males) were included. Mean age and disease duration at the time of evaluation were 55.9 (13.3) and 9.5 (7.8) years, respectively. One-hundred-and-forty-nine (40.4%) patients had diffuse cutaneous SSc (dcSSc). On bivariate analysis, differences were observed between males and females for clinical symptoms and self-reported symptoms of depression and anxiety, however without reaching statistical significance. Indeed, a trend was found for higher body mass index (BMI) (25.0 [4.1] vs 23.0 [4.5], p = 0.013), more frequent dcSSc, echocardiography systolic pulmonary artery pressure >35 mmHg and interstitial lung disease in males than females (54.8% vs 37.2%, p = 0.010; 24.2% vs 10.5%, p = 0.003; and 54.8% vs 41.2%, p = 0.048, respectively), whereas calcinosis and self-reported anxiety symptoms tended to be more frequent in females than males (36.0% vs 21.4%, p = 0.036, and 62.3% vs 43.5%, p = 0.006, respectively). On multivariate analysis, BMI, echocardiography PAP>35 mmHg, and anxiety were the variables most closely associated with gender.

Conclusions

In SSc patients, male gender tends to be associated with diffuse disease and female gender with calcinosis and self-reported symptoms of anxiety. Disease-associated disability and HRQoL were similar in both groups.  相似文献   

9.

Background

Hypercapnic Chronic Obstructive Pulmonary Disease (COPD) exacerbation in patients with comorbidities and multidrug therapy is complicated by mixed acid-base, hydro-electrolyte and lactate disorders. Aim of this study was to determine the relationships of these disorders with the requirement for and duration of noninvasive ventilation (NIV) when treating hypercapnic respiratory failure.

Methods

Sixty-seven consecutive patients who were hospitalized for hypercapnic COPD exacerbation had their clinical condition, respiratory function, blood chemistry, arterial blood gases, blood lactate and volemic state assessed. Heart and respiratory rates, pH, PaO2 and PaCO2 and blood lactate were checked at the 1st, 2nd, 6th and 24th hours after starting NIV.

Results

Nine patients were transferred to the intensive care unit. NIV was performed in 11/17 (64.7%) mixed respiratory acidosis–metabolic alkalosis, 10/36 (27.8%) respiratory acidosis and 3/5 (60%) mixed respiratory-metabolic acidosis patients (p = 0.026), with durations of 45.1±9.8, 36.2±8.9 and 53.3±4.1 hours, respectively (p = 0.016). The duration of ventilation was associated with higher blood lactate (p<0.001), lower pH (p = 0.016), lower serum sodium (p = 0.014) and lower chloride (p = 0.038). Hyponatremia without hypervolemic hypochloremia occurred in 11 respiratory acidosis patients. Hypovolemic hyponatremia with hypochloremia and hypokalemia occurred in 10 mixed respiratory acidosis–metabolic alkalosis patients, and euvolemic hypochloremia occurred in the other 7 patients with this mixed acid-base disorder.

Conclusions

Mixed acid-base and lactate disorders during hypercapnic COPD exacerbations predict the need for and longer duration of NIV. The combination of mixed acid-base disorders and hydro-electrolyte disturbances should be further investigated.  相似文献   

10.

Background

Multiple sclerosis (MS) patients with breakthrough disease on immunomodulatory drugs are frequently offered to switch to natalizumab or immunosuppressants. The effect of natalizumab monotherapy in patients with breakthrough disease is unknown.

Methods

This is an open-label retrospective cohort study of 993 patients seen at least four times at the University of California San Francisco MS Center, 95 had breakthrough disease on first-line therapy (60 patients switched to natalizumab, 22 to immunosuppressants and 13 declined the switch [non-switchers]). We used Poisson regression adjusted for potential confounders to compare the relapse rate within and across groups before and after the switch.

Results

In the within-group analyses, the relapse rate decreased by 70% (95% CI 50,82%; p<0.001) in switchers to natalizumab and by 77% (95% CI 59,87%; p<0.001) in switchers to immunosuppressants; relapse rate in non-switchers did not decrease (6%, p = 0.87). Relative to the reduction among non-switchers, the relapse rate was reduced by 68% among natalizumab switchers (95% CI 19,87%; p = 0.017) and by 76% among the immunosuppressant switchers (95% CI 36,91%; p = 0.004).

Conclusions

Switching to natalizumab or immunosuppressants in patients with breakthrough disease is effective in reducing clinical activity of relapsing MS. The magnitude of the effect and the risk-benefit ratio should be evaluated in randomized clinical trials and prospective cohort studies.  相似文献   

11.

Background

Studies indicate that acquired deficits negatively affect patients'' self-reported health related quality of life (HRQOL) and survival, but the impact of HRQOL deterioration after surgery on survival has not been explored.

Objective

Assess if change in HRQOL after surgery is a predictor for survival in patients with glioblastoma.

Methods

Sixty-one patients with glioblastoma were included. The majority of patients (n = 56, 91.8%) were operated using a neuronavigation system which utilizes 3D preoperative MRI and updated intraoperative 3D ultrasound volumes to guide resection. HRQOL was assessed using EuroQol 5D (EQ-5D), a generic instrument. HRQOL data were collected 1–3 days preoperatively and after 6 weeks. The mean change in EQ-5D index was −0.05 (95% CI −0.15–0.05) 6 weeks after surgery (p = 0.285). There were 30 patients (49.2%) reporting deterioration 6 weeks after surgery. In a Cox multivariate survival analysis we evaluated deterioration in HRQOL after surgery together with established risk factors (age, preoperative condition, radiotherapy, temozolomide and extent of resection).

Results

There were significant independent associations between survival and use of temozolomide (HR 0.30, p = 0.019), radiotherapy (HR 0.26, p = 0.030), and deterioration in HRQOL after surgery (HR 2.02, p = 0.045). Inclusion of surgically acquired deficits in the model did not alter the conclusion.

Conclusion

Early deterioration in HRQOL after surgery is independently and markedly associated with impaired survival in patients with glioblastoma. Deterioration in patient reported HRQOL after surgery is a meaningful outcome in surgical neuro-oncology, as the measure reflects both the burden of symptoms and treatment hazards and is linked to overall survival.  相似文献   

12.

Objective

Sirtuins (SIRTs) and mitochondrial uncoupling proteins (UCPs) have been implicated in cardiovascular diseases through the control of reactive oxygen species production. This study sought to investigate the association between genetic variants in the SIRT and UCP genes and carotid plaque.

Methods

In a group of 1018 stroke-free subjects from the Northern Manhattan Study with high-definition carotid ultrasonography and genotyping, we investigated the associations of 85 single nucleotide polymorphisms (SNPs) in the 11 SIRT and UCP genes with the presence and number of carotid plaques, and evaluated interactions of SNPs with sex, smoking, diabetes and hypertension as well as interactions between SNPs significantly associated with carotid plaque.

Results

Overall, 60% of subjects had carotid plaques. After adjustment for demographic and vascular risk factors, T-carriers of the SIRT6 SNP rs107251 had an increased risk for carotid plaque (odds ratio, OR = 1.71, 95% CI = 1.23–2.37, Bonferroni-corrected p = 0.03) and for a number of plaques (rate ratio, RR = 1.31, 1.18–1.45, Bonferroni-corrected p = 1.4×10−5), whereas T-carriers of the UCP5 SNP rs5977238 had an decreased risk for carotid plaque (OR = 0.49, 95% CI = 0.32–0.74, Bonferroni-corrected p = 0.02) and plaque number (RR = 0.64, 95% CI = 0.52–0.78, Bonferroni-corrected p = 4.9×10−4). Some interactions with a nominal p≤0.01 were found between sex and SNPs in the UCP1 and UCP3 gene; between smoking, diabetes, hypertension and SNPs in UCP5 and SIRT5; and between SNPs in the UCP5 gene and the UCP1, SIRT1, SIRT3, SIRT5, and SIRT6 genes in association with plaque phenotypes.

Conclusion

We observed significant associations between genetic variants in the SIRT6 and UCP5 genes and atherosclerotic plaque. We also found potential effect modifications by sex, smoking and vascular risk factors of the SIRT/UCP genes in the associations with atherosclerotic plaque. Further studies are needed to validate our observations.  相似文献   

13.

Rationale

Contamination by bacterial or fungal organisms reduces the effectiveness of mycobacterial culture for diagnosis of pulmonary tuberculosis (TB). We evaluated the effect of an anti-microbial and an anti-fungal oral rinse prior to expectoration on culture-contamination rates.

Methods

We enrolled a consecutive random sample of adults with cough for ≥2 weeks and suspected TB admitted to Mulago Hospital (Kampala, Uganda) between October 2008 and June 2009. We randomly assigned patients to oral rinse (60 seconds with chlorhexidine followed by 60 seconds with nystatin) vs. no oral rinse prior to initial sputum collection. Uganda National Tuberculosis Reference Laboratory technicians blinded to the method of sputum collection (with or without oral rinse) processed all sputum specimens for smear microscopy (direct Ziehl-Neelsen) and mycobacterial culture (Lowenstein-Jensen media).

Results

Of 220 patients enrolled, 177 (80%) were HIV-seropositive (median CD4-count 37 cells/uL, IQR 13–171 cells/uL). Baseline characteristics were similar between patients in the oral-rinse (N = 110) and no oral-rinse (N = 110) groups. The proportion of contaminated cultures was significantly lower in the oral-rinse group compared to the no oral-rinse group (4% vs. 15%, risk difference −11%, 95% CI −18 to −3%, p = 0.005). Oral rinse significantly reduced the proportion of contaminated cultures among HIV-infected patients (3% vs. 18%, risk difference −14%, 95% CI −23 to −6%, p = 0.002) but not HIV-uninfected (6% vs. 4%, risk difference 2%, 95% CI −12 to +15%, p = 0.81) patients. However, the proportion of smear-positive specimens (25% vs. 35%, p = 0.10) and culture-positive specimens (48% vs. 56%, p = 0.24) were lower in the oral-rinse compared to the no oral-rinse group, although the differences were not statistically significant.

Conclusions

Oral rinse prior to sputum expectoration is a promising strategy to reduce mycobacterial culture contamination in areas with high HIV prevalence, if strategies can be devised to reduce the adverse impact of oral rinse on smear- and culture-positivity.  相似文献   

14.

Background

Myocardial perfusion imaging (MPI) can detect myocardial perfusion abnormalities but many examinations are without pathological findings. This study examines whether circulating biomarkers can be used as screening modality prior to MPI.

Methodology/Principal Findings

243 patients with an intermediate risk of CAD or with known CAD with renewed suspicion of ischemia were referred to MPI. Blood samples were analyzed for N-terminal fragment of the prohormone brain natriuretic peptide (NT-proBNP), YKL-40, IL-6, matrix metalloproteinase 9 (MMP-9) and high sensitive C-reactive protein (hsCRP). Patients with myocardial perfusion defects had elevated levels of NT-proBNP (p<0.0001), YKL-40 (p = 0.03) and IL-6 (p = 0.03) but not of hsCRP (p = 0.58) nor of MMP-9 (p = 0.14). The NT-proBNP increase was observed in both genders (p<0.0001), whereas YKL-40 (p = 0.005) and IL-6 (p = 0.02) were elevated only in men. A NT-proBNP cut off-concentration at 25 ng/l predicted a normal MPI with a negative predictive value >95% regardless of existing CAD.

Conclusions

20-25% of patients suspected of CAD could have been spared a MPI by using a NT-proBNP cut-off concentration at 25 ng/l with a negative predictive value >95%. NT-proBNP has the potential use of being a screening marker of CAD before referral of the patient to MPI.  相似文献   

15.

Objectives

We hypothesised that assessment of plasma C-terminal pro-endothelin-1 (CT-proET-1), a stable endothelin-1 precursor fragment, is of prognostic value in patients with chronic heart failure (CHF), beyond other prognosticators, including N-terminal pro-B-type natriuretic peptide (NT-proBNP).

Methods

We examined 491 patients with systolic CHF (age: 63±11 years, 91% men, New York Heart Association [NYHA] class [I/II/III/IV]: 9%/45%/38%/8%, 69% ischemic etiology). Plasma CT-proET-1 was detected using a chemiluminescence immunoassay.

Results

Increasing CT-proET-1 was a predictor of increased cardiovascular mortality at 12-months of follow-up (standardized hazard ratio 1.42, 95% confidence interval [CI] 1.04–1.95, p = 0.03) after adjusting for NT-proBNP, left ventricular ejection fraction (LVEF), age, creatinine, NYHA class. In receiver operating characteristic curve analysis, areas under curve for 12-month follow-up were similar for CT-proET-1 and NT-proBNP (p = 0.40). Both NT-proBNP and CT-proET-1 added prognostic value to a base model that included LVEF, age, creatinine, and NYHA class. Adding CT-proET-1 to the base model had stronger prognostic power (p<0.01) than adding NT-proBNP (p<0.01). Adding CT-proET-1 to NT-proBNP in this model yielded further prognostic information (p = 0.02).

Conclusions

Plasma CT-proET-1 constitutes a novel predictor of increased 12-month cardiovascular mortality in patients with CHF. High CT-proET-1 together with high NT-proBNP enable to identify patients with CHF and particularly unfavourable outcomes.  相似文献   

16.
17.

Background

Modification of ritonavir-boosted lopinavir (LPV/r)-based antiretroviral therapy is required for HIV-infected children co-treated for tuberculosis (TB). We aimed to determine virologic and toxicity outcomes among TB/HIV co-treated children with the following modifications to their antiretroviral therapy (ART): (1) super-boosted LPV/r, (2) double-dose LPV/r or (3) ritonavir.

Methods and Findings

A medical record review was conducted at two clinical sites in Johannesburg, South Africa. The records of children 6–24 months of age initiating LPV/r-based therapy were reviewed. Children co-treated for TB were categorized based on the modifications made to their ART regimen and were compared to children of the same age at each site not treated for TB.Included are 526 children, 294 (56%) co-treated for TB. All co-treated children had more severe HIV disease, including lower CD4 percents and worse growth indicators, than comparisons.Children in the super-boosted group (n = 156) were as likely to be virally suppressed (<400 copies/ml) at 6 months as comparisons (69.2% vs. 74.8%, p = 0.36). Children in the double-dose (n = 47) and ritonavir groups (n = 91) were significantly less likely to be virally suppressed at 6 months (53.1% and 49.3%) than comparisons (74.8% and 82.1%; p = 0.02 and p<0.0001, respectively). At 12 months only children in the ritonavir group still had lower rates of virological suppression relative to comparisons (63.9% vs 83.3% p<0.05). Grade 1 or greater ALT elevations were more common in the super-boosted (75%) than double-dose (54.6%) or ritonavir (33.9%) groups (p = 0.09 and p<0.0001) but grade 3/4 elevations were observed in 3 (13.6%) of the super-boosted, 7 (15.9%) of the double-dose and 5 (8.9%) of the ritonavir group (p = 0.81 and p = 0.29).

Conclusion

Good short-term virologic outcomes were achieved in children co-treated for TB and HIV who received super-boosted LPV/r. Treatment limiting toxicity was rare. Strategies for increased dosing of LPV/r with TB treatment warrant further investigation.  相似文献   

18.

Background

The preoperative Heterotopic Ossification (HO) extent is usually one of the main used criteria to predict the recurrence before excision. Brooker et al built a radiologic scale to assess this pre operative extent around the hip. The aim of this study is to investigate the relationship between the recurrence risk after hip HO excision in Traumatic Brain Injury (TBI) and Spinal Cord Injury (SCI) patients and the preoperative extent of HO.

Methodology/Principal Findings

A case control study including TBI or SCI patients following surgery for troublesome hip HO with (case, n = 19) or without (control, n = 76) recurrence. Matching criteria were: sex, pathology (SCI or TBI) and age at the time of surgery (+/−4.5 years). For each etiology (TBI and SCI), the residual cognitive and functional status (Garland classification), the preoperative extent (Brooker status), the modified radiological and functional status (GCG-BD classification), HO localization, side, mean age at the CNS damage, mean delay for the first HO surgery, and for the case series, the mean operative delay for recurrence after the first surgical intervention were noted.

Conclusions/Significance

The median delay for first HO surgery was 38.6 months (range 4.5 to 414.5;) for the case subgroup and 17.6 months (range 5.7 to 339.6) for the control group. No significant link was found between recurrence and operative delay (p = 0.51); the location around the joint (0.07); the Brooker (p = 0.52) or GCG-BD status (p = 0.79). Including all the matching factors, no significant relationship was found between the recurrence HO risk and the preoperative extent of troublesome hip HO using Brooker status (OR = 1.56(95% CI: 0.47–5.19)) or GCG-BD status (OR class 3 versus 2 = 0.67(95% CI: 0.11–4.24) and OR class 4 versus 2 = 0.79(95%CI: 0.09–6.91)). Until the pathophysiology of HO development is understood, it will be difficult to create tools which can predict HO recurrence.  相似文献   

19.
Yoo DE  Park JT  Oh HJ  Kim SJ  Lee MJ  Shin DH  Han SH  Yoo TH  Choi KH  Kang SW 《PloS one》2012,7(1):e30072

Background

The effect of glycemic control after starting peritoneal dialysis (PD) on the survival of diabetic PD patients has largely been unexplored, especially in Asian population.

Methods

We conducted a prospective observational study, in which 140 incident PD patients with diabetes were recruited. Patients were divided into tertiles according to the means of quarterly HbA1C levels measured during the first year after starting PD. We examined the association between HbA1C and all-cause mortality using Cox proportional hazards models.

Results

The mean age was 58.7 years, 59.3% were male, and the mean follow-up duration was 3.5 years (range 0.4–9.5 years). The mean HbA1C levels were 6.3%, 7.1%, and 8.5% in the 1st, 2nd, and 3rd tertiles, respectively. Compared to the 1st tertile, the all-cause mortality rates were higher in the 2nd [hazard ratio (HR), 4.16; 95% confidence interval (CI), 0.91–18.94; p = 0.065] and significantly higher in the 3rd (HR, 13.16; 95% CI, 2.67–64.92; p = 0.002) tertiles (p for trend = 0.005), after adjusting for confounding factors. Cardiovascular mortality, however, did not differ significantly among the tertiles (p for trend = 0.682). In contrast, non-cardiovascular deaths, most of which were caused by infection, were more frequent in the 2nd (HR, 7.67; 95% CI, 0.68–86.37; p = 0.099) and the 3rd (HR, 51.24; 95% CI, 3.85–681.35; p = 0.003) tertiles than the 1st tertile (p for trend = 0.007).

Conclusions

Poor glycemic control is associated with high mortality rates in diabetic PD patients, suggesting that better glycemic control may improve the outcomes of these patients.  相似文献   

20.

Background

Cardiac allograft vasculopathy (CAV) is the principal cause of long-term graft failure following heart transplantation. Early identification of patients at risk of CAV is essential to target invasive follow-up procedures more effectively and to establish appropriate therapies. We evaluated the prognostic value of the first heart biopsy (median: 9 days post-transplant) versus all biopsies obtained within the first three months for the prediction of CAV and graft failure due to CAV.

Methods and Findings

In a prospective cohort study, we developed multivariate regression models evaluating markers of atherothrombosis (fibrin, antithrombin and tissue plasminogen activator [tPA]) and endothelial activation (intercellular adhesion molecule-1) in serial biopsies obtained during the first three months post-transplantation from 172 patients (median follow-up = 6.3 years; min = 0.37 years, max = 16.3 years). Presence of fibrin was the dominant predictor in first-biopsy models (Odds Ratio [OR] for one- and 10-year graft failure due to CAV = 38.70, p = 0.002, 95% CI = 4.00–374.77; and 3.99, p = 0.005, 95% CI = 1.53–10.40) and loss of tPA was predominant in three-month models (OR for one- and 10-year graft failure due to CAV = 1.81, p = 0.025, 95% CI = 1.08–3.03; and 1.31, p = 0.001, 95% CI = 1.12–1.55). First-biopsy and three-month models had similar predictive and discriminative accuracy and were comparable in their capacities to correctly classify patient outcomes, with the exception of 10-year graft failure due to CAV in which the three-month model was more predictive. Both models had particularly high negative predictive values (e.g., First-biopsy vs. three-month models: 99% vs. 100% at 1-year and 96% vs. 95% at 10-years).

Conclusions

Patients with absence of fibrin in the first biopsy and persistence of normal tPA in subsequent biopsies rarely develop CAV or graft failure during the next 10 years and potentially could be monitored less invasively. Presence of early risk markers in the transplanted heart may be secondary to ischemia/reperfusion injury, a potentially modifiable factor.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号