首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.

Background

The Trypanosoma cruzi satellite DNA (satDNA) OligoC-TesT is a standardised PCR format for diagnosis of Chagas disease. The sensitivity of the test is lower for discrete typing unit (DTU) TcI than for TcII-VI and the test has not been evaluated in chronic Chagas disease patients.

Methodology/Principal Findings

We developed a new prototype of the OligoC-TesT based on kinetoplast DNA (kDNA) detection. We evaluated the satDNA and kDNA OligoC-TesTs in a multi-cohort study with 187 chronic Chagas patients and 88 healthy endemic controls recruited in Argentina, Chile and Spain and 26 diseased non-endemic controls from D.R. Congo and Sudan. All specimens were tested in duplicate. The overall specificity in the controls was 99.1% (95% CI 95.2%–99.8%) for the satDNA OligoC-TesT and 97.4% (95% CI 92.6%–99.1%) for the kDNA OligoC-TesT. The overall sensitivity in the patients was 67.9% (95% CI 60.9%–74.2%) for the satDNA OligoC-TesT and 79.1% (95% CI 72.8%–84.4%) for the kDNA OligoC-Test.

Conclusions/Significance

Specificities of the two T. cruzi OligoC-TesT prototypes are high on non-endemic and endemic controls. Sensitivities are moderate but significantly (p = 0.0004) higher for the kDNA OligoC-TesT compared to the satDNA OligoC-TesT.  相似文献   

2.

Background

Anterior odontoid screw fixation (AOSF) has been one of the most popular treatments for odontoid fractures. However, the true efficacy of AOSF remains unclear. In this study, we aimed to provide the pooled rates of non-union, reoperation, infection, and approach related complications after AOSF for odontoid fractures.

Methods

We searched studies that discussed complications after AOSF for type II or type III odontoid fractures. A proportion meta-analysis was done and potential sources of heterogeneity were explored by meta-regression analysis.

Results

Of 972 references initially identified, 63 were eligible for inclusion. 54 studies provided data regarding non-union. The pooled non-union rate was 10% (95% CI: 7%–3%). 48 citations provided re-operation information with a pooled proportion of 5% (95% CI: 3%–7%). Infection was described in 20 studies with an overall rate of 0.2% (95% CI: 0%–1.2%). The main approach related complication is postoperative dysphagia with a pooled rate of 10% (95% CI: 4%–17%). Proportions for the other approach related complications such as postoperative hoarseness (1.2%, 95% CI: 0%–3.7%), esophageal/retropharyngeal injury (0%, 95% CI: 0%–1.1%), wound hematomas (0.2%, 95% CI: 0%–1.8%), and spinal cord injury (0%, 95% CI: 0%–0.2%) were very low. Significant heterogeneities were detected when we combined the rates of non-union, re-operation, and dysphagia. Multivariate meta-regression analysis showed that old age was significantly predictive of non-union. Subgroup comparisons showed significant higher non-union rates in age ≥70 than that in age ≤40 and in age 40 to <50. Meta-regression analysis did not reveal any examined variables influencing the re-operation rate. Meta-regression analysis showed age had a significant effect on the dysphagia rate.

Conclusions/Significances

This study summarized the rates of non-union, reoperation, infection, and approach related complications after AOSF for odontoid factures. Elderly patients were more likely to experience non-union and dysphagia.  相似文献   

3.

Purpose

The role of spot sign on computed tomography angiography (CTA) for predicting hematoma expansion (HE) after primary intracerebral hemorrhage (ICH) has been the focus of many studies. Our study sought to evaluate the predictive accuracy of spot signs for HE in a meta-analytic approach.

Materials and Methods

The database of Pubmed, Embase, and the Cochrane Library were searched for eligible studies. Researches were included if they reported data on HE in primary ICH patients, assessed by spot sign on first-pass CTA. Studies with additional data of second-pass CTA, post-contrast CT (PCCT) and CT perfusion (CTP) were also included.

Results

18 studies were pooled into the meta-analysis, including 14 studies of first-pass CTA, and 7 studies of combined CT modalities. In evaluating the accuracy of spot sign for predicting HE, studies of first-pass CTA showed that the sensitivity was 53% (95% CI, 49%–57%) with a specificity of 88% (95% CI, 86%–89%). The pooled positive likelihood ratio (PLR) was 4.70 (95% CI, 3.28–6.74) and the negative likelihood ratio (NLR) was 0.44 (95% CI, 0.34–0.58). For studies of combined CT modalities, the sensitivity was 73% (95% CI, 67%–79%) with a specificity of 88% (95% CI, 86%–90%). The aggregated PLR was 6.76 (95% CI, 3.70–12.34) and the overall NLR was 0.17 (95% CI 0.06–0.48).

Conclusions

Spot signs appeared to be a reliable imaging biomarker for HE. The additional detection of delayed spot sign was helpful in improving the predictive accuracy of early spot signs. Awareness of our results may impact the primary ICH care by providing supportive evidence for the use of combined CT modalities in detecting spot signs.  相似文献   

4.

Background

Lung-dominant connective tissue disease (LD-CTD) is a new concept for classifying the subset of patients with interstitial pneumonia who have clinical features suggesting an associated CTD, but whose features fall short of a clear diagnosis of CTD under the current rheumatologic classification systems. The impact of mean pulmonary arterial pressure (MPAP) in LD-CTD has not been sufficiently elucidated.

Objectives

To evaluate the survival impact of MPAP measured during the initial evaluation in patients with LD-CTD.

Methods

We retrospectively analyzed the initial evaluation data of 100 LD-CTD patients undergoing pulmonary function test, 6-min walk test (6MWT), and right heart catheterization (RHC).

Results

The mean MPAP was 16.2±4.4 mm Hg, and 18 patients had MPAP≥20 mm Hg. A univariate Cox proportional hazard model showed that MPAP and several variables have a statistically significant impact on survival. With stepwise, multivariate Cox proportional analysis, MPAP (HR  = 1.293; 95% CI 1.130–1.480; p<0.001) and mean forced vital capacity (FVC) % predicted (HR = 0.958; 95% CI 0.930–0.986; p = 0.004) were shown to be independent determinants of survival.

Conclusions

Higher MPAP and lower %FVC at the initial evaluation were significant independent prognostic factors of LD-CTD. MPAP evaluation provides additional information of disease status and will help physicians to predict mortality in LD-CTD.  相似文献   

5.

Background

Radiographic manifestations of pulmonary tuberculosis (TB) in patients with diabetes mellitus (DM) have previously been reported, with inconsistent results. We conducted a study to investigate whether glycemic control has an impact on radiographic manifestations of pulmonary TB.

Methods

Consecutive patients with culture-positive pulmonary TB who had DM in three tertiary care hospitals from 2005–2010 were selected for review and compared with a similar number without DM. Glycemic control was assessed by glycated haemoglobin A1C (HbA1C). A pre-treatment chest radiograph was read independently by two qualified pulmonologists blinded to patients’ diabetic status. Films with any discordant reading were read by a third reader.

Results

1209 culture positive pulmonary TB patients (581 with DM and 628 without DM) were enrolled. Compared with those without DM, TB patients with DM were significantly more likely to have opacity over lower lung fields, extensive parenchymal lesions, any cavity, multiple cavities and large cavities (>3 cm). The relative risk of lower lung field opacities was 0.80 (95% CI 0.46–1.42) for those with DM with A1C<7%, 2.32 (95% CI 1.36 - 3.98) for A1C 7%–9%, and 1.62 (95% CI 1.12–2.36) for A1C>9%; and that of any cavity over no cavity was 0.87 (95% CI 0.46–1.62) for patients with DM with A1C<7%, 1.84 (95% CI 1.20–2.84) for A1C 7%–9%, and 3.71 (95% CI 2.64–5.22) for A1C>9%, relative to patients without DM.

Conclusions

Glycemic control significantly influenced radiographic manifestations of pulmonary TB in patients with DM.  相似文献   

6.

Background

Overweight/obesity is a serious public health problem that affects a large part of the world population across all age and racial/ethnic groups. However, there has not been a meta-analysis of the prevalence of childhood and adolescent overweight/obesity in China during the past 30 years.

Methods

The China National Knowledge Infrastructure and Wanfang DATA, MEDLINE, EMBASE and Cumulative Index to Nursing and Allied Health Literature were searched for relevant studies published between January 1970 and June 2012. The prevalence of overweight/obesity over time was pooled using Stata/SE, version 9. Summary statistics (odds ratios, ORs) were used to compare sex-specific and urban-rural preponderance of overweight/obesity using Review Manager.

Results

After screening 1326 papers, we included 35 papers (41 studies), most of medium quality. The prevalence of overweight/obesity increased from 1.8% (95% confidence interval [CI], 0.4%–3.1%) and 0.4% (95% CI, −0.1% to −0.8%) respectively in 1981–1985 to 13.1% (95% CI, 11.2%–15.0%) and 7.5% (95% CI, 6.6%–8.4%) respectively in 2006–2010. The average annual increase was 8.3% and 12.4% respectively. Boys were more likely to be overweight/obese than girls (OR, 1.36; 95% CI, 1.24–1.49 and OR, 1.68; 95% CI, 1.52–1.86 respectively). The prevalence of overweight/obesity was higher in urban areas than in rural areas (OR, 1.66; 95% CI, 1.54–1.79 and OR, 1.97; 95% CI, 1.68–2.30 respectively). For age-specific subgroup analyses, both overweight and obesity increased more rapidly in the toddler stage than in other developmental stages. Sensitivity analyses showed that sample-size differences, study quality, overweight/obesity criteria and geographical distribution affected overweight/obesity prevalence.

Conclusions

Toddlers and urban boys were at particularly high risk; the prevalence in these groups increased more rapidly than in their counterparts. Public health prevention strategies are urgently needed to modify health behaviors of children and adolescents and control overweight/obesity in China.  相似文献   

7.

Objective

Virtual touch tissue quantification (VTQ) of acoustic radiation force impulse (ARFI) is a new quantitative technique to measure tissue stiffness. The study was aimed to assess the usefulness of VTQ in the diagnosis of thyroid nodules.

Methods

173 pathologically proven thyroid nodules in 142 patients were included and all were examined by conventional ultrasound (US), conventional elasticity imaging (EI) and VTQ of ARFI. The tissue stiffness for VTQ was expressed as shear wave velocity (SWV) (m/s). Receiver-operating characteristic curve (ROC) analyses were performed to assess the diagnostic performance. Intra- and inter-observer reproducibility of VTQ measurement was assessed.

Results

The SWVs of benign and malignant thyroid nodules were 2.34±1.17 m/s (range: 0.61–9.00 m/s) and 4.82±2.53 m/s (range: 2.32–9.00 m/s) respectively (P<0.001). The mean SWV ratios between each nodule and the adjacent thyroid tissue were 1.19±0.67 (range: 0.31–6.87) for benign and 2.50±1.54 (range: 0.85–6.69) for malignant nodules (P<0.001). ROC analyses indicated that the area under the curve was 0.861 (95% CI : 0.804, 0.918) (P<0.001) for SWV and 0.831(95% CI : 0.761, 0.900)(P<0.001) for the SWV ratio. The cutoff points for the differential diagnosis were 2.87 m/s for SWV and 1.59 for SWV ratio. The sensitivity, specificity, accuracy, positive predictive value, and negative predictive value for EI were 65.9%, 66.7%, 66.5%, 40.3%, and 85.1%, respectively, and were 63.6%–75%, 82.2%–88.4%, 80.3%–82.1%, 58.9%–65.1%, and 87.7%–90.5%, respectively, for VTQ. The diagnostic value of VTQ is the highest for nodules >20 mm and lowest for those ≤10 mm. The correlation coefficients were 0.904 for intraobserver measurement and 0.864 for interobserver measurement.

Conclusions

VTQ of ARFI provides quantitative and reproducible information about the tissue stiffness, which is useful for the differentiation between benign and malignant thyroid nodules. The diagnostic performance of VTQ is higher than that of conventional EI.  相似文献   

8.

Background

A suboptimal left ventricular (LV) pacing site may account for non-responsiveness of patients to cardiac resynchronization therapy (CRT). The vector selection of a novel quadripolar LV pacing lead, which was mainly developed to overcome technical issues with stimulation thresholds and phrenic nerve capture, may affect hemodynamic response, and was therefore assessed in this study. (German Clinical Trials Register DRKS00000573).

Methods and Results

Hemodynamic effects of a total of 145 LVPCs (9.1 per patient) of CRT devices with a quadripolar LV lead (Quartet™, St. Jude Medical) were assessed in 16/20 consecutive patients by invasive measurement of LV+dP/dtmax at an invasively optimized AV-interval in random order. Optimal (worst) LVPCs per patient were identified as those with maximal (minimal) %change in LV+dP/dtmax (%ΔLV+dP/dtmax) as compared to a preceding baseline. LV+dP/dtmax significantly increased in all 145 LVPCs (p<0.0001 compared to baseline) with significant intraindividual differences between LVPCs (p<0.0001). Overall, CRT acutely augmented %ΔLV+dP/dtmax by 31.3% (95% CI 24%–39%) in the optimal, by 21.3% (95% CI: 15%–27%) in the worst and by 28.2% (95% CI: 21%–36%) in a default distal LVPC. This resulted in an absolute additional acute increase in %ΔLV+dP/dtmax of 10.0% (95% CI: 7%–13%) of the optimal when compared to the worst (p<0.0001), and of 3.1% (95% CI: 1%–5%) of the optimal when compared to the default distal LVPC (p<0.001). Optimal LVPCs were not programmable with a standard bipolar lead in 44% (7/16) of patients.

Conclusion

The pacing configuration of a quadripolar LV lead determinates acute hemodynamic response. Pacing in the individually optimized configuration gives rise to an additional absolute 10% increase in %ΔLV+dP/dtmax when comparing optimal and worst vectors.  相似文献   

9.

Background

In July 2010 a new multiple hub-and-spoke model for acute stroke care was implemented across the whole of London, UK, with continuous specialist care during the first 72 hours provided at 8 hyper-acute stroke units (HASUs) compared to the previous model of 30 local hospitals receiving acute stroke patients. We investigated differences in clinical outcomes and costs between the new and old models.

Methods

We compared outcomes and costs ‘before’ (July 2007–July 2008) vs. ‘after’ (July 2010–June 2011) the introduction of the new model, adjusted for patient characteristics and national time trends in mortality and length of stay. We constructed 90-day and 10-year decision analytic models using data from population based stroke registers, audits and published sources. Mortality and length of stay were modelled using survival analysis.

Findings

In a pooled sample of 307 patients ‘before’ and 3156 patients ‘after’, survival improved in the ‘after’ period (age adjusted hazard ratio 0.54; 95% CI 0.41–0.72). The predicted survival rates at 90 days in the deterministic model adjusted for national trends were 87.2% ‘before’ % (95% CI 86.7%–87.7%) and 88.7% ‘after’ (95% CI 88.6%–88.8%); a relative reduction in deaths of 12% (95% CI 8%–16%). Based on a cohort of 6,438 stroke patients, the model produces a total cost saving of £5.2 million per year at 90 days (95% CI £4.9-£5.5 million; £811 per patient).

Conclusion

A centralized model for acute stroke care across an entire metropolitan city appears to have reduced mortality for a reduced cost per patient, predominately as a result of reduced hospital length of stay.  相似文献   

10.

Background

Retaining patients with HIV infection in care is still a major challenge in sub- Saharan Africa, particularly in the Democratic Republic of Congo (DRC) where the antiretroviral treatment (ART) coverage is low. Monitoring retention is an important tool for evaluating the quality of care.

Methods and Findings

A review of medical records of HIV -infected children was performed in three health facilities in the DRC: the Amo-Congo Health center, the Monkole Clinic in Kinshasa, and the HEAL Africa Clinic in Goma. Medical records of 720 children were included. Kaplan Meier curves were constructed with the probability of retention at 6 months, 1 year, 2 years and 3 years. Retention rates were: 88.2% (95% CI: 85.1%–90.8%) at 6 months; 85% (95% CI: 81.5%–87.6%) at one year; 79.4% (95%CI: 75.5%–82.8%) at two years and 74.7% (95% CI: 70.5%–78.5%) at 3 years. The retention varied across study sites: 88.2%, 66.6% and 92.5% at 6 months; 84%, 59% and 90% at 12 months and 75.7%, 56.3% and 85.8% at 24 months respectively for Amo-Congo/Kasavubu, Monkole facility and HEAL Africa. After multivariable Cox regression four variables remained independently associated with attrition: study site, CD4 cell count <350 cells/µL, children younger than 2 years and children whose caregivers were member of an independent church.

Conclusions

Attrition remains a challenge for pediatric HIV positive patients in ART programs in DRC. In addition, the low coverage of pediatric treatment exacerbates the situation of pediatric HIV/AIDS.  相似文献   

11.

Background

Recent studies suggest that cancer increases risk of atrial fibrillation. Whether atrial fibrillation is a marker for underlying occult cancer is unknown.

Methods

We conducted a cohort study (1980–2011) of all Danish patients with new-onset atrial fibrillation. To examine cancer risk, we computed absolute risk at 3 months and standardized incidence ratios (SIRs) by comparing observed cancer incidence among patients newly diagnosed with atrial fibrillation with that expected based on national cancer incidence during the period.

Results

Median follow-up time was 3.4 years among 269 742 atrial fibrillation patients. Within 3 months of follow-up, 6656 cancers occurred (absolute risk, 2.5%; 95% confidence intervals [CI], 2.4%–2.5%) versus 1302 expected, yielding a SIR of 5.11; 95% CI, 4.99–5.24. Associations were particularly strong for cancers of the lung, kidney, colon, ovary, and for non-Hodgkin''s lymphoma. The SIR within 3 months of follow-up was 7.02; 95% CI, 6.76–7.28 for metastatic and 3.53; 95% CI, 3.38–3.68 for localized cancer. Beyond 3 months of follow-up, overall cancer risk was modestly increased (SIR, 1.13; 95% CI, 1.12–1.15).

Conclusion

Patients with new-onset atrial fibrillation had a markedly increased relative risk of a cancer diagnosis within the next three months, however, corresponding absolute risk was small.  相似文献   

12.

Objective

The HIV/AIDS epidemic has evolved with an increasing burden in older adults. We assessed for knowledge about aging and HIV/AIDS, among clinicians in Kampala district, Uganda.

Methods

A cross-sectional survey of 301 clinicians complemented by 9 key-informant interviews between May and October 2011. Data was analyzed by multivariable logistic regression for potential determinants of clinician knowledge about HIV/AIDS in older adults, estimating their adjusted Odds Ratios (aOR) and 95% confidence intervals (95% CI) using Stata 11.2 software.

Results

Two-hundred and sixty-two questionnaires (87.7%) were returned. Respondents had a median age of 30 years (IQR 27–34) and 57.8% were general medical doctors. The mean knowledge score was 49% (range 8.8%–79.4%). Questions related to co-morbidities in HIV/AIDS (non-AIDS related cancers and systemic diseases) and chronic antiretroviral treatment toxicities (metabolic disorders) accounted for significantly lower scores (mean, 41.7%, 95% CI: 39.3%–44%) compared to HIV/AIDS epidemiology and prevention (mean, 65.7%, 95% CI: 63.7%–67.7%). Determinants of clinician knowledge in the multivariable analysis included (category, aOR, 95% CI): clinician age (30–39 years; 3.28∶1.65–9.75), number of persons with HIV/AIDS seen in the past year (less than 50; 0.34∶0.14–0.86) and clinical profession (clinical nurse practitioner; 0.31∶0.11–0.83). Having diploma level education had a marginal association with lower knowledge about HIV and aging (p = 0.09).

Conclusion

Our study identified gaps and determinants of knowledge about HIV/AIDS in older adults among clinicians in Kampala district, Uganda. Clinicians in low and middle income countries could benefit from targeted training in chronic care for older adults with HIV/AIDS and long-term complications of antiretroviral treatment.  相似文献   

13.

Background

Bevacizumab is believed to be as effective and safe as ranibizumab for ophthalmic diseases; however, its magnitude of effectiveness and safety profile remain controversial. Thus, a meta-analysis and systematic review appears necessary.

Methods

PubMed and EMBASE were systematically searched with no restrictions. All relevant citations comparing ranibizumab and bevacizumab were considered for inclusion. Pooled effect estimates were obtained using a fixed- and random-effects meta-analysis.

Results

Nine independent randomised-controlled clinical trials (RCTs) involving 2,289 participants were identified. Compared with bevacizumab, the overall combined weighted mean difference (WMD) of the mean change in visual acuity for ranibizumab was 0.52 letters (95% CI −0.11–1.14). The odds ratios (ORs) of gaining ≥15, gaining 5–14, losing 5–14 and losing ≤15 letters were 1.10 (95% CI 0.90–1.33), 0.93 (95% CI 0.77–1.11), 0.89 (95% CI 0.65–1.22) and 0.95 (95% CI 0.73–1.25), respectively. The risk of serious systemic events increased by 17% (95% CI 6%–27%, p = 0.0042) for bevacizumab treatment in comparison with ranibizumab. No statistically significant differences between the two treatments were found for the nonfatal arterial thrombotic events, ocular serious adverse, death from vascular and all causes events.

Conclusions

Bevacizumab is not inferior to ranibizumab as a treatment for achieving visual acuity. The use of bevacizumab was associated with an increased risk of developing serious systemic events. Weighing the costs and health outcomes is necessary when selecting between bevacizumab and ranibizumab for ophthalmic diseases. Due to the limitations of the available data, further research is needed.  相似文献   

14.

Background

Centenarians are a rapidly growing demographic group worldwide, yet their health and social care needs are seldom considered. This study aims to examine trends in place of death and associations for centenarians in England over 10 years to consider policy implications of extreme longevity.

Methods and Findings

This is a population-based observational study using death registration data linked with area-level indices of multiple deprivations for people aged ≥100 years who died 2001 to 2010 in England, compared with those dying at ages 80-99. We used linear regression to examine the time trends in number of deaths and place of death, and Poisson regression to evaluate factors associated with centenarians’ place of death. The cohort totalled 35,867 people with a median age at death of 101 years (range: 100–115 years). Centenarian deaths increased 56% (95% CI 53.8%–57.4%) in 10 years. Most died in a care home with (26.7%, 95% CI 26.3%–27.2%) or without nursing (34.5%, 95% CI 34.0%–35.0%) or in hospital (27.2%, 95% CI 26.7%–27.6%). The proportion of deaths in nursing homes decreased over 10 years (−0.36% annually, 95% CI −0.63% to −0.09%, p = 0.014), while hospital deaths changed little (0.25% annually, 95% CI −0.06% to 0.57%, p = 0.09). Dying with frailty was common with “old age” stated in 75.6% of death certifications. Centenarians were more likely to die of pneumonia (e.g., 17.7% [95% CI 17.3%–18.1%] versus 6.0% [5.9%–6.0%] for those aged 80–84 years) and old age/frailty (28.1% [27.6%–28.5%] versus 0.9% [0.9%–0.9%] for those aged 80–84 years) and less likely to die of cancer (4.4% [4.2%–4.6%] versus 24.5% [24.6%–25.4%] for those aged 80–84 years) and ischemic heart disease (8.6% [8.3%–8.9%] versus 19.0% [18.9%–19.0%] for those aged 80–84 years) than were younger elderly patients. More care home beds available per 1,000 population were associated with fewer deaths in hospital (PR 0.98, 95% CI 0.98–0.99, p<0.001).

Conclusions

Centenarians are more likely to have causes of death certified as pneumonia and frailty and less likely to have causes of death of cancer or ischemic heart disease, compared with younger elderly patients. To reduce reliance on hospital care at the end of life requires recognition of centenarians’ increased likelihood to “acute” decline, notably from pneumonia, and wider provision of anticipatory care to enable people to remain in their usual residence, and increasing care home bed capacity. Please see later in the article for the Editors'' Summary  相似文献   

15.

Introduction

The increasing number of people requiring HIV treatment in South Africa calls for efficient use of its human resources for health in order to ensure optimum treatment coverage and outcomes. This paper describes an innovative public-private partnership model which uses private sector doctors to treat public sector patients and ascertains the model’s ability to maintain treatment outcomes over time.

Methods

The study used a retrospective design based on the electronic records of patients who were down-referred from government hospitals to selected private general medical practitioners (GPs) between November 2005 and October 2012. In total, 2535 unique patient records from 40 GPs were reviewed. The survival functions for mortality and attrition were calculated. Cumulative incidence of mortality for different time cohorts (defined by year of treatment initiation) was also established.

Results

The median number of patients per GP was 143 (IQR: 66–246). At the time of down-referral to private GPs, 13.8% of the patients had CD4 count <200 cell/mm3, this proportion reduced to 6.6% at 12 months and 4.1% at 48 months. Similarly, 88.4% of the patients had suppressed viral load (defined as HIV-1 RNA <400 copies/ml) at 48 months. The patients’ probability of survival at 12 and 48 months was 99.0% (95% CI: 98.4%–99.3%) and 89.0% (95% CI: 87.1%–90.0%) respectively. Patient retention at 48 months remained high at 94.3% (95% CI: 93.0%–95.7%).

Conclusions

The study findings demonstrate the ability of the GPs to effectively maintain patient treatment outcomes and potentially contribute to HIV treatment scale-up with the relevant support mechanism. The model demonstrates how an assisted private sector based programme can be effectively and efficiently used to either target specific health concerns, key populations or serve as a stop-gap measure to meet urgent health needs.  相似文献   

16.

Aim

To investigate the influence of metformin use on liver dysfunction and hepatic encephalopathy in a retrospective cohort of diabetic cirrhotic patients. To analyze the impact of metformin on glutaminase activity and ammonia production in vitro.

Methods

Eighty-two cirrhotic patients with type 2 diabetes were included. Forty-one patients were classified as insulin sensitizers experienced (metformin) and 41 as controls (cirrhotic patients with type 2 diabetes mellitus without metformin treatment). Baseline analysis included: insulin, glucose, glucagon, leptin, adiponectin, TNFr2, AST, ALT. HOMA-IR was calculated. Baseline HE risk was calculated according to minimal hepatic encephalopathy, oral glutamine challenge and mutations in glutaminase gene. We performed an experimental study in vitro including an enzymatic activity assay where glutaminase inhibition was measured according to different metformin concentrations. In Caco2 cells, glutaminase activity inhibition was evaluated by ammonia production at 24, 48 and 72 hours after metformina treatment.

Results

Hepatic encephalopathy was diagnosed during follow-up in 23.2% (19/82): 4.9% (2/41) in patients receiving metformin and 41.5% (17/41) in patients without metformin treatment (logRank 9.81; p = 0.002). In multivariate analysis, metformin use [H.R.11.4 (95% CI: 1.2–108.8); p = 0.034], age at diagnosis [H.R.1.12 (95% CI: 1.04–1.2); p = 0.002], female sex [H.R.10.4 (95% CI: 1.5–71.6); p = 0.017] and HE risk [H.R.21.3 (95% CI: 2.8–163.4); p = 0.003] were found independently associated with hepatic encephalopathy. In the enzymatic assay, glutaminase activity inhibition reached 68% with metformin 100 mM. In Caco2 cells, metformin (20 mM) decreased glutaminase activity up to 24% at 72 hours post-treatment (p<0.05).

Conclusions

Metformin was found independently related to overt hepatic encephalopathy in patients with type 2 diabetes mellitus and high risk of hepatic encephalopathy. Metformin inhibits glutaminase activity in vitro. Therefore, metformin use seems to be protective against hepatic encephalopathy in diabetic cirrhotic patients.  相似文献   

17.

Background

The prognostic importance of tumor size in gastric cancer is unclear. This study investigated whether the inclusion of tumor size could improve prognostic accuracy in node-negative gastric cancer.

Methods

Clinical and pathological data of 492 patients with node-negative gastric cancer who underwent radical surgery in our department from January 1995 to December 2008 were analyzed. The prognostic accuracy of T stage was compared with that of T stage plus tumor size. The ability of tumor size to improve the 95% confidence interval (CI) of postoperative 5-year survival rate in gastric cancer patients was assessed. Different T stages plus tumor size were further analyzed to assess improvements in prognosis.

Results

Mean tumor size was 3.79±1.98 cm with a normal distribution. Multivariate analysis showed that tumor size and T stage were independent prognostic factors. Postoperative 5-year survival rate tended to decrease as tumor size increased in 1 cm increments. The addition of tumor size to T stage improved accuracy in predicting 5-year survival by 4.2% (P<0.05), as well as improving the 95% CI of postoperative 5-year survival rate by 3.2–5.1%. The addition of tumor size improved the predictive accuracy of postoperative 5-year survival rate by 3.9% (95% CI 70.4%–91.1%, P = 0.033) in patients with stage T3N0M0 tumors and by 6.5% (95% CI 68.7%–88.4%, P = 0.014) in patients with stage T4aN0M0 tumors.

Conclusions

Tumor size is an independent prognostic factor for survival in patients with node-negative gastric cancer, as well as improving prognostic accuracy in stage T3/4aN0M0 tumors.  相似文献   

18.

Background

1974–2005 studies across Sierra Leone showed onchocerciasis endemicity in 12 of 14 health districts (HDs) and baseline studies 2005–2008 showed lymphatic filariasis (LF) endemicity in all 14 HDs. Three integrated annual mass drug administration (MDA) were conducted in the 12 co-endemic districts 2008–2010 with good geographic, programme and drug coverage. Midterm assessment was conducted 2011 to determine impact of these MDAs on LF in these districts.

Methodology/Principal Findings

The mf prevalence and intensity in the 12 districts were determined using the thick blood film method and results compared with baseline data from 2007–2008. Overall mf prevalence fell from 2.6% (95% CI: 2.3%–3.0%) to 0.3% (95% CI: 0.19%–0.47%), a decrease of 88.5% (p = 0.000); prevalence was 0.0% (100.0% decrease) in four districts: Bo, Moyamba, Kenema and Kono (p = 0.001, 0.025, 0.085 and 0.000 respectively); and seven districts had reductions in mf prevalence of between 70.0% and 95.0% (p = 0.000, 0.060, 0.001, 0.014, 0.000, 0.000 and 0.002 for Bombali, Bonthe, Kailahun, Kambia, Koinadugu, Port Loko and Tonkolili districts respectively). Pujehun had baseline mf prevalence of 0.0%, which was maintained. Only Bombali still had an mf prevalence ≥1.0% (1.58%, 95% CI: 0.80%–3.09%)), and this is the district that had the highest baseline mf prevalence: 6.9% (95% CI: 5.3%–8.8%). Overall arithmetic mean mf density after three MDAs was 17.59 mf/ml (95% CI: 15.64 mf/ml–19.55 mf/ml) among mf positive individuals (65.4% decrease from baseline of 50.9 mf/ml (95% CI: 40.25 mf/ml–61.62 mf/ml; p = 0.001) and 0.05 mf/ml (95% CI: 0.03 mf/ml–0.08 mf/ml) for the entire population examined (96.2% decrease from baseline of 1.32 mf/ml (95% CI: 1.00 mf/ml–1.65 mf/ml; p = 0.000)).

Conclusions/Significance

The results show that mf prevalence decreased to <1.0% in all but one of the 12 districts after three MDAs. Overall mf density reduced by 65.0% among mf-positive individuals, and 95.8% for the entire population.  相似文献   

19.

Purpose

We aimed to characterize the antiretroviral therapy (ART) cascade among female sex workers (FSWs) globally.

Methods

We systematically searched PubMed, Embase and MEDLINE in March 2014 to identify studies reporting on ART uptake, attrition, adherence, and outcomes (viral suppression or CD4 count improvements) among HIV-infected FSWs globally. When possible, available estimates were pooled using random effects meta-analyses (with heterogeneity assessed using Cochran''s Q test and I2 statistic).

Results

39 studies, reporting on 21 different FSW study populations in Asia, Africa, North America, South America, and Central America and the Caribbean, were included. Current ART use among HIV-infected FSWs was 38% (95% CI: 29%–48%, I2 = 96%, 15 studies), and estimates were similar between high-, and low- and middle-income countries. Ever ART use among HIV-infected FSWs was greater in high-income countries (80%; 95% CI: 48%–94%, I2 = 70%, 2 studies) compared to low- and middle-income countries (36%; 95% CI: 7%–81%, I2 = 99%, 3 studies). Loss to follow-up after ART initiation was 6% (95% CI: 3%–11%, I2 = 0%, 3 studies) and death after ART initiation was 6% (95% CI: 3%–11%, I2 = 0%, 3 studies). The fraction adherent to ≥95% of prescribed pills was 76% (95% CI: 68%–83%, I2 = 36%, 4 studies), and 57% (95% CI: 46%–68%, I2 = 82%, 4 studies) of FSWs on ART were virally suppressed. Median gains in CD4 count after 6 to 36 months on ART, ranged between 103 and 241 cells/mm3 (4 studies).

Conclusions

Despite global increases in ART coverage, there is a concerning lack of published data on HIV treatment for FSWs. Available data suggest that FSWs can achieve levels of ART uptake, retention, adherence, and treatment response comparable to that seen among women in the general population, but these data are from only a few research settings. More routine programme data on HIV treatment among FSWs across settings should be collected and disseminated.  相似文献   

20.

Objective

Despite antihypertensive treatment, most hypertensive patients still have high blood pressure (BP), notably high systolic blood pressure (SBP). The EFFICIENT study examines the efficacy and acceptability of a single-pill combination of sustained-release (SR) indapamide, a thiazide-like diuretic, and amlodipine, a calcium channel blocker (CCB), in the management of hypertension.

Methods

Patients who were previously uncontrolled on CCB monotherapy (BP≥140/90 mm Hg) or were previously untreated with grade 2 or 3 essential hypertension (BP≥160/100 mm Hg) received a single-pill combination tablet containing indapamide SR 1.5 mg and amlodipine 5 mg daily for 45 days, in this multicenter prospective phase 4 study. The primary outcome was mean change in BP from baseline; percentage of patients achieving BP control (BP<140/90 mm Hg) was a secondary endpoint. SBP reduction (ΔSBP) versus diastolic BP reduction (ΔDBP) was evaluated (ΔSBP/ΔDBP) from baseline to day 45. Safety and tolerability were also assessed.

Results

Mean baseline BP of 196 patients (mean age 52.3 years) was 160.2/97.9 mm Hg. After 45 days, mean SBP decreased by 28.5 mm Hg (95% CI, 26.4 to 30.6), while diastolic BP decreased by 15.6 mm Hg (95% CI, 14.5 to 16.7). BP control (<140/90 mm Hg) was achieved in 85% patients. ΔSBP/ΔDBP was 1.82 in the overall population. Few patients (n = 3 [2%]) reported side effects, and most (n = 194 [99%]) adhered to treatment.

Conclusion

In patients who were previously uncontrolled on CCB monotherapy or untreated with grade 2 or 3 hypertension, single-pill combination indapamide SR/amlodipine reduced BP effectively—especially SBP— over 45 days, and was safe and well tolerated.

Trial Registration

Clinical Trial Registry – India CTRI/2010/091/000114  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号