首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 406 毫秒
1.

Objectives

To examine the associations of Intimate partner violence (IPV) with stress-related sleep disturbance (measured using the Ford Insomnia Response to Stress Test [FIRST]) and poor sleep quality (measured using the Pittsburgh Sleep Quality Index [PSQI]) during early pregnancy.

Methods

This cross-sectional study included 634 pregnant Peruvian women. In-person interviews were conducted in early pregnancy to collect information regarding IPV history, and sleep traits. Adjusted odds ratios (aOR) and 95% confidence intervals (95%CIs) were calculated using logistic regression procedures.

Results

Lifetime IPV was associated with a 1.54-fold increased odds of stress-related sleep disturbance (95% CI: 1.08–2.17) and a 1.93-fold increased odds of poor sleep quality (95% CI: 1.33–2.81). Compared with women experiencing no IPV during lifetime, the aOR (95% CI) for stress-related sleep disturbance associated with each type of IPV were: physical abuse only 1.24 (95% CI: 0.84–1.83), sexual abuse only 3.44 (95%CI: 1.07–11.05), and physical and sexual abuse 2.51 (95% CI: 1.27–4.96). The corresponding aORs (95% CI) for poor sleep quality were: 1.72 (95% CI: 1.13–2.61), 2.82 (95% CI: 0.99–8.03), and 2.50 (95% CI: 1.30–4.81), respectively. Women reporting any IPV in the year prior to pregnancy had increased odds of stress-related sleep disturbance (aOR = 2.07; 95% CI: 1.17–3.67) and poor sleep quality (aOR = 2.27; 95% CI: 1.30–3.97) during pregnancy.

Conclusion

Lifetime and prevalent IPV exposures are associated with stress-related sleep disturbance and poor sleep quality during pregnancy. Our findings suggest that sleep disturbances may be important mechanisms that underlie the lasting adverse effects of IPV on maternal and perinatal health.  相似文献   

2.

Objective

To evaluate the Fibrosis (FIB)-4 index as a predictor of major liver-related events (LRE) and liver-related death (LRD) in human immunodeficiency virus (HIV) type-1 patients initiating combination antiretroviral therapy (cART).

Design

Retrospective analysis of a prospective cohort study.

Setting

Italian HIV care centers participating to the ICONA Foundation cohort.

Participants

Treatment-naive patients enrolled in ICONA were selected who: initiated cART, had hepatitis C virus (HCV) serology results, were HBsAg negative, had an available FIB-4 index at cART start and during follow up.

Methods

Cox regression models were used to determine the association of FIB4 with the risk of major LRE (gastrointestinal bleeding, ascites, hepatic encephalopathy, hepato-renal syndrome or hepatocellular carcinoma) or LRD.

Results

Three-thousand four-hundred seventy-five patients were enrolled: 73.3% were males, 27.2% HCV seropositive. At baseline (time of cART initiation) their median age was 39 years, had a median CD4+ T cell count of 260 cells/uL, and median HIV RNA 4.9 log copies/mL, 65.9% had a FIB-4 <1.45, 26.4% 1.45–3.25 and 7.7% >3.25. Over a follow up of 18,662 person-years, 41 events were observed: 25 major LRE and 16 LRD (incidence rate, IR, 2.2 per 1,000 PYFU [95% confidence interval, CI 1.6–3.0]). IR was higher in HCV seropositives as compared to negatives (5.9 vs 0.5 per 1,000 PYFU). Higher baseline FIB-4 category as compared to <1.45 (FIB-4 1.45–3.25: HR 3.55, 95% CI 1.09–11.58; FIB-4>3.25: HR 4.25, 1.21–14.92) and time-updated FIB-4 (FIB-4 1.45–3.25: HR 3.40, 1.02–11.40; FIB-4>3.25: HR 21.24, 6.75–66.84) were independently predictive of major LRE/LRD, after adjusting for HIV- and HCV-related variables, alcohol consumption and type of cART.

Conclusions

The FIB-4 index at cART initiation, and its modification over time are risk factors for major LRE or LRD, independently of infection with HCV and could be used to monitor patients on cART.  相似文献   

3.

Purpose

Cataract is a very prevalent ocular disorder, and environmental risk factors for age-related cataracts have been widely investigated. We aimed to evaluate an association of dietary sodium intake and socioeconomic factors with the development of age-related cataracts.

Methods

A cross-sectional case-control study based on the 2008–2011 Korea National Health and Nutrition Examination Survey. Dietary sodium intake was estimated using urinary sodium to creatinine ratio (U[Na+]/Cr).

Results

Among a total 12,693 participants, 2,687 (21.1%) had cataracts and 10,006 patients without cataracts served as controls. The prevalence of cataracts increased with age and quartiles of U[Na+]/Cr (p for trend < 0.001). Multivariate logistic regression analyses revealed that factors related to the development of cataracts were age ≥ 50 years (adjusted odds ratio [aOR] 15.34, 95% confidence interval [CI] 13.31‒17.69), low income (aOR 1.85, 95% CI 1.64–2.09), low educational attainment (aOR 1.76, 95% CI 1.57–1.96), and high sodium intake (U[Na+]/Cr > 16.4 mmol/mmol; aOR 1.29, 95% CI 1.16–1.44). In a subgroup analysis, a robust effect on cataracts across U[Na+]/Cr quartiles was observed in patients ≥ 50 years of age (aOR 1.11, 95% CI 1.04–1.18), though not in younger patients (aOR 1.06, 95% CI 0.96–1.17).

Conclusions

Our results suggest that high sodium intake and low socioeconomic status may affect the development of cataracts, and that a low-salt diet could be helpful for the prevention of cataracts in an older population. Furthermore, efforts to close gaps in health services due to socioeconomic factors may contribute to a reduction in the prevalence of cataracts.  相似文献   

4.

Background

Clinically significant bleeding is important for subsequent optimal case management in dengue patients, but most studies have focused on dengue severity as an outcome. Our study objective was to identify differences in admission parameters between patients who developed clinically significant bleeding and those that did not. We sought to develop a model for discriminating between these patients.

Methods

We conducted a retrospective study of 4,383 adults aged >18 years who were hospitalized with dengue infection at Tan Tock Seng Hospital, Singapore from 2005 to 2008. Patients were divided into those with clinically significant bleeding (n = 188), and those without (n = 4,195). Demographic, clinical, and laboratory variables on admission were compared between groups to determine factors associated with clinically significant bleeding during hospitalization.

Results

On admission, female gender (p<0.001); temperature >38°C (p<0.001); nausea/vomiting (p = 0.009) and abdominal pain/tenderness (p = 0.005); lower systolic blood pressure (p<0.001); higher pulse rate (p<0.001); increased absolute neutrophil count (ANC; p<0.001); reduced absolute lymphocyte count (ALC; p<0.001), haematocrit percentage (p<0.001) and platelet count (p = 0.04), and increased prothrombin time (p = 0.003) were significantly associated with clinically significant bleeding on univariate analysis. Multivariate analysis showed that independent variables in the final model were female gender (aOR 2.85; 95% CI: 1.9–4.33); temperature >38°C (aOR 1.81; 95% CI: 1.27–2.61), nausea/vomiting (aOR 1.39; 95% CI: 0.94–2.12), ANC (aOR 1.3; 95% CI: 1.15–1.46), ALC (aOR 0.4; 95% CI: 0.25–0.64), hematocrit percentage (aOR 0.96; 95% CI: 0.92–1.002) and platelet count (aOR 0.993; 95% CI: 0.988–0.998). At the cutoff of -3.919, the model achieved an AUC of 0.758 (sensitivity:0.87, specificity: 0.38, PPV: 0.06, NPV: 0.98).

Conclusion

Clinical risk factors associated with clinically significant bleeding were identified. This model may be useful to complement clinical judgement in triaging adult dengue patients given the dynamic nature of acute dengue, particularly in pre-identifying those less likely to develop clinically significant bleeding.  相似文献   

5.

Background

WHO recommends regular viral load (VL) monitoring of patients on antiretroviral therapy (ART) for timely detection of virological failure, prevention of acquired HIV drug resistance (HIVDR) and avoiding unnecessary switching to second-line ART. However, the cost and complexity of routine VL testing remains prohibitive in most resource limited settings (RLS). We evaluated a simple, low–cost, qualitative viral–failure assay (VFA) on dried blood spots (DBS) in three clinical settings in Uganda.

Methods

We conducted a cross–sectional diagnostic accuracy study in three HIV/AIDS treatment centres at the Joint Clinical Research Centre in Uganda. The VFA employs semi-quantitative detection of HIV–1 RNA amplified from the LTR gene. We used paired dry blood spot (DBS) and plasma with the COBASAmpliPrep/COBASTaqMan, Roche version 2 (VLref) as the reference assay. We used the VFA at two thresholds of viral load, (>5,000 or >1,000 copies/ml).

Results

496 paired VFA and VLref results were available for comparative analysis. Overall, VFA demonstrated 78.4% sensitivity, (95% CI: 69.7%–87.1%), 93% specificity (95% CI: 89.7%–96.4%), 89.3% accuracy (95% CI: 85%–92%) and an agreement kappa = 0.72 as compared to the VLref. The predictive values of positivity and negativity among patients on ART for >12 months were 72.7% and 99.3%, respectively.

Conclusions

VFA allowed 89% of correct classification of VF. Only 11% of the patients were misclassified with the potential of unnecessary or late switch to second–line ART. Our findings present an opportunity to roll out simple and affordable VL monitoring for HIV–1 treatment in RLS.  相似文献   

6.

Background

Community water supplies in underserved areas of the United States may be associated with increased microbiological contamination and risk of gastrointestinal disease. Microbial and health risks affecting such systems have not been systematically characterized outside outbreak investigations. The objective of the study was to evaluate associations between self-reported gastrointestinal illnesses (GII) and household-level water supply characteristics.

Methods

We conducted a cross-sectional study of water quality, water supply characteristics, and GII in 906 households served by 14 small and medium-sized community water supplies in Alabama’s underserved Black Belt region.

Results

We identified associations between respondent-reported water supply interruption and any symptoms of GII (adjusted odds ratio (aOR): 3.01, 95% confidence interval (CI) = 1.65–5.49), as well as low water pressure and any symptoms of GII (aOR: 4.51, 95% CI = 2.55–7.97). We also identified associations between measured water quality such as lack of total chlorine and any symptoms of GII (aOR: 5.73, 95% CI = 1.09–30.1), and detection of E. coli in water samples and increased reports of vomiting (aOR: 5.01, 95% CI = 1.62–15.52) or diarrhea (aOR: 7.75, 95% CI = 2.06–29.15).

Conclusions

Increased self-reported GII was associated with key water system characteristics as measured at the point of sampling in a cross-sectional study of small and medium water systems in rural Alabama in 2012 suggesting that these water supplies can contribute to endemic gastro-intestinal disease risks. Future studies should focus on further characterizing and managing microbial risks in systems facing similar challenges.  相似文献   

7.

Objectives

To examine the process and outcomes of care of COPD patients by Advanced Practice Providers (APPs) and primary care physicians.

Methods

We conducted a cross sectional retrospective cohort study of Medicare beneficiaries with COPD who had at least one hospitalization in 2010. We examined the process measures of receipt of spirometry evaluation, influenza and pneumococcal vaccine, use of COPD medications, and referral to a pulmonary specialist visit. Outcome measures were emergency department (ER) visit, number of hospitalizations and 30-day readmission in 2010.

Results

A total of 7,257 Medicare beneficiaries with COPD were included. Of these, 1,999 and 5,258 received primary care from APPs and primary care physicians, respectively. Patients in the APP group were more likely to be white, younger, male, residing in non-metropolitan areas and have fewer comorbidities. In terms of process of care measures, APPs were more likely to prescribe short acting bronchodilators (adjusted odds ratio [aOR] = 1.18, 95%Confidence Interval [CI] 1.05–1.32), oxygen therapy (aOR = 1.25, 95% CI 1.12–1.40) and consult a pulmonary specialist (aOR = 1.39, 95% CI 1.23–1.56), but less likely to give influenza and pneumococcal vaccinations. Patients receiving care from APPs had lower rates of ER visits for COPD (aOR = 0.84, 95%CI 0.71–0.98) and had a higher follow-up rate with pulmonary specialist within 30 days of hospitalization for COPD (aOR = 1.25, 95%CI 1.07–1.48) than those cared for by physicians.

Conclusions

Compared to patients cared for by physicians, patients cared for by APPs were more likely to receive short acting bronchodilator, oxygen therapy and been referred to pulmonologist, however they had lower rates of vaccination probably due to lower age group. Patients cared for by APPs were less like to visit an ER for COPD compared to patients care for by physicians, conversely there was no differences in hospitalization or readmission for COPD between MDs and APPs.  相似文献   

8.

Background

Few studies have examined the contribution of treatment on the mortality of dementia based on a population-based study.

Objective

To investigate the effects of anti-dementia and nootropic treatments on the mortality of dementia using a population-based cohort study.

Methods

12,193 incident dementia patients were found from 2000 to 2010. Their data were compared with 12,193 age- and sex-matched non-dementia controls that were randomly selected from the same database. Dementia was classified into vascular (VaD) and degenerative dementia. Mortality incidence and hazard ratios (HRs) were calculated.

Results

The median survival time was 3.39 years (95% confidence interval [CI]: 2.88–3.79) for VaD without medication, 6.62 years (95% CI: 6.24–7.21) for VaD with nootropics, 3.01 years (95% CI: 2.85–3.21) for degenerative dementia without medication, 8.11 years (95% CI: 6.30–8.55) for degenerative dementia with anti-dementia medication, 6.00 years (95% CI: 5.73–6.17) for degenerative dementia with nootropics, and 9.03 years (95% CI: 8.02–9.87) for degenerative dementia with both anti-dementia and nootropic medications. Compared to the non-dementia group, the HRs among individuals with degenerative dementia were 2.69 (95% CI: 2.55–2.83) without medication, 1.46 (95% CI: 1.39–1.54) with nootropics, 1.05 (95% CI: 0.82–1.34) with anti-dementia medication, and 0.92 (95% CI: 0.80–1.05) with both nootropic and anti-dementia medications. VaD with nootropics had a lower mortality (HR: 1.25, 95% CI: 1.15–1.37) than VaD without medication (HR: 2.46, 95% CI: 2.22–2.72).

Conclusion

Pharmacological treatments have beneficial effects for patients with dementia in prolonging their survival.  相似文献   

9.

Background

Predictors of death in hospitalized HIV-infected patients have not been previously reported in Bangladesh.

Objective

The primary aim of this study was to determine predictors of death among hospitalized HIV-infected patients at a large urban hospital in Bangladesh.

Methods

A study was conducted in the HIV in-patient unit (Jagori Ward) of icddr,b''s Dhaka Hospital. Characteristics of patients who died during hospitalization were compared to those of patients discharged from the ward. Bivariate analysis was performed to determine associations between potential risk factors and death. Multivariable logistic regression was used to identify factors independently associated with death.

Results

Of 293 patients admitted to the Jagori Ward, 57 died during hospitalization. Most hospitalized patients (67%) were male and the median age was 35 (interquartile range: 2–65) years. Overall, 153 (52%) patients were diagnosed with HIV within 6 months of hospitalization. The most common presumptive opportunistic infections (OIs) identified were tuberculosis (32%), oesophageal candidiasis (9%), Pneumocystis jirovecii pneumonia (PJP) (8%), and histoplasmosis (7%). On multivariable analysis, independent predictors of mortality were CD4 count ≤200 cells/mm3 (adjusted odds ratio [aOR]: 16.6, 95% confidence interval [CI]: 3.7–74.4), PJP (aOR: 18.5, 95% CI: 4.68–73.3), oesophageal candidiasis (aOR: 27.5, 95% CI: 5.5–136.9), malignancy (aOR:15.2, 95% CI: 2.3–99.4), and bacteriuria (aOR:7.9, 95% CI: 1.2–50.5). Being on antiretroviral therapy prior to hospitalization (aOR: 0.2, 95% CI: 0.06–0.5) was associated with decreased mortality.

Conclusion

This study showed that most patients who died during hospitalization on the Jagori Ward had HIV-related illnesses which could have been averted with earlier diagnosis of HIV and proper management of OIs. It is prudent to develop a national HIV screening programme to facilitate early identification of HIV.  相似文献   

10.

Background

Systematic reviews of randomised controlled trials report that probiotics reduce the risk of necrotising enterocolitis (NEC) in preterm neonates.

Aim

To determine whether routine probiotic supplementation (RPS) to preterm neonates would reduce the incidence of NEC.

Methods

The incidence of NEC ≥ Stage II and all-cause mortality was compared for an equal period of 24 months ‘before’ (Epoch 1) and ‘after’ (Epoch 2) RPS with Bifidobacterium breve M-16V in neonates <34 weeks. Multivariate logistic regression analysis was conducted to adjust for relevant confounders.

Results

A total of 1755 neonates (Epoch I vs. II: 835 vs. 920) with comparable gestation and birth weights were admitted. There was a significant reduction in NEC ≥ Stage II: 3% vs. 1%, adjusted odds ratio (aOR) = 0.43 (95%CI: 0.21–0.87); ‘NEC ≥ Stage II or all-cause mortality’: 9% vs. 5%, aOR = 0.53 (95%CI: 0.32–0.88); but not all-cause mortality alone: 7% vs. 4%, aOR = 0.58 (95% CI: 0.31–1.06) in Epoch II. The benefits in neonates <28 weeks did not reach statistical significance: NEC ≥ Stage II: 6% vs. 3%, aOR 0.51 (95%CI: 0.20–1.27), ‘NEC ≥ Stage II or all-cause mortality’, 21% vs. 14%, aOR = 0.59 (95%CI: 0.29–1.18); all-cause mortality: 17% vs. 11%, aOR = 0.63 (95%CI: 0.28–1.41). There was no probiotic sepsis.

Conclusion

RPS with Bifidobacterium breve M-16V was associated with decreased NEC≥ Stage II and ‘NEC≥ Stage II or all-cause mortality’ in neonates <34 weeks. Large sample size is required to assess the potential benefits of RPS in neonates <28 weeks.  相似文献   

11.

Background

Limited antiretroviral treatment regimens in resource-limited settings require long-term sustainability of patients on the few available options. We evaluated the incidence and predictors of combined antiretroviral treatment (cART) modifications, in an outpatient cohort of 955 patients who initiated cART between January 2009 and January 2011 in western Kenya.

Methods

cART modification was defined as either first time single drug substitution or switch. Incidence rates were determined by Poisson regression and risk factor analysis assessed using multivariate Cox regression modeling.

Results

Over a median follow-up period of 10.7 months, 178 (18.7%) patients modified regimens (incidence rate (IR); 18.6 per 100 person years [95% CI: 16.2–21.8]). Toxicity was the most common cited reason (66.3%). In adjusted multivariate Cox piecewise regression model, WHO disease stage III/IV (aHR; 1.82, 95%CI: 1.25–2.66), stavudine (d4T) use (aHR; 2.21 95%CI: 1.49–3.30) and increase in age (aHR; 1.02, 95%CI: 1.0–1.04) were associated with increased risk of treatment modification within the first year post-cART. Zidovudine (AZT) and tenofovir (TDF) use had a reduced risk for modification (aHR; 0.60 95%CI: 0.38–0.96 and aHR; 0.51 95%CI: 0.29–0.91 respectively). Beyond one year of treatment, d4T use (aHR; 2.75, 95% CI: 1.25–6.05), baseline CD4 counts ≤350 cells/mm3 (aHR; 2.45, 95%CI: 1.14–5.26), increase in age (aHR; 1.05 95%CI: 1.02–1.07) and high baseline weight >60kg aHR; 2.69 95% CI: 1.58–4.59) were associated with risk of cART modification.

Conclusions

Early treatment initiation at higher CD4 counts and avoiding d4T use may reduce treatment modification and subsequently improve sustainability of patients on the available limited options.  相似文献   

12.

Objective

Current guidelines call for HIV-infected women to deliver via scheduled Caesarean when the maternal HIV viral load (VL) is >1,000 copies/ml. We describe the mode of delivery among HIV-infected women and evaluate adherence to relevant recommendations.

Study Design

We performed a population-based surveillance analysis of HIV-infected pregnant women in Philadelphia from 2005 to 2013, comparing mode of delivery (vaginal, scheduled Caesarean, or emergent Caesarean) by VL during pregnancy, closest to the time of delivery (≤1,000 copies/ml versus an unknown VL or VL >1,000 copies/ml) and associated factors in multivariable analysis.

Results

Our cohort included 824 deliveries from 648 HIV-infected women, of whom 69.4% had a VL ≤1,000 copies/ml and 30.6% lacked a VL or had a VL >1,000 copies/ml during pregnancy, closest to the time of delivery. Mode of delivery varied by VL: 56.6% of births were vaginal, 30.1% scheduled Caesarean, and 13.3% emergent Caesarean when the VL was ≤1,000 copies/ml; when the VL was unknown or >1,000 copies/ml, 32.9% of births were vaginal, 49.9% scheduled Caesarean and 17.5% emergent Caesarean. In multivariable analyses, Hispanic women (adjusted odds ratio (AOR) 0.17, 95% Confidence Interval (CI) 0.04–0.76) and non-Hispanic black women (AOR 0.27, 95% CI 0.10–0.77) were less to likely to deliver via scheduled Caesarean compared to non-Hispanic white women. Women who delivered prior to 38 weeks’ gestation (AOR 0.37, 95% CI 0.18–0.76) were also less likely to deliver via scheduled Caesarean compared to women who delivered after 38 weeks’ gestation. An interaction term for race and gestational age at delivery was significant in multivariable analysis. Non-Hispanic black (AOR 0.06, 95% CI 0.01–0.36) and Hispanic women (AOR 0.03, 95% CI 0.00–0.59) were more likely to deliver prematurely and less likely to deliver via scheduled C-section compared to non-Hispanic white women. Having a previous Caesarean (AOR 27.77, 95% CI 8.94–86.18) increased the odds of scheduled Caesarean delivery.

Conclusions

Only half of deliveries for women with an unknown VL or VL >1,000 copies/ml occurred via scheduled Caesarean. Delivery prior to 38 weeks, particularly among minority women, resulted in a missed opportunity to receive a scheduled Caesarean. However, even when delivering at or after 38 weeks’ gestation, a significant proportion of women did not get a scheduled Caesarean when indicated, suggesting a need for focused public health interventions to increase the proportion of women achieving viral suppression during pregnancy and delivering via scheduled Caesarean when indicated.  相似文献   

13.

Background

We conducted a population-based cross-sectional study to examine gender differences in severity, management, and outcome among patients with acute biliary pancreatitis (ABP) because available data are insufficient and conflicting.

Methods

We analyzed 13,110 patients (50.6% male) with first-attack ABP from Taiwan’s National Health Insurance Research Database between 2000 and 2009. The primary outcome was hospital mortality. Secondary outcomes included the development of severe ABP and the provision of treatment measures. Gender difference was assessed using multivariable analyses with generalized estimating equations models.

Results

The odds of gastrointestinal bleeding (adjusted odds ratio [aOR] 1.44, 95% confidence interval [CI] 1.18–1.76) and local complication (aOR 1.38, 95% CI 1.05–1.82) were 44% and 38% higher in men than in women, respectively. Compared with women, men had 24% higher odds of receiving total parenteral nutrition (aOR 1.24, 95% CI 1.00–1.52), but had 18% and 41% lower odds of receiving cholecystectomy (aOR 0.82, 95% CI 0.72–0.93) and hemodialysis (aOR 0.59, 95% CI 0.42–0.83), respectively. Hospital mortality was higher in men than in women (1.8% vs. 1.1%, p = 0.001). After adjustment for potential confounders, men had 81% higher odds of in-hospital death than women (aOR 1.81, 95% CI 1.15–2.86). Among patients with severe ABP, hospital mortality was 11.0% and 7.5% in men and women (p<0.001), respectively. The adjusted odds of death remained higher in men than in women with severe ABP (aOR 1.72, 95% CI 1.10–2.68).

Conclusions

Gender is an important determinant of outcome in patients with ABP and may affect their treatment measures.  相似文献   

14.

Purpose

Non-adherence to tuberculosis therapy can lead to drug resistance, prolonged infectiousness, and death; therefore, understanding what causes treatment default is important. Pakistan has one of the highest burdens of tuberculosis in the world, yet there have been no qualitative studies in Pakistan that have specifically examined why default occurs. We conducted a mixed methods study at a tuberculosis clinic in Karachi to understand why patients with drug-susceptible tuberculosis default from treatment, and to identify factors associated with default. Patients attending this clinic pick up medications weekly and undergo family-supported directly observed therapy.

Methods

In-depth interviews were administered to 21 patients who had defaulted. We also compared patients who defaulted with those who were cured, had completed, or had failed treatment in 2013.

Results

Qualitative analyses showed the most common reasons for default were the financial burden of treatment, and medication side effects and beliefs. The influence of finances on other causes of default was also prominent, as was concern about the effect of treatment on family members. In quantitative analysis, of 2120 patients, 301 (14.2%) defaulted. Univariate analysis found that male gender (OR: 1.34, 95% CI: 1.04–1.71), being 35–59 years of age (OR: 1.54, 95% CI: 1.14–2.08), or being 60 years of age or older (OR: 1.84, 95% CI: 1.17–2.88) were associated with default. After adjusting for gender, disease site, and patient category, being 35–59 years of age (aOR: 1.49, 95% CI: 1.10–2.03) or 60 years of age or older (aOR: 1.76, 95% CI: 1.12–2.77) were associated with default.

Conclusions

In multivariate analysis age was the only variable associated with default. This lack of identifiable risk factors and our qualitative findings imply that default is complex and often due to extrinsic and medication-related factors. More tolerable medications, improved side effect management, and innovative cost-reduction measures are needed to reduce default from tuberculosis treatment.  相似文献   

15.

Background

Allergy documentation is frequently inconsistent and incomplete. The impact of this variability on subsequent treatment is not well described.

Objective

To determine how allergy documentation affects subsequent antibiotic choice.

Design

Retrospective, cohort study.

Participants

232,616 adult patients seen by 199 primary care providers (PCPs) between January 1, 2009 and January 1, 2014 at an academic medical system.

Main Measures

Inter-physician variation in beta-lactam allergy documentation; antibiotic treatment following beta-lactam allergy documentation.

Key Results

15.6% of patients had a reported beta-lactam allergy. Of those patients, 39.8% had a specific allergen identified and 22.7% had allergic reaction characteristics documented. Variation between PCPs was greater than would be expected by chance (all p<0.001) in the percentage of their patients with a documented beta-lactam allergy (7.9% to 24.8%), identification of a specific allergen (e.g. amoxicillin as opposed to “penicillins”) (24.0% to 58.2%) and documentation of the reaction characteristics (5.4% to 51.9%). After beta-lactam allergy documentation, patients were less likely to receive penicillins (Relative Risk [RR] 0.16 [95% Confidence Interval: 0.15–0.17]) and cephalosporins (RR 0.28 [95% CI 0.27–0.30]) and more likely to receive fluoroquinolones (RR 1.5 [95% CI 1.5–1.6]), clindamycin (RR 3.8 [95% CI 3.6–4.0]) and vancomycin (RR 5.0 [95% CI 4.3–5.8]). Among patients with beta-lactam allergy, rechallenge was more likely when a specific allergen was identified (RR 1.6 [95% CI 1.5–1.8]) and when reaction characteristics were documented (RR 2.0 [95% CI 1.8–2.2]).

Conclusions

Provider documentation of beta-lactam allergy is highly variable, and details of the allergy are infrequently documented. Classification of a patient as beta-lactam allergic and incomplete documentation regarding the details of the allergy lead to beta-lactam avoidance and use of other antimicrobial agents, behaviors that may adversely impact care quality and cost.  相似文献   

16.

Background and Purpose

We studied whether anticoagulant use and outcomes differed between rural versus urban Canadian non-valvular atrial fibrillation (NVAF) patients prior to the introduction of direct oral anticoagulant drugs.

Methods

Retrospective cohort study of 25,284 adult Albertans with NVAF between April 1, 1999 and December 31, 2008.

Results

Compared to urban patients, rural patients were older (p = 0.0009) and had more comorbidities but lower bleeding risk at baseline. In the first year after NVAF diagnosis, urban patients were less likely to be hospitalized (aOR 0.82, 95%CI 0.77–0.89) or have an emergency department visit for any reason (aOR 0.61, 95%CI 0.56–0.66) but warfarin dispensation rates (72.2% vs 71.8% at 365 days, p = 0.98) and clinical outcomes were similar: 7.8% died in both groups, 3.2% rural vs. 2.8% urban had a stroke or systemic embolism (SSE) (aOR 0.92, 95%CI 0.77–1.11), and 6.6% vs. 5.7% (aOR 0.93, 95%CI 0.81–1.06) had a bleed. Baseline SSE risk did not impact warfarin dispensation (73.0% in those with high vs. 72.8% in those with low CHADS2 score, p = 0.85) but patients at higher baseline bleeding risk were less likely to be using warfarin (69.2% high vs. 73.6% low HASBLED score, p<0.0001) in the first 365 days after diagnosis. In warfarin users, bleeding was more frequent (7.5% vs 6.2%, aHR 1.51 [95%CI 1.33–1.72]) but death or SSE was less frequent (7.0% vs 18.1%, aHR 0.60 [0.54–0.66]).

Conclusion

Warfarin use and clinical event rates did not differ between rural and urban NVAF patients in a universal access publically-funded healthcare system.  相似文献   

17.

Background

Predicting the neurological sequelae of carbon monoxide poisoning (COP) has not been well studied. We investigated the independent predictors of neurological sequelae in patients with COP and combined these predictors to predict the prognosis.

Methods

This study was conducted at four hospitals in Shandong Province, China. Data were retrospectively collected from 258 patients with COP between November 1990 and October 2011. Thirty-day neurological sequelae were the primary endpoints.

Results

A lack of pupil reflex and a loss of consciousness appear to be independent predictors for neurological sequelae in patients with COP. The presence of either one had a sensitivity of 77.0% (95% confidence interval [CI]: 69.3–83.2), a specificity of 47.1% (95% CI: 38.3–56.0), positive predictive value (PPV) of 62.9% (95% CI: 55.2–70.1), and a negative predictive value (NPV) of 63.6% (95% CI: 52.6–73.4). With both predictors present, the sensitivity was 11.5% (95% CI: 6.9 to 18.3), the specificity was 99.2 (95% CI: 94.7–100.0), the PPV was 94.1% (95% CI: 69.2–99.7), and the NPV was 49.0% (95% CI: 42.5–55.5).

Conclusions

The risk for neurological sequelae apparently increased with the number of independent predictors. In patients with both predictors, the risk for neurological sequelae was 94.1%. Almost all (99.2%) patients with neither predictor had no neurological sequelae. This finding may help physicians make decisions about and dispositions for patients with COP. For patients with a higher risk, earlier treatment and more appropriate utilization of health care services, including hyperbaric oxygen, should be considered.  相似文献   

18.

Context

As life expectancy improves among Human Immunodeficiency Virus (HIV) patients, renal and cardiovascular diseases are increasingly prevalent in this population. Renal and cardiovascular disease are mutual risk factors and are characterized by albuminuria. Understanding the interactions between HIV, cardiovascular risk factors and renal disease is the first step in tackling this new therapeutic frontier in HIV.

Methods

In a rural primary health care centre, 903 HIV-infected adult patients were randomly selected and data on HIV-infection and cardiovascular risk factors were collected. Glomerular filtration rate (eGFR) was estimated. Albuminuria was defined as an Albumin-Creatinine-Ratio above 30 mg/g. Multivariate logistic regression analysis was used to analyse albuminuria and demographic, clinical and HIV-associated variables.

Results

The study population consisted of 903 HIV-infected patients, with a median age of 40 years (Inter-Quartile Range (IQR) 34–48 years), and included 625 (69%) women. The median duration since HIV diagnosis was 26 months (IQR 12–58 months) and 787 (87%) received antiretroviral therapy. Thirty-six (4%) of the subjects were shown to have diabetes and 205 (23%) hypertension. In the cohort, 21% had albuminuria and 2% an eGFR <60 mL/min/1.73m2. Albuminuria was associated with hypertension (adjusted odds ratio (aOR) 1.59; 95% confidence interval (CI) 1.05–2.41; p<0.05), total cholesterol (aOR 1.31; 95% CI 1.11–1.54; p<0.05), eGFR (aOR 0.98; 95% CI 0.97–0.99; p<0.001) and detectable viral load (aOR 2.74; 95% CI 1.56–4.79; p<0.001). Hypertension was undertreated: 78% were not receiving treatment, while another 11% were inadequately treated. No patients were receiving lipid-lowering medication.

Conclusion

Glomerular filtration rate was well conserved, while albuminuria was common amongst HIV-infected patients in rural South Africa. Both cardiovascular and HIV-specific variables were associated with albuminuria. Improved cardiovascular risk prevention as well as adequate virus suppression might be the key to escape the vicious circle of renal failure and cardiovascular disease and improve the long-term prognosis of HIV-infected patients.  相似文献   

19.

Background

Life expectancy has increased for newly diagnosed HIV patients since the inception of combination antiretroviral treatment (cART), but there remains a need to better understand the characteristics of long-term survival in HIV-positive patients. We examined long-term survival in HIV-positive patients receiving cART in the Australian HIV Observational Database (AHOD), to describe changes in mortality compared to the general population and to develop longer-term survival models.

Methods

Data were examined from 2,675 HIV-positive participants in AHOD who started cART. Standardised mortality ratios (SMR) were calculated by age, sex and calendar year across prognostic characteristics using Australian Bureau of Statistics national data as reference. SMRs were examined by years of duration of cART by CD4 and similarly by viral load. Survival was analysed using Cox-proportional hazards and parametric survival models.

Results

The overall SMR for all-cause mortality was 3.5 (95% CI: 3.0–4.0). SMRs by CD4 count were 8.6 (95% CI: 7.2–10.2) for CD4<350 cells/µl; 2.1 (95% CI: 1.5–2.9) for CD4 = 350–499 cells/µl; and 1.5 (95% CI: 1.1–2.0) for CD4≥500 cells/µl. SMRs for patients with CD4 counts <350 cells/µL were much higher than for patients with higher CD4 counts across all durations of cART. SMRs for patients with viral loads greater than 400 copies/ml were much higher across all durations of cART. Multivariate models demonstrated improved survival associated with increased recent CD4, reduced recent viral load, younger patients, absence of HBVsAg-positive ever, year of HIV diagnosis and incidence of ADI. Parametric models showed a fairly constant mortality risk by year of cART up to 15 years of treatment.

Conclusion

Observed mortality remained fairly constant by duration of cART and was modelled accurately by accepted prognostic factors. These rates did not vary much by duration of treatment. Changes in mortality with age were similar to those in the Australian general population.  相似文献   

20.

Background

There are very limited data on children with pneumonia in Mali. The objective was to assess the etiology and factors associated with community-acquired pneumonia in hospitalized children <5 years of age in Mali.

Methods

A prospective hospital-based case-control study was implemented in the Pediatric department of Gabriel Touré University Hospital at Bamako, Mali, between July 2011-December 2012. Cases were children with radiologically-confirmed pneumonia; Controls were hospitalized children without respiratory features, matched for age and period. Respiratory specimens, were collected to identify 19 viruses and 5 bacteria. Whole blood was collected from cases only. Factors associated with pneumonia were assessed by multivariate logistic regression.

Results

Overall, 118 cases and 98 controls were analyzed; 44.1% were female, median age was 11 months. Among pneumonia cases, 30.5% were hypoxemic at admission, mortality was 4.2%. Pneumonia cases differed from the controls regarding clinical signs and symptoms but not in terms of past medical history. Multivariate analysis of nasal swab findings disclosed that S. pneumoniae (adjusted odds ratio [aOR] = 3.4, 95% confidence interval [95% CI]: 1.6–7.0), human metapneumovirus (aOR = 17.2, 95% CI: 2.0–151.4), respiratory syncytial virus [RSV] (aOR = 7.4, 95% CI: 2.3–23.3), and influenza A virus (aOR = 10.7, 95% CI: 1.0–112.2) were associated with pneumonia, independently of patient age, gender, period, and other pathogens. Distribution of S. pneumoniae and RSV differed by season with higher rates of S. pneumoniae in January-June and of RSV in July-September. Pneumococcal serotypes 1 and 5 were more frequent in pneumonia cases than in the controls (P = 0.009, and P = 0.04, respectively).

Conclusions

In this non-PCV population from Mali, pneumonia in children was mainly attributed to S. pneumoniae, RSV, human metapneumovirus, and influenza A virus. Increased pneumococcal conjugate vaccine coverage in children could significantly reduce the burden of pneumonia in sub-Saharan African countries.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号