首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.

Setting

Under India''s Revised National Tuberculosis Control Programme (RNTCP), >15% of previously-treated patients in the reported 2006 patient cohort defaulted from anti-tuberculosis treatment.

Objective

To assess the timing, characteristics, and risk factors for default amongst re-treatment TB patients.

Methodology

For this case-control study, in 90 randomly-selected programme units treatment records were abstracted from all 2006 defaulters from the RNTCP re-treatment regimen (cases), with one consecutively-selected non-defaulter per case. Patients who interrupted anti-tuberculosis treatment for >2 months were classified as defaulters.

Results

1,141 defaulters and 1,189 non-defaulters were included. The median duration of treatment prior to default was 81 days (25%–75% interquartile range 44–117 days) and documented retrieval efforts after treatment interruption were inadequate. Defaulters were more likely to have been male (adjusted odds ratio [aOR] 1.4, 95% confidence interval [CI] 1.2–1.7), have previously defaulted anti-tuberculosis treatment (aOR 1.3 95%CI 1.1–1.6], have previous treatment from non-RNTCP providers (AOR 1.3, 95%CI 1.0–1.6], or have public health facility-based treatment observation (aOR 1.3, 95%CI 1.1–1.6).

Conclusions

Amongst the large number of re-treatment patients in India, default occurs early and often. Improved pre-treatment counseling and community-based treatment provision may reduce default rates. Efforts to retrieve treatment interrupters prior to default require strengthening.  相似文献   

2.
3.

Background

The Médecins Sans Frontières project of Uzbekistan has provided multidrug-resistant tuberculosis treatment in the Karakalpakstan region since 2003. Rates of default from treatment have been high, despite psychosocial support, increasing particularly since programme scale-up in 2007. We aimed to determine factors associated with default in multi- and extensively drug-resistant tuberculosis patients who started treatment between 2003 and 2008 and thus had finished approximately 2 years of treatment by the end of 2010.

Methods

A retrospective cohort analysis of multi- and extensively drug-resistant tuberculosis patients enrolled in treatment between 2003 and 2008 compared baseline demographic characteristics and possible risk factors for default. Default was defined as missing ≥60 consecutive days of treatment (all drugs). Data were routinely collected during treatment and entered in a database. Potential risk factors for default were assessed in univariate analysis using chi-square test and in multivariate analysis with logistic regression.

Results

20% (142/710) of patients defaulted after a median of 6 months treatment (IQR 2.6–9.9). Factors associated with default included severity of resistance patterns (pre-extensively drug-resistant/extensively drug-resistant tuberculosis adjusted odds ratio 0.52, 95%CI: 0.31–0.86), previous default (2.38, 1.09–5.24) and age >45 years (1.77, 1.10–2.87). The default rate was 14% (42/294) for patients enrolled 2003–2006 and 24% (100/416) for 2007–2008 enrolments (p = 0.001).

Conclusions

Default from treatment was high and increased with programme scale-up. It is essential to ensure scale-up of treatment is accompanied with scale-up of staff and patient support. A successful first course of tuberculosis treatment is important; patients who had previously defaulted were at increased risk of default and death. The protective effect of severe resistance profiles suggests that understanding disease severity or fear may motivate against default. Targeted health education and support for at-risk patients after 5 months of treatment when many begin to feel better may decrease default.  相似文献   

4.

Objective

To determine factors influencing the utilization and accessibility to bacteriologic-based tuberculosis (TB) diagnosis among sputum smear positive (SS+) retreatment TB patients, and to develop strategies for improving the case detection rate of MDR-TB in rural China.

Study Design and Setting

A cross-sectional study of SS+ TB retreatment patients was conducted in eight counties from three provinces with different implementation period and strategy of MDR-TB program in China. Demographic and socioeconomic parameters were collected by self-reporting questionnaires. Sputum samples were collected and cultured by the laboratory of county-designated TB clinics and delivered to prefectural Centers for Disease Prevention and Control (CDC) labs for DST with 4 first-line anti-TB drugs.

Results

Among the 196 SS+ retreatment patients, 61.22% received culture tests during current treatment. Patients from more developed regions (OR = 24.0 and 3.6, 95% CI: 8.6–67.3 and 1.1–11.6), with better socio-economic status (OR = 3. 8, 95% CI: 1.3–10.7), who had multiple previous anti-TB treatments (OR = 5.0, 95% CI: 1.6–15.9), and who failed in the most recent anti-TB treatment (OR = 2.6, 95% CI: 1.0–6.4) were more likely to receive culture tests. The percentage of isolates resistant to any of first-line anti-TB drugs and MDR-TB were 50.0% (95% CI: 39.8%-60.2%) and 30.4% (95% CI: 21.0%-39.8%) respectively.

Conclusions

Retreatment SS+ TB patients, high risk MDR-TB population, had poor utilization of access to bacteriologic-based TB diagnosis, which is far from optimal. The next step of anti-TB strategy should be focused on how to make bacteriological-based diagnosis cheaper, safer and more maneuverable, and how to assure the DST-guided treatment for these high-risk TB patients.  相似文献   

5.

Background

The association between diabetes mellitus (DM) and tuberculosis (TB) is re-emerging worldwide. Recently, the prevalence of DM is increasing in resource poor countries where TB is of high burden. The objective of the current study was to determine the prevalence and analyze associated factors of TB and DM comorbidity in South-Eastern Amhara Region, Ethiopia.

Methods

This was a facility based cross-sectional study. All newly diagnosed TB patients attending selected health facilities in the study area were consecutively screened for DM. DM was diagnosed based on the World Health Organization diagnostic criteria. A pre-tested semi-structured questionnaire was used to collect socio-demographic, lifestyles and clinical data. Logistic regression analysis was performed to identify factors associated with TB and DM comorbidity.

Result

Among a total of 1314 patients who participated in the study, the prevalence of DM was estimated at 109 (8.3%). Being female [odds ratio (OR) 1.70; 95% confidence interval (CI) (1.10–2.62)], patients age [41–64 years (OR 3.35; 95% CI (2.01–5.57), 65–89 years (OR 3.18; 95% CI (1.52–6.64)], being a pulmonary TB case [(OR 1.69; 95% CI 1.09–2.63)] and having a family history of DM [(OR 4.54; 95% CI (2.36–8.73)] were associated factors identified with TB and DM comorbidity.

Conclusion

The prevalence of DM among TB patients in South-Eastern Amahra Region is high. Routine screening of TB patients for DM is recommended in the study area.  相似文献   

6.

Background

Interferon-γ release assays such as the QuantiFERON-TB Gold In-Tube Test (QFT-GIT) are designed to detect Mycobacterium tuberculosis infections, whether latent or manifesting as disease. However, a substantial number of persons with culture-confirmed tuberculosis (TB) have negative QFT-GITs. Information on host factors contributing to false-negative and indeterminate results are limited.

Methods

A multicenter retrospective cohort study was performed with 1,264 culture-confirmed TB patients older than 18 years who were subjected to the QFT-GIT at one of the six hospitals between May 2007 and February 2014. Patients with human immunodeficiency virus infection were excluded. Clinical and laboratory data were collected in South Korea.

Results

Of all patients, 87.6% (1,107/1,264) were diagnosed with pulmonary TB and 12.4% (157/1,264) with extrapulmonary TB. The rate of negative results was 14.4% (182/1,264). The following factors were highly correlated with false-negative results in the QFT-GIT: advanced age (age ≥ 65 years, odds ratio [OR] 1.57, 95% confidence interval [CI] 1.03–2.39), bilateral disease as determined by chest radiography (OR 1.75, 95% CI 1.13–2.72), malignancy (OR 2.42, 95% CI 1.30–4.49), and lymphocytopenia (total lymphocyte count < 1.0 × 109/L, OR 1.86, 95% CI 1.21–2.87).

Conclusions

Consequently, QFT-GIT results need to be interpreted with caution in patients with these host risk factors such as the elderly, bilateral disease on chest radiography, or malignancy, or lymphocytopenia.  相似文献   

7.

Rationale

High rates of recurrent tuberculosis after successful treatment have been reported from different high burden settings in Sub-Saharan Africa. However, little is known about the rate of smear-positive tuberculosis after treatment default. In particular, it is not known whether or not treatment defaulters continue to be or become again smear-positive and thus pose a potential for transmission of infection to others.

Objective

To investigate, in a high tuberculosis burden setting, the rate of re-treatment for smear-positive tuberculosis among cases defaulting from standardized treatment compared to successfully treated cases.

Methods

Retrospective cohort study among smear-positive tuberculosis cases treated between 1996 and 2008 in two urban communities in Cape Town, South Africa. Episodes of re-treatment for smear-positive tuberculosis were ascertained via probabilistic record linkage. Survival analysis and Poisson regression were used to compare the rate of smear-positive tuberculosis after treatment default to that after successful treatment.

Results

A total of 2,136 smear-positive tuberculosis cases were included in the study. After treatment default, the rate of re-treatment for smear-positive tuberculosis was 6.86 (95% confidence interval [CI]: 5.59–8.41) per 100 person-years compared to 2.09 (95% CI: 1.81–2.41) after cure (adjusted Hazard Ratio [aHR]: 3.97; 95% CI: 3.00–5.26). Among defaulters, the rate was inversely associated with treatment duration and sputum conversion prior to defaulting. Smear grade at start of the index treatment episode (Smear3+: aHR 1.61; 95%CI 1.11–2.33) was independently associated with smear-positive tuberculosis re-treatment, regardless of treatment outcome.

Conclusions

In this high-burden setting, there is a high rate of subsequent smear-positive tuberculosis after treatment default. Treatment defaulters are therefore likely to contribute to the pool of infectious source cases in the community. Our findings underscore the importance of preventing treatment default, as a means of successful tuberculosis control in high-burden settings.  相似文献   

8.

Background

Relatively little is known about the specific relationship and impact from chronic obstructive pulmonary disease (COPD) on multidrug-resistant tuberculsosis (MDR-TB).

Methods

We conducted a retrospective study included patients aged ≥40 years with a confirmed pulmonary TB at three tertiary hospitals (Shandong, China) between January 2011 and October 2014. Univariable and multivariable analyses were performed to identify the relationship of MDR-TB and COPD.

Results

A total of 2164 patients aged ≥ 40 years with available results of drug susceptibility test (DST) and medical records were screened for this study: 268 patients with discharge diagnosis of COPD and 1896 patients without COPD. Overall, 14.2% of patients with COPD and 8.5% patients without COPD were MDR-TB. The rate of MDR-TB were significantly higher in patients with COPD (P<0.05). Migrant (odds ratios (OR) 1.32, 95% confidence interval (CI) 1.02–1.72), previous anti-TB treatment (OR 4.58, 95% CI 1.69–12.42), cavity (OR 2.33, 95% CI 1.14–4.75), and GOLD stage (OR 1.86, 95% CI 1.01–2.93) were the independent predictors for MDR-TB among patients with COPD.

Conclusions

MDR-TB occurs more frequently in patients with underlying COPD, especially those with being migrant, previous anti-TB therapy, cavity and severe airway obstruction.  相似文献   

9.

Background

Tuberculosis is a major public health problem especially in developing countries, the comparative efficacy and safety of fluroquinolones (FQs) for adult patients with newly diagnosed, sputum-positive tuberculosis remains controversial. We aimed to investigate the benefits and risks of FQs-containing (addition/substitution) regimens in this population.

Methods

A network meta-analysis was performed to compare FQs (C: ciprofloxacin; O: ofloxacin; Lo: levofloxacin; M: moxifloxacin; G: gatifloxacin) addition/substitution regimen with standard HRZE regimen (ie isoniazid, rifampicin, pyrazinamide and ethambutol) in newly diagnosed, sputum-positive tuberculosis. Medline, Embase and Cochrane Central Register of Controlled Trials were systematically searched, randomized trials with duration longer than 8 weeks were included. The primary outcome was week-8 sputum negativity, and secondary outcomes included treatment failure, serious adverse events and death from all cause.

Results

Twelve studies comprising 6465 participants were included in the network meta-analysis. Löwenstein-Jensen culture method showed that HRZEM (OR 4.96, 95% CI 2.83–8.67), MRZE (OR 1.48, 95% CI 1.19–1.84) and HRZM (OR 1.32, 95% CI 1.08–1.62) had more sputum conversion than HRZE by the eighth week, whereas HRC (OR 0.39, 95% CI 0.19–0.77) and HRZO (OR 0.47, 95% CI 0.24–0.92) were worse than HRZE. Moxifloxacin-containing regimens showed more conversion than HRZE by liquid method at the end of two months. But by the end of treatment, FQs-containing regimens didn’t show superiority than HRZE on treatment failure. There were no significant differences between any regimens on other outcomes like serious adverse events and all-cause death.

Conclusion

This comprehensive network meta-analysis showed that compared with HRZE, moxifloxacin-containing regimens could significantly increase sputum conversion by the eighth week for patients with newly diagnosed pulmonary tuberculosis while HRC and HRZO regimens were inferior. But all the FQs-containing regimens did not show superiority in other outcomes (such as treatment failure, serious adverse events and all-cause death). Thus, HRZE is still an effective regimen for this population. Although moxifloxacin-containing regimens have deomonstrated their potential, FQs-containing regimens should be used with great caution to avoid widespread FQs-resistance worldwide.  相似文献   

10.

Objectives

To examine the associations of Intimate partner violence (IPV) with stress-related sleep disturbance (measured using the Ford Insomnia Response to Stress Test [FIRST]) and poor sleep quality (measured using the Pittsburgh Sleep Quality Index [PSQI]) during early pregnancy.

Methods

This cross-sectional study included 634 pregnant Peruvian women. In-person interviews were conducted in early pregnancy to collect information regarding IPV history, and sleep traits. Adjusted odds ratios (aOR) and 95% confidence intervals (95%CIs) were calculated using logistic regression procedures.

Results

Lifetime IPV was associated with a 1.54-fold increased odds of stress-related sleep disturbance (95% CI: 1.08–2.17) and a 1.93-fold increased odds of poor sleep quality (95% CI: 1.33–2.81). Compared with women experiencing no IPV during lifetime, the aOR (95% CI) for stress-related sleep disturbance associated with each type of IPV were: physical abuse only 1.24 (95% CI: 0.84–1.83), sexual abuse only 3.44 (95%CI: 1.07–11.05), and physical and sexual abuse 2.51 (95% CI: 1.27–4.96). The corresponding aORs (95% CI) for poor sleep quality were: 1.72 (95% CI: 1.13–2.61), 2.82 (95% CI: 0.99–8.03), and 2.50 (95% CI: 1.30–4.81), respectively. Women reporting any IPV in the year prior to pregnancy had increased odds of stress-related sleep disturbance (aOR = 2.07; 95% CI: 1.17–3.67) and poor sleep quality (aOR = 2.27; 95% CI: 1.30–3.97) during pregnancy.

Conclusion

Lifetime and prevalent IPV exposures are associated with stress-related sleep disturbance and poor sleep quality during pregnancy. Our findings suggest that sleep disturbances may be important mechanisms that underlie the lasting adverse effects of IPV on maternal and perinatal health.  相似文献   

11.

Background

In human immunodeficiency virus treatment adequate virological suppression is warranted, nevertheless for some patients it remains a challenge. We investigated factors associated with low-level viraemia (LLV) and virological failure (VF) under combined antiretroviral therapy (cART).

Materials and Methods

We analysed patients receiving standard regimens between 1st July 2012 and 1st July 2013 with at least one viral load (VL) measurement below the quantification limit (BLQ) in their treatment history. After a minimum of 6 months of unmodified cART, the next single VL measurement within 6 months was analysed. VF was defined as HIV RNA levels ≥200 copies/mL and all other quantifiable measurements were classified as LLV. Factors associated with LLV and VF compared to BLQ were identified by logistic regression models.

Results

Of 2276 participants, 1972 (86.6%) were BLQ, 222 (9.8%) showed LLV and 82 (3.6%) had VF. A higher risk for LLV and VF was shown in patients with cART interruptions and in patients with boosted PI therapy. The risk for LLV and VF was lower in patients from centres using the Abbott compared to the Roche assay to measure VL. A higher risk for LLV but not for VF was found in patients with a higher VL before cART [for >99.999 copies/mL: aOR (95% CI): 4.19 (2.07–8.49); for 10.000–99.999 copies/mL: aOR (95% CI): 2.52 (1.23–5.19)] and shorter cART duration [for <9 months: aOR (95% CI): 2.59 (1.38–4.86)]. A higher risk for VF but not for LLV was found in younger patients [for <30 years: aOR (95% CI): 2.76 (1.03–7.35); for 30–50 years: aOR (95% CI): 2.70 (1.26–5.79)], people originating from high prevalence countries [aOR (95% CI): 2.20 (1.09–4.42)] and in male injecting drug users [aOR (95% CI): 2.72 (1.38–5.34)].

Conclusions

For both VF and LLV, factors associated with adherence play a prominent role. Furthermore, performance characteristics of the diagnostic assay used for VL quantification should also be taken into consideration.  相似文献   

12.

Background

Tuberculosis is a major occupational hazard in low and middle-income countries. Limited data exist on serial testing of healthcare workers (HCWs) with interferon-γ release assays (IGRAs) for latent tuberculosis infection (LTBI), especially in low and middle-income countries. We sought to evaluate the rates of and risk factors for LTBI prevalence and LTBI test conversion among HCWs using the tuberculin skin test (TST) and QuantiFERON-TB Gold In-tube assay (QFT-GIT).

Methods

A prospective longitudinal study was conducted among HCWs in the country of Georgia. Subjects completed a questionnaire, and TST and QFT-GIT tests were performed. LTBI testing was repeated 6-26 months after baseline testing.

Results

Among 319 HCWs enrolled, 89% reported prior BCG vaccination, and 60% worked in TB healthcare facilities (HCFs). HCWs from TB HCFs had higher prevalence of positive QFT-GIT and TST than those from non-TB HCFs: 107/194 (55%) vs. 30/125 (31%) QFT-GIT positive (p<0.0001) and 128/189 (69%) vs. 64/119 (54%) TST positive (p = 0.01). There was fair agreement between TST and QFT-GIT (kappa = 0.42, 95% CI 0.31–0.52). In multivariate analysis, frequent contact with TB patients was associated with increased risk of positive QFT-GIT (aOR 3.04, 95% CI 1.79–5.14) but not positive TST. Increasing age was associated with increased risk of positive QFT-GIT (aOR 1.05, 95% CI 1.01–1.09) and TST (aOR 1.05, 95% CI 1.01–1.10). High rates of HCW conversion were seen: the QFT-GIT conversion rate was 22.8/100 person-years, and TST conversion rate was 17.1/100 person-years. In multivariate analysis, female HCWs had decreased risk of TST conversion (aOR 0.05, 95% CI 0.01–0.43), and older HCWs had increased risk of QFT-GIT conversion (aOR 1.07 per year, 95% CI 1.01–1.13).

Conclusion

LTBI prevalence and LTBI test conversion rates were high among Georgian HCWs, especially among those working at TB HCFs. These data highlight the need for increased implementation of TB infection control measures.  相似文献   

13.

Objective

To assess the predictive factors for subjective improvement with nonsurgical treatment in consecutive patients with lumbar spinal stenosis (LSS).

Materials and Methods

Patients with LSS were enrolled from 17 medical centres in Japan. We followed up 274 patients (151 men; mean age, 71 ± 7.4 years) for 3 years. A multivariable logistic regression model was used to assess the predictive factors for subjective symptom improvement with nonsurgical treatment.

Results

In 30% of patients, conservative treatment led to a subjective improvement in the symptoms; in 70% of patients, the symptoms remained unchanged, worsened, or required surgical treatment. The multivariable analysis of predictive factors for subjective improvement with nonsurgical treatment showed that the absence of cauda equina symptoms (only radicular symptoms) had an odds ratio (OR) of 3.31 (95% confidence interval [CI]: 1.50–7.31); absence of degenerative spondylolisthesis/scoliosis had an OR of 2.53 (95% CI: 1.13–5.65); <1-year duration of illness had an OR of 3.81 (95% CI: 1.46–9.98); and hypertension had an OR of 2.09 (95% CI: 0.92–4.78).

Conclusions

The predictive factors for subjective symptom improvement with nonsurgical treatment in LSS patients were the presence of only radicular symptoms, absence of degenerative spondylolisthesis/scoliosis, and an illness duration of <1 year.  相似文献   

14.

Background

Severe sepsis, may be present on hospital arrival in approximately one-third of patients with community-acquired pneumonia (CAP).

Objective

To determine the host characteristics and micro-organisms associated with severe sepsis in patients hospitalized with CAP.

Results

We performed a prospective multicenter cohort study in 13 Spanish hospital, on 4070 hospitalized CAP patients, 1529 of whom (37.6%) presented with severe sepsis. Severe sepsis CAP was independently associated with older age (>65 years), alcohol abuse (OR, 1.31; 95% CI, 1.07–1.61), chronic obstructive pulmonary disease (COPD) (OR, 1.75; 95% CI, 1.50–2.04) and renal disease (OR, 1.57; 95% CI, 1.21–2.03), whereas prior antibiotic treatment was a protective factor (OR, 0.62; 95% CI, 0.52–0.73). Bacteremia (OR, 1.37; 95% CI, 1.05–1.79), S pneumoniae (OR, 1.59; 95% CI, 1.31–1.95) and mixed microbial etiology (OR, 1.65; 95% CI, 1.10–2.49) were associated with severe sepsis CAP.

Conclusions

CAP patients with COPD, renal disease and alcohol abuse, as well as those with CAP due to S pneumonia or mixed micro-organisms are more likely to present to the hospital with severe sepsis.  相似文献   

15.

Background

Tuberculosis is a major health concern in prisons, particularly where HIV prevalence is high. Our objective was to determine the undiagnosed pulmonary tuberculosis (“undiagnosed tuberculosis”) prevalence in a representative sample of prisoners in a South African prison. In addition we investigated risk factors for undiagnosed tuberculosis, to explore if screening strategies could be targeted to high risk groups, and, the performance of screening tools for tuberculosis.

Methods and Findings

In this cross-sectional survey, male prisoners were screened for tuberculosis using symptoms, chest radiograph (CXR) and two spot sputum specimens for microscopy and culture. Anonymised HIV antibody testing was performed on urine specimens. The sensitivity, specificity and predictive values of symptoms and investigations were calculated, using Mycobacterium tuberculosis isolated on sputum culture as the gold standard.From September 2009 to October 2010, 1046 male prisoners were offered enrolment to the study. A total of 981 (93.8%) consented (median age was 32 years; interquartile range [IQR] 27–37 years) and were screened for tuberculosis. Among 968 not taking tuberculosis treatment and with sputum culture results, 34 (3.5%; 95% confidence interval [CI] 2.4–4.9%) were culture positive for Mycobacterium tuberculosis. HIV prevalence was 25.3% (242/957; 95% CI 22.6–28.2%). Positive HIV status (adjusted odds ratio [aOR] 2.0; 95% CI 1.0–4.2) and being an ex-smoker (aOR 2.6; 95% CI 1.2–5.9) were independently associated with undiagnosed tuberculosis. Compared to the gold standard of positive sputum culture, cough of any duration had a sensitivity of 35.3% and specificity of 79.6%. CXR was the most sensitive single screening modality (sensitivity 70.6%, specificity 92.2%). Adding CXR to cough of any duration gave a tool with sensitivity of 79.4% and specificity of 73.8%.

Conclusions

Undiagnosed tuberculosis and HIV prevalence was high in this prison, justifying routine screening for tuberculosis at entry into the prison, and intensified case finding among existing prisoners.  相似文献   

16.

Background

Community water supplies in underserved areas of the United States may be associated with increased microbiological contamination and risk of gastrointestinal disease. Microbial and health risks affecting such systems have not been systematically characterized outside outbreak investigations. The objective of the study was to evaluate associations between self-reported gastrointestinal illnesses (GII) and household-level water supply characteristics.

Methods

We conducted a cross-sectional study of water quality, water supply characteristics, and GII in 906 households served by 14 small and medium-sized community water supplies in Alabama’s underserved Black Belt region.

Results

We identified associations between respondent-reported water supply interruption and any symptoms of GII (adjusted odds ratio (aOR): 3.01, 95% confidence interval (CI) = 1.65–5.49), as well as low water pressure and any symptoms of GII (aOR: 4.51, 95% CI = 2.55–7.97). We also identified associations between measured water quality such as lack of total chlorine and any symptoms of GII (aOR: 5.73, 95% CI = 1.09–30.1), and detection of E. coli in water samples and increased reports of vomiting (aOR: 5.01, 95% CI = 1.62–15.52) or diarrhea (aOR: 7.75, 95% CI = 2.06–29.15).

Conclusions

Increased self-reported GII was associated with key water system characteristics as measured at the point of sampling in a cross-sectional study of small and medium water systems in rural Alabama in 2012 suggesting that these water supplies can contribute to endemic gastro-intestinal disease risks. Future studies should focus on further characterizing and managing microbial risks in systems facing similar challenges.  相似文献   

17.

Purpose

Cataract is a very prevalent ocular disorder, and environmental risk factors for age-related cataracts have been widely investigated. We aimed to evaluate an association of dietary sodium intake and socioeconomic factors with the development of age-related cataracts.

Methods

A cross-sectional case-control study based on the 2008–2011 Korea National Health and Nutrition Examination Survey. Dietary sodium intake was estimated using urinary sodium to creatinine ratio (U[Na+]/Cr).

Results

Among a total 12,693 participants, 2,687 (21.1%) had cataracts and 10,006 patients without cataracts served as controls. The prevalence of cataracts increased with age and quartiles of U[Na+]/Cr (p for trend < 0.001). Multivariate logistic regression analyses revealed that factors related to the development of cataracts were age ≥ 50 years (adjusted odds ratio [aOR] 15.34, 95% confidence interval [CI] 13.31‒17.69), low income (aOR 1.85, 95% CI 1.64–2.09), low educational attainment (aOR 1.76, 95% CI 1.57–1.96), and high sodium intake (U[Na+]/Cr > 16.4 mmol/mmol; aOR 1.29, 95% CI 1.16–1.44). In a subgroup analysis, a robust effect on cataracts across U[Na+]/Cr quartiles was observed in patients ≥ 50 years of age (aOR 1.11, 95% CI 1.04–1.18), though not in younger patients (aOR 1.06, 95% CI 0.96–1.17).

Conclusions

Our results suggest that high sodium intake and low socioeconomic status may affect the development of cataracts, and that a low-salt diet could be helpful for the prevention of cataracts in an older population. Furthermore, efforts to close gaps in health services due to socioeconomic factors may contribute to a reduction in the prevalence of cataracts.  相似文献   

18.

Background

The increase in urban migrants is one of major challenges for tuberculosis control in China. The different characteristics of tuberculosis cases between urban migrants and local residents in China have not been investigated before.

Methodology/Principal Findings

We performed a retrospective study of all pulmonary tuberculosis patients reported in Songjiang district, Shanghai, to determine the demographic, clinical and microbiological characteristics of tuberculosis cases between urban migrants and local residents. We calculated the odds ratios (OR) and performed multivariate logistic regression to identify the characteristics that were independently associated with tuberculosis among urban migrants. A total of 1,348 pulmonary tuberculosis cases were reported during 2006–2008, among whom 440 (32.6%) were local residents and 908 (67.4%) were urban migrants. Urban migrant (38.9/100,000 population) had higher tuberculosis rates than local residents (27.8/100,000 population), and the rates among persons younger than age 35 years were 3 times higher among urban migrants than among local residents. Younger age (adjusted OR per additional year at risk = 0.92, 95% CI: 0.91–0.94, p<0.001), poor treatment outcome (adjusted OR = 4.12, 95% CI: 2.65–5.72, p<0.001), and lower frequency of any comorbidity at diagnosis (adjusted OR = 0.20, 95% CI: 0.13–0.26, p = 0.013) were significantly associated with tuberculosis patients among urban migrants. There were poor treatment outcomes among urban migrants, mainly from transfers to another jurisdiction (19.3% of all tuberculosis patients among urban migrants).

Conclusions/Significance

A considerable proportion of tuberculosis cases in Songjiang district, China, during 2006–2008 occurred among urban migrants. Our findings highlight the need to develop and implement specific tuberculosis control strategies for urban migrants, such as more exhaustive case finding, improved case management and follow-up, and use of directly observed therapy (DOT).  相似文献   

19.

Introduction

Prison settings have been often identified as important but neglected reservoirs for TB. This study was designed to determine the prevalence of undiagnosed pulmonary TB and assess the potential risk factors for such TB cases in prisons of the Tigray region.

Method

A cross-sectional study was conducted between August 2013 and February 2014 in nine prisons. A standardized symptom-based questionnaire was initially used to identify presumptive TB cases. From each, three consecutive sputum samples were collected for acid-fast bacilli (AFB) microscopy and culture. Blood samples were collected from consented participants for HIV testing.

Result

Out of 809 presumptive TB cases with culture result, 4.0% (95% CI: 2.65–5.35) were confirmed to have undiagnosed TB. The overall estimated point prevalence of undiagnosed TB was found to be 505/100,000 prisoners (95% CI: 360–640). Together with the 27 patients who were already on treatment, the overall estimated point prevalence of TB would be 793/100,000 prisoners (95% CI: 610–970), about four times higher than in the general population. The ratio of active to passive case detection was 1.18:1. The prevalence of HIV was 4.4% (36/809) among presumptive TB cases and 6.3% (2/32) among undiagnosed TB cases. In a multivariate logistic regression analysis, chewing Khat (adjusted OR = 2.81; 95% CI: 1.02–7.75) and having had a close contact with a TB patient (adjusted OR = 2.18; 95% CI: 1.05–4.51) were found to be predictors of undiagnosed TB among presumptive TB cases.

Conclusions

This study revealed that at least half of symptomatic pulmonary TB cases in Northern Ethiopian prisons remain undiagnosed and hence untreated. The prevalence of undiagnosed TB in the study prisons was more than two folds higher than in the general population of Tigray. This may indicate the need for more investment and commitment to improving TB case detection in the study prisons.  相似文献   

20.

Objective

The objective of the present study was to examine the associations between metabolic syndrome (MS) components, such as overweight (OW), hypertension (HT), dyslipidemia (DL), and impaired glucose tolerance (IGT), and intervertebral disc degeneration (DD).

Design

The present study included 928 participants (308 men, 620 women) of the 1,011 participants in the Wakayama Spine Study. DD on magnetic resonance imaging was classified according to the Pfirrmann system. OW, HT, DL, and IGT were assessed using the criteria of the Examination Committee of Criteria for MS in Japan.

Results

Multivariable logistic regression analysis revealed that OW was significantly associated with cervical, thoracic, and lumbar DD (cervical: odds ratio [OR], 1.28; 95% confidence interval [CI], 0.92–1.78; thoracic: OR, 1.75; 95% CI, 1.24–2.51; lumbar: OR, 1.87; 95% CI, 1.06–3.48). HT and IGT were significantly associated with thoracic DD (HT: OR, 1.54; 95% CI, 1.09–2.18; IGT: OR, 1.65; 95% CI, 1.12–2.48). Furthermore, subjects with 1 or more MS components had a higher OR for thoracic DD compared with those without MS components (vs. no component; 1 component: OR, 1.58; 95% CI, 1.03–2.42; 2 components: OR, 2.60; 95% CI, 1.62–4.20; ≥3 components: OR, 2.62; 95% CI, 1.42–5.00).

Conclusion

MS components were significantly associated with thoracic DD. Furthermore, accumulation of MS components significantly increased the OR for thoracic DD. These findings support the need for further studies of the effects of metabolic abnormality on DD.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号