首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.

Background

Healthy lifestyle including sufficient physical activity may mitigate or prevent adverse long-term effects of childhood cancer. We described daily physical activities and sports in childhood cancer survivors and controls, and assessed determinants of both activity patterns.

Methodology/Principal Findings

The Swiss Childhood Cancer Survivor Study is a questionnaire survey including all children diagnosed with cancer 1976–2003 at age 0–15 years, registered in the Swiss Childhood Cancer Registry, who survived ≥5years and reached adulthood (≥20years). Controls came from the population-based Swiss Health Survey. We compared the two populations and determined risk factors for both outcomes in separate multivariable logistic regression models. The sample included 1058 survivors and 5593 controls (response rates 78% and 66%). Sufficient daily physical activities were reported by 52% (n = 521) of survivors and 37% (n = 2069) of controls (p<0.001). In contrast, 62% (n = 640) of survivors and 65% (n = 3635) of controls reported engaging in sports (p = 0.067). Risk factors for insufficient daily activities in both populations were: older age (OR for ≥35years: 1.5, 95CI 1.2–2.0), female gender (OR 1.6, 95CI 1.3–1.9), French/Italian Speaking (OR 1.4, 95CI 1.1–1.7), and higher education (OR for university education: 2.0, 95CI 1.5–2.6). Risk factors for no sports were: being a survivor (OR 1.3, 95CI 1.1–1.6), older age (OR for ≥35years: 1.4, 95CI 1.1–1.8), migration background (OR 1.5, 95CI 1.3–1.8), French/Italian speaking (OR 1.4, 95CI 1.2–1.7), lower education (OR for compulsory schooling only: 1.6, 95CI 1.2–2.2), being married (OR 1.7, 95CI 1.5–2.0), having children (OR 1.3, 95CI 1.4–1.9), obesity (OR 2.4, 95CI 1.7–3.3), and smoking (OR 1.7, 95CI 1.5–2.1). Type of diagnosis was only associated with sports.

Conclusions/Significance

Physical activity levels in survivors were lower than recommended, but comparable to controls and mainly determined by socio-demographic and cultural factors. Strategies to improve physical activity levels could be similar as for the general population.  相似文献   

2.

Background

A common weakness of patient satisfaction surveys is a suboptimal participation rate. Some patients may be unable to participate, because of language barriers, physical limitations, or mental problems. As the role of these barriers is poorly understood, we aimed to identify patient characteristics that are associated with non-participation in a patient satisfaction survey.

Methodology

At the University Hospitals of Geneva, Switzerland, a patient satisfaction survey is regularly conducted among all adult patients hospitalized for >24 hours on a one-month period in the departments of internal medicine, geriatrics, surgery, neurosciences, psychiatry, and gynaecology-obstetrics. In order to assess the factors associated with non-participation to the patient satisfaction survey, a case-control study was conducted among patients selected for the 2005 survey. Cases (non respondents, n = 195) and controls (respondents, n = 205) were randomly selected from the satisfaction survey, and information about potential barriers to participation was abstracted in a blinded fashion from the patients'' medical and nursing charts.

Principal Findings

Non-participation in the satisfaction survey was independently associated with the presence of a language barrier (odds ratio [OR] 4.53, 95% confidence interval [CI95%]: 2.14–9.59), substance abuse (OR 3.75, CI95%: 1.97–7.14), cognitive limitations (OR 3.72, CI95%: 1.64–8.42), a psychiatric diagnosis (OR 1.99, CI95%: 1.23–3.23) and a sight deficiency (OR 2.07, CI95%: 0.98–4.36). The odds ratio for non-participation increased gradually with the number of predictors.

Conclusions

Five barriers to non-participation in a mail survey were identified. Gathering patient feedback through mailed surveys may lead to an under-representation of some patient subgroups.  相似文献   

3.

Background

Inflammatory bowel disease (IBD) is a chronic intestinal disorder that is associated with a limited number of clinical biomarkers. In order to facilitate the diagnosis of IBD and assess its disease activity, we investigated the potential of novel multivariate indexes using statistical modeling of plasma amino acid concentrations (aminogram).

Methodology and Principal Findings

We measured fasting plasma aminograms in 387 IBD patients (Crohn''s disease (CD), n = 165; ulcerative colitis (UC), n = 222) and 210 healthy controls. Based on Fisher linear classifiers, multivariate indexes were developed from the aminogram in discovery samples (CD, n = 102; UC, n = 102; age and sex-matched healthy controls, n = 102) and internally validated. The indexes were used to discriminate between CD or UC patients and healthy controls, as well as between patients with active disease and those in remission. We assessed index performances using the area under the curve of the receiver operating characteristic (ROC AUC). We observed significant alterations to the plasma aminogram, including histidine and tryptophan. The multivariate indexes established from plasma aminograms were able to distinguish CD or UC patients from healthy controls with ROC AUCs of 0.940 (95% confidence interval (CI): 0.898–0.983) and 0.894 (95%CI: 0.853–0.935), respectively in validation samples (CD, n = 63; UC, n = 120; healthy controls, n = 108). In addition, other indexes appeared to be a measure of disease activity. These indexes distinguished active CD or UC patients from each remission patients with ROC AUCs of 0.894 (95%CI: 0.853–0.935) and 0.849 (95%CI: 0.770–0.928), and correlated with clinical disease activity indexes for CD (rs = 0.592, 95%CI: 0.385–0.742, p<0.001) or UC (rs = 0.598, 95%CI: 0.452–0.713, p<0.001), respectively.

Conclusions and Significance

In this study, we demonstrated that established multivariate indexes composed of plasma amino acid profiles can serve as novel, non-invasive, objective biomarkers for the diagnosis and monitoring of IBD, providing us with new insights into the pathophysiology of the disease.  相似文献   

4.

Background

There is limited empirical research on the underlying gender inequity norms shaping gender-based violence, power, and HIV risks in sub-Saharan Africa, or how risk pathways may differ for men and women. This study is among the first to directly evaluate the adherence to gender inequity norms and epidemiological relationships with violence and sexual risks for HIV infection.

Methods

Data were derived from population-based cross-sectional samples recruited through two-stage probability sampling from the 5 highest HIV prevalence districts in Botswana and all districts in Swaziland (2004–5). Based on evidence of established risk factors for HIV infection, we aimed 1) to estimate the mean adherence to gender inequity norms for both men and women; and 2) to model the independent effects of higher adherence to gender inequity norms on a) male sexual dominance (male-controlled sexual decision making and rape (forced sex)); b) sexual risk practices (multiple/concurrent sex partners, transactional sex, unprotected sex with non-primary partner, intergenerational sex).

Findings

A total of 2049 individuals were included, n = 1255 from Botswana and n = 796 from Swaziland. In separate multivariate logistic regression analyses, higher gender inequity norms scores remained independently associated with increased male-controlled sexual decision making power (AORmen = 1.90, 95%CI:1.09–2.35; AORwomen = 2.05, 95%CI:1.32–2.49), perpetration of rape (AORmen = 2.19 95%CI:1.22–3.51), unprotected sex with a non-primary partner (AORmen = 1.90, 95%CI:1.14–2.31), intergenerational sex (AORwomen = 1.36, 95%CI:1.08–1.79), and multiple/concurrent sex partners (AORmen = 1.42, 95%CI:1.10–1.93).

Interpretation

These findings support the critical evidence-based need for gender-transformative HIV prevention efforts including legislation of women''s rights in two of the most HIV affected countries in the world.  相似文献   

5.

Background

Traditional methods of diagnosing mucosal leishmaniasis (ML), such as biopsy with histopathology, are insensitive and require collection of an invasive diagnostic specimen.

Methods

We compared standard invasive procedures including biopsy histopathology, biopsy PCR, and leishmanin skin test (LST) to a novel, non-invasive, cytology-brush based PCR for the diagnosis of ML in Lima, Peru. Consensus reference standard was 2/4 tests positive, and outcome measures were sensitivity and specificity. Leishmania species identification was performed by PCR-based assays of positive specimens.

Results

Twenty-eight patients were enrolled, 23 of whom fulfilled criteria for a diagnosis of ML. Sensitivity and specificity of biopsy with histopathology were 21.7% [95% CI 4.9–38.5%] and 100%; 69.6% [95% CI 50.8–88.4%] and 100% for LST; 95.7% [95% CI 87.4–100%] and 100% for biopsy PCR; and 95.7% [95% CI 87.4–100%] and 90% [95% CI 71.4–100%] for cytology brush PCR using both Cervisoft® and Histobrush® cervical cytology brushes. Represented species identified by PCR-RFLP included: L. (V). braziliensis (n = 4), and L. (V). peruviana (n = 3).

Conclusions

Use of commercial grade cytology brush PCR for diagnosis of ML is sensitive, rapid, well tolerated, and carries none of the risks of invasive diagnostic procedures such as biopsy. Further optimization is required for adequate species identification. Further evaluation of this method in field and other settings is warranted.  相似文献   

6.

Background

Starting HAART in a very advanced stage of disease is assumed to be the most prevalent form of initiation in HIV-infected subjects in developing countries. Data from Latin America and the Caribbean is still lacking. Our main objective was to determine the frequency, risk factors and trends in time for being late HAART initiator (LHI) in this region.

Methodology

Cross-sectional analysis from 9817 HIV-infected treatment-naïve patients initiating HAART at 6 sites (Argentina, Chile, Haiti, Honduras, Peru and Mexico) from October 1999 to July 2010. LHI had CD4+ count ≤200cells/mm3 prior to HAART. Late testers (LT) were those LHI who initiated HAART within 6 months of HIV diagnosis. Late presenters (LP) initiated after 6 months of diagnosis. Prevalence, risk factors and trends over time were analyzed.

Principal Findings

Among subjects starting HAART (n = 9817) who had baseline CD4+ available (n = 8515), 76% were LHI: Argentina (56%[95%CI:52–59]), Chile (80%[95%CI:77–82]), Haiti (76%[95%CI:74–77]), Honduras (91%[95%CI:87–94]), Mexico (79%[95%CI:75–83]), Peru (86%[95%CI:84–88]). The proportion of LHI statistically changed over time (except in Honduras) (p≤0.02; Honduras p = 0.7), with a tendency towards lower rates in recent years. Males had increased risk of LHI in Chile, Haiti, Peru, and in the combined site analyses (CSA). Older patients were more likely LHI in Argentina and Peru (OR 1.21 per +10-year of age, 95%CI:1.02–1.45; OR 1.20, 95%CI:1.02–1.43; respectively), but not in CSA (OR 1.07, 95%CI:0.94–1.21). Higher education was associated with decreased risk for LHI in Chile (OR 0.92 per +1-year of education, 95%CI:0.87–0.98) (similar trends in Mexico, Peru, and CSA). LHI with date of HIV-diagnosis available, 55% were LT and 45% LP.

Conclusion

LHI was highly prevalent in CCASAnet sites, mostly due to LT; the main risk factors associated were being male and older age. Earlier HIV-diagnosis and earlier treatment initiation are needed to maximize benefits from HAART in the region.  相似文献   

7.

Introduction

Biomarker-based cross-sectional incidence estimation requires a Recent Infection Testing Algorithm (RITA) with an adequately large mean recency duration, to achieve reasonable survey counts, and a low false-recent rate, to minimise exposure to further bias and imprecision. Estimating these characteristics requires specimens from individuals with well-known seroconversion dates or confirmed long-standing infection. Specimens with well-known seroconversion dates are typically rare and precious, presenting a bottleneck in the development of RITAs.

Methods

The mean recency duration and a ‘false-recent rate’ are estimated from data on seroconverting blood donors. Within an idealised model for the dynamics of false-recent results, blood donor specimens were used to characterise RITAs by a new method that maximises the likelihood of cohort-level recency classifications, rather than modelling individual sojourn times in recency.

Results

For a range of assumptions about the false-recent results (0% to 20% of biomarker response curves failing to reach the threshold distinguishing test-recent and test-non-recent infection), the mean recency duration of the Vironostika-LS ranged from 154 (95% CI: 96–231) to 274 (95% CI: 234–313) days in the South African donor population (n = 282), and from 145 (95% CI: 67–226) to 252 (95% CI: 194–308) days in the American donor population (n = 106). The significance of gender and clade on performance was rejected (p−value = 10%), and utility in incidence estimation appeared comparable to that of a BED-like RITA. Assessment of the Vitros-LS (n = 108) suggested potentially high false-recent rates.

Discussion

The new method facilitates RITA characterisation using widely available specimens that were previously overlooked, at the cost of possible artefacts. While accuracy and precision are insufficient to provide estimates suitable for incidence surveillance, a low-cost approach for preliminary assessments of new RITAs has been demonstrated. The Vironostika-LS and Vitros-LS warrant further analysis to provide greater precision of estimates.  相似文献   

8.
Cao L  Silvestry S  Zhao N  Diehl J  Sun J 《PloS one》2012,7(2):e30094

Background and Objective

Postoperative cardiocerebral and renal complications are a major threat for patients undergoing cardiac surgery. This study was aimed to examine the effect of preoperative aspirin use on patients undergoing cardiac surgery.

Methods

An observational cohort study was performed on consecutive patients (n = 1879) receiving cardiac surgery at this institution. The patients excluded from the study were those with preoperative anticoagulants, unknown aspirin use, or underwent emergent cardiac surgery. Outcome events included were 30-day mortality, renal failure, readmission and a composite outcome - major adverse cardiocerebral events (MACE) that include permanent or transient stroke, coma, perioperative myocardial infarction (MI), heart block and cardiac arrest.

Results

Of all patients, 1145 patients met the inclusion criteria and were divided into two groups: those taking (n = 858) or not taking (n = 287) aspirin within 5 days preceding surgery. Patients with aspirin presented significantly more with history of hypertension, diabetes, peripheral arterial disease, previous MI, angina and older age. With propensity scores adjusted and multivariate logistic regression, however, this study showed that preoperative aspirin therapy (vs. no aspirin) significantly reduced the risk of MACE (8.4% vs. 12.5%, odds ratio [OR] 0.585, 95% CI 0.355–0.964, P = 0.035), postoperative renal failure (2.6% vs. 5.2%, OR 0.438, CI 0.203–0.945, P = 0.035) and dialysis required (0.8% vs. 3.1%, OR 0.230, CI 0.071–0.742, P = 0.014), but did not significantly reduce 30-day mortality (4.1% vs. 5.8%, OR 0.744, CI 0.376–1.472, P = 0.396) nor it increased readmissions in the patients undergoing cardiac surgery.

Conclusions

Preoperative aspirin therapy is associated with a significant decrease in the risk of MACE and renal failure and did not increase readmissions in patients undergoing non-emergent cardiac surgery.  相似文献   

9.

Background

Renalase is a soluble enzyme that metabolizes circulating catecholamines. A common missense polymorphism in the flavin-adenine dinucleotide-binding domain of human renalase (Glu37Asp) has recently been described. The association of this polymorphism with cardiac structure, function, and ischemia has not previously been reported.

Methods

We genotyped the rs2296545 single-nucleotide polymorphism (Glu37Asp) in 590 Caucasian individuals and performed resting and stress echocardiography. Logistic regression was used to examine the associations of the Glu37Asp polymorphism (C allele) with cardiac hypertrophy (LV mass>100 g/m2), systolic dysfunction (LVEF<50%), diastolic dysfunction, poor treadmill exercise capacity (METS<5) and inducible ischemia.

Results

Compared with the 406 participants who had GG or CG genotypes, the 184 participants with the CC genotype had increased odds of left ventricular hypertrophy (OR = 1.43; 95% CI 0.99–2.06), systolic dysfunction (OR = 1.72; 95% CI 1.01–2.94), diastolic dysfunction (OR = 1.75; 95% CI 1.05–2.93), poor exercise capacity (OR = 1.61; 95% CI 1.05–2.47), and inducible ischemia (OR = 1.49, 95% CI 0.99–2.24). The Glu37Asp (CC genotype) caused a 24-fold decrease in affinity for NADH and a 2.3-fold reduction in maximal renalase enzymatic activity.

Conclusions

A functional missense polymorphism in renalase (Glu37Asp) is associated with cardiac hypertrophy, ventricular dysfunction, poor exercise capacity, and inducible ischemia in persons with stable coronary artery disease. Further studies investigating the therapeutic implications of this polymorphism should be considered.  相似文献   

10.

Background

The role of the placenta in fetal programming has been recognized as a highly significant, yet often neglected area of study. We investigated placental size in relation to psychopathology, in particular attention deficit hyperactivity disorder (ADHD) symptoms, in children at 8 years of age, and later as adolescents at 16 years.

Methodology/Principal Findings

Prospective data were obtained from The Northern Finland Birth Cohort (NFBC) 1986. Placental weight, surface area and birth weight were measured according to standard procedures, within 30 minutes after birth. ADHD symptoms, probable psychiatric disturbance, antisocial disorder and neurotic disorder were assessed at 8 years (n = 8101), and ADHD symptoms were assessed again at 16 years (n = 6607), by teachers and parents respectively. We used logistic regression analyses to investigate the association between placental size and mental health outcomes, and controlled for gestational age, birth weight, socio-demographic factors and medical factors, during gestation. There were significant positive associations between placental size (weight, surface area and placental-to-birth-weight ratio) and mental health problems in boys at 8 and 16 years of age. Increased placental weight was linked with overall probable psychiatric disturbance (at 8y, OR  = 1.14 [95% CI  = 1.04–1.25]), antisocial behavior (at 8 y, OR  = 1.14 [95% CI  = 1.03–1.27]) and ADHD symptoms (inattention-hyperactivity at 16y, OR  = 1.19 [95% CI  = 1.02–1.38]). No significant associations were detected among girls.

Conclusions/Significance

Compensatory placental growth may occur in response to prenatal insults. Such overgrowth may affect fetal development, including brain development, and ultimately contribute to psychopathology.  相似文献   

11.

Background

In Norway, women with low-grade squamous intraepithelial lesions (LSIL) are followed up after six months in order to decide whether they should undergo further follow-up or be referred back to the screening interval of three years. A high specificity and positive predictive value (PPV) of the triage test is important to avoid unnecessary diagnostic and therapeutic procedures.

Materials and Methods

At the University Hospital of North Norway, repeat cytology and the HPV mRNA test PreTect HPV-Proofer, detecting E6/E7 mRNA from HPV types 16, 18, 31, 33 and 45, are used in triage of women with ASC-US and LSIL. In this study, women with LSIL cytology in the period 2005–2008 were included (n = 522). Two triage methods were evaluated in two separate groups: repeat cytology only (n = 225) and HPV mRNA testing in addition to repeat cytology (n = 297). Histologically confirmed cervical intraepithelial neoplasia of grade 2 or worse (CIN2+) was used as the study endpoint.

Results

Of 522 women with LSIL, 207 had biopsies and 125 of them had CIN2+. The sensitivity and specificity of repeat cytology (ASC-US or worse) were 85.7% (95% confidence interval (CI): 72.1, 92.2) and 54.4 % (95% CI: 46.9, 61.9), respectively. The sensitivity and specificity of the HPV mRNA test were 94.2% (95% CI: 88.7, 99.7) and 86.0% (95% CI: 81.5, 90.5), respectively. The PPV of repeat cytology was 38.4% (95% CI: 29.9, 46.9) compared to 67.0% (95% CI: 57.7, 76.4) of the HPV mRNA test.

Conclusion

HPV mRNA testing was more sensitive and specific than repeat cytology in triage of women with LSIL cytology. In addition, the HPV mRNA test showed higher PPV. These data indicate that the HPV mRNA test is a better triage test for women with LSIL than repeat cytology.  相似文献   

12.

Objective

Turnover of the extracellular matrix in all solid organs is governed mainly by a balance between the degrading matrix metalloproteinases (MMPs) and their tissue inhibitors (TIMPs). An altered extracellular matrix metabolism has been implicated in a variety of diseases. We investigated relations of serum levels of MMP-9 and TIMP-1 to mortality risk from an etiological perspective.

Design

The prospective Uppsala Longitudinal Study of Adult Men (ULSAM) cohort, followed from 1991–1995 for up to 18.1 years. A random population-based sample of 1,082 71-year-old men, no loss to follow-up. Endpoints were all-cause (n = 628), cardiovascular (n = 230), non-cardiovascular (n = 398) and cancer mortality (n = 178), and fatal or non-fatal myocardial infarction (n = 138) or stroke (n = 163).

Results

Serum MMP-9 and TIMP-1 levels were associated with risk of all-cause mortality (Cox proportional hazard ratio [HR] per standard deviation 1.10, 95% confidence interval [CI] 1.03–1.19; and 1.11, 1.02–1.20; respectively). TIMP-1 levels were mainly related to risks of cardiovascular mortality and stroke (HR per standard deviation 1.22, 95% CI 1.09–1.37; and 1.18, 1.04–1.35; respectively). All relations except those of TIMP-1 to stroke risk were attenuated by adjustment for cardiovascular disease risk factors. Relations in a subsample without cardiovascular disease or cancer were similar to those in the total sample.

Conclusion

In this community-based cohort of elderly men, serum MMP-9 and TIMP-1 levels were related to mortality risk. An altered extracellular matrix metabolism may be involved in several detrimental pathways, and circulating MMP-9 or TIMP-1 levels may be relevant markers thereof.  相似文献   

13.

Purpose

Residing in deprived areas may increase risk of mortality beyond that explained by a person''s own SES-related factors and lifestyle. The aim of this study was to examine the relation between neighborhood socioeconomic deprivation and all-cause, cancer- and cardiovascular disease (CVD)-specific mortality for men and women after accounting for education and other important person-level risk factors.

Methods

In the longitudinal NIH-AARP Study, we analyzed data from healthy participants, ages 50–71 years at study baseline (1995–1996). Deaths (n = 33831) were identified through December 2005. Information on census tracts was obtained from the 2000 US Census. Cox models estimated hazard ratios (HRs) and 95% confidence intervals (CIs) for quintiles of neighborhood deprivation.

Results

Participants in the highest quintile of deprivation had elevated risks for overall mortality (HRmen = 1.17, 95% CI: 1.10, 1.24; HRwomen = 1.13, 95% CI: 1.05, 1.22) and marginally increased risk for cancer deaths (HRmen = 1.09, 95% CI: 1.00, 1.20; HRwomen = 1.09, 95% CI: 0.99, 1.22). CVD mortality associations appeared stronger in men (HR = 1.33, 95% CI: 1.19, 1.49) than women (HR = 1.18, 95% CI: 1.01, 1.38). There was no evidence of an effect modification by education.

Conclusion

Higher neighborhood deprivation was associated with modest increases in all-cause, cancer- and CVD-mortality after accounting for many established risk factors.  相似文献   

14.
15.

Background

Tuberculosis (TB) and TB-human immunodeficiency virus infection (HIV) coinfection is a major public health concern in resource-limited settings. Although TB treatment is challenging in HIV-infected patients because of treatment interactions, immunopathological reactions, and concurrent infections, few prospective studies have addressed this in sub-Saharan Africa.In this study we aimed to determine incidence, causes of, and risk factors for serious adverse events among patients on first-line antituberculous treatment, as well as its impact on antituberculous treatment outcome.

Methods and findings

Prospective observational cohort study of adults treated for TB at the Internal Medicine department of the Kigali University Hospital from May 2008 through August 2009.Of 263 patients enrolled, 253 were retained for analysis: median age 35 (Interquartile range, IQR 28–40), 55% male, 66% HIV-positive with a median CD4 count 104 cells/mm3 (IQR 44–248 cells/mm3). Forty percent had pulmonary TB, 43% extrapulmonary TB and 17% a mixed form. Sixty-four (26%) developed a serious adverse event; 58/167 (35%) HIV-infected vs. 6/86 (7%) HIV-uninfected individuals. Commonest events were concurrent infection (n = 32), drug-induced hepatitis (n = 24) and paradoxical reactions/TB-IRIS (n = 23).HIV-infection (adjusted Hazard Ratio, aHR 3.4, 95% Confidence Interval, CI 1.4–8.7) and extrapulmonary TB (aHR 2, 95%CI 1.1–3.7) were associated with an increased risk of serious adverse events. For TB/HIV co-infected patients, extrapulmonary TB (aHR 2.0, 95%CI 1.1–3.9) and CD4 count <100 cells/mm3 at TB diagnosis (aHR 1.7, 95%CI 1.0–2.9) were independent predictors. Adverse events were associated with an almost two-fold higher risk of unsuccessful treatment outcome at 6 months (HR 1.89, 95%CI 1.3–3.0).

Conclusion

Adverse events frequently complicate the course of antituberculous treatment and worsen treatment outcome, particularly in patients with extrapulmonary TB and advanced immunodeficiency. Concurrent infection accounts for most events. Our data suggest that deterioration in a patient already receiving antituberculous treatment should prompt an aggressive search for additional infections.  相似文献   

16.

Background

Bovine tuberculosis (BTB) is a widespread zoonosis in developing countries but has received little attention in sub-Saharan Africa, especially in Niger. Recent investigations confirmed the high incidence of the disease in cattle slaughtered in an abattoir in Niamey. The fact that most of the animals in which M. bovis has been identified were from the rural area of Torodi implied the existence of a probable source of BTB in this region. This study aimed to determine the prevalence of BTB infection in cattle and to identify risk factors for infection in human and cattle populations in Torodi.

Methods and Principal Findings

A survey was carried out at the level of households keeping livestock (n = 51). The questionnaire was related to the potential risk factors and the presence of clinical signs of TB both in animals and humans. Comparative Intradermal Tuberculin Test was conducted to determine the TB status in cattle (n = 393). The overall apparent individual animal prevalence of tuberculin reactors was 3.6% (CI: 95%, 1.9–5.9), whereas the individual true prevalence was estimated at 0.8% (CI: 95%, 0.0–5.0). Using a multivariate logistic regression analysis and a classification tree analysis, the only household level risk factor that significantly influenced the presence of BTB in cattle was the presence of animals coughing in the herd (OR = 4.7, 95% CI: 1.12–19.71, p-value = 0.034). The lack of the practice of quarantine was borderline significant (OR = 4.2, 95% CI: 0.96–18.40, p-value = 0.056).

Conclusion/Significance

The study confirmed that BTB is endemic in cattle in Torodi and the risk of the transmission of the disease to humans is potentially high. For the control of the disease in livestock, slaughtering of infected animals and the compensation of the owners is needed. Collaboration between the veterinary and the medical sectors, in the diagnosis, monitoring, prevention and control of BTB is strongly encouraged.  相似文献   

17.

Objective

Self-rated health is a generic health indicator predicting mortality, many diseases, and need for care. We examined self-rated health as a predictor of subsequent disability retirement, and ill-health and working conditions as potential explanations for the association.

Methods

Self-rated health and the covariates were obtained from the Helsinki Health Study baseline mail surveys in 2000–2002 conducted among municipal employees aged 40–60 years (n = 6525). Data for disability retirement events (n = 625) along with diagnoses were linked from the Finnish Centre for Pensions, with a follow-up by the end of 2010. Hazard ratios (HR) and their 95% confidence intervals (CI) were calculated using competing risks models.

Results

Less than good self-rated health predicted disability retirement due to all causes among both women (HR = 4.60, 95% CI = 3.84–5.51) and men (HR = 3.83, 95% CI = 2.64–5.56), as well as due to musculoskeletal diseases (HR = 5.17, 95% CI = 4.02–6.66) and mental disorders (HR = 4.80, 95% CI = 3.50–6.59) among women and men pooled. Ill-health and physical working conditions partly explained the found associations, which nevertheless remained after the adjustments. Among the measures of ill-health limiting long-standing illness explained the association most in all-cause disability retirement and disability retirements due to musculoskeletal diseases, whereas common mental disorders explained the association most in disability retirements due to mental health disorders. Among working conditions physical work load and hazardous exposures at work explained the association most, although much less than ill-health.

Conclusions

Self-rated health is a strong predictor of disability retirement. This can be partly explained by ill-health and working conditions. Poor self-rated health provides a useful marker for increased risk of work disability and subsequent disability retirement.  相似文献   

18.

Objectives

To determine the incidence of and risk factors for HIV acquisition in a cohort of HIV-uninfected partners from HIV discordant couples in Masaka, Uganda, and to establish its suitability for HIV vaccine trials.

Methods

HIV-uninfected adults living in HIV discordant couple relationships were enrolled and followed for 2 years. Interviews, medical investigations, HIV counseling and testing, syphilis and urine pregnancy (women) tests were performed at quarterly visits. Sexual risk behaviour data were collected every 6 months.

Results

495 participants were enrolled, of whom 34 seroconverted during 786.6 person-years of observation (PYO). The overall HIV incidence rate [95% confidence interval (CI)] was 4.3 [3.1–6]; and 4.3 [2.8–6.4] and 4.4 [2.5–8] per 100 PYO in men and women respectively. Independent baseline predictors for HIV acquisition were young age [18–24 (aRR = 4.1, 95% CI 1.6–10.8) and 25–34 (aRR = 2.7, 95% CI 1.2–5.8) years]; alcohol use (aRR = 2.6, 95% CI 1.1–6); and reported genital discharge (aRR = 3.4, 95% CI 1.6–7.2) in the past year. Condom use frequency in the year preceding enrolment was predictive of a reduced risk of HIV acquisition [sometimes (aRR = 0.4, 95% CI 0.2–0.8); always (aRR = 0.1, 95% CI 0.02–0.9)]. In the follow-up risk analysis, young age [18–24 (aRR = 6.2, 95% CI 2.2–17.3) and 25-34 (aRR = 2.3, 95% CI 1.1–5.0) years], reported genital discharge (aRR = 2.5, 95% CI 1.1–5.5), serological syphilis (aRR 3.2, 95% CI 1.3–7.7) and the partner being ART naïve (aRR = 4.8, 95% CI 1.4–16.0) were independently associated with HIV acquisition. There were no seroconversions among participants who reported consistent condom use during the study.

Conclusions

The study has identified important risk factors for HIV acquisition among HIV discordant couples. HIV-uninfected partners in discordant couples may be a suitable population for HIV vaccine efficacy trials. However, recent confirmation that ART reduces heterosexual HIV transmission may make it unfeasible to conduct HIV prevention trials in this population.  相似文献   

19.

Background

The burden of breast cancer in Asia is escalating. We evaluated the impact of ethnicity on survival after breast cancer in the multi-ethnic region of South East Asia.

Methodology/Principal Findings

Using the Singapore-Malaysia hospital-based breast cancer registry, we analyzed the association between ethnicity and mortality following breast cancer in 5,264 patients diagnosed between 1990 and 2007 (Chinese: 71.6%, Malay: 18.4%, Indian: 10.0%). We compared survival rates between ethnic groups and calculated adjusted hazard ratios (HR) to estimate the independent effect of ethnicity on survival. Malays (n = 968) presented at a significantly younger age, with larger tumors, and at later stages than the Chinese and Indians. Malays were also more likely to have axillary lymph node metastasis at similar tumor sizes and to have hormone receptor negative and poorly differentiated tumors. Five year overall survival was highest in the Chinese women (75.8%; 95%CI: 74.4%–77.3%) followed by Indians (68.0%; 95%CI: 63.8%–72.2%), and Malays (58.5%; 95%CI: 55.2%–61.7%). Compared to the Chinese, Malay ethnicity was associated with significantly higher risk of all-cause mortality (HR: 1.34; 95%CI: 1.19–1.51), independent of age, stage, tumor characteristics and treatment. Indian ethnicity was not significantly associated with risk of mortality after breast cancer compared to the Chinese (HR: 1.14; 95%CI: 0.98–1.34).

Conclusion

In South East Asia, Malay ethnicity is independently associated with poorer survival after breast cancer. Research into underlying reasons, potentially including variations in tumor biology, psychosocial factors, treatment responsiveness and lifestyle after diagnosis, is warranted.  相似文献   

20.

Objective

To provide HIV seroincidence data among men who have sex with men (MSM) in the United States and to identify predictive factors for seroconversion.

Methods

From 1998–2002, 4684 high-risk MSM, age 18–60 years, participated in a randomized, placebo-controlled HIV vaccine efficacy trial at 56 U.S. clinical trial sites. Demographics, behavioral data, and HIV status were assessed at baseline and 6 month intervals. Since no overall vaccine efficacy was detected, data were combined from both trial arms to calculate HIV incidence based on person-years (py) of follow-up. Predictors of seroconversion, adjusted hazards ratio (aHR), were evaluated using a Cox proportional hazard model with time-varying covariates.

Results

Overall, HIV incidence was 2.7/100 py and was relatively uniform across study sites and study years. HIV incidence was highest among young men and men reporting unprotected sex, recreational drug use, and a history of a sexually transmitted infection. Independent predictors of HIV seroconversion included: age 18–30 years (aHR = 2.4; 95% CI 1.4,4.0), having >10 partners (aHR = 2.4; 95% CI 1.7,3.3), having a known HIV-positive male sex partner (aHR = 1.6; 95% CI 1.2, 2.0), unprotected anal intercourse with HIV positive/unknown male partners (aHR = 1.7; 95% CI 1.3, 2.3), and amphetamine (aHR = 1.6; 95% CI 1.1, 2.1) and popper (aHR = 1.7; 95% CI 1.3, 2.2) use.

Conclusions

HIV seroincidence was high among MSM despite repeated HIV counseling and reported declines in sexual risk behaviors. Continuing development of new HIV prevention strategies and intensification of existing efforts will be necessary to reduce the rate of new HIV infections, especially among young men.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号