首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 406 毫秒
1.

Background

The clinical and financial outcomes of SSIs directly attributable to MRSA and methicillin-resistance are largely uncharacterized. Previously published data have provided conflicting conclusions.

Methodology

We conducted a multi-center matched outcomes study of 659 surgical patients. Patients with SSI due to MRSA were compared with two groups: matched uninfected control patients and patients with SSI due to MSSA. Four outcomes were analyzed for the 90-day period following diagnosis of the SSI: mortality, readmission, duration of hospitalization, and hospital charges. Attributable outcomes were determined by logistic and linear regression.

Principal Findings

In total, 150 patients with SSI due to MRSA were compared to 231 uninfected controls and 128 patients with SSI due to MSSA. SSI due to MRSA was independently predictive of readmission within 90 days (OR = 35.0, 95% CI 17.3–70.7), death within 90 days (OR = 7.27, 95% CI 2.83–18.7), and led to 23 days (95% CI 19.7–26.3) of additional hospitalization and $61,681 (95% 23,352–100,011) of additional charges compared with uninfected controls. Methicillin-resistance was not independently associated with increased mortality (OR = 1.72, 95% CI 0.70–4.20) nor likelihood of readmission (OR = 0.43, 95% CI 0.21–0.89) but was associated with 5.5 days (95% CI 1.97–9.11) of additional hospitalization and $24,113 (95% 4,521–43,704) of additional charges.

Conclusions/Significance

The attributable impact of S. aureus and methicillin-resistance on outcomes of surgical patients is substantial. Preventing a single case of SSI due to MRSA can save hospitals as much as $60,000.  相似文献   

2.

Background

The use of tablet computers and other touch screen technology within the healthcare system has rapidly expanded. It has been reported that these devices can harbor pathogens in hospitals; however, much less is known about what pathogens they can harbor when used outside the hospital environment compared to hospital practice.

Methods

Thirty iPads belonging to faculty with a variety of practice settings were sampled to determine the presence and quantity of clinically-relevant organisms. Flocked nylon swabs and neutralizer solution were used to sample the surface of each iPad. Samples were then plated on a variety of selective agars for presence and quantity of selected pathogens. In addition, faculty members were surveyed to classify the physical location of their practice settings and usage patterns. Continuous variables were compared via an unpaired Student''s t test with two-tailed distribution; categorical variables were compared with the Fisher''s exact test.

Results

Of the iPads sampled, 16 belonged to faculty practicing within a hospital and 14 belonged to a faculty member practicing outside a hospital. More faculty within the hospital group used their iPads at their practice sites (78.6% vs. 31.3%; p = 0.014) and within patient care areas (71.4% vs. 18.8%; p = 0.009) than the non-hospital group. There were no differences in the presence, absence, or quantity of, any of the pathogens selectively isolated between groups. Problematic nosocomial pathogens such as methicillin-resistant Staphylococcus aureus (MRSA), vancomycin-resistant enterococci (VRE), and P. aeruginosa were isolated from both hospital and non-hospital faculty iPads.

Conclusions

Gram positive and Gram negative organisms were recovered from the surfaces of iPads regardless of practice setting; these included problematic multidrug-resistant pathogens like MRSA, VRE, and Pseudomonas aeruginosa. Healthcare personnel in all settings should be aware of the potential for tablet computers to serve as a nidus for microorganism transmission.  相似文献   

3.

Introduction

HIV prevalence among state prison inmates in the United States is more than five times higher than among nonincarcerated persons, but HIV transmission within U.S. prisons is sparsely documented. We investigated 88 HIV seroconversions reported from 1988–2005 among male Georgia prison inmates.

Methods

We analyzed medical and administrative data to describe seroconverters'' HIV testing histories and performed a case-crossover analysis of their risks before and after HIV diagnosis. We sequenced the gag, env, and pol genes of seroconverters'' HIV strains to identify genetically-related HIV transmission clusters and antiretroviral resistance. We combined risk, genetic, and administrative data to describe prison HIV transmission networks.

Results

Forty-one (47%) seroconverters were diagnosed with HIV from July 2003–June 2005 when voluntary annual testing was offered. Seroconverters were less likely to report sex (OR [odds ratio] = 0.02, 95% CI [confidence interval]: 0–0.10) and tattooing (OR = 0.03, 95% CI: <0.01–0.20) in prison after their HIV diagnosis than before. Of 67 seroconverters'' specimens tested, 33 (49%) fell into one of 10 genetically-related clusters; of these, 25 (76%) reported sex in prison before their HIV diagnosis. The HIV strains of 8 (61%) of 13 antiretroviral-naïve and 21 (40%) of 52 antiretroviral-treated seroconverters were antiretroviral-resistant.

Discussion

Half of all HIV seroconversions were identified when routine voluntary testing was offered, and seroconverters reduced their risks following their diagnosis. Most genetically-related seroconverters reported sex in prison, suggesting HIV transmission through sexual networks. Resistance testing before initiating antiretroviral therapy is important for newly-diagnosed inmates.  相似文献   

4.

Background

Community-acquired MRSA (CA-MRSA) is rapidly increasing. Currently, it is unknown which reservoirs are involved. An exploratory hospital-based case-control study was performed in sixteen Dutch hospitals to identify risk factors for CA-MRSA carriage in patients not belonging to established risk groups.

Methods

Cases were in- or outpatients from sixteen Dutch hospitals, colonised or infected with MRSA without healthcare- or livestock-associated risk factors for MRSA carriage. Control subjects were patients not carrying MRSA, and hospitalised on the same ward or visited the same outpatients'' clinic as the case. The presence of potential risk factors for CA-MRSA carriage was determined using a standardised questionnaire.

Results

Regular consumption of poultry (OR 2⋅40; 95% CI 1⋅08–5⋅33), cattle density per municipality (OR 1⋅30; 95% CI 1⋅00–1⋅70), and sharing of scuba diving equipment (OR 2⋅93 95% CI 1⋅19–7⋅21) were found to be independently associated with CA-MRSA carriage. CA-MRSA carriage was not related to being of foreign origin.

Conclusions

The observed association between the consumption of poultry and CA-MRSA carriage suggests that MRSA in the food chain may be a source for MRSA carriage in humans. Although sharing of scuba diving equipment was found to be associated with CA-MRSA carriage, the role played by skin abrasions in divers, the lack of decontamination of diving materials, or the favourable high salt content of sea water is currently unclear. The risk for MRSA MC398 carriage in areas with a high cattle density may be due to environmental contamination with MRSA MC398 or human-to-human transmission. Further studies are warranted to confirm our findings and to determine the absolute risks of MRSA acquisition associated with the factors identified.  相似文献   

5.

Background

Recently, livestock-associated methicillin-resistant Staphylococcus aureus CC398 has been discovered in animals, livestock farmers and retail meat. This cross-sectional study aimed to determine the spread to persons not in direct contact with livestock in areas with a high density of pig farms.

Methodology/Principal Findings

With a random mailing in 3 selected municipalities in the Netherlands, adult persons were asked to fill in a questionnaire and to take a nose swab. In total, complete information was obtained on 583 persons. Of the 534 persons without livestock-contact, one was positive for MRSA (0.2%; 95% confidence interval, <0.01–1.2). Of the 49 persons who did indicate to be working at or living on a livestock farm, 13 were positive for MRSA (26.5%; 95% confidence interval, 16.1–40.4). All spa-types belonged to CC398.

Conclusions/Significance

Livestock-associated MRSA has a high prevalence in people with direct contact with animals. At this moment it has not spread from the farms into the community.  相似文献   

6.

Background and Objectives

Rhabdomyolysis is often associated with sepsis and gram positive bacterial pathogens are reported to be the most frequent cause of sepsis induced rhabdomyolysis. We report the pattern of infecting bacterial pathogens and associated causal factors in a South-Indian cohort.

Design, Setting, Participants & Measurements

Retrospective cohort study of adult patients with community acquired bacterial sepsis complicated by rhabdomyolysis from March 2003 - August 2008. Rhabdomyolysis was defined as serum creatine kinase >2000 IU/L. The study population was divided into group-I (sepsis with gram positive pathogens), group–II (sepsis with gram negative pathogens) and group-III (culture negative sepsis).

Results

103 patients (group I -15, group II- 34 and group III- 54) formed the study cohort. Mean age was 55 years and two-third had diabetes. Mean creatine kinase was 7114 IU/L and mean serum creatinine on admission was 2.4 mg/dl. Causative pathogen of sepsis was identified in 47.5%. Gram negative pathogens were more frequently (33%) associated with rhabdomyolysis than gram positive pathogens (14.5%). Lung was the commonest foci of sepsis (38.8%). 78.6% of the study population had one or more additional causal factor for rhabdomyolysis like statin intake, chronic alcoholism, hypokalemia, hypernatremia and hypophosphatemia. Mortality was 59%.

Conclusions

Gram negative bacterial pathogens were more frequently associated with rhabdomyolysis than gram positive pathogens. Rhabdomyolysis in patients with sepsis is multifactorial and is associated with high mortality.  相似文献   

7.

Background

Methicillin resistant Staphylococcus aureus (MRSA) poses a threat to patient safety and public health. Understanding how MRSA is acquired is important for prevention efforts. This study investigates risk factors for MRSA nasal carriage among patients at an eastern North Carolina hospital in 2011.

Methods

Using a case-control design, hospitalized patients ages 18 – 65 years were enrolled between July 25, 2011 and December 15, 2011 at Vidant Medical Center, a tertiary care hospital that screens all admitted patients for nasal MRSA carriage. Cases, defined as MRSA nasal carriers, were age and gender matched to controls, non-MRSA carriers. In-hospital interviews were conducted, and medical records were reviewed to obtain information on medical and household exposures. Multivariable conditional logistic regression was used to derive odds ratio (OR) estimates of association between MRSA carriage and medical and household exposures.

Results

In total, 117 cases and 119 controls were recruited to participate. Risk factors for MRSA carriage included having household members who took antibiotics or were hospitalized (OR: 3.27; 95% Confidence Interval (CI): 1.24–8.57) and prior hospitalization with a positive MRSA screen (OR: 3.21; 95% CI: 1.12–9.23). A lower proportion of cases than controls were previously hospitalized without a past positive MRSA screen (OR: 0.40; 95% CI: 0.19–0.87).

Conclusion

These findings suggest that household exposures are important determinants of MRSA nasal carriage in hospitalized patients screened at admission.  相似文献   

8.

Setting

Under India''s Revised National Tuberculosis Control Programme (RNTCP), >15% of previously-treated patients in the reported 2006 patient cohort defaulted from anti-tuberculosis treatment.

Objective

To assess the timing, characteristics, and risk factors for default amongst re-treatment TB patients.

Methodology

For this case-control study, in 90 randomly-selected programme units treatment records were abstracted from all 2006 defaulters from the RNTCP re-treatment regimen (cases), with one consecutively-selected non-defaulter per case. Patients who interrupted anti-tuberculosis treatment for >2 months were classified as defaulters.

Results

1,141 defaulters and 1,189 non-defaulters were included. The median duration of treatment prior to default was 81 days (25%–75% interquartile range 44–117 days) and documented retrieval efforts after treatment interruption were inadequate. Defaulters were more likely to have been male (adjusted odds ratio [aOR] 1.4, 95% confidence interval [CI] 1.2–1.7), have previously defaulted anti-tuberculosis treatment (aOR 1.3 95%CI 1.1–1.6], have previous treatment from non-RNTCP providers (AOR 1.3, 95%CI 1.0–1.6], or have public health facility-based treatment observation (aOR 1.3, 95%CI 1.1–1.6).

Conclusions

Amongst the large number of re-treatment patients in India, default occurs early and often. Improved pre-treatment counseling and community-based treatment provision may reduce default rates. Efforts to retrieve treatment interrupters prior to default require strengthening.  相似文献   

9.

Background

The costs and benefits of controlling nosocomial spread of antibiotic-resistant bacteria are unknown.

Methods

We developed a mathematical algorithm to determine cost-effectiveness of infection control programs and explored the dynamical interactions between different epidemiological variables and cost-effectiveness. The algorithm includes occurrence of nosocomial infections, attributable mortality, costs and efficacy of infection control and how antibiotic-resistant bacteria affect total number of infections: do infections with antibiotic-resistant bacteria replace infections caused by susceptible bacteria (replacement scenario) or occur in addition to them (addition scenario). Methicillin-resistant Staphylococcus aureus (MRSA) bacteremia was used for illustration using observational data on S. aureus bacteremia (SAB) in our hospital (n = 189 between 2001–2004, all being methicillin-susceptible S. aureus [MSSA]).

Results

In the replacement scenario, the costs per life year gained range from € 45,912 to € 6590 for attributable mortality rates ranging from 10% to 50%. Using € 20,000 per life year gained as a threshold, completely preventing MRSA would be cost-effective in the replacement scenario if attributable mortality of MRSA is ≥21%. In the addition scenario, infection control would be cost saving along the entire range of estimates for attributable mortality.

Conclusions

Cost-effectiveness of controlling antibiotic-resistant bacteria is highly sensitive to the interaction between infections caused by resistant and susceptible bacteria (addition or replacement) and attributable mortality. In our setting, controlling MRSA would be cost saving for the addition scenario but would not be cost-effective in the replacement scenario if attributable mortality would be <21%.  相似文献   

10.

Background

Twin studies offer a ‘natural experiment’ that can estimate the magnitude of environmental and genetic effects on a target phenotype. We hypothesised that fidgetiness and enjoyment of activity would be heritable but that objectively-measured daily activity would show a strong shared environmental effect.

Methodology/Principal Findings

In a sample of 9–12 year-old same-sex twin pairs (234 individuals; 57 MZ, 60 DZ pairs) we assessed three dimensions of physical activity: i) objectively-measured physical activity using accelerometry, ii) ‘fidgetiness’ using a standard psychometric scale, and iii) enjoyment of physical activity from both parent ratings and children''s self-reports. Shared environment effects explained the majority (73%) of the variance in objectively-measured total physical activity (95% confidence intervals (CI): 0.63–0.81) with a smaller unshared environmental effect (27%; CI: 0.19–0.37) and no significant genetic effect. In contrast, fidgetiness was primarily under genetic control, with additive genetic effects explaining 75% (CI: 62–84%) of the variance, as was parent''s report of children''s enjoyment of low 74% (CI: 61–82%), medium 80% (CI: 71–86%), and high impact activity (85%; CI: 78–90%), and children''s expressed activity preferences (60%, CI: 42–72%).

Conclusions

Consistent with our hypothesis, the shared environment was the dominant influence on children''s day-to-day activity levels. This finding gives a strong impetus to research into the specific environmental characteristics influencing children''s activity, and supports the value of interventions focused on home or school environments.  相似文献   

11.

Background

Multidrug antiretroviral (ARV) regimens including HAART and short-course dual antiretroviral (sc-dARV) regimens were introduced in 2004 to improve Prevention of Mother-to-Child Transmission (PMTCT) in Cameroon. We assessed the effectiveness of these regimens from 6–10 weeks and 12 months of age, respectively.

Methodology/Findings

We conducted a retrospective cohort study covering the period from October 2004 to March 2008 in a reference hospital in Cameroon. HIV-positive pregnant women with CD4 ≤350 cells/mm3 received first-line HAART [regimen 1] while the others received ARV prophylaxis including sc-dARV or single dose nevirapine (sd-NVP). Sc-dARV included at least two drugs according to different gestational ages: zidovudine (ZDV) from 28–32 weeks plus sd-NVP [regimen 2], ZDV and lamuvidine (3TC) from 33–36 weeks plus sd-NVP [regimen 3]. When gestational age was ≥37 weeks, women received sd-NVP during labour [regimen 4]. Infants received sd-NVP plus ZDV and 3TC for 7 days or 30 days. Early diagnosis (6–10 weeks) was done, using b-DNA and subsequently RT-PCR. We determined early MTCT rate and associated risk factors using logistic regression. The 12-month HIV-free survival was assessed using Cox regression. Among 418 mothers, 335 (80%) received multidrug ARV regimens (1, 2, and 3) and MTCT rate with multidrug regimens was 6.6% [95%CI: 4.3–9.6] at 6 weeks, without any significant difference between regimens. Duration of mother''s ARV regimen <4 weeks [OR = 4.7, 95%CI: 1.3–17.6], mother''s CD4 <350 cells/mm3 [OR = 6.4, 95%CI: 1.8–22.5] and low birth weight [OR = 4.0, 95%CI: 1.4–11.3] were associated with early MTCT. By 12 months, mixed feeding [HR = 8.7, 95%CI: 3.6–20.6], prematurity [HR = 2.3, 95%CI: 1.2–4.3] and low birth weight were associated with children''s risk of progressing to infection or death.

Conclusions

Multidrug ARV regimens for PMTCT are feasible and effective in routine reference hospital. Early initiation of ARV during pregnancy and proper obstetrical care are essential to improve PMTCT.  相似文献   

12.

Background

In 1994 there was a horrific genocide in Rwanda following years of tension, resulting in the murder of at least 800,000 people. Although many people were injured in addition to those killed, no attempt has been made to assess the lasting burden of physical injuries related to these events. The aim of this study was to estimate the current burden of musculoskeletal impairment (MSI) attributable to the 1994 war and related violence.

Methodology/Principal Findings

A national cross-sectional survey of MSI was conducted in Rwanda. 105 clusters of 80 people were selected through probability proportionate to size sampling. Households within clusters were selected through compact segment sampling. Enumerated people answered a seven-question screening test to assess whether they might have an MSI. Those who were classed as potential cases in the screening test were examined and interviewed by a physiotherapist, using a standard protocol that recorded the site, nature, cause, and severity of the MSI. People with MSI due to trauma were asked whether this trauma occurred during the 1990–1994 war or during the episodes that preceded or followed this war. Out of 8,368 people enumerated, 6,757 were available for screening and examination (80.8%). 352 people were diagnosed with an MSI (prevalence = 5.2%, 95% CI = 4.5–5.9%). 106 cases of MSI (30.6%) were classified as resulting from trauma, based on self-report and the physiotherapist''s assessment. Of these, 14 people (13.2%) reported that their trauma-related MSI occurred during the 1990–1994 war, and a further 7 (6.6%) that their trauma-related MSI occurred during the violent episodes that preceded and followed the war, giving an overall prevalence of trauma-related MSI related to the 1990–1994 war of 0.3% (95% CI = 0.2–0.4%).

Conclusions/Significance

A decade on, the overall prevalence of MSI was relatively high in Rwanda but few cases appeared to be the result of the 1994 war or related violence.  相似文献   

13.

Background

Infectious diseases often demonstrate heterogeneity of transmission among host populations. This heterogeneity reduces the efficacy of control strategies, but also implies that focusing control strategies on “hotspots” of transmission could be highly effective.

Methods and Findings

In order to identify hotspots of malaria transmission, we analysed longitudinal data on febrile malaria episodes, asymptomatic parasitaemia, and antibody titres over 12 y from 256 homesteads in three study areas in Kilifi District on the Kenyan coast. We examined heterogeneity by homestead, and identified groups of homesteads that formed hotspots using a spatial scan statistic. Two types of statistically significant hotspots were detected; stable hotspots of asymptomatic parasitaemia and unstable hotspots of febrile malaria. The stable hotspots were associated with higher average AMA-1 antibody titres than the unstable clusters (optical density [OD] = 1.24, 95% confidence interval [CI] 1.02–1.47 versus OD = 1.1, 95% CI 0.88–1.33) and lower mean ages of febrile malaria episodes (5.8 y, 95% CI 5.6–6.0 versus 5.91 y, 95% CI 5.7–6.1). A falling gradient of febrile malaria incidence was identified in the penumbrae of both hotspots. Hotspots were associated with AMA-1 titres, but not seroconversion rates. In order to target control measures, homesteads at risk of febrile malaria could be predicted by identifying the 20% of homesteads that experienced an episode of febrile malaria during one month in the dry season. That 20% subsequently experienced 65% of all febrile malaria episodes during the following year. A definition based on remote sensing data was 81% sensitive and 63% specific for the stable hotspots of asymptomatic malaria.

Conclusions

Hotspots of asymptomatic parasitaemia are stable over time, but hotspots of febrile malaria are unstable. This finding may be because immunity offsets the high rate of febrile malaria that might otherwise result in stable hotspots, whereas unstable hotspots necessarily affect a population with less prior exposure to malaria. Please see later in the article for the Editors'' Summary  相似文献   

14.

Objectives

A national survey in 1997 demonstrated that trachoma was endemic in Mali. Interventions to control trachoma including mass drug administration (MDA) with azithromycin were launched in the regions of Kayes and Koulikoro in 2003. MDA was discontinued after three annual rounds in 2006, and an impact survey conducted. We resurveyed all districts in Kayes and Koulikoro in 2009 to reassess trachoma prevalence and determine intervention objectives for the future. In this paper we present findings from both the 2006 and 2009 surveys.

Methods

Population-based cluster surveys were conducted in each of the nine districts in Koulikoro in 2006 and 2009, whilst in Kayes, four of seven districts in 2006 and all seven districts in 2009 were surveyed. Household members present were examined for clinical signs of trachoma.

Results

Overall, 29,179 persons from 2,528 compounds, in 260 clusters were examined in 2006 and 32,918 from 7,533 households in 320 clusters in 2009. The prevalence of TF in children aged 1–9 years in Kayes and Koulikoro was 3.9% (95%CI 2.9–5.0%, range by district 1.2–5.4%) and 2.7% (95%CI 2.3–3.1%, range by district 0.1–5.0%) respectively in 2006. In 2009 TF prevalence was 7.26% (95%CI 6.2–8.2%, range by district 2.5–15.4%) in Kayes and 8.19% (95%CI 7.3–9.1%, range by district 1.7–17.2%) in Koulikoro among children of the same age group. TT in adults 15 years of age and older was 2.37% (95%CI 1.66–3.07%, range by district 0.30–3.54%) in 2006 and 1.37% (95%CI 1.02–1.72%, range by district 0.37–1.87%) in 2009 in Kayes and 1.75% (95%CI 1.31–2.23%, range by district 1.06–2.49%) in 2006 and 1.08% (95%CI 0.86–1.30%, range by district 0.34–1.78%) in 2009 in Koulikoro.

Conclusions

Using WHO guidelines for decision making, four districts, Bafoulabe in Kayes Region; and Banamba, Kolokani and Koulikoro in Koulikoro Region, still meet criteria for district-wide implementation of the full SAFE strategy as TF in children exceeds 10%. A community-by-community approach to trachoma control may now be required in the other twelve districts. Trichiasis surgery provision remains a need in all districts and should be enhanced in six districts in Kayes and five in Koulikoro where the prevalence exceeded 1.0% in adults. Since 1997 great progress has been observed in the fight against blinding trachoma; however, greater effort is required to meet the elimination target of 2015.  相似文献   

15.

Background

Trachoma, one of the neglected tropical diseases is suspected to be endemic in Malawi. Objectives: To determine the prevalence of trachoma and associated risk factors in central and southern Malawi.

Methodology/Principal Findings

A population based survey conducted in randomly selected clusters in Chikwawa district (population 438,895), southern Malawi and Mchinji district (population 456,558), central Malawi. Children aged 1–9 years and adults aged 15 and above were assessed for clinical signs of trachoma. In total, 1010 households in Chikwawa and 1016 households in Mchinji districts were enumerated within 108 clusters (54 clusters in each district). A total of 6,792 persons were examined for ocular signs of trachoma. The prevalence of trachomatous inflammation, follicular (TF) among children aged 1–9 years was 13.6% (CI 11.6–15.6) in Chikwawa and 21.7% (CI 19.5–23.9) in Mchinji districts respectively. The prevalence of trachoma trichiasis (TT) in women and men aged 15 years and above was 0.6% (CI 0.2–0.9) in Chikwawa and 0.3% (CI 0.04–0.6) in Mchinji respectively. The presence of a dirty face was significantly associated with trachoma follicular (TF) in both Chikwawa and Mchinji districts (P<0.001).

Conclusion/Significance

Prevalence rates of trachoma follicles (TF) in Central and Southern Malawi exceeds the WHO guidelines for the intervention with mass antibiotic distribution (TF>10%), and warrants the trachoma SAFE control strategy to be undertaken in Chikwawa and Mchinji districts.  相似文献   

16.
17.

Background

Recent research has demonstrated that many swine and swine farmers in the Netherlands and Canada are colonized with MRSA. However, no studies to date have investigated carriage of MRSA among swine and swine farmers in the United States (U.S.).

Methods

We sampled the nares of 299 swine and 20 workers from two different production systems in Iowa and Illinois, comprising approximately 87,000 live animals. MRSA isolates were typed by pulsed field gel electrophoresis (PFGE) using SmaI and EagI restriction enzymes, and by multi locus sequence typing (MLST). PCR was used to determine SCCmec type and presence of the pvl gene.

Results

In this pilot study, overall MRSA prevalence in swine was 49% (147/299) and 45% (9/20) in workers. The prevalence of MRSA carriage among production system A''s swine varied by age, ranging from 36% (11/30) in adult swine to 100% (60/60) of animals aged 9 and 12 weeks. The prevalence among production system A''s workers was 64% (9/14). MRSA was not isolated from production system B''s swine or workers. Isolates examined were not typeable by PFGE when SmaI was used, but digestion with EagI revealed that the isolates were clonal and were not related to common human types in Iowa (USA100, USA300, and USA400). MLST documented that the isolates were ST398.

Conclusions

These results show that colonization of swine by MRSA was very common on one swine production system in the midwestern U.S., suggesting that agricultural animals could become an important reservoir for this bacterium. MRSA strain ST398 was the only strain documented on this farm. Further studies are examining carriage rates on additional farms.  相似文献   

18.

Background

Visceral leishmaniasis (VL) is diagnosed by microscopic confirmation of the parasite in bone marrow, spleen or lymph node aspirates. These procedures are unsuitable for rapid diagnosis of VL in field settings. The development of rK39-based rapid diagnostic tests (RDT) revolutionized diagnosis of VL by offering high sensitivity and specificity in detecting disease in the Indian subcontinent; however, these tests have been less reliable in the African subcontinent (sensitivity range of 75–85%, specificity of 70–92%). We have addressed limitations of the rK39 with a new synthetic polyprotein, rK28, followed by development and evaluation of two new rK28-based RDT prototype platforms.

Methodology/Principal Findings

Evaluation of 62 VL-confirmed sera from Sudan provided sensitivities of 96.8% and 93.6% (95% CI = K28: 88.83–99.61%; K39: 84.30–98.21%) and specificities of 96.2% and 92.4% (95% CI = K28: 90.53–98.95%; K39: 85.54–96.65%) for rK28 and rK39, respectively. Of greater interest was the observation that individual VL sera with low rK39 reactivity often had much higher rK28 reactivity. This characteristic of the fusion protein was exploited in the development of rK28 rapid tests, which may prove to be crucial in detecting VL among patients with low rK39 antibody levels. Evaluation of two prototype lateral flow-based rK28 rapid tests on 53 VL patients in Sudan and 73 VL patients in Bangladesh provided promisingly high sensitivities (95.9% [95% CI = 88.46–99.1 in Sudan and 98.1% [95% CI = 89.93–99.95%] in Bangladesh) compared to the rK39 RDT (sensitivities of 86.3% [95% CI = 76.25–93.23%] in Sudan and 88.7% [95% CI = 76.97–95.73%] in Bangladesh).

Conclusions/Significance

Our study compares the diagnostic accuracy of rK39 and rK28 in detecting active VL cases and our findings indicate that rK28 polyprotein has great potential as a serodiagnostic tool. A new rK28-based RDT will prove to be a valuable asset in simplifying VL disease confirmation at the point-of-care.  相似文献   

19.

Background

Environmental surfaces play an important role in the transmission of healthcare-associated pathogens. Because environmental cleaning is often suboptimal, there is a growing demand for safe, rapid, and automated disinfection technologies, which has lead to a wealth of novel disinfection options available on the market. Specifically, automated ultraviolet-C (UV-C) devices have grown in number due to the documented efficacy of UV-C for reducing healthcare-acquired pathogens in hospital rooms. Here, we assessed and compared the impact of pathogen concentration, organic load, distance, and radiant dose on the killing efficacy of two analogous UV-C devices.

Principal Findings

The devices performed equivalently for each impact factor assessed. Irradiation delivered for 41 minutes at 4 feet from the devices consistently reduced C. difficile spores by ∼ 3 log10CFU/cm2, MRSA by>4 log10CFU/cm2, and VRE by >5 log10CFU/cm2. Pathogen concentration did not significantly impact the killing efficacy of the devices. However, both a light and heavy organic load had a significant negative impacted on the killing efficacy of the devices. Additionally, increasing the distance to 10 feet from the devices reduced the killing efficacy to ≤3 log10CFU/cm2 for MRSA and VRE and <2 log10CFU/cm2 for C.difficile spores. Delivery of reduced timed doses of irradiation particularly impacted the ability of the devices to kill C. difficile spores. MRSA and VRE were reduced by >3 log10CFU/cm2 after only 10 minutes of irradiation, while C. difficile spores required 40 minutes of irradiation to achieve a similar reduction.

Conclusions

The UV-C devices were equally effective for killing C. difficile spores, MRSA, and VRE. While neither device would be recommended as a stand-alone disinfection procedure, either device would be a useful adjunctive measure to routine cleaning in healthcare facilities.  相似文献   

20.

Background

Although syndromic surveillance is increasingly used to detect unusual illness, there is a debate whether it is useful for detecting local outbreaks. We evaluated whether syndromic surveillance detects local outbreaks of lower-respiratory infections (LRIs) without swamping true signals by false alarms.

Methods and Findings

Using retrospective hospitalization data, we simulated prospective surveillance for LRI-elevations. Between 1999–2006, a total of 290762 LRIs were included by date of hospitalization and patients place of residence (>80% coverage, 16 million population). Two large outbreaks of Legionnaires disease in the Netherlands were used as positive controls to test whether these outbreaks could have been detected as local LRI elevations. We used a space-time permutation scan statistic to detect LRI clusters. We evaluated how many LRI-clusters were detected in 1999–2006 and assessed likely causes for the cluster-signals by looking for significantly higher proportions of specific hospital discharge diagnoses (e.g. Legionnaires disease) and overlap with regional influenza elevations. We also evaluated whether the number of space-time signals can be reduced by restricting the scan statistic in space or time. In 1999–2006 the scan-statistic detected 35 local LRI clusters, representing on average 5 clusters per year. The known Legionnaires'' disease outbreaks in 1999 and 2006 were detected as LRI-clusters, since cluster-signals were generated with an increased proportion of Legionnaires disease patients (p:<0.0001). 21 other clusters coincided with local influenza and/or respiratory syncytial virus activity, and 1 cluster appeared to be a data artifact. For 11 clusters no likely cause was defined, some possibly representing as yet undetected LRI-outbreaks. With restrictions on time and spatial windows the scan statistic still detected the Legionnaires'' disease outbreaks, without loss of timeliness and with less signals generated in time (up to 42% decline).

Conclusions

To our knowledge this is the first study that systematically evaluates the performance of space-time syndromic surveillance with nationwide high coverage data over a longer period. The results show that syndromic surveillance can detect local LRI-outbreaks in a timely manner, independent of laboratory-based outbreak detection. Furthermore, since comparatively few new clusters per year were observed that would prompt investigation, syndromic hospital-surveillance could be a valuable tool for detection of local LRI-outbreaks.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号