首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 843 毫秒
1.

Background:

Characterizing high-cost users of health care resources is essential for the development of appropriate interventions to improve the management of these patients. We sought to determine the concentration of health care spending, characterize demographic characteristics and clinical diagnoses of high-cost users and examine the consistency of their health care consumption over time.

Methods:

We conducted a retrospective analysis of all residents of Ontario, Canada, who were eligible for publicly funded health care between 2009 and 2011. We estimated the total attributable government health care spending for every individual in all health care sectors.

Results:

More than $30 billion in annual health expenditures, representing 75% of total government health care spending, was attributed to individual costs. One-third of high-cost users (individuals with the highest 5% of costs) in 2009 remained in this category in the subsequent 2 years. Most spending among high-cost users was for institutional care, in contrast to lower-cost users, among whom spending was predominantly for ambulatory care services. Costs were far more concentrated among children than among older adults. The most common reasons for hospital admissions among high-cost users were chronic diseases, infections, acute events and palliative care.

Interpretation:

Although high health care costs were concentrated in a small minority of the population, these related to a diverse set of patient health care needs and were incurred in a wide array of health care settings. Improving the sustainability of the health care system through better management of high-cost users will require different tactics for different high-cost populations.Health care spending per person in any given year is highly uneven. The concentration of health care utilization among small numbers of patients is well established. In the United States, the Agency for Healthcare Research and Quality reported that 1% of users in 2008 accounted for 20% of expenditures and that 5% of users accounted for nearly 50% of expenditures.1 Data from Canada in 1972 and 1996, and again in 2009, showed that high-cost users (individuals with the highest 5% of costs) consumed 65% of combined hospital and nursing home costs, 64% of acute care days and 84% of combined acute and post-acute home care resources, respectively.24Each year, a few people have major health events that must be addressed, often with expensive treatments. The relative rarity and unpredictability of these events for any individual underlies the need for health care insurance. However, improved life expectancy, resulting in part from effective treatments of cardiovascular and respiratory disease, HIV infection and some cancers,5 as well as the chronic debilitating effects of conditions, such as severe stroke or heart failure, are contributing to rising numbers of chronically high users of health care resources. Yet, little is known about the sustained use of health care services among high-cost users.Previous studies of high-cost users in the US have been limited by the use of survey respondents as a source population1 or included only patients in private insurance systems who were less than 65 years of age.6,7 Studies in Canada have examined spending on acute hospital, physician or nursing home care only, representing less than half of all health care expenditures.2,3 Age-related patterns have not been compared.We conducted a study to make a system-wide assessment of the concentration and persistence of costs in a comprehensive health care system. We had several objectives: to measure total expenditures of publicly insured care for every individual, and by health care sector, in the province of Ontario between 2009 and 2011; to track expenditure patterns for individuals over a 3-year period; to describe the concentration of health care spending among different age groups; and to identify the main diagnoses among high-cost users.  相似文献   

2.

Background

Cervical cancer screening is a critical health service that is often unavailable to women in under-resourced settings. In order to expand access to this and other reproductive and primary health care services, a South African non-governmental organization established a van-based mobile clinic in two rural districts in South Africa. To inform policy and budgeting, we conducted a cost evaluation of this service delivery model.

Methods

The evaluation was retrospective (October 2012–September 2013 for one district and April–September 2013 for the second district) and conducted from a provider cost perspective. Services evaluated included cervical cancer screening, HIV counselling and testing, syndromic management of sexually transmitted infections (STIs), breast exams, provision of condoms, contraceptives, and general health education. Fixed costs, including vehicle purchase and conversion, equipment, operating costs and mobile clinic staffing, were collected from program records and public sector pricing information. The number of women accessing different services was multiplied by ingredients-based variable costs, reflecting the consumables required. All costs are reported in 2013 USD.

Results

Fixed costs accounted for most of the total annual costs of the mobile clinics (85% and 94% for the two districts); the largest contributor to annual fixed costs was staff salaries. Average costs per patient were driven by the total number of patients seen, at $46.09 and $76.03 for the two districts. Variable costs for Pap smears were higher than for other services provided, and some services, such as breast exams and STI and tuberculosis symptoms screening, had no marginal cost.

Conclusions

Staffing costs are the largest component of providing mobile health services to rural communities. Yet, in remote areas where patient volumes do not exceed nursing staff capacity, incorporating multiple services within a cervical cancer screening program is an approach to potentially expand access to health care without added costs.  相似文献   

3.

Background:

Sophisticated approaches are needed to improve the quality of care for elderly people living in residential care facilities. We determined the effects of multidisciplinary integrated care on the quality of care and quality of life for elderly people in residential care facilities.

Methods:

We performed a cluster randomized controlled trial involving 10 residential care facilities in the Netherlands that included 340 participating residents with physical or cognitive disabilities. Five of the facilities applied multidisciplinary integrated care, and five provided usual care. The intervention, inspired by the disease management model, consisted of a geriatric assessment of functional health every three months. The assessment included use of the Long-term Care Facility version of the Resident Assessment Instrument by trained nurse-assistants to guide the design of an individualized care plan; discussion of outcomes and care priorities with the family physician, the resident and his or her family; and monthly multidisciplinary meetings with the nurse-assistant, family physician, psychologist and geriatrician to discuss residents with complex needs. The primary outcome was the sum score of 32 risk-adjusted quality-of-care indicators.

Results:

Compared with the facilities that provided usual care, the intervention facilities had a significantly higher sum score of the 32 quality-of-care indicators (mean difference − 6.7, p = 0.009; a medium effect size of 0.72). They also had significantly higher scores for 11 of the 32 indicators of good care in the areas of communication, delirium, behaviour, continence, pain and use of antipsychotic agents.

Interpretation:

Multidisciplinary integrated care resulted in improved quality of care for elderly people in residential care facilities compared with usual care.

Trial registration:

www.controlled-trials.com trial register no. ISRCTN11076857.The quality of care provided in residential care facilities is under pressure worldwide.1 Facilities are frequently understaffed, and the complexity of care needed by residents increases while expertise of staff does not necessarily keep pace.2,3 Although most care organizations want to innovate and improve quality of care, many lack expertise or financial resources needed to do so.4,5 Family physicians are responsible for medical care in residential care facilities in the Netherlands. However, they do not regard themselves as suited for systematic management of chronic diseases and disabilities associated with frail health.6About 10% of elderly people aged 75 or older in the Netherlands live in residential care facilities.7,8 These facilities were established to offer sheltered living for elderly people who are disabled but still relatively healthy. Because of the growing elderly population, the characteristics of elderly people living in residential care facilities have become more comparable to those of people in nursing homes, who need complex care. Residential care facilities in the Netherlands are comparable to residential care facilities in Canada, are publicly funded and are subject to government inspection and approval. Over 70% of the residents need professional care, such as assistance with activities of daily living, nursing care (e.g., medication, wound care) and housekeeping. They have multiple chronic diseases and associated disabilities.912Effective interventions for chronic illnesses generally rely on a multidisciplinary team approach. The elements of this approach include structured geriatric assessment, protocol-based regulation of medications, support for self-reliance and intensive follow-up. The closely related disease management model comprises coordination of care, steering of the care process and patient empowerment.13 This model is strongly recommended by Bodenheimer and colleagues to improve the health and quality of life of chronically ill patients.14 However, no studies have as yet been undertaken to evaluate the effects of disease management on functional health and quality of care for elderly people in residential care facilities who have physical or cognitive disabilities.We developed an approach to multidisciplinary integrated care inspired by the disease management model. The objective of our study was to determine the effects of multidisciplinary integrated care on quality of care and quality of life for elderly people in residential care facilities.  相似文献   

4.

Background

The practice of sharing sanitation facilities does not meet the current World Health Organization/UNICEF definition for what is considered improved sanitation. Recommendations have been made to categorize shared sanitation as improved sanitation if security, user access, and other conditions can be assured, yet limited data exist on user preferences with respect to shared facilities.

Objective

This study analyzed user perceptions of shared sanitation facilities in rural households in East Java, Indonesia, and Bangladesh.

Methods

Cross-sectional studies of 2,087 households in East Java and 3,000 households in Bangladesh were conducted using questionnaires and observational methods. Relative risks were calculated to analyze associations between sanitation access and user perceptions of satisfaction, cleanliness, and safety.

Results

In East Java, 82.4% of households with private improved sanitation facilities reported feeling satisfied with their place of defecation compared to 68.3% of households with shared improved facilities [RR 1.19, 95% CI 1.09, 1.31]. In Bangladesh, 87.7% of households with private improved facilities reported feeling satisfied compared to 74.5% of households with shared improved facilities [RR 1.15, 95% CI 1.10, 1.20]. In East Java, 79.5% of households who reported a clean latrine also reported feeling satisfied with their place of defecation; only 38.9% of households who reported a dirty latrine also reported feeling satisfied [RR 1.74, 95% CI 1.45, 2.08].

Conclusion

Simple distinctions between improved and unimproved sanitation facilities tend to misrepresent the variability observed among households sharing sanitation facilities. Our results suggest that private improved sanitation is consistently preferred over any other sanitation option. An increased number of users appeared to negatively affect toilet cleanliness, and lower levels of cleanliness were associated with lower levels of satisfaction. However, when sanitation facilities were clean and shared by a limited number of households, users of shared facilities often reported feeling both satisfied and safe.  相似文献   

5.
6.

Background

Analyzing high throughput genomics data is a complex and compute intensive task, generally requiring numerous software tools and large reference data sets, tied together in successive stages of data transformation and visualisation. A computational platform enabling best practice genomics analysis ideally meets a number of requirements, including: a wide range of analysis and visualisation tools, closely linked to large user and reference data sets; workflow platform(s) enabling accessible, reproducible, portable analyses, through a flexible set of interfaces; highly available, scalable computational resources; and flexibility and versatility in the use of these resources to meet demands and expertise of a variety of users. Access to an appropriate computational platform can be a significant barrier to researchers, as establishing such a platform requires a large upfront investment in hardware, experience, and expertise.

Results

We designed and implemented the Genomics Virtual Laboratory (GVL) as a middleware layer of machine images, cloud management tools, and online services that enable researchers to build arbitrarily sized compute clusters on demand, pre-populated with fully configured bioinformatics tools, reference datasets and workflow and visualisation options. The platform is flexible in that users can conduct analyses through web-based (Galaxy, RStudio, IPython Notebook) or command-line interfaces, and add/remove compute nodes and data resources as required. Best-practice tutorials and protocols provide a path from introductory training to practice. The GVL is available on the OpenStack-based Australian Research Cloud (http://nectar.org.au) and the Amazon Web Services cloud. The principles, implementation and build process are designed to be cloud-agnostic.

Conclusions

This paper provides a blueprint for the design and implementation of a cloud-based Genomics Virtual Laboratory. We discuss scope, design considerations and technical and logistical constraints, and explore the value added to the research community through the suite of services and resources provided by our implementation.  相似文献   

7.
Ahmed M. Bayoumi  Gregory S. Zaric 《CMAJ》2008,179(11):1143-1151

Background

The cost-effectiveness of Canada''s only supervised injection facility has not been rigorously evaluated. We estimated the impact of the facility on survival, rates of HIV and hepatitis C virus infection, referral to methadone maintenance treatment and associated costs.

Methods

We simulated the population of Vancouver, British Columbia, including injection drug users and persons infected with HIV and hepatitis C virus. The model used a time horizon of 10 years and the perspective of the health care system. We compared the situation of the supervised injection facility with one that had no facility but that had other interventions, such as needle-exchange programs. The effects considered were decreased needle sharing, increased use of safe injection practices and increased referral to methadone maintenance treatment. Outcomes included life-years gained, costs, and incremental cost-effectiveness ratios discounted at 5% per year.

Results

Focusing on the base assumption of decreased needle sharing as the only effect of the supervised injection facility, we found that the facility was associated with an incremental net savings of almost $14 million and 920 life-years gained over 10 years. When we also considered the health effect of increased use of safe injection practices, the incremental net savings increased to more than $20 million and the number of life-years gained to 1070. Further increases were estimated when we considered all 3 health benefits: the incremental net savings was more than $18 million and the number of life-years gained 1175. Results were sensitive to assumptions related to injection frequency, the risk of HIV transmission through needle sharing, the frequency of safe injection practices among users of the facility, the costs of HIV-related care and of operating the facility, and the proportion of users who inject in the facility.

Interpretation

Vancouver''s supervised injection site is associated with improved health and cost savings, even with conservative estimates of efficacy.Supervised injection sites offer a safe and hygienic environment for people to inject their previously obtained illicit drugs under supervision.1,2 Observational studies from the facility in Vancouver, British Columbia, have demonstrated positive effects: a decrease in needle sharing and reuse of syringes, fewer people injecting drugs in public, an increase in referrals to social services and addiction counselling, a decrease in the number of publicly discarded syringes, no apparent increase in police reports of drug dealing or crime, and no observed increase in new initiates into drug use.3–5 Although an expert advisory committee recently concluded that the Vancouver facility has beneficial effects, prominent law enforcement groups have argued that the resources allocated to the facility would be more effectively spent elsewhere.6,7We used computer simulation to estimate the projected impact of Vancouver''s supervised injection site on survival, rates of HIV and hepatitis C virus infection, referral to methadone maintenance treatment and associated costs. Our goal was to assess the cost-effectiveness of the facility and thus provide important insights into this policy debate.  相似文献   

8.
Core facilities (CFs) provide a centralised access to costly equipment, scientific expertise, experimental design, day-to-day technical support and training of users. CFs have a tremendous impact on research outputs, skills and educational agendas, increasing the competencies of staff, researchers and students. However, the rapid development of new technologies and methodologies for the life sciences requires fast adaptation and development of existing core facilities and their technical and scientific staff. Given the scarcity of well-defined CF career paths, CF staff positions are typically filled by people having followed either academic or technical tracks. Each academic institution follows different policies and often fails to adequately recognize the merits of CF personnel and to support their training efficiently. Thus, the Core Technologies for Life Science association (CTLS), through the Training working group, has conducted an anonymous online survey to assess the training needs of CF personnel, as well as to identify common characteristics and challenges in this relatively new and dynamic career type. 275 individuals, including core managers and directors, technicians, technologists and administrators, participated in the survey. The survey was divided into 2 sections; the first, applied to all respondents, and the second, specifically targeted core management issues. Training needs in technological areas, financial and soft skills, management and administrative issues were surveyed as well. The lack of clarity and consistency regarding established career paths for CF professionals was evident from the second part of the survey, highlighting geographical or cultural differences. Gender balance was achieved and the distribution was always taken into account. The results of this survey highlight a need to develop better training resources for CF staff, to improve their recognition within academic institutions, and to establish a recognized career pathway.  相似文献   

9.

Background

Sexual violence is a major public health issue, affecting 35% of women worldwide. Major risk factors for sexual assault include inadequate indoor sanitation and the need to travel to outdoor toilet facilities. We estimated how increasing the number of toilets in an urban township (Khayelitsha, South Africa) might reduce both economic costs and the incidence and social burden of sexual assault.

Methods

We developed a mathematical model that links risk of sexual assault to the number of sanitation facilities and the time a woman must spend walking to a toilet. We defined a composite societal cost function, comprising both the burden of sexual assault and the costs of installing and maintaining public chemical toilets. By expressing total social costs as a function of the number of available toilets, we were able to identify an optimal (i.e., cost-minimizing) social investment in toilet facilities.

Findings

There are currently an estimated 5600 toilets in Khayelitsha. This results in 635 sexual assaults and US$40 million in combined social costs each year. Increasing the number of toilets to 11300 would minimize total costs ($35 million) and reduce sexual assaults to 446. Higher toilet installation and maintenance costs would be more than offset by lower sexual assault costs. Probabilistic sensitivity analysis shows that the optimal number of toilets exceeds the original allocation of toilets in the township in over 80% of the 5000 iterations of the model.

Interpretation

Improving access to sanitation facilities in urban settlements will simultaneously reduce the incidence of sexual assaults and overall cost to society. Since our analysis ignores the many additional health benefits of improving sanitation in resource-constrained urban areas (e.g., potential reductions in waterborne infectious diseases), the optimal number of toilets identified here should be interpreted as conservative.  相似文献   

10.
It is well established that cytochrome c is released from mitochondria when the permeability transition (PT) of this organelle is induced by Ca2+. Our previous study showed that valinomycin also caused the release of cytochrome c from mitochondria but without inducing this PT (Shinohara, Y., Almofti, M. R., Yamamoto, T., Ishida, T., Kita, F., Kanzaki, H., Ohnishi, M., Yamashita, K., Shimizu, S., and Terada, H. (2002) Permeability transition-independent release of mitochondrial cytochrome c induced by valinomycin. Eur. J. Biochem. 269, 5224–5230). These results indicate that cytochrome c may be released from mitochondria with or without the induction of PT. In the present study, we examined the protein species released from valinomycin- and Ca2+-treated mitochondria by LC-MS/MS analysis. As a result, the proteins located in the intermembrane space were found to be specifically released from valinomycin-treated mitochondria, whereas those in the intermembrane space and in the matrix were released from Ca2+-treated mitochondria. These results were confirmed by Western analysis. Furthermore to examine how the protein release occurred, we examined the correlation between the species of released proteins and those of the abundant proteins in mitochondria. Consequently most of the proteins released from mitochondria treated with either agent were highly expressed proteins in mitochondria, indicating that the release occurred not selectively but in a manner dependent on the concentration of the proteins. Based on these results, the permeabilization effects of Ca2+ and valinomycin on the inner and outer mitochondrial membranes are discussed.Mitochondria are well known as the organelle for energy conversion in all eukaryotes. This energy conversion, i.e. ATP synthesis, is performed by using the electrochemical gradient of H+ across the inner mitochondrial membrane. To enable effective energy conversion, the mitochondrial inner membrane is highly resistant to the permeation of solutes and ions. However, under certain conditions, such as in the presence of Ca2+ and inorganic phosphate, the permeability of this inner membrane is known to be markedly increased. This phenomenon is referred to as the permeability transition (PT)1 and is believed to result from the formation of a proteinaceous pore, referred to as the PT pore, which makes the inner membrane permeable to various solutes and ions smaller than 1.5 kDa (13). The physiological importance of the PT has long been uncertain; however, recent studies have revealed that the changes in the permeability of the inner mitochondrial membrane due to the induction of PT cause the release of cytochrome c into the cytosol and that the released cytochrome c then triggers subsequent steps of programmed cell death, which is known as apoptosis (46). Thus, the PT is considered to be one of the major regulatory steps of apoptosis. However, the questions as to how the PT is induced and how cytochrome c is released accompanied by the induction of PT have remained unanswered.To characterize the features of the mitochondrial PT and to understand the mechanism underlying the release of cytochrome c from mitochondria, investigators have studied the effects of various agents on this organelle. As a result, the PT and the release of cytochrome c were found to be induced not only by Ca2+ but also by other agents (79). We also found that copper-o-phenanthroline (10), metal ions (11), and cyanine dyes (12, 13) induced this PT and the release of cytochrome c from mitochondria. Furthermore we reported that valinomycin, known as a potassium-selective ionophore, also induces the release of cytochrome c from mitochondria but without the induction of PT (14). This finding indicated that cytochrome c could be released from mitochondria in two different manners: one with the induction of PT and the other without it. To understand how cytochrome c is released from mitochondria, it is very important to know what protein species are released from mitochondria concomitant with the release of cytochrome c. To address these questions, in the present study we used a mass spectrometry (LC-MS/MS system)-based proteome analysis approach, which allowed us to identify the protein species present in a limited amount of protein samples. Using proteomics techniques, we examined the protein species released from mitochondria treated with valinomycin or with Ca2+, and we discuss our findings on the status of inner and outer mitochondrial membranes treated with these agents.  相似文献   

11.
Academic Core Facilities are optimally situated to improve the quality of preclinical research by implementing quality control measures and offering these to their users. Subject Categories: Methods & Resources, Science Policy & Publishing

During the past decade, the scientific community and outside observers have noted a concerning lack of rigor and transparency in preclinical research that led to talk of a “reproducibility crisis” in the life sciences (Baker, 2016; Bespalov & Steckler, 2018; Heddleston et al, 2021). Various measures have been proposed to address the problem: from better training of scientists to more oversight to expanded publishing practices such as preregistration of studies. The recently published EQIPD (Enhancing Quality in Preclinical Data) System is, to date, the largest initiative that aims to establish a systematic approach for increasing the robustness and reliability of biomedical research (Bespalov et al, 2021). However, promoting a cultural change in research practices warrants a broad adoption of the Quality System and its underlying philosophy. It is here that academic Core Facilities (CF), research service providers at universities and research institutions, can make a difference.It is fair to assume that a significant fraction of published data originated from experiments that were designed, run, or analyzed in CFs. These academic services play an important role in the research ecosystem by offering access to cutting‐edge equipment and by developing and testing novel techniques and methods that impact research in the academic and private sectors alike (Bikovski et al, 2020). Equipment and infrastructure are not the only value: CFs employ competent personnel with profound knowledge and practical experience of the specific field of interest: animal behavior, imaging, crystallography, genomics, and so on. Thus, CFs are optimally positioned to address concerns about the quality and robustness of preclinical research.  相似文献   

12.
Self-assembly of melamine-cyanuric acid (MC) leads to urinary tract calculi and renal failure. The hydration effects on molecular geometry, the IR spectra, the frontier molecular orbital, the energy barrier of proton transfer (PT), as well as the stability of MC were explored by density functional theory (DFT) calculations. The intramolecular PT breaks the big π-conjugated ring of melamine or converts the p-π conjugation (:N-C'=O) to π-π conjugation (O=C-N=C') of cyanuric acid. The intermolecular PT varies the coupling between melamine and cyanuric acid from pure hydrogen bonds (Na…HNd and NH…O) to the cooperation of cation…anion electrostatic interaction (NaH+…Nd -) and two NH…O hydrogen bonds. Distinct IR spectra shifts occur for Na…HNd stretching mode upon PT, i.e., blue-shift upon intramolecular PT and red-shift upon intermolecular PT. It is expected that the PT would inhibit the generation of rosette-like structure or one-dimensional tape conformer for the MC complexes. Hydration obviously effects the local geometric structure around the water binding site, as well as the IR spectra of NH…O and N…HN hydrogen bonds. Hydration decreases the intramolecular PT barrier from ~45 kcal mol-1 in anhydrous complex to ~11.5 kcal mol-1 in trihydrated clusters. While, the hydration effects on intermolecular PT barrier is slight. The relative stability of MC varies slightly by hydration due to the strong hydrogen bond interaction between melamine and cyanuric acid fragments.
Graphical Abstract Hydration effect on proton transfer in melamine?cyanuric acid complex
  相似文献   

13.
14.

Purpose

Urbanization and industrial development intensify water utilization and wastewater generation. The efficiency of wastewater treatment systems varies and depends on system design and wastewater condition. The research aims to examine seven existing centralized municipal wastewater treatment plants (WWTPs) in Bangkok to discover which system configuration yields the best environmental and economic performance. The degree of environmental impact and operational costs from different system designs were investigated to help select future wastewater treatment systems.

Methods

Life cycle assessment (LCA) has been conducted to evaluate environmental impacts from centralized municipal wastewater treatment systems. Life cycle impact assessment method based on endpoint modeling (LIME) was applied, with three major potential environmental impact categories including eutrophication, global warming, and acidification. All seven centralized municipal WWTPs in Bangkok were investigated as case studies. The system configurations are classified into five types of activated sludge (AS) systems. The contribution of impacts from individual processes in each type of AS system was analyzed. The methodology covered major on-site and off-site operational processes excluding construction and maintenance phases. Average annual data were calculated to develop an inventory dataset. JEMAI-Pro software was utilized in this study to analyze the life cycle impact of the systems.

Results and discussion

The level of environmental impact from a WWTP depends on the configuration of the AS system. The highest potential environmental impact from a municipal WWTP is eutrophication, which is obviously affected by ammonium and phosphorous discharges into water bodies. The vertical loop reactor activated sludge (VLRAS) system yielded the best treatment performance among the five AS sub-systems. The consumption of electricity used to operate the system contributed significantly to global warming potential and correlated considerably with operating costs. Comparing among three system sizes, the large-scale WWTP revealed inefficient electricity consumption, whereas the medium plant provided better performance in chemical use and operating costs.

Conclusions

Centralized municipal WWTPs with capacities ranging from 10 to 350?×?103 m3/day were evaluated with respect to environmental performance and costs during the operating phase. Among all case studies, a medium-scale WWTP with a VLRAS system offered the best operating performance in terms of low environmental impact, resource consumption, and cost. To enhance WWTP management, it is vital to improve the efficiency of electricity consumption in primary and secondary treatment processes and increase wastewater collection efficiency to maximize the plant operating capacity and minimize overall environmental impacts.
  相似文献   

15.
Fourier transform infrared (FT-IR) spectroscopy and chemometric techniques were used to discriminate five closely related Salmonella enterica serotype Enteritidis phage types, phage type 1 (PT1), PT1b, PT4b, PT6, and PT6a. Intact cells and outer membrane protein (OMP) extracts from bacterial cell membranes were subjected to FT-IR analysis in transmittance mode. Spectra were collected over a wavenumber range from 4,000 to 600 cm−1. Partial least-squares discriminant analysis (PLS-DA) was used to develop calibration models based on preprocessed FT-IR spectra. The analysis based on OMP extracts provided greater separation between the Salmonella Enteritidis PT1-PT1b, PT4b, and PT6-PT6a groups than the intact cell analysis. When these three phage type groups were considered, the method based on OMP extract FT-IR spectra was 100% accurate. Moreover, complementary local models that considered only the PT1-PT1b and PT6-PT6a groups were developed, and the level of discrimination increased. PT1 and PT1b isolates were differentiated successfully with the local model using the entire OMP extract spectrum (98.3% correct predictions), whereas the accuracy of discrimination between PT6 and PT6a isolates was 86.0%. Isolates belonging to different phage types (PT19, PT20, and PT21) were used with the model to test its robustness. For the first time it was demonstrated that FT-IR analysis of OMP extracts can be used for construction of robust models that allow fast and accurate discrimination of different Salmonella Enteritidis phage types.Over the past 10 years there has been an increase in the incidence of gastrointestinal infections caused by Salmonella enterica serovar Enteritidis, which is now one of the leading S. enterica serotypes worldwide (21, 27). Poultry, poultry products, cattle, and dairy products are the predominant sources of Salmonella-contaminated food products that cause human salmonellosis (28). Large-scale infections continue to occur in developed countries (8). Unrestricted international movement of commercially prepared food and food ingredients and dissimilarities in government and industry food safety controls during the processing, distribution, and marketing of products have surely contributed to the increase in food-borne outbreaks. Salmonella is a tremendous challenge for the agricultural and food processing industries because of its ability to survive under adverse conditions, such as low levels of nutrients and suboptimal temperatures (4, 13).Salmonella Enteritidis isolates can be categorized for epidemiological purposes by using a variety of typing tools (13). These tools include typing techniques such as serological and phage typing (29) and antibiotic resistance patterns (25). These methods are now supplemented by molecular genetics techniques, such as DNA fingerprinting (23), plasmid profiling (16), and pulsed-field gel electrophoresis (26). Phage typing has been used to diagnose Salmonella outbreaks, including S. enterica serovar Typhi and S. enterica serovar Typhimurium outbreaks (29). It is useful to evaluate whether isolates obtained from different sources at different times are similar or distinct in terms of their reactions with a specific collection of bacteriophages used for typing. The correlation between phage type and the source of an epidemic is high (22). Although very effective, existing classification methods are time-consuming, laborious, and expensive, and they often require special training of personnel and expertise, which can prevent a rapid response to the presence of pathogenic bacterial species.Fourier transform infrared (FT-IR) spectroscopy has been successfully used for differentiation and classification of microorganisms at the species and subspecies levels (7, 9, 12, 15, 18, 19, 20). This technique has been shown to have high discriminatory power and allows identification of bacteria at distinct taxonomic levels based on differences in the infrared absorption patterns of microbial cells. FT-IR spectroscopy has been used to differentiate and characterize intact microbial cells based on outer membrane cell components, including lipopolysaccharides (LPS), lipoproteins, and phospholipids (24). Several studies in which S. enterica serotypes have been discriminated using multivariate data analysis and FT-IR spectroscopy have been performed (1, 2, 10, 11). Kim et al. (11) compared the FT-IR spectra of intact cells and the FT-IR spectra of outer membrane protein (OMP) extracts from S. enterica serotypes to discriminate serotypes. Analysis of spectra of OMP extracts in the 1,800- to 1,500-cm−1 region resulted in 100% correct classification of the serotypes investigated.Previously, there have been no reports of differentiation of Salmonella Enteritidis phage types by FT-IR spectroscopy and chemometric methods. To discriminate closely related phage types of Salmonella Enteritidis in this study, intact cells and OMP extracts of bacterial cell membranes were subjected to FT-IR analysis. The isolates analyzed included isolates belonging to five of the phage types of Salmonella Enteritidis found most frequently in Portuguese hospitals in the period from 2004 to 2006, phage type 1 (PT1), PT1b, PT4b, PT6, and PT6a (5, 14). Chemometric models were used to discriminate between phage types based on infrared spectra.  相似文献   

16.

Background

Currently there is a lot of debate about the advantages and disadvantages of for-profit health care delivery. We examined staffing ratios for direct-care and support staff in publicly funded not-for-profit and for-profit nursing homes in British Columbia.

Methods

We obtained staffing data for 167 long-term care facilities and linked these to the type of facility and ownership of the facility. All staff were members of the same bargaining association and received identical wages in both not-for-profit and for-profit facilities. Similar public funding is provided to both types of facilities, although the amounts vary by the level of functional dependence of the residents. We compared the mean number of hours per resident-day provided by direct-care staff (registered nurses, licensed practical nurses and resident care aides) and support staff (housekeeping, dietary and laundry staff) in not-for-profit versus for-profit facilities, after adjusting for facility size (number of beds) and level of care.

Results

The nursing homes included in our study comprised 76% of all such facilities in the province. Of the 167 nursing homes examined, 109 (65%) were not-for-profit and 58 (35%) were for-profit; 24% of the for-profit homes were part of a chain, and the remaining homes were owned by a single operator. The mean number of hours per resident-day was higher in the not-for-profit facilities than in the for-profit facilities for both direct-care and support staff and for all facility levels of care. Compared with for-profit ownership, not-for-profit status was associated with an estimated 0.34 more hours per resident-day (95% confidence interval [CI] 0.18–0.49, p < 0.001) provided by direct-care staff and 0.23 more hours per resident-day (95% CI 0.15–0.30, p < 0.001) provided by support staff.

Interpretation

Not-for-profit facility ownership is associated with higher staffing levels. This finding suggests that public money used to provide care to frail eldery people purchases significantly fewer direct-care and support staff hours per resident-day in for-profit long-term care facilities than in not-for-profit facilities.Nnursing homes provide long-term housing, support and direct care to members of the community who are unable to function independently because of medical, physical and cognitive disabilities. Although only a small proportion of older Canadians reside in nursing homes (18% of those ≥ 80 years), the majority (81%) of long-term care residents are frail elderly people over the age of 65.1Government-funded long-term care in Canada has been provided for many years by a mix of not-for-profit (nonproprietary) and for-profit (proprietary) facilities. The ratio of this mix varies greatly by province. For example, in Ontario 52% of publicly funded nursing homes are for-profit, as compared with 15% in Manitoba.2Previous studies from the United States have shown that having more direct-care personnel is associated with better care in nursing homes.3,4,5,6,7 Specifically, higher numbers of registered-nurse hours per resident-day have been associated with fewer violations of care standards4 and improved functional ability of residents.7 Schnelle and colleagues examined 21 nursing homes in California and found that the homes with the highest number of nurse aides performed significantly better in 13 of 16 quality-of-care measures than the homes with fewer nurse aides.6 Although there has been little research on staffing levels and nursing home care in other countries, health policy-makers in the United Kingdom8 and Australia9 have begun to call for greater accountability for public resources spent in this area.The American literature has also shown that, compared with for-profit nursing homes, not-for-profit facilities have higher direct-care staffing levels4 and lower staff turnover rates.10,11 However, the majority of nursing home care in the United States is delivered by the for-profit sector, whereas in Canada the not-for-profit sector constitutes the majority. This may result in a difference in the informal benchmarks for staffing levels between the 2 countries. There also may be a wider variation in wages and working conditions among nursing homes in the United States, which potentially confounds the comparison between for-profit and not-for-profit facilities.We compared staffing levels of nursing and support staff in publicly funded long-term care facilities by ownership type (not-for-profit v. for-profit) in British Columbia at a time when the majority of publicly funded not-for-profit and for-profit facilities employed a unionized labour force with standardized wages and benefits set by a master collective agreement.In British Columbia, approximately 70% of publicly funded nursing homes are nonproprietary (not-for-profit) and 30% are proprietary (for-profit). Both not-for-profit and for-profit facilities receive global funding from the provincial government on the basis of (a) the level of functional dependence of facility residents and (b) the percentage of fees borne by residents according to their income levels. At the time of the study (2001), there was no regulation by government or the regional health authorities as to how individual facilities allocated funding between staffing, administration or property costs.Not-for-profit long-term care in British Columbia is delivered by religious, cultural or other community-based societies, by regional health authorities or by publicly owned acute care hospitals. For-profit care is delivered by sole operators or by facilities that are part of larger business entities (chains). Nursing homes are grouped by levels of care according to the residents'' case-mix: intermediate care only (IC), intermediate and extended care (IC & EC), multi-level care, or extended care only. IC facilities provide care for people with relatively more functional ability, whereas extended care facilities accommodate the most functionally dependent people. The other 2 facility types provide care for people with a mix of functional abilities.  相似文献   

17.
18.
19.

Introduction

Uganda scaled-up Early HIV Infant Diagnosis (EID) when simplified methods for testing of infants using dried blood spots (DBS) were adopted in 2006 and sample transport and management was therefore made feasible in rural settings. Before this time only 35% of the facilities that were providing EID services were reached through the national postal courier system, Posta Uganda. The transportation of samples during this scale-up, therefore, quickly became a challenge and varied from facility to facility as different methods were used to transport the samples. This study evaluates a novel specimen transport network system for EID testing.

Methods

A retrospective study was done in mid-2012 on 19 pilot hubs serving 616 health facilities in Uganda. The effect on sample-result turnaround time (TAT) and the cost of DBS sample transport on 876 sample-results was analyzed.

Results

The HUB network system provided increased access to EID services ranging from 36% to 51%, drastically reduced transportation costs by 62%, reduced turn-around times by 46.9% and by a further 46.2% through introduction of SMS printers.

Conclusions

The HUB model provides a functional, reliable and efficient national referral network against which other health system strengthening initiatives can be built to increase access to critical diagnostic and treatment monitoring services, improve the quality of laboratory and diagnostic services, with reduced turn-around times and improved quality of prevention and treatment programs thereby reducing long-term costs.  相似文献   

20.

Background

North America''s first medically supervised safer injecting facility for illicit injection drug users was opened in Vancouver on Sept. 22, 2003. Although similar facilities exist in a number of European cities and in Sydney, Australia, no standardized evaluations of their impact have been presented in the scientific literature.

Methods

Using a standardized prospective data collection protocol, we measured injection-related public order problems during the 6 weeks before and the 12 weeks after the opening of the safer injecting facility in Vancouver. We measured changes in the number of drug users injecting in public, publicly discarded syringes and injection-related litter. We used Poisson log-linear regression models to evaluate changes in these public order indicators while considering potential confounding variables such as police presence and rainfall.

Results

In stratified linear regression models, the 12-week period after the facility''s opening was independently associated with reductions in the number of drug users injecting in public (p < 0.001), publicly discarded syringes (p < 0.001) and injection-related litter (p < 0.001). The predicted mean daily number of drug users injecting in public was 4.3 (95% confidence interval [CI] 3.5–5.4) during the period before the facility''s opening and 2.4 (95% CI 1.9–3.0) after the opening; the corresponding predicted mean daily numbers of publicly discarded syringes were 11.5 (95% CI 10.0–13.2) and 5.4 (95% CI 4.7–6.2). Externally compiled statistics from the city of Vancouver on the number of syringes discarded in outdoor safe disposal boxes were consistent with our findings.

Interpretation

The opening of the safer injecting facility was independently associated with improvements in several measures of public order, including reduced public injection drug use and public syringe disposal.Many cities are experiencing epidemics of bloodborne diseases as a result of illicit injection drug use,1,2,3 and drug overdoses have become a leading cause of death in many urban areas.4,5,6 Public drug use also plagues many inner city neighbourhoods, and the unsafe disposal of syringes in these settings is a major community concern.7,8,9,10,11,12,13In over 2 dozen European cities and, more recently, in Sydney, Australia, medically supervised safer injecting facilities, where injection drug users (IDUs) can inject previously obtained illicit drugs under the supervision of medical staff, have been established in an effort to reduce the community and public health impacts of illicit drug use.14 Inside these facilities IDUs are typically provided with sterile injecting equipment, emergency care in the event of overdose, as well as primary care services and referral to addiction treatment.13,15 Although anecdotal reports have suggested that such sites may improve public order,12 reduce the number of deaths from overdose16 and improve access to care,17 no standardized evaluations of their impact are available in the scientific literature.18On Sept. 22, 2003, health officials in Vancouver opened a government-sanctioned safer injecting facility as pilot project. The facility, the first in North America, is centrally located in Vancouver''s Downtown Eastside, which is the most impoverished urban neighbourhood in Canada and home to well-documented overdose and HIV epidemics among the estimated 5000 IDUs who reside there.19,20 Federal approval for the 3-year project was granted on the condition that the health and social impacts of the facility be rigorously evaluated. Although evaluation of the facility''s impact on certain outcomes (e.g., HIV incidence) is ongoing and will take several years, it is now possible to examine the impacts of the site on public order. Therefore, we conducted this study to test the hypothesis that changes in improperly discarded syringes and public drug use would be observed after the opening of the safer injecting facility.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号