首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
A consequence assessment framework was developed to evaluate the economic and environmental consequences of an exotic disease in the context of supporting policy level decisions on mitigation strategies. The framework adopted a semi-qualitative analysis of impacts supported by expert judgement. The efficacy of the framework was illustrated via assessment of the notifiable fish disease, Gyrodactylus salaris. In this example, the economic cost of an illustrative outbreak of G. salaris ranged from £0.22 million to £90 million. The cost of the most likely scenario (regional spread) was estimated to be £7.5 million (minimum to maximum range of £2–22 million), reflecting the uncertainty in the extent of spread of the parasite before detection. The environmental impacts vary by a factor of 35 between incursion scenarios reflecting the number of affected catchments in the scenarios.  相似文献   

2.
Pollination services provided by insects play a key role in English crop production and wider ecology. Despite growing evidence of the negative effects of habitat loss on pollinator populations, limited policy support is available to reverse this pressure. One measure that may provide beneficial habitat to pollinators is England’s entry level stewardship agri-environment scheme. This study uses a novel expert survey to develop weights for a range of models which adjust the balance of Entry Level Stewardship options within the current area of spending. The annual costs of establishing and maintaining these option compositions were estimated at £59.3–£12.4 M above current expenditure. Although this produced substantial reduction in private cost:benefit ratios, the benefits of the scheme to pollinator habitat rose by 7–140 %; significantly increasing the public cost:benefit ratio. This study demonstrates that the scheme has significant untapped potential to provide good quality habitat for pollinators across England, even within existing expenditure. The findings should open debate on the costs and benefits of specific entry level stewardship management options and how these can be enhanced to benefit both participants and biodiversity more equitably.  相似文献   

3.
Bioenergy with Carbon Capture and Storage (BECCS) features heavily in the energy scenarios designed to meet the Paris Agreement targets, but the models used to generate these scenarios do not address environmental and social implications of BECCS at the regional scale. We integrate ecosystem service values into a land‐use optimization tool to determine the favourability of six potential UK locations for a 500 MW BECCS power plant operating on local biomass resources. Annually, each BECCS plant requires 2.33 Mt of biomass and generates 2.99 Mt CO2 of negative emissions and 3.72 TWh of electricity. We make three important discoveries: (a) the impacts of BECCS on ecosystem services are spatially discrete, with the most favourable locations for UK BECCS identified at Drax and Easington, where net annual welfare values (from the basket of ecosystems services quantified) of £39 and £25 million were generated, respectively, with notably lower annual welfare values at Barrow (?£6 million) and Thames (£2 million); (b) larger BECCS deployment beyond 500 MW reduces net social welfare values, with a 1 GW BECCS plant at Drax generating a net annual welfare value of £19 million (a 50% decline compared with the 500 MW deployment), and a welfare loss at all other sites; (c) BECCS can be deployed to generate net welfare gains, but trade‐offs and co‐benefits between ecosystem services are highly site and context specific, and these landscape‐scale, site‐specific impacts should be central to future BECCS policy developments. For the United Kingdom, meeting the Paris Agreement targets through reliance on BECCS requires over 1 GW at each of the six locations considered here and is likely, therefore, to result in a significant welfare loss. This implies that an increased number of smaller BECCS deployments will be needed to ensure a win–win for energy, negative emissions and ecosystem services.  相似文献   

4.
Plots at each of eight widespread permanent pasture sites below 300 m and representative of large areas of long established grassland in England and Wales, were treated with an insecticide plus molluscicide, a fungicide or nematicide treatment. Populations of various invertebrates and the occurrence of foliar fungal diseases were assessed. Leaf blotch (Drechslera) was the most common disease, but neither this nor other foliar fungal diseases were prevalent until late in the growing season. The fungicide treatment did not control diseases satisfactorily. The fungicide and nematicide treatment had little effect on total annual herbage yield. Leatherjackets, crambids, slugs and frit fly larvae were present, usually in low numbers, at most sites. The insecticide and molluscicide treatment increased yield by 11% on average across all sites and years. Losses caused by pests to UK grasslands were estimated to be over £500 million per year.  相似文献   

5.
Cystic echinococcosis (CE) is a globally distributed parasitic infection of humans and livestock. The disease is of significant medical and economic importance in many developing countries, including Iran. However, the socioeconomic impact of the disease, in most endemic countries, is not fully understood. The purpose of the present study was to determine the monetary burden of CE in Iran. Epidemiological data, including prevalence and incidence of CE in humans and animals, were obtained from regional hospitals, the scientific literature, and official government reports. Economic data relating to human and animal disease, including cost of treatment, productivity losses, and livestock production losses were obtained from official national and international datasets. Monte Carlo simulation methods were used to represent uncertainty in input parameters. Mean number of surgical CE cases per year for 2000–2009 was estimated at 1,295. The number of asymptomatic individuals living in the country was estimated at 635,232 (95% Credible Interval, CI 149,466–1,120,998). The overall annual cost of CE in Iran was estimated at US$232.3 million (95% CI US$103.1–397.8 million), including both direct and indirect costs. The cost associated with human CE was estimated at US$93.39 million (95% CI US$6.1–222.7 million) and the annual cost associated with CE in livestock was estimated at US$132 million (95% CI US$61.8–246.5 million). The cost per surgical human case was estimated at US$1,539. CE has a considerable economic impact on Iran, with the cost of the disease approximated at 0.03% of the country''s gross domestic product. Establishment of a CE surveillance system and implementation of a control program are necessary to reduce the economic burden of CE on the country. Cost-benefit analysis of different control programs is recommended, incorporating present knowledge of the economic losses due to CE in Iran.  相似文献   

6.
The study aimed to determine costs to the state government of implementing different interventions for controlling rabies among the entire human and animal populations of Tamil Nadu. This built upon an earlier assessment of Tamil Nadu''s efforts to control rabies. Anti-rabies vaccines were made available at all health facilities. Costs were estimated for five different combinations of animal and human interventions using an activity-based costing approach from the provider perspective. Disease and population data were sourced from the state surveillance data, human census and livestock census. Program costs were extrapolated from official documents. All capital costs were depreciated to estimate annualized costs. All costs were inflated to 2012 Rupees. Sensitivity analysis was conducted across all major cost centres to assess their relative impact on program costs. It was found that the annual costs of providing Anti-rabies vaccine alone and in combination with Immunoglobulins was $0.7 million (Rs 36 million) and $2.2 million (Rs 119 million), respectively. For animal sector interventions, the annualised costs of rolling out surgical sterilisation-immunization, injectable immunization and oral immunizations were estimated to be $ 44 million (Rs 2,350 million), $23 million (Rs 1,230 million) and $ 11 million (Rs 590 million), respectively. Dog bite incidence, health systems coverage and cost of rabies biologicals were found to be important drivers of costs for human interventions. For the animal sector interventions, the size of dog catching team, dog population and vaccine costs were found to be driving the costs. Rabies control in Tamil Nadu seems a costly proposition the way it is currently structured. Policy makers in Tamil Nadu and other similar settings should consider the long-term financial sustainability before embarking upon a state or nation-wide rabies control programme.  相似文献   

7.
Epidemic dynamics pose a great challenge to stochastic modelling because chance events are major determinants of the size and the timing of the outbreak. Reintroduction of the disease through contact with infected individuals from other areas is an important latent stochastic variable. In this study we model these stochastic processes to explain extinction and recurrence of epidemics observed in measles. We develop estimating functions for such a model and apply the methodology to temporal case counts of measles in 60 cities in England and Wales. In order to estimate the unobserved spatial contact process we suggest a method based on stochastic simulation and marginal densities. The estimation results show that it is possible to consider a unified model for the UK cities where the parameters depend on the city size. Stochastic realizations from the dynamic model realistically capture the transitions from an endemic cyclic pattern in large populations to irregular epidemic outbreaks in small human host populations.  相似文献   

8.
A survey in South Wales has been used to estimate the annual cost of individual anaesthetic drugs used in the National Health Service. Halothane, costing £900,000, is the most expensive item, followed by oxygen (£390,000) and nitrous oxide (£350,000). The total cost of drugs used for hospital anaesthetics is estimated to be £2,300,000, and for dental anaesthetics £156,000. The drug cost of the average hospital anaesthetic is £1 3s. and of the average dental anaesthetic 1s. 9d.The provision of liquid oxygen stores could reduce oxygen costs by up to 90%, and the greater use of closed-circuit anaesthesia, with low gas flows, could save £750,000 annually. The drug cost of an anaesthetic in Cardiff has risen from 8s. 4d. in 1959 to 19s. 10d. in 1968, but costs in Cardiff compare favourably with published figures from other similar centres.  相似文献   

9.
Leatherjackets can cause serious yield reductions in Northern Ireland grass. This paper considers the distribution of field population sizes derived from 19 years' data in conjuction with calculated damage functions to estimate average expected losses. Six management options were compared. These were ‘No action’, routine insecticide application in September or March, insecticide application in March in high risk years only and the use of monitoring and economic thresholds in September or March. The annual expected revenues for each of these options was calculated for herbage dry-matter values of £0.01–0.07 kg-1. It was concluded that action against leatherjackets in September would give better returns than action in March and that the use of monitoring and spraying of only those field populations above an economic threshold gave rise to greatest revenue. An annual loss to leatherjackets in Northern Ireland of over £15 million was calculated from an assumed herbage value of £0.035 kg-1 dry matter.  相似文献   

10.
Weed control is important and one of the more expensive inputs to sugar beet production. The introduction of genetically modified herbicide tolerant (GMHT) sugar beet would result in a major saving in weed control costs in the crop for growers, including control of problem weeds such as perennial weeds and weed beet. However, there would be other economic consequences of growing GMHT beet, some of which would manifest themselves in other parts of the rotation, such as the previous crop, the cereal stubbles that proceed most beet crops, soil tillage and spray application. The average national saving for UK sugar beet growers if they could use the technology would be in excess of £150 ha?1 yr?1 or £23 million yr?1, which includes reductions in agrochemical use of c. £80 ha,?1 yr?1 or £12 million yr?1. However, for some growers, the gains would be much larger and for a few, less than these figures. The possible cost savings are sufficiently large that they could ensure that sugar beet production, with its regionally important environmental benefits as a spring crop, remains economically viable in the UK post reform of the EU sugar regime.  相似文献   

11.
Tree ring data provide proxy records of historical hydroclimatic conditions that are widely used for reconstructing precipitation time series. Most previous applications are limited to annual time scales, though information about daily precipitation would enable a range of additional analyses of environmental processes to be investigated and modelled. We used statistical downscaling to simulate stochastic daily precipitation ensembles using dendrochronological data from the western Canadian boreal forest. The simulated precipitation series were generally consistent with observed precipitation data, though reconstructions were poorly constrained during short periods of forest pest outbreaks. The proposed multiple temporal scale precipitation reconstruction can generate annual daily maxima and persistent monthly wet and dry episodes, so that the observed and simulated ensembles have similar precipitation characteristics (i.e. magnitude, peak, and duration)—an improvement on previous modelling studies. We discuss how ecological disturbances may limit reconstructions by inducing non-linear responses in tree growth, and conclude with suggestions of possible applications and further development of downscaling methods for dendrochronological data.  相似文献   

12.

Background

By the end of 2011 Global Fund investments will be supporting 3.5 million people on antiretroviral therapy (ART) in 104 low- and middle-income countries. We estimated the cost and health impact of continuing treatment for these patients through 2020.

Methods and Findings

Survival on first-line and second-line ART regimens is estimated based on annual retention rates reported by national AIDS programs. Costs per patient-year were calculated from country-reported ARV procurement prices, and expenditures on laboratory tests, health care utilization and end-of-life care from in-depth costing studies. Of the 3.5 million ART patients in 2011, 2.3 million will still need treatment in 2020. The annual cost of maintaining ART falls from $1.9 billion in 2011 to $1.7 billion in 2020, as a result of a declining number of surviving patients partially offset by increasing costs as more patients migrate to second-line therapy. The Global Fund is expected to continue being a major contributor to meeting this financial need, alongside other international funders and domestic resources. Costs would be $150 million less in 2020 with an annual 5% decline in first-line ARV prices and $150–370 million less with a 5%–12% annual decline in second-line prices, but $200 million higher in 2020 with phase out of stavudine (d4T), or $200 million higher with increased migration to second-line regimens expected if all countries routinely adopted viral load monitoring. Deaths postponed by ART correspond to 830,000 life-years saved in 2011, increasing to around 2.3 million life-years every year between 2015 and 2020.

Conclusions

Annual patient-level direct costs of supporting a patient cohort remain fairly stable over 2011–2020, if current antiretroviral prices and delivery costs are maintained. Second-line antiretroviral prices are a major cost driver, underscoring the importance of investing in treatment quality to improve retention on first-line regimens.  相似文献   

13.
Rearing quality dairy heifers is essential to maintain herds by replacing culled cows. Information on the key factors influencing the cost of rearing under different management systems is, however, limited and many farmers are unaware of their true costs. This study determined the cost of rearing heifers from birth to first calving in Great Britain including the cost of mortality, investigated the main factors influencing these costs across differing farming systems and estimated how long it took heifers to repay the cost of rearing on individual farms. Primary data on heifer management from birth to calving was collected through a survey of 101 dairy farms during 2013. Univariate followed by multivariable linear regression was used to analyse the influence of farm factors and key rearing events on costs. An Excel spreadsheet model was developed to determine the time it took for heifers to repay the rearing cost. The mean±SD ages at weaning, conception and calving were 62±13, 509±60 and 784±60 days. The mean total cost of rearing was £1819±387/heifer with a mean daily cost of £2.31±0.41. This included the opportunity cost of the heifer and the mean cost of mortality, which ranged from £103.49 to £146.19/surviving heifer. The multivariable model predicted an increase in mean cost of rearing of £2.87 for each extra day of age at first calving and a decrease in mean cost of £6.06 for each percentile increase in time spent at grass. The model also predicted a decrease in the mean cost of rearing in autumn and spring calving herds of £273.20 and £288.56, respectively, compared with that in all-year-round calving herds. Farms with herd sizes⩾100 had lower mean costs of between £301.75 and £407.83 compared with farms with <100 milking cows. The mean gross margin per heifer was £441.66±304.56 (range £367.63 to £1120.08), with 11 farms experiencing negative gross margins. Most farms repaid the cost of heifer rearing in the first two lactations (range 1 to 6 lactations) with a mean time from first calving until breaking even of 530±293 days. The results of the economic analysis suggest that management decisions on key reproduction events and grazing policy significantly influence the cost of rearing and the time it takes for heifers to start making a profit for the farm.  相似文献   

14.
The identification of meat and bone meal (MBM) as a significant factor in the spread of bovine spongiform encephalopathy (BSE) has resulted in the introduction of restrictions on the use and movement of MBM and tallow. This has led to a requirement for alternative uses for these products. This paper reports on a risk assessment performed on the use of tallow as a fuel oil extender in diesel engines. With up to 4000 tonnes of tallow being produced each year in Ireland, combustion with energy recovery represents a viable, cost-efficient utilization route. A stochastic (Latin Hypercube sampling) simulation model was developed to assess the infectiv-ity risk to humans associated with potential airborne exposure to the combustion products when using tallow as a combustion fuel in diesel engines. The model simulates the potential infectivity pathways that tallow follows, including its production from animals with potentially subclinical BSE and processing the tallow with segregation and heat treatments. The model uses probability distributions for the most important input parameters. The assessment takes into account a number of epidemiological parameters that include tissue infectivity, species barrier, disease incidence, and heat inactivation. Two scenarios, reflecting the infectivity risk in different animal tissues defined by the European Commissions Scientific Steering Committee (SSC), were performed. It is seen from the model results that the risk of a human contracting variant Creutzfeldt-Jakob Disease (vCJD) from potential airborne exposure to BSE, resulting from the combustion of tallow, is extremely small even when model uncertainty is taken into account (mean individual risk values ranging from 10-11.43 to 10-7.23 per year/person). The risks are a number of orders of magnitude less than the sporadic annual incidence level of Creutzfeldt-Jakob Disease 9CJD) in Europe (approximately 10-6)  相似文献   

15.
Background: Eradicating Helicobacter pylori markedly reduces ulcer recurrence in patients with peptic ulcer disease (PUD). Many decision analysis studies have concluded that eradicating H. pylori in PUD patients is more cost‐effective than maintaining antisecretory therapy. In 1995, we introduced an H. pylori eradication program into a large transportation company that experienced increased incidences of PUD among its employees along with increased medical costs, and we performed trend analysis of the actual medical costs of PUD in this cohort. Methods: In this cohort, there were approximately 8500 employees. H. pylori‐positive PUD patients were identified at the annual health check up. The patients received eradication therapy with lansoprazole, amoxicillin, and clarithromycin. After eradication, the patients were followed up by a yearly health check up. The annual number of patients who received eradication was recorded, and the annual direct medical costs of PUD therapy were analyzed. Results: A total of 440 H. pylori‐positive PUD patients received eradication therapy in a 7‐year period. Based on an intention‐to‐treat analysis, the eradication rate was 84.5% (372 of 440). The largest number of patients who received eradication therapy was found in 1995 (n = 115), and from 1995 to 2001 this number decreased yearly by 12.5 (95% confidence interval (CI): 5 to 20). Between 1989 and 1995, the annual medical costs arising of PUD therapy increased by ¥2.25 million (95% CI: 1.19 to 3.31) per year, being highest (¥22.75 million) in 1995. Between 1995 and 2001, the costs decreased by ¥3.88 million (95% CI: 3.16 to 4.59) per year. The cost in 2001 was 5.7% of the cost in 1995. The eradication program was terminated in 2001 because the prevalence of PUD diminished markedly, and the associated medical costs decreased as well. Conclusions: H. pylori eradication could reduce the number of PUD patients and associated medical costs in the workplace setting.  相似文献   

16.
The anthropogenic movement of live fish has been identified as the most important route for the transmission of disease between river catchments. To assist in contingency planning for exotic salmonid disease outbreaks, a stochastic model was developed to assess the potential geographic distribution of an introduced pathogen with time to first detection. The Live Fish Movement Database (a resource funded by the UK Department for Environment, Food and Rural Affairs [Defra] and managed by the Centre for Environment, Fisheries and Aquaculture Science [CEFAS] and the Environment Agency [EA]) was used to establish details of live fish movement in England and Wales. A contact network was created for farm to farm and farm to non-farm (fishery) movements of rainbow trout Oncorhynchus mykiss, brown trout Salmo trutta and Atlantic salmon Salmo salar, and probability functions were used to model the timing and destination of movements from farm sites, based on these trading activities. Monte Carlo simulations were run to track the progression of potential disease transmission from single index farm inputs through river catchments with time. Two hundred farms exported fish to 1653 destinations in 147 of the total 198 river catchments. The median number of catchments contacted after 3 and 12 mo were 3 and 6, respectively. In 5% of simulations 63 or more catchments were contacted, and in 1% of simulations 75 or more catchments were contacted after 12 mo. These results may be used to underpin the development of contingency plans for exotic disease outbreaks.  相似文献   

17.
Leptospirosis is a common zoonotic disease in China. From 1991 to 2010, its average annual incidence was 0.70 cases per 100,000 population. During these two decades, three major outbreaks of leptospirosis occurred due to flooding and heavy rainfall. Leptospira interrogans serogroup Icterohaemorrhagiae serovar Lai is the predominant leptospire responsible for at least 60% of Chinese cases, and Apodemus agrarius serves as the major animal host. Based on the differences in predominant leptospiral serovars, epidemic features and incidence, there are three leptospirosis-prevalent regions in China. However, the incidence has significantly decreased in the last ten years.  相似文献   

18.
Tracing the source of campylobacteriosis   总被引:1,自引:0,他引:1  
Campylobacter jejuni is the leading cause of bacterial gastro-enteritis in the developed world. It is thought to infect 2–3 million people a year in the US alone, at a cost to the economy in excess of US $4 billion. C. jejuni is a widespread zoonotic pathogen that is carried by animals farmed for meat and poultry. A connection with contaminated food is recognized, but C. jejuni is also commonly found in wild animals and water sources. Phylogenetic studies have suggested that genotypes pathogenic to humans bear greatest resemblance to non-livestock isolates. Moreover, seasonal variation in campylobacteriosis bears the hallmarks of water-borne disease, and certain outbreaks have been attributed to contamination of drinking water. As a result, the relative importance of these reservoirs to human disease is controversial. We use multilocus sequence typing to genotype 1,231 cases of C. jejuni isolated from patients in Lancashire, England. By modeling the DNA sequence evolution and zoonotic transmission of C. jejuni between host species and the environment, we assign human cases probabilistically to source populations. Our novel population genetics approach reveals that the vast majority (97%) of sporadic disease can be attributed to animals farmed for meat and poultry. Chicken and cattle are the principal sources of C. jejuni pathogenic to humans, whereas wild animal and environmental sources are responsible for just 3% of disease. Our results imply that the primary transmission route is through the food chain, and suggest that incidence could be dramatically reduced by enhanced on-farm biosecurity or preventing food-borne transmission.  相似文献   

19.
Conservation biologists, as well as veterinary and public health officials, would benefit greatly from being able to forecast whether outbreaks of infectious disease will be major. For values of the basic reproductive number (R 0) between one and two, infectious disease outbreaks have a reasonable chance of either fading out at an early stage or, in the absence of intervention, spreading widely within the population. If it were possible to predict when fadeout was likely to occur, the need for costly precautionary control strategies could be minimized. However, the predictability of even simple epidemic processes remains largely unexplored. Here we conduct an examination of simulated data from the early stages of a fatal disease outbreak and explore how observable information might be useful for predicting major outbreaks. Specifically, would knowing the time of deaths for the first few cases allow us to predict whether an outbreak will be major? Using two approaches, trajectory matching and discriminant function analysis, we find that even in our best-case scenario (with accurate knowledge of epidemiological parameters, and precise times of death), it was not possible to reliably predict the outcome of a stochastic Susceptible-Exposed–Infectious-Recovered (SEIR) process.  相似文献   

20.
The significant advances made by the global scientific community during the COVID-19 pandemic, exemplified by the development of multiple SARVS-CoV-2 vaccines in less than 1 y, were made possible in part because of animal research. Historically, animals have been used to study the characterization, treatment, and prevention of most of the major infectious disease outbreaks that humans have faced. From the advent of modern ‘germ theory’ prior to the 1918 Spanish Flu pandemic through the more recent Ebola and Zika virus outbreaks, research that uses animals has revealed or supported key discoveries in disease pathogenesis and therapy development, helping to save lives during crises. Here we summarize the role of animal research in past pandemic and epidemic response efforts, as well as current and future considerations for animal research in the context of infectious disease research.

From the moment it began in late 2019, the COVID-19 pandemic has been met with remarkable scientific effort. In less than 1 y, substantial progress has been made in understanding the behavior of severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2), the causative agent of COVID-19, characterizing the damage it inflicts on the body, and developing safe and effective vaccines. Research in animals has provided many breakthroughs, as it has for most significant outbreaks in the past. Animals have been used to study infectious diseases long before disease-causing microorganisms were known to exist. Animal research in response to pandemics, past and present, provides a clear example of how such research can best serve the scientific community in the event of future outbreaks and other disease conditions. Response to a pandemic requires quick action to identify the emerging diseases, characterize transmission and pathogenesis, and develop preventative measures and therapies. Ideally, through the surveillance of environments and animal populations that may harbor pathogens with pandemic potential and through preclinical and basic science research in virology and vaccinology for diseases that are suspected to be potential threats, the pandemic response should begin before a disease gains the ability to spread easily through a population. In this article, we discuss the importance of animal research in all aspects of pandemic research response and the vital role it continues to play today.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号