首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Two popular models of absence of synergism in epidemiologic cohort studies are analyzed and compared. It is shown that the statistical concept of the union of independent events that traditionally has given rise to the “additive” model of relative risk can also generate the “multiplicative” model of relative risk. In fact, the same set of approximating conditions can be used to generate both models, which suggests a lack of identifiability under the traditional approach. An alternate approach is proposed in this paper. The new approach does not require the assumption that background risk factors are independent from causal agents of interest. The concept of “dose additivity” is discussed.  相似文献   

2.
Microbial risk assessors often make simplifying assumptions that lead to the selection of simple concave functions with low-dose linearity, consistent with no-threshold and single-hit hypotheses, as default dose–response model forms. However, evidence is accumulating as the “microbiome revolution” progresses that challenge these assumptions that influence the estimates of the nature and magnitude of uncertainties associated with microbial risks. Scientific advances in the knowledge of the human “superorganism” (hybrid consortium of human plus microbial communities that cooperatively regulates health and disease) enable the design of definitive studies to estimate the pathogen doses overcome by the innate defenses, including the protective microbiota. The systematic investigation of the events of non-typhoid salmonellosis in humans undertaken nearly 2 decades ago was updated to incorporate recent scientific advances in the understanding of impact of the healthy superorganism that strengthens and extends the biological motivations for sublinear or convex dose–response curves in microbial risk assessment. The knowledge of colonization resistance (innate protection of the human superorganism from low doses of ingested pathogens) and microbiota-mediated clearance is advancing mechanistically for many pathosystems. However, until more detailed mechanistic data become available for salmonellosis, the consideration of a variety of empirical model forms is essential for depicting the uncertainty of the “true” dose–response model.  相似文献   

3.
The symptom of impotence is, in the vast majority of cases, extremely welldefined. However, therapeutic approaches incorporate various strategies, including: andrological, analytical and behavioural. A great advance in recent years was due to the development of “integrative models” such as the “American Model” by Helen Kaplan, the “Geneva Model” of Willy Pasini, and our proposed “Parisian Model”, which is founded on the principles of analytical sexotherapy, and comprises several clearly defined stages. The management of impotence must not be restricted only to pharmacological treatment.  相似文献   

4.
The long-term conservation of biodiversity and related ecosystems goods and services of the Autonomous Region of Madrid is jeopardized by the intensive resource-consuming development model followed by the region in the past few decades. This paper presents the aggregated results of the first integrated assessment of the protected areas of the Autonomous Region of Madrid (Spain) with the System for the Integrated Assessment of Protected Areas (SIAPA). Detailed results are also provided for individual protected areas as supplementary data. The assessment was done during 2009–2010, on ten protected areas differing in their sizes (from 2.5 to 52,796 ha), protection categories (seven categories) and types of ecosystems present. Comparison of results from both assessment models of the SIAPA (the Complete Model and the Simplified Model) is also presented. The results from the Complete Model show that eight out of the ten protected areas of the Autonomous Region of Madrid are currently ineffective. The poorest partial indexes overall were: “State of Conservation” and “Social and Economic Context”. The only indexes significantly correlated with the effectiveness of a protected area were: the “State of Conservation” (r = 0.851**) and the “Social Perception and Valuation” (r = 0.786**). Although not as relevant as was thought, “Management” and the other non-significant factors are likely to influence the effectiveness of protected areas as well. The results for the Simplified Model are slightly better than those for the Complete Model, although this is probably a specific result of this assessment. The two models of the SIAPA were very significantly correlated, although their aggregated results should not be compared directly.  相似文献   

5.
Summary A general framework is proposed for Bayesian model based designs of Phase I cancer trials, in which a general criterion for coherence (Cheung, 2005, Biometrika 92 , 863–873) of a design is also developed. This framework can incorporate both “individual” and “collective” ethics into the design of the trial. We propose a new design that minimizes a risk function composed of two terms, with one representing the individual risk of the current dose and the other representing the collective risk. The performance of this design, which is measured in terms of the accuracy of the estimated target dose at the end of the trial, the toxicity and overdose rates, and certain loss functions reflecting the individual and collective ethics, is studied and compared with existing Bayesian model based designs and is shown to have better performance than existing designs.  相似文献   

6.
The sea-level rise induced by climate change has caused impacts (e.g., floods and saline intrusion) in estuaries. In this work, we used monitoring data (salinity, sediment and taxa occurrence), simulated saline intrusion and Species Distribution Model to predict the spatial distribution of families in the estuary at two levels of SLR (0.5 m and 1 m) for two scenarios (moderate and extreme). For the simulation, we used the ensemble method applied to five models (MARS, GLM, GAM, RF and BRT). High AUC and TSS values indicated “good” to “excellent” accuracy. RF and GLM obtained the best and worst values, respectively. The model predicted local extinctions and new colonization in the upper estuarine zones. With the effects of climate change intensifying, it is extremely important that managers consider the use of predictive tools to anticipate the impacts of climate change on a local scale on species migration.  相似文献   

7.
The interaction of background disease processes with environmental induced diseases has long been an issue of considerable interest and debate with respect to its impact on risk assessment. Whether and to what extent these processes should be considered independent or additive to background has been the principal focus of debate. The concept of hormesis, a biphasic dose response characterized by a low dose stimulation and a high dose inhibition, as framed within the context of post-conditioning, reveal the occurrence of a third type of “background” possibility, that of “subtraction to background”. This novel application of the hormesis concept, which is framed within the biological context of post-conditioning adaptive processes, offers considerable implications for the assessment of aging and environmental risk assessment.  相似文献   

8.
Some recommended protocols for in vitro chromosome-aberration assays call for two flasks per dose group. Use of replicate flasks allows for possible variation in percent aberrant cells (ABR) between flasks. We studied the magnitude of variation between replicate flasks of Chinese hamster ovary (CHO) cells using data from 211 assays from three laboratories, in order to assess the effect on assay sensitivity. Based on all 403 pairs of replicate “control” flasks, there was almost no excess variability between flasks. The standard deviation (SD) was only 4% larger than the value expected purely from sampling cells (P < 0.05). Data from all 366 pairs of replicate “treated” flasks showed that between-flask variation increased with the average percent aberrant cells (P < 0.001). The SD for 60 pairs of flasks with 3.0–7.5% ABR cells was 32% larger than the expected value. However, computer simulations based on these data showed use of replicate flasks has little effect on assay false-positive or true-positive rates. All assays with replicate treated flasks and at least three dose groups including control were re-analyzed as “single-flask” experiments. A “single-flask” experiment was defined by taking both control flasks but only one treated flask per dose. For each assay, all possible single-flask experiments were re-analyzed and the percent with positive results recorded. For most assays, conclusions were the same regardless of which treated flasks were selected, in spite of the fact that these single-flask experiments had only half as many cells scored per active dose group. For a very few assays with marginal results, the conclusion could change depending on which set of flasks was chosen, but these were such borderline results that a repeat assay was required in any case. Repeating the assay is a better way to resolve marginal results than examining replicate flasks. From our re-examination of the experimental data and from the computer simulation, we conclude that, while flask-to-flask variability exists, it has no practical effect on the test outcome, so that use of replicate flasks is not necessary for this assay.  相似文献   

9.
Ecological studies of health effects due to agent exposure are generally considered to be a blunt instrument of scientific investigation, unfit to determine the “true” exposure-effect relationship for an agent. Based on this widely accepted tenet, ecological studies of the correlation between the local air concentration of radon and the local lung cancer mortality as measured by Cohen have been criticized as being subject to the “Ecological Fallacy” and thus producing invalid risk data. Here we discuss the data that a risk assessment needs as a minimum requirement for making a valid risk estimate. The examination of these data and a “thought experiment” show that it is Cohen's raw ecological data, uncorrected for population characteristic factors, which are the proper data for a risk assessment. Consequently, the “true” exposure-effect relationship is less and less important the more population characteristic factors are identified and the larger they are. This reversal of the usual argument is due to our approach: Here, the prediction of the health effects in an exposed population is of primary importance, not the shape of the “true” exposure-effect relationship. The results derived in this paper hold for ecological studies of any agent causing any health or other effect.  相似文献   

10.
Accurate modeling of geographic distributions of species is crucial to various applications in ecology and conservation. The best performing techniques often require some parameter tuning, which may be prohibitively time‐consuming to do separately for each species, or unreliable for small or biased datasets. Additionally, even with the abundance of good quality data, users interested in the application of species models need not have the statistical knowledge required for detailed tuning. In such cases, it is desirable to use “default settings”, tuned and validated on diverse datasets. Maxent is a recently introduced modeling technique, achieving high predictive accuracy and enjoying several additional attractive properties. The performance of Maxent is influenced by a moderate number of parameters. The first contribution of this paper is the empirical tuning of these parameters. Since many datasets lack information about species absence, we present a tuning method that uses presence‐only data. We evaluate our method on independently collected high‐quality presence‐absence data. In addition to tuning, we introduce several concepts that improve the predictive accuracy and running time of Maxent. We introduce “hinge features” that model more complex relationships in the training data; we describe a new logistic output format that gives an estimate of probability of presence; finally we explore “background sampling” strategies that cope with sample selection bias and decrease model‐building time. Our evaluation, based on a diverse dataset of 226 species from 6 regions, shows: 1) default settings tuned on presence‐only data achieve performance which is almost as good as if they had been tuned on the evaluation data itself; 2) hinge features substantially improve model performance; 3) logistic output improves model calibration, so that large differences in output values correspond better to large differences in suitability; 4) “target‐group” background sampling can give much better predictive performance than random background sampling; 5) random background sampling results in a dramatic decrease in running time, with no decrease in model performance.  相似文献   

11.
Estimation of the absorbed dose in nuclear pediatric is mandatory and necessary to assess the risk in radiation protection of each child. The radiophysicist plays an essential role in this process because he has the knowledge for this evaluation. This work aims to compare different dosimetric methods in the literature. All are based on the most used in this domain, “MIRD method”. The maximum relative deviations are 30% for three tests studied and do not set a reference.  相似文献   

12.
A “safe” or sub-threshold dose is often estimated for oral toxicity of substances in order to protect humans from adverse health effects. This dose is referred to by several terms: reference dose (RfD), tolerable daily intake (TDI), and acceptable daily intake (ADI). Similarly, tolerable concentration (TC), and reference concentration (RfC) are commonly used terms for a “safe” concentration for inhalation. The process of deriving these doses generally involves identifying a no observed, or lowest observed adverse effect level (NOAEL or LOAEL) in animals, or humans, and application of uncertainty factors to account for the extrapolation from laboratory animals to humans and/or from an average human to a sensitive human. Public health agencies have begun to consider using a data derived approach, which uses available toxicokinetic and toxicodynamic data in the determination of uncertainty factors, rather than relying on the standard default values. Recently two different tolerable daily intake risk values were derived by two different World Health Organization (WHO) work groups. The International Programme on Chemical Safety, and the Working Group on Chemical Substances in Drinking Water both used the approach developed by Renwick (1993); however, the two groups interpreted and used the available data differently. The result was a difference of over twofold in the total uncertainty factor used. This review compares and contrasts the two approaches used by these WHO work groups.  相似文献   

13.
Risk assessment research rarely quells controversy. Mega-mouse, and mega-rat, experiments contradicted a threshold for carcinogenesis, yet thresholds are still argued. High to low dose continuity of response from cigarette smoking to environmental tobacco smoke, and from occupational asbestos exposure to take-home asbestos, contradict thresholds in people. Nevertheless, mechanistic hypotheses allege “Houdini Risk Assessments”, which make risks disappear or allow industries to escape from protecting workers. Despite concerns for animal-to-human extrapolations, priority occupational exposures with sufficient or substantial evidence of carcinogenicity in people not addressed by new exposure limits include silica, sulfuric acid mist, chromates, diesel particulate matter, particulate matter generally, metalworking fluids, welding fume, and formaldehyde. “Houdini Risk Assessments” are exercises in “anti-hypothesis generation”: ignore selected tumor sites and types; ignore data from people (as with formaldehyde and diesel); choose the most resistant species in laboratory tests; select biochemical parameters in which the most resistant species resembles people; assume a mechanism that gives threshold or steep exposure response for carcinogenic effect; and reduce estimated people risk by the parameter ratio to the most resistant species. NORA research should focus on quantitative reconciliation of laboratory and epidemiology studies, and develop a counter “anti-hypothesis” generation research agenda for key exposure circumstances.  相似文献   

14.
Mice irradiated with lethal doses of total body X-irradiation are very susceptible to Staphylococcus aureus. A difference in virulence between a “mouse-virulent” strain and a “mouse-non-virulent” strain which lacks cell-wall aggressin could nevertheless be demonstrated. The virulence of the “mouse-non-virulent” strain could be increased by adding a cell-wall preparation with aggressin-activity (DOCR) from a “mouse-virulent” strain to the inoculum. When injected together with a dose of bacteria lower than the minimum pus-forming dose a lesion-enhancing effect of DOCR from a “mouse-virulent” strain was also demonstrated in man.  相似文献   

15.
16.
We compared the performance of four logistic regression models of different complexity with different environmental data quality, in predicting the occurrence of 49 terrestrial mollusc species in southern Sweden. Performance of models derived from an explanatory data set was evaluated on a confirmatory data set. The overall predictive success of our models (>80% for the three best model approaches), is as good as in other studies, despite the fact that we had to transform a text database into quantitative habitat variables. Simple models (no variable interactions), with forward selection, and detailed habitat data (from field visits) showed the best overall predictive success (mean=84.8%). From comparisons of model approaches, we conclude that data quality (map‐derived data vs habitat mapping) had a stronger impact than model complexity on model performance. However, most of these models showed relatively low values (mean=0.29) for Kappa (statistic for model evaluation), suggesting that the models need to be improved before they would be applied. Predictive success was strongly associated with species incidence but also Kappa was positively correlated with species incidence in univariate tests. Predictive success for true absences was negatively correlated with predictive success for true presences (R2=0.69) and most models failed to give a good prediction of both categories. Models for species with a high incidence in “Open dry sites” or “Mesic interior forests” had a better performance than expected, suggesting that occurrences of species with preference for “narrow” habitats are most easy to predict. Tree layer variables (openness and species abundance) were included in 48 of the 49 final predictive models, suggesting that these variables were good “indicators” of habitat conditions for ground‐living molluscs. Twenty‐four species models included distance to coast and altitude, and we interpret these associations as partly being related to differences in climate. In the final models, true presences (36.9% correctly classified) were much more difficult to predict than true absences (89.7% correct). Possible explanations might be that important habitat variables (e.g. chemical variables and site history) were not included. On the other hand, all suitable sites would not be expected to be occupied due to dynamics in local extinctions (meta‐population theory).  相似文献   

17.
18.
The coverage of the fifth-generation network has increased steadily since the network was introduced in 2019. However, public protests around the globe against the construction of 5G network base stations have continued to occur for fear that electromagnetic (EM) waves emitted from the stations would cause adverse health effects. To identify factors that have contributed to such increased risk perception, we conducted a cross-sectional study using data obtained from a survey that assessed Korean adults’ risk perception of EM wave-related objects. We found that female gender, high level of perceived exposure to EM waves, evaluation of public policies as ineffective, and high level of objective knowledge on EM waves were associated with increased risk perception. Furthermore, we found that higher ratings on a few risk characteristics such as “personal knowledge,” “seriousness of the risk to future generations,” “dreadfulness,” and “severity of consequences” were also associated with increased risk perception as well. Bioelectromagnetics. © 2020 The Authors. Bioelectromagnetics published by Wiley Periodicals LLC on behalf of Bioelectromagnetics Society  相似文献   

19.
Currently, no general measure of population health response to untestable doses of chemicals and microbes has been established that accounts for uncertainty quantitatively and indicates relative toxicity or virulence directly. Untestable doses include those corresponding to the 2.74 × 10?7 illnesses per exposure expressed in goals of the U.S. Environmental Protection Agency's Surface Water Treatment Rule, and doses of human bioterror agents. For example, it is shown that relative Benchmark Dose (BMD) values depend upon the level of confidence assumed. Because of the lack of scientific basis for this level, BMDs are not comparable among health stressors for untestable doses and stressors. In this paper a new predictive Bayesian method is proposed for absolute and relative dose-response assessment based on available information. Information may include toxicological judgment, epidemiological statistics, genetic information, related data, and numeric dose-response data. Results for rotavirus indicate a “safe dose” of 6.3 × 10?7 focus-forming units/exposure, approximately one-half log above the dose corresponding to the maximum risk for any pathogen assuming a 100% infection rate. The result further indicates the limited value of data in refining the assessment, due to the inability of data to reduce variability. The method is suggested for assessing risks of new and existing chemicals and pathogens, as a basis for prioritizing expenditures for protection against environmental and terrorist threats.  相似文献   

20.
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号