首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
This paper is devoted to the statistical analysis of a stochastic model introduced in [P. Bertail, S. Clémençon, and J. Tressou, A storage model with random release rate for modelling exposure to food contaminants, Math. Biosci. Eng. 35 (1) (2008), pp. 35–60] for describing the phenomenon of exposure to a certain food contaminant. In this modelling, the temporal evolution of the contamination exposure is entirely determined by the accumulation phenomenon due to successive dietary intakes and the pharmacokinetics governing the elimination process inbetween intakes, in such a way that the exposure dynamic through time is described as a piecewise deterministic Markov process. Paths of the contamination exposure process are scarcely observable in practice, therefore intensive computer simulation methods are crucial for estimating the time-dependent or steady-state features of the process. Here we consider simulation estimators based on consumption and contamination data and investigate how to construct accurate bootstrap confidence intervals (CI) for certain quantities of considerable importance from the epidemiology viewpoint. Special attention is also paid to the problem of computing the probability of certain rare events related to the exposure process path arising in dietary risk analysis using multilevel splitting or importance sampling (IS) techniques. Applications of these statistical methods to a collection of data sets related to dietary methyl mercury contamination are discussed thoroughly.  相似文献   

2.
This review identifies several important challenges in null model testing in ecology: 1) developing randomization algorithms that generate appropriate patterns for a specified null hypothesis; these randomization algorithms stake out a middle ground between formal Pearson–Neyman tests (which require a fully‐specified null distribution) and specific process‐based models (which require parameter values that cannot be easily and independently estimated); 2) developing metrics that specify a particular pattern in a matrix, but ideally exclude other, related patterns; 3) avoiding classification schemes based on idealized matrix patterns that may prove to be inconsistent or contradictory when tested with empirical matrices that do not have the idealized pattern; 4) testing the performance of proposed null models and metrics with artificial test matrices that contain specified levels of pattern and randomness; 5) moving beyond simple presence–absence matrices to incorporate species‐level traits (such as abundance) and site‐level traits (such as habitat suitability) into null model analysis; 6) creating null models that perform well with many sites, many species pairs, and varying degrees of spatial autocorrelation in species occurrence data. In spite of these challenges, the development and application of null models has continued to provide valuable insights in ecology, evolution, and biogeography for over 80 years.  相似文献   

3.
Extrapolation of health risks from high to low doses has received a considerable amount of attention in carcinogenic risk assessment over decades. Fitting statistical dose-response models to experimental data collected at high doses and use of the fitted model for estimating effects at low doses lead to quite different risk predictions. Dissatisfaction with this procedure was formulated both by toxicologists who saw a deficit of biological knowledge in the models as well as by risk modelers who saw the need of mechanistically-based stochastic modeling. This contribution summarizes the present status of low dose modeling and the determination of the shape of dose-response curves. We will address the controversial issues of the appropriateness of threshold models, the estimation of no observed adverse effect levels (NOAEL), and their relevance for low dose modeling. We will distinguish between quantal dose-response models for tumor incidence and models of the more informative age/time dependent tumor incidence. The multistage model and the two-stage model of clonal expansion are considered as dose-response models accounting for biological mechanisms. Problems of the identifiability of mechanisms are addressed, the relation between administered dose and effective target dose is illustrated by examples, and the recently proposed Benchmark Dose concept for risk assessment is presented with its consequences for mechanistic modeling and statistical estimation.  相似文献   

4.
Using a modified version of the substitutional process proposed by Neyman, we estimate the parameters of the phylogenetic tree made up of three species (or groups of species). The parameters estimated are the rate of substitution of amino acids along a protein and the ratio of the times of divergence of the species (or group of species). A method is given for determining the tree structure when it is not known. Both the maximum likelihood and Bayes methods are used in the estimation. The basic model of the substitutional process within the proteins is validated by showing that the estimates of the ratio of the times of divergence of three species computed from two different protein molecules (haemoglobin α and fibrinopeptides) are within one standard deviation of each other. Next we consider the construction of the correct phylogenetic tree made up of three or more taxonomic categories like phyla or class utilizing the structure of the various types of protein molecules of the species in the three categories. The generalization of the procedure for the construction of the entire phylogenetic tree is also indicated. The main advantage of this method of tree construction over the traditional method is that the latter method can use the information of only one type of protein (for example cytochrome c) while the method of this paper can use all the available data from the different molecules. We also discuss the recent controversy over the constancy of the molecular clock.  相似文献   

5.
6.
7.
Biomechanical models are important tools in the study of human motion. This work proposes a computational model to analyse the dynamics of lower limb motion using a kinematic chain to represent the body segments and rotational joints linked by viscoelastic elements. The model uses anthropometric parameters, ground reaction forces and joint Cardan angles from subjects to analyse lower limb motion during the gait. The model allows evaluating these data in each body plane. Six healthy subjects walked on a treadmill to record the kinematic and kinetic data. In addition, anthropometric parameters were recorded to construct the model. The viscoelastic parameter values were fitted for the model joints (hip, knee and ankle). The proposed model demonstrated that manipulating the viscoelastic parameters between the body segments could fit the amplitudes and frequencies of motion. The data collected in this work have viscoelastic parameter values that follow a normal distribution, indicating that these values are directly related to the gait pattern. To validate the model, we used the values of the joint angles to perform a comparison between the model results and previously published data. The model results show a same pattern and range of values found in the literature for the human gait motion.  相似文献   

8.
Bioprocess and Biosystems Engineering - Starting from a relatively detailed model of a bioprocess producing fructo-oligosaccharides, a set of experimental data collected in batch and fed-batch...  相似文献   

9.
This work proposes a model of visual bottom-up attention for dynamic scene analysis. Our work adds motion saliency calculations to a neural network model with realistic temporal dynamics [(e.g., building motion salience on top of De Brecht and Saiki Neural Networks 19:1467–1474, (2006)]. The resulting network elicits strong transient responses to moving objects and reaches stability within a biologically plausible time interval. The responses are statistically different comparing between earlier and later motion neural activity; and between moving and non-moving objects. We demonstrate the network on a number of synthetic and real dynamical movie examples. We show that the model captures the motion saliency asymmetry phenomenon. In addition, the motion salience computation enables sudden-onset moving objects that are less salient in the static scene to rise above others. Finally, we include strong consideration for the neural latencies, the Lyapunov stability, and the neural properties being reproduced by the model.  相似文献   

10.
Personal exposure meters (PEM) are routinely used for the exposure assessment to radio frequency electric or magnetic fields. However, their readings are subject to errors associated with perturbations of the fields caused by the presence of the human body. This paper presents a novel analysis method for the characterization of this effect. Using ray‐tracing techniques, PEM measurements have been emulated, with and without an approximation of this shadowing effect. In particular, the Global System for Mobile Communication mobile phone frequency band was chosen for its ubiquity and, specifically, we considered the case where the subject is walking outdoors in a relatively open area. These simulations have been contrasted with real PEM measurements in a 35‐min walk. Results show a good agreement in terms of root mean square error and E‐field cumulative distribution function (CDF), with a significant improvement when the shadowing effect is taken into account. In particular, the Kolmogorov–Smirnov (KS) test provides a P‐value of 0.05 when considering the shadowing effect, versus a P‐value of 10−14 when this effect is ignored. In addition, although the E‐field levels in the absence of a human body have been found to follow a Nakagami distribution, a lognormal distribution fits the statistics of the PEM values better than the Nakagami distribution. As a conclusion, although the mean could be adjusted by using correction factors, there are also other changes in the CDF that require particular attention due to the shadowing effect because they might lead to a systematic error. Bioelectromagnetics 32:209–217, 2011. © 2010 Wiley‐Liss, Inc.  相似文献   

11.
In earlier papers a qualitative and quantitative model was developed for predicting the number of forest fires occurring per day. This model permits the forecast at 00.00 hours Universal Time Convention (UTC) of any day (d), the number of forest fires per day for a range of several days (d tod+5) over a particular region. Input data are the number of forest fires in the region during two preceding days (d–2 andd–1) and the type of day (real and evaluated from radiosonde ford–2,d–1,d and predicted from meteorological medium-range forecasts, i.e. of European Centre, ford+1,d+2,d+3,d+4 andd+5. As this model requires data obtained by radiosonde, particularly temperatures and geopotentials at 850 and 700 hPa and dew points (or specific humidity) at 850 hPa, this study investigates the spatial validity of the model in relation to the distance from the radiosonde station (RS). The highest quality forecast is obtained for the region immediately surrounding the RS, and diminishes with increasing distance from it, this being due to the data obtained from the RS not being representative of the atmospheric column over the region. Hence, the derivation of the critical distance for a particular quality level of measurement. Conversely, fixed quality level implies a specific separation between RS and the region for the prediction, with a higher predictive quality implying a shorter distance.  相似文献   

12.
13.
Analysis of variation in pheromone amounts and ratios between individuals is usually performed separately for amounts and ratios of the different components. Non-parametric tests are regularly applied. This way of analysis is statistically correct, yet, limited for several reasons. The use of a parametric linear mixed model approach to analyze both amounts and ratios of different components at the same time is proposed. This method appears to be very flexible and may facilitate the analysis of pheromone data.  相似文献   

14.
Calculations of dietary exposure to acrylamide   总被引:2,自引:0,他引:2  
In this paper we calculated the usual and acute exposure to acrylamide (AA) in the Dutch population and young children (1-6 years). For this AA levels of different food groups were used as collected by the Institute for Reference Materials and Measurements (IRMM) of the European Commission's Directorate General Joint Research Centre (JRC) from April 2003 up to May 2004. This database contained about 3500 AA levels received from mainly Germany, The Netherlands, Ireland, Greece, Austria, UK and from food industry. Food consumption levels used were derived from the Dutch National Food Consumption Survey of 1997/1998 (n=6250 of which 530 children aged 1-6 years). The exposure was estimated using the probabilistic approach. The results of the exposure calculations are discussed in relation to different methodological aspects of AA exposure calculations and possible uncertainties related to this. The items discussed include quality of the AA levels measured in food items, the allocation of AA levels to food categories, the quality of food consumption levels, and relevant exposure model in relation to reported toxicity of AA. Furthermore, we demonstrate that scenario studies and probabilistic modelling of exposure are potential useful tools to evaluate the effect of processing techniques to reduce AA levels in food on AA exposure. The scenarios studied reduced total AA exposure ranging from <1% up to 17%.  相似文献   

15.
In this paper we calculated the usual and acute exposure to acrylamide (AA) in the Dutch population and young children (1–6 years). For this AA levels of different food groups were used as collected by the Institute for Reference Materials and Measurements (IRMM) of the European Commission's Directorate General Joint Research Centre (JRC) from April 2003 up to May 2004. This database contained about 3500 AA levels received from mainly Germany, The Netherlands, Ireland, Greece, Austria, UK and from food industry. Food consumption levels used were derived from the Dutch National Food Consumption Survey of 1997/1998 (n = 6250 of which 530 children aged 1–6 years). The exposure was estimated using the probabilistic approach. The results of the exposure calculations are discussed in relation to different methodological aspects of AA exposure calculations and possible uncertainties related to this. The items discussed include quality of the AA levels measured in food items, the allocation of AA levels to food categories, the quality of food consumption levels, and relevant exposure model in relation to reported toxicity of AA. Furthermore, we demonstrate that scenario studies and probabilistic modelling of exposure are potential useful tools to evaluate the effect of processing techniques to reduce AA levels in food on AA exposure. The scenarios studied reduced total AA exposure ranging from <1% up to 17%.  相似文献   

16.
17.
18.
A dynamic phosphate budget model for a eutrophic lake   总被引:1,自引:1,他引:0  
The relations between the external nutrient loading of lakes, recycling through sediments and the resulting productivity are complicated by feed-back mechanisms, seasonal variations and trends. Simulation is a useful tool for the identification of controlling factors and the assessment of the effects of management measures, supplementary to experimental research. The model variables in our dynamic phosphate budget model include inorganic and organic particulate phosphate and dissolved o-phosphate, in both sediments and overlying water. Sediments may be aerobic or anaerobic, depending on topography, temperature and composition. The major processes described are primary production, mineralisation, sedimentation, adsorption and diffusion. Several model parameters have been estimated directly for Lake Brielle (Netherlands). The sediment dilution rate, the extent of anaerobic conditions and the number and character of adsorption sites are important controlling factors.  相似文献   

19.
A new method to estimate the oxygen transfer coefficient (KLa) from the experimental dynamic response data is presented. Employing a linear model which allows for gas phase, diffusion film, and oxygen electrode dynamics, the first moment of the response curve is simply related to the sum of the model parameters. Two separate experiments are used to characterize the measurement dynamics and to measure the unknown KLa parameter. The simple calculation procedure involves only measuring the area above the response curves.  相似文献   

20.
Assessing the whole‐body absorption in a human in a realistic environment requires a statistical approach covering all possible exposure situations. This article describes the development of a statistical multi‐path exposure method for heterogeneous realistic human body models. The method is applied for the 6‐year‐old Virtual Family boy (VFB) exposed to the GSM downlink at 950 MHz. It is shown that the whole‐body SAR does not differ significantly over the different environments at an operating frequency of 950 MHz. Furthermore, the whole‐body SAR in the VFB for multi‐path exposure exceeds the whole‐body SAR for worst‐case single‐incident plane wave exposure by 3.6%. Moreover, the ICNIRP reference levels are not conservative with the basic restrictions in 0.3% of the exposure samples for the VFB at the GSM downlink of 950 MHz. The homogeneous spheroid with the dielectric properties of the head suggested by the IEC underestimates the absorption compared to realistic human body models. Moreover, the variation in the whole‐body SAR for realistic human body models is larger than for homogeneous spheroid models. This is mainly due to the heterogeneity of the tissues and the irregular shape of the realistic human body model compared to homogeneous spheroid human body models. Bioelectromagnetics 34:240–251, 2013. © 2012 Wiley Periodicals, Inc.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号