首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
The effect of a variable initial value is examined in Stein's stochastic neuronal model with synaptic reversal potentials under the conditions of a constant threshold and a constant input. The moments of the interspike interval distribution are presented as the functions of the initial depolarization which ranges from inhibitory reversal potential to the threshold potential. Normal, exponential and transformed Gamma distributions are tested for the initial value of depolarization. The coefficient of variation is shown to be greater than one when the initial depolarization is sufficiently above the resting level. An interpretation of this result in the terms of spatial facilitation is offered. The effect of a random initial value is found to be most pronounced for the neurons depolarized to a near threshold level.  相似文献   

2.
Epidemic thresholds in network models of heterogeneous populations characterized by highly right-skewed contact distributions can be very small. When the population is above the threshold, an epidemic is inevitable and conventional control measures to reduce the transmissibility of a pathogen will fail to eradicate it. We consider a two-sex network model for a sexually transmitted disease which assumes random mixing conditional on the degree distribution. We derive expressions for the basic reproductive number (R(0)) for one and heterogeneous two-population in terms of characteristics of the degree distributions and transmissibility. We calculate interval estimates for the epidemic thresholds for stochastic process models in three human populations based on representative surveys of sexual behavior (Uganda, Sweden, USA). For Uganda and Sweden, the epidemic threshold is greater than zero with high confidence. For the USA, the interval includes zero. We discuss the implications of these findings along with the limitations of epidemic models which assume random mixing.  相似文献   

3.
 Most vertebrate animals produce optokinetic nystagmus in response to rotation of their visual surround. Nystagmus consists of an alternation of slow-phase eye rotations, which follow the surround, and fast-phase eye rotations, which quickly reset eye position. The time intervals between fast phases vary stochastically, even during optokinetic nystagmus produced by constant velocity rotation of a uniform surround. The inter-fast-phase interval distribution has a long tail, and intervals that are long relative to the mode become even more likely as constant surround velocity is decreased. This paper provides insight into fast-phase timing by showing that the process of fast-phase generation during constant velocity optokinetic nystagmus is analogous to a random walk with drift toward a threshold. Neurophysiologically, the output of vestibular nucleus neurons, which drive the slow phase, would approximate a random walk with drift because they integrate the noisy, constant surround velocity signal they receive from the visual system. Burst neurons, which fire a burst to drive the fast phase and reset the slow phase, are brought to threshold by the vestibular nucleus neurons. Such a nystagmic process produces stochastically varying inter-fast-phase intervals, and long intervals emerge naturally because, as drift rate (related to surround velocity) decreases, it becomes more likely that any random walk can meander for a long time before it crosses the threshold. The theoretical probability density function of the first threshold crossing times of random walks with drift is known to be that of an inverse Gaussian distribution. This probability density function describes well the distributions of the intervals between fast phases that were either determined experimentally, or simulated using a neurophysiologically plausible neural network model of fast-phase generation, during constant velocity optokinetic nystagmus. Received: 1 June 1995/Accepted in revised form: 15 February 1996  相似文献   

4.
We consider in this paper the statistical distribution of hydrophobic residues along the length of protein chains. For this purpose we used a binary hydrophobicity scale which assigns hydrophobic residues a value of one and non-hydrophobes a value of zero. The resulting binary sequences are tested for randomness using the standard run test. For the majority of the 5,247 proteins examined, the distribution of hydrophobic residues along a sequence cannot be distinguished from that expected for a random distribution. This suggests that (a) functional proteins may have originated from random sequences, (b) the folding of proteins into compact structures may be much more permissive with less sequence specificity than previously thought, and (c) the clusters of hydrophobic residues along chains which are revealed by hydrophobicity plots are a natural consequence of a random distribution and can be conveniently described by binomial statistics.  相似文献   

5.
We use distribution theory and ordering of non-negative random variables to study the Susceptible-Exposed-Infectious-Removed (SEIR) model with two control measures, quarantine and isolation, to reduce the spread of an infectious disease. We identify that the probability distributions of the latent period and the infectious period are primary features of the SEIR model to formulate the epidemic threshold and to evaluate the effectiveness of the intervention measures. If the primary features are changed, the conclusions will be altered in an importantly different way. For the latent and infectious periods with known mean values, it is the dilation, a generalization of variance, of their distributions that ranks the effectiveness of these control measures. We further propose ways to set quarantine and isolation targets to reduce the controlled reproduction number below the threshold using observed initial growth rate from outbreak data. If both quarantine and isolation are 100% effective, one can directly use the observed growth rate for setting control targets. If they are not 100% effective, some further knowledge of the distributions is required.  相似文献   

6.
为解释长白山温带森林群落构建和物种多度格局的形成过程, 该文以不同演替阶段的针阔混交林监测样地数据为基础, 采用中性理论模型、生物统计模型(对数正态分布模型)和生态位模型(Zifp模型、分割线段模型、生态位优先模型)拟合森林群落物种多度分布, 并用χ 2检验、Kolmogorov-Smirnov (K-S)检验和赤池信息准则(AIC)选择最佳拟合模型。结果显示: 中性模型能很好地预测长白山温带森林不同演替阶段植物群落的物种多度分布。在10 m × 10 m尺度上, 5种模型均可被χ 2检验和K-S检验接受, 但中性模型拟合效果不如对数正态分布模型、Zifp模型、分割线段模型和生态位优先模型, 表明小尺度上中性过程和生态位过程均能解释群落物种多度分布, 但生态位过程的解释能力相对较大。而在中大尺度上(30 m × 30 m、60 m × 60 m和90 m × 90 m), 中性模型为最优拟合模型, 并且随着研究尺度增加, 生态位模型和生物统计模型逐渐被χ 2检验拒绝, 表明中性过程在长白山针阔混交林群落物种多度分布格局形成中的作用随着研究尺度增加而逐渐增大。该文证实了中性过程在长白山温带针阔混交林群落结构形成中具有重要作用, 但未否认生态位机制在群落构建中的贡献。因此, 温带森林群落构建过程中中性理论和生态位理论并非相互矛盾, 而是相互融合的。在研究森林群落物种多度分布时, 应重视取样尺度和演替阶段的影响, 并采用多种模型进行拟合。  相似文献   

7.
In this paper, we study the SIS (susceptible–infected–susceptible) and SIR (susceptible–infected–removed) epidemic models on undirected, weighted networks by deriving pairwise-type approximate models coupled with individual-based network simulation. Two different types of theoretical/synthetic weighted network models are considered. Both start from non-weighted networks with fixed topology followed by the allocation of link weights in either (i) random or (ii) fixed/deterministic way. The pairwise models are formulated for a general discrete distribution of weights, and these models are then used in conjunction with stochastic network simulations to evaluate the impact of different weight distributions on epidemic thresholds and dynamics in general. For the SIR model, the basic reproductive ratio R 0 is computed, and we show that (i) for both network models R 0 is maximised if all weights are equal, and (ii) when the two models are ‘equally-matched’, the networks with a random weight distribution give rise to a higher R 0 value. The models with different weight distributions are also used to explore the agreement between the pairwise and simulation models for different parameter combinations.  相似文献   

8.
Alternative molecular mechanisms can be envisaged for the cellular repair of UV-damaged DNA. In the "random collision" model, DNA damage distributed throughout the genome is recognised and repaired by a process of random collision between DNA damage and repair enzymes. The other model assumes a "processive" mechanism, whereby DNA is scanned for damage by a repair complex moving steadily along its length. These two models give different predictions concerning the time course of repair. Random collision should result in a declining rate of repair with time as the concentration of lesions in the DNA falls; but the processive model predicts a constant rate of repair until scanning is complete. We have examined the time course of DNA repair in human fibroblasts given low (generally sublethal) doses of UV light. Using 3 distinct assays, we find no sign of a constant repair rate after 4 J/m2 or less, even when the first few hours after irradiation are examined. Thus DNA repair is likely to depend on random collision. The implications of this finding for the structural organisation of repair are discussed.  相似文献   

9.
We use bootstrap simulation to characterize uncertainty in parametric distributions, including Normal, Lognormal, Gamma, Weibull, and Beta, commonly used to represent variability in probabilistic assessments. Bootstrap simulation enables one to estimate sampling distributions for sample statistics, such as distribution parameters, even when analytical solutions are not available. Using a two-dimensional framework for both uncertainty and variability, uncertainties in cumulative distribution functions were simulated. The mathematical properties of uncertain frequency distributions were evaluated in a series of case studies during which the parameters of each type of distribution were varied for sample sizes of 5, 10, and 20. For positively skewed distributions such as Lognormal, Weibull, and Gamma, the range of uncertainty is widest at the upper tail of the distribution. For symmetric unbounded distributions, such as Normal, the uncertainties are widest at both tails of the distribution. For bounded distributions, such as Beta, the uncertainties are typically widest in the central portions of the distribution. Bootstrap simulation enables complex dependencies between sampling distributions to be captured. The effects of uncertainty, variability, and parameter dependencies were studied for several generic functional forms of models, including models in which two-dimensional random variables are added, multiplied, and divided, to show the sensitivity of model results to different assumptions regarding model input distributions, ranges of variability, and ranges of uncertainty and to show the types of errors that may be obtained from mis-specification of parameter dependence. A total of 1,098 case studies were simulated. In some cases, counter-intuitive results were obtained. For example, the point value of the 95th percentile of uncertainty for the 95th percentile of variability of the product of four Gamma or Weibull distributions decreases as the coefficient of variation of each model input increases and, therefore, may not provide a conservative estimate. Failure to properly characterize parameter uncertainties and their dependencies can lead to orders-of-magnitude mis-estimates of both variability and uncertainty. In many cases, the numerical stability of two-dimensional simulation results was found to decrease as the coefficient of variation of the inputs increases. We discuss the strengths and limitations of bootstrap simulation as a method for quantifying uncertainty due to random sampling error.  相似文献   

10.
The spatial distribution of propagules in soil is an important factor in determining the ability of mycoparasites to control soilborne plant pathogens. The assumptions of uniform, random and aggregated propagule distribution were used to evaluate the importance of spatial distribution patterns of propagules of a mycoparasite. For the random and uniform cases explicit expressions were obtained for the average distance between propagules. Average distances among propagules are 40-50% smaller for the random compared to the uniform distribution. For the aggregated case no explicit expression is possible and numerical simulations were used to generate spatial distributions. The consequences for host inactivation by the mycoparasite were evaluated using a simple model of omnidirectional and constant growth of the mycoparasite. A random distribution of propagules gave a considerably slower rate of inactivation than the uniform distribution. Numerical simulations were made to generate comparable patterns of host inactivation for aggregated distributions in which propagule clusters were located at random in three-dimensional space and the distances between propagules with centres followed a normal distribution. The number of propagule centres and propagules/centre varied for a given inoculum density. Parameters were estimated from published data for sclerotia of Sclerotium minor inactivation at different densities of macroconidia of Sporidesmium sclerotivorum. Differences in host inactivation between the uniform and random distributions were small but both gave poor predictions of the field data at low and high densities. The aggregated distribution gave an improved fit for the higher propagule densities but no improvement at the lower. In studying the dynamics of mycoparasites it may be more significant epidemiologically to design treatments based on differences in mean distances between propagules rather than population densities. Density-dependent effects on growth rate need more attention in models and studies on mycoparasite ecology.  相似文献   

11.
Using the complete genome of Thermoplasma volcanium, as an example, we have examined the distribution functions for the amount of C or G in consecutive, non-overlapping blocks of m bases in this system. We find that these distributions are very much broader (by many factors) than those expected for a random distribution of bases. If we plot the widths of the C-G distributions relative to the widths expected for random distributions, as a function of the block size used, we obtain a power law with a characteristic exponent. The broadening of the C-G distributions follows from the empirical finding that blocks containing a given C-G content tend to be followed by blocks of similar C-G content thus indicating a statistical persistence of composition. The exponent associated with the power law thus measures the strength of persistence in a given DNA. This behavior can be understood using Mandelbrot's model of a fractional Brownian walk. In this model there is a hierarchy of persistence (correlation between blocks) between all parts of the system. The model gives us a way to scale the C-G distributions such that all these functions are collapsed onto a master curve. For a fractional Brownian walk, the fractal dimension of the C-G distribution is simply related to the persistence exponent for the power law. The persistence exponent for T. volcanium is found to be gamma = 0.29 while for a 10 million base segment of the human genome we obtain gamma = 0.39, similar to but not identical with the value found for the microbe.  相似文献   

12.
A mathematical model is formulated for the development of a population of cells in which the individual members may grow and divide or die. A given cell is characterized by its age and volume, and these parameters are assumed to determine the rate of volume growth and the probability per unit time of division or death. The initial value problem is formulated, and it is shown that if cell growth rate is proportional to cell volume, then the volume distribution will not converge to a time-invariant shape without an added dispersive mechanism. Mathematical simplications which are possible for the special case of populations in the exponential phase or in the steady state are considered in some detail. Experimental volume distributions of mammalian cells in exponentially growing suspension cultures are analyzed, and growth rates and division probabilities are deduced. It is concluded that the cell volume growth rate is approximately proportional to cell volume and that the division probability increases with volume above a critical threshold. The effects on volume distribution of division into daughter cells of unequal volumes are examined in computer models.  相似文献   

13.
An effective degree approach to modeling the spread of infectious diseases on a network is introduced and applied to a disease that confers no immunity (a Susceptible-Infectious-Susceptible model, abbreviated as SIS) and to a disease that confers permanent immunity (a Susceptible-Infectious-Recovered model, abbreviated as SIR). Each model is formulated as a large system of ordinary differential equations that keeps track of the number of susceptible and infectious neighbors of an individual. From numerical simulations, these effective degree models are found to be in excellent agreement with the corresponding stochastic processes of the network on a random graph, in that they capture the initial exponential growth rates, the endemic equilibrium of an invading disease for the SIS model, and the epidemic peak for the SIR model. For each of these effective degree models, a formula for the disease threshold condition is derived. The threshold parameter for the SIS model is shown to be larger than that derived from percolation theory for a model with the same disease and network parameters, and consequently a disease may be able to invade with lower transmission than predicted by percolation theory. For the SIR model, the threshold condition is equal to that predicted by percolation theory. Thus unlike the classical homogeneous mixing disease models, the SIS and SIR effective degree models have different disease threshold conditions.  相似文献   

14.
Biophysicists use single particle tracking (SPT) methods to probe the dynamic behavior of individual proteins and lipids in cell membranes. The mean squared displacement (MSD) has proven to be a powerful tool for analyzing the data and drawing conclusions about membrane organization, including features like lipid rafts, protein islands, and confinement zones defined by cytoskeletal barriers. Here, we implement time series analysis as a new analytic tool to analyze further the motion of membrane proteins. The experimental data track the motion of 40 nm gold particles bound to Class I major histocompatibility complex (MHCI) molecules on the membranes of mouse hepatoma cells. Our first novel result is that the tracks are significantly autocorrelated. Because of this, we developed linear autoregressive models to elucidate the autocorrelations. Estimates of the signal to noise ratio for the models show that the autocorrelated part of the motion is significant. Next, we fit the probability distributions of jump sizes with four different models. The first model is a general Weibull distribution that shows that the motion is characterized by an excess of short jumps as compared to a normal random walk. We also fit the data with a chi distribution which provides a natural estimate of the dimension d of the space in which a random walk is occurring. For the biological data, the estimates satisfy 1<d<2, implying that particle motion is not confined to a line, but also does not occur freely in the plane. The dimension gives a quantitative estimate of the amount of nanometer scale obstruction met by a diffusing molecule. We introduce a new distribution and use the generalized extreme value distribution to show that the biological data also have an excess of long jumps as compared to normal diffusion. These fits provide novel estimates of the microscopic diffusion constant.  相似文献   

15.
A learning theory based on the lowering of thresholds of neurons under certain conditions is applied to two “random net” models. The first, a so-called “ganglion-brain” is characterized by completely random connections of all afferent tracts except certain ones which form the pathways for unconditioned responses. Certain expressions are derived which measure the learning potentiality of the ganglion— in particular, with respect to the number of responses which can be learned (conditioning potential) and the amount of interference between the learned responses (redundance potential). The second model concerns the progressive refinement of a response. The efficiency of learning in this case is reflected in the eventual specificity of the response which, in turn, depends on the modification of the distribution of thresholds associated with the neurons governing the responses. Expressions are derived relating the initial distribution of thresholds, the relative effectiveness of the various responses, and certain other parameters to the final distribution of thresholds. For a particular choice of the effectiveness distribution of responses the progressive sharpening of the threshold curve (i.e., progressive specificity of response) is demonstrated. Some implications of the model with respect to the evolution of nervous systems are discussed.  相似文献   

16.
Although species distributions can change in an unexpectedly short period of time, most species distribution models (SDMs) use only long‐term averaged environmental conditions to explain species distributions. We aimed to demonstrate the importance of incorporating antecedent environmental conditions into SDMs in comparison to long‐term averaged environmental conditions. We modeled the presence/absence of 18 fish species captured across 108 sampling events along a 50‐km length of the Sagami River in Japan throughout the 1990s (one to four times per site at 45 sites). We constructed and compared the two types of SDMs: 1) a conventional model that uses only long‐term averaged (10‐yr) environmental conditions; and 2) a proposed model that incorporates environmental conditions 2 yr prior to a sampling event (antecedent conditions) together with long‐term averages linked to life‐history stages. These models both included geomorphological, hydrological, and sampling conditions as predictors. A random forest algorithm was applied for modeling and quantifying the relative importance of the predictors. For seven species, antecedent hydrological conditions were more important than the long‐term averaged hydrological conditions. Furthermore, the distributions of two species with low prevalence could not be predicted using long‐term averaged hydrological conditions but only using antecedent hydrological conditions. In conclusion, incorporating antecedent environmental factors linked with life‐history stages at appropriate time scales can better explain changes in species distribution through time.  相似文献   

17.
Previous stochastic compartmental models have introduced the primary source of stochasticity through either a probabilistic transfer mechanism or a random rate coefficient. This paper combines these primary sources into a unified stochastic compartmental model. Twelve different stochastic models are produced by combining various sources of stochasticity and the mean value and the covariance for each of the twelve models is derived. The covariance of each model has a different form whereby the individual sources of stochasticity are identificable from data. The various stochastic models are illustrated for certain specified distributions of the rate coefficient and of the initial count. Several properties of the models are derived and discussed. Among these is the fact that the expected count of a model with a random rate coefficient will always exceed the expected count of a model with a fixed coefficient evaluated at the mean rate. A general modeling strategy for the onecompartment, time invariant hazard rate is also proposed.  相似文献   

18.
Species distribution models combining environmental and spatial components are increasingly used to understand and forecast species invasions. However, modelling distributions of invasive species inhabiting stream networks requires due consideration of their dendritic spatial structure, which may strongly constrain dispersal and colonization pathways. Here we evaluate the application of novel geostatistical tools to species distribution modelling in dendritic networks, using as case study two invasive crayfish (Procambarus clarkii and Pacifastacus leniusculus) in a Mediterranean watershed. Specifically, we used logistic mixed models to relate the probability of occurrence of each crayfish to environmental variables, while specifying three spatial autocorrelation components in random errors. These components described spatial dependencies between sites as a function of (1) straight-line distances (Euclidean model) between sites, (2) hydrologic (along the waterlines) distances between flow-connected sites (tail-up model), and (3) hydrologic distances irrespective of flow connection (tail-down model). We found a positive effect of stream order on P. clarkii, indicating an association with the lower and mid reaches of larger streams, while P. leniusculus was affected by an interaction between stream order and elevation, indicating an association with larger streams at higher altitude. For both species, models including environmental and spatial components far outperformed the pure environmental models, with the tail-up and the Euclidean components being the most important for P. clarkii and P. leniusculus, respectively. Overall, our study highlighted the value of geostatistical tools to model the distribution of riverine and aquatic invasive species, and stress the need to specify spatial dependencies representing the dendritic network structure of stream ecosystems.  相似文献   

19.
The iterated birth and death process is defined as an n-fold iteration of a stochastic process consisting of the combination of instantaneous random killing of individuals in a certain population with a given survival probability s with a Markov birth and death process describing subsequent population dynamics. A long standing problem of computing the distribution of the number of clonogenic tumor cells surviving a fractionated radiation schedule consisting of n equal doses separated by equal time intervals tau is solved within the framework of iterated birth and death processes. For any initial tumor size i, an explicit formula for the distribution of the number M of surviving clonogens at moment tau after the end of treatment is found. It is shown that if i-->infinity and s-->0 so that is(n) tends to a finite positive limit, the distribution of random variable M converges to a probability distribution, and a formula for the latter is obtained. This result generalizes the classical theorem about the Poisson limit of a sequence of binomial distributions. The exact and limiting distributions are also found for the number of surviving clonogens immediately after the nth exposure. In this case, the limiting distribution turns out to be a Poisson distribution.  相似文献   

20.
According to the LATER model (linear approach to thresholds with ergodic rate), the latency of a single saccade in response to target appearance can be understood as a decision process, which is subject to (i) variations in the rate of (visual) information processing; and (ii) the threshold for the decision. We tested whether the LATER model can also be applied to the sequences of saccades in a multiple fixation search, during which latencies of second and subsequent saccades are typically shorter than that of the initial saccade. We found that the distributions of the reciprocal latencies for later saccades, unlike those of the first saccade, are highly asymmetrical, much like a gamma distribution. This suggests that the normal distribution of the rate r, which the LATER model assumes, is not appropriate to describe the rate distributions of subsequent saccades in a scanning sequence. By contrast, the gamma distribution is also appropriate to describe the distribution of reciprocal latencies for the first saccade. The change of the gamma distribution parameters as a function of the ordinal number of the saccade suggests a lowering of the threshold for second and later saccades, as well as a reduction in the number of target elements analysed.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号