首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
A family of trivariate binomial mixtures with respect to their exponent parameter is introduced and its structure is studied by the use of probability generating functions. Expressions for probabilities, factorial moments and factorial cumulants are given. Conditional distributions are also examined. Illustrative examples include the trivariate Poisson, binomial, negative binomial and modified logarithmic series distributions. In addition, properties of the compounded trivariate Poisson distribution are discussed. Finally biological, medical and ecological applications are indicated.  相似文献   

2.
We prove that the generalized Poisson distribution GP(theta, eta) (eta > or = 0) is a mixture of Poisson distributions; this is a new property for a distribution which is the topic of the book by Consul (1989). Because we find that the fits to count data of the generalized Poisson and negative binomial distributions are often similar, to understand their differences, we compare the probability mass functions and skewnesses of the generalized Poisson and negative binomial distributions with the first two moments fixed. They have slight differences in many situations, but their zero-inflated distributions, with masses at zero, means and variances fixed, can differ more. These probabilistic comparisons are helpful in selecting a better fitting distribution for modelling count data with long right tails. Through a real example of count data with large zero fraction, we illustrate how the generalized Poisson and negative binomial distributions as well as their zero-inflated distributions can be discriminated.  相似文献   

3.
A simple extension of the Poisson process results in binomially distributed counts of events in a time interval. A further extension generalises this to probability distributions under‐ or over‐dispersed relative to the binomial distribution. Substantial levels of under‐dispersion are possible with this modelling, but only modest levels of over‐dispersion – up to Poisson‐like variation. Although simple analytical expressions for the moments of these probability distributions are not available, approximate expressions for the mean and variance are derived, and used to re‐parameterise the models. The modelling is applied in the analysis of two published data sets, one showing under‐dispersion and the other over‐dispersion. More appropriate assessment of the precision of estimated parameters and reliable model checking diagnostics follow from this more general modelling of these data sets.  相似文献   

4.
A stochastic model for a general system of first-order reactions in which each reaction may be either a conversion reaction or a catalytic reaction is derived. The governing master equation is formulated in a manner that explicitly separates the effects of network topology from other aspects, and the evolution equations for the first two moments are derived. We find the surprising, and apparently unknown, result that the time evolution of the second moments can be represented explicitly in terms of the eigenvalues and projections of the matrix that governs the evolution of the means. The model is used to analyze the effects of network topology and the reaction type on the moments of the probability distribution. In particular, it is shown that for an open system of first-order conversion reactions, the distribution of all the system components is a Poisson distribution at steady state. Two different measures of the noise have been used previously, and it is shown that different qualitative and quantitative conclusions can result, depending on which measure is used. The effect of catalytic reactions on the variance of the system components is also analyzed, and the master equation for a coupled system of first-order reactions and diffusion is derived. All authors contributed equally to this work.  相似文献   

5.
6.
负二项分布与昆虫种群空间格局分析的研究现状   总被引:3,自引:0,他引:3  
对农业有害生物及其天敌种群密度的正确估计是实施IPM(有害生物综合治理)方案的先决条件,因此,抽样方法一直被列为昆虫学,生态学和植物保护科学中最重要的基本  相似文献   

7.
Power investigations, for example, in statistical procedures for the assessment of agreement among multiple raters often require the simultaneous simulation of several dependent binomial or Poisson distributions to appropriately model the stochastical dependencies between the raters' results. Regarding the rather large dimensions of the random vectors to be generated and the even larger number of interactions to be introduced into the simulation scenarios to determine all necessary information on their distributions' dependence stucture, one needs efficient and fast algorithms for the simulation of multivariate Poisson and binomial distributions. Therefore two equivalent models for the multivariate Poisson distribution are combined to obtain an algorithm for the quick implementation of its multivariate dependence structure. Simulation of the multivariate Poisson distribution then becomes feasible by first generating and then convoluting independent univariate Poisson variates with appropriate expectations. The latter can be computed via linear recursion formulae. Similar means for simulation are also considered for the binomial setting. In this scenario it turns out, however, that exact computation of the probability function is even easier to perform; therefore corresponding linear recursion formulae for the point probabilities of multivariate binomial distributions are presented, which only require information about the index parameter and the (simultaneous) success probabilities, that is the multivariate dependence structure among the binomial marginals.  相似文献   

8.
The concentration of a drug in the circulatory system is studied under two different elimination strategies. The first strategy—geometric elimination—is the classical one which assumes a constant elimination rate per cycle. The second strategy—Poisson elimination—assumes that the elimination rate changes during the process of elimination. The problem studied here is to find a relationship between the residence-time distribution and the cycle-time distribution for a given rule of elimination. While the presented model gives this relationship in terms of Laplace-Stieltjes transform, the aim here is to determine the shapes of the corresponding probability density functions. From experimental data, we expect positively skewed, gamma-like distributions for the residence time of the drug in the body. Also, as some elimination parameter in the model approaches a limit, the exponential distribution often arises. Therefore, we use laguerre series expansions, which yield a parsimonious approximation of positively skewed probability densities that are close to a gamma distribution. The coefficients in the expansion are determined by the central moments, which can be obtained from experimental data or as a consequence of theoretical assumptions. The examples presented show that gamma-like densities arise for a diverse set of cycle-time distributions and under both elimination rules.  相似文献   

9.
The question of how to characterize the bacterial density in a body of water when data are available as counts from a number of small-volume samples was examined for cases where either the Poisson or negative binomial probability distributions could be used to describe the bacteriological data. The suitability of the Poisson distribution when replicate analyses were performed under carefully controlled conditions and of the negative binomial distribution for samples collected from different locations and over time were illustrated by two examples. In cases where the negative binomial distribution was appropriate, a procedure was given for characterizing the variability by dividing the bacterial counts into homogeneous groups. The usefulness of this procedure was illustrated for the second example based on survey data for Lake Erie. A further illustration of the difference between results based on the Poisson and negative binomial distributions was given by calculating the probability of obtaining all samples sterile, assuming various bacterial densities and sample sizes.  相似文献   

10.
 Mean firing rates (MFRs), with analogue values, have thus far been used as information carriers of neurons in most brain theories of learning. However, the neurons transmit the signal by spikes, which are discrete events. The climbing fibers (CFs), which are known to be essential for cerebellar motor learning, fire at the ultra-low firing rates (around 1 Hz), and it is not yet understood theoretically how high-frequency information can be conveyed and how learning of smooth and fast movements can be achieved. Here we address whether cerebellar learning can be achieved by CF spikes instead of conventional MFR in an eye movement task, such as the ocular following response (OFR), and an arm movement task. There are two major afferents into cerebellar Purkinje cells: parallel fiber (PF) and CF, and the synaptic weights between PFs and Purkinje cells have been shown to be modulated by the stimulation of both types of fiber. The modulation of the synaptic weights is regulated by the cerebellar synaptic plasticity. In this study we simulated cerebellar learning using CF signals as spikes instead of conventional MFR. To generate the spikes we used the following four spike generation models: (1) a Poisson model in which the spike interval probability follows a Poisson distribution, (2) a gamma model in which the spike interval probability follows the gamma distribution, (3) a max model in which a spike is generated when a synaptic input reaches maximum, and (4) a threshold model in which a spike is generated when the input crosses a certain small threshold. We found that, in an OFR task with a constant visual velocity, learning was successful with stochastic models, such as Poisson and gamma models, but not in the deterministic models, such as max and threshold models. In an OFR with a stepwise velocity change and an arm movement task, learning could be achieved only in the Poisson model. In addition, for efficient cerebellar learning, the distribution of CF spike-occurrence time after stimulus onset must capture at least the first, second and third moments of the temporal distribution of error signals. Received: 28 January 2000 / Accepted in revised form: 2 August 2000  相似文献   

11.
On the analysis of high order moments of fluorescence fluctuations.   总被引:6,自引:3,他引:3  
A simple, straightforward analysis to characterize the distribution of aggregate sizes in a reversible aggregation system at equilibrium is presented. The method, an extension of fluorescence correlation spectroscopy (FCS), is based on measurements of higher order moments of spontaneous fluctuations of fluorescence intensity emitted from a defined open region of the sample. These fluctuations indicate fluctuations of the numbers of the fluorescent molecules in the observation region. Shot noise resulting from the random character of fluorescence emission and from the photoelectric detection system is modeled as a Poisson distribution and is subtracted from the measured photon count fluctuation moments to yield the desired fluorescence fluctuation moments. This analysis can also be used to estimate the fraction of immobile fluorophores in FCS measurements.  相似文献   

12.
An analytical approach is presented for determining the response of a neuron or of the activity in a network of connected neurons, represented by systems of nonlinear ordinary stochastic differential equations—the Fitzhugh-Nagumo system with Gaussian white noise current. For a single neuron, five equations hold for the first- and second-order central moments of the voltage and recovery variables. From this system we obtain, under certain assumptions, five differential equations for the means, variances, and covariance of the two components. One may use these quantities to estimate the probability that a neuron is emitting an action potential at any given time. The differential equations are solved by numerical methods. We also perform simulations on the stochastic Fitzugh-Nagumo system and compare the results with those obtained from the differential equations for both sustained and intermittent deterministic current inputs withsuperimposed noise. For intermittent currents, which mimic synaptic input, the agreement between the analytical and simulation results for the moments is excellent. For sustained input, the analytical approximations perform well for small noise as there is excellent agreement for the moments. In addition, the probability that a neuron is spiking as obtained from the empirical distribution of the potential in the simulations gives a result almost identical to that obtained using the analytical approach. However, when there is sustained large-amplitude noise, the analytical method is only accurate for short time intervals. Using the simulation method, we study the distribution of the interspike interval directly from simulated sample paths. We confirm that noise extends the range of input currents over which (nonperiodic) spike trains may exist and investigate the dependence of such firing on the magnitude of the mean input current and the noise amplitude. For networks we find the differential equations for the means, variances, and covariances of the voltage and recovery variables and show how solving them leads to an expression for the probability that a given neuron, or given set of neurons, is firing at time t. Using such expressions one may implement dynamical rules for changing synaptic strengths directly without sampling. The present analytical method applies equally well to temporally nonhomogeneous input currents and is expected to be useful for computational studies of information processing in various nervous system centers.  相似文献   

13.
The number of common adjacencies of genetic markers, as a measure of the similarity of two genomes, has been widely used as indicator of evolutionary relatedness and as the basis for inferring phylogenetic relationships. Its probability distribution enables statistical tests in detecting whether significant evolutionary signal remains in the marker order. In this article, we derive the probability distributions of the number of adjacencies for a number of types of genome--signed or unsigned, circular or linear, single-chromosome or multichromosomal. Generating functions are found for singlechromosome cases, from which exact counts can be calculated. Probability approaches are adopted for multichromosomal cases, where we.nd the exact values for expectations and variances. In both cases, the limiting distributions are derived in term of numbers of adjacencies. For all unsigned cases, the limiting distribution is Poisson with parameter 2; for all signed cases, the limiting distribution is Poisson with parameter (1/2).  相似文献   

14.
T Ichiye  M Karplus 《Biochemistry》1988,27(9):3487-3497
The effects of anisotropy and anharmonicity of the atomic fluctuations on the results of crystallographic refinement of proteins are examined. Atomic distribution functions from a molecular dynamics simulation for lysozyme are introduced into a real-space (electron density) refinement procedure for individual atoms. Several models for the atomic probability distributions are examined. When isotropic, harmonic motion is assumed, the largest discrepancies between the true first moments (means) and second moments (B factors) of the positions calculated from the dynamics and the fitted values occur for probability densities with multiple peaks. The refined mean is at the center of the largest peak, and the refined B factor is slightly larger than that of the largest peak, unless the distance between the peaks is small compared to the peak width. The resulting values are often significantly different from the true first and second moments of the distribution. To improve the results, alternate conformations, rather than anharmonic corrections, should be included.  相似文献   

15.
A method is given for studying realistic random fluctuations in the carrying capacity of the logistic population growth model. This method is then applied using an environmental noise based on a Poisson process, and the time-dependent moments of the population probability density calculated. These moments are expressed in terms of a parameter obtained by dividing the correlation time of the environmental fluctuations by the characteristic response time of the population. When this quotient is large (very slow fluctuations tracked by the population) or small (very rapid fluctuations which are averaged), exact solutions are obtained for the probability density itself. It is also shown that at equilibrium, the average population sizes given by these two exact solutions bound all other cases.Numerical simulations confirm these developments and point to a trade-off between population stability and average population size. Additional simulations show that the probability of becoming extinct in a given time is greatest for populations intermediate between tracking and averaging the carrying capacity fluctuations. In addition to specifying when environmental noise can be ignored, these results indicate the direction in which growth parameters evolve in a fluctuating environment.  相似文献   

16.
Promotion time models have been recently adapted to the context of infectious diseases to take into account discrete and multiple exposures. However, Poisson distribution of the number of pathogens transmitted at each exposure was a very strong assumption and did not allow for inter-individual heterogeneity. Bernoulli, the negative binomial, and the compound Poisson distributions were proposed as alternatives to Poisson distribution for the promotion time model with time-changing exposure. All were derived within the frailty model framework. All these distributions have a point mass at zero to take into account non-infected people. Bernoulli distribution, the two-component cure rate model, was extended to multiple exposures. Contrary to the negative binomial and the compound Poisson distributions, Bernoulli distribution did not enable to connect the number of pathogens transmitted to the delay between transmission and infection detection. Moreover, the two former distributions enable to account for inter-individual heterogeneity. The delay to surgical site infection was an example of single exposure. The probability of infection was very low; thus, estimation of the effect of selected risk factors on that probability obtained with Bernoulli and Poisson distributions were very close. The delay to nosocomial urinary tract infection was a multiple exposure example. The probabilities of pathogen transmission during catheter placement and catheter presence were estimated. Inter-individual heterogeneity was very high, and the fit was better with the compound Poisson and the negative binomial distributions. The proposed models proved to be also mechanistic. The negative binomial and the compound Poisson distributions were useful alternatives to account for inter-individual heterogeneity.  相似文献   

17.
The local exchange model developed by McNair et al. (1997) provides a stochastic diffusion approximation to the random-like motion of fine particles suspended in turbulent water. Based on this model, McNair (2000) derived equations governing the probability distribution and moments of the hitting time, which is the time until a particle hits the bottom for the first time from a given initial elevation. In the present paper, we derive the corresponding equations for the probability distribution and moments of the hitting distance, which is the longitudinal distance a particle has traveled when it hits the bottom for the first time. We study the dependence of the distribution and moments on a particle's initial elevation and on two dimensionless parameters: an inverse Reynolds number M (a measure of the importance of viscous mixing compared to turbulent mixing of water) and the Rouse number ?(a measure of the importance of deterministic gravitational sinking compared to stochastic turbulent mixing in governing the vertical motion of a particle). We also compute predicted hitting-distance distributions for two published data sets. The results show that for fine particles suspended in moderately to highly turbulent water, the hitting-distance distribution is strongly skewed to the right, with mode相似文献   

18.
Effective management of knee joint disorders demands appropriate rehabilitation programs to restore function while strengthening muscles. Excessive stresses in cartilage/menisci and forces in ligaments should be avoided to not exacerbate joint condition after an injury or reconstruction. Using a validated 3D nonlinear finite element model, detailed biomechanics of the entire joint in closed-kinetic-chain squat exercises are investigated at different flexion angles, weights in hands, femur-tibia orientations and coactivity in hamstrings. Predictions are in agreement with results of earlier studies. Estimation of small forces in cruciate ligaments advocates the use of squat exercises at all joint angles and external loads. In contrast, large contact stresses, especially at the patellofemoral joint, that approach cartilage failure threshold in compression suggest avoiding squatting at greater flexion angles, joint moments and weights in hands. Current results are helpful in comprehensive evaluation and design of effective exercise therapies and trainings with minimal risk to various components.  相似文献   

19.
Frequencies of restriction sites.   总被引:1,自引:1,他引:0  
Restriction sites or other sequence patterns are usually assumed to occur according to a Poisson distribution with mean equal to the reciprocal of the probability of the given site or pattern. For situations where non-overlapping occurrences of patterns, such as restriction sites, are the objects of interest, this note shows that the Poisson assumption is frequently misleading. Both the case of base composition (independent bases) and of dinucleotide frequencies (Markov chains) are treated. Moreover, a new technique is presented which allows treatment of collections of patterns, where the departure from the Poisson assumption is even more striking. This later case includes double digests, and an example of a five enzyme digest is included.  相似文献   

20.
The iterated birth and death Markov process is defined as an n-fold iteration of a birth and death Markov process describing kinetics of certain population combined with random killing of individuals in the population at moments tau 1,...,tau n with given survival probabilities s1,...,sn. A long-standing problem of computing the distribution of the number of clonogenic tumor cells surviving an arbitrary fractionated radiation schedule is solved within the framework of iterated birth and death Markov process. It is shown that, for any initial population size iota, the distribution of the size N of the population at moment t > or = tau n is generalized negative binomial, and an explicit computationally feasible formula for the latter is found. It is shown that if i --> infinity and sn --> 0 so that the product iota s1...sn tends to a finite positive limit, the distribution of random variable N converges to a probability distribution, which for t = tau n turns out to be Poisson. In the latter case, an estimate of the rate of convergence in the total variation metric similar to the classical Law of Rare Events is obtained.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号