首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Estimation of any probability distribution parameters is vital because imprecise and biased estimates can be misleading. In this study, we investigate a flexible power function distribution and introduced new two methods such as, probability weighted moments, and generalized probability weighted methods for its parameters. We compare their results with L-moments, trimmed L-moments by a simulation study and a real data example based on performance measures such as, mean square error and total deviation. We concluded that all the methods perform well in the case of large sample size (n>30), however, the generalized probability weighted moment method performs better for small sample size.  相似文献   

2.
A heuristic approximation procedure devised by Bartlett has often been used to estimate the stationary first- and second-order moments of difference-equation population models perturbed by “small” noise. Here, the approximation is proved to be valid under quite general assumptions: the exact and approximate moments differ by an amount of order σ3 as σ → 0, where σ2 is the mean-square norm of the noise process. The existence of stationary solutions to the perturbed difference equation is also considered. If the noise is Markovian, stationary solutions satisfying the assumptions of the error analysis are proved to exist if the noise is “small” with probability 1. The results are applied to a population model with two age classes and variable recruitment.  相似文献   

3.
Many distributions have been used in flood frequency analysis (FFA) for fitting the flood extremes data. However, as shown in the paper, the scatter of Polish data plotted on the moment ratio diagram shows that there is still room for a new model. In the paper, we study the usefulness of the generalized exponential (GE) distribution in flood frequency analysis for Polish Rivers. We investigate the fit of GE distribution to the Polish data of the maximum flows in comparison with the inverse Gaussian (IG) distribution, which in our previous studies showed the best fitting among several models commonly used in FFA. Since the use of a discrimination procedure without the knowledge of its performance for the considered probability density functions may lead to erroneous conclusions, we compare the probability of correct selection for the GE and IG distributions along with the analysis of the asymptotic model error in respect to the upper quantile values. As an application, both GE and IG distributions are alternatively assumed for describing the annual peak flows for several gauging stations of Polish Rivers. To find the best fitting model, four discrimination procedures are used. In turn, they are based on the maximized logarithm of the likelihood function (K procedure), on the density function of the scale transformation maximal invariant (QK procedure), on the Kolmogorov-Smirnov statistics (KS procedure) and the fourth procedure based on the differences between the ML estimate of 1% quantile and its value assessed by the method of moments and linear moments, in sequence (R procedure). Due to the uncertainty of choosing the best model, the method of aggregation is applied to estimate of the maximum flow quantiles.  相似文献   

4.
The performance of diagnostic tests is often evaluated by estimating their sensitivity and specificity with respect to a traditionally accepted standard test regarded as a “gold standard” in making the diagnosis. Correlated samples of binary data arise in many fields of application. The fundamental unit for analysis is occasionally the site rather than the subject in site-specific studies. Statistical methods that take into account the within-subject corelation should be employed to estimate the sensitivity and the specificity of diagnostic tests since site-specific results within a subject can be highly correlated. I introduce several statistical methods for the estimation of the sensitivity and the specificity of sitespecific diagnostic tests. I apply these techniques to the data from a study involving an enzymatic diagnostic test to motivate and illustrate the estimation of the sensitivity and the specificity of periodontal diagnostic tests. I present results from a simulation study for the estimation of diagnostic sensitivity when the data are correlated within subjects. Through a simulation study, I compare the performance of the binomial estimator pCBE, the ratio estimator pCBE, the weighted estimator pCWE, the intracluster correlation estimator pCIC, and the generalized estimating equation (GEE) estimator PCGEE in terms of biases, observed variances, mean squared errors (MSE), relative efficiencies of their variances and 95 per cent coverage proportions. I recommend using PCBE when σ == 0. I recommend use of the weighted estimator PCWE when σ = 0.6. When σ == 0.2 or σ == 0.4, and the number of subjects is at least 30, PCGEE performs well.  相似文献   

5.
Wang CY 《Biometrics》2000,56(1):106-112
Consider the problem of estimating the correlation between two nutrient measurements, such as the percent energy from fat obtained from a food frequency questionnaire (FFQ) and that from repeated food records or 24-hour recalls. Under a classical additive model for repeated food records, it is known that there is an attenuation effect on the correlation estimation if the sample average of repeated food records for each subject is used to estimate the underlying long-term average. This paper considers the case in which the selection probability of a subject for participation in the calibration study, in which repeated food records are measured, depends on the corresponding FFQ value, and the repeated longitudinal measurement errors have an autoregressive structure. This paper investigates a normality-based estimator and compares it with a simple method of moments. Both methods are consistent if the first two moments of nutrient measurements exist. Furthermore, joint estimating equations are applied to estimate the correlation coefficient and related nuisance parameters simultaneously. This approach provides a simple sandwich formula for the covariance estimation of the estimator. Finite sample performance is examined via a simulation study, and the proposed weighted normality-based estimator performs well under various distributional assumptions. The methods are applied to real data from a dietary assessment study.  相似文献   

6.
In this paper, we develop a Gaussian estimation (GE) procedure to estimate the parameters of a regression model for correlated (longitudinal) binary response data using a working correlation matrix. A two‐step iterative procedure is proposed for estimating the regression parameters by the GE method and the correlation parameters by the method of moments. Consistency properties of the estimators are discussed. A simulation study was conducted to compare 11 estimators of the regression parameters, namely, four versions of the GE, five versions of the generalized estimating equations (GEEs), and two versions of the weighted GEE. Simulations show that (i) the Gaussian estimates have the smallest mean square error and best coverage probability if the working correlation structure is correctly specified and (ii) when the working correlation structure is correctly specified, the GE and the GEE with exchangeable correlation structure perform best as opposed to when the correlation structure is misspecified.  相似文献   

7.
Question: The optimal use of the point intercept method (PIM) for efficient estimation of plant biomass has not been addressed although PIM is a commonly used method in vegetation analysis. In this study we compare results achieved using PIM at a range of efforts, we assess a method for calculating these results that are new with PIM and we provide a formula for planning the optimal use of PIM. Location: Northern Norway. Methods: We collected intercept data at a range of efforts, i.e. from one to 100 pins per 0.25 m2 plots, on three plant growth forms in a mountain meadow. After collection of intercept data we clipped and weighed the plant biomass. The relationship between intercept frequency and weighed biomass (b) was estimated using both a weighted linear regression model (WLR) and an ordinary linear regression model (OLR). The accuracy of the estimate of biomass achieved by PIM at different efforts was assessed by running computer simulations at different pin densities. Results: The relationship between intercept frequency and weighed biomass (b) was far better estimated using WLR compared to the normally used OLR. Efforts above 10 pins per 0.25 m2 plot had a negligible effect on the accuracy of the estimate of biomass achieved by PIM whereas the number of plots had a strong effect. Moreover, for a given level of accuracy, the required number of plots varied depending on plant growth form. We achieved similar results to that of the computer simulations when applying our WLR based formula. Conclusion: This study shows that PIM can be applied more efficiently than was done in previous studies for the purpose of plant biomass estimation, where several plots should be analysed but at considerably less effort per plot. Moreover, WLR rather than OLR should be applied when estimating biomass from intercept frequency. The formula we have deduced is a useful tool for planning plant biomass analysis with PIM.  相似文献   

8.
9.
Evaluation of the loads on lumbar intervertebral discs (IVD) is critically important since it is closely related to spine biomechanics, pathology and prosthesis design. Non-invasive estimation of the loads in the discs remains a challenge. In this study, we proposed a new technique to estimate in vivo loads in the IVD using a subject-specific finite element (FE) model of the disc and the kinematics of the disc endplates as input boundary conditions. The technique was validated by comparing the forces and moments in the discs calculated from the FE analyses to the in vitro experiment measurements of three corresponding lumbar discs. The results showed that the forces and moments could be estimated within an average error of 20%. Therefore, this technique can be a promising tool for non-invasive estimation of the loads in the discs and may be extended to be used on living subjects.  相似文献   

10.
Monitoring procedures for Alpine ibex Capra ibex are limited in habitats with reduced visibility and when physical capture and marking of the animals is not intended. Photographic sampling, involving using camera‐trap data and identifying ibex from natural markings, was adopted with capture‐recapture models to estimate the abundance of ibex in Austria. The software CAPTURE's model produced an average capture probability of 0.44 with an estimate of 34–51 ibex and a mean population size of 38 ibex. This first study showed the applicability of photographic capture‐recapture techniques to estimate the abundance of ibex based on their natural markings.  相似文献   

11.
Tail moments in the single cell gel electrophoresis (comet) assay usually do not follow a normal distribution, making the statistical analysis complicated. Researchers have used a wide variety of statistical techniques in an attempt to overcome this problem. In many cases, the tail moments follow a bimodal distribution that can be modeled with a mixture of gamma distributions. This bimodality may be due to cells being in two different stages of the cell cycle at the time of treatment. Maximum likelihood, modified to accommodate censored data, can be used to estimate the five parameters of the gamma mixture distribution for each slide. A weighted analysis of variance on the parameter estimates for the gamma mixtures can be performed to determine differences in DNA damage between treatments. These methods were applied to an experiment on the effect of thymidine kinase in DNA damage and repair. Analysis based on the mixture of gamma distributions was found to be more statistically valid, more powerful, and more informative than analysis based on log-transformed tail moments.  相似文献   

12.
We are interested in the estimation of average treatment effects based on right-censored data of an observational study. We focus on causal inference of differences between t-year absolute event risks in a situation with competing risks. We derive doubly robust estimation equations and implement estimators for the nuisance parameters based on working regression models for the outcome, censoring, and treatment distribution conditional on auxiliary baseline covariates. We use the functional delta method to show that these estimators are regular asymptotically linear estimators and estimate their variances based on estimates of their influence functions. In empirical studies, we assess the robustness of the estimators and the coverage of confidence intervals. The methods are further illustrated using data from a Danish registry study.  相似文献   

13.
Question: Species optima or indicator values are frequently used to predict environmental variables from species composition. The present study focuses on the question whether predictions can be improved by using species environmental amplitudes instead of single values representing species optima. Location: Semi‐natural, deciduous hardwood forests of northwestern Germany. Methods: Based on a data set of 558 relevés, species responses (presence/absence) to pH were modelled with Huisman‐Olff‐Fresco (HOF) regression models. Species amplitudes were derived from response curves using three different methods. To predict the pH from vegetation, a maximum amplitude overlap method was applied. For comparison, predictions resulting from several established methods, i. e. maximum likelihood/present and absent species, maximum likelihood/present species only, mean weighted averages and mean Ellenberg indicator values were calculated. The predictive success (squared Pearson's r and root mean square error of prediction) was evaluated using an independent data set of 151 relevés. Results: Predictions based upon amplitudes defined by maximum Cohen's x probability threshold yield the best results of all amplitude definitions (R2= 0.75, RMSEP = 0.52). Provided there is an even distribution of the environmental variable, amplitudes defined by predicted probability exceeding prevalence are also suitable (R2= 0.76, RMSEP = 0.55). The prediction success is comparable to maximum likelihood (present species only) and – after rescaling – to mean weighted averages. Predicted values show a good linearity to observed pH values as opposed to a curvilinear relationship of mean Ellenberg indicator values. Transformation or rescaling of the predicted values is not required. Conclusions: Species amplitudes given by a minimum and maximum boundary for each species can be used to efficiently predict environmental variables from species composition. The predictive success is superior to mean Ellenberg indicator values and comparable to mean indicator values based on species weighted averages.  相似文献   

14.
In the present research, merwinite (M) scaffolds with and without nano‐titanium dioxide (titania) were synthesized by water‐based freeze casting method. Two different amounts (7.5 and 10 wt%) of n‐TiO2 were added to M scaffolds. They were sintered at temperature of 1573.15°K and at cooling rate of 4°K/min. The changes in physical and mechanical properties were investigated. The results showed that although M and M containing 7.5 wt% n‐TiO2 (MT7.5) scaffolds had approximately the same microstructures in terms of pore size and wall thickness, these factors were different for sample MT10. In overall, the porosity, volume and linear shrinkage were decreased by adding different weight ratios of n‐TiO2 into the M structure. According to the obtained mechanical results, the optimum mechanical performance was related to the sample MT7.5 (E = 51 MPa and σ = 2 MPa) with respect to the other samples, i.e.: M (E = 47 MPa and σ = 1.8 MPa) and MT10 (E = 32 MPa and σ = 1.4 MPa). The acellular in vitro bioactivity experiment confirmed apatite formation on the surfaces of all samples for various periods of soaking time. Based on cell study, the sample which possessed favorable mechanical behavior (MT7.5) supported attachment and proliferation of osteoblastic cells. These results revealed that the MT7.5 scaffold with improved mechanical and biological properties could have a potential to be used in bone substitute. © 2015 American Institute of Chemical Engineers Biotechnol. Prog., 31:550–556, 2015  相似文献   

15.
The weights used in iterative weighted least squares (IWLS) regression are usually estimated parametrically using a working model for the error variance. When the variance function is misspecified, the IWLS estimates of the regression coefficients β are still asymptotically consistent but there is some loss in efficiency. Since second moments can be quite hard to model, it makes sense to estimate the error variances nonparametrically and to employ weights inversely proportional to the estimated variances in computing the WLS estimate for β. Surprisingly, this approach had not received much attention in the literature. The aim of this note is to demonstrate that such a procedure can be implemented easily in S-plus using standard functions with default options making it suitable for routine applications. The particular smoothing method that we use is local polynomial regression applied to the logarithm of the squared residuals but other smoothers can be tried as well. The proposed procedure is applied to data on the use of two different assay methods for a hormone. Efficiency calculations based on the estimated model show that the nonparametric IWLS estimates are more efficient than the parametric IWLS estimates based on three different plausible working models for the variance function. The proposed estimators also perform well in a simulation study using both parametric and nonparametric variance functions as well as normal and gamma errors.  相似文献   

16.
17.
In observational studies of survival time featuring a binary time-dependent treatment, the hazard ratio (an instantaneous measure) is often used to represent the treatment effect. However, investigators are often more interested in the difference in survival functions. We propose semiparametric methods to estimate the causal effect of treatment among the treated with respect to survival probability. The objective is to compare post-treatment survival with the survival function that would have been observed in the absence of treatment. For each patient, we compute a prognostic score (based on the pre-treatment death hazard) and a propensity score (based on the treatment hazard). Each treated patient is then matched with an alive, uncensored and not-yet-treated patient with similar prognostic and/or propensity scores. The experience of each treated and matched patient is weighted using a variant of Inverse Probability of Censoring Weighting to account for the impact of censoring. We propose estimators of the treatment-specific survival functions (and their difference), computed through weighted Nelson–Aalen estimators. Closed-form variance estimators are proposed which take into consideration the potential replication of subjects across matched sets. The proposed methods are evaluated through simulation, then applied to estimate the effect of kidney transplantation on survival among end-stage renal disease patients using data from a national organ failure registry.  相似文献   

18.
Though there are many problems on the usefulness of the logistic curve, it may be necessary to examine before discussing these problems whether or not the actual data fit to the theoretical values. It has been clarified in this paper that the relation between the population density and its rate of increase per individual described by the differential equation (1) is represented by a straight line on a finite difference diagram on which Ni+1−Ni/Ni values are plotted against Ni+1. Utilizing this linear relation we may examine the fittness of the logistic curve to the actual data and when it is fitted we may estimate the parameters of the logistic equation by (5) and (6). The result of the application of this method to the experimental populations of azuki bean weevil indicates that the relation between parent and progeny densities fits well to the logistic type as has been proved byFujita andUtida (1953) who utilized the linear reltion between 1/R+2σ and parent density where R is the apparent rate of reproduction and σ is a constant dependent primarily upon the length of adult life (0≦σ≦1).  相似文献   

19.
In this article we construct and study estimators of the causal effect of a time-dependent treatment on survival in longitudinal studies. We employ a particular marginal structural model (MSM), proposed by Robins (2000), and follow a general methodology for constructing estimating functions in censored data models. The inverse probability of treatment weighted (IPTW) estimator of Robins et al. (2000) is used as an initial estimator and forms the basis for an improved, one-step estimator that is consistent and asymptotically linear when the treatment mechanism is consistently estimated. We extend these methods to handle informative censoring. The proposed methodology is employed to estimate the causal effect of exercise on mortality in a longitudinal study of seniors in Sonoma County. A simulation study demonstrates the bias of naive estimators in the presence of time-dependent confounders and also shows the efficiency gain of the IPTW estimator, even in the absence such confounding. The efficiency gain of the improved, one-step estimator is demonstrated through simulation.  相似文献   

20.
普通克立格法在昆虫生态学中的应用   总被引:6,自引:3,他引:3  
地统计学是以区域化变量为基础,以变差函数为主要工龄,分析空间相关变量结构的统计方法。在对波动较大的实验变差函数进行拟合时,虽无法获得最优拟合,但运用人机对话的拟合方法来灵活选取参数,可以得到较理想的变差函数模型的参数。本文运用加权多项式回归法以及人机对话的方法,得到了较理想的1级与2级球状模型拟合结果,同时利用直线函数对实验变差函数进行了拟合,最后利用普通Kriging法,对待估计点进行各理论模型的最优、线性、无偏内插估计,得出克立格内插权重。将此方法应用于广东省四会市大沙镇富溪乡试验田稻飞观测数据,由待估点周围若干观测点的数据,有效地估计出待估点的昆虫分布密度,并讨论比较了不同理论模型的拟合效果以及估计误差。结果表明,2级球状模型的拟合最好,一级球状模型次之,直线函数的拟合最差,但直线函数计算最为简便。  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号