首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Louzada-Neto F 《Biometrics》1999,55(4):1281-1285
We propose a polyhazard model to deal with lifetime data associated with latent competing risks. The causes of failure are assumed unobserved and affecting individuals independently. The general framework allows a broad class of hazard models that includes the most common hazard-based models. The model accommodates bathtub and multimodal hazards, keeping enough flexibility for common lifetime data that cannot be accommodated by usual hazard-based models. Maximum likelihood estimation is discussed, and parametric simulation is used for hypothesis testing.  相似文献   

2.
Generalised absolute risk models were fitted to the latest Japanese atomic bomb survivor cancer incidence data using Bayesian Markov Chain Monte Carlo methods, taking account of random errors in the DS86 dose estimates. The resulting uncertainty distributions in the relative risk model parameters were used to derive uncertainties in population cancer risks for a current UK population. Because of evidence for irregularities in the low-dose dose response, flexible dose-response models were used, consisting of a linear-quadratic-exponential model, used to model the high-dose part of the dose response, together with piecewise-linear adjustments for the two lowest dose groups. Following an assumed administered dose of 0.001 Sv, lifetime leukaemia radiation-induced incidence risks were estimated to be 1.11 x 10(-2) Sv(-1) (95% Bayesian CI -0.61, 2.38) using this model. Following an assumed administered dose of 0.001 Sv, lifetime solid cancer radiation-induced incidence risks were calculated to be 7.28 x 10(-2) Sv(-1) (95% Bayesian CI -10.63, 22.10) using this model. Overall, cancer incidence risks predicted by Bayesian Markov Chain Monte Carlo methods are similar to those derived by classical likelihood-based methods and which form the basis of established estimates of radiation-induced cancer risk.  相似文献   

3.
This paper demonstrates the advantages of sharing information about unknown features of covariates across multiple model components in various nonparametric regression problems including multivariate, heteroscedastic, and semicontinuous responses. In this paper, we present a methodology which allows for information to be shared nonparametrically across various model components using Bayesian sum-of-tree models. Our simulation results demonstrate that sharing of information across related model components is often very beneficial, particularly in sparse high-dimensional problems in which variable selection must be conducted. We illustrate our methodology by analyzing medical expenditure data from the Medical Expenditure Panel Survey (MEPS). To facilitate the Bayesian nonparametric regression analysis, we develop two novel models for analyzing the MEPS data using Bayesian additive regression trees—a heteroskedastic log-normal hurdle model with a “shrink-toward-homoskedasticity” prior and a gamma hurdle model.  相似文献   

4.
Mallick BK  Denison DG  Smith AF 《Biometrics》1999,55(4):1071-1077
A Bayesian multivariate adaptive regression spline fitting approach is used to model univariate and multivariate survival data with censoring. The possible models contain the proportional hazards model as a subclass and automatically detect departures from this. A reversible jump Markov chain Monte Carlo algorithm is described to obtain the estimate of the hazard function as well as the survival curve.  相似文献   

5.
Shared random effects joint models are becoming increasingly popular for investigating the relationship between longitudinal and time‐to‐event data. Although appealing, such complex models are computationally intensive, and quick, approximate methods may provide a reasonable alternative. In this paper, we first compare the shared random effects model with two approximate approaches: a naïve proportional hazards model with time‐dependent covariate and a two‐stage joint model, which uses plug‐in estimates of the fitted values from a longitudinal analysis as covariates in a survival model. We show that the approximate approaches should be avoided since they can severely underestimate any association between the current underlying longitudinal value and the event hazard. We present classical and Bayesian implementations of the shared random effects model and highlight the advantages of the latter for making predictions. We then apply the models described to a study of abdominal aortic aneurysms (AAA) to investigate the association between AAA diameter and the hazard of AAA rupture. Out‐of‐sample predictions of future AAA growth and hazard of rupture are derived from Bayesian posterior predictive distributions, which are easily calculated within an MCMC framework. Finally, using a multivariate survival sub‐model we show that underlying diameter rather than the rate of growth is the most important predictor of AAA rupture.  相似文献   

6.
In the current literature on latent variable models, much effort has been put on the development of dichotomous and polytomous cognitive diagnostic models (CDMs) for assessments. Recently, the possibility of using continuous responses in CDMs has been brought to discussion. But no Bayesian approach has been developed yet for the analysis of CDMs when responses are continuous. Our work is the first Bayesian framework for the continuous deterministic inputs, noisy, and gate (DINA) model. We also propose new interpretations for item parameters in this DINA model, which makes the analysis more interpretable than before. In addition, we have conducted several simulations to evaluate the performance of the continuous DINA model through our Bayesian approach. Then, we have applied the proposed DINA model to a real data example of risk perceptions for individuals over a range of health-related activities. The application results exemplify the high potential of the use of the proposed continuous DINA model to classify individuals in the study.  相似文献   

7.
Bayesian clinical trial designs offer the possibility of a substantially reduced sample size, increased statistical power, and reductions in cost and ethical hazard. However when prior and current information conflict, Bayesian methods can lead to higher than expected type I error, as well as the possibility of a costlier and lengthier trial. This motivates an investigation of the feasibility of hierarchical Bayesian methods for incorporating historical data that are adaptively robust to prior information that reveals itself to be inconsistent with the accumulating experimental data. In this article, we present several models that allow for the commensurability of the information in the historical and current data to determine how much historical information is used. A primary tool is elaborating the traditional power prior approach based upon a measure of commensurability for Gaussian data. We compare the frequentist performance of several methods using simulations, and close with an example of a colon cancer trial that illustrates a linear models extension of our adaptive borrowing approach. Our proposed methods produce more precise estimates of the model parameters, in particular, conferring statistical significance to the observed reduction in tumor size for the experimental regimen as compared to the control regimen.  相似文献   

8.
Zeh J  Poole D  Miller G  Koski W  Baraff L  Rugh D 《Biometrics》2002,58(4):832-840
Annual survival probability of bowhead whales, Balaena mysticetus, was estimated using both Bayesian and maximum likelihood implementations of Cormack and Jolly-Seber (JS) models for capture-recapture estimation in open populations and reduced-parameter generalizations of these models. Aerial photographs of naturally marked bowheads collected between 1981 and 1998 provided the data. The marked whales first photographed in a particular year provided the initial 'capture' and 'release' of those marked whales and photographs in subsequent years the 'recaptures'. The Cormack model, often called the Cormack-Jolly-Seber (CJS) model, and the program MARK were used to identify the model with a single survival and time-varying capture probabilities as the most appropriate for these data. When survival was constrained to be one or less, the maximum likelihood estimate computed by MARK was one, invalidating confidence interval computations based on the asymptotic standard error or profile likelihood. A Bayesian Markov chain Monte Carlo (MCMC) implementation of the model was used to produce a posterior distribution for annual survival. The corresponding reduced-parameter JS model was also fit via MCMC because it is the more appropriate of the two models for these photoidentification data. Because the CJS model ignores much of the information on capture probabilities provided by the data, its results are less precise and more sensitive to the prior distributions used than results from the JS model. With priors for annual survival and capture probabilities uniform from 0 to 1, the posterior mean for bowhead survival rate from the JS model is 0.984, and 95% of the posterior probability lies between 0.948 and 1. This high estimated survival rate is consistent with other bowhead life history data.  相似文献   

9.
Fully Bayesian methods for Cox models specify a model for the baseline hazard function. Parametric approaches generally provide monotone estimations. Semi‐parametric choices allow for more flexible patterns but they can suffer from overfitting and instability. Regularization methods through prior distributions with correlated structures usually give reasonable answers to these types of situations. We discuss Bayesian regularization for Cox survival models defined via flexible baseline hazards specified by a mixture of piecewise constant functions and by a cubic B‐spline function. For those “semi‐parametric” proposals, different prior scenarios ranging from prior independence to particular correlated structures are discussed in a real study with microvirulence data and in an extensive simulation scenario that includes different data sample and time axis partition sizes in order to capture risk variations. The posterior distribution of the parameters was approximated using Markov chain Monte Carlo methods. Model selection was performed in accordance with the deviance information criteria and the log pseudo‐marginal likelihood. The results obtained reveal that, in general, Cox models present great robustness in covariate effects and survival estimates independent of the baseline hazard specification. In relation to the “semi‐parametric” baseline hazard specification, the B‐splines hazard function is less dependent on the regularization process than the piecewise specification because it demands a smaller time axis partition to estimate a similar behavior of the risk.  相似文献   

10.
Biophysical models are increasingly used for medical applications at the organ scale. However, model predictions are rarely associated with a confidence measure although there are important sources of uncertainty in computational physiology methods. For instance, the sparsity and noise of the clinical data used to adjust the model parameters (personalization), and the difficulty in modeling accurately soft tissue physiology. The recent theoretical progresses in stochastic models make their use computationally tractable, but there is still a challenge in estimating patient-specific parameters with such models. In this work we propose an efficient Bayesian inference method for model personalization using polynomial chaos and compressed sensing. This method makes Bayesian inference feasible in real 3D modeling problems. We demonstrate our method on cardiac electrophysiology. We first present validation results on synthetic data, then we apply the proposed method to clinical data. We demonstrate how this can help in quantifying the impact of the data characteristics on the personalization (and thus prediction) results. Described method can be beneficial for the clinical use of personalized models as it explicitly takes into account the uncertainties on the data and the model parameters while still enabling simulations that can be used to optimize treatment. Such uncertainty handling can be pivotal for the proper use of modeling as a clinical tool, because there is a crucial requirement to know the confidence one can have in personalized models.  相似文献   

11.
A vast amount of ecological knowledge generated over the past two decades has hinged upon the ability of model selection methods to discriminate among various ecological hypotheses. The last decade has seen the rise of Bayesian hierarchical models in ecology. Consequently, commonly used tools, such as the AIC, become largely inapplicable and there appears to be no consensus about a particular model selection tool that can be universally applied. We focus on a specific class of competing Bayesian spatial capture–recapture (SCR) models and apply and evaluate some of the recommended Bayesian model selection tools: (1) Bayes Factor—using (a) Gelfand‐Dey and (b) harmonic mean methods, (2) Deviance Information Criterion (DIC), (3) Watanabe‐Akaike's Information Criterion (WAIC) and (4) posterior predictive loss criterion. In all, we evaluate 25 variants of model selection tools in our study. We evaluate these model selection tools from the standpoint of selecting the “true” model and parameter estimation. In all, we generate 120 simulated data sets using the true model and assess the frequency with which the true model is selected and how well the tool estimates N (population size), a parameter of much importance to ecologists. We find that when information content is low in the data, no particular model selection tool can be recommended to help realize, simultaneously, both the goals of model selection and parameter estimation. But, in general (when we consider both the objectives together), we recommend the use of our application of the Bayes Factor (Gelfand‐Dey with MAP approximation) for Bayesian SCR models. Our study highlights the point that although new model selection tools are emerging (e.g., WAIC) in the applied statistics literature, those tools based on sound theory even under approximation may still perform much better.  相似文献   

12.
Salway R  Wakefield J 《Biometrics》2008,64(2):620-626
Summary .   This article considers the modeling of single-dose pharmacokinetic data. Traditionally, so-called compartmental models have been used to analyze such data. Unfortunately, the mean function of such models are sums of exponentials for which inference and computation may not be straightforward. We present an alternative to these models based on generalized linear models, for which desirable statistical properties exist, with a logarithmic link and gamma distribution. The latter has a constant coefficient of variation, which is often appropriate for pharmacokinetic data. Inference is convenient from either a likelihood or a Bayesian perspective. We consider models for both single and multiple individuals, the latter via generalized linear mixed models. For single individuals, Bayesian computation may be carried out with recourse to simulation. We describe a rejection algorithm that, unlike Markov chain Monte Carlo, produces independent samples from the posterior and allows straightforward calculation of Bayes factors for model comparison. We also illustrate how prior distributions may be specified in terms of model-free pharmacokinetic parameters of interest. The methods are applied to data from 12 individuals following administration of the antiasthmatic agent theophylline.  相似文献   

13.
Yin G  Ibrahim JG 《Biometrics》2005,61(1):208-216
For multivariate failure time data, we propose a new class of shared gamma frailty models by imposing the Box-Cox transformation on the hazard function, and the product of the baseline hazard and the frailty. This novel class of models allows for a very broad range of shapes and relationships between the hazard and baseline hazard functions. It includes the well-known Cox gamma frailty model and a new additive gamma frailty model as two special cases. Due to the nonnegative hazard constraint, this shared gamma frailty model is computationally challenging in the Bayesian paradigm. The joint priors are constructed through a conditional-marginal specification, in which the conditional distribution is univariate, and it absorbs the nonlinear parameter constraints. The marginal part of the prior specification is free of constraints. The prior distributions allow us to easily compute the full conditionals needed for Gibbs sampling, while incorporating the constraints. This class of shared gamma frailty models is illustrated with a real dataset.  相似文献   

14.
We introduce a new statistical computing method, called data cloning, to calculate maximum likelihood estimates and their standard errors for complex ecological models. Although the method uses the Bayesian framework and exploits the computational simplicity of the Markov chain Monte Carlo (MCMC) algorithms, it provides valid frequentist inferences such as the maximum likelihood estimates and their standard errors. The inferences are completely invariant to the choice of the prior distributions and therefore avoid the inherent subjectivity of the Bayesian approach. The data cloning method is easily implemented using standard MCMC software. Data cloning is particularly useful for analysing ecological situations in which hierarchical statistical models, such as state-space models and mixed effects models, are appropriate. We illustrate the method by fitting two nonlinear population dynamics models to data in the presence of process and observation noise.  相似文献   

15.
Marginalized models (Heagerty, 1999, Biometrics 55, 688-698) permit likelihood-based inference when interest lies in marginal regression models for longitudinal binary response data. Two such models are the marginalized transition and marginalized latent variable models. The former captures within-subject serial dependence among repeated measurements with transition model terms while the latter assumes exchangeable or nondiminishing response dependence using random intercepts. In this article, we extend the class of marginalized models by proposing a single unifying model that describes both serial and long-range dependence. This model will be particularly useful in longitudinal analyses with a moderate to large number of repeated measurements per subject, where both serial and exchangeable forms of response correlation can be identified. We describe maximum likelihood and Bayesian approaches toward parameter estimation and inference, and we study the large sample operating characteristics under two types of dependence model misspecification. Data from the Madras Longitudinal Schizophrenia Study (Thara et al., 1994, Acta Psychiatrica Scandinavica 90, 329-336) are analyzed.  相似文献   

16.
As the practice of using population models for wildlife risk assessment has become more common, so has the practice of using surrogate data, typically taken from the published scientific literature, as inputs for demographic models. This practice clearly exposes the user to inferential errors. However, it is likely to continue because demographic data are expensive to gather. We review potential errors associated with the use of previously published demographic data and how those errors propagate into the endpoints of demographic projection models. We suggest methods for inferring bias in model endpoints when multiple and opposing biases are present in the demographic input data. We provide an example using Eastern Meadowlarks (Sturnella magna), a common songbird in Midwestern grasslands and agro-ecosystems. We conclude with a brief review of methods that could improve inference made using published demographic data, including methods from life-history theory, meta-analysis, and Bayesian statistics.  相似文献   

17.
基于稳定氧同位素确定植物水分来源不同方法的比较   总被引:3,自引:0,他引:3  
利用稳定同位素技术确定植物水分来源,对提高生态水文过程的认识和对干旱半干旱区的生态管理至关重要。目前基于稳定同位素技术确定植物水分来源的方法众多,但不同方法之间对比的研究较少。本研究基于原位样品采集,室内实验测试,利用直接对比法、多元线性混合模型(IsoSource)、贝叶斯混合模型(MixSIR、MixSIAR)和吸水深度模型分析植物水分来源,并对比各方法的优缺点。结果表明:相对于多元线性混合模型(IsoSource)而言,贝叶斯混合模型(MixSIR、MixSIAR)具有更好的水源区分性能,但对数据要求较高,且植物木质部水和潜在水源同位素组成的标准差越小,模型运行结果的可信度更高。本研究中贝叶斯混合模型(MixSIR)为最优解。在利用稳定氢氧同位素技术确定植物水分来源时,可先通过直接对比法定性判断植物可能利用的潜在水源,然后再用多元线性混合模型(IsoSource)、贝叶斯混合模型(MixSIR、MixSIAR)计算出各潜在水源对植物的贡献率和贡献范围,必要时可评估模型性能,选择出最优模型,定量分析植物的水分来源。若植物主要吸收利用不同土层深度的土壤水,可结合吸水深度模型计算出植物...  相似文献   

18.
This paper presents a novel semiparametric joint model for multivariate longitudinal and survival data (SJMLS) by relaxing the normality assumption of the longitudinal outcomes, leaving the baseline hazard functions unspecified and allowing the history of the longitudinal response having an effect on the risk of dropout. Using Bayesian penalized splines to approximate the unspecified baseline hazard function and combining the Gibbs sampler and the Metropolis–Hastings algorithm, we propose a Bayesian Lasso (BLasso) method to simultaneously estimate unknown parameters and select important covariates in SJMLS. Simulation studies are conducted to investigate the finite sample performance of the proposed techniques. An example from the International Breast Cancer Study Group (IBCSG) is used to illustrate the proposed methodologies.  相似文献   

19.
Houseman EA  Marsit C  Karagas M  Ryan LM 《Biometrics》2007,63(4):1269-1277
Increasingly used in health-related applications, latent variable models provide an appealing framework for handling high-dimensional exposure and response data. Item response theory (IRT) models, which have gained widespread popularity, were originally developed for use in the context of educational testing, where extremely large sample sizes permitted the estimation of a moderate-to-large number of parameters. In the context of public health applications, smaller sample sizes preclude large parameter spaces. Therefore, we propose a penalized likelihood approach to reduce mean square error and improve numerical stability. We present a continuous family of models, indexed by a tuning parameter, that range between the Rasch model and the IRT model. The tuning parameter is selected by cross validation or approximations such as Akaike Information Criterion. While our approach can be placed easily in a Bayesian context, we find that our frequentist approach is more computationally efficient. We demonstrate our methodology on a study of methylation silencing of gene expression in bladder tumors. We obtain similar results using both frequentist and Bayesian approaches, although the frequentist approach is less computationally demanding. In particular, we find high correlation of methylation silencing among 16 loci in bladder tumors, that methylation is associated with smoking and also with patient survival.  相似文献   

20.
Predictions of metal consumption are vital for criticality assessments and sustainability analyses. Although demand for a material varies strongly by region and end-use sector, statistical models of demand typically predict demand using regression analyses at an aggregated global level (“fully pooled models”). “Un-pooled” regression models that predict demand at a disaggregated country or regional level face challenges due to limited data availability and large uncertainty. In this paper, we propose a Bayesian hierarchical model that can simultaneously identify heterogeneous demand parameters (like price and income elasticities) for individual regions and sectors, as well as global parameters. We demonstrate the model's value by estimating income and price elasticity of copper demand in five sectors (Transportation, Electrical, Construction, Manufacturing, and Other) and five regions (North America, Europe, Japan, China, and Rest of World). To validate the benefits of the Bayesian approach, we compare the model to both a “fully pooled” and an “un-pooled” model. The Bayesian model can predict global demand with similar uncertainty as a fully pooled regression model, while additionally capturing regional heterogeneity in income elasticity of demand. Compared to un-pooled models that predict demand for individual countries and sectors separately, our model reduces the uncertainty of parameter estimates by more than 50%. The hierarchical Bayesian modeling approach we propose can be used for various commodities, improving material demand projections used to study the impact of policies on mining sector emissions and informing investment in critical material production.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号