首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 0 毫秒
1.
We study a hybrid model that combines Cox proportional hazards regression with tree-structured modeling. The main idea is to use step functions, provided by a tree structure, to 'augment' Cox (1972) proportional hazards models. The proposed model not only provides a natural assessment of the adequacy of the Cox proportional hazards model but also improves its model fitting without loss of interpretability. Both simulations and an empirical example are provided to illustrate the use of the proposed method.  相似文献   

2.
3.
Summary .   The Cox hazards model ( Cox, 1972 , Journal of the Royal Statistical Society, Series B 34, 187–220) for survival data is routinely used in many applied fields, sometimes, however, with too little emphasis on the fit of the model. A useful alternative to the Cox model is the Aalen additive hazards model ( Aalen, 1980 , in Lecture Notes in Statistics-2 , 1–25) that can easily accommodate time changing covariate effects. It is of interest to decide which of the two models that are most appropriate to apply in a given application. This is a nontrivial problem as these two classes of models are nonnested except only for special cases. In this article we explore the Mizon–Richard encompassing test for this particular problem. It turns out that it corresponds to fitting of the Aalen model to the martingale residuals obtained from the Cox regression analysis. We also consider a variant of this method, which relates to the proportional excess model ( Martinussen and Scheike, 2002 , Biometrika 89, 283–298). Large sample properties of the suggested methods under the two rival models are derived. The finite-sample properties of the proposed procedures are assessed through a simulation study. The methods are further applied to the well-known primary biliary cirrhosis data set.  相似文献   

4.
5.
The Cox proportional hazards model has become the standard in biomedical studies, particularly for settings in which the estimation covariate effects (as opposed to prediction) is the primary objective. In spite of the obvious flexibility of this approach and its wide applicability, the model is not usually chosen for its fit to the data, but by convention and for reasons of convenience. It is quite possible that the covariates add to, rather than multiply the baseline hazard, making an additive hazards model a more suitable choice. Typically, proportionality is assumed, with the potential for additive covariate effects not evaluated or even seriously considered. Contributing to this phenomenon is the fact that many popular software packages (e.g., SAS, S-PLUS/R) have standard procedures to fit the Cox model (e.g., proc phreg, coxph), but as of yet no analogous procedures to fit its additive analog, the Lin and Ying (1994) semiparametric additive hazards model. In this article, we establish the connections between the Lin and Ying (1994) model and both Cox and least squares regression. We demonstrate how SAS's phreg and reg procedures may be used to fit the additive hazards model, after some straightforward data manipulations. We then apply the additive hazards model to examine the relationship between Model for End-stage Liver Disease (MELD) score and mortality among patients wait-listed for liver transplantation.  相似文献   

6.
Yip PS  Zhou Y  Lin DY  Fang XZ 《Biometrics》1999,55(3):904-908
We use the semiparametric additive hazards model to formulate the effects of individual covariates on the capture rates in the continuous-time capture-recapture experiment, and then construct a Horvitz-Thompson-type estimator for the unknown population size. The resulting estimator is consistent and asymptotically normal with an easily estimated variance. Simulation studies show that the asymptotic approximations are adequate for practical use when the average capture probabilities exceed .5. Ignoring covariates would underestimate the population size and the coverage probability is poor. A wildlife example is provided.  相似文献   

7.
8.
The standard Cox model is perhaps the most commonly used model for regression analysis of failure time data but it has some limitations such as the assumption on linear covariate effects. To relax this, the nonparametric additive Cox model, which allows for nonlinear covariate effects, is often employed, and this paper will discuss variable selection and structure estimation for this general model. For the problem, we propose a penalized sieve maximum likelihood approach with the use of Bernstein polynomials approximation and group penalization. To implement the proposed method, an efficient group coordinate descent algorithm is developed and can be easily carried out for both low- and high-dimensional scenarios. Furthermore, a simulation study is performed to assess the performance of the presented approach and suggests that it works well in practice. The proposed method is applied to an Alzheimer's disease study for identifying important and relevant genetic factors.  相似文献   

9.
On using the Cox proportional hazards model with missing covariates   总被引:1,自引:0,他引:1  
  相似文献   

10.
11.
K Y Liang  S G Self  X H Liu 《Biometrics》1990,46(3):783-793
In this paper, we develop the Cox proportional hazards model with special structured time-dependent covariates in the context of prospective epidemiologic studies. Our model possesses the following two features: (i) different relative risk parameters are allowed for early versus late onset of the disease of interest; (ii) an additional parameter is introduced so that specification is not required for the time (age) at which a change of the magnitude of the relative risks takes place, the so-called change point. Some difficulties with statistical inference for the proposed model are briefly discussed, and the large-sample distribution of a test for no change point is derived. As an illustration, we apply the model to a set of data gathered on a group of white male medical students of The Johns Hopkins Medical School enrolled between 1948 and 1964. We examine the hypothesis that the effect of reactivity to the cold pressor test may vary with early versus late onset of hypertension.  相似文献   

12.
The additive hazards model specifies the effect of covariates on the hazard in an additive way, in contrast to the popular Cox model, in which it is multiplicative. As the non-parametric model, additive hazards offer a very flexible way of modeling time-varying covariate effects. It is most commonly estimated by ordinary least squares. In this paper, we consider the case where covariates are bounded, and derive the maximum likelihood estimator under the constraint that the hazard is non-negative for all covariate values in their domain. We show that the maximum likelihood estimator may be obtained by separately maximizing the log-likelihood contribution of each event time point, and we show that the maximizing problem is equivalent to fitting a series of Poisson regression models with an identity link under non-negativity constraints. We derive an analytic solution to the maximum likelihood estimator. We contrast the maximum likelihood estimator with the ordinary least-squares estimator in a simulation study and show that the maximum likelihood estimator has smaller mean squared error than the ordinary least-squares estimator. An illustration with data on patients with carcinoma of the oropharynx is provided.  相似文献   

13.
Yan J  Huang J 《Biometrics》2012,68(2):419-428
Summary Cox models with time-varying coefficients offer great flexibility in capturing the temporal dynamics of covariate effects on right-censored failure times. Because not all covariate coefficients are time varying, model selection for such models presents an additional challenge, which is to distinguish covariates with time-varying coefficient from those with time-independent coefficient. We propose an adaptive group lasso method that not only selects important variables but also selects between time-independent and time-varying specifications of their presence in the model. Each covariate effect is partitioned into a time-independent part and a time-varying part, the latter of which is characterized by a group of coefficients of basis splines without intercept. Model selection and estimation are carried out through a fast, iterative group shooting algorithm. Our approach is shown to have good properties in a simulation study that mimics realistic situations with up to 20 variables. A real example illustrates the utility of the method.  相似文献   

14.
Parzen M  Lipsitz SR 《Biometrics》1999,55(2):580-584
In this paper, a global goodness-of-fit test statistic for a Cox regression model, which has an approximate chi-squared distribution when the model has been correctly specified, is proposed. Our goodness-of-fit statistic is global and has power to detect if interactions or higher order powers of covariates in the model are needed. The proposed statistic is similar to the Hosmer and Lemeshow (1980, Communications in Statistics A10, 1043-1069) goodness-of-fit statistic for binary data as well as Schoenfeld's (1980, Biometrika 67, 145-153) statistic for the Cox model. The methods are illustrated using data from a Mayo Clinic trial in primary billiary cirrhosis of the liver (Fleming and Harrington, 1991, Counting Processes and Survival Analysis), in which the outcome is the time until liver transplantation or death. The are 17 possible covariates. Two Cox proportional hazards models are fit to the data, and the proposed goodness-of-fit statistic is applied to the fitted models.  相似文献   

15.
Parker CB  Delong ER 《Biometrics》2000,56(4):996-1001
Changes in maximum likelihood parameter estimates due to deletion of individual observations are useful statistics, both for regression diagnostics and for computing robust estimates of covariance. For many likelihoods, including those in the exponential family, these delete-one statistics can be approximated analytically from a one-step Newton-Raphson iteration on the full maximum likelihood solution. But for general conditional likelihoods and the related Cox partial likelihood, the one-step method does not reduce to an analytic solution. For these likelihoods, an alternative analytic approximation that relies on an appropriately augmented design matrix has been proposed. In this paper, we extend the augmentation approach to explicitly deal with discrete failure-time models. In these models, an individual subject may contribute information at several time points, thereby appearing in multiple risk sets before eventually experiencing a failure or being censored. Our extension also allows the covariates to be time dependent. The new augmentation requires no additional computational resources while improving results.  相似文献   

16.
Outcome mismeasurement can lead to biased estimation in several contexts. Magder and Hughes (1997, American Journal of Epidemiology 146, 195-203) showed that failure to adjust for imperfect outcome measures in logistic regression analysis can conservatively bias estimation of covariate effects, even when the mismeasurement rate is the same across levels of the covariate. Other authors have addressed the need to account for mismeasurement in survival analysis in selected cases (Snapinn, 1998, Biometrics 54, 209-218; Gelfand and Wang, 2000, Statistics in Medicine 19, 1865-1879; Balasubramanian and Lagakos, 2001, Biometrics 57, 1048-1058, 2003, Biometrika 90, 171-182). We provide a general, more widely applicable, adjusted proportional hazards (APH) method for estimation of cumulative survival and hazard ratios in discrete time when the outcome is measured with error. We show that mismeasured failure status in a standard proportional hazards (PH) model can conservatively bias estimation of hazard ratios and that inference, in most practical situations, is more severely affected by poor specificity than by poor sensitivity. However, in simulations over a wide range of conditions, the APH method with correctly specified mismeasurement rates performs very well.  相似文献   

17.
Lin J  Zhang D  Davidian M 《Biometrics》2006,62(3):803-812
We propose "score-type" tests for the proportional hazards assumption and for covariate effects in the Cox model using the natural smoothing spline representation of the corresponding nonparametric functions of time or covariate. The tests are based on the penalized partial likelihood and are derived by viewing the inverse of the smoothing parameter as a variance component and testing an equivalent null hypothesis that the variance component is zero. We show that the tests have a size close to the nominal level and good power against general alternatives, and we apply them to data from a cancer clinical trial.  相似文献   

18.
19.
For sample size calculation in clinical trials with survival endpoints, the logrank test, which is the optimal method under the proportional hazard (PH) assumption, is predominantly used. In reality, the PH assumption may not hold. For example, in immuno-oncology trials, delayed treatment effects are often expected. The sample size without considering the potential violation of the PH assumption may lead to an underpowered study. In recent years, combination tests such as the maximum weighted logrank test have received great attention because of their robust performance in various hazards scenarios. In this paper, we propose a flexible simulation-free procedure to calculate the sample size using combination tests. The procedure extends the Lakatos' Markov model and allows for complex situations encountered in a clinical trial, like staggered entry, dropouts, etc. We evaluate the procedure using two maximum weighted logrank tests, one projection-type test, and three other commonly used tests under various hazards scenarios. The simulation studies show that the proposed method can achieve the target power for all compared tests in most scenarios. The combination tests exhibit robust performance under correct specification and misspecification scenarios and are highly recommended when the hazard-changing patterns are unknown beforehand. Finally, we demonstrate our method using two clinical trial examples and provide suggestions about the sample size calculations under nonproportional hazards.  相似文献   

20.
A novel functional additive model is proposed, which is uniquely modified and constrained to model nonlinear interactions between a treatment indicator and a potentially large number of functional and/or scalar pretreatment covariates. The primary motivation for this approach is to optimize individualized treatment rules based on data from a randomized clinical trial. We generalize functional additive regression models by incorporating treatment-specific components into additive effect components. A structural constraint is imposed on the treatment-specific components in order to provide a class of additive models with main effects and interaction effects that are orthogonal to each other. If primary interest is in the interaction between treatment and the covariates, as is generally the case when optimizing individualized treatment rules, we can thereby circumvent the need to estimate the main effects of the covariates, obviating the need to specify their form and thus avoiding the issue of model misspecification. The methods are illustrated with data from a depression clinical trial with electroencephalogram functional data as patients' pretreatment covariates.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号