首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
In longitudinal studies where time to a final event is the ultimate outcome often information is available about intermediate events the individuals may experience during the observation period. Even though many extensions of the Cox proportional hazards model have been proposed to model such multivariate time-to-event data these approaches are still very rarely applied to real datasets. The aim of this paper is to illustrate the application of extended Cox models for multiple time-to-event data and to show their implementation in popular statistical software packages. We demonstrate a systematic way of jointly modelling similar or repeated transitions in follow-up data by analysing an event-history dataset consisting of 270 breast cancer patients, that were followed-up for different clinical events during treatment in metastatic disease. First, we show how this methodology can also be applied to non Markovian stochastic processes by representing these processes as "conditional" Markov processes. Secondly, we compare the application of different Cox-related approaches to the breast cancer data by varying their key model components (i.e. analysis time scale, risk set and baseline hazard function). Our study showed that extended Cox models are a powerful tool for analysing complex event history datasets since the approach can address many dynamic data features such as multiple time scales, dynamic risk sets, time-varying covariates, transition by covariate interactions, autoregressive dependence or intra-subject correlation.  相似文献   

2.
BackgroundIn both observational and randomized studies, associations with overall survival are by and large assessed on a multiplicative scale using the Cox model. However, clinicians and clinical researchers have an ardent interest in assessing absolute benefit associated with treatments. In older patients, some studies have reported lower relative treatment effect, which might translate into similar or even greater absolute treatment effect given their high baseline hazard for clinical events.MethodsThe effect of treatment and the effect modification of treatment were respectively assessed using a multiplicative and an additive hazard model in an analysis adjusted for propensity score in the context of coronary surgery.ResultsThe multiplicative model yielded a lower relative hazard reduction with bilateral internal thoracic artery grafting in older patients (Hazard ratio for interaction/year = 1.03, 95%CI: 1.00 to 1.06, p = 0.05) whereas the additive model reported a similar absolute hazard reduction with increasing age (Delta for interaction/year = 0.10, 95%CI: -0.27 to 0.46, p = 0.61). The number needed to treat derived from the propensity score-adjusted multiplicative model was remarkably similar at the end of the follow-up in patients aged < = 60 and in patients >70.ConclusionsThe present example demonstrates that a lower treatment effect in older patients on a relative scale can conversely translate into a similar treatment effect on an additive scale due to large baseline hazard differences. Importantly, absolute risk reduction, either crude or adjusted, can be calculated from multiplicative survival models. We advocate for a wider use of the absolute scale, especially using additive hazard models, to assess treatment effect and treatment effect modification.  相似文献   

3.
Fully Bayesian methods for Cox models specify a model for the baseline hazard function. Parametric approaches generally provide monotone estimations. Semi‐parametric choices allow for more flexible patterns but they can suffer from overfitting and instability. Regularization methods through prior distributions with correlated structures usually give reasonable answers to these types of situations. We discuss Bayesian regularization for Cox survival models defined via flexible baseline hazards specified by a mixture of piecewise constant functions and by a cubic B‐spline function. For those “semi‐parametric” proposals, different prior scenarios ranging from prior independence to particular correlated structures are discussed in a real study with microvirulence data and in an extensive simulation scenario that includes different data sample and time axis partition sizes in order to capture risk variations. The posterior distribution of the parameters was approximated using Markov chain Monte Carlo methods. Model selection was performed in accordance with the deviance information criteria and the log pseudo‐marginal likelihood. The results obtained reveal that, in general, Cox models present great robustness in covariate effects and survival estimates independent of the baseline hazard specification. In relation to the “semi‐parametric” baseline hazard specification, the B‐splines hazard function is less dependent on the regularization process than the piecewise specification because it demands a smaller time axis partition to estimate a similar behavior of the risk.  相似文献   

4.
When analyzing clinical trials with a stratified population, homogeneity of treatment effects is a common assumption in survival analysis. However, in the context of recent developments in clinical trial design, which aim to test multiple targeted therapies in corresponding subpopulations simultaneously, the assumption that there is no treatment‐by‐stratum interaction seems inappropriate. It becomes an issue if the expected sample size of the strata makes it unfeasible to analyze the trial arms individually. Alternatively, one might choose as primary aim to prove efficacy of the overall (targeted) treatment strategy. When testing for the overall treatment effect, a violation of the no‐interaction assumption renders it necessary to deviate from standard methods that rely on this assumption. We investigate the performance of different methods for sample size calculation and data analysis under heterogeneous treatment effects. The commonly used sample size formula by Schoenfeld is compared to another formula by Lachin and Foulkes, and to an extension of Schoenfeld's formula allowing for stratification. Beyond the widely used (stratified) Cox model, we explore the lognormal shared frailty model, and a two‐step analysis approach as potential alternatives that attempt to adjust for interstrata heterogeneity. We carry out a simulation study for a trial with three strata and violations of the no‐interaction assumption. The extension of Schoenfeld's formula to heterogeneous strata effects provides the most reliable sample size with respect to desired versus actual power. The two‐step analysis and frailty model prove to be more robust against loss of power caused by heterogeneous treatment effects than the stratified Cox model and should be preferred in such situations.  相似文献   

5.
Right-truncated data arise when observations are ascertained retrospectively, and only subjects who experience the event of interest by the time of sampling are selected. Such a selection scheme, without adjustment, leads to biased estimation of covariate effects in the Cox proportional hazards model. The existing methods for fitting the Cox model to right-truncated data, which are based on the maximization of the likelihood or solving estimating equations with respect to both the baseline hazard function and the covariate effects, are numerically challenging. We consider two alternative simple methods based on inverse probability weighting (IPW) estimating equations, which allow consistent estimation of covariate effects under a positivity assumption and avoid estimation of baseline hazards. We discuss problems of identifiability and consistency that arise when positivity does not hold and show that although the partial tests for null effects based on these IPW methods can be used in some settings even in the absence of positivity, they are not valid in general. We propose adjusted estimating equations that incorporate the probability of observation when it is known from external sources, which results in consistent estimation. We compare the methods in simulations and apply them to the analyses of human immunodeficiency virus latency.  相似文献   

6.
Yin G  Ibrahim JG 《Biometrics》2005,61(1):208-216
For multivariate failure time data, we propose a new class of shared gamma frailty models by imposing the Box-Cox transformation on the hazard function, and the product of the baseline hazard and the frailty. This novel class of models allows for a very broad range of shapes and relationships between the hazard and baseline hazard functions. It includes the well-known Cox gamma frailty model and a new additive gamma frailty model as two special cases. Due to the nonnegative hazard constraint, this shared gamma frailty model is computationally challenging in the Bayesian paradigm. The joint priors are constructed through a conditional-marginal specification, in which the conditional distribution is univariate, and it absorbs the nonlinear parameter constraints. The marginal part of the prior specification is free of constraints. The prior distributions allow us to easily compute the full conditionals needed for Gibbs sampling, while incorporating the constraints. This class of shared gamma frailty models is illustrated with a real dataset.  相似文献   

7.
Cho H  Ibrahim JG  Sinha D  Zhu H 《Biometrics》2009,65(1):116-124
We propose Bayesian case influence diagnostics for complex survival models. We develop case deletion influence diagnostics for both the joint and marginal posterior distributions based on the Kullback-Leibler divergence (K-L divergence). We present a simplified expression for computing the K-L divergence between the posterior with the full data and the posterior based on single case deletion, as well as investigate its relationships to the conditional predictive ordinate. All the computations for the proposed diagnostic measures can be easily done using Markov chain Monte Carlo samples from the full data posterior distribution. We consider the Cox model with a gamma process prior on the cumulative baseline hazard. We also present a theoretical relationship between our case-deletion diagnostics and diagnostics based on Cox's partial likelihood. A simulated data example and two real data examples are given to demonstrate the methodology.  相似文献   

8.
The Cox proportional hazards model has become the standard in biomedical studies, particularly for settings in which the estimation covariate effects (as opposed to prediction) is the primary objective. In spite of the obvious flexibility of this approach and its wide applicability, the model is not usually chosen for its fit to the data, but by convention and for reasons of convenience. It is quite possible that the covariates add to, rather than multiply the baseline hazard, making an additive hazards model a more suitable choice. Typically, proportionality is assumed, with the potential for additive covariate effects not evaluated or even seriously considered. Contributing to this phenomenon is the fact that many popular software packages (e.g., SAS, S-PLUS/R) have standard procedures to fit the Cox model (e.g., proc phreg, coxph), but as of yet no analogous procedures to fit its additive analog, the Lin and Ying (1994) semiparametric additive hazards model. In this article, we establish the connections between the Lin and Ying (1994) model and both Cox and least squares regression. We demonstrate how SAS's phreg and reg procedures may be used to fit the additive hazards model, after some straightforward data manipulations. We then apply the additive hazards model to examine the relationship between Model for End-stage Liver Disease (MELD) score and mortality among patients wait-listed for liver transplantation.  相似文献   

9.
Summary Identification of novel biomarkers for risk assessment is important for both effective disease prevention and optimal treatment recommendation. Discovery relies on the precious yet limited resource of stored biological samples from large prospective cohort studies. Case‐cohort sampling design provides a cost‐effective tool in the context of biomarker evaluation, especially when the clinical condition of interest is rare. Existing statistical methods focus on making efficient inference on relative hazard parameters from the Cox regression model. Drawing on recent theoretical development on the weighted likelihood for semiparametric models under two‐phase studies ( Breslow and Wellner, 2007 ), we propose statistical methods to evaluate accuracy and predictiveness of a risk prediction biomarker, with censored time‐to‐event outcome under stratified case‐cohort sampling. We consider nonparametric methods and a semiparametric method. We derive large sample properties of proposed estimators and evaluate their finite sample performance using numerical studies. We illustrate new procedures using data from Framingham Offspring Study to evaluate the accuracy of a recently developed risk score incorporating biomarker information for predicting cardiovascular disease.  相似文献   

10.
Dong B  Matthews DE 《Biometrics》2012,68(2):408-418
In medical studies, it is often of scientific interest to evaluate the treatment effect via the ratio of cumulative hazards, especially when those hazards may be nonproportional. To deal with nonproportionality in the Cox regression model, investigators usually assume that the treatment effect has some functional form. However, to do so may create a model misspecification problem because it is generally difficult to justify the specific parametric form chosen for the treatment effect. In this article, we employ empirical likelihood (EL) to develop a nonparametric estimator of the cumulative hazard ratio with covariate adjustment under two nonproportional hazard models, one that is stratified, as well as a less restrictive framework involving group-specific treatment adjustment. The asymptotic properties of the EL ratio statistic are derived in each situation and the finite-sample properties of EL-based estimators are assessed via simulation studies. Simultaneous confidence bands for all values of the adjusted cumulative hazard ratio in a fixed interval of interest are also developed. The proposed methods are illustrated using two different datasets concerning the survival experience of patients with non-Hodgkin's lymphoma or ovarian cancer.  相似文献   

11.
This work is motivated by clinical trials in chronic heart failure disease, where treatment has effects both on morbidity (assessed as recurrent non‐fatal hospitalisations) and on mortality (assessed as cardiovascular death, CV death). Recently, a joint frailty proportional hazards model has been proposed for these kind of efficacy outcomes to account for a potential association between the risk rates for hospital admissions and CV death. However, more often clinical trial results are presented by treatment effect estimates that have been derived from marginal proportional hazards models, that is, a Cox model for mortality and an Andersen–Gill model for recurrent hospitalisations. We show how these marginal hazard ratios and their estimates depend on the association between the risk processes, when these are actually linked by shared or dependent frailty terms. First we derive the marginal hazard ratios as a function of time. Then, applying least false parameter theory, we show that the marginal hazard ratio estimate for the hospitalisation rate depends on study duration and on parameters of the underlying joint frailty model. In particular, we identify parameters, for example the treatment effect on mortality, that determine if the marginal hazard ratio estimate for hospitalisations is smaller, equal or larger than the conditional one. How this affects rejection probabilities is further investigated in simulation studies. Our findings can be used to interpret marginal hazard ratio estimates in heart failure trials and are illustrated by the results of the CHARM‐Preserved trial (where CHARM is the ‘Candesartan in Heart failure Assessment of Reduction in Mortality and morbidity’ programme).  相似文献   

12.
The stratified Cox proportional hazards model is introduced to incorporate covariates and involve nonproportional treatment effect of two groups into the analysis and then the confidence interval estimators for the difference in median survival times of two treatments in stratified Cox model are proposed. The one is based on baseline survival functions of two groups, and the other on average survival functions of two groups. I illustrate the proposed methods with an example from a study conducted by the Radiation Therapy Oncology Group in cancer of the mouth and throat. Simulations are carried out to investigate the small‐sample properties of proposed methods in terms of coverage rates.  相似文献   

13.
Liang Li  Bo Hu  Tom Greene 《Biometrics》2009,65(3):737-745
Summary .  In many longitudinal clinical studies, the level and progression rate of repeatedly measured biomarkers on each subject quantify the severity of the disease and that subject's susceptibility to progression of the disease. It is of scientific and clinical interest to relate such quantities to a later time-to-event clinical endpoint such as patient survival. This is usually done with a shared parameter model. In such models, the longitudinal biomarker data and the survival outcome of each subject are assumed to be conditionally independent given subject-level severity or susceptibility (also called frailty in statistical terms). In this article, we study the case where the conditional distribution of longitudinal data is modeled by a linear mixed-effect model, and the conditional distribution of the survival data is given by a Cox proportional hazard model. We allow unknown regression coefficients and time-dependent covariates in both models. The proposed estimators are maximizers of an exact correction to the joint log likelihood with the frailties eliminated as nuisance parameters, an idea that originated from correction of covariate measurement error in measurement error models. The corrected joint log likelihood is shown to be asymptotically concave and leads to consistent and asymptotically normal estimators. Unlike most published methods for joint modeling, the proposed estimation procedure does not rely on distributional assumptions of the frailties. The proposed method was studied in simulations and applied to a data set from the Hemodialysis Study.  相似文献   

14.
The goal of relative survival methodology is to compare thesurvival experience of a cohort with that of the backgroundpopulation. Most often an additive excess hazard model is employed,which assumes that each person's hazard is a sum of 2 components—thepopulation hazard obtained from life tables and an excess hazardattributable to the specific condition. Usually covariate effectson the excess hazard are assumed to have a proportional hazardsstructure with parametrically modelled baseline. In this paper,we introduce a new fitting procedure using the expectation–maximizationalgorithm, treating the cause of death as missing data. Themethod requires no assumptions about the baseline excess hazardthus reducing the risk of bias through misspecification. Itaccommodates the possibility of knowledge of cause of deathfor some patients, and as a side effect, the method yields anestimate of the ratio between the excess and the populationhazard for each subject. More importantly, it estimates thebaseline excess hazard flexibly with no additional degrees offreedom spent. Finally, it is a generalization of the Cox model,meaning that all the wealth of options in existing softwarefor the Cox model can be used in relative survival. The methodis applied to a data set on survival after myocardial infarction,where it shows how a particular form of the hazard functioncould be missed using the existing methods.  相似文献   

15.
Realistic power calculations for large cohort studies and nested case control studies are essential for successfully answering important and complex research questions in epidemiology and clinical medicine. For this, we provide a methodical framework for general realistic power calculations via simulations that we put into practice by means of an R‐based template. We consider staggered recruitment and individual hazard rates, competing risks, interaction effects, and the misclassification of covariates. The study cohort is assembled with respect to given age‐, gender‐, and community distributions. Nested case‐control analyses with a varying number of controls enable comparisons of power with a full cohort analysis. Time‐to‐event generation under competing risks, including delayed study‐entry times, is realized on the basis of a six‐state Markov model. Incidence rates, prevalence of risk factors and prefixed hazard ratios allow for the assignment of age‐dependent transition rates given in the form of Cox models. These provide the basis for a central simulation‐algorithm, which is used for the generation of sample paths of the underlying time‐inhomogeneous Markov processes. With the inclusion of frailty terms into the Cox models the Markov property is specifically biased. An “individual Markov process given frailty” creates some unobserved heterogeneity between individuals. Different left‐truncation‐ and right‐censoring patterns call for the use of Cox models for data analysis. p‐values are recorded over repeated simulation runs to allow for the desired power calculations. For illustration, we consider scenarios with a “testing” character as well as realistic scenarios. This enables the validation of a correct implementation of theoretical concepts and concrete sample size recommendations against an actual epidemiological background, here given with possible substudy designs within the German National Cohort.  相似文献   

16.
A number of imprinted genes have been observed in plants, animals and humans. They not only control growth and developmental traits, but may also be responsible for survival traits. Based on the Cox proportional hazards (PH) model, we constructed a general parametric model for dissecting genomic imprinting, in which a baseline hazard function is selectable for fitting the effects of imprinted quantitative trait loci (iQTL) genotypes on the survival curve. The expectation–maximisation (EM) algorithm is derived for solving the maximum likelihood estimates of iQTL parameters. The imprinting patterns of the detected iQTL are statistically tested under a series of null hypotheses. The Bayesian information criterion (BIC) model selection criterion is employed to choose an optimal baseline hazard function with maximum likelihood and parsimonious parameterisation. We applied the proposed approach to analyse the published data in an F2 population of mice and concluded that, among five commonly used survival distributions, the log-logistic distribution is the optimal baseline hazard function for the survival time of hyperoxic acute lung injury (HALI). Under this optimal model, five QTL were detected, among which four are imprinted in different imprinting patterns.  相似文献   

17.
In clinical settings, the necessity of treatment is often measured in terms of the patient’s prognosis in the absence of treatment. Along these lines, it is often of interest to compare subgroups of patients (e.g., based on underlying diagnosis) with respect to pre-treatment survival. Such comparisons may be complicated by at least two important issues. First, mortality contrasts by subgroup may differ over follow-up time, as opposed to being constant, and may follow a form that is difficult to model parametrically. Moreover, in settings where the proportional hazards assumption fails, investigators tend to be more interested in cumulative (as opposed to instantaneous) effects on mortality. Second, pre-treatment death is censored by the receipt of treatment and in settings where treatment assignment depends on time-dependent factors that also affect mortality, such censoring is likely to be informative. We propose semiparametric methods for contrasting subgroup-specific cumulative mortality in the presence of dependent censoring. The proposed estimators are based on the cumulative hazard function, with pre-treatment mortality assumed to follow a stratified Cox model. No functional form is assumed for the nature of the non-proportionality. Asymptotic properties of the proposed estimators are derived, and simulation studies show that the proposed methods are applicable to practical sample sizes. The methods are then applied to contrast pre-transplant mortality for acute versus chronic End-Stage Liver Disease patients.  相似文献   

18.
Wei G  Schaubel DE 《Biometrics》2008,64(3):724-732
Summary .   Often in medical studies of time to an event, the treatment effect is not constant over time. In the context of Cox regression modeling, the most frequent solution is to apply a model that assumes the treatment effect is either piecewise constant or varies smoothly over time, i.e., the Cox nonproportional hazards model. This approach has at least two major limitations. First, it is generally difficult to assess whether the parametric form chosen for the treatment effect is correct. Second, in the presence of nonproportional hazards, investigators are usually more interested in the cumulative than the instantaneous treatment effect (e.g., determining if and when the survival functions cross). Therefore, we propose an estimator for the aggregate treatment effect in the presence of nonproportional hazards. Our estimator is based on the treatment-specific baseline cumulative hazards estimated under a stratified Cox model. No functional form for the nonproportionality need be assumed. Asymptotic properties of the proposed estimators are derived, and the finite-sample properties are assessed in simulation studies. Pointwise and simultaneous confidence bands of the estimator can be computed. The proposed method is applied to data from a national organ failure registry.  相似文献   

19.
This paper deals with hazard regression models for survival data with time-dependent covariates consisting of updated quantitative measurements. The main emphasis is on the Cox proportional hazards model but also additive hazard models are discussed. Attenuation of regression coefficients caused by infrequent updating of covariates is evaluated using simulated data mimicking our main example, the CSL1 liver cirrhosis trial. We conclude that the degree of attenuation depends on the type of stochastic process describing the time-dependent covariate and that attenuation may be substantial for an Ornstein-Uhlenbeck process. Also trends in the covariate combined with non-synchronous updating may cause attenuation. Simple methods to adjust for infrequent updating of covariates are proposed and compared to existing techniques using both simulations and the CSL1 data. The comparison shows that while existing, more complicated methods may work well with frequent updating of covariates the simpler techniques may have advantages in larger data sets with infrequent updatings.  相似文献   

20.
BackgroundComparative effectiveness studies of cancer therapeutics in observational data face confounding by patterns of clinical treatment over time. The validity of survival analysis in longitudinal health records depends on study design choices including index date definition and model specification for covariate adjustment.MethodsOverall survival in cancer is a multi-state transition process with mortality and treatment switching as competing risks. Parametric Weibull regression quantifies proportionality of hazards across lines of therapy in real-world cohorts of 12 solid tumor types. Study design assessments compare alternative analytic models in simulations with realistic disproportionality. The multi-state simulation framework is adaptable to alternative treatment effect profiles and exposure patterns.ResultsEvent-specific hazards of treatment-switching and death are not proportional across lines of therapy in 12 solid tumor types. Study designs that include all eligible lines of therapy per subject showed lower bias and variance than designs that select one line per subject. Confounding by line number was effectively mitigated across a range of simulation scenarios by Cox proportional hazards models with stratified baseline hazards and inverse probability of treatment weighting.ConclusionQuantitative study design assessment can inform the planning of observational research in clinical oncology by demonstrating the potential impact of model misspecification. Use of empirical parameter estimates in simulation designs adapts analytic recommendations to the clinical population of interest.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号