首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Fully Bayesian methods for Cox models specify a model for the baseline hazard function. Parametric approaches generally provide monotone estimations. Semi‐parametric choices allow for more flexible patterns but they can suffer from overfitting and instability. Regularization methods through prior distributions with correlated structures usually give reasonable answers to these types of situations. We discuss Bayesian regularization for Cox survival models defined via flexible baseline hazards specified by a mixture of piecewise constant functions and by a cubic B‐spline function. For those “semi‐parametric” proposals, different prior scenarios ranging from prior independence to particular correlated structures are discussed in a real study with microvirulence data and in an extensive simulation scenario that includes different data sample and time axis partition sizes in order to capture risk variations. The posterior distribution of the parameters was approximated using Markov chain Monte Carlo methods. Model selection was performed in accordance with the deviance information criteria and the log pseudo‐marginal likelihood. The results obtained reveal that, in general, Cox models present great robustness in covariate effects and survival estimates independent of the baseline hazard specification. In relation to the “semi‐parametric” baseline hazard specification, the B‐splines hazard function is less dependent on the regularization process than the piecewise specification because it demands a smaller time axis partition to estimate a similar behavior of the risk.  相似文献   

2.
This paper deals with hazard regression models for survival data with time-dependent covariates consisting of updated quantitative measurements. The main emphasis is on the Cox proportional hazards model but also additive hazard models are discussed. Attenuation of regression coefficients caused by infrequent updating of covariates is evaluated using simulated data mimicking our main example, the CSL1 liver cirrhosis trial. We conclude that the degree of attenuation depends on the type of stochastic process describing the time-dependent covariate and that attenuation may be substantial for an Ornstein-Uhlenbeck process. Also trends in the covariate combined with non-synchronous updating may cause attenuation. Simple methods to adjust for infrequent updating of covariates are proposed and compared to existing techniques using both simulations and the CSL1 data. The comparison shows that while existing, more complicated methods may work well with frequent updating of covariates the simpler techniques may have advantages in larger data sets with infrequent updatings.  相似文献   

3.
BackgroundFlexible parametric survival models (FPMs) are commonly used in epidemiology. These are preferred as a wide range of hazard shapes can be captured using splines to model the log-cumulative hazard function and can include time-dependent effects for more flexibility. An important issue is the number of knots used for splines. The reliability of estimates are assessed using English data for 10 cancer types and the use of online interactive graphs to enable a more comprehensive sensitivity analysis at the control of the user is demonstrated.MethodsSixty FPMs were fitted to each cancer type with varying degrees of freedom to model the baseline excess hazard and the main and time-dependent effect of age. For each model, we obtained age-specific, age-group and internally age-standardised relative survival estimates. The Akaike Information Criterion and Bayesian Information Criterion were also calculated and comparative estimates were obtained using the Ederer II and Pohar Perme methods. Web-based interactive graphs were developed to present results.ResultsAge-standardised estimates were very insensitive to the exact number of knots for the splines. Age-group survival is also stable with negligible differences between models. Age-specific estimates are less stable especially for the youngest and oldest patients, of whom there are very few, but for most scenarios perform well.ConclusionAlthough estimates do not depend heavily on the number of knots, too few knots should be avoided, as they can result in a poor fit. Interactive graphs engage researchers in assessing model sensitivity to a wide range of scenarios and their use is highly encouraged.  相似文献   

4.
Summary Case–cohort sampling is a commonly used and efficient method for studying large cohorts. Most existing methods of analysis for case–cohort data have concerned the analysis of univariate failure time data. However, clustered failure time data are commonly encountered in public health studies. For example, patients treated at the same center are unlikely to be independent. In this article, we consider methods based on estimating equations for case–cohort designs for clustered failure time data. We assume a marginal hazards model, with a common baseline hazard and common regression coefficient across clusters. The proposed estimators of the regression parameter and cumulative baseline hazard are shown to be consistent and asymptotically normal, and consistent estimators of the asymptotic covariance matrices are derived. The regression parameter estimator is easily computed using any standard Cox regression software that allows for offset terms. The proposed estimators are investigated in simulation studies, and demonstrated empirically to have increased efficiency relative to some existing methods. The proposed methods are applied to a study of mortality among Canadian dialysis patients.  相似文献   

5.
BackgroundIn both observational and randomized studies, associations with overall survival are by and large assessed on a multiplicative scale using the Cox model. However, clinicians and clinical researchers have an ardent interest in assessing absolute benefit associated with treatments. In older patients, some studies have reported lower relative treatment effect, which might translate into similar or even greater absolute treatment effect given their high baseline hazard for clinical events.MethodsThe effect of treatment and the effect modification of treatment were respectively assessed using a multiplicative and an additive hazard model in an analysis adjusted for propensity score in the context of coronary surgery.ResultsThe multiplicative model yielded a lower relative hazard reduction with bilateral internal thoracic artery grafting in older patients (Hazard ratio for interaction/year = 1.03, 95%CI: 1.00 to 1.06, p = 0.05) whereas the additive model reported a similar absolute hazard reduction with increasing age (Delta for interaction/year = 0.10, 95%CI: -0.27 to 0.46, p = 0.61). The number needed to treat derived from the propensity score-adjusted multiplicative model was remarkably similar at the end of the follow-up in patients aged < = 60 and in patients >70.ConclusionsThe present example demonstrates that a lower treatment effect in older patients on a relative scale can conversely translate into a similar treatment effect on an additive scale due to large baseline hazard differences. Importantly, absolute risk reduction, either crude or adjusted, can be calculated from multiplicative survival models. We advocate for a wider use of the absolute scale, especially using additive hazard models, to assess treatment effect and treatment effect modification.  相似文献   

6.
P K Andersen  M Vaeth 《Biometrics》1989,45(2):523-535
This paper studies two classes of hazard-rate-based models for the mortality in a group of individuals taking normal life expectancy into account. In a multiplicative hazard model, the estimate for the relative mortality generalises the standardised mortality ratio, and the adequacy of a model with constant relative mortality can be tested using a type of total time on test statistic. In an additive hazard model, continuous-time generalisations of a "corrected" survival curve and a "normal" survival curve are obtained, and the adequacy of a model with constant excess mortality can again be tested using a type of total time on test statistic. A model including both the multiplicative hazard model and the additive hazard model is briefly considered. The use of the models is illustrated on a set of data concerning survival after operation for malignant melanoma.  相似文献   

7.
In this paper, we introduce a new model for recurrent event data characterized by a baseline rate function fully parametric, which is based on the exponential‐Poisson distribution. The model arises from a latent competing risk scenario, in the sense that there is no information about which cause was responsible for the event occurrence. Then, the time of each recurrence is given by the minimum lifetime value among all latent causes. The new model has a particular case, which is the classical homogeneous Poisson process. The properties of the proposed model are discussed, including its hazard rate function, survival function, and ordinary moments. The inferential procedure is based on the maximum likelihood approach. We consider an important issue of model selection between the proposed model and its particular case by the likelihood ratio test and score test. Goodness of fit of the recurrent event models is assessed using Cox‐Snell residuals. A simulation study evaluates the performance of the estimation procedure in the presence of a small and moderate sample sizes. Applications on two real data sets are provided to illustrate the proposed methodology. One of them, first analyzed by our team of researchers, considers the data concerning the recurrence of malaria, which is an infectious disease caused by a protozoan parasite that infects red blood cells.  相似文献   

8.
A number of imprinted genes have been observed in plants, animals and humans. They not only control growth and developmental traits, but may also be responsible for survival traits. Based on the Cox proportional hazards (PH) model, we constructed a general parametric model for dissecting genomic imprinting, in which a baseline hazard function is selectable for fitting the effects of imprinted quantitative trait loci (iQTL) genotypes on the survival curve. The expectation–maximisation (EM) algorithm is derived for solving the maximum likelihood estimates of iQTL parameters. The imprinting patterns of the detected iQTL are statistically tested under a series of null hypotheses. The Bayesian information criterion (BIC) model selection criterion is employed to choose an optimal baseline hazard function with maximum likelihood and parsimonious parameterisation. We applied the proposed approach to analyse the published data in an F2 population of mice and concluded that, among five commonly used survival distributions, the log-logistic distribution is the optimal baseline hazard function for the survival time of hyperoxic acute lung injury (HALI). Under this optimal model, five QTL were detected, among which four are imprinted in different imprinting patterns.  相似文献   

9.
In the presence of competing causes of event occurrence (e.g., death), the interest might not only be in the overall survival but also in the so-called net survival, that is, the hypothetical survival that would be observed if the disease under study were the only possible cause of death. Net survival estimation is commonly based on the excess hazard approach in which the hazard rate of individuals is assumed to be the sum of a disease-specific and expected hazard rate, supposed to be correctly approximated by the mortality rates obtained from general population life tables. However, this assumption might not be realistic if the study participants are not comparable with the general population. Also, the hierarchical structure of the data can induces a correlation between the outcomes of individuals coming from the same clusters (e.g., hospital, registry). We proposed an excess hazard model that corrects simultaneously for these two sources of bias, instead of dealing with them independently as before. We assessed the performance of this new model and compared it with three similar models, using extensive simulation study, as well as an application to breast cancer data from a multicenter clinical trial. The new model performed better than the others in terms of bias, root mean square error, and empirical coverage rate. The proposed approach might be useful to account simultaneously for the hierarchical structure of the data and the non-comparability bias in studies such as long-term multicenter clinical trials, when there is interest in the estimation of net survival.  相似文献   

10.
Carlin BP  Hodges JS 《Biometrics》1999,55(4):1162-1170
In clinical trials conducted over several data collection centers, the most common statistically defensible analytic method, a stratified Cox model analysis, suffers from two important defects. First, identification of units that are outlying with respect to the baseline hazard is awkward since this hazard is implicit (rather than explicit) in the Cox partial likelihood. Second (and more seriously), identification of modest treatment effects is often difficult since the model fails to acknowledge any similarity across the strata. We consider a number of hierarchical modeling approaches that preserve the integrity of the stratified design while offering a middle ground between traditional stratified and unstratified analyses. We investigate both fully parametric (Weibull) and semiparametric models, the latter based not on the Cox model but on an extension of an idea by Gelfand and Mallick (1995, Biometrics 51, 843-852), which models the integrated baseline hazard as a mixture of monotone functions. We illustrate the methods using data from a recent multicenter AIDS clinical trial, comparing their ease of use, interpretation, and degree of robustness with respect to estimates of both the unit-specific baseline hazards and the treatment effect.  相似文献   

11.
Most existing statistical methods for mapping quantitative trait loci (QTL) assume that the phenotype follows a normal distribution and that it is fully observed. However, some phenotypes have skewed distributions and may be censored. This note proposes a simple and efficient approach to QTL detecting for censored traits with the Cox PH model without estimating the baseline hazard function which is "nuisance".  相似文献   

12.
Objective: To adjust an excess hazard regression model with a random effect associated with a geographical level, the Département in France, and compare its parameter estimates with those obtained using a “fixed-effect” excess hazard regression model. Methods: An excess hazard regression model with a piecewise constant baseline hazard was used and a normal distribution was assumed for the random effect. Likelihood maximization was performed using a numerical integration technique, the Quadrature of Gauss–Hermite. Results were obtained with colon-rectum and thyroid cancer data from the French network of cancer registries. Result: The results were in agreement with what was theoretically expected. We showed a greater heterogeneity of the excess hazard in thyroid cancers than in colon-rectum cancers. The hazard ratios for the covariates as estimated with the mixed-effect model were close to those obtained with the fixed-effect model. However, unlike the fixed-effect model, the mixed-effect model allowed the analysis of data with a large number of clusters. The shrinkage estimator associated with Département is an optimal measure of Département-specific excess risk of death and the variance of the random effect gave information on the within-cluster correlation. Conclusion: An excess hazard regression model with random effect can be used for estimating variation in the risk of death due to cancer between many clusters of small sizes.  相似文献   

13.
14.
The natural survival, relative to properly chosen controls, of 26 beagle dogs injected once intravenously with an average of 0.58 +/- 0.04 kBq 239Pu/kg, 23 dogs injected with 2.31 +/- 0.43 kBq 226Ra/kg, 13 dogs injected with 1.84 +/- 0.26 kBq 228Ra/kg, 12 dogs injected with 0.56 +/- 0.030 kBq 228Th/kg, and 12 dogs injected with 21.13 +/- 1.74 kBq 90Sr/kg was evaluated statistically. The amounts of these radionuclides are related directly to the estimated maximum permissible body burdens for humans suggested in ICRP II (1959). They constitute a level of exposure that initially was assumed to cause no deleterious effects in dogs. This study had two objectives: (1) identification of homogeneous control groups against which to evaluate the survival of the irradiated groups and (2) comparison of the survival characteristics and estimation of mortality or hazard rate ratios for control dogs vs dogs injected with the baseline dosages given above. It was shown, by goodness-of-fit plots, that the Cox proportional hazards model was an appropriate method of analysis. Therefore, covariates that possibly could influence survival were tested for significance. Only the effects of grand mal seizure, which is caused in epileptic dogs by an external stimulus and can be fatal if untreated, were significant (P less than 0.0001). Consequently, in the final model, death from grand mal seizure was considered as accidental. After censoring the dogs dying from grand mal seizure, it was established that the data for the control groups from previous and contemporary experiments could be pooled. The change in hazard rates relative to controls resulting from exposure to the baseline radionuclide level was modest, 1.6 times for 239Pu (P = 0.033), 1.0(4) for 226Ra (P = 0.86), 1.9 for 228Ra (P = 0.035), 2.5 for 228Th (P less than 0.001), and 0.52 for 90Sr (P = 0.041). Bone tumor induction was clearly elevated in dogs injected with 239Pu and 228Th. When the effect of these bone tumors on survival was removed by censoring, the dogs injected with 239Pu were indistinguishable from the controls. In contrast, the effects of bone tumor on group survival of the 228Ra and 228Th dogs were not significant. Thus, no additional life-shortening effects beyond those attributable to bone tumor were suggested by these data for 239Pu, but other, as yet unspecified, confounders are suggested for 228Ra and 228Th.  相似文献   

15.
The semiparametric Cox proportional hazards model is routinely adopted to model time-to-event data. Proportionality is a strong assumption, especially when follow-up time, or study duration, is long. Zeng and Lin (J. R. Stat. Soc., Ser. B, 69:1–30, 2007) proposed a useful generalisation through a family of transformation models which allow hazard ratios to vary over time. In this paper we explore a variety of tests for the need for transformation, arguing that the Cox model is so ubiquitous that it should be considered as the default model, to be discarded only if there is good evidence against the model assumptions. Since fitting an alternative transformation model is more complicated than fitting the Cox model, especially as procedures are not yet incorporated in standard software, we focus mainly on tests which require a Cox fit only. A score test is derived, and we also consider performance of omnibus goodness-of-fit tests based on Schoenfeld residuals. These tests can be extended to compare different transformation models. In addition we explore the consequences of fitting a misspecified Cox model to data generated under a true transformation model. Data on survival of 1043 leukaemia patients are used for illustration.  相似文献   

16.
Pan W  Chappell R 《Biometrics》2002,58(1):64-70
We show that the nonparametric maximum likelihood estimate (NPMLE) of the regression coefficient from the joint likelihood (of the regression coefficient and the baseline survival) works well for the Cox proportional hazards model with left-truncated and interval-censored data, but the NPMLE may underestimate the baseline survival. Two alternatives are also considered: first, the marginal likelihood approach by extending Satten (1996, Biometrika 83, 355-370) to truncated data, where the baseline distribution is eliminated as a nuisance parameter; and second, the monotone maximum likelihood estimate that maximizes the joint likelihood by assuming that the baseline distribution has a nondecreasing hazard function, which was originally proposed to overcome the underestimation of the survival from the NPMLE for left-truncated data without covariates (Tsai, 1988, Biometrika 75, 319-324). The bootstrap is proposed to draw inference. Simulations were conducted to assess their performance. The methods are applied to the Massachusetts Health Care Panel Study data set to compare the probabilities of losing functional independence for male and female seniors.  相似文献   

17.

Background

Randomized Controlled Trials almost invariably utilize the hazard ratio calculated with a Cox proportional hazard model as a treatment efficacy measure. Despite the widespread adoption of HRs, these provide a limited understanding of the treatment effect and may even provide a biased estimate when the assumption of proportional hazards in the Cox model is not verified by the trial data. Additional treatment effect measures on the survival probability or the time scale may be used to supplement HRs but a framework for the simultaneous generation of these measures is lacking.

Methods

By splitting follow-up time at the nodes of a Gauss Lobatto numerical quadrature rule, techniques for Poisson Generalized Additive Models (PGAM) can be adopted for flexible hazard modeling. Straightforward simulation post-estimation transforms PGAM estimates for the log hazard into estimates of the survival function. These in turn were used to calculate relative and absolute risks or even differences in restricted mean survival time between treatment arms. We illustrate our approach with extensive simulations and in two trials: IPASS (in which the proportionality of hazards was violated) and HEMO a long duration study conducted under evolving standards of care on a heterogeneous patient population.

Findings

PGAM can generate estimates of the survival function and the hazard ratio that are essentially identical to those obtained by Kaplan Meier curve analysis and the Cox model. PGAMs can simultaneously provide multiple measures of treatment efficacy after a single data pass. Furthermore, supported unadjusted (overall treatment effect) but also subgroup and adjusted analyses, while incorporating multiple time scales and accounting for non-proportional hazards in survival data.

Conclusions

By augmenting the HR conventionally reported, PGAMs have the potential to support the inferential goals of multiple stakeholders involved in the evaluation and appraisal of clinical trial results under proportional and non-proportional hazards.  相似文献   

18.
We propose a constrained maximum partial likelihood estimator for dimension reduction in integrative (e.g., pan-cancer) survival analysis with high-dimensional predictors. We assume that for each population in the study, the hazard function follows a distinct Cox proportional hazards model. To borrow information across populations, we assume that each of the hazard functions depend only on a small number of linear combinations of the predictors (i.e., “factors”). We estimate these linear combinations using an algorithm based on “distance-to-set” penalties. This allows us to impose both low-rankness and sparsity on the regression coefficient matrix estimator. We derive asymptotic results that reveal that our estimator is more efficient than fitting a separate proportional hazards model for each population. Numerical experiments suggest that our method outperforms competitors under various data generating models. We use our method to perform a pan-cancer survival analysis relating protein expression to survival across 18 distinct cancer types. Our approach identifies six linear combinations, depending on only 20 proteins, which explain survival across the cancer types. Finally, to validate our fitted model, we show that our estimated factors can lead to better prediction than competitors on four external datasets.  相似文献   

19.
It is generally assumed that the daily probability of survival of mosquitoes is independent of age. To test this assumption we have conducted a three‐year experimental fieldwork study (2005–2007) at Fortaleza‐CE in Brazil, determining daily survival rates of the dengue vector Aedes aegypti (L.). Survival rates of adult Ae. aegypti may be age‐dependent and the statistical analysis is a sensitive approach for comparing patterns of mosquito survival. The mosquito survival data were better fit by a Weibull survival function than by the more traditionally used Gompertz or logistic survival functions. Gompertz, Weibull, or logistic survival functions often fit the survival, and the tails of the survival curves usually appear to fall between the values predicted by the three functions. We corroborate that the mortality of Ae. aegypti in semi‐natural conditions may no more be considered as a constant phenomenon during the life of adult mosquitoes but varies according to the age and environmental conditions under a tropical climate. This study estimates the variability in the survival rate of Ae. aegypti and environmental factors that are related to such variability. The statistical analysis shows that the fitting ability, concerning the hazard function, was in decreasing order: Seasonal Cox, the three‐parameter Gompertz, and the three‐parameter Weibull, that was similar to the three‐parameter logistic. The advantage of using the Cox model is that it is convenient for exploring the relationship between survival and several explanatory variables. The Cox model has the advantage of preserving the variable in its original quantitative form and of using a maximum of information. The survival analyses indicate that mosquito mortality is both age‐ and environment‐dependent.  相似文献   

20.
Analysis with time-to-event data in clinical and epidemiological studies often encounters missing covariate values, and the missing at random assumption is commonly adopted, which assumes that missingness depends on the observed data, including the observed outcome which is the minimum of survival and censoring time. However, it is conceivable that in certain settings, missingness of covariate values is related to the survival time but not to the censoring time. This is especially so when covariate missingness is related to an unmeasured variable affected by the patient's illness and prognosis factors at baseline. If this is the case, then the covariate missingness is not at random as the survival time is censored, and it creates a challenge in data analysis. In this article, we propose an approach to deal with such survival-time-dependent covariate missingness based on the well known Cox proportional hazard model. Our method is based on inverse propensity weighting with the propensity estimated by nonparametric kernel regression. Our estimators are consistent and asymptotically normal, and their finite-sample performance is examined through simulation. An application to a real-data example is included for illustration.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号