首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
In survival analysis when the mortality reaches a peak after some finite period and then slowly declines, it is appropriate to use a model which has a nonmonotonic failure rate. In this paper we study the log-logistic model whose failure rate exhibits the above behavior and its mean residual life behaves in the reverse fashion. The maximum likelihood estimation of the parameters is examined and it is proved analytically that unique maximum likelihood estimates exist for the parameters. A lung cancer data set is analyzed. Confidence intervals for the parameters as well as for the critical points of the failure rate and mean residual life functions are obtained for the high performance status (PS) and low PS subgroups, where the term performance status is a measure of general medical status.  相似文献   

2.
A number of imprinted genes have been observed in plants, animals and humans. They not only control growth and developmental traits, but may also be responsible for survival traits. Based on the Cox proportional hazards (PH) model, we constructed a general parametric model for dissecting genomic imprinting, in which a baseline hazard function is selectable for fitting the effects of imprinted quantitative trait loci (iQTL) genotypes on the survival curve. The expectation–maximisation (EM) algorithm is derived for solving the maximum likelihood estimates of iQTL parameters. The imprinting patterns of the detected iQTL are statistically tested under a series of null hypotheses. The Bayesian information criterion (BIC) model selection criterion is employed to choose an optimal baseline hazard function with maximum likelihood and parsimonious parameterisation. We applied the proposed approach to analyse the published data in an F2 population of mice and concluded that, among five commonly used survival distributions, the log-logistic distribution is the optimal baseline hazard function for the survival time of hyperoxic acute lung injury (HALI). Under this optimal model, five QTL were detected, among which four are imprinted in different imprinting patterns.  相似文献   

3.
The survival curves of Listeria innocua CDW47 by high hydrostatic pressure were obtained at four pressure levels (138, 207, 276, 345 MPa) and four temperatures (25, 35, 45, 50 degrees C) in peptone solution. Tailing was observed in the survival curves. Elevated temperatures and pressures substantially promoted the inactivation of L. innocua. A linear and two non-linear (Weibull and log-logistic) models were fitted to these data and the goodness of fit of these models were compared. Regression coefficients (R2), root mean square (RMSE), accuracy factor (Af) values and residual plots suggested that linear model, although it produced good fits for some pressure-temperature combinations, was not as appropriate as non-linear models to represent the data. The residual and correlation plots strongly suggested that among the non linear models studied the log-logistic model produced better fit to the data than the Weibull model. Such pressure-temperature inactivation models form the engineering basis for design, evaluation and optimization of high hydrostatic pressure processes as a new preservation technique.  相似文献   

4.
In this paper, six mathematical models were applied to model time trends of smoking cessation. Both statistical and non-statistical methods were used and included the exponential, ideodynamic, log-logistic, Pareto, sickle and Weibull models. All models included the possibilities of both permanent abstinence and relapse to smoking. Time trends from all models were compared with data from the Multiple Risk Factor Intervention Trial (MRFIT) program. The Pareto, log-logistic, Weibull and ideodynamic models yielded satisfactory fits to the data while the sickle and exponential models did not. Even though the data used in this paper were not sufficient to distinguish among these four models, the methodology will be useful for further narrowing the model choices as additional data for the testing become available.  相似文献   

5.
Summary .  In this article, we consider the setting where the event of interest can occur repeatedly for the same subject (i.e., a recurrent event; e.g., hospitalization) and may be stopped permanently by a terminating event (e.g., death). Among the different ways to model recurrent/terminal event data, the marginal mean (i.e., averaging over the survival distribution) is of primary interest from a public health or health economics perspective. Often, the difference between treatment-specific recurrent event means will not be constant over time, particularly when treatment-specific differences in survival exist. In such cases, it makes more sense to quantify treatment effect based on the cumulative difference in the recurrent event means, as opposed to the instantaneous difference in the rates. We propose a method that compares treatments by separately estimating the survival probabilities and recurrent event rates given survival, then integrating to get the mean number of events. The proposed method combines an additive model for the conditional recurrent event rate and a proportional hazards model for the terminating event hazard. The treatment effects on survival and on recurrent event rate among survivors are estimated in constructing our measure and explain the mechanism generating the difference under study. The example that motivates this research is the repeated occurrence of hospitalization among kidney transplant recipients, where the effect of expanded criteria donor (ECD) compared to non-ECD kidney transplantation on the mean number of hospitalizations is of interest.  相似文献   

6.
The applicability of the William, Landel, and Ferry (WLF) equation with a modification to take into account the effect of melt-dilution and an empirical log-logistic equation were evaluated to model the kinetics of diffusion-controlled reactions in frozen systems. Kinetic data for the pectin methylesterase catalyzed hydrolysis of pectin in four model systems with different glass transition temperatures: sucrose, maltodextrin (DE = 16.5-19.5), carboxymethylcellulose (CMC) and fructose in a temperature range of -24 to 0 degrees C were used. The modified WLF equation was evaluated with a concentration-dependent glass transition temperature (T(g)) as well as the glass transition temperature of the maximally freeze-concentrated matrix (T(g)') as reference temperatures. The equation with temperature-dependent T(g) described the reaction kinetics reasonably well in all the model systems studied. However the kinetics was better described by a linear relationship between log(V(0)/V(0ref)) and (T - T(ref)) in all cases except CMC. The log-logistic equation also described the kinetics reasonably well. The effect of melt-dilution on reactant concentration was found to be minimal in all cases.  相似文献   

7.
There are many situations where intermittent short-term exposures of a certain kind are thought to temporarily enhance the risk of onset of an adverse health event (illness). When the hazard rate of the illness is small it is desirable to investigate this possible association using only data on cases occurring in a finite observation period. Here we extend a method for such an analysis by allowing the baseline hazard for the illness to depend on the increasing age over the observation period and using age at the times of exposure, a time dependent variable, as a covariate in the effect of the transient exposure. The method is illustrated with a study of the possible association of long-haul air travel and hospitalization for venous thromboembolism over an observation period of 19 years. It is demonstrated that allowing for aging over the observation period can avoid bias in the estimated effect size when the baseline hazard for the illness increases with age and exposures occur irregularly over time.  相似文献   

8.
In this paper we present and discuss a novel, simple and easy to implement parametric modeling approach to assess synergy. An extended three parameter log-logistic model is used to analyse the data and calculate confidence intervals of the interaction indices. In addition the model corrects for the bias due to plate-location effects. The analysis is performed with PROC NLMIXED and SAS-code is provided. The approach is illustrated using data coming from an oncology study in which the inhibition effect of a combination of two compounds is studied using 96-well plates and a fixed-ratio design.  相似文献   

9.
The goal of relative survival methodology is to compare thesurvival experience of a cohort with that of the backgroundpopulation. Most often an additive excess hazard model is employed,which assumes that each person's hazard is a sum of 2 components—thepopulation hazard obtained from life tables and an excess hazardattributable to the specific condition. Usually covariate effectson the excess hazard are assumed to have a proportional hazardsstructure with parametrically modelled baseline. In this paper,we introduce a new fitting procedure using the expectation–maximizationalgorithm, treating the cause of death as missing data. Themethod requires no assumptions about the baseline excess hazardthus reducing the risk of bias through misspecification. Itaccommodates the possibility of knowledge of cause of deathfor some patients, and as a side effect, the method yields anestimate of the ratio between the excess and the populationhazard for each subject. More importantly, it estimates thebaseline excess hazard flexibly with no additional degrees offreedom spent. Finally, it is a generalization of the Cox model,meaning that all the wealth of options in existing softwarefor the Cox model can be used in relative survival. The methodis applied to a data set on survival after myocardial infarction,where it shows how a particular form of the hazard functioncould be missed using the existing methods.  相似文献   

10.
Genomic imprinting, a genetic phenomenon of non-equivalent allele expression that depends on parental origins, has been ubiquitously observed in nature. It does not only control the traits of growth and development but also may be responsible for survival traits. Based on the accelerated failure time model, we construct a general parametric model for mapping the imprinted QTL (iQTL). Within the framework of interval mapping, maximum likelihood estimation of iQTL parameters is implemented via EM algorithm. The imprinting patterns of the detected iQTL are statistically tested according to a series of null hypotheses. BIC model selection criterion is employed to choose an optimal baseline hazard function with maximum likelihood and parsimonious parameters. Simulations are used to validate the proposed mapping procedure. A published dataset from a mouse model system was used to illustrate the proposed framework. Results show that among the five commonly used survival distributions, Log-logistic distribution is the optimal baseline hazard function for mapping QTL of hyperoxic acute lung injury (HALI) survival; under the log-logistic distribution, four QTLs were identified, in which only one QTL was inherited in Mendelian fashion, whereas others were imprinted in different imprinting patterns.  相似文献   

11.
We present a parametric family of regression models for interval-censored event-time (survival) data that accomodates both fixed (e.g. baseline) and time-dependent covariates. The model employs a three-parameter family of survival distributions that includes the Weibull, negative binomial, and log-logistic distributions as special cases, and can be applied to data with left, right, interval, or non-censored event times. Standard methods, such as Newton-Raphson, can be employed to estimate the model and the resulting estimates have an asymptotically normal distribution about the true values with a covariance matrix that is consistently estimated by the information function. The deviance function is described to assess model fit and a robust sandwich estimate of the covariance may also be employed to provide asymptotically robust inferences when the model assumptions do not apply. Spline functions may also be employed to allow for non-linear covariates. The model is applied to data from a long-term study of type 1 diabetes to describe the effects of longitudinal measures of glycemia (HbA1c) over time (the time-dependent covariate) on the risk of progression of diabetic retinopathy (eye disease), an interval-censored event-time outcome.  相似文献   

12.
The accelerated failure time regression model is most commonly used with right-censored survival data. This report studies the use of a Weibull-based accelerated failure time regression model when left- and interval-censored data are also observed. Two alternative methods of analysis are considered. First, the maximum likelihood estimates (MLEs) for the observed censoring pattern are computed. These are compared with estimates where midpoints are substituted for left- and interval-censored data (midpoint estimator, or MDE). Simulation studies indicate that for relatively large samples there are many instances when the MLE is superior to the MDE. For samples where the hazard rate is flat or nearly so, or where the percentage of interval-censored data is small, the MDE is adequate. An example using Framingham Heart Study data is discussed.  相似文献   

13.
A five-parameter competing hazard model of the age pattern of mortality is described, and methods of fitting it to survivorship, death rate, and age structure data are developed and presented. The methods are then applied to published life table and census data to construct life tables for a Late Woodland population, a Christian period Nubian population, and the Yanomama. The advantage of this approach over the use of model life tables is that the hazard model facilitates life-table construction without imposing a particular age pattern of mortality on the data. This development makes it possible to use anthropological data to extend the study of human variation in mortality patterns to small populations.  相似文献   

14.
Yin G  Ibrahim JG 《Biometrics》2005,61(1):208-216
For multivariate failure time data, we propose a new class of shared gamma frailty models by imposing the Box-Cox transformation on the hazard function, and the product of the baseline hazard and the frailty. This novel class of models allows for a very broad range of shapes and relationships between the hazard and baseline hazard functions. It includes the well-known Cox gamma frailty model and a new additive gamma frailty model as two special cases. Due to the nonnegative hazard constraint, this shared gamma frailty model is computationally challenging in the Bayesian paradigm. The joint priors are constructed through a conditional-marginal specification, in which the conditional distribution is univariate, and it absorbs the nonlinear parameter constraints. The marginal part of the prior specification is free of constraints. The prior distributions allow us to easily compute the full conditionals needed for Gibbs sampling, while incorporating the constraints. This class of shared gamma frailty models is illustrated with a real dataset.  相似文献   

15.
This paper is concerned with the analysis of zero‐inflated count data when time of exposure varies. It proposes a modified zero‐inflated count data model where the probability of an extra zero is derived from an underlying duration model with Weibull hazard rate. The new model is compared to the standard Poisson model with logit zero inflation in an application to the effect of treatment with thiotepa on the number of new bladder tumors.  相似文献   

16.
A model is discussed for incorporating information from a time-dependent covariable (an intervening event) and covariables independent of time into the analysis of survival data. In the model, it is assumed that individuals are potentially subject to two paths to failure, one including the intervening event and the other not. Additional assumptions are that failure times associated with the two paths are independent and that the time to failure subsequent to the intervening event is dependent on the intervening event time. Allowing the underlying hazard rates for the model to follow a WEIBULL form, use of the model and methods for fitting and hypothesis testing are illustrated by application to a follow-up study involving industrial workers where disability retirement was the intervening event. Extensions of the model to accommodate grouped survival data are presented.  相似文献   

17.
An estimator of the hazard rate function from discrete failure time data is obtained by semiparametric smoothing of the (nonsmooth) maximum likelihood estimator, which is achieved by repeated multiplication of a Markov chain transition-type matrix. This matrix is constructed so as to have a given standard discrete parametric hazard rate model, termed the vehicle model, as its stationary hazard rate. As with the discrete density estimation case, the proposed estimator gives improved performance when the vehicle model is a good one and otherwise provides a nonparametric method comparable to the only purely nonparametric smoother discussed in the literature. The proposed semiparametric smoothing approach is then extended to hazard models with covariates and is illustrated by applications to simulated and real data sets.  相似文献   

18.
In the presence of competing causes of event occurrence (e.g., death), the interest might not only be in the overall survival but also in the so-called net survival, that is, the hypothetical survival that would be observed if the disease under study were the only possible cause of death. Net survival estimation is commonly based on the excess hazard approach in which the hazard rate of individuals is assumed to be the sum of a disease-specific and expected hazard rate, supposed to be correctly approximated by the mortality rates obtained from general population life tables. However, this assumption might not be realistic if the study participants are not comparable with the general population. Also, the hierarchical structure of the data can induces a correlation between the outcomes of individuals coming from the same clusters (e.g., hospital, registry). We proposed an excess hazard model that corrects simultaneously for these two sources of bias, instead of dealing with them independently as before. We assessed the performance of this new model and compared it with three similar models, using extensive simulation study, as well as an application to breast cancer data from a multicenter clinical trial. The new model performed better than the others in terms of bias, root mean square error, and empirical coverage rate. The proposed approach might be useful to account simultaneously for the hierarchical structure of the data and the non-comparability bias in studies such as long-term multicenter clinical trials, when there is interest in the estimation of net survival.  相似文献   

19.
Skaug HJ  Schweder T 《Biometrics》1999,55(1):29-36
The likelihood function for data from independent observer line transect surveys is derived, and a hazard model is proposed for the situation where animals are available for detection only at discrete time points. Under the assumption that the time points of availability follow a Poisson point process, we obtain an analytical expression for the detection function. We discuss different criteria for choosing the hazard function and consider in particular two different parametric families of hazard functions. Discrete and continuous hazard models are compared and the robustness of the discrete model is investigated. Finally, the methodology is applied to data from a survey for minke whales in the northeastern Atlantic.  相似文献   

20.
Wang L  Du P  Liang H 《Biometrics》2012,68(3):726-735
Summary In some survival analysis of medical studies, there are often long-term survivors who can be considered as permanently cured. The goals in these studies are to estimate the noncured probability of the whole population and the hazard rate of the susceptible subpopulation. When covariates are present as often happens in practice, to understand covariate effects on the noncured probability and hazard rate is of equal importance. The existing methods are limited to parametric and semiparametric models. We propose a two-component mixture cure rate model with nonparametric forms for both the cure probability and the hazard rate function. Identifiability of the model is guaranteed by an additive assumption that allows no time-covariate interactions in the logarithm of hazard rate. Estimation is carried out by an expectation-maximization algorithm on maximizing a penalized likelihood. For inferential purpose, we apply the Louis formula to obtain point-wise confidence intervals for noncured probability and hazard rate. Asymptotic convergence rates of our function estimates are established. We then evaluate the proposed method by extensive simulations. We analyze the survival data from a melanoma study and find interesting patterns for this study.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号