首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
2.
We present a parametric family of regression models for interval-censored event-time (survival) data that accomodates both fixed (e.g. baseline) and time-dependent covariates. The model employs a three-parameter family of survival distributions that includes the Weibull, negative binomial, and log-logistic distributions as special cases, and can be applied to data with left, right, interval, or non-censored event times. Standard methods, such as Newton-Raphson, can be employed to estimate the model and the resulting estimates have an asymptotically normal distribution about the true values with a covariance matrix that is consistently estimated by the information function. The deviance function is described to assess model fit and a robust sandwich estimate of the covariance may also be employed to provide asymptotically robust inferences when the model assumptions do not apply. Spline functions may also be employed to allow for non-linear covariates. The model is applied to data from a long-term study of type 1 diabetes to describe the effects of longitudinal measures of glycemia (HbA1c) over time (the time-dependent covariate) on the risk of progression of diabetic retinopathy (eye disease), an interval-censored event-time outcome.  相似文献   

3.
This paper focuses on the methodology developed for analyzing a multivariate interval-censored data set from an AIDS observational study. A purpose of the study was to determine the natural history of the opportunistic infection cytomeglovirus (CMV) in an HIV-infected individual. For this observational study, laboratory tests were performed at scheduled clinic visits to test for the presence of the CMV virus in the blood and in the urine (called CMV shedding in the blood and urine). The study investigators were interested in determining whether the stage of HIV disease at study entry was predictive of an increased risk for CMV shedding in either the blood or the urine. If all patients had made each clinic visit, the data would be multivariate grouped failure time data and published methods could be used. However, many patients missed several visits, and when they returned, their lab tests indicated a change in their blood and/or urine CMV shedding status, resulting in interval-censored failure time data. This paper outlines a method for applying the proportional hazards model to the analysis of multivariate interval-censored failure time data from a study of CMV in HIV-infected patients.  相似文献   

4.
A new method for analyzing stage-frequency data is proposed which is based on the estimation of rates of transition between one stage and the next highest stage in one unit of time, and a unit time survival rate that is assumed to be constant. Once these estimates are calculated it becomes possible to also estimate the mean durations of stages, stage-specific survival rates, and numbers entering stages. An advantage of the method is that it can be applied with any distribution of entry times to stage 1, and any distribution of numbers in stages when sampling begins. Use of the method is illustrated on data from a copepod population in a Canadian lake.  相似文献   

5.
6.
We propose a joint analysis of recurrent and nonrecurrent event data subject to general types of interval censoring. The proposed analysis allows for general semiparametric models, including the Box–Cox transformation and inverse Box–Cox transformation models for the recurrent and nonrecurrent events, respectively. A frailty variable is used to account for the potential dependence between the recurrent and nonrecurrent event processes, while leaving the distribution of the frailty unspecified. We apply the pseudolikelihood for interval-censored recurrent event data, usually termed as panel count data, and the sufficient likelihood for interval-censored nonrecurrent event data by conditioning on the sufficient statistic for the frailty and using the working assumption of independence over examination times. Large sample theory and a computation procedure for the proposed analysis are established. We illustrate the proposed methodology by a joint analysis of the numbers of occurrences of basal cell carcinoma over time and time to the first recurrence of squamous cell carcinoma based on a skin cancer dataset, as well as a joint analysis of the numbers of adverse events and time to premature withdrawal from study medication based on a scleroderma lung disease dataset.  相似文献   

7.
Marginalized models (Heagerty, 1999, Biometrics 55, 688-698) permit likelihood-based inference when interest lies in marginal regression models for longitudinal binary response data. Two such models are the marginalized transition and marginalized latent variable models. The former captures within-subject serial dependence among repeated measurements with transition model terms while the latter assumes exchangeable or nondiminishing response dependence using random intercepts. In this article, we extend the class of marginalized models by proposing a single unifying model that describes both serial and long-range dependence. This model will be particularly useful in longitudinal analyses with a moderate to large number of repeated measurements per subject, where both serial and exchangeable forms of response correlation can be identified. We describe maximum likelihood and Bayesian approaches toward parameter estimation and inference, and we study the large sample operating characteristics under two types of dependence model misspecification. Data from the Madras Longitudinal Schizophrenia Study (Thara et al., 1994, Acta Psychiatrica Scandinavica 90, 329-336) are analyzed.  相似文献   

8.
Tian L  Lagakos S 《Biometrics》2006,62(3):821-828
We develop methods for assessing the association between a binary time-dependent covariate process and a failure time endpoint when the former is observed only at a single time point and the latter is right censored, and when the observations are subject to truncation and competing causes of failure. Using a proportional hazards model for the effect of the covariate process on the failure time of interest, we develop an approach utilizing EM algorithm and profile likelihood for estimating the relative risk parameter and cause-specific hazards for failure. The methods are extended to account for other covariates that can influence the time-dependent covariate process and cause-specific risks of failure. We illustrate the methods with data from a recent study on the association between loss of hepatitis B e antigen and the development of hepatocellular carcinoma in a population of chronic carriers of hepatitis B.  相似文献   

9.
We study a white-noise driven integrate-and-fire (IF) neuron with a time-dependent threshold. We give analytical expressions for mean and variance of the interspike interval assuming that the modification of the threshold value is small. It is shown that the variability of the interval can become both smaller or larger than in the case of constant threshold depending on the decay rate of threshold. We also show that the relative variability is minimal for a certain finite decay rate of the threshold. Furthermore, for slow threshold decay the leaky IF model shows a minimum in the coefficient of variation whenever the firing rate of the neuron matches the decay rate of the threshold. This novel effect can be seen if the firing rate is changed by varying the noise intensity or the mean input current.  相似文献   

10.
11.
Guo Y  Manatunga AK 《Biometrics》2007,63(1):164-172
Assessing agreement is often of interest in clinical studies to evaluate the similarity of measurements produced by different raters or methods on the same subjects. Lin's (1989, Biometrics 45, 255-268) concordance correlation coefficient (CCC) has become a popular measure of agreement for correlated continuous outcomes. However, commonly used estimation methods for the CCC do not accommodate censored observations and are, therefore, not applicable for survival outcomes. In this article, we estimate the CCC nonparametrically through the bivariate survival function. The proposed estimator of the CCC is proven to be strongly consistent and asymptotically normal, with a consistent bootstrap variance estimator. Furthermore, we propose a time-dependent agreement coefficient as an extension of Lin's (1989) CCC for measuring the agreement between survival times among subjects who survive beyond a specified time point. A nonparametric estimator is developed for the time-dependent agreement coefficient as well. It has the same asymptotic properties as the estimator of the CCC. Simulation studies are conducted to evaluate the performance of the proposed estimators. A real data example from a prostate cancer study is used to illustrate the method.  相似文献   

12.
Many biomedical studies have identified important imaging biomarkers that are associated with both repeated clinical measures and a survival outcome. The functional joint model (FJM) framework, proposed by Li and Luo in 2017, investigates the association between repeated clinical measures and survival data, while adjusting for both high-dimensional images and low-dimensional covariates based on the functional principal component analysis (FPCA). In this paper, we propose a novel algorithm for the estimation of FJM based on the functional partial least squares (FPLS). Our numerical studies demonstrate that, compared to FPCA, the proposed FPLS algorithm can yield more accurate and robust estimation and prediction performance in many important scenarios. We apply the proposed FPLS algorithm to a neuroimaging study. Data used in preparation of this article were obtained from the Alzheimer's Disease Neuroimaging Initiative (ADNI) database.  相似文献   

13.
Ghosh D 《Biometrics》2009,65(2):521-529
Summary .  There has been a recent emphasis on the identification of biomarkers and other biologic measures that may be potentially used as surrogate endpoints in clinical trials. We focus on the setting of data from a single clinical trial. In this article, we consider a framework in which the surrogate must occur before the true endpoint. This suggests viewing the surrogate and true endpoints as semicompeting risks data; this approach is new to the literature on surrogate endpoints and leads to an asymmetrical treatment of the surrogate and true endpoints. However, such a data structure also conceptually complicates many of the previously considered measures of surrogacy in the literature. We propose novel estimation and inferential procedures for the relative effect and adjusted association quantities proposed by Buyse and Molenberghs (1998, Biometrics 54, 1014–1029). The proposed methodology is illustrated with application to simulated data, as well as to data from a leukemia study.  相似文献   

14.
The stratified Cox proportional hazards model is introduced to incorporate covariates and involve nonproportional treatment effect of two groups into the analysis and then the confidence interval estimators for the difference in median survival times of two treatments in stratified Cox model are proposed. The one is based on baseline survival functions of two groups, and the other on average survival functions of two groups. I illustrate the proposed methods with an example from a study conducted by the Radiation Therapy Oncology Group in cancer of the mouth and throat. Simulations are carried out to investigate the small‐sample properties of proposed methods in terms of coverage rates.  相似文献   

15.
16.
McKeague IW  Tighiouart M 《Biometrics》2000,56(4):1007-1015
This article introduces a new Bayesian approach to the analysis of right-censored survival data. The hazard rate of interest is modeled as a product of conditionally independent stochastic processes corresponding to (1) a baseline hazard function and (2) a regression function representing the temporal influence of the covariates. These processes jump at times that form a time-homogeneous Poisson process and have a pairwise dependency structure for adjacent values. The two processes are assumed to be conditionally independent given their jump times. Features of the posterior distribution, such as the mean covariate effects and survival probabilities (conditional on the covariate), are evaluated using the Metropolis-Hastings-Green algorithm. We illustrate our methodology by an application to nasopharynx cancer survival data.  相似文献   

17.
Metric data are usually assessed on a continuous scale with good precision, but sometimes agricultural researchers cannot obtain precise measurements of a variable. Values of such a variable cannot then be expressed as real numbers (e.g., 1.51 or 2.56), but often can be represented by intervals into which the values fall (e.g., from 1 to 2 or from 2 to 3). In this situation, statisticians talk about censoring and censored data, as opposed to missing data, where no information is available at all. Traditionally, in agriculture and biology, three methods have been used to analyse such data: (a) when intervals are narrow, some form of imputation (e.g., mid‐point imputation) is used to replace the interval and traditional methods for continuous data are employed (such as analyses of variance [ANOVA] and regression); (b) for time‐to‐event data, the cumulative proportions of individuals that experienced the event of interest are analysed, instead of the individual observed times‐to‐event; (c) when intervals are wide and many individuals are collected, non‐parametric methods of data analysis are favoured, where counts are considered instead of the individual observed value for each sample element. In this paper, we show that these methods may be suboptimal: The first one does not respect the process of data collection, the second leads to unreliable standard errors (SEs), while the third does not make full use of all the available information. As an alternative, methods of survival analysis for censored data can be useful, leading to reliable inferences and sound hypotheses testing. These methods are illustrated using three examples from plant and crop sciences.  相似文献   

18.
Aim: Present a kinetic model‐based approach for using isothermal data to predict the survival of manure‐borne enteric bacteria under dynamic conditions in an agricultural environment. Methods and Results: A model to predict the survival of Salmonella enterica serovar Typhimurium under dynamic temperature conditions in soil in the field was developed. The working hypothesis was that the inactivation phenomena associated with the survival kinetics of an organism in an agricultural matrix under dynamic temperature conditions is for a large part due to the cumulative effect of inactivation at various temperatures within the continuum registered in the matrix in the field. The modelling approach followed included (i) the recording of the temperature profile that the organism experiences in the field matrix, (ii) modelling the survival kinetics under isothermal conditions at a range of temperatures that were registered in the matrix in the field; and (iii) using the isothermal‐based kinetic models to develop models for predicting survival under dynamic conditions. The time needed for 7 log CFU g?1Salmonella Typhimurium in manure and manure‐amended soil to reach the detection limit of the enumeration method (2 log CFU g?1) under tropical conditions in the Central Agro‐Ecological Zone of Uganda was predicted to be 61–68 days and corresponded with observed CFU of about 2·2–3·0 log CFU g?1, respectively. The Bias and Accuracy factor of the prediction was 0·71–0·84 and 1·2–1·4, respectively. Conclusions: Survival of Salm. Typhimurium under dynamic field conditions could be for 71–84% determined by the developed modelling approach, hence substantiating the working hypothesis. Significance and Impact of the Study: Survival kinetic models obtained under isothermal conditions can be used to develop models for predicting the persistence of manure‐borne enteric bacteria under dynamic field conditions in an agricultural environment.  相似文献   

19.
Accounting for time-varying confounding when assessing the causal effects of time-varying exposures on survival time is challenging. Standard survival methods that incorporate time-varying confounders as covariates generally yield biased effect estimates. Estimators using weighting by inverse probability of exposure can be unstable when confounders are highly predictive of exposure or the exposure is continuous. Structural nested accelerated failure time models (AFTMs) require artificial recensoring, which can cause estimation difficulties. Here, we introduce the structural nested cumulative survival time model (SNCSTM). This model assumes that intervening to set exposure at time t to zero has an additive effect on the subsequent conditional hazard given exposure and confounder histories when all subsequent exposures have already been set to zero. We show how to fit it using standard software for generalized linear models and describe two more efficient, double robust, closed-form estimators. All three estimators avoid the artificial recensoring of AFTMs and the instability of estimators that use weighting by the inverse probability of exposure. We examine the performance of our estimators using a simulation study and illustrate their use on data from the UK Cystic Fibrosis Registry. The SNCSTM is compared with a recently proposed structural nested cumulative failure time model, and several advantages of the former are identified.  相似文献   

20.
In longitudinal studies where time to a final event is the ultimate outcome often information is available about intermediate events the individuals may experience during the observation period. Even though many extensions of the Cox proportional hazards model have been proposed to model such multivariate time-to-event data these approaches are still very rarely applied to real datasets. The aim of this paper is to illustrate the application of extended Cox models for multiple time-to-event data and to show their implementation in popular statistical software packages. We demonstrate a systematic way of jointly modelling similar or repeated transitions in follow-up data by analysing an event-history dataset consisting of 270 breast cancer patients, that were followed-up for different clinical events during treatment in metastatic disease. First, we show how this methodology can also be applied to non Markovian stochastic processes by representing these processes as "conditional" Markov processes. Secondly, we compare the application of different Cox-related approaches to the breast cancer data by varying their key model components (i.e. analysis time scale, risk set and baseline hazard function). Our study showed that extended Cox models are a powerful tool for analysing complex event history datasets since the approach can address many dynamic data features such as multiple time scales, dynamic risk sets, time-varying covariates, transition by covariate interactions, autoregressive dependence or intra-subject correlation.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号