首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Cohort and nested case-control (NCC) designs are frequently used in pharmacoepidemiology to assess the associations of drug exposure that can vary over time with the risk of an adverse event. Although it is typically expected that estimates from NCC analyses are similar to those from the full cohort analysis, with moderate loss of precision, only few studies have actually compared their respective performance for estimating the effects of time-varying exposures (TVE). We used simulations to compare the properties of the resulting estimators of these designs for both time-invariant exposure and TVE. We varied exposure prevalence, proportion of subjects experiencing the event, hazard ratio, and control-to-case ratio and considered matching on confounders. Using both designs, we also estimated the real-world associations of time-invariant ever use of menopausal hormone therapy (MHT) at baseline and updated, time-varying MHT use with breast cancer incidence. In all simulated scenarios, the cohort-based estimates had small relative bias and greater precision than the NCC design. NCC estimates displayed bias to the null that decreased with a greater number of controls per case. This bias markedly increased with higher proportion of events. Bias was seen with Breslow's and Efron's approximations for handling tied event times but was greatly reduced with the exact method or when NCC analyses were matched on confounders. When analyzing the MHT-breast cancer association, differences between the two designs were consistent with simulated data. Once ties were taken correctly into account, NCC estimates were very similar to those of the full cohort analysis.  相似文献   

2.
Meta-analysis of binary data is challenging when the event under investigation is rare, and standard models for random-effects meta-analysis perform poorly in such settings. In this simulation study, we investigate the performance of different random-effects meta-analysis models in terms of point and interval estimation of the pooled log odds ratio in rare events meta-analysis. First and foremost, we evaluate the performance of a hypergeometric-normal model from the family of generalized linear mixed models (GLMMs), which has been recommended, but has not yet been thoroughly investigated for rare events meta-analysis. Performance of this model is compared to performance of the beta-binomial model, which yielded favorable results in previous simulation studies, and to the performance of models that are frequently used in rare events meta-analysis, such as the inverse variance model and the Mantel–Haenszel method. In addition to considering a large number of simulation parameters inspired by real-world data settings, we study the comparative performance of the meta-analytic models under two different data-generating models (DGMs) that have been used in past simulation studies. The results of this study show that the hypergeometric-normal GLMM is useful for meta-analysis of rare events when moderate to large heterogeneity is present. In addition, our study reveals important insights with regard to the performance of the beta-binomial model under different DGMs from the binomial-normal family. In particular, we demonstrate that although misalignment of the beta-binomial model with the DGM affects its performance, it shows more robustness to the DGM than its competitors.  相似文献   

3.
Summary In case–control research where there are multiple case groups, standard analyses fail to make use of all available information. Multiple events case–control (MECC) studies provide a new approach to sampling from a cohort and are useful when it is desired to study multiple types of events in the cohort. In this design, subjects in the cohort who develop any event of interest are sampled, as well as a fraction of the remaining subjects. We show that a simple case–control analysis of data arising from MECC studies is biased and develop three general estimating‐equation‐based approaches to analyzing data from these studies. We conduct simulation studies to compare the efficiency of the various MECC analyses with each other and with the corresponding conventional analyses. It is shown that the gain in efficiency by using the new design is substantial in many situations. We demonstrate the application of our approach to a nested case–control study of the effect of oral sodium phosphate use on chronic kidney injury with multiple case definitions.  相似文献   

4.
Current climatic trends involve both increasing temperatures and climatic variability, with extreme events becoming more frequent. Increasing concern on extreme climatic events has triggered research on vegetation shifts. However, evidences of vegetation shifts resulting from these events are still relatively rare. Empirical evidence supports the existence of stabilizing processes minimizing and counteracting the effects of these events, reinforcing community resilience. We propose a demographic framework to understand this inertia to change based on the balance between adult mortality induced by the event and enhanced recruitment or adult survival after the event. The stabilizing processes potentially contributing to this compensation include attenuation of the adult mortality caused by the event, due to site quality variability, to tolerance, phenotypic variability, and plasticity at population level, and to facilitative interactions. Mortality compensation may also occur by increasing future survival due to beneficial effect on growth and survival of the new conditions derived from global warming and increased climatic variability, to lowered competition resulting from reduced density in affected stands, or to antagonistic release when pathogens or predators are vulnerable to the event or the ongoing climatic conditions. Finally, mortality compensation may appear by enhanced recruitment due to release of competition with established vegetation, for instance as a consequence of gap openings after event‐caused mortality, or to the new conditions, which may be more favorable for seedling establishment, or to enhanced mutualistic interactions (pollination, dispersal). There are important challenges imposed by the need of long‐term studies, but a research agenda focused on potentially stabilizing processes is well suited to understand the variety of responses, including lack of sudden changes and community inertia that are frequently observed in vegetation under extreme events. This understanding is crucial for the establishment of sound management strategies and actions addressed to improve ecosystem resilience under climate change scenarios.  相似文献   

5.
The observation of repeated events for subjects in cohort studies could be terminated by loss to follow-up, end of study, or a major failure event such as death. In this context, the major failure event could be correlated with recurrent events, and the usual assumption of noninformative censoring of the recurrent event process by death, required by most statistical analyses, can be violated. Recently, joint modeling for 2 survival processes has received considerable attention because it makes it possible to study the joint evolution over time of 2 processes and gives unbiased and efficient parameters. The most commonly used estimation procedure in the joint models for survival events is the expectation maximization algorithm. We show how maximum penalized likelihood estimation can be applied to nonparametric estimation of the continuous hazard functions in a general joint frailty model with right censoring and delayed entry. The simulation study demonstrates that this semiparametric approach yields satisfactory results in this complex setting. As an illustration, such an approach is applied to a prospective cohort with recurrent events of follicular lymphomas, jointly modeled with death.  相似文献   

6.
Summary .   An approach for determining the power of a case–cohort study for a single binary exposure variable and a low failure rate was recently proposed by Cai and Zeng (2004, Biometrics 60, 1015–1024). In this article, we show that computing power for a case–cohort study using a standard case–control method yields nearly identical levels of power. An advantage of the case–control approach is that existing sample size software can be used for the calculations. We also propose an additional formula for computing the power of a case–cohort study for the situation when the event is not rare.  相似文献   

7.
Individuals may experience more than one type of recurrent event and a terminal event during the life course of a disease. Follow‐up may be interrupted for several reasons, including the end of a study, or patients lost to follow‐up, which are noninformative censoring events. Death could also stop the follow‐up, hence, it is considered as a dependent terminal event. We propose a multivariate frailty model that jointly analyzes two types of recurrent events with a dependent terminal event. Two estimation methods are proposed: a semiparametrical approach using penalized likelihood estimation where baseline hazard functions are approximated by M‐splines, and another one with piecewise constant baseline hazard functions. Finally, we derived martingale residuals to check the goodness‐of‐fit. We illustrate our proposals with a real dataset on breast cancer. The main objective was to model the dependency between the two types of recurrent events (locoregional and metastatic) and the terminal event (death) after a breast cancer.  相似文献   

8.
Case-cohort analysis with accelerated failure time model   总被引:1,自引:0,他引:1  
Kong L  Cai J 《Biometrics》2009,65(1):135-142
Summary .  In a case–cohort design, covariates are assembled only for a subcohort that is randomly selected from the entire cohort and any additional cases outside the subcohort. This design is appealing for large cohort studies of rare disease, especially when the exposures of interest are expensive to ascertain for all the subjects. We propose statistical methods for analyzing the case–cohort data with a semiparametric accelerated failure time model that interprets the covariates effects as to accelerate or decelerate the time to failure. Asymptotic properties of the proposed estimators are developed. The finite sample properties of case–cohort estimator and its relative efficiency to full cohort estimator are assessed via simulation studies. A real example from a study of cardiovascular disease is provided to illustrate the estimating procedure.  相似文献   

9.
Song R  Kosorok MR  Cai J 《Biometrics》2008,64(3):741-750
Summary .   Recurrent events data are frequently encountered in clinical trials. This article develops robust covariate-adjusted log-rank statistics applied to recurrent events data with arbitrary numbers of events under independent censoring and the corresponding sample size formula. The proposed log-rank tests are robust with respect to different data-generating processes and are adjusted for predictive covariates. It reduces to the Kong and Slud (1997, Biometrika 84, 847–862) setting in the case of a single event. The sample size formula is derived based on the asymptotic normality of the covariate-adjusted log-rank statistics under certain local alternatives and a working model for baseline covariates in the recurrent event data context. When the effect size is small and the baseline covariates do not contain significant information about event times, it reduces to the same form as that of Schoenfeld (1983, Biometrics 39, 499–503) for cases of a single event or independent event times within a subject. We carry out simulations to study the control of type I error and the comparison of powers between several methods in finite samples. The proposed sample size formula is illustrated using data from an rhDNase study.  相似文献   

10.
French B  Heagerty PJ 《Biometrics》2009,65(2):415-422
Summary .  Longitudinal studies typically collect information on the timing of key clinical events and on specific characteristics that describe those events. Random variables that measure qualitative or quantitative aspects associated with the occurrence of an event are known as marks. Recurrent marked point process data consist of possibly recurrent events, with the mark (and possibly exposure) measured if and only if an event occurs. Analysis choices depend on which aspect of the data is of primary scientific interest. First, factors that influence the occurrence or timing of the event may be characterized using recurrent event analysis methods. Second, if there is more than one event per subject, then the association between exposure and the mark may be quantified using repeated measures regression methods. We detail assumptions required of any time-dependent exposure process and the event time process to ensure that linear or generalized linear mixed models and generalized estimating equations provide valid estimates. We provide theoretical and empirical evidence that if these conditions are not satisfied, then an independence estimating equation should be used for consistent estimation of association. We conclude with the recommendation that analysts carefully explore both the exposure and event time processes prior to implementing a repeated measures analysis of recurrent marked point process data.  相似文献   

11.
Kim YJ 《Biometrics》2006,62(2):458-464
In doubly censored failure time data, the survival time of interest is defined as the elapsed time between an initial event and a subsequent event, and the occurrences of both events cannot be observed exactly. Instead, only right- or interval-censored observations on the occurrence times are available. For the analysis of such data, a number of methods have been proposed under the assumption that the survival time of interest is independent of the occurrence time of the initial event. This article investigates a different situation where the independence may not be true with the focus on regression analysis of doubly censored data. Cox frailty models are applied to describe the effects of covariates and an EM algorithm is developed for estimation. Simulation studies are performed to investigate finite sample properties of the proposed method and an illustrative example from an acquired immune deficiency syndrome (AIDS) cohort study is provided.  相似文献   

12.
Three main modes of extinction are responsible for reductions in morphological disparity: (1) random (caused by a nonselective extinction event); (2) marginal (a symmetric, selective extinction event trimming the margin of morphospace); and (3) lateral (an asymmetric, selective extinction event eliminating one side of the morphospace). These three types of extinction event can be distinguished from one another by comparing changes in three measures of morphospace occupation: (1) the sum of range along the main axes; (2) the sum of variance; and (3) the position of the centroid. Computer simulations of various extinction events demonstrate that the pre‐extinction distribution of taxa (random or normal) in the morphospace has little influence on the quantification of disparity changes, whereas the modes of the extinction events play the major role. Together, the three disparity metrics define an “extinction‐space” in which different extinction events can be directly compared with one another. Application of this method to selected extinction events (Frasnian‐Famennian, Devonian‐Carboniferous, and Permian‐Triassic) of the Ammonoidea demonstrate the similarity of the Devonian events (selective extinctions) but the striking difference from the end‐Permian event (nonselective extinction). These events differ in their mode of extinction despite decreases in taxonomic diversity of similar magnitude.  相似文献   

13.
Many research questions involve time-to-event outcomes that can be prevented from occurring due to competing events. In these settings, we must be careful about the causal interpretation of classical statistical estimands. In particular, estimands on the hazard scale, such as ratios of cause-specific or subdistribution hazards, are fundamentally hard to interpret causally. Estimands on the risk scale, such as contrasts of cumulative incidence functions, do have a clear causal interpretation, but they only capture the total effect of the treatment on the event of interest; that is, effects both through and outside of the competing event. To disentangle causal treatment effects on the event of interest and competing events, the separable direct and indirect effects were recently introduced. Here we provide new results on the estimation of direct and indirect separable effects in continuous time. In particular, we derive the nonparametric influence function in continuous time and use it to construct an estimator that has certain robustness properties. We also propose a simple estimator based on semiparametric models for the two cause-specific hazard functions. We describe the asymptotic properties of these estimators and present results from simulation studies, suggesting that the estimators behave satisfactorily in finite samples. Finally, we reanalyze the prostate cancer trial from Stensrud et al. (2020).  相似文献   

14.
For the calculation of relative measures such as risk ratio (RR) and odds ratio (OR) in a single study, additional approaches are required for the case of zero events. In the case of zero events in one treatment arm, the Peto odds ratio (POR) can be calculated without continuity correction, and is currently the relative effect estimation method of choice for binary data with rare events. The aim of this simulation study is a variegated comparison of the estimated OR and estimated POR with the true OR in a single study with two parallel groups without confounders in data situations where the POR is currently recommended. This comparison was performed by means of several performance measures, that is the coverage, confidence interval (CI) width, mean squared error (MSE), and mean percentage error (MPE). We demonstrated that the estimator for the POR does not outperform the estimator for the OR for all the performance measures investigated. In the case of rare events, small treatment effects and similar group sizes, we demonstrated that the estimator for the POR performed better than the estimator for the OR only regarding the coverage and MPE, but not the CI width and MSE. For larger effects and unbalanced group size ratios, the coverage and MPE of the estimator for the POR were inappropriate. As in practice the true effect is unknown, the POR method should be applied only with the utmost caution.  相似文献   

15.
A cause-specific cumulative incidence function (CIF) is the probability of failure from a specific cause as a function of time. In randomized trials, a difference of cause-specific CIFs (treatment minus control) represents a treatment effect. Cause-specific CIF in each intervention arm can be estimated based on the usual non-parametric Aalen–Johansen estimator which generalizes the Kaplan–Meier estimator of CIF in the presence of competing risks. Under random censoring, asymptotically valid Wald-type confidence intervals (CIs) for a difference of cause-specific CIFs at a specific time point can be constructed using one of the published variance estimators. Unfortunately, these intervals can suffer from substantial under-coverage when the outcome of interest is a rare event, as may be the case for example in the analysis of uncommon adverse events. We propose two new approximate interval estimators for a difference of cause-specific CIFs estimated in the presence of competing risks and random censoring. Theoretical analysis and simulations indicate that the new interval estimators are superior to the Wald CIs in the sense of avoiding substantial under-coverage with rare events, while being equivalent to the Wald CIs asymptotically. In the absence of censoring, one of the two proposed interval estimators reduces to the well-known Agresti–Caffo CI for a difference of two binomial parameters. The new methods can be easily implemented with any software package producing point and variance estimates for the Aalen–Johansen estimator, as illustrated in a real data example.  相似文献   

16.
Recurrent events could be stopped by a terminal event, which commonly occurs in biomedical and clinical studies. In this situation, dependent censoring is encountered because of potential dependence between these two event processes, leading to invalid inference if analyzing recurrent events alone. The joint frailty model is one of the widely used approaches to jointly model these two processes by sharing the same frailty term. One important assumption is that recurrent and terminal event processes are conditionally independent given the subject‐level frailty; however, this could be violated when the dependency may also depend on time‐varying covariates across recurrences. Furthermore, marginal correlation between two event processes based on traditional frailty modeling has no closed form solution for estimation with vague interpretation. In order to fill these gaps, we propose a novel joint frailty‐copula approach to model recurrent events and a terminal event with relaxed assumptions. Metropolis–Hastings within the Gibbs Sampler algorithm is used for parameter estimation. Extensive simulation studies are conducted to evaluate the efficiency, robustness, and predictive performance of our proposal. The simulation results show that compared with the joint frailty model, the bias and mean squared error of the proposal is smaller when the conditional independence assumption is violated. Finally, we apply our method into a real example extracted from the MarketScan database to study the association between recurrent strokes and mortality.  相似文献   

17.
Mid‐Devonian to end‐Late Devonian trilobites of different taxonomic categories are updated as to their actual stratigraphical range with respect to the internationally defined stage boundaries. The main palaeogeographical and ecological occurrences are summarized. Numerical analyses emphasize the clear relationship between fluctuations in diversity and global eustatic events. Already declining in diversity from the early mid‐Devonian, shallow‐water communities became most restricted during the mid‐Givetian Taghanic transgression. After a phase of adaptive radiation, off‐shore trilobite communities were severely affected during the mid‐ and end‐Late Devonian crises. From an initial 5 orders 3 were lost at the end‐Frasnian Kellwasser crisis while only 1 from the remaining 2 orders survived the Devonian‐Carboniferous boundary Hangenberg event. In both cases extinction was preceded by a unidirectional evolutionary trend in eye reduction accompanied by impoverishment of lower rank taxa. This phenomenon is obviously a result of selective adaptation under constant long‐lasting environmental influences. Specialization to obligate epi‐ or even endo‐benthic life habit, however, led fatally to extinction when stable conditions became substantially perturbed. Sudden sea‐level changes with subsequent break in the REDOX‐equilibrium took place at the Kellwasser and Hangenberg events, which were most probably responsible for trilobite mass extinctions.  相似文献   

18.
Satten GA 《Biometrics》1999,55(4):1228-1231
This paper describes a method for determining whether the times between a chain of successive events (which all individuals experience in the same order) are correlated, for data in which the exact event times are not observed. Such data arise when individuals are only observed occasionally to determine which events have occurred. In such data, the (unknown) event times are interval censored. In addition, some individuals may have experienced some of the events before their first observation and may be lost to follow-up before experiencing the last event. Using a frailty model proposed by Aalen (1988, Mathematical Scientist 13, 90-103) but which has never been used to analyze real data, we examine whether individuals who develop early markers of HIV infection can also be expected to develop antibody and other indicators of HIV infection more rapidly.  相似文献   

19.
Climate change scenarios predict an increased frequency of extreme climatic events. In Arctic regions, one of the most profound of these are extreme and sudden winter warming events in which temperatures increase rapidly to above freezing, often causing snow melt across whole landscapes and exposure of ecosystems to warm temperatures. Following warming, vegetation and soils no longer insulated below snow are then exposed to rapidly returning extreme cold. Using a new experimental facility established in sub‐Arctic dwarf shrub heathland in northern Sweden, we simulated an extreme winter warming event in the field and report findings on growth, phenology and reproduction during the subsequent growing season. A 1‐week long extreme winter warming event was simulated in early March using infrared heating lamps run with or without soil warming cables. Both single short events delayed bud development of Vaccinium myrtillus by up to 3 weeks in the following spring (June) and reduced flower production by more than 80%: this also led to a near‐complete elimination of berry production in mid‐summer. Empetrum hermaphroditum also showed delayed bud development. In contrast, Vaccinium vitis‐idaea showed no delay in bud development, but instead appeared to produce a greater number of actively growing vegetative buds within plots warmed by heating lamps only. Again, there was evidence of reduced flowering and berry production in this species. While bud break was delayed, growing season measurements of growth and photosynthesis did not reveal a differential response in the warmed plants for any of the species. These results demonstrate that a single, short, extreme winter warming event can have considerable impact on bud production, phenology and reproductive effort of dominant plant species within sub‐Arctic dwarf shrub heathland. Furthermore, large interspecific differences in sensitivity are seen. These findings are of considerable concern, because they suggest that repeated events may potentially impact on the biodiversity and productivity of these systems should these extreme events increase in frequency as a result of global change. Although climate change may lengthen the growing season by earlier spring snow melt, there is a profound danger for these high‐latitude ecosystems if extreme, short‐lived warming in winter exposes plants to initial warm temperatures, but then extreme cold for the rest of the winter. Work is ongoing to determine the longer term and wider impacts of these events.  相似文献   

20.
Summary Recombination between dispersed yet related serine tRNA genes of Schizosaccharomyces pombe does occur during mitosis but it is approximately three orders of magnitude less frequent than in meiosis. Two mitotic events have been studied in detail. In the first, a sequence of at least 18 nucleotides has been transferred from the donor sup3 gene on the right arm of chromosome I to the related acceptor gene sup12 on the left arm of the same chromosome, thereby leading to the simultaneous change of 8 bp in the acceptor gene. This event must be explained in terms of recombination rather than mutation. It is assumed that it represents mitotic gene conversion, although it was not possible to demonstrate that the donor gene had emerged unchanged from the event. The second case reflects an interaction between sup9 on chromosome III and sup3 on chromosome I. Genetic and physical analysis allows this event to be described as mitotic gene conversion associated with crossingover. The result of this event is a reciprocal translocation. No further chromosomal aberrations were found among an additional 700 potential intergenic convertants tested. Thus intergenic conversion is much less frequently associated with crossingover than allelic conversion. However, the rare intergenic conversion events associated with crossingover provide a molecular mechanism for chromosomal rearrangements.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号