首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Targeted minimum loss based estimation (TMLE) provides a template for the construction of semiparametric locally efficient double robust substitution estimators of the target parameter of the data generating distribution in a semiparametric censored data or causal inference model (van der Laan and Rubin (2006), van der Laan (2008), van der Laan and Rose (2011)). In this article we demonstrate how to construct a TMLE that also satisfies the property that it is at least as efficient as a user supplied asymptotically linear estimator. In particular it is shown that this type of TMLE can incorporate empirical efficiency maximization as in Rubin and van der Laan (2008), Tan (2008, 2010), Rotnitzky et al. (2012), and retain double robustness. For the sake of illustration we focus on estimation of the additive average causal effect of a point treatment on an outcome, adjusting for baseline covariates.  相似文献   

2.
3.
The development of oncology drugs progresses through multiple phases, where after each phase, a decision is made about whether to move a molecule forward. Early phase efficacy decisions are often made on the basis of single-arm studies based on a set of rules to define whether the tumor improves (“responds”), remains stable, or progresses (response evaluation criteria in solid tumors [RECIST]). These decision rules are implicitly assuming some form of surrogacy between tumor response and long-term endpoints like progression-free survival (PFS) or overall survival (OS). With the emergence of new therapies, for which the link between RECIST tumor response and long-term endpoints is either not accessible yet, or the link is weaker than with classical chemotherapies, tumor response-based rules may not be optimal. In this paper, we explore the use of a multistate model for decision-making based on single-arm early phase trials. The multistate model allows to account for more information than the simple RECIST response status, namely, the time to get to response, the duration of response, the PFS time, and time to death. We propose to base the decision on efficacy on the OS hazard ratio (HR) comparing historical control to data from the experimental treatment, with the latter predicted from a multistate model based on early phase data with limited survival follow-up. Using two case studies, we illustrate feasibility of the estimation of such an OS HR. We argue that, in the presence of limited follow-up and small sample size, and making realistic assumptions within the multistate model, the OS prediction is acceptable and may lead to better early decisions within the development of a drug.  相似文献   

4.
Lok JJ  Degruttola V 《Biometrics》2012,68(3):745-754
Summary We estimate how the effect of antiretroviral treatment depends on the time from HIV-infection to initiation of treatment, using observational data. A major challenge in making inferences from such observational data arises from biases associated with the nonrandom assignment of treatment, for example bias induced by dependence of time of initiation on disease status. To address this concern, we develop a new class of Structural Nested Mean Models (SNMMs) to estimate the impact of time of initiation of treatment after infection on an outcome measured a fixed duration after initiation, compared to the effect of not initiating treatment. This leads to a SNMM that models the effect of multiple dosages of treatment on a time-dependent outcome, in contrast to most existing SNNMs, which focus on the effect of one dosage of treatment on an outcome measured at the end of the study. Our identifying assumption is that there are no unmeasured confounders. We illustrate our methods using the observational Acute Infection and Early Disease Research Program (AIEDRP) Core01 database on HIV. The current standard of care in HIV-infected patients is Highly Active Anti-Retroviral Treatment (HAART); however, the optimal time to start HAART has not yet been identified. The new class of SNNMs allows estimation of the dependence of the effect of 1 year of HAART on the time between estimated date of infection and treatment initiation, and on patient characteristics. Results of fitting this model imply that early use of HAART substantially improves immune reconstitution in the early and acute phase of HIV-infection.  相似文献   

5.
Thall PF  Wooten LH  Shpall EJ 《Biometrics》2006,62(1):193-201
In therapy of rapidly fatal diseases, early treatment efficacy often is characterized by an event, "response," which is observed relatively quickly. Since the risk of death decreases at the time of response, it is desirable not only to achieve a response, but to do so as rapidly as possible. We propose a Bayesian method for comparing treatments in this setting based on a competing risks model for response and death without response. Treatment effect is characterized by a two-dimensional parameter consisting of the probability of response within a specified time and the mean time to response. Several target parameter pairs are elicited from the physician so that, for a reference covariate vector, all elicited pairs embody the same improvement in treatment efficacy compared to a fixed standard. A curve is fit to the elicited pairs and used to determine a two-dimensional parameter set in which a new treatment is considered superior to the standard. Posterior probabilities of this set are used to construct rules for the treatment comparison and safety monitoring. The method is illustrated by a randomized trial comparing two cord blood transplantation methods.  相似文献   

6.
Xinyang Huang  Jin Xu 《Biometrics》2020,76(4):1310-1318
Individualized treatment rules (ITRs) recommend treatments based on patient-specific characteristics in order to maximize the expected clinical outcome. At the same time, the risks caused by various adverse events cannot be ignored. In this paper, we propose a method to estimate an optimal ITR that maximizes clinical benefit while having the overall risk controlled at a desired level. Our method works for a general setting of multi-category treatment. The proposed procedure employs two shifted ramp losses to approximate the 0-1 loss in the objective function and constraint, respectively, and transforms the estimation problem into a difference of convex functions (DC) programming problem. A relaxed DC algorithm is used to solve the nonconvex constrained optimization problem. Simulations and a real data example are used to demonstrate the finite sample performance of the proposed method.  相似文献   

7.
Animals living in groups make movement decisions that depend, among other factors, on social interactions with other group members. Our present understanding of social rules in animal collectives is mainly based on empirical fits to observations, with less emphasis in obtaining first-principles approaches that allow their derivation. Here we show that patterns of collective decisions can be derived from the basic ability of animals to make probabilistic estimations in the presence of uncertainty. We build a decision-making model with two stages: Bayesian estimation and probabilistic matching. In the first stage, each animal makes a Bayesian estimation of which behavior is best to perform taking into account personal information about the environment and social information collected by observing the behaviors of other animals. In the probability matching stage, each animal chooses a behavior with a probability equal to the Bayesian-estimated probability that this behavior is the most appropriate one. This model derives very simple rules of interaction in animal collectives that depend only on two types of reliability parameters, one that each animal assigns to the other animals and another given by the quality of the non-social information. We test our model by obtaining theoretically a rich set of observed collective patterns of decisions in three-spined sticklebacks, Gasterosteus aculeatus, a shoaling fish species. The quantitative link shown between probabilistic estimation and collective rules of behavior allows a better contact with other fields such as foraging, mate selection, neurobiology and psychology, and gives predictions for experiments directly testing the relationship between estimation and collective behavior.  相似文献   

8.
9.
This paper introduces a class of data-dependent allocation rules for use in sequential clinical trials designed to choose the better of two competing treatments, or to decide that they are of equal efficacy. These readily understood and easily implemented rules are shown to reduce, substantially the number of tests with the poorer treatment for a broad category of experimental situations. Allocation rules of this type are applied both to trials with an instantaneous binomial response and to delayed response trials where interest centers on exponentially distributed survival time. In each case, a comparison of this design with alternative designs given in the literature shows that the proposed design is superior with respect to ease of application and is comparable to the alternatives regarding inferior treatment number and average sample number. In addition, the proposed rules mitigate many of the difficulties generally associated with adaptive assignment rules, such as selection and systematic bias.  相似文献   

10.
We study the effect of delaying treatment in the presence of (unobserved) heterogeneity. In a homogeneous population and assuming a proportional treatment effect, a treatment delay period will result in notably lower cumulative recovery percentages. We show in theoretical scenarios using frailty models that if the population is heterogeneous, the effect of a delay period is much smaller. This can be explained by the selection process that is induced by the frailty. Patient groups that start treatment later have already undergone more selection. The marginal hazard ratio for the treatment will act differently in such a more homogeneous patient group. We further discuss modeling approaches for estimating the effect of treatment delay in the presence of heterogeneity, and compare their performance in a simulation study. The conventional Cox model that fails to account for heterogeneity overestimates the effect of treatment delay. Including interaction terms between treatment and starting time of treatment or between treatment and follow up time gave no improvement. Estimating a frailty term can improve the estimation, but is sensitive to misspecification of the frailty distribution. Therefore, multiple frailty distributions should be used and the results should be compared using the Akaike Information Criterion. Non-parametric estimation of the cumulative recovery percentages can be considered if the dataset contains sufficient long term follow up for each of the delay strategies. The methods are demonstrated on a motivating application evaluating the effect of delaying the start of treatment with assisted reproductive techniques on time-to-pregnancy in couples with unexplained subfertility.  相似文献   

11.
12.
In randomized trials with imperfect compliance, it is sometimes recommended to supplement the intention‐to‐treat estimate with an instrumental variable (IV) estimate, which is consistent for the effect of treatment administration in those subjects who would get treated if randomized to treatment and would not get treated if randomized to control. The IV estimation however has been criticized for its reliance on simultaneous existence of complementary “fatalistic” compliance states. The objective of the present paper is to identify some sufficient conditions for consistent estimation of treatment effects in randomized trials with stochastic compliance. It is shown that in the stochastic framework, the classical IV estimator is generally inconsistent for the population‐averaged treatment effect. However, even under stochastic compliance, with certain common experimental designs the IV estimator and a simple alternative estimator can be used for consistent estimation of the effect of treatment administration in well‐defined and identifiable subsets of the study population.  相似文献   

13.
An information theory of the genetic code is given, which deals with the process by which template codes (nucleotides or codons) choose substrate codes (nucleotides or anti-codons) in accordance with the base-pairing rules in the chain elongation phase of polynucleotide or polypeptide synthesis. A definite period of recognition time (τ) required for a template code to discriminate a substrate code is proposed, and an experimental method for determining the time is suggested. A substrate word is defined to be the sequence of substrate codes which have appeared at a recognition site in turn before a substrate code complementary to a template code first appears, and the mean length of substrate words (F) is derived from the mole fractions of template codes and substrate codes. The chain elongation rate is greatest when the mole fractions of template codes is proportional to the square of those of substrate codes to minimize the mean recognition time per word (Fτ). The uncertainty of a template (G) and the uncertainty of a medium (M) respectively are derived from the minimum of the function F. The amount of genetic information contained in a template is measured by the function G. The unit of the amount of genetic information is termed “cit”. The function M, the ratio of the number of all binary collisions to the number of homogeneous binary collisions in a mixture of different molecules, may be the new other “entropy” which represents informational properties of the mixture not represented by thermodynamic entropy of mixing. Both functions (G and M) have maxima when all random variables are equal and they are multiplicative in nature in contrast to entropy which is additive. The multiplicativity of the function G may contribute to the enormous informational capacity of genes.  相似文献   

14.
Perceptual interferences in the estimation of quantities (time, space and numbers) have been interpreted as evidence for a common magnitude system. However, if duration estimation has appears sensitive to spatial and numerical interferences, space and number estimation tend to be resilient to temporal manipulations. These observations question the relative contribution of each quantity in the elaboration of a representation in a common mental metric. Here, we elaborated a task in which perceptual evidence accumulated over time for all tested quantities (space, time and number) in order to match the natural requirement for building a duration percept. For this, we used a bisection task. Experimental trials consisted of dynamic dots of different sizes appearing progressively on the screen. Participants were asked to judge the duration, the cumulative surface or the number of dots in the display while the two non-target dimensions varied independently. In a prospective experiment, participants were informed before the trial which dimension was the target; in a retrospective experiment, participants had to attend to all dimensions and were informed only after a given trial which dimension was the target. Surprisingly, we found that duration was resilient to spatial and numerical interferences whereas space and number estimation were affected by time. Specifically, and counter-intuitively, results revealed that longer durations lead to smaller number and space estimates whether participants knew before (prospectively) or after (retrospectively) a given trial which quantity they had to estimate. Altogether, our results support a magnitude system in which perceptual evidence for time, space and numbers integrate following Bayesian cue-combination rules.  相似文献   

15.
Single-stranded DNA binding protein is a key component in growth of bacteriophage T7. In addition, DNA synthesis by the purified in vitro replication system is markedly stimulated when the DNA template is coated with Escherichia coli single-stranded DNA binding protein (SSB). In an attempt to understand the mechanism for this stimulation, we have studied the effect of E. coli SSB on DNA synthesis by the T7 DNA polymerase using a primed single-stranded M13 DNA template which serves as a model for T7 lagging strand DNA synthesis. Polyacrylamide gel analysis of the DNA product synthesized on this template in the absence of SSB indicated that the T7 DNA polymerase pauses at many specific sites, some stronger than others. By comparing the position of pausing with the DNA sequence of this region and by using a DNA template that contains an extremely stable hairpin structure, it was found that many, but not all, of these pause positions correspond to regions of potential secondary structure. The presence of SSB during synthesis resulted in a large reduction in the frequency of hesitations at many sites that correspond to these secondary structures. However, the facts that a large percentage of the pause sites remain unaffected even at saturating levels of SSB and that SSB stimulates synthesis on a singly primed poly(dA) template suggested that other mechanisms also contribute to the stimulation of DNA synthesis caused by SSB. Using a sucrose gradient analysis, we found that SSB increases the affinity of the polymerase for single-stranded DNA that this increased binding is only noticed when the polymerase concentration is limiting. The effect of this difference in polymerase affinity was clearly observed by a polyacrylamide gel analysis of the product DNA synthesized during a limited DNA synthesis reaction using conditions where only two nucleotides are added to the primer. Under these circumstances, where the presence of hairpin structures should not contribute to the stimulatory effect of SSB, we found that the extension of the primer is stimulated 4-fold if the DNA template is coated with SSB. Furthermore, SSB had no effect on this synthesis at large polymerase to template ratios.  相似文献   

16.
Measurement of bone mineral density (BMD) by DXA (dual-energy X-ray absorptiometry) is generally considered to be the clinical golden standard technique to diagnose osteoporosis. However, BMD alone is only a moderate predictor of fracture risk. Finite element analyses of bone mechanics can contribute to a more accurate prediction of fracture risk. In this study, we applied a method to estimate the 3D geometrical shape of bone based on a 2D BMD image and a femur shape template. Proximal femurs of eighteen human cadavers were imaged with computed tomography (CT) and divided into two groups. Image data from the first group (N = 9) were applied to create a shape template by using the general Procrustes analysis and thin plate splines. This template was then applied to estimate the shape of the femurs in the second group (N = 9), using the 2D BMD image projected from the CT image, and the geometrical errors of the shape estimation method were evaluated. Finally, finite element analysis with stance loading condition was conducted based on the original CT and the estimated geometrical shape to evaluate the effect of the geometrical errors on the outcome of the simulations. The volumetric errors induced by the shape estimation method itself were low (<0.6%). Increasing the number of bone specimens used for the template decreased the geometrical errors. When nine bones were used for the template, the mean distance difference (±SD) between the estimated and the CT shape surfaces was 1.2 ± 0.3 mm, indicating that the method was feasible for estimating the shape of the proximal femur. Small errors in geometry led systematically to larger errors in the mechanical simulations. The method could provide more information of the mechanical characteristics of bone based on 2D BMD radiography and could ultimately lead to more sensitive diagnosis of osteoporosis.  相似文献   

17.
Loeys T  Goetghebeur E 《Biometrics》2003,59(1):100-105
Survival data from randomized trials are most often analyzed in a proportional hazards (PH) framework that follows the intention-to-treat (ITT) principle. When not all the patients on the experimental arm actually receive the assigned treatment, the ITT-estimator mixes its effect on treatment compliers with its absence of effect on noncompliers. The structural accelerated failure time (SAFT) models of Robins and Tsiatis are designed to consistently estimate causal effects on the treated, without direct assumptions about the compliance selection mechanism. The traditional PH-model, however, has not yet led to such causal interpretation. In this article, we examine a PH-model of treatment effect on the treated subgroup. While potential treatment compliance is unobserved in the control arm, we derive an estimating equation for the Compliers PROPortional Hazards Effect of Treatment (C-PROPHET). The jackknife is used for bias correction and variance estimation. The method is applied to data from a recently finished clinical trial in cancer patients with liver metastases.  相似文献   

18.
This paper explores the extent to which application of statistical stopping rules in clinical trials can create an artificial heterogeneity of treatment effects in overviews (meta-analyses) of related trials. For illustration, we concentrate on overviews of identically designed group sequential trials, using either fixed nominal or O'Brien and Fleming two-sided boundaries. Some analytic results are obtained for two-group designs and simulation studies are otherwise used, with the following overall findings. The use of stopping rules leads to biased estimates of treatment effect so that the assessment of heterogeneity of results in an overview of trials, some of which have used stopping rules, is confounded by this bias. If the true treatment effect being studied is small, as is often the case, then artificial heterogeneity is introduced, thus increasing the Type I error rate in the test of homogeneity. This could lead to erroneous use of a random effects model, producing exaggerated estimates and confidence intervals. However, if the true mean effect is large, then between-trial heterogeneity may be underestimated. When undertaking or interpreting overviews, one should ascertain whether stopping rules have been used (either formally or informally) and should consider whether their use might account for any heterogeneity found.  相似文献   

19.
Zhao and Tsiatis (1997) consider the problem of estimation of the distribution of the quality-adjusted lifetime when the chronological survival time is subject to right censoring. The quality-adjusted lifetime is typically defined as a weighted sum of the times spent in certain states up until death or some other failure time. They propose an estimator and establish the relevant asymptotics under the assumption of independent censoring. In this paper we extend the data structure with a covariate process observed until the end of follow-up and identify the optimal estimation problem. Because of the curse of dimensionality, no globally efficient nonparametric estimators, which have a good practical performance at moderate sample sizes, exist. Given a correctly specified model for the hazard of censoring conditional on the observed quality-of-life and covariate processes, we propose a closed-form one-step estimator of the distribution of the quality-adjusted lifetime whose asymptotic variance attains the efficiency bound if we can correctly specify a lower-dimensional working model for the conditional distribution of quality-adjusted lifetime given the observed quality-of-life and covariate processes. The estimator remains consistent and asymptotically normal even if this latter submodel is misspecified. The practical performance of the estimators is illustrated with a simulation study. We also extend our proposed one-step estimator to the case where treatment assignment is confounded by observed risk factors so that this estimator can be used to test a treatment effect in an observational study.  相似文献   

20.
K L Larson  B S Strauss 《Biochemistry》1987,26(9):2471-2479
We analyzed the ability of DNA polymerases to bypass damage on single- and double-stranded templates. In vitro DNA synthesis was studied on UV-irradiated and polyaromatic hydrocarbon reacted (benzo[a]pyrenediol epoxide and oxiranylpyrene) double-stranded templates by a protocol involving initiation on a uniquely nicked circular double-stranded template. The template was prepared by treating single-stranded (+)M13mp2 circular strands with mutagen and then hybridizing with restricted M13 RFmp2, followed by isolation of the nicked RFII forms. The protocol permits either (+), (-), or both strands to carry lesions. We found that the rules for termination and bypass of lesions previously observed with single-stranded DNA templates also hold for double-stranded templates. Termination of synthesis occurs primarily one nucleotide 3' to the lesion in the template strand. Bypass of UV-induced lesions can be followed in a series of three partial reactions in the presence of Mn2+ and dGMP, which relax the specificity of nucleotide insertion and 3'----5' exonuclease activity, respectively. There is no evidence for greater permissivity of bypass in double-as opposed to single-stranded templates. As with single-stranded templates, purines and preferentially deoxyadenosine (dA) are inserted opposite lesions. Lesions in the nontemplate strand elicit neither termination nor pausing. The addition of Rec A protein resulted in a measurable increase of bypass in this system.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号