首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Caged-layer hens were scored as infested or uninfested by visual examination of the vent region, and the number of northern fowl mite, Ornithonyssus sylviarum (Canestrini & Fanzago), per hen was estimated. The proportion infested and average number of mites per hen were shown to have a highly significant, positive relationship (r = 0.936). Sampling among houses within a flock, and rows and sections within houses were analyzed to determine the reliability of sampling a representative portion of a flock. Low- and moderate-tolerance treatment thresholds, based on percentage of hens infested with mites, were developed from sampling 1 wk before and 1 wk after acaricide treatments determined necessary by the producer. These thresholds were used to compare a fixed (single) sampling plan, a curtailed procedure of the fixed sampling plan, and a sequential sampling plan based on a sequential probability ratio test, by sampling 174 hens (the maximum number needed for the single sampling plan). The sequential sampling plan required fewer hen examinations on average to reach a treatment decision than did the other plans, depending on the infestation tolerance limits. Using a low tolerance approach in which infestations below 15% are considered noneconomic (safe threshold) and infestations above 25% are considered economically important (action threshold), as few as 5 hens required examination to reach a treatment decision. Sequential sampling plan graphs are presented for 2 tolerance threshold scenarios (a 15% safe-threshold paired with a 25% action threshold and a 35% safe-threshold paired with a 45% action threshold). These sequential sampling plans using presence absence assessments should greatly facilitate monitoring and treatment decisions for this important pest.  相似文献   

2.
Symmetric group sequential test designs   总被引:1,自引:0,他引:1  
In Phase III clinical trials, ethical considerations often demand interim analyses in order that the better treatment be made available to all patients as soon as possible. Group sequential test designs that do not treat the hypotheses symmetrically may not fully address this concern since early termination of the study may be easier under one of the hypotheses. We present a one-parameter family of symmetric one-sided group sequential designs that are nearly fully efficient in terms of the average sample number. The symmetric tests are then extended to a two-sided hypothesis test. These symmetric two-sided group sequential tests are found to have improved overall efficiency when compared to the tests proposed by Pocock (1977, Biometrika 64, 191-199) and O'Brien and Fleming (1979, Biometrics 35, 549-556). Tables of critical values for both one-sided and two-sided symmetric designs are provided, thus allowing easy determination of sample sizes and stopping boundaries for a group sequential test. Approximate tests based on these designs are proposed for use when the number and timing of analyses are random.  相似文献   

3.
A pest management decision to initiate a control treatment depends upon an accurate estimate of mean pest density. Presence-absence sampling plans significantly reduce sampling efforts to make treatment decisions by using the proportion of infested leaves to estimate mean pest density in lieu of counting individual pests. The use of sequential hypothesis testing procedures can significantly reduce the number of samples required to make a treatment decision. Here we construct a mean-proportion relationship for Oligonychus perseae Tuttle, Baker, and Abatiello, a mite pest of avocados, from empirical data, and develop a sequential presence-absence sampling plan using Bartlett's sequential test procedure. Bartlett's test can accommodate pest population models that contain nuisance parameters that are not of primary interest. However, it requires that population measurements be independent, which may not be realistic because of spatial correlation of pest densities across trees within an orchard. We propose to mitigate the effect of spatial correlation in a sequential sampling procedure by using a tree-selection rule (i.e., maximin) that sequentially selects each newly sampled tree to be maximally spaced from all other previously sampled trees. Our proposed presence-absence sampling methodology applies Bartlett's test to a hypothesis test developed using an empirical mean-proportion relationship coupled with a spatial, statistical model of pest populations, with spatial correlation mitigated via the aforementioned tree-selection rule. We demonstrate the effectiveness of our proposed methodology over a range of parameter estimates appropriate for densities of O. perseae that would be observed in avocado orchards in California.  相似文献   

4.
In observational cohort studies with complex sampling schemes, truncation arises when the time to event of interest is observed only when it falls below or exceeds another random time, that is, the truncation time. In more complex settings, observation may require a particular ordering of event times; we refer to this as sequential truncation. Estimators of the event time distribution have been developed for simple left-truncated or right-truncated data. However, these estimators may be inconsistent under sequential truncation. We propose nonparametric and semiparametric maximum likelihood estimators for the distribution of the event time of interest in the presence of sequential truncation, under two truncation models. We show the equivalence of an inverse probability weighted estimator and a product limit estimator under one of these models. We study the large sample properties of the proposed estimators and derive their asymptotic variance estimators. We evaluate the proposed methods through simulation studies and apply the methods to an Alzheimer's disease study. We have developed an R package, seqTrun , for implementation of our method.  相似文献   

5.
6.
Artificial data were simulated by using two-exponential functions and normally distributed pseudo-random numbers. The variation corresponded to either constant or relative variance for the data error. These data were used to test (i) three different weighting functions and (ii) the effect of data truncation on the precision of estimating the parameters of two-exponential functions.  相似文献   

7.
This paper explores the extent to which application of statistical stopping rules in clinical trials can create an artificial heterogeneity of treatment effects in overviews (meta-analyses) of related trials. For illustration, we concentrate on overviews of identically designed group sequential trials, using either fixed nominal or O'Brien and Fleming two-sided boundaries. Some analytic results are obtained for two-group designs and simulation studies are otherwise used, with the following overall findings. The use of stopping rules leads to biased estimates of treatment effect so that the assessment of heterogeneity of results in an overview of trials, some of which have used stopping rules, is confounded by this bias. If the true treatment effect being studied is small, as is often the case, then artificial heterogeneity is introduced, thus increasing the Type I error rate in the test of homogeneity. This could lead to erroneous use of a random effects model, producing exaggerated estimates and confidence intervals. However, if the true mean effect is large, then between-trial heterogeneity may be underestimated. When undertaking or interpreting overviews, one should ascertain whether stopping rules have been used (either formally or informally) and should consider whether their use might account for any heterogeneity found.  相似文献   

8.
In long-term clinical studies, recurrent event data are sometimes collected and used to contrast the efficacies of two different treatments. The event reoccurrence rates can be compared using the popular negative binomial model, which incorporates information related to patient heterogeneity into a data analysis. For treatment allocation, a balanced approach in which equal sample sizes are obtained for both treatments is predominately adopted. However, if one treatment is superior, then it may be desirable to allocate fewer subjects to the less-effective treatment. To accommodate this objective, a sequential response-adaptive treatment allocation procedure is derived based on the doubly adaptive biased coin design. Our proposed treatment allocation schemes have been shown to be capable of reducing the number of subjects receiving the inferior treatment while simultaneously retaining a test power level that is comparable to that of a balanced design. The redesign of a clinical study illustrates the advantages of using our procedure.  相似文献   

9.
Mutations in ABCA1 uniformly decrease plasma HDL-cholesterol (HDL-C) and reduce cholesterol efflux, yet different mutations in ABCA1 result in different phenotypic effects in heterozygotes. For example, truncation mutations result in significantly lower HDL-C and apoliprotein A-I (apoA-I) levels in heterozygotes compared with nontruncation mutations, suggesting that truncation mutations may negatively affect the wild-type allele. To specifically test this hypothesis, we examined ABCA1 protein expression in response to 9-cis-retinoic acid (9-cis-RA) and 22-R-hydroxycholesterol (22-R-OH-Chol) in a collection of human fibroblasts representing eight different mutations and observed that truncation mutations blunted the response to oxysterol stimulation and dominantly suppressed induction of the remaining full-length allele to 5-10% of wild-type levels. mRNA levels between truncation and nontruncation mutations were comparable, suggesting that ABCA1 expression was suppressed at the protein level. Dominant negative activity of truncated ABCA1 was recapitulated in an in vitro model using transfected Cos-7 cells. Our results suggest that the severe reduction of HDL-C in patients with truncation mutations may be at least partly explained by dominant negative suppression of expression and activity of the remaining full-length ABCA1 allele. These data suggest that ABCA1 requires a physical association with itself or other molecules for normal function and has important pharmacogenetic implications for individuals with truncation mutations.  相似文献   

10.
Tseng GC  Wong WH 《Biometrics》2005,61(1):10-16
In this article, we propose a method for clustering that produces tight and stable clusters without forcing all points into clusters. The methodology is general but was initially motivated from cluster analysis of microarray experiments. Most current algorithms aim to assign all genes into clusters. For many biological studies, however, we are mainly interested in identifying the most informative, tight, and stable clusters of sizes, say, 20-60 genes for further investigation. We want to avoid the contamination of tightly regulated expression patterns of biologically relevant genes due to other genes whose expressions are only loosely compatible with these patterns. "Tight clustering" has been developed specifically to address this problem. It applies K-means clustering as an intermediate clustering engine. Early truncation of a hierarchical clustering tree is used to overcome the local minimum problem in K-means clustering. The tightest and most stable clusters are identified in a sequential manner through an analysis of the tendency of genes to be grouped together under repeated resampling. We validated this method in a simulated example and applied it to analyze a set of expression profiles in the study of embryonic stem cells.  相似文献   

11.
Recent null models that place species ranges randomly within a bounded domain have produced controversial results. Many such geometric constraint models predict a peak in species richness in the centre of domains in the absence of underlying environmental gradients or interspecific interactions. We used two-dimensional simulation models to explore different ways that species ranges could interact with the domain boundary. In the rejection model, a randomly generated range that overlaps a domain boundary is removed from the simulation. In the reshaping model, a range that overlaps the domain boundary is reshaped so that the entire range is placed within the domain. The truncation model allows potential ranges to extend across the boundary, but only that portion of the range within the domain is included in the realized range. Both rejection and reshaping models produced a drop in species richness near domain boundaries, though the effect was less pronounced in the reshaping model. Our truncation model did not produce any spatial pattern in species richness. Thus the random placement of species ranges within a bounded domain does not necessarily lead to a mid-domain effect.
  Range truncation is consistent with bioclimate envelope models, which can successfully predict a species range in response to the availability of appropriate climate conditions. We argue that such flexible range sizes are more realistic than the assumption that range size is an unvarying characteristic of a species. Other range characteristics, including size and shape, can change near domain boundaries in the null models, including the truncation model. A broader consideration of range characteristics near domain boundaries could be productive.  相似文献   

12.
An outline of a pest management plan for cotton insects has been developed that is essentially based on three sets of sequential sampling plans. These plans utilize the binomial sampling theory and treatment levels for pests during three phenological stages of cotton plant growth. The plans also provide a known level of accuracy in making management decisions and frequently will require less time than other sampling techniques.  相似文献   

13.
Aplenc R  Zhao H  Rebbeck TR  Propert KJ 《Genetics》2003,163(3):1215-1219
Molecular epidemiological association studies use valuable biosamples and incur costs. Statistical methods for early genotyping termination may conserve biosamples and costs. Group sequential methods (GSM) allow early termination of studies on the basis of interim comparisons. Simulation studies evaluated the application of GSM using data from a case-control study of GST genotypes and prostate cancer. Group sequential boundaries (GSB) were defined in the EAST-2000 software and were evaluated for study termination when early evidence suggested that the null hypothesis of no association between genotype and disease was unlikely to be rejected. Early termination of GSTM1 genotyping, which demonstrated no association with prostate cancer, occurred in >90% of the simulated studies. On average, 36.4% of biosamples were saved from unnecessary genotyping. In contrast, for GSTT1, which demonstrated a positive association, inappropriate termination occurred in only 6.6%. GSM may provide significant cost and sample savings in molecular epidemiology studies.  相似文献   

14.
Group sequential stopping rules are often used during the conduct of clinical trials in order to attain more ethical treatment of patients and to better address efficiency concerns. Because the use of such stopping rules materially affects the frequentist operating characteristics of the hypothesis test, it is necessary to choose an appropriate stopping rule during the planning of the study. It is often the case, however, that the number and timing of interim analyses are not precisely known at the time of trial design, and thus the implementation of a particular stopping rule must allow for flexible determination of the schedule of interim analyses. In this article, we consider the use of constrained stopping boundaries in the implementation of stopping rules. We compare this approach when used on various scales for the test statistic. When implemented on the scale of boundary crossing probabilities, this approach is identical to the error spending function approach of Lan and DeMets (1983).  相似文献   

15.
K K Lan  J M Lachin 《Biometrics》1990,46(3):759-770
To control the Type I error probability in a group sequential procedure using the logrank test, it is important to know the information times (fractions) at the times of interim analyses conducted for purposes of data monitoring. For the logrank test, the information time at an interim analysis is the fraction of the total number of events to be accrued in the entire trial. In a maximum information trial design, the trial is concluded when a prespecified total number of events has been accrued. For such a design, therefore, the information time at each interim analysis is known. However, many trials are designed to accrue data over a fixed duration of follow-up on a specified number of patients. This is termed a maximum duration trial design. Under such a design, the total number of events to be accrued is unknown at the time of an interim analysis. For a maximum duration trial design, therefore, these information times need to be estimated. A common practice is to assume that a fixed fraction of information will be accrued between any two consecutive interim analyses, and then employ a Pocock or O'Brien-Fleming boundary. In this article, we describe an estimate of the information time based on the fraction of total patient exposure, which tends to be slightly negatively biased (i.e., conservative) if survival is exponentially distributed. We then present a numerical exploration of the robustness of this estimate when nonexponential survival applies. We also show that the Lan-DeMets (1983, Biometrika 70, 659-663) procedure for constructing group sequential boundaries with the desired level of Type I error control can be computed using the estimated information fraction, even though it may be biased. Finally, we discuss the implications of employing a biased estimate of study information for a group sequential procedure.  相似文献   

16.
Spatial distribution patterns of adult squash bugs were determined in watermelon, Citrullus lanatus (Thunberg) Matsumura and Nakai, during 2001 and 2002. Results of analysis using Taylor's power law regression model indicated that squash bugs were aggregated in watermelon. Taylor's power law provided a good fit with r2 = 0.94. A fixed precision sequential sampling plan was developed for estimating adult squash bug density at fixed precision levels in watermelon. The plan was tested using a resampling simulation method on nine and 13 independent data sets ranging in density from 0.15 to 2.52 adult squash bugs per plant. Average estimated means obtained in 100 repeated simulation runs were within the 95% CI of the true means for all the data. Average estimated levels of precision were similar to the desired level of precision, particularly when the sampling plan was tested on data having an average mean density of 1.19 adult squash bugs per plant. Also, a sequential sampling for classifying adult squash bug density as below or above economic threshold was developed to assist in the decision-making process. The classification sampling plan is advantageous in that it requires smaller sample sizes to estimate the population status when the population density differs greatly from the action threshold. However, the plan may require excessively large sample sizes when the density is close to the threshold. Therefore, an integrated sequential sampling plan was developed using a combination of a fixed precision and classification sequential sampling plans. The integration of sampling plans can help reduce sampling requirements.  相似文献   

17.
The patterning of an internal organ, like the heart, is little understood. Central to this patterning is the formation, or the acquisition, of an anteroposterior (A-P) axis. We have approached the question of how the heart tube acquires polarity in the zebrafish, Brachydanio rerio, which offers numerous advantages for studying cardiac morphogenesis. During the early stages of organogenesis in the fish, the heart tube lies in an A-P orientation with the venous end lying anteriorly and the arterial end lying posteriorly. High doses (10(-6)-10(-5)M) of retinoic acid (RA) cause truncation of the body axis, as they do in Xenopus. Low doses of retinoic acid (10(-8)-10(-7) M), which do not appear to affect the rest of the embryo, have pronounced effects upon heart tube morphogenesis, causing it to shrink progressively along the A-P axis. To investigate this further, we identified monoclonal antibodies that distinguish between the zebrafish cardiac chambers and used them to show that the RA-induced cardiac truncation always begins at the arterial end of the heart tube. There is a continuous gradient of sensitivity from the arterial to the venous end, such that increasing RA exposure causes the progressive and sequential deletion first of the bulbus arteriosus and then, in order, of the ventricle, the atrium, and the sinus venosus. As exposure increases, parts of chambers are deleted before entire chambers; thus, the sensitivity to RA appears to be independent of chamber boundaries. The analysis of the heart tube's sensitivity to RA and its timing suggest that polarity is established during or shortly after initial commitment to the cardiac lineage.  相似文献   

18.
Buckiová D  Brown NA 《Teratology》1999,59(3):139-147
To study the mechanism of hyperthermia on the development of the rostral neural tube, we used a model in which closely-staged presomite 9.5-day rat embryos were exposed in culture to 43 degrees C for 13 min, and then cultured further for 12-48 hr. This treatment had little effect on the development of the rest of the embryo, but resulted in a spectrum of brain defects, the most severe being a lack of all forebrain and midbrain structures. Whole-mount in situ hybridisation was used to monitor the expression domains of Otx2, Emx2, Krox20, and hoxb1. These showed that there were no ectopic expression patterns, for any gene at any stage examined. Even in those embryos which apparently lacked all forebrain and midbrain structures, there were expression domains of Otx2 and Emx2 in the most rostral neural tissue, and these retained their nested dorso-ventral boundaries, showing that cells fated to form rostral brain were not wholly eliminated. Thus, heat-induced rostral neural tube truncation is of a quite different mechanism from the respecification proposed for retinoic acid, despite their very similar phenotypes. In the hindbrain region of treated embryos, we observed decreased intensity of Krox20, staining and an abnormal relationship developed between the position of hoxb1 expression and the otocyst and pharyngeal arches. In the most extreme cases, this domain was shifted to be more caudal than the rostral edge of the otocyst, while the otocyst retained its normal position relative to the pharyngeal arches. We interpret this as a growth imbalance between neuroepithelium and overlying tissues, perhaps due to a disruption of signals from the midbrain/hindbrain boundary.  相似文献   

19.
Adaptive sample size calculations in group sequential trials   总被引:4,自引:0,他引:4  
Lehmacher W  Wassmer G 《Biometrics》1999,55(4):1286-1290
A method for group sequential trials that is based on the inverse normal method for combining the results of the separate stages is proposed. Without exaggerating the Type I error rate, this method enables data-driven sample size reassessments during the course of the study. It uses the stopping boundaries of the classical group sequential tests. Furthermore, exact test procedures may be derived for a wide range of applications. The procedure is compared with the classical designs in terms of power and expected sample size.  相似文献   

20.
Total shoulder arthroplasty (TSA) is an accepted and most successfully used treatment for different shoulder pathologies. Different risk factors for the failure of the prosthesis are known. A pathological scapular orientation, observed in elderly people or in patients suffering from neuromuscular diseases, could be a cause of failure, which has not been investigated yet. To test this hypothesis, a numerical musculoskeletal model of the glenohumeral joint was used to compare two TSA cases: a reference normal case and a case with a pathological anterior tilt of the scapula. An active abduction of 150° was simulated. Joint force, contact pattern, polyethylene and cement stress were evaluated for both cases. The pathological tilt slightly increased the joint force and the contact pressure, but also shifted the contact pattern. This eccentric contact increased the stress level within the polyethylene of the glenoid component and within the surrounding cement layer. This adverse effect occurred mainly during the first 60° of abduction. Therefore, a pathological orientation of the scapula may increase the risk of a failure of the cement layer around the glenoid component. These preliminary numerical results should be confirmed by a clinical study.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号