首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 62 毫秒
1.
Significance testing for correlated binary outcome data   总被引:1,自引:0,他引:1  
B Rosner  R C Milton 《Biometrics》1988,44(2):505-512
Multiple logistic regression is a commonly used multivariate technique for analyzing data with a binary outcome. One assumption needed for this method of analysis is the independence of outcome for all sample points in a data set. In ophthalmologic data and other types of correlated binary data, this assumption is often grossly violated and the validity of the technique becomes an issue. A technique has been developed (Rosner, 1984) that utilizes a polychotomous logistic regression model to allow one to look at multiple exposure variables in the context of a correlated binary data structure. This model is an extension of the beta-binomial model, which has been widely used to model correlated binary data when no covariates are present. In this paper, a relationship is developed between the two techniques, whereby it is shown that use of ordinary logistic regression in the presence of correlated binary data can result in true significance levels that are considerably larger than nominal levels in frequently encountered situations. This relationship is explored in detail in the case of a single dichotomous exposure variable. In this case, the appropriate test statistic can be expressed as an adjusted chi-square statistic based on the 2 X 2 contingency table relating exposure to outcome. The test statistic is easily computed as a function of the ordinary chi-square statistic and the correlation between eyes (or more generally between cluster members) for outcome and exposure, respectively. This generalizes some previous results obtained by Koval and Donner (1987, in Festschrift for V. M. Joshi, I. B. MacNeill (ed.), Vol. V, 199-224.(ABSTRACT TRUNCATED AT 250 WORDS)  相似文献   

2.
Wang K 《Human heredity》2002,54(2):57-68
The method of variance components is the method of choice for mapping quantitative trait loci (QTLs) with general pedigrees. Being a likelihood-based method, this method can be computation intensive even for nuclear families, and has excessive false positive rates under some situations. Here two efficient score statistics to detect QTLs are derived, one assumes that the candidate locus has no dominance effect, and the other one does not make such an assumption. These two score statistics are asymptotically equivalent to the method of variance components but they are easier to compute and more robust than the likelihood ratio statistic. The derivation of these score statistics is facilitated by separating the segregation parameters, the parameters that describe the distribution of the phenotypic value in the population, from the linkage parameters, the parameters that measure the effect of the candidate locus on the phenotypic value. Such a separation of the model parameters greatly reduces the number of parameters to be dealt with in the analysis of linkage. The asymptotic distributions of both score statistics are derived. Simulation studies indicate that, compared to the method of variance components, both score statistics have comparable or higher power, and their false-positive rates are closer to their respective nominal significance levels.  相似文献   

3.
A method is proposed for reconstructing the time and age dependence of incidence rates from successive age-prevalence cross sections taken from the sentinel surveys of irreversible diseases when there is an important difference in mortality between the infected and susceptible subpopulations. The prevalence information at different time-age points is used to generate a surface; the time-age variations along the life line profiles of this surface and the difference in mortality rates are used to reconstruct the time and age dependence of the incidence rate. Past attempts were based on specified parametric forms for the incidence or on the hypothesis of time-invariant forms for the age-prevalence cross sections. The proposed method makes no such assumptions and is thus capable of coping with rapidly evolving prevalence situations. In the simulations carried out, it is found to be resilient to important random noise components added to a prescribed incidence rate input. The method is also tested on a real data set of successive HIV age-prevalence cross sections from Burundi coupled to differential mortality data on HIV(+) and HIV(-) individuals. The often-made assumption that the incidence rate can be written as the product of a calendar time component and an age component is also examined. In this case, a pooling procedure is proposed to estimate the time and the age profiles of the incidence rate using the reconstructed incidence rates at all time-age points.  相似文献   

4.
Acceptance of public spaces is often guided by perceptual schemata. Such schemata also seem to play a role in thermal comfort and microclimate experience. For climate-responsive design with a focus on thermal comfort it is important to acquire knowledge about these schemata. For this purpose, perceived and “real” microclimate situations were compared for three Dutch urban squares. People were asked about their long-term microclimate perceptions, which resulted in “cognitive microclimate maps”. These were compared with mapped microclimate data from measurements representing the common microclimate when people stay outdoors. The comparison revealed some unexpected low matches; people clearly overestimated the influence of the wind. Therefore, a second assumption was developed: that it is the more salient wind situations that become engrained in people’s memory. A comparison using measurement data from windy days shows better matches. This suggests that these more salient situations play a role in the microclimate schemata that people develop about urban places. The consequences from this study for urban design are twofold. Firstly, urban design should address not only the “real” problems, but, more prominently, the “perceived” problems. Secondly, microclimate simulations addressing thermal comfort issues in urban spaces should focus on these perceived, salient situations.  相似文献   

5.
Power-frequency electric fields are strongly perturbed in the vicinity of human beings and experimental animals. As a consequence, the extrapolation of biological data from laboratory animals to human-exposure situations cannot use the unperturbed exposure field strength as a common exposure parameter. Rather, comparisons between species must be based on the actual electric fields at the outer surfaces of and inside the bodies of the subjects. Experimental data have been published on surface and internal fields for a few exposure situations, but it is not feasible to characterize experimentally more than a small fraction of the diverse types of exposures which occur in the laboratory and in the field. A predictive numerical model is needed, one whose predictions have been verified in situations where experimental data are available, and one whose results can be used with confidence in new exposure situations. This paper describes a numerical technique which can be used to develop such a model, and it carries out this development for a test case, that of a homogeneous right-circular cylinder resting upright on-end on a ground plane and exposed to a vertical, uniform, 60-Hz electric field. The accuracy of the model is tested by comparing short-circuit currents and induced current densities predicted by it to measured values: Agreement is good.  相似文献   

6.
Fay MP  Tiwari RC  Feuer EJ  Zou Z 《Biometrics》2006,62(3):847-854
The annual percent change (APC) is often used to measure trends in disease and mortality rates, and a common estimator of this parameter uses a linear model on the log of the age-standardized rates. Under the assumption of linearity on the log scale, which is equivalent to a constant change assumption, APC can be equivalently defined in three ways as transformations of either (1) the slope of the line that runs through the log of each rate, (2) the ratio of the last rate to the first rate in the series, or (3) the geometric mean of the proportional changes in the rates over the series. When the constant change assumption fails then the first definition cannot be applied as is, while the second and third definitions unambiguously define the same parameter regardless of whether the assumption holds. We call this parameter the percent change annualized (PCA) and propose two new estimators of it. The first, the two-point estimator, uses only the first and last rates, assuming nothing about the rates in between. This estimator requires fewer assumptions and is asymptotically unbiased as the size of the population gets large, but has more variability since it uses no information from the middle rates. The second estimator is an adaptive one and equals the linear model estimator with a high probability when the rates are not significantly different from linear on the log scale, but includes fewer points if there are significant departures from that linearity. For the two-point estimator we can use confidence intervals previously developed for ratios of directly standardized rates. For the adaptive estimator, we show through simulation that the bootstrap confidence intervals give appropriate coverage.  相似文献   

7.
A new method is presented for estimating the parameters of two different models of a joint. The two models are: (1) A rotational joint with a fixed axis of rotation, also referred to as a hinge joint and (2) a ball and socket model, corresponding to a spherical joint. Given the motion of a set of markers, it is shown how the parameters can be estimated, utilizing the whole data set. The parameters are estimated from motion data by minimizing two objective functions. The method does not assume a rigid body motion, but only that each marker rotates around the same fixed axis of rotation or center of rotation. Simulation results indicate that in situations where the rigid body assumption is valid and when measurement noise is present, the proposed method is inferior to methods that utilize the rigid body assumption. However, when there are large skin movement artefacts, simulation results show the proposed method to be more robust.  相似文献   

8.
The assumption that total abundance of RNAs in a cell is roughly the same in different cells is underlying most studies based on gene expression analyses. But experiments have shown that changes in the expression of some master regulators such as c-MYC can cause global shift in the expression of almost all genes in some cell types like cancers. Such shift will violate this assumption and can cause wrong or biased conclusions for standard data analysis practices, such as detection of differentially expressed (DE) genes and molecular classification of tumors based on gene expression. Most existing gene expression data were generated without considering this possibility, and are therefore at the risk of having produced unreliable results if such global shift effect exists in the data. To evaluate this risk, we conducted a systematic study on the possible influence of the global gene expression shift effect on differential expression analysis and on molecular classification analysis. We collected data with known global shift effect and also generated data to simulate different situations of the effect based on a wide collection of real gene expression data, and conducted comparative studies on representative existing methods. We observed that some DE analysis methods are more tolerant to the global shift while others are very sensitive to it. Classification accuracy is not sensitive to the shift and actually can benefit from it, but genes selected for the classification can be greatly affected.  相似文献   

9.
Epidemiological games combine epidemic modelling with game theory to assess strategic choices in response to risks from infectious diseases. In most epidemiological games studied thus-far, the strategies of an individual are represented with a single choice parameter. There are many natural situations where strategies can not be represented by a single dimension, including situations where individuals can change their behavior as they age. To better understand how age-dependent variations in behavior can help individuals deal with infection risks, we study an epidemiological game in an SI model with two life-history stages where social distancing behaviors that reduce exposure rates are age-dependent. When considering a special case of the general model, we show that there is a unique Nash equilibrium when the infection pressure is a monotone function of aggregate exposure rates, but non-monotone effects can appear even in our special case. The non-monotone effects sometimes result in three Nash equilibria, two of which have local invasion potential simultaneously. Returning to a general case, we also describe a game with continuous age-structure using partial-differential equations, numerically identify some Nash equilibria, and conjecture about uniqueness.  相似文献   

10.
Generalized causal mediation analysis   总被引:1,自引:0,他引:1  
Albert JM  Nelson S 《Biometrics》2011,67(3):1028-1038
The goal of mediation analysis is to assess direct and indirect effects of a treatment or exposure on an outcome. More generally, we may be interested in the context of a causal model as characterized by a directed acyclic graph (DAG), where mediation via a specific path from exposure to outcome may involve an arbitrary number of links (or "stages"). Methods for estimating mediation (or pathway) effects are available for a continuous outcome and a continuous mediator related via a linear model, while for a categorical outcome or categorical mediator, methods are usually limited to two-stage mediation. We present a method applicable to multiple stages of mediation and mixed variable types using generalized linear models. We define pathway effects using a potential outcomes framework and present a general formula that provides the effect of exposure through any specified pathway. Some pathway effects are nonidentifiable and their estimation requires an assumption regarding the correlation between counterfactuals. We provide a sensitivity analysis to assess the impact of this assumption. Confidence intervals for pathway effect estimates are obtained via a bootstrap method. The method is applied to a cohort study of dental caries in very low birth weight adolescents. A simulation study demonstrates low bias of pathway effect estimators and close-to-nominal coverage rates of confidence intervals. We also find low sensitivity to the counterfactual correlation in most scenarios.  相似文献   

11.
Cheung YK 《Biometrics》2005,61(2):524-531
When comparing follow-up measurements from two independent populations, missing records may arise due to censoring by events whose occurrence is associated with baseline covariates. In these situations, inferences based only on the completely followed observations may be biased if the follow-up measurements and the covariates are correlated. This article describes exact inference for a class of modified U-statistics under covariate-dependent dropouts. The method involves weighing each permutation according to the retention probabilities, and thus requires estimation of the missing data mechanism. The proposed procedure is nonparametric in that no distributional assumption is necessary for the outcome variables and the missingness patterns. Monte Carlo approximation by the Gibbs sampler is proposed, and is shown to be fast and accurate via simulation. The method is illustrated in two small data sets for which asymptotic inferential procedures may not be appropriate.  相似文献   

12.
Mehdi Cherif  Michel Loreau 《Oikos》2010,119(6):897-907
Droop's model was originally designed to describe the growth of unicellular phytoplankton species in chemostats but it is now commonly used for a variety of organisms in models of trophic interactions, ecosystem functioning, and evolution. Despite its ubiquitous use, Droop's model is still limited by several simplifying assumptions. For example, the assumption of equal theoretical maximum growth rates for all nutrients is commonly used to describe growth limited by multiple nutrients. This assumption, however, is both biologically unrealistic and potentially misleading. We propose the alternative hypothesis of equal realized maximum growth rates for all nutrients. We support our hypothesis with empirical and theoretical arguments and discuss how it may improve our understanding of the biology of growth, while avoiding some of the pitfalls of the previous assumption.  相似文献   

13.
The classical group sequential test procedures that were proposed by Pocock (1977) and O'Brien and Fleming (1979) rest on the assumption of equal sample sizes between the interim analyses. Regarding this it is well known that for most situations there is not a great amount of additional Type I error if monitoring is performed for unequal sample sizes between the stages. In some cases, however, problems can arise resulting in an unacceptable liberal behavior of the test procedure. In this article worst case scenarios in sample size imbalancements between the inspection times are considered. Exact critical values for the Pocock and the O'Brien and Fleming group sequential designs are derived for arbitrary and for varying but bounded sample sizes. The approach represents a reasonable alternative to the flexible method that is based on the Type I error rate spending function. The SAS syntax for performing the calculations is provided. Using these procedures, the inspection times or the sample sizes in the consecutive stages need to be chosen independently of the data observed so far.  相似文献   

14.
In a previous paper (Klotz et a1., 1979) we described a method for determining evolutionary trees from sequence data when rates of evolution of the sequences might differ greatly. It was shown theoretically that the method always gave the correct topology and root when the exact number of mutation differences between sequences and from their common ancestor was known. However, the method is impractical to use in most situations because it requires some knowledge of the ancestor. In this present paper we describe another method, related to the previous one, in which a present-day sequence can serve temporarily as an ancestor for purposes of determining the evolutionary tree regardless of the rates of evolution of the sequences involved. This new method can be carried out with high precision without the aid of a computer, and it does not increase in difficulty rapidly as the number of sequences involved in the study increases, unlike other methods.  相似文献   

15.
In clinical trials, the comparison of two different populations is a common problem. Nonlinear (parametric) regression models are commonly used to describe the relationship between covariates, such as concentration or dose, and a response variable in the two groups. In some situations, it is reasonable to assume some model parameters to be the same, for instance, the placebo effect or the maximum treatment effect. In this paper, we develop a (parametric) bootstrap test to establish the similarity of two regression curves sharing some common parameters. We show by theoretical arguments and by means of a simulation study that the new test controls its significance level and achieves a reasonable power. Moreover, it is demonstrated that under the assumption of common parameters, a considerably more powerful test can be constructed compared with the test that does not use this assumption. Finally, we illustrate the potential applications of the new methodology by a clinical trial example.  相似文献   

16.
When modeling survival data, it is common to assume that the (log-transformed) survival time (T) is conditionally independent of the (log-transformed) censoring time (C) given a set of covariates. There are numerous situations in which this assumption is not realistic, and a number of correction procedures have been developed for different models. However, in most cases, either some prior knowledge about the association between T and C is required, or some auxiliary information or data is/are supposed to be available. When this is not the case, the application of many existing methods turns out to be limited. The goal of this paper is to overcome this problem by developing a flexible parametric model, that is a type of transformed linear model. We show that the association between T and C is identifiable in this model. The performance of the proposed method is investigated both in an asymptotic way and through finite sample simulations. We also develop a formal goodness-of-fit test approach to assess the quality of the fitted model. Finally, the approach is applied to data coming from a study on liver transplants.  相似文献   

17.
In this work, the numerical dosimetry in human exposure to the electromagnetic fields from antennas of wireless devices, such as those of wireless local area networks (WLAN) access points or phone and computer peripherals with Bluetooth antennas, is analyzed with the objective of assessing guidelines compliance. Several geometrical configurations are considered to simulate possible exposure situations of a person to the fields from WLAN or Bluetooth antennas operating at 2400 MHz. The exposure to radiation from two sources of different frequencies when using a 1800 MHz GSM mobile phone connected via Bluetooth with a hands-free car kit is also considered. The finite-difference time-domain (FDTD) method is used to calculate electric and magnetic field values in the vicinity of the antennas and specific absorption rates (SAR) in a high-resolution model of the human head and torso, to be compared with the limits from the guidelines (reference levels and basic restrictions, respectively). Results show that the exposure levels in worst-case situations studied are lower than those obtained when analyzing the exposure to mobile phones, as could be expected because of the low power of the signals and the distance between the human and the antennas, with both field and SAR values being far below the limits established by the guidelines, even when considering the combined exposure to both a GSM and a Bluetooth antenna.  相似文献   

18.
The role of mycoplasmas in non-gonococcal urethritis: a review   总被引:7,自引:0,他引:7  
The criteria that need to be fulfilled before regarding a mycoplasma as a cause of non-gonococcal urethritis (NGU) are outlined. Of the seven mycoplasmas that have been isolated from the human genitourinary tract, most cannot be considered as contenders for causing NGU. Although there is no evidence to support an etiological role for Mycoplasma hominis, it may be unwise to ignore this mycoplasma in view of its known pathogenicity in other situations. The cumulative weight of evidence indicates that strains of Ureaplasma urealyticum (ureaplasmas) cause NGU in some patients. The reason for their occurrence in the urethra of some men without disease needs to be established. Ureaplasmas do not seem to cause post-gonococcal urethritis. The role in NGU of M. genitalium, newly discovered in the male urethra, is unknown, but its biological features, morphological appearance, and ability to cause genital disease in marmosets suggest that it may be pathogenic for man.  相似文献   

19.
The central challenge from the Precautionary Principle to statistical methodology is to help delineate (preferably quantitatively) the possibility that some exposure is hazardous, even in cases where this is not established beyond reasonable doubt. The classical approach to hypothesis testing is unhelpful, because lack of significance can be due either to uninformative data or to genuine lack of effect (the Type II error problem). Its inversion, bioequivalence testing, might sometimes be a model for the Precautionary Principle in its ability to ‘prove the null hypothesis.’ Current procedures for setting safe exposure levels are essentially derived from these classical statistical ideas, and we outline how uncertainties in the exposure and response measurements affect the No Observed Adverse Effect Level (NOAEL), the Benchmark approach and the “Hockey Stick” model. A particular problem concerns model uncertainty: usually these procedures assume that the class of models describing dose/response is known with certainty; this assumption is however often violated, perhaps particularly often when epidemiological data form the source of the risk assessment, and regulatory authorities have occasionally resorted to some average based on competing models. The recent methodology of Bayesian model averaging might be a systematic version of this, but is this an arena for the Precautionary Principle to come into play?  相似文献   

20.
Various alpha1-acid glycoprotein (AGP) glycoforms are present in plasma differing in extent of branching and/or fucosylation of their 5 N-linked glycans, as well as in concentration. It is assumed that hepatic synthesis determines the relative occurrence of the AGP-glycoforms in plasma, but experimental evidence is lacking. In this study, we have investigated the contribution of fractional synthesis rates to the plasma concentration of AGP-glycoforms that differed in relative occurrence in healthy human plasma. During a [13C]valine infusion, AGP was isolated from the plasma of healthy volunteers. Four AGP-glycoforms, differing strongly in plasma concentration were obtained by sequential affinity chromatography over concanavalin-A- and Aleuria aurantia -agarose columns. The incorporation of the [13C]valine tracer into the AGP-glycoforms was measured by gas chromatography combustion isotope ratio mass spectrometry. The mean fractional synthesis rates of the four AGP-glycoforms did not differ significantly between each other as well between individuals. The results indicated a renewal of about 15%/day of the plasma pools of each of the AGP-glycoforms. This is in support to the assumption that the differences in plasma concentration of the AGP-glycoforms are a reflection of the state of the hepatic glycosylation process.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号