首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
The temporal properties of spontaneous and (or) evoked discharges of 157 neurons localized in dorsal cochlear nucleus of anaesthetized cats have been studied. Tone bursts were presented at stimulus best frequency in a free field from the side of ipsilateral ear. About half of cells were characterized by paused or build-up types of the discharge. For all such units a long lasting post-spike decrease in excitability could be seen from the analysis of hazard functions of spontaneous and evoked activity. As a result, the time dependence of conditional probability of the first crossing of the threshold (under condition of an absence of previous response spikes) or expecting probability function (EPF) were over the usual peristimulus histograms. Units with chopper discharges usually did not demonstrate alternative peaks in EPF. We interpreted this fact as evidence that chopper discharge pattern is a result of strong post spike decrease in excitability. Such pattern doesn't demonstrate an existence of real periodicity of the unit. In primary-like units the hazard functions demonstrated only minor after-spike decrease of excitability, and the EPFs were similar to the initial part of peristimulus histograms. Type II units (presumably inhibitory cells) were characterized by non-monotonous hazard functions and by a tendency to burst response patterns. In some cells, we observed a tendency to existence of real intrinsic oscillations both in the EPFs and hazard functions.  相似文献   

2.
Person‐time incidence rates are frequently used in medical research. However, standard estimation theory for this measure of event occurrence is based on the assumption of independent and identically distributed (iid) exponential event times, which implies that the hazard function remains constant over time. Under this assumption and assuming independent censoring, observed person‐time incidence rate is the maximum‐likelihood estimator of the constant hazard, and asymptotic variance of the log rate can be estimated consistently by the inverse of the number of events. However, in many practical applications, the assumption of constant hazard is not very plausible. In the present paper, an average rate parameter is defined as the ratio of expected event count to the expected total time at risk. This rate parameter is equal to the hazard function under constant hazard. For inference about the average rate parameter, an asymptotically robust variance estimator of the log rate is proposed. Given some very general conditions, the robust variance estimator is consistent under arbitrary iid event times, and is also consistent or asymptotically conservative when event times are independent but nonidentically distributed. In contrast, the standard maximum‐likelihood estimator may become anticonservative under nonconstant hazard, producing confidence intervals with less‐than‐nominal asymptotic coverage. These results are derived analytically and illustrated with simulations. The two estimators are also compared in five datasets from oncology studies.  相似文献   

3.
A stochastic approximation algorithm is proposed for recursive estimation of the hyperparameters characterizing, in a population, the probability density function of the parameters of a statistical model. For a given population model defined by a parametric model of a biological process, an error model, and a class of densities on the set of the individual parameters, this algorithm provides a sequence of estimates from a sequence of individuals' observation vectors. Convergence conditions are verified for a class of population models including usual pharmacokinetic applications. This method is implemented for estimation of pharmacokinetic population parameters from drug multiple-dosing data. Its estimation capabilities are evaluated and compared to a classical method in population pharmacokinetics, the first-order method (NONMEM), on simulated data.  相似文献   

4.
5.
An estimator of the hazard rate function from discrete failure time data is obtained by semiparametric smoothing of the (nonsmooth) maximum likelihood estimator, which is achieved by repeated multiplication of a Markov chain transition-type matrix. This matrix is constructed so as to have a given standard discrete parametric hazard rate model, termed the vehicle model, as its stationary hazard rate. As with the discrete density estimation case, the proposed estimator gives improved performance when the vehicle model is a good one and otherwise provides a nonparametric method comparable to the only purely nonparametric smoother discussed in the literature. The proposed semiparametric smoothing approach is then extended to hazard models with covariates and is illustrated by applications to simulated and real data sets.  相似文献   

6.
This work presents a simple and accurate method to estimate the noise autocorrelation function in auditory evoked potential applications. It basically consists in applying a conventional correlation function estimator over the contaminated evoked potential signal processed by a comb filter. The main feature of the proposed technique is the possibility of obtaining information on large correlation lags without the need of extra time intervals, minimizing the estimation time. A theoretical analysis is provided showing that, under certain but achievable conditions, the correlation function of the processed signal approximates the real noise correlation function. Simulation results and an example with a single-trial evoked potential estimation technique illustrate the expected performance. The proposed method is of special interest to either single or small number of trials evoked potential estimation techniques in anaesthesia monitoring applications.  相似文献   

7.
Hazard rate models with covariates.   总被引:3,自引:0,他引:3  
Many problems, particularly in medical research, concern the relationship between certain covariates and the time to occurrence of an event. The hazard or failure rate function provides a conceptually simple representation of time to occurrence data that readily adapts to include such generalizations as competing risks and covariates that vary with time. Two partially parametric models for the hazard function are considered. These are the proportional hazards model of Cox (1972) and the class of log-linear or accelerated failure time models. A synthesis of the literature on estimation from these models under prospective sampling indicates that, although important advances have occurred during the past decade, further effort is warranted on such topics as distribution theory, tests of fit, robustness, and the full utilization of a methodology that permits non-standard features. It is further argued that a good deal of fruitful research could be done on applying the same models under a variety of other sampling schemes. A discussion of estimation from case-control studies illustrates this point.  相似文献   

8.
The analysis of failure times in the presence of competing risks.   总被引:15,自引:0,他引:15  
Distinct problems in the analysis of failure times with competing causes of failure include the estimation of treatment or exposure effects on specific failure types, the study of interrelations among failure types, and the estimation of failure rates for some causes given the removal of certain other failure types. The usual formation of these problems is in terms of conceptual or latent failure times for each failure type. This approach is criticized on the basis of unwarranted assumptions, lack of physical interpretation and identifiability problems. An alternative approach utilizing cause-specific hazard functions for observable quantities, including time-dependent covariates, is proposed. Cause-specific hazard functions are shown to be the basic estimable quantities in the competing risks framework. A method, involving the estimation of parameters that relate time-dependent risk indicators for some causes to cause-specific hazard functions for other causes, is proposed for the study of interrelations among failure types. Further, it is argued that the problem of estimation of failure rates under the removal of certain causes is not well posed until a mechanism for cause removal is specified. Following such a specification, one will sometimes be in a position to make sensible extrapolations from available data to situations involving cause removal. A clinical program in bone marrow transplantation for leukemia provides a setting for discussion and illustration of each of these ideas. Failure due to censoring in a survivorship study leads to further discussion.  相似文献   

9.
Intensive care unit (ICU) patients are highly susceptible to hospital-acquired infections due to their poor health and many invasive therapeutic treatments. The effect on mortality of acquiring such infections is, however, poorly understood. Our goal is to quantify this using data from the National Surveillance Study of Nosocomial Infections in ICUs (Belgium). This is challenging because of the presence of time-dependent confounders, such as mechanical ventilation, which lie on the causal path from infection to mortality. Standard statistical analyses may be severely misleading in such settings and have shown contradictory results. Inverse probability weighting for marginal structural models may instead be used but is not directly applicable because these models parameterize the effect of acquiring infection on a given day in ICU, versus "never" acquiring infection in ICU, and this is ill-defined when ICU discharge precedes that day. Additional complications arise from the informative censoring of the survival time by hospital discharge and the instability of the inverse weighting estimation procedure. We accommodate this by introducing a new class of marginal structural models for so-called partial exposure regimes. These describe the effect on the hazard of death of acquiring infection on a given day s, versus not acquiring infection "up to that day," had patients stayed in the ICU for at least s days.  相似文献   

10.
Du P  Jiang Y  Wang Y 《Biometrics》2011,67(4):1330-1339
Gap time hazard estimation is of particular interest in recurrent event data. This article proposes a fully nonparametric approach for estimating the gap time hazard. Smoothing spline analysis of variance (ANOVA) decompositions are used to model the log gap time hazard as a joint function of gap time and covariates, and general frailty is introduced to account for between-subject heterogeneity and within-subject correlation. We estimate the nonparametric gap time hazard function and parameters in the frailty distribution using a combination of the Newton-Raphson procedure, the stochastic approximation algorithm (SAA), and the Markov chain Monte Carlo (MCMC) method. The convergence of the algorithm is guaranteed by decreasing the step size of parameter update and/or increasing the MCMC sample size along iterations. Model selection procedure is also developed to identify negligible components in a functional ANOVA decomposition of the log gap time hazard. We evaluate the proposed methods with simulation studies and illustrate its use through the analysis of bladder tumor data.  相似文献   

11.
Huang Y 《Biometrics》1999,55(4):1108-1113
Induced dependent censorship is a general phenomenon in health service evaluation studies in which a measure such as quality-adjusted survival time or lifetime medical cost is of interest. We investigate the two-sample problem and propose two classes of nonparametric tests. Based on consistent estimation of the survival function for each sample, the two classes of test statistics examine the cumulative weighted difference in hazard functions and in survival functions. We derive a unified asymptotic null distribution theory and inference procedure. The tests are applied to trial V of the International Breast Cancer Study Group and show that long duration chemotherapy significantly improves time without symptoms of disease and toxicity of treatment as compared with the short duration treatment. Simulation studies demonstrate that the proposed tests, with a wide range of weight choices, perform well under moderate sample sizes.  相似文献   

12.
Right-truncated data arise when observations are ascertained retrospectively, and only subjects who experience the event of interest by the time of sampling are selected. Such a selection scheme, without adjustment, leads to biased estimation of covariate effects in the Cox proportional hazards model. The existing methods for fitting the Cox model to right-truncated data, which are based on the maximization of the likelihood or solving estimating equations with respect to both the baseline hazard function and the covariate effects, are numerically challenging. We consider two alternative simple methods based on inverse probability weighting (IPW) estimating equations, which allow consistent estimation of covariate effects under a positivity assumption and avoid estimation of baseline hazards. We discuss problems of identifiability and consistency that arise when positivity does not hold and show that although the partial tests for null effects based on these IPW methods can be used in some settings even in the absence of positivity, they are not valid in general. We propose adjusted estimating equations that incorporate the probability of observation when it is known from external sources, which results in consistent estimation. We compare the methods in simulations and apply them to the analyses of human immunodeficiency virus latency.  相似文献   

13.
Genomic imprinting, a genetic phenomenon of non-equivalent allele expression that depends on parental origins, has been ubiquitously observed in nature. It does not only control the traits of growth and development but also may be responsible for survival traits. Based on the accelerated failure time model, we construct a general parametric model for mapping the imprinted QTL (iQTL). Within the framework of interval mapping, maximum likelihood estimation of iQTL parameters is implemented via EM algorithm. The imprinting patterns of the detected iQTL are statistically tested according to a series of null hypotheses. BIC model selection criterion is employed to choose an optimal baseline hazard function with maximum likelihood and parsimonious parameters. Simulations are used to validate the proposed mapping procedure. A published dataset from a mouse model system was used to illustrate the proposed framework. Results show that among the five commonly used survival distributions, Log-logistic distribution is the optimal baseline hazard function for mapping QTL of hyperoxic acute lung injury (HALI) survival; under the log-logistic distribution, four QTLs were identified, in which only one QTL was inherited in Mendelian fashion, whereas others were imprinted in different imprinting patterns.  相似文献   

14.
Wang L  Du P  Liang H 《Biometrics》2012,68(3):726-735
Summary In some survival analysis of medical studies, there are often long-term survivors who can be considered as permanently cured. The goals in these studies are to estimate the noncured probability of the whole population and the hazard rate of the susceptible subpopulation. When covariates are present as often happens in practice, to understand covariate effects on the noncured probability and hazard rate is of equal importance. The existing methods are limited to parametric and semiparametric models. We propose a two-component mixture cure rate model with nonparametric forms for both the cure probability and the hazard rate function. Identifiability of the model is guaranteed by an additive assumption that allows no time-covariate interactions in the logarithm of hazard rate. Estimation is carried out by an expectation-maximization algorithm on maximizing a penalized likelihood. For inferential purpose, we apply the Louis formula to obtain point-wise confidence intervals for noncured probability and hazard rate. Asymptotic convergence rates of our function estimates are established. We then evaluate the proposed method by extensive simulations. We analyze the survival data from a melanoma study and find interesting patterns for this study.  相似文献   

15.
A new approach to the study of the stability of delay systems is developed. The method is applicable to biological control systems and other systems where little information about time delays is available. The view proposed is that stability information can be deduced from the statistical properties of the probability distribution that encodes the structure of the time delay. The main statistical variables used are the usual expectation parameter E and a modified variance, called relative variance and denoted R, that is invariant under time-scale changes. In many cases, the stability of a model improves as R increases while E remains fixed. The statistical approach is shown to be closely related to a geometric method of Walther and Cushing that establishes stability in the case of a convex delay distribution function. In fact, it is shown that convex and concave distributions have R values respectively greater than and less than 1/2. A generalized version of the geometric theory is presented that relaxes the smoothness hypothesis on the density function; this brings it more into correspondence with statistical theory, which applies to general distributions irrespective of their smoothness.  相似文献   

16.
This article considers the asymptotic estimation theory for the log relative potency in a symmetric parallel bioassay when uncertain prior information about the true log relative potency is assumed to be a known quantity. Three classes of point estimation, namely, the unrestricted estimator, the shrinkage restricted estimator and shrinkage preliminary test estimator are proposed. Their asymptotic mean squared errors are derived and compared. The relative dominance picture of the estimators is presented. Interestingly, proposed shrinkage preliminary test estimator dominates the unrestricted estimator in a range that is wider than that of the usual preliminary test estimator. Most importantly, the size of the preliminary test is much appropriate than the usual preliminary test estimator.  相似文献   

17.
An algorithm for parameter estimation is presented for the neural system model. Because of its firing mechanism analogous to that of the model based on the first time crossing problem, this problem is solved numerically for our model according to the results of Kostyukov et al. (1981). We propose the algorithm that estimates the parameters of the model considering the equivalence between the probability density function of the 1st crossing time and that of the interspike interval, which is derived from the interspike interval histogram by making use of the spline function technique. The ability of the algorithm is ensured by the application to the simulated interspike interval data. The parameter estimation is carried out also for the practical neural data recorded in the cat's optic tract fibers in both the spontaneous and the stimulated cases. These applications will show the effectiveness of the algorithm in practical cases.  相似文献   

18.
This article investigates an augmented inverse selection probability weighted estimator for Cox regression parameter estimation when covariate variables are incomplete. This estimator extends the Horvitz and Thompson (1952, Journal of the American Statistical Association 47, 663-685) weighted estimator. This estimator is doubly robust because it is consistent as long as either the selection probability model or the joint distribution of covariates is correctly specified. The augmentation term of the estimating equation depends on the baseline cumulative hazard and on a conditional distribution that can be implemented by using an EM-type algorithm. This method is compared with some previously proposed estimators via simulation studies. The method is applied to a real example.  相似文献   

19.
Transgenic technology is developing rapidly; however, consumers and environmentalists remain wary of its safety for use in agriculture. Research is needed to ensure the safe use of transgenic technology and thus increase consumer confidence. This goal is best accomplished by using a thorough, unbiased examination of risks associated with agricultural biotechnology. In this paper, we review discussion on risk and extend our approach to predict risk. We also distinguish between the risk and hazard of transgenic organisms in natural environments. We define transgene risk as the probability a transgene will spread into natural conspecific populations and define hazard as the probability of species extinction, displacement, or ecosystem disruption given that the transgene has spread. Our methods primarily address risk relative to two types of hazards: extinction which has a high hazard, and invasion which has an unknown level of hazard, similar to that of an introduced exotic species. Our method of risk assessment is unique in that we concentrate on the six major fitness components of an organism's life cycle to determine if transgenic individuals differ in survival or reproductive capacity from wild type. Our approach then combines estimates of the net fitness parameters into a mathematical model to determine the fate of the transgene and the affected wild population. We also review aspects of fish ecology and behavior that contribute to risk and examine combinations of net fitness parameters which can lead to invasion and extinction hazards. We describe three new ways that a transgene could result in an extinction hazard: (1) when the transgene increases male mating success but reduces daily adult viability, (2) when the transgene increases adult viability but reduces male fertility, and (3) when the transgene increases both male mating success and adult viability but reduces male fertility. The last scenario is predicted to cause rapid extinction, thus it poses an extreme risk. Although we limit our discussion to aquacultural applications, our methods can easily be adapted to other sexually reproducing organisms with suitable adjustments of terminology.  相似文献   

20.
We consider the estimation of a nonparametric smooth function of some event time in a semiparametric mixed effects model from repeatedly measured data when the event time is subject to right censoring. The within-subject correlation is captured by both cross-sectional and time-dependent random effects, where the latter is modeled by a nonhomogeneous Ornstein–Uhlenbeck stochastic process. When the censoring probability depends on other variables in the model, which often happens in practice, the event time data are not missing completely at random. Hence, the complete case analysis by eliminating all the censored observations may yield biased estimates of the regression parameters including the smooth function of the event time, and is less efficient. To remedy, we derive the likelihood function for the observed data by modeling the event time distribution given other covariates. We propose a two-stage pseudo-likelihood approach for the estimation of model parameters by first plugging an estimator of the conditional event time distribution into the likelihood and then maximizing the resulting pseudo-likelihood function. Empirical evaluation shows that the proposed method yields negligible biases while significantly reduces the estimation variability. This research is motivated by the project of hormone profile estimation around age at the final menstrual period for the cohort of women in the Michigan Bone Health and Metabolism Study.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号