首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
The article provides a perspective on the challenges for biostatistics as well as on contributions that biostatisticians are making and can make to medical product development and regulation and what the future might be in these areas. The current environment in the United States for pharmaceutical development and regulation is discussed along with the expectations that the public has for how medical products should contribute to public heath. The globalization of research and the use of study designs that incorporate multi-regional populations present new challenges for design and inference. The emerging interest in and development of the science of safety assessment and quantitative approaches to risk evaluation is considered. Guidance development, especially in the area of clinical trials design, continues to be one of the needs that FDA is asked to meet. Guidance development is proceeding for non-inferiority study designs, adaptive designs, multiple endpoints in clinical trials, and missing outcome data in clinical trials. Biostatisticians will be asked and challenged to take on leadership roles in new areas such as personalized medicine, biomarker and genomics, development of new tools for visual display of clinical data, quality assurance and monitoring in clinical trials.  相似文献   

2.
The first efficacy trials--named STEP--of a T cell vaccine against HIV/AIDS began in 2004. The unprecedented structure of these trials raised new modeling and statistical challenges. Is it plausible that memory T cells, as opposed to antibodies, can actually prevent infection? If they fail at prevention, to what extent can they ameliorate disease? And how do we estimate efficacy in a vaccine trial with two primary endpoints, one traditional, one entirely novel (viral load after infection), and where the latter may be influenced by selection bias due to the former? In preparation for the STEP trials, biostatisticians developed novel techniques for estimating a causal effect of a vaccine on viral load, while accounting for post-randomization selection bias. But these techniques have not been tested in biologically plausible scenarios. We introduce new stochastic models of T cell and HIV kinetics, making use of new estimates of the rate that cytotoxic T lymphocytes--CTLs; the so-called killer T cells--can kill HIV-infected cells. Based on these models, we make the surprising discovery that it is not entirely implausible that HIV-specific CTLs might prevent infection--as the designers explicitly acknowledged when they chose the endpoints of the STEP trials. By simulating thousands of trials, we demonstrate that the new statistical methods can correctly identify an efficacious vaccine, while protecting against a false conclusion that the vaccine exacerbates disease. In addition to uncovering a surprising immunological scenario, our results illustrate the utility of mechanistic modeling in biostatistics.  相似文献   

3.
Good quality medical research generally requires not only an expertise in the chosen medical field of interest but also a sound knowledge of statistical methodology. The number of medical research articles which have been published in Indian medical journals has increased quite substantially in the past decade. The aim of this study was to collate all evidence on study design quality and statistical analyses used in selected leading Indian medical journals. Ten (10) leading Indian medical journals were selected based on impact factors and all original research articles published in 2003 (N = 588) and 2013 (N = 774) were categorized and reviewed. A validated checklist on study design, statistical analyses, results presentation, and interpretation was used for review and evaluation of the articles. Main outcomes considered in the present study were – study design types and their frequencies, error/defects proportion in study design, statistical analyses, and implementation of CONSORT checklist in RCT (randomized clinical trials). From 2003 to 2013: The proportion of erroneous statistical analyses did not decrease (χ2=0.592, Φ=0.027, p=0.4418), 25% (80/320) in 2003 compared to 22.6% (111/490) in 2013. Compared with 2003, significant improvement was seen in 2013; the proportion of papers using statistical tests increased significantly (χ2=26.96, Φ=0.16, p<0.0001) from 42.5% (250/588) to 56.7 % (439/774). The overall proportion of errors in study design decreased significantly (χ2=16.783, Φ=0.12 p<0.0001), 41.3% (243/588) compared to 30.6% (237/774). In 2013, randomized clinical trials designs has remained very low (7.3%, 43/588) with majority showing some errors (41 papers, 95.3%). Majority of the published studies were retrospective in nature both in 2003 [79.1% (465/588)] and in 2013 [78.2% (605/774)]. Major decreases in error proportions were observed in both results presentation (χ2=24.477, Φ=0.17, p<0.0001), 82.2% (263/320) compared to 66.3% (325/490) and interpretation (χ2=25.616, Φ=0.173, p<0.0001), 32.5% (104/320) compared to 17.1% (84/490), though some serious ones were still present. Indian medical research seems to have made no major progress regarding using correct statistical analyses, but error/defects in study designs have decreased significantly. Randomized clinical trials are quite rarely published and have high proportion of methodological problems.  相似文献   

4.
In recent years, the number of studies using a cluster-randomized design has grown dramatically. In addition, the cluster-randomized crossover design has been touted as a methodological advance that can increase efficiency of cluster-randomized studies in certain situations. While the cluster-randomized crossover trial has become a popular tool, standards of design, analysis, reporting and implementation have not been established for this emergent design. We address one particular aspect of cluster-randomized and cluster-randomized crossover trial design: estimating statistical power. We present a general framework for estimating power via simulation in cluster-randomized studies with or without one or more crossover periods. We have implemented this framework in the clusterPower software package for R, freely available online from the Comprehensive R Archive Network. Our simulation framework is easy to implement and users may customize the methods used for data analysis. We give four examples of using the software in practice. The clusterPower package could play an important role in the design of future cluster-randomized and cluster-randomized crossover studies. This work is the first to establish a universal method for calculating power for both cluster-randomized and cluster-randomized clinical trials. More research is needed to develop standardized and recommended methodology for cluster-randomized crossover studies.  相似文献   

5.
E-learning has been widely utilized in medical education and suggested by some proponents to represent a fundamental advance in educational methodology. We challenge this conclusion by examining e-learning in the context of broader learning theories, specifically as they relate to instructional design and methods. Core tenets of educational design are applied to e-learning in a unified model for instructional design, and examples of e-learning technologies are examined in the context of medical education, with reflections on research questions generated by these new modalities. Throughout, we argue that e-learning is a tool that, when designed appropriately, can be used to meet worthy educational goals.  相似文献   

6.
7.
Econometricians Daniel McFadden and James Heckman won the 2000 Nobel Prize in economics for their work on discrete choice models and selection bias. Statisticians and epidemiologists have made similar contributions to medicine with their work on case-control studies, analysis of incomplete data, and causal inference. In spite of repeated nominations of such eminent figures as Bradford Hill and Richard Doll, however, the Nobel Prize in physiology and medicine has never been awarded for work in biostatistics or epidemiology. (The "exception who proves the rule" is Ronald Ross, who, in 1902, won the second medical Nobel for his discovery that the mosquito was the vector for malaria. Ross then went on to develop the mathematics of epidemic theory--which he considered his most important scientific contribution-and applied his insights to malaria control programs.) The low esteem accorded epidemiology and biostatistics in some medical circles, and increasingly among the public, correlates highly with the contradictory results from observational studies that are displayed so prominently in the lay press. In spite of its demonstrated efficacy in saving lives, the "black box" approach of risk factor epidemiology is not well respected. To correct these unfortunate perceptions, statisticians would do well to follow more closely their own teachings: conduct larger, fewer studies designed to test specific hypotheses, follow strict protocols for study design and analysis, better integrate statistical findings with those from the laboratory, and exercise greater caution in promoting apparently positive results.  相似文献   

8.
Evolution of advanced manufacturing technologies and the new manufacturing paradigm has enriched the computer integrated manufacturing (CIM) methodology. The new advances have put more demands for CIM integration technology and associated supporting tools. One of these demands is to provide CIM systems with better software architecture, more flexible integration mechanisms, and powerful support platforms. In this paper, we present an integrating infrastructure for CIM implementation in manufacturing enterprises to form an integrated automation system. A research prototype of an integrating infrastructure has been developed for the development, integration, and operation of integrated CIM system. It is based on the client/server structure and employs object-oriented and agent technology. System openness, scalability, and maintenance are ensured by conforming to international standards and by using effective system design software and management tools.  相似文献   

9.
Auxology has developed from mere describing child and adolescent growth into a vivid and interdisciplinary research area encompassing human biologists, physicians, social scientists, economists and biostatisticians. The meeting illustrated the diversity in auxology, with the various social, medical, biological and biostatistical aspects in studies on child growth and development.  相似文献   

10.
The paper examines major criticisms of AD/HD (Attention Deficit/Hyperactivity Disorder) neurofeedback research using T. R. Rossiter and T. J. La Vaque (1995) as an exemplar and discusses relevant aspects of research methodology. J. Lohr, S. Meunier, L. Parker, and J. P. Kline (2001), D. A. Waschbusch and G. P. Hill (2001), and J. P. Kline, C. N. Brann, and B. R. Loney (2002) criticized Rossiter and La Vaque for (1) using an active treatment control; (2) nonrandom assignment of patients; (3) provision of collateral treatments; (4) using nonstandardized and invalid assessment instruments; (5) providing artifact contaminated EEG feedback; and (6) conducting multiple non-alpha protected t tests. The criticisms, except those related to statistical analysis, are invalid or are not supported as presented by the authors. They are based on the critics' unsubstantiated opinions; require redefining Rossiter and La Vaque as an efficacy rather than an effectiveness study; or reflect a lack of familiarity with the research literature. However, there are broader issues to be considered. Specifically, what research methodology is appropriate for studies evaluating the effectiveness of neurofeedback and who should make that determination? The uncritical acceptance and implementation of models developed for psychotherapy, pharmacology, or medical research is premature and ill-advised. Neurofeedback researchers should develop models that are appropriate to the technology, treatment paradigms, and goals of neurofeedback outcome studies. They need to explain the rationale for their research methodology and defend their choices.  相似文献   

11.
12.
Sir Ronald Aylmer Fisher was the most famous and most productive statistician of the 20th century. Throughout his life, however, Fisher doubted the causal relationship between tobacco smoking and lung cancer. Instead, he invoked a genetic confounder to explain the statistical association between the two factors, i.e., he believed in the existence of a gene that plays a role in both cancer etiology and smoking behavior. There have been many attempts to explain Fisher’s stubbornness regarding this matter. In addition to nonscientific reasons (Fisher was himself a keen smoker) worries about the future importance of valid statistical methodology in medical research also may have played an important role. Interestingly, recent genome-wide association studies (GWAS) of smoking behavior as well as lung cancer have revealed that there may have been a grain of truth in Fisher’s idea and that his confounder may coincide with the gene encoding nicotine receptor subunit α5 on chromosome 15q25.  相似文献   

13.
Success in experiments and/or technology mainly depends on a properly designed process or product. The traditional method of process optimization involves the study of one variable at a time, which requires a number of combinations of experiments that are time, cost and labor intensive. The Taguchi method of design of experiments is a simple statistical tool involving a system of tabulated designs (arrays) that allows a maximum number of main effects to be estimated in an unbiased (orthogonal) fashion with a minimum number of experimental runs. It has been applied to predict the significant contribution of the design variable(s) and the optimum combination of each variable by conducting experiments on a real-time basis. The modeling that is performed essentially relates signal-to-noise ratio to the control variables in a 'main effect only' approach. This approach enables both multiple response and dynamic problems to be studied by handling noise factors. Taguchi principles and concepts have made extensive contributions to industry by bringing focused awareness to robustness, noise and quality. This methodology has been widely applied in many industrial sectors; however, its application in biological sciences has been limited. In the present review, the application and comparison of the Taguchi methodology has been emphasized with specific case studies in the field of biotechnology, particularly in diverse areas like fermentation, food processing, molecular biology, wastewater treatment and bioremediation.  相似文献   

14.
This paper argues for the inclusion of ethnography as a research methodology for understanding the effects of public health policy. To do this, the implementation of DOTS (Directly Observed Therapy, Short-course) -- the World Health Organization (WHO) prescribed policy for the control of the infectious disease tuberculosis -- is explored in the context of Nepal. A brief history of DOTS and its implementation in Nepal is outlined, and the way it has been represented by those within the Nepal Tuberculosis Programme (NTP) is described. This is followed by an outline of the research done in relation to this, and the ethnographic methods used. These ethnographic data are then interpreted and analysed in relation to two specific areas of concern. Firstly, the effects around the epidemiological uses of 'cases' is explored; it is argued that a tightening of the definitional categories so necessary for the programme to be stabilized for comparative purposes has profound material effects in marginalizing some from treatment. Secondly, the paper examines some of the implications and effects relating to the way that the 'directly observed' component was implemented. The discussion explores how current debate on DOTS has been played out in some medical journals. It argues for the importance of ethnography as a method for understanding certain questions that cannot be answered by particular, and increasingly dominant, research ideologies informed by randomized controlled trials. This raises important issues about the nature of 'evidence' in debates on the relationship of research to policy.  相似文献   

15.
In infectious disease epidemiology, statistical methods are an indispensable component for the automated detection of outbreaks in routinely collected surveillance data. So far, methodology in this area has been largely of frequentist nature and has increasingly been taking inspiration from statistical process control. The present work is concerned with strengthening Bayesian thinking in this field. We extend the widely used approach of Farrington et al. and Heisterkamp et al. to a modern Bayesian framework within a time series decomposition context. This approach facilitates a direct calculation of the decision‐making threshold while taking all sources of uncertainty in both prediction and estimation into account. More importantly, with the methodology it is now also possible to integrate covariate processes, e.g. weather influence, into the outbreak detection. Model inference is performed using fast and efficient integrated nested Laplace approximations, enabling the use of this method in routine surveillance at public health institutions. Performance of the algorithm was investigated by comparing simulations with existing methods as well as by analysing the time series of notified campylobacteriosis cases in Germany for the years 2002–2011, which include absolute humidity as a covariate process. Altogether, a flexible and modern surveillance algorithm is presented with an implementation available through the R package ‘surveillance’.  相似文献   

16.
Magnetic resonance imaging (MRI) is an imaging technique with a rapidly expanding application range. This methodology, which relies on quantum physics and substance magnetic properties, is now being routinely used in the clinics and medical research. With the advent of measuring functional brain activity with MRI (functional MRI), this methodology has reached a larger section of the neuroscience community (e.g. psychologists, neurobiologists). In the past, the use of MRI as a biomarker or as an assay to probe tissue pathophysiological condition was limited. However, with the new applications of MRI: molecular imaging, contrast-enhanced imaging and diffusion imaging, MRI is turning into a powerful tool for in vivo characterization of tissue pathophysiology. This review focuses on the diffusion MRI. Although it only measures the averaged Brownian translational motion of water molecules, using different analysis schemes, one can extract a wide range of quantitative indices that represent tissue morphology and compartmentalization. Statistical and visualization routines help to relate these indices to biologically relevant measures such as cell density, water content and size distribution. The aim of this review is to shed light on the potential of this methodology to be used in biological research. To that end, this review is intended for the non-MRI specialists who wish to pursue biological research with this methodology. We will overview the current applications of diffusion MRI and its relation to cellular biology of brain tissue.  相似文献   

17.
Survival prediction from a large number of covariates is a current focus of statistical and medical research. In this paper, we study a methodology known as the compound covariate prediction performed under univariate Cox proportional hazard models. We demonstrate via simulations and real data analysis that the compound covariate method generally competes well with ridge regression and Lasso methods, both already well-studied methods for predicting survival outcomes with a large number of covariates. Furthermore, we develop a refinement of the compound covariate method by incorporating likelihood information from multivariate Cox models. The new proposal is an adaptive method that borrows information contained in both the univariate and multivariate Cox regression estimators. We show that the new proposal has a theoretical justification from a statistical large sample theory and is naturally interpreted as a shrinkage-type estimator, a popular class of estimators in statistical literature. Two datasets, the primary biliary cirrhosis of the liver data and the non-small-cell lung cancer data, are used for illustration. The proposed method is implemented in R package “compound.Cox” available in CRAN at http://cran.r-project.org/.  相似文献   

18.
Increased concentrations of Total Phosphorus (TP) in freshwater systems lead to eutrophication and can contribute to a wide range of environmental effects. In the modern era, water quality models have increasingly been used globally for the development of management scenarios with the aim of reducing the eutrophication risk. However, the accuracy of these models is limited by the quality of the boundary conditions forcing data, namely TP concentration datasets. In this study, a novel methodology is proposed to improve machine learning prediction accuracy in the modeling of river TP concentration forced with small input training datasets. These models can then be used to increase the quality and consistency of the TP concentration datasets required to force water quality models. This new methodology relies on the generation of 100 new training datasets from the raw training datasets of input predictors through the implementation of an over/undersampling technique. The modeling approach used in this study was supported by the application of ten machine learning algorithms to estimate the TP concentration values in 22 rivers located in Portugal. The modeling approach also included an input feature importance evaluation, as well as model hyperparameter optimization. In general terms, the Extreme Gradient Boosting (XGBoost) and Support Vector Regressor (SVR) models performed best overall, with the ensemble results recorded for both models working to increase the mean Nash-Sutcliffe efficiency (NSE) across all the areas being studied by 96% (0.01 ± 0.22 to 0.31 ± 0.32) and reduce the mean percentage bias (PBIAS) by 43% (18.47 ± 17.31 to 10.60 ± 17.40). The results of this study suggest that the solution proposed has the potential to significantly improve the modeling of TP concentration in rivers with machine learning methods, as well as providing increased scope for its application to larger training datasets and the prediction of other types of dependent variables. Hopefully, the results of this study will further add to the body of information available in this area of research and aid the development of the water management process.  相似文献   

19.
For many years, the role of internal radiotherapy has remained limited to certain historical indications such as thyroid cancers or to academic medical research. However, the recent recognition of theranostics and targeted therapies as one of the cornerstones of the modern concept of personalized medicine, has participated in the promotion of new developments for beta and alpha radiotherapy. In this paper, we will review the emerging radionuclides, radiopharmaceutical developments and advances, as well as the clinical successes that have been made in past few years. The results obtained, for some very promising, could herald in a new era for nuclear medicine. However, as presented in this review, in order to fully exploit its potential, and not to remain static as a promising or emerging therapy, the entire field of nuclear medicine must invest in the implementation of well-designed prospective and comparative studies for targeted radiotherapy.  相似文献   

20.
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号