首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
The factors which need to be taken into account in designing a 'good' experiment are reviewed. Such an experiment should be unbiased, have high precision, a wide range of applicability, it should be simple, and there should be a means of quantifying uncertainty (Cox 1958). The relative precision due to the use of randomized block designs was found to range from 96% to 543% in 5 experiments involving 30 variables. However, a survey of 78 papers published in two toxicology journals showed that such designs were hardly used. Similarly, designs in which more than one factor was varied simultaneously ('factorial designs') were only used in 9% of studies, though interactions between variables such as dose and strain of animal may be common, so that single factor experiments could be misleading. The consequences of increased within-group variability due to infection and genetic segregation were quantified using data published by G?rtner (1990). Both substantially reduced precision, but toxicologists continue to use non-isogenic laboratory animals, leading to experiments with a lower level of precision than is necessary. It is concluded that there is scope for improving the design of animal experiments, which could lead to a reduction in animal use. People using animals should be required to take formal training courses which include sessions on experimental design in order to minimize animal use and to increase experimental efficiency.  相似文献   

2.
GAP analyses are tools used to inform us about the short-comings of a scientific area or necessities in social–economic problems. In the last 20 years, environmental enrichment as an area of scientific investigation has come of age; this can be clearly seen by the number of publications produced in this area. For example, a search on the database The Web of Science©, using the keywords “environmental enrichment”, from 1985 to 2004 produced 744 articles. In this study we analysed these 744 articles and classified them by year into: type of environment (e.g., zoo, farm and laboratory); taxonomic classification (e.g., mammal, bird, etc.); type of enrichment (e.g., food, sensory, etc.); subject area (e.g., neurosciences and agriculture); country of publication; and gathered data on experimental design (e.g., sample sizes). Furthermore, we collected similar data on animal well-being and animal conservation for comparative purposes (keywords: “animal well-being” and “animal conservation”). The results from this study show that the number of environmental enrichment studies has been steadily increasing from a low level in the 1980s until 1999, when there was a noticeable acceleration in the number of articles published. Largely, this acceleration was a response to the growing interest in environmental enrichment by neuroscientists. The data also show a relative lack of, and recent decline in, publications in the area of agriculture. Thus, the data suggest a need for more research on enriching the lives of farm animals. Environmental enrichment publications over the 20 years of the study corresponded to 27% of all animal well-being publications in the period. One interesting comparison between enrichment and animal well-being revealed the virtual absence of research in animal well-being by neuroscientists. The detailed results of this study will help in identifying gaps in our knowledge about environmental enrichment, and how experimental designs might be improved.  相似文献   

3.
During the course of an experiment using animals, many variables (e.g., age, body weight at several times, food and water consumption, hematology, and clinical biochemistry) and other characteristics are often recorded in addition to the primary response variable(s) specified by the experimenter. These additional variables have an important role in the design and interpretation of the experiment. They may be formally incorporated into the design and/or analysis and thus increase precision and power. However, even if these variables are not incorporated into the primary statistical design or into the formal analysis of the experiment, they may nevertheless be used in an ancillary or exploratory way to provide valuable information about the experiment, as shown by various examples. Used in this way, ancillary variables may improve analysis and interpretation by providing an assessment of the randomization process and an approach to the identification of outliers, lead to the generation of new hypotheses, and increase generality of results or account for differences in results when compared across different experiments. Thus, appropriate use of additional variables may lead to reduction in the number of animals required to achieve the aims of the experiment and may provide additional scientific information as an extra benefit. Unfortunately, this type of information is sometimes effectively discarded because its potential value is not recognized. Guidelines for use of animals include, in addition to the obligation to follow humane procedures, the obligation to use no more animals than necessary. Ethical experimental practice thus requires that all information be properly used and reported.  相似文献   

4.
Although there have been several papers recommending appropriate experimental designs for ancient-DNA studies, there have been few attempts at statistical analysis. We assume that we cannot decide whether a result is authentic simply by examining the sequence (e.g., when working with humans and domestic animals). We use a maximum-likelihood approach to estimate the probability that a positive result from a sample is (either partly or entirely) an amplification of DNA that was present in the sample before the experiment began. Our method is useful in two situations. First, we can decide in advance how many samples will be needed to achieve a given level of confidence. For example, to be almost certain (95% confidence interval 0.96-1.00, maximum-likelihood estimate 1.00) that a positive result comes, at least in part, from DNA present before the experiment began, we need to analyze at least five samples and controls, even if all samples and no negative controls yield positive results. Second, we can decide how much confidence to place in results that have been obtained already, whether or not there are positive results from some controls. For example, the risk that at least one negative control yields a positive result increases with the size of the experiment, but the effects of occasional contamination are less severe in large experiments.  相似文献   

5.
The numerical response, the change in specific growth rate with food concentration, is a fundamental component of many aquatic microbial studies. Accurately and precisely determining the parameters of this response is essential to obtain useful data for both aut- and synecological studies. In this work we emphasize four points that are often ignored in designing numerical response experiments: (1) the inclusion of subthreshold concentrations (i.e., where growth rate is negative) in the experimental design; (2) an appropriate allocation of effort, i.e., the superiority of choosing more individual prey concentrations rather than replicating fewer; (3) the potential superiority of replicating experiments rather than simply replicating treatment in a single experiment; and (4) the placement of most measurements near the lower end of the concentration gradient, well below the asymptote, possibly following a geometric progression. We illustrate the first point by examining a small subset of published data on planktonic oligotrich ciliates and then, using a Monte Carlo simulation, rigorously evaluate the experimental design, supporting the remaining points.  相似文献   

6.
《Ecological monographs》2011,81(4):635-663
Ecology is inherently multivariate, but high-dimensional data are difficult to understand. Dimension reduction with ordination analysis helps with both data exploration and clarification of the meaning of inferences (e.g., randomization tests, variation partitioning) about a statistical population. Most such inferences are asymmetric, in that variables are classified as either response or explanatory (e.g., factors, predictors). But this asymmetric approach has limitations (e.g., abiotic variables may not entirely explain correlations between interacting species). We study symmetric population-level inferences by modeling correlations and co-occurrences, using these models for out-of-sample prediction. Such modeling requires a novel treatment of ordination axes as random effects, because fixed effects only allow within-sample predictions. We advocate an iterative methodology for random-effects ordination: (1) fit a set of candidate models differing in complexity (e.g., number of axes); (2) use information criteria to choose among models; (3) compare model predictions with data; (4) explore dimension-reduced graphs (e.g., biplots); (5) repeat 1–4 if model performance is poor. We describe and illustrate random-effects ordination models (with software) for two types of data: multivariate-normal (e.g., log morphometric data) and presence–absence community data. A large simulation experiment with multivariate-normal data demonstrates good performance of (1) a small-sample-corrected information criterion and (2) factor analysis relative to principal component analysis. Predictive comparisons of multiple alternative models is a powerful form of scientific reasoning: we have shown that unconstrained ordination can be based on such reasoning.  相似文献   

7.
Simple ratios in which a measurement variable is divided by a size variable are commonly used but known to be inadequate for eliminating size correlations from morphometric data. Deficiencies in the simple ratio can be alleviated by incorporating regression coefficients describing the bivariate relationship between the measurement and size variables. Recommendations have included: 1) subtracting the regression intercept to force the bivariate relationship through the origin (intercept-adjusted ratios); 2) exponentiating either the measurement or the size variable using an allometry coefficient to achieve linearity (allometrically adjusted ratios); or 3) both subtracting the intercept and exponentiating (fully adjusted ratios). These three strategies for deriving size-adjusted ratios imply different data models for describing the bivariate relationship between the measurement and size variables (i.e., the linear, simple allometric, and full allometric models, respectively). Algebraic rearrangement of the equation associated with each data model leads to a correctly formulated adjusted ratio whose expected value is constant (i.e., size correlation is eliminated). Alternatively, simple algebra can be used to derive an expected value function for assessing whether any proposed ratio formula is effective in eliminating size correlations. Some published ratio adjustments were incorrectly formulated as indicated by expected values that remain a function of size after ratio transformation. Regression coefficients incorporated into adjusted ratios must be estimated using least-squares regression of the measurement variable on the size variable. Use of parameters estimated by any other regression technique (e.g., major axis or reduced major axis) results in residual correlations between size and the adjusted measurement variable. Correctly formulated adjusted ratios, whose parameters are estimated by least-squares methods, do control for size correlations. The size-adjusted results are similar to those based on analysis of least-squares residuals from the regression of the measurement on the size variable. However, adjusted ratios introduce size-related changes in distributional characteristics (variances) that differentially alter relationships among animals in different size classes. © 1993 Wiley-Liss, Inc.  相似文献   

8.
Optimal experiment design for parameter estimation (OED/PE) has become a popular tool for efficient and accurate estimation of kinetic model parameters. When the kinetic model under study encloses multiple parameters, different optimization strategies can be constructed. The most straightforward approach is to estimate all parameters simultaneously from one optimal experiment (single OED/PE strategy). However, due to the complexity of the optimization problem or the stringent limitations on the system's dynamics, the experimental information can be limited and parameter estimation convergence problems can arise. As an alternative, we propose to reduce the optimization problem to a series of two-parameter estimation problems, i.e., an optimal experiment is designed for a combination of two parameters while presuming the other parameters known. Two different approaches can be followed: (i) all two-parameter optimal experiments are designed based on identical initial parameter estimates and parameters are estimated simultaneously from all resulting experimental data (global OED/PE strategy), and (ii) optimal experiments are calculated and implemented sequentially whereby the parameter values are updated intermediately (sequential OED/PE strategy).This work exploits OED/PE for the identification of the Cardinal Temperature Model with Inflection (CTMI) (Rosso et al., 1993). This kinetic model describes the effect of temperature on the microbial growth rate and encloses four parameters. The three OED/PE strategies are considered and the impact of the OED/PE design strategy on the accuracy of the CTMI parameter estimation is evaluated. Based on a simulation study, it is observed that the parameter values derived from the sequential approach deviate more from the true parameters than the single and global strategy estimates. The single and global OED/PE strategies are further compared based on experimental data obtained from design implementation in a bioreactor. Comparable estimates are obtained, but global OED/PE estimates are, in general, more accurate and reliable.  相似文献   

9.
Consider an experiment where the response is based on an image; e.g., an image captured to a computer file by a digital camera mounted on a microscope. Suppose relevant quantitative measures are extracted from the images so that results can be analyzed by conventional statistical methods. The steps involved in extracting the measures may require that the technicians, who are processing the images, perform some subjective manipulations. In this case, it is important to determine the bias and variability, if any, attributable to the technicians' decisions. This paper describes the experimental design and statistical analyses that are useful for those determinations. The design and analysis are illustrated by application to two biofilm research projects that involved quantitative image analysis. In one investigation, the technician was required to choose a threshold level, then the image analysis program automatically extracted relevant measures from the resulting black and white image. In the other investigation, the technician was required to choose fiducial points in each of two images collected on different microscopes; then the image analysis program registered the images by stretching, rotating, and overlaying them, so that their quantitative features could be correlated. These investigations elucidated the effects of the technicians' decisions, thereby helping us to assess properly the statistical uncertainties in the conclusions for the primary experiments.  相似文献   

10.
Phenotype is often correlated with resource use, which suggests that as phenotypic variation in a population increases, intraspecific competition will decrease. However, few studies have experimentally tested the prediction that increased intraspecific phenotypic variation leads to reduced competitive effects (e.g., on growth rate, survival or reproductive rate). We investigated this prediction with two experiments on wood frogs (Rana sylvatica). In the first experiment, we found that a frog’s size was positively correlated with the size of its preferred prey, indicating that the feeding niche of the frogs changed with size. In the second experiment, we used an experimental design in which we held the initial mass of “focal” frogs constant, but varied the initial mass of their competitors. We found a significant quadratic effect of the average mass of competitors: focal frog growth was lowest when raised with similar-sized competitors, and highest when raised with competitors that were larger or smaller. Our results demonstrate that growth rates increase (i.e., competitive intensity decreases) when individuals are less similar to other members of the population and exhibit less overlap in resource use. Thus, changes in the amount of phenotypic variation in a population may ultimately affect population-level processes, such as population growth rate and extinction risk.  相似文献   

11.
Rosner B  Glynn RJ  Lee ML 《Biometrics》2006,62(1):185-192
The Wilcoxon signed rank test is a frequently used nonparametric test for paired data (e.g., consisting of pre- and posttreatment measurements) based on independent units of analysis. This test cannot be used for paired comparisons arising from clustered data (e.g., if paired comparisons are available for each of two eyes of an individual). To incorporate clustering, a generalization of the randomization test formulation for the signed rank test is proposed, where the unit of randomization is at the cluster level (e.g., person), while the individual paired units of analysis are at the subunit within cluster level (e.g., eye within person). An adjusted variance estimate of the signed rank test statistic is then derived, which can be used for either balanced (same number of subunits per cluster) or unbalanced (different number of subunits per cluster) data, with an exchangeable correlation structure, with or without tied values. The resulting test statistic is shown to be asymptotically normal as the number of clusters becomes large, if the cluster size is bounded. Simulation studies are performed based on simulating correlated ranked data from a signed log-normal distribution. These studies indicate appropriate type I error for data sets with > or =20 clusters and a superior power profile compared with either the ordinary signed rank test based on the average cluster difference score or the multivariate signed rank test of Puri and Sen. Finally, the methods are illustrated with two data sets, (i) an ophthalmologic data set involving a comparison of electroretinogram (ERG) data in retinitis pigmentosa (RP) patients before and after undergoing an experimental surgical procedure, and (ii) a nutritional data set based on a randomized prospective study of nutritional supplements in RP patients where vitamin E intake outside of study capsules is compared before and after randomization to monitor compliance with nutritional protocols.  相似文献   

12.
Size-related changes of form in animals with periodically patterned body axes and post-embryonic growth discontinuously obtained throughout a series of moulting episodes cannot be accounted for by allometry alone. We address here the relationships between body size and number and size of appropriately selected structural units (e.g., segments), which may more or less closely approximate independent developmental units, or unitary targets of selection, or both. Distinguishing between units fundamentally involving one cell only or a small and fixed number of cells (e.g., the ommatidia in a compound eye), and units made of an indeterminate number of cells (e.g., trunk segments), we analyze and discuss a selection of body features of either kind, both in ontogeny and in phylogeny, through a review of current literature and meta-analyses of published and unpublished data. While size/number relationships are too diverse to allow easy generalizations, they provide conspicuous examples of the complex interplay of selective forces and developmental constraints that characterizes the evolution of arthropod body patterning.  相似文献   

13.
Application of Landscape Allometry to Restoration of Tidal Channels   总被引:2,自引:0,他引:2  
Oligohaline tidal channels (sloughs) in the Pacific Northwest were shown to have allometric form with respect to outlet width and depth, channel length, perimeter, and surface area. In contrast, an artificial slough, excavated to mitigate port improvements, did not conform to natural slough allometry, resulting in high retention of allochthonous inputs and sediment accumulation. Additionally, intertidal sedge habitat abundance was related to slough size for smaller sloughs, but larger sloughs did not fit this allometric pattern. This suggests that sedge habitat in large sloughs has been destroyed due to extensive log storage and transportation from the 1890s to the 1970s. Finally, the abundance of salmonid prey of terrestrial origin—aphids and adult flies—in slough surface waters was correlated with slough perimeter and, for aphids, with the amount of intertidal sedge habitat. An allometric perspective on landscape form and function has several implications for habitat restoration and mitigation: (1) Size‐related constraints on replication for landscape‐scale studies are loosened (e.g., rather than requiring reference sites that are similar in size to experimental sites, analysis of covariance can be used to control size effects); (2) physical processes, such as sedimentation and erosion, affect landscape form, whereas landscape form can affect ecological processes, so design of restoration or mitigation projects should conform to allometric patterns to maximize physical and ecological predictability; (3) landscape allometry may provide insight into undocumented human disturbances; and (4) allometric patterns suggest design goals and criteria for success.  相似文献   

14.
J.-P. Voigt  J.P. Huston  M. Voits  H. Fink 《Peptides》1996,17(8):1313-1315
The effects of CCK on food intake were investigated under fixed feeding conditions in comparison to a test meal taken after 16 h of food deprivation. The experiments were performed on young adult rats (8 weeks old) as well on aged rats (23 months old). Intraperitoneal CCK-8 (8 and 40 μg/kg) significantly reduced the size of a test meal following 16-h food deprivation. This effect was independent of the age of the rats. However, under fixed feeding conditions neither of the doses used in this study reduced food intake in the young adult rats, whereas the highest dose of 40 μg/kg did so in the aged rats. These results suggest that the hypophagic effect of exogenous CCK-8 depends on experimental conditions, food intake being reduced after a period of food deprivation but not under a fixed feeding regimen in adult animals. Furthermore, the data suggest that age is a factor contributing to the complex behavioral actions of CCK, because only old animals were more susceptible to an anorectic action of CCK under the fixed feeding schedule. An explanation may lie in an interaction of other known behavioral effects of CCK (e.g., anxiogenic, mnemonic action) with its effects under the different feeding schedules.  相似文献   

15.
Allan R Brasier 《BioTechniques》2002,32(1):100-2, 104, 106, 108-9
High-density oligonucleotide arrays are widely employed for detecting global changes in gene expression profiles of cells or tissues exposed to specific stimuli. Presented with large amounts of data, investigators can spend significant amounts of time analyzing and interpreting this array data. In our application of GeneChip arrays to analyze changes in gene expression in viral-infected epithelium, we have needed to develop additional computational tools that may be of utility to other investigators using this methodology. Here, I describe two executable programs to facilitate data extraction and multiple data point analysis. These programs run in a virtual DOS environment on Microsoft Windows 95/98/2K operating systems on a desktop PC. Both programs can be freely downloaded from the BioTechniques Software Library (www.BioTechniques.com). The first program, Retriever, extracts primary data from an array experiment contained in an Affymetrix textfile using user-inputted individual identification strings (e.g., the probe set identification numbers). With specific data retrieved for individual genes, hybridization profiles can be examined and data normalized. The second program, CompareTable, is used to facilitate comparison analysis of two experimental replicates. CompareTable compares two lists of genes, identifies common entries, extracts their data, and writes an output text file containing only those genes present in both of the experiments. The output files generated by these two programs can be opened and manipulated by any software application recognizing tab-delimited text files (e.g., Microsoft NotePad or Excel).  相似文献   

16.
An analysis of 117 titration experiments in the murine scrapie model is presented. The experiments encompass 30 years' work and a wide range of experimental conditions. To check that the experimental designs were reasonably consistent over time, comparisons were made of size, duration, source of inoculum, etc., in each experiment. These comparisons revealed no systematic trends that would render invalid comparisons across experiments. For 114 of the experiments it was possible to calculate the dose at which half of the challenged animals were infected (the ID50). These 114 experiments were then combined on the basis of relative dose (i.e. tenfold dilution relative to the ID50). This created a data set in which over 4000 animals were challenged with doses of scrapie ranging from four orders of magnitude below to five orders of magnitude above the ID50. Analysis of this data reveals that mean incubation periods rise linearly with logarithmic decreases in dose. A one unit increase in relative dose (i.e. a tenfold increase in actual dose) will, on average, decrease the incubation period by 25 days. At ID50 the average incubation period in this data set is 300 days. Within a single dose, in a single experimental model, incubation periods have a distribution close to normal. Variability in incubation period also rises linearly as dose decreases. There is no age or sex effect upon the probability of infection, but female mice have incubation periods that are, on average, nine days shorter than their male counterparts and young mice have incubation periods that are longer by seven days. Although many of these patterns are apparent in the results of single titration curves, they can be more rigorously investigated by considering the outcome for thousands of mice.  相似文献   

17.
Within the last 5 years, protein microarrays have been developed and applied to multiple approaches: identification of protein–protein interactions or protein–small molecule interactions, cancer profiling, detection of microorganisms and toxins, and identification of antibodies due to allergens, autoantigens, and pathogens. Protein microarrays are small size (typically in the microscopy slide format) planar analytical devices with probes arranged in high density to provide the ability to screen several hundred to thousand known substrates (e.g., proteins, peptides, antibodies) simultaneously. Due to their small size, only minute amounts of spotted probes and analytes (e.g., serum) are needed; this is a particularly important feature, for these are limited or expensive. In this review, different types of protein microarrays are reviewed: protein microarrays (PMAs), with spotted proteins or peptides; antibody microarrays (AMAs), with spotted antibodies or antibody fragments (e.g., scFv); reverse phase protein microarrays (RPMAs), a special form of PMA where crude protein mixtures (e.g., cell lysates, fractions) are spotted; and nonprotein microarrays (NPMAs) where macromolecules other than proteins and nucleic acids (e.g., carbohydrates, monosaccharides, lipopolysaccharides) are spotted. In this study, exemplary experiments for all types of protein arrays are discussed wherever applicable with regard to investigations of microorganisms.  相似文献   

18.
19.
Measurements of gene expression from microarray experiments are highly dependent on experimental design. Systematic noise can be introduced into the data at numerous steps. On Illumina BeadChips, multiple samples are assayed in an ordered series of arrays. Two experiments were performed using the same samples but different hybridization designs. An experiment confounding genotype with BeadChip and treatment with array position was compared to another experiment in which these factors were randomized to BeadChip and array position. An ordinal effect of array position on intensity values was observed in both experiments. We demonstrate that there is increased rate of false-positive results in the confounded design and that attempts to correct for confounded effects by statistical modeling reduce power of detection for true differential expression. Simple analysis models without post hoc corrections provide the best results possible for a given experimental design. Normalization improved differential expression testing in both experiments but randomization was the most important factor for establishing accurate results. We conclude that lack of randomization cannot be corrected by normalization or by analytical methods. Proper randomization is essential for successful microarray experiments.  相似文献   

20.
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号