首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
The analysis of sedimentary chironomid assemblages is an approach that has been widely adopted for inferring past environmental conditions. However, there is an ongoing discussion in the literature about whether this approach could become more sensitive at detecting past environmental changes if paleolimnologists conducted finer taxonomic analyses of their specimens. To advance this discussion, we conducted comparative analyses of larval chironomid assemblages resolved to two levels of taxonomic resolution. For this exercise, we chose to use live assemblages (as opposed to sub-fossil assemblages) because fine taxonomic resolution of live assemblages is more easily obtained. Our specific aims were to (i) determine if finely resolved taxa comprising a coarsely resolved group have similar ecological niches, (ii) determine if different environmental predictors of community composition are identified when two different levels of taxonomic resolution are applied and (iii) evaluate whether the variance explained by environmental variables differs substantially between levels of taxonomic resolution. We found that there was substantial dispersion among finely resolved taxa belonging to a single coarse group, which suggests that the merging of these taxa results in the loss of ecological information, and therefore warrants higher taxonomic precision. However, the identification of significant environmental predictors and the proportion of variance explained by these did not differ greatly between our two levels of taxonomic resolution. Overall our results show that coarse-resolution analyses may be adequate for some applications, but if the aim is to infer subtle environmental changes (as is the case in most paleolimnological studies) we recommend the highest possible level of taxonomic resolution.  相似文献   

2.
Scales are widely used to determine both the growth rates and the age of fish. Their use in this method of scale reading is however complicated by the occurrence of false checks. It has been difficult to investigate the particular environmental factors that normally cause these interruptions in growth because it usually takes many weeks before a check can be detected and the growing scale is a complex system. A technique has therefore been devised in which a scale is isolated from the fish and used to measure its metabolism of radioactive glycine. This gives a measure of protein synthesis which correlates well with the normal growth of the scale and of the whole fish. It is, however, a measurement of the'instantaneous growth rate'of the fish instead of one compounded over a long period of time and it thus opens up the possibility of investigating specific environmental factors that might cause changes in growth. The effects of handling, low oxygen levels, starvation, light and temperature changes have been studied by this technique. It has been shown that handling has the most marked effect but that oxygen levels and starvation also have direct effects that could induce false checks.  相似文献   

3.
Wu J  Zeng Y  Huang J  Hou W  Zhu J  Wu R 《Genetical research》2007,89(1):27-38
Whether there are different genes involved in response to different environmental signals and how these genes interact to determine the final expression of the trait are of fundamental importance in agricultural and biological research. We present a statistical framework for mapping environment-induced genes (or quantitative trait loci, QTLs) of major effects on the expression of a trait that respond to changing environments. This framework is constructed with a maximum-likelihood-based mixture model, in which the mean and covariance structure of environment-induced responses is modelled. The means for responses to continuous environmental states, referred to as reaction norms, are approximated for different QTL genotypes by mathematical equations that were derived from fundamental biological principles or based on statistical goodness-of-fit to observational data. The residual covariance between different environmental states was modelled by autoregressive processes. Such an approach to studying the genetic control of reaction norms can be expected to be advantageous over traditional mapping approaches in which no biological principles and statistical structures are considered. We demonstrate the analytical procedure and power of this approach by modelling the photosynthetic rate process as a function of temperature and light irradiance. Our approach allows for testing how a QTL affects the reaction norm of photosynthetic rate to a specific environment and whether there exist different QTLs to mediate photosynthetic responses to temperature and light irradiance, respectively.  相似文献   

4.
As the density of development increases, there is a growing need to address the cumulative effects of project developments on the environment. In Canada this need has been recognized in legislation whereby new project developments that require an environmental assessment under the Canadian Environmental Assessment Act are required to address the cumulative effects of proposed project activities relative to the existing environmental condition. Unfortunately, existing stressor-based and effects-based approaches to environmental assessment do not adequately address cumulative effects as defined under the Act when used in isolation. However, elements from each approach can be conceptually incorporated into a holistic cumulative effects assessment framework. Key framework components include: (1) an effects-based assessment to determine existing accumulated environmental state, (2) a stressor-based assessment to predict potential impacts of new development relative to the existing environmental state, (3) post-development monitoring to assess the accuracy of impact predictions and to provide an avenue for adaptive management, and (4) decision-making frameworks to link scientific information to public opinion and managerial action. The key advantage of this framework is that it provides a more holistic, systematic approach for incorporation of ecological information into a scientific and management framework for cumulative effects assessment.  相似文献   

5.
Central body fat distribution has been shown to be related to hyperinsulinemia, insulin resistance, hypertriglyceridemia, and atherosclerosis to a greater degree than general obesity. There are known to be both genetic and environmental effects on all components of this clustering. Whether these genetic effects are due to one set of genes in common to the components or whether genetic influences on insulin resistance and/or general/abdominal fatness 'turn on' other genes that affect other components of the syndrome is not clear. We analyzed data from the Swedish Adoption/Twin Study of Aging (60% female; monozygotic = 116, dizygotic = 202; average age 65 years) to determine whether there were genetic and/or environmental factors shared among general body fat distribution, abdominal body fat distribution, fasting insulin levels and cardiovascular disease. We found additive genetic effects in males to be significantly different from those in females with genetic effects accounting for variance in waist-hip ratio (males = 28%; females = 49%), body mass index (males = 58%; females = 73%), fasting insulin levels (FI) (males = 27%; females = 49%), and cardiovascular disease (CVD) (males = 18%; females = 37%). There were also shared genetic and environmental effects among all the variables except CVD, but a majority of the genetic variance for these measures was trait specific.  相似文献   

6.
Offspring trait expression is determined by the combination of parental genes and parental environments. Although maternal environmental effects have been widely characterized, few studies have focused on paternal environmental effects. To determine whether light availability influences pollen and offspring traits in the woodland herb Campanula americana, we reared clones of 12 genotypes in two light levels. In the parental generation we measured pollen number and size. Plants grown under high light produced more pollen grains per flower than those grown under low light. However, the response was genotype specific; some individuals responded little to changes in light availability while others substantially reduced pollen production. As a consequence, paternity ratios may vary between light environments if more pollen is associated with greater siring success. We crossed a subset of these plants to produce the offspring generation. The paternal and maternal light environments influenced offspring seed mass, percentage germination, and days to germination, while only maternal light levels influenced later life traits, such as leaf number and size. Maternal and paternal environmental effects had opposite influences on seed mass, percentage germination and days to germination. Finally, there was no direct relationship between light effects on pollen production and offspring trait expression.  相似文献   

7.
Implicit and explicit use of expert knowledge to inform ecological analyses is becoming increasingly common because it often represents the sole source of information in many circumstances. Thus, there is a need to develop statistical methods that explicitly incorporate expert knowledge, and can successfully leverage this information while properly accounting for associated uncertainty during analysis. Studies of cause‐specific mortality provide an example of implicit use of expert knowledge when causes‐of‐death are uncertain and assigned based on the observer's knowledge of the most likely cause. To explicitly incorporate this use of expert knowledge and the associated uncertainty, we developed a statistical model for estimating cause‐specific mortality using a data augmentation approach within a Bayesian hierarchical framework. Specifically, for each mortality event, we elicited the observer's belief of cause‐of‐death by having them specify the probability that the death was due to each potential cause. These probabilities were then used as prior predictive values within our framework. This hierarchical framework permitted a simple and rigorous estimation method that was easily modified to include covariate effects and regularizing terms. Although applied to survival analysis, this method can be extended to any event‐time analysis with multiple event types, for which there is uncertainty regarding the true outcome. We conducted simulations to determine how our framework compared to traditional approaches that use expert knowledge implicitly and assume that cause‐of‐death is specified accurately. Simulation results supported the inclusion of observer uncertainty in cause‐of‐death assignment in modeling of cause‐specific mortality to improve model performance and inference. Finally, we applied the statistical model we developed and a traditional method to cause‐specific survival data for white‐tailed deer, and compared results. We demonstrate that model selection results changed between the two approaches, and incorporating observer knowledge in cause‐of‐death increased the variability associated with parameter estimates when compared to the traditional approach. These differences between the two approaches can impact reported results, and therefore, it is critical to explicitly incorporate expert knowledge in statistical methods to ensure rigorous inference.  相似文献   

8.
《Biomarkers》2013,18(8):560-571
To explain the underlying causes of apparently stochastic disease, current research is focusing on systems biology approaches wherein individual genetic makeup and specific ‘gene–environment’ interactions are considered. This is an extraordinarily complex task because both the environmental exposure profiles and the specific genetic susceptibilities presumably have large variance components. In this article, the focus is on the initial steps along the path to disease outcome namely environmental uptake, biologically available dose, and preclinical effect. The general approach is to articulate a conceptual model and identify biomarker measurements that could populate the model with hard data. Between-subject variance components from different exposure studies are used to estimate the source and magnitude of the variability of biomarker measurements. The intent is to determine the relative effects of different biological media (breath or blood), environmental compounds and their metabolites, different concentration levels, and levels of environmental exposure control. Examples are drawn from three distinct exposure biomarker studies performed by the US Environmental Protection Agency that studied aliphatic and aromatic hydrocarbons, trichloroethylene and methyl tertiary butyl ether. All results are based on empirical biomarker measurements of breath and blood from human subjects; biological specimens were collected under appropriate Institutional Review Board protocols with informed consent of the subjects. The ultimate goal of this work is to develop a framework for eventually assessing the total susceptibility ranges along the toxicological pathway from exposure to effect. The investigation showed that exposures are a greater contributor to biomarker variance than are internal biological parameters.  相似文献   

9.
Although the role of morphology in evolutionary theory remains a subject of debate, assessing the contributions of morphological investigation to evolutionary developmental biology (Evo-devo) is a more circumscribed issue of direct relevance to ongoing research. Historical studies of morphologically oriented researchers and the formation of the Modern Synthesis in the Anglo-American context identify a recurring theme: the synthetic theory of evolution did not capture multiple levels of biological organization. When this feature is incorporated into a philosophical framework for explaining the origin of evolutionary innovations and novelties (a core domain of inquiry in Evo-devo) two specific roles for morphology can be described: (1) the conceptualization and operational identification of the targets of explanation; and (2) the elucidation of causal interactions at higher levels of organization during ontogeny and through evolutionary time. These roles are critical components of any adequate explanation of innovation and novelty though not exhaustive of the parts played by morphology in evolutionary investigation. They also invite reflection on what counts as an evolutionary cause in contemporary evolutionary biology.  相似文献   

10.
Many years of uncontrolled discharge of sewage and industrial effluent have resulted in serious contamination of much of the sediments underlying inland and nearshore coastal waters in Hong Kong by potentially toxic heavy metal and trace organic pollutants. Much has been achieved to improve control of this pollution at the source and prevent further deterioration. Nevertheless, comprehensive environmental assessment and management measures are required to ensure that any unacceptably contaminated sediment that must be dredged to facilitate infrastructural development is safely handled and disposed of. It is estimated that some 50?Mm3 of sediment classified as unacceptably contaminated may require dredging and special management elsewhere over the coming 10-year period. To facilitate improved decision making about the most appropriate disposal options for dredged sediment Hong Kong has recently implemented a new sediment quality assessment framework under which information on the biological activity of contaminated material is considered in addition to data on chemical composition. Dredged sediment classified as unacceptably contaminated has been disposed of at a contained disposal facility at East Sha Chau since 1992. To date over 20?Mm3 of sediment has been placed into seabed pits that are subsequently capped with clay. The site is subject to a rigorous monitoring programme that has clearly demonstrated its environmental acceptability.  相似文献   

11.
Pei L  Hughes MD 《Biometrics》2008,64(4):1117-1125
SUMMARY: Bridging clinical trials are sometimes designed to evaluate whether a proposed dose for use in one population, for example, children, gives similar pharmacokinetic (PK) levels, or has similar effects on a surrogate marker as an established effective dose used in another population, for example, adults. For HIV bridging trials, because of the increased risk of viral resistance to drugs at low PK levels, the goal is often to determine whether the doses used in different populations result in similar percentages of patients with low PK levels. For example, it may be desired to evaluate that a proposed pediatric dose gives approximately 10% of children with PK levels below the 10th percentile of PK levels for the established adult dose. However, the 10th percentile for the adult dose is often imprecisely estimated in studies of relatively small size. Little attention has been given to the statistical framework for such bridging studies. In this article, a formal framework for the design and analysis of quantile-based bridging studies is proposed. The methodology is then developed for normally distributed outcome measures from both frequentist and Bayesian directions. Sample size and other design considerations are discussed.  相似文献   

12.
Rundle A 《Mutation research》2006,600(1-2):23-36
Carcinogen-DNA adducts are thought to be a useful biomarker in epidemiologic studies seeking to show that environmental exposures to xenobiotics cause cancer. This paper reviews the literature in this field from an epidemiologic perspective and identifies several common problems in the epidemiologic design and analysis of these studies. Carcinogen-DNA adducts have been used in studies attempting to link xenobiotic exposures to hepatocellular carcinoma, smoking related cancers and breast cancer. Adduct measurements have been useful in further implicating aflatoxin exposure in the etiology of hepatocellular carcinoma. For smoking related cancers, associations with carcinogen-DNA adducts are commonly seen in current smokers but less so in ex- or non-smokers. In breast cancer the associations have been inconsistent and weak and there is little evidence that carcinogen-DNA adducts implicate xenobiotic exposures in the etiology of breast cancer. Methodological issues common to these studies are the use of target versus surrogate tissues and how this choice impacts control selection, disease effects on adduct levels, the time period reflected by adduct levels, the use of inappropriate statistical analyses and small sample sizes. It is unclear whether the lack of association between carcinogen-DNA adducts and cancer reflects a lack of association between the xenobiotic exposure of interest and cancer or the effects of these methodological issues. A greater focus needs to be placed on designs that allow measurements of adduct levels in tissues collected years prior to cancer diagnosis, there is little need for further hospital based case-control studies in which adducts are measured at the time of or after diagnosis. New designs that address these issues are suggested in the paper.  相似文献   

13.
14.
Neural aspects of cognitive motor control   总被引:13,自引:0,他引:13  
  相似文献   

15.
Quantitative imaging has become a vital technique in biological discovery and clinical diagnostics; a plethora of tools have recently been developed to enable new and accelerated forms of biological investigation. Increasingly, the capacity for high-throughput experimentation provided by new imaging modalities, contrast techniques, microscopy tools, microfluidics and computer controlled systems shifts the experimental bottleneck from the level of physical manipulation and raw data collection to automated recognition and data processing. Yet, despite their broad importance, image analysis solutions to address these needs have been narrowly tailored. Here, we present a generalizable formulation for autonomous identification of specific biological structures that is applicable for many problems. The process flow architecture we present here utilizes standard image processing techniques and the multi-tiered application of classification models such as support vector machines (SVM). These low-level functions are readily available in a large array of image processing software packages and programming languages. Our framework is thus both easy to implement at the modular level and provides specific high-level architecture to guide the solution of more complicated image-processing problems. We demonstrate the utility of the classification routine by developing two specific classifiers as a toolset for automation and cell identification in the model organism Caenorhabditis elegans. To serve a common need for automated high-resolution imaging and behavior applications in the C. elegans research community, we contribute a ready-to-use classifier for the identification of the head of the animal under bright field imaging. Furthermore, we extend our framework to address the pervasive problem of cell-specific identification under fluorescent imaging, which is critical for biological investigation in multicellular organisms or tissues. Using these examples as a guide, we envision the broad utility of the framework for diverse problems across different length scales and imaging methods.  相似文献   

16.
Generalized norms of reaction for ecological developmental biology   总被引:1,自引:0,他引:1  
A standard norm of reaction (NoR) is a graphical depiction of the phenotypic value of some trait of an individual genotype in a population as a function an environmental parameter. NoRs thus depict the phenotypic plasticity of a trait. The topological properties of NoRs for sets of different genotypes can be used to infer the presence of (nonlinear) genotype-environment interactions. Although it is clear that many NoRs are adaptive, it is not yet settled whether their evolutionary etiology should be explained by selection on the mean phenotypic trait values in different environments or whether there are specific genes conferring plasticity. If the second alternative is true, the NoR is itself an object of selection. Generalized NoRs depict plasticity at the level of populations or subspecies within a species, species within a genus, or taxa at higher levels. Historically, generalized NoRs have routinely been drawn though rarely explicitly recognized as such. Such generalized NoRs can be used to make evolutionary inferences at higher taxonomic levels in a way analogous to how standard NoRs are used for microevolutionary inferences.  相似文献   

17.
Eotaxin and eosinophil recruitment: implications for human disease   总被引:6,自引:0,他引:6  
Eosinophils have been implicated in a broad range of diseases, notably allergic conditions (for example, asthma, rhinitis and atopic dermatitis) and other inflammatory disorders (for example, inflammatory bowel disease, eosinophilic gastroenteritis and pneumonia). These disease states are characterized by an accumulation of eosinophils in tissues. Severe tissue damage ensues as eosinophils release their highly cytotoxic granular proteins. Defining the mechanisms that control recruitment of eosinophils to tissues is fundamental to understanding these disease processes and provides targets for novel drug therapy. An important discovery in this context was the identification of an eosinophil-specific chemoattractant, eotaxin. Over the past six years there has been intensive investigation into the biological effects of eotaxin and its role in specific disease processes and this is the subject of this review.  相似文献   

18.

Purpose

In this two-part paper (Background and Initial Assumptions (part 1) and Results of Survey Research (part 2)), we present surveys whose main objective is to determine whether, and to what extent, the life cycle assessment (LCA) technique is used for the identification and assessment of environmental aspects in environmental management systems (EMS) and whether there are any differences in this respect between the companies and countries analysed.

Methods

The survey research was carried out using the computer assisted self-administered interviewing method among selected Polish, German and Swedish organisations which implement EMS in accordance with the requirements of ISO 14001 and/or the EMAS regulation.

Results

The organisations investigated, regardless of their country, are dominated by qualitative and semi-quantitative techniques of assessment and identification of environmental aspects. LCA was used sporadically, although some differences can be observed between the countries analysed.

Conclusions

The environmental managers accustomed to traditional qualitative and semi-quantitative solutions have not been given preparation to enable them to understand and adopt different approaches such as LCA. On the other hand, representatives of the organisations investigated declared that they were ready to accept an even longer timescale for the identification and assessment processes relating to environmental aspects, which represents a potential opportunity for LCA. The more precise understanding and definition of environmental problems that are precisely defined in LCA would represent a novelty for environmental managers. In practice, environmental problems are defined in a general sense and rather ambiguously, as this level of detail is sufficient in the context of qualitative and semi-quantitative techniques commonly used for the identification and assessment of environmental aspects.  相似文献   

19.
BACKGROUND: Toxicology studies utilizing animals and in vitro cellular or tissue preparations have been used to study the toxic effects and mechanism of action of drugs and chemicals and to determine the effective and safe dose of drugs in humans and the risk of toxicity from chemical exposures. Testing in animals could be improved if animal dosing using the mg/kg basis was abandoned and drugs and chemicals were administered to compare the effects of pharmacokinetically and toxicokinetically equivalent serum levels in the animal model and human. Because alert physicians or epidemiology studies, not animal studies, have discovered most human teratogens and toxicities in children, animal studies play a minor role in discovering teratogens and agents that are deleterious to infants and children. In vitro studies play even a less important role, although they are helpful in describing the cellular or tissue effects of the drugs or chemicals and their mechanism of action. One cannot determine the magnitude of human risks from in vitro studies when they are the only source of toxicology data. METHODS: Toxicology studies on adult animals is carried out by pharmaceutical companies, chemical companies, the Food and Drug Administration (FDA), many laboratories at the National Institutes of Health, and scientific investigators in laboratories throughout the world. Although there is a vast amount of animal toxicology studies carried out on pregnant animals and adult animals, there is a paucity of animal studies utilizing newborn, infant, and juvenile animals. This deficiency is compounded by the fact that there are very few toxicology studies carried out in children. That is one reason why pregnant women and children are referred to as "therapeutic orphans." RESULTS: When animal studies are carried out with newborn and developing animals, the results demonstrate that generalizations are less applicable and less predictable than the toxicology studies in pregnant animals. Although many studies show that infants and developing animals may have difficulty in metabolizing drugs and are more vulnerable to the toxic effects of environmental chemicals, there are exceptions that indicate that infants and developing animals may be less vulnerable and more resilient to some drugs and chemicals. In other words, the generalization indicating that developing animals are always more sensitive to environmental toxicants is not valid. For animal toxicology studies to be useful, animal studies have to utilize modern concepts of pharmacokinetics and toxicokinetics, as well as "mechanism of action" (MOA) studies to determine whether animal data can be utilized for determining human risk. One example is the inability to determine carcinogenic risks in humans for some drugs and chemicals that produce tumors in rodents, When the oncogenesis is the result of peroxisome proliferation, a reaction that is of diminished importance in humans. CONCLUSIONS: Scientists can utilize animal studies to study the toxicokinetic and toxicodynamic aspects of drugs and environmental toxicants. But they have to be carried out with the most modern techniques and interpreted with the highest level of scholarship and objectivity. Threshold exposures, no-adverse-effect level (NOAEL) exposures, and toxic effects can be determined in animals, but have to be interpreted with caution when applying them to the human. Adult problems in growth, endocrine dysfunction, neurobehavioral abnormalities, and oncogenesis may be related to exposures to drugs, chemicals, and physical agents during development and may be fruitful areas for investigation. Maximum permissible exposures have to be based on data, not on generalizations that are applied to all drugs and chemicals. Epidemiology studies are still the best methodology for determining the human risk and the effects of environmental toxicants. Carrying out these focused studies in developing humans will be difficult. Animal studies may be our only alternative for answering many questions with regard to specific postnatal developmental vulnerabilities.  相似文献   

20.
Inferring which protein species have been detected in bottom‐up proteomics experiments has been a challenging problem for which solutions have been maturing over the past decade. While many inference approaches now function well in isolation, comparing and reconciling the results generated across different tools remains difficult. It presently stands as one of the greatest barriers in collaborative efforts such as the Human Proteome Project and public repositories such as the PRoteomics IDEntifications (PRIDE) database. Here we present a framework for reporting protein identifications that seeks to improve capabilities for comparing results generated by different inference tools. This framework standardizes the terminology for describing protein identification results, associated with the HUPO‐Proteomics Standards Initiative (PSI) mzIdentML standard, while still allowing for differing methodologies to reach that final state. It is proposed that developers of software for reporting identification results will adopt this terminology in their outputs. While the new terminology does not require any changes to the core mzIdentML model, it represents a significant change in practice, and, as such, the rules will be released via a new version of the mzIdentML specification (version 1.2) so that consumers of files are able to determine whether the new guidelines have been adopted by export software.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号