首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   7篇
  免费   0篇
  7篇
  2021年   1篇
  2011年   1篇
  2009年   1篇
  2001年   1篇
  1999年   1篇
  1996年   1篇
  1995年   1篇
排序方式: 共有7条查询结果,搜索用时 15 毫秒
1
1.
A risk analysis of in utero caffeine exposure is presented utilizing epidemiological studies and animal studies dealing with congenital malformation, pregnancy loss, and weight reduction. These effects are of interest to teratologists, because animal studies are useful in their evaluation. Many of the epidemiology studies did not evaluate the impact of the "pregnancy signal," which identifies healthy pregnancies and permits investigators to identify subjects with low pregnancy risks. The spontaneous abortion epidemiology studies were inconsistent and the majority did not consider the confounding introduced by not considering the pregnancy signal. The animal studies do not support the concept that caffeine is an abortafacient for the wide range of human caffeine exposures. Almost all the congenital malformation epidemiology studies were negative. Animal pharmacokinetic studies indicate that the teratogenic plasma level of caffeine has to reach or exceed 60 μg/ml, which is not attainable from ingesting large amounts of caffeine in foods and beverages. No epidemiological study described the "caffeine teratogenic syndrome." Six of the 17 recent epidemiology studies dealing with the risk of caffeine and fetal weight reduction were negative. Seven of the positive studies had growth reductions that were clinically insignificant and none of the studies cited the animal literature. Analysis of caffeine's reproductive toxicity considers reproducibility and plausibility of clinical, epidemiological, and animal data. Moderate or even high amounts of beverages and foods containing caffeine do not increase the risks of congenital malformations, miscarriage or growth retardation. Pharmacokinetic studies markedly improve the ability to perform the risk analyses.  相似文献   
2.
Notions of mechanism, emergence, reduction and explanation are all tied to levels of analysis. I cover the relationship between lower and higher levels, suggest a level of mechanism approach for neuroscience in which the components of a mechanism can themselves be further decomposed and argue that scientists'' goals are best realized by focusing on pragmatic concerns rather than on metaphysical claims about what is ‘real''. Inexplicably, neuroscientists are enchanted by both reduction and emergence. A fascination with reduction is misplaced given that theory is neither sufficiently developed nor formal to allow it, whereas metaphysical claims of emergence bring physicalism into question. Moreover, neuroscience''s existence as a discipline is owed to higher-level concepts that prove useful in practice. Claims of biological plausibility are shown to be incoherent from a level of mechanism view and more generally are vacuous. Instead, the relevant findings to address should be specified so that model selection procedures can adjudicate between competing accounts. Model selection can help reduce theoretical confusions and direct empirical investigations. Although measures themselves, such as behaviour, blood-oxygen-level-dependent (BOLD) and single-unit recordings, are not levels of analysis, like levels, no measure is fundamental and understanding how measures relate can hasten scientific progress.This article is part of the theme issue ‘Key relationships between non-invasive functional neuroimaging and the underlying neuronal activity''.  相似文献   
3.
Challenges to low-dose linearity and other default assumptions in cancer risk assessment and the limitations associated with NOAELs, LOAELs, and constant uncertainty factor values in the evaluation of noncancer health effects have stimulated the continued evolution of risk assessment methodologies. The increasing need for more realistic estimates of the dose-response relationship, better uncertainty characterization, and greater utilization of cost-benefit analyses have also contributed to this evolution. “Comprehensive Realism” is an emerging quantitative weight-of-evidence based risk assessment methodology for both cancer and noncancer health effects which utilizes probability distributions and decision analysis techniques to reflect more of the relevant human exposure data, more of the available and pertinent human and animal dose-response data, and the current state of knowledge about the relative plausibility of alternative dose-response analyses. A tree (like a decision tree and a probability tree) is used to decompose the dose-response assessment into component factors, to provide a structure for explicitly considering multiple alternatives for each factor, and to explicitly incorporate the current state of knowledge about the relative plausibility of these alternatives. Groundbreaking work has demonstrated the feasibility of weight-of-evidence based distributional characterizations, and provided initial examples. Computer software implementations are also available.  相似文献   
4.
In this paper, I address the question as to why participants tend to respond realistically to situations and events portrayed within an immersive virtual reality system. The idea is put forward, based on the experience of a large number of experimental studies, that there are two orthogonal components that contribute to this realistic response. The first is ‘being there’, often called ‘presence’, the qualia of having a sensation of being in a real place. We call this place illusion (PI). Second, plausibility illusion (Psi) refers to the illusion that the scenario being depicted is actually occurring. In the case of both PI and Psi the participant knows for sure that they are not ‘there’ and that the events are not occurring. PI is constrained by the sensorimotor contingencies afforded by the virtual reality system. Psi is determined by the extent to which the system can produce events that directly relate to the participant, the overall credibility of the scenario being depicted in comparison with expectations. We argue that when both PI and Psi occur, participants will respond realistically to the virtual reality.  相似文献   
5.
Epidemiologic studies have shown significant and consistent associations between ambient particulate matter (PM) and a variety of adverse cardiopulmonary health consequences. These associations are strongest in elderly sub-populations identified with preexistent cardiopulmonary diseases. While the consistency of the findings appears to warrant serious public health concern, the lack of clear mechanistic underpinnings (“biological plausibility”) has prompted skepticism of the epidemiology. Toxicologists have embraced the challenge to address the issue of biologic plausibility in the context of the PM epidemiology and are vigorously investigating both causality and mechanisms by which effects might be mediated at the molecular, cellular, and organismic levels. Among the several hypotheses being touted, that involving constituent bioavailable transition metals appears most coherent with the epidemiologic observations. It provides a conceptual pathway that could have clinical implications. To date, the toxicology has involved relatively high doses of PM-metal to the lung of animal models and certainly more remains to be determined as to metal-specific biological roles at low environmental doses, especially in the context of host susceptibility. Nevertheless, the cumulative evidence is sufficiently cohesive to support our contention that metals are likely one important causal factor in the pathophysiological cascade leading to PM health effects. However, it would be premature at this point to consider metals in any risk management strategy for PM.  相似文献   
6.
Zrzavý’s arguments against the critical analyses of data supporting the Ecdysozoa hypothesis (Wägele et al., J. Zool. Syst. Evol. Res. 37, 211–223, 1999) are discussed. Zrzavý does not understand that the same basic principle of a priori weighting can be applied to sequence data as well as to morphological characters. Quality of evidence is the same as probability of homology, which is estimated from the number of discernible identical details. In sequences it is the number of identical nucleotides. Spectral analyses, dismissed by Zrzavý, visualize patterns of putative homologies present in alignments and also the number of positions supporting splits by chance alone. In cases in which old phylogenetic signals for a given monophylum are eroded in a gene, plesiomorphies and chance patterns will have strong influence on tree topologies and spectra. If plesiomorphies are a cause of errors, the addition of taxa that shorten internal branches is a remedy, although, in many cases such taxa may be extinct. The place of a priori estimations of data quality in a sequence of steps necessary for a phylogenetic analysis is shown. Morphological complexity is used as a proxy for a complex genetic basis and is used as a major criterion to compare characters of the Ecdysozoa and the Articulata. The details associated with the character ‘complex cuticle’ are discussed. Neither moulting nor the known components of the cuticle are novelties occurring only in Ecdysozoa. A published total evidence analysis is used to show that the number of coded characters does not necessarily reflect the quality of the data set. Zrzavý’s misunderstanding of the role of evolutionary scenarios is clarified and the importance of the use of additional biological data for plausibility arguments is explained. Plausibility arguments in favour of the Articulata hypothesis rely on facts found in functional morphology and in the fossil record. Zrzavý’s critique follows the actual mainstream but does not uncover logical mistakes or erroneous data analyses in the work of 86 . It is concluded that the Articulata hypothesis is a well‐founded alternative to the Ecdysozoa; it is based on much better morphological evidence and supported by plausibility arguments that currently do not exist for the Ecdysozoa.  相似文献   
7.
According to most current ideas, lipid vesicles were the first cell-like aggregates. However, the presence of long-chain fatty acids in the pre-enzymatic era is highly implausible, as simulation experiments and organic analyses of meteorites demonstrate. Moreover, the formation of a double-layer membrane in an aqueous environment requires quite homogeneous mixtures of cylindrical lipids. Modern plasma membranes are both proteic and lipidic, and it is more plausible that, in primordial aggregates, lipid-like molecules implemented a membranaceous interface which was mainly peptidic in nature.  相似文献   
1
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号