首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Research concerned with the psychology and physiology of interoceptive processes is reviewed with the purpose of evaluating theoretical formulations of learned visceral control. Basic animal research in interoception provides relevant information; however, much research dealing directly with interoception and learned control is inadequate due either to inappropriate measurement of interoceptive ability or to poor experimental design. The two primary theoretical orientations linking interoception and learned visceral control differ according to the role ascribed to external feedback; the first views feedback as an enhancement of interoceptive cues, the second as an enhancement of exteroceptive cues. These theories are discussed with regard to recent investigations of learned visceral control.  相似文献   

2.
Research concerned with the psychology and physiology of interoceptive processes is reviewed with the purpose of evaluating theoretical formulations of learned visceral control. Basic animal research in interoception provides relevant information; however, much research dealing directly with interoception and learned control is inadequate due either to inappropriate measurement of interoceptive ability or to poor experimental design. The two primary theoretical orientations linking interoception and learned visceral control differ according to the role ascribed to external feedback; the first views feedback as an enhancement of interoceptive cues, the second as an enhancement of exteroceptive cues. These theories are discussed with regard to recent investigations of learned visceral control.  相似文献   

3.
The paper deals with the links between physiological measurements and cognitive and emotional functioning. As long as the operator is a key agent in charge of complex systems, the definition of metrics able to predict his performance is a great challenge. The measurement of the physiological state is a very promising way but a very acute comprehension is required; in particular few studies compare autonomous nervous system reactivity according to specific cognitive processes during task performance and task related psychological stress is often ignored. We compared physiological parameters recorded on 24 healthy subjects facing two neuropsychological tasks: a dynamic task that require problem solving in a world that continually evolves over time and a logical task representative of cognitive processes performed by operators facing everyday problem solving. Results showed that the mean pupil diameter change was higher during the dynamic task; conversely, the heart rate was more elevated during the logical task. Finally, the systolic blood pressure seemed to be strongly sensitive to psychological stress. A better taking into account of the precise influence of a given cognitive activity and both workload and related task-induced psychological stress during task performance is a promising way to better monitor operators in complex working situations to detect mental overload or pejorative stress factor of error.  相似文献   

4.
Abstract

Adequate design of control rooms is very important for increasing the reliability of operators working in them. Therefore, it is necessary to choose the best solution for improving the compliance between operators and elements of control room while dealing with problems of illumination, thermal climate and noise. For this purpose we will analyze the impact of execution time of operator's information processing tasks, motor and logical operator's tasks, operator's functional state, workplace risk factors and indoor environment parameters on operator's reliability in control room. Methodological approaches for these five groups are formed and used for experimental research and to determine the level of their quantitative values. Research results are showing the following levels of reliability for specific indicators: reaction time ‐ very good, coefficient of task complexity ‐ optimal, operator's error caused by stress ‐ very good, workplace risk factor ‐ critical and indoor environment ‐ good. Because the functional appropriateness and efficiency in the operator's workplace were inadequate, the following recommendations and solutions were proposed: signaling and control elements should be placed on the central part of the control desk, because due to poor control desk construction, they cause uneven burden on various operator's body parts.  相似文献   

5.
In this mini review, we capture the latest progress of applying artificial intelligence (AI) techniques based on deep learning architectures to molecular de novo design with a focus on integration with experimental validation. We will cover the progress and experimental validation of novel generative algorithms, the validation of QSAR models and how AI-based molecular de novo design is starting to become connected with chemistry automation. While progress has been made in the last few years, it is still early days. The experimental validations conducted thus far should be considered proof-of-principle, providing confidence that the field is moving in the right direction.  相似文献   

6.
Knudsen B  Miyamoto MM 《Genetics》2007,176(4):2335-2342
Coalescent theory provides a powerful framework for estimating the evolutionary, demographic, and genetic parameters of a population from a small sample of individuals. Current coalescent models have largely focused on population genetic factors (e.g., mutation, population growth, and migration) rather than on the effects of experimental design and error. This study develops a new coalescent/mutation model that accounts for unobserved polymorphisms due to missing data, sequence errors, and multiple reads for diploid individuals. The importance of accommodating these effects of experimental design and error is illustrated with evolutionary simulations and a real data set from a population of the California sea hare. In particular, a failure to account for sequence errors can lead to overestimated mutation rates, inflated coalescent times, and inappropriate conclusions about the population. This current model can now serve as a starting point for the development of newer models with additional experimental and population genetic factors. It is currently implemented as a maximum-likelihood method, but this model may also serve as the basis for the development of Bayesian approaches that incorporate experimental design and error.  相似文献   

7.
The circulatory control system is driven partly by factors relating to the arterial side and partly by factors relating to the venous side. Students are generally provided with a conceptually clear account of the arterial side, based on sound homeostatic mechanisms of negative feedback from a well-defined error signal, arterial pressure. However, on the venous side, teaching is often based on the notion of venous return, a concept that, as normally presented, is imprecise and intangible, a frequent cause of confusion that may lead to errors of clinical practice. Although one can trace these misconceptions back to some of Guyton's publications, Guyton himself was well aware of the complexities of venous resistance and capacitance but has not always been well served by subsequent misinterpretation. The fundamental problem with venous return that makes it inappropriate for controlling the circulation is that it lacks the essential requirement of being an error signal. We propose instead a new variable, venous excess, which represents the accumulation of any mismatch between the rate of blood entering the great veins and the rate of leaving, the cardiac output. As well as being directly observable without intervention (in a patient's jugular vein), it meets all of the requirements of an error signal: via the Starling mechanism it stimulates cardiac output, regulates venous compliance, and in the longer term is an important determinant of fluid intake and excretion, and these effects act to reduce the original perturbation. Based on this concept, we suggest a simple and secure basis for teaching the control of the circulation that avoids undue reliance on entities that are difficult to specify or measure and emphasizes the role of feedback and the similarities between the arterial and venous mechanisms.  相似文献   

8.
In artificial intelligence, abstraction is commonly used to account for the use of various levels of details in a given representation language or the ability to change from one level to another while preserving useful properties. Abstraction has been mainly studied in problem solving, theorem proving, knowledge representation (in particular for spatial and temporal reasoning) and machine learning. In such contexts, abstraction is defined as a mapping between formalisms that reduces the computational complexity of the task at stake. By analysing the notion of abstraction from an information quantity point of view, we pinpoint the differences and the complementary role of reformulation and abstraction in any representation change. We contribute to extending the existing semantic theories of abstraction to be grounded on perception, where the notion of information quantity is easier to characterize formally. In the author's view, abstraction is best represented using abstraction operators, as they provide semantics for classifying different abstractions and support the automation of representation changes. The usefulness of a grounded theory of abstraction in the cartography domain is illustrated. Finally, the importance of explicitly representing abstraction for designing more autonomous and adaptive systems is discussed.  相似文献   

9.
In this paper, a recently developed model governing the cancer growth on a cell population level with combination of immune and chemotherapy is used to develop a reactive (feedback) mixed treatment strategy. The feedback design proposed here is based on nonlinear constrained model predictive control together with an adaptation scheme that enables the effects of unavoidable modeling uncertainties to be compensated. The effectiveness of the proposed strategy is shown under realistic human data showing the advantage of treatment in feedback form as well as the relevance of the adaptation strategy in handling uncertainties and modeling errors. A new treatment strategy defined by an original optimal control problem formulation is also proposed. This new formulation shows particularly interesting possibilities since it may lead to tumor regression under better health indicator profile.  相似文献   

10.
Making faultless complex objects from potentially faulty building blocks is a fundamental challenge in computer engineering, nanotechnology and synthetic biology. Here, we show for the first time how recursion can be used to address this challenge and demonstrate a recursive procedure that constructs error‐free DNA molecules and their libraries from error‐prone oligonucleotides. Divide and Conquer (D&C), the quintessential recursive problem‐solving technique, is applied in silico to divide the target DNA sequence into overlapping oligonucleotides short enough to be synthesized directly, albeit with errors; error‐prone oligonucleotides are recursively combined in vitro, forming error‐prone DNA molecules; error‐free fragments of these molecules are then identified, extracted and used as new, typically longer and more accurate, inputs to another iteration of the recursive construction procedure; the entire process repeats until an error‐free target molecule is formed. Our recursive construction procedure surpasses existing methods for de novo DNA synthesis in speed, precision, amenability to automation, ease of combining synthetic and natural DNA fragments, and ability to construct designer DNA libraries. It thus provides a novel and robust foundation for the design and construction of synthetic biological molecules and organisms.  相似文献   

11.
A bio-robot system refers to an animal equipped with Brain-Computer Interface (BCI), through which the outer stimulation is delivered directly into the animal's brain to control its behaviors. The development of bio-robots suffers from the dependency on real-time guidance by human operators. Because of its inherent difficulties, there is no feasible method for automatic controlling of bio-robots yet. In this paper, we propose a new method to realize the automatic navigation for bio-robots. A General Regression Neural Network (GRNN) is adopted to analyze and model the controlling procedure of human operations. Comparing to the traditional approaches with explicit controlling rules, our algorithm learns the controlling process and imitates the decision-making of human-beings to steer the rat-robot automatically. In real-time navigation experiments, our method successfully controls bio-robots to follow given paths automatically and precisely. This work would be significant for future applications of bio-robots and provide a new way to realize hybrid intelligent systems with artificial intelligence and natural biological intelligence combined together.  相似文献   

12.
Nicholas Humphrey's social intelligence hypothesis proposed that the major engine of primate cognitive evolution was social competition. Lev Vygotsky also emphasized the social dimension of intelligence, but he focused on human primates and cultural things such as collaboration, communication and teaching. A reasonable proposal is that primate cognition in general was driven mainly by social competition, but beyond that the unique aspects of human cognition were driven by, or even constituted by, social cooperation. In the present paper, we provide evidence for this Vygotskian intelligence hypothesis by comparing the social-cognitive skills of great apes with those of young human children in several domains of activity involving cooperation and communication with others. We argue, finally, that regular participation in cooperative, cultural interactions during ontogeny leads children to construct uniquely powerful forms of perspectival cognitive representation.  相似文献   

13.
Optimisation of the anaerobic digestion of agricultural resources   总被引:7,自引:1,他引:7  
It is in the interest of operators of anaerobic digestion plants to maximise methane production whilst concomitantly reducing the chemical oxygen demand of the digested material. Although the production of biogas through anaerobic digestion is not a new idea, commercial anaerobic digestion processes are often operated at well below their optimal performance due to a variety of factors. This paper reviews current optimisation techniques associated with anaerobic digestion and suggests possible areas where improvements could be made, including the basic design considerations of a single or multi-stage reactor configuration, the type, power and duration of the mixing regime and the retention of active microbial biomass within the reactor. Optimisation of environmental conditions within the digester such as temperature, pH, buffering capacity and fatty acid concentrations is also discussed. The methane-producing potential of various agriculturally sourced feedstocks has been examined, as has the advantages of co-digestion to improve carbon-to-nitrogen ratios and the use of pre-treatments and additives to improve hydrolysis rates or supplement essential nutrients which may be limiting. However, perhaps the greatest shortfall in biogas production is the lack of reliable sensory equipment to monitor key parameters and suitable, parallelised control systems to ensure that the process continually operates at optimal performance. Modern techniques such as software sensors and powerful, flexible controllers are capable of solving these problems. A direct comparison can be made here with, for instance, oil refineries where a more mature technology uses continuous in situ monitoring and associated feedback procedures to routinely deliver continuous, optimal performance.  相似文献   

14.
A brief history of the development of mathematical models of the cardiovascular system is presented. Until the advent of computers, very little modeling of transient physiological phenomena was done, but this is now commonplace. The problem of stability in complex physiological models fortunately is averted by the fact that the physiological controls are themselves highly stable. The reason for this is that evolution has eliminated unstable feedback loops because they are lethal. Indeed, enough safety factor has been provided in the design of the body so that even poor mathematical models are often quite stable. An especially important use of complex cardiovascular models has been to derive new concepts of cardiovascular function. One such concept is the “principle of infinite gain” for long-term control of arterial pressure, which states that the long-term level of arterial pressure is controlled by a balance between the fluid intake and the output of fluid by the kidneys, not by the level of total peripheral resistance as has been a long-standing misconception based on acute rather than chronic animal experiments.  相似文献   

15.
Feedback has a powerful influence on learning, but it is also expensive to provide. In large classes it may even be impossible for instructors to provide individualized feedback. Peer assessment is one way to provide personalized feedback that scales to large classes. Besides these obvious logistical benefits, it has been conjectured that students also learn from the practice of peer assessment. However, this has never been conclusively demonstrated. Using an online educational platform that we developed, we conducted an in-class matched-set, randomized crossover experiment with high power to detect small effects. We establish that peer assessment causes a small but significant gain in student achievement. Our study also demonstrates the potential of web-based platforms to facilitate the design of high-quality experiments to identify small effects that were previously not detectable.  相似文献   

16.
Biomonitoring has become a key concept in environmental management since it is the most ecologically-relevant means for assessing pollution impact. Its broad applicability, however, raises the need for harmonization, optimization and standardization. The main difficulty met in the development of a generalized methodological framework for standardizing biomonitoring surveys is to reconcile a theoretical approach with an operational approach: in any set-up the survey strategy should ensure that the measured values represent the status of the environment. This leads, inevitably, to the application of a variety of methods, techniques and strategies in order to accommodate the special ecogeomorphological characteristics of each area and handle adequately the knowledge gaps related to local species stress response mechanisms and tolerances. Thereby, comparability of the results, even between subsequent surveys in the same area or concurrent surveys at neighbouring areas, is unfeasible, yet indispensable in defining spatiotemporal pollution patterns in large ecosystems. This inevitably requires some kind of normalization/harmonization that would strengthen any observed correlations between exposure and health effects, which ultimately may point at potential causal relationships. The aim of this work is to design/develop a knowledge management tool, built on a cybernetic infrastructure for (i) localizing the variation source(s) in each project that prohibit inter-survey normalisation/comparability, (ii) determining the path of error propagation as a causal chain when a fault is identified, (iii) testing the ultimate causes suggested as mostly responsible for this fault, and (iv) proceeding to remedial proposals (including a feedback possibility in case that the suggested remedy is proved to be inadequate) with a view to improving quality and reliability of biosurveillance. The presented tool relies on Fuzzy Fault Tree Analysis (FFTA) to identify, categorise, sort and analyse all possible sources of variation and error in biomonitoring; thereby, an expert system is developed, where the tree (dendritic) structure serves as the Knowledge Base (KB) and the fuzzy rules based decision mechanism is the inference engine. This scheme, relying on a collaborative model building methodology and a systemic modeling formalism by using 2nd order cybernetics in order to include human judgement and reasoning, enables knowledge to be used not only for representation but also for reasoning at functional level.  相似文献   

17.
Control of our movements is apparently facilitated by an adaptive internal model in the cerebellum. It was long thought that this internal model implemented an adaptive inverse model and generated motor commands, but recently many reject that idea in favor of a forward model hypothesis. In theory, the forward model predicts upcoming state during reaching movements so the motor cortex can generate appropriate motor commands. Recent computational models of this process rely on the optimal feedback control (OFC) framework of control theory. OFC is a powerful tool for describing motor control, it does not describe adaptation. Some assume that adaptation of the forward model alone could explain motor adaptation, but this is widely understood to be overly simplistic. However, an adaptive optimal controller is difficult to implement. A reasonable alternative is to allow forward model adaptation to ‘re-tune’ the controller. Our simulations show that, as expected, forward model adaptation alone does not produce optimal trajectories during reaching movements perturbed by force fields. However, they also show that re-optimizing the controller from the forward model can be sub-optimal. This is because, in a system with state correlations or redundancies, accurate prediction requires different information than optimal control. We find that adding noise to the movements that matches noise found in human data is enough to overcome this problem. However, since the state space for control of real movements is far more complex than in our simple simulations, the effects of correlations on re-adaptation of the controller from the forward model cannot be overlooked.  相似文献   

18.
Shchurova  L. Yu.  Namiot  V. A. 《Biophysics》2022,67(6):1046-1054
Biophysics - An attempt to mathematically formalize the problem of intelligence causes difficulties, if only because there is no single definition for intelligence adopted by all psychologists. In...  相似文献   

19.
From what we know at present with respect to the neural control of walking, it can be concluded that an optimal biologically inspired robot could have the following features. The limbs should include several joints in which position changes can be obtained by actuators across the joints. The control of mono- and biarticular actuators should occur at least at three levels: one at direct control of the actuators (equivalent to motoneuron level), the second at indirect control acting at a level which controls whole limb movement (flexion or extension) and the third at a still higher level controlling the interlimb coordination. The limb level circuits should be able to produce alternating flexion and extension movements in the limb by means of coupled oscillator flexor and extensor parts which are mutually inhibitory. The interlimb control level should be able to command the various limb control centers. All three control levels should have some basic feedback circuits but the most essential one is needed at the limb control level and concerns the decision to either flex or extend a given limb. The decision to activate the extensor part of the limb oscillator has to be based on feedback signalling the onset of loading of the limb involved. This should be signalled by means of load sensors in the limb. The decision to activate the flexor part of the limb oscillator has to depend on various types of feedback. The most important requirement is that flexion should only occur when the limb concerned is no longer loaded above a given threshold. The rule for the initiation of limb flexion can be made more robust by adding the requirement that position at the base of the limb ("hip") should be within a normal end of stance phase range. Hence, human locomotion is thought to use a number of principles which simplify control, just as in other species such as the cat. It is suggested that cat and human locomotion are good models to learn from when designing efficient walking robots.  相似文献   

20.
Replicative senescence has the potential both to act as an anti-tumour mechanism, and to contribute to age-related changes in tissue function. Studies on human cells have revealed much, both about the nature of cell division counters, some of which utilize the gradual erosion of chromosomal telomeres, and the downstream signalling pathways that initiate and maintain growth arrest in senescence. A powerful test of the hypothesis that senescence is linked to either ageing or tumour prevention now requires a suitable animal model system. Here we overview the current understanding of replicative senescence in human cells, and address to what extent the senescence of murine cells in culture mirrors this phenomenon. We also discuss whether examples of telomere-independent senescence, such as those seen in mouse embryonic fibroblasts (MEFs) and several human cells types, should be viewed not as a consequence of "inadequate growth conditions", but rather as a powerful potential model system to dissect the selective pressures that occur in the early stages of tumour development, ones that we speculate lead to the observed high frequency of abrogation of p16INK4a function in human cancer.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号