首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
虚拟细胞是20世纪末在国外兴起的一种利用现代信息技术和计算机模拟进行细胞研究的全新手段。主要是通过计算机建立人工细胞模型,模拟细胞内外环境,进行生物学的研究和探索。综述了国外主要的虚拟细胞模型的研究概况。  相似文献   

2.
This paper presents a model which is based on biological research using the movable finite automata (MFA) on a self-assembly of T4 phage, and exhibits the results of artificial life simulation. In the previous work, Thompson and Goel [Artificial Life, Addison Weley, 1989, pp. 317-340; Biosystems 18 (1985) 23; J. Theor. Biol. 131 (1988) 351] presented the movable finite automata (MFA) which has a capability of moving on finite automata, and simulated on a computer. They were represented individual rectangular boxes, however, the results of simulation was different from real T4 phage. We propose the sphere model as a protein structure, and simulate the self-assembly of the entire structure of the T4 phage on a computer.  相似文献   

3.
4.
A Monte Carlo computer simulation program is designed in orderto describe the spatial and time evolution of a population ofliving individuals under preassigned environmental conditionsof energy. The simulation is inspired by previous techniquesdeveloped in physics — in particular, in molecular dynamicsand simulations of liquids — and it already provides somenew insights regarding macroscopic deterministic models in ecologyand concerning eventual control of artificial biomass productionplants. Received on July 15, 1986; accepted on October 9, 1986  相似文献   

5.

Background  

General protein evolution models help determine the baseline expectations for the evolution of sequences, and they have been extensively useful in sequence analysis and for the computer simulation of artificial sequence data sets.  相似文献   

6.
In this paper, an unsupervised learning algorithm is developed. Two versions of an artificial neural network, termed a differentiator, are described. It is shown that our algorithm is a dynamic variation of the competitive learning found in most unsupervised learning systems. These systems are frequently used for solving certain pattern recognition tasks such as pattern classification and k-means clustering. Using computer simulation, it is shown that dynamic competitive learning outperforms simple competitive learning methods in solving cluster detection and centroid estimation problems. The simulation results demonstrate that high quality clusters are detected by our method in a short training time. Either a distortion function or the minimum spanning tree method of clustering is used to verify the clustering results. By taking full advantage of all the information presented in the course of training in the differentiator, we demonstrate a powerful adaptive system capable of learning continuously changing patterns.  相似文献   

7.
Two common problems in computer simulations are the decisions to ignore or include a particular element of a system under study in a model and the choice of an appropriate integration algorithm. To examine aspects of these problems, a simple exponential system is considered in which a large simulation error is induced by a rather small truncation error. The effect of computational precision, step size and hardware selection on this error is examined at standard and extended precisions over a range of step sizes and on a variety of computers. For this model, simulation accuracy is an exponential function of the number of bits in the mantissa of the computer word. Optimal step size is a function of accuracy required and precision used; a trade-off between truncation and round-off errors becomes important as accuracy requirements increase. Machine selection is important primarily in economic terms if the required precision is available. We conclude that the effect on a simulation of small terms such as truncation errors can be unexpectedly large, that solutions should always be checked, and that high precision and wide dynamic range are important to the successful computer simulation of models such as that examined.  相似文献   

8.
To develop a more efficient and optimal artificial kidney, many experimental approaches have been used to study mass transfer inside, outside, and cross hollow fiber membranes with different kinds of membranes, solutes, and flow rates as parameters. However, these experimental approaches are expensive and time consuming. Numerical calculation and computer simulation is an effective way to study mass transfer in the artificial kidney, which can save substantial time and reduce experimental cost. This paper presents a new model to simulate mass transfer in artificial kidney by coupling together shell-side, lumen-side, and transmembrane flows. Darcy's equations were employed to simulate shell-side flow, Navier-Stokes equations were employed to simulate lumen-side flow, and Kedem-Katchalsky equations were used to compute transmembrane flow. Numerical results agreed well with experimental results within 10% error. Numerical results showed the nonuniform distribution of flow and solute concentration in shell-side flow due to the entry/exit effect and Darcy permeability. In the shell side, the axial velocity in the periphery is higher than that in the center. This numerical model presented a clear insight view of mass transfer in an artificial kidney and may be used to help design an optimal artificial kidney and its operation conditions to improve hemodialysis.  相似文献   

9.
In the field of active and passive transport of substances across epithelial membranes little progress has been made, mostly for technical reasons, towards a comprehensive view of a wealth of isolated laboratory data. The present study is an attempt to advance the use of the method of computer simulation, with application of the “Continuous System Modelling Program” into the field of membrane transport. High speed of operation and great versatility make this procedure uniquely suitable to transport studies on multicompartment biological systems, such as epithelia. Basic prerequisites are, a detailed knowledge of the morphological parameters of the system, and an abundance of often isolated laboratory data against which the function of a model membrane can be checked. The simulation process becomes then a study of finding the constraints on all rate constants involved (a few of which may be known) which lead to results compatible with experimental facts. Whereas computer modelling is no substitute for experimental studies, it is one way of arriving at a comprehensive view of the complex flow patterns in such complex structures as epithelia. The computer simulation technique can lead to new, testable predictions, and it gives the laboratory investigator a critical perspective of potential pitfalls in experimental techniques used in studies on fluxes in structures as small as those encountered in epithelia. The usefulness of computer simulation in the field of membrane transport is exemplified by applying it to the problem of the initial rate of uptake of Na+ by frog skin epidermis. It is shown, here, that the computer data are in excellent agreement with experimental data on epidermis. Beyond this, the computer data permit calculations on kinetic parameters, e.g. Na+ pool sizes and rates of Na+ fluxes between compartments, which, for the present at least, cannot be directly measured.  相似文献   

10.
The application of computer simulation to molecular systems of biochemical interest is reviewed. It is shown that computer simulation is a tool complementary to experimental methods, which can be used to access atomic details inaccessible to experimental probes. Examples are given in which computer simulation augments the experimental information by providing an atomic picture of high resolution with respect to space, energy or time. The usefulness of a computer simulation largely depends on its quality. The most important factors that limit the accuracy of simulated results are discussed. The accuracy of different simulation studies can differ by orders of magnitude. The accuracy will depend on the type of biomolecular system and process studied. It will also depend on the choice of force field, the simulation set-up and the protocol that is used. A list of quality-determining factors is given, which may be useful when interpreting simulation studies appearing in the literature.  相似文献   

11.
This paper discusses a computer simulation of a pneumatic portable piston-type artificial heart drive system with a linear d-c-motor. The purpose of the design is to obtain an artificial heart drive system with high efficiency and small dimensions to enhance portability. The design employs two factors contributing the total efficiency of the drive system. First, the dimensions of the pneumatic actuator were optimized under a cost function of the total efficiency. Second, the motor performance was studied in terms of efficiency. More than 50 percent of the input energy of the actuator with practical loads is consumed in the armature circuit in all linear d-c-motors with brushes. An optimal design is: the piston cross-sectional area of 10.5 cm2 cylinder longitudinal length of 10 cm. The total efficiency could be up to 25 percent by improving the gasket to reduce the frictional force.  相似文献   

12.
During the last two decades, considerable progress has been made in the studies of brain–computer interfaces (BCIs)—devices in which motor signals from the brain are registered by multi-electrode arrays and transformed into commands for artificial actuators such as cursors and robotic devices. This review is focused on one problem. Voluntary motor control is based on neurophysiological processes, which strongly depend on the afferent innervation of skin, muscles, and joints. Thus, invasive BCI has to be based on a bidirectional system in which motor control signals are registered by multichannel microelectrodes implanted in motor areas, whereas tactile, proprioceptive, and other useful signals are transported back to the brain through spatiotemporal patterns of intracortical microstimulation (ICMS) delivered to sensory areas. In general, the studies of invasive BCIs have advanced in several directions. The progress of BCIs with artificial sensory feedback will not only help patients, but will also expand base knowledge in the field of human cortical functions.  相似文献   

13.
The aim of this paper is to give an overview of computer modelling and simulation in cellular biology, in particular as applied to complex biochemical processes within the cell. This is illustrated by the use of the techniques of object-oriented modelling, where the computer is used to construct abstractions of objects in the domain being modelled, and these objects then interact within the computer to simulate the system and allow emergent properties to be observed. The paper also discusses the role of computer simulation in understanding complexity in biological systems, and the kinds of information which can be obtained about biology via simulation.  相似文献   

14.
15.
A torque-driven, subject-specific 3-D computer simulation model of the impact phase of one-handed tennis backhand strokes was evaluated by comparing performance and simulation results. Backhand strokes of an elite subject were recorded on an artificial tennis court. Over the 50-ms period after impact, good agreement was found with an overall RMS difference of 3.3° between matching simulation and performance in terms of joint and racket angles. Consistent with previous experimental research, the evaluation process showed that grip tightness and ball impact location are important factors that affect postimpact racket and arm kinematics. Associated with these factors, the model can be used for a better understanding of the eccentric contraction of the wrist extensors during one-handed backhand ground strokes, a hypothesized mechanism of tennis elbow.  相似文献   

16.
17.
Mechanocomputational techniques in conjunction with artificial intelligence (AI) are revolutionizing the interpretations of the crucial information from the medical data and converting it into optimized and organized information for diagnostics. It is possible due to valuable perfection in artificial intelligence, computer aided diagnostics, virtual assistant, robotic surgery, augmented reality and genome editing (based on AI) technologies. Such techniques are serving as the products for diagnosing emerging microbial or non microbial diseases. This article represents a combinatory approach of using such approaches and providing therapeutic solutions towards utilizing these techniques in disease diagnostics.  相似文献   

18.
W Düchting 《Blut》1975,31(6):371-388
This paper deals with a block diagram which describes the structure of the control of erythropoiesis by negative feedback loops. The model is transformed into an adequate simulation program using a special block-oriented programming language called ASIM (Analoge SIMulation). For both normal and diseased states of the blood forming process the dynamic responses of the erythrocytes and reticulocytes are simulated by a digital computer analysis. The computer simulation includes different forms of anemia caused by parameter variations as well as by structural alterations. Rising oscillations are obtained too in multiloop control systems containing complex paths with minor loops, which for example take into account the erythrocytic chalones. The described model shows that rising oscillations, that are unstable control loops, can be produced by changing the control-loop structure as well as by parameter changes. In case of malignant disorders such failures becoming effective in oscillations are discussed as parts of disturbed homoeostasis. The results of these studies obtained by simulation should especially stimulate scientists working in the fields of biology and medicine to new test series to verify the proposed hypothesis.  相似文献   

19.
To efficiently simulate very large networks of interconnected neurons, particular consideration has to be given to the computer architecture being used. This article presents techniques for implementing simulators for large neural networks on a number of different computer architectures. The neuronal simulation task and the computer architectures of interest are first characterized, and the potential bottlenecks are highlighted. Then we describe the experience gained from adapting an existing simulator, SWIM, to two very different architectures–vector computers and multiprocessor workstations. This work lead to the implementation of a new simulation library, SPLIT, designed to allow efficient simulation of large networks on several architectures. Different computer architectures put different demands on the organization of both data structures and computations. Strict separation of such architecture considerations from the neuronal models and other simulation aspects makes it possible to construct both portable and extendible code.  相似文献   

20.
王宜成 《生态学报》2013,33(11):3258-3268
传统的自然保护区设计方法是打分法和Gap分析法,这两种方法简单易行但可靠性不高;地理信息系统(GIS)在保护区设计领域的应用也为人熟悉.关注近年来快速发展而国内使用不多的两种方法——数学建模和计算机模拟.数学建模主要用来从一组备选地块中选择一部分组成自然保护区,包括线性和非线性模型,用启发式算法或最优化算法求解.启发式算法具有速度快、灵活等优点,但解通常不是最优的,不能保证稀缺资源的最优化利用.最优化算法运算效率低,变量较多比如数百时就可能遇到计算困难,但解是最优的.预计两种算法都将继续发展.计算机模拟主要用于保护区评价、功能区划分、预测特定环境比如空间特征和气候变化对物种的影响等,多用启发式算法,与其它软件结合把结果以图画显示出来.两种方法特别是计算机模拟均要求保护区设计者有较强的专业知识.讨论了两种方法面临的问题和新的研究方向,至少包括:1)基础数据依然需要完善;2)一些新的因素比如动态性和不确定性如何在模型中考虑并与其它因素结合;3)气候变化预景下模拟参数如何评估和调整;4)如何协调保护与发展的关系;5)方法的实际应用需要研究者与决策者之间建立交流机制;6)多领域专家和相关利益方应有机会参与保护区设计.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号