首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
The uptake of virtual simulation technologies in both military and civilian surgical contexts has been both slow and patchy. The failure of the virtual reality community in the 1990s and early 2000s to deliver affordable and accessible training systems stems not only from an obsessive quest to develop the 'ultimate' in so-called 'immersive' hardware solutions, from head-mounted displays to large-scale projection theatres, but also from a comprehensive lack of attention to the needs of the end users. While many still perceive the science of simulation to be defined by technological advances, such as computing power, specialized graphics hardware, advanced interactive controllers, displays and so on, the true science underpinning simulation--the science that helps to guarantee the transfer of skills from the simulated to the real--is that of human factors, a well-established discipline that focuses on the abilities and limitations of the end user when designing interactive systems, as opposed to the more commercially explicit components of technology. Based on three surgical simulation case studies, the importance of a human factors approach to the design of appropriate simulation content and interactive hardware for medical simulation is illustrated. The studies demonstrate that it is unnecessary to pursue real-world fidelity in all instances in order to achieve psychological fidelity--the degree to which the simulated tasks reproduce and foster knowledge, skills and behaviours that can be reliably transferred to real-world training applications.  相似文献   

2.
The process of connecting genetic parts—DNA assembly—is a foundational technology for synthetic biology. Microfluidics present an attractive solution for minimizing use of costly reagents, enabling multiplexed reactions, and automating protocols by integrating multiple protocol steps. However, microfluidics fabrication and operation can be expensive and requires expertise, limiting access to the technology. With advances in commodity digital fabrication tools, it is now possible to directly print fluidic devices and supporting hardware. 3D printed micro- and millifluidic devices are inexpensive, easy to make and quick to produce. We demonstrate Golden Gate DNA assembly in 3D-printed fluidics with reaction volumes as small as 490 nL, channel widths as fine as 220 microns, and per unit part costs ranging from $0.61 to $5.71. A 3D-printed syringe pump with an accompanying programmable software interface was designed and fabricated to operate the devices. Quick turnaround and inexpensive materials allowed for rapid exploration of device parameters, demonstrating a manufacturing paradigm for designing and fabricating hardware for synthetic biology.  相似文献   

3.
The complexity and diversity of manufacturing software and the need to adapt this software to the frequent changes in the production requirements necessitate the use of a systematic approach to developing this software. The software life-cycle model (Royce, 1970) that consists of specifying the requirements of a software system, designing, implementing, testing, and evolving this software can be followed when developing large portions of manufacturing software. However, the presence of hardware devices in these systems and the high costs of acquiring and operating hardware devices further complicate the manufacturing software development process and require that the functionality of this software be extended to incorporate simulation and prototyping. This paper reviews recent methods for planning, scheduling, simulating, and monitoring the operation of manufacturing systems. A synopsis of the approaches to designing and implementing the real-time control software of these systems is presented. It is concluded that current methodologies support, in a very restricted sense, these planning, scheduling, and monitoring activities, and that enhanced performance can be achieved via an integrated approach.  相似文献   

4.
The configuration and hardware components of a desk-top computer coupled system for bench-top bioreactors are described. Examples of on-line acquisition of several directly accessible environmental process parameters and computations of directly inaccessible state variables are presented. The system described offers great advantages in experiments for establishing sophisticated control algorithms and for studying the physiological behaviour of microbial populations.  相似文献   

5.
Recently much effort has been spent on providing a shared address space abstraction on clusters of small-scale symmetric multiprocessors. However, advances in technology will soon make it possible to construct these clusters with larger-scale cc-NUMA nodes, connected with non-coherent networks that offer latencies and bandwidth comparable to interconnection networks used in hardware cache-coherent systems. The shared memory abstraction can be provided on these systems in software across nodes and hardware within nodes.Recent simulation results have demonstrated that certain features of modern system area networks can be used to greatly reduce shared virtual memory (SVM) overheads [5,19]. In this work we leverage these results and we use detailed system emulation to investigate building future software shared memory clusters. We use an existing, large-scale hardware cache-coherent system with 64 processors to emulate a complete future cluster. We port our existing infrastructure (communication layer and shared memory protocol) on this system and study the behavior of a set of real applications. We present results for both 32- and 64-processor system configurations.We find that: (i) System emulation is invaluable in quantifying potential benefits from changes in the technology of commodity components. More importantly, it reveals potential problems in future systems that are easily overlooked in simulation studies. Thus, system emulation should be used along with other modeling techniques (e.g., simulation, implementation) to investigate future trends. (ii) Our work shows that current SVM protocols can only partially take advantage of faster interconnects and wider nodes due to operating system and architectural implications. We quantify the related issues and identify the areas where more research is required for future SVM clusters.  相似文献   

6.
CAMBIO, a software package devoted to bioprocess modelling,which runs on Apollo computers, is described. This softwareenables bioengineers to easily and interactively design appropriatemathematical models directly from their perception of the process.CAMBIO provides the user with a set of design symbols and mnemonicicons in order to interactively design a functional diagram.This diagram has to exhibit the most relevant components withtheir related interactions through biological and physico-chemicalreactions. Then, CAMBIO automatically generates the dynamicalmaterial balance equations of the process in the form of analgebraic-differential system by taking advantage of the knowledgeinvolved in the functional diagram. The model may be used forcontrol design purpose or completed by kinetics expressionswith a view to simulation. CAMBIO offers facilities to generatea simulation model (for coding of kinetics, introducing auxiliaryvariables, etc.). This model is automatically interfaced witha specialized simulation software which allows an immediatevisualization of the process dynamical behaviour under variousoperational conditions (possibly involving feedback controlstrategies). An example of an application dealing with yeastfermentation is given. Received on June 14, 1990; accepted on January 11, 1991  相似文献   

7.
8.
Rho GTPases are conformational switches that control a wide variety of signaling pathways critical for eukaryotic cell development and proliferation. They represent attractive targets for drug design as their aberrant function and deregulated activity is associated with many human diseases including cancer. Extensive high-resolution structures (>100) and recent mutagenesis studies have laid the foundation for the design of new structure-based chemotherapeutic strategies. Although the inhibition of Rho signaling with drug-like compounds is an active area of current research, very little attention has been devoted to directly inhibiting Rho by targeting potential allosteric non-nucleotide binding sites. By avoiding the nucleotide binding site, compounds may minimize the potential for undesirable off-target interactions with other ubiquitous GTP and ATP binding proteins. Here we describe the application of molecular dynamics simulations, principal component analysis, sequence conservation analysis, and ensemble small-molecule fragment mapping to provide an extensive mapping of potential small-molecule binding pockets on Rho family members. Characterized sites include novel pockets in the vicinity of the conformationaly responsive switch regions as well as distal sites that appear to be related to the conformations of the nucleotide binding region. Furthermore the use of accelerated molecular dynamics simulation, an advanced sampling method that extends the accessible time-scale of conventional simulations, is found to enhance the characterization of novel binding sites when conformational changes are important for the protein mechanism.  相似文献   

9.
In today’s biopharmaceutical industries, the lead time to develop and produce a new monoclonal antibody takes years before it can be launched commercially. The reasons lie in the complexity of the monoclonal antibodies and the need for high product quality to ensure clinical safety which has a significant impact on the process development time. Frameworks such as quality by design are becoming widely used by the pharmaceutical industries as they introduce a systematic approach for building quality into the product. However, full implementation of quality by design has still not been achieved due to attrition mainly from limited risk assessment of product properties as well as the large number of process factors affecting product quality that needs to be investigated during the process development. This has introduced a need for better methods and tools that can be used for early risk assessment and predictions of critical product properties and process factors to enhance process development and reduce costs. In this review, we investigate how the quantitative structure–activity relationships framework can be applied to an existing process development framework such as quality by design in order to increase product understanding based on the protein structure of monoclonal antibodies. Compared to quality by design, where the effect of process parameters on the drug product are explored, quantitative structure–activity relationships gives a reversed perspective which investigates how the protein structure can affect the performance in different unit operations. This provides valuable information that can be used during the early process development of new drug products where limited process understanding is available. Thus, quantitative structure–activity relationships methodology is explored and explained in detail and we investigate the means of directly linking the structural properties of monoclonal antibodies to process data. The resulting information as a decision tool can help to enhance the risk assessment to better aid process development and thereby overcome some of the limitations and challenges present in QbD implementation today.  相似文献   

10.
11.
Traditional drug discovery starts by experimentally screening chemical libraries to find hit compounds that bind to protein targets, modulating their activity. Subsequent rounds of iterative chemical derivitization and rescreening are conducted to enhance the potency, selectivity, and pharmacological properties of hit compounds. Although computational docking of ligands to targets has been used to augment the empirical discovery process, its historical effectiveness has been limited because of the poor correlation of ligand dock scores and experimentally determined binding constants. Recent progress in super-computing, coupled to theoretical insights, allows the calculation of the Gibbs free energy, and therefore accurate binding constants, for usually large ligand-receptor systems. This advance extends the potential of virtual drug discovery. A specific embodiment of the technology, integrating de novo, abstract fragment based drug design, sophisticated molecular simulation, and the ability to calculate thermodynamic binding constants with unprecedented accuracy, are discussed.  相似文献   

12.
13.
Our understanding of complex living systems is limited by our capacity to perform experiments in high throughput. While robotic systems have automated many traditional hand‐pipetting protocols, software limitations have precluded more advanced maneuvers required to manipulate, maintain, and monitor hundreds of experiments in parallel. Here, we present Pyhamilton, an open‐source Python platform that can execute complex pipetting patterns required for custom high‐throughput experiments such as the simulation of metapopulation dynamics. With an integrated plate reader, we maintain nearly 500 remotely monitored bacterial cultures in log‐phase growth for days without user intervention by taking regular density measurements to adjust the robotic method in real‐time. Using these capabilities, we systematically optimize bioreactor protein production by monitoring the fluorescent protein expression and growth rates of a hundred different continuous culture conditions in triplicate to comprehensively sample the carbon, nitrogen, and phosphorus fitness landscape. Our results demonstrate that flexible software can empower existing hardware to enable new types and scales of experiments, empowering areas from biomanufacturing to fundamental biology.  相似文献   

14.
The ability to apply controlled forces to individual molecules has been revolutionary in shaping our understanding of biophysics in areas as diverse as dynamic bond strength, biological motor operation, and DNA replication. However, the methodology to perform single-molecule experiments remains relatively inaccessible because of cost and complexity. In 2010, we introduced the centrifuge force microscope (CFM) as a platform for accessible and high-throughput single-molecule experimentation. The CFM consists of a rotating microscope with which prescribed centrifugal forces can be applied to microsphere-tethered biomolecules. In this work, we develop and demonstrate a next-generation Wi-Fi CFM that offers unprecedented ease of use and flexibility in design. The modular CFM unit fits within a standard benchtop centrifuge and connects by Wi-Fi to an external computer for live control and streaming at near gigabit speeds. The use of commercial wireless hardware allows for flexibility in programming and provides a streamlined upgrade path as Wi-Fi technology advances. To facilitate ease of use, detailed build and setup instructions, as well as LabVIEW-based control software and MATLAB-based analysis software, are provided. We demonstrate the instrument’s performance by analysis of force-dependent dissociation of short DNA duplexes of 7, 8, and 9 bp. We showcase the sensitivity of the approach by resolving distinct dissociation kinetic rates for a 7 bp duplex in which one G-C basepair is mutated to an A-T basepair.  相似文献   

15.
近年来,随着计算机硬件、软件工具和数据丰度的不断突破,以机器学习为代表的人工智能技术在生物、基础医学和药学等领域的应用不断拓展和融合,极大地推动了这些领域的发展,尤其是药物研发领域的变革。其中,药物-靶标相互作用(drug-target interactions, DTI)的识别是药物研发领域中的重要难题和人工智能技术交叉融合的热门方向,研究人员在DTI预测方面做了大量的工作,构建了许多重要的数据库,开发或拓展了各类机器学习算法和工具软件。对基于机器学习的DTI预测的基本流程进行了介绍,并对利用机器学习预测DTI的研究进行了回顾,同时对不同的机器学习方法运用于DTI预测的优缺点进行了简单总结,以期对开发更加有效的预测算法和DTI预测的发展提供帮助。  相似文献   

16.
A wide variety of information or ‘metadata’ is required when undertaking dendrochronological sampling. Traditionally, researchers record observations and measurements on field notebooks and/or paper recording forms, and use digital cameras and hand-held GPS devices to capture images and record locations. In the lab, field notes are often manually entered into spreadsheets or personal databases, which are then sometimes linked to images and GPS waypoints. This process is both time consuming and prone to human and instrument error. Specialised hardware technology exists to marry these data sources, but costs can be prohibitive for small scale operations (>$2000 USD). Such systems often include proprietary software that is tailored to very specific needs and might require a high level of expertise to use. We report on the successful testing and deployment of a dendrochronological field data collection system utilising affordable off-the-shelf devices ($100–300 USD). The method builds upon established open source software that has been widely used in developing countries for public health projects as well as to assist in disaster recovery operations. It includes customisable forms for digital data entry in the field, and a marrying of accurate GPS location with geotagged photographs (with possible extensions to other measuring devices via Bluetooth) into structured data fields that are easy to learn and operate. Digital data collection is less prone to human error and efficiently captures a range of important metadata. In our experience, the hardware proved field worthy in terms of size, ruggedness, and dependability (e.g., battery life). The system integrates directly with the Tellervo software to both create forms and populate the database, providing end users with the ability to tailor the solution to their particular field data collection needs.  相似文献   

17.
Mechanism is a core chemical concept that has vital implications for reaction rate, efficiency and selectivity. The discovery of mechanism is not easy due to the great diversity of possible chemical rearrangements in even relatively simple systems. For this reason, mechanisms involving bond breaking and forming are usually proposed via chemical intuition – which limits the scope of considered possibilities – and these hypotheses are then tested using simulation or experiment. This article discusses an automated simulation strategy for investigating multiple elementary step reaction mechanisms in chemical systems. The method starts from a single input structure and seeks out nearby intermediates, optimises the proposed structures and then determines the kinetic viability of each elementary step. The kinetically accessible intermediates are catalogued and new searches are performed on each unique structure. This process is repeated for an arbitrary number of steps without human intervention, and massively parallel computation enables fast searches in chemical space. Importantly, this strategy can be empirically shown to lead to a finite number of accessible structures, not a combinatorial explosion of intermediates. Therefore, the method should be able to predict multi-step reaction pathways in many interesting chemical systems. Demonstrations on organic reactions and a hydrogen storage material, ammonia borane, show that the herein proposed strategy can uncover complex reactivity without relying on existing chemical intuition.  相似文献   

18.
The identification of specific biomarkers obtained directly from human pathological lesions remains a major challenge, because the amount of tissue available is often very limited. We have developed a novel, comprehensive, and efficient method permitting the identification and absolute quantification of potentially accessible proteins in such precious samples. This protein subclass comprises cell membrane associated and extracellular proteins, which are reachable by systemically deliverable substances and hence especially suitable for diagnosis and targeted therapy applications. To isolate such proteins, we exploited the ability of chemically modified biotin to label ex vivo accessible proteins and the fact that most of these proteins are glycosylated. This approach consists of three successive steps involving first the linkage of potentially accessible proteins to biotin molecules followed by their purification. The remaining proteins are then subjected to glycopeptide isolation. Finally, the analysis of the nonglycosylated peptides and their involvement in an in silico method increased the confident identification of glycoproteins. The value of the technique was demonstrated on human breast cancer tissue samples originating from 5 individuals. Altogether, the method delivered quantitative data on more than 400 potentially accessible proteins (per sample and replicate). In comparison to biotinylation or glycoprotein analysis alone, the sequential method significantly increased the number (≥30% and ≥50% respectively) of potentially therapeutically and diagnostically valuable proteins. The sequential method led to the identification of 93 differentially modulated proteins, among which several were not reported to be associated with the breast cancer. One of these novel potential biomarkers was CD276, a cell membrane-associated glycoprotein. The immunohistochemistry analysis showed that CD276 is significantly differentially expressed in a series of breast cancer lesions. Due to the fact that our technology is applicable to any type of tissue biopsy, it bears the ability to accelerate the discovery of new relevant biomarkers in a broad spectrum of pathologies.  相似文献   

19.
Just as the power of the open-source design paradigm has driven down the cost of software to the point that it is accessible to most people, the rise of open-source hardware is poised to drive down the cost of doing experimental science to expand access to everyone. To assist in this aim, this paper introduces a library of open-source 3-D-printable optics components. This library operates as a flexible, low-cost public-domain tool set for developing both research and teaching optics hardware. First, the use of parametric open-source designs using an open-source computer aided design package is described to customize the optics hardware for any application. Second, details are provided on the use of open-source 3-D printers (additive layer manufacturing) to fabricate the primary mechanical components, which are then combined to construct complex optics-related devices. Third, the use of the open-source electronics prototyping platform are illustrated as control for optical experimental apparatuses. This study demonstrates an open-source optical library, which significantly reduces the costs associated with much optical equipment, while also enabling relatively easily adapted customizable designs. The cost reductions in general are over 97%, with some components representing only 1% of the current commercial investment for optical products of similar function. The results of this study make its clear that this method of scientific hardware development enables a much broader audience to participate in optical experimentation both as research and teaching platforms than previous proprietary methods.  相似文献   

20.
Malaria is a global disease infecting several million individuals annually. Malarial infection is particularly severe in the poorest parts of the world and is a major drain on their limited resources. Development of drug resistance and absence of a preventive vaccine have led to an immediate necessity for identifying new drug targets to combat malaria. Understanding the intricacies of parasite biology is essential to design novel intervention strategies that can prevent the growth of the parasite. The structural biology approach towards this goal involves the identification of key differences in the structures of the human and parasite enzymes and the determination of unique protein structures essential for parasite survival. This review covers the work on structural biology of plasmodial proteins carried out during the period of January 2006 to June 2007.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号