首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Neuroimaging technologies such as Magnetic Resonance Imaging (MRI) and Computed Tomography (CT) collect three-dimensional data (3D) that is typically viewed on two-dimensional (2D) screens. Actual 3D models, however, allow interaction with real objects such as implantable electrode grids, potentially improving patient specific neurosurgical planning and personalized clinical education. Desktop 3D printers can now produce relatively inexpensive, good quality prints. We describe our process for reliably generating life-sized 3D brain prints from MRIs and 3D skull prints from CTs. We have integrated a standardized, primarily open-source process for 3D printing brains and skulls. We describe how to convert clinical neuroimaging Digital Imaging and Communications in Medicine (DICOM) images to stereolithography (STL) files, a common 3D object file format that can be sent to 3D printing services. We additionally share how to convert these STL files to machine instruction gcode files, for reliable in-house printing on desktop, open-source 3D printers. We have successfully printed over 19 patient brain hemispheres from 7 patients on two different open-source desktop 3D printers. Each brain hemisphere costs approximately $3–4 in consumable plastic filament as described, and the total process takes 14–17 hours, almost all of which is unsupervised (preprocessing = 4–6 hr; printing = 9–11 hr, post-processing = <30 min). Printing a matching portion of a skull costs $1–5 in consumable plastic filament and takes less than 14 hr, in total. We have developed a streamlined, cost-effective process for 3D printing brain and skull models. We surveyed healthcare providers and patients who confirmed that rapid-prototype patient specific 3D models may help interdisciplinary surgical planning and patient education. The methods we describe can be applied for other clinical, research, and educational purposes.  相似文献   

2.
In this paper, we propose and evaluate a flexible architecture for desktop grids that supports multiple task allocation policies on top of a structured P2P overlay. In our proposal, a?Bag-of-Tasks application is submitted to random nodes and placed in their local queue, that is processed in a FIFO way. When a node becomes idle, a task allocation policy is executed that fetches tasks from remote nodes. The proposed architecture is flexible since it is decoupled from both the P2P middleware and the P2P overlay. A?prototype of the proposed architecture was implemented on top of the JXTA middleware, using the Chord P2P search overlay. The results obtained in a 16-machine heterogeneous desktop grid show that very good performance gains are obtained with multiple task allocation policies. Also, a speedup of 9.85 was achieved for an application composed of 270 network flow balancing tasks, reducing its wallclock execution time from 32.51?min to 3.3?min.  相似文献   

3.

More than 1000 distributed ledger technology (DLT) systems raising $600 billion in investment in 2016 feature the unprecedented and disruptive potential of blockchain technology. A systematic and data-driven analysis, comparison and rigorous evaluation of the different design choices of distributed ledgers and their implications is a challenge. The rapidly evolving nature of the blockchain landscape hinders reaching a common understanding of the techno-socio-economic design space of distributed ledgers and the cryptoeconomies they support. To fill this gap, this paper makes the following contributions: (i) A conceptual architecture of DLT systems with which (ii) a taxonomy is designed and (iii) a rigorous classification of DLT systems is made using real-world data and wisdom of the crowd. (iv) A DLT design guideline is the end result of applying machine learning methodologies on the classification data. Compared to related work and as defined in earlier taxonomy theory, the proposed taxonomy is highly comprehensive, robust, explanatory and extensible. The findings of this paper can provide new insights and better understanding of the key design choices evolving the modeling complexity of DLT systems, while identifying opportunities for new research contributions and business innovation.

  相似文献   

4.
Focusing light on infection in four dimensions   总被引:1,自引:0,他引:1  
The fusion of cell biology with microbiology has bred a new discipline, cellular microbiology, in which the primary aim is to understand host-pathogen interactions at a tissue, cellular and molecular level. In this context, we require techniques allowing us to probe infection in situ and extrapolate quantitative information on its spatiotemporal dynamics. To these ends, fluorescent light-based imaging techniques offer a powerful tool, and the state-of-the-art is defined by paradigms using so-called multidimensional (multi-D) imaging microscopy. Multi-D imaging aims to visualize and quantify biological events through time and space and, more specifically, refers to combinations of: three (3D, volume), four (4D, time) and five (5D, multiwavelength)-dimensional recordings. Successful multi-D imaging depends upon understanding the available technologies and their limitations. This is especially true in the field of microbiology where visualization of infectious/pathogenic activities inside living host systems presents particular technical challenges. Thus, as multi-D imaging rapidly becomes a common bench tool to the cellular microbiologist, this review provides the new user with some of the necessary technical insight required to get the best from these methods.  相似文献   

5.
Modeling always has been at the core of both organizational design and information systems (IS) development. Models enable decision makers to filter out the irrelevant complexities of the real world, so that efforts can be directed toward the most important parts of the system under study. However, both business analysts and IS professionals may find it difficult to navigate through a maze of theoretical paradigms, methodological approaches, and representational formalisms that have been proposed for both business process modeling (BPM) and information systems modeling (ISM). This paper sets out to put an order to this chaos by proposing an evaluation framework and a novel taxonomy of BPM and ISM techniques. These findings, coupled with a detailed review of BPM and ISM techniques, can assist decision makers in comparatively evaluating and selecting suitable modeling techniques, depending on the characteristics and requirements of individual projects.  相似文献   

6.

Background

Desktop virtual environments (VEs) are increasingly deployed to study the effects of environmental qualities and interventions on human behavior and safety related concerns in built environments. For these applications it is essential that users appraise the affective qualities of the VE similar to those of its real world counterpart. Previous studies have shown that factors like simulated lighting, sound and dynamic elements all contribute to the affective appraisal of a desktop VE. Since ambient odor is known to affect the affective appraisal of real environments, and has been shown to increase the sense of presence in immersive VEs, it may also be an effective tool to tune the affective appraisal of desktop VEs. This study investigated if exposure to ambient odor can modulate the affective appraisal of a desktop VE with signs of public disorder.

Method

Participants explored a desktop VE representing a suburban neighborhood with signs of public disorder (neglect, vandalism and crime), while being exposed to either room air or subliminal levels of unpleasant (tar) or pleasant (cut grass) ambient odor. Whenever they encountered signs of disorder they reported their safety related concerns and associated affective feelings.

Results

Signs of crime in the desktop VE were associated with negative affective feelings and concerns for personal safety and personal property. However, there was no significant difference between reported safety related concerns and affective connotations in the control (no-odor) and in each of the two ambient odor conditions.

Conclusion

Ambient odor did not affect safety related concerns and affective connotations associated with signs of disorder in the desktop VE. Thus, semantic congruency between ambient odor and a desktop VE may not be sufficient to influence its affective appraisal, and a more realistic simulation in which simulated objects appear to emit scents may be required to achieve this goal.  相似文献   

7.
The realization of a sustainable bioeconomy requires our ability to understand and engineer complex design principles for the development of platform organisms capable of efficient conversion of cheap and sustainable feedstocks (e.g., sunlight, CO2, and nonfood biomass) into biofuels and bioproducts at sufficient titers and costs. For model microbes, such as Escherichia coli, advances in DNA reading and writing technologies are driving the adoption of new paradigms for engineering biological systems. Unfortunately, microbes with properties of interest for the utilization of cheap and renewable feedstocks, such as photosynthesis, autotrophic growth, and cellulose degradation, have very few, if any, genetic tools for metabolic engineering. Therefore, it is important to develop “design rules” for building a genetic toolbox for novel microbes. Here, we present an overview of our current understanding of these rules for the genetic manipulation of prokaryotic microbes and the available genetic tools to expand our ability to genetically engineer nonmodel systems.  相似文献   

8.
An optimization of power and energy consumptions is the important concern for a design of modern-day and future computing and communication systems. Various techniques and high performance technologies have been investigated and developed for an efficient management of such systems. All these technologies should be able to provide good performance and to cope under an increased workload demand in the dynamic environments such as Computational Grids (CGs), clusters and clouds. In this paper we approach the independent batch scheduling in CG as a bi-objective minimization problem with makespan and energy consumption as the scheduling criteria. We use the Dynamic Voltage Scaling (DVS) methodology for scaling and possible reduction of cumulative power energy utilized by the system resources. We develop two implementations of Hierarchical Genetic Strategy-based grid scheduler (Green-HGS-Sched) with elitist and struggle replacement mechanisms. The proposed algorithms were empirically evaluated versus single-population Genetic Algorithms (GAs) and Island GA models for four CG size scenarios in static and dynamic modes. The simulation results show that proposed scheduling methodologies fairly reduce the energy usage and can be easily adapted to the dynamically changing grid states and various scheduling scenarios.  相似文献   

9.
10.
Synthetic biology is an interdisciplinary field that takes top-down approaches to understand and engineer biological systems through design-build-test cycles. A number of advances in this relatively young field have greatly accelerated such engineering cycles. Specifically, various innovative tools were developed for in silico biosystems design, DNA de novo synthesis and assembly, construct verification, as well as metabolite analysis, which have laid a solid foundation for building biological foundries for rapid prototyping of improved or novel biosystems. This review summarizes the state-of-the-art technologies for synthetic biology and discusses the challenges to establish such biological foundries.  相似文献   

11.
Cyanobacteria are photosynthetic bacteria that occupy various habitats across the globe, playing critical roles in many of Earth's biogeochemical cycles both in both aquatic and terrestrial systems. Despite their well-known significance, their taxonomy remains problematic and is the subject of much research. Taxonomic issues of Cyanobacteria have consequently led to inaccurate curation within known reference databases, ultimately leading to problematic taxonomic assignment during diversity studies. Recent advances in sequencing technologies have increased our ability to characterize and understand microbial communities, leading to the generation of thousands of sequences that require taxonomic assignment. We herein propose CyanoSeq ( https://zenodo.org/record/7569105 ), a database of cyanobacterial 16S rRNA gene sequences with curated taxonomy. The taxonomy of CyanoSeq is based on the current state of cyanobacterial taxonomy, with ranks from the domain to genus level. Files are provided for use with common naive Bayes taxonomic classifiers, such as those included in DADA2 or the QIIME2 platform. Additionally, FASTA files are provided for creation of de novo phylogenetic trees with (near) full-length 16S rRNA gene sequences to determine the phylogenetic relationship of cyanobacterial strains and/or ASV/OTUs. The database currently consists of 5410 cyanobacterial 16S rRNA gene sequences along with 123 Chloroplast, Bacterial, and Vampirovibrionia (formally Melainabacteria) sequences.  相似文献   

12.
《Trends in parasitology》2023,39(6):475-486
The study of tick evolution may be classified into disciplines such as taxonomy and systematics, biogeography, evolution and development (evo-devo), ecology, and hematophagy. These disciplines overlap and impact each other to various extents. Advances in one field may lead to paradigm shifts in our understanding of tick evolution not apparent to other fields. The current study considers paradigm shifts that occurred, are in the process, or may occur in future for the disciplines that study tick evolution. Some disciplines have undergone significant changes, while others may still be developing their own paradigms. Integration of these various disciplines is essential to come to a holistic view of tick evolution; however, maturation of paradigms may be necessary before this vision can be attained.  相似文献   

13.

Background  

Microarray analysis allows the simultaneous measurement of thousands to millions of genes or sequences across tens to thousands of different samples. The analysis of the resulting data tests the limits of existing bioinformatics computing infrastructure. A solution to this issue is to use High Performance Computing (HPC) systems, which contain many processors and more memory than desktop computer systems. Many biostatisticians use R to process the data gleaned from microarray analysis and there is even a dedicated group of packages, Bioconductor, for this purpose. However, to exploit HPC systems, R must be able to utilise the multiple processors available on these systems. There are existing modules that enable R to use multiple processors, but these are either difficult to use for the HPC novice or cannot be used to solve certain classes of problems. A method of exploiting HPC systems, using R, but without recourse to mastering parallel programming paradigms is therefore necessary to analyse genomic data to its fullest.  相似文献   

14.
The taxonomic impediment to biodiversity studies may be influenced radically by the application of new technology, in particular, desktop image analysers and neural networks. The former offer an opportunity to automate objective feature measurement processes, and the latter provide powerful pattern recognition and data analysis tools which are able to 'learn' patterns in multivariate data. The coupling of these technologies may provide a realistic opportunity for the automation of routine species identifications. The potential benefits and limitations of these technologies, along with the development of automated identification systems are reviewed.  相似文献   

15.
There is, or there should be, an interaction between concepts of taxonomy and biodiversity. On the one hand, taxonomy develops some general and particular classificatory paradigms, which own diversity is to be taken into account to understand the nature of variety of natural kinds. On the other hand, analysis of the properties of biodiversity may put forward nontrivial problems for taxonomy that cannot be deduced directly from its own statements. From the point view of taxonomy, it is argued that the current concept of biodiversity based entirely on the species concept is deeply rooted in reductionistic view of nature. It is outdated epistemologically and should be replaced by the modern taxonomic concept of the hierarchical phylogenetic pattern. Operationally, the latter presumes a possibility for each species to be assigned a certain "phylogenetic weight", according to its phylogenetic uniqueness. From the point view of biodiversity, it is argued that the global biodiversity is a three component entity, as it includes, in addition to phylogenetic and ecological hierarchies, a biomorphic hierarchy, as well. This calls for taxonomy to elaborate the general principles of classification of biomorphs.  相似文献   

16.
The recently introduced term ‘integrative taxonomy’ refers to taxonomy that integrates all available data sources to frame species limits. We survey current taxonomic methods available to delimit species that integrate a variety of data, including molecular and morphological characters. A literature review of empirical studies using the term ‘integrative taxonomy’ assessed the kinds of data being used to frame species limits, and methods of integration. Almost all studies are qualitative and comparative – we are a long way from a repeatable, quantitative method of truly ‘integrative taxonomy’. The usual methods for integrating data in phylogenetic and population genetic paradigms are not appropriate for integrative taxonomy, either because of the diverse range of data used or because of the special challenges that arise when working at the species/population boundary. We identify two challenges that, if met, will facilitate the development of a more complete toolkit and a more robust research programme in integrative taxonomy using species tree approaches. We propose the term ‘iterative taxonomy’ for current practice that treats species boundaries as hypotheses to be tested with new evidence. A search for biological or evolutionary explanations for discordant evidence can be used to distinguish between competing species boundary hypotheses. We identify two recent empirical examples that use the process of iterative taxonomy.  相似文献   

17.
Microbial communities exhibit exquisitely complex structure. Many aspects of this complexity, from the number of species to the total number of interactions, are currently very difficult to examine directly. However, extraordinary efforts are being made to make these systems accessible to scientific investigation. While recent advances in high-throughput sequencing technologies have improved accessibility to the taxonomic and functional diversity of complex communities, monitoring the dynamics of these systems over time and space - using appropriate experimental design - is still expensive. Fortunately, modeling can be used as a lens to focus low-resolution observations of community dynamics to enable mathematical abstractions of functional and taxonomic dynamics across space and time. Here, we review the approaches for modeling bacterial diversity at both the very large and the very small scales at which microbial systems interact with their environments. We show that modeling can help to connect biogeochemical processes to specific microbial metabolic pathways.  相似文献   

18.
We have designed a set of protocols that use peer-to-peer techniques to efficiently implement a distributed and decentralized desktop grid. Incoming jobs with different resource requirements are matched with system nodes through proximity in an N-dimensional Content-Addressable Network, where each resource type is represented as a distinct dimension. In this paper, we describe a comprehensive suite of techniques that cooperate to maximize throughput, and to ensure that load is balanced across all peers. We balance load induced by job executions through randomly generated virtual dimension values, which act to disaggregate clusters of nodes/jobs, and also by a job pushing mechanism based on an approximate global view of the system. We improve upon initial job assignments by using a job-stealing mechanism to overcome load imbalance caused by heterogeneity of nodes/jobs and stale load information. We also describe a set of optimizations that combine to reduce the system load created by the management of the underlying peer-to-peer system and the job-monitoring infrastructure. Unlike other systems, we can effectively support resource constraints of jobs during the course of load balancing since we simplify the problem of matchmaking through building a multi-dimensional resource space and mapping jobs and nodes to this space. We use extensive simulation results to show that the new techniques improve scalability, system throughput, and average response time.  相似文献   

19.
MetaSim: a sequencing simulator for genomics and metagenomics   总被引:1,自引:0,他引:1  
Richter DC  Ott F  Auch AF  Schmid R  Huson DH 《PloS one》2008,3(10):e3373

Background

The new research field of metagenomics is providing exciting insights into various, previously unclassified ecological systems. Next-generation sequencing technologies are producing a rapid increase of environmental data in public databases. There is great need for specialized software solutions and statistical methods for dealing with complex metagenome data sets.

Methodology/Principal Findings

To facilitate the development and improvement of metagenomic tools and the planning of metagenomic projects, we introduce a sequencing simulator called MetaSim. Our software can be used to generate collections of synthetic reads that reflect the diverse taxonomical composition of typical metagenome data sets. Based on a database of given genomes, the program allows the user to design a metagenome by specifying the number of genomes present at different levels of the NCBI taxonomy, and then to collect reads from the metagenome using a simulation of a number of different sequencing technologies. A population sampler optionally produces evolved sequences based on source genomes and a given evolutionary tree.

Conclusions/Significance

MetaSim allows the user to simulate individual read datasets that can be used as standardized test scenarios for planning sequencing projects or for benchmarking metagenomic software.  相似文献   

20.
ABSTRACT Waterfowl biologists estimate seed production in moist-soil wetlands to calculate duck-energy days (DEDs) and evaluate management techniques. Previously developed models that predict plant seed yield using morphological measurements are tedious and time consuming. We developed simple linear regression models that indirectly and directly related seed-head area to seed production for 7 common moist-soil plants using portable and desktop scanners and a dot grid, and compared time spent processing samples and predictive ability among models. To construct models, we randomly collected approximately 60 plants/species at the Tennessee National Wildlife Refuge, USA, during September 2005 and 2006, threshed and dried seed from seed heads, and related dry mass to seed-head area. All models explained substantial variation in seed mass (R2< 0.87) and had high predictive ability (R2predicted < 0.84). Processing time of seed heads averaged 22 and 3 times longer for the dot grid and portable scanner, respectively, than for the desktop scanner. We recommend use of desktop scanners for accurate and rapid estimation of moist-soil plant seed production. Seed predictions per plant from our models can be used to estimate total seed production and DEDs in moist-soil wetlands.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号