首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
2.
Storm is a software package that allows users to test a variety of hypotheses regarding patterns of relatedness and patterns of mate choice and/or mate compatibility within a population. These functions are based on four main calculations that can be conducted either independently or in the hypothesis-testing framework: internal relatedness; homozygosity by loci; pairwise relatedness; and a new metric called allele inheritance, which calculates the proportion of loci at which an offspring inherits a paternal allele different from that inherited from its mother. STORM allows users to test four hypotheses based on these calculations and Monte Carlo simulations: (i) are individuals within observed associations or groupings more/less related than expected; (ii) do observed offspring have more/less genetic variability (based on internal relatedness or homozygosity by loci) than expected from the gene pool; (iii) are observed mating pairs more/less related than expected if mating is random with respect to relatedness; and (iv) do observed offspring inherit paternal alleles different from those inherited from the mother more/less often than expected based on Mendelian inheritance.  相似文献   

3.
Cardiac experimental electrophysiology is in need of a well-defined Minimum Information Standard for recording, annotating, and reporting experimental data. As a step towards establishing this, we present a draft standard, called Minimum Information about a Cardiac Electrophysiology Experiment (MICEE). The ultimate goal is to develop a useful tool for cardiac electrophysiologists which facilitates and improves dissemination of the minimum information necessary for reproduction of cardiac electrophysiology research, allowing for easier comparison and utilisation of findings by others. It is hoped that this will enhance the integration of individual results into experimental, computational, and conceptual models. In its present form, this draft is intended for assessment and development by the research community. We invite the reader to join this effort, and, if deemed productive, implement the Minimum Information about a Cardiac Electrophysiology Experiment standard in their own work.  相似文献   

4.
Background and ObjectiveThe development, control and optimisation of new x-ray breast imaging modalities could benefit from a quantitative assessment of the resulting image textures. The aim of this work was to develop a software tool for routine radiomics applications in breast imaging, which will also be available upon request.MethodsThe tool (developed in MATLAB) allows image reading, selection of Regions of Interest (ROI), analysis and comparison. Requirements towards the tool also included convenient handling of common medical and simulated images, building and providing a library of commonly applied algorithms and a friendly graphical user interface. Initial set of features and analyses have been selected after a literature search. Being open, the tool can be extended, if necessary.ResultsThe tool allows semi-automatic extracting of ROIs, calculating and processing a total of 23 different metrics or features in 2D images and/or in 3D image volumes. Computations of the features were verified against computations with other software packages performed with test images. Two case studies illustrate the applicability of the tool – (i) features on a series of 2D ‘left’ and ‘right’ CC mammograms acquired on a Siemens Inspiration system were computed and compared, and (ii) evaluation of the suitability of newly proposed and developed breast phantoms for x-ray-based imaging based on reference values from clinical mammography images. Obtained results could steer the further development of the physical breast phantoms.ConclusionsA new image analysis toolbox was realized and can now be used in a multitude of radiomics applications, on both clinical and test images.  相似文献   

5.
Improvements in the determination of individual genotypes from samples with low DNA quantity and quality are of prime importance in molecular ecology and conservation for reliable genetic individual identification (molecular tagging using microsatllites loci). Thus, errors (e.g. allelic dropout and false allele) appearing during samples genotyping must be monitored and eliminated as far as possible. The multitubes approach is a very effective but a costly and time‐consuming solution. In this paper, we present a simulation software that allows evaluation of the effect of genotyping errors on genetic identification of individuals and the effectiveness of a multitubes approach to correct these errors.  相似文献   

6.
With continued efforts towards a single MSI data format, data conversion routines must be made universally available. The benefits of a common imaging format, imzML, are slowly becoming more widely appreciated but the format remains to be used by only a small proportion of imaging groups. Increased awareness amongst researchers and continued support from major MS vendors in providing tools for converting proprietary formats into imzML are likely to result in a rapidly increasing uptake of the format. It is important that this does not lead to the exclusion of researchers using older or unsupported instruments. We describe an open source converter, imzMLConverter, to ensure against this. We propose that proprietary formats should first be converted to mzML using one of the widely available converters, such as msconvert and then use imzMLConverter to convert mzML to imzML. This will allow a wider audience to benefit from the imzML format immediately.  相似文献   

7.
Essential Biodiversity Variables (EBV) are fundamental variables that can be used for assessing biodiversity change over time, for determining adherence to biodiversity policy, for monitoring progress towards sustainable development goals, and for tracking biodiversity responses to disturbances and management interventions. Data from observations or models that provide measured or estimated EBV values, which we refer to as EBV data products, can help to capture the above processes and trends and can serve as a coherent framework for documenting trends in biodiversity. Using primary biodiversity records and other raw data as sources to produce EBV data products depends on cooperation and interoperability among multiple stakeholders, including those collecting and mobilising data for EBVs and those producing, publishing and preserving EBV data products. Here, we encapsulate ten principles for the current best practice in EBV-focused biodiversity informatics as ‘The Bari Manifesto’, serving as implementation guidelines for data and research infrastructure providers to support the emerging EBV operational framework based on trans-national and cross-infrastructure scientific workflows. The principles provide guidance on how to contribute towards the production of EBV data products that are globally oriented, while remaining appropriate to the producer's own mission, vision and goals. These ten principles cover: data management planning; data structure; metadata; services; data quality; workflows; provenance; ontologies/vocabularies; data preservation; and accessibility. For each principle, desired outcomes and goals have been formulated. Some specific actions related to fulfilling the Bari Manifesto principles are highlighted in the context of each of four groups of organizations contributing to enabling data interoperability - data standards bodies, research data infrastructures, the pertinent research communities, and funders. The Bari Manifesto provides a roadmap enabling support for routine generation of EBV data products, and increases the likelihood of success for a global EBV framework.  相似文献   

8.
MOTIVATION: Reliable, automated communication of biological information requires methods to declare the information's semantics. In this paper I describe an approach to semantic declaration intended to permit independent, distributed databases, algorithms, and servers to exchange and process requests for information and computations without requiring coordination or agreement among them on universe of discourse, data model, schema, or implementation. RESULTS: This approach uses Glossa, a formal language defining the semantics of biological ideas, information, and algorithms, to executably define the semantics of complex ideas and computations by constructs of semiotes, terms which axiomatically define very simple notions. A database or algorithm wishing to exchange information or computations maintains a set of mappings between its particular notions and semiotes, and a parser to translate between its indigenous ideas and implementation and the semiotes. Requests from other databases or algorithms are issued as semiotic messages, locally interpreted and processed, and the results returned as semiotes to the requesting entity. Thus, semiotes serve as a shared, abstract layer of definitions which can be computably combined by each database or algorithm according to its own needs and ideas. By combining the explicit declaration of semantics with the computation of the semantics of complex ideas, Glossa and its semiotes permit independent computational entities to lightly federate their capabilities as desired while maintaining their unique perspectives on both scientific and technical questions.  相似文献   

9.
Modern document protection relies on the simultaneous combination of many optical features with micron and submicron structures, whose complexity is the main obstacle for unauthorized copying. In that sense, documents are best protected by the diffractive optical elements generated lithographically and mass‐produced by embossing. The problem is that the resulting security elements are identical, facilitating mass‐production of both original and counterfeited documents. Here, we prove that each butterfly wing‐scale is structurally and optically unique and can be used as an inimitable optical memory tag and applied for document security. Wing‐scales, exhibiting angular variability of their color, were laser‐cut and bleached to imprint cryptographic information of an authorized issuer. The resulting optical memory tag is extremely durable, as verified by several century‐old insect specimens still retaining their coloration. The described technique is simple, amenable to mass‐production, low cost and easy to integrate within the existing security infrastructure.  相似文献   

10.
PurposeSpectral Computed Tomography (SCT) systems equipped with photon counting detectors (PCD) are clinically desired, since such systems provide not only additional diagnostic information but also radiation dose reductions by a factor of two or more. The current unavailability of clinical PCDs makes a simulation of such systems necessary.MethodsIn this paper, we present a Monte Carlo-based simulation of a SCT equipped with a PCD. The aim of this development is to facilitate research on potential clinical applications. Our MC simulator takes into account scattering interactions within the scanned object and has the ability to simulate scans with and without scatter and a wide variety of imaging parameters. To demonstrate the usefulness of such a MC simulator for development of SCT applications, a phantom with contrast targets covering a wide range of clinically significant iodine concentrations is simulated. With those simulations the impact of scatter and exposure on image quality and material decomposition results is investigated.ResultsOur results illustrate that scatter radiation plays a significant role in visual as well as quantitative results. Scatter radiation can reduce the accuracy of contrast agent concentration by up to 15%.ConclusionsWe present a reliable and robust software bench for simulation of SCTs equipped with PCDs.  相似文献   

11.
Much of the debate about reciprocity in humans and other primates hinges on proximate mechanisms, or more precisely, the contingency of one service on another. While there is good evidence for long-term statistical contingencies of services given and received in primates, results for short-term behavioral contingencies are mixed. Indeed, as we show here controlled experiments using artificial tasks and explicit turn-taking were unlikely to find short-term effects. We therefore used more naturalistic experiments to test for short-term contingencies of grooming on food sharing and vice versa in one group of chimpanzees and two groups of bonobos. Overall, we found significant effects of grooming on food sharing and vice versa, however, in the chimpanzees these effects disappeared when controlling for long-term characteristics of the dyad including services exchanged over the whole study period. In the bonobos, short-term contingencies remained significant which was likely a consequence of considerable tension surrounding monopolizable food resulting in higher rates of grooming and other affiliative behaviors around sharing sessions. These results are consistent with the fact that previous evidence for short-term contingency often involved grooming and that long-term contingency is more commonly observed in primates. We propose that long-term contingency is proximately regulated by a ‘relationship score’ computed through a tally of past interactions which tend to outweigh recent single events. We therefore suggest that future research into the proximate mechanisms of reciprocity should trace the development of such a score by focusing on newly formed dyads with no history of interactions.  相似文献   

12.
Double‐digested RADseq (ddRADseq) is a NGS methodology that generates reads from thousands of loci targeted by restriction enzyme cut sites, across multiple individuals. To be statistically sound and economically optimal, a ddRADseq experiment has a preliminary design stage that needs to consider issues related to the selection of enzymes, particular features of the genome of the focal species, possible modifications to the library construction protocol, coverage needed to minimize missing data, and the potential sources of error that may impact upon the coverage. We present ddradseqtools , a software package to help ddRADseq experimental design by (i) the generation of in silico double‐digested fragments; (ii) the construction of modified ddRADseq libraries using adapters with either one or two indexes and degenerate base regions (DBRs) to quantify PCR duplicates; and (iii) the initial steps of the bioinformatics preprocessing of reads. ddradseqtools generates single‐end (SE) or paired‐end (PE) reads that may bear SNPs and/or indels. The effect of allele dropout and PCR duplicates on coverage is also simulated. The resulting output files can be submitted to pipelines of alignment and variant calling, to allow the fine‐tuning of parameters. The software was validated with specific tests for the correct operability of the program. The correspondence between in silico settings and parameters from ddRADseq in vitro experiments was assessed to provide guidelines for the reliable performance of the software. ddradseqtools is cost‐efficient in terms of execution time, and can be run on computers with standard CPU and RAM configuration.  相似文献   

13.
14.
Databases have become integral parts of data management, dissemination, and mining in biology. At the Second Annual Conference on Electron Tomography, held in Amsterdam in 2001, we proposed that electron tomography data should be shared in a manner analogous to structural data at the protein and sequence scales. At that time, we outlined our progress in creating a database to bring together cell level imaging data across scales, The Cell Centered Database (CCDB). The CCDB was formally launched in 2002 as an on-line repository of high-resolution 3D light and electron microscopic reconstructions of cells and subcellular structures. It contains 2D, 3D, and 4D structural and protein distribution information from confocal, multiphoton, and electron microscopy, including correlated light and electron microscopy. Many of the data sets are derived from electron tomography of cells and tissues. In the 5 years since its debut, we have moved the CCDB from a prototype to a stable resource and expanded the scope of the project to include data management and knowledge engineering. Here, we provide an update on the CCDB and how it is used by the scientific community. We also describe our work in developing additional knowledge tools, e.g., ontologies, for annotation and query of electron microscopic data.  相似文献   

15.
Low-cost, robust, and user-friendly diagnostic capabilities at the point-of-care (POC) are critical for treating infectious diseases and preventing their spread in developing countries. Recent advances in micro- and nanoscale technologies have enabled the merger of optical and fluidic technologies (optofluidics) paving the way for cost-effective lensless imaging and diagnosis for POC testing in resource-limited settings. Applications of the emerging lensless imaging technologies include detecting and counting cells of interest, which allows rapid and affordable diagnostic decisions. This review presents the advances in lensless imaging and diagnostic systems, and their potential clinical applications in developing countries. The emerging technologies are reviewed from a POC perspective considering cost effectiveness, portability, sensitivity, throughput and ease of use for resource-limited settings.  相似文献   

16.
17.
18.
Computer-Integrated Manufacturing (CIM) systems may be classified as real-time systems. Hence, the applicability of methodologies that are developed for specifying, designing, implementing, testing, and evolving real-time software is investigated in this article. The paper highlights the activities of the software development process. Among these activities, a great emphasis is placed on automating the software requirements specification activity, and a set of formal models and languages for specifying these requirements is presented. Moreover, a synopsis of the real-time software methodologies that have been implemented by the academic and industrial communities is presented together with a critique of the strengths and weaknesses of these methodologies. The possible use of the real-time methodologies in developing the control software of efficient and dependable manufacturing systems is explored. In these systems, efficiency is achieved by increasing the level of concurrency of the operations of a plan, and by scheduling the execution of these operations with the intent of maximizing the utilization of the devices of their systems. On the other hand, dependability requires monitoring the operations of these systems. This monitoring activity facilitates the detection of faults that may occur when executing the scheduled operations of a plan, recovering from these faults, and, whenever feasible, resuming the original schedule of the system. The paper concludes that the set of surveyed methodologies may be used to develop the real-time control software of efficient and dependable manufacturing systems. However, an integrated approach to planning, scheduling, and monitoring the operations of these systems will significantly enhance their utility, and no such approach is supported by any of these methodologies.  相似文献   

19.
MOTIVATION: MethylCoder is a software program that generates per-base methylation data given a set of bisulfite-treated reads. It provides the option to use either of two existing short-read aligners, each with different strengths. It accounts for soft-masked alignments and overlapping paired-end reads. MethylCoder outputs data in text and binary formats in addition to the final alignment in SAM format, so that common high-throughput sequencing tools can be used on the resulting output. It is more flexible than existing software and competitive in terms of speed and memory use. AVAILABILITY: MethylCoder requires only a python interpreter and a C compiler to run. Extensive documentation and the full source code are available under the MIT license at: https://github.com/brentp/methylcode. CONTACT: bpederse@gmail.com.  相似文献   

20.

Background  

Interpreting and controlling bioelectromagnetic phenomena require realistic physiological models and accurate numerical solvers. A semi-realistic model often used in practise is the piecewise constant conductivity model, for which only the interfaces have to be meshed. This simplified model makes it possible to use Boundary Element Methods. Unfortunately, most Boundary Element solutions are confronted with accuracy issues when the conductivity ratio between neighboring tissues is high, as for instance the scalp/skull conductivity ratio in electro-encephalography. To overcome this difficulty, we proposed a new method called the symmetric BEM, which is implemented in the OpenMEEG software. The aim of this paper is to present OpenMEEG, both from the theoretical and the practical point of view, and to compare its performances with other competing software packages.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号