首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 46 毫秒
1.
Ontologies have emerged as a fast growing research topic in the area of semantic web during last decade. Currently there are 204 ontologies that are available through OBO Foundry and BioPortal. Several excellent tools for navigating the ontological structure are available, however most of them are dedicated to a specific annotation data or integrated with specific analysis applications, and do not offer flexibility in terms of general-purpose usage for ontology exploration. We developed OntoVisT, a web based ontological visualization tool. This application is designed for interactive visualization of any ontological hierarchy for a specific node of interest, up to the chosen level of children and/or ancestor. It takes any ontology file in OBO format as input and generates output as DAG hierarchical graph for the chosen query. To enhance the navigation capabilities of complex networks, we have embedded several features such as search criteria, zoom in/out, center focus, nearest neighbor highlights and mouse hover events. The application has been tested on all 72 data sets available in OBO format through OBO foundry. The results for few of them can be accessed through OntoVisT-Gallery. AVAILABILITY: The database is available for free at http://ccbb.jnu.ac.in/OntoVisT.html.  相似文献   

2.
MOTIVATION: Simulation of dynamic biochemical systems is receiving considerable attention due to increasing availability of experimental data of complex cellular functions. Numerous simulation tools have been developed for numerical simulation of the behavior of a system described in mathematical form. However, there exist only a few evaluation studies of these tools. Knowledge of the properties and capabilities of the simulation tools would help bioscientists in building models based on experimental data. RESULTS: We examine selected simulation tools that are intended for the simulation of biochemical systems. We choose four of them for more detailed study and perform time series simulations using a specific pathway describing the concentration of the active form of protein kinase C. We conclude that the simulation results are convergent between the chosen simulation tools. However, the tools differ in their usability, support for data transfer to other programs and support for automatic parameter estimation. From the experimentalists' point of view, all these are properties that need to be emphasized in the future.  相似文献   

3.
Especially in the last decade or so, there have been dramatic advances in fluorescence-based imaging methods designed to measure a multitude of functions in living cells. Despite this, many of the methods used to analyze the resulting images are limited. Perhaps the most common mode of analysis is the choice of regions of interest (ROIs), followed by quantification of the signal contained therein in comparison with another “control” ROI. While this method has several advantages, such as flexibility and capitalization on the power of human visual recognition capabilities, it has the drawbacks of potential subjectivity and lack of precisely defined criteria for ROI selection. This can lead to analyses which are less precise or accurate than the data might allow for, and generally a regrettable loss of information. Herein, we explore the possibility of abandoning the use of conventional ROIs, and instead propose treating individual pixels as ROIs, such that all information can be extracted systematically with the various statistical cutoffs we discuss. As a test case for this approach, we monitored intracellular pH in cells transfected with the chloride/bicarbonate transporter slc26a3 using the ratiometric dye SNARF-5F under various conditions. We performed a parallel analysis using two different levels of stringency in conventional ROI analysis as well as the pixels-as-ROIs (PAR) approach, and found that pH differences between control and transfected cells were accentuated by ~50-100% by using the PAR approach. We therefore consider this approach worthy of adoption, especially in cases in which higher accuracy and precision are required.  相似文献   

4.

Background  

In bioinformatics and genomics, there are many applications designed to investigate the common properties for a set of genes. Often, these multi-gene analysis tools attempt to reveal sequential, functional, and expressional ties. However, while tremendous effort has been invested in developing tools that can analyze a set of genes, minimal effort has been invested in developing tools that can help researchers compile, store, and annotate gene sets in the first place. As a result, the process of making or accessing a set often involves tedious and time consuming steps such as finding identifiers for each individual gene. These steps are often repeated extensively to shift from one identifier type to another; or to recreate a published set. In this paper, we present a simple online tool which – with the help of the gene catalogs Ensembl and GeneLynx – can help researchers build and annotate sets of genes quickly and easily.  相似文献   

5.
Tetralogy of Fallot (TOF) is the most common form of cyanotic congenital heart disease. Infants diagnosed with TOF require surgical interventions to survive into adulthood. However, as a result of postoperative structural malformations and long-term ventricular remodeling, further interventions are often required later in life. To help identify those at risk of disease progression, serial cardiac magnetic resonance (CMR) imaging is used to monitor these patients. However, most of the detailed information on cardiac shape and biomechanics contained in these large four-dimensional (4D) data sets goes unused in clinical practice for lack of efficient and comprehensive quantitative analysis tools. While current global metrics of cardiac size and function, such as indexed ventricular mass and volumes, can identify patients at risk of further complications, they are not adequate to explain the underlying mechanisms causing the postoperative malfunctions, and help cardiologists plan optimal personalized treatments. We are proposing a novel approach that uses 4D ventricular shape models derived from CMR imaging exams to generate statistical atlases of ventricular shape and finite-element models of ventricular biomechanics to identify specific features of cardiac shape and biomechanical properties that explain variations in ventricular function. This study has the potential to discover novel biomarkers that precede adverse ventricular remodeling and dysfunction.  相似文献   

6.
7.

Background  

Systems biologists work with many kinds of data, from many different sources, using a variety of software tools. Each of these tools typically excels at one type of analysis, such as of microarrays, of metabolic networks and of predicted protein structure. A crucial challenge is to combine the capabilities of these (and other forthcoming) data resources and tools to create a data exploration and analysis environment that does justice to the variety and complexity of systems biology data sets. A solution to this problem should recognize that data types, formats and software in this high throughput age of biology are constantly changing.  相似文献   

8.
Absolute protein concentration determination is becoming increasingly important in a number of fields including diagnostics, biomarker discovery and systems biology modeling. The recently introduced quantification concatamer methodology provides a novel approach to performing such determinations, and it has been applied to both microbial and mammalian systems. While a number of software tools exist for performing analyses of quantitative data generated by related methodologies such as SILAC, there is currently no analysis package dedicated to the quantification concatamer approach. Furthermore, most tools that are currently available in the field of quantitative proteomics do not manage storage and dissemination of such data sets.  相似文献   

9.
10.
11.
Areas of endemism have been recognized as important units in historical biogeography, and much attention has been given to methods of identifying these units operationally. Interestingly, little has been written about the philosophical nature of areas of endemism. The purpose of this essay is to make an ontological argument for areas of endemism as individuals and to discuss the consequences of such a conclusion. The recognition of species as individuals is crucial to the argument. Several criteria are identified for entities to be considered individuals, all of which are shown for areas of endemism. An ontological concept of an area of endemism is presented. Two of the consequences of regarding areas of endemism as individuals are that areas of endemism should be the preferred units of biogeography over the units used in event‐based methods and that parsimony analysis of endemism and similar methods may be operational tools for the discovery of areas of endemism.  相似文献   

12.
Recent advances in technology and associated methodology have made the current period one of the most exciting in molecular biology and medicine. Underlying these is an appreciation that modern research is driven by increasing large amounts of data being interpreted by interdisciplinary collaborative teams which are often geographically dispersed. The availability of cheap computing power, high speed informatics networks and high quality analysis software has been essential to this as has the application of modern quality assurance methodologies. In this review, we discuss the application of modern 'High-Throughput' molecular biological technologies such as 'Microarrays' and 'Next Generation Sequencing' to scientific and biomedical research as we have observed. Furthermore in this review, we also offer some guidance that enables the reader as to understand certain features of these as well as new strategies and help them to apply these i-Gene tools in their endeavours successfully. Collectively, we term this 'i-Gene Analysis'. We also offer predictions as to the developments that are anticipated in the near and more distant future.  相似文献   

13.
We present an analysis of some considerations involved in expressing the Gene Ontology (GO) as a machine-processible ontology, reflecting principles of formal ontology. GO is a controlled vocabulary that is intended to facilitate communication between biologists by standardizing usage of terms in database annotations. Making such controlled vocabularies maximally useful in support of bioinformatics applications requires explicating in machine-processible form the implicit background information that enables human users to interpret the meaning of the vocabulary terms. In the case of GO, this process would involve rendering the meanings of GO into a formal (logical) language with the help of domain experts, and adding additional information required to support the chosen formalization. A controlled vocabulary augmented in these ways is commonly called an ontology. In this paper, we make a modest exploration to determine the ontological requirements for this extended version of GO. Using the terms within the three GO hierarchies (molecular function, biological process and cellular component), we investigate the facility with which GO concepts can be ontologized, using available tools from the philosophical and ontological engineering literature.  相似文献   

14.
Our current understanding of the molecular mechanisms which regulate cellular processes such as vesicular trafficking has been enabled by conventional biochemical and microscopy techniques. However, these methods often obscure the heterogeneity of the cellular environment, thus precluding a quantitative assessment of the molecular interactions regulating these processes. Herein, we present Molecular Interactions in Super Resolution (MIiSR) software which provides quantitative analysis tools for use with super-resolution images. MIiSR combines multiple tools for analyzing intermolecular interactions, molecular clustering and image segmentation. These tools enable quantification, in the native environment of the cell, of molecular interactions and the formation of higher-order molecular complexes. The capabilities and limitations of these analytical tools are demonstrated using both modeled data and examples derived from the vesicular trafficking system, thereby providing an established and validated experimental workflow capable of quantitatively assessing molecular interactions and molecular complex formation within the heterogeneous environment of the cell.  相似文献   

15.
The universe of prion and prion-like phenomena has expanded significantly in the past several years. Here, we overview the challenges in classifying this data informatically, given that terms such as “prion-like”, “prion-related” or “prion-forming” do not have a stable meaning in the scientific literature. We examine the spectrum of proteins that have been described in the literature as forming prions, and discuss how “prion” can have a range of meaning, with a strict definition being for demonstration of infection with in vitro-derived recombinant prions. We suggest that although prion/prion-like phenomena can largely be apportioned into a small number of broad groups dependent on the type of transmissibility evidence for them, as new phenomena are discovered in the coming years, a detailed ontological approach might be necessary that allows for subtle definition of different “flavors” of prion / prion-like phenomena.  相似文献   

16.
17.
"Metabonomics" has in the past decade demonstrated enormous potential in furthering the understanding of, for example, disease processes, toxicological mechanisms, and biomarker discovery. The same principles can also provide a systematic and comprehensive approach to the study of food ingredient impact on consumer health. However, "metabonomic" methodology requires the development of rapid, advanced analytical tools to comprehensively profile biofluid metabolites within consumers. Until now, NMR spectroscopy has been used for this purpose almost exclusively. Chromatographic techniques and in particular HPLC, have not been exploited accordingly. The main drawbacks of chromatography are the long analysis time, instabilities in the sample fingerprint and the rigorous sample preparation required. This contribution addresses these problems in the quest to develop generic methods for high-throughput profiling using HPLC. After a careful optimization process, stable fingerprints of biofluid samples can be obtained using standard HPLC equipment. A method using a short monolithic column and a rapid gradient with a high flow-rate has been developed that allowed rapid and detailed profiling of larger numbers of urine samples. The method can be easily translated into a slow, shallow-gradient high-resolution method for identification of interesting peaks by LC-MS/NMR. A similar approach has been applied for cell culture media samples. Due to the much higher protein content of such samples non-porous polymer-based small particle columns yielded the best results. The study clearly shows that HPLC can be used in metabonomic fingerprinting studies.  相似文献   

18.
There is an increasing worldwide concern about the problem of dealing with the waste electrical and electronic equipment (WEEE), given the high volume of appliances that are disposed of every day. In this article, an environmental evaluation of WEEE is performed that combines life cycle assessment (LCA) methodology and multivariate statistical techniques. Because LCA handles a large number of data in its different phases, when one is trying to uncover the structure of large multidimensional data sets, multivariate statistical techniques can provide useful information. In particular, principal‐component analysis and multidimensional scaling are two important dimension‐reducing tools that have been shown to be of help in understanding this type of complex multivariate data set. In this article, we use a variable selection method that reduces the number of categories for which the environmental impacts have to be computed; this step is especially useful when the number of impact categories or the number of products or processes to benchmark increases. We provide a detailed illustration showing how we have used the proposed approach to analyze and interpret the environmental impacts of different domestic appliances.  相似文献   

19.
For decades, biologists have relied on software to visualize and interpret imaging data. As techniques for acquiring images increase in complexity, resulting in larger multidimensional datasets, imaging software must adapt. ImageJ is an open‐source image analysis software platform that has aided researchers with a variety of image analysis applications, driven mainly by engaged and collaborative user and developer communities. The close collaboration between programmers and users has resulted in adaptations to accommodate new challenges in image analysis that address the needs of ImageJ's diverse user base. ImageJ consists of many components, some relevant primarily for developers and a vast collection of user‐centric plugins. It is available in many forms, including the widely used Fiji distribution. We refer to this entire ImageJ codebase and community as the ImageJ ecosystem. Here we review the core features of this ecosystem and highlight how ImageJ has responded to imaging technology advancements with new plugins and tools in recent years. These plugins and tools have been developed to address user needs in several areas such as visualization, segmentation, and tracking of biological entities in large, complex datasets. Moreover, new capabilities for deep learning are being added to ImageJ, reflecting a shift in the bioimage analysis community towards exploiting artificial intelligence. These new tools have been facilitated by profound architectural changes to the ImageJ core brought about by the ImageJ2 project. Therefore, we also discuss the contributions of ImageJ2 to enhancing multidimensional image processing and interoperability in the ImageJ ecosystem.  相似文献   

20.
Aims Grasslands are the world's most extensive terrestrial ecosystem, and are a major feed source for livestock. Meeting increasing demand for meat and other dairy products in a sustainable manner is a big challenge. At a field scale, Global Positioning System and ground-based sensor technologies provide promising tools for grassland and herd management with high precision. With the growth in availability of spaceborne remote sensing data, it is therefore important to revisit the relevant methods and applications that can exploit this imagery. In this article, we have reviewed the (i) current status of grassland monitoring/observation methods and applications based on satellite remote sensing data, (ii) the technological and methodological developments to retrieve different grassland biophysical parameters and management characteristics (i.e. degradation, grazing intensity) and (iii) identified the key remaining challenges and some new upcoming trends for future development.Important findings The retrieval of grassland biophysical parameters have evolved in recent years from classical regression analysis to more complex, efficient and robust modeling approaches, driven by satellite data, and are likely to continue to be the most robust method for deriving grassland information, however these require more high quality calibration and validation data. We found that the hypertemporal satellite data are widely used for time series generation, and particularly to overcome cloud contamination issues, but the current low spatial resolution of these instruments precludes their use for field-scale application in many countries. This trend may change with the current rise in launch of satellite constellations, such as RapidEye, Sentinel-2 and even the microsatellites such as those operated by Skybox Imaging. Microwave imagery has not been widely used for grassland applications, and a better understanding of the backscatter behaviour from different phenological stages is needed for more reliable products in cloudy regions. The development of hyperspectral satellite instrumentation and analytical methods will help for more detailed discrimination of habitat types, and the development of tools for greater end-user operation.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号