首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 812 毫秒
1.

Background

With the advances in DNA sequencer-based technologies, it has become possible to automate several steps of the genotyping process leading to increased throughput. To efficiently handle the large amounts of genotypic data generated and help with quality control, there is a strong need for a software system that can help with the tracking of samples and capture and management of data at different steps of the process. Such systems, while serving to manage the workflow precisely, also encourage good laboratory practice by standardizing protocols, recording and annotating data from every step of the workflow.

Results

A laboratory information management system (LIMS) has been designed and implemented at the International Crops Research Institute for the Semi-Arid Tropics (ICRISAT) that meets the requirements of a moderately high throughput molecular genotyping facility. The application is designed as modules and is simple to learn and use. The application leads the user through each step of the process from starting an experiment to the storing of output data from the genotype detection step with auto-binning of alleles; thus ensuring that every DNA sample is handled in an identical manner and all the necessary data are captured. The application keeps track of DNA samples and generated data. Data entry into the system is through the use of forms for file uploads. The LIMS provides functions to trace back to the electrophoresis gel files or sample source for any genotypic data and for repeating experiments. The LIMS is being presently used for the capture of high throughput SSR (simple-sequence repeat) genotyping data from the legume (chickpea, groundnut and pigeonpea) and cereal (sorghum and millets) crops of importance in the semi-arid tropics.

Conclusion

A laboratory information management system is available that has been found useful in the management of microsatellite genotype data in a moderately high throughput genotyping laboratory. The application with source code is freely available for academic users and can be downloaded fromhttp://www.icrisat.org/gt-bt/lims/lims.asp.  相似文献   

2.
3.

Background

The coupling of pathways and processes through shared components is being increasingly recognised as a common theme which occurs in many cell signalling contexts, in which it plays highly non-trivial roles.

Results

In this paper we develop a basic modelling and systems framework in a general setting for understanding the coupling of processes and pathways through shared components. Our modelling framework starts with the interaction of two components with a common third component and includes production and degradation of all these components. We analyze the signal processing in our model to elucidate different aspects of the coupling. We show how different kinds of responses, including "ultrasensitive" and adaptive responses, may occur in this setting. We then build on the basic model structure and examine the effects of additional control regulation, switch-like signal processing, and spatial signalling. In the process, we identify a way in which allosteric regulation may contribute to signalling specificity, and how competitive effects may allow an enzyme to robustly coordinate and time the activation of parallel pathways.

Conclusions

We have developed and analyzed a common systems platform for examining the effects of coupling of processes through shared components. This can be the basis for subsequent expansion and understanding the many biologically observed variations on this common theme.  相似文献   

4.

Background

Availability of allograft tympano-ossicular systems (ATOS) provides unique reconstructive capabilities, allowing more radical removal of middle ear pathology. To provide ATOS, the University of Antwerp Temporal Bone Bank (UATB) was established in 1988. ATOS use was stopped in many countries because of safety issues concerning human tissue transplantation. Our objective was to maintain an ATOS tissue bank complying with European Union (EU) directives on human tissues and cells.

Methods

The guidelines of the Belgian Superior Health Council, including EU directive requirements, were rigorously applied to UATB infrastructure, workflow protocols and activity. Workflow protocols were updated and an internal audit was performed to check and improve consistency with established quality systems and changing legislations. The Belgian Federal Agency of Medicines and Health Products performed an inspection to examine compliance with national legislatives and EU directives on human tissues and cells. A sample of important procedures was meticulously examined in its workflow setting next to assessment of the infrastructure and personnel.

Results

Results are reported on infrastructure, personnel, administrative workflow, procurement, preparation, processing, distribution, internal audit and inspection by the competent authority. Donors procured: 2006, 93 (45.1%); 2007, 64 (20.6%); 2008, 56 (13.1%); 2009, 79 (6.9%). The UATB was approved by the Minister of Health without critical or important shortcomings. The Ministry accords registration each time for 2?years.

Conclusions

An ATOS tissue bank complying with EU regulations on human allografts is feasible and critical to assure that the patient receives tissue, which is safe, individually checked and prepared in a suitable environment.  相似文献   

5.
6.

Background

Based upon defining a common reference point, current real-time quantitative PCR technologies compare relative differences in amplification profile position. As such, absolute quantification requires construction of target-specific standard curves that are highly resource intensive and prone to introducing quantitative errors. Sigmoidal modeling using nonlinear regression has previously demonstrated that absolute quantification can be accomplished without standard curves; however, quantitative errors caused by distortions within the plateau phase have impeded effective implementation of this alternative approach.

Results

Recognition that amplification rate is linearly correlated to amplicon quantity led to the derivation of two sigmoid functions that allow target quantification via linear regression analysis. In addition to circumventing quantitative errors produced by plateau distortions, this approach allows the amplification efficiency within individual amplification reactions to be determined. Absolute quantification is accomplished by first converting individual fluorescence readings into target quantity expressed in fluorescence units, followed by conversion into the number of target molecules via optical calibration. Founded upon expressing reaction fluorescence in relation to amplicon DNA mass, a seminal element of this study was to implement optical calibration using lambda gDNA as a universal quantitative standard. Not only does this eliminate the need to prepare target-specific quantitative standards, it relegates establishment of quantitative scale to a single, highly defined entity. The quantitative competency of this approach was assessed by exploiting "limiting dilution assay" for absolute quantification, which provided an independent gold standard from which to verify quantitative accuracy. This yielded substantive corroborating evidence that absolute accuracies of ± 25% can be routinely achieved. Comparison with the LinReg and Miner automated qPCR data processing packages further demonstrated the superior performance of this kinetic-based methodology.

Conclusion

Called "linear regression of efficiency" or LRE, this novel kinetic approach confers the ability to conduct high-capacity absolute quantification with unprecedented quality control capabilities. The computational simplicity and recursive nature of LRE quantification also makes it amenable to software implementation, as demonstrated by a prototypic Java program that automates data analysis. This in turn introduces the prospect of conducting absolute quantification with little additional effort beyond that required for the preparation of the amplification reactions.  相似文献   

7.

Background

Over the past decade the workflow system paradigm has evolved as an efficient and user-friendly approach for developing complex bioinformatics applications. Two popular workflow systems that have gained acceptance by the bioinformatics community are Taverna and Galaxy. Each system has a large user-base and supports an ever-growing repository of application workflows. However, workflows developed for one system cannot be imported and executed easily on the other. The lack of interoperability is due to differences in the models of computation, workflow languages, and architectures of both systems. This lack of interoperability limits sharing of workflows between the user communities and leads to duplication of development efforts.

Results

In this paper, we present Tavaxy, a stand-alone system for creating and executing workflows based on using an extensible set of re-usable workflow patterns. Tavaxy offers a set of new features that simplify and enhance the development of sequence analysis applications: It allows the integration of existing Taverna and Galaxy workflows in a single environment, and supports the use of cloud computing capabilities. The integration of existing Taverna and Galaxy workflows is supported seamlessly at both run-time and design-time levels, based on the concepts of hierarchical workflows and workflow patterns. The use of cloud computing in Tavaxy is flexible, where the users can either instantiate the whole system on the cloud, or delegate the execution of certain sub-workflows to the cloud infrastructure.

Conclusions

Tavaxy reduces the workflow development cycle by introducing the use of workflow patterns to simplify workflow creation. It enables the re-use and integration of existing (sub-) workflows from Taverna and Galaxy, and allows the creation of hybrid workflows. Its additional features exploit recent advances in high performance cloud computing to cope with the increasing data size and complexity of analysis. The system can be accessed either through a cloud-enabled web-interface or downloaded and installed to run within the user's local environment. All resources related to Tavaxy are available at http://www.tavaxy.org.  相似文献   

8.

Background

Childhood asthma prevalence is widely measured by parental proxy report of physician-diagnosed asthma in questionnaires. Our objective was to validate this measure in a North American population.

Methods

The 2884 study participants were a subsample of 5619 school children aged 5 to 9 years from 231 schools participating in the Toronto Child Health Evaluation Questionnaire study in 2006. We compared agreement between "questionnaire diagnosis" and a previously validated "health claims data diagnosis". Sensitivity, specificity and kappa were calculated for the questionnaire diagnosis using the health claims diagnosis as the reference standard.

Results

Prevalence of asthma was 15.7% by questionnaire and 21.4% by health claims data. Questionnaire diagnosis was insensitive (59.0%) but specific (95.9%) for asthma. When children with asthma-related symptoms were excluded, the sensitivity increased (83.6%), and specificity remained high (93.6%).

Conclusions

Our results show that parental report of asthma by questionnaire has low sensitivity but high specificity as an asthma prevalence measure. In addition, children with "asthma-related symptoms" may represent a large fraction of under-diagnosed asthma and they should be excluded from the inception cohort for risk factor studies.  相似文献   

9.

Background

An adequate and expressive ontological representation of biological organisms and their parts requires formal reasoning mechanisms for their relations of physical aggregation and containment.

Results

We demonstrate that the proposed formalism allows to deal consistently with "role propagation along non-taxonomic hierarchies", a problem which had repeatedly been identified as an intricate reasoning problem in biomedical ontologies.

Conclusion

The proposed approach seems to be suitable for the redesign of compositional hierarchies in (bio)medical terminology systems which are embedded into the framework of the OBO (Open Biological Ontologies) Relation Ontology and are using knowledge representation languages developed by the Semantic Web community.  相似文献   

10.

Background

Fractures of the long bones and femur fractures in particular are common in multiple trauma patients, but the optimal management of femur fractures in these patients is not yet resolved. Although there is a trend towards the concept of "Damage Control Orthopedics" (DCO) in the management of multiple trauma patients with long bone fractures as reflected by a significant increase in primary external fixation of femur fractures, current literature is insufficient. Thus, in the era of "evidence-based medicine", there is the need for a more specific, clarifying trial.

Methods/Design

The trial is designed as a randomized controlled open-label multicenter study. Multiple trauma patients with femur shaft fractures and a calculated probability of death between 20 and 60% will be randomized to either temporary fracture fixation with fixateur externe and defined secondary definitive treatment (DCO) or primary reamed nailing (early total care). The primary objective is to reduce the extent of organ failure as measured by the maximum sepsis-related organ failure assessment (SOFA) score.

Discussion

The Damage Control Study is the first to evaluate the risk adapted damage control orthopedic surgery concept of femur shaft fractures in multiple trauma patients in a randomized controlled design. The trial investigates the differences in clinical outcome of two currently accepted different ways of treating multiple trauma patients with femoral shaft fractures. This study will help to answer the question whether the "early total care" or the ?damage control” concept is associated with better outcome.

Trial registration

Current Controlled Trials ISRCTN10321620  相似文献   

11.

Background

Genes and gene products are frequently annotated with Gene Ontology concepts based on the evidence provided in genomics articles. Manually locating and curating information about a genomic entity from the biomedical literature requires vast amounts of human effort. Hence, there is clearly a need forautomated computational tools to annotate the genes and gene products with Gene Ontology concepts by computationally capturing the related knowledge embedded in textual data.

Results

In this article, we present an automated genomic entity annotation system, GEANN, which extracts information about the characteristics of genes and gene products in article abstracts from PubMed, and translates the discoveredknowledge into Gene Ontology (GO) concepts, a widely-used standardized vocabulary of genomic traits. GEANN utilizes textual "extraction patterns", and a semantic matching framework to locate phrases matching to a pattern and produce Gene Ontology annotations for genes and gene products. In our experiments, GEANN has reached to the precision level of 78% at therecall level of 61%. On a select set of Gene Ontology concepts, GEANN either outperforms or is comparable to two other automated annotation studies. Use of WordNet for semantic pattern matching improves the precision and recall by 24% and 15%, respectively, and the improvement due to semantic pattern matching becomes more apparent as the Gene Ontology terms become more general.

Conclusion

GEANN is useful for two distinct purposes: (i) automating the annotation of genomic entities with Gene Ontology concepts, and (ii) providing existing annotations with additional "evidence articles" from the literature. The use of textual extraction patterns that are constructed based on the existing annotations achieve high precision. The semantic pattern matching framework provides a more flexible pattern matching scheme with respect to "exactmatching" with the advantage of locating approximate pattern occurrences with similar semantics. Relatively low recall performance of our pattern-based approach may be enhanced either by employing a probabilistic annotation framework based on the annotation neighbourhoods in textual data, or, alternatively, the statistical enrichment threshold may be adjusted to lower values for applications that put more value on achieving higher recall values.  相似文献   

12.

?

We examine the Tree of Life (TOL) as an evolutionary hypothesis and a heuristic. The original TOL hypothesis has failed but a new "statistical TOL hypothesis" is promising. The TOL heuristic usefully organizes data without positing fundamental evolutionary truth.

Reviewers

This article was reviewed by W. Ford Doolittle, Nicholas Galtier and Christophe Malaterre.  相似文献   

13.

Introduction

Data processing is one of the biggest problems in metabolomics, given the high number of samples analyzed and the need of multiple software packages for each step of the processing workflow.

Objectives

Merge in the same platform the steps required for metabolomics data processing.

Methods

KniMet is a workflow for the processing of mass spectrometry-metabolomics data based on the KNIME Analytics platform.

Results

The approach includes key steps to follow in metabolomics data processing: feature filtering, missing value imputation, normalization, batch correction and annotation.

Conclusion

KniMet provides the user with a local, modular and customizable workflow for the processing of both GC–MS and LC–MS open profiling data.
  相似文献   

14.

Background

The Global Programme to Eliminate Lymphatic Filariasis (GPELF) depends upon Mass Drug Administration (MDA) to interrupt transmission. Therefore, delimitation of transmission risk areas is an important step, and hence we attempted to define a geo-environmental risk model (GERM) for determining the areas of potential transmission of lymphatic filariasis.

Methods

A range of geo-environmental variables has been selected, and customized on GIS platform to develop GERM for identifying the areas of filariasis transmission in terms of "risk" and "non-risk". The model was validated through a 'ground truth study' following standard procedure using GIS tools for sampling and Immuno-chromotographic Test (ICT) for screening the individuals.

Results

A map for filariasis transmission was created and stratified into different spatial entities, "risk' and "non-risk", depending on Filariasis Transmission Risk Index (FTRI). The model estimation corroborated well with the ground (observed) data.

Conclusion

The geo-environmental risk model developed on GIS platform is useful for spatial delimitation purpose on a macro scale.  相似文献   

15.

Background

More than 60% of new strokes each year are "mild" in severity and this proportion is expected to rise in the years to come. Within our current health care system those with "mild" stroke are typically discharged home within days, without further referral to health or rehabilitation services other than advice to see their family physician. Those with mild stroke often have limited access to support from health professionals with stroke-specific knowledge who would typically provide critical information on topics such as secondary stroke prevention, community reintegration, medication counselling and problem solving with regard to specific concerns that arise. Isolation and lack of knowledge may lead to a worsening of health problems including stroke recurrence and unnecessary and costly health care utilization. The purpose of this study is to assess the effectiveness, for individuals who experience a first "mild" stroke, of a sustainable, low cost, multimodal support intervention (comprising information, education and telephone support) - "WE CALL" compared to a passive intervention (providing the name and phone number of a resource person available if they feel the need to) - "YOU CALL", on two primary outcomes: unplanned-use of health services for negative events and quality of life.

Method/Design

We will recruit 384 adults who meet inclusion criteria for a first mild stroke across six Canadian sites. Baseline measures will be taken within the first month after stroke onset. Participants will be stratified according to comorbidity level and randomised to one of two groups: YOU CALL or WE CALL. Both interventions will be offered over a six months period. Primary outcomes include unplanned use of heath services for negative event (frequency calendar) and quality of life (EQ-5D and Quality of Life Index). Secondary outcomes include participation level (LIFE-H), depression (Beck Depression Inventory II) and use of health services for health promotion or prevention (frequency calendar). Blind assessors will gather data at mid-intervention, end of intervention and one year follow up.

Discussion

If effective, this multimodal intervention could be delivered in both urban and rural environments. For example, existing infrastructure such as regional stroke centers and existing secondary stroke prevention clinics, make this intervention, if effective, deliverable and sustainable.

Trial Registration

ISRCTN95662526  相似文献   

16.
17.

Background

With the globalization of clinical trials, a growing emphasis has been placed on the standardization of the workflow in order to ensure the reproducibility and reliability of the overall trial. Despite the importance of workflow evaluation, to our knowledge no previous studies have attempted to adapt existing modeling languages to standardize the representation of clinical trials. Unified Modeling Language (UML) is a computational language that can be used to model operational workflow, and a UML profile can be developed to standardize UML models within a given domain. This paper''s objective is to develop a UML profile to extend the UML Activity Diagram schema into the clinical trials domain, defining a standard representation for clinical trial workflow diagrams in UML.

Methods

Two Brazilian clinical trial sites in rheumatology and oncology were examined to model their workflow and collect time-motion data. UML modeling was conducted in Eclipse, and a UML profile was developed to incorporate information used in discrete event simulation software.

Results

Ethnographic observation revealed bottlenecks in workflow: these included tasks requiring full commitment of CRCs, transferring notes from paper to computers, deviations from standard operating procedures, and conflicts between different IT systems. Time-motion analysis revealed that nurses'' activities took up the most time in the workflow and contained a high frequency of shorter duration activities. Administrative assistants performed more activities near the beginning and end of the workflow. Overall, clinical trial tasks had a greater frequency than clinic routines or other general activities.

Conclusions

This paper describes a method for modeling clinical trial workflow in UML and standardizing these workflow diagrams through a UML profile. In the increasingly global environment of clinical trials, the standardization of workflow modeling is a necessary precursor to conducting a comparative analysis of international clinical trials workflows.  相似文献   

18.

 

In children with Prader Willi syndrome (PWS), besides growth hormone (GH) therapy, control of the food environment and regular exercise, surgical treatment of scoliosis deformities seems the treatment of choice, even though the risks of spinal surgery in this specific population is very high. Therefore the question arises as to whether the risks of spinal surgery outweigh the benefits in a condition, which bears significant risks per se. The purpose of this systematic review of the Pub Med literature was to find mid or long-term results of spinal fusion surgery in patients with PWS, and to present the conservative treatment in a case study of nine patients with this condition.

Methods

Types of studies included; all kinds of studies; retrospective and prospective ones, which reported upon the outcome of scoliosis surgery in patients with PWS. Types of participants included: patients with scoliosis and PWS. Type of intervention: surgery. Search strategy for identification of the studies; Pub Med; limited to English language and bibliographies of all reviewed articles. Nine patients with PWS from our data-base treated conservatively have been found, being 19 years or over at the time this study has been performed. The results of conservative management are described and related to the natural history and treatment results found in the Pub Med review.

Results

From 2210 titles displayed in the Pub Med database with the key word being "Prader Willi syndrome", 5 different papers were displayed at the date of the search containing some information on the outcome of surgery and none appeared to contain a mid or long-term follow-up. The PWS patients treated conservatively from our series all stayed below 70° and some of which improved.

Discussion

If the curve of scoliosis patients with PWS can be kept within certain limits (usually below 70 degrees) conservatively, this treatment seems to have fewer complications than surgical treatments. The results of our retrospective study of nine patients demonstrate that scoliosis in this entity plays only a minor role and surgery is unnecessary when high quality conservative management exists.

Conclusion

There is lack of the long follow-up studies in post-surgical cases in patients with PWS and scoliosis. The rate of complications of spinal fusion in patients with PWS and scoliosis is very high and the death rates have been found to be higher than in patients with Adolescent Idiopathic Scoliosis (AIS). The long-term side-effects of the intervention are detrimental, so that the risk-benefit ratio favours the conservative approaches over spinal fusion surgery.  相似文献   

19.

Background

Too much or too little milk production are common problems in a lactation consultant's practice. Whereas underproduction is widely discussed in the lactation literature, overabundant milk supply is not. In my practice I work with women who experience moderate to severe oversupply syndrome. In most cases the syndrome can be successfully treated with full removal of milk followed by unilateral breastfeeding ad lib with the same breast offered at every breastfeed in a certain time block ("block feeding").

Case presentations

Four cases of over-supply of breast milk are presented. The management and outcome of each case is described.

Conclusion

Overabundant milk supply is an often under-diagnosed condition in otherwise healthy lactating women. Full drainage and "block feeding" offer an adequate and userfriendly way to normalize milk production and treat symptoms in both mother and child.  相似文献   

20.

Background

Given the complex mechanisms underlying biochemical processes systems biology researchers tend to build ever increasing computational models. However, dealing with complex systems entails a variety of problems, e.g. difficult intuitive understanding, variety of time scales or non-identifiable parameters. Therefore, methods are needed that, at least semi-automatically, help to elucidate how the complexity of a model can be reduced such that important behavior is maintained and the predictive capacity of the model is increased. The results should be easily accessible and interpretable. In the best case such methods may also provide insight into fundamental biochemical mechanisms.

Results

We have developed a strategy based on the Computational Singular Perturbation (CSP) method which can be used to perform a "biochemically-driven" model reduction of even large and complex kinetic ODE systems. We provide an implementation of the original CSP algorithm in COPASI (a COmplex PAthway SImulator) and applied the strategy to two example models of different degree of complexity - a simple one-enzyme system and a full-scale model of yeast glycolysis.

Conclusion

The results show the usefulness of the method for model simplification purposes as well as for analyzing fundamental biochemical mechanisms. COPASI is freely available at http://www.copasi.org.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号