首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
The Biologische Anstalt Helgoland (BAH) offers unique possibilities for research and education in marine sciences in the southern part of the North Sea. Besides its own research duties, the Institute provides research facilities and technical assistance for guest scientists, assists in the teaching and education of university student groups, and conducts its own courses. The Institute further supplies universities and research institutions on the mainland with marine organisms. The marine station on Helgoland has 14 laboratories, with a total of 32 working places available for guest scientists. The Wadden Sea Institute in List on the island of Sylt offers 6 laboratories with a total of 18 working places. Furthermore, laboratory classrooms are located on Helgoland and in List for 50 and 20 participants, respectively. For the convenience of the guest researchers staying at the BAH, guest-houses are run on Helgoland (Arthur-Hagmeier-Haus, Wilhelm-Mielk-Haus) und in List (Adolf-Bückmann-Haus). Guest researchers have been welcome since the founding of the Institute in 1892. Heincke gave a brief report on the activities of the first visitors from 1892 to 1897. Only sporadic reports are available for the first 60 years of this century. Guest scientists and their activities have only been recorded in detail in the annual reports of the BAH since 1962. The number of researchers and the length of their visits have increased continuously since 1962. The research facilities on Helgoland, in List and Hamburg have been modernized during the last 20 years. In 1971, four modern laboratories for guest researchers could be opened on Helgoland with financial support of the German Research Foundation (DFG). The number, of guest researchers in List and Hamburg increased after the completion of new buildings in 1979 and 1982. The recent increase in research activities by guest scientists is due to numerous students, from many different universities, using the superb research facilities to do their Masters thesis, or Ph.D. Guest researchers and students either perform their own research or cooperate with scientists of the BAH.  相似文献   

2.
With the rapid development of uncertain artificial intelligent and the arrival of big data era, conventional clustering analysis and granular computing fail to satisfy the requirements of intelligent information processing in this new case. There is the essential relationship between granular computing and clustering analysis, so some researchers try to combine granular computing with clustering analysis. In the idea of granularity, the researchers expand the researches in clustering analysis and look for the best clustering results with the help of the basic theories and methods of granular computing. Granularity clustering method which is proposed and studied has attracted more and more attention. This paper firstly summarizes the background of granularity clustering and the intrinsic connection between granular computing and clustering analysis, and then mainly reviews the research status and various methods of granularity clustering. Finally, we analyze existing problem and propose further research.  相似文献   

3.
Neutron capture therapy (NCT) research encompasses a wide range of preclinical and clinical studies needed to develop this promising but complex cancer treatment. Many specialized facilities and capabilities including thermal and epithermal neutron irradiation facilities, boron analysis, specialized mixed-field dosimetry, animal care facilities and protocols, cell culture laboratories, and, for human clinical studies, licenses and review board approvals are required for NCT research. Such infrastructure is essential, but much of it is not readily available within the community. This is especially true for neutron irradiation facilities, which often require significant development and capital investment too expensive to duplicate at each site performing NCT research. To meet this need, the NCT group at the Massachusetts Institute of Technology (MIT) has established a User Center for NCT researchers that is already being accessed successfully by various groups. This paper describes the facilities, capabilities and other resources available at MIT and how the NCT research community can access them.  相似文献   

4.
Lilley KS  Deery MJ  Gatto L 《Proteomics》2011,11(6):1017-1025
Many analytical techniques have been executed by core facilities established within academic, pharmaceutical and other industrial institutions. The centralization of such facilities ensures a level of expertise and hardware which often cannot be supported by individual laboratories. The establishment of a core facility thus makes the technology available for multiple researchers in the same institution. Often, the services within the core facility are also opened out to researchers from other institutions, frequently with a fee being levied for the service provided. In the 1990s, with the onset of the age of genomics, there was an abundance of DNA analysis facilities, many of which have since disappeared from institutions and are now available through commercial sources. Ten years on, as proteomics was beginning to be utilized by many researchers, this technology found itself an ideal candidate for being placed within a core facility. We discuss what in our view are the daily challenges of proteomics core facilities. We also examine the potential unmet needs of the proteomics core facility that may also be applicable to proteomics laboratories which do not function as core facilities.  相似文献   

5.
This paper presents an overview of computing and networkingfacilities developed by the Medical Research Council to provideonline computing support to the Human Genome Mapping Project(HGMP) in the UK. The facility is connected to a number of othercomputing facilities in various centres of genetics and molecularbiology research excellence, either directly via high-speedlinks or through national and international wide–areanetworks. The paper describes the design and implementationof the current system, a ‘client/server’ networkof Sun, IBM, DEC and Apple servers, gateways and workstations.A short outline of online computing services currently deliveredby this system to the UK human genetics research community isalso provided. More information about the services and theiravailability could be obtained by a direct approach to the UKHGMP-RC.  相似文献   

6.
Bridging the gap     
《MABS-AUSTIN》2013,5(5):440-452
Therapeutic monoclonal antibodies (mAbs) currently dominate the biologics marketplace. Development of a new therapeutic mAb candidate is a complex, multistep process and early stages of development typically begin in an academic research environment. Recently, a number of facilities and initiatives have been launched to aid researchers along this difficult path and facilitate progression of the next mAb blockbuster. Complementing this, there has been a renewed interest from the pharmaceutical industry to reconnect with academia in order to boost dwindling pipelines and encourage innovation. In this review, we examine the steps required to take a therapeutic mAb from discovery through early stage preclinical development and toward becoming a feasible clinical candidate. Discussion of the technologies used for mAb discovery, production in mammalian cells and innovations in single-use bioprocessing is included. We also examine regulatory requirements for product quality and characterization that should be considered at the earliest stages of mAb development. We provide details on the facilities available to help researchers and small-biotech build value into early stage product development, and include examples from within our own facility of how technologies are utilized and an analysis of our client base.  相似文献   

7.
Hong CB  Kim YJ  Moon S  Shin YA  Go MJ  Kim DJ  Lee JY  Cho YS 《BMB reports》2012,45(1):44-46
Recent advances in high-throughput genotyping technologies have enabled us to conduct a genome-wide association study (GWAS) on a large cohort. However, analyzing millions of single nucleotide polymorphisms (SNPs) is still a difficult task for researchers conducting a GWAS. Several difficulties such as compatibilities and dependencies are often encountered by researchers using analytical tools, during the installation of software. This is a huge obstacle to any research institute without computing facilities and specialists. Therefore, a proper research environment is an urgent need for researchers working on GWAS. We developed BioSMACK to provide a research environment for GWAS that requires no configuration and is easy to use. BioSMACK is based on the Ubuntu Live CD that offers a complete Linux-based operating system environment without installation. Moreover, we provide users with a GWAS manual consisting of a series of guidelines for GWAS and useful examples. BioSMACK is freely available at http://ksnp.cdc. go.kr/biosmack.  相似文献   

8.
Computational techniques have been adopted in medi-cal and biological systems for a long time. There is no doubt that the development and application of computational methods will render great help in better understanding biomedical and biological functions. Large amounts of datasets have been produced by biomedical and biological experiments and simulations. In order for researchers to gain knowledge from origi- nal data, nontrivial transformation is necessary, which is regarded as a critical link in the chain of knowledge acquisition, sharing, and reuse. Challenges that have been encountered include: how to efficiently and effectively represent human knowledge in formal computing models, how to take advantage of semantic text mining techniques rather than traditional syntactic text mining, and how to handle security issues during the knowledge sharing and reuse. This paper summarizes the state-of-the-art in these research directions. We aim to provide readers with an introduction of major computing themes to be applied to the medical and biological research.  相似文献   

9.
Therapeutic monoclonal antibodies (mAbs) currently dominate the biologics marketplace. Development of a new therapeutic mAb candidate is a complex, multistep process and early stages of development typically begin in an academic research environment. Recently, a number of facilities and initiatives have been launched to aid researchers along this difficult path and facilitate progression of the next mAb blockbuster. Complementing this, there has been a renewed interest from the pharmaceutical industry to reconnect with academia in order to boost dwindling pipelines and encourage innovation. In this review, we examine the steps required to take a therapeutic mAb from discovery through early stage preclinical development and toward becoming a feasible clinical candidate. Discussion of the technologies used for mAb discovery, production in mammalian cells and innovations in single-use bioprocessing is included. We also examine regulatory requirements for product quality and characterization that should be considered at the earliest stages of mAb development. We provide details on the facilities available to help researchers and small-biotech build value into early stage product development, and include examples from within our own facility of how technologies are utilized and an analysis of our client base.Key words: monoclonal antibody, preclinical development, biologics, CHO cells, cell culture  相似文献   

10.
In this commentary I aim to raise awareness among researchers and sanctuary directors to potential barriers to retiring Old and New World monkeys from research facilities. I define a barrier as an opinion or stereotype that prevents primate retirement from occurring on a regular basis. By discussing retirement barriers and recommending how to overcome them, I aim to increase the frequency of retiring monkeys from the laboratory into naturalistic sanctuaries. In this article I compile a final list of 10 barriers to retirement—and recommendations on how to overcome them—based on responses to forums, comments from primate sanctuary directors, information contained in scientific and sanctuary literature, and personal experiences. I conclude that researchers will increase the frequency of primate retirement by performing the following 5 actions: (a) increase communication by networking with sanctuaries, (b) prevent negative publicity by developing a confidentiality clause with the sanctuary, (c) increase understanding by reviewing the articles written on retiring monkeys into sanctuaries, (d) increase funding for primate retirement by including funding requests in grant proposals, or (e) by raising private funds.  相似文献   

11.
Microarrays: handling the deluge of data and extracting reliable information   总被引:13,自引:0,他引:13  
Application of powerful, high-throughput genomics technologies is becoming more common and these technologies are evolving at a rapid pace. Genomics facilities are being established in major research institutions to produce inexpensive, customized cDNA microarrays that are accessible to researchers in a broad range of fields. These high-throughput platforms have generated a massive onslaught of data, which threatens to overwhelm researchers. Although microarrays show great promise, the technology has not matured to the point of consistently generating robust and reliable data when used in the average laboratory. This article addresses several aspects related to the handling of the deluge of microarray data and extracting reliable information from these data. We review the essential elements of data acquisition, data processing and data analysis, and briefly discuss issues related to the quality, validation and storage of data. Our goal is to point out some of the problems that must be overcome before this promising technology can achieve its full potential.  相似文献   

12.
Combined analysis of multiple, large datasets is a common objective in the health- and biosciences. Existing methods tend to require researchers to physically bring data together in one place or follow an analysis plan and share results. Developed over the last 10 years, the DataSHIELD platform is a collection of R packages that reduce the challenges of these methods. These include ethico-legal constraints which limit researchers’ ability to physically bring data together and the analytical inflexibility associated with conventional approaches to sharing results. The key feature of DataSHIELD is that data from research studies stay on a server at each of the institutions that are responsible for the data. Each institution has control over who can access their data. The platform allows an analyst to pass commands to each server and the analyst receives results that do not disclose the individual-level data of any study participants. DataSHIELD uses Opal which is a data integration system used by epidemiological studies and developed by the OBiBa open source project in the domain of bioinformatics. However, until now the analysis of big data with DataSHIELD has been limited by the storage formats available in Opal and the analysis capabilities available in the DataSHIELD R packages. We present a new architecture (“resources”) for DataSHIELD and Opal to allow large, complex datasets to be used at their original location, in their original format and with external computing facilities. We provide some real big data analysis examples in genomics and geospatial projects. For genomic data analyses, we also illustrate how to extend the resources concept to address specific big data infrastructures such as GA4GH or EGA, and make use of shell commands. Our new infrastructure will help researchers to perform data analyses in a privacy-protected way from existing data sharing initiatives or projects. To help researchers use this framework, we describe selected packages and present an online book (https://isglobal-brge.github.io/resource_bookdown).  相似文献   

13.
High-throughput genome research has long been associated with bioinformatics, as it assists genome sequencing and annotation projects. Along with databases, to store, properly manage, and retrieve biological data, a large number of computational tools have been developed to decode biological information from this data. However, with the advent of next-generation sequencing (NGS) technology the sequence data starts generating at a pace never before seen. Consequently researchers are facing a threat as they are experiencing a potential shortage of storage space and tools to analyze the data. Moreover, the voluminous data increases traffic in the network by uploading and downloading large data sets, and thus consume much of the network's available bandwidth. All of these obstacles have led to the solution in the form of cloud computing.  相似文献   

14.
ABSTRACT: BACKGROUND: Accurate and efficient RNA secondary structure prediction remains an important open problem in computational molecular biology. Historically, advances in computing technology have enabled faster and more accurate RNA secondary structure predictions. Previous parallelized prediction programs achieved significant improvements in runtime, but their implementations were not portable from niche high-performance computers or easily accessible to most RNA researchers. With the increasing prevalence of multi-core desktop machines, a new parallel prediction program is needed to take full advantage of today's computing technology. FINDINGS: We present here the first implementation of RNA secondary structure prediction by thermodynamic optimization for modern multi-core computers. We show that GTfold predicts secondary structure in less time than UNAfold and RNAfold, without sacrificing accuracy, on machines with four or more cores. CONCLUSIONS: GTfold supports advances in RNA structural biology by reducing the timescales for secondary structure prediction. The difference will be particularly valuable to researchers working with lengthy RNA sequences, such as RNA viral genomes.  相似文献   

15.
Platform technologies (PT) are techniques or tools that enable a range of scientific investigations and are critical to today''s advanced technology research environment. Once installed, they require specialized staff for their operations, who in turn, provide expertise to researchers in designing appropriate experiments. Through this pipeline, research outputs are raised to the benefit of the researcher and the host institution.1 Platform facilities provide access to instrumentation and expertise for a wide range of users beyond the host institution, including other academic and industry users. To maximize the return on these substantial public investments, this wider access needs to be supported. The question of support and the mechanisms through which this occurs need to be established based on a greater understanding of how PT facilities operate. This investigation was aimed at understanding if and how platform facilities across the Bio21 Cluster meet operating costs. Our investigation found: 74% of platforms surveyed do not recover 100% of direct operating costs and are heavily subsidized by their home institution, which has a vested interest in maintaining the technology platform; platform managers play a major role in establishing the costs and pricing of the facility, normally in a collaborative process with a management committee or institutional accountant; and most facilities have a three-tier pricing structure recognizing internal academic, external academic, and commercial clients.  相似文献   

16.
There are numerous ways to display a phylogenetic tree, which is reflected in the diversity of software tools available to phylogenetists. Displaying very large trees continues to be a challenge, made ever harder as increasing computing power enables researchers to construct ever-larger trees. At the same time, computing technology is enabling novel visualisations, ranging from geophylogenies embedded on digital globes to touch-screen interfaces that enable greater interaction with evolutionary trees. In this review, I survey recent developments in phylogenetic visualisation, highlighting successful (and less successful) approaches and sketching some future directions.  相似文献   

17.
This paper examines the US Atomic Energy Commission’s radioisotope distribution program, established in 1946, which employed the uranium piles built for the wartime bomb project to produce specific radioisotopes for use in scientific investigation and medical therapy. As soon as the program was announced, requests from researchers began pouring into the Commission’s office. During the first year of the program alone over 1000 radioisotope shipments were sent out. The numerous requests that came from scientists outside the United States, however, sparked a political debate about whether the Commission should or even could export radioisotopes. This controversy manifested the tension between the aims of the Marshall Plan and growing US national security concerns after World War II. Proponents of international circulation of radioisotopes emphasized the political and scientific value of collaborating with European scientists, especially biomedical researchers. In the end, radioisotopes were shipped from the Commission’s Oak Ridge facility to many laboratories in England and continental Europe, where they were used in biochemical research on animals, plants, and microbes. However, the issue of radioisotope export continued to draw political fire in the United States, even after the establishment of national atomic energy facilities elsewhere.  相似文献   

18.
19.
One of the challenges of computational-centric research is to make the research undertaken reproducible in a form that others can repeat and re-use with minimal effort. In addition to the data and tools necessary to re-run analyses, execution environments play crucial roles because of the dependencies of the operating system and software version used. However, some of the challenges of reproducible science can be addressed using appropriate computational tools and cloud computing to provide an execution environment.Here, we demonstrate the use of a Kepler scientific workflow for reproducible science that is sharable, reusable, and re-executable. These workflows reduce barriers to sharing and will save researchers time when undertaking similar research in the future.To provide infrastructure that enables reproducible science, we have developed cloud-based Collaborative Environment for Ecosystem Science Research and Analysis (CoESRA) infrastructure to build, execute and share sophisticated computation-centric research. The CoESRA provides users with a storage and computational platform that is accessible from a web-browser in the form of a virtual desktop. Any registered user can access the virtual desktop to build, execute and share the Kepler workflows. This approach will enable computational scientists to share complete workflows in a pre-configured environment so that others can reproduce the computational research with minimal effort.As a case study, we developed and shared a complete IUCN Red List of Ecosystems Assessment workflow that reproduces the assessments undertaken by Burns et al. (2015) on Mountain Ash forests in the Central Highlands of Victoria, Australia. This workflow provides an opportunity for other researchers and stakeholders to run this assessment with minimal supervision. The workflow also enables researchers to re-evaluate the assessment when additional data becomes available. The assessment can be run in a CoESRA virtual desktop by opening a workflow in a Kepler user interface and pressing a “start” button. The workflow is pre-configured with all the open access datasets and writes results to a pre-configured folder.  相似文献   

20.
[First paragraph(s)...]“A freezing chamber offers an easy place for such [frost] experiments... and ... valuable data as to the cold-resisting powers of our plants might be arrived at” (Cockayne, 1897). The National Climate Laboratory was opened in 1970 and has been operating for the past 25 years (recently celebrating its anniversary) for both national and international scientists carrying out environmental research on plants and animals (Halligan, 1995). The facilities have been used by a wide range of plant and animal based researchers from the pastoral, horticultural, and forestry sectors involving a range of disciplines from agronomy, pathology, entomology, physiology, plant breeding and zoology. New Zealand ecologists have been conspicuously absent (in spite of the potential in controlled environment research seen by pioneering ecologists—see above quotation) and yet there has been a range of activities that are complementary to this discipline.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号