首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
We have used Leginon, a fully automatic system capable of acquiring cryo-electron micrographs, to collect data of single particles, specifically of the AAA ATPase p97. The images were acquired under low-dose conditions and required no operator intervention other than the initial setup and periodic refilling of the cold-stage dewar. Each image was acquired at two different defocus values. Two-dimensional projection maps of p97 were calculated from these data and compared to results previously obtained using the conventional manual data collection methods to film. The results demonstrate that Leginon performs as well as an experienced microscopist for the acquisition of single-particle data. The general advantages of automation are discussed.  相似文献   

2.
A rate-limiting step in determining a connectome, the set of all synaptic connections in a nervous system, is extraction of the relevant information from serial electron micrographs. Here we introduce a software application, Elegance, that speeds acquisition of the minimal dataset necessary, allowing the discovery of new connectomes. We have used Elegance to obtain new connectivity data in the nematode worm Caenorhabditis elegans. We analyze the accuracy that can be obtained, which is limited by unresolvable ambiguities at some locations in electron microscopic images. Elegance is useful for reconstructing connectivity in any region of neuropil of sufficiently small size.  相似文献   

3.
4.
We have developed a system to automatically acquire cryo-electron micrographs. The system is designed to emulate all of the decisions and actions of a highly trained microscopist in collecting data from a vitreous ice specimen. These include identifying suitable areas of vitreous ice at low magnification, determining the presence and location of specimen on the grid, automatically adjusting imaging parameters (focus, astigmatism) under low-dose conditions, and acquiring images at high magnification to either film or a digital camera. This system is responsible for every aspect of image acquisition and can run unattended, other than requiring periodic refilling of the cryogens, for over 24 h. The system has been tested out on a variety of specimens that represent typical challenges in the field of cryo-electron microscopy. The results show that the overall performance of the system is equivalent to that of an experienced microscopist.  相似文献   

5.
生物三维电子显微学在过去几年取得了巨大的突破,一些具有高对称性的病毒颗粒获得了准原子分辨率的结构,非对称性的生物大分子及其复合体的结构分辨率也有快速的提高。而要获得高分辨率的结构,获取足够多的高质量电子显微照片是其中的一个关键因素。近年来,自动化数据采集技术在电子断层成像术和单颗粒方法中都取得了很大的进展。其广泛应用将使结构测定更加快速并使结构分辨率提高到更高的层次。  相似文献   

6.
Spectral libraries have emerged as a viable alternative to protein sequence databases for peptide identification. These libraries contain previously detected peptide sequences and their corresponding tandem mass spectra (MS/MS). Search engines can then identify peptides by comparing experimental MS/MS scans to those in the library. Many of these algorithms employ the dot product score for measuring the quality of a spectrum-spectrum match (SSM). This scoring system does not offer a clear statistical interpretation and ignores fragment ion m/z discrepancies in the scoring. We developed a new spectral library search engine, Pepitome, which employs statistical systems for scoring SSMs. Pepitome outperformed the leading library search tool, SpectraST, when analyzing data sets acquired on three different mass spectrometry platforms. We characterized the reliability of spectral library searches by confirming shotgun proteomics identifications through RNA-Seq data. Applying spectral library and database searches on the same sample revealed their complementary nature. Pepitome identifications enabled the automation of quality analysis and quality control (QA/QC) for shotgun proteomics data acquisition pipelines.  相似文献   

7.
8.
The results of neuromuscular reconstructions of the paralyzed face are difficult to assess. Very sophisticated methods are necessary to measure the motor deficits of facial paralysis or the functional recovery in the face. The aim of this development was a relatively simple system for data acquisition, which is easy to handle and which makes it relatively cheap to delegate data acquisition to centers all over the world, which will not be able to derive a data analysis on their own, but will send their data to a center with specialized equipment. A complex mirror system was developed to get three different views of the face at the same time on the video screen. At each investigation, a digital video is taken from a calibration grid and from standardized facial movements of the patient. Secondary analysis of the digital videofilm is made possible at any time later on by the support of a computer program, which calculates distances and movements three-dimensionally from the frontal image and the right and left mirror images. Pathologies of the mimic movements can be identified as well as improvements after surgical procedures by this system. The significant advantage is the possibility to watch the same movement on the video which is under study and to apply any kind of study later on. Taking the video needs only a few minutes, and fatigue of the patient's mimic system is prevented. Measurements usually at the endpoints of the movements give excellent information on the quantity of the movement or the degree of the facial palsy, whereas the video itself is very informative regarding the quality of the smile. Specific computer software was developed for standardized three-dimensional analysis of the video-documented facial movements and for data presentation. There are options like two-dimensional graphs of single moving points in the face or three-dimensional graphs of the movements of all measured points at the same time during a standardized facial movement. By a comparison of the right- and left-sided alterations of specific distances between two points during the facial movements, the degree of normal symmetry or pathologic asymmetry is quantified. This system is more suitable for detailed scientific multicenter studies than any other system previously established. A very sensitive instrument for exact evaluation of mimic function is now available.  相似文献   

9.
Several scientific instruments suppliers are offering complete networking and automation packages for analytical laboratories. Nevertheless there is still considerable work to be done in the area of standardization of file formats generated by different data acquisition systems supplied by scientific instruments manufacturers. Recent work on the netCDF transfer protocol for mass spectrometry data suggests that good progress is being made in the area of data formats.Our laboratory operates a number of diverse instruments, including two high resolution systems (ZAB 2F, 70 SEQ) and one quadrupole (QMD 1000) from Fisons Instruments, one ion trap system from Finnigan (ITS 40) and one pyrolysis mass spectrometer from Horizon Instruments (RAPyD-400), all equipped with autosamplers. This instruments are physically located in two distinct laboratories. The data systems are based on very different computers, including a DEC PDP-11/24, a VAX 4000/90 and several PCs.The large amount of data produced by the MS laboratory and the implementation of GLPs, (Good Laboratory Practices) and GALPs (Good Automated Laboratory Practices) prompted us to examine the possibility of networking the instrumentation in a client/server computing environment. All instrument data systems have been connected to the institute network via ethernet, using either DECnet or TCP/IP. A VAXCluster consisting in a VAXStation 4000/90 host and a VAXStation 3100 satellite has been configured as a server using DEC PATHWORKS V.4.1 server software. This allows for file, disk, application and print services to all the PC clients connected network wide. Unattended distributed backup and restore services for PC hard disks are implemented. Mass spectrometry data files are permanently archived in their original format on 4 Gbyte tape cartridges and stored for later retrieval. Files can be transferred to any office PC running the appropriate mass spectrometry software. A centralized spectra and structure information management system based on the MassLib (Chemical Concepts) software allows for library searches using the SISCOM algorithm after specific file conversion programs or using JCAMP-DX files. Furthermore, the mass spectrometer data systems are readied for their eventual incorporation into a LIMS.  相似文献   

10.
Digital imaging technology is gradually being incorporated into all areas of biological research, but there is a distinct lack of information resources targeted at scientists in their specialist areas. There is a wealth of potential applications for digital images in phycology, including morphometric or visual analysis of specimens, taxonomic databases and publication of digital micrographs in lieu of photomicrographs. Here, we provide a review of digital imaging in general and its potential for the field of microalgal research in particular. We also present a number of imaging techniques that are critical for image acquisition and optimization, which can enable beginners to build their own libraries of high quality digital images. Resolution requirements of digital cameras are explained and related to microscope resolution. The benefits of digital imaging technology are discussed and contrasted with those of traditional silver halide technology.  相似文献   

11.
The construction of a consistent protein chemical shift database is an important step toward making more extensive use of this data in structural studies. Unfortunately, progress in this direction has been hampered by the quality of the available data, particularly with respect to chemical shift referencing, which is often either inaccurate or inconsistently annotated. Preprocessing of the data is therefore required to detect and correct referencing errors. We have developed a program for performing this task, based on the comparison of reported and expected chemical shift distributions. This program, named CheckShift, does not require additional data and is therefore applicable to data sets where structures are not available. Therefore CheckShift provides the possibility to re-reference chemical shifts prior to their use as structural constraints.  相似文献   

12.
This paper describes basic software for digitization and processing of microscopic cell images used at the Department of Clinical Cytology at Uppsala University Hospital. A family of programs running on a PDP-8 minicomputer which is connected to a Leitz Orthoplan microscope with two image scanners, one diode-array scanner and a moving-stage photometer, is used for data collection. The digitized image data is converted by converted by conversion program to IBM compatible format. The data structures for image processing and statistical evaluation on the IBM system are also described. Finally, some experiences from the use of the software in cytology automation are discussed.  相似文献   

13.
We report here the automation of search procedures to rapidly screen a large number of reagents and incubation conditions that lead to the formation of protein crystals. The system consists of a Biomek 1000 Automated Laboratory Workstation from Beckman Instruments under the control of a custom user-interface program developed by Cryschem. A plate composed of twenty-four vapor diffusion chambers, each with its own reservoir well and protein drop holder was designed by Cryschem to fit the Biomek table. The Cryschem software manages a large data base of incubation conditions and generates instructions for the workstation to dispense the protein, buffers, detergents, cofactors and other reagents used to promote the formation of protein crystals. The plate is manually positioned on the Biomek table and then under program control additions are automatically made to each chamber as follows: Precipitating solution is added to each reservoir well and the protein solution along with the precipitating solution and various other reagents are added to each drop holder. The plate is removed from the table and a mylar tape is applied to simultaneously seal all the chambers. The plates are placed at a controlled temperature and periodically examined for crystal formation.  相似文献   

14.
SIGNATURE is a particle selection system for molecular electron microscopy. It applies a hierarchical screening procedure to identify molecular particles in EM micrographs. The user interface of the program provides versatile functions to facilitate image data visualization, particle annotation and particle quality inspection. The system design emphasizes both functionality and usability. This software has been released to the EM community and has been successfully applied to macromolecular structural analyses.  相似文献   

15.
While gathering data on the visual pigments of numerous species, many light micrographs have been taken of the photoreceptor cells containing the visual pigment after spectral recordings have been made from these cells. These micrographs are used to view the cells and obtain measurements of their size. Usually the morphology of the photoreceptor cells is retained adequately so that such measurements can be taken. Occasionally it is necessary to partially fix the retinal tissue which aids in the maintenance of photoreceptor morphology while allowing visual pigment spectral recordings to be made. We have found that primate retinal tissue, as well as some other mammalian tissue, disintegrates rapidly. Although partial fixation allows spectral recordings to be made before the micrographs are taken, the treatment does not always adequately preserve cell morphology for quality micrographs to be obtained. In these cases, visual pigment recordings are made from pieces of unfixed and partially fixed retinal tissue; an additional piece of the same retinal material is well-fixed, embedded and thick-sectioned for light microscopy.Preparing retinal material for sectioning is lengthy and time consuming so an alternative tissue preparation technique was sought. Material can be processed for the scanning electron microscope (SEM) more rapidly than for sectioning, however severe tissue shrinkage occurs during this process. It was found that although shrinkage does occur in the retinal tissue prepared for SEM, the relative proportions of the photoreceptor cells are maintained extremely well. Using the critical point drying method (CPD) pig retinal tissue was prepared for SEM. Scanning electron micrographs of the pig photoreceptors were taken for cell measurement. Since these micrographs could be made at higher magnification than is available by light microscopy, a more detailed view of the pig photoreceptor cells was obtained. Cell measurements made from the light and the scanning electron micrographs indicate that an approximate shrinkage of 50% occurs in the SEM prepared material.  相似文献   

16.
Information management has been an integral part of the research process at the North Temperate Lakes Long-term Ecological Research (NTL LTER) program for over 30 years. A combination of factors has made the information management system (IMS) at NTL very successful. Significant resources have been invested in the IMS from the beginning, the Information Manager has been part of the leadership team at NTL and later in various roles at the LTER network level; the NTL IMS was a very early adopter of database systems, standardized metadata, and a data delivery system based on those metadata. This approach has made data easily accessible to NTL researchers and the broader scientific community. Data management workflows have become increasingly more automated with adoption of modern technologies as they became available, making the system efficient enough to handle core data as well as all one-time research data generated within NTL and several related projects. More than three decades of core data from eleven lakes are reused extensively as critical background information and as the limnological go-to site for many synthesis projects within and beyond LTER.The NTL IMS continues to implement new technologies for improving data management efficiency, discovery, access, integration, and synthesis. Accordingly, the functionality of the original online data access system programmed in Java and JavaServer Pages (JSP) was ported to the modern content management system, Drupal and integrated into LTER's Drupal Ecological Information Management System (DEIMS). NTL has invested in sensor technology for studying lake conditions over the long term, which necessitated a sophisticated management system tailored to high frequency data streams. Several technologies have been used at different times for automation of management, quality control and archiving of these high volume data. Near real time lake conditions can be accessed on the NTL website and smart phone Apps.Easy access to long-term and sensor data in the NTL IMS has led NTL researchers to develop new analytical methods and the publication of several R statistical packages. Recent graduate students are now employed as data scientists helping define a new career path inspired by the availability of data.The NTL project has amassed one of the world's most comprehensive long-term datasets on lakes and their surrounding landscapes. The NTL IMS facilitates the use of these data by multiple groups for research, education, and communication of science to the public.  相似文献   

17.
A microcomputer-controlled data acquisition system for spectrophotometric enzyme kinetics measurements has been assembled. The system uses an Apple IIe computer which is interfaced to the binary coded decimal output of a Gilford spectrophotometer. No analog-to-digital converter had to be purchased. A BASIC program which collects timed absorbance readings every 500 ms, plots the data in real time, performs a linear regression of the data to measure the reaction rate, and calculates the enzyme activity concentration is given in full. Details describing the interfacing of the computer to the spectrophotometer are presented which will permit other laboratories to readily assemble their own systems using this hardware. Kinetic data acquired by the system are highly reproducible and agree well with data processed much more slowly by manual techniques from strip chart recordings.  相似文献   

18.
The extent and pattern of glycosylation on therapeutic antibodies can influence their circulatory half-life, engagement of effector functions, and immunogenicity, with direct consequences to efficacy and patient safety. Hence, controlling glycosylation patterns is central to any drug development program, yet poses a formidable challenge to the bio-manufacturing industry. Process changes, which can affect glycosylation patterns, range from manufacturing at different scales or sites, to switching production process mode, all the way to using alternative host cell lines. In the emerging space of biosimilars development, often times all of these aspects apply. Gaining a deep understanding of the direction and extent to which glycosylation quality attributes can be modulated is key for efficient fine-tuning of glycan profiles in a stage appropriate manner, but establishment of such platform knowledge is time consuming and resource intensive. Here we report an inexpensive and highly adaptable screening system for comprehensive modulation of glycans on antibodies expressed in CHO cells. We characterize 10 media additives in univariable studies and in combination, using a design of experiments approach to map the design space for tuning glycosylation profile attributes. We introduce a robust workflow that does not require automation, yet enables rapid process optimization. We demonstrate scalability across deep wells, shake flasks, AMBR-15 cell culture system, and 2 L single-use bioreactors. Further, we show that it is broadly applicable to different molecules and host cell lineages. This universal approach permits fine-tuned modulation of glycan product quality, reduces development costs, and enables agile implementation of process changes throughout the product lifecycle.  相似文献   

19.
Sorting on the basis of the complex features resolved by chromosome slit-scan analysis requires rapid and flexible pulse shape acquisition and processing for determining sort decisions before droplet breakoff. Fluorescence scans of chromosome morphology contain centromeric index and banding information suitable for chromosome classification, but these scans are often characterized by variability in length and height and require sophisticated data processing procedures for identification. Setting sort criteria on such complex morphological data requires digitization and subsequent computation by an algorithm tolerant of variations in overall pulse shape. We demonstrate here the capability to sort individual chromosomes based on their morphological features measured by slit-scan flow cytometry. To do this we have constructed a sort controller capable of acquiring an 128 byte chromosome waveform and executing a series of numerical computations resulting in an area-based centromeric index sort decision in less than 2 ms. The system is configured in a NOVIX microprocessor, programmed in FORTH, and interfaced to a slit-scan flow cytometer data acquisition system. An advantage of this configuration is direct control over the machine state during program execution for minimal processing time. Examples of flow sorted chromosomes are shown with their corresponding fluorescence pulse shapes.  相似文献   

20.
The ability to evaluate the validity of data is essential to any investigation, and manual “eyes on” assessments of data quality have dominated in the past. Yet, as the size of collected data continues to increase, so does the effort required to assess their quality. This challenge is of particular concern for networks that automate their data collection, and has resulted in the automation of many quality assurance and quality control analyses. Unfortunately, the interpretation of the resulting data quality flags can become quite challenging with large data sets. We have developed a framework to summarize data quality information and facilitate interpretation by the user. Our framework consists of first compiling data quality information and then presenting it through 2 separate mechanisms; a quality report and a quality summary. The quality report presents the results of specific quality analyses as they relate to individual observations, while the quality summary takes a spatial or temporal aggregate of each quality analysis and provides a summary of the results. Included in the quality summary is a final quality flag, which further condenses data quality information to assess whether a data product is valid or not. This framework has the added flexibility to allow “eyes on” information on data quality to be incorporated for many data types. Furthermore, this framework can aid problem tracking and resolution, should sensor or system malfunctions arise.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号