首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
A three-dimensional model for the complex between human serum retinol binding protein and transthyretin (formerly named prealbumin) is presented. The model was obtained by interactive rigid-body computer graphics docking and the characterization of the molecular surfaces in terms of fractal dimension. Available experimental data, as well as results from molecular dynamics calculations, support the proposed model.  相似文献   

2.
Two Macintosh programs written for multivariate data analysisand multivariate data graphical display are presented. MacMulincludes principal component analysis (PCA), correspondenceanalysis (CA) and multiple correspondence analysis (MCA), witha complete, original and unified set of numerical aids to interpretation.GraphMu is designed for drawing collections of elementary graphics(curves, maps, graphical models) thus allowing comparisons betweenvariables, individuals, and principal axes planes of multivariatemethods. Both programs are self-documented applications andmake full use of the user-oriented graphical interface of theMacintosh to simplify the process of analysing data sets. Anexample is described to show the results obtained on a smallecological data set. Received on January 24, 1989; accepted on July 17, 1989  相似文献   

3.
An editor system is presented for a flexible definition of colour maps for use in raster graphics systems with frame buffers. This system - MONICOL - allows an easy display of 3D objects, particularly in molecular graphics. Some applications with respect to the representation of shaded objects and energy surfaces are given, as well as to a software solution of multiple layer buffering.  相似文献   

4.
A computer graphics technique for computer-assisted stereotactic surgery is presented. The program is designed to aid the surgeon by presenting an on-line graphics display of stereotactic probes and electrodes superimposed on cross sections of the human brain stem. This technique simulates an otherwise blind surgical procedure on a graphics screen for use during surgery. An earlier system based around the DEC MINC-11 BA computer system has been used by the authors for the performance of stereotactic surgery with conventional ventriculography. This system has been upgraded and is now configured about an even more compact microprocessor-based hardware system with expanded graphics capabilities, which also allows its use with computerized tomography.  相似文献   

5.
High‐throughput systems and processes have typically been targeted for process development and optimization in the bioprocessing industry. For process characterization, bench scale bioreactors have been the system of choice. Due to the need for performing different process conditions for multiple process parameters, the process characterization studies typically span several months and are considered time and resource intensive. In this study, we have shown the application of a high‐throughput mini‐bioreactor system viz. the Advanced Microscale Bioreactor (ambr15TM), to perform process characterization in less than a month and develop an input control strategy. As a pre‐requisite to process characterization, a scale‐down model was first developed in the ambr system (15 mL) using statistical multivariate analysis techniques that showed comparability with both manufacturing scale (15,000 L) and bench scale (5 L). Volumetric sparge rates were matched between ambr and manufacturing scale, and the ambr process matched the pCO2 profiles as well as several other process and product quality parameters. The scale‐down model was used to perform the process characterization DoE study and product quality results were generated. Upon comparison with DoE data from the bench scale bioreactors, similar effects of process parameters on process yield and product quality were identified between the two systems. We used the ambr data for setting action limits for the critical controlled parameters (CCPs), which were comparable to those from bench scale bioreactor data. In other words, the current work shows that the ambr15TM system is capable of replacing the bench scale bioreactor system for routine process development and process characterization. © 2015 American Institute of Chemical Engineers Biotechnol. Prog., 31:1623–1632, 2015  相似文献   

6.
DNA/GUI (DNA Graphical User Interface) is an interactive software system for rapid and efficient analysis of images of the types used in genome mapping, such as autoradiograms and electrophoretic gels. Images are digitized using a commercially available charge-coupled-device (CCD) camera system and analyzed on a graphics workstation using a menu-driven user interface. DNA/GUI features automatic lane and band detection, simultaneous display of multiple images and a unique spatial-normalization algorithm. Images and their associated data are archived and easily available for later recall. Preliminary results indicate that DNA/GUI is a useful tool in the analysis and comparison of images used in a variety of applications such as genetic-linkage analysis and DNA restriction mapping. The interactive display software is based on the X Window System and is therefore readily portable to a variety of graphics workstations.  相似文献   

7.
Curie point pyrolysis-mass spectrometry is a powerful method for fast characterization of complex, nonvolatile materials. Fast, reproducible heating of the material results in a characteristic mixture of volatile fragment products, which is analyzed on-line by mass spectrometry. The method can be used for various purposes ranging from classification and identification to quality control and biochemical analysis and has already proven to be a versatile tool in the fields of (micro-) biology, biochemistry, soil science and geochemistry. Our fully automated Py-MS system for batch-wise analysis of series of samples will be presented, together with computer methods for multivariate analysis of the spectral data. Some results obtained within the application-fields mentioned above will also be given.  相似文献   

8.
Summary A multivariate Gaussian model for mammalian development is presented with the associated biological and mathematical assumptions. Many biological investigations use the female mammal X chromosome to test hypotheses and to estimate parameters of the developmental system. In particular, Lyon's (1961) hypotheses are used as a basis of the mathematical model. Experimental mouse data and three sets of human experimental data are analyzed using the hypothesized Gaussian model. The estimated biological parameters are consistent with some current biological theories.  相似文献   

9.
Abstract

We present here results on 260 pico seconds (ps) molecular dynamics (MD) simulation of substance P (SP) in hydrated bilayer of dimyristoyl phosphatidyl choline (DMPC) (39 molecules of DMPC with 776 water molecules). 260 ps MD simulation has been carried out in 0.001 ps time interval with united atom force field, using AMBER 4.0 package. Non bonded pair list was updated every 20 cycles using 12.5 Å cut off distance. Analysis of MD data is done using our package ANALMD. The obtained models are presented using graphics package RASMOL. All simulations, analysis of MD data and graphics is done on INDIGO-2, R-4400 extreme graphics work station. Our results show no systematic change in order parameter, but reduction in transfraction of the chain torsional angles, compared to our earlier results on MD simulation on hydrated DMPC bilayer without SP. C-terminal and central peptide residues adopt partial helical conformation. Helix type as classified on the basis of H-bonds is between a and 310. The peptide backbone shows flexibility during heating runs. Later, it is stabilized and there was not much change in the spatial position of the backbone. Lipid matrix serves the role of immobilization of the peptide backbone in a preferred conformation.  相似文献   

10.
Jaillais B  Perrin E  Mangavel C  Bertrand D 《Planta》2011,233(6):1147-1156
Variations in the quality of wheat kernels can be an important problem in the cereal industry. In particular, desiccation conditions play an essential role in both the technological characteristics of the kernel and its ability to sprout. In planta desiccation constitutes a key stage in the determinism of the functional properties of seeds. The impact of desiccation on the endosperm texture of seed is presented in this work. A simple imaging system had previously been developed to acquire multivariate images to characterize the heterogeneity of food materials. A special algorithm for the use under principal component analysis (PCA) was developed to process the acquired multivariate images. Wheat grains were collected at physiological maturity, and were subjected to two types of drying conditions that induced different kinetics of water loss. A data set containing 24 images (dimensioned 702 × 524 pixels) corresponding to the different desiccation stages of wheat kernels was acquired at different wavelengths and then analyzed. A comparison of the images of kernel sections highlighted changes in kernel texture as a function of their drying conditions. Slow drying led to a floury texture, whereas fast drying caused a glassy texture. The automated imaging system thus developed is sufficiently rapid and economical to enable the characterization in large collections of grain texture as a function of time and water content.  相似文献   

11.
A process fault identification and classification scheme for an automobile door assembly process is presented based on multivariate in-line dimensional measurements and principal component factor analysis. First, the door assembly process and the dimensional measurement system are briefly introduced. Second, the technique of principal component factor analysis is presented for process fault identification. Process faults are summarized based on off-line identified case studies. Finally a machine classification scheme based on principal components and principal factors is presented and evaluated, using the pattern knowledge obtained off-line. This scheme is shown to be effective in classifying process faults using production data.  相似文献   

12.
Stereotactic tumor biopsy and brachytherapy catheter implantation can be accomplished with targets derived from computed axial tomography and magnetic resonance scans. Computer manipulation of image data allows both diagnostic and therapeutic procedures to be carried out from a single set of scan slices. This eliminates the need for repeat scanning as part of the surgical procedure. Microcomputer technology is sufficiently advanced to handle the images and graphics necessary for stereotactic neurosurgery. A system based on the IBM PC/AT designed for this purpose uses readily available graphics software and custom-designed imaging programs. Direct loading of computed axial or magnetic resonance scan images from magnetic tape can be accomplished. Determination of points, contours and volumes in three-dimensional space allows intraoperative alignment of image data and patient landmarks within the stereotactic head frame using pattern recognition overlays. Three-axis scaling for magnification correction along with rotational and linear data transformations provide the basis for single-scan stereotaxis. Interactive computer graphics integrate image, patient and frame coordinates for target determination. This method eliminates the need to design and fabricate nonmagnetic or radiolucent scanner-compatible devices.  相似文献   

13.
14.
An in-housed designed computerised flow injection system comprised a fully integrated microdistillation flow injection (MDFI) system for low level ammonia analysis was reported. In this system, the microdistillation separation step was incorporated into the flow injection manifold and the ammonia gas sensing probe sensing element was replaced by a flow-through micro-pH electrode which sensed the change in pH of a flowing collector solution caused by the dissolution of distilled ammonia gas, in a process analogous to that occurring in the internal solution of the gas sensing probe. A computerised control and data acquisition system was constructed for this system using a commercially available data acquisition card which offered many advantages such as improved data acquisition rates and control over the system components, as well as good graphics display and data processing options. The system was optimised using a multi-variable simplex optimisation technique.  相似文献   

15.
16.
The objective of proteomics is to get an overview of the proteins expressed at a given point in time in a given tissue and to identify the connection to the biochemical status of that tissue. Therefore sample throughput and analysis time are important issues in proteomics. The concept of proteomics is to encircle the identity of proteins of interest. However, the overall relation between proteins must also be explained. Classical proteomics consist of separation and characterization, based on two-dimensional electrophoresis, trypsin digestion, mass spectrometry and database searching. Characterization includes labor intensive work in order to manage, handle and analyze data. The field of classical proteomics should therefore be extended to also include handling of large datasets in an objective way. The separation obtained by two-dimensional electrophoresis and mass spectrometry gives rise to huge amount of data. We present a multivariate approach to the handling of data in proteomics with the advantage that protein patterns can be spotted at an early stage and consequently the proteins selected for sequencing can be selected intelligently. These methods can also be applied to other data generating protein analysis methods like mass spectrometry and near infrared spectroscopy and examples of application to these techniques are also presented. Multivariate data analysis can unravel complicated data structures and may thereby relieve the characterization phase in classical proteomics. Traditionally statistical methods are not suitable for analysis of the huge amounts of data, where the number of variables exceed the number of objects. Multivariate data analysis, on the other hand, may uncover the hidden structures present in these data. This study takes its starting point in the field of classical proteomics and shows how multivariate data analysis can lead to faster ways of finding interesting proteins. Multivariate analysis has shown interesting results as a supplement to classical proteomics and added a new dimension to the field of proteomics.  相似文献   

17.
This paper examines the feasibility of using multivariate data analysis (MVDA) for supporting some of the key activities that are required for successful manufacturing of biopharmaceutical products. These activities include scale-up, process comparability, process characterization, and fault diagnosis. Multivariate data analysis and modeling were performed using representative data from small-scale (2 L) and large-scale (2000 L) batches of a cell-culture process. Several input parameters (pCO2, pO2, glucose, pH, lactate, ammonium ions) and output parameters (purity, viable cell density, viability, osmolality) were evaluated in this analysis. Score plots, loadings plots, and VIP plots were utilized for assessing scale-up and comparability of the cell-culture process. Batch control charts were found to be useful for fault diagnosis during routine manufacturing. Finally, observations made from reviewing VIP plots were found to be in agreement with conclusions from process characterization studies demonstrating the effectiveness of MVDA as a tool for extracting process knowledge.  相似文献   

18.
Wang T  Wu L 《Biometrics》2011,67(4):1452-1460
Multivariate one-sided hypotheses testing problems arise frequently in practice. Various tests have been developed. In practice, there are often missing values in multivariate data. In this case, standard testing procedures based on complete data may not be applicable or may perform poorly if the missing data are discarded. In this article, we propose several multiple imputation methods for multivariate one-sided testing problem with missing data. Some theoretical results are presented. The proposed methods are evaluated using simulations. A real data example is presented to illustrate the methods.  相似文献   

19.
S Demers  J Kim  P Legendre  L Legendre 《Cytometry》1992,13(3):291-298
Flow cytometry has recently been introduced in aquatic ecology. Its unique feature is to measure several optical characteristics simultaneously on a large number of cells. Until now, these data have generally been analyzed in simple ways, e.g., frequency histograms and bivariate scatter diagrams, so that the multivariate potential of the data has not been fully exploited. This paper presents a way of answering ecologically meaningful questions, using the multivariate characteristics of the data. In order to do so, the multivariate data are reduced to a small number of classes by clustering, which reduces the data to a categorical variable. Multivariate pairwise comparisons can then be performed among samples using these new data vectors. The test case presented in the paper forms a time series of observations from which the new method enables us to study on the temporal evolution of cell types.  相似文献   

20.
This paper discusses the application of the data handling-graphics-statistics program Stata (Computing Resources Center, Santa Monica, CA) to radioimmunoassay. We have found that this program is more powerful and easier to use than a spreadsheet for analyzing various kinds of laboratory data generated from chromatography, radiolabeling experiments, enzyme-linked immunosorbent assays, and radioimmunoassays, to name several examples. Data from a radioimmunoassay procedure, originally analyzed using a spreadsheet, Lotus 1-2-3, have been processed with Stata. Simple programs (batch files) have been devised for computations and graphics. The original data and a comparison of results are presented.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号