共查询到20条相似文献,搜索用时 0 毫秒
2.
3.
Xiaojing Liu Zheng Ser Ahmad A. Cluntun Samantha J. Mentch Jason W. Locasale 《Journal of visualized experiments : JoVE》2014,(87)
Metabolite profiling has been a valuable asset in the study of metabolism in health and disease. However, current platforms have different limiting factors, such as labor intensive sample preparations, low detection limits, slow scan speeds, intensive method optimization for each metabolite, and the inability to measure both positively and negatively charged ions in single experiments. Therefore, a novel metabolomics protocol could advance metabolomics studies. Amide-based hydrophilic chromatography enables polar metabolite analysis without any chemical derivatization. High resolution MS using the Q-Exactive (QE-MS) has improved ion optics, increased scan speeds (256 msec at resolution 70,000), and has the capability of carrying out positive/negative switching. Using a cold methanol extraction strategy, and coupling an amide column with QE-MS enables robust detection of 168 targeted polar metabolites and thousands of additional features simultaneously. Data processing is carried out with commercially available software in a highly efficient way, and unknown features extracted from the mass spectra can be queried in databases. 相似文献
4.
Background
The new generation of massively parallel DNA sequencers, combined with the challenge of whole human genome resequencing, result in the need for rapid and accurate alignment of billions of short DNA sequence reads to a large reference genome. Speed is obviously of great importance, but equally important is maintaining alignment accuracy of short reads, in the 25–100 base range, in the presence of errors and true biological variation.Methodology
We introduce a new algorithm specifically optimized for this task, as well as a freely available implementation, BFAST, which can align data produced by any of current sequencing platforms, allows for user-customizable levels of speed and accuracy, supports paired end data, and provides for efficient parallel and multi-threaded computation on a computer cluster. The new method is based on creating flexible, efficient whole genome indexes to rapidly map reads to candidate alignment locations, with arbitrary multiple independent indexes allowed to achieve robustness against read errors and sequence variants. The final local alignment uses a Smith-Waterman method, with gaps to support the detection of small indels.Conclusions
We compare BFAST to a selection of large-scale alignment tools - BLAT, MAQ, SHRiMP, and SOAP - in terms of both speed and accuracy, using simulated and real-world datasets. We show BFAST can achieve substantially greater sensitivity of alignment in the context of errors and true variants, especially insertions and deletions, and minimize false mappings, while maintaining adequate speed compared to other current methods. We show BFAST can align the amount of data needed to fully resequence a human genome, one billion reads, with high sensitivity and accuracy, on a modest computer cluster in less than 24 hours. BFAST is available at http://bfast.sourceforge.net. 相似文献5.
One of the principal characteristics of large scale wireless sensor networks is their distributed, multi-hop nature. Due to this characteristic, applications such as query propagation rely regularly on network-wide flooding for information dissemination. If the transmission radius is not set optimally, the flooded packet may be holding the transmission medium for longer periods than are necessary, reducing overall network throughput. We analyze the impact of the transmission radius on the average settling time—the time at which all nodes in the network finish transmitting the flooded packet. Our analytical model takes into account the behavior of the underlying contention-based MAC protocol, as well as edge effects and the size of the network. We show that for large wireless networks there exists an intermediate transmission radius which minimizes the settling time, corresponding to an optimal tradeoff between reception and contention times. We also explain how physical propagation models affect small wireless networks and why there is no intermediate optimal transmission radius observed in these cases. The mathematical analysis is supported and validated through extensive simulations.Marco Zuniga is currently a PhD student in the Department of Electrical Engineering at the University of Southern California. He received his Bachelors degree in Electrical Engineering from the Pontificia Universidad Catolica del Peru in 1998, and his Masters degree in Electrical Engineering from the University of Southern California in 2002. His interests are in the area of Wireless Sensor Networks in general, and more specifically in studying the interaction amongst different layers to improve the performance of these networks. He is a member of IEEE and the Phi Kappa Phi Honor society.Bhaskar Krishnamachari is an Assistant Professor in the Department of Electrical Engineering at the University of Southern California (USC), where he also holds a joint appointment in the Department of Computer Science. He received his Bachelors degree in Electrical Engineering with a four-year full-tuition scholarship from The Cooper Union for the Advancement of Science and Art in 1998. He received his Masters degree and his Ph.D. in Electrical Engineering from Cornell University in 1999 and 2002, under a four-year university graduate fellowship. Dr. Krishnamacharis previous research has included work on critical density thresholds in wireless networks, data centric routing in sensor networks, mobility management in cellular telephone systems, multicast flow control, heuristic global optimization, and constraint satisfaction. His current research is focused on the discovery of fundamental principles and the analysis and design of protocols for next generation wireless sensor networks. He is a member of IEEE, ACM and the Tau Beta Pi and Eta Kappa Nu Engineering Honor Societies 相似文献
6.
A rapid large scale procedure was devised for the purification of desmosine and isodesmosine from ligamentum nuchae elastin. The method makes use of the hydrophilic nature of the desmosines which preferentially absorbs to cellulose fibers in mixtures of organic solvents. Resolution of the isomers was achieved on a polystyrene resin column. 相似文献
7.
8.
Emmett W. Bassett 《Preparative biochemistry & biotechnology》2013,43(5-6):461-477
A highly active form of wheat germ agglutinin (WGA) was isolated by affinity chromatography on a partially acid hydrolyzed chitin column after extraction of the wheat germ with 0.5 M formic acid and removal of the denatured or water insoluble WGA by dialyzing against distilled water before and after affinity chromatography. The purified preparation was found to be homogeneous by gel filtration, disc electrophoresis, and chemical analysis. It reacted readily with WGA receptors in human serum and urine, giving well-defined bands on agar gel double diffusion and electrophoresis. When chemically coupled to Sepharose the WGA was very reactive with red blood cells, WGA receptors in serum, urine and other biological fluids. The Sepharose-WGA has proven to be stable over a long period of time. 相似文献
9.
利用RNAi技术大规模分析基因功能的研究 总被引:1,自引:0,他引:1
将双链RNA导入细胞内会干扰与之同源的基因的表达 ,使生物体产生相应的功能缺陷表型 ,这种作用称为RNAi(RNAinterference)。针对人类、植物、微生物等大规模基因组测序后所面临的功能鉴定难题 ,着重探讨了RNAi的机制及其研究进展。复合物RISC和酶Dicer的发现揭示了RNAi导致同源基因沉默的作用机理。通过使用RNAi这种反向遗传学工具可使生物体产生相应的功能缺陷表型 ,从而确定未知基因的功能。因此RNAi对大规模分析动、植物基因功能的研究具有重要的作用。 相似文献
10.
大量分离叶绿素a和b的方法 总被引:7,自引:0,他引:7
在原有分离叶绿素a和b的DEAE SepharoseCL 6B和SepharoseCL 6B柱层析法的基础上 ,用较大的层析柱 ,在保持分离效果的前提下 ,提高流速 ,并对层析柱进行原位再生处理 ,实现循环分离 ,可节省装柱和平衡时间 ,快速大量分离纯化叶绿素并可循环再生。 10 0 g菠菜叶可分离得到叶绿素a约 5 0mg ,叶绿素b约 15mg。 相似文献
11.
<正> Two uncoupleable distributions, assigning missions to robots and allocating robots to home stations, accompany the use ofmobile service robots in hospitals.In the given problem, two workload-related objectives and five groups of constraints areproposed.A bio-mimicked Binary Bees Algorithm (BBA) is introduced to solve this multiobjective multiconstraint combinatorialoptimisation problem, in which constraint handling technique (Multiobjective Transformation, MOT), multiobjectiveevaluation method (nondominance selection), global search strategy (stochastic search in the variable space), local searchstrategy (Hamming neighbourhood exploitation), and post-processing means (feasibility selection) are the main issues.TheBBA is then demonstrated with a case study, presenting the execution process of the algorithm, and also explaining the change ofelite number in evolutionary process.Its optimisation result provides a group of feasible nondominated two-level distributionschemes. 相似文献
12.
Previous studies have demonstrated the existence of optimization criteria in the design and development of mammalians cardiovascular systems. Similarities in mammalian arterial wave reflection suggest there are certain design criteria for the optimization of arterial wave dynamics. Inspired by these natural optimization criteria, we investigated the feasibility of optimizing the aortic waves by modifying wave reflection sites. A hydraulic model that has physical and dynamical properties similar to a human aorta and left ventricle was used for a series of in-vitro experiments. The results indicate that placing an artificial reflection site (a ring) at a specific location along the aorta may create a constructive wave dynamic that could reduce LV pulsatile workload. This simple bio-inspired approach may have important implications for the future of treatment strategies for diseased aorta. 相似文献
13.
Noise driven exploration of a brain network’s dynamic repertoire has been hypothesized to be causally involved in cognitive function, aging and neurodegeneration. The dynamic repertoire crucially depends on the network’s capacity to store patterns, as well as their stability. Here we systematically explore the capacity of networks derived from human connectomes to store attractor states, as well as various network mechanisms to control the brain’s dynamic repertoire. Using a deterministic graded response Hopfield model with connectome-based interactions, we reconstruct the system’s attractor space through a uniform sampling of the initial conditions. Large fixed-point attractor sets are obtained in the low temperature condition, with a bigger number of attractors than ever reported so far. Different variants of the initial model, including (i) a uniform activation threshold or (ii) a global negative feedback, produce a similarly robust multistability in a limited parameter range. A numerical analysis of the distribution of the attractors identifies spatially-segregated components, with a centro-medial core and several well-delineated regional patches. Those different modes share similarity with the fMRI independent components observed in the “resting state” condition. We demonstrate non-stationary behavior in noise-driven generalizations of the models, with different meta-stable attractors visited along the same time course. Only the model with a global dynamic density control is found to display robust and long-lasting non-stationarity with no tendency toward either overactivity or extinction. The best fit with empirical signals is observed at the edge of multistability, a parameter region that also corresponds to the highest entropy of the attractors. 相似文献
14.
Alvin W. Nienow 《Cytotechnology》2006,50(1-3):9-33
This article mainly addresses the issues associated with the engineering of large-scale free suspension culture in agitated bioreactors >10,000 L because they have become the system of choice industrially. It is particularly concerned with problems that become increasingly important as the scale increases. However, very few papers have been written that are actually based on such large-scale studies and the few that do rarely address any of the issues quantitatively. Hence, it is necessary very often to extrapolate from small-scale work and this review tries to pull the two types of study together. It is shown that ‘shear sensitivity’ due to agitation and bursting bubbles is no longer considered a major problem. Homogeneity becomes increasingly important with respect to pH and nutrients at the largest scale and sub-surface feeding is recommended despite ‘cleaning in place’ concerns. There are still major questions with cell retention/recycle systems at these scales, either because of fouling, of capacity or of potential and different ‘shear sensitivity’ questions. Fed-batch operation gives rise to cell densities that have led to the use of oxygen and enriched air to meet oxygen demands. This strategy, in turn, gives rise to a CO2 evolution rate that impacts on pH control, pCO2 and osmolality. These interactions are difficult to resolve but if higher sparge and agitation intensities could be used to achieve the necessary oxygen transfer, the problem would largely disappear. Thus, the perception of ‘shear sensitivity’ is still impacting on the development of animal cell culture at the commercial scale. Microcarrier culture is also briefly addressed. Finally, some recommendations for bioreactor configuration and operating strategy are given. 相似文献
15.
Danny M. Gee Richard H. Palmieri Gerald G. Porter Ernst A. Noltmann 《Preparative biochemistry & biotechnology》2013,43(4):441-455
large-scale purification procedure for phosphoglucose isomerase from pig skeletal muscle is described. It consists of two fractionations by selective precipitation and two ion exchange chromatography steps yielding an end product of approximately 900 units (micromoles of sub-strate converted to product per rain per mg of protein, at 30°) specific activity. The method separates three isoenzymic forms with an overall recovery of about 30% of the original total enzyme activity in the form of Isoenzyme III, the latter being the predominant enzyme species. 相似文献
16.
17.
Based on a bionic concept and combing air-cushion techniques and track driving mechanisms, a novel semi-floating hybrid concept vehicle is proposed to meet the transportation requirements on soft terrain. First, the vehicle scheme and its improved duel-spring flexible suspension design are described. Then, its fuel consumption model is proposed accordingly with respect to two vehicle operating parameters. Aiming at minimizing the fuel consumption, two Genetic Algorithms (GAs) are designed and implemented. For the initial one (GA-1), despite getting an acceptable result, there still existed some problems in its optimization process. Based on an analysis of the defects of GA-1, an improved algorithm GA-2 was developed whose effectiveness and stability were embodied in the optimization process and results. The proposed design scheme and optimization approaches can provide valuable references for this new kind of vehicle with promising applications in the areas of agriculture, petroleum industry, military or scientific exploitations, etc. 相似文献
18.
Zhaoda Zhang Alexandra Nichols Jimmy X. Tang Mazen Alsbeti Jin Yan Tang 《Nucleosides, nucleotides & nucleic acids》2013,32(7-9):1585-1588
Abstract Several sulfur-transfer reagents have been evaluated for large scale synthesis of oligonucleotide phosphorothioate analogues, in which 3-ethoxy-1, 2-dithiazoline-5-one (EDITH, 5) shows potential as an alternative to Beaucage reagent. 相似文献
19.
Jijun Hao Charles H. Williams Morgan E. Webb Charles C. Hong 《Journal of visualized experiments : JoVE》2010,(46)
Given their small embryo size, rapid development, transparency, fecundity, and numerous molecular, morphological and physiological similarities to mammals, zebrafish has emerged as a powerful in vivo platform for phenotype-based drug screens and chemical genetic analysis. Here, we demonstrate a simple, practical method for large-scale screening of small molecules using zebrafish embryos. Download video file.(43M, mov) 相似文献
20.
Olman Victor Mao Fenglou Wu Hongwei Xu Ying 《IEEE/ACM transactions on computational biology and bioinformatics / IEEE, ACM》2009,6(2):344-352
Large sets of bioinformatical data provide a challenge in time consumption while solving the cluster identification problem, and that is why a parallel algorithm is so needed for identifying dense clusters in a noisy background. Our algorithm works on a graph representation of the data set to be analyzed. It identifies clusters through the identification of densely intraconnected subgraphs. We have employed a minimum spanning tree (MST) representation of the graph and solve the cluster identification problem using this representation. The computational bottleneck of our algorithm is the construction of an MST of a graph, for which a parallel algorithm is employed. Our high-level strategy for the parallel MST construction algorithm is to first partition the graph, then construct MSTs for the partitioned subgraphs and auxiliary bipartite graphs based on the subgraphs, and finally merge these MSTs to derive an MST of the original graph. The computational results indicate that when running on 150 CPUs, our algorithm can solve a cluster identification problem on a data set with 1,000,000 data points almost 100 times faster than on single CPU, indicating that this program is capable of handling very large data clustering problems in an efficient manner. We have implemented the clustering algorithm as the software CLUMP. 相似文献