首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
SAMtools is a widely-used genomics application for post-processing high-throughput sequence alignment data. Such sequence alignment data are commonly sorted to make downstream analysis more efficient. However, this sorting process itself can be computationally- and I/O-intensive: high-throughput sequence alignment files in the de facto standard binary alignment/map (BAM) format can be many gigabytes in size, and may need to be decompressed before sorting and compressed afterwards. As a result, BAM-file sorting can be a bottleneck in genomics workflows. This paper describes a case study on the performance analysis and optimization of SAMtools for sorting large BAM files. OpenMP task parallelism and memory optimization techniques resulted in a speedup of 5.9X versus the upstream SAMtools 1.3.1 for an internal (in-memory) sort of 24.6 GiB of compressed BAM data (102.6 GiB uncompressed) with 32 processor cores, while a 1.98X speedup was achieved for an external (out-of-core) sort of a 271.4 GiB BAM file.  相似文献   

2.
The application of information processing to the study of the evolution of the patient's state requires the constituting of a file which contains the value of different items as functions of time. There are many methods for doing this and most of them appeal to the construction of a complex database. The technique presented in the paper tries to at first to constitute, for each patient, some records, of which every one corresponds to a determined ‘operation’ (the term ‘operation’ is taken in the wide sense; example: consultation, surgical intervention, …) then to link them according to the chronological order. It has the advantage of being easy so that it may be used even on minicomputers. Furthermore, the exploitation of the resultant data file is very easy. This method is well suited to problems such as the examination of the longitudinal investigations, the treatment of records of the patients who went through several operations, etc. The application presented here is the checking of the functioning of pacemakers. We wrote on a minicomputer MITRA 15/125 transferable general program (using FORTRAN IV exclusively), which allows one to carry out the linking and to exploit the resultant file. It is so designed that the addition of new procedures involved in new applications may be very easy.  相似文献   

3.
This paper presents a general program which cannot only be used for the treatment and the statistical interrogations, but also for basic operations, such as validity check and the preparation of data which is often necessary for other programs. Its use language is simple, suitable and accessible to non-specialists. The program is sufficiently complete so that it can treat some complex problems without requiring complementary programs. It can treat several problems simultaneously. This allows one to gain data-reading time and the program is therefore economical. Its domain of use is large: epidemiological studies, psychological and sociological investigations, biological studies, clinical research, chronological follow-up, examinations of faculties, ….The program is written in FORTRAN IV and thus transferable. It contains more than 25 000 instructions but needs limited place in the core memory (less than 64 000 words). Its structure allows further evolution and addition of new procedures or new methods. A transformation into a conversational form is considered.  相似文献   

4.
In the era of structural genomics, it is necessary to generate accurate structural alignments in order to build good templates for homology modeling. Although a great number of structural alignment algorithms have been developed, most of them ignore intermolecular interactions during the alignment procedure. Therefore, structures in different oligomeric states are barely distinguishable, and it is very challenging to find correct alignment in coil regions. Here we present a novel approach to structural alignment using a clique finding algorithm and environmental information (SAUCE). In this approach, we build the alignment based on not only structural coordinate information but also realistic environmental information extracted from biological unit files provided by the Protein Data Bank (PDB). At first, we eliminate all environmentally unfavorable pairings of residues. Then we identify alignments in core regions via a maximal clique finding algorithm. Two extreme value distribution (EVD) form statistics have been developed to evaluate core region alignments. With an optional extension step, global alignment can be derived based on environment-based dynamic programming linking. We show that our method is able to differentiate three-dimensional structures in different oligomeric states, and is able to find flexible alignments between multidomain structures without predetermined hinge regions. The overall performance is also evaluated on a large scale by comparisons to current structural classification databases as well as to other alignment methods.  相似文献   

5.
IntroductionDeficits in memory performance in later life are frequent and well documented. There are several terms that refer to this phenomenon and the most commonly used is age associated memory impairment (AAMI). Currently, cognitive or memory training programmes are increasingly being used to treat this deficit. The Department of Health of the City of Madrid has developed a multifactorial memory training programme for older people which is carried out in 13 City Health Centres.ObjectivesTo study the effects of this programme in a sample of users aged more than 65 years with memory impairment, to determine the persistence of the results after 6 months, and to investigate predictors of results.Patients and methodThe sample was composed of 1,083 subjects who underwent memory training. The subjects were assessed before and after training and after 6 months. Among other tests, the Mini Examen Cognoscitivo (MEC), the Rivermead Behavioural Memory Test (RBMT), the Geriatric Depression Scale (GDS), and Memory Failures of Everyday (MFE) were used. The training method used (UMAM method) was developed by the Memory Unit of the City of Madrid.ResultsObjective memory improvement for the entire group was 40% (Cohen’s «d», 0.95) and 77% of the subjects improved. Seventy- five percent of the subjects improved in subjective memory functioning (Cohen’s «d», 0.64). Improvement in mood was also observed. These changes were maintained after 6 months. The predictive variables were age, MEC, GHQ and GDS scores before training, but the percentage of explained variance was very low.ConclusionsThe multifactorial memory training programme, UMAM, improves objective and subjective memory functioning in older people with memory impairment and the benefits persist after 6 months. The predictive value of the variables studied is low.  相似文献   

6.
《IRBM》2008,29(1):35-43
In this article, we present a Case-based Reasoning system for the retrieval of patient files similar to a case placed as query. We focus on patient files made up of several images with contextual information (such as the patient age, sex and medical history). Indeed, medical experts generally need varied sources of information (which might be incomplete) to diagnose a pathology. Consequently, we derive a retrieval framework from decision trees, which are well suited to process heterogeneous and incomplete information. To be integrated in the system, images are indexed by their digital content. The method is evaluated on a classified diabetic retinopathy database. On this database, results are promising: the retrieval sensitivity reaches 79.5% for a window of five cases, which is almost twice as good as the retrieval of single images alone.  相似文献   

7.

Background

Sequence alignment data is often ordered by coordinate (id of the reference sequence plus position on the sequence where the fragment was mapped) when stored in BAM files, as this simplifies the extraction of variants between the mapped data and the reference or of variants within the mapped data. In this order paired reads are usually separated in the file, which complicates some other applications like duplicate marking or conversion to the FastQ format which require to access the full information of the pairs.

Results

In this paper we introduce biobambam, a set of tools based on the efficient collation of alignments in BAM files by read name. The employed collation algorithm avoids time and space consuming sorting of alignments by read name where this is possible without using more than a specified amount of main memory. Using this algorithm tasks like duplicate marking in BAM files and conversion of BAM files to the FastQ format can be performed very efficiently with limited resources. We also make the collation algorithm available in the form of an API for other projects. This API is part of the libmaus package.

Conclusions

In comparison with previous approaches to problems involving the collation of alignments by read name like the BAM to FastQ or duplication marking utilities our approach can often perform an equivalent task more efficiently in terms of the required main memory and run-time. Our BAM to FastQ conversion is faster than all widely known alternatives including Picard and bamUtil. Our duplicate marking is about as fast as the closest competitor bamUtil for small data sets and faster than all known alternatives on large and complex data sets.
  相似文献   

8.
BACKGROUND: The jet-in-air cell sorters currently available are not very suitable for sorting potentially biohazardous material under optimal conditions because they do not protect operators and samples as recommended in the guidelines for safe biotechnology. To solve this problem we have adapted a cell sorting system to a special biosafety cabinet that satisfies the requirements for class II cabinets. With aid of this unit, sorting can be performed in conformance with the recommendations for biosafety level 2. METHODS: After integrating a modified fluorescence-activated cell sorter (FACS) Vantage into a special biosafety cabinet, we investigated the influence of the laminar air flow (LAF) inside the cabinet on side stream stability and the analytical precision of the cell sorter. In addition to the routine electronic counting of microparticles, we carried out tests on the containment of aerosols, using T4 bacteriophage as indicators, to demonstrate the efficiency of the biosafety cabinet for sorting experiments performed under biosafety level 2 conditions. RESULTS: The experiments showed that LAF, which is necessary to build up sterile conditions in a biosafety cabinet, does not influence the conditions for side stream stability or the analytical precision of the FACS Vantage cell sorting system. In addition, tests performed to assess aerosol containment during operation of the special biosafety cabinet demonstrated that the cabinet-integrated FACS Vantage unit (CIF) satisfies the conditions for class II cabinets. In the context of gene transfer experiments, the CIF facility was used to sort hematopoietic progenitor cells under biosafety level 2 conditions. CONCLUSIONS: The newly designed biosafety cabinet offers a practical modality for improving biosafety for operators and samples during cell sorting procedures. It can thus also be used for sorting experiments with genetically modified organisms in conformance with current biosafety regulations and guidelines.  相似文献   

9.
R. Delattre 《BioControl》1982,27(1):57-70
Crop loss assessment due to pests is a fundamental element in the economic analysis which must take into consideration cotton prices and the cost of phytosanitary measures. These elements vary from one country to another, which explains the divergence of theoretical intervention thresholds. Because of the attributes of the cotton plant, particularly its remarkable potential for adaptation and recovery, the damage and losses suffered at harvest are generally substantially less than the sum of damages observed during the growing season. It is difficult to estimate the different types of compensation of which the plant is capable when, for example, it benefits from subsequent rainfall. Thus when establishing a tolerance threshold, it is necessary to take into account variations in sowing dates, general cultural care and soil conditions. It is necessary also to consider the fact that chemical treatments may be variable in their effect and duration on pests. Chemical interventions are sometimes associated with unintended effects which are unfavourable (phytotoxicity) or indirect (bioecological equilibrium), the biological and economic consequences of which are difficult to evaluate. The author attempts to show with the aid of diagrams the relationships between the different factors mentioned, emphasising the positive and negative consequences. He concludes that the system of traditional treatments, of regular calendar applications, remains the means of control best adapted to the conditions of cotton production in tropical Africa. More sophisticated systems will not be appropriate until the methods for evaluating pest populations are better developed and more rapid. A broader knowledge of the unintended effects of treatments and the compensatory potential of the damaged plant is also required.  相似文献   

10.
Twenty-seven taxa of appendicularians have been identified fromthe cruise CALCOFI n° 7202 and their distribution has beenstudied by various numerical methods, with the purpose of definingsimultaneously the groups of associated species and the groupsof stations possessing similar characteristics. Two large recurrentgroups have been defined by the method of Fager and McGowan,corresponding approximatively to the major water masses present:central Pacific waters and equatorial Pacific waters. The methodof Williams and Lambert permits the separation of 11 southernstations and a northern group divided into 2 subgroups dependingon the presence or absence of O. dioica. The rank correlationsgive results very similar to those obtained with point correlations.Finally, principal component analysis allow good separationof the northern and southern zones by the factorial plan 1–3,while representation of the stations by the factorial plan 2–3separates the two northern subgroups depending on the presenceor absence of O. dioica. The results obtained by the differentmethods are therefore very much alike, but it is difficult toconclude they are a result of the specific cruise, or whetherthey represent a general phenomenon.  相似文献   

11.
The thick Quaternary deposits of the Caune de l’Arago (Pyrénées-Orientales, France) are dated to between 690 000 and 90 000 years old. At least fifteen different archeostratigraphical units have been identified within these deposits, each corresponding to distinct prehistoric occupations. Numerous stone tools made from several different rock types, have been discovered in each unit. The tools present specific characteristics concerning the choice of raw materials, the typology, and the technology used to produce them. Morpho-technological study of the different components of the assemblage contributes to a better understanding of the debitage methods used for their production. Each raw material is considered individually in order to ascertain its origin in the environment, its typological role and the technology applied during its exploitation. Defining production systems leads to the characterisation of the assemblages from each unit. When compared, they reveal common elements, as well as differences, suggesting evolutionary trends. Some observations are also made concerning the extent to which changing uses of the site may have influenced the general morphology of each assemblage, therefore taking into account exterior factors. Analysis of this rich stone tool assemblage helps to situate the Caune de l’Arago industry within the larger evolutionary context of the Lower Paleolithic in Mediterranean Europe.  相似文献   

12.
Statistically optimal methods for identifying single unit activity in multiple unit recordings are discussed. These methods take into account both the nerve impulse waveforms and the firing patterns of the units. A generalized least-squares fit procedure is shown to be the optimal recognition scheme under some reasonable statistical assumptions, but the amount of computation becomes prohibitively large when the method is applied to the problem of sorting superimposed waveforms. A linear filter technique which relies on simultaneous recording from several electrodes in shown to give good separation of superimposed waveforms. An iterative recognition procedure can be applied to improve the results and reduce the number of recording electrodes required.  相似文献   

13.
Excavation has enabled recovery of 854 artifacts within 30 archaeological levels in the south sector and 11 in the north (chapter 3). These levels are quite probably contemporaneous, or even the same. The quantitative disparity in the number of strata between the two sectors is simply due to the fact that only the lower half of the northern zone was completely investigated. Similarly, the number of artifacts recovered by level varies according to the surface area excavated, although is some cases the density of material is significant despite the small area excavated, for example stratum C IV 5 which contains 174 lithic artifacts in 2 m2. Before undertaking the technological analysis of the artifacts, given the preceding polemics provoked by the great age of this site and its implications for the spread of the first populations out of Africa, it was deemed important to carry out a plurifactorial analysis combining all of the data related to: the stratigraphy, taphonomic processes (post-depositional disturbance), analysis of natural processes that may have produced eoliths, experimentation and techno-functional analysis of the material. The stratigraphy shows clear interstratifications of fine and coarse fluviatile levels with often very clear particle size sorting of the coarse fraction. The archaeological material is typically found at the interface of these strata, either at the base of a clayey matrix, overlying a preceding coarse level, or in the superficial part of a sandy-clay deposit underlying coarse deposits. Post-depositional disturbance revealed during the new excavations in 2003–2006 cannot alone be the cause of eoliths. Excavation of a 6 m2 zone in the modern river bed, located below the site, has demonstrated that the technological traits of eoliths recovered cannot in any way be confused with the technological traits of the artifacts recovered at the site itself. Similarly, viewed quantitatively, the 6 m2 zone excavated in the river bed yielded around 20 eoliths while the 30 m2 zone excavated at the site yielded 854 artifacts, one stratum alone yielding 184 artifacts in a 3 m2 zone. During the experimental phase, adopting the same conditions of procuring raw material, from the same river bed, we very quickly realized the rarity of types of adequate volume that had been generally used at the site and the need to use certain operational processes to create such a form. In addition, the hardness and presence of several natural fracture planes in the Triassic limestone explain the choice of different operational processes and the very high number of knapping accidents, including those occurring during bipolar percussion. Although 90% of the raw material used was cobbles or broken blocks of local Triassic limestone, 10% of the tools were made on exogenous raw materials – siliceous or gravelly limestone, quartzitic sandstone, chert, volcanic rock – that are absent from the immediate environment of the site. These raw materials were brought to the site in the form of tools: worked cobbles, large retouched flakes, backed double-truncated flakes, a plaquette with a lateral bifacial edge, etc. The 854 artifacts have been classified into six object classes: worked cobbles with transversal edge (39%), worked cobbles with lateral edge (2%), unipolar flakes (27%), bipolar objects (half-blocks, half-cobbles including some flat “split” cobbles, “orange slices”, flakes and diverse fragments) (17%) and fragments resulting from knapping of blocks or cobbles (13%), hammerstone (2%). When the frequencies of these classes are calculated for each of the sectors, percentages are similar, indicating a high degree of homogeneity in the archaeological assemblages at the site. The situation is somewhat different when assemblages are compared within a single sector. Slight differences appear in the percentages of bipolar pieces and unipolar flakes. These differences seem to be random, like the frequency rate of knapping accidents in bipolar reduction, or economic, such as the choice of operational schemes to create worked cobbles based on the availability of suitable raw materials. The technological affinity between each of the archaeological assemblages tends to demonstrate great stability in technological knowledge through time. The class of worked cobbles is by far the most important and, apart from a few flakes produced intentionally, it appears to group all of the tools. To avoid placing these tools in a restrictive, semantically meaningless, class, we prefer the concept of matrix to the term worked cobble. A matrix is a structured arrangement of a series of technological traits, in a form as close as possible to that of the future tool. The matrix phase leads to the tool production phase, which may be unnecessary if the matrix phase includes fictionalization. In other words, the concept of matrix enables separating the phase of preparing a predetermined volume, such as a blade, Levallois flake or bifacial piece, from the tool production phase, consisting in creating the type of transformative edge intended, if necessary. The tool is thus an artifact of a specific form with an integrated edge and an operational scheme both specific to the function attributed to it and means of use associated with the form. Observed variability relates to: the size of the volume ranging from to 20, morphology, the form of the line formed by the edge which can in frontal view be curved, linear, sinuous or denticulated, and in transversal view curved, linear, sinuous or saw-toothed, and the length of the edge ranging from 1 to 10. Matrices with a simple bevel are distinguished from double bevels. In the framework of the technological analysis of production schemes to produce matrices with a simple bevel, a broad range of variability in production schemes can be observed, divided arbitrarily into two stages. The first stage consists in creating as closely as possibly the technological traits of the future matrix due to five general schemes. The first scheme (A) consists in selecting a cobble or block naturally possessing at least some of the technological traits needed. The missing traits are added by various preparations, including bipolar percussion 3 times out of 5. The second scheme (B) consists in knapping a flake from the block with some of the technological traits required for the matrix present on one of its surfaces. The third scheme (C) consists in the choice of a plaquette from which a bipolar shock creates the main traits of the matrix. The fourth scheme (D) consists in choosing a volume very similar to the intended matrix. The fifth and final scheme (E) consists in knapping a flake with technological traits very different from those intended. Depending on the distance between intention and realization, a second stage may be necessary. In general, this second stage perfects or creates the intended active edge, which is rarely obtained in the first stage. To produce a matrix with a double bevel, it is sometimes necessary to add an intermediate stage in order to prepare the second bevel. The first stage remains the same, with the use of the five operational schemes. By contrast, a clear difference exists in the percentages for the use of these schemes. For a matrix with a simple bevel, scheme A is dominant, followed by scheme D, while the situation is reversed for a matrix with a double bevel, where scheme D is dominant. Unipolar flakes, representing 27% of the assemblage, are produced in three different ways. The most important is flakes resulting from matrix production. The two others are flakes produced during different knapping schemes, some flakes in relation to the few cores present, other flakes in exogenous raw materials produced elsewhere and generally much larger. The other classes are dominated by bipolar products resulting essentially from knapping accidents. To summarize, these assemblages are characterized by: the search for tools differentiated by form and active edges; more than 90% of the tools made on two kinds of supports: a matrix with a simple bevel or a double bevel; matrices obtained using different operational schemes successively associating if necessary a knapping stage and a shaping stage. While the Triassic limestone is hard and thus imposes a strong constraint on knapping, the range of operational schemes appears to have been a “cultural” response diversified to this constraint and the presence of tools on exogenous raw materials. At the scale of China, comparison of this industry is impossible since it is the only site of this age and to contain so much material. The site of Majuangou, the only site of similar caliber, is younger by several hundred thousand years and is located several thousand kilometers to the north, making comparisons meaningless. We note only that most of the tool supports at Majuangou are knapped flakes. On an inter-continental scale, the comparison of sites of equal age is more promising. But lithic analyses are based on different methods, preventing comparison of similar data. However, if we make a simple summary of the data available, we can first say that in Africa, during these periods, different development technological stages were present and stages that are considered more evolved are manifestly less common using our approach. While these stages are more or less contemporaneous, which counters the idea of uniqueness, they would more surely be evidence of populations that were not in direct contact and had separate lines of development. In Asia, the Longgupo industry evidences a different technological option than that of contemporaneous populations in Africa. By contrast, when we take into account its developmental stage, we realize that this is an “evolved” stage in which the form of the support of the future tool is predominant. If we compare Africa and Asia in terms of stages, we are a priori at the same stage with different options being selected.  相似文献   

14.
Mass spectrometry-based proteomics is increasingly being used in biomedical research. These experiments typically generate a large volume of highly complex data, and the volume and complexity are only increasing with time. There exist many software pipelines for analyzing these data (each typically with its own file formats), and as technology improves, these file formats change and new formats are developed. Files produced from these myriad software programs may accumulate on hard disks or tape drives over time, with older files being rendered progressively more obsolete and unusable with each successive technical advancement and data format change. Although initiatives exist to standardize the file formats used in proteomics, they do not address the core failings of a file-based data management system: (1) files are typically poorly annotated experimentally, (2) files are "organically" distributed across laboratory file systems in an ad hoc manner, (3) files formats become obsolete, and (4) searching the data and comparing and contrasting results across separate experiments is very inefficient (if possible at all). Here we present a relational database architecture and accompanying web application dubbed Mass Spectrometry Data Platform that is designed to address the failings of the file-based mass spectrometry data management approach. The database is designed such that the output of disparate software pipelines may be imported into a core set of unified tables, with these core tables being extended to support data generated by specific pipelines. Because the data are unified, they may be queried, viewed, and compared across multiple experiments using a common web interface. Mass Spectrometry Data Platform is open source and freely available at http://code.google.com/p/msdapl/.  相似文献   

15.
Oblong, a program with very low memory requirements, is presented. It is designed for parsimony analysis of data sets comprising many characters for moderate numbers of taxa (the order of up to a few hundred). The program can avoid using vast amounts of RAM by temporarily saving data to disk buffers, only parts of which are periodically read back in by the program. In this way, the entire data set is never held in RAM by the program—only small parts of it. While using disk files to store the data slows down searches, it does so only by a relatively small factor (4× to 5×), because the program minimizes the number of times the data must be accessed (i.e. read back in) during tree searches. Thus, even if the program is not designed primarily for speed, runtimes are within an order of magnitude of those of the fastest existing parsimony programs.  相似文献   

16.

Background

Principal component analysis (PCA) has been widely employed for automatic neuronal spike sorting. Calculating principal components (PCs) is computationally expensive, and requires complex numerical operations and large memory resources. Substantial hardware resources are therefore needed for hardware implementations of PCA. General Hebbian algorithm (GHA) has been proposed for calculating PCs of neuronal spikes in our previous work, which eliminates the needs of computationally expensive covariance analysis and eigenvalue decomposition in conventional PCA algorithms. However, large memory resources are still inherently required for storing a large volume of aligned spikes for training PCs. The large size memory will consume large hardware resources and contribute significant power dissipation, which make GHA difficult to be implemented in portable or implantable multi-channel recording micro-systems.

Method

In this paper, we present a new algorithm for PCA-based spike sorting based on GHA, namely stream-based Hebbian eigenfilter, which eliminates the inherent memory requirements of GHA while keeping the accuracy of spike sorting by utilizing the pseudo-stationarity of neuronal spikes. Because of the reduction of large hardware storage requirements, the proposed algorithm can lead to ultra-low hardware resources and power consumption of hardware implementations, which is critical for the future multi-channel micro-systems. Both clinical and synthetic neural recording data sets were employed for evaluating the accuracy of the stream-based Hebbian eigenfilter. The performance of spike sorting using stream-based eigenfilter and the computational complexity of the eigenfilter were rigorously evaluated and compared with conventional PCA algorithms. Field programmable logic arrays (FPGAs) were employed to implement the proposed algorithm, evaluate the hardware implementations and demonstrate the reduction in both power consumption and hardware memories achieved by the streaming computing

Results and discussion

Results demonstrate that the stream-based eigenfilter can achieve the same accuracy and is 10 times more computationally efficient when compared with conventional PCA algorithms. Hardware evaluations show that 90.3% logic resources, 95.1% power consumption and 86.8% computing latency can be reduced by the stream-based eigenfilter when compared with PCA hardware. By utilizing the streaming method, 92% memory resources and 67% power consumption can be saved when compared with the direct implementation of GHA.

Conclusion

Stream-based Hebbian eigenfilter presents a novel approach to enable real-time spike sorting with reduced computational complexity and hardware costs. This new design can be further utilized for multi-channel neuro-physiological experiments or chronic implants.  相似文献   

17.
The precise morphological study of the chromosomes of Lolium perenne L. puts forwards the lack of a statistical method allowing to appreciate the distinction between the elements of a chromosomal stock for the establishment of a caryogram. The present method is based upon the comparisons of variances, using, as a check value, a comprehensive estimation of the purely random variation. This estimation must be done by avoiding two sources of error, which were not taken in consideration until now and result from the fluctuation process only: 1. Arbitrary separation in each cell of a set of 4 chromosomes which are not really distinguishable in two pairs, one to which a relatively high value is attributed, and the second to which a lower value is given. Both distributions obtained present very low variances which, when employed in order to test the significance of the difference between both means, wrongly confirm the value of the initial distinction. 2. Confusion of both arms in symmetrical chromosomes when establishing the C/L ratio. This has two consequences: decreasing the value of the variance and under estimation of the symmetry, and this so much the more than the actual symmetry is more complete. — Application of this method leads to consider as identical the f and g chromosomes of Lolium perenne and to separate by means of the total lengths (or the ratio T/Θ) the c and d pairs the morphological identity of which is, in other respects, remarkable. In addition, it is also interesting to note the clear manifestation of a satisfying homogeneity in the variation of the chromosomes and of their constituent arms.  相似文献   

18.
Flow cytometric cell sorting is commonly used to obtain purified subpopulations of cells for use in in vitro and in vivo assays. This can be time-consuming if the subpopulations of interest represent very low percentages of the cell suspension under study. Often the desired subpopulations are identified by two-color immunofluorescence staining. Generally, cell sorting is performed with a flow cytometer configured to trigger on light scatter signals, then sort windows are set based upon the signals from both fluorescent markers. We demonstrate that triggering the cytometer with the fluorescence signal from antibody staining common to both of the desired subpopulations, then sorting the subpopulations based upon staining of a second marker, substantially increases the speed of cell sorting vis-à-vis traditional methods. This is because undesired events are not analysed, allowing an increase in the throughput rate. While desired subpopulations of cells can be obtained by this method, undesired (i.e., nonstaining) cell "contaminants" increase and may require a second sort. The combined time for the initial enrichment sort and a second sort can be less than sorting once using standard methodology. Alternatively, the degree of contamination may be controlled by adjusting the concentration of the cell suspension and by the sample flow rate.  相似文献   

19.
Decision-making in hospital, when a biomedical instrumentation investment has to be decided, is not simple because of the large amount of criteria which should be considered. To show the most important criteria in decision-making, their respective weight and their interdependencies, we used multi-dimensional analysis methods. A quasi-exhaustive list of criteria has been constructed and sent to all the heads of medical departments, the directors and the clinical engineers of the French university hospitals, in order to obtain their appreciation of the influence of these criteria in their own decision-making experience. The application of factorial analysis methods to the answers gives two main results. There is no characteristic behaviour of professional groups, such as engineers, directors, surgeons, radiologist, etc.… A list of 21 criteria, which are the most influential in decision-making, can be identified. This result will be emphasized in order to develop a tool which allows one to measure the adequacy of a biomedical instrument that shows a clinical need.  相似文献   

20.
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号