首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Routine preoperative tests such as the determination of bleeding time and coagulation time are unnecessary and are not recommended. Rulings which require routine preoperative tests result in the adoption of inferior and unreliable time-saving methods in the laboratory. If the clinical staff insists that laboratory procedures to predict hemorrhage be performed on every patient scheduled for operation, approved methods of performing the tests should be employed.Preoperative procedures should include a personal and a family history, a careful and complete physical examination and screening laboratory tests such as urinalysis, hematocrit, leukocyte count and smear examination, including estimation of the number of thrombocytes.Special hemorrhagic studies are indicated on selected patients. These selected patients include those who have a history of abnormal bleeding, those who consider themselves “easy bleeders” or who have apprehension concerning hemorrhage at the time of operation, and those who have physical signs of hemorrhage. Special hemorrhagic studies should also be performed on patients who have diseases that are known to be associated with vascular and coagulation abnormalities, infants who have not been subjected to tests of trauma and on patients from whom a reliable history cannot be obtained.Extra precaution should be taken if operation is to be performed in hospitals or clinics that do not have adequate blood banking facilities and if the operation to be performed is one in which difficulty in hemostasis is anticipated.The preoperative tests that are indicated on selected patients should include as a minimum: The thrombocyte count, determination of the bleeding time by the Ivy method, determination of the coagulation time by the multiple tube method and the observation of the clot. Where facilities are available, the hemorrhagic study should also include the plasma and serum prothrombin activity tests.  相似文献   

2.
Bacterial agents and cell components can be spread as bioaerosols, producing infections and asthmatic problems. This study compares four methods for the detection and enumeration of aerosolized bacteria collected in an AGI-30 impinger. Changes in the total and viable concentrations of Pseudomonas fluorescens in the collection fluid with respect to time of impingement were determined. Two direct microscopic methods (acridine orange and BacLight) and aerodynamic aerosol-size spectrometry (Aerosizer) were employed to measure the total bacterial cell concentrations in the impinger collection fluid and the air, respectively. These data were compared with plate counts on selective (MacConkey agar) and nonselective (Trypticase soy agar) media, and the percentages of culturable cells in the collection fluid and the bacterial injury response to the impingement process were determined'. The bacterial collection rate was found to be relatively unchanged during 60 min of impingement. The aerosol measurements indicated an increased amount of cell fragments upstream of the impinger due to continuous bacterial nebulization. Some of the bacterial clusters, present in the air upstream of the impinger, deagglomerated during impingement, thus increasing the total bacterial count by both direct microscopic methods. The BacLight staining technique was also used to determine the changes in viable bacterial concentration during the impingement process. The percentage of viable bacteria, determined as a ratio of BacLight live to total counts was only 20% after 60 min of sampling. High counts on Trypticase soy agar indicated that most of the injured cells could recover. On the other hand, the counts from the MacConkey agar were very low, indicating that most of the cells were structurally damaged in the impinger. The comparison of data on the percentage of injured bacteria obtained by the traditional plate count with the data on percentage of nonviable bacteria obtained by the BacLight method showed good agreement.  相似文献   

3.
Different methods are used to study bacterial adhesion to intestinal epithelial cells, which is an important step in pathogenic infection as well as in probiotic colonization of the intestinal tract. The aim of this study was to compare the ELISA-based method with more conventional plate count and radiolabeling methods for bacterial adhesion detection. An ELISA-based assay was optimized for the detection of Bifidobacterium longum and Escherichia coli O157:H7, which are low and highly adherent bacteria, respectively. In agreement with previous investigations, a percentage of adhesion below 1% was obtained for B. longum with ELISA. However, high nonspecific background and low positive signals were measured due to the use of polyclonal antibodies and the low adhesion capacity with this strain. In contrast, the ELISA-based method developed for E. coli adhesion detected a high adhesion percentage (15%). For this bacterium the three methods tested gave similar results for the highest bacterial concentrations (6.8 Log CFU added bacteria/well). However, differences among methods increased with the addition of decreased bacterial concentration due to different detection thresholds (5.9, 5.6 and 2.9 Log CFU adherent bacteria/well for radioactivity, ELISA and plate count methods, respectively). The ELISA-based method was shown to be a good predictor for bacterial adhesion compared to the radiolabeling method when good quality specific antibodies were used. This technique is convenient and allows handling of numerous samples.  相似文献   

4.
Routine preoperative tests such as the determination of bleeding time and coagulation time are unnecessary and are not recommended. Rulings which require routine preoperative tests result in the adoption of inferior and unreliable time-saving methods in the laboratory. If the clinical staff insists that laboratory procedures to predict hemorrhage be performed on every patient scheduled for operation, approved methods of performing the tests should be employed. Preoperative procedures should include a personal and a family history, a careful and complete physical examination and screening laboratory tests such as urinalysis, hematocrit, leukocyte count and smear examination, including estimation of the number of thrombocytes. Special hemorrhagic studies are indicated on selected patients. These selected patients include those who have a history of abnormal bleeding, those who consider themselves "easy bleeders" or who have apprehension concerning hemorrhage at the time of operation, and those who have physical signs of hemorrhage. Special hemorrhagic studies should also be performed on patients who have diseases that are known to be associated with vascular and coagulation abnormalities, infants who have not been subjected to tests of trauma and on patients from whom a reliable history cannot be obtained. Extra precaution should be taken if operation is to be performed in hospitals or clinics that do not have adequate blood banking facilities and if the operation to be performed is one in which difficulty in hemostasis is anticipated. THE PREOPERATIVE TESTS THAT ARE INDICATED ON SELECTED PATIENTS SHOULD INCLUDE AS A MINIMUM: The thrombocyte count, determination of the bleeding time by the Ivy method, determination of the coagulation time by the multiple tube method and the observation of the clot. Where facilities are available, the hemorrhagic study should also include the plasma and serum prothrombin activity tests.  相似文献   

5.
A study was made on the efficacy of commonly used test methods to provide useful data concerning toxicity of construction materials to bacteria. Tests such as growth and storage of liquid cultures, which are based on plate counts, appeared to be less efficient in providing interpretable data than were tests which used solid media. The conjunctive use of zone inhibition and an adaptation of the replica-plate technique used in bacterial genetics supplied as much information as other tests, but these solid-media tests were easier to perform, accommodated more test samples per unit of bacterial culture, and the data were more easily understood.  相似文献   

6.
The generalized negative binomial distribution has been found useful in fitting over-dispersed as well as under-dispersed count data. We define and study the generalized binomial regression model which is used to predict a count response variable affected by one or more explanatory variables. The methods of maximum likelihood and moments are given for estimating the model parameters. Approximate tests for the adequacy of the model are considered. The generalized binomial regression model has been applied to two observed data sets to which binomial regression model was applied earlier.  相似文献   

7.
The published results on 60 chemicals and X-rays investigated in the mouse spot test were compared with data on the same chemicals tested in the bacterial mutation assay (Ames test) and lifetime rodent bioassays. The performance of the spot test as an in vivo complementary assay to the in vitro bacterial mutagenesis test reveals that of 60 agents, 38 were positive in both systems, 6 were positive only in the spot test, 10 were positive only in the bacterial test and 6 were negative in both assays. The spot test was also considered as a predictor of carcinogenesis; 45 chemicals were carcinogenic of which 35 were detected as positive by the spot test and 3 out of 6 non-carcinogens were correctly identified as negative. If the results are regarded in sequence, i.e. that a positive result in a bacterial mutagenicity test reveals potential that may or may not be realized in vivo, then 48 chemicals were mutagenic in the bacterial mutation assay of which 38 were active in the spot test and 31 were confirmed as carcinogens in bioassays. 12 chemicals were non-mutagenic to bacteria of which 6 gave positive responses in the spot test and 5 were confirmed as carcinogens. These results provide strong evidence that the mouse coat spot test is an effective complementary test to the bacterial mutagenesis assay for the detection of genotoxic chemicals and as a confirmatory test for the identification of carcinogens. The main deficiency at present is the paucity of data from the testing of non-carcinogens. With further development and improvement of the test it is probable that the predictive performance of the assay in identifying carcinogens should improve, since many of the false negative responses may be due to inadequate testing.  相似文献   

8.
MOTIVATION: Analysis of oligonucleotide array data, especially to select genes of interest, is a highly challenging task because of the large volume of information and various experimental factors. Moreover, interaction effect (i.e. expression changes depend on probe effects) complicates the analysis because current methods often use an additive model to analyze data. We propose an approach to address these issues with the aim of producing a more reliable selection of differentially expressed genes. The approach uses the rank for normalization, employs the percentile-range to measure expression variation, and applies various filters to monitor expression changes. RESULTS: We compare our approach with MAS and Dchip models. A data set from an angiogenesis study is used for illustration. Results show that our approach performs better than other methods either in identification of the positive control gene or in PCR confirmatory tests. In addition, the invariant set of genes in our approach provides an efficient way for normalization.  相似文献   

9.
Aims:  There has been an increasing number of pathogens becoming resistant to multiple classes of antibiotics. The study on how mutation emerges is therefore crucial to promote further understanding in this area. Conventional methods for such studies involve the monitoring of growth by standard plate count and biomolecular sequencing. This is however tedious and not cost effective. The aim of this paper is thus to introduce a novel system that enables real-time monitoring of bacterial ‘mutation-in-progress’. Methods and Results:  This system provides real-time data, thus enabling confirmatory and further work to be performed at the important points when mutation is initiated. The system integrates spectroscopic techniques as the detection system and various supporting systems, such as a nutrient replenishing system, a pH control system and a waste system to allow for extended monitoring. In this paper, the feasibility of monitoring the emergence of ciprofloxacin resistance in Staphylococcus aureus was demonstrated as an initial example. The integrated system was found to require significantly less material resource and manpower compared with conventional techniques. Conclusions:  The novel system to monitor bacterial mutation-in-progress is presented. The work reported herein demonstrates such a system to be effective and efficient in performing real-time monitoring of mutation-in-progress, especially in extended time frames for mutation into the weeks and months. Significance and Impact of the Study:  With the successful optimization of this system, researchers can learn about the dynamics of antibiotic resistance and further understand how the mutation of bacteria occurs.  相似文献   

10.
Classification of microorganisms on the basis of traditional microbiological methods (morphological, physiological and biochemical) creates a blurred image about their taxonomic status and thus needs further clarification. It should be based on a more pragmatic approach of deploying a number of methods for the complete characterization of microbes. Hence, the methods now employed for bacterial systematics include, the complete 16S rRNA gene sequencing and its comparative analysis by phylogenetic trees, DNA-DNA hybridization studies with related organisms, analyses of molecular markers and signature pattern(s), biochemical assays, physiological and morphological tests. Collectively these genotypic, chemotaxonomic and phenotypic methods for determining taxonomic position of microbes constitute what is known as the ‘polyphasic approach’ for bacterial systematics. This approach is currently the most popular choice for classifying bacteria and several microbes, which were previously placed under invalid taxa have now been resolved into new genera and species. This has been possible owing to rapid development in molecular biological techniques, automation of DNA sequencing coupled with advances in bioinformatic tools and access to sequence databases. Several DNA-based typing methods are known; these provide information for delineating bacteria into different genera and species and have the potential to resolve differences among the strains of a species. Therefore, newly isolated strains must be classified on the basis of the polyphasic approach. Also previously classified organisms, as and when required, can be reclassified on this ground in order to obtain information about their accurate position in the microbial world. Thus, current techniques enable microbiologists to decipher the natural phylogenetic relationships between microbes.  相似文献   

11.
The medical device-related infections are frequently a consequence of Staphylococcus biofilm, a lifestyle enhancing bacterial resistance to antibiotics. Antibiotic susceptibility tests are usually performed on planktonic forms of clinical isolates. Some methods have been developed to perform antibiotic susceptibility tests on biofilm. However, none of them counts bacterial inoculum. As antibiotic susceptibility is related to bacterial inoculum, the test results could be mistaken. Here, a new method, BioTimer Assay (BTA), able to count bacteria in biofilm without any manipulation of samples, is presented. Moreover, the BTA method is applied to analyze antibiotic susceptibility of six Staphylococcus strains in biofilm and to determine the number of viable bacteria in the presence of sub-inhibitory doses of four different antibiotics. To validate BTA, the new method was compared to reference methods both for counting and antibiotic susceptibility tests. A high agreement between BTA and reference methods is found on planktonic forms. Therefore, BTA was employed to count bacteria in biofilm and to analyze biofilm antibiotic susceptibility. Results confirm the high resistance to antibiotics of Staphylococcus biofilm. Moreover, BTA counts the number of viable bacteria in the presence of sub-inhibitory doses of antibiotics. The results show that the number of viable bacteria depends on sub-inhibitory doses, age of biofilm and type of antibiotic. In particular, differently to gentamicin and ampicillin, sub-inhibitory doses of ofloxacin and azithromycin reduce the number of viable bacteria at lower extent in young than in old biofilm. In conclusion, BTA is a reliable, rapid, easy-to-perform, and versatile method, and it can be considered a useful tool to analyze antibiotic susceptibility of Staphylococcus spp. in biofilm.  相似文献   

12.
Traditional resampling-based tests for homogeneity in covariance matrices across multiple groups resample residuals, that is, data centered by group means. These residuals do not share the same second moments when the null hypothesis is false, which makes them difficult to use in the setting of multiple testing. An alternative approach is to resample standardized residuals, data centered by group sample means and standardized by group sample covariance matrices. This approach, however, has been observed to inflate type I error when sample size is small or data are generated from heavy-tailed distributions. We propose to improve this approach by using robust estimation for the first and second moments. We discuss two statistics: the Bartlett statistic and a statistic based on eigen-decomposition of sample covariance matrices. Both statistics can be expressed in terms of standardized errors under the null hypothesis. These methods are extended to test homogeneity in correlation matrices. Using simulation studies, we demonstrate that the robust resampling approach provides comparable or superior performance, relative to traditional approaches, for single testing and reasonable performance for multiple testing. The proposed methods are applied to data collected in an HIV vaccine trial to investigate possible determinants, including vaccine status, vaccine-induced immune response level and viral genotype, of unusual correlation pattern between HIV viral load and CD4 count in newly infected patients.  相似文献   

13.
Zero‐truncated data arises in various disciplines where counts are observed but the zero count category cannot be observed during sampling. Maximum likelihood estimation can be used to model these data; however, due to its nonstandard form it cannot be easily implemented using well‐known software packages, and additional programming is often required. Motivated by the Rao–Blackwell theorem, we develop a weighted partial likelihood approach to estimate model parameters for zero‐truncated binomial and Poisson data. The resulting estimating function is equivalent to a weighted score function for standard count data models, and allows for applying readily available software. We evaluate the efficiency for this new approach and show that it performs almost as well as maximum likelihood estimation. The weighted partial likelihood approach is then extended to regression modelling and variable selection. We examine the performance of the proposed methods through simulation and present two case studies using real data.  相似文献   

14.
The usefulness of microbiological standards for frozen foods is now a controversy in the trade and scientific literature. Most reviewers have given arguments both for and against, and have concluded that they should be applied with great caution. Such standards have the advantage of putting questions of safety on a convenient numerical basis. Canadian workers have reported that promulgation of standards has invariably raised the hygienic level of the products controlled.

Bacteriological standards have often been associated with the question of safety to the consumer. Everyone recognizes that food poisoning bacteria are a potential danger in any food. But many have argued that the history of food poisoning outbreaks from frozen foods is excellent and that there is no need for standards; on the other hand, proponents of standards have pointed to the incomplete investigation and reporting of outbreaks, and have argued that there may be more outbreaks than we realize. They have pointed to laboratory studies that have shown grossly mishandled precooked frozen foods to be truly dangerous. Some have proposed that pathogens should be absent from foods; but others have questioned that a microbiological standard can accomplish this end. Some pathogens, such as Salmonella or Staphylococcus have been shown to be so ubiquitous that their presence in some commercial foods is unavoidable. Also, sampling and analytical methods have been described as inadequate to guarantee that pathogens present will be detected. Some have argued that control at the source is a better way—through inspections of the plant operation, by enforcement of handling codes, or by processing procedures such as pasteurization, which would be more certain to result in a pathogen-free food.

A most important part of any of the proposed standards is a “total count” of viable aerobic bacteria. English workers have found that foods causing poisoning outbreaks usually had total viable counts above 10 million per gram. On the other hand, these same workers found Salmonella on meats with very low total viable count. The assumption by many that low total count indicates safety has been shown to be not always true. Furthermore, high counts of nonpathogenic organisms, such as psychrophilic saprophytes would have no public health significance.

The relation between bacterial level and quality is open to less controversy. Some authorities have pointed to bacterial level as a measure of sanitation, adequacy of refrigeration, or speed of handling. Others have indicated that to determine which of these factors caused a high count would be impossible with only a total count on the product as a guide. Some investigators have said a high count affects flavor adversely before actual spoilage is evident, and this may be a factor in competition on today's market. It is well established that initial bacterial level will affect the shelf-life of a chilled product. Methods of analysis are more nearly adequate for counts than for pathogens, but they need improvement, and should be clearly specified as part of any bacteriological standard. Foods with high count could sometimes be brought into compliance merely by storing them for a sufficient period frozen, or by heating them slightly. This has been cited by some authors as a disadvantage of bacteriological standards.

The enterococci and the coliform group (except Escherichia coli) have been shown to be ubiquitous and therefore should not be used alone to indicate fecal contamination. Although E. coli has greater significance, its source should be determined each time it is found.

Various reviewers have expressed the need for caution in the application of standards. The principal precautionary arguments we have found are as follows:

1) A single set of microbiological standards should not be applied to foods as a miscellaneous group, such as “frozen foods” or “precooked foods.”

2) Microbiological standards should be applied first to the more hazardous types of foods on an individual basis, after sufficient data are accumulated on expected bacterial levels, with consideration of variations in composition, processing procedures, and time of frozen storage.

3) When standards are chosen, there should be a definite relation between the standard and the hazard against which it is meant to protect the public.

4) Methods of sampling and analysis should be carefully studied for reliability and reproducibility among laboratories, and chosen methods should be specified in detail as part of the standard.

5) Tolerances should be included in the standard to account for inaccuracies of sampling and analysis.

6) At first, the standard should be applied on a tentative basis to allow for voluntary compliance before becoming a strictly enforced regulation.

7) Microbiological standards will be expensive to enforce.

8) If standards are unwisely chosen they will not stand in courts of law.

  相似文献   

15.
Strains of Clostridium perfringens and culturally similar species which also may grow on selective isolation media for this organism were examined by conventional confirmatory tests, the API ZYM system and by individual tests for phosphatase and glutamic acid decarboxylase activity.
API ZYM tests, involving 19 different enzymes, confirmed the known similarity between Cl. perfringens, Cl. absonum, Cl. paraperfringens and Cl. sardiniensis but effectively distinguished this group from Cl. bifermentans, Cl. celatum, Cl. perenne and Cl. sordellii. A similar separation was achieved by a single test for acid phosphatase which could be applied to individual colonies on a plating medium.
Because the acid phosphatase test was found to be of greater value than nitrate reduction in distinguishing Cl. perfringens , it could replace the latter in the usual series of confirmatory tests. It is suggested that strains from Cl. perfringens isolation media should be screened for acid phosphatase activity at the purification stage and only positive strains subjected to further tests.
It was found that Cl. perfringens could not be distinguished from the other species on the basis of glutamate decarboxylase activity.  相似文献   

16.
Keith P. Lewis 《Oikos》2004,104(2):305-315
Ecologists rely heavily upon statistics to make inferences concerning ecological phenomena and to make management recommendations. It is therefore important to use statistical tests that are most appropriate for a given data-set. However, inappropriate statistical tests are often used in the analysis of studies with categorical data (i.e. count data or binary data). Since many types of statistical tests have been used in artificial nests studies, a review and comparison of these tests provides an opportunity to demonstrate the importance of choosing the most appropriate statistical approach for conceptual reasons as well as type I and type II errors.
Artificial nests have routinely been used to study the influences of habitat fragmentation, and habitat edges on nest predation. I review the variety of statistical tests used to analyze artificial nest data within the framework of the generalized linear model and argue that logistic regression is the most appropriate and flexible statistical test for analyzing binary data-sets. Using artificial nest data from my own studies and an independent data set from the medical literature as examples, I tested equivalent data using a variety of statistical methods. I then compared the p-values and the statistical power of these tests. Results vary greatly among statistical methods. Methods inappropriate for analyzing binary data often fail to yield significant results even when differences between study groups appear large, while logistic regression finds these differences statistically significant. Statistical power is is 2–3 times higher for logistic regression than for other tests. I recommend that logistic regression be used to analyze artificial nest data and other data-sets with binary data.  相似文献   

17.
The evidence for amphibian population declines is based on count data that were not adjusted for detection probabilities. Such data are not reliable even when collected using standard methods. The formula C = Np (where C is a count, N the true parameter value, and p is a detection probability) relates count data to demography, population size, or distributions. With unadjusted count data, one assumes a linear relationship between C and N and that p is constant. These assumptions are unlikely to be met in studies of amphibian populations. Amphibian population data should be based on methods that account for detection probabilities.  相似文献   

18.
The coliform group has been used extensively as an indicator of water quality and has historically led to the public health protection concept. The aim of this review is to examine methods currently in use or which can be proposed for the monitoring of coliforms in drinking water. Actually, the need for more rapid, sensitive and specific tests is essential in the water industry. Routine and widely accepted techniques are discussed, as are methods which have emerged from recent research developments.Approved traditional methods for coliform detection include the multiple-tube fermentation (MTF) technique and the membrane filter (MF) technique using different specific media and incubation conditions. These methods have limitations, however, such as duration of incubation, antagonistic organism interference, lack of specificity and poor detection of slow-growing or viable but non-culturable (VBNC) microorganisms. Nowadays, the simple and inexpensive membrane filter technique is the most widely used method for routine enumeration of coliforms in drinking water.The detection of coliforms based on specific enzymatic activity has improved the sensitivity of these methods. The enzymes beta-D galactosidase and beta-D glucuronidase are widely used for the detection and enumeration of total coliforms and Escherichia coli, respectively. Many chromogenic and fluorogenic substrates exist for the specific detection of these enzymatic activities, and various commercial tests based on these substrates are available. Numerous comparisons have shown these tests may be a suitable alternative to the classical techniques. They are, however, more expensive, and the incubation time, even though reduced, remains too long for same-day results. More sophisticated analytical tools such as solid phase cytometry can be employed to decrease the time needed for the detection of bacterial enzymatic activities, with a low detection threshold.Detection of coliforms by molecular methods is also proposed, as these methods allow for very specific and rapid detection without the need for a cultivation step. Three molecular-based methods are evaluated here: the immunological, polymerase chain reaction (PCR) and in-situ hybridization (ISH) techniques. In the immunological approach, various antibodies against coliform bacteria have been produced, but the application of this technique often showed low antibody specificity. PCR can be used to detect coliform bacteria by means of signal amplification: DNA sequence coding for the lacZ gene (beta-galactosidase gene) and the uidA gene (beta-D glucuronidase gene) has been used to detect total coliforms and E. coli, respectively. However, quantification with PCR is still lacking in precision and necessitates extensive laboratory work. The FISH technique involves the use of oligonucleotide probes to detect complementary sequences inside specific cells. Oligonucleotide probes designed specifically for regions of the 16S RNA molecules of Enterobacteriaceae can be used for microbiological quality control of drinking water samples. FISH should be an interesting viable alternative to the conventional culture methods for the detection of coliforms in drinking water, as it provides quantitative data in a fairly short period of time (6 to 8 h), but still requires research effort.This review shows that even though many innovative bacterial detection methods have been developed, few have the potential for becoming a standardized method for the detection of coliforms in drinking water samples.  相似文献   

19.
Peter Warner  Alison Glassco 《CMAJ》1963,88(26):1280-1283
An investigation of the “normal” bacterial content of the air of a large general hospital is described. In many different places within five different areas 70 to 200 settle-plate or slit-sampler bacterial counts were carried out. Average counts were most often of the same order as or lower than other published results and were proportional to human activity. The use of the logarithms of the counts showed no advantage, and conventional statistics should be applied with caution in evaluating such studies. Slitsampler and settle-plate counts of all bacteria showed no correlation, whereas those of Staph. aureus were correlated. There is a lack of parallelism between hospital infection and air bacteria counted by current methods, which are, therefore, not suitable for routine use.  相似文献   

20.
The presence of Aeromonas spp. in water can represent a risk for human health. Therefore, it is important to know the physiological status of these bacteria and their survival in the environment. We studied the behavior of a strain of Aeromonas hydrophila in river water, spring water, brackish water, mineral water, and chlorinated drinking water, which had different physical and chemical characteristics. The bacterial content was evaluated by spectrophotometric and plate count techniques. Flow cytometric determination of viability was carried out using a dual-staining technique that enabled us to distinguish viable bacteria from damaged and membrane-compromised bacteria. The traditional methods showed that the bacterial content was variable and dependent on the type of water. The results obtained from the plate count analysis correlated with the absorbance data. In contrast, the flow cytometric analysis results did not correlate with the results obtained by traditional methods; in fact, this technique showed that there were viable cells even when the optical density was low or no longer detectable and there was no plate count value. According to our results, flow cytometry is a suitable method for assessing the viability of bacteria in water samples. Furthermore, it permits fast detection of bacteria that are in a viable but nonculturable state, which are not detectable by conventional methods.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号