首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Tang J  Guo S  Sun Q  Deng Y  Zhou D 《BMC genomics》2010,11(Z2):S9

Background

Ultrasound imaging technology has wide applications in cattle reproduction and has been used to monitor individual follicles and determine the patterns of follicular development. However, the speckles in ultrasound images affect the post-processing, such as follicle segmentation and finally affect the measurement of the follicles. In order to reduce the effect of speckles, a bilateral filter is developed in this paper.

Results

We develop a new bilateral filter for speckle reduction in ultrasound images for follicle segmentation and measurement. Different from the previous bilateral filters, the proposed bilateral filter uses normalized difference in the computation of the Gaussian intensity difference. We also present the results of follicle segmentation after speckle reduction. Experimental results on both synthetic images and real ultrasound images demonstrate the effectiveness of the proposed filter.

Conclusions

Compared with the previous bilateral filters, the proposed bilateral filter can reduce speckles in both high-intensity regions and low intensity regions in ultrasound images. The segmentation of the follicles in the speckle reduced images by the proposed method has higher performance than the segmentation in the original ultrasound image, and the images filtered by Gaussian filter, the conventional bilateral filter respectively.
  相似文献   

2.

Background

Cell segmentation is a critical step for quantification and monitoring of cell cycle progression, cell migration, and growth control to investigate cellular immune response, embryonic development, tumorigenesis, and drug effects on live cells in time-lapse microscopy images.

Methods

In this study, we propose a joint spatio-temporal diffusion and region-based level-set optimization approach for moving cell segmentation. Moving regions are initially detected in each set of three consecutive sequence images by numerically solving a system of coupled spatio-temporal partial differential equations. In order to standardize intensities of each frame, we apply a histogram transformation approach to match the pixel intensities of each processed frame with an intensity distribution model learned from all frames of the sequence during the training stage. After the spatio-temporal diffusion stage is completed, we compute the edge map by nonparametric density estimation using Parzen kernels. This process is followed by watershed-based segmentation and moving cell detection. We use this result as an initial level-set function to evolve the cell boundaries, refine the delineation, and optimize the final segmentation result.

Results

We applied this method to several datasets of fluorescence microscopy images with varying levels of difficulty with respect to cell density, resolution, contrast, and signal-to-noise ratio. We compared the results with those produced by Chan and Vese segmentation, a temporally linked level-set technique, and nonlinear diffusion-based segmentation. We validated all segmentation techniques against reference masks provided by the international Cell Tracking Challenge consortium. The proposed approach delineated cells with an average Dice similarity coefficient of 89 % over a variety of simulated and real fluorescent image sequences. It yielded average improvements of 11 % in segmentation accuracy compared to both strictly spatial and temporally linked Chan-Vese techniques, and 4 % compared to the nonlinear spatio-temporal diffusion method.

Conclusions

Despite the wide variation in cell shape, density, mitotic events, and image quality among the datasets, our proposed method produced promising segmentation results. These results indicate the efficiency and robustness of this method especially for mitotic events and low SNR imaging, enabling the application of subsequent quantification tasks.
  相似文献   

3.

Background

The finite element method (FEM) is a powerful mathematical tool to simulate and visualize the mechanical deformation of tissues and organs during medical examinations or interventions. It is yet a challenge to build up an FEM mesh directly from a volumetric image partially because the regions (or structures) of interest (ROIs) may be irregular and fuzzy.

Methods

A software package, ImageParser, is developed to generate an FEM mesh from 3-D tomographic medical images. This software uses a semi-automatic method to detect ROIs from the context of image including neighboring tissues and organs, completes segmentation of different tissues, and meshes the organ into elements.

Results

The ImageParser is shown to build up an FEM model for simulating the mechanical responses of the breast based on 3-D CT images. The breast is compressed by two plate paddles under an overall displacement as large as 20% of the initial distance between the paddles. The strain and tangential Young's modulus distributions are specified for the biomechanical analysis of breast tissues.

Conclusion

The ImageParser can successfully exact the geometry of ROIs from a complex medical image and generate the FEM mesh with customer-defined segmentation information.
  相似文献   

4.

Background

Compared to the design of a traditional multi-radius (MR) total knee arthroplasty (TKA), the single-radius (SR) implant investigated has a fixed flexion/extension center of rotation. The biomechanical effectiveness of an SR for functional daily activities, i.e., sit-to-stand, is not well understood. The purpose of the study was to compare the biomechanics underlying functional performance of the sit-to-stand (STS) movement between the limbs containing an MR and an SR TKA of bilateral TKA participants.

Methods

Sagittal plane kinematics and kinetics, and EMG data for selected knee flexor and extensor muscles were analyzed for eight bilateral TKA patients, each with an SR and an MR TKA implant.

Results

Compared to the MR limb, the SR limb demonstrated greater peak antero-posterior (AP) ground reaction force, higher AP ground reaction impulse, less vastus lateralis and semitendinosus EMG during the forward-thrust phase of the STS movement. No significant difference of knee extensor moment was found between the two knees.

Conclusion

Some GRF and EMG differences were evident between the MR and SR limbs during STS movement. Compensatory adaptations may be used to perform the STS.
  相似文献   

5.

Background

The reconstruction of ancestral genomes must deal with the problem of resolution, necessarily involving a trade-off between trying to identify genomic details and being overwhelmed by noise at higher resolutions.

Results

We use the median reconstruction at the synteny block level, of the ancestral genome of the order Gentianales, based on coffee, Rhazya stricta and grape, to exemplify the effects of resolution (granularity) on comparative genomic analyses.

Conclusions

We show how decreased resolution blurs the differences between evolving genomes, with respect to rate, mutational process and other characteristics.
  相似文献   

6.

Background

Steatosis is routinely assessed histologically in clinical practice and research. Automated image analysis can reduce the effort of quantifying steatosis. Since reproducibility is essential for practical use, we have evaluated different analysis methods in terms of their agreement with stereological point counting (SPC) performed by a hepatologist.

Methods

The evaluation was based on a large and representative data set of 970 histological images from human patients with different liver diseases. Three of the evaluated methods were built on previously published approaches. One method incorporated a new approach to improve the robustness to image variability.

Results

The new method showed the strongest agreement with the expert. At 20× resolution, it reproduced steatosis area fractions with a mean absolute error of 0.011 for absent or mild steatosis and 0.036 for moderate or severe steatosis. At 10× resolution, it was more accurate than and twice as fast as all other methods at 20× resolution. When compared with SPC performed by two additional human observers, its error was substantially lower than one and only slightly above the other observer.

Conclusions

The results suggest that the new method can be a suitable automated replacement for SPC. Before further improvements can be verified, it is necessary to thoroughly assess the variability of SPC between human observers.
  相似文献   

7.

Introduction

Mass spectrometry imaging (MSI) experiments result in complex multi-dimensional datasets, which require specialist data analysis tools.

Objectives

We have developed massPix—an R package for analysing and interpreting data from MSI of lipids in tissue.

Methods

massPix produces single ion images, performs multivariate statistics and provides putative lipid annotations based on accurate mass matching against generated lipid libraries.

Results

Classification of tissue regions with high spectral similarly can be carried out by principal components analysis (PCA) or k-means clustering.

Conclusion

massPix is an open-source tool for the analysis and statistical interpretation of MSI data, and is particularly useful for lipidomics applications.
  相似文献   

8.

Introduction

Collecting feces is easy. It offers direct outcome to endogenous and microbial metabolites.

Objectives

In a context of lack of consensus about fecal sample preparation, especially in animal species, we developed a robust protocol allowing untargeted LC-HRMS fingerprinting.

Methods

The conditions of extraction (quantity, preparation, solvents, dilutions) were investigated in bovine feces.

Results

A rapid and simple protocol involving feces extraction with methanol (1/3, M/V) followed by centrifugation and a step filtration (10 kDa) was developed.

Conclusion

The workflow generated repeatable and informative fingerprints for robust metabolome characterization.
  相似文献   

9.

Background

Efficient computational recognition and segmentation of target organ from medical images are foundational in diagnosis and treatment, especially about pancreas cancer. In practice, the diversity in appearance of pancreas and organs in abdomen, makes detailed texture information of objects important in segmentation algorithm. According to our observations, however, the structures of previous networks, such as the Richer Feature Convolutional Network (RCF), are too coarse to segment the object (pancreas) accurately, especially the edge.

Method

In this paper, we extend the RCF, proposed to the field of edge detection, for the challenging pancreas segmentation, and put forward a novel pancreas segmentation network. By employing multi-layer up-sampling structure replacing the simple up-sampling operation in all stages, the proposed network fully considers the multi-scale detailed contexture information of object (pancreas) to perform per-pixel segmentation. Additionally, using the CT scans, we supply and train our network, thus get an effective pipeline.

Result

Working with our pipeline with multi-layer up-sampling model, we achieve better performance than RCF in the task of single object (pancreas) segmentation. Besides, combining with multi scale input, we achieve the 76.36% DSC (Dice Similarity Coefficient) value in testing data.

Conclusion

The results of our experiments show that our advanced model works better than previous networks in our dataset. On the other words, it has better ability in catching detailed contexture information. Therefore, our new single object segmentation model has practical meaning in computational automatic diagnosis.
  相似文献   

10.

Background

Centrifugation is an indispensable procedure for plasma sample preparation, but applied conditions can vary between labs.

Aim

Determine whether routinely used plasma centrifugation protocols (1500×g 10 min; 3000×g 5 min) influence non-targeted metabolomic analyses.

Methods

Nuclear magnetic resonance spectroscopy (NMR) and High Resolution Mass Spectrometry (HRMS) data were evaluated with sparse partial least squares discriminant analyses and compared with cell count measurements.

Results

Besides significant differences in platelet count, we identified substantial alterations in NMR and HRMS data related to the different centrifugation protocols.

Conclusion

Already minor differences in plasma centrifugation can significantly influence metabolomic patterns and potentially bias metabolomics studies.
  相似文献   

11.

Introduction

Untargeted and targeted analyses are two classes of metabolic study. Both strategies have been advanced by high resolution mass spectrometers coupled with chromatography, which have the advantages of high mass sensitivity and accuracy. State-of-art methods for mass spectrometric data sets do not always quantify metabolites of interest in a targeted assay efficiently and accurately.

Objectives

TarMet can quantify targeted metabolites as well as their isotopologues through a reactive and user-friendly graphical user interface.

Methods

TarMet accepts vendor-neutral data files (NetCDF, mzXML and mzML) as inputs. Then it extracts ion chromatograms, detects peak position and bounds and confirms the metabolites via the isotope patterns. It can integrate peak areas for all isotopologues automatically.

Results

TarMet detects more isotopologues and quantify them better than state-of-art methods, and it can process isotope tracer assay well.

Conclusion

TarMet is a better tool for targeted metabolic and stable isotope tracer analyses.
  相似文献   

12.

Background

The ability to direct the cellular response by means of biomaterial surface topography is important for biomedical applications. Substrate surface topography has been shown to be an effective cue for the regulation of cellular response. Here, the response of human aortic endothelial cells to nanoporous anodic alumina and macroporous silicon with collagen and fibronectin functionalization has been studied.

Methods

Confocal microscopy and scanning electron microscopy were employed to analyse the effects of the material and the porosity on the adhesion, morphology, and proliferation of the cells. Cell spreading and filopodia formation on macro- and nanoporous material was characterized by atomic force microscopy. We have also studied the influence of the protein on the adhesion.

Results

It was obtained the best results when the material is functionalized with fibronectin, regarding cells adhesion, morphology, and proliferation.

Conclusion

These results permit to obtain chemical modified 3D structures for several biotechnology applications such as tissue engineering, organ-on-chip or regenerative medicine.
  相似文献   

13.

Introduction

Data sharing is being increasingly required by journals and has been heralded as a solution to the ‘replication crisis’.

Objectives

(i) Review data sharing policies of journals publishing the most metabolomics papers associated with open data and (ii) compare these journals’ policies to those that publish the most metabolomics papers.

Methods

A PubMed search was used to identify metabolomics papers. Metabolomics data repositories were manually searched for linked publications.

Results

Journals that support data sharing are not necessarily those with the most papers associated to open metabolomics data.

Conclusion

Further efforts are required to improve data sharing in metabolomics.
  相似文献   

14.

Background

In recent years the visualization of biomagnetic measurement data by so-called pseudo current density maps or Hosaka-Cohen (HC) transformations became popular.

Methods

The physical basis of these intuitive maps is clarified by means of analytically solvable problems.

Results

Examples in magnetocardiography, magnetoencephalography and magnetoneurography demonstrate the usefulness of this method.

Conclusion

Hardware realizations of the HC-transformation and some similar transformations are discussed which could advantageously support cross-platform comparability of biomagnetic measurements.
  相似文献   

15.

Introduction

Untargeted metabolomics is a powerful tool for biological discoveries. To analyze the complex raw data, significant advances in computational approaches have been made, yet it is not clear how exhaustive and reliable the data analysis results are.

Objectives

Assessment of the quality of raw data processing in untargeted metabolomics.

Methods

Five published untargeted metabolomics studies, were reanalyzed.

Results

Omissions of at least 50 relevant compounds from the original results as well as examples of representative mistakes were reported for each study.

Conclusion

Incomplete raw data processing shows unexplored potential of current and legacy data.
  相似文献   

16.

Introduction

Intrahepatic cholestasis of pregnancy (ICP) is a common maternal liver disease; development can result in devastating consequences, including sudden fetal death and stillbirth. Currently, recognition of ICP only occurs following onset of clinical symptoms.

Objective

Investigate the maternal hair metabolome for predictive biomarkers of ICP.

Methods

The maternal hair metabolome (gestational age of sampling between 17 and 41 weeks) of 38 Chinese women with ICP and 46 pregnant controls was analysed using gas chromatography–mass spectrometry.

Results

Of 105 metabolites detected in hair, none were significantly associated with ICP.

Conclusion

Hair samples represent accumulative environmental exposure over time. Samples collected at the onset of ICP did not reveal any metabolic shifts, suggesting rapid development of the disease.
  相似文献   

17.

Introduction

Quantification of tetrahydrofolates (THFs), important metabolites in the Wood–Ljungdahl pathway (WLP) of acetogens, is challenging given their sensitivity to oxygen.

Objective

To develop a simple anaerobic protocol to enable reliable THFs quantification from bioreactors.

Methods

Anaerobic cultures were mixed with anaerobic acetonitrile for extraction. Targeted LC–MS/MS was used for quantification.

Results

Tetrahydrofolates can only be quantified if sampled anaerobically. THF levels showed a strong correlation to acetyl-CoA, the end product of the WLP.

Conclusion

Our method is useful for relative quantification of THFs across different growth conditions. Absolute quantification of THFs requires the use of labelled standards.
  相似文献   

18.

Background

Automated image analysis, measurements of virtual slides, and open access electronic measurement user systems require standardized image quality assessment in tissue-based diagnosis.

Aims

To describe the theoretical background and the practical experiences in automated image quality estimation of colour images acquired from histological slides.

Theory, material and measurements

Digital images acquired from histological slides should present with textures and objects that permit automated image information analysis. The quality of digitized images can be estimated by spatial independent and local filter operations that investigate in homogenous brightness, low peak to noise ratio (full range of available grey values), maximum gradients, equalized grey value distribution, and existence of grey value thresholds. Transformation of the red-green-blue (RGB) space into the hue-saturation-intensity (HSI) space permits the detection of colour and intensity maxima/minima. The feature distance of the original image to its standardized counterpart is an appropriate measure to quantify the actual image quality. These measures have been applied to a series of H&;E stained, fluorescent (DAPI, Texas Red, FITC), and immunohistochemically stained (PAP, DAB) slides. More than 5,000 slides have been measured and partly analyzed in a time series.

Results

Analysis of H&;E stained slides revealed low shading corrections (10%) and moderate grey value standardization (10 – 20%) in the majority of cases. Immunohistochemically stained slides displayed greater shading and grey value correction. Fluorescent stained slides are often revealed to high brightness. Images requiring only low standardization corrections possess at least 5 different statistically significant thresholds, which are useful for object segmentation. Fluorescent images of good quality only posses one singular intensity maximum in contrast to good images obtained from H&;E stained slides that present with 2 – 3 intensity maxima.

Conclusion

Evaluation of image quality and creation of formally standardized images should be performed prior to automatic analysis of digital images acquired from histological slides. Spatial dependent and local filter operations as well as analysis of the RGB and HSI spaces are appropriate methods to reproduce evaluated formal image quality.
  相似文献   

19.

Introduction

It is difficult to elucidate the metabolic and regulatory factors causing lipidome perturbations.

Objectives

This work simplifies this process.

Methods

A method has been developed to query an online holistic lipid metabolic network (of 7923 metabolites) to extract the pathways that connect the input list of lipids.

Results

The output enables pathway visualisation and the querying of other databases to identify potential regulators. When used to a study a plasma lipidome dataset of polycystic ovary syndrome, 14 enzymes were identified, of which 3 are linked to ELAVL1—an mRNA stabiliser.

Conclusion

This method provides a simplified approach to identifying potential regulators causing lipid-profile perturbations.
  相似文献   

20.

Introduction

Data processing is one of the biggest problems in metabolomics, given the high number of samples analyzed and the need of multiple software packages for each step of the processing workflow.

Objectives

Merge in the same platform the steps required for metabolomics data processing.

Methods

KniMet is a workflow for the processing of mass spectrometry-metabolomics data based on the KNIME Analytics platform.

Results

The approach includes key steps to follow in metabolomics data processing: feature filtering, missing value imputation, normalization, batch correction and annotation.

Conclusion

KniMet provides the user with a local, modular and customizable workflow for the processing of both GC–MS and LC–MS open profiling data.
  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号