首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Layer‐by‐layer cell printing is useful in mimicking layered tissue structures inside the human body and has great potential for being a promising tool in the field of tissue engineering, regenerative medicine, and drug discovery. However, imaging human cells cultured in multiple hydrogel layers in 3D‐printed tissue constructs is challenging as the cells are not in a single focal plane. Although confocal microscopy could be a potential solution for this issue, it compromises the throughput which is a key factor in rapidly screening drug efficacy and toxicity in pharmaceutical industries. With epifluorescence microscopy, the throughput can be maintained at a cost of blurred cell images from printed tissue constructs. To rapidly acquire in‐focus cell images from bioprinted tissues using an epifluorescence microscope, we created two layers of Hep3B human hepatoma cells by printing green and red fluorescently labeled Hep3B cells encapsulated in two alginate layers in a microwell chip. In‐focus fluorescent cell images were obtained in high throughput using an automated epifluorescence microscopy coupled with image analysis algorithms, including three deconvolution methods in combination with three kernel estimation methods, generating a total of nine deconvolution paths. As a result, a combination of Inter‐Level Intra‐Level Deconvolution (ILILD) algorithm and Richardson‐Lucy (RL) kernel estimation proved to be highly useful in bringing out‐of‐focus cell images into focus, thus rapidly yielding more sensitive and accurate fluorescence reading from the cells in different layers. © 2017 American Institute of Chemical Engineers Biotechnol. Prog., 34:445–454, 2018  相似文献   

2.
A stochastic approximation algorithm is proposed for recursive estimation of the hyperparameters characterizing, in a population, the probability density function of the parameters of a statistical model. For a given population model defined by a parametric model of a biological process, an error model, and a class of densities on the set of the individual parameters, this algorithm provides a sequence of estimates from a sequence of individuals' observation vectors. Convergence conditions are verified for a class of population models including usual pharmacokinetic applications. This method is implemented for estimation of pharmacokinetic population parameters from drug multiple-dosing data. Its estimation capabilities are evaluated and compared to a classical method in population pharmacokinetics, the first-order method (NONMEM), on simulated data.  相似文献   

3.
The primary goal of “in vitro–in vivo correlation” (IVIVC) is the reliable prediction of the in vivo serum concentration‐time course, based on the in vitro drug dissolution or release profiles. IVIVC methods are particularly appropriate for formulations that are released over an extended period of time or with a lag in absorption and may support approving a change in formulation of a drug without additional bioequivalence trials in human subjects. Most of the current IVIVC models are assessed using frequentist methods, such as linear regression, based on averaged data and entail complex and potentially unstable mathematical deconvolution. The proposed IVIVC approach includes (a) a nonlinear‐mixed effects model for the in vitro release data; (b) a population pharmacokinetic (PK) compartment model for the in vivo immediate release (IR) data; and (c) a system of ordinal differential equations (ODEs), containing the submodels (a) and (b), which approximates and predicts the in vivo controlled release (CR) data. The innovation in this paper consists of splitting the parameter space between submodels (a) and (b) versus (c). Subsequently, the uncertainty on these parameters is accounted for using a Bayesian framework, that is estimates from the first two submodels serve as priors for the Bayesian hierarchical third submodel. As such, the Bayesian method explained ensures a natural integration and transfer of knowledge between various sources of information, balancing possible differences in sample size and parameter uncertainty of in vitro and in vivo studies. Consequently, it is a very flexible approach yielding results for a broad range of data situations. The application of the method is demonstrated for a transdermal patch (TD).  相似文献   

4.
The quantitative differentiation of liposomal encapsulated and non-encapsulated drug tissue concentrations is desirable, since the efficacy and toxicity are only related to the level of non-encapsulated drug. However, such separate concentration profiles in tissues have still not been reported due to lacking analytical methodology. The encapsulation of prodrugs like prednisolone phosphate (PP) in liposomes offers new, analytical opportunities. Instantaneous dephosphorylation of PP into prednisolone (P) by phosphatases after its release from the liposome in vivo makes it possible to differentiate between the encapsulated and the non-encapsulated drug for such preparations of liposomal PP: PP represents the encapsulated drug, while P represents the non-encapsulated drug. In the here described study, the instantaneous dephosphorylation of PP by murine liver and kidney phosphatases has been verified by incubation of PP in liver and kidney homogenates followed by estimation of the dephosphorylation rate constants k and the dephosphorylation time of the expected maximal in vivo non-encapsulated drug concentrations. In vitro PP has been rapidly converted into P in the presence of homogenate from the excretory organs. The calculated values for k have shown that the liver contains more active sites per gram of tissue than the kidneys. However, the dephosphorylation of PP by these active sites is slower compared with the kidneys. Compared with other pharmacokinetic processes of P, the estimated dephosphorylation times of the expected maximal in vivo non-encapsulated drug concentrations in the liver and the kidneys are considered to be instantaneous. This enables the separate determination of the encapsulated and non-encapsulated drug concentrations in the excretory organs after administration of liposomal PP in mice generating the first pharmacokinetic profile of a liposomal preparation, in which the in vivo encapsulated and free drug tissues concentrations are measured separately. This can also gain important insights into the pharmacokinetics of liposomal formulations in general.  相似文献   

5.
A method of quantitative determination of azithromycin in HPLC with mass spectrometric detection was developed. The detection limit is 0.5 ng/ml. The method was used in the study of pharmacokinetics and bioequivalence of Zi-Factor (capsules of 250 mg of azithromycin made by Veropharm, Russia) vs. reference-drug. The pharmacokinetic study was performed by the open cross randomized procedure in 18 volunteers. The pharmacokinetic parameters required for estimation of the drug bioequivalence were calculated. The statistical analysis of the pharmacokinetic parameters revealed bioequivalence of Zi-Factor and the reference-drug.  相似文献   

6.
Colocalization of dihydropyridine (DHPR) and ryanodine (RyR) receptors, a key determinant of Ca(2+)-induced Ca2+ release, was previously estimated in 3-, 6-, 10-, and 20-day-old rabbit ventricular myocytes by immunocytochemistry and confocal microscopy. We now report on the effects of deconvolution (using a maximum-likelihood estimation algorithm) on the calculation of colocalization indexes. Clusters of DHPR and RyR can be accurately represented as point sources of fluorescence, which enables a model of their relative distributions to be constructed using images of point spread functions to simulate their fluorescence inside a cell. This model was used to investigate the effects of deconvolution on colocalization as a function of separation distance. Deconvolution resulted in significant improvements in both axial and transverse resolutions, producing significant increases in clarity. Comparisons of intensity profiles (full-width half-maximum) pre- and postdeconvolution showed decreased dispersion of the fluorescent signal and a corresponding decrease in false colocalization as determined by fluorescence modeling. This hypothesis was extended to physiological data previously collected. The number of colocalized voxels was quantified after deconvolution, and the degree of colocalization of DHPR with RyR decreased significantly after deconvolution in all age groups: 3 days (62 +/- 2% before deconvolution, 43 +/- 3 after deconvolution) to 20 days old (79 +/- 1% before deconvolution, 63 +/- 2% after deconvolution). The data demonstrate that confocal images should be deconvolved before any quantitative analysis, such as colocalization index determination, to minimize the detrimental effects of out-of-focus light in coincident voxels.  相似文献   

7.
The informational content of RNA sequencing is currently far from being completely explored. Most of the analyses focus on processing tables of counts or finding isoform deconvolution via exon junctions. This article presents a comparison of several techniques that can be used to estimate differential expression of exons or small genomic regions of expression, based on their coverage function shapes. The problem is defined as finding the differentially expressed exons between two samples using local expression profile normalization and statistical measures to spot the differences between two profile shapes. Initial experiments have been done using synthetic data, and real data modified with synthetically created differential patterns. Then, 160 pipelines (5 types of generator × 4 normalizations × 8 difference measures) are compared. As a result, the best analysis pipelines are selected based on linearity of the differential expression estimation and the area under the ROC curve. These platform-independent techniques have been implemented in the Bioconductor package rnaSeqMap. They point out the exons with differential expression or internal splicing, even if the counts of reads may not show this. The areas of application include significant difference searches, splicing identification algorithms and finding suitable regions for QPCR primers.  相似文献   

8.
Statistical methods are discussed, which are used in the analysis of point patterns. Special attention has been paid to their application in ecological research. Some new procedures are presented, which seem to be better compatible with the needs of the ecologist.It is pointed out that patterns can usually be described in terms of an appropriate trend surface as well as in terms of mutual interactions. This circumstance restricts the value of the analysis of point patterns for ecological research in tracing the mechanisms which are connected with the distribution of individuals.After having discussed the current sampling designs with respect to point patterns, the, estimation of the local intensity is treated. Although the so-called distance-method has got considerable attention in this respect, it is stated that this method is not very appropriate for this purpose. For two sampling designs, it is illustrated how to estimate functions, which describe density variation in the field.Further, a procedure is proposed which estimates the covariance curve, as well as the total amount of interaction in the pattern. The relation of the statistic with the covariance curve has been pointed out. An improvement has been proposed of the well-known Greigh-Smith method, i.e. the estimation of the variance curve.The estimation procedures proposed have been illustrated by three examples from the field, i.e. dispersal patterns of barnacles, anemones and glassworts, all belonging to low structured communities. They are presented in the Appendix. Monte Carlo-methods are used to study the properties of some statistical procedures.This paper has been part of a Ph.D. thesis, State Univ. Leiden, May 1977.  相似文献   

9.
The purpose of this study was to develop a once daily sustained release tablet of aceclofenac using chitosan and an enteric coating polymer (hydroxypropyl methylcellulose phthalate or cellulose acetate phthalate). Overall sustained release for 24 h was achieved by preparing a double-layer tablet in which the immediate release layer was formulated for a prompt release of the drug and the sustained release layer was designed to achieve a prolonged release of drug. The preformulation studies like IR spectroscopic and differential scanning calorimetry showed the absence of drug–excipient interactions. The tablets were found within the permissible limits for various physicochemical parameters. Scanning electron microscopy was used to visualize the surface morphology of the tablets and to confirm drug release mechanisms. Good equivalence in the drug release profile was observed when drug release pattern of the tablet containing chitosan and hydroxypropyl methylcellulose phthalate (M-7) was compared with that of marketed tablet. The optimized tablets were stable at accelerated storage conditions for 6 months with respect to drug content and physical appearance. The results of pharmacokinetic studies in human volunteers showed that the optimized tablet (M-7) exhibited no difference in the in vivo drug release in comparison with marketed tablet. No significant difference between the values of pharmacokinetic parameters of M-7 and marketed tablets was observed (p > 0.05; 95% confidence intervals). However the clinical studies in large scale and, long term and extensive stability studies at different conditions are required to confirm these results.  相似文献   

10.
Linear mixed‐effects models are frequently used for estimating quantitative genetic parameters, including the heritability, as well as the repeatability, of traits. Heritability acts as a filter that determines how efficiently phenotypic selection translates into evolutionary change, whereas repeatability informs us about the individual consistency of phenotypic traits. As quantities of biological interest, it is important that the denominator, the phenotypic variance in both cases, reflects the amount of phenotypic variance in the relevant ecological setting. The current practice of quantifying heritabilities and repeatabilities from mixed‐effects models frequently deprives their denominator of variance explained by fixed effects (often leading to upward bias of heritabilities and repeatabilities), and it has been suggested to omit fixed effects when estimating heritabilities in particular. We advocate an alternative option of fitting models incorporating all relevant effects, while including the variance explained by fixed effects into the estimation of the phenotypic variance. The approach is easily implemented and allows optimizing the estimation of phenotypic variance, for example by the exclusion of variance arising from experimental design effects while still including all biologically relevant sources of variation. We address the estimation and interpretation of heritabilities in situations in which potential covariates are themselves heritable traits of the organism. Furthermore, we discuss complications that arise in generalized and nonlinear mixed models with fixed effects. In these cases, the variance parameters on the data scale depend on the location of the intercept and hence on the scaling of the fixed effects. Integration over the biologically relevant range of fixed effects offers a preferred solution in those situations.  相似文献   

11.
Microarrays provide a valuable tool for the quantification of gene expression. Usually, however, there is a limited number of replicates leading to unsatisfying variance estimates in a gene‐wise mixed model analysis. As thousands of genes are available, it is desirable to combine information across genes. When more than two tissue types or treatments are to be compared it might be advisable to consider the array effect as random. Then information between arrays may be recovered, which can increase accuracy in estimation. We propose a method of variance component estimation across genes for a linear mixed model with two random effects. The method may be extended to models with more than two random effects. We assume that the variance components follow a log‐normal distribution. Assuming that the sums of squares from the gene‐wise analysis, given the true variance components, follow a scaled χ2‐distribution, we adopt an empirical Bayes approach. The variance components are estimated by the expectation of their posterior distribution. The new method is evaluated in a simulation study. Differentially expressed genes are more likely to be detected by tests based on these variance estimates than by tests based on gene‐wise variance estimates. This effect is most visible in studies with small array numbers. Analyzing a real data set on maize endosperm the method is shown to work well. (© 2008 WILEY‐VCH Verlag GmbH & Co. KGaA, Weinheim)  相似文献   

12.
In this paper the situation of extra population heterogeneity is discussed from a analysis of variance point of view. We first provide a non‐iterative way of estimating the variance of the heterogeneity distribution without estimating the heterogeneity distribution itself for Poisson and binomial counts. The consequences of the presence of heterogeneity in the estimation of the mean are discussed. We show that if the homogeneity assumption holds, the pooled mean is optimal while in the presence of strong heterogeneity, the simple (arithmetic) mean is an optimal estimator of the mean SMR or mean proportion. These results lead to the problem of finding an optimal estimator for situations not represented by these two extreme cases. We propose an iterative solution to this problem. Illustrations for the application of these findings are provided with examples from various areas.  相似文献   

13.
Deconvolution is an essential step of image processing that aims to compensate for the image blur caused by the microscope's point spread function. With many existing deconvolution methods, it is challenging to choose the method and its parameters most appropriate for particular image data at hand. To facilitate this task, we developed DeconvTest: an open‐source Python‐based framework for generating synthetic microscopy images, deconvolving them with different algorithms, and quantifying reconstruction errors. In contrast to existing software, DeconvTest combines all components required to analyze deconvolution performance in a systematic, high‐throughput and quantitative manner. We demonstrate the power of the framework by using it to identify the optimal deconvolution settings for synthetic and real image data. Based on this, we provide a guideline for (a) choosing optimal values of deconvolution parameters for image data at hand and (b) optimizing imaging conditions for best results in combination with subsequent image deconvolution.  相似文献   

14.
Repeatability (more precisely the common measure of repeatability, the intra‐class correlation coefficient, ICC) is an important index for quantifying the accuracy of measurements and the constancy of phenotypes. It is the proportion of phenotypic variation that can be attributed to between‐subject (or between‐group) variation. As a consequence, the non‐repeatable fraction of phenotypic variation is the sum of measurement error and phenotypic flexibility. There are several ways to estimate repeatability for Gaussian data, but there are no formal agreements on how repeatability should be calculated for non‐Gaussian data (e.g. binary, proportion and count data). In addition to point estimates, appropriate uncertainty estimates (standard errors and confidence intervals) and statistical significance for repeatability estimates are required regardless of the types of data. We review the methods for calculating repeatability and the associated statistics for Gaussian and non‐Gaussian data. For Gaussian data, we present three common approaches for estimating repeatability: correlation‐based, analysis of variance (ANOVA)‐based and linear mixed‐effects model (LMM)‐based methods, while for non‐Gaussian data, we focus on generalised linear mixed‐effects models (GLMM) that allow the estimation of repeatability on the original and on the underlying latent scale. We also address a number of methods for calculating standard errors, confidence intervals and statistical significance; the most accurate and recommended methods are parametric bootstrapping, randomisation tests and Bayesian approaches. We advocate the use of LMM‐ and GLMM‐based approaches mainly because of the ease with which confounding variables can be controlled for. Furthermore, we compare two types of repeatability (ordinary repeatability and extrapolated repeatability) in relation to narrow‐sense heritability. This review serves as a collection of guidelines and recommendations for biologists to calculate repeatability and heritability from both Gaussian and non‐Gaussian data.  相似文献   

15.
Knapp SJ  Bridges-Jr WC  Yang MH 《Genetics》1989,121(4):891-898
Statistical methods have not been described for comparing estimates of family-mean heritability (H) or expected selection response (R), nor have consistently valid methods been described for estimating R intervals. Nonparametric methods, e.g., delete-one jackknifing, may be used to estimate variances, intervals, and hypothesis test statistics in estimation problems where parametric methods are unsuitable, nonrobust, or undefinable. Our objective was to evaluate normal-approximation jackknife interval estimators for H and R using Monte Carlo simulation. Simulations were done using normally distributed within-family effects and normally, uniformly, and exponentially distributed between-family effects. Realized coverage probabilities for jackknife interval (2) and parametric interval (5) for H were not significantly different from stated probabilities when between-family effects were normally distributed. Coverages for jackknife intervals (3) and (4) for R were not significantly different from stated coverages when between-family effects were normally distributed. Coverages for interval (3) for R were occasionally significantly less than stated when between-family effects were uniformly or exponentially distributed. Coverages for interval (2) for H were occasionally significantly less than stated when between-family effects were exponentially distributed. Thus, intervals (3) and (4) for R and (2) for H were robust. Means of analysis of variance estimates of R were often significantly less than parametric values when the number of families evaluated was 60 or less. Means of analysis of variance estimates of H were consistently significantly less than parametric values. Means of jackknife estimates of H calculated from log transformed point estimates and R calculated from untransformed or log transformed point estimates were not significantly different from parametric values. Thus, jackknife estimators of H and R were unbiased. Delete-one jackknifing is a robust, versatile, and effective statistical method when applied to estimation problems involving variance functions. Jackknifing is especially valuable in hypothesis test estimation problems where the objective is comparing estimates from different populations.  相似文献   

16.
The present study is an extension of the investigations made by Grieszbach and Schack (1993) where the recursive estimators of the quantile were introduced. Attention is focused on statistical properties and on the controlling of these estimators in order to reduce their variance and to improve their capability of adaptation. Using methods of stochastic approximation, several control algorithms have been developed, where both the consistent and the adaptive estimation are considered. Due to the recursive computation formula the estimators are suitable for the analysis of large data sets and for sets whose elements are obtained sequentially. In this study, application examples from the analysis of EEG‐records are presented, where quantiles are used as threshold values.  相似文献   

17.
Phenotypic plasticity is a central topic in ecology and evolution. Individuals may differ in the degree of plasticity (individual‐by‐environment interaction (I × E)), which has implications for the capacity of populations to respond to selection. Random regression models (RRMs) are a popular tool to study I × E in behavioural or life‐history traits, yet evidence for I × E is mixed, differing between species, populations, and even between studies on the same population. One important source of discrepancies between studies is the treatment of heterogeneity in residual variance (heteroscedasticity). To date, there seems to be no collective awareness among ecologists of its influence on the estimation of I × E or a consensus on how to best model it. We performed RRMs with differing residual variance structures on simulated data with varying degrees of heteroscedasticity and plasticity, sample size and environmental variability to test how RRMs would perform under each scenario. The residual structure in the RRMs affected the precision of estimates of simulated I × E as well as statistical power, with substantial lack of precision and high false‐positive rates when sample size, environmental variability and plasticity were small. We show that model comparison using information criteria can be used to choose among residual structures and reinforce this point by analysis of real data of two study populations of great tits (Parus major). We provide guidelines that can be used by biologists studying I × E that, ultimately, should lead to a reduction in bias in the literature concerning the statistical evidence and the reported magnitude of variation in plasticity.  相似文献   

18.
The purpose of this study was to develop a once daily sustained release tablet of aceclofenac using chitosan and an enteric coating polymer (hydroxypropyl methylcellulose phthalate or cellulose acetate phthalate). Overall sustained release for 24 h was achieved by preparing a double-layer tablet in which the immediate release layer was formulated for a prompt release of the drug and the sustained release layer was designed to achieve a prolonged release of drug. The preformulation studies like IR spectroscopic and differential scanning calorimetry showed the absence of drug–excipient interactions. The tablets were found within the permissible limits for various physicochemical parameters. Scanning electron microscopy was used to visualize the surface morphology of the tablets and to confirm drug release mechanisms. Good equivalence in the drug release profile was observed when drug release pattern of the tablet containing chitosan and hydroxypropyl methylcellulose phthalate (M-7) was compared with that of marketed tablet. The optimized tablets were stable at accelerated storage conditions for 6 months with respect to drug content and physical appearance. The results of pharmacokinetic studies in human volunteers showed that the optimized tablet (M-7) exhibited no difference in the in vivo drug release in comparison with marketed tablet. No significant difference between the values of pharmacokinetic parameters of M-7 and marketed tablets was observed (p > 0.05; 95% confidence intervals). However the clinical studies in large scale and, long term and extensive stability studies at different conditions are required to confirm these results.Key words: aceclofenac, chitosan, matrix tablet, pharmacokinetics, sustained release  相似文献   

19.
Summary A time‐specific log‐linear regression method on quantile residual lifetime is proposed. Under the proposed regression model, any quantile of a time‐to‐event distribution among survivors beyond a certain time point is associated with selected covariates under right censoring. Consistency and asymptotic normality of the regression estimator are established. An asymptotic test statistic is proposed to evaluate the covariate effects on the quantile residual lifetimes at a specific time point. Evaluation of the test statistic does not require estimation of the variance–covariance matrix of the regression estimators, which involves the probability density function of the survival distribution with censoring. Simulation studies are performed to assess finite sample properties of the regression parameter estimator and test statistic. The new regression method is applied to a breast cancer data set with long‐term follow‐up to estimate the patients' median residual lifetimes, adjusting for important prognostic factors.  相似文献   

20.
Who wouldn't want to have a drug that is activated only in the target cell? Prodrugs that are metabolically triggered inside the pathogen but not in the host are an attractive concept in antimicrobial chemotherapy. Of particular interest are bioreductive prodrugs such as nitro compounds or quinones that can initiate cytotoxic redox cascades and release active metabolites. The critical points for the selectivity of such molecules are, what is the source of the electrons that activate the prodrug, and which are the enzymes that catalyze the reduction? Meredith et al. conceive an elegant approach to answer these questions, making use of reverse genetics in Trypanosoma brucei. By overexpression of key reductase genes, they engineer trypanosomal indicator lines that are hypersensitive to particular bioreductive prodrugs and allow to discriminate between one‐electron and two‐electron transfer activation mechanisms. Indicator lines that are also defective in DNA repair further indicate whether the resultant metabolites interfere with the parasite's genome. This set of T. brucei indicator lines provides a tool for the deconvolution of the mechanisms of prodrug activation and drug action that will facilitate the rational development of bioreductive prodrugs for parasite chemotherapy.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号