首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Mixed-mode chromatography combines features of ion-exchange chromatography and hydrophobic interaction chromatography and is increasingly used in antibody purification. As a replacement for flow-through operations on traditional unmixed resins or as a pH-controlled bind-and-elute step, the use of both interaction modes promises a better removal of product-specific impurities. However, the combination of the functionalities makes industrial process development significantly more complex, in particular the identification of the often small elution window that delivers the desired selectivity. Mechanistic modeling has proven that even difficult separation problems can be solved in a computer-optimized manner once the process dynamics have been modeled. The adsorption models described in the literature are also very complex, which makes model calibration difficult. In this work, we approach this problem with a newly constructed model that describes the adsorber saturation with the help of the surface coverage function of the colloidal particle adsorption model for ion-exchange chromatography. In a case study, a model for a pH-controlled antibody polishing step was created from six experiments. The behavior of fragments, aggregates, and host cell proteins was described with the help of offline analysis. After in silico optimization, a validation experiment confirmed an improved process performance in comparison to the historical process set point. In addition to these good results, the work also shows that the high dynamics of mixed-mode chromatography can produce unexpected results if process parameters deviate too far from tried and tested conditions.  相似文献   

2.
With the quality by design (QbD) initiative, regulatory authorities demand a consistent drug quality originating from a well-understood manufacturing process. This study demonstrates the application of a previously published mechanistic chromatography model to the in silico process characterization (PCS) of a monoclonal antibody polishing step. The proposed modeling workflow covered the main tasks of traditional PCS studies following the QbD principles, including criticality assessment of 11 process parameters and establishment of their proven acceptable ranges of operation. Analyzing effects of multi-variate sampling of process parameters on the purification outcome allowed identification of the edge-of-failure. Experimental validation of in silico results demanded approximately 75% less experiments compared to a purely wet-lab based PCS study. Stochastic simulation, considering the measured variances of process parameters and loading material composition, was used to estimate the capability of the process to meet the acceptance criteria for critical quality attributes and key performance indicators. The proposed workflow enables the implementation of digital process twins as QbD tool for improved development of biopharmaceutical manufacturing processes.  相似文献   

3.
4.
Cation exchange chromatography (CEX) is an essential part of most monoclonal antibody (mAb) purification platforms. Process characterization and root cause investigation of chromatographic unit operations are performed using scale down models (SDM). SDM chromatography columns typically have the identical bed height as the respective manufacturing-scale, but a significantly reduced inner diameter. While SDMs enable process development demanding less material and time, their comparability to manufacturing-scale can be affected by variability in feed composition, mobile phase and resin properties, or dispersion effects depending on the chromatography system at hand. Mechanistic models can help to close gaps between scales and reduce experimental efforts compared to experimental SDM applications. In this study, a multicomponent steric mass-action (SMA) adsorption model was applied to the scale-up of a CEX polishing step. Based on chromatograms and elution pool data ranging from laboratory- to manufacturing-scale, the proposed modeling workflow enabled early identification of differences between scales, for example, system dispersion effects or ionic capacity variability. A multistage model qualification approach was introduced to measure the model quality and to understand the model's limitations across scales. The experimental SDM and the in silico model were qualified against large-scale data using the identical state of the art equivalence testing procedure. The mechanistic chromatography model avoided limitations of the SDM by capturing effects of bed height, loading density, feed composition, and mobile phase properties. The results demonstrate the applicability of mechanistic chromatography models as a possible alternative to conventional SDM approaches.  相似文献   

5.
6.
A systematic evaluation of nonlinear mixed-effect taper models for volume prediction was performed. Of 21 taper equations with fewer than 5 parameters each, the best 4-parameter fixed-effect model according to fitting statistics was then modified by comparing its values for the parameters total height (H), diameter at breast height (DBH), and aboveground height (h) to modeling data. Seven alternative prediction strategies were compared using the best new equation in the absence of calibration data, which is often unavailable in forestry practice. The results of this study suggest that because calibration may sometimes be a realistic option, though it is rarely used in practical applications, one of the best strategies for improving the accuracy of volume prediction is the strategy with 7 calculated total heights of 3, 6 and 9 trees in the largest, smallest and medium-size categories, respectively. We cannot use the average trees or dominant trees for calculating the random parameter for further predictions. The method described here will allow the user to make the best choices of taper type and the best random-effect calculated strategy for each practical application and situation at tree level.  相似文献   

7.
High-throughput experimentation has revolutionized data-driven experimental sciences and opened the door to the application of machine learning techniques. Nevertheless, the quality of any data analysis strongly depends on the quality of the data and specifically the degree to which random effects in the experimental data-generating process are quantified and accounted for. Accordingly calibration, i.e. the quantitative association between observed quantities and measurement responses, is a core element of many workflows in experimental sciences.Particularly in life sciences, univariate calibration, often involving non-linear saturation effects, must be performed to extract quantitative information from measured data. At the same time, the estimation of uncertainty is inseparably connected to quantitative experimentation. Adequate calibration models that describe not only the input/output relationship in a measurement system but also its inherent measurement noise are required. Due to its mathematical nature, statistically robust calibration modeling remains a challenge for many practitioners, at the same time being extremely beneficial for machine learning applications.In this work, we present a bottom-up conceptual and computational approach that solves many problems of understanding and implementing non-linear, empirical calibration modeling for quantification of analytes and process modeling. The methodology is first applied to the optical measurement of biomass concentrations in a high-throughput cultivation system, then to the quantification of glucose by an automated enzymatic assay. We implemented the conceptual framework in two Python packages, calibr8 and murefi, with which we demonstrate how to make uncertainty quantification for various calibration tasks more accessible. Our software packages enable more reproducible and automatable data analysis routines compared to commonly observed workflows in life sciences.Subsequently, we combine the previously established calibration models with a hierarchical Monod-like ordinary differential equation model of microbial growth to describe multiple replicates of Corynebacterium glutamicum batch cultures. Key process model parameters are learned by both maximum likelihood estimation and Bayesian inference, highlighting the flexibility of the statistical and computational framework.  相似文献   

8.
9.
Development of a chromatographic step in a time and resource efficient manner remains a serious bottleneck in protein purification. Chromatographic performance typically depends on raw material attributes, feed material attributes, process factors, and their interactions. Design of experiments (DOE) based process development is often chosen for this purpose. A challenge is, however, in performing a DOE with such a large number of process factors. A split DOE approach based on process knowledge in order to reduce the number of experiments is proposed. The first DOE targets optimizing factors that are likely to significantly impact the process and their effect on process performance is unknown. The second DOE aims to fine-tune another set of interacting process factors, impact of whom on process performance is known from process understanding. Furthermore, modeling of a large set of output response variables has been achieved by fitting the output responses to an empirical equation and then using the parametric constants of the equation as output response variables for regression modeling. Two case studies involving hydrophobic interaction chromatography for removal of aggregates and cation exchange chromatography for separation of charge variants and aggregates have been utilized to illustrate the proposed approach. Proposed methodology reduced total number of experiments by 25% and 72% compared to a single DOE based on central composite design and full factorial design, respectively. The proposed approach is likely to result in a significant reduction in resources required as well as time taken during process development. © 2018 American Institute of Chemical Engineers Biotechnol. Prog., 35: e2730, 2019  相似文献   

10.
Process modeling involves the use of a set of mathematical equations to represent key physical phenomena involved in the process. An appropriately validated model can be used to predict process behavior with limited experimental data, identify critical ranges for process variables, and guide further process development. Although process modeling is extensively used in the chemical process industries, it has not been widely used in purification unit operations in biotechnology. Recent FDA guidelines encourage the use of process modeling during process development, along with multivariate statistical methods, detailed risk assessment, and other quantifiers of uncertainty. This paper will review recent advances in the modeling of key downstream unit operations: chromatography, filtration, and centrifugation. The focus will be on the application of modeling for industrial applications. Relevant papers presented at a session on this topic at the recent American Chemical Society National Meeting in San Francisco will also be reviewed.  相似文献   

11.
12.
Mathematical modeling is a potent in silico tool that can help investigate, interpret, and predict the behavior of biological systems. The first step is to develop a working hypothesis of the biology. Then by “translating” the biological phenomena into equations, models can harness the power of mathematical analysis techniques to explore the dynamics and interactions of the biological components. Models can be used together with traditional experimental models to help design new experiments, test hypotheses, identify mechanisms, and predict outcomes. This article reviews the process of building, calibrating, and using mathematical models in the context of the kinetics of receptor and signal transduction biology. An example model related to the androgen receptor-mediated regulation of the prostate is presented to illustrate the steps in the modeling process and to highlight the potential for mathematical modeling in this area.  相似文献   

13.
Intensified and continuous processes require fast and robust methods and technologies to monitor product titer for faster analytical turnaround time, process monitoring, and process control. The current titer measurements are mostly offline chromatography-based methods which may take hours or even days to get the results back from the analytical labs. Thus, offline methods will not meet the requirement of real time titer measurements for continuous production and capture processes. FTIR and chemometric based multivariate modeling are promising tools for real time titer monitoring in clarified bulk (CB) harvests and perfusate lines. However, empirical models are known to be vulnerable to unseen variability, specifically a FTIR chemometric titer model trained on a given biological molecule and process conditions often fails to provide accurate predictions of titer in another molecule under different process conditions. In this study, we developed an adaptive modeling strategy: the model was initially built using a calibration set of available perfusate and CB samples and then updated by augmenting spiking samples of the new molecules to the calibration set to make the model robust against perfusate or CB harvest of the new molecule. This strategy substantially improved the model performance and significantly reduced the modeling effort for new molecules.  相似文献   

14.
Mathematical modeling is a potent in silico tool that can help investigate, interpret, and predict the behavior of biological systems. The first step is to develop a working hypothesis of the biology. Then by "translating" the biological phenomena into equations, models can harness the power of mathematical analysis techniques to explore the dynamics and interactions of the biological components. Models can be used together with traditional experimental models to help design new experiments, test hypotheses, identify mechanisms, and predict outcomes. This article reviews the process of building, calibrating, and using mathematical models in the context of the kinetics of receptor and signal transduction biology. An example model related to the androgen receptor-mediated regulation of the prostate is presented to illustrate the steps in the modeling process and to highlight the potential for mathematical modeling in this area.  相似文献   

15.
On-line monitoring tools for downstream chromatographic processing (DSP) of biotherapeutics can enable fast actions to correct for disturbances in the upstream, gain process understanding, and eventually lead to process optimization. While UV/Vis spectroscopy is mostly assessing the protein's amino acid composition and the application of Fourier transform infrared spectroscopy is limited due to strong water interactions, Raman spectroscopy is able to assess the secondary and tertiary protein structure without significant water interactions. The aim of this work is to implement the Raman technology in DSP, by designing an in-line flow cell with a reduced dead volume of 80 μL and a reflector to increase the signal intensity as well as developing a chemometric modeling path. In this context, measurement settings were adjusted and spectra were taken from different chromatographic breakthrough curves of IgG1 in harvest. The resulting models show a small average RMSEP of 0.12 mg/mL, on a broad calibration range from 0 to 2.82 mg/mL IgG1. This work highlights the benefits of model assisted Raman spectroscopy in chromatography with complex backgrounds, lays the fundamentals for in-line monitoring of IgG1, and enables advanced control strategies. Moreover, the approach might be extended to further critical quality attributes like aggregates or could be transferred to other process steps.  相似文献   

16.
Genome-scale metabolic models bridge the gap between genome-derived biochemical information and metabolic phenotypes in a principled manner, providing a solid interpretative framework for experimental data related to metabolic states, and enabling simple in silico experiments with whole-cell metabolism. Models have been reconstructed for almost 20 bacterial species, so far mainly through expert curation efforts integrating information from the literature with genome annotation. A wide variety of computational methods exploiting metabolic models have been developed and applied to bacteria, yielding valuable insights into bacterial metabolism and evolution, and providing a sound basis for computer-assisted design in metabolic engineering. Recent advances in computational systems biology and high-throughput experimental technologies pave the way for the systematic reconstruction of metabolic models from genomes of new species, and a corresponding expansion of the scope of their applications. In this review, we provide an introduction to the key ideas of metabolic modeling, survey the methods, and resources that enable model reconstruction and refinement, and chart applications to the investigation of global properties of metabolic systems, the interpretation of experimental results, and the re-engineering of their biochemical capabilities.  相似文献   

17.
Cancer invasion is one of the hallmarks of cancer and a prerequisite for cancer metastasis. However, the invasive process is very complex, depending on multiple correlated intrinsic and environmental factors, and thus is difficult to study experimentally in a fully controlled way. Therefore, there is an increased demand for interdisciplinary integrated approaches combining laboratory experiments with multiscale in silico modeling. In this review, we will summarize current computational techniques applicable to model cancer invasion in silico, with a special focus on a class of individual-cell-based models developed in our laboratories. We also discuss their integration with traditional and novel in vitro experimentation, including new invasion assays whose design was inspired by computational modeling.  相似文献   

18.
Deciphering the biological networks underlying complex phenotypic traits, e.g., human disease is undoubtedly crucial to understand the underlying molecular mechanisms and to develop effective therapeutics. Due to the network complexity and the relatively small number of available experiments, data-driven modeling is a great challenge for deducing the functions of genes/proteins in the network and in phenotype formation. We propose a novel knowledge-driven systems biology method that utilizes qualitative knowledge to construct a Dynamic Bayesian network (DBN) to represent the biological network underlying a specific phenotype. Edges in this network depict physical interactions between genes and/or proteins. A qualitative knowledge model first translates typical molecular interactions into constraints when resolving the DBN structure and parameters. Therefore, the uncertainty of the network is restricted to a subset of models which are consistent with the qualitative knowledge. All models satisfying the constraints are considered as candidates for the underlying network. These consistent models are used to perform quantitative inference. By in silico inference, we can predict phenotypic traits upon genetic interventions and perturbing in the network. We applied our method to analyze the puzzling mechanism of breast cancer cell proliferation network and we accurately predicted cancer cell growth rate upon manipulating (anti)cancerous marker genes/proteins.  相似文献   

19.
Phenotype-centric modeling enables a paradigm shift in the analysis of mechanistic models. It brings the focus to a network's biochemical phenotypes and their relationship with measurable traits (e.g., product yields, system dynamics, signal amplification factors, etc.) and away from computationally intensive simulation-centric modeling. Here, we explore applications of this new modeling strategy in the field of rational metabolic engineering using the amorphadiene biosynthetic network as a case study. This network has previously been studied using a mechanistic model and the simulation-centric strategy, and thus provides an excellent means to compare and contrast results obtained from these two very different strategies. We show that the phenotype-centric strategy, without values for the parameters, not only identifies beneficial intervention strategies obtained with the simulation-centric strategy, but it also provides an understanding of the mechanistic context for the validity of these predictions. Additionally, we propose a set of hypothetical strains with the potential to outperform reported production strains and to enhance the mechanistic understanding of the amorphadiene biosynthetic network. Further, we identify the landscape of possible intervention strategies for the given model. We believe that phenotype-centric modeling can advance the field of rational metabolic engineering by enabling the development of next generation kinetics-based algorithms and methods that do not rely on a priori knowledge of kinetic parameters but allow a structured, global analysis of system design in the parameter space.  相似文献   

20.

Background

The dynamics of biochemical networks can be modelled by systems of ordinary differential equations. However, these networks are typically large and contain many parameters. Therefore model reduction procedures, such as lumping, sensitivity analysis and time-scale separation, are used to simplify models. Although there are many different model reduction procedures, the evaluation of reduced models is difficult and depends on the parameter values of the full model. There is a lack of a criteria for evaluating reduced models when the model parameters are uncertain.

Results

We developed a method to compare reduced models and select the model that results in similar dynamics and uncertainty as the original model. We simulated different parameter sets from the assumed parameter distributions. Then, we compared all reduced models for all parameter sets using cluster analysis. The clusters revealed which of the reduced models that were similar to the original model in dynamics and variability. This allowed us to select the smallest reduced model that best approximated the full model. Through examples we showed that when parameter uncertainty was large, the model should be reduced further and when parameter uncertainty was small, models should not be reduced much.

Conclusions

A method to compare different models under parameter uncertainty is developed. It can be applied to any model reduction method. We also showed that the amount of parameter uncertainty influences the choice of reduced models.
  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号