首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 343 毫秒
1.
In cell culture processes cell growth and metabolism drive changes in the chemical environment of the culture. These environmental changes elicit reactor control actions, cell growth response, and are sensed by cell signaling pathways that influence metabolism. The interplay of these forces shapes the culture dynamics through different stages of cell cultivation and the outcome greatly affects process productivity, product quality, and robustness. Developing a systems model that describes the interactions of those major players in the cell culture system can lead to better process understanding and enhance process robustness. Here we report the construction of a hybrid mechanistic-empirical bioprocess model which integrates a mechanistic metabolic model with subcomponent models for cell growth, signaling regulation, and the bioreactor environment for in silico exploration of process scenarios. Model parameters were optimized by fitting to a dataset of cell culture manufacturing process which exhibits variability in metabolism and productivity. The model fitting process was broken into multiple steps to mitigate the substantial numerical challenges related to the first-principles model components. The optimized model captured the dynamics of metabolism and the variability of the process runs with different kinetic profiles and productivity. The variability of the process was attributed in part to the metabolic state of cell inoculum. The model was then used to identify potential mitigation strategies to reduce process variability by altering the initial process conditions as well as to explore the effect of changing CO2 removal capacity in different bioreactor scales on process performance. By incorporating a mechanistic model of cell metabolism and appropriately fitting it to a large dataset, the hybrid model can describe the different metabolic phases in culture and the variability in manufacturing runs. This approach of employing a hybrid model has the potential to greatly facilitate process development and reactor scaling.  相似文献   

2.
In recent years, hybrid neural network approaches, which combine mechanistic and neural network models, have received considerable attention. These approaches are potentially very efficient for obtaining more accurate predictions of process dynamics by combining mechanistic and neural network models in such a way that the neural network model properly accounts for unknown and nonlinear parts of the mechanistic model. In this work, a full-scale coke-plant wastewater treatment process was chosen as a model system. Initially, a process data analysis was performed on the actual operational data by using principal component analysis. Next, a simplified mechanistic model and a neural network model were developed based on the specific process knowledge and the operational data of the coke-plant wastewater treatment process, respectively. Finally, the neural network was incorporated into the mechanistic model in both parallel and serial configurations. Simulation results showed that the parallel hybrid modeling approach achieved much more accurate predictions with good extrapolation properties as compared with the other modeling approaches even in the case of process upset caused by, for example, shock loading of toxic compounds. These results indicate that the parallel hybrid neural modeling approach is a useful tool for accurate and cost-effective modeling of biochemical processes, in the absence of other reasonably accurate process models.  相似文献   

3.
Process understanding and characterization forms the foundation, ensuring consistent and robust biologics manufacturing process. Using appropriate modeling tools and machine learning approaches, the process data can be monitored in real time to avoid manufacturing risks. In this article, we have outlined an approach toward implementation of chemometrics and machine learning tools (neural network analysis) to model and predict the behavior of a mixed-mode chromatography step for a biosimilar (Teriparatide) as a case study. The process development data and process knowledge was assimilated into a prior process knowledge assessment using chemometrics tools to derive important parameters critical to performance indicators (i.e., potential quality and process attributes) and to establish the severity ranking for the FMEA analysis. The characterization data of the chromatographic operation are presented alongwith the determination of the critical, key and non- key process parameters, set points, operating, process acceptance and characterized ranges. The scale-down model establishment was assessed using traditional approaches and novel approaches like batch evolution model and neural network analysis. The batch evolution model was further used to demonstrate batch monitoring through direct chromatographic data, thus demonstrating its application for continuos process verification. Assimilation of process knowledge through a structured data acquisition approach, built-in from process development to continuous process verification was demonstrated to result in a data analytics driven model that can be coupled with machine learning tools for real time process monitoring. We recommend application of these approaches with the FDA guidance on stage wise process development and validation to reduce manufacturing risks.  相似文献   

4.
A mathematical model for the growth process of the bacterium Bacillus subtilis is described. The model is a highly structured one. The driving motivation for development of the model and explicit accounting of major interactions of metabolic networks in the model is related to our eventual goal that the model will be used in the analysis of complex biological patterns. Bacillus subtilis was chosen in our study due to the interesting sporulation process that these cells undergo in response to adverse environmental conditions including nutrient limitation. Sporulation process in B. subtilis represents a primordial prototype of cellular differentiation in higher cellular systems. Thus a model for the B. subtilis growth process should prove extremely useful for understanding questions of developmental biology. The model is capable of simulating the transition between the exponential and stationary phase of growth in a batch culture. Since during the transition period the growth process and the metabolism become decoupled and many transient processes are taking place, such predictions are a severe test for the validity of any model. A strategy to examine the leading hypothesis on B. subtills sporulation implementing GTP as a component which signals sporulation initiation is described.  相似文献   

5.
The development of an organism represents a complex dynamic process, which is controlled by a network of genes and multiple environmental factors. Programmed cell death (PCD), a physiological cell suicide process, occurs during the development of most organisms and is, typically, a complex dynamic trait. Understanding how genes control this complex developmental process has been a long-standing topic in PCD studies. In this article, we propose a nonparametric model, based on orthogonal Legendre polynomials, to map genes or quantitative trait loci (QTLs) that govern the dynamic features of the PCD process. The model is built under the maximum likelihood-based functional mapping framework and is implemented with the EM algorithm. A general information criterion is proposed for selecting the optimal Legendre order that best fits the dynamic pattern of the PCD process. The consistency of the order selection criterion is established. A nonstationary structured antedependence model (SAD) is applied to model the covariance structure among the phenotypes measured at different time points. The developed model generates a number of hypothesis tests regarding the genetic control mechanism of the PCD process. Extensive simulation studies are conducted to investigate the statistical behavior of the model. Finally, we apply the model to a rice tiller number data set in which several QTLs are identified. The developed model provides a quantitative and testable framework for assessing the interplay between genes and the developmental PCD process, and will have great implications for elucidating the genetic architecture of the PCD process.  相似文献   

6.
《Cytotherapy》2021,23(10):953-959
Background aimsThis article describes the development of a small-scale model for Ficoll-based cell separation as part of process development of an advanced therapy medicinal product and its qualification. Because of the complexity of biological products, their manufacturing process as well as characterization and control needs to be accurately understood. Likewise, scale-down models serve as an indispensable tool for process development, characterization, optimization and validation. This scale-down model represents a cell processor device widely used in advance therapies. This approach is inteded to optimise resources and to focus its use on process characterisation studies under the paradigm of the Quality by design. A scale-down model should reflect the large manufacturing scale. Consequently, this simplified system should offer a high degree of control over the process parameters to depict a robust model, even considering the process limitations. For this reason, a model should be developed and qualified for the intended purpose.MethodsProcess operating parameters were studied, and their resulting performance at full scale was used as a baseline to guide scale-down model development. Once the model was established, comparability runs were performed by establishing standard operating conditions with bone marrow samples. These analyses showed consistency between the bench and the large scale. Additionally, statistical analyses were employed to demonstrate equivalence.ResultsThe process performance indicators and assessed quality attributes were equivalent and fell into the acceptance ranges defined for the large-scale process.ConclusionsThis scale-down model is suitable for use in process characterization studies.  相似文献   

7.
8.
9.
In this work we propose a model that simultaneously optimizes the process variables and the structure of a multiproduct batch plant for the production of recombinant proteins. The complete model includes process performance models for the unit stages and a posynomial representation for the multiproduct batch plant. Although the constant time and size factor models are the most commonly used to model multiproduct batch processes, process performance models describe these time and size factors as functions of the process variables selected for optimization. These process performance models are expressed as algebraic equations obtained from the analytical integration of simplified mass balances and kinetic expressions that describe each unit operation. They are kept as simple as possible while retaining the influence of the process variables selected to optimize the plant. The resulting mixed-integer nonlinear program simultaneously calculates the plant structure (parallel units in or out of phase, and allocation of intermediate storage tanks), the batch plant decision variables (equipment sizes, batch sizes, and operating times of semicontinuous items), and the process decision variables (e.g., final concentration at selected stages, volumetric ratio of phases in the liquid-liquid extraction). A noteworthy feature of the proposed approach is that the mathematical model for the plant is the same as that used in the constant factor model. The process performance models are handled as extra constraints. A plant consisting of eight stages operating in the single product campaign mode (one fermentation, two microfiltrations, two ultrafiltrations, one homogenization, one liquid-liquid extraction, and one chromatography) for producing four different recombinant proteins by the genetically engineered yeast Saccharomyces cerevisiae was modeled and optimized. Using this example, it is shown that the presence of additional degrees of freedom introduced by the process performance models, with respect to a fixed size and time factor model, represents an important development in improving plant design.  相似文献   

10.
Thanks to their different senses, human observers acquire multiple information coming from their environment. Complex cross-modal interactions occur during this perceptual process. This article proposes a framework to analyze and model these interactions through a rigorous and systematic data-driven process. This requires considering the general relationships between the physical events or factors involved in the process, not only in quantitative terms, but also in term of the influence of one factor on another. We use tools from information theory and probabilistic reasoning to derive relationships between the random variables of interest, where the central notion is that of conditional independence. Using mutual information analysis to guide the model elicitation process, a probabilistic causal model encoded as a Bayesian network is obtained. We exemplify the method by using data collected in an audio-visual localization task for human subjects, and we show that it yields a well-motivated model with good predictive ability. The model elicitation process offers new prospects for the investigation of the cognitive mechanisms of multisensory perception.  相似文献   

11.
ABSTRACT: BACKGROUND: Cost-effective production of lignocellulosic biofuels remains a major financial and technical challenge at the industrial scale. A critical tool in biofuels process development is the techno-economic (TE) model, which calculates biofuel production costs using a process model and an economic model. The process model solves mass and energy balances for each unit, and the economic model estimates capital and operating costs from the process model based on economic assumptions. The process model inputs include experimental data on the feedstock composition and intermediate product yields for each unit. These experimental yield data are calculated from primary measurements. Uncertainty in these primary measurements is propagated to the calculated yields, to the process model, and ultimately to the economic model. Thus, outputs of the TE model have a minimum uncertainty associated with the uncertainty in the primary measurements. RESULTS: We calculate the uncertainty in the Minimum Ethanol Selling Price (MESP) estimate for lignocellulosic ethanol production via a biochemical conversion process: dilute sulfuric acid pretreatment of corn stover followed by enzymatic hydrolysis and co-fermentation of the resulting sugars to ethanol. We perform a sensitivity analysis on the TE model and identify the feedstock composition and conversion yields from three unit operations (xylose from pretreatment, glucose from enzymatic hydrolysis, and ethanol from fermentation) as the most important variables. The uncertainty in the pretreatment xylose yield arises from multiple measurements, whereas the glucose and ethanol yields from enzymatic hydrolysis and fermentation, respectively, are dominated by a single measurement: the fraction of insoluble solids (fIS) in the biomass slurries. CONCLUSIONS: We calculate a $0.15/gal uncertainty in MESP from the TE model due to uncertainties in primary measurements. This result sets a lower bound on the error bars of the TE model predictions. This analysis highlights the primary measurements that merit further development to reduce the uncertainty associated with their use in TE models. While we develop and apply this mathematical framework to a specific biorefinery scenario here, this analysis can be readily adapted to other types of biorefining processes and provides a general framework for propagating uncertainty due to analytical measurements through a TE model.  相似文献   

12.
Summary We estimate the parameters of a stochastic process model for a macroparasite population within a host using approximate Bayesian computation (ABC). The immunity of the host is an unobserved model variable and only mature macroparasites at sacrifice of the host are counted. With very limited data, process rates are inferred reasonably precisely. Modeling involves a three variable Markov process for which the observed data likelihood is computationally intractable. ABC methods are particularly useful when the likelihood is analytically or computationally intractable. The ABC algorithm we present is based on sequential Monte Carlo, is adaptive in nature, and overcomes some drawbacks of previous approaches to ABC. The algorithm is validated on a test example involving simulated data from an autologistic model before being used to infer parameters of the Markov process model for experimental data. The fitted model explains the observed extra‐binomial variation in terms of a zero‐one immunity variable, which has a short‐lived presence in the host.  相似文献   

13.
We have previously shown the usefulness of historical data for fermentation process optimization. The methodology developed includes identification of important process inputs, training of an artificial neural network (ANN) process model, and ultimately use of the ANN model with a genetic algorithm to find the optimal values of each critical process input. However, this approach ignores the time-dependent nature of the system, and therefore, does not fully utilize the available information within a database. In this work, we propose a method for incorporating time-dependent optimization into our previously developed three-step optimization routine. This is achieved by an additional step that uses a fermentation model (consisting of coupled ordinary differential equations (ODE)) to interpret important time-course features of the collected data through adjustments in model parameters. Important process variables not explicitly included in the model were then identified for each model parameter using automatic relevance determination (ARD) with Gaussian process (GP) models. The developed GP models were then combined with the fermentation model to form a hybrid neural network model that predicted the time-course activity of the cell and protein concentrations of novel fermentation conditions. A hybrid-genetic algorithm was then used in conjunction with the hybrid model to suggest optimal time-dependent control strategies. The presented method was implemented upon an E. coli fermentation database generated in our laboratory. Optimization of two different criteria (final protein yield and a simplified economic criteria) was attempted. While the overall protein yield was not increased using this methodology, we were successful in increasing a simplified economic criterion by 15% compared to what had been previously observed. These process conditions included using 35% less arabinose (the inducer) and 33% less typtone in the media and reducing the time required to reach the maximum protein concentration by 10% while producing approximately the same level of protein as the previous optimum.  相似文献   

14.
In many longitudinal studies, the individual characteristics associated with the repeated measures may be possible covariates of the time to an event of interest, and thus, it is desirable to model the time-to-event process and the longitudinal process jointly. Statistical analyses may be further complicated in such studies with missing data such as informative dropouts. This article considers a nonlinear mixed-effects model for the longitudinal process and the Cox proportional hazards model for the time-to-event process. We provide a method for simultaneous likelihood inference on the 2 models and allow for nonignorable data missing. The approach is illustrated with a recent AIDS study by jointly modeling HIV viral dynamics and time to viral rebound.  相似文献   

15.
16.
The objective of process characterization is to demonstrate robustness of manufacturing processes by understanding the relationship between key operating parameters and final performance. Technical information from the characterization study is important for subsequent process validation, and this has become a regulatory expectation in recent years. Since performing the study at the manufacturing scale is not practically feasible, development of scale-down models that represent the performance of the commercial process is essential to achieve reliable process characterization. In this study, we describe a systematic approach to develop a bioreactor scale-down model and to characterize a cell culture process for recombinant protein production in CHO cells. First, a scale-down model using 2-L bioreactors was developed on the basis of the 2000-L commercial scale process. Profiles of cell growth, productivity, product quality, culture environments (pH, DO, pCO2), and level of metabolites (glucose, glutamine, lactate, ammonia) were compared between the two scales to qualify the scale-down model. The key operating parameters were then characterized in single-parameter ranging studies and an interaction study using this scale-down model. Appropriate operation ranges and acceptance criteria for certain key parameters were determined to ensure the success of process validation and the process performance consistency. The process worst-case condition was also identified through the interaction study.  相似文献   

17.
In manufacturing monoclonal antibodies (mAbs), it is crucial to be able to predict how process conditions and supplements affect productivity and quality attributes, especially glycosylation. Supplemental inputs, such as amino acids and trace metals in the media, are reported to affect cell metabolism and glycosylation; quantifying their effects is essential for effective process development. We aim to present and validate, through a commercially relevant cell culture process, a technique for modeling such effects efficiently. While existing models can predict mAb production or glycosylation dynamics under specific process configurations, adapting them to new processes remains challenging, because it involves modifying the model structure and often requires some mechanistic understanding. Here, a modular modeling technique for adapting an existing model for a fed-batch Chinese hamster ovary (CHO) cell culture process without structural modifications or mechanistic insight is presented. Instead, data is used, obtained from designed experimental perturbations in media supplementation, to train and validate a supplemental input effect model, which is used to “patch” the existing model. The combined model can be used for model-based process development to improve productivity and to meet product quality targets more efficiently. The methodology and analysis are generally applicable to other CHO cell lines and cell types.  相似文献   

18.
19.
Process understanding is emphasized in the process analytical technology initiative and the quality by design paradigm to be essential for manufacturing of biopharmaceutical products with consistent high quality. A typical approach to developing a process understanding is applying a combination of design of experiments with statistical data analysis. Hybrid semi-parametric modeling is investigated as an alternative method to pure statistical data analysis. The hybrid model framework provides flexibility to select model complexity based on available data and knowledge. Here, a parametric dynamic bioreactor model is integrated with a nonparametric artificial neural network that describes biomass and product formation rates as function of varied fed-batch fermentation conditions for high cell density heterologous protein production with E. coli. Our model can accurately describe biomass growth and product formation across variations in induction temperature, pH and feed rates. The model indicates that while product expression rate is a function of early induction phase conditions, it is negatively impacted as productivity increases. This could correspond with physiological changes due to cytoplasmic product accumulation. Due to the dynamic nature of the model, rational process timing decisions can be made and the impact of temporal variations in process parameters on product formation and process performance can be assessed, which is central for process understanding.  相似文献   

20.
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号