首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Bioprocess development and optimization is a challenging, costly, and time-consuming effort. In this multidisciplinary task, upstream processing (USP) and downstream processing (DSP) are conventionally considered distinct disciplines. This consideration fosters “one-way” optimization disregarding interdependencies between unit operations; thus, the full potential of the process chain cannot be achieved. Therefore, it is necessary to fully integrate USP and DSP process development to provide balanced biotechnological production processes. The aim of the present study was to investigate how different host/secretory signal/antigen binding fragment (Fab) combinations in E. coli expression systems influence USP, primary recovery performance and the final product quality. We ran identical fed-batch cultivations with 16 different expression clones to study growth and product formation kinetics, as well as centrifugation efficiency, viscosity, extracellular DNA, and endotoxin content, important parameters in DSP. We observed a severe influence on cell growth, product titer, extracellular product, and cell lysis, accompanied by a significant impact on the analyzed parameters of DSP performance. Our results provide the basis for future research on integrated process development considering interdependencies between USP and DSP; however, individual products need to be considered specifically. These interdependencies need to be understood for rational decision-making and efficient process development in research and industry.  相似文献   

2.
Quality by Design (QbD) is gaining industry acceptance as an approach towards development and commercialization of biotechnology therapeutic products that are expressed via microbial or mammalian cell lines. In QbD, the process is designed and controlled to deliver specified quality attributes consistently. To acquire the enhanced understanding that is necessary to achieve the above, however, requires more extensive experimentation to establish the design space for the process and the product. With biotechnology companies operating under ever-increasing pressure towards lowering the cost of manufacturing, the use of high-throughput tools has emerged as a necessary enabler of QbD in a time- and resource-constrained environment. We review this topic for those in academia and industry that are engaged in drug substance process development.  相似文献   

3.
The anticipated increase in the demand for inactivated polio vaccines resulting from the success in the polio eradication program requires an increase in production capacity and cost price reduction of the current inactivated polio vaccine production processes. Improvement of existing production processes is necessary as the initial process development has been done decades ago. An up‐to‐date lab‐scale version encompassing the legacy inactivated polio vaccine production process was set‐up. This lab‐scale version should be representative of the large scale, meaning a scale‐down model, to allow experiments for process optimization that can be readily applied. Initially the separate unit operations were scaled‐down at setpoint. Subsequently, the unit operations were applied successively in a comparative manner to large‐scale manufacturing. This allows the assessment of the effects of changes in one unit operation to the consecutive units at small‐scale. Challenges in translating large‐scale operations to lab‐scale are discussed, and the concessions that needed to be made are described. The current scale‐down model for cell and virus culture (2.3‐L) presents a feasible model with its production scale counterpart (750‐L) when operated at setpoint. Also, the current scale‐down models for the DSP unit operations clarification, concentration, size exclusion chromatography, ion exchange chromatography, and inactivation are in agreement with the manufacturing scale. The small‐scale units can be used separately, as well as sequentially, to study variations and critical product quality attributes in the production process. Finally, it is shown that the scale‐down unit operations can be used consecutively to prepare trivalent vaccine at lab‐scale with comparable characteristics to the product produced at manufacturing scale. Biotechnol. Bioeng. 2013; 110: 1354–1365. © 2012 Wiley Periodicals, Inc.  相似文献   

4.
A typical biotech process starts with the vial of the cell bank, ends with the final product and has anywhere from 15 to 30 unit operations in series. The total number of process variables (input and output parameters) and other variables (raw materials) can add up to several hundred variables. As the manufacturing process is widely accepted to have significant impact on the quality of the product, the regulatory agencies require an assessment of process comparability across different phases of manufacturing (Phase I vs. Phase II vs. Phase III vs. Commercial) as well as other key activities during product commercialization (process scale-up, technology transfer, and process improvement). However, assessing comparability for a process with such a large number of variables is nontrivial and often companies resort to qualitative comparisons. In this article, we present a quantitative approach for assessing process comparability via use of chemometrics. To our knowledge this is the first time that such an approach has been published for biotech processing. The approach has been applied to an industrial case study involving evaluation of two processes that are being used for commercial manufacturing of a major biosimilar product. It has been demonstrated that the proposed approach is able to successfully identify the unit operations in the two processes that are operating differently. We expect this approach, which can also be applied toward assessing product comparability, to be of great use to both the regulators and the industry which otherwise struggle to assess comparability.  相似文献   

5.
Fermentanomics is an emerging field of research and involves understanding the underlying controlled process variables and their effect on process yield and product quality. Although major advancements have occurred in process analytics over the past two decades, accurate real‐time measurement of significant quality attributes for a biotech product during production culture is still not feasible. Researchers have used an amalgam of process models and analytical measurements for monitoring and process control during production. This article focuses on using multivariate data analysis as a tool for monitoring the internal bioreactor dynamics, the metabolic state of the cell, and interactions among them during culture. Quality attributes of the monoclonal antibody product that were monitored include glycosylation profile of the final product along with process attributes, such as viable cell density and level of antibody expression. These were related to process variables, raw materials components of the chemically defined hybridoma media, concentration of metabolites formed during the course of the culture, aeration‐related parameters, and supplemented raw materials such as glucose, methionine, threonine, tryptophan, and tyrosine. This article demonstrates the utility of multivariate data analysis for correlating the product quality attributes (especially glycosylation) to process variables and raw materials (especially amino acid supplements in cell culture media). The proposed approach can be applied for process optimization to increase product expression, improve consistency of product quality, and target the desired quality attribute profile. © 2015 American Institute of Chemical Engineers Biotechnol. Prog., 31:1586–1599, 2015  相似文献   

6.
Biotech unit operations are often characterized by a large number of inputs (operational parameters) and outputs (performance parameters) along with complex correlations among them. A typical biotech process starts with the vial of the cell bank, ends with the final product, and has anywhere from 15 to 30 such unit operations in series. Besides the above‐mentioned operational parameters, raw material attributes can also impact process performance and product quality as well as interact among each other. Multivariate data analysis (MVDA) offers an effective approach to gather process understanding from such complex datasets. Review of literature suggests that the use of MVDA is rapidly increasing, fuelled by the gradual acceptance of quality by design (QbD) and process analytical technology (PAT) among the regulators and the biotech industry. Implementation of QbD and PAT requires enhanced process and product understanding. In this article, we first discuss the most critical issues that a practitioner needs to be aware of while performing MVDA of bioprocessing data. Next, we present a step by step procedure for performing such analysis. Industrial case studies are used to elucidate the various underlying concepts. With the increasing usage of MVDA, we hope that this article would be a useful resource for present and future practitioners of MVDA. © 2014 American Institute of Chemical Engineers Biotechnol. Prog., 30:967–973, 2014  相似文献   

7.
The development of a biopharmaceutical production process usually occurs sequentially, and tedious optimization of each individual unit operation is very time-consuming. Here, the conditions established as optimal for one-step serve as input for the following step. Yet, this strategy does not consider potential interactions between a priori distant process steps and therefore cannot guarantee for optimal overall process performance. To overcome these limitations, we established a smart approach to develop and utilize integrated process models using machine learning techniques and genetic algorithms. We evaluated the application of the data-driven models to explore potential efficiency increases and compared them to a conventional development approach for one of our development products. First, we developed a data-driven integrated process model using gradient boosting machines and Gaussian processes as machine learning techniques and a genetic algorithm as recommendation engine for two downstream unit operations, namely solubilization and refolding. Through projection of the results into our large-scale facility, we predicted a twofold increase in productivity. Second, we extended the model to a three-step model by including the capture chromatography. Here, depending on the selected baseline-process chosen for comparison, we obtained between 50% and 100% increase in productivity. These data show the successful application of machine learning techniques and optimization algorithms for downstream process development. Finally, our results highlight the importance of considering integrated process models for the whole process chain, including all unit operations.  相似文献   

8.
The concept of "design space" has been proposed in the ICH Q8 guideline and is gaining momentum in its application in the biotech industry. It has been defined as "the multidimensional combination and interaction of input variables (e.g., material attributes) and process parameters that have been demonstrated to provide assurance of quality." This paper presents a stepwise approach for defining process design space for a biologic product. A case study, involving P. pastoris fermentation, is presented to facilitate this. First, risk analysis via Failure Modes and Effects Analysis (FMEA) is performed to identify parameters for process characterization. Second, small-scale models are created and qualified prior to their use in these experimental studies. Third, studies are designed using Design of Experiments (DOE) in order for the data to be amenable for use in defining the process design space. Fourth, the studies are executed and the results analyzed for decisions on the criticality of the parameters as well as on establishing process design space. For the application under consideration, it is shown that the fermentation unit operation is very robust with a wide design space and no critical operating parameters. The approach presented here is not specific to the illustrated case study. It can be extended to other biotech unit operations and processes that can be scaled down and characterized at small scale.  相似文献   

9.
Multi‐factorial experimentation is essential in understanding the link between mammalian cell culture conditions and the glycoprotein product of any biomanufacturing process. This understanding is increasingly demanded as bioprocess development is influenced by the Quality by Design paradigm. We have developed a system that allows hundreds of micro‐bioreactors to be run in parallel under controlled conditions, enabling factorial experiments of much larger scope than is possible with traditional systems. A high‐throughput analytics workflow was also developed using commercially available instruments to obtain product quality information for each cell culture condition. The micro‐bioreactor system was tested by executing a factorial experiment varying four process parameters: pH, dissolved oxygen, feed supplement rate, and reduced glutathione level. A total of 180 micro‐bioreactors were run for 2 weeks during this DOE experiment to assess this scaled down micro‐bioreactor system as a high‐throughput tool for process development. Online measurements of pH, dissolved oxygen, and optical density were complemented by offline measurements of glucose, viability, titer, and product quality. Model accuracy was assessed by regressing the micro‐bioreactor results with those obtained in conventional 3 L bioreactors. Excellent agreement was observed between the micro‐bioreactor and the bench‐top bioreactor. The micro‐bioreactor results were further analyzed to link parameter manipulations to process outcomes via leverage plots, and to examine the interactions between process parameters. The results show that feed supplement rate has a significant effect (P < 0.05) on all performance metrics with higher feed rates resulting in greater cell mass and product titer. Culture pH impacted terminal integrated viable cell concentration, titer and intact immunoglobulin G titer, with better results obtained at the lower pH set point. The results demonstrate that a micro‐scale system can be an excellent model of larger scale systems, while providing data sets broader and deeper than are available by traditional methods. Biotechnol. Bioeng. 2009; 104: 1107–1120. © 2009 Wiley Periodicals, Inc.  相似文献   

10.
Most biotechnology unit operations are complex in nature with numerous process variables, feed material attributes, and raw material attributes that can have significant impact on the performance of the process. Design of experiments (DOE)‐based approach offers a solution to this conundrum and allows for an efficient estimation of the main effects and the interactions with minimal number of experiments. Numerous publications illustrate application of DOE towards development of different bioprocessing unit operations. However, a systematic approach for evaluation of the different DOE designs and for choosing the optimal design for a given application has not been published yet. Through this work we have compared the I‐optimal and D‐optimal designs to the commonly used central composite and Box–Behnken designs for bioprocess applications. A systematic methodology is proposed for construction of the model and for precise prediction of the responses for the three case studies involving some of the commonly used unit operations in downstream processing. Use of Akaike information criterion for model selection has been examined and found to be suitable for the applications under consideration. © 2013 American Institute of Chemical Engineers Biotechnol. Prog., 30:86–99, 2014  相似文献   

11.
In water treatment processes, the individual unit operations are complex, highly non-linear and poorly understood. Whilst many models have been developed to improve process understanding, these are rarely in a form easily exploited by the control engineer. Attempts to improve the performance of water treatment works through the application of improved control and measurement have had variable success. This paper discusses investigations into the application of feedback control on the clarification process of a large-scale pilot plant using a streaming current detector (SCD). The application is aimed towards maximising the efficiency of the chemical coagulation process. To achieve this, a simple model of the interactions of process operating conditions on the SCD measurements must be made.  相似文献   

12.
Multivariate statistical process monitoring (MSPM) is becoming increasingly utilized to further enhance process monitoring in the biopharmaceutical industry. MSPM can play a critical role when there are many measurements and these measurements are highly correlated, as is typical for many biopharmaceutical operations. Specifically, for processes such as cleaning‐in‐place (CIP) and steaming‐in‐place (SIP, also known as sterilization‐in‐place), control systems typically oversee the execution of the cycles, and verification of the outcome is based on offline assays. These offline assays add to delays and corrective actions may require additional setup times. Moreover, this conventional approach does not take interactive effects of process variables into account and cycle optimization opportunities as well as salient trends in the process may be missed. Therefore, more proactive and holistic online continued verification approaches are desirable. This article demonstrates the application of real‐time MSPM to processes such as CIP and SIP with industrial examples. The proposed approach has significant potential for facilitating enhanced continuous verification, improved process understanding, abnormal situation detection, and predictive monitoring, as applied to CIP and SIP operations. © 2014 American Institute of Chemical Engineers Biotechnol. Prog., 30:505–515, 2014  相似文献   

13.
Biotech unit operations are often characterized by a large number of inputs (operational parameters) and outputs (performance parameters) along with complex correlations amongst them. A typical biotech process starts with the vial of the cell bank, ends with the final product, and has anywhere from 15 to 30 such unit operations in series. The aforementioned parameters can impact process performance and product quality and also interact amongst each other. Chemometrics presents one effective approach to gather process understanding from such complex data sets. The increasing use of chemometrics is fuelled by the gradual acceptance of quality by design and process analytical technology amongst the regulators and the biotech industry, which require enhanced process and product understanding. In this article, we review the topic of chemometrics applications in biotech processes with a special focus on recent major developments. Case studies have been used to highlight some of the significant applications.  相似文献   

14.
Downstream sample purification for quality attribute analysis is a significant bottleneck in process development for non‐antibody biologics. Multi‐step chromatography process train purifications are typically required prior to many critical analytical tests. This prerequisite leads to limited throughput, long lead times to obtain purified product, and significant resource requirements. In this work, immunoaffinity purification technology has been leveraged to achieve single‐step affinity purification of two different enzyme biotherapeutics (Fabrazyme® [agalsidase beta] and Enzyme 2) with polyclonal and monoclonal antibodies, respectively, as ligands. Target molecules were rapidly isolated from cell culture harvest in sufficient purity to enable analysis of critical quality attributes (CQAs). Most importantly, this is the first study that demonstrates the application of predictive analytics techniques to predict critical quality attributes of a commercial biologic. The data obtained using the affinity columns were used to generate appropriate models to predict quality attributes that would be obtained after traditional multi‐step purification trains. These models empower process development decision‐making with drug substance‐equivalent product quality information without generation of actual drug substance. Optimization was performed to ensure maximum target recovery and minimal target protein degradation. The methodologies developed for Fabrazyme were successfully reapplied for Enzyme 2, indicating platform opportunities. The impact of the technology is significant, including reductions in time and personnel requirements, rapid product purification, and substantially increased throughput. Applications are discussed, including upstream and downstream process development support to achieve the principles of Quality by Design (QbD) as well as integration with bioprocesses as a process analytical technology (PAT). © 2014 American Institute of Chemical Engineers Biotechnol. Prog., 30:708–717, 2014  相似文献   

15.
In the process analytical technology (PAT) initiative, the application of sensors technology and modeling methods is promoted. The emphasis is on Quality by Design, online monitoring, and closed-loop control with the general aim of building in product quality into manufacturing operations. As a result, online high-throughput process analyzers find increasing application and therewith high amounts of highly correlated data become available online. In this study, an hybrid chemometric/mathematical modeling method is adopted for data analysis, which is shown to be advantageous over the commonly used chemometric techniques in PAT applications. This methodology was applied to the analysis of process data of Bordetella pertussis cultivations, namely online data of near-infrared, (NIR), pH, temperature and dissolved oxygen, and off-line data of biomass, glutamate, and lactate concentrations. The hybrid model structure consisted of macroscopic material balance equations in which the specific reactions rates are modeled by nonlinear partial least square (PLS). This methodology revealed a significant higher statistical confidence in comparison to PLSs, translated in a reduction of mean squared prediction errors (e.g., individual root mean squared prediction errors calibration/validation obtained through the hybrid model for the concentrations of lactate: 0.8699/0.7190 mmol/L; glutamate: 0.6057/0.2917 mmol/L; and biomass: 0.0520/0.0283 OD; and obtained through the PLS model for the concentrations of lactate: 1.3549/1.0087 mmol/L; glutamate: 0.7628/0.3504 mmol/L; and biomass: 0.0949/0.0412 OD). Moreover, the analysis of loadings and scores in the hybrid approach revealed that process features can, as for PLS, be extracted by the hybrid method.  相似文献   

16.
In response to the biopharmaceutical industry advancing from traditional batch operation to continuous operation, the Food and Drug Administration (FDA) has published a draft for continuous integrated biomanufacturing. This draft outlines the most important rules for establishing continuous integration. One of these rules is a thorough understanding of mass flows in the process. A computer simulation framework is developed for modeling the residence time distribution (RTD) of integrated continuous downstream processes based on a unit‐by‐unit modeling approach in which unit operations are simulated one‐by‐one across the entire processing time, and then combined into an integrated RTD model. The framework allows for easy addition or replacement of new unit operations, as well as quick adjustment of process parameters during evaluation of the RTD model. With this RTD model, the start‐up phase to reach steady state can be accelerated, the effects of process disturbances at any stage of the process can be calculated, and virtual tracking of a section of the inlet material throughout the process is possible. A hypothetical biomanufacturing process for an antibody was chosen for showcasing the RTD modeling approach.  相似文献   

17.
Chinese hamster ovary (CHO) cells are often used to produce therapeutic monoclonal antibodies (mAbs). CHO cells express many host cell proteins (HCPs) required for their growth. Interactions of HCPs with mAbs can sometimes result in co‐purification of trace levels of ‘hitchhiker’ HCPs during the manufacturing process. Purified mAb‐1 product produced in early stages of process optimization had high HCP levels. In addition, these lots formed delayed‐onset particles containing mAb‐1 and its heavy chain C‐terminal fragments. Studies were performed to determine the cause of the observed particle formation and to optimize the purification for improved HCP clearance. Protease activity and inhibitor stability studies confirmed that an aspartyl protease was responsible for fragmentation of mAb‐1 resulting in particle formation. An affinity resin was used to selectively capture aspartyl proteases from the mAb‐1 product. Mass spectrometry identified the captured aspartyl protease as CHO cathepsin D. A wash step at high pH with salt and caprylate was implemented during the protein A affinity step to disrupt the HCP–mAb interactions and improve HCP clearance. The product at the end of purification using the optimized process had very low HCP levels, did not contain detectable protease activity, and did not form particles. Spiking of CHO cathepsin D back into mAb‐1 product from the optimized process confirmed that it was the cause of the particle formation. This work demonstrated that process optimization focused on removal of HCPs was successful in eliminating particle formation in the final mAb‐1 product. © 2015 American Institute of Chemical Engineers Biotechnol. Prog., 31:1360–1369, 2015  相似文献   

18.
Historical manufacturing data can potentially harbor a wealth of information for process optimization and enhancement of efficiency and robustness. To extract useful data multivariate data analysis (MVDA) using projection methods is often applied. In this contribution, the results obtained from applying MVDA on data from inactivated polio vaccine (IPV) production runs are described. Data from over 50 batches at two different production scales (700‐L and 1,500‐L) were available. The explorative analysis performed on single unit operations indicated consistent manufacturing. Known outliers (e.g., rejected batches) were identified using principal component analysis (PCA). The source of operational variation was pinpointed to variation of input such as media. Other relevant process parameters were in control and, using this manufacturing data, could not be correlated to product quality attributes. The gained knowledge of the IPV production process, not only from the MVDA, but also from digitalizing the available historical data, has proven to be useful for troubleshooting, understanding limitations of available data and seeing the opportunity for improvements. Biotechnol. Bioeng. 2010;107: 96–104. © 2010 Wiley Periodicals, Inc.  相似文献   

19.
Process analytical technology (PAT) is an initiative from the US FDA combining analytical and statistical tools to improve manufacturing operations and ensure regulatory compliance. This work describes the use of a continuous monitoring system for a protein refolding reaction to provide consistency in product quality and process performance across batches. A small‐scale bioreactor (3 L) is used to understand the impact of aeration for refolding recombinant human vascular endothelial growth factor (rhVEGF) in a reducing environment. A reverse‐phase HPLC assay is used to assess product quality. The goal in understanding the oxygen needs of the reaction and its impact to quality, is to make a product that is efficiently refolded to its native and active form with minimum oxidative degradation from batch to batch. Because this refolding process is heavily dependent on oxygen, the % dissolved oxygen (DO) profile is explored as a PAT tool to regulate process performance at commercial manufacturing scale. A dynamic gassing out approach using constant mass transfer (kLa) is used for scale‐up of the aeration parameters to manufacturing scale tanks (2,000 L, 15,000 L). The resulting DO profiles of the refolding reaction show similar trends across scales and these are analyzed using rpHPLC. The desired product quality attributes are then achieved through alternating air and nitrogen sparging triggered by changes in the monitored DO profile. This approach mitigates the impact of differences in equipment or feedstock components between runs, and is directly inline with the key goal of PAT to “actively manage process variability using a knowledge‐based approach.” Biotechnol. Bioeng. 2009; 104: 340–351 © 2009 Wiley Periodicals, Inc.  相似文献   

20.
Intracellular antibody Fab' fragments periplasmically expressed in Escherichia coli require the release of Fab' from the cells before initial product recovery. This work demonstrates the utility of microscale bioprocessing techniques to evaluate the influence of different cell disruption operations on subsequent solid–liquid separation and product recovery. Initially, the industrial method of Fab' release by thermochemical extraction was established experimentally at the microwell scale and was observed to yield Fab' release consistent with the larger scale process. The influence of two further cell disruption operations, homogenization and sonication, on subsequent Fab' recovery by microfiltration was also examined. The results showed that the heat‐extracted cells give better dead‐end microfiltration performance in terms of permeate flux and specific cake resistance. In contrast, the cell suspensions prepared by homogenization and sonication showed more efficient product release but with lower product purity and poorer microfiltration performance. Having established the various microscale methods the linked sequence was automated on the deck of a laboratory robotic platform and used to show how different conditions during thermochemical extraction impacted on the optimal performance of the linked unit operations. The results illustrate the power of microscale techniques to evaluate crucial unit operation interactions in a bioprocess sequence using only microliter volumes of feed. © 2010 American Institute of Chemical Engineers Biotechnol. Prog., 2010  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号