首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
This work investigates the insights and understanding which can be deduced from predictive process models for the product quality of a monoclonal antibody based on designed high‐throughput cell culture experiments performed at milliliter (ambr‐15®) scale. The investigated process conditions include various media supplements as well as pH and temperature shifts applied during the process. First, principal component analysis (PCA) is used to show the strong correlation characteristics among the product quality attributes including aggregates, fragments, charge variants, and glycans. Then, partial least square regression (PLS1 and PLS2) is applied to predict the product quality variables based on process information (one by one or simultaneously). The comparison of those two modeling techniques shows that a single (PLS2) model is capable of revealing the interrelationship of the process characteristics to the large set product quality variables. In order to show the dynamic evolution of the process predictability separate models are defined at different time points showing that several product quality attributes are mainly driven by the media composition and, hence, can be decently predicted from early on in the process, while others are strongly affected by process parameter changes during the process. Finally, by coupling the PLS2 models with a genetic algorithm first the model performance can be further improved and, most importantly, the interpretation of the large‐dimensioned process–product‐interrelationship can be significantly simplified. The generally applicable toolset presented in this case study provides a solid basis for decision making and process optimization throughout process development. © 2017 American Institute of Chemical Engineers Biotechnol. Prog., 33:1368–1380, 2017  相似文献   

2.
Abstract

Proteolytic degradation is a serious problem that complicates downstream processing during production of recombinant therapeutic proteins. It can lead to decreased product yield, diminished biological activity, and suboptimal product quality. Proteolytic degradation or protein truncation is observed in various expression hosts and is mostly attributed to the activity of proteases released by host cells. Since these clipped proteins can impact pharmacokinetics and immunogenicity in addition to potency, they need to be appropriately controlled to ensure consistency of product quality and patient safety. A chromatography step for the selective removal of clipped proteins from an intact protein was developed in this study. Poly(ethylenimine)-grafted anion- exchange resins (PolyQUAT and PolyPEI) were evaluated and compared to traditional macroporous anion-exchange and tentacled anion-exchange resins. Isocratic retention experiments were conducted to determine the retention factors (k′) and charge factors (Z) were determined through the classical stoichiometric displacement model. High selectivity in separation of closely related clipped proteins was obtained with the PolyQUAT resin. A robust design space was established for the PolyQUAT chromatography through Design-Of-Experiments (DoE) based process optimization. Results showed a product recovery of up to 63% with purity levels >99.0%. Approximately, one-log clearance of host cell protein and two-logs clearance of host cell DNA were also obtained. The newly developed PolyQUAT process was compared with an existing process and shown to be superior with respect to the number of process steps, process time, process yield, and product quality.  相似文献   

3.
ABSTRACT

Scientific modeling along with hands-on inquiry can lead to a deeper understanding of scientific concepts among students in upper elementary grades. Even though scientific modeling involves abstract-thinking processes, can students in younger elementary grades successfully participate in scientific modeling? Scientific modeling, like all other aspects of scientific inquiry, has to be developed. This article clearly outlines how students in a first-grade classroom can develop and use scientific models to explain the properties and behaviors of solids, liquids, and gases in a unit on the states of matter.  相似文献   

4.
The publication of the International Conference of Harmonization (ICH) Q8, Q9, and Q10 guidelines paved the way for the standardization of quality after the Food and Drug Administration issued current Good Manufacturing Practices guidelines in 2003. “Quality by Design”, mentioned in the ICH Q8 guideline, offers a better scientific understanding of critical process and product qualities using knowledge obtained during the life cycle of a product. In this scope, the “knowledge space” is a summary of all process knowledge obtained during product development, and the “design space” is the area in which a product can be manufactured within acceptable limits. To create the spaces, artificial neural networks (ANNs) can be used to emphasize the multidimensional interactions of input variables and to closely bind these variables to a design space. This helps guide the experimental design process to include interactions among the input variables, along with modeling and optimization of pharmaceutical formulations. The objective of this study was to develop an integrated multivariate approach to obtain a quality product based on an understanding of the cause–effect relationships between formulation ingredients and product properties with ANNs and genetic programming on the ramipril tablets prepared by the direct compression method. In this study, the data are generated through the systematic application of the design of experiments (DoE) principles and optimization studies using artificial neural networks and neurofuzzy logic programs.KEY WORDS: artificial neural networks (ANNs), gene expression programming (GEP), optimization, quality by design (QbD)  相似文献   

5.
《Cytotherapy》2021,23(10):953-959
Background aimsThis article describes the development of a small-scale model for Ficoll-based cell separation as part of process development of an advanced therapy medicinal product and its qualification. Because of the complexity of biological products, their manufacturing process as well as characterization and control needs to be accurately understood. Likewise, scale-down models serve as an indispensable tool for process development, characterization, optimization and validation. This scale-down model represents a cell processor device widely used in advance therapies. This approach is inteded to optimise resources and to focus its use on process characterisation studies under the paradigm of the Quality by design. A scale-down model should reflect the large manufacturing scale. Consequently, this simplified system should offer a high degree of control over the process parameters to depict a robust model, even considering the process limitations. For this reason, a model should be developed and qualified for the intended purpose.MethodsProcess operating parameters were studied, and their resulting performance at full scale was used as a baseline to guide scale-down model development. Once the model was established, comparability runs were performed by establishing standard operating conditions with bone marrow samples. These analyses showed consistency between the bench and the large scale. Additionally, statistical analyses were employed to demonstrate equivalence.ResultsThe process performance indicators and assessed quality attributes were equivalent and fell into the acceptance ranges defined for the large-scale process.ConclusionsThis scale-down model is suitable for use in process characterization studies.  相似文献   

6.
Industrial fermentations typically use media that are balanced with multiple substitutable substrates including complex carbon and nitrogen source. Yet, much of the modeling effort to date has mainly focused on defined media. Here, we present a structured model that accounts for growth and product formation kinetics of rifamycin B fermentation in a multi-substrate complex medium. The phenomenological model considers the organism to be an optimal strategist with an in-built mechanism that regulates the sequential and simultaneous uptake of the substrate combinations. This regulatory process is modeled by assuming that the uptake of a substrate depends on the level of a key enzyme or a set of enzymes, which may be inducible. Further, the fraction of flux through a given metabolic branch is estimated using a simple multi-variable constrained optimization. The model has the typical form of Monod equation with terms incorporating multiple limiting substrates and substrate inhibition. Several batch runs were set up with varying initial substrate concentrations to estimate the kinetic parameters for the rifamycin overproducer strain Amycolatopsis mediterranei S699. Glucose and ammonium sulfate (AMS) demonstrated significant substrate inhibition toward growth as well as product formation. The model correctly predicts the experimentally observed regulated simultaneous uptake of the substitutable substrate combinations under different fermentation conditions. The modeling results may have applications in the optimization and control of rifamycin B fermentation while the modeling strategy presented here would be applicable to other industrially important fermentations.  相似文献   

7.
Purpose

This paper provided an integrated method to evaluate environmental impact and life cycle cost (LCC) of various alternative design schemes in the early design and development stages of complex mechanical product; an optimization method of product design schemes based on life cycle assessment (LCA) and LCC is proposed as a supporting design tool to achieve optimal integration of environmental impact and cost of the design.

Methods

The applied research methods include product level deconstruction model, LCA/LCC integrated analysis model, and the product design scheme optimization method. In the life cycle environmental assessment, GaBi software and CML2001 evaluation method are used to evaluate product environmental impact. In terms of product design configuration scheme optimization, the TOPSIS method is used to optimize the design schemes generated. Taking the internal and external trim of automobile as an example, the specific implementation process of the method is illustrated.

Results and discussion

The case study indicates that, when comprehensively considering the environmental impact and cost, the composite indices of the optimal and worst schemes are 0.8667 and 0.3001, respectively; their costs are ¥164.87 and ¥179.68, respectively; and the eco points of environmental impact are 14.74 and 39.78, respectively. The cost of the two schemes are not much different, but the environmental impact of the optimal scheme is only 37.1% of the worst scheme’s; When cost is the only factor to be considered, the lowest cost design scheme is about 36.7% of the maximum scheme’s cost, and the environmental impact of the lowest cost design scheme is about 1.6 times of the maximum cost scheme’s. When environmental impact is the only factor to be considered, the least environmental impact of design scheme accounts about 31.7% of the largest; the cost of design scheme with the least environmental impact accounts for about 58.1% of the largest one’s. Integrating LCA and LCC, scientific suggestions can be provided from several perspectives.

Conclusions

By considering the environmental impact and LCC, this paper proposes a method of product design scheme optimization as a supporting design tool which could evaluate the design options of the product and identify the preferred option in the early stage of product design. It is helpful to realize the sustainability of the product. In order to improve the applicability of this method, the weighting factors of environmental impact and cost could be adjusted according to the requirements of energy saving and emission reduction of different enterprises.

  相似文献   

8.
Process understanding is emphasized in the process analytical technology initiative and the quality by design paradigm to be essential for manufacturing of biopharmaceutical products with consistent high quality. A typical approach to developing a process understanding is applying a combination of design of experiments with statistical data analysis. Hybrid semi-parametric modeling is investigated as an alternative method to pure statistical data analysis. The hybrid model framework provides flexibility to select model complexity based on available data and knowledge. Here, a parametric dynamic bioreactor model is integrated with a nonparametric artificial neural network that describes biomass and product formation rates as function of varied fed-batch fermentation conditions for high cell density heterologous protein production with E. coli. Our model can accurately describe biomass growth and product formation across variations in induction temperature, pH and feed rates. The model indicates that while product expression rate is a function of early induction phase conditions, it is negatively impacted as productivity increases. This could correspond with physiological changes due to cytoplasmic product accumulation. Due to the dynamic nature of the model, rational process timing decisions can be made and the impact of temporal variations in process parameters on product formation and process performance can be assessed, which is central for process understanding.  相似文献   

9.
Abstract

Using problems from real life contexts which is related to learners environment or their culture plays an important role in their learning that concept. In this regard, science educators especially physics educators search for real-life domain of theoretical concepts for effective science teaching and they consider analogical and physical models as an opportunity in their instruction. In the presented activity, we worked with 66 senior pre-service science teachers from our science teaching methods course. We used crowd movements as a real-life domain of our analogical models to scientifically explain a stampede case, then utilized physical model to explore continuity equation. Real life problem based scenarios could be used while taking advantage of the 3?D modeling in teaching of scientific principle. As a result, we found that pre-service teachers were able to make scientific explanation for causes of stampedes by using modeling activity. High school teachers and upper-level instructors could benefit from including the modeling activity introduced in this study to help their students understand the concepts related to continuity equation by designing a physical model based on an analogical model. Via the physical model, students are able to make predictions, observations, interpretations and explanations of a complex and abstract scientific phenomenon.  相似文献   

10.
BackgroundSite-specific coupling of toxin entities to antibodies has become a popular method of synthesis of antibody-drug conjugates (ADCs), as it leads to a homogenous product and allows a free choice of a convenient site for conjugation.MethodsWe introduced a short motif, containing a single cysteine surrounded by aromatic residues, into the N-terminal FG-loop of the CH2 domain of two model antibodies, cetuximab and trastuzumab. The extent of conjugation with toxic payload was examined with hydrophobic interaction chromatography and mass spectrometry and the activity of resulting conjugates was tested on antigen-overexpressing cell lines.ResultsAntibody mutants were amenable for rapid coupling with maleimide-based linker endowed toxin payload and the modifications did not impair their reactivity with target cell lines or negatively impact their biophysical properties. Without any previous reduction, up to 50% of the antibody preparation was found to be coupled with two toxins per molecule. After the isolation of this fraction with preparative hydrophobic interaction chromatography, the ADC could elicit a potent cytotoxic effect on the target cell lines.ConclusionBy fine-tuning the microenvironment of the reactive cysteine residue, this strategy offers a simplified protocol for production of site-selectively coupled ADCs.General significanceOur unique approach allows the generation of therapeutic ADCs with controlled chemical composition, which facilitates the optimization of their pharmacological activity. This strategy for directional coupling could in the future simplify the construction of ADCs with double payloads (“dual warheads”) introduced with orthogonal techniques.  相似文献   

11.
ABSTRACT

Poly(β-hydroxybutyrate) or PHB is an important member of the family of polyhydroxyalkanoates with properties that make it potentially competitive with synthetic polymers. In addition, PHB is biodegradable. While the biochemistry of PHB synthesis by microorganisms is well known, improvement of large-scale productivity requires good fermentation modeling and optimization. The latter aspect is reviewed here.

Current models are of two types: (i) mechanistic and (ii) cybernetic. The models may be unstructured or structured, and they have been applied to single cultures and co-cultures. However, neither class of models expresses adequately all the important features of large-scale non-ideal fermentations. Model-independent neural networks provide faithful representations of observations, but they can be difficult to design. So hybrid models, combining mechanistic, cybernetic and neural models, offer a useful compromise. All three kinds of basic models are discussed with applications and directions toward hybrid model development.  相似文献   

12.
There are many points of contact between optimization problems and modeling. On the one hand the model adjustment process itself as a process of estimation is closely connected with optimization, in that it is to produce what is in one sense the best possible model. The basic structure of the optimization problem as problem in decision making with the necessary input of an objective function is thus evident. On the other hand a model is never an end in itself but on the basis of its simulation capacity a means to an end, for example in biotechnological optimization. From this point of view the model is a product of scientific work and thus an economic value. Equally, through its intended purpose the model exhibits a utility value. A complete evaluation of the model as a condition of rational modeling must take into account both these aspects. That is possible in principle by adding the modeling expenditure to expenditure for the realization of biotechnological processes, expressing the economic consequences of model quality as an objective function, and minimizing the specific total expenditure for the product to be produced. Biotechnological practice requires that the “optimum” model is approached by means of iterative processes. Some practical examples will make the process clear, taking into account qualitative (semantic) and quantitative (accuracy) aspects of the utility value.  相似文献   

13.
Gentleman  Wendy 《Hydrobiologia》2002,480(1-3):69-85
Research on plankton ecology in the oceans has traditionally been conducted via two scientific approaches: in situ (in the field) and in vitro (in the laboratory). There is, however, a third approach: exploring plankton dynamics in silico, or using computer models as tools to study marine ecosystems. Models have been used for this purpose for over 60 years, and the innovations and implementations of historical studies provide a context for how future model applications can continue to advance our understanding. To that end, this paper presents a chronology of the in silico approach to plankton dynamics, beginning with modeling pioneers who worked in the days before computers. During the first 30 years of automated computation, plankton modeling focused on formulations for biological processes and investigations of community structure. The changing technological context and conceptual paradigms of the late-1970s and 1980s resulted in simulations becoming more widespread research tools for biological oceanographers. This period saw rising use of models as hypothesis-testing tools, and means of exploring the effects of circulation on spatial distributions of organisms. Continued computer advances and increased availability of data in the 1990s allowed old approaches to be applied to old and new problems, and led to developments of new approaches. Much of the modeling in the new millennium so far has incorporated these sophistications, and many cutting-edge applications have come from a new generation of plankton scientists who were trained by modeling gurus of previous eras. The future directions for modeling plankton dynamics are rooted in the historical studies.  相似文献   

14.
《Cytotherapy》2019,21(10):1081-1093
Background aimsAutologous cell therapy (AuCT) is an emerging therapeutic treatment that is undergoing transformation from laboratory- to industry-scale manufacturing with recent regulatory approvals. Various challenges facing the complex AuCT manufacturing and supply chain process hinder the scale out and broader application of this highly potent treatment.MethodsWe present a multiscale logistics simulation framework, AuCT-Sim, that integrates novel supply chain system modeling algorithms, methods, and tools. AuCT-Sim includes a single facility model and a system-wide network model. Unique challenges of the AuCT industry are analyzed and addressed in AuCT-Sim. Decision-supporting tools can be developed based on this framework to explore “what-if” manufacturing and supply chain scenarios of importance to various cell therapy stakeholder groups.ResultsTwo case studies demonstrate the decision-supporting capability of AuCT-Sim where one investigates the optimal reagent base stocking level, and the other one simulates a reagent supply disruption event. These case studies serve as guidelines for designing computational experiments with AuCT-Sim to solve specific problems in AuCT manufacturing and supply chain.DiscussionThis simulation framework will be useful in understanding the impact of possible manufacturing and supply chain strategies, policies, regulations, and standards informing strategies to increase patient access to AuCT.  相似文献   

15.
Due to the lack of complete understanding of metabolic networks and reaction pathways, establishing a universal mechanistic model for mammalian cell culture processes remains a challenge. Contrarily, data-driven approaches for modeling these processes lack extrapolation capabilities. Hybrid modeling is a technique that exploits the synergy between the two modeling methods. Although mammalian cell cultures are among the most relevant processes in biotechnology and indeed looks ideal for hybrid modeling, their application has only been proposed but never developed in the literature. This study provides a quantitative assessment of the improvement brought by hybrid models with respect to the state-of-the-art statistical predictive models in the context of therapeutic protein production. This is illustrated using a dataset obtained from a 3.5 L fed-batch experiment. With the goal to robustly define the process design space, hybrid models reveal a superior capability to predict the time evolution of different process variables using only the initial and process conditions in comparison to the statistical models. Hybrid models not only feature more accurate prediction results but also demonstrate better robustness and extrapolation capabilities. For the future application, this study highlights the added value of hybrid modeling for model-based process optimization and design of experiments.  相似文献   

16.
《Cytotherapy》2022,24(11):1136-1147
Background aimsCell therapies have emerged as a potentially transformative therapeutic modality in many chronic and incurable diseases. However, inherent donor and patient variabilities, complex manufacturing processes, lack of well-defined critical quality attributes and unavailability of in-line or at-line process or product analytical technologies result in significant variance in cell product quality and clinical trial outcomes. New approaches for overcoming these challenges are needed to realize the potential of cell therapies.MethodsHere the authors developed an untargeted two-dimensional gas chromatography mass spectrometry (GC×GC-MS)-based method for non-destructive longitudinal at-line monitoring of cells during manufacturing to discover correlative volatile biomarkers of cell proliferation and end product potency.ResultsSpecifically, using mesenchymal stromal cell cultures as a model, the authors demonstrated that GC×GC-MS of the culture medium headspace can effectively discriminate between media types and tissue sources. Headspace GC×GC-MS identified specific volatile compounds that showed a strong correlation with cell expansion and product functionality quantified by indoleamine-2,3-dioxygenase and T-cell proliferation/suppression assays. Additionally, the authors discovered increases in specific volatile metabolites when cells were treated with inflammatory stimulation.ConclusionsThis work establishes GC×GC-MS as an at-line process analytical technology for cell manufacturing that could improve culture robustness and may be used to non-destructively monitor culture state and correlate with end product function.  相似文献   

17.
As stipulated by ICH Q8 R2 (1), prediction of critical process parameters based on process modeling is a part of enhanced, quality by design approach to product development. In this work, we discuss a Bayesian model for the prediction of primary drying phase duration. The model is based on the premise that resistance to dry layer mass transfer is product specific, and is a function of nucleation temperature. The predicted duration of primary drying was experimentally verified on the lab scale lyophilizer. It is suggested that the model be used during scale-up activities in order to minimize trial and error and reduce costs associated with expensive large scale experiments. The proposed approach extends the work of Searles et al. (2) by adding a Bayesian treatment to primary drying modeling.  相似文献   

18.
《Cytotherapy》2022,24(6):590-596
Background aimsCell therapies are costlier to manufacture than small molecules and protein therapeutics because they require multiple manipulations and are often produced in an autologous manner. Strategies to lower the cost of goods to produce a cell therapy could make a significant impact on its total cost.MethodsBorrowing from the field of bioprocess development, the authors took a design of experiments (DoE)-based approach to understanding the manufacture of a cell therapy product in pre-clinical development, analyzing main cost factors in the production process. The cells used for these studies were autologous CD4+ T lymphocytes gene-edited using CRISPR/Cas9 and recombinant adeno-associated virus (AAV) to restore normal FOXP3 gene expression as a prospective investigational product for patients with immune dysregulation, polyendocrinopathy, enteropathy, X-linked (IPEX) syndrome.ResultsUsing gene editing efficiency as the response variable, an initial screen was conducted for other variables that could influence the editing frequency. The multiplicity of infection (MOI) of AAV and amount of single guide RNA (sgRNA) were the significant factors used for the optimization step to generate a response contour plot. Cost analysis was done for multiple points in the design space to find cost drivers that could be reduced. For the range of values tested (50 000–750 000 vg/cell AAV and 0.8–4 μg sgRNA), editing with the highest MOI and sgRNA yielded the best gene editing frequency. However, cost analysis showed the optimal solution was gene editing at 193 000 vg/cell AAV and 1.78 μg sgRNA.ConclusionsThe authors used DoE to define key factors affecting the gene editing process for a potential investigational therapeutic, providing a novel and faster data-based approach to understanding factors driving complex biological processes. This approach could be applied in process development and aid in achieving more robust strategies for the manufacture of cellular therapeutics.  相似文献   

19.
A. Basset  W. Los 《Plant biosystems》2013,147(4):780-782
Abstract

LifeWatch is the European research infrastructure on biodiversity. It is building virtual, instead of physical, laboratories supplied by the most advanced facilities to capture, standardise, integrate, analyse and model biodiversity, and to consider scenarios of change. LifeWatch is aimed at supporting a deeper understanding of biodiversity for societal benefits.  相似文献   

20.
Purpose

Composites consist of at least two merged materials. Separation of these components for recycling is typically an energy-intensive process with potentially significant impacts on the components’ quality. The purpose of this article is to suggest how allocation for recycling of products manufactured from composites can be handled in life cycle assessment to accommodate for the recycling process and associated quality degradations of the different composite components, as well as to describe the challenges involved.

Method

Three prominent recycling allocation approaches were selected from the literature: the cut-off approach, the end-of-life recycling approach with quality-adjusted substitution, and the circular footprint formula. The allocation approaches were adapted to accommodate for allocation of impacts by conceptualizing the composite material recycling as a separation process with subsequent recycling of the recovered components, allowing for separate modeling of the quality changes in each individual component. The adapted allocation approaches were then applied in a case study assessing the cradle-to-grave climate impact and energy use of a fictitious product made from a composite material that in the end of life is recycled through grinding, pyrolysis, or by means of supercritical water treatment. Finally, the experiences and results from applying the allocation approaches were analyzed with regard to what incentives they provide and what challenges they come with.

Results and discussion

Using the approach of modeling the composite as at least two separate materials rather than one helped to clarify the incentives provided by each allocation approach. When the product is produced using primary materials, the cut-off approach gives no incentive to recycle, and the end-of-life recycling approach and the circular footprint formula give incentives to recycle and recover materials of high quality. Each of the allocation approaches come with inherent challenges, especially when knowledge is limited regarding future systems as in prospective studies. This challenge is most evident for the circular footprint formula, for example, with regard to the supply and demand balance.

Conclusions

We recommend modeling the composite materials in products as separate, individual materials. This proved useful for capturing changes in quality, trade-offs between recovering high quality materials and the environmental impact of the recycling system, and the incentives the different approaches provide. The cut-off and end-of-life approaches can both be used in prospective studies, whereas the circular footprint formula should be avoided as a third approach when no market for secondary material is established.

  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号