首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
In manufacturing monoclonal antibodies (mAbs), it is crucial to be able to predict how process conditions and supplements affect productivity and quality attributes, especially glycosylation. Supplemental inputs, such as amino acids and trace metals in the media, are reported to affect cell metabolism and glycosylation; quantifying their effects is essential for effective process development. We aim to present and validate, through a commercially relevant cell culture process, a technique for modeling such effects efficiently. While existing models can predict mAb production or glycosylation dynamics under specific process configurations, adapting them to new processes remains challenging, because it involves modifying the model structure and often requires some mechanistic understanding. Here, a modular modeling technique for adapting an existing model for a fed-batch Chinese hamster ovary (CHO) cell culture process without structural modifications or mechanistic insight is presented. Instead, data is used, obtained from designed experimental perturbations in media supplementation, to train and validate a supplemental input effect model, which is used to “patch” the existing model. The combined model can be used for model-based process development to improve productivity and to meet product quality targets more efficiently. The methodology and analysis are generally applicable to other CHO cell lines and cell types.  相似文献   

2.
Manufacturing systems design involves the solution of a complex series of interrelated problems. This complexity will increase in the future as manufacturing practices change to meet increased global competition. Research within manufacturing systems design has mainly been focused on finding improved models for solving particular problems, or extending existing modeling techniques. This has resulted in numerous modeling tools being available to support manufacturing systems design. However, little research work has been carried out into consolidating the existing theories and models. As a result, a large body of this work has not been applied in industry. Model management has evolved as a research area which investigates methods for storing, modifying, and manipulating models. This article describes a prototype model management system for manufacturing systems design. The objective here is not to develop “another” decision support system for manufacturing design, but to illustrate, through the development of a prototype system, a number of key ideas of how concepts from the area of model management systems can be used to support manufacturing systems design. The prototype model management system utilizes the structured modeling framework and uses an extended version of the structured modeling language. An important aspect of the prototype model management system is the incorporation of the model development task, thus allowing the system to be easily updated and adapted. The prototype system was evaluated using a range of queueing network models for manufacturing systems design.  相似文献   

3.
4.
The development of a prototype tool for modeling manufacturing in a biopharmaceutical plant is discussed. A hierarchical approach to modeling a manufacturing process has been adopted to confer maximum user flexibility. The use of this framework for assessing the impact of manufacturing decisions on strategic technical and business indicators is demonstrated via a case study. In the case study, which takes the example of a mammalian cell culture process delivering a therapeutic for clinical trials, the dynamic modeling tool indicates how manufacturing options affect the demands on resources and the associated manufacturing costs. The example illustrates how the decision-support software can be used by biopharmaceutical companies to investigate the effects of working toward different strategic goals on the cost-effectiveness of the process, prior to committing to a particular option.  相似文献   

5.
An increasing number of industrial bioprocesses capitalize on living cells by using them as cell factories that convert sugars into chemicals. These processes range from the production of bulk chemicals in yeasts and bacteria to the synthesis of therapeutic proteins in mammalian cell lines. One of the tools in the continuous search for improved performance of such production systems is the development and application of mathematical models. To be of value for industrial biotechnology, mathematical models should be able to assist in the rational design of cell factory properties or in the production processes in which they are utilized. Kinetic models are particularly suitable towards this end because they are capable of representing the complex biochemistry of cells in a more complete way compared to most other types of models. They can, at least in principle, be used to in detail understand, predict, and evaluate the effects of adding, removing, or modifying molecular components of a cell factory and for supporting the design of the bioreactor or fermentation process. However, several challenges still remain before kinetic modeling will reach the degree of maturity required for routine application in industry. Here we review the current status of kinetic cell factory modeling. Emphasis is on modeling methodology concepts, including model network structure, kinetic rate expressions, parameter estimation, optimization methods, identifiability analysis, model reduction, and model validation, but several applications of kinetic models for the improvement of cell factories are also discussed.  相似文献   

6.
The process of product design is driven toward achieving design specifications while meeting cost targets. Designers typically have models and tools to aid in functional and performance analysis of the design but few tools and little quantitative information to aid in cost analysis. Estimates of the cost of manufacture often are made through a cost multiplier based on material cost. Manufacturing supplies guidelines to aid in design, but these guidelines often lack the detail needed to make sound design decisions. A need was identified for a quantitative way for modeling manufacturing costs at Motorola. After benchmarking cost modeling efforts around the company, an activity-based costing method was developed to model manufacturing cycle time and cost. Models for 12 key manufacturing steps were developed. The factory operating costs are broken down by time, and cost is allocated to each product according to the processing it requires. The process models were combined into a system-level model, capturing subtle yet realistic operational detail. The framework was implemented in a software program to aid designers in calculating manufacturing costs from limited design information. Since the information tool provides an estimate of manufacturing costs at the design prototype stage, the development engineer can identify and eliminate expensive components and reduce the need for costly manufacturing processing. Using this methodology to make quantitative trade-offs between material and manufacturing costs, significant savings in overall product costs are achieved.  相似文献   

7.
A novel method for the qualification of reduced scale models (RSMs) was illustrated using data from both a 250-ml advanced microscale bioreactor (ambr) and a 5-L bioreactor RSM for a 2,000-L manufacturing scale process using a CHO cell line to produce a recombinant monoclonal antibody. The example study showed how the method was used to identify process performance attributes and product quality attributes that capture important aspects of the RSM qualification process. The method uses two novel statistical approaches: multivariate dimension reduction and data visualization techniques, via partial least squares discriminant analysis (PLS-DA), and Bayesian multivariate linear modeling for inferential analysis. Bayesian multivariate linear modeling allows for individual probability distributions of the differences of the mean of each attribute for each scale, as well as joint probability statements on the differences of the means for multiple attributes. Depending on the results of this inferential procedure, PLS-DA is used to identify the process performance outputs at the different scales which have the greatest negative impact on the multivariate Bayesian joint probabilities. Experience with that particular process can then be leveraged to adjust operating conditions to minimize these differences, and then equivalence can be reassessed using the multivariate linear model.  相似文献   

8.
The task of process modeling in a manufacturing environment centers around controlling and improving the flow of materials. This flow comprises a complicated web of control and physical systems. Despite a variety of manufacturing system modeling approaches, more rigorous process modeling is required. This paper presents an integrated modeling framework for manufacturing systems (IMF-M). Conceptual modeling of physical materials flow supported by a graphical representation facilitates improvement of operations in manufacturing environments. A declarative and executable representation of control information systems helps to improve information management by managing a variety of information models with improved readability and reusability. A unified representation of the physical process and information system provides a common modeling milieu in which efforts can be coordinated among several groups working in the different domains of scheduling, shop floor and logistics control, and information system. Since the framework helps adapt to the changes of the physical process and information system affecting each other in a consistent manner, the modeling output enhances integration of the manufacturing system.  相似文献   

9.
This paper presents a new approach for modeling of DNA sequences for the purpose of exon detection. The proposed model adopts the sum-of-sinusoids concept for the representation of DNA sequences. The objective of the modeling process is to represent the DNA sequence with few coefficients. The modeling process can be performed on the DNA signal as a whole or on a segment-by-segment basis. The created models can be used instead of the original sequences in a further spectral estimation process for exon detection. The accuracy of modeling is evaluated evaluated by using the Root Mean Square Error (RMSE) and the R-square metrics. In addition, non-parametric spectral estimation methods are used for estimating the spectral of both original and modeled DNA sequences. The results of exon detection based on original and modeled DNA sequences coincide to a great extent, which ensures the success of the proposed sum-of-sinusoids method for modeling of DNA sequences.  相似文献   

10.
Purpose

A scalable life cycle inventory (LCI) model, which provides mass composition and gate-to-gate manufacturing data for a power electronic inverter unit intended for controlling electric vehicle propulsion motors, was developed. The purpose is to fill existing data gaps for life cycle assessment (LCA) of electric vehicles. The model comprises new and easy-to-use data with sufficient level of detail to enable proper component scaling and in-depth analysis of inverter units. The aim of this article (part II) is to describe the modeling of all production steps and present new datasets. Another objective is to explain the strategies for data collection, system boundaries, and how unit process datasets were made to interact properly with the scalable design model (part I).

Methods

Data for the manufacturing of the inverter unit was collected from a variety of literature, technical specifications, factory data, site visits, and expert interviews. The model represents current levels of technology and modern industrial scale production. Industry data dates back to 2012. Some older literature is referred to, but only if it was found to remain relevant. Upstream, new data has been gathered to the point where the Ecoinvent database can be used to model a full cradle-to-gate inventory. To make the LCI model easy to use, each flow crossing the system boundary is reported with a recommended linked flow to this database.

Results and discussion

The screening and modeling of manufacturing inverter units resulted in a substantial compilation of new inventory data. In close integration with the design model, which is scalable in size over a range of 20–200 kW in nominal power and 250–700 V in DC system voltage (part I), it forms a comprehensive scalable LCI model of a typical automotive power electronic inverter unit intended for traction motor control. New production data covers electroplating of gold, electro-galvanization, machining and anodizing of aluminum, ceramic substrate fabrication, direct copper bonding, photoimaging and regenerative etching, power module assembly with a two-step soldering process, and the assembly of automotive printed circuit boards.

Conclusions

Interviews with experts were found to be vital for effective data collection and the reporting of details a key to maintaining data usability over time, for reuse, rework, and criticism by other LCA practitioners.

  相似文献   

11.
Multi‐component, multi‐scale Raman spectroscopy modeling results from a monoclonal antibody producing CHO cell culture process including data from two development scales (3 L, 200 L) and a clinical manufacturing scale environment (2,000 L) are presented. Multivariate analysis principles are a critical component to partial least squares (PLS) modeling but can quickly turn into an overly iterative process, thus a simplified protocol is proposed for addressing necessary steps including spectral preprocessing, spectral region selection, and outlier removal to create models exclusively from cell culture process data without the inclusion of spectral data from chemically defined nutrient solutions or targeted component spiking studies. An array of single‐scale and combination‐scale modeling iterations were generated to evaluate technology capabilities and model scalability. Analysis of prediction errors across models suggests that glucose, lactate, and osmolality are well modeled. Model strength was confirmed via predictive validation and by examining performance similarity across single‐scale and combination‐scale models. Additionally, accurate predictive models were attained in most cases for viable cell density and total cell density; however, these components exhibited some scale‐dependencies that hindered model quality in cross‐scale predictions where only development data was used in calibration. Glutamate and ammonium models were also able to achieve accurate predictions in most cases. However, there are differences in the absolute concentration ranges of these components across the datasets of individual bioreactor scales. Thus, glutamate and ammonium PLS models were forced to extrapolate in cases where models were derived from small scale data only but used in cross‐scale applications predicting against manufacturing scale batches. © 2014 American Institute of Chemical Engineers Biotechnol. Prog., 31:566–577, 2015  相似文献   

12.
The understanding of the molecular mechanism of cell-to-cell communication is fundamental for system biology. Up to now, the main objectives of bioinformatics have been reconstruction, modeling and analysis of metabolic, regulatory and signaling processes, based on data generated from high-throughput technologies. Cell-to-cell communication or quorum sensing (QS), the use of small molecule signals to coordinate complex patterns of behavior in bacteria, has been the focus of many reports over the past decade. Based on the quorum sensing process of the organism Aliivibrio salmonicida, we aim at developing a functional Petri net, which will allow modeling and simulating cell-to-cell communication processes. Using a new editor-controlled information system called VANESA (http://vanesa.sf.net), we present how to combine different fields of studies such as life-science, database consulting, modeling, visualization and simulation for a semi-automatic reconstruction of the complex signaling quorum sensing network. We show how cell-to-cell communication processes and information-flow within a cell and across cell colonies can be modeled using VANESA and how those models can be simulated with Petri net network structures in a sophisticated way.  相似文献   

13.
This article presents an approach toward product design for environment (DfE) at the level that integrates environmental hazard analysis with models of transformation processes. As a complementary analysis tool to life-cycle assessment (LCA), this method would support detailed design decisions through modeling of a "process chain" for a subset of the product's life cycle. The building blocks for this approach are a set of unit process models that can convert process and design parameters into estimates for energy utilization, production scrap, and ancillary waste flows. These values for quantity of environmental releases can be integrated using a multicriiteria environmental hazard evaluation methodology that can estimate the "qualrty" of environmental releases. Finally, the waste information can be used to support a design model that can link design parameters to material, process, and operational parameter selection. A case study illustrating printed circuit board (PCB) assembly is presented to show process chain implementation in manufacturing applications.  相似文献   

14.
Establishing reliable surface mount assemblies requires robust design and assembly practices, including stringent process control schemes for achieving high yield processes and high quality solder interconnects. Conventional Shewhart-based process control charts prevalent in today's complex surface mount manufacturing processes are found to be inadequate as a result of autocorrelation, high false alarm probability, and inability to detect process deterioration. Hence, new strategies are needed to circumvent the shortcomings of traditional process control techniques. In this article, the adequacy of Shewhart models in a surface mount manufacturing environment is examined and some alternative solutions and strategies for process monitoring are discussed. For modeling solder paste deposition process data, a time series analysis based on neural network models is highly desirable for both controllability and predictability. In particular, neural networks can be trained to model the autocorrelated time series, learn historical process behavior, and forecast future process performance with low prediction errors. This forecasting ability is especially useful for early detection of solder paste deterioration, so that timely remedial actions can be taken, minimizing the impact on subsequent yields of downstream processes. As for the automated component placement process where very low fraction nonconforming frequently occurs, control-charting schemes based on cumulative counts of conforming items produced prior to detection of nonconforming items is more sensitive in flagging process deterioration. For the reflow soldering and wave-soldering processes, the use of demerit control charts is appealing as it provides not only better control when various defects with a different degree of severity are encountered, but also leads to an improved ARL performance. Illustrative examples of actual process data are presented to demonstrate these approaches.  相似文献   

15.
Zhu B  Song PX  Taylor JM 《Biometrics》2011,67(4):1295-1304
This article presents a new modeling strategy in functional data analysis. We consider the problem of estimating an unknown smooth function given functional data with noise. The unknown function is treated as the realization of a stochastic process, which is incorporated into a diffusion model. The method of smoothing spline estimation is connected to a special case of this approach. The resulting models offer great flexibility to capture the dynamic features of functional data, and allow straightforward and meaningful interpretation. The likelihood of the models is derived with Euler approximation and data augmentation. A unified Bayesian inference method is carried out via a Markov chain Monte Carlo algorithm including a simulation smoother. The proposed models and methods are illustrated on some prostate-specific antigen data, where we also show how the models can be used for forecasting.  相似文献   

16.
Flexibility in part process representation and in highly adaptive routing algorithms are two major sources for improvement in the control of flexible manufacturing systems (FMSs). This article reports the investigation of the impact of these two kinds of flexibilities on the performance of the system. We argue that, when feasible, the choices of operations and sequencing of the part process plans should be deferred until detailed knowledge about the real-time factory state is available. To test our ideas, a flexible routing control simulation system (FRCS) was constructed and a programming language for modeling FMS part process plans, control strategies, and environments of the FMS was designed and implemented. In addition, a scheme for implementing flexible process routing called data flow dispatching rule (DFDR) was derived. The simulation results indicate that flexible processing can reduce mean flow time while increasing system throughput and machine utilization. We observed that this form of flexibility makes automatic load balancing of the machines possible. On the other hand, it also makes the control and scheduling process more complicated and calls for new control algorithms.  相似文献   

17.
Although invasion risk is expected to increase with propagule pressure (PP), it is unclear whether PP-invasibility relationships follow an asymptotic or some other non-linear form and whether such relationships vary with underlying environmental conditions. Using manipulations of PP, soil fertility and disturbance, we tested how each influence PP-invasibility relationships for Lespedeza cuneata in a Kansas grassland and use recruitment curve models to determine how safe sites may contribute to plant invasions. After three growing seasons, we found that the PP-invasibility relationships best fit an asymptotic model of invasion reflecting a combination of density-independent and density-dependent processes and that seeds were aggregated within the plant community despite efforts to uniformly sow seeds. Consistent with some models, community invasibility decreased with enhanced soil fertility or reduced levels of disturbance in response to changes in the fraction of safe sites. Our results illustrate that disturbance and soil fertility can be a useful organizing principle for predicting community invasibility, asymptotic models are a reasonable starting point for modeling invasion, and new modeling techniques—coupled with classic experimental approaches—can enhance our understanding of the invasion process.  相似文献   

18.
Mathematical models of the cellular metabolism have a special interest within biotechnology. Many different kinds of commercially important products are derived from the cell factory, and metabolic engineering can be applied to improve existing production processes, as well as to make new processes available. Both stoichiometric and kinetic models have been used to investigate the metabolism, which has resulted in defining the optimal fermentation conditions, as well as in directing the genetic changes to be introduced in order to obtain a good producer strain or cell line. With the increasing availability of genomic information and powerful analytical techniques, mathematical models also serve as a tool for understanding the cellular metabolism and physiology.  相似文献   

19.
Hall DB  Clutter M 《Biometrics》2004,60(1):16-24
Nonlinear mixed effects models have become important tools for growth and yield modeling in forestry. To date, applications have concentrated on modeling single growth variables such as tree height or bole volume. Here, we propose multivariate multilevel nonlinear mixed effects models for describing several plot-level timber quantity characteristics simultaneously. We describe how such models can be used to produce future predictions of timber volume (yield). The class of models and methods of estimation and prediction are developed and then illustrated on data from a University of Georgia study of the effects of various site preparation methods on the growth of slash pine (Pinus elliottii Engelm.).  相似文献   

20.
Background, Aims and Scope Allocation is required when quantifying environmental impacts of individual products from multi-product manufacturing plants. The International Organization for Standardization (ISO) recommends in ISO 14041 that allocation should reflect underlying physical relationships between inputs and outputs, or in the absence of such knowledge, allocation should reflect other relationships (e.g. economic value). Economic allocation is generally recommended if process specific information on the manufacturing process is lacking. In this paper, a physico-chemical allocation matrix, based on industry-specific data from the dairy industry, is developed and discussed as an alternative allocation method. Methods Operational data from 17 dairy manufacturing plants was used to develop an industry specific physico-chemical allocation matrix. Through an extensive process of substraction/substitution, it is possible to determine average resource use (e.g. electricity, thermal energy, water, etc) and wastewater emissions for individual dairy products within multi-product manufacturing plants. The average operational data for individual products were normalised to maintain industry confidentiality and then used as an industry specific allocation matrix. The quantity of raw milk required per product is based on the milk solids basis to account for dairy by-products that would otherwise be neglected. Results and Discussion Applying fixed type allocation methods (e.g. economic) for all input and outputs based on the quantity of product introduces order of magnitude sized deviations from physico-chemical allocation in some cases. The error associated with the quality of the whole of factory plant data or truncation error associated with setting system boundaries is insignificant in comparison. The profound effects of the results on systems analysis are discussed. The results raise concerns about using economic allocation as a default when allocating intra-industry sectoral flows (i.e. mass and process energy) in the absence of detailed technical information. It is recommended that economic allocation is better suited as a default for reflecting inter-industry sectoral flows. Conclusion The study highlights the importance of accurate causal allocation procedures that reflect industry-specific production methods. Generation of industry-specific allocation matrices is possible through a process of substitution/subtraction and optimisation. Allocation using such matrices overcomes the inherit bias of mass, process energy or price allocations for a multi-product manufacturing plant and gives a more realistic indication of resource use or emissions per product. The approach appears to be advantageous for resource use or emissions allocation if data is only available on a whole of factory basis for several plants with a similar level of technology. Recommendation and Perspective The industry specific allocation matrix approach will assist with allocation in multi-product LCAs where the level of technology in an industry is similar. The matrix will also benefit dairy manufacturing companies and help them more accurately allocate resources and impacts (i.e. costs) to different products within the one plant. It is recommended that similar physico-chemical allocation matrices be developed for other industry sectors with a view of ultimately coupling them with input-output analysis.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号