首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 0 毫秒
1.
《Current biology : CB》2022,32(13):2921-2934.e3
  1. Download : Download high-res image (222KB)
  2. Download : Download full-size image
  相似文献   

2.
The cost analysis of a real facility for the production of high value microalgae biomass is presented. The facility is based on ten 3 m3 tubular photobioreactors operated in continuous mode for 2 years, data of Scenedesmus almeriensis productivity but also of nutrients and power consumption from this facility being used. The yield of the facility was close to maximum expected for the location of Almería, the annual production capacity being 3.8 t/year (90 t/ha·year) and the photosynthetic efficiency being 3.6%. The production cost was 69 €/kg. Economic analysis shows that labor and depreciation are the major factors contributing to this cost. Simplification of the technology and scale-up to a production capacity of 200 t/year allows to reduce the production cost up to 12.6 €/kg. Moreover, to reduce the microalgae production cost to approaches the energy or commodities markets it is necessary to reduce the photobioreactor cost (by simplifying its design or materials used), use waste water and flue gases, and reduce the power consumption and labor required for the production step. It can be concluded that although it has been reported that production of biofuels from microalgae is relatively close to being economically feasible, data here reported demonstrated that to achieve it by using the current production technologies, it is necessary to substantially reduce their costs and to operate them near their optimum values.  相似文献   

3.
We calculate and analyze the information capacity-achieving conditions and their approximations in a simple neuronal system. The input–output properties of individual neurons are described by an empirical stimulus–response relationship and the metabolic cost of neuronal activity is taken into account. The exact (numerical) results are compared with a popular “low-noise” approximation method which employs the concepts of parameter estimation theory. We show, that the approximate method gives reliable results only in the case of significantly low response variability. By employing specialized numerical procedures we demonstrate, that optimal information transfer can be near-achieved by a number of different input distributions. It implies that the precise structure of the capacity-achieving input is of lesser importance than the value of capacity. Finally, we illustrate on an example that an innocuously looking stimulus–response relationship may lead to a problematic interpretation of the obtained Fisher information values.  相似文献   

4.
《Cytotherapy》2014,16(5):619-630
Background aimsCytotoxic T lymphocytes modified with chimeric antigen receptors (CARs) for adoptive immunotherapy of hematologic malignancies are effective in pre-clinical models, and this efficacy has translated to success in several clinical trials. Many early trials were disappointing in large part because of the lack of proliferation and subsequent persistence of transferred cells. Recent investigations have pointed to the importance of delivering highly proliferative cells, whether of naive or early memory phenotypes.MethodsWe investigated the influence of two common cell culturing methods used in early trials and their relationship to T-cell phenotype and pre-clinical efficacy.ResultsWe observed that stimulation with soluble anti-CD3 antibody OKT-3 and high-dose interleukin-2 produces more effector memory-type T cells with shorter average telomeres when compared with cells generated with the use of CD3/CD28 beads. When used in xenograft models of leukemia, bead-stimulated cells proliferated earlier and to a higher degree than those generated with the use of OKT-3/IL2 and resulted in better disease control despite no difference in distribution or migration throughout the mouse. Inclusion of the known successful clinical 4-1BB endodomain in the CAR could not rescue the function of OKT-3/IL-2–cultured cells. T cells isolated from animals that survived long-term (>120 days) retained a central memory–like phenotype and demonstrated a memory response to a large re-challenge of CD19-positive leukemia.ConclusionsIn summary, we confirm that cells with a younger phenotype or higher proliferative capacity perform better in pre-clinical models and that cell culturing influences cell phenotype seemingly independent of the 4-1BB endodomain in the CAR structure.  相似文献   

5.
Escherichia coli is one of the most commonly used host organisms for the production of recombinant biopharmaceuticals. E. coli is usually characterized by fast growth on cheap media and high productivity, but one drawback is its intracellular product formation. Product recovery from E. coli bioprocesses requires tedious downstream processing (DSP). A typical E. coli DSP for an intracellular product starts with a cell disruption step to access the product. Different methods exist, but a scalable process is usually achieved by high pressure homogenization (HPH). The protocols for HPH are often applied universally without adapting them to the recombinant product, even though HPH can affect product quantity and quality. Based on our previous study on cell disruption efficiency, we aimed at screening operational conditions to maximize not only product quantity, but also product quality of a soluble therapeutic protein expressed in E. coli. We screened for critical process parameters (CPPs) using a multivariate approach (design of experiments; DoE) during HPH to maximize product titer and achieve sufficient product quality, based on predefined critical quality attributes (CQAs). In this case study, we were able to gain valuable knowledge on the efficiency of HPH on E. coli cell disruption, product release and its impact on CQAs. Our results show that HPH is a key unit operation that has to be optimized for each product.  相似文献   

6.
We combined economic and life‐cycle analyses in an integrated framework to ascertain greenhouse gas (GHG) intensities, production costs, and abatement costs of GHG emissions for ethanol and electricity derived from three woody feedstocks (logging residues only, pulpwood only, and pulpwood and logging residues combined) across two forest management choices (intensive and nonintensive) and 31 harvest ages (year 10–year 40 in steps of 1 year) on reforested and afforested lands at the production level for slash pine (Pinus elliottii) in the Southern United States. We assumed that wood chips and wood pellets will be used to produce ethanol and generate electricity, respectively. Production costs and GHG intensities of ethanol and electricity were lowest for logging residues at the optimal rotation age for both forest management choices. Opportunity cost related with the change in rotation age was a significant determinant of the variability in the overall production cost. GHG intensity of feedstocks obtained from afforested land was lower than reforested land. Relative savings in GHG emissions were higher for ethanol than electricity. Abatement cost of GHG emissions for ethanol was lower than electricity, especially when feedstocks were obtained from a plantation whose rotation age was close to the optimal rotation age. A carbon tax of at least $25 and $38 Mg?1 CO2e will be needed to promote production of ethanol from wood chips and electricity from wood pellets in the US, respectively.  相似文献   

7.
This study describes the application of quality by design (QbD) principles to the development and implementation of a major manufacturing process improvement for a commercially distributed therapeutic protein produced in Chinese hamster ovary cell culture. The intent of this article is to focus on QbD concepts, and provide guidance and understanding on how the various components combine together to deliver a robust process in keeping with the principles of QbD. A fed-batch production culture and a virus inactivation step are described as representative examples of upstream and downstream unit operations that were characterized. A systematic approach incorporating QbD principles was applied to both unit operations, involving risk assessment of potential process failure points, small-scale model qualification, design and execution of experiments, definition of operating parameter ranges and process validation acceptance criteria followed by manufacturing-scale implementation and process validation. Statistical experimental designs were applied to the execution of process characterization studies evaluating the impact of operating parameters on product quality attributes and process performance parameters. Data from process characterization experiments were used to define the proven acceptable range and classification of operating parameters for each unit operation. Analysis of variance and Monte Carlo simulation methods were used to assess the appropriateness of process design spaces. Successful implementation and validation of the process in the manufacturing facility and the subsequent manufacture of hundreds of batches of this therapeutic protein verifies the approaches taken as a suitable model for the development, scale-up and operation of any biopharmaceutical manufacturing process.  相似文献   

8.
The strength of adhesion and dynamics of detachment of murine 3T3 fibroblasts from self-assembled monolayers were measured in a radial-flow chamber (RFC) by applying models for fluid mechanics, adhesion strength probability distributions, and detachment kinetics. Four models for predicting fluid mechanics in a RFC were compared to evaluate the accuracy of each model and the significance of inlet effects. Analysis of these models indicated an outer region at large radial positions consistent with creeping flow, an intermediate region influenced by inertial dampening, and an inner region dominated by entrance effects from the axially-oriented inlet. In accompanying experiments patterns of the fraction of cells resisting detachment were constructed for individual surfaces as a function of the applied shear stress and evaluated by comparison with integrals of both a normal and a log-normal distribution function. The two functions were equally appropriate, yielding similar estimates of the mean strength of adhesion. Further, varying the Reynolds number in the inlet, Re(d), between 630 and 1480 (corresponding to volumetric flow rates between 0.9 and 2.1 mL/s) did not affect the mean strength of adhesion. For these same experiments, analysis of the dynamics of detachment revealed three temporal phases: 1) rapid detachment of cells at the onset of flow, consistent with a first-order homogeneous kinetic model; 2) time-dependent rate of detachment during the first 30 sec. of exposure to hydrodynamic shear, consistent with the first-order heterogeneous kinetic model proposed by Dickinson and Cooper (1995); and 3) negligible detachment, indicative of pseudo-steady state after 60 sec. of flow. Our results provide rigorous guidelines for the measurement of adhesive interactions between mammalian cells and prospective biomaterial surfaces using a RFC. (c) 1997 John Wiley & Sons, Inc. Biotechnol Bioeng 55: 616-629, 1997.  相似文献   

9.
Vaccination is one of the most successful public health interventions being a cost‐effective tool in preventing deaths among young children. The earliest vaccines were developed following empirical methods, creating vaccines by trial and error. New process development tools, for example mathematical modeling, as well as new regulatory initiatives requiring better understanding of both the product and the process are being applied to well‐characterized biopharmaceuticals (for example recombinant proteins). The vaccine industry is still running behind in comparison to these industries. A production process for a new Haemophilus influenzae type b (Hib) conjugate vaccine, including related quality control (QC) tests, was developed and transferred to a number of emerging vaccine manufacturers. This contributed to a sustainable global supply of affordable Hib conjugate vaccines, as illustrated by the market launch of the first Hib vaccine based on this technology in 2007 and concomitant price reduction of Hib vaccines. This paper describes the development approach followed for this Hib conjugate vaccine as well as the mathematical modeling tool applied recently in order to indicate options for further improvements of the initial Hib process. The strategy followed during the process development of this Hib conjugate vaccine was a targeted and integrated approach based on prior knowledge and experience with similar products using multi‐disciplinary expertise. Mathematical modeling was used to develop a predictive model for the initial Hib process (the ‘baseline’ model) as well as an ‘optimized’ model, by proposing a number of process changes which could lead to further reduction in price. © 2016 American Institute of Chemical Engineers Biotechnol. Prog., 32:568–580, 2016  相似文献   

10.
A model discriminating experimental design approach for fed-batch processes has been developed and applied to the fermentative production of L-valine by a genetically modified Corynebacterium glutamicum strain possessing multiple auxotrophies as an example. Being faced with the typical situation of uncertain model information based on preliminary experiments, model discriminating design was successfully applied to improve discrimination between five competing models. Within the same modeling and experimental design framework, also the planning of an optimized production process with respect to the total volumetric productivity is shown. Simulation results were experimentally affirmed, yielding an increased total volumetric productivity of 6.2 mM L-valine per hour. However, also so far unknown metabolic mechanisms were observed in the optimized process, underlining the importance of process optimization during modeling to avoid problems of extreme extrapolation of model predictions during the final process optimization.  相似文献   

11.
Derived from any somatic cell type and possessing unlimited self-renewal and differentiation potential, induced pluripotent stem cells (iPSCs) are poised to revolutionize stem cell biology and regenerative medicine research, bringing unprecedented opportunities for treating debilitating human diseases. To overcome the limitations associated with safety, efficiency, and scalability of traditional iPSC derivation, expansion, and differentiation protocols, biomaterials have recently been considered. Beyond addressing these limitations, the integration of biomaterials with existing iPSC culture platforms could offer additional opportunities to better probe the biology and control the behavior of iPSCs or their progeny in vitro and in vivo. Herein, we discuss the impact of biomaterials on the iPSC field, from derivation to tissue regeneration and modeling. Although still exploratory, we envision the emerging combination of biomaterials and iPSCs will be critical in the successful application of iPSCs and their progeny for research and clinical translation.  相似文献   

12.
This paper discusses regression analysis of longitudinal data in which the observation process may be related to the longitudinal process of interest. Such data have recently attracted a great deal of attention and some methods have been developed. However, most of those methods treat the observation process as a recurrent event process, which assumes that one observation can immediately follow another. Sometimes, this is not the case, as there may be some delay or observation duration. Such a process is often referred to as a recurrent episode process. One example is the medical cost related to hospitalization, where each hospitalization serves as a single observation. For the problem, we present a joint analysis approach for regression analysis of both longitudinal and observation processes and a simulation study is conducted that assesses the finite sample performance of the approach. The asymptotic properties of the proposed estimates are also given and the method is applied to the medical cost data that motivated this study.  相似文献   

13.
14.
In recent years there has been much interest in the genetic enhancement of plant metabolism; however, attempts at genetic modification are often unsuccessful due to an incomplete understanding of network dynamics and their regulatory properties. Kinetic modeling of plant metabolic networks can provide predictive information on network control and response to genetic perturbations, which allow estimation of flux at any concentration of intermediate or enzyme in the system. In this research, a kinetic model of the benzenoid network was developed to simulate whole network responses to different concentrations of supplied phenylalanine (Phe) in petunia flowers and capture flux redistributions caused by genetic manipulations. Kinetic parameters were obtained by network decomposition and non‐linear least squares optimization of data from petunia flowers supplied with either 75 or 150 mm 2H5‐Phe. A single set of kinetic parameters simultaneously accommodated labeling and pool size data obtained for all endogenous and emitted volatiles at the two concentrations of supplied 2H5‐Phe. The generated kinetic model was validated using flowers from transgenic petunia plants in which benzyl CoA:benzyl alcohol/phenylethanol benzoyltransferase (BPBT) was down‐regulated via RNAi. The determined in vivo kinetic parameters were used for metabolic control analysis, in which flux control coefficients were calculated for fluxes around the key branch point at Phe and revealed that phenylacetaldehyde synthase activity is the primary controlling factor for the phenylacetaldehyde branch of the benzenoid network. In contrast, control of flux through the β‐oxidative and non‐β‐oxidative pathways is highly distributed.  相似文献   

15.
We present a model-based method, designated Inverse Metabolic Control Analysis (IMCA), which can be used in conjunction with classical Metabolic Control Analysis for the analysis and design of cellular metabolism. We demonstrate the capabilities of the method by first developing a comprehensively curated kinetic model of sphingolipid biosynthesis in the yeast Saccharomyces cerevisiae. Next we apply IMCA using the model and integrating lipidomics data. The combinatorial complexity of the synthesis of sphingolipid molecules, along with the operational complexity of the participating enzymes of the pathway, presents an excellent case study for testing the capabilities of the IMCA. The exceptional agreement of the predictions of the method with genome-wide data highlights the importance and value of a comprehensive and consistent engineering approach for the development of such methods and models. Based on the analysis, we identified the class of enzymes regulating the distribution of sphingolipids among species and hydroxylation states, with the D-phospholipase SPO14 being one of the most prominent. The method and the applications presented here can be used for a broader, model-based inverse metabolic engineering approach.  相似文献   

16.
In macroscopic dynamic models of fermentation processes, elementary modes (EM) derived from metabolic networks are often used to describe the reaction stoichiometry in a simplified manner and to build predictive models by parameterizing kinetic rate equations for the EM. In this procedure, the selection of a set of EM is a key step which is followed by an estimation of their reaction rates and of the associated confidence bounds. In this paper, we present a method for the computation of reaction rates of cellular reactions and EM as well as an algorithm for the selection of EM for process modeling. The method is based on the dynamic metabolic flux analysis (DMFA) proposed by Leighty and Antoniewicz (2011, Metab Eng, 13(6), 745–755) with additional constraints, regularization and analysis of uncertainty. Instead of using estimated uptake or secretion rates, concentration measurements are used directly to avoid an amplification of measurement errors by numerical differentiation. It is shown that the regularized DMFA for EM method is significantly more robust against measurement noise than methods using estimated rates. The confidence intervals for the estimated reaction rates are obtained by bootstrapping. For the selection of a set of EM for a given st oichiometric model, the DMFA for EM method is combined with a multiobjective genetic algorithm. The method is applied to real data from a CHO fed-batch process. From measurements of six fed-batch experiments, 10 EM were identified as the smallest subset of EM based upon which the data can be described sufficiently accurately by a dynamic model. The estimated EM reaction rates and their confidence intervals at different process conditions provide useful information for the kinetic modeling and subsequent process optimization.  相似文献   

17.
Multivariate resolution methods make up a set of mathematical tools that may be applied to the analysis and interpretation of spectroscopic data recorded when monitoring a physical or chemical process with multichannel detectors. The goal of resolution methods is the recovery of chemical and/or physical information from the experimental data. Such data include, for example, the number of intermediates present in a reaction, the rate or equilibrium constants, and the spectra for each one of those intermediates. Multivariate resolution methods have been shown to be useful for the study of biophysical and biochemical processes such as folding/unfolding of proteins or nucleic acids. The present article reviews the most frequently used resolution methods, the limitations on their use, and their latest applications in protein and nucleic acid research.  相似文献   

18.
19.
20.
Different cure fraction models have been used in the analysis of lifetime data in presence of cured patients. This paper considers mixture and nonmixture models based on discrete Weibull distribution to model recurrent event data in presence of a cure fraction. The novelty of this study is the use of a discrete lifetime distribution in place of usual existing continuous lifetime distributions for lifetime data in presence of cured fraction, censored data, and covariates. In the verification of the fit of the proposed model it is proposed the use of randomized quantile residuals. An extensive simulation study is considered to evaluate the properties of the estimates of the parameters related to the proposed model. As an illustration of the proposed methodology, it is considered an application considering a medical dataset related to lifetimes in a retrospective cohort study conducted by Puchner et al. (2017) that consists of 147 consecutive cases with surgical treatment of a sarcoma of the pelvis between the years of 1980 and 2012.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号