首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 93 毫秒
1.
The possibility of model operation of an informational field of an entity (molecule), which can be represented as a discrete parameter system of elements (atoms), is shown. The idea of the approach is based on the application of the Shannon's method (quantitative estimation of the information) not only to the molecular structure, but also to the surrounding space. The performance of an information field of a molecule can be utilized for the solution of the "structure-property" tasks.  相似文献   

2.
The concept of "design space" has been proposed in the ICH Q8 guideline and is gaining momentum in its application in the biotech industry. It has been defined as "the multidimensional combination and interaction of input variables (e.g., material attributes) and process parameters that have been demonstrated to provide assurance of quality." This paper presents a stepwise approach for defining process design space for a biologic product. A case study, involving P. pastoris fermentation, is presented to facilitate this. First, risk analysis via Failure Modes and Effects Analysis (FMEA) is performed to identify parameters for process characterization. Second, small-scale models are created and qualified prior to their use in these experimental studies. Third, studies are designed using Design of Experiments (DOE) in order for the data to be amenable for use in defining the process design space. Fourth, the studies are executed and the results analyzed for decisions on the criticality of the parameters as well as on establishing process design space. For the application under consideration, it is shown that the fermentation unit operation is very robust with a wide design space and no critical operating parameters. The approach presented here is not specific to the illustrated case study. It can be extended to other biotech unit operations and processes that can be scaled down and characterized at small scale.  相似文献   

3.
Shannon's definition of uncertainty or surprisal has been applied extensively to measure the information content of aligned DNA sequences and characterizing DNA binding sites. In contrast to Shannon's uncertainty, this study investigates the applicability and suitability of a parametric uncertainty measure due to Rényi. It is observed that this measure also provides results in agreement with Shannon's measure, pointing to its utility in analysing DNA binding site region. For facilitating the comparison between these uncertainty measures, a dimensionless quantity called "redundancy" has been employed. It is found that Rényi's measure at low parameter values possess a better delineating feature of binding sites (of binding regions) than Shannon's measure. The critical value of the parameter is chosen with an outlier criterion.  相似文献   

4.
Process analytical technology (PAT) has been gaining a lot of momentum in the biopharmaceutical community due to the potential for continuous real-time quality assurance resulting in improved operational control and compliance. Two of the key goals that have been outlined for PAT are "variability is managed by the process" and "product quality attributes can be accurately and reliably predicted over the design space established for materials used, process parameters, manufacturing, environmental, and other conditions". Recently, we have been examining the feasibility of applying different analytical tools for designing PAT applications for bioprocessing. We have previously shown that a commercially available online high performance liquid chromatography (HPLC) system can be used for analysis that can facilitate real-time decisions for column pooling based on product quality attributes (Rathore et al., 2008). In this article we test the feasibility of using a commercially available ultra- performance liquid chromatography (UPLC) system for real-time pooling of process chromatography columns. It is demonstrated that the UPLC system offers a feasible approach and meets the requirements of a PAT application. While the application presented here is of a reversed phase assay, the approach and the hardware can be easily applied to other modes of liquid chromatography.  相似文献   

5.
We present a bio-inspired strategy for designing embedded strain sensors in space structures. In insects, the campaniform sensillum is a hole extending through the cuticle arranged such that its shape changes in response to loads. The shape change is rotated through 90 by the suspension of a bell-shaped cap whose deflection is detected by a cell beneath the cuticle. It can be sensitive to displacements of the order of 1 nm. The essential morphology, a hole formed in a plate of fibrous composite material, was modelled by Skordos et al. who showed that global deformation of the plate (which can be flat, curved or a tube) induces higher local deformation of the hole due to its locally higher compliance. Further developments reported here show that this approach can be applied to groups of holes relative to their orientation.The morphology of the sensillum in insects suggests that greater sensitivity can be achieved by arranging several holes in a regular pattern; that if the hole is oval it can be "aimed" to sense specific strain directions; and that either by controlling the shape of the hole or its relationship with other holes it can have a tuned response to dynamic strains.We investigate space applications in which novel bio-inspired strain sensors could successfully be used.  相似文献   

6.
We present a bio-inspired strategy for designing embedded strain sensors in space structures. In insects, the campaniform sensillum is a hole extending through the cuticle arranged such that its shape changes in response to loads. The shape change is rotated through 90° by the suspension of a bell-shaped cap whose deflection is detected by a cell beneath the cuticle. It can be sensitive to displacements of the order of 1 nm. The essential morphology, a hole formed in a plate of fibrous composite mate- rial, was modelled by Skordos et al. who showed that global deformation of the plate (which can be flat, curved or a tube) induces higher local deformation of the hole due to its locally higher compliance. Further developments reported here show that this approach can be applied to groups of holes relative to their orientation. , The morphology of the sensillum in insects suggests that greater sensitivity can be achieved by arranging several holes in a regular pattern; that if the hole is oval it can be "aimed" to sense specific strain directions; and that either by controlling the shape of the hole or its relationship with other holes it can have a tuned response to dynamic strains. We investigate space applications in which novel bio-inspired strain sensors could successfully be used.  相似文献   

7.
群落均匀度分形分析   总被引:7,自引:1,他引:6  
王永繁  余世孝  刘蔚秋 《生态学报》2003,23(6):1031-1036
修正了Frontier和Ricotta等关于有效物种丰富度指数A与物种丰富度指数S之间幂律关系的定义.探讨了A与S之间分形关系的生态学意义.认为分形维数D是群落均匀度测度值在物种数S不断增加的过程中.向其逼近的一个理论值;提出了利用双对数坐标上建立的A与S拟合直线的方程.对群落均匀度的4种变化趋势进行描述的方法。以广东黑石顶自然保护区森林演替系列为例.研究了针阔叶混交林和常绿阔叶林样带上.随着样带观察长度的逐渐增加群落均匀度的变化情况。结果表明.230m长的混交林样带只存在一个线性无标度区间.群落均匀度随样带长度的不断增加而逐渐降低.向分形维数D=0.810趋近。170m长的常绿阔叶林样带存在两个线性无标度区问.在0~25m的尺度域内.随着样带长度的逐渐增加均匀度不断降低.向分形维数D=0.525逼近;在30~170m的尺度域内.随着样带观察长度的增加.群落均匀度也逐渐增加.向分形维数D=0.920趋近。  相似文献   

8.
Salinas E  Bentley NM 《Bio Systems》2007,89(1-3):16-23
We derive a simple measure for quantifying the average accuracy with which a neuronal population can represent a stimulus. This quantity, the basis set error, has three key properties: (1) it makes no assumptions about the form of the neuronal responses; (2) it depends only on their second order statistics, so although it is easy to compute, it does take noise correlations into account; (3) its magnitude has an intuitive interpretation in terms of the accuracy with which information can be extracted from the population using a simple method-"simple" meaning linear. We use the basis set error to characterize the efficacy of several types of population codes generated synthetically in a computer. In general, the basis set error typically ranks different encoding schemes in a way that is qualitatively similar to Shannon's mutual information, except when nonlinear readout methods are necessary. Because this measure is concerned with signals that can be read out easily (i.e., through linear operations), it provides a lower bound on coding accuracy relative to the computational capabilities that are accessible to a neuronal population.  相似文献   

9.
Laughton CA  Orozco M  Vranken W 《Proteins》2009,75(1):206-216
NMR structures are typically deposited in databases such as the PDB in the form of an ensemble of structures. Generally, each of the models in such an ensemble satisfies the experimental data and is equally valid. No unique solution can be calculated because the experimental NMR data is insufficient, in part because it reflects the conformational variability and dynamical behavior of the molecule in solution. Even for relatively rigid molecules, the limited number of structures that are typically deposited cannot completely encompass the structural diversity allowed by the observed NMR data, but they can be chosen to try and maximize its representation. We describe here the adaptation and application of techniques more commonly used to examine large ensembles from molecular dynamics simulations, to the analysis of NMR ensembles. The approach, which is based on principal component analysis, we call COCO ("Complementary Coordinates"). The COCO approach analyses the distribution of an NMR ensemble in conformational space, and generates a new ensemble that fills "gaps" in the distribution. The method is very rapid, and analysis of a 25-member ensemble and generation of a new 25 member ensemble typically takes 1-2 min on a conventional workstation. Applied to the 545 structures in the RECOORD database, we find that COCO generates new ensembles that are as structurally diverse-both from each other and from the original ensemble-as are the structures within the original ensemble. The COCO approach does not explicitly take into account the NMR restraint data, yet in tests on selected structures from the RECOORD database, the COCO ensembles are frequently good matches to this data, and certainly are structures that can be rapidly refined against the restraints to yield high-quality, novel solutions. COCO should therefore be a useful aid in NMR structure refinement and in other situations where a richer representation of conformational variability is desired-for example in docking studies. COCO is freely accessible via the website www.ccpb.ac.uk/COCO.  相似文献   

10.
Process analytical technology (PAT) has been gaining a lot of momentum in the biopharmaceutical community due to the potential for continuous real time quality assurance resulting in improved operational control and compliance. This paper presents a PAT application for one of the most commonly used unit operation in bioprocessing, namely liquid chromatography. Feasibility of using a commercially available online-high performance liquid chromatography (HPLC) system for real-time pooling of process chromatography column is examined. Further, experimental data from the feasibility studies are modeled and predictions of the model are compared to actual experimental data. It is found that indeed for the application under consideration, the online-HPLC offers a feasible approach for analysis that can facilitate real-time decisions for column pooling based on product quality attributes. It is shown that implementing this analytical scheme allows us to meet two of the key goals that have been outlined for PAT, that is, "variability is managed by the process" and "product quality attributes can be accurately and reliably predicted over the design space established for materials used, process parameters, manufacturing, environmental, and other conditions." Finally, the implications of implementing such a PAT application in a manufacturing environment are discussed. The application presented here can be extended to other modes of process chromatography and/or HPLC analysis.  相似文献   

11.
Adaptation of asexual populations is driven by beneficial mutations and therefore the dynamics of this process, besides other factors, depends on the distribution of beneficial fitness effects. It is known that on uncorrelated fitness landscapes, this distribution can only be of three types: truncated, exponential and power law. We performed extensive stochastic simulations to study the adaptation dynamics on rugged fitness landscapes, and identified two quantities that can be used to distinguish the underlying distribution of beneficial fitness effects. The first quantity studied here is the fitness difference between successive mutations that spread in the population, which is found to decrease in the case of truncated distributions, remains nearly a constant for exponentially decaying distributions and increases when the fitness distribution decays as a power law. The second quantity of interest, namely, the rate of change of fitness with time also shows quantitatively different behaviour for different beneficial fitness distributions. The patterns displayed by the two aforementioned quantities are found to hold good for both low and high mutation rates. We discuss how these patterns can be exploited to determine the distribution of beneficial fitness effects in microbial experiments.  相似文献   

12.
Constraint-based modeling has proven to be a useful tool in the analysis of biochemical networks. To date, most studies in this field have focused on the use of linear constraints, resulting from mass balance and capacity constraints, which lead to the definition of convex solution spaces. One additional constraint arising out of thermodynamics is known as the "loop law" for reaction fluxes, which states that the net flux around a closed biochemical loop must be zero because no net thermodynamic driving force exists. The imposition of the loop-law can lead to nonconvex solution spaces making the analysis of the consequences of its imposition challenging. A four-step approach is developed here to apply the loop-law to study metabolic network properties: 1), determine linear equality constraints that are necessary (but not necessarily sufficient) for thermodynamic feasibility; 2), tighten V(max) and V(min) constraints to enclose the remaining nonconvex space; 3), uniformly sample the convex space that encloses the nonconvex space using standard Monte Carlo techniques; and 4), eliminate from the resulting set all solutions that violate the loop-law, leaving a subset of steady-state solutions. This subset of solutions represents a uniform random sample of the space that is defined by the additional imposition of the loop-law. This approach is used to evaluate the effect of imposing the loop-law on predicted candidate states of the genome-scale metabolic network of Helicobacter pylori.  相似文献   

13.
Using a measure of how differentially expressed a gene is in two biochemically/phenotypically different conditions, we can rank all genes in a microarray dataset. We have shown that the falling-off of this measure (normalized maximum likelihood in a classification model such as logistic regression) as a function of the rank is typically a power-law function. This power-law function in other similar ranked plots are known as the Zipf's law, observed in many natural and social phenomena. The presence of this power-law function prevents an intrinsic cutoff point between the "important" genes and "irrelevant" genes. We have shown that similar power-law functions are also present in permuted dataset, and provide an explanation from the well-known chi(2) distribution of likelihood ratios. We discuss the implication of this Zipf's law on gene selection in a microarray data analysis, as well as other characterizations of the ranked likelihood plots such as the rate of fall-off of the likelihood.  相似文献   

14.
Division of labor is one of the primary adaptations of sociality and the focus of much theoretical work on self-organization. This work has been hampered by the lack of a quantitative measure of division of labor that can be applied across systems. We divide Shannon's mutual entropy by marginal entropy to quantify division of labor, rendering it robust over changes in number of individuals or tasks. Reinterpreting individuals and tasks makes this methodology applicable to a wide range of other contexts, such as breeding systems and predator-prey interactions.  相似文献   

15.
There have been many different approaches employed to define the "consensus" sequence of various DNA binding sites and to use the definition obtained to locate and rank members of a given sequence family. The analysis presented here enlists two of these approaches, each in modified form, to develop a highly efficient search protocol for Escherichia coli promoters and to provide a relative ranking of these sites showing good agreement with in vitro measurements of promoter strength. Schneider et al. have applied Shannon's index of information content to evaluate the significance of each position within the consensus of a family of aligned sequences. In a formal sense, this index is only applicable to a group of sequences, providing at each position a negative entropy value between zero (random) and two bits (total conservation of a single base) for sequences in which all bases are equally represented. A method for evaluating how well an individual sequence conforms to the information content pattern of the consensus is described. A function is derived, by analogy to the information content of the sequence family, for application to individual sequences. Since this function is a measure of conformity, it can be used in a search protocol to identify new members of the family represented by the consensus. A protocol for locating E. coli promoters is presented. The Berg-von Hippel statistical-mechanical function is also tested in a similar application. While the information content function provides a superior search protocol, the Berg-von Hippel function, when scaled at each position by the information content, does well at ranking promoters according to their strength as measured in vitro.  相似文献   

16.
Lugo JE  Doti R  Faubert J 《PloS one》2011,6(4):e17188

Background

Photonic crystals are artificial structures that have periodic dielectric components with different refractive indices. Under certain conditions, they abnormally refract the light, a phenomenon called negative refraction. Here we experimentally characterize negative refraction in a one dimensional photonic crystal structure; near the low frequency edge of the fourth photonic bandgap. We compare the experimental results with current theory and a theory based on the group velocity developed here. We also analytically derived the negative refraction correctness condition that gives the angular region where negative refraction occurs.

Methodology/Principal Findings

By using standard photonic techniques we experimentally determined the relationship between incidence and negative refraction angles and found the negative refraction range by applying the correctness condition. In order to compare both theories with experimental results an output refraction correction was utilized. The correction uses Snell''s law and an effective refractive index based on two effective dielectric constants. We found good agreement between experiment and both theories in the negative refraction zone.

Conclusions/Significance

Since both theories and the experimental observations agreed well in the negative refraction region, we can use both negative refraction theories plus the output correction to predict negative refraction angles. This can be very useful from a practical point of view for space filtering applications such as a photonic demultiplexer or for sensing applications.  相似文献   

17.
This paper considers a model of the human cardiovascular-respiratory control system with one and two transport delays in the state equations describing the respiratory system. The effectiveness of the control of the ventilation rate is influenced by such transport delays because blood gases must be transported a physical distance from the lungs to the sensory sites where these gases are measured. The short term cardiovascular control system does not involve such transport delays although delays do arise in other contexts such as the baroreflex loop (see [46]) for example. This baroreflex delay is not considered here. The interaction between heart rate, blood pressure, cardiac output, and blood vessel resistance is quite complex and given the limited knowledge available of this interaction, we will model the cardiovascular control mechanism via an optimal control derived from control theory. This control will be stabilizing and is a reasonable approach based on mathematical considerations as well as being further motivated by the observation that many physiologists cite optimization as a potential influence in the evolution of biological systems (see, e.g., Kenner [29] or Swan [62]). In this paper we adapt a model, previously considered (Timischl [63] and Timischl et al. [64]), to include the effects of one and two transport delays. We will first implement an optimal control for the combined cardiovascular-respiratory model with one state space delay. We will then consider the effects of a second delay in the state space by modeling the respiratory control via an empirical formula with delay while the the complex relationships in the cardiovascular control will still be modeled by optimal control. This second transport delay associated with the sensory system of the respiratory control plays an important role in respiratory stability. As an application of this model we will consider congestive heart failure where this transport delay is larger than normal and the transition from the quiet awake state to stage 4 (NREM) sleep. The model can be used to study the interaction between cardiovascular and respiratory function in various situations as well as to consider the influence of optimal function in physiological control system performance.Supported by FWF (Austria) under grant F310 as a subproject of the Special Research Center F003 Optimization and ControlMathematics Subject Classification (2000): 92C30, 49J15  相似文献   

18.
The pharmacodynamic potency of a therapeutic cytokine interacting with a cell-surface receptor can be attributed primarily to three central properties: [1] cytokine/receptor binding affinity, [2] cytokine/receptor endocytic trafficking dynamics, and [3] cytokine/receptor signaling. Thus, engineering novel or second-generation cytokines requires an understanding of the contribution of each of these to the overall cell response. We describe here an efficient method toward this goal in demonstrated application to the clinically important cytokine granulocyte colony-stimulating factor (GCSF) with a chemical analogue and a number of genetic mutants. Using a combination of simple receptor-binding and dose-response proliferation assays we construct an appropriately scaled plot of relative mitogenic potency versus ligand concentration normalized by binding affinity. Analysis of binding and proliferation data in this manner conveniently indicates which of the cytokine properties-binding, trafficking, and/or signaling-are contributing substantially to altered potency effects. For the GCSF analogues studied here, two point mutations as well as a poly(ethylene glycol) chemical conjugate were found to have increased potencies despite comparable or slightly lower affinities, and trafficking was predicted to be the responsible mechanism. A third point mutant exhibiting comparable binding affinity but reduced potency was predicted to have largely unchanged trafficking properties. Surprisingly, another mutant possessing an order-of-magnitude weaker binding affinity displayed enhanced potency, and increased ligand half-life was predicted to be responsible for this net beneficial effect. Each of these predictions was successfully demonstrated by subsequent measurements of depletion of these five analogues from cell culture medium. Thus, for the GCSF system we find that ligand trafficking dynamics can play a major role in regulating mitogenic potency. Our results demonstrate that cytokine analogues can exhibit pharmacodynamic behaviors across a diverse spectrum of "binding-potency space" and that our analysis through normalization can efficiently elucidate hypotheses for the underlying mechanisms for further dedicated testing. We have also extended the Black-Leff model of pharmacological agonism to include trafficking effects along with binding and signaling, and this model provides a framework for parsing the effects of these factors on pharmacodynamic potency.  相似文献   

19.
High throughput confocal imaging poses challenges in the computational image analysis of complex subcellular structures such as the microtubule cytoskeleton. Here, we developed CellArchitect, an automated image analysis tool that quantifies changes to subcellular patterns illustrated by microtubule markers in plants. We screened microtubule‐targeted herbicides and demonstrate that high throughput confocal imaging with integrated image analysis by CellArchitect can distinguish effects induced by the known herbicides indaziflam and trifluralin. The same platform was used to examine 6 other compounds with herbicidal activity, and at least 3 different effects induced by these compounds were profiled. We further show that CellArchitect can detect subcellular patterns tagged by actin and endoplasmic reticulum markers. Thus, the platform developed here can be used to automate image analysis of complex subcellular patterns for purposes such as herbicide discovery and mode of action characterisation. The capacity to use this tool to quantitatively characterize cellular responses lends itself to application across many areas of biology.   相似文献   

20.
In the analysis of arterial branching the classical "cube law' has provided a working model for the relation between the diameter of a blood vessel and the flow which the vessel carries on a long-term basis. The law has shown good agreement with biological data, but questions remain regarding its applicability to all levels of the arterial tree. The present study tests the hypothesis that the cube law may not be valid in the first few generations of the arterial tree, where vessel capacitance and gross anatomy may play important roles. Biological data have shown some support for this hypothesis in the past but the heterogeneity characteristic of past data has not allowed a conclusive test so far. We present new data which have been obtained from the same location on the arterial tree and in sufficient number to make this test possible for the first time. Also, while past tests have been based primarily on correlation of the measured data with an assumed power law, we show here that this can be misleading. The present data allow a simpler test which does not involve correlation and which leads to more direct conclusions. For the vessels surveyed, the results show unequivocally that the relation between diameter and flow is governed by a 'square law' rather than the classical cube law. Coupled with past findings this suggests that the square law may apply at the first few levels of the arterial tree, while the cube law continues from there to perhaps the precapillary levels.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号