首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
This study is motivated by a real problem encountered in the manufacturing and distribution process at a local electronic manufacturer of security devices. We investigate the impact of operations redesign (i.e., operations merging) on the cost of safety stock in a supply chain. A simple safety stock method is used to derive a model for estimating safety stock levels. Our result shows that operations redesign can have a significant impact on safety stock investment. We extend and complement the existing literature in the following aspects: (i) we address the issue of safety stock deployment, i.e., we not only investigate the problem of how many operations should be delayed, but also determine which operations need to be delayed, (ii) we provide an efficient heuristic algorithm to determine which operations need to be merged, and (iii) we find the optimal operations redesign strategies under some special cases. Our analysis also reveals some important conditions and insights for better operations redesign, which enable us not only to decide when an operations redesign is appropriate, but also to suggest the scale and the format of the operations redesign.  相似文献   

2.
The computational task of protein structure prediction is believed to require exponential time, but previous arguments as to its intractability have taken into account only the size of a protein's conformational space. Such arguments do not rule out the possible existence of an algorithm, more selective than exhaustive search, that is efficient and exact. (An efficient algorithm is one that is guaranteed, for all possible inputs, to run in time bounded by a function polynomial in the problem size. An intractable problem is one for which no efficient algorithm exists.) Questions regarding the possible intractability of problems are often best answered using the theory of NP-completeness. In this treatment we show the NP-hardness of two typical mathematical statements of empirical potential energy function minimization of macromolecules. Unless all NP-complete problems can be solved efficiently, these results imply that a function minimization algorithm can be efficient for protein structure prediction only if it exploits protein-specific properties that prohibit the simple geometric constructions that we use in our proofs. Analysis of further mathematical statements of molecular structure prediction could constitute a systematic methodology for identifying sources of complexity in protein folding, and for guiding development of predictive algorithms.  相似文献   

3.
MOTIVATION: In a wide range of experimental techniques in biology, there is a need for an efficient method to calculate the melting temperature of pairings of two single DNA strands. Avoiding cross-hybridization when choosing primers for the polymerase chain reaction or selecting probes for large-scale DNA assays are examples where the exact determination of melting temperatures is important. Beyond being exact, the method has to be efficient, as these techniques often require the simultaneous calculation of melting temperatures of up to millions of possible pairings. The problem is to simultaneously determine the most stable alignment of two sequences, including potential loops and bulges, and calculate the corresponding melting temperature. RESULTS: As the melting temperature can be expressed as a fraction in terms of enthalpy and entropy differences of the corresponding annealing reaction, we propose to use a fractional programming algorithm, the Dinkelbach algorithm, to solve the problem. To calculate the required differences of enthalpy and entropy, the Nearest Neighbor model is applied. Using this model, the substeps of the Dinkelbach algorithm in our problem setting turn out to be calculations of alignments which optimize an additive score function. Thus, the usual dynamic programming techniques can be applied. The result is an efficient algorithm to determine melting temperatures of two DNA strands, suitable for large-scale applications such as primer or probe design. AVAILABILITY: The software is available for academic purposes from the authors. A web interface is provided at http://www.zaik.uni-koeln.de/bioinformatik/fptm.html  相似文献   

4.
Selecting a control group that is perfectly matched for ethnic ancestry with a group of affected individuals is a major problem in studying the association of a candidate gene with a disease. This problem can be avoided by a design that uses parental data in place of nonrelated controls. Schaid and Sommer presented two new methods for the statistical analysis using this approach: (1) a likelihood method (Hardy-Weinberg equilibrium [HWE] method), which rests on the assumption that HWE holds, and (2) a conditional likelihood method (conditional on parental genotype [CPG] method) appropriate when HWE is absent. Schaid and Sommer claimed that the CPG method can be more efficient than the HWE method, even when equilibrium holds. It can be shown, however that in the equilibrium situation the HWE method is always more efficient than the CPG method. For a dominant disease, the differences are slim. But for a recessive disease, the CPG method requires a much larger sample size to achieve a prescribed power than the HWE method. Additionally, we show how the relative risks for the various candidate-gene genotypes can be estimated without relying on iterative methods. For the CPG method, we represent an asymptotic power approximation that is sufficiently precise for planning the sample size of an association study.  相似文献   

5.
Total generalized variation (TGV)-based computed tomography (CT) image reconstruction, which utilizes high-order image derivatives, is superior to total variation-based methods in terms of the preservation of edge information and the suppression of unfavorable staircase effects. However, conventional TGV regularization employs l1-based form, which is not the most direct method for maximizing sparsity prior. In this study, we propose a total generalized p-variation (TGpV) regularization model to improve the sparsity exploitation of TGV and offer efficient solutions to few-view CT image reconstruction problems. To solve the nonconvex optimization problem of the TGpV minimization model, we then present an efficient iterative algorithm based on the alternating minimization of augmented Lagrangian function. All of the resulting subproblems decoupled by variable splitting admit explicit solutions by applying alternating minimization method and generalized p-shrinkage mapping. In addition, approximate solutions that can be easily performed and quickly calculated through fast Fourier transform are derived using the proximal point method to reduce the cost of inner subproblems. The accuracy and efficiency of the simulated and real data are qualitatively and quantitatively evaluated to validate the efficiency and feasibility of the proposed method. Overall, the proposed method exhibits reasonable performance and outperforms the original TGV-based method when applied to few-view problems.  相似文献   

6.
7.
R N Lewis  N Mak  R N McElhaney 《Biochemistry》1987,26(19):6118-6126
The thermotropic phase behavior of a series of 1,2-diacylphosphatidylcholines containing linear saturated acyl chains of 10-22 carbons was studied by differential scanning calorimetry. When fully hydrated and thoroughly equilibrated by prolonged incubation at appropriate low temperatures, all of the compounds studied form an apparently stable subgel phase (the Lc phase). The formation of the stable Lc phase is a complex process which apparently proceeds via a number of metastable intermediates after being nucleated by incubation at appropriate low temperatures. The process of Lc phase formation is subject to considerable hysteresis, and our observations indicate that the kinetic limitations become more severe as the length of the acyl chain increases. The kinetics of Lc phase formation also depend upon whether the acyl chains contain an odd or an even number of carbon atoms. The Lc phase is unstable at higher temperatures and upon heating converts to the so-called liquid-crystalline state (the L alpha phase). The conversion from the stable Lc to the L alpha phase can be a direct, albeit a multistage process, as observed with very short chain phosphatidylcholines, or one or more stable gel states may exist between the Lc and L alpha states. For the longer chain compounds, conversions from one stable gel phase to another become separated on the temperature scale, so that discrete subtransition, pretransition, and gel/liquid-crystalline phase transition events are observed.(ABSTRACT TRUNCATED AT 250 WORDS)  相似文献   

8.
MOTIVATION: Structure-based protein redesign can help engineer proteins with desired novel function. Improving computational efficiency while still maintaining the accuracy of the design predictions has been a major goal for protein design algorithms. The combinatorial nature of protein design results both from allowing residue mutations and from the incorporation of protein side-chain flexibility. Under the assumption that a single conformation can model protein folding and binding, the goal of many algorithms is the identification of the Global Minimum Energy Conformation (GMEC). A dominant theorem for the identification of the GMEC is Dead-End Elimination (DEE). DEE-based algorithms have proven capable of eliminating the majority of candidate conformations, while guaranteeing that only rotamers not belonging to the GMEC are pruned. However, when the protein design process incorporates rotameric energy minimization, DEE is no longer provably-accurate. Hence, with energy minimization, the minimized-DEE (MinDEE) criterion must be used instead. RESULTS: In this paper, we present provably-accurate improvements to both the DEE and MinDEE criteria. We show that our novel enhancements result in a speedup of up to a factor of more than 1000 when applied in redesign for three different proteins: Gramicidin Synthetase A, plastocyanin, and protein G. AVAILABILITY: Contact authors for source code.  相似文献   

9.
A simple and very efficient protein design strategy is proposed by developing some recently introduced theoretical tools which have been successfully applied to exactly solvable protein models. The design approach is implemented by using three amino acid classes and it is based on the minimization of an appropriate energy function. For a given native state the results of the design procedure are compared, through a statistical analysis, with the properties of an ensemble of sequences folding in the same conformation. If the success rate is computed on those sites designed with high confidence, it can be as high as 80%. The method is also able to identify key sites for the folding process: results for 2ci2 and barnase are in very good agreement with experimental results.  相似文献   

10.
Chen LH  Lee WC 《PloS one》2011,6(12):e28604
Randomization is a hallmark of clinical trials. If a trial entails very few subjects and has many prognostic factors (or many factor levels) to be balanced, minimization is a more efficient method to achieve balance than a simple randomization. We propose a novel minimization method, the 'two-way minimization'. The method separately calculates the 'imbalance in the total numbers of subjects' and the 'imbalance in the distributions of prognostic factors'. And then to allocate a subject, it chooses--by probability--to minimize either one of these two aspects of imbalances. As such, it is a method that is both treatment-adaptive and covariate-adaptive. We perform Monte-Carlo simulations to examine its statistical properties. The two-way minimization (with proper regression adjustment of the force-balanced prognostic factors) has the correct type I error rates. It also produces point estimates that are unbiased and variance estimates that are accurate. When there are important prognostic factors to be balanced in the study, the method achieves the highest power and the smallest variance among randomization methods that are resistant to selection bias. The allocation can be done in real time and the subsequent data analysis is straightforward. The two-way minimization is recommended to balance prognostic factors in small trials.  相似文献   

11.
For the intensively studied vehicle routing problem (VRP), two real-life restrictions have received only minor attention in the VRP-literature: traffic congestion and driving hours regulations. Traffic congestion causes late arrivals at customers and long travel times resulting in large transport costs. To account for traffic congestion, time-dependent travel times should be considered when constructing vehicle routes. Next, driving hours regulations, which restrict the available driving and working times for truck drivers, must be respected. Since violations are severely fined, also driving hours regulations should be considered when constructing vehicle routes, even more in combination with congestion problems. The objective of this paper is to develop a solution method for the VRP with time windows (VRPTW), time-dependent travel times, and driving hours regulations. The major difficulty of this VRPTW extension is to optimize each vehicle’s departure times to minimize the duty time of each driver. Having compact duty times leads to cost savings. However, obtaining compact duty times is much harder when time-dependent travel times and driving hours regulations are considered. We propose a restricted dynamic programming (DP) heuristic for constructing the vehicle routes, and an efficient heuristic for optimizing the vehicle’s departure times for each (partial) vehicle route, such that the complete solution algorithm runs in polynomial time. Computational experiments demonstrate the trade-off between travel distance minimization and duty time minimization, and illustrate the cost savings of extending the depot opening hours such that traveling before the morning peak and after the evening peak becomes possible.  相似文献   

12.
We apply Constraint Network Analysis (CNA) to investigate the relationship between structural rigidity and thermostability of five citrate synthase (CS) structures over a temperature range from 37 °C to 100 °C. For the first time, we introduce an ensemble-based variant of CNA and model the temperature-dependence of hydrophobic interactions in the constraint network. A very good correlation between the predicted thermostabilities of CS and optimal growth temperatures of their source organisms (R2=0.88, p=0.017) is obtained, which validates that CNA is able to quantitatively discriminate between less and more thermostable proteins even within a series of orthologs. Structural weak spots on a less thermostable CS, predicted by CNA to be in the top 5% with respect to the frequency of occurrence over an ensemble, have a higher mutation ratio in a more thermostable CS than other sequence positions. Furthermore, highly ranked weak spots that are also highly conserved with respect to the amino acid type found at that sequence position are nevertheless found to be mutated in the more stable CS. As for mechanisms at an atomic level that lead to a reinforcement of weak spots in more stable CS, we observe that the thermophilic CS achieve a higher thermostability by better hydrogen bonding networks whereas hyperthermophilic CS incorporate more hydrophobic contacts to reach the same goal. Overall, these findings suggest that CNA can be applied as a pre-filter in data-driven protein engineering to focus on residues that are highly likely to improve thermostability upon mutation.  相似文献   

13.
Metallo-beta-lactamases can hydrolyze a broad spectrum of beta-lactam antibiotics and thus confer resistance to bacteria. For the Pseudomonas aeruginosa enzyme IMP-1, several variants have been reported. IMP-6 and IMP-1 differ by a single residue (glycine and serine at position 196, respectively), but have significantly different substrate spectra; while the catalytic efficiency toward the two cephalosporins cephalothin and cefotaxime is similar for both variants, IMP-1 is up to 10-fold more efficient than IMP-6 toward cephaloridine and ceftazidime. Interestingly, this biochemical effect is caused by a residue remote from the active site. The substrate-specific impact of residue 196 was studied by molecular dynamics simulations using a cationic dummy atom approach for the zinc ions. Substrates were docked in an intermediate structure near the transition state to the binding site of IMP-1 and IMP-6. At a simulation temperature of 100 K, most complexes were stable during 1 ns of simulation time. However, at higher temperatures, some complexes became unstable and the substrate changed to a nonactive conformation. To model stability, six molecular dynamics simulations at 100 K were carried out for all enzyme-substrate complexes. Stable structures were further heated to 200 and 300 K. By counting stable structures, we derived a stability ranking score which correlated with experimentally determined catalytic efficiency. The use of a stability score as an indicator of catalytic efficiency of metalloenzymes is novel, and the study of substrates in a near-transition state intermediate structure is superior to the modeling of Michaelis complexes. The remote effect of residue 196 can be described by a domino effect: upon replacement of serine with glycine, a hole is created and a stabilizing interaction between Ser196 and Lys33 disappears, rendering the neighboring residues more flexible; this increased flexibility is then transferred to the active site.  相似文献   

14.
Molecular dynamics simulated annealing (SA-MD) simulations are frequently used for refinement and optimization of peptide and protein structures. Depending on the simulation conditions and simulation length SA-MD simulations can be trapped in locally stable conformations far from the global optimum. As an alternative replica exchange molecular dynamics (RexMD) simulations can be used which allow exchanges between high and low simulation temperatures at all stages of the simulation. A significant drawback of RexMD simulations is, however, the rapid increase of the replica number with increasing system size to cover a desired temperature range. A combined SA-MD and RexMD approach termed SA-RexMD is suggested that employs a small number of replicas (4) and starts initially with a set of high simulation temperatures followed by gradual cooling of the set of temperatures until a target temperature has been reached. The protocol has been applied for the folding of several peptide systems and for the refinement of protein model structures. In all the cases, the SA-RexMD method turned out to be significantly more efficient in reaching low energy structures and also structures close to experiment compared to continuous MD simulations at the target temperature and to SA-MD simulations at the same computational demand. The approach is well suited for applications in structure refinement and for systematic force field improvement.  相似文献   

15.
Realization of novel molecular function requires the ability to alter molecular complex formation. Enzymatic function can be altered by changing enzyme-substrate interactions via modification of an enzyme's active site. A redesigned enzyme may either perform a novel reaction on its native substrates or its native reaction on novel substrates. A number of computational approaches have been developed to address the combinatorial nature of the protein redesign problem. These approaches typically search for the global minimum energy conformation among an exponential number of protein conformations. We present a novel algorithm for protein redesign, which combines a statistical mechanics-derived ensemble-based approach to computing the binding constant with the speed and completeness of a branch-and-bound pruning algorithm. In addition, we developed an efficient deterministic approximation algorithm, capable of approximating our scoring function to arbitrary precision. In practice, the approximation algorithm decreases the execution time of the mutation search by a factor of ten. To test our method, we examined the Phe-specific adenylation domain of the nonribosomal peptide synthetase gramicidin synthetase A (GrsA-PheA). Ensemble scoring, using a rotameric approximation to the partition functions of the bound and unbound states for GrsA-PheA, is first used to predict binding of the wildtype protein and a previously described mutant (selective for leucine), and second, to switch the enzyme specificity toward leucine, using two novel active site sequences computationally predicted by searching through the space of possible active site mutations. The top scoring in silico mutants were created in the wetlab and dissociation/binding constants were determined by fluorescence quenching. These tested mutations exhibit the desired change in specificity from Phe to Leu. Our ensemble-based algorithm, which flexibly models both protein and ligand using rotamer-based partition functions, has application in enzyme redesign, the prediction of protein-ligand binding, and computer-aided drug design.  相似文献   

16.
During leather manufacture, high amounts of chromium shavings, wet by‐products of the leather industry, are produced worldwide. They are stable towards temperatures of up to 110°C and enzymatic degradation, preventing anaerobic digestion in a biogas plant. Hitherto, chromium shavings are not utilized industrially to produce biogas. In order to ease enzymatic degradation, necessary to produce biogas, a previous denaturation of the native structure has to be carried out. In our projects, chromium shavings were pre‐treated thermally and mechanically by extrusion and hydrothermal methods. In previous works, we intensively studied the use of these shavings to produce biogas in batch scale and significant improvement was reached when using pre‐treated shavings. In this work, a scale‐up of the process was performed in a continuous reactor using pre‐treated and untreated chromium shavings to examine the feasibility of the considered method. Measuring different parameters along the anaerobic digestion, namely organic matter, collagen content, and volatile fatty acids content, it was possible to show that a higher methane production can be reached and a higher loading rate can be used when feeding the reactor with pre‐treated shavings instead of untreated chromium shavings, which means a more economical and efficient process in an industrial scenario.  相似文献   

17.
Rapid detection of salmonella in foods—a convenient two-day procedure   总被引:2,自引:1,他引:1  
A new method for detecting salmonellas in foods within 42 h is described. This highly specific and sensitive selective motility procedure gave an efficiency of 96.8% when tested against traditional methods using more than 800 food samples. It is stable at ambient temperatures and can be prepared for use in 5 min.  相似文献   

18.
MOTIVATION: Conformational searches in molecular docking are a time-consuming process with wide range of applications. Favorable conformations of the ligands that successfully bind with receptors are sought to form stable ligand-receptor complexes. Usually a large number of conformations are generated and their binding energies are examined. We propose adding a geometric screening phase before an energy minimization procedure so that only conformations that geometrically fit in the binding site will be prompted for energy calculation. RESULTS: Geometric screening can drastically reduce the number of conformations to be examined from millions (or higher) to thousands (or lower). The method can also handle cases when there are more variables than geometric constraints. An early-stage implementation is able to finish the geometric filtering of conformations for molecules with up to nine variables in 1 min. To the best of our knowledge, this is the first time such results are reported deterministically. CONTACT: mzhang@mdanderson.org.  相似文献   

19.
A reliable and accurate identification of the type of tumors is crucial to the proper treatment of cancers. In recent years, it has been shown that sparse representation (SR) by l1-norm minimization is robust to noise, outliers and even incomplete measurements, and SR has been successfully used for classification. This paper presents a new SR-based method for tumor classification using gene expression data. A set of metasamples are extracted from the training samples, and then an input testing sample is represented as the linear combination of these metasamples by l1-regularized least square method. Classification is achieved by using a discriminating function defined on the representation coefficients. Since l1-norm minimization leads to a sparse solution, the proposed method is called metasample-based SR classification (MSRC). Extensive experiments on publicly available gene expression data sets show that MSRC is efficient for tumor classification, achieving higher accuracy than many existing representative schemes.  相似文献   

20.
Minimization is a valuable method for allocating participants between the control and experimental arms of clinical studies. The use of minimization reduces differences that might arise by chance between the study arms in the distribution of patient characteristics such as gender, ethnicity and age. However, unlike randomization, minimization requires real time assessment of each new participant with respect to the preceding distribution of relevant participant characteristics within the different arms of the study. For multi-site studies, this necessitates centralized computational analysis that is shared between all study locations. Unfortunately, there is no suitable freely available open source or free software that can be used for this purpose. OxMaR was developed to enable researchers in any location to use minimization for patient allocation and to access the minimization algorithm using any device that can connect to the internet such as a desktop computer, tablet or mobile phone. The software is complete in itself and requires no special packages or libraries to be installed. It is simple to set up and run over the internet using online facilities which are very low cost or even free to the user. Importantly, it provides real time information on allocation to the study lead or administrator and generates real time distributed backups with each allocation. OxMaR can readily be modified and customised and can also be used for standard randomization. It has been extensively tested and has been used successfully in a low budget multi-centre study. Hitherto, the logistical difficulties involved in minimization have precluded its use in many small studies and this software should allow more widespread use of minimization which should lead to studies with better matched control and experimental arms. OxMaR should be particularly valuable in low resource settings.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号