首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Many studies have been carried out and many commercial software applications have been developed to improve the performances of surface mining operations, especially for the loader-trucks cycle of surface mining. However, there have been quite few studies aiming to improve the mining process of underground mines. In underground mines, mobile mining equipment is mostly scheduled instinctively, without theoretical support for these decisions. Furthermore, in case of unexpected events, it is hard for miners to rapidly find solutions to reschedule and to adapt the changes. This investigation first introduces the motivation, the technical background, and then the objective of the study. A decision support instrument (i.e. schedule optimizer for mobile mining equipment) is proposed and described to address this issue. The method and related algorithms which are used in this instrument are presented and discussed. The proposed method was tested by using a real case of Kittilä mine located in Finland. The result suggests that the proposed method can considerably improve the working efficiency and reduce the working time of the underground mine.  相似文献   

2.
In response to the biopharmaceutical industry advancing from traditional batch operation to continuous operation, the Food and Drug Administration (FDA) has published a draft for continuous integrated biomanufacturing. This draft outlines the most important rules for establishing continuous integration. One of these rules is a thorough understanding of mass flows in the process. A computer simulation framework is developed for modeling the residence time distribution (RTD) of integrated continuous downstream processes based on a unit‐by‐unit modeling approach in which unit operations are simulated one‐by‐one across the entire processing time, and then combined into an integrated RTD model. The framework allows for easy addition or replacement of new unit operations, as well as quick adjustment of process parameters during evaluation of the RTD model. With this RTD model, the start‐up phase to reach steady state can be accelerated, the effects of process disturbances at any stage of the process can be calculated, and virtual tracking of a section of the inlet material throughout the process is possible. A hypothetical biomanufacturing process for an antibody was chosen for showcasing the RTD modeling approach.  相似文献   

3.
Metal mining or more general mineral mining, is the base industry of the economic wealth and development of numerous countries. However, mining has a negative reputation due to the complex problems of environmental contamination like SO2 and CO2 emissions and acid mine drainage (AMD) formation, which endangers vital limited resources, like air, water, and soils. This view paper highlights the environmental problems of todays metal mining operations and explores possibilities of future more sustainable mining operations with focus on enhanced and optimized metal recovery systems in combination with a minimization of the environmental impact. These changes depend on a change in mentality and in the mining operation process, which can nowadays yet be observed in some modern mining operations. The goal for the future will be to implement these changes as standard for all future mining operations.  相似文献   

4.
In this article, we describe a new approach that allows the prediction of the performance of a large-scale integrated process for the primary recovery of a therapeutic antibody from an analysis of the individual unit operations and their interactions in an ultra scale-down mimic of the process. The recovery process consisted of four distinct unit operations. Using the new approach we defined the important engineering parameters in each operation that impacted the overall recovery process and in each case verified its effect by a combination of modelling and experimentation. Immunoglobulins were precipitated from large volumes of dilute blood plasma and the precipitated flocs were recovered by centrifugal separation from the liquor containing contaminating proteins, including albumin. The fluid mechanical forces acting on the precipitate and the time of exposure to these forces were used to define a time-integrated fluid stress. This was used as a scaling factor to predict the properties of the precipitated flocs at large scale. In the case of centrifugation, the performance of a full-scale disc stack centrifuge was predicted. This was achieved from a computational fluid dynamics (CFD) analysis of the flow field in the centrifuge coupled with experimental data obtained from the precipitated immunoglobulin flocs using the scale-down precipitation tank, a rotating shear device, and a standard swing-out rotor centrifuge operating under defined conditions. In this way, the performance of the individual unit operations, and their linkage, was successfully analysed from a combination of modelling and experiments. These experiments required only millilitre quantities of the process material. The overall performance of the large-scale process was predicted by tracking the changes in physical and biological properties of the key components in the system, including the size distribution of the antibody precipitates and antibody activity through the individual unit operations in the ultra scale-down process flowsheet.  相似文献   

5.
The bounded complexity of DNA computing   总被引:5,自引:0,他引:5  
Garzon MH  Jonoska N  Karl SA 《Bio Systems》1999,52(1-3):63-72
This paper proposes a new approach to analyzing DNA-based algorithms in molecular computation. Such protocols are characterized abstractly by: encoding, tube operations and extraction. Implementation of these approaches involves encoding in a multiset of molecules that are assembled in a tube having a number of physical attributes. The physico-chemical state of a tube can be changed by a prescribed number of elementary operations. Based on realistic definitions of these elementary operations, we define complexity of a DNA-based algorithm using the physico-chemical property of each operation. We show that new algorithms for Hamiltonian path are about twice as efficient as Adleman's original one and that a recent algorithm for Max-Clique provides a similar increase in efficiency. Consequences of this approach to tube complexity and DNA computing are discussed.  相似文献   

6.
Information Quality (IQ) is a critical factor for the success of many activities in the information age, including the development of data warehouses and implementation of data mining. The issue of IQ risk is recognized during the process of data mining; however, there is no formal methodological approach to dealing with such issues.

Consequently, it is essential to measure the risk of IQ in a data warehouse to ensure success in implementing data mining. This article presents a methodology to determine three IQ risk characteristics: accuracy, comprehensiveness, and non-membership. The methodology provides a set of quantitative models to examine how the quality risks of source information affect the quality for information outputs produced using the relational algebra operations: Restriction, Projection, and Cubic product. It can be used to determine how quality risks associated with diverse data sources affect the derived data. The study also develops a data cube model and associated algebra to support IQ risk operations.  相似文献   


7.
This paper presents a new approach to speed up the operation of time delay neural networks. The entire data are collected together in a long vector and then tested as a one input pattern. The proposed fast time delay neural networks (FTDNNs) use cross correlation in the frequency domain between the tested data and the input weights of neural networks. It is proved mathematically and practically that the number of computation steps required for the presented time delay neural networks is less than that needed by conventional time delay neural networks (CTDNNs). Simulation results using MATLAB confirm the theoretical computations.  相似文献   

8.
Many real world situations exist where job scheduling is required. This is the case of some entities, machines, or workers who have to execute certain jobs as soon as possible. Frequently what happens is that several workers or machines are not available to perform their activities during some time periods, due to different circumstances. This paper deals with these situations, and considers stochastic scheduling models to study these problems. When scheduling models are used in practice, they have to take into account that some machines may not be working. That temporal lack of machine availability is known as breakdowns, which happen randomly at any time. The times required to repair those machines are also random variables. The jobs have operations with stochastic processing times, their own release times, and there is no precedence between them. Each job is divided into operations and each operation is performed on the corresponding specialized machine. In addition, in the problems considered, the order in which the operations of each job are done is irrelevant. We develop a heuristic approach to solve these stochastic open-shop scheduling problems where random machine breakdowns can happen. The proposed approach is general and it does not depend on the distribution types of the considered random input data. It provides solutions to minimize the expected makespan. Computational experiences are also reported. The results show that the proposed approach gives a solid performance, finding suitable solutions with short CPU times.  相似文献   

9.
10.
Many biochemical processes consist of a sequence of operations for which optimal operating conditions (setpoints) have to be determined. If such optimization is performed for each operation separately with respect to objectives defined for each operation individually, overall process performance is likely to be suboptimal. Interactions between unit operations have to be considered, and a unique objective has to be defined for the whole process. This paper shows how a suitable optimization problem can be formulated and solved to obtain the best overall set of operating conditions for a process. A typical enzyme production process has been chosen as an example. In order to arrive at a demonstrative model for the entire sequence of unit operations, it is shown how interaction effects may be accommodated in the models. Optimal operating conditions are then determined subject to a global process objective and are shown to be different from those resulting from optimization of each separate operation. As this strategy may result in an economic benefit, it merits further research into interaction modeling and performance optimization.  相似文献   

11.
A cyclic shop is a production system that repeatedly produces identical sets of parts of multiple types, called minimal part sets (MPSs), in the same loading and processing sequence. A different part type may have a different machine visit sequence. We consider a version of cyclic job shop where some operations of an MPS instance are processed prior to some operations of the previous MPS instances. We call such a shop an overtaking cyclic job shop (OCJS). The overtaking degree can be specified by how many MPS instances the operations of an MPS instance can overtake. More overtaking results in more work-in-progress, but reduces the cycle time, in general. We prove that for a given processing sequence of the operations at each machine, under some conditions, an OCJS has a stable earliest starting schedule such that each operation starts as soon as its preceding operations are completed, the schedule repeats an identical timing pattern for each MPS instance, and the cycle time is kept to be minimal. To do these, we propose a specialized approach to analyzing steady states for an event graph model of an OCJS that has a cyclic structure, which can keep the MPS-based scheduling concept. Based on the steady-state results, we develop a mixed integer programming model for finding a processing sequence of the operations at each machine and the overtaking degrees, if necessary, that minimize the cycle time.  相似文献   

12.
Pressures for cost‐effective new therapies and an increased emphasis on emerging markets require technological advancements and a flexible future manufacturing network for the production of biologic medicines. The safety and efficacy of a product is crucial, and consistent product quality is an essential feature of any therapeutic manufacturing process. The active control of product quality in a typical biologic process is challenging because of measurement lags and nonlinearities present in the system. The current study uses nonlinear model predictive control to maintain a critical product quality attribute at a predetermined value during pilot scale manufacturing operations. This approach to product quality control ensures a more consistent product for patients, enables greater manufacturing efficiency, and eliminates the need for extensive process characterization by providing direct measures of critical product quality attributes for real time release of drug product. © 2015 American Institute of Chemical Engineers Biotechnol. Prog., 31:1433–1441, 2015  相似文献   

13.
14.
西部干旱区煤炭开采环境影响研究   总被引:24,自引:4,他引:20  
雷少刚  卞正富 《生态学报》2014,34(11):2837-2843
大规模煤炭资源开采与干旱脆弱生态环境的空间耦合将会诱发日益严峻的社会、环境问题。分析了我国西部干旱区煤炭地下开采对植被、土壤水、地下水、土壤等关键环境要素影响的研究进展与不足;指出现有研究主要以单一环境要素、局部尺度为主,缺乏多尺度综合的环境要素协同损伤规律与井上下联动响应机理等基础研究。因此,现有成果还无法满足指导煤炭开采技术改进与控制环境损伤的需要。建议从井下到井上,从工作面、矿井、矿区、流域多尺度综合实现地空一体化同步监测,加强对西部干旱矿区各关键环境要素的协同损伤规律研究;加强对井上下环境要素对开采地质条件响应机理研究;建立不同开采条件、不同评价尺度下,矿区开采环境损伤的评价与预测模型。  相似文献   

15.
In today’s biopharmaceutical industries, the lead time to develop and produce a new monoclonal antibody takes years before it can be launched commercially. The reasons lie in the complexity of the monoclonal antibodies and the need for high product quality to ensure clinical safety which has a significant impact on the process development time. Frameworks such as quality by design are becoming widely used by the pharmaceutical industries as they introduce a systematic approach for building quality into the product. However, full implementation of quality by design has still not been achieved due to attrition mainly from limited risk assessment of product properties as well as the large number of process factors affecting product quality that needs to be investigated during the process development. This has introduced a need for better methods and tools that can be used for early risk assessment and predictions of critical product properties and process factors to enhance process development and reduce costs. In this review, we investigate how the quantitative structure–activity relationships framework can be applied to an existing process development framework such as quality by design in order to increase product understanding based on the protein structure of monoclonal antibodies. Compared to quality by design, where the effect of process parameters on the drug product are explored, quantitative structure–activity relationships gives a reversed perspective which investigates how the protein structure can affect the performance in different unit operations. This provides valuable information that can be used during the early process development of new drug products where limited process understanding is available. Thus, quantitative structure–activity relationships methodology is explored and explained in detail and we investigate the means of directly linking the structural properties of monoclonal antibodies to process data. The resulting information as a decision tool can help to enhance the risk assessment to better aid process development and thereby overcome some of the limitations and challenges present in QbD implementation today.  相似文献   

16.
The biopharmaceutical industry is moving toward a more quality by design (QbD) approach that seeks to increase product and process understanding and process control. Miniature bioreactor systems offer a high-throughput method enabling the assessment of numerous process variables in a controlled environment. However, the number of off/at-line samples that can be taken is restricted due to the small working volume of each vessel. This limitation may be resolved through the use of Raman spectroscopy due to its ability to obtain multianalyte data from small sample volumes fast. It can, however, be challenging to implement this technique for this application due to the complexity of the sample matrix and that analytes are often present in low concentration. Here, we present a design of experiments (DOE) approach to generate samples for calibrating robust multivariate predictive models measuring glucose, lactate, ammonium, viable cell concentration (VCC) and product concentration, for unclarified cell culture that improves the daily monitoring of each miniature bioreactor vessel. Furthermore, we demonstrate how the output of the glucose and VCC models can be used to control the glucose and main nutrient feed rate within miniature bioreactor cultures to within qualified critical limits set for larger scale vessels. The DOE approach used to generate the calibration sample set is shown to result in models more robust to process changes than by simply using samples taken from the “typical” process. © 2018 American Institute of Chemical Engineers Biotechnol. Prog., 35: e2740, 2019.  相似文献   

17.
Multivariate statistical process monitoring (MSPM) is becoming increasingly utilized to further enhance process monitoring in the biopharmaceutical industry. MSPM can play a critical role when there are many measurements and these measurements are highly correlated, as is typical for many biopharmaceutical operations. Specifically, for processes such as cleaning‐in‐place (CIP) and steaming‐in‐place (SIP, also known as sterilization‐in‐place), control systems typically oversee the execution of the cycles, and verification of the outcome is based on offline assays. These offline assays add to delays and corrective actions may require additional setup times. Moreover, this conventional approach does not take interactive effects of process variables into account and cycle optimization opportunities as well as salient trends in the process may be missed. Therefore, more proactive and holistic online continued verification approaches are desirable. This article demonstrates the application of real‐time MSPM to processes such as CIP and SIP with industrial examples. The proposed approach has significant potential for facilitating enhanced continuous verification, improved process understanding, abnormal situation detection, and predictive monitoring, as applied to CIP and SIP operations. © 2014 American Institute of Chemical Engineers Biotechnol. Prog., 30:505–515, 2014  相似文献   

18.
The concept of "design space" has been proposed in the ICH Q8 guideline and is gaining momentum in its application in the biotech industry. It has been defined as "the multidimensional combination and interaction of input variables (e.g., material attributes) and process parameters that have been demonstrated to provide assurance of quality." This paper presents a stepwise approach for defining process design space for a biologic product. A case study, involving P. pastoris fermentation, is presented to facilitate this. First, risk analysis via Failure Modes and Effects Analysis (FMEA) is performed to identify parameters for process characterization. Second, small-scale models are created and qualified prior to their use in these experimental studies. Third, studies are designed using Design of Experiments (DOE) in order for the data to be amenable for use in defining the process design space. Fourth, the studies are executed and the results analyzed for decisions on the criticality of the parameters as well as on establishing process design space. For the application under consideration, it is shown that the fermentation unit operation is very robust with a wide design space and no critical operating parameters. The approach presented here is not specific to the illustrated case study. It can be extended to other biotech unit operations and processes that can be scaled down and characterized at small scale.  相似文献   

19.
Protection of the environment and people from the potential impacts of uranium mining and milling is a global issue as the world's demand for power generation derived from uranium increases. We present a framework for deriving multiple stressor-pathway causal models for an operational uranium mine that can be used to identify research and monitoring needs for environmental protection. Additionally the framework enabled us to categorize the importance of pathways in the system. An interdisciplinary approach to causal model development was undertaken in order to ensure the potential impacts of mining on the natural environment and human health were identified and assessed by researchers with the appropriate knowledge. An example of a causal model and supporting narrative is provided for the most important stressor pathway, transport of inorganic toxicants via the surface water to surface water pathway. This risk-based screening approach can be applied to mining operations where environmental protection (including human health) is underpinned by quantitative interdisciplinary research and monitoring.  相似文献   

20.
A new methodology to quantify minerals’ criticalities is proposed—the criticality systems of minerals. In this methodology, four types of agents—mineral suppliers, consumers, regulators of the market, and others, such as the communities near mining operations—interact with each other through three types of indicators: constraints, such as the political stability in the mining regions, the mineral's substitutability and economic importance; agents’ interactions, such as buyer–seller bargaining; and interactive variables, such as the demand, supply, and price. When the criticality systems of two mineral groups are constructed, analyses that compare the indicators of these criticality systems can determine which group is more critical than the other. This methodology allows evaluation of criticality in a dynamic and systemic manner.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号