首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Hankins and Rovito (1984) examined the impact of different tool policies on cutting tool inventory levels and spindle utilization for a flexible manufacturing system (FMS). This study provides a broader perspective of the impact of tool allocation approaches on flow times, tardiness, percent of orders tardy, machine utilization, and robot utilization. Part type selection procedures have been suggested for the FMS prerelease planning problem. However, very little research has specifically evaluated the part type selection procedures across different tool allocation approaches. Also, with the exception of Stecke and Kim (1988, 1991) no other known study has provided any insights on what tool allocation approaches are appropriate when processing different mixes of part types. This research is devoted to addressing those issues. Three tool allocation approaches, three production scheduling rules, and three levels of part mix are evaluated in this study through a similation model of a flexible manufacturing system. The specific impacts of the tool approaches, their interaction effects with the part type selection rules, and their effectiveness at different part type mix levels are provided through the use of a regression metamodel.  相似文献   

2.
Fixtures dedicated to a given part type in an FMS can sometimes become bottlenecks to some FMSs as demand variability increases. Past research indicates that the increased operating flexibility associated with general-purpose fixtures may be a key to the efficient scheduling of even those FMSs dedicated to producing a small number of part types over a known planning horizon. However, the increased time required to reconfigure general-purpose fixtures to meet current demand may also create a bottleneck in the loading area. Increases in the defect rate associated with improper fixture assembly and part alignment are also possible. One solution may be to reconfigure general-purpose fixtures off line according to specific demand schedules for the period, and then to treat them as “pseudo-dedicated” fixtures until the next period. This would utilize some of the flexibility associated with general-purpose fixtures, while reducing the negative drawbacks associated with incremental loading time and alignment errors. The research reported in this article simulates an existing FMS, using fixtures dedicated to individual part types, and compares the results to those collected using a group of general-purpose fixtures that are reconfigured each week, based on current demand for each part type, and used in a pseudo-dedicated fashion. Three simulation experiments are run with increasing coefficients of variation in the input distributions used to generate demand. Performance is measured by system throughput, proportion of parts tardy, and average tardiness. The simulation results show that while overall system performance decreases as the level of demand mix variability increases, this negative impact is significantly less severe when using pseudo-dedicated, general-purpose fixtures.  相似文献   

3.
The allocation of tools to machines determines potential part routes in flexible manufacturing systems. Given production requirements and a minimum feasible set of tools, the decision of how to fill vacant slots in tool magazines to maximize routing flexibility is shown to be a minimum cost network flow problem for the cases when routing flexibility is a function of the average workload per tool aggregated over tool types, or of the number of possible routes through the system. A linear programming model is then used to plan a set of routes for each part type so as to minimize either the material handling requirement or the maximum workload on any machine. The impact of these tool addition strategies on the material handling and workload equalization is investigated and computational results presented. The advantage of the overall approach is computational simplicity at each step and the ability to react to dynamic changes.  相似文献   

4.
Usually, most of the typical job shop scheduling approaches deal with the processing sequence of parts in a fixed routing condition. In this paper, we suggest a genetic algorithm (GA) to solve the job-sequencing problem for a production shop that is characterized by flexible routing and flexible machines. This means that all parts, of all part types, can be processed through alternative routings. Also, there can be several machines for each machine type. To solve these general scheduling problems, a genetic algorithm approach is proposed and the concepts of virtual and real operations are introduced. Chromosome coding and genetic operators of GAs are defined during the problem solving. A minimum weighted tardiness objective function is used to define code fitness, which is used for selecting species and producing a new generation of codes. Finally, several experimental results are given.  相似文献   

5.
This research involves the development and evaluation of a part flow control model for a type of flexible manufacturing system (FMS) called a dedicated flexible flow line (FFL). In the FFL, all part types flow along the same path between successive machine groups. The specific objective of the part flow control model for the FFL is to minimize makespan for a given set of parts produced in a FFL near-term schedule, given fixed available buffer constraints. The control model developed in this research involved the repeated, real-time execution of a mathematical programming algorithm. The algorithm attempts to release the right mix of parts at the tight time to keep the FFL operating smoothly. The focus of the approach is directed toward managing WIP buffers for each machine group queue. The algorithm specifically incorporates stochastic disturbance factors such as machine failures. Through a limited number of simulation experiments, performance of the control model is shown to be superior to other parts releasing and control methods reported in the literature.  相似文献   

6.
7.
System setup problems in flexible manufacturing systems deal with short-term planning problems such as part type selection, machine grouping, operation assignment, tooling, fixture and pallet allocation, and routing. In this article, we consider three of the subproblems: part type selection, machine grouping, and loading. We suggest a heuristic approach to solve the subproblems consistently with the objective of maximizing the expected production rate. The proposed procedure includes routines to generate all possible machine grouping alternatives for a given set of machines, to obtain optimal target workloads for each grouping alternative, and to allocate operations and tools to machine groups. These routines are executed iteratively until a good solution to the system setup problem is obtained. Computational experience is reported.  相似文献   

8.
Non-negative matrix factorization is a useful tool for reducing the dimension of large datasets. This work considers simultaneous non-negative matrix factorization of multiple sources of data. In particular, we perform the first study that involves more than two datasets. We discuss the algorithmic issues required to convert the approach into a practical computational tool and apply the technique to new gene expression data quantifying the molecular changes in four tissue types due to different dosages of an experimental panPPAR agonist in mouse. This study is of interest in toxicology because, whilst PPARs form potential therapeutic targets for diabetes, it is known that they can induce serious side-effects. Our results show that the practical simultaneous non-negative matrix factorization developed here can add value to the data analysis. In particular, we find that factorizing the data as a single object allows us to distinguish between the four tissue types, but does not correctly reproduce the known dosage level groups. Applying our new approach, which treats the four tissue types as providing distinct, but related, datasets, we find that the dosage level groups are respected. The new algorithm then provides separate gene list orderings that can be studied for each tissue type, and compared with the ordering arising from the single factorization. We find that many of our conclusions can be corroborated with known biological behaviour, and others offer new insights into the toxicological effects. Overall, the algorithm shows promise for early detection of toxicity in the drug discovery process.  相似文献   

9.
We describe a new approach for using suitable STS and SSR markers as a powerful molecular tool for screening segregating populations involved in backcross schemes for marker-assisted selection, as a preselection step. Since it can be applied to very large populations, this preselection strategy allows one to increase substantially the pressure of selection at each backcross generation. The technique is fast and reproducible, and can be made even more efficient and costeffective by simultaneous DNA amplification from different primer pairs. In the example illustrated here, three suitable PCR-based markers were used to complete the selection of 300 individuals out of 2300 in less than one month with two people working on the project.  相似文献   

10.
In this work, the development and application of published models for describing the behavior of plant cell cultures is reviewed. The structure of each type of model is analyzed and the new tendencies for the modeling of biotechnological processes that can be applied in plant cell cultures are presented. This review is a tool for clarifying the main features that characterize each type of model in the field of plant cell cultures and can be used as a support on the selection of the more suitable model type, taking into account the purpose of specific research.  相似文献   

11.
Analyzing the production capacity of a flexible manufacturing system consisting of a number of alternative, nonidentical, flexible machines, where each machine is capable of producing several different part types simultaneously (by flexibly allocating its production capacity among these part types), is not a trivial task. The production capacity set of such a system is naturally expressed in terms of the machine-specific production rates of all part types. In this paper we also express it in terms of the total production rates of all part types over all machines. More specifically, we express the capacity set as the convex hull of a set of points corresponding to all possible assignments of machines to part types, where in each assignment each machine allocates all its capacity to only one part type. First, we show that within each subset of assignments having a given number of machines assigned to each part type, there is a unique assignment that corresponds to an extreme point of the capacity set. Then, we propose a procedure for generating all the extreme points and facets of the capacity set. Numerical experience shows that when the number of part types is less than four, the size of the capacity set (measured in terms of the number of variables times the number of constraints) is smaller, if the capacity set is expressed in terms of the total production rates of all part types over all machines than if it is expressed in terms of the machine-specific production rates of all part types. When the number of part types is four or more, however, the opposite is true.  相似文献   

12.
Accurate prediction of species distributions based on sampling and environmental data is essential for further scientific analysis, such as stock assessment, detection of abundance fluctuation due to climate change or overexploitation, and to underpin management and legislation processes. The evolution of computer science and statistics has allowed the development of sophisticated and well-established modelling techniques as well as a variety of promising innovative approaches for modelling species distribution. The appropriate selection of modelling approach is crucial to the quality of predictions about species distribution. In this study, modelling techniques based on different approaches are compared and evaluated in relation to their predictive performance, utilizing fish density acoustic data. Generalized additive models and mixed models amongst the regression models, associative neural networks (ANNs) and artificial neural networks ensemble amongst the artificial neural networks and ordinary kriging amongst the geostatistical techniques are applied and evaluated. A verification dataset is used for estimating the predictive performance of these models. A combination of outputs from the different models is applied for prediction optimization to exploit the ability of each model to explain certain aspects of variation in species acoustic density. Neural networks and especially ANNs appear to provide more accurate results in fitting the training dataset while generalized additive models appear more flexible in predicting the verification dataset. The efficiency of each technique in relation to certain sampling and output strategies is also discussed.  相似文献   

13.
The selection of an appropriate control sample for use in association mapping requires serious deliberation. Unrelated controls are generally easy to collect, but the resulting analyses are susceptible to spurious association arising from population stratification. Parental controls are popular, since triads comprising a case and two parents can be used in analyses that are robust to this stratification. However, parental controls are often expensive and difficult to collect. In some situations, studies may have both parental and unrelated controls available for analysis. For example, a candidate-gene study may analyze triads but may have an additional sample of unrelated controls for examination of background linkage disequilibrium in genomic regions. Also, studies may collect a sample of triads to confirm results initially found using a traditional case-control study. Initial association studies also may collect each type of control, to provide insurance against the weaknesses of the other type. In these situations, resulting samples will consist of some triads, some unrelated controls, and, possibly, some unrelated cases. Rather than analyze the triads and unrelated subjects separately, we present a likelihood-based approach for combining their information in a single combined association analysis. Our approach allows for joint analysis of data from both triad and case-control study designs. Simulations indicate that our proposed approach is more powerful than association tests that are based on each separate sample. Our approach also allows for flexible modeling and estimation of allele effects, as well as for missing parental data. We illustrate the usefulness of our approach using SNP data from a candidate-gene study of psoriasis.  相似文献   

14.
Sexual selection is traditionally measured at the population level, assuming that populations lack structure. However, increasing evidence undermines this approach, indicating that intrasexual competition in natural populations often displays complex patterns of spatial and temporal structure. This complexity is due in part to the degree and mechanisms of polyandry within a population, which can influence the intensity and scale of both pre- and post-copulatory sexual competition. Attempts to measure selection at the local and global scale have been made through multi-level selection approaches. However, definitions of local scale are often based on physical proximity, providing a rather coarse measure of local competition, particularly in polyandrous populations where the local scale of pre- and post-copulatory competition may differ drastically from each other. These limitations can be solved by social network analysis, which allows us to define a unique sexual environment for each member of a population: ‘local scale’ competition, therefore, becomes an emergent property of a sexual network. Here, we first propose a novel quantitative approach to measure pre- and post-copulatory sexual selection, which integrates multi-level selection with information on local scale competition derived as an emergent property of networks of sexual interactions. We then use simple simulations to illustrate the ways in which polyandry can impact estimates of sexual selection. We show that for intermediate levels of polyandry, the proposed network-based approach provides substantially more accurate measures of sexual selection than the more traditional population-level approach. We argue that the increasing availability of fine-grained behavioural datasets provides exciting new opportunities to develop network approaches to study sexual selection in complex societies.  相似文献   

15.
The current drug development pathway in oncology research has led to a large attrition rate for new drugs, in part due to a general lack of appropriate preclinical studies that are capable of accurately predicting efficacy and/or toxicity in the target population. Because of an obvious need for novel therapeutics in many types of cancer, new compounds are being investigated in human Phase I and Phase II clinical trials before a complete understanding of their toxicity and efficacy profiles is obtained. In fact, for newer targeted molecular agents that are often cytostatic in nature, the conventional preclinical evaluation used for traditional cytotoxic chemotherapies utilizing primary tumor shrinkage as an endpoint may not be appropriate. By utilizing an integrated pharmacokinetic/pharmacodynamic approach, along with proper selection of a model system, the drug development process in oncology research may be improved leading to a better understanding of the determinants of efficacy and toxicity, and ultimately fewer drugs that fail once they reach human clinical trials.  相似文献   

16.
The objective of root cause analysis (RCA) is to make the trouble shooting dimensional error efforts in an assembly plant more efficient and successful by pinpointing the underlying reasons for variation. The result of eliminating or limiting these sources of variation is a real and long term process improvement. Complex products are manufactured in multileveled hierarchical assembly processes using positioning fixtures. A general approach for diagnosing fixture related errors using routine measurement on products, rather than from special measurements on fixtures, is presented. The assembly variation is effectively tracked down into variation in the fixture tooling elements, referred to as locators. In this way, the process engineers can focus on adjusting the locators affected by most variation. However, depending on the assembly process configuration, inspection strategy, and the type of locator error, it can be impossible to completely sort out the variation caused by an individual locator. The reason for this is that faults in different locators can cause identical dimensional deviation in the inspection station. Conditions guaranteeing diagnosability are derived by considering multiple uncoupled locator faults, in contrast to previous research focusing on single or multiple coupled locator faults. Furthermore, even if an assembly is not diagnosable, it is still possible to gain information for diagnosis by using a novel approach to find an interval for each locator containing the true underlying locator variation. In this way, some locators can be excluded from further analysis, some can be picked out for adjustment, and others remain as potential reason for assembly variation. Another way around the problem of diagnosability is to make a higher level diagnosis by calculating the amount of variation originating from different assembly stations. Also, a design for diagnosis approach is discussed, where assembly and inspection concepts allowing for root cause analysis are the objective.  相似文献   

17.
ABSTRACT The controversy over the use of null hypothesis statistical testing (NHST) has persisted for decades, yet NHST remains the most widely used statistical approach in wildlife sciences and ecology. A disconnect exists between those opposing NHST and many wildlife scientists and ecologists who conduct and publish research. This disconnect causes confusion and frustration on the part of students. We, as students, offer our perspective on how this issue may be addressed. Our objective is to encourage academic institutions and advisors of undergraduate and graduate students to introduce students to various statistical approaches so we can make well-informed decisions on the appropriate use of statistical tools in wildlife and ecological research projects. We propose an academic course that introduces students to various statistical approaches (e.g., Bayesian, frequentist, Fisherian, information theory) to build a foundation for critical thinking in applying statistics. We encourage academic advisors to become familiar with the statistical approaches available to wildlife scientists and ecologists and thus decrease bias towards one approach. Null hypothesis statistical testing is likely to persist as the most common statistical analysis tool in wildlife science until academic institutions and student advisors change their approach and emphasize a wider range of statistical methods.  相似文献   

18.
The study of complex hereditary diseases is a very challenging area of research. The expanding set of in silico approaches offers a flourishing ground for the acceleration of meaningful findings in this area by exploitation of rich and diverse sources of omic data. These approaches are cheap, flexible, extensible, often complementary and can continuously integrate new information and tests to improve the selection of genes responsible for hereditary diseases. Following this principle, we improved and extended our web-service TOM for the identification of candidate genes in the study of complex hereditary diseases. AVAILABILITY: Our tool is freely available online at http://www.micrel.deis.unibo.it/~tom/.  相似文献   

19.
This paper discusses the general methodological controversy between individual and group research approaches by comparing the main characteristics of these two methods as applied to the specific context of basic research on voluntary heart rate control. A review of the literature published over the past 19 years in this area of study shows an imbalance in the frequency of utilization of these two methods that strongly favors short-term group designs. Implications of this research tendency are discussed. The relevance and the advantages of applying the individual approach to voluntary autonomic control research are outlined. This area is particularly amenable to the individual approach because the phenomena under study seem to be characterized by, among other things, a smaller intrasubject than intersubject variability. It is suggested that the present imbalanced tendency in the choice of a research method be corrected and that researchers adopt a more flexible attitude in the choice of the best method for studying each specific problem.  相似文献   

20.
Protein carbonylation is a major form of protein oxidation and is widely used as an indicator of oxidative stress. Carbonyl groups do not have distinguishing UV or visible, spectrophotometric absorbance/fluorescence characteristics and thus their detection and quantification can only be achieved using specific chemical probes. In this paper, we review the advantages and disadvantages of several chemical probes that have been and are still being used for protein carbonyl analysis. These probes include 2,4-dinitrophenylhydazine (DNPH), tritiated sodium borohydride ([3H]NaBH4), biotin-containing probes, and fluorescence probes. As our discussions lean toward gel-based approaches, utilizations of these probes in 2D gel-based proteomic analysis of carbonylated proteins are illustrated where applicable. Analysis of carbonylated proteins by ELISA, immunofluorescent imaging, near infrared fluorescence detection, and gel-free proteomic approaches are also discussed where appropriate. Additionally, potential applications of blue native gel electrophoresis as a tool for first dimensional separation in 2D gel-based analysis of carbonylated proteins are discussed as well.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号