首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
In the second part of the publication [1] the application of an adaptive algorithm for the adaptation of the static optimum was orientated on a specific problem. In this third part the indirect determination of the productivity from the nitrogen balance used in the adaptive algorithm is supplied, and control algorithms founded on the nitrogen balance are given in order to complete the algorithm.  相似文献   

2.
Ground stone tools are lithic tools made on coarse blanks that are not included in the chipped or polished stone studies. During the study of a ground stone tools collection coming from the Campaniform site of Beg ar Loued (Molène island, Finistère, France), the techno-functional unit (TFU) analysis as been adapted to these objects. A TFU is a part of a tool that is technologically independent, having his own part in this tool's overall functioning. The application of this method to ground stone tools needed some adaptations to their characteristics. The aim is to obtain a high level technological analysis for these tools, succeeding to express functional hypothesis. Two concrete examples from Beg ar Loued's collection illustrate the TFU analysis method's application to ground stone tools. This method has the advantage to organize the progress of each piece's analysis. The TFU analysis helps to understand the multifunctional tools by considering each function as a full tool. This allows us to talk about function's hierarchisation and their adaptation to blanks. We also obtain some informations about blank's selection. These many and varied results make the TFU analysis essential for ground stone tool's technological and functional analysis.  相似文献   

3.
4.
Machine vision has the potential to significantly impact both quality and productivity in automated manufacturing, due to its versatility, flexibility, and relative speed. Unfortunately, algorithmic development has not kept pace with advances in vision hardware technology, particularly in the areas of analysis and decision making. In this article, a tutorial is presented that explains how a genetic algorithm can be applied to vision systems for shape analysis and quality assessment. The control parameters for the algorithm are optimized by conducting experiments of Taguchi's approach to parameter design. The main objective behind this algorithm is to explain an application of the vision system that uses upstream design data of machined parts of different types for downstream metrology and quality decision making in the environment of flexible manufacturing. The part types used for demonstration are restricted to planar polygonal profiles generated by projecting 3D objects onto a 2D inspection plane. The input to the system is a set of boundary features of the part being analyzed, and the outputs from the system include the estimators of size, orientation, position, and out-of-profile error of the part. The system can analyze machined parts of different types without modifying software programs and parameter settings, which makes it generic and flexible, and is inherently suitable for on-line implementation in FMS environments.  相似文献   

5.
The performance of a biological Fe2+ oxidizing fluidized bed reactor (FBR) was modeled by a popular neural network-back-propagation algorithm over a period of 220 days at 37 °C under different operational conditions. A method is proposed for modeling Fe3+ production in FBR and thereby managing the regeneration of Fe3+ for heap leaching application, based on an artificial neural network-back-propagation algorithm. Depending on output value, relevant control strategies and actions are activated, and Fe3+ production in FBR was considered as a critical output parameter. The modeling of effluent Fe3+ concentration was very successful, and an excellent match was obtained between the measured and the predicted concentrations.  相似文献   

6.
In the increasingly competitive global markets, enterprises face challenges in responding to customer orders quickly, as well as producing customized products cost-effectively. This paper proposes a dynamic heuristic-based algorithm for the part input sequencing problem of flexible manufacturing systems (FMSs) in a mass customization (MC) environment. The FMS manufactures a variety of parts, and customer orders arrive dynamically with order size as small as one. Segmental set functions are established in the proposed algorithm to apply the strategy of dynamic workload balancing, and the shortest processing time (SPT) scheduling rule. Theoretical analysis is performed and the effectiveness of the algorithm in dynamic workload balancing under the complex and dynamic environment is proven. The application of the algorithm is illustrated by an example. The potential of its practical applications to the FMSs in make-to-order (MTO) supply chains is also discussed. Further research is provided.  相似文献   

7.
A fast Monte Carlo integration algorithm with varying time step is described for cooperative binding of ligands of arbitrary length to a one-dimensional lattice. This algorithm is particularly suitable for strongly cooperative or anticooperative systems, i.e., when the time scales for different kinetic events are very different. As an application, the kinetics of a bimodal two-ligand system are briefly discussed.  相似文献   

8.
The purpose of this study is an application of scale invariant feature transform (SIFT) algorithm to stitch the cervical-thoracic-lumbar (C-T-L) spine magnetic resonance (MR) images to provide a view of the entire spine in a single image. All MR images were acquired with fast spin echo (FSE) pulse sequence using two MR scanners (1.5 T and 3.0 T). The stitching procedures for each part of spine MR image were performed and implemented on a graphic user interface (GUI) configuration. Moreover, the stitching process is performed in two categories; manual point-to-point (mPTP) selection that performed by user specified corresponding matching points, and automated point-to-point (aPTP) selection that performed by SIFT algorithm. The stitched images using SIFT algorithm showed fine registered results and quantitatively acquired values also indicated little errors compared with commercially mounted stitching algorithm in MRI systems. Our study presented a preliminary validation of the SIFT algorithm application to MRI spine images, and the results indicated that the proposed approach can be performed well for the improvement of diagnosis. We believe that our approach can be helpful for the clinical application and extension of other medical imaging modalities for image stitching.  相似文献   

9.
Capture–mark–recapture (CMR) approaches are the backbone of many studies in population ecology to gain insight on the life cycle, migration, habitat use, and demography of target species. The reliable and repeatable recognition of an individual throughout its lifetime is the basic requirement of a CMR study. Although invasive techniques are available to mark individuals permanently, noninvasive methods for individual recognition mainly rest on photographic identification of external body markings, which are unique at the individual level. The re‐identification of an individual based on comparing shape patterns of photographs by eye is commonly used. Automated processes for photographic re‐identification have been recently established, but their performance in large datasets (i.e., > 1000 individuals) has rarely been tested thoroughly. Here, we evaluated the performance of the program AMPHIDENT, an automatic algorithm to identify individuals on the basis of ventral spot patterns in the great crested newt (Triturus cristatus) versus the genotypic fingerprint of individuals based on highly polymorphic microsatellite loci using GENECAP. Between 2008 and 2010, we captured, sampled and photographed adult newts and calculated for 1648 samples/photographs recapture rates for both approaches. Recapture rates differed slightly with 8.34% for GENECAP and 9.83% for AMPHIDENT. With an estimated rate of 2% false rejections (FRR) and 0.00% false acceptances (FAR), AMPHIDENT proved to be a highly reliable algorithm for CMR studies of large datasets. We conclude that the application of automatic recognition software of individual photographs can be a rather powerful and reliable tool in noninvasive CMR studies for a large number of individuals. Because the cross‐correlation of standardized shape patterns is generally applicable to any pattern that provides enough information, this algorithm is capable of becoming a single application with broad use in CMR studies for many species.  相似文献   

10.
The industrial continuous fermentation for the production of microbial protein is like any real process subject to disturbances. The considerable social expenditures involved in production make it necessary to restrict the negative consequences of these disturbances to a minimum by means of suitable measures. One such measure is the computer-aided adaptation of the static optimum. For the reaction on nonmeasurable disturbing inputs an algorithm is given containing the steps data filtering, adaption of process model and optimization, and the solution of the data-filtering problem in the widest sense by spline functions is discussed. The application of this algorithm to a specific problem of process control is demonstrated in [1].  相似文献   

11.
12.
In liquid chromatography-mass spectrometry (LC-MS), parts of LC peaks are often corrupted by their co-eluting peptides, which results in increased quantification variance. In this paper, we propose to apply accurate LC peak boundary detection to remove the corrupted part of LC peaks. Accurate LC peak boundary detection is achieved by checking the consistency of intensity patterns within peptide elution time ranges. In addition, we remove peptides with erroneous mass assignment through model fitness check, which compares observed intensity patterns to theoretically constructed ones. The proposed algorithm can significantly improve the accuracy and precision of peptide ratio measurements.  相似文献   

13.
In this work, we introduce in the first part new developments in Principal Component Analysis (PCA) and in the second part a new method to select variables (genes in our application). Our focus is on problems where the values taken by each variable do not all have the same importance and where the data may be contaminated with noise and contain outliers, as is the case with microarray data. The usual PCA is not appropriate to deal with this kind of problems. In this context, we propose the use of a new correlation coefficient as an alternative to Pearson's. This leads to a so-called weighted PCA (WPCA). In order to illustrate the features of our WPCA and compare it with the usual PCA, we consider the problem of analyzing gene expression data sets. In the second part of this work, we propose a new PCA-based algorithm to iteratively select the most important genes in a microarray data set. We show that this algorithm produces better results when our WPCA is used instead of the usual PCA. Furthermore, by using Support Vector Machines, we show that it can compete with the Significance Analysis of Microarrays algorithm.  相似文献   

14.
A digital computer was programmed to detect impulses in the presence of noise, rather than identify or classify impulse activity from microelectrodes. The analog signal was abstracted into a sequential series of voltage time vectors that measured peak-to-peak activity. The amplitude and time difference between a peak-positive potential and the next peak-negative potential defined one vector. The amplitude and time difference between that negative peak and the next positive peak defined the next vector, and so on. An algorithm determined if each successive vector was part of a signal pattern by comparing the properties of the vector to those in a stored list. The algorithm was designed for future application with minimum computer systems and multiple-tip microelectrodes.  相似文献   

15.
A molecular sequence alignment algorithm based on dynamic programming has been extended to allow the computation of all pairs of residues that can be part of optimal and suboptimal sequence alignments. The uncertainties inherent in sequence alignment can be displayed using a new form of dot plot. The method allows the qualitative assessment of whether or not two sequences are related, and can reveal what parts of the alignment are better determined than others. It also permits the computation of representative optimal and suboptimal alignments. The relation between alignment reliability and alignment parameters is discussed. Other applications are to cyclical permutations of sequences and the detection of self-similarity. An application to multiple sequence alignment is noted.  相似文献   

16.
Virtual worlds and environments are becoming an increasingly central part of our lives, yet they are still far from accessible to the blind. This is especially unfortunate as such environments hold great potential for them for uses such as social interaction, online education and especially for use with familiarizing the visually impaired user with a real environment virtually from the comfort and safety of his own home before visiting it in the real world. We have implemented a simple algorithm to improve this situation using single-point depth information, enabling the blind to use a virtual cane, modeled on the “EyeCane” electronic travel aid, within any virtual environment with minimal pre-processing. Use of the Virtual-EyeCane, enables this experience to potentially be later used in real world environments with identical stimuli to those from the virtual environment. We show the fast-learned practical use of this algorithm for navigation in simple environments.  相似文献   

17.
18.
Li  Chunlin  Cai  Qianqian  Luo  Youlong 《Cluster computing》2022,25(2):1421-1439

Improper data replacement and inappropriate selection of job scheduling policy are important reasons for the degradation of Spark system operation speed, which directly causes the performance degradation of Spark parallel computing. In this paper, we analyze the existing caching mechanism of Spark and find that there is still more room for optimization of the existing caching policy. For the task structure analysis, the key information of Spark tasks is taken out to obtain the data and memory usage during the task runtime, and based on this, an RDD weight calculation method is proposed, which integrates various factors affecting the RDD usage and establishes an RDD weight model. Based on this model, a minimum weight replacement algorithm based on RDD structure analyzing is proposed. The algorithm ensure that the relatively more valuable data in the data replacement process can be cached into memory. In addition, the default job scheduling algorithm of the Spark framework considers a single factor, which cannot form effective scheduling for jobs and causes a waste of cluster resources. In this paper, an adaptive job scheduling policy based on job classification is proposed to solve the above problem. The policy can classify job types and schedule resources more effectively for different types of jobs. The experimental results show that the proposed dynamic data replacement algorithm effectively improves Spark's memory utilization. The proposed job classification-based adaptive job scheduling algorithm effectively improves the system resource utilization and shortens the job completion time.

  相似文献   

19.
In this paper we use marginal probabilities to derive expressions for the means, variances and covariances ofm-compartment systems. We also present an efficient algorithm for the estimation of the parameters of the system using time series data when measurements are available fromk of them compartments. An application of the analysis and parameter estimation procedure for a model representing the results of a cancer treatment follow-up study is given. Supported in part by NSF Grant Number DCR74-17282.  相似文献   

20.
In parts 1–3 of the publication exceptionally stationary processes were used as information sources. In this fourth part the application of transition states for the adaption of the static model is investigated. A solution on the basis of a simple disturbance model is proposed and an evaluation of this procedure is given.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号