首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
ABSTRACT: BACKGROUND: Laboratories engaged in computational biology or bioinformatics frequently need to run lengthy, multistep, and user-driven computational jobs. Each job can tie up a computer for a few minutes to several days, and many laboratories lack the expertise or resources to build and maintain a dedicated computer cluster. RESULTS: JobCenter is a client-server application and framework for job management and distributed job execution. The client and server components are both written in Java and are cross-platform and relatively easy to install. All communication with the server is client-driven, which allows worker nodes to run anywhere (even behind external firewalls) and provides inherent load balancing. Adding a worker node to the worker pool is as simple as dropping the JobCenter client files onto any computer and performing basic configuration, providing tremendous ease-of-use, flexibility, and limitless horizontal scalability. Each worker installation may be independently configured, including the types of jobs it is able to run. Executed jobs may be written in any language and may include multiple execution steps. CONCLUSIONS: JobCenter is a versatile and scalable distributed job management system that allows laboratories to very efficiently distribute all computational work among all available resources. JobCenter is freely available at http://code.google.com/p/jobcenter/.  相似文献   

2.
A BASIC computer program for performing weighted nonlinear regression is described and a listing of the program is given. The program, which is small and simple to use, has been designed to be run by users with little knowledge of mathematics or computers. Robust methods of analysis are described which may be applied to data in which experimental errors are not normally distributed, and the program incorporates one such method. It is shown that the program is useful for the analysis of data conforming to the Michaelis-Menten equation, a single exponential, and to binding equations, and other applications are discussed.  相似文献   

3.
A "kinemage" (kinetic image) is a scientific illustration presented as an interactive computer display. Operations on the displayed kinemage respond within a fraction of a second: the entire image can be rotated in real time, parts of the display can be turned on or off, points can be identified by selecting them, and the change between different forms can be animated. A kinemage is prepared and specified by the author(s) of a journal article, in order to better communicate ideas that depend on three-dimensional information. The kinemages are distributed as plain text files of commented display lists and accompanying explanations. They are viewed and explored in an open-ended way by the reader using a simple graphics program, such as the one described here (called MAGE), which presently runs on Macintosh computers. A utility (called PREKIN) helps authors prepare the kinemages. Kinemages are being implemented under the auspices of the Innovative Technology Fund.  相似文献   

4.
Following an evaluation of the various methods available for non-destructive biomass estimation in short rotation forestry, a standardised procedure was defined and incorporated into a computer programme (BioEst). Special efforts were made to ensure that the system can be used by people who are unfamiliar with computers and mathematics. BioEst provides an interface between a calliper and a spreadsheet programme which was written in Microsoft Excel macro language. Therefore, it is simple to modify the programme and create personal protocols. BioEst can be run on a portable PC with Microsoft Excel for Windows. The computer continuously recalculates an estimate of the amount of biomass per hectare, as well as some summary statistics, when fed data on shoot diameter obtained by making row-section-wise measurements with a standard digital calliper. BioEst is available without cost from the author.  相似文献   

5.
A flexible new computer program for handling DNA sequence data.   总被引:9,自引:2,他引:7       下载免费PDF全文
A compact new computer program for handling nucleic acid sequence data is presented. It consists of a number of different subsets, which may be used according to a given code system. The program is designed for the determination of restriction enzyme and other recognition sites in correlation with translation patterns, and allows tabulation of codon frequencies and protein molecular weights within specified gene boundaries. The program is especially designed for detection of overlapping genes. The language, is FORTRAN and thus the program may be used on small computers; it may also be used without any prior computer experience. Copies are available on request.  相似文献   

6.
A postal survey sent to 350 patients from two rural practices confirmed that an appreciable minority of patients (17%) were opposed to doctors using computers. The questionnaire distributed had been carefully designed to identify their opposition more specifically. Most of the general concern was accounted for by the 91 patients (31%) who feared that confidentiality of information would be reduced. The sensitive nature of medical information alerts patients to the possibility of diminished security of records and obliges practices considering acquiring a computer to ensure that these fears are not realised. Smaller proportions of patients were found to oppose computers on other grounds--namely, impersonality, economy, and general anxiety.  相似文献   

7.
In this paper, we discuss how to realize fault-tolerant applications on distributed objects. Servers supporting objects can be fault-tolerant by taking advantage of replication and checkpointing technologies. However, there is no discussion on how application programs being performed on clients are tolerant of clients faults. For example, servers might block in the two-phase commitment protocol due to the client fault. We newly discuss how to make application programs fault-tolerant by taking advantage of mobile agent technologies where a program can move from a computer to another computer in networks. An application program to be performed on a faulty computer can be performed on another operational computer by moving the program in the mobile agent model. In this paper, we discuss a transactional agent model where a reliable and efficient application for manipulating objects in multiple computers is realized in the mobile agent model. In the transactional agent model, only a small part of the application program named routing subagent moves around computers. A routing subagent autonomously finds a computer which to visit next. We discuss a hierarchical navigation map which computer should be visited price to another computer in a transactional agent. A routing subagent makes a decision on which computer visit for the hierarchical navigation map. Programs manipulating objects in a computer are loaded to the computer on arrival of the routing subagent in order to reduce the communication overhead. This part of the transactional agent is a manipulating subagent. The manipulation subagent still exists on the computer even after the routing subagent leaves the computer in order to hold objects until the commitment. We assume every computer may stop by fault while networks are reliable. There are kinds of faulty computers for a transactional agent; current, destination, and sibling computers where a transactional agent now exists, will move, and has visited, respectively. The types of faults are detected by neighbouring manipulation subagents by communicating with each other. If some of the manipulation subagents are faulty, the routing subagent has to be aborted. However, the routing subagent is still moving. We discuss how to efficiently deliver the abort message to the moving routing subagent. We evaluate the transactional agent model in terms of how long it takes to abort the routing subagent if some computer is faulty.
Makoto TakizawaEmail:
  相似文献   

8.

Background  

BLAST is a widely used genetic research tool for analysis of similarity between nucleotide and protein sequences. This paper presents a software application entitled "Squid" that makes use of grid technology. The current version, as an example, is configured for BLAST applications, but adaptation for other computing intensive repetitive tasks can be easily accomplished in the open source version. This enables the allocation of remote resources to perform distributed computing, making large BLAST queries viable without the need of high-end computers.  相似文献   

9.
Accurate parameter estimation of allometric equations is a question of considerable interest. Various techniques that address this problem exist. In this paper it is assumed that the measured values are normally distributed and a maximum likelihood estimation approach is used. The computations involved in this procedure are reducible to relatively simple forms, and an efficient numerical algorithm is used. A listing of the computer program is included as an appendix.  相似文献   

10.
Cluster, consisting of a group of computers, is to act as a whole system to provide users with computer resources. Each computer is a node of this cluster. Cluster computer refers to a system consisting of a complete set of computers connected to each other. With the rapid development of computer technology, cluster computing technique with high performance–cost ratio has been widely applied in distributed parallel computing. For the large-scale close data in group enterprise, a heterogeneous data integration model was built under cluster environment based on cluster computing, XML technology and ontology theory. Such model could provide users unified and transparent access interfaces. Based on cluster computing, the work has solved the heterogeneous data integration problems by means of Ontology and XML technology. Furthermore, good application effect has been achieved compared with traditional data integration model. Furthermore, it was proved that this model improved the computing capacity of system, with high performance–cost ratio. Thus, it is hoped to provide support for decision-making of enterprise managers.  相似文献   

11.
Many research institutions are deploying computing clusters based on a shared/buy-in paradigm. Such clusters combine shared computers, which are free to be used by all users, and buy-in computers, which are computers purchased by users for semi-exclusive use. The purpose of this paper is to characterize the typical behavior and performance of a shared/buy-in computing cluster, using data traces from the Shared Computing Cluster (SCC) at Boston University that runs under this paradigm as a case study. Among our main findings, we show that the semi-exclusive policy, which allows any SCC user to use idle buy-in resources for a limited time, increases the utilization of buy-in resources by 17.4%, thus significantly improving the performance of the system as a whole. We find that jobs allowed to run on idle buy-in resources arrive more frequently and run for a shorter time than other jobs. Finally, we identify the run time limit (i.e., the maximum time during which a job is allowed to use resources) and the type of parallel environment as two factors that have a significant impact on the different performance experienced by shared and buy-in jobs.  相似文献   

12.
It ought to be easy to exchange digital micrographs and other computer data files with a colleague even on another continent. In practice, this often is not the case. The advantages and disadvantages of various methods that are available for exchanging data files between computers are discussed. When possible, data should be transferred through computer networking. When data are to be exchanged locally between computers with similar operating systems, the use of a local area network is recommended. For computers in commercial or academic environments that have dissimilar operating systems or are more widely spaced, the use of FTPs is recommended. Failing this, posting the data on a website and transferring by hypertext transfer protocol is suggested. If peer to peer exchange between computers in domestic environments is needed, the use of Messenger services such as Microsoft Messenger or Yahoo Messenger is the method of choice. When it is not possible to transfer the data files over the internet, single use, writable CD ROMs are the best media for transferring data. If for some reason this is not possible, DVD-R/RW, DVD+R/RW, 100 MB ZIP disks and USB flash media are potentially useful media for exchanging data files.  相似文献   

13.
It ought to be easy to exchange digital micrographs and other computer data files with a colleague even on another continent. In practice, this often is not the case. The advantages and disadvantages of various methods that are available for exchanging data files between computers are discussed. When possible, data should be transferred through computer networking. When data are to be exchanged locally between computers with similar operating systems, the use of a local area network is recommended. For computers in commercial or academic environments that have dissimilar operating systems or are more widely spaced, the use of FTPs is recommended. Failing this, posting the data on a website and transferring by hypertext transfer protocol is suggested. If peer to peer exchange between computers in domestic environments is needed, the use of Messenger services such as Microsoft Messenger or Yahoo Messenger is the method of choice. When it is not possible to transfer the data files over the internet, single use, writable CD ROMs are the best media for transferring data. If for some reason this is not possible, DVD-R/RW, DVD+R/RW, 100 MB ZIP disks and USB flash media are potentially useful media for exchanging data files.  相似文献   

14.
It ought to be easy to exchange digital micrographs and other computer data files with a colleague even on another continent. In practice, this often is not the case. The advantages and disadvantages of various methods that are available for exchanging data files between computers are discussed. When possible, data should be transferred through computer networking. When data are to be exchanged locally between computers with similar operating systems, the use of a local area network is recommended. For computers in commercial or academic environments that have dissimilar operating systems or are more widely spaced, the use of FTPs is recommended. Failing this, posting the data on a website and transferring by hypertext transfer protocol is suggested. If peer to peer exchange between computers in domestic environments is needed, the use of Messenger services such as Microsoft Messenger or Yahoo Messenger is the method of choice. When it is not possible to transfer the data files over the internet, single use, writable CD ROMs are the best media for transferring data. If for some reason this is not possible, DVD-R/RW, DVD+R/RW, 100 MB ZIP disks and USB flash media are potentially useful media for exchanging data files.  相似文献   

15.
16.
M Gulotta 《Biophysical journal》1995,69(5):2168-2173
LabVIEW is a graphic object-oriented computer language developed to facilitate hardware/software communication. LabVIEW is a complete computer language that can be used like Basic, FORTRAN, or C. In LabVIEW one creates virtual instruments that aesthetically look like real instruments but are controlled by sophisticated computer programs. There are several levels of data acquisition VIs that make it easy to control data flow, and many signal processing and analysis algorithms come with the software as premade VIs. In the classroom, the similarity between virtual and real instruments helps students understand how information is passed between the computer and attached instruments. The software may be used in the absence of hardware so that students can work at home as well as in the classroom. This article demonstrates how LabVIEW can be used to control data flow between computers and instruments, points out important features for signal processing and analysis, and shows how virtual instruments may be used in place of physical instrumentation. Applications of LabVIEW to the teaching laboratory are also discussed, and a plausible course outline is given.  相似文献   

17.
《Biochemical education》1999,27(2):93-96
We have successfully implemented a simple computerised data acquisition system which has been used extensively in our biochemistry teaching laboratory classes. Our experience suggests that it is important to consider carefully what the computers are required to do in the laboratory well before their introduction. Finally, we believe that it is desirable to keep the system simple and sufficiently versatile that it can be adapted to several different uses without too much difficulty.  相似文献   

18.
19.
Developments in computer hardware and software are making significant improvements in the availability of simulation for biomedical researchers. This paper reviews past and present techniques for digital computer simulation and looks at improvements likely in the near future. In the area of hardware, personal computers are making computing and simulation more widely available and at the same time, supercomputers and special-purpose numerical processors are making it possible to solve larger problems. Software developments for simulation are reducing the time, effort and special skills required to produce a simulation program. A new hierarchical linker is proposed to make it easy to synthesize a global model by combining existing submodels. In the more distant future, computer models may be constructed graphically and with the assistance of intelligent programs capable of analysis and information retrieval.  相似文献   

20.
The use of computers in clinical electrocardiography is increasing rapidly; however, the role of computers with respect to the electrocardiographer has not been established. At present all electrocardiograms (ECGs) processed by computer are also interpreted by electrocardiographers; hense effort is duplicated. In an investigation of whether conditions can be defined under which the electrocardiographer can use the computer more profitably by eliminating some of the duplication, ECGs recorded in a university teaching hospital were processed by a computer program and subsequently reviewed by 1 of 10 electrocardiographers. For ECGs interpreted as showing normal sinus rhythm the rate of agreement between computer and human reviewer was 99%. For those showing a normal ECG pattern (contour) the rate of direct agreement was only 88%. However, the rate of occurrence of clinically significant differences was only 1.64%; hence the rate of essential agreement for this classification was 98.36%. Other classifications with good agreement were myocardial infarction, sinus bradycardia and sinus tachycardia. Therefore, in circumstances comparable to those of this investigation it is feasible for electrocardiographers to use computers to reduce greatly their workload without compromising the quality of the service provided.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号