首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
A general 'coherent signal averager' software package which can be run on a small laboratory computer is presented as an application of a new approach to medical instrumentation. The combination of the minicomputer, preprocessing hardware and the above-mentioned software yields a flexible multipurpose averaging system for electrophysiological signals. The possibilities of the system are discussed with reference to visual evoked potential measurements in a clinical function laboratory.  相似文献   

2.
A very low cost microprocessor system has been designed to ease data handling problems in a large workload immunoassay laboratory. The microprocessor collects and stores data from many immunoassay detection devices simultaneously, and transfers the data to a minicomputer for analysis as each measurement batch is completed. Stored data are protected against a mains power failure during collection and against non-availability of the minicomputer at transfer time. The system provides fast and reliable transfer of very large amounts of raw data from measurement devices to computer, and therefore facilitates the use of a statistically sound data reduction software package.  相似文献   

3.
A Multielectrodic EMG analysis program is developing. The purpose is to get as short as possible the main EMG parameters (amplitude, duration, frequency) of most motor units, and to reach an estimation of the anatomical extent of single units. According to the muscle extent a variable number of electrodes are inserted crosswise the fibers. EMG signals are simultaneously recorded on a multichannel AMPEX FR1300 and then off-line processed by a 21MX HP minicomputer connected with a 5Mbytes disc drive. Some technical problems had to be solved:channel amplification adjustment to avoid any difference among preamplifiers calibration and filtering, severe hum filtering of main power that is specially strong in nultielectrodic recording systems, the need of sampling at the same Nyquist time the signals of different channels. The computer is instructed to identify the "sinchronous" units i.e. the motor units recorded from more than one channel. These motor units are detected, counted and deleted from all the channels, except the one where they show the maximum amplitude. The percentage of these sinchronous units depends upon the interelectrodic distance and their anatomical area, thus it can support an evaluation of motor unit anatomical spread.  相似文献   

4.
Optimum coordinate sets have been obtained for ferrocytochrome c and the two symmetry-independent molecules of ferricytochrome c from tuna at 2.0 A resolution by making the best fit of models with standard bond lengths and angles to the experimental electron density maps (1977) J. Biol. Chem. 252, 759-785, as a preliminary to full refinement with 1.5 A data. Both the Diamond model-building programs and locally developed minicomputer routines were tried, with the latter preferred for economy and ease of operation, although both gave satisfactory results. Atomic coordinates are available on microfiche or from the Brookhaven Protein Data Bank. Using the two ferricytochrome molecules as a control, no differences between oxidized and reduced cytochrome molecules can be seen that are outside the probable limits of accuracy of the 2.0 A analysis. Rotation and subtractive difference map comparisons also show no conformation changes. If believable differences do appear in the course of the 1.5 A refinement now underway, these should be no more than minor breathing of main chain or adjustment of side chains.  相似文献   

5.
A computer package written in Fortran-IV for the PDP-11 minicomputer is described. The package's novel features are: software for voice-entry of sequence data; a less memory intensive algorithm for optimal sequence alignment; and programs that fit statistical models to nucleic acid and protein sequences.  相似文献   

6.
In 1979, a minicomputer system was developed for Hoffmann-La Roche by ABEC, Inc. for the purpose of achieving on-line analysis and reporting of data from 16 70-L pilot-plant fermentors (New Brunswick Scientific Co.). The system consists of a PDP 11/60 computer with 96K core capacity, two RL01 disk drives, two RX01 floppy-disk drives, LA-36 DECwriter terminal, Tektronix CRT, and Versatec printer plotter. DEC, PDP, RSX, RL01, RX01, LA-36, and DECwriter are trademarks of Digital Equipment Corporation. The computer software comprises three distinct groups of programs. RSX-11M is a disk-based operating system that allows quick response to realtime events, such as process monitoring and data acquisition, while carrying out less time-dependent activities, such as program development and graphical output. The AIM (Biles, Inc.) system is used to acquire and convert the voltage signals produced by pilot-plant instrumentation into engineering units. Analysis and graphical output are executed by ABEC and Versatec supplied programs. The most beneficial task performed by the computer is the production of graphical output of a variety of measured and analyzed data. This has led to an increase in personnel productivity and design of more meaningful experiments. An ancillary function of the system is to pick up data logged by a PDP 11/03 computer from a remote fermentation production plant by means of a MODEM interfaced communication link. Production data are analyzed and presented in a form identical with pilot-plant data. The experience with the system is discussed in this article.  相似文献   

7.

Motivation

Mass spectrometry is a high throughput, fast, and accurate method of protein analysis. Using the peaks detected in spectra, we can compare a normal group with a disease group. However, the spectrum is complicated by scale shifting and is also full of noise. Such shifting makes the spectra non-stationary and need to align before comparison. Consequently, the preprocessing of the mass data plays an important role during the analysis process. Noises in mass spectrometry data come in lots of different aspects and frequencies. A powerful data preprocessing method is needed for removing large amount of noises in mass spectrometry data.

Results

Hilbert-Huang Transformation is a non-stationary transformation used in signal processing. We provide a novel algorithm for preprocessing that can deal with MALDI and SELDI spectra. We use the Hilbert-Huang Transformation to decompose the spectrum and filter-out the very high frequencies and very low frequencies signal. We think the noise in mass spectrometry comes from many sources and some of the noises can be removed by analysis of signal frequence domain. Since the protein in the spectrum is expected to be a unique peak, its frequence domain should be in the middle part of frequence domain and will not be removed. The results show that HHT, when used for preprocessing, is generally better than other preprocessing methods. The approach not only is able to detect peaks successfully, but HHT has the advantage of denoising spectra efficiently, especially when the data is complex. The drawback of HHT is that this approach takes much longer for the processing than the wavlet and traditional methods. However, the processing time is still manageable and is worth the wait to obtain high quality data.  相似文献   

8.
9.
An introductory review of hardware aspects of on-line experimental data processing reveals that the combination of a specialized (hard-wired) preprocessing unit coupled with a programmable laboratory computer is an optimal set up for an electrophysiological laboratory. The paper deals with a proposed modular system, which makes the assembly of a large number of different preprocessing units possible. Some practical applications of the preprocessing units coupled with a LINC (D.E.C.) computer are presented in conclusion.  相似文献   

10.
A computerized system for occlusion pressure measurement during a rebreathing test is described. The system is implemented on an Apple II microcomputer. A set of programs allows calibration, data acquisition during the experiment, and fast automatic processing of the various parameters of ventilation and occlusion pressure versus end tidal PCO2. The use of a limited memory system is made possible by an electronic interface which allows preprocessing of the mouth pressure. In addition, that device drives a new simple electromagnetic valve with low flow resistance and dead space.  相似文献   

11.
Automatisation of collection and processing of chronic toxicity tests are presented in a ‘pharmaceutical research center environment’. The data are either analog signals or numerical values transmitted to the computer, from terminals posted in various laboratories. The minicomputer works under time-sharing system which allows many users to access to the programs in the same time. This system controls about 3000 results a day and prints complete reports in less than two days. Rewriting is then avoided thanks to this system, and in such a way, the reports are more reliable.  相似文献   

12.
This paper describes basic software for digitization and processing of microscopic cell images used at the Department of Clinical Cytology at Uppsala University Hospital. A family of programs running on a PDP-8 minicomputer which is connected to a Leitz Orthoplan microscope with two image scanners, one diode-array scanner and a moving-stage photometer, is used for data collection. The digitized image data is converted by converted by conversion program to IBM compatible format. The data structures for image processing and statistical evaluation on the IBM system are also described. Finally, some experiences from the use of the software in cytology automation are discussed.  相似文献   

13.
Physiological and developmental implications of motor unit anatomy   总被引:2,自引:0,他引:2  
There is increasing evidence that the architectural design and arrangement of the fibers within a motor unit have important physiological and developmental ramifications. Limited data, however, are available to directly address this issue. In the present study the physiological properties of one motor unit in each of seven cat tibialis anterior (TA) muscles were determined. Each of these units then was repetitively stimulated to deplete the glycogen in all muscle fibers within the unit. Subsequently, the length, type of ending, and spatial distribution of fibers sampled from these physiologically and histochemically typed motor units were determined. Four fast fatigable (FF), one fast, fatigue resistant (FR), and two slow (S) motor units (MU) were studied. The samples consisted of all those glycogen-depleted fibers (9-27) contained within a single fascicle or a circumscribed area of each of the motor unit territories. The mean fiber lengths for the two slow motor units were 35.9 and 45.5 mm. The mean fiber lengths for the fast motor unit samples ranged from 8.8 to 48.5 mm. Some fibers of both the fast and slow units reached lengths of 58 mm. Most of the fibers in the slow units extended the entire distance between the proximal and distal musculotendinous planes, had relatively constant cross-sectional areas, and terminated at the tendon as blunt endings. In contrast, the majority of the fibers in the fast units terminated intrafascicularly at one end, and the cross-sectional area decreased progressively along their lengths, that is, showed a tapering pattern for a significant proportion of their lengths. Therefore, the force generated by units that end midfascicularly would appear to be transmitted to connective tissue elements and/or adjacent fibers. All fibers of a fast unit within a fascicle were located at approximately the same proximo-distal location. Thus, developmentally the selection of muscle fibers by a motoneuron would seem to be influenced by their spatial distribution. The architectural complexities of motor units also have clear implications for the mechanical interactions of active and inactive motor units. For example, the tension capabilities of a motor unit may be influenced not only by the spatial arrangement of its own fibers, but also by the level of activation of neighboring motor units.  相似文献   

14.
In this paper we examine the theory and method for obtaining rotational diffusion coefficients for peptides in dilute solution from 13C-nmr spin-lattice relaxation data. We show that even for the case of nearly equal observed relaxation times of chemically and magnetically nonequivalent alpha-carbons marked rotational anisotropy will be the usual case. We describe two interactive, minicomputer programs which are of general use in this type of work. The implications of this study on spectral density-based conformational determinations of peptides is discussed.  相似文献   

15.
The purpose of this paper is to present an automatic method of signal analysis. To help physicians in their diagnostics, this method is implemented on a minicomputer in order to detect non-stationary points in electroencephalograms.The signal is modelled with an autoregressive filter. The parameters of this filter are adapted at each step. Identification gives the best model in the sense of a cost function representing the mean square error of noise, which is estimated during the optimisation time-window. The cost function is expressed by a quadratic formula. This allows the use of a fast algorithm, the ‘conjugate gradient method’. An original statistical test is developed to detect non-stationary points in the signal. The performance of this method is tested with artificial data to determine the sensitivity of method parameters. Detection using real data is presented.  相似文献   

16.
The analysis of electrophysiological recordings often involves visual inspection of time series data to locate specific experiment epochs, mask artifacts, and verify the results of signal processing steps, such as filtering or spike detection. Long-term experiments with continuous data acquisition generate large amounts of data. Rapid browsing through these massive datasets poses a challenge to conventional data plotting software because the plotting time increases proportionately to the increase in the volume of data. This paper presents FTSPlot, which is a visualization concept for large-scale time series datasets using techniques from the field of high performance computer graphics, such as hierarchic level of detail and out-of-core data handling. In a preprocessing step, time series data, event, and interval annotations are converted into an optimized data format, which then permits fast, interactive visualization. The preprocessing step has a computational complexity of ; the visualization itself can be done with a complexity of and is therefore independent of the amount of data. A demonstration prototype has been implemented and benchmarks show that the technology is capable of displaying large amounts of time series data, event, and interval annotations lag-free with ms. The current 64-bit implementation theoretically supports datasets with up to bytes, on the x86_64 architecture currently up to bytes are supported, and benchmarks have been conducted with bytes/1 TiB or double precision samples. The presented software is freely available and can be included as a Qt GUI component in future software projects, providing a standard visualization method for long-term electrophysiological experiments.  相似文献   

17.
We have developed a collection of programs for manipulation and analysis of nucleotide and protein sequences. The package was written in Fortran 77 on a Sirius1/Victor microcomputer which can be easily implemented on a large variety of other computers. Some of the programs have already been adapted for use on a Vax 11. Our aim was to develop programs consisting of small, comprehensible and well documented units that have very fast execution times and are comfortably interactive. The package is therefore suitable for individual modifications, even with little understanding of computer languages.  相似文献   

18.
Simulation software programs continue to evolve and to meet the needs of risk analysts. In the past several years, two spreadsheet add-in programs added the capability of fitting distributions to data to their tool kits using classical statistical (i.e., non-Bayesian) methods. Crystal Ball version 4.0 now contains this capability in its standard program (and in Crystal Ball Pro version 4.0), while the BestFit software program is a component of the @RISK Decision Tools Suite that can also be purchased as a stand-alone program. Both programs will automatically fit distributions using maximum likelihood estimators to continuous data and provide goodness-of-fit statistics based on chi-squared, Kolmogorov-Smirnov, and Anderson-Darling tests. BestFit will also fit discrete distributions, and for all distributions it offers the option of optimizing the fit based on the goodness-of-fit parameters. Analysts should be wary of placing too much emphasis on the goodness-of-fit statistics given their limitations, and the fact that only some of the statistics are appropriately corrected to account for the fact that the distribution parameters are also fit using the data. These programs dramatically simplify efforts to use maximum likelihood estimation to fit distributions. However, the fact that a program is used to fit distributions should not be viewed as validation that the data have been fitted and interpreted correctly. Both programs rely heavily on the analyst's judgment and will allow analysts to fit inappropriate distributions. Currently, both programs could be improved by adding the ability to perform extensive basic exploratory data analysis and to give regression diagnostics that are needed to satisfy critical analysts or reviewers. Given that Bayesian methods are central to risk analysis, adding the capability of fitting distributions by combining data with prior information would greatly increase the utility of these programs.  相似文献   

19.
An increasing number of countries are committing to meet the global target to eliminate human deaths from dog-mediated rabies by 2030. Mass dog vaccination is central to this strategy. To interrupt rabies transmission from dogs to humans, the World Health Organization recommends that vaccination campaigns should be carried out every year in all dog-owning communities vaccinating 70% of their susceptible dogs. Monitoring and evaluation of dog vaccination campaigns are needed to measure progress towards elimination. In this study, we measured the delivery performance of large-scale vaccination campaigns implemented in 25 districts in south-east Tanzania from 2010 until 2017. We used regression modelling to infer the factors associated with, and potentially influencing the successful delivery of vaccination campaigns. During 2010–2017, five rounds of vaccination campaigns were carried out, vaccinating in total 349,513 dogs in 2,066 administrative vaccination units (rural villages or urban wards). Progressively more dogs were vaccinated over the successive campaigns. The campaigns did not reach all vaccination units each year, with only 16–28% of districts achieving 100% campaign completeness (where all units were vaccinated). During 2013–2017 when vaccination coverage was monitored, approximately 20% of vaccination units achieved the recommended 70% coverage, with average coverage around 50%. Campaigns were also not completed at annual intervals, with the longest interval between campaigns being 27 months. Our analysis revealed that districts with higher budgets generally achieved higher completeness, with a twofold difference in district budget increasing the odds of a vaccination unit being reached by a campaign by slightly more than twofold (OR: 2.29; 95% CI: 1.69–3.09). However, higher budgets did not necessarily result in higher coverage within vaccination units that were reached. We recommend national programs regularly monitor and evaluate the performance of their vaccination campaigns, so as to identify factors hindering their effective delivery and to guide remedial action.  相似文献   

20.
Microelectronic technology has made possible a rapid expansion in the exploitation of the digital computer in medicine, but software development remains an increasingly significant problem. This paper describes a distributed computer network which combines the low cost computing power of the microcomputer with the highly developed features of a minicomputer. Powerful data processing facilities are provided through centralization of major resources on the minicomputer. Parallel real-time computing is assigned to microcomputers which have access to the minicomputer filestore, and to its comprehensive software development aids.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号