首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Human error analysis is certainly a challenge today for all involved in safety and environmental risk assessment. The risk assessment process should not ignore the role of humans in accidental events and the consequences that may derive from human error. This article presents a case study of the Success Likelihood Index Method (SLIM) applied to the Electric Power Company of Serbia (EPCS), with the aim to disclose the importance of human error analysis in risk assessment. A database on work-related injuries, accidents, and critical interventions that occurred over a 10-year period in the EPCS provided the basis for this study. The research comprised analysis of 1074 workplaces, with a total of 3997 employees. A detailed analysis identified 10 typical human errors, performance shaping factors (PSFs), and estimated human error probability (HEP). Based on the obtained research results one can conclude that PSF control remains crucial for human error reduction, and thus prevention of occupational injuries and fatalities (the number of injuries decreased from 58 in 2012 to 44 in 2013, no fatalities recorded). Furthermore, the case study performed at the EPCS confirmed that the SLIM is highly applicable for quantification of human errors, comprehensive, and easy to perform.  相似文献   

2.
Shale gas fracturing is a complex system of continuous operation. If human errors occur, it will cause a chain reaction, from abnormal events to fracturing accidents, and even lead to abandonment of shale gas wells. The process of shale gas fracturing has many production stages that are complex and the consequence of any error is serious. The human error modes in shale gas fracturing process are mutative. Therefore, human error should be studied in a systematic way, and in a hybrid framework, that is, whole integration of identification, prioritization, reasoning, and control. A new structured identification method of human error in a hybrid framework for shale gas fracturing operation is presented in the article. First, human error is structurally identified based on the human HAZOP method. Second, fuzzy VIKOR method is applied to comprehensive prioritization. Finally, 4M element theory is used to analyze the human error and control its evolution. The method improves the consistency of the identification results through the standard identification step and the identification criterion. Results from a study of feed-flow process indicate that 34 kinds of human errors can be identified, and high-probability errors occur in the behavior of implementation and observation.  相似文献   

3.
Human error on the flight deck   总被引:1,自引:0,他引:1  
Despite terrorist bombs and structural failures, human error on the flight deck continues to account for the majority of aircraft accidents. The Royal Air Force (RAF) Institute of Aviation Medicine (IAM) has investigated the psychology of such error since the early 1970s, and to this end has used two principal techniques. The first has involved assisting in the official inquiries into both RAF and civil flying accidents, and the second has involved setting up a reporting system that permits any commercial pilot to report his own everyday errors, in complete confidence, to the RAF IAM. The latter system possesses the clear benefit of gathering error data untainted by considerations of culpability, and sometimes permits system rectification before the occurrence of accidents. This paper examines selected examples of errors associated with the design of equipment and with the social psychology of crews, and suggests that some consideration of the psychology of organizations may be necessary to ensure that the problems of human error are given the degree of consideration they require.  相似文献   

4.
Although it is clear that errors in genotyping data can lead to severe errors in linkage analysis, there is as yet no consensus strategy for identification of genotyping errors. Strategies include comparison of duplicate samples, independent calling of alleles, and Mendelian-inheritance-error checking. This study aimed to develop a better understanding of error types associated with microsatellite genotyping, as a first step toward development of a rational error-detection strategy. Two microsatellite marker sets (a commercial genomewide set and a custom-designed fine-resolution mapping set) were used to generate 118,420 and 22,500 initial genotypes and 10,088 and 8,328 duplicates, respectively. Mendelian-inheritance errors were identified by PedManager software, and concordance was determined for the duplicate samples. Concordance checking identifies only human errors, whereas Mendelian-inheritance-error checking is capable of detection of additional errors, such as mutations and null alleles. Neither strategy is able to detect all errors. Inheritance checking of the commercial marker data identified that the results contained 0.13% human errors and 0.12% other errors (0.25% total error), whereas concordance checking found 0.16% human errors. Similarly, Mendelian-inheritance-error checking of the custom-set data identified 1.37% errors, compared with 2.38% human errors identified by concordance checking. A greater variety of error types were detected by Mendelian-inheritance-error checking than by duplication of samples or by independent reanalysis of gels. These data suggest that Mendelian-inheritance-error checking is a worthwhile strategy for both types of genotyping data, whereas fine-mapping studies benefit more from concordance checking than do studies using commercial marker data. Maximization of error identification increases the likelihood of linkage when complex diseases are analyzed.  相似文献   

5.
N. Williams 《CMAJ》1964,90(19):1099-1104
Injuries and deaths from traffic accidents are a public health problem of epidemic proportions and justify intensive epidemiological research. The human factor is responsible for the majority of traffic accidents. The literature concerning the human factor is reviewed, and it is concluded that psychosocial influences are most important, though medical conditions may be responsible for 3 to 4% of accidents. Problems concerning the medical examination of drivers are discussed and the need is emphasized to find some means of removing from the road those drivers who continue to drive in spite of repeated medical advice not to do so. Some of the medical conditions influencing driver safety are discussed. It is recommended that each Division of The Canadian Medical Association should publish a guide for physicians who examine drivers. The advantages of a uniform guide in Canada are stressed.  相似文献   

6.
7.
ABSTRACT: BACKGROUND: Prescribing errors are a major source of morbidity and mortality and represent a significant patient safety concern. Evidence suggests that trainee doctors are responsible for most prescribing errors. Understanding the factors that influence prescribing behavior may lead to effective interventions to reduce errors. Existing investigations of prescribing errors have been based on Human Error Theory but not on other relevant behavioral theories. The aim of this study was to apply a broad theory-based approach using the Theoretical Domains Framework (TDF) to investigate prescribing in the hospital context among a sample of trainee doctors. METHOD: Semistructured interviews, based on 12 theoretical domains, were conducted with 22 trainee doctors to explore views, opinions, and experiences of prescribing and prescribing errors. Content analysis was conducted, followed by applying relevance criteria and a novel stage of critical appraisal, to identify which theoretical domains could be targeted in interventions to improve prescribing. RESULTS: Seven theoretical domains met the criteria of relevance: "social professional role and identity," "environmental context and resources," "social influences," "knowledge," "skills," "memory, attention, and decision making," and "behavioral regulation." From critical appraisal of the interview data, "beliefs about consequences" and "beliefs about capabilities" were also identified as potentially important domains. Interrelationships between domains were evident. Additionally, the data supported theoretical elaboration of the domain behavioral regulation. CONCLUSIONS: In this investigation of hospital-based prescribing, participants' attributions about causes of errors were used to identify domains that could be targeted in interventions to improve prescribing. In a departure from previous TDF practice, critical appraisal was used to identify additional domains that should also be targeted, despite participants' perceptions that they were not relevant to prescribing errors. These were beliefs about consequences and beliefs about capabilities. Specifically, in the light of the documented high error rate, beliefs that prescribing errors were not likely to have consequences for patients and that trainee doctors are capable of prescribing without error should also be targeted in an intervention. This study is the first to suggest critical appraisal for domain identification and to use interview data to propose theoretical elaborations and interrelationships between domains.  相似文献   

8.
An ortho-oblique-type binary data decomposition based on data matrices providing relevant conditions of accidents is proposed as a means of classifying patterns of human error, and the computing procedures are described. The usefulness of the technique is shown by a numerical example of accidents in freight-car classification yard work. Forty-two severely injured victims were interviewed by psychologists and the contents of case reports were rearranged into a reliable binary data matrix which indicated the presence or absence of an interrelationship between each sample and each of 42 items of operational, environmental, and psychological conditions. Three patterns of error leading to an accident were identified by interpreting the orthogonally rotated results. They were 1) failure of strenuous performance in relation to fatigue and poor communication, 2) veterans' mistakes in teamwork, possibly due to hasty operation or distractions, and 3) errors caused by certain defects of machines or inappropriate work space. Recommendations were thus made on training plans for beginners. The importance of using non-scaled accidental data in binary form and allowing rotation for data decomposition is discussed.  相似文献   

9.
The European Water Framework Directive has required chemical and biological assessments in waterbodies. While studies of water chemistry uncertainties have existed for a long time, few studies have been carried out in hydrobiology. Our aim was to study the role of uncertainties defined as any action that may cause a data error on the French index “Indice Biologique des Macrophytes de Rivières” IBMR based on the macrophyte compartment. IBMR gives the trophic status of the river. The selected uncertainties were based on the surveyor effect both in situ and in laboratory, such as taxa omission, species identification error and cover class error. We proposed an innovative approach close to sensitivity analysis using controlled virtual changes in taxa identification and cover classes based on two confusion matrices. The creation of new experimental floristic lists and the calculation of metrics according to random specified errors allowed us to measure the effect of these errors on the IBMR and the trophic status. The taxa identification errors and combined errors (taxa identification and cover class) always had a stronger impact than cover class errors. To limit their impact, surveyor training, confrontation between surveyors and a quality control approach could be applied.  相似文献   

10.
The present paper deals with a virtual model devoted to isokinetics and isometrics assessment of a human muscular group in the common joints, knee, ankle, hip, shoulder, cervical spine, etc. This virtual model with an analytical analysis followed by a numerical simulation is able to predict measurement errors of the joint torque due to offset of rotation centers between the body segment and the ergometer arm. As soon as offset is present, errors increase due to the influence of inertial effects, gravity effects, stiffness due to the limb strapping on the ergometer arm or Coulomb friction between limb and ergometer. The analytical model is written in terms of Lagrange formalism and the numerical model uses ADAMS software adapted to multi-body dynamics simulations. Results of models show a maximal relative error of 11%, for a 10% relative offset between the rotation centers. Inertial contributions are found to be negligible but gravity effects must be discussed in regard to the measured torque. Stiffness or friction effects may also increase the torque error; in particular when offset occurs, it is shown that errors due to friction have to be considered for all torque level while only stiffness effects have to be considered for torque less than 25Nm. This study also emphasizes the influence of the angular range of motion at a given angular position.  相似文献   

11.
12.
Concussion, or mild traumatic brain injury, occurs in many activities, mostly as a result of the head being accelerated. A comprehensive study has been conducted to understand better the mechanics of the impacts associated with concussion in American football. This study involves a sequence of techniques to analyse and reconstruct many different head impact scenarios. It is important to understand the validity and accuracy of these techniques in order to be able to use the results of the study to improve helmets and helmet standards. Two major categories of potential errors have been investigated. The first category concerns error sources specific to the use of crash test dummy instrumentation (accelerometers) and associated data processing techniques. These are relied upon to establish both linear and angular head acceleration responses. The second category concerns the use of broadcast video data and crash test dummy head-neck-torso systems. These are used to replicate the complex head impact scenarios of whole body collisions that occur on the football field between two living human beings. All acceleration measurement and processing techniques were based on well-established practices and standards. These proved to be reliable and reproducible. Potential errors in the linear accelerations due to electrical or mechanical noise did not exceed 2% for the three different noise sources investigated. Potential errors in the angular accelerations due to noise could be as high as 6.7%, due to error accumulation of multiple linear acceleration measurements. The potential error in the relative impact velocity between colliding heads could be as high as 11%, and was found to be the largest error source in the sequence of techniques to reconstruct the game impacts. Full-scale experiments with complete crash test dummies in staged head impacts showed maximum errors of 17% for resultant linear accelerations and 25% for resultant angular accelerations.  相似文献   

13.
Frameshift mutagenesis by eucaryotic DNA polymerases in vitro   总被引:23,自引:0,他引:23  
The frequency and specificity of frameshift errors produced during a single round of in vitro DNA synthesis by DNA polymerases-alpha, -beta, and -gamma (pol-alpha, -beta, and -gamma, respectively) have been determined. DNA polymerase-beta is the least accurate enzyme, producing frameshift errors at an average frequency of one error for each 1,000-3,000 nucleotides polymerized, a frequency similar to its average base substitution accuracy. DNA polymerase-alpha is approximately 10-fold more accurate, producing frameshifts at an average frequency of one error for every 10,000-30,000 nucleotides polymerized, a frequency which is about 2- to 6-fold lower than the average pol-alpha base substitution accuracy. DNA polymerase-gamma is highly accurate, producing on the average less than one frameshift error for every 200,000-400,000 nucleotides polymerized. This represents a more than 10-fold higher fidelity than for base substitutions. Among the collection of sequenced frameshifts produced by DNA polymerases-alpha and beta, both common features and distinct specificities are apparent. These specificities suggest a major role for eucaryotic DNA polymerases in modulating frameshift fidelity. Possible mechanisms for production of frameshifts are discussed in relation to the observed biases. One of these models has been experimentally supported using site-directed mutagenesis to change the primary DNA sequence of the template. Alteration of a pol-beta frameshift hotspot sequence TTTT to CTCT reduced the frequency of pol-beta-dependent minus-one-base errors at this site by more than 30-fold, suggesting that more than 97% of the errors at the TTTT run involve a slippage mechanism.  相似文献   

14.
Summary .  Sampling DNA noninvasively has advantages for identifying animals for uses such as mark–recapture modeling that require unique identification of animals in samples. Although it is possible to generate large amounts of data from noninvasive sources of DNA, a challenge is overcoming genotyping errors that can lead to incorrect identification of individuals. A major source of error is allelic dropout, which is failure of DNA amplification at one or more loci. This has the effect of heterozygous individuals being scored as homozygotes at those loci as only one allele is detected. If errors go undetected and the genotypes are naively used in mark–recapture models, significant overestimates of population size can occur. To avoid this it is common to reject low-quality samples but this may lead to the elimination of large amounts of data. It is preferable to retain these low-quality samples as they still contain usable information in the form of partial genotypes. Rather than trying to minimize error or discarding error-prone samples we model dropout in our analysis. We describe a method based on data augmentation that allows us to model data from samples that include uncertain genotypes. Application is illustrated using data from the European badger ( Meles meles ).  相似文献   

15.
For surface fluxes of carbon dioxide, the net daily flux is the sum of daytime and nighttime fluxes of approximately the same magnitude and opposite direction. The net flux is therefore significantly smaller than the individual flux measurements and error assessment is critical in determining whether a surface is a net source or sink of carbon dioxide. For carbon dioxide flux measurements, it is an occasional misconception that the net flux is measured as the difference between the net upward and downward fluxes (i.e. a small difference between large terms). This is not the case. The net flux is the sum of individual (half-hourly or hourly) flux measurements, each with an associated error term. The question of errors and uncertainties in long-term flux measurements of carbon and water is addressed by first considering the potential for errors in flux measuring systems in general and thus errors which are relevant to a wide range of timescales of measurement. We also focus exclusively on flux measurements made by the micrometeorological method of eddy covariance. Errors can loosely be divided into random errors and systematic errors, although in reality any particular error may be a combination of both types. Systematic errors can be fully systematic errors (errors that apply on all of the daily cycle) or selectively systematic errors (errors that apply to only part of the daily cycle), which have very different effects. Random errors may also be full or selective, but these do not differ substantially in their properties. We describe an error analysis in which these three different types of error are applied to a long-term dataset to discover how errors may propagate through long-term data and which can be used to estimate the range of uncertainty in the reported sink strength of the particular ecosystem studied.  相似文献   

16.
This paper examines the consequences of observation errors for the "random walk with drift", a model that incorporates density independence and is frequently used in population viability analysis. Exact expressions are given for biases in estimates of the mean, variance and growth parameters under very general models for the observation errors. For other quantities, such as the finite rate of increase, and probabilities about population size in the future we provide and evaluate approximate expressions. These expressions explain the biases induced by observation error without relying exclusively on simulations, and also suggest ways to correct for observation error. A secondary contribution is a careful discussion of observation error models, presented in terms of either log-abundance or abundance. This discussion recognizes that the bias and variance in observation errors may change over time, the result of changing sampling effort or dependence on the underlying population being sampled.  相似文献   

17.
In systems and computational biology, much effort is devoted to functional identification of systems and networks at the molecular-or cellular scale. However, similarly important networks exist at anatomical scales such as the tendon network of human fingers: the complex array of collagen fibers that transmits and distributes muscle forces to finger joints. This network is critical to the versatility of the human hand, and its function has been debated since at least the 16th century. Here, we experimentally infer the structure (both topology and parameter values) of this network through sparse interrogation with force inputs. A population of models representing this structure co-evolves in simulation with a population of informative future force inputs via the predator-prey estimation-exploration algorithm. Model fitness depends on their ability to explain experimental data, while the fitness of future force inputs depends on causing maximal functional discrepancy among current models. We validate our approach by inferring two known synthetic Latex networks, and one anatomical tendon network harvested from a cadaver''s middle finger. We find that functionally similar but structurally diverse models can exist within a narrow range of the training set and cross-validation errors. For the Latex networks, models with low training set error [<4%] and resembling the known network have the smallest cross-validation errors [∼5%]. The low training set [<4%] and cross validation [<7.2%] errors for models for the cadaveric specimen demonstrate what, to our knowledge, is the first experimental inference of the functional structure of complex anatomical networks. This work expands current bioinformatics inference approaches by demonstrating that sparse, yet informative interrogation of biological specimens holds significant computational advantages in accurate and efficient inference over random testing, or assuming model topology and only inferring parameters values. These findings also hold clues to both our evolutionary history and the development of versatile machines.  相似文献   

18.
19.
Several recent accidents in complex high-risk technologies had their primary origins in a variety of delayed-action human failures committed long before an emergency state could be recognized. These disasters were due to the adverse conjunction of a large number of causal factors, each one necessary but singly insufficient to achieve the catastrophic outcome. Although the errors and violations of those at the immediate human-system interface often feature large in the post-accident investigations, it is evident that these 'front-line' operators are rarely the principal instigators of system breakdown. Their part is often to provide just those local triggering conditions necessary to manifest systemic weaknesses created by fallible decisions made earlier in the organizational and managerial spheres. The challenge facing the human reliability community is to find ways of identifying and neutralizing these latent failures before they combine with local triggering events to breach the system's defences. New methods of risk assessment and risk management are needed if we are to achieve any significant improvements in the safety of complex, well-defended, socio-technical systems. This paper distinguishes between active and latent human failures and proposes a general framework for understanding the dynamics of accident causation. It also suggests ways in which current methods of protection may be enhanced, and concludes by discussing the unusual structural features of 'high-reliability' organizations.  相似文献   

20.
Summary The sources of errors which may occur when cytophotometric analysis is performed with video microscopy using a charged-coupled device (CCD) camera and image analysis are reviewed. The importance of these errors in practice has been tested, and ways of minimizing or avoiding them are described. Many of these sources of error are known from scanning and integrating cytophotometry; they include the use of white instead of monochromatic light, the distribution error, glare, diffraction, shading distortion, and inadequate depth of field. Sources of errors specifically linked with video microscopy or image analysis are highlighted as well; these errors include blooming, limited dynamic range of grey levels, non-linear responses of the camera, contrast transfer, photon noise, dark current, read-out noise, fixed scene noise and spatial calibration. Glare, contrast transfer, fixed scene noise, depth of field and spatial calibration seem to be the most serious sources of errors when measurements are not carried out correctly. We include a table summarizing all the errors discussed in this review and procedures for avoiding them. It can be concluded that if accurate calibration steps are performed and proper guidelines followed, image cytometry can be applied safely for quantifying amounts of chromophore per cell or per unit volume of tissue in sections, even when relatively simple and inexpensive instrumentation is being used.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号