首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Computational models of the human body coupled with optimization can be used to predict the influence of variables that cannot be experimentally manipulated. Here, we present a study that predicts the motion of the human body while lifting a box, as a function of flexibility of the hip and lumbar joints in the sagittal plane. We modeled the human body in the sagittal plane with joints actuated by pairs of agonist-antagonist muscle torque generators, and a passive hamstring muscle. The characteristics of a stiff, average and flexible person were represented by co-varying the lumbar range-of-motion, lumbar passive extensor-torque and the hamstring passive muscle-force. We used optimal control to solve for motions that simulated lifting a 10 kg box from a 0.3 m height. The solution minimized the total sum of the normalized squared active and passive muscle torques and the normalized passive hamstring muscle forces, over the duration of the motion. The predicted motion of the average lifter agreed well with experimental data in the literature. The change in model flexibility affected the predicted joint angles, with the stiffer models flexing more at the hip and knee, and less at the lumbar joint, to complete the lift. Stiffer models produced similar passive lumbar torque and higher hamstring muscle force components than the more flexible models. The variation between the motion characteristics of the models suggest that flexibility may play an important role in determining lifting technique.  相似文献   

2.
Kinematic and center of mass (CoM) mechanical variables used to define terrestrial gaits are compared for various tetrapod species. Kinematic variables (limb phase, duty factor) provide important timing information regarding the neural control and limb coordination of various gaits. Whereas, mechanical variables (potential and kinetic energy relative phase, %Recovery, %Congruity) provide insight into the underlying mechanisms that minimize muscle work and the metabolic cost of locomotion, and also influence neural control strategies. Two basic mechanisms identified by Cavagna et al. (1977. Am J Physiol 233:R243-R261) are used broadly by various bipedal and quadrupedal species. During walking, animals exchange CoM potential energy (PE) with kinetic energy (KE) via an inverted pendulum mechanism to reduce muscle work. During the stance period of running (including trotting, hopping and galloping) gaits, animals convert PE and KE into elastic strain energy in spring elements of the limbs and trunk and regain this energy later during limb support. The bouncing motion of the body on the support limb(s) is well represented by a simple mass-spring system. Limb spring compliance allows the storage and return of elastic energy to reduce muscle work. These two distinct patterns of CoM mechanical energy exchange are fairly well correlated with kinematic distinctions of limb movement patterns associated with gait change. However, in some cases such correlations can be misleading. When running (or trotting) at low speeds many animals lack an aerial period and have limb duty factors that exceed 0.5. Rather than interpreting this as a change of gait, the underlying mechanics of the body's CoM motion indicate no fundamental change in limb movement pattern or CoM dynamics has occurred. Nevertheless, the idealized, distinctive patterns of CoM energy fluctuation predicted by an inverted pendulum for walking and a bouncing mass spring for running are often not clear cut, especially for less cursorial species. When the kinematic and mechanical patterns of a broader diversity of quadrupeds and bipeds are compared, more complex patterns emerge, indicating that some animals may combine walking and running mechanics at intermediate speeds or at very large size. These models also ignore energy costs that are likely associated with the opposing action of limbs that have overlapping support times during walking. A recent model of terrestrial gait (Ruina et al., 2005. J Theor Biol, in press) that treats limb contact with the ground in terms of collisional energy loss indicates that considerable CoM energy can be conserved simply by matching the path of CoM motion perpendicular to limb ground force. This model, coupled with the earlier ones of pendular exchange during walking and mass-spring elastic energy savings during running, provides compelling argument for the view that the legged locomotion of quadrupeds and other terrestrial animals has generally evolved to minimize muscle work during steady level movement.  相似文献   

3.
Time histories of neuromuscular and mechanical variables of human motion are often compared by using discrete timing events (onset, offset, time to peak, zero crossing, etc). The determination of these discrete timing points is often subjective and their interpretation can cause confusion when attempting to compare patterns. In this technical note, cross correlation and the 95% confidence interval of its maximum value are proposed as an objective means of pattern recognition and comparison. EMG patterns of cycling at different cadences were used as an example to demonstrate the effectiveness of this cross correlation method in identification of changes between conditions. Using a standard method of threshold identification, different onset and offset values can be found by using different thresholds, and the sequence of the offset timings between conditions can change. This is a clear indication of the inherent subjectivity with these discrete timing methods. In contrast, calculation of cross correlation for incremental phase shifts permits the identification of a maximal value that is an objective measure of the actual phase shifting between the two time series. Further, calculation of the 95% confidence interval allows one to determine whether the phase shifting is statistically significant. The application of this method is not limited to EMG pattern comparison, and can also be applied to other time histories such as kinematic and kinetic parameters of human motion.  相似文献   

4.
Cortical motion analysis continuously encodes image velocity but might also be used to predict future patterns of sensory input along the motion path. We asked whether this predictive aspect of motion is exploited by the human visual system. Targets can be more easily detected at the leading as compared to the trailing edge of motion [1], but this effect has been attributed to a nonspecific boost in contrast gain at the leading edge, linked to motion-induced shifts in spatial position [1-4]. Here we show that the detectability of a local sinusoidal target presented at the ends of a region containing motion is phase dependent at the leading edge, but not at the trailing edge. These two observations rule out a simple gain control mechanism that modulates contrast energy and passive filtering explanations, respectively. By manipulating the relative orientation of the moving pattern and target, we demonstrate that the resulting spatial variation in detection threshold along the edge closely resembles the superposition of sensory input and an internally generated predicted signal. These findings show that motion induces a forward prediction of spatial pattern that combines with the cortical representation of the future stimulus.  相似文献   

5.
ProteoCat is a computer program that has been designed to help researchers in the planning of large-scale proteomic experiments. The central part of this program is the unit of hydrolysis simulation that supports 4 proteases (trypsin, lysine C, endoproteinases Asp-N and GluC). For peptides obtained after virtual hydrolysis or loaded from data files a number of properties important in mass-spectrometric experiments can be calculated and predicted; the resultant data can be analyzed or filtered (to reduce a set of peptides). The program is using new and improved modifications of own earlier developed methods for pI prediction, which can be also predicted by means of popular pKa scales proposed by other reseachers. The algorithm for prediction of peptide retention time has been realized similarly to the algorithm used in the SSRCalc program. Using ProteoCat it is possible to estimate the coverage of amino acid sequences of analyzed proteins under defined limitation on peptides detection, as well as the possibility of assembly of peptide fragments with user-defined minimal sizes of “sticky” ends. The program has a graphical user interface, written on JAVA and available at http://www.ibmc.msk.ru/LPCIT/ProteoCat.  相似文献   

6.
Climate change poses a serious threat to biodiversity. Predicting the effects of climate change on the distribution of a species' habitat can help humans address the potential threats which may change the scope and distribution of species. Pterocarya stenoptera is a common fast‐growing tree species often used in the ecological restoration of riverbanks and alpine forests in central and eastern China. Until now, the characteristics of the distribution of this species' habitat are poorly known as are the environmental factors that influence its preferred habitat. In the present study, the Maximum Entropy Modeling (Maxent) algorithm and the Genetic Algorithm for Ruleset Production (GARP) were used to establish the models for the potential distribution of this species by selecting 236 sites with known occurrences and 14 environmental variables. The results indicate that both models have good predictive power. Minimum temperature of coldest month (Bio6), mean temperature of warmest quarter (Bio10), annual precipitation (Bio12), and precipitation of driest month (Bio14) were important environmental variables influencing the prediction of the Maxent model. According to the models, the temperate and subtropical regions of eastern China had high environmental suitability for this species, where the species had been recorded. Under each climate change scenario, climatic suitability of the existing range of this species increased, and its climatic niche expanded geographically to the north and higher elevation. GARP predicted a more conservative expansion. The projected spatial and temporal patterns of P. stenoptera can provide reference for the development of forest management and protection strategies.  相似文献   

7.
8.
Despite ‘abnormal’ motion being considered a risk factor for low back injury, the current understanding of ‘normal’ spine motion is limited. Identifying normal motion within an individual is complicated by the considerable variation in movement patterns amongst healthy individuals. Therefore, the purpose of this study was to characterize sources of variation in spine motion among a sample of healthy participants. The second objective of this study was to develop a multivariate model capable of predicting an expected movement pattern for an individual. The kinematic shape of the lower thoracic and lumbar spine was recorded during a constrained dynamic trunk flexion movement; as this is not a normal everyday movement task, movements are considered ‘typical’ and ‘atypical’ for this task rather than ‘normal’ and ‘abnormal’. Variations in neutral standing posture accounted for 85% of the variation in spine motion throughout the task. Differences in total spine range of flexion and a regional re-weighting of range of motion between lower thoracic and lumbar regions explained a further 9% of the variance among individuals. The analysis also highlighted a difference in temporal sequencing of motion between lower thoracic and lumbar regions which explained 2% of the total movement variation. These identified sources of variation were used to select independent variables for a multivariate linear model capable of predicting an individuals’ expected movement pattern. This was done as a proof-of-concept to demonstrate how the error between predicted and observed motion patterns could be used to differentiate between ‘typical’ and ‘atypical’ movement strategies.  相似文献   

9.
RNA secondary structure is often predicted from sequence by free energy minimization. Over the past two years, advances have been made in the estimation of folding free energy change, the mapping of secondary structure and the implementation of computer programs for structure prediction. The trends in computer program development are: efficient use of experimental mapping of structures to constrain structure prediction; use of statistical mechanics to improve the fidelity of structure prediction; inclusion of pseudoknots in secondary structure prediction; and use of two or more homologous sequences to find a common structure.  相似文献   

10.
Differences in motion patterns subserving the same movement goal can be identified qualitatively. These alternatives, which may characterize 'movement techniques' (e.g., the stoop and the squat lifting technique), may be associated with significantly different biomechanical constraints and physiological responses. Despite the widely shared understanding of the significance of alternative movement techniques, quantitative representation and identification of movement techniques have received little attention, especially for three-dimensional whole-body motions. In an attempt to systematically differentiate movement techniques, this study introduces a quantitative index termed joint contribution vector (JCV) representing a motion in terms of contributions of individual joint degrees-of-freedom to the achievement of the task goal. Given a set of uncharacterized (unlabeled) motions represented by joint angle trajectories (motion capture data), the JCV and statistical clustering methods enable automated motion classification to uncover a taxonomy of alternative movement techniques. The results of our motion data analyses show that the JCV was able to characterize and discern stoop and squat lifting motions, and also to identify movement techniques for a three-dimensional, whole-body, one-handed load-transfer task. The JCV index would facilitate consideration of alternative movement techniques in a variety of applications, including work method comparison and selection, and human motion modeling and simulation.  相似文献   

11.
There is currently no validated full-body lifting model publicly available on the OpenSim modelling platform to estimate spinal loads during lifting. In this study, the existing full-body-lumbar-spine model was adapted and validated for lifting motions to produce the lifting full-body model. Back muscle activations predicted by the model closely matched the measured erector spinae activation patterns. Model estimates of intradiscal pressures and in vivo measurements were strongly correlated. The same spine loading trends were observed for model estimates and reported vertebral body implant measurements. These results demonstrate the suitability of this model to evaluate changes in lumbar loading during lifting.  相似文献   

12.
Accurate measurement of knee-joint kinematics is critical for understanding the biomechanical function of the knee in vivo. Measurements of the relative movements of the bones at the knee are often used in inverse dynamics analyses to estimate the net muscle torques exerted about the joint, and as inputs to finite-element models to accurately assess joint contact. The fine joint translations that contribute to patterns of joint stress are impossible to measure accurately using traditional video-based motion capture techniques. Sub-millimetre changes in joint translation can mean the difference between contact and no contact of the cartilage tissue, leading to incorrect predictions of joint loading. This paper describes the use of low-dose X-ray fluoroscopy, an in vivo dynamic imaging modality that is finding increasing application in human joint motion measurement. Specifically, we describe a framework that integrates traditional motion capture, X-ray fluoroscopy and anatomically-based finite-element modelling for the purpose of assessing joint function during dynamic activity. We illustrate our methodology by applying it to study patellofemoral joint function, wherein the relative movements of the patella are predicted and the corresponding joint-contact stresses are calculated for a step-up task.  相似文献   

13.
Assessment of survival prediction models based on microarray data   总被引:1,自引:0,他引:1  
MOTIVATION: In the process of developing risk prediction models, various steps of model building and model selection are involved. If this process is not adequately controlled, overfitting may result in serious overoptimism leading to potentially erroneous conclusions. METHODS: For right censored time-to-event data, we estimate the prediction error for assessing the performance of a risk prediction model (Gerds and Schumacher, 2006; Graf et al., 1999). Furthermore, resampling methods are used to detect overfitting and resulting overoptimism and to adjust the estimates of prediction error (Gerds and Schumacher, 2007). RESULTS: We show how and to what extent the methodology can be used in situations characterized by a large number of potential predictor variables where overfitting may be expected to be overwhelming. This is illustrated by estimating the prediction error of some recently proposed techniques for fitting a multivariate Cox regression model applied to the data of a prognostic study in patients with diffuse large-B-cell lymphoma (DLBCL). AVAILABILITY: Resampling-based estimation of prediction error curves is implemented in an R package called pec available from the authors.  相似文献   

14.
Research is needed to create early warnings of dengue outbreaks to inform stakeholders and control the disease. This analysis composes of a comparative set of prediction models including only meteorological variables; only lag variables of disease surveillance; as well as combinations of meteorological and lag disease surveillance variables. Generalized linear regression models were used to fit relationships between the predictor variables and the dengue surveillance data as outcome variable on the basis of data from 2001 to 2010. Data from 2011 to 2013 were used for external validation purposed of prediction accuracy of the model. Model fit were evaluated based on prediction performance in terms of detecting epidemics, and for number of predicted cases according to RMSE and SRMSE, as well as AIC. An optimal combination of meteorology and autoregressive lag terms of dengue counts in the past were identified best in predicting dengue incidence and the occurrence of dengue epidemics. Past data on disease surveillance, as predictor alone, visually gave reasonably accurate results for outbreak periods, but not for non-outbreaks periods. A combination of surveillance and meteorological data including lag patterns up to a few years in the past showed most predictive of dengue incidence and occurrence in Yogyakarta, Indonesia. The external validation showed poorer results than the internal validation, but still showed skill in detecting outbreaks up to two months ahead. Prior studies support the fact that past meteorology and surveillance data can be predictive of dengue. However, to a less extent has prior research shown how the longer-term past disease incidence data, up to years, can play a role in predicting outbreaks in the coming years, possibly indicating cross-immunity status of the population.  相似文献   

15.
16.
Correlative species distribution models are frequently used to predict species’ range shifts under climate change. However, climate variables often show high collinearity and most statistical approaches require the selection of one among strongly correlated variables. When causal relationships between species presence and climate parameters are unknown, variable selection is often arbitrary, or based on predictive performance under current conditions. While this should only marginally affect current range predictions, future distributions may vary considerably when climate parameters do not change in concert. We investigated this source of uncertainty using four highly correlated climate variables together with a constant set of landscape variables in order to predict current (2010) and future (2050) distributions of four mountain bird species in central Europe. Simulating different parameterization decisions, we generated a) four models including each of the climate variables singly, b) a model taking advantage of all variables simultaneously and c) an un‐weighted average of the predictions of a). We compared model accuracy under current conditions, predicted distributions under four scenarios of climate change, and – for one species – evaluated back‐projections using historical occurrence data. Although current and future variable‐correlations remained constant, and the models’ accuracy under contemporary conditions did not differ, future range predictions varied considerably in all climate change scenarios. Averaged models and models containing all climate variables simultaneously produced intermediate predictions; the latter, however, performed best in back‐projections. This pattern, consistent across different modelling methods, indicates a benefit from including multiple climate predictors in ambiguous situations. Variable selection proved to be an important source of uncertainty for future range predictions, difficult to control using contemporary information. Small, but diverging changes of climate variables, masked by constant overall correlation patterns, can cause substantial differences between future range predictions which need to be accounted for, particularly when outcomes are intended for conservation decisions.  相似文献   

17.
Species distribution models (SDMs) that rely on regional‐scale environmental variables will play a key role in forecasting species occurrence in the face of climate change. However, in the Anthropocene, a number of local‐scale anthropogenic variables, including wildfire history, land‐use change, invasive species, and ecological restoration practices can override regional‐scale variables to drive patterns of species distribution. Incorporating these human‐induced factors into SDMs remains a major research challenge, in part because spatial variability in these factors occurs at fine scales, rendering prediction over regional extents problematic. Here, we used big sagebrush (Artemisia tridentata Nutt.) as a model species to explore whether including human‐induced factors improves the fit of the SDM. We applied a Bayesian hurdle spatial approach using 21,753 data points of field‐sampled vegetation obtained from the LANDFIRE program to model sagebrush occurrence and cover by incorporating fire history metrics and restoration treatments from 1980 to 2015 throughout the Great Basin of North America. Models including fire attributes and restoration treatments performed better than those including only climate and topographic variables. Number of fires and fire occurrence had the strongest relative effects on big sagebrush occurrence and cover, respectively. The models predicted that the probability of big sagebrush occurrence decreases by 1.2% (95% CI: ?6.9%, 0.6%) when one fire occurs and cover decreases by 44.7% (95% CI: ?47.9%, ?41.3%) if at least one fire occurred over the 36 year period of record. Restoration practices increased the probability of big sagebrush occurrence but had minimal effect on cover. Our results demonstrate the potential value of including disturbance and land management along with climate in models to predict species distributions. As an increasing number of datasets representing land‐use history become available, we anticipate that our modeling framework will have broad relevance across a range of biomes and species.  相似文献   

18.
Multiple linear regression analyses (also often referred to as generalized linear models – GLMs, or generalized linear mixed models – GLMMs) are widely used in the analysis of data in molecular ecology, often to assess the relative effects of genetic characteristics on individual fitness or traits, or how environmental characteristics influence patterns of genetic differentiation. However, the coefficients resulting from multiple regression analyses are sometimes misinterpreted, which can lead to incorrect interpretations and conclusions within individual studies, and can propagate to wider‐spread errors in the general understanding of a topic. The primary issue revolves around the interpretation of coefficients for independent variables when interaction terms are also included in the analyses. In this scenario, the coefficients associated with each independent variable are often interpreted as the independent effect of each predictor variable on the predicted variable. However, this interpretation is incorrect. The correct interpretation is that these coefficients represent the effect of each predictor variable on the predicted variable when all other predictor variables are zero. This difference may sound subtle, but the ramifications cannot be overstated. Here, my goals are to raise awareness of this issue, to demonstrate and emphasize the problems that can result and to provide alternative approaches for obtaining the desired information.  相似文献   

19.
Fermentation database mining by pattern recognition   总被引:1,自引:0,他引:1  
A large volume of data is routinely collected during the course of typical fermentation and other processes. Such data provide the required basis for process documentation and occasionally are also used for process analysis and improvement. The information density of these data is often low, and automatic condensing, analysis, and interpretation ("database mining") are highly desirable. In this article we present a methodology whereby process variables are processed to create a database of derivative process quantities representative of the global patterns, intermediate trends, and local characteristics of the process. A powerful search algorithm subsequently attempts to extract the specific process variables and their particular attributes that uniquely characterize a class of process outcomes such as high- or low-yield fermentations.The basic components of our pattern recognition methodology are described along with applications to the analysis of two sets of data from industrial fermentations. Results indicate that truly discriminating variables do exist in typical fermentation data and they can be useful in identifying the causes or symptoms of different process outcomes. The methodology has been implemented in a user-friendly software, named db-miner, which facilitates the application of the methodology for efficient and speedy analysis of fermentation process data. (c) 1997 John Wiley & Sons, Inc. Biotechnol Bioeng 53: 443-452, 1997.  相似文献   

20.
Molecular portraits, such as mRNA expression or DNA methylation patterns, have been shown to be strongly correlated with phenotypical parameters. These molecular patterns can be revealed routinely on a genomic scale. However, class prediction based on these patterns is an under-determined problem, due to the extreme high dimensionality of the data compared to the usually small number of available samples. This makes a reduction of the data dimensionality necessary. Here we demonstrate how phenotypic classes can be predicted by combining feature selection and discriminant analysis. By comparing several feature selection methods we show that the right dimension reduction strategy is of crucial importance for the classification performance. The techniques are demonstrated by methylation pattern based discrimination between acute lymphoblastic leukemia and acute myeloid leukemia.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号