首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.

Background

Glaucoma is a progressive eye disease and a leading cause of visual disability. Automated assessment of the visual field determines the different stages in the disease process: it would be desirable to link these measurements taken in the clinic with patient''s actual function, or establish if patients compensate for their restricted field of view when performing everyday tasks. Hence, this study investigated eye movements in glaucomatous patients when viewing driving scenes in a hazard perception test (HPT).

Methodology/Principal Findings

The HPT is a component of the UK driving licence test consisting of a series of short film clips of various traffic scenes viewed from the driver''s perspective each containing hazardous situations that require the camera car to change direction or slow down. Data from nine glaucomatous patients with binocular visual field defects and ten age-matched control subjects were considered (all experienced drivers). Each subject viewed 26 different films with eye movements simultaneously monitored by an eye tracker. Computer software was purpose written to pre-process the data, co-register it to the film clips and to quantify eye movements and point-of-regard (using a dynamic bivariate contour ellipse analysis). On average, and across all HPT films, patients exhibited different eye movement characteristics to controls making, for example, significantly more saccades (P<0.001; 95% confidence interval for mean increase: 9.2 to 22.4%). Whilst the average region of ‘point-of-regard’ of the patients did not differ significantly from the controls, there were revealing cases where patients failed to see a hazard in relation to their binocular visual field defect.

Conclusions/Significance

Characteristics of eye movement patterns in patients with bilateral glaucoma can differ significantly from age-matched controls when viewing a traffic scene. Further studies of eye movements made by glaucomatous patients could provide useful information about the definition of the visual field component required for fitness to drive.  相似文献   

2.

Introduction

The FRAX calculator combines a set of clinical risk factors with country-specific incidence rates to determine the ten-year absolute risk of major osteoporotic fracture. However, regional or country-specific databases from Central American countries are not available. We compared the use of various FRAX databases and the Pluijm algorithm in determining risk of fracture.

Methods

We collected clinical risk factor data needed for the FRAX calculator and Pluijm algorithm of Hispanic women in Guatemala and calculated the FRAX absolute risk measures of major osteoporotic fracture and hip fracture. Subjects were postmenopausal women greater than age 40 with no history of using medication that affect bone. A random sample of 204 women in 34 different regions women in Guatemala City was visited in their homes to complete the surveys. The Pluijm risk score and FRAX risk score using the US Hispanic, Spain, and Mexican databases were calculated.

Results

We used the US NOF guidelines for treatment which suggest a treatment threshold for patients with a 10-year hip fracture probability ≥3% or a 10-year major osteoporotic fracture risk ≥20%. The number of patients meeting the suggested threshold limits for treatment using the Spain and Mexico calculators were identical. There was 100% conformity in threshold limits for both hip and major osteoporotic fracture risk. The mean conformity for any fracture risk between US Hispanic and the other two databases was 97.5%. Conformity was 99.0% based on major osteoporotic fracture and 97.5% based on risk of hip fracture. The Pluijm evaluation shows conformity of 87.2% and 83.3%, respectively, when compared to the US Hispanic and Spain/Mexico FRAX thresholds for risk of fracture.

Discussion

Although the different FRAX databases provide variations in the absolute risk of fracture, the overall conformity to treatment thresholds amongst the US Hispanic, Spain, and Mexico databases show the database used would have little effect as to the decision to treat. The Pluijm tool conforms to the FRAX thresholds and can be used as well. It does not matter which country-specific calculator or assessment tool is used, as there are a similar number of patients that would meet the intervention threshold.  相似文献   

3.
A peripherally presented target embedded in dynamic texture perceptually disappears (or 'fills-in') after around 10 s of steady fixation. This phenomenon was investigated for a target containing moving dots. The effects of manipulating the coherence of the motion within the target and the density of dots across the whole screen were explored. Coherence thresholds for the detection of a target at different dot densities were recorded for comparison. Fading occurred faster as either motion coherence or dot density was reduced. Coherence thresholds for target detection were unaffected by manipulations of dot density. There appeared to be no relationship between the stimulus exposure time needed for fading and the coherence threshold for detection of a target. The results suggest that the time taken for a target to fade is not a simple function of its motion detection threshold.  相似文献   

4.

Background

Face processing, amongst many basic visual skills, is thought to be invariant across all humans. From as early as 1965, studies of eye movements have consistently revealed a systematic triangular sequence of fixations over the eyes and the mouth, suggesting that faces elicit a universal, biologically-determined information extraction pattern.

Methodology/Principal Findings

Here we monitored the eye movements of Western Caucasian and East Asian observers while they learned, recognized, and categorized by race Western Caucasian and East Asian faces. Western Caucasian observers reproduced a scattered triangular pattern of fixations for faces of both races and across tasks. Contrary to intuition, East Asian observers focused more on the central region of the face.

Conclusions/Significance

These results demonstrate that face processing can no longer be considered as arising from a universal series of perceptual events. The strategy employed to extract visual information from faces differs across cultures.  相似文献   

5.
Li RW  MacKeben M  Chat SW  Kumar M  Ngo C  Levi DM 《PloS one》2010,5(10):e13434

Background

Much previous work on how normal aging affects visual enumeration has been focused on the response time required to enumerate, with unlimited stimulus duration. There is a fundamental question, not yet addressed, of how many visual items the aging visual system can enumerate in a “single glance”, without the confounding influence of eye movements.

Methodology/Principal Findings

We recruited 104 observers with normal vision across the age span (age 21–85). They were briefly (200 ms) presented with a number of well- separated black dots against a gray background on a monitor screen, and were asked to judge the number of dots. By limiting the stimulus presentation time, we can determine the maximum number of visual items an observer can correctly enumerate at a criterion level of performance (counting threshold, defined as the number of visual items at which ≈63% correct rate on a psychometric curve), without confounding by eye movements. Our findings reveal a 30% decrease in the mean counting threshold of the oldest group (age 61–85: ∼5 dots) when compared with the youngest groups (age 21–40: 7 dots). Surprisingly, despite decreased counting threshold, on average counting accuracy function (defined as the mean number of dots reported for each number tested) is largely unaffected by age, reflecting that the threshold loss can be primarily attributed to increased random errors. We further expanded this interesting finding to show that both young and old adults tend to over-count small numbers, but older observers over-count more.

Conclusion/Significance

Here we show that age reduces the ability to correctly enumerate in a glance, but the accuracy (veridicality), on average, remains unchanged with advancing age. Control experiments indicate that the degraded performance cannot be explained by optical, retinal or other perceptual factors, but is cortical in origin.  相似文献   

6.

Background

Low haemoglobin concentration has been associated with adverse prognosis in patients with angina and myocardial infarction (MI), but the strength and shape of the association and the presence of any threshold has not been precisely evaluated.

Methods and findings

A retrospective cohort study was carried out using the UK General Practice Research Database. 20,131 people with a new diagnosis of stable angina and no previous acute coronary syndrome, and 14,171 people with first MI who survived for at least 7 days were followed up for a mean of 3.2 years. Using semi-parametric Cox regression and multiple adjustment, there was evidence of threshold haemoglobin values below which mortality increased in a graded continuous fashion. For men with MI, the threshold value was 13.5 g/dl (95% confidence interval [CI] 13.2–13.9); the 29.5% of patients with haemoglobin below this threshold had an associated hazard ratio for mortality of 2.00 (95% CI 1.76–2.29) compared to those with haemoglobin values in the lowest risk range. Women tended to have lower threshold haemoglobin values (e.g, for MI 12.8 g/dl; 95% CI 12.1–13.5) but the shape and strength of association did not differ between the genders, nor between patients with angina and MI. We did a systematic review and meta-analysis that identified ten previously published studies, reporting a total of only 1,127 endpoints, but none evaluated thresholds of risk.

Conclusions

There is an association between low haemoglobin concentration and increased mortality. A large proportion of patients with coronary disease have haemoglobin concentrations below the thresholds of risk defined here. Intervention trials would clarify whether increasing the haemoglobin concentration reduces mortality. Please see later in the article for the Editors'' Summary  相似文献   

7.

Background

There are few clinical tools that assess decision-making under risk. Tests that characterize sensitivity and bias in decisions between prospects varying in magnitude and probability of gain may provide insights in conditions with anomalous reward-related behaviour.

Objective

We designed a simple test of how subjects integrate information about the magnitude and the probability of reward, which can determine discriminative thresholds and choice bias in decisions under risk.

Design/Methods

Twenty subjects were required to choose between two explicitly described prospects, one with higher probability but lower magnitude of reward than the other, with the difference in expected value between the two prospects varying from 3 to 23%.

Results

Subjects showed a mean threshold sensitivity of 43% difference in expected value. Regarding choice bias, there was a ‘risk premium’ of 38%, indicating a tendency to choose higher probability over higher reward. An analysis using prospect theory showed that this risk premium is the predicted outcome of hypothesized non-linearities in the subjective perception of reward value and probability.

Conclusions

This simple test provides a robust measure of discriminative value thresholds and biases in decisions under risk. Prospect theory can also make predictions about decisions when subjective perception of reward or probability is anomalous, as may occur in populations with dopaminergic or striatal dysfunction, such as Parkinson''s disease and schizophrenia.  相似文献   

8.
9.

Objectives

Eye movements are the physical expression of upper fetal brainstem function. Our aim was to identify and differentiate specific types of fetal eye movement patterns using dynamic MRI sequences. Their occurrence as well as the presence of conjugated eyeball motion and consistently parallel eyeball position was systematically analyzed.

Methods

Dynamic SSFP sequences were acquired in 72 singleton fetuses (17–40 GW, three age groups [17–23 GW, 24–32 GW, 33–40 GW]). Fetal eye movements were evaluated according to a modified classification originally published by Birnholz (1981): Type 0: no eye movements; Type I: single transient deviations; Type Ia: fast deviation, slower reposition; Type Ib: fast deviation, fast reposition; Type II: single prolonged eye movements; Type III: complex sequences; and Type IV: nystagmoid.

Results

In 95.8% of fetuses, the evaluation of eye movements was possible using MRI, with a mean acquisition time of 70 seconds. Due to head motion, 4.2% of the fetuses and 20.1% of all dynamic SSFP sequences were excluded.Eye movements were observed in 45 fetuses (65.2%). Significant differences between the age groups were found for Type I (p = 0.03), Type Ia (p = 0.031), and Type IV eye movements (p = 0.033). Consistently parallel bulbs were found in 27.3–45%.

Conclusions

In human fetuses, different eye movement patterns can be identified and described by MRI in utero. In addition to the originally classified eye movement patterns, a novel subtype has been observed, which apparently characterizes an important step in fetal brainstem development. We evaluated, for the first time, eyeball position in fetuses. Ultimately, the assessment of fetal eye movements by MRI yields the potential to identify early signs of brainstem dysfunction, as encountered in brain malformations such as Chiari II or molar tooth malformations.  相似文献   

10.

Purpose

Musculoskeletal disorders increase the risk for absenteeism and work disability. However, the threshold when musculoskeletal pain intensity significantly increases the risk of sickness absence among different occupations is unknown. This study estimates the risk for long-term sickness absence (LTSA) from different pain intensities in the low back, neck/shoulder and knees among female healthcare workers in eldercare.

Methods

Prospective cohort study among 8,732 Danish female healthcare workers responding to a questionnaire in 2004–2005, and subsequently followed for one year in a national register of social transfer payments (DREAM). Using Cox regression hazard ratio (HR) analysis we modeled risk estimates of pain intensities on a scale from 0–9 (reference 0, where 0 is no pain and 9 is worst imaginable pain) in the low back, neck/shoulders and knees during the last three months for onset of LTSA (receiving sickness absence compensation for at least eight consecutive weeks) during one-year follow-up.

Results

During follow-up, the 12-month prevalence of LTSA was 6.3%. With adjustment for age, BMI, smoking and leisure physical activity, the thresholds of pain intensities significantly increasing risk of LTSA for the low back (HR 1.44 [95%CI 1.07–1.93]), neck/shoulders (HR 1.47 [95%CI 1.10–1.96]) and knees (HR 1.43 [95%CI 1.06–1.93]) were 5, 4 and 3 (scale 0–9), respectively, referencing pain intensity of 0.

Conclusion

The threshold of pain intensity significantly increasing the risk for LTSA among female healthcare workers varies across body regions, with knee pain having the lowest threshold. This knowledge may be used in the prevention of LTSA among health care workers.  相似文献   

11.

Purpose

To investigate the utility of uncorrected visual acuity measures in screening for refractive error in white school children aged 6-7-years and 12-13-years.

Methods

The Northern Ireland Childhood Errors of Refraction (NICER) study used a stratified random cluster design to recruit children from schools in Northern Ireland. Detailed eye examinations included assessment of logMAR visual acuity and cycloplegic autorefraction. Spherical equivalent refractive data from the right eye were used to classify significant refractive error as myopia of at least 1DS, hyperopia as greater than +3.50DS and astigmatism as greater than 1.50DC, whether it occurred in isolation or in association with myopia or hyperopia.

Results

Results are presented from 661 white 12-13-year-old and 392 white 6-7-year-old school-children. Using a cut-off of uncorrected visual acuity poorer than 0.20 logMAR to detect significant refractive error gave a sensitivity of 50% and specificity of 92% in 6-7-year-olds and 73% and 93% respectively in 12-13-year-olds. In 12-13-year-old children a cut-off of poorer than 0.20 logMAR had a sensitivity of 92% and a specificity of 91% in detecting myopia and a sensitivity of 41% and a specificity of 84% in detecting hyperopia.

Conclusions

Vision screening using logMAR acuity can reliably detect myopia, but not hyperopia or astigmatism in school-age children. Providers of vision screening programs should be cognisant that where detection of uncorrected hyperopic and/or astigmatic refractive error is an aspiration, current UK protocols will not effectively deliver.  相似文献   

12.

Background

A key aspect of representations for object recognition and scene analysis in the ventral visual stream is the spatial frame of reference, be it a viewer-centered, object-centered, or scene-based coordinate system. Coordinate transforms from retinocentric space to other reference frames involve combining neural visual responses with extraretinal postural information.

Methodology/Principal Findings

We examined whether such spatial information is available to anterior inferotemporal (AIT) neurons in the macaque monkey by measuring the effect of eye position on responses to a set of simple 2D shapes. We report, for the first time, a significant eye position effect in over 40% of recorded neurons with small gaze angle shifts from central fixation. Although eye position modulates responses, it does not change shape selectivity.

Conclusions/Significance

These data demonstrate that spatial information is available in AIT for the representation of objects and scenes within a non-retinocentric frame of reference. More generally, the availability of spatial information in AIT calls into questions the classic dichotomy in visual processing that associates object shape processing with ventral structures such as AIT but places spatial processing in a separate anatomical stream projecting to dorsal structures.  相似文献   

13.

Background

Accurate objective assessment of sedentary and physical activity behaviours during childhood is integral to the understanding of their relation to later health outcomes, as well as to documenting the frequency and distribution of physical activity within a population.

Purpose

To calibrate the Actigraph GT1M accelerometer, using energy expenditure (EE) as the criterion measure, to define thresholds for sedentary behaviour and physical activity categories suitable for use in a large scale epidemiological study in young children.

Methods

Accelerometer-based assessments of physical activity (counts per minute) were calibrated against EE measures (kcal.kg−1.hr−1) obtained over a range of exercise intensities using a COSMED K4b2 portable metabolic unit in 53 seven-year-old children. Children performed seven activities: lying down viewing television, sitting upright playing a computer game, slow walking, brisk walking, jogging, hopscotch and basketball. Threshold count values were established to identify sedentary behaviour and light, moderate and vigorous physical activity using linear discriminant analysis (LDA) and evaluated using receiver operating characteristic (ROC) curve analysis.

Results

EE was significantly associated with counts for all non-sedentary activities with the exception of jogging. Threshold values for accelerometer counts (counts.minute−1) were <100 for sedentary behaviour and ≤2240, ≤3840 and ≥3841 for light, moderate and vigorous physical activity respectively. The area under the ROC curves for discrimination of sedentary behaviour and vigorous activity were 0.98. Boundaries for light and moderate physical activity were less well defined (0.61 and 0.60 respectively). Sensitivity and specificity were higher for sedentary (99% and 97%) and vigorous (95% and 91%) than for light (60% and 83%) and moderate (61% and 76%) thresholds.

Conclusion

The accelerometer cut points established in this study can be used to classify sedentary behaviour and to distinguish between light, moderate and vigorous physical activity in children of this age.  相似文献   

14.

Background

The effect of acupuncture on sensory perception has never been systematically reviewed; although, studies on acupuncture mechanisms are frequently based on the idea that changes in sensory thresholds reflect its effect on the nervous system.

Methods

Pubmed, EMBASE and Scopus were screened for studies investigating the effect of acupuncture on thermal or mechanical detection or pain thresholds in humans published in English or German. A meta-analysis of high quality studies was performed.

Results

Out of 3007 identified articles 85 were included. Sixty five studies showed that acupuncture affects at least one sensory threshold. Most studies assessed the pressure pain threshold of which 80% reported an increase after acupuncture. Significant short- and long-term effects on the pressure pain threshold in pain patients were revealed by two meta-analyses including four and two high quality studies, respectively. In over 60% of studies, acupuncture reduced sensitivity to noxious thermal stimuli, but measuring methods might influence results. Few but consistent data indicate that acupuncture reduces pin-prick like pain but not mechanical detection. Results on thermal detection are heterogeneous. Sensory threshold changes were equally frequent reported after manual acupuncture as after electroacupuncture. Among 48 sham-controlled studies, 25 showed stronger effects on sensory thresholds through verum than through sham acupuncture, but in 9 studies significant threshold changes were also observed after sham acupuncture. Overall, there is a lack of high quality acupuncture studies applying comprehensive assessments of sensory perception.

Conclusions

Our findings indicate that acupuncture affects sensory perception. Results are most compelling for the pressure pain threshold, especially in pain conditions associated with tenderness. Sham acupuncture can also cause such effects. Future studies should incorporate comprehensive, standardized assessments of sensory profiles in order to fully characterize its effect on sensory perception and to explore the predictive value of sensory profiles for the effectiveness of acupuncture.  相似文献   

15.

Background

Leprosy is the most frequent treatable neuromuscular disease. Yet, every year, thousands of patients develop permanent peripheral nerve damage as a result of leprosy. Since early detection and treatment of neuropathy in leprosy has strong preventive potential, we conducted a cohort study to determine which test detects this neuropathy earliest.

Methods and Findings

One hundred and eighty-eight multibacillary (MB) leprosy patients were selected from a cohort of 303 and followed for 2 years after diagnosis. Nerve function was evaluated at each visit using nerve conduction (NC), quantitative thermal sensory testing and vibrometry, dynamometry, monofilament testing (MFT), and voluntary muscle testing (VMT). Study outcomes were sensory and motor impairment detected by MFT or VMT. Seventy-four of 188 patients (39%) had a reaction, neuritis, or new nerve function impairment (NFI) event during a 2-year follow-up. Sub-clinical neuropathy was extensive (20%–50%), even in patients who did not develop an outcome event. Sensory nerve action potential (SNAP) amplitudes, compound motor action potential (CMAP) velocities, and warm detection thresholds (WDT) were most frequently affected, with SNAP impairment frequencies ranging from 30% (median) to 69% (sural). Velocity was impaired in up to 43% of motor nerves. WDTs were more frequently affected than cold detection thresholds (29% versus 13%, ulnar nerve). Impairment of SNC and warm perception often preceded deterioration in MF or VMT scores by 12 weeks or more.

Conclusions

A large proportion of leprosy patients have subclinical neuropathy that was not evident when only MFT and VMT were used. SNC was the most frequently and earliest affected test, closely followed by WDT. They are promising tests for improving early detection of neuropathy, as they often became abnormal 12 weeks or more before an abnormal monofilament test. Changes in MFT and VMT score mirrored changes in neurophysiology, confirming their validity as screening tests.  相似文献   

16.

Background

Despite the consistent information available on the physiological changes induced by head down bed rest, a condition which simulates space microgravity, our knowledge on the possible perceptual-cortical alterations is still poor. The present study investigated the effects of 2-h head-down bed rest on subjective and cortical responses elicited by electrical, pain-related somatosensory stimulation.

Methodology/Principal Findings

Twenty male subjects were randomly assigned to two groups, head-down bed rest (BR) or sitting control condition. Starting from individual electrical thresholds, Somatosensory Evoked Potentials were elicited by electrical stimuli administered randomly to the left wrist and divided into four conditions: control painless condition, electrical pain threshold, 30% above pain threshold, 30% below pain threshold. Subjective pain ratings collected during the EEG session showed significantly reduced pain perception in BR compared to Control group. Statistical analysis on four electrode clusters and sLORETA source analysis revealed, in sitting controls, a P1 component (40–50 ms) in the right somatosensory cortex, whereas it was bilateral and differently located in BR group. Controls'' N1 (80–90 ms) had widespread right hemisphere activation, involving also anterior cingulate, whereas BR group showed primary somatosensory cortex activation. The P2 (190–220 ms) was larger in left-central locations of Controls compared with BR group.

Conclusions/Significance

Head-down bed rest was associated to an overall decrease of pain sensitivity and an altered pain network also outside the primary somatosensory cortex. Results have implications not only for astronauts'' health and spaceflight risks, but also for the clinical aspects of pain detection in bedridden patients at risk of fatal undetected complications.  相似文献   

17.

Background

Repeated mass azithromycin distributions are effective in controlling the ocular strains of chlamydia that cause trachoma. However, it is unclear when treatments can be discontinued. Investigators have proposed graduating communities when the prevalence of infection identified in children decreases below a threshold. While this can be tested empirically, results will not be available for years. Here we use a mathematical model to predict results with different graduation strategies in three African countries.

Methods

A stochastic model of trachoma transmission was constructed, using the parameters with the maximum likelihood of obtaining results observed from studies in Tanzania (with 16% infection in children pre-treatment), The Gambia (9%), and Ethiopia (64%). The expected prevalence of infection at 3 years was obtained, given different thresholds for graduation and varying the characteristics of the diagnostic test.

Results

The model projects that three annual treatments at 80% coverage would reduce the mean prevalence of infection to 0.03% in Tanzanian, 2.4% in Gambian, and 12.9% in the Ethiopian communities. If communities graduate when the prevalence of infection falls below 5%, then the mean prevalence at 3 years with the new strategy would be 0.3%, 3.9%, and 14.4%, respectively. Graduations reduced antibiotic usage by 63% in Tanzania, 56% in The Gambia, and 11% in Ethiopia.

Conclusion

Models suggest that graduating communities from a program when the infection is reduced to 5% is a reasonable strategy and could reduce the amount of antibiotic distributed in some areas by more than 2-fold.  相似文献   

18.

Background

Fetal facial development is essential not only for postnatal bonding between parents and child, but also theoretically for the study of the origins of affect. However, how such movements become coordinated is poorly understood. 4-D ultrasound visualisation allows an objective coding of fetal facial movements.

Methodology/Findings

Based on research using facial muscle movements to code recognisable facial expressions in adults and adapted for infants, we defined two distinct fetal facial movements, namely “cry-face-gestalt” and “laughter- gestalt,” both made up of up to 7 distinct facial movements. In this conceptual study, two healthy fetuses were then scanned at different gestational ages in the second and third trimester. We observed that the number and complexity of simultaneous movements increased with gestational age. Thus, between 24 and 35 weeks the mean number of co-occurrences of 3 or more facial movements increased from 7% to 69%. Recognisable facial expressions were also observed to develop. Between 24 and 35 weeks the number of co-occurrences of 3 or more movements making up a “cry-face gestalt” facial movement increased from 0% to 42%. Similarly the number of co-occurrences of 3 or more facial movements combining to a “laughter-face gestalt” increased from 0% to 35%. These changes over age were all highly significant.

Significance

This research provides the first evidence of developmental progression from individual unrelated facial movements toward fetal facial gestalts. We propose that there is considerable potential of this method for assessing fetal development: Subsequent discrimination of normal and abnormal fetal facial development might identify health problems in utero.  相似文献   

19.

Background

Although CD4 cell count monitoring is used to decide when to start antiretroviral therapy in patients with HIV-1 infection, there are no evidence-based recommendations regarding its optimal frequency. It is common practice to monitor every 3 to 6 months, often coupled with viral load monitoring. We developed rules to guide frequency of CD4 cell count monitoring in HIV infection before starting antiretroviral therapy, which we validated retrospectively in patients from the Swiss HIV Cohort Study.

Methodology/Principal Findings

We built up two prediction rules (“Snap-shot rule” for a single sample and “Track-shot rule” for multiple determinations) based on a systematic review of published longitudinal analyses of CD4 cell count trajectories. We applied the rules in 2608 untreated patients to classify their 18 061 CD4 counts as either justifiable or superfluous, according to their prior ≥5% or <5% chance of meeting predetermined thresholds for starting treatment. The percentage of measurements that both rules falsely deemed superfluous never exceeded 5%. Superfluous CD4 determinations represented 4%, 11%, and 39% of all actual determinations for treatment thresholds of 500, 350, and 200×106/L, respectively. The Track-shot rule was only marginally superior to the Snap-shot rule. Both rules lose usefulness for CD4 counts coming near to treatment threshold.

Conclusions/Significance

Frequent CD4 count monitoring of patients with CD4 counts well above the threshold for initiating therapy is unlikely to identify patients who require therapy. It appears sufficient to measure CD4 cell count 1 year after a count >650 for a threshold of 200, >900 for 350, or >1150 for 500×106/L, respectively. When CD4 counts fall below these limits, increased monitoring frequency becomes advisable. These rules offer guidance for efficient CD4 monitoring, particularly in resource-limited settings.  相似文献   

20.

Background

Attention is used to enhance neural processing of selected parts of a visual scene. It increases neural responses to stimuli near target locations and is usually coupled to eye movements. Covert attention shifts, however, decouple the attentional focus from gaze, allowing to direct the attention to a peripheral location without moving the eyes. We tested whether covert attention shifts modulate ongoing neuronal activity in cortical area V6A, an area that provides a bridge between visual signals and arm-motor control.

Methodology/Principal Findings

We performed single cell recordings from 3 Macaca Fascicularis trained to fixate straight-head, while shifting attention outward to a peripheral cue and inward again to the fixation point. We found that neurons in V6A are influenced by spatial attention. The attentional modulation occurs without gaze shifts and cannot be explained by visual stimulations. Visual, motor, and attentional responses can occur in combination in single neurons.

Conclusions/Significance

This modulation in an area primarily involved in visuo-motor transformation for reaching may form a neural basis for coupling attention to the preparation of reaching movements. Our results show that cortical processes of attention are related not only to eye-movements, as many studies have shown, but also to arm movements, a finding that has been suggested by some previous behavioral findings. Therefore, the widely-held view that spatial attention is tightly intertwined with—and perhaps directly derived from—motor preparatory processes should be extended to a broader spectrum of motor processes than just eye movements.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号