首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
2.
BackgroundQuality of life (QL) assessments of children with incapacitating diseases, such as cerebral palsy (CP), have often been conducted with the help of the representatives of a child, making QL assessment more subjective. TheAutoquestionnaireQualité de Vie Enfant Imagé (AUQEI) is a QL assessment designed for children to self-report—it uses images to facilitate the reporting process.Objectiveevaluate the psychometric properties of AUQEI when responses are given by children with CP.FindingsChildren aged 4 to 12 years (45 with CP and 45 healthy children) gave responses to the questionnaire. The data quality, reliability and validity were assessed. The data loss rate ranged from 8.8% to 46.7%, and was highest for the “autonomy” factor. No floor or ceiling effect was detected. The success rate for reliability of the internal consistency of the items was less than 80% for the “autonomy” factor. Cronbach’s alpha coefficient was 0.71 for the instrument and less than 0.5 for the factors. All the factors had a success rate of greater than 80% for the discriminating validity of the items. The factors did not have correlations between each other, thus indicating adequate discriminating validity. Convergent validity was tested and a significant correlation was demonstrated only between the AUQEI “functioning” factor and the Child Health Questionnaire—50-Item (CHQ-PF50) physical summary score (r = 0.31, p = 0.042). The AUQEI scores did not have correlations with the gross motor function scores (p>0.05) as expected for divergent validity. Regarding construct validity, the total AUQEI score obtained by the CP group was lower (median: 47.3) than that of the healthy group (median: 51.0) (p<0.01).ConclusionThe AUQEI was shown to be a reliable and valid instrument for assessing children with CP when the total score was used. Convergent validity should continue to be tested in future studies.  相似文献   

3.

Background

The Center for Epidemiologic Studies Depression Scale (CES-D) is a commonly used instrument to measure depressive symptomatology. Despite this, the evidence for its psychometric properties remains poorly established in Chinese populations. The aim of this study was to validate the use of the CES-D in Chinese primary care patients by examining factor structure, construct validity, reliability, sensitivity and responsiveness.

Methods and Results

The psychometric properties were assessed amongst a sample of 3686 Chinese adult primary care patients in Hong Kong. Three competing factor structure models were examined using confirmatory factor analysis. The original CES-D four-structure model had adequate fit, however the data was better fit into a bi-factor model. For the internal construct validity, corrected item-total correlations were 0.4 for most items. The convergent validity was assessed by examining the correlations between the CES-D, the Patient Health Questionnaire 9 (PHQ-9) and the Short Form-12 Health Survey (version 2) Mental Component Summary (SF-12 v2 MCS). The CES-D had a strong correlation with the PHQ-9 (coefficient: 0.78) and SF-12 v2 MCS (coefficient: -0.75). Internal consistency was assessed by McDonald’s omega hierarchical (ωH). The ωH value for the general depression factor was 0.855. The ωH values for “somatic”, “depressed affect”, “positive affect” and “interpersonal problems” were 0.434, 0.038, 0.738 and 0.730, respectively. For the two-week test-retest reliability, the intraclass correlation coefficient was 0.91. The CES-D was sensitive in detecting differences between known groups, with the AUC >0.7. Internal responsiveness of the CES-D to detect positive and negative changes was satisfactory (with p value <0.01 and all effect size statistics >0.2). The CES-D was externally responsive, with the AUC>0.7.

Conclusions

The CES-D appears to be a valid, reliable, sensitive and responsive instrument for screening and monitoring depressive symptoms in adult Chinese primary care patients. In its original four-factor and bi-factor structure, the CES-D is supported for cross-cultural comparisons of depression in multi-center studies.  相似文献   

4.

Background

The way in which leadership is experienced in different socioeconomic strata is of interest per se, as well as how it relates to employee mental health.

Methods

Three waves of SLOSH (Swedish Longitudinal Occupational Survey of Health, a questionnaire survey on a sample of the Swedish working population) were used, 2006, 2008 and 2010 (n = 5141). The leadership variables were: “Non-listening leadership” (one question: “Does your manager listen to you?” - four response categories), “Self centered leadership” (sum of three five-graded questions – “non-participating”, “asocial” and “loner”). The socioeconomic factors were education and income. Emotional exhaustion and depressive symptoms were used as indicators of mental health.

Results

Non-listening leadership was associated with low income and low education whereas self-centered leadership showed a weaker relationship with education and no association at all with income. Both leadership variables were significantly associated with emotional exhaustion and depressive symptoms. “Self centered” as well as “non-listening” leadership in 2006 significantly predicted employee depressive symptoms in 2008 after adjustment for demographic variables. These predictions became non-significant when adjustment was made for job conditions (demands and decision latitude) in the “non-listening” leadership analyses, whereas predictions of depressive symptoms remained significant after these adjustments in the “self-centered leadership” analyses.

Conclusions

Our results show that the leadership variables are associated with socioeconomic status and employee mental health. “Non-listening” scores were more sensitive to societal change and more strongly related to socioeconomic factors and job conditions than “self-centered” scores.  相似文献   

5.
Ethylene Production and Respiratory Behavior of the rin Tomato Mutant   总被引:17,自引:13,他引:4       下载免费PDF全文
Little or no change in ethylene or CO2 production occurred in rin tomato mutant fruits monitored for up to 120 days after harvest. Of the abnormally ripening tomatoes investigated, including “Never ripe” (Nr Y a h, Nr c l2 r), “Evergreen” (gf r) and “Green Flesh” (gf), only rin did not show a typical climacteric and ethylene rise.  相似文献   

6.

Objectives

Center for Epidemiologic Studies Depression (CES-D) Scale scores in English- and French-speaking Canadian systemic sclerosis (SSc) patients are commonly pooled in analyses, but no studies have evaluated the metric equivalence of the English and French CES-D. The study objective was to examine the metric equivalence of the CES-D in English- and French-speaking SSc patients.

Methods

The CES-D was completed by 1007 English-speaking and 248 French-speaking patients from the Canadian Scleroderma Research Group Registry. Confirmatory factor analysis (CFA) was used to assess the factor structure in both samples. The Multiple-Indicator Multiple-Cause (MIMIC) model was utilized to assess differential item functioning (DIF).

Results

A two-factor model (Positive and Negative affect) showed excellent fit in both samples. Statistically significant, but small-magnitude, DIF was found for 3 of 20 CES-D items, including items 3 (Blues), 10 (Fearful), and 11 (Sleep). Prior to accounting for DIF, French-speaking patients had 0.08 of a standard deviation (SD) lower latent scores for the Positive factor (95% confidence interval [CI]−0.25 to 0.08) and 0.09 SD higher scores (95% CI−0.07 to 0.24) for the Negative factor than English-speaking patients. After DIF correction, there was no change on the Positive factor and a non-significant increase of 0.04 SD on the Negative factor for French-speaking patients (difference = 0.13 SD, 95% CI−0.03 to 0.28).

Conclusions

The English and French versions of the CES-D, despite minor DIF on several items, are substantively equivalent and can be used in studies that combine data from English- and French-speaking Canadian SSc patients.  相似文献   

7.
Sequence database searches require accurate estimation of the statistical significance of scores. Optimal local sequence alignment scores follow Gumbel distributions, but determining an important parameter of the distribution (λ) requires time-consuming computational simulation. Moreover, optimal alignment scores are less powerful than probabilistic scores that integrate over alignment uncertainty (“Forward” scores), but the expected distribution of Forward scores remains unknown. Here, I conjecture that both expected score distributions have simple, predictable forms when full probabilistic modeling methods are used. For a probabilistic model of local sequence alignment, optimal alignment bit scores (“Viterbi” scores) are Gumbel-distributed with constant λ=log 2, and the high scoring tail of Forward scores is exponential with the same constant λ. Simulation studies support these conjectures over a wide range of profile/sequence comparisons, using 9,318 profile-hidden Markov models from the Pfam database. This enables efficient and accurate determination of expectation values (E-values) for both Viterbi and Forward scores for probabilistic local alignments.  相似文献   

8.
9.
The evidence about the effect of dietary patterns on blood cholesterol from cohort studies was very scarce. The study was to identify the association of dietary patterns with lipid profile, especially cholesterol, in a cohort in north China. Using a 1-year food frequency questionnaire, we assessed the dietary intake of 4515 adults from the Harbin People’s Health Study in 2008, aged 20-74 years. Principle component analysis was used to identify dietary patterns. The follow-up was completed in 2012. Fasting blood samples were collected for the determination of blood lipid concentrations. Logistic regression models were used to evaluate the association of dietary patterns with the incidence of hypercholesterolemia, hypertriglyceridemia, and low-HDL cholesterolemia. Five dietary patterns were identified (“staple food”, “vegetable, fruit and milk”, “potato, soybean and egg”, “snack”, and “meat”). The relative risk (RR) between the extreme tertiles of the snack dietary pattern scores was 1.72 (95% CI = 1.14, 2.59, P = 0.004) for hypercholesterolemia, 1.39 (1.13, 1.75, P = 0.036) for hypertriglyceridemia, after adjustment for age, sex, education, body mass index, smoking, alcohol consumption, energy intake, exercise and baseline lipid concentrations. There was a significant positive association between the snack dietary pattern scores and fasting serum total cholesterol (SRC (standardized regression coefficient) = 0.262, P = 0.025), LDL-c (SRC = 0.324, P = 0.002) and triglycerides (SRC = 0.253, P = 0.035), after adjustment for the multiple variables above. Moreover, the adjusted RR of hypertriglyceridemia between the extreme tertiles was 0.73 (0.56, 0.94, P = 0.025) for the vegetable, fruit and milk dietary pattern, and 1.86 (1.33, 2.41, P = 0.005) for the meat dietary pattern. The snack dietary pattern was a newly emerged dietary pattern in northern Chinese adults. It appears conceivable that the risk of hypercholesterolemia can be reduced by changing the snack dietary pattern.  相似文献   

10.
Visual search can be accelerated when properties of the target are known. Such knowledge allows the searcher to direct attention to items sharing these properties. Recent work indicates that information about properties of non-targets (i.e., negative cues) can also guide search. In the present study, we examine whether negative cues lead to different search behavior compared to positive cues. We asked observers to search for a target defined by a certain shape singleton (broken line among solid lines). Each line was embedded in a colored disk. In “positive cue” blocks, participants were informed about possible colors of the target item. In “negative cue” blocks, the participants were informed about colors that could not contain the target. Search displays were designed such that with both the positive and negative cues, the same number of items could potentially contain the broken line (“relevant items”). Thus, both cues were equally informative. We measured response times and eye movements. Participants exhibited longer response times when provided with negative cues compared to positive cues. Although negative cues did guide the eyes to relevant items, there were marked differences in eye movements. Negative cues resulted in smaller proportions of fixations on relevant items, longer duration of fixations and in higher rates of fixations per item as compared to positive cues. The effectiveness of both cue types, as measured by fixations on relevant items, increased over the course of each search. In sum, a negative color cue can guide attention to relevant items, but it is less efficient than a positive cue of the same informational value.  相似文献   

11.

Background

The Center for Epidemiologic Studies Depression Scale (CES-D; Radloff, 1977) is a commonly used freely available self-report measure of depressive symptoms. Despite its popularity, several recent investigations have called into question the robustness and suitability of the commonly used 4-factor 20-item CES-D model. The goal of the current study was to address these concerns by confirming the factorial validity of the CES-D.

Methods and Findings

Differential item functioning estimates were used to examine sex biases in item responses, and confirmatory factor analyses were used to assess prior CES-D factor structures and new models heeding current theoretical and empirical considerations. Data used for the analyses included undergraduate (n = 948; 74% women), community (n = 254; 71% women), rehabilitation (n = 522; 53% women), clinical (n = 84; 77% women), and National Health and Nutrition Examination Survey (NHANES; n = 2814; 56% women) samples. Differential item functioning identified an item as inflating CES-D scores in women. Comprehensive comparison of the several models supported a novel, psychometrically robust, and unbiased 3-factor 14-item solution, with factors (i.e., negative affect, anhedonia, and somatic symptoms) that are more in line with current diagnostic criteria for depression.

Conclusions

Researchers and practitioners may benefit from using the novel factor structure of the CES-D and from being cautious in interpreting results from the originally proposed scale. Comprehensive results, implications, and future research directions are discussed.  相似文献   

12.
BackgroundMany U.S.-bound refugees travel from countries where intestinal parasites (hookworm, Trichuris trichuria, Ascaris lumbricoides, and Strongyloides stercoralis) are endemic. These infections are rare in the United States and may be underdiagnosed or misdiagnosed, leading to potentially serious consequences. This evaluation examined the costs and benefits of combinations of overseas presumptive treatment of parasitic diseases vs. domestic screening/treating vs. no program.MethodsAn economic decision tree model terminating in Markov processes was developed to estimate the cost and health impacts of four interventions on an annual cohort of 27,700 U.S.-bound Asian refugees: 1) “No Program,” 2) U.S. “Domestic Screening and Treatment,” 3) “Overseas Albendazole and Ivermectin” presumptive treatment, and 4) “Overseas Albendazole and Domestic Screening for Strongyloides”. Markov transition state models were used to estimate long-term effects of parasitic infections. Health outcome measures (four parasites) included outpatient cases, hospitalizations, deaths, life years, and quality-adjusted life years (QALYs).ResultsThe “No Program” option is the least expensive ($165,923 per cohort) and least effective option (145 outpatient cases, 4.0 hospitalizations, and 0.67 deaths discounted over a 60-year period for a one-year cohort). The “Overseas Albendazole and Ivermectin” option ($418,824) is less expensive than “Domestic Screening and Treatment” ($3,832,572) or “Overseas Albendazole and Domestic Screening for Strongyloides” ($2,182,483). According to the model outcomes, the most effective treatment option is “Overseas Albendazole and Ivermectin,” which reduces outpatient cases, deaths and hospitalization by around 80% at an estimated net cost of $458,718 per death averted, or $2,219/$24,036 per QALY/life year gained relative to “No Program”.DiscussionOverseas presumptive treatment for U.S.-bound refugees is a cost-effective intervention that is less expensive and at least as effective as domestic screening and treatment programs. The addition of ivermectin to albendazole reduces the prevalence of chronic strongyloidiasis and the probability of rare, but potentially fatal, disseminated strongyloidiasis.  相似文献   

13.

Introduction

Dietary patterns are culturally specific and there is limited data on the association of dietary patterns with late-life depression in Chinese. This study examined the associations between dietary patterns and baseline and subsequent depressive symptoms in community-dwelling Chinese older people in Hong Kong.

Methods

Participants aged ≥65 year participating in a cohort study examining the risk factors for osteoporosis completed a validated food frequency questionnaire at baseline between 2001 and 2003. Factor analysis was used to identify three dietary patterns: “vegetables-fruits” pattern, “snacks-drinks-milk products” pattern, and “meat-fish” pattern. Depressive symptoms were measured at baseline and 4-year using the validated Geriatric Depression Scale. Multiple logistic regression was used for cross-sectional analysis (n = 2,902) to assess the associations between dietary patterns and the presence of depressive symptoms, and for longitudinal analysis (n = 2,211) on their associations with 4-year depressive symptoms, with adjustment for socio-demographic and lifestyle factors.

Results

The highest quartile of “vegetables-fruits” pattern score was associated with reduced likelihood of depressive symptoms [Adjusted OR = 0.55 (95% CI: 0.36–0.83), ptrend = 0.017] compared to the lowest quartile at baseline. Similar inverse trend was observed for the highest quartile of “snacks-drinks-milk products” pattern score [Adjusted OR = 0.41 (95% CI: 0.26–0.65), ptrend<0.001] compared to the lowest quartile. There was no association of “meat-fish” pattern with the presence of depressive symptoms at baseline. None of the dietary patterns were associated with subsequent depressive symptoms at 4-year.

Conclusion

Higher “vegetables-fruits” and “snacks-drinks-milk products” pattern scores were associated with reduced likelihood of baseline depressive symptoms in Chinese older people in Hong Kong. The longitudinal analyses failed to show any causal relationship between dietary patterns and depressive symptoms in this population.  相似文献   

14.
BackgroundDiet is an important factor in the prevention of chronic diseases. Analysis of secular trends of dietary patterns can be biased by energy under-reporting. Therefore, the objective of the present study was to analyse the impact of energy under-reporting on dietary patterns and secular trends in dietary patterns defined by cluster analysis.ResultsThree clusters, “healthy”, “mixed” and “western”, were identified for both surveys. The “mixed” cluster was the predominant cluster in both surveys. Excluding EUR reduced the proportion of the “mixed” cluster up to 6.40% in the 2000 survey; this caused secular trend increase in the prevalence of the “mixed” pattern. Cross-classification analysis of all participants and PER’ data showed substantial agreement in cluster assignments: 68.7% in 2000 and 84.4% in 2005. Excluding EUR did not cause meaningful (≥15%) changes in the “healthy” pattern. It provoked changes in consumption of some food groups in the “mixed” and “western” patterns: mainly decreases of unhealthy foods within the 2000 and increases of unhealthy foods within the 2005 surveys. Secular trend effects of EUR were similar to those within the 2005 survey. Excluding EUR reversed the direction of secular trends in consumption of several food groups in PER in the “mixed” and “western” patterns.ConclusionsEUR affected distribution of participants between dietary patterns within and between surveys, secular trends in food group consumption and amount of food consumed in all, but not in the “healthy” pattern. Our findings emphasize threats from energy under-reporting in dietary data analysis.  相似文献   

15.
IntroductionPrincipal component analysis is used to determine dietary behaviors of a population whereas reduced rank regression is used to construct disease-related dietary patterns. This study aimed to compare both types of DP and theirs associations with cardiovascular risk factors (CVRF).ResultsWe found that CVRF-related patterns also reflect eating behaviours of the population. Comparing concordant food groups between both dietary pattern methods, a diet high in fruits, oleaginous and dried fruits, vegetables, olive oil, fats rich in omega 6 and tea and low in fried foods, lean and fatty meat, processed meat, ready meal, soft drink and beer was associated with lower prevalence of CVRF. In the opposite, a pattern characterized by high intakes of fried foods, meat, offal, beer, wine and aperitifs and spirits, and low intakes of cereals, sugar and sweets and soft drinks was associated with higher prevalence of CVRF.ConclusionIn sum, we found that a “Prudent” and “Animal protein and alcohol” patterns were both associated with CVRF and behaviourally meaningful. Moreover, the relationships of those dietary patterns with lifestyle characteristics support the theory that food choices are part of a larger pattern of healthy lifestyle.  相似文献   

16.
BackgroundThis study evaluates an active search strategy for leprosy diagnosis based on responses to a Leprosy Suspicion Questionnaire (LSQ), and analyzing the clinical, immunoepidemiological and follow-up aspects for individuals living in a prison population.MethodsA cross-sectional study based on a questionnaire posing 14 questions about leprosy symptoms and signs that was distributed to 1,400 prisoners. This was followed by dermatoneurological examination, anti-PGL-I serology and RLEP-PCR. Those without leprosy were placed in the Non-leprosy Group (NLG, n = 1,216) and those diagnosed with clinical symptoms of leprosy were placed in the Leprosy Group (LG, n = 34).FindingsIn total, 896 LSQ were returned (64%), and 187 (20.9%) of the responses were deemed as positive for signs/symptoms, answering 2.7 questions on average. Clinically, 1,250 (89.3%) of the prisoners were evaluated resulting in the diagnosis of 34 new cases (LG), based on well-accepted clinical signs and symptoms, a new case detection rate of 2.7% within this population, while the NLG were comprised of 1,216 individuals. The confinement time medians were 39 months in the LG while it was 36 months in the NLG (p>0.05). The 31 leprosy cases who responded to the questionnaire (LSQ+) had an average of 1.5 responses. The symptoms “anesthetized skin area” and “pain in nerves” were most commonly mentioned in the LG while “tingling, numbness in the hands/feet”, “sensation of pricks and needles”, “pain in nerves” and “spots on the skin” responses were found in more than 30% of questionnaires in the NLG. Clinically, 88.2% had dysesthetic macular skin lesions and 97.1% presented some peripheral nerve impairment, 71.9% with some degree of disability. All cases were multibacillary, confirming a late diagnosis. Anti-PGL-I results in the LG were higher than in the NLG (p<0.0001), while the RLEP-PCR was positive in 11.8% of the patients.InterpretationOur findings within the penitentiary demonstrated a hidden prevalence of leprosy, although the individuals diagnosed were likely infected while living in their former communities and not as a result of exposure in the prison. The LSQ proved to be an important screening tool to help identify leprosy cases in prisons.  相似文献   

17.
18.
AimAngolan Miombo woodlands, rich in timber species of the Leguminosae family, go through one of the highest rates of deforestation in sub‐Saharan Africa. This study presents, on the basis of updated information of the distribution of Leguminosae timber species native to Angola, an integrated index framing the main threats for trees, which aims to support new conservation measures.LocationSub‐Saharan Africa, Republic of Angola.MethodsThe current distribution areas of six Leguminosae timber species (i.e., Afzelia quanzensis, Brachystegia spiciformis, Guibourtia coleosperma, Isoberlinia angolensis, Julbernardia paniculata, and Pterocarpus angolensis) were predicted through ensemble modeling techniques. The level of threat to each species was analyzed, comparing the species potential distribution with a threat index map and with the protected areas. The threat index of anthropogenic and climatic factors encompasses the effects of population density, agriculture, proximity to roads, loss of tree cover, overexploitation, trends in wildfires, and predicted changes in temperature and precipitation.ResultsOur results revealed that about 0.5% of Angola''s area is classified as of “Very high” threat, 23.9% as “High” threat, and 66.5% as “Moderate” threat. Three of the studied species require special conservation efforts, namely B. spiciformis and I. angolensis, which have a large fraction of predicted distribution in areas of high threat, and G. coleosperma since it has a restricted distribution area and is one of the most valuable species in international markets. The priority areas for the conservation of Leguminosae timber species were found in Benguela and Huíla.Main conclusionsThis study provides updated data that should be applied to inform policymakers, contributing to national conservation planning and protection of native flora in Angola. Moreover, it presents a methodological approach for the predictions of species distribution and for the creation of a threat index map that can be applied in other poorly surveyed tropical regions.  相似文献   

19.
BackgroundDuring 2017, twenty health districts (locations) implemented a dengue outbreak Early Warning and Response System (EWARS) in Mexico, which processes epidemiological, meteorological and entomological alarm indicators to predict dengue outbreaks and triggers early response activities.Out of the 20 priority districts where more than one fifth of all national disease transmission in Mexico occur, eleven districts were purposely selected and analyzed. Nine districts presented outbreak alarms by EWARS but without subsequent outbreaks (“non-outbreak districts”) and two presented alarms with subsequent dengue outbreaks (“outbreak districts”). This evaluation study assesses and compares the impact of alarm-informed response activities and the consequences of failing a timely and adequate response across the outbreak groups.MethodsFive indicators of dengue outbreak response (larval control, entomological studies with water container interventions, focal spraying and indoor residual spraying) were quantitatively analyzed across two groups (”outbreak districts” and “non-outbreak districts”). However, for quality control purposes, only qualitative concluding remarks were derived from the fifth response indicator (fogging).ResultsThe average coverage of vector control responses was significantly higher in non-outbreak districts and across all four indicators. In the “outbreak districts” the response activities started late and were of much lower intensity compared to “non-outbreak districts”. Vector control teams at districts-level demonstrated diverse levels of compliance with local guidelines for ‘initial’, ‘early’ and ‘late’ responses to outbreak alarms, which could potentially explain the different outcomes observed following the outbreak alarms.ConclusionFailing timely and adequate response of alarm signals generated by EWARS showed to negatively impact the disease outbreak control process. On the other hand, districts with adequate and timely response guided by alarm signals demonstrated successful records of outbreak prevention. This study presents important operational scenarios when failing or successding EWARS but warrants investigating the effectiveness and cost-effectiveness of EWARS using a more robust designs.  相似文献   

20.

Background

Computerized adaptive testing (CAT) utilizes latent variable measurement model parameters that are typically assumed to be equivalently applicable to all people. Biased latent variable scores may be obtained in samples that are heterogeneous with respect to a specified measurement model. We examined the implications of sample heterogeneity with respect to CAT-predicted patient-reported outcomes (PRO) scores for the measurement of pain.

Methods

A latent variable mixture modeling (LVMM) analysis was conducted using data collected from a heterogeneous sample of people in British Columbia, Canada, who were administered the 36 pain domain items of the CAT-5D-QOL. The fitted LVMM was then used to produce data for a simulation analysis. We evaluated bias by comparing the referent PRO scores of the LVMM with PRO scores predicted by a “conventional” CAT (ignoring heterogeneity) and a LVMM-based “mixture” CAT (accommodating heterogeneity).

Results

The LVMM analysis indicated support for three latent classes with class proportions of 0.25, 0.30 and 0.45, which suggests that the sample was heterogeneous. The simulation analyses revealed differences between the referent PRO scores and the PRO scores produced by the “conventional” CAT. The “mixture” CAT produced PRO scores that were nearly equivalent to the referent scores.

Conclusion

Bias in PRO scores based on latent variable models may result when population heterogeneity is ignored. Improved accuracy could be obtained by using CATs that are parameterized using LVMM.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号