首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.

Objective

To evaluate the accuracy of glycosylated hemoglobin A1c (HbA1c) for the diagnosis of postpartum abnormal glucose tolerance among women with gestational diabetes mellitus (GDM).

Methods

After a systematic review of related studies, the sensitivity, specificity, positive likelihood ratio (PLR), negative likelihood ratio (NLR), diagnostic odds ratio (DOR), and other measures about the accuracy of HbA1c in the diagnosis of postpartum abnormal glucose tolerance were pooled using random-effects models. The summary receiver operating characteristic (SROC) curve was used to summarize the overall test performance.

Results

Six studies met our inclusion criteria. The pooled results on SEN, SPE, PLR, NLR, and DOR were 0.36 (95% CI 0.23–0.52), 0.85 (95% CI 0.73–0.92), 2.4 (95% CI 1.6–3.6), 0.75 (95% CI 0.63–0.88) and 3 (95% CI 2–5). The area under the summary receiver operating characteristic (SROC) curve was 0.67 with a Q value of 0.63.

Conclusions

Measurement of HbA1c alone is not a sensitive test to detect abnormal glucose tolerance in women with prior GDM.  相似文献   

2.

Objective

The aim of this study was to evaluate whether the distribution pattern of early ischemic changes in the initial MRI allows a practical method for estimating leptomeningeal collateralization in acute ischemic stroke (AIS).

Methods

Seventy-four patients with AIS underwent MRI followed by conventional angiogram and mechanical thrombectomy. Diffusion restriction in Diffusion weighted imaging (DWI) and correlated T2-hyperintensity of the infarct were retrospectively analyzed and subdivided in accordance with Alberta Stroke Program Early CT score (ASPECTS). Patients were angiographically graded in collateralization groups according to the method of Higashida, and dichotomized in 2 groups: 29 subjects with collateralization grade 3 or 4 (well-collateralized group) and 45 subjects with grade 1 or 2 (poorly-collateralized group). Individual ASPECTS areas were compared among the groups.

Results

Means for overall DWI-ASPECTS were 6.34 vs. 4.51 (well vs. poorly collateralized groups respectively), and for T2-ASPECTS 9.34 vs 8.96. A significant difference between groups was found for DWI-ASPECTS (p<0.001), but not for T2-ASPECTS (p = 0.088). Regarding the individual areas, only insula, M1-M4 and M6 showed significantly fewer infarctions in the well-collateralized group (p-values <0.001 to 0.015). 89% of patients in the well-collateralized group showed 0–2 infarctions in these six areas (44.8% with 0 infarctions), while 59.9% patients of the poor-collateralized group showed 3–6 infarctions.

Conclusion

Patients with poor leptomeningeal collateralization show more infarcts on the initial MRI, particularly in the ASPECTS areas M1 to M4, M6 and insula. Therefore DWI abnormalities in these areas may be a surrogate marker for poor leptomeningeal collaterals and may be useful for estimation of the collateral status in routine clinical evaluation.  相似文献   

3.

Introduction

The capability of CT perfusion (CTP) Alberta Stroke Program Early CT Score (ASPECTS) to predict outcome and identify ischemia severity in acute ischemic stroke (AIS) patients is still questioned.

Methods

62 patients with AIS were imaged within 8 hours of symptom onset by non-contrast CT, CT angiography and CTP scans at admission and 24 hours. CTP ASPECTS was calculated on the affected hemisphere using cerebral blood flow (CBF), cerebral blood volume (CBV) and mean transit time (MTT) maps by subtracting 1 point for any abnormalities visually detected or measured within multiple cortical circular regions of interest according to previously established thresholds. MTT-CBV ASPECTS was considered as CTP ASPECTS mismatch. Hemorrhagic transformation (HT), recanalization status and reperfusion grade at 24 hours, final infarct volume at 7 days and modified Rankin scale (mRS) at 3 months after onset were recorded.

Results

Semi-quantitative and quantitative CTP ASPECTS were highly correlated (p<0.00001). CBF, CBV and MTT ASPECTS were higher in patients with no HT and mRS≤2 and inversely associated with final infarct volume and mRS (p values: from p<0.05 to p<0.00001). CTP ASPECTS mismatch was slightly associated with radiological and clinical outcomes (p values: from p<0.05 to p<0.02) only if evaluated quantitatively. A CBV ASPECTS of 9 was the optimal semi-quantitative value for predicting outcome.

Conclusions

Our findings suggest that visual inspection of CTP ASPECTS recognizes infarct and ischemic absolute values. Semi-quantitative CBV ASPECTS, but not CTP ASPECTS mismatch, represents a strong prognostic indicator, implying that core extent is the main determinant of outcome, irrespective of penumbra size.  相似文献   

4.

Background

Various studies have assessed the diagnostic accuracy of EGFR mutation-specific antibodies in non-small cell lung cancer (NSCLC). We performed a meta-analysis of existing data to investigate the diagnostic value of mutation-specific antibodies for detection of EGFR mutations in NSCLC.

Methods

We systematically retrieved relevant studies from PubMed, Web of Knowledge, and Google Scholar. Data from studies that met the inclusion criteria were extracted for further exploration of heterogeneity, including calculation of the average sensitivity, specificity, positive likelihood ratio (PLR), negative likelihood ratio (NLR), diagnostic odds ratio (DOR), and analysis of SROC(summary receiver operating characteristic) curves.

Results

Fifteen studies met our inclusion criteria. A summary of the meta-analysis of the efficacy of the anti-E746-A750 antibody was as follows: sensitivity, 0.60 (95% CI, 0.55–0.64); specificity, 0.98 (95% CI, 0.97–0.98); PLR, 33.50 (95% CI, 13.96–80.39); NLR, 0.39 (95% CI, 0.30–0.51) and DOR, 111.17 (95% CI, 62.22–198.63). A similar meta-analysis was performed for the anti-L858R antibody with results as follows: sensitivity, 0.76 (95% CI, 0.71–0.79); specificity, 0.96 (95% CI, 0.95–0.97); PLR, 24.42 (95% CI, 11.66–51.17); NLR, 0.22 (95% CI, 0.12–0.39) and DOR, 126.66 (95% CI, 54.60–293.82).

Conclusion

Immunohistochemistry alone is sufficient for the detection of EGFR mutations if the result is positive. Molecular-based analyses are necessary only if the anti-E746-A750 antibody results are negative. Immunohistochemistry seems more suitable for clinical screening for EGFR mutations prior to molecular-based analysis.  相似文献   

5.

Purpose

To evaluate whether neutrophil-to-lymphocyte ratio (NLR) and platelet-to-lymphocyte ratio (PLR) predict survival and metastasis in patients after transarterial chemoembolization (TACE) for recurrent hepatocellular carcinoma (RHCC).

Materials and Methods

Clinical and laboratory data from 132 RHCC patients treated with TACE from January 2003 to December 2012 were retrospectively reviewed. Prognostic factors were assessed by multivariate analysis, and the predictive values of NLR and PLR for overall survival (OS) and extrahepatic metastases were compared.

Results

Pretreatment mean NLR and PLR were 3.1 and 137, respectively. The 0.5-, 1-, and 2-year OS rates were 93.7%, 67.1%, and 10.1% in the low NLR group and 81.1%, 18.9%, and 3.8% in the high NLR group, respectively (P = 0.017). The corresponding OS rates in the low and high PLR groups were 92.5%, 58.1%, and 9.7% and 84.6%, 23.1%, and 2.6%, respectively (P = 0.030). The discriminatory performance predicting 1-year survival probability was significantly poorer for NLR (area under the curve [AUC] = 0.685, 95% confidence interval [CI] 0.598–0.763) than for PLR (AUC = 0.792, 95% CI 0.712–0.857; P = 0.0295), but was good for both ratios for predicting post-TACE extrahepatic metastasis. Multivariate analysis indicated that high PLR (hazard ratio [HR] = 0.373, 95% CI = 0.216-0.644, P < 0.001, vascular invasion (HR = 0.507, 95% CI = 0.310–0.832, P = 0.007), and multiple tumors (HR= 0.553, 95% CI = 0.333–0.919, P = 0.022) were independent prognostic factors for OS.

Conclusions

High NLR and PLR were both associated with poor prognosis and metastasis in RHCC patients treated with TACE, but high PLR was a better predictor of 1-year OS. High PLR, vascular invasion, and multiple tumors were independent, unfavorable prognostic factors.  相似文献   

6.

Background

While the prevalence of mental illness or cognitive disability is higher among homeless people than the general population in Western countries, few studies have investigated its prevalence in Japan or other Asian countries. The present study conducted a survey to comprehensively assess prevalence of mental illness, cognitive disability, and their overlap among homeless individuals living in Nagoya, Japan.

Methods

Participants were 114 homeless individuals. Mental illness was diagnosed based on semi-structured interviews conducted by psychiatrists. The Wechsler Adult Intelligence Scale-III (WAIS-III, simplified version) was used to diagnose intellectual/ cognitive disability.

Results

Among all participants, 42.1% (95% CI 33.4–51.3%) were diagnosed with a mental illness: 4.4% (95% CI 1.9–9.9%) with schizophrenia or other psychotic disorder, 17.5% (95% CI 11.6–25.6%) with a mood disorder, 2.6% (95% CI 0.9–7.5%) with an anxiety disorder, 14.0% (95% CI 8.8–21.6%) with a substance-related disorder, and 3.5% (95% CI 1.4–8.8%) with a personality disorder. Additionally, 34.2% (95% CI 26.1–43.3%) demonstrated cognitive disability: 20.2% (95% CI 13.8–28.5%) had mild and 14.0% (95% CI 8.8–21.6%) had moderate or severe disability. The percent overlap between mental illness and cognitive disability was 15.8% (95% CI 10.2–23.6%). Only 39.5% (95% CI 26.1–43.3%) of the participants were considered to have no psychological or cognitive dysfunction. Participants were divided into four groups based on the presence or absence of mental illness and/or cognitive disability. Only individuals with a cognitive disability reported a significant tendency toward not wanting to leave their homeless life.

Conclusion

This is the first report showing that the prevalence of mental illness and/or cognitive disability among homeless individuals is much higher than in the general Japanese population. Appropriate support strategies should be devised and executed based on the specificities of an individual’s psychological and cognitive condition.  相似文献   

7.
8.
BackgroundA new lateral flow immunoassay (LFA) for the detection of cryptococcal antigen was developed.ObjectiveWe aimed to systematically review all relevant studies to evaluate the diagnostic accuracy of the cryptococcal antigen LFA on serum, CSF and urine specimens.MethodsWe searched public databases including PubMed, Web of Science, Elsevier Science Direct and Cochrane Library for the English-language literature published up to September 2014. We conducted meta-analyses of sensitivity, specificity, positive likelihood ratio (PLR), negative likelihood ratio (NLR) and diagnostic odds ratios (DOR) and SROC of LFA in serum and CSF, respectively. The sensitivity of LFA in urine was also analyzed. Subgroup analyses were carried out to analyze the potential heterogeneity.Results12 studies were included in this study. The pooled sensitivity and specificity values of LFA in serum were 97.6% (95% CI, 95.6% to 98.9%) and 98.1% (95% CI, 97.4% to 98.6%), respectively. The average PLR of LFA in serum was 43.787 (95% CI, 22.60–84.81) and the NLR was 0.03 (95% CI, 0.01–0.09). The pooled DOR was 2180.30 (95% CI, 868.92–5471.00) and the AUC was 0.9968. The pooled sensitivity and specificity values of LFA in CSF were 98.9% (95% CI, 97.9% to 99.5%) and 98.9% (95% CI, 98.0% to 99.5%), respectively. The average PLR of LFA in serum was 48.83 (95% CI, 21.59–110.40) and the NLR was 0.02 (95% CI, 0.01–0.04). The pooled DOR was 2931.10 (95% CI, 1149.20–7475.90) and the AUC was 0.9974. The pooled sensitivity value of LFA in urine was 85.0% (95% CI, 78.7% to 90.1%)ConclusionsThe study demonstrates a very high accuracy of LFA in serum and CSF for the diagnosis of cryptococcosis in patients at risk. LFA in urine can be a promising sample screening tool for early diagnosis of cryptococcosis.  相似文献   

9.
BackgroundRed cell distribution width (RDW), neutrophil-to-lymphocyte ratio (NLR), and platelet-to-lymphocyte ratio (PLR) are known inflammatory indices. Elevated values are found in many cancers and may be associated with a poor prognosis. The article aimed to assess the impact of RDW, NLR, and PLR on overall survival (OS) of patients with oropharyngeal cancer treated with radiotherapy (RT).Materials and methodsThis retrospective study includes 208 patients treated for oropharyngeal cancer with definitive RT or RT combined with neoadjuvant or concurrent systemic therapy, at one institution between 2004 and 2014. The receiver operating characteristic (ROC) method, log-rank testing, and Cox proportional hazards regression model were used for the analysis.ResultsThe OS was significantly higher in RDW ≤ 13.8% (p = 0.001) and NLR ≤ 2.099 (p = 0.016) groups. The RDW index was characterized by the highest discriminatory ability [area under the curve (AUC) = 0.59, 95% confidence interval (CI): 0.51–0.67], closely followed by NLR (AUC = 0.58, 95% CI: 0.50–0.65). In the univariate Cox regression analysis, RDW [hazard ratio (HR): 1.28, 95% CI: 1.12–1.47, p < 0.001] and NLR (HR: 1.11, 95% CI: 1.06–1.18, p < 0.001) were associated with an increased risk of death. In the multivariate analysis, among the analyzed indices, only NLR was significantly associated with survival (HR: 1.16, 95% CI: 1.03–1.29, p = 0.012).ConclusionsIn the study, only NLR proved to be an independent predictor of OS. However, its clinical value is limited due to the relatively low sensitivity and specificity.  相似文献   

10.
The objective of this study was to identify possible hygiene behaviors associated with the incidence of ILI among adults in Beijing. In January 2011, we conducted a multi-stage sampling, cross-sectional survey of adults living in Beijing using self-administered anonymous questionnaires. The main outcome variable was self-reported ILI within the past year. Multivariate logistic regression was used to identify factors associated with self-reported ILI. A total of 13003 participants completed the questionnaires. 6068 (46.7%) of all participants reported ILI during the past year. After adjusting for demographic characteristics, the variables significantly associated with a lower likelihood of reporting ILI were regular physical exercise (OR 0.80; 95% CI 0.74–0.87), optimal hand hygiene (OR 0.87; 95% CI 0.80–0.94), face mask use when going to hospitals (OR 0.87; 95% CI 0.80–0.95), and not sharing of towels and handkerchiefs (OR 0.68; 95% CI 0.63–0.73). These results highlight that personal hygiene behaviors were potential preventive factors against the incidence of ILI among adults in Beijing, and future interventions to improve personal hygiene behaviors are needed in Beijing.  相似文献   

11.
Higher levels of LINE1 methylation in blood DNA have been associated with increased kidney cancer risk using post-diagnostically collected samples; however, this association has never been examined using pre-diagnostic samples. We examined the association between LINE1 %5mC and renal cell carcinoma (RCC) risk using pre-diagnostic blood DNA from the United States-based, Prostate, Lung, Colorectal, and Ovarian Cancer Screening Trial (PLCO) (215 cases/436 controls), and the Alpha-tocopherol, Beta-carotene Cancer Prevention Study (ATBC) of Finnish male smokers (191 cases/575 controls). Logistic regression adjusted for age at blood draw, study center, pack-years of smoking, body mass index, hypertension, dietary alcohol intake, family history of cancer, and sex was used to calculate odds ratios (ORs) and 95% confidence intervals (CIs) using cohort and sex-specific methylation categories. In PLCO, higher, although non-significant, RCC risk was observed for participants at or above median methylation level (M2) compared to those below the median (M1) (OR: 1.37, 95% CI: 0.96–1.95). The association was stronger in males (M2 vs. M1, OR: 1.54, 95% CI: 1.00–2.39) and statistically significant among male smokers (M2 vs. M1, OR: 2.60, 95% CI: 1.46–4.63). A significant interaction for smoking was also detected (P-interaction: 0.01). No association was found among females or female smokers. Findings for male smokers were replicated in ATBC (M2 vs. M1, OR: 1.31, 95% CI: 1.07–1.60). In a pooled analysis of PLCO and ATBC male smokers (281cases/755controls), the OR among subjects at or above median methylation level (M2) compared to those below the median (M1) was 1.89 (95% CI: 1.34–2.67, P-value: 3 x 10–4); a trend was also observed by methylation quartile (P-trend: 0.002). These findings suggest that higher LINE1 methylation levels measured prior to cancer diagnosis may be a biomarker of future RCC risk among male smokers.  相似文献   

12.

Purpose

To improve ischemic stroke outcome prediction using imaging information from a prospective cohort who received admission CT angiography (CTA).

Methods

In a prospectively designed study, 649 stroke patients diagnosed with acute ischemic stroke had admission NIH stroke scale scores, noncontrast CT (NCCT), CTA, and 6-month outcome assessed using the modified Rankin scale (mRS) scores. Poor outcome was defined as mRS>2. Strokes were classified as “major” by the (1) Alberta Stroke Program Early CT Score (ASPECTS+) if NCCT ASPECTS was≤7; (2) Boston Acute Stroke Imaging Scale (BASIS+) if they were ASPECTS+ or CTA showed occlusion of the distal internal carotid, proximal middle cerebral, or basilar arteries; and (3) NIHSS for scores>10.

Results

Of 649 patients, 253 (39.0%) had poor outcomes. NIHSS, BASIS, and age, but not ASPECTS, were independent predictors of outcome. BASIS and NIHSS had similar sensitivities, both superior to ASPECTS (p<0.0001). Combining NIHSS with BASIS was highly predictive: 77.6% (114/147) classified as NIHSS>10/BASIS+ had poor outcomes, versus 21.5% (77/358) with NIHSS≤10/BASIS− (p<0.0001), regardless of treatment. The odds ratios for poor outcome is 12.6 (95% CI: 7.9 to 20.0) in patients who are NIHSS>10/BASIS+ compared to patients who are NIHSS≤10/BASIS−; the odds ratio is 5.4 (95% CI: 3.5 to 8.5) when compared to patients who are only NIHSS>10 or BASIS+.

Conclusions

BASIS and NIHSS are independent outcome predictors. Their combination is stronger than either instrument alone in predicting outcomes. The findings suggest that CTA is a significant clinical tool in routine acute stroke assessment.  相似文献   

13.

Purpose

The role of spot sign on computed tomography angiography (CTA) for predicting hematoma expansion (HE) after primary intracerebral hemorrhage (ICH) has been the focus of many studies. Our study sought to evaluate the predictive accuracy of spot signs for HE in a meta-analytic approach.

Materials and Methods

The database of Pubmed, Embase, and the Cochrane Library were searched for eligible studies. Researches were included if they reported data on HE in primary ICH patients, assessed by spot sign on first-pass CTA. Studies with additional data of second-pass CTA, post-contrast CT (PCCT) and CT perfusion (CTP) were also included.

Results

18 studies were pooled into the meta-analysis, including 14 studies of first-pass CTA, and 7 studies of combined CT modalities. In evaluating the accuracy of spot sign for predicting HE, studies of first-pass CTA showed that the sensitivity was 53% (95% CI, 49%–57%) with a specificity of 88% (95% CI, 86%–89%). The pooled positive likelihood ratio (PLR) was 4.70 (95% CI, 3.28–6.74) and the negative likelihood ratio (NLR) was 0.44 (95% CI, 0.34–0.58). For studies of combined CT modalities, the sensitivity was 73% (95% CI, 67%–79%) with a specificity of 88% (95% CI, 86%–90%). The aggregated PLR was 6.76 (95% CI, 3.70–12.34) and the overall NLR was 0.17 (95% CI 0.06–0.48).

Conclusions

Spot signs appeared to be a reliable imaging biomarker for HE. The additional detection of delayed spot sign was helpful in improving the predictive accuracy of early spot signs. Awareness of our results may impact the primary ICH care by providing supportive evidence for the use of combined CT modalities in detecting spot signs.  相似文献   

14.

Background

Certain population groups have been rendered vulnerable in Chad because of displacement of more than 200,000 people over the last three years as a result of mass violence against civilians in the east of the country. The objective of the study was to assess mortality and nutritional patterns among displaced and non-displaced population living in camps, villages and a town in the Ouddaï and Salamat regions of Chad.

Methodology

Between May and October 2007, two stage, 30-cluster household surveys were conducted among 43,900 internally displaced persons (IDPs) living in camps in Ouaddai region (n = 898 households), among 19,400 non-displaced persons (NDPs) living in 42 villages in Ouaddai region (n = 900 households) and among 17,000 NDPs living in a small town in Salamat region (n = 901 households). Data collection included anthropometric measurements, measles vaccination rates and retrospective mortality. Crude mortality rate (CMR), mortality rate among children younger than 5 years (U5MR), causes of death and the prevalence of wasting (weight-for-height z score <−2) among children aged 6 to 59 months were the main outcome measures.

Conclusions

The CMR among the 4902 IDPs in Gozbeida camps, 4477 NDPs living in a village and 4073 NDPs living in a town surveyed was 1.8 (95% CI, 1.2–2.8), 0.3 (95% CI, 0.2–0.4), 0.3 (95% CI, 0.2–0.5) per 10,000 per day, respectively. The U5MR in a camp (n = 904), a village (n = 956) and a town (n = 901) was 4.1 (95% CI, 2.1–7.7), 0.5 (95% CI, 0.3–0.9) and 0.7 (95% CI, 0.4–1.4) per 10,000 per day, respectively. Diarrhoea was reported to be the main cause of death. Acute malnutrition rates (according to the WHO definition) among 904 IDP children, 956 NDPs children living in a village, 901 NDP children living in a town aged 6 to 59 months were 20.6% (95% CI, 17.9%–23.3%), 16.4% (95% CI, 14.0%–18.8%) and 10.1% (95% CI, 8.1%–12.2%) respectively. The study found a high mortality rate among IDPs and an elevated prevalence of wasting not only in IDP camps but also in villages located in the same region. The town-dweller population remains at risk of malnutrition. Appropriate contingency plans need to be made to ensure acceptable living standards for these populations.  相似文献   

15.

Introduction

There are few reports in the literature estimating the epidemiologic characteristics of pediatric chronic dialysis. These patients have impaired physical growth, high number of comorbidities and great need for continuous attention of specialized services with high demand for complex and costly procedures.

Objective

The aim of this study was to estimate the incidence and prevalence rates and describe the characteristics of children and adolescents undergoing chronic dialysis treatment in a Brazilian demographic health survey.

Materials and Methods

A cross-sectional study was performed in a representative sample of dialysis centers (nc = 239) that was established from the 2011 Brazilian Nephrology Society Census (Nc = 708). We collected data encompassing the five Brazilian macro-regions. We analyzed the data from all patients under 19 years of age. The sample population consisted of 643 children and adolescents who were on chronic dialysis program anytime in 2012. Data collection was carried out in the dialysis services by means of patients'' records reviews and personal interviews with the centers’ leaders.

Results

We estimated that there were a total of 1,283 pediatric patients on chronic dialysis treatment in Brazil, resulting in a prevalence of 20.0 cases per million age-related population (pmarp) (95% CI: 14.8–25.3) and an incidence of 6.6 cases pmarp in 2012 (95% CI: 4.8–8.4). The South region had the highest prevalence and incidence rates of patients under dialysis therapy, 27.7 (95% CI: 7.3–48.1) and 11.0 (95% CI: 2.8–19.3) cases pmarp, respectively; the lowest prevalence and incidence rates were found in the North-Midwest region, 13.8 (95% CI: 6.2–21.4), and in the Northeast region, 3.8 (95% CI: 1.4–6.3) cases pmarp, respectively.

Conclusion

Brazil has an overall low prevalence of children on chronic dialysis treatment, figuring near the rates from others countries with same socioeconomic profile. There are substantial differences among regions related to pediatric chronic dialysis treatment. Joint strategies aiming to reduce inequities and improving access to treatment and adequacy of services across the Brazilian regions are necessary to provide an appropriate care setting for this population group.  相似文献   

16.

Objective

Depression is a major cause of disability in working populations and the reduction of socioeconomic inequalities in disability is an important public health challenge. We examined work disability due to depression with four indicators of socioeconomic status.

Methods

A prospective cohort study of 125 355 Finnish public sector employees was linked to national register data on work disability (>9 days) due to depressive disorders (International Classification of Diseases, codes F32–F34) from January 2005 to December 2011. Primary outcomes were the onset of work disability due to depressive disorders and, among those with such disability, return to work after and recurrent episodes of work disability due to depression.

Results

We found a consistent inverse socioeconomic gradient in work disability due to depression. Lower occupational position, lower educational level, smaller residence size, and rented (vs. owner-occupied) residence were all associated with an increased risk of work disability. Return to work was slower for employees with basic education (cumulative odds ratio = 1.21, 95% CI: 1.05–1.39) compared to those with higher education. Recurrent work disability episodes due to depression were less common among upper-grade non-manual workers (the highest occupational group) than among lower-grade non-manual (hazard ratio = 1.16, 95% CI: 1.07–1.25) and manual (hazard ratio = 1.14, 95% CI: 1.02–1.26) workers.

Conclusions

These data from Finnish public sector employees show persistent socioeconomic inequalities in work disability due to depression from 2005 to 2011 in terms of onset, recovery and recurrence.  相似文献   

17.
ObjectiveWe aim to evaluate the accuracy of the 16S ribosomal ribonucleic acid (rRNA) gene polymerase chain reaction (PCR) test in the diagnosis of bloodstream infections through a systematic review and meta-analysis.MethodsA computerized literature search was conducted to identify studies that assessed the diagnostic value of 16S rRNA gene PCR test for bloodstream infections. Study quality was assessed using the revised Quality Assessment of Diagnostic Accuracy Studies (QUADAS-2) tool. We calculated the sensitivity, specificity, positive likelihood ratio (PLR), negative likelihood ratio (NLR), diagnostic odds ratio (DOR) and their 95% confidence intervals (95% CI) for each study. Summary receiver operating characteristic (SROC) curve was used to summarize overall test performance. Statistical analysis was performed in Meta-DiSc 1.4 and Stata/SE 12.0 software.ResultsTwenty-eight studies were included in our meta-analysis. Using random-effect model analysis, the pooled sensitivity, specificity, PLR, NLR, and DOR were 0.87 (95% CI, 0.85–0.89), 0.94 (95% CI, 0.93–0.95), 12.65 (95% CI, 8.04–19.90), 0.14 (95% CI, 0.08–0.24), and 116.76 (95% CI, 52.02–262.05), respectively. The SROC curve indicated that the area under the curve (AUC) was 0.9690 and the maximum joint sensitivity and specificity (Q*) was 0.9183. In addition, heterogeneity was statistically significant but was not caused by the threshold effect.ConclusionExisting data suggest that 16S rRNA gene PCR test is a practical tool for the rapid screening of sepsis. Further prospective studies are needed to assess the diagnostic value of PCR amplification and DNA microarray hybridization of 16S rRNA gene in the future.  相似文献   

18.
The aims of the present study were to identify risk factors associated with latent tuberculosis (TB), examine the development of active disease among contacts, and assess the effectiveness of treating latent infection in indigenous Brazilians from January 2006 to December 2011. This was a retrospective study consisting of 1,371 tuberculosis contacts, 392 of whom underwent treatment for latent infection. Morbidity-from-TB data were obtained from the Information System for Disease Notification (SINAN) database, and the contacts’ data were collected from the clinical records using forms employed by Special Department of Indigenous Health (SESAI) multidisciplinary teams, according to SESAI’s instructions. The variables that were associated with latent infection among the contacts were age (odds ratio [OR]: 1.03; 95% confidence interval [CI]: 1.02–1.04) and close contact with a smear-positive index case (OR: 2.26, 95% CI: 1.59–3.22). The variables associated with the development of active TB among the contacts were a tuberculin skin test (TST) ≥10 mm (relative risk [RR]: 1.12, 95% CI: 1.07–1.17), age (RR: 1.01, 95% CI: 1.00–1.03), and treatment of latent infection (RR: 0.03, 95% CI: 0.01–0.27). The estimated number of latent infection treatments needed to prevent one case of active TB among the contacts was 51 treatments (95% CI: 33–182). In contacts with TST ≥10 mm, 10 (95% CI: 6–19) latent infection treatments were necessary to prevent one case of active TB. Age and close contact with a smear-positive index case were associated with latent TB. Screening with TST is a high priority among individuals contacting smear-positive index cases. Age and TST are associated with the development of active TB among contacts, and treatment of latent infection is an effective measure to control TB in indigenous communities.  相似文献   

19.
BackgroundAcquisition of a disability in adulthood has been associated with a reduction in mental health. We tested the hypothesis that low wealth prior to disability acquisition is associated with a greater deterioration in mental health than for people with high wealth.MethodsWe assess whether level of wealth prior to disability acquisition modifies this association using 12 waves of data (2001–2012) from the Household, Income and Labour Dynamics in Australia survey–a population-based cohort study of working-age Australians. Eligible participants reported at least two consecutive waves of disability preceded by at least two consecutive waves without disability (1977 participants, 13,518 observations). Fixed-effects linear regression was conducted with a product term between wealth prior to disability (in tertiles) and disability acquisition with the mental health component score of the SF–36 as the outcome.ResultsIn models adjusted for time-varying confounders, there was evidence of negative effect measure modification by prior wealth of the association between disability acquisition and mental health (interaction term for lowest wealth tertile: -2.2 points, 95% CI -3.1 points, -1.2, p<0.001); low wealth was associated with a greater decline in mental health following disability acquisition (-3.3 points, 95% CI -4.0, -2.5) than high wealth (-1.1 points, 95% CI -1.7, -0.5).ConclusionThe findings suggest that low wealth prior to disability acquisition in adulthood results in a greater deterioration in mental health than among those with high wealth.  相似文献   

20.
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号