首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 750 毫秒
1.

Background

Centenarians are a rapidly growing demographic group worldwide, yet their health and social care needs are seldom considered. This study aims to examine trends in place of death and associations for centenarians in England over 10 years to consider policy implications of extreme longevity.

Methods and Findings

This is a population-based observational study using death registration data linked with area-level indices of multiple deprivations for people aged ≥100 years who died 2001 to 2010 in England, compared with those dying at ages 80-99. We used linear regression to examine the time trends in number of deaths and place of death, and Poisson regression to evaluate factors associated with centenarians’ place of death. The cohort totalled 35,867 people with a median age at death of 101 years (range: 100–115 years). Centenarian deaths increased 56% (95% CI 53.8%–57.4%) in 10 years. Most died in a care home with (26.7%, 95% CI 26.3%–27.2%) or without nursing (34.5%, 95% CI 34.0%–35.0%) or in hospital (27.2%, 95% CI 26.7%–27.6%). The proportion of deaths in nursing homes decreased over 10 years (−0.36% annually, 95% CI −0.63% to −0.09%, p = 0.014), while hospital deaths changed little (0.25% annually, 95% CI −0.06% to 0.57%, p = 0.09). Dying with frailty was common with “old age” stated in 75.6% of death certifications. Centenarians were more likely to die of pneumonia (e.g., 17.7% [95% CI 17.3%–18.1%] versus 6.0% [5.9%–6.0%] for those aged 80–84 years) and old age/frailty (28.1% [27.6%–28.5%] versus 0.9% [0.9%–0.9%] for those aged 80–84 years) and less likely to die of cancer (4.4% [4.2%–4.6%] versus 24.5% [24.6%–25.4%] for those aged 80–84 years) and ischemic heart disease (8.6% [8.3%–8.9%] versus 19.0% [18.9%–19.0%] for those aged 80–84 years) than were younger elderly patients. More care home beds available per 1,000 population were associated with fewer deaths in hospital (PR 0.98, 95% CI 0.98–0.99, p<0.001).

Conclusions

Centenarians are more likely to have causes of death certified as pneumonia and frailty and less likely to have causes of death of cancer or ischemic heart disease, compared with younger elderly patients. To reduce reliance on hospital care at the end of life requires recognition of centenarians’ increased likelihood to “acute” decline, notably from pneumonia, and wider provision of anticipatory care to enable people to remain in their usual residence, and increasing care home bed capacity. Please see later in the article for the Editors'' Summary  相似文献   

2.
BackgroundCervical cancer screening strategies using visual inspection or cytology may have suboptimal diagnostic accuracy for detection of precancer in women living with HIV (WLHIV). The optimal screen and screen–triage strategy, age to initiate, and frequency of screening for WLHIV remain unclear. This study evaluated the sensitivity, specificity, and positive predictive value of different cervical cancer strategies in WLHIV in Africa.Methods and findingsWLHIV aged 25–50 years attending HIV treatment centres in Burkina Faso (BF) and South Africa (SA) from 5 December 2011 to 30 October 2012 were enrolled in a prospective evaluation study of visual inspection using acetic acid (VIA) or visual inspection using Lugol’s iodine (VILI), high-risk human papillomavirus DNA test (Hybrid Capture 2 [HC2] or careHPV), and cytology for histology-verified high-grade cervical intraepithelial neoplasia (CIN2+/CIN3+) at baseline and endline, a median 16 months later. Among 1,238 women (BF: 615; SA: 623), median age was 36 and 34 years (p < 0.001), 28.6% and 49.6% ever had prior cervical cancer screening (p < 0.001), and 69.9% and 64.2% were taking ART at enrolment (p = 0.045) in BF and SA, respectively. CIN2+ prevalence was 5.8% and 22.4% in BF and SA (p < 0.001), respectively. VIA had low sensitivity for CIN2+ (44.7%, 95% confidence interval [CI] 36.9%–52.7%) and CIN3+ (56.1%, 95% CI 43.3%–68.3%) in both countries, with specificity for ≤CIN1 of 78.7% (95% CI 76.0%–81.3%). HC2 had sensitivity of 88.8% (95% CI 82.9%–93.2%) for CIN2+ and 86.4% (95% CI 75.7%–93.6%) for CIN3+. Specificity for ≤CIN1 was 55.4% (95% CI 52.2%–58.6%), and screen positivity was 51.3%. Specificity was higher with a restricted genotype (HPV16/18/31/33/35/45/52/58) approach (73.5%, 95% CI 70.6%–76.2%), with lower screen positivity (33.7%), although there was lower sensitivity for CIN3+ (77.3%, 95% CI 65.3%–86.7%). In BF, HC2 was more sensitive for CIN2+/CIN3+ compared to VIA/VILI (relative sensitivity for CIN2+ = 1.72, 95% CI 1.28–2.32; CIN3+: 1.18, 95% CI 0.94–1.49). Triage of HC2-positive women with VIA/VILI reduced the number of colposcopy referrals, but with loss in sensitivity for CIN2+ (58.1%) but not for CIN3+ (84.6%). In SA, cytology high-grade squamous intraepithelial lesion or greater (HSIL+) had best combination of sensitivity (CIN2+: 70.1%, 95% CI 61.3%–77.9%; CIN3+: 80.8%, 95% CI 67.5%–90.4%) and specificity (81.6%, 95% CI 77.6%–85.1%). HC2 had similar sensitivity for CIN3+ (83.0%, 95% CI 70.2%–91.9%) but lower specificity compared to HSIL+ (42.7%, 95% CI 38.4%–47.1%; relative specificity = 0.57, 95% CI 0.52–0.63), resulting in almost twice as many referrals. Compared to HC2, triage of HC2-positive women with HSIL+ resulted in a 40% reduction in colposcopy referrals but was associated with some loss in sensitivity. CIN2+ incidence over a median 16 months was highest among VIA baseline screen-negative women (2.2%, 95% CI 1.3%–3.7%) and women who were baseline double-negative with HC2 and VIA (2.1%, 95% CI 1.3%–3.5%) and lowest among HC2 baseline screen-negative women (0.5%, 95% CI 0.1%–1.8%). Limitations of our study are that WLHIV included in the study may not reflect a contemporary cohort of WLHIV initiating ART in the universal ART era and that we did not evaluate HPV tests available in study settings today.ConclusionsIn this cohort study among WLHIV in Africa, a human papillomavirus (HPV) test targeting 14 high-risk (HR) types had higher sensitivity to detect CIN2+ compared to visual inspection but had low specificity, although a restricted genotype approach targeting 8 HR types decreased the number of unnecessary colposcopy referrals. Cytology HSIL+ had optimal performance for CIN2+/CIN3+ detection in SA. Triage of HPV-positive women with HSIL+ maintained high specificity but with some loss in sensitivity compared to HC2 alone.

In this cohort study, Helen Kelly and colleagues explore cervical cancer screening strategies for women living with HIV.  相似文献   

3.
ObjectiveCurrent practice guidelines recommend the routine use of several cardiac medications early in the course of acute myocardial infarction (AMI). Our objective was to analyze temporal trends in medication use and in-hospital mortality of AMI patients in a Chinese population.MethodsThis is a retrospective observational study using electronic medical records from the hospital information system (HIS) of 14 Chinese hospitals. We identified 5599 patients with AMI between 2005 and 2011. Factors associated with medication use and in-hospital mortality were explored by using hierarchical logistic regression.ResultsThe use of several guideline-recommended medications all increased during the study period: statins (57.7%–90.1%), clopidogrel (61.8%–92.3%), β-Blockers (45.4%–65.1%), ACEI/ARB (46.7%–58.7%), aspirin (81.9%–92.9%), and the combinations thereof increased from 24.9% to 42.8% (P<0.001 for all). Multivariate analyses showed statistically significant increases in all these medications. The in-hospital mortality decreased from 15.9% to 5.7% from 2005 to 2011 (P<0.001). After multivariate adjustment, admission year was still a significant factor (OR = 0.87, 95% CI 0.79–0.96, P = 0.007), the use of aspirin (OR = 0.64, 95% CI 0.46–0.87), clopidogrel (OR = 0.44, 95% CI 0.31–0.61), ACEI/ARB (OR = 0.73, 95% CI 0.56–0.94) and statins (OR = 0.54, 95% CI 0.40–0.73) were associated with a decrease in in-hospital mortality. Patients with older age, cancer and renal insufficiency had higher in-hospital mortality, while they were generally less likely to receive all these medications.ConclusionUse of guideline-recommended medications early in the course of AMI increased between 2005 and 2011 in a Chinese population. During this same time, there was a decrease in in-hospital mortality.  相似文献   

4.
BackgroundThe COVID-19 epidemic in the United States is widespread, with more than 200,000 deaths reported as of September 23, 2020. While ecological studies show higher burdens of COVID-19 mortality in areas with higher rates of poverty, little is known about social determinants of COVID-19 mortality at the individual level.Methods and findingsWe estimated the proportions of COVID-19 deaths by age, sex, race/ethnicity, and comorbid conditions using their reported univariate proportions among COVID-19 deaths and correlations among these variables in the general population from the 2017–2018 National Health and Nutrition Examination Survey (NHANES). We used these proportions to randomly sample individuals from NHANES. We analyzed the distributions of COVID-19 deaths by race/ethnicity, income, education level, and veteran status. We analyzed the association of these characteristics with mortality by logistic regression. Summary demographics of deaths include mean age 71.6 years, 45.9% female, and 45.1% non-Hispanic white. We found that disproportionate deaths occurred among individuals with nonwhite race/ethnicity (54.8% of deaths, 95% CI 49.0%–59.6%, p < 0.001), individuals with income below the median (67.5%, 95% CI 63.4%–71.5%, p < 0.001), individuals with less than a high school level of education (25.6%, 95% CI 23.4% –27.9%, p < 0.001), and veterans (19.5%, 95% CI 15.8%–23.4%, p < 0.001). Except for veteran status, these characteristics are significantly associated with COVID-19 mortality in multiple logistic regression. Limitations include the lack of institutionalized people in the sample (e.g., nursing home residents and incarcerated persons), the need to use comorbidity data collected from outside the US, and the assumption of the same correlations among variables for the noninstitutionalized population and COVID-19 decedents.ConclusionsSubstantial inequalities in COVID-19 mortality are likely, with disproportionate burdens falling on those who are of racial/ethnic minorities, are poor, have less education, and are veterans. Healthcare systems must ensure adequate access to these groups. Public health measures should specifically reach these groups, and data on social determinants should be systematically collected from people with COVID-19.

In this simulation study, Benjamin Seligman and colleagues explore socio-demographic factors associated with COVID-19 deaths in the US.  相似文献   

5.

Background

Financial ties between health professionals and industry may unduly influence professional judgments and some researchers have suggested that widening disease definitions may be one driver of over-diagnosis, bringing potentially unnecessary labeling and harm. We aimed to identify guidelines in which disease definitions were changed, to assess whether any proposed changes would increase the numbers of individuals considered to have the disease, whether potential harms of expanding disease definitions were investigated, and the extent of members'' industry ties.

Methods and Findings

We undertook a cross-sectional study of the most recent publication between 2000 and 2013 from national and international guideline panels making decisions about definitions or diagnostic criteria for common conditions in the United States. We assessed whether proposed changes widened or narrowed disease definitions, rationales offered, mention of potential harms of those changes, and the nature and extent of disclosed ties between members and pharmaceutical or device companies.Of 16 publications on 14 common conditions, ten proposed changes widening and one narrowing definitions. For five, impact was unclear. Widening fell into three categories: creating “pre-disease”; lowering diagnostic thresholds; and proposing earlier or different diagnostic methods. Rationales included standardising diagnostic criteria and new evidence about risks for people previously considered to not have the disease. No publication included rigorous assessment of potential harms of proposed changes.Among 14 panels with disclosures, the average proportion of members with industry ties was 75%. Twelve were chaired by people with ties. For members with ties, the median number of companies to which they had ties was seven. Companies with ties to the highest proportions of members were active in the relevant therapeutic area. Limitations arise from reliance on only disclosed ties, and exclusion of conditions too broad to enable analysis of single panel publications.

Conclusions

For the common conditions studied, a majority of panels proposed changes to disease definitions that increased the number of individuals considered to have the disease, none reported rigorous assessment of potential harms of that widening, and most had a majority of members disclosing financial ties to pharmaceutical companies. Please see later in the article for the Editors'' Summary  相似文献   

6.
BackgroundGlucose-6-phosphate dehydrogenase (G6PD) deficiency is a common enzyme deficiency, prevalent in many malaria-endemic countries. G6PD-deficient individuals are susceptible to hemolysis during oxidative stress, which can occur from exposure to certain medications, including 8-aminoquinolines used to treat Plasmodium vivax malaria. Accordingly, access to point-of-care (POC) G6PD testing in Brazil is critical for safe treatment of P. vivax malaria.Methodology/Principal findingsThis study evaluated the performance of the semi-quantitative, POC STANDARD G6PD Test (SD Biosensor, Republic of Korea). Participants were recruited at clinics and through an enriched sample in Manaus and Porto Velho, Brazil. G6PD and hemoglobin measurements were obtained from capillary samples at the POC using the STANDARD and HemoCue 201+ (HemoCue AB, Sweden) tests. A thick blood slide was prepared for malaria microscopy. At the laboratories, the STANDARD and HemoCue tests were repeated on venous samples and a quantitative spectrophotometric G6PD reference assay was performed (Pointe Scientific, Canton, MI). G6PD was also assessed by fluorescent spot test. In Manaus, a complete blood count was performed.Samples were analyzed from 1,736 participants. In comparison to spectrophotometry, the STANDARD G6PD Test performed equivalently in determining G6PD status in venous and capillary specimens under varied operating temperatures. Using the manufacturer-recommended reference value thresholds, the test’s sensitivity at the <30% threshold on both specimen types was 100% (95% confidence interval [CI] venous 93.6%–100.0%; capillary 93.8%–100.0%). Specificity was 98.6% on venous specimens (95% CI 97.9%–99.1%) and 97.8% on capillary (95% CI 97.0%–98.5%). At the 70% threshold, the test’s sensitivity was 96.9% on venous specimens (95% CI 83.8%–99.9%) and 94.3% on capillary (95% CI 80.8%–99.3%). Specificity was 96.5% (95% CI 95.0%–97.6%) and 92.3% (95% CI 90.3%–94.0%) on venous and capillary specimens, respectively.Conclusion/SignificanceThe STANDARD G6PD Test is a promising tool to aid in POC detection of G6PD deficiency in Brazil.Trial registrationThis study was registered with ClinicalTrials.gov (identifier: NCT04033640).  相似文献   

7.
BackgroundSuicide is the leading cause of death among 15–34 year olds in China, but no national data are available on the suicide and suicide attempts rates of college students, a sub-group of youth with 23 million. Several studies have reported the prevalence of suicide attempts among college students, however, no meta-analysis pooling the prevalence of suicide attempts is found.ResultsA total of 29 eligible studies, with 88,225 college students, were finally included. The maximum and minimum reported prevalences of suicide attempts among college students in China were 0.4% and 10.5%, respectively. The pooled prevalence of suicide attempts was 2.8% (95%CI: 2.3%–3.3%). Subgroup analyses showed that the pooled estimate of prevalence of life time suicide attempts was 2.7% (95%CI: 2.1%–3.3%), and 12-month suicide attempts was 2.9% (95%CI: 2.0%–3.8%). The prevalence for males was 2.4% (95%CI: 1.8%–3.0%), and for females was 2.7% (95%CI: 1.9%–3.7%). The prevalences among college students in grade 1 through 4 were 2.8% (95%CI: 1.7%–3.8%), 1.8% (95%CI: 1.2%–2.3%), 2.0% (95%CI: 0.8%–3.1%), and 2.9% (95%CI: 0.1%–6.7%), respectively. The prevalences among college students from rural and urban areas were 5.1% (95%CI: 2.8%–7.5%) and 3.7% (95%CI: 1.4%–5.9%), respectively.Conclusions2.8% prevalence of suicide attempts and more than 600,000 suicide attempters among college students indicate that suicide attempt among college students is an important public health problem in China. More attention should be paid to the current situation.  相似文献   

8.
BackgroundFoot complications are considered to be a serious consequence of diabetes mellitus, posing a major medical and economical threat. Identifying the extent of this problem and its risk factors will enable health providers to set up better prevention programs. Saudi National Diabetes Registry (SNDR), being a large database source, would be the best tool to evaluate this problem.MethodsThis is a cross-sectional study of a cohort of 62,681 patients aged ≥25 years from SNDR database, selected for studying foot complications associated with diabetes and related risk factors.ResultsThe overall prevalence of diabetic foot complications was 3.3% with 95% confidence interval (95% CI) of (3.16%–3.44%), whilst the prevalences of foot ulcer, gangrene, and amputations were 2.05% (1.94%–2.16%), 0.19% (0.16%–0.22%), and 1.06% (0.98%–1.14%), respectively. The prevalence of foot complications increased with age and diabetes duration predominantly amongst the male patients. Diabetic foot is more commonly seen among type 2 patients, although it is more prevalent among type 1 diabetic patients. The Univariate analysis showed Charcot joints, peripheral vascular disease (PVD), neuropathy, diabetes duration ≥10 years, insulin use, retinopathy, nephropathy, age ≥45 years, cerebral vascular disease (CVD), poor glycemic control, coronary artery disease (CAD), male gender, smoking, and hypertension to be significant risk factors with odds ratio and 95% CI at 42.53 (18.16–99.62), 14.47 (8.99–23.31), 12.06 (10.54–13.80), 7.22 (6.10–8.55), 4.69 (4.28–5.14), 4.45 (4.05–4.89), 2.88 (2.43–3.40), 2.81 (2.31–3.43), 2.24 (1.98–2.45), 2.02 (1.84–2.22), 1.54 (1.29–1.83), and 1.51 (1.38–1.65), respectively.ConclusionsRisk factors for diabetic foot complications are highly prevalent; they have put these complications at a higher rate and warrant primary and secondary prevention programs to minimize morbidity and mortality in addition to economic impact of the complications. Other measurements, such as decompression of lower extremity nerves, should be considered among diabetic patients.  相似文献   

9.
BackgroundScrub typhus (ST) is a life-threatening infectious disease if appropriate treatment is unavailable. Large discrepancy of clinical severity of ST patients was reported among age groups, and the underlying risk factors for severe disease are unclear.MethodsClinical and epidemiological data of ST patients were collected in 55 surveillance hospitals located in Guangzhou City, China, from 2012 to 2018. Severe prognosis and related factors were determined and compared between pediatric and elderly patients.ResultsA total of 2,074 ST patients including 209 pediatric patients and 1,865 elderly patients were included, with a comparable disease severity rate of 11.0% (95% CI 7.1%–16.1%) and 10.3% (95% CI 9.0%–11.8%). Different frequencies of clinical characteristics including lymphadenopathy, skin rash, enlarged tonsils, etc. were observed between pediatric and elderly patients. Presence of peripheral edema and decreased hemoglobin were the most important predictors of severe illness in pediatric patients with adjusted ORs by 38.99 (9.96–152.67, p<0.001) and 13.22 (1.54–113.50, p = 0.019), respectively, while presence of dyspnea and increased total bilirubin were the potential determinants of severe disease in elderly patients with adjusted ORs by 11.69 (7.33–18.64, p<0.001) and 3.17 (1.97–5.11, p<0.001), respectively. Compared with pediatric patients, elderly patients were more likely to receive doxycycline (64.8% v.s 9.9%, p<0.001), while less likely to receive azithromycin therapy (5.0% v.s 41.1%, p<0.001).ConclusionThe disease severity rate is comparable between pediatric and elderly ST patients, while different clinical features and laboratory indicators were associated with development of severe complications for pediatric and elderly patients, which is helpful for diagnosis and progress assessment of disease for ST patients.  相似文献   

10.
BackgroundDengue is the world’s most common mosquito-borne virus but remains diagnostically challenging due to its nonspecific presentation. Access to laboratory confirmation is limited and thus most reported figures are based on clinical diagnosis alone, the accuracy of which is uncertain. This systematic review assesses the diagnostic accuracy of the traditional (1997) and revised (2009) WHO clinical case definitions for dengue fever, the basis for most national guidelines.Methodology/Principal findingsPubMed, EMBASE, Scopus, OpenGrey, and the annual Dengue Bulletin were searched for studies assessing the diagnostic accuracy of the unmodified clinical criteria. Two reviewers (NR/SL) independently assessed eligibility, extracted data, and evaluated risk of bias using a modified QUADAS-2. Additional records were found by citation network analysis. A meta-analysis was done using a bivariate mixed-effects regression model. Studies that modified criteria were analysed separately. This systematic review protocol was registered on PROSPERO (CRD42020165998). We identified 11 and 12 datasets assessing the 1997 and 2009 definition, respectively, and 6 using modified criteria. Sensitivity was 93% (95% CI: 77–98) and 93% (95% CI: 86–96) for the 1997 and 2009 definitions, respectively. Specificity was 29% (95% CI: 8–65) and 31% (95% CI: 18–48) for the 1997 and 2009 definitions, respectively. Diagnostic performance suffered at the extremes of age. No modification significantly improved accuracy.Conclusions/SignificanceDiagnostic accuracy of clinical criteria is poor, with significant implications for surveillance and public health responses for dengue control. As the basis for most reported figures, this has relevance to policymakers planning resource allocation and researchers modelling transmission, particularly during COVID-19.  相似文献   

11.
The Mpumalanga Men''s Study (MPMS) is the assessment of the Project Boithato HIV prevention intervention for South African MSM. Boithato aims to increase consistent condom use, regular testing for HIV-negative MSM, and linkage to care for HIV-positive MSM. The MPMS baseline examined HIV prevalence and associated risk behaviors, and testing, care, and treatment behaviors among MSM in Gert Sibande and Ehlanzeni districts in Mpumalanga province, South Africa in order to effectively target intervention activities. We recruited 307 MSM in Gert Sibande and 298 in Ehlanzeni through respondent-driven sampling (RDS) between September 2012-March 2013. RDS-adjusted HIV prevalence estimates are 28.3% (95% CI 21.1%–35.3%) in Gert Sibande, and 13.7% (95% CI 9.1%–19.6%) in Ehlanzeni. Prevalence is significantly higher among MSM over age 25 [57.8% (95% CI 43.1%–72.9%) vs. 17.9% (95% CI 10.6%–23.9%), P<0.001 in Gert Sibande; 34.5% (95%CI 20.5%–56.0%) vs. 9.1% (95% CI 4.6%–13.9%), P<0.001 in Ehlanzeni]. In Gert Sibande, prevalence is higher among self-identified gay and transgender MSM vs. other MSM [39.3% (95%CI, 28.3%–47.9%), P<0.01], inconsistent condom users [38.1% (18.1%–64.2%), P<0.05], those with a current regular male partner [35.0% (27.1%–46.4%), P<0.05], and those with lifetime experience of intimate partner violence with men [40.4%, (95%CI 28.9%–50.9%), P<0.05]. Prevalence of previous HIV testing was 65.8% (95%CI 58.8%–74.0%) in Gert Sibande, and 69.3% (95%CI 61.9%–76.8%) in Ehlanzeni. Regular HIV testing was uncommon [(34.6%, (95%CI 27.9%–41.4%) in Gert Sibande; 31.0% (95%CI 24.9%–37.8%) in Ehlanzeni]. Among HIV-positive participants, few knew their status (28.1% in Gert Sibande and 14.5% in Ehlanzeni), or were appropriately linked to care (18.2% and 11.3%, respectively), or taking antiretroviral therapy (13.6% and 9.6% respectively). MPMS results demonstrate the importance of implementing interventions for MSM to increase consistent condom use, regular HIV testing, and linkage and engagement in care for HIV-infected MSM.  相似文献   

12.

Background

Unnecessary diagnostic imaging leads to higher costs, longer emergency department stays, and increased patient exposure to ionizing radiation. We sought to prospectively derive and validate two decision instruments (DIs) for selective chest computed tomography (CT) in adult blunt trauma patients.

Methods and Findings

From September 2011 to May 2014, we prospectively enrolled blunt trauma patients over 14 y of age presenting to eight US, urban level 1 trauma centers in this observational study. During the derivation phase, physicians recorded the presence or absence of 14 clinical criteria before viewing chest imaging results. We determined injury outcomes by CT radiology readings and categorized injuries as major or minor according to an expert-panel-derived clinical classification scheme. We then employed recursive partitioning to derive two DIs: Chest CT-All maximized sensitivity for all injuries, and Chest CT-Major maximized sensitivity for only major thoracic injuries (while increasing specificity). In the validation phase, we employed similar methodology to prospectively test the performance of both DIs.We enrolled 11,477 patients—6,002 patients in the derivation phase and 5,475 patients in the validation phase. The derived Chest CT-All DI consisted of (1) abnormal chest X-ray, (2) rapid deceleration mechanism, (3) distracting injury, (4) chest wall tenderness, (5) sternal tenderness, (6) thoracic spine tenderness, and (7) scapular tenderness. The Chest CT-Major DI had the same criteria without rapid deceleration mechanism. In the validation phase, Chest CT-All had a sensitivity of 99.2% (95% CI 95.4%–100%), a specificity of 20.8% (95% CI 19.2%–22.4%), and a negative predictive value (NPV) of 99.8% (95% CI 98.9%–100%) for major injury, and a sensitivity of 95.4% (95% CI 93.6%–96.9%), a specificity of 25.5% (95% CI 23.5%–27.5%), and a NPV of 93.9% (95% CI 91.5%–95.8%) for either major or minor injury. Chest CT-Major had a sensitivity of 99.2% (95% CI 95.4%–100%), a specificity of 31.7% (95% CI 29.9%–33.5%), and a NPV of 99.9% (95% CI 99.3%–100%) for major injury and a sensitivity of 90.7% (95% CI 88.3%–92.8%), a specificity of 37.9% (95% CI 35.8%–40.1%), and a NPV of 91.8% (95% CI 89.7%–93.6%) for either major or minor injury. Regarding the limitations of our work, some clinicians may disagree with our injury classification and sensitivity thresholds for injury detection.

Conclusions

We prospectively derived and validated two DIs (Chest CT-All and Chest CT-Major) that identify blunt trauma patients with clinically significant thoracic injuries with high sensitivity, allowing for a safe reduction of approximately 25%–37% of unnecessary chest CTs. Trauma evaluation protocols that incorporate these DIs may decrease unnecessary costs and radiation exposure in the disproportionately young trauma population.  相似文献   

13.
BackgroundA number of prior studies have demonstrated that research participants with limited English proficiency in the United States are routinely excluded from clinical trial participation. Systematic exclusion through study eligibility criteria that require trial participants to be able to speak, read, and/or understand English affects access to clinical trials and scientific generalizability. We sought to establish the frequency with which English language proficiency is required and, conversely, when non-English languages are affirmatively accommodated in US interventional clinical trials for adult populations.Methods and findingsWe used the advanced search function on ClinicalTrials.gov specifying interventional studies for adults with at least 1 site in the US. In addition, we used these search criteria to find studies with an available posted protocol. A computer program was written to search for evidence of English or Spanish language requirements, or the posted protocol, when available, was manually read for these language requirements. Of the 14,367 clinical trials registered on ClinicalTrials.gov between 1 January 2019 and 1 December 2020 that met baseline search criteria, 18.98% (95% CI 18.34%–19.62%; n = 2,727) required the ability to read, speak, and/or understand English, and 2.71% (95% CI 2.45%–2.98%; n = 390) specifically mentioned accommodation of translation to another language. The remaining trials in this analysis and the following sub-analyses did not mention English language requirements or accommodation of languages other than English. Of 2,585 federally funded clinical trials, 28.86% (95% CI 27.11%–30.61%; n = 746) required English language proficiency and 4.68% (95% CI 3.87%–5.50%; n = 121) specified accommodation of other languages; of the 5,286 industry-funded trials, 5.30% (95% CI 4.69%–5.90%; n = 280) required English and 0.49% (95% CI 0.30%–0.69%; n = 26) accommodated other languages. Trials related to infectious disease were less likely to specify an English requirement than all registered trials (10.07% versus 18.98%; relative risk [RR] = 0.53; 95% CI 0.44–0.64; p < 0.001). Trials related to COVID-19 were also less likely to specify an English requirement than all registered trials (8.18% versus 18.98%; RR = 0.43; 95% CI 0.33–0.56; p < 0.001). Trials with a posted protocol (n = 366) were more likely than all registered clinical trials to specify an English requirement (36.89% versus 18.98%; RR = 1.94, 95% CI 1.69–2.23; p < 0.001). A separate analysis of studies with posted protocols in 4 therapeutic areas (depression, diabetes, breast cancer, and prostate cancer) demonstrated that clinical trials related to depression were the most likely to require English (52.24%; 95% CI 40.28%–64.20%). One limitation of this study is that the computer program only searched for the terms “English” and “Spanish” and may have missed evidence of other language accommodations. Another limitation is that we did not differentiate between requirements to read English, speak English, understand English, and be a native English speaker; we grouped these requirements together in the category of English language requirements.ConclusionsA meaningful percentage of US interventional clinical trials for adults exclude individuals who cannot read, speak, and/or understand English, or are not native English speakers. To advance more inclusive and generalizable research, funders, sponsors, institutions, investigators, institutional review boards, and others should prioritize translating study materials and eliminate language requirements unless justified either scientifically or ethically.

Akila Muthukumar and coauthors, systematically analyze ClinicalTrials.gov to evaluate the frequency of English language requirements in clinical trial eligibility criteria.  相似文献   

14.
BackgroundHypertension is the most important cardiovascular risk factor in India, and representative studies of middle-aged and older Indian adults have been lacking. Our objectives were to estimate the proportions of hypertensive adults who had been diagnosed, took antihypertensive medication, and achieved control in the middle-aged and older Indian population and to investigate the association between access to healthcare and hypertension management.Methods and findingsWe designed a nationally representative cohort study of the middle-aged and older Indian population, the Longitudinal Aging Study in India (LASI), and analyzed data from the 2017–2019 baseline wave (N = 72,262) and the 2010 pilot wave (N = 1,683). Hypertension was defined as self-reported physician diagnosis or elevated blood pressure (BP) on measurement, defined as systolic BP ≥ 140 mm Hg or diastolic BP ≥ 90 mm Hg. Among hypertensive individuals, awareness, treatment, and control were defined based on self-reports of having been diagnosed, taking antihypertensive medication, and not having elevated BP, respectively. The estimated prevalence of hypertension for the Indian population aged 45 years and older was 45.9% (95% CI 45.4%–46.5%). Among hypertensive individuals, 55.7% (95% CI 54.9%–56.5%) had been diagnosed, 38.9% (95% CI 38.1%–39.6%) took antihypertensive medication, and 31.7% (95% CI 31.0%–32.4%) achieved BP control. In multivariable logistic regression models, access to public healthcare was a key predictor of hypertension treatment (odds ratio [OR] = 1.35, 95% CI 1.14–1.60, p = 0.001), especially in the most economically disadvantaged group (OR of the interaction for middle economic status = 0.76, 95% CI 0.61–0.94, p = 0.013; OR of the interaction for high economic status = 0.84, 95% CI 0.68–1.05, p = 0.124). Having health insurance was not associated with improved hypertension awareness among those with low economic status (OR = 0.96, 95% CI 0.86–1.07, p = 0.437) and those with middle economic status (OR of the interaction = 1.15, 95% CI 1.00–1.33, p = 0.051), but it was among those with high economic status (OR of the interaction = 1.28, 95% CI 1.10–1.48, p = 0.001). Comparing hypertension awareness, treatment, and control rates in the 4 pilot states, we found statistically significant (p < 0.001) improvement in hypertension management from 2010 to 2017–2019. The limitations of this study include the pilot sample being relatively small and that it recruited from only 4 states.ConclusionsAlthough considerable variations in hypertension diagnosis, treatment, and control exist across different sociodemographic groups and geographic areas, reducing uncontrolled hypertension remains a public health priority in India. Access to healthcare is closely tied to both hypertension diagnosis and treatment.

Jinkook Lee and colleagues investigate hypertension management and its association with healthcare access in middle-aged and older adults in India.  相似文献   

15.

Background

Regular breakfast consumption may protect against type 2 diabetes risk in adults but little is known about its influence on type 2 diabetes risk markers in children. We investigated the associations between breakfast consumption (frequency and content) and risk markers for type 2 diabetes (particularly insulin resistance and glycaemia) and cardiovascular disease in children.

Methods and Findings

We conducted a cross-sectional study of 4,116 UK primary school children aged 9–10 years. Participants provided information on breakfast frequency, had measurements of body composition, and gave fasting blood samples for measurements of blood lipids, insulin, glucose, and glycated haemoglobin (HbA1c). A subgroup of 2,004 children also completed a 24-hour dietary recall. Among 4,116 children studied, 3,056 (74%) ate breakfast daily, 450 (11%) most days, 372 (9%) some days, and 238 (6%) not usually. Graded associations between breakfast frequency and risk markers were observed; children who reported not usually having breakfast had higher fasting insulin (percent difference 26.4%, 95% CI 16.6%–37.0%), insulin resistance (percent difference 26.7%, 95% CI 17.0%–37.2%), HbA1c (percent difference 1.2%, 95% CI 0.4%–2.0%), glucose (percent difference 1.0%, 95% CI 0.0%–2.0%), and urate (percent difference 6%, 95% CI 3%–10%) than those who reported having breakfast daily; these differences were little affected by adjustment for adiposity, socioeconomic status, and physical activity levels. When the higher levels of triglyceride, systolic blood pressure, and C-reactive protein for those who usually did not eat breakfast relative to those who ate breakfast daily were adjusted for adiposity, the differences were no longer significant. Children eating a high fibre cereal breakfast had lower insulin resistance than those eating other breakfast types (p for heterogeneity <0.01). Differences in nutrient intakes between breakfast frequency groups did not account for the differences in type 2 diabetes markers.

Conclusions

Children who ate breakfast daily, particularly a high fibre cereal breakfast, had a more favourable type 2 diabetes risk profile. Trials are needed to quantify the protective effect of breakfast on emerging type 2 diabetes risk. Please see later in the article for the Editors'' Summary  相似文献   

16.
BackgroundTuberculosis (TB) rates among Tibetan refugee children and adolescents attending boarding schools in India are extremely high. We undertook a comprehensive case finding and TB preventive treatment (TPT) program in 7 schools in the Zero TB Kids project. We aimed to measure the TB infection and disease burden and investigate the risk of TB disease in children and adults who did and did not receive TPT in the schools.Methods and findingsA mobile team annually screened children and staff for TB at the 7 boarding schools in Himachal Pradesh, India, using symptom criteria, radiography, molecular diagnostics, and tuberculin skin tests. TB infection (TBI) was treated with short-course regimens of isoniazid and rifampin or rifampin. TB disease was treated according to Tibetan and Indian guidelines. Between April 2017 and December 2019, 6,582 schoolchildren (median age 14 [IQR 11–16] years) and 807 staff (median age 40 [IQR 33–48] years) were enrolled. Fifty-one percent of the students and 58% of the staff were females. Over 13,161 person-years of follow-up in schoolchildren (median follow-up 2.3 years) and 1,800 person-years of follow-up in staff (median follow-up 2.5 years), 69 TB episodes occurred in schoolchildren and 4 TB episodes occurred in staff, yielding annual incidence rates of 524/100,000 (95% CI 414–663/100,000) person-years and 256/100,000 (95% CI 96–683/100,000) person-years, respectively. Of 1,412 schoolchildren diagnosed with TBI, 1,192 received TPT. Schoolchildren who received TPT had 79% lower risk of TB disease (adjusted hazard ratio [aHR] 0.21; 95% CI 0.07–0.69; p = 0.010) compared to non-recipients, the primary study outcome. Protection was greater in recent contacts (aHR 0.07; 95% CI 0.01–0.42; p = 0.004), the secondary study outcome. The prevalence of recent contacts was 28% (1,843/6,582). Two different TPT regimens were used (3HR and 4R), and both were apparently effective. No staff receiving TPT developed TB. Overall, between 2017 and 2019, TB disease incidence decreased by 87%, from 837/100,000 (95% CI 604–1,129/100,000) person-years to 110/100,000 (95% CI 36–255/100,000) person-years (p < 0.001), and TBI prevalence decreased by 42% from 19% (95% CI 18%–20%) to 11% (95% CI 10%–12%) (p < 0.001). A limitation of our study is that TB incidence could be influenced by secular trends during the study period.ConclusionsIn this study, following implementation of a school-wide TB screening and preventive treatment program, we observed a significant reduction in the burden of TB disease and TBI in children and adolescents. The benefit of TPT was particularly marked for recent TB contacts. This initiative may serve as a model for TB detection and prevention in children and adolescents in other communities affected by TB.

Kunchok Dorjee and colleagues investigate infection and disease burden following mass tuberculosis preventive treatment for Tibetan refugee children at schools in India.  相似文献   

17.
BackgroundAcquisition of a disability in adulthood has been associated with a reduction in mental health. We tested the hypothesis that low wealth prior to disability acquisition is associated with a greater deterioration in mental health than for people with high wealth.MethodsWe assess whether level of wealth prior to disability acquisition modifies this association using 12 waves of data (2001–2012) from the Household, Income and Labour Dynamics in Australia survey–a population-based cohort study of working-age Australians. Eligible participants reported at least two consecutive waves of disability preceded by at least two consecutive waves without disability (1977 participants, 13,518 observations). Fixed-effects linear regression was conducted with a product term between wealth prior to disability (in tertiles) and disability acquisition with the mental health component score of the SF–36 as the outcome.ResultsIn models adjusted for time-varying confounders, there was evidence of negative effect measure modification by prior wealth of the association between disability acquisition and mental health (interaction term for lowest wealth tertile: -2.2 points, 95% CI -3.1 points, -1.2, p<0.001); low wealth was associated with a greater decline in mental health following disability acquisition (-3.3 points, 95% CI -4.0, -2.5) than high wealth (-1.1 points, 95% CI -1.7, -0.5).ConclusionThe findings suggest that low wealth prior to disability acquisition in adulthood results in a greater deterioration in mental health than among those with high wealth.  相似文献   

18.
BackgroundTwo weeks’ isolation is widely recommended for people commencing treatment for pulmonary tuberculosis (TB). The evidence that this corresponds to clearance of potentially infectious tuberculous mycobacteria in sputum is not well established. This World Health Organization–commissioned review investigated sputum sterilisation dynamics during TB treatment.Methods and findingsFor the main analysis, 2 systematic literature searches of OvidSP MEDLINE, Embase, and Global Health, and EBSCO CINAHL Plus were conducted to identify studies with data on TB infectiousness (all studies to search date, 1 December 2017) and all randomised controlled trials (RCTs) for drug-susceptible TB (from 1 January 1990 to search date, 20 February 2018). Included articles reported on patients receiving effective treatment for culture-confirmed drug-susceptible pulmonary TB. The outcome of interest was sputum bacteriological conversion: the proportion of patients having converted by a defined time point or a summary measure of time to conversion, assessed by smear or culture. Any study design with 10 or more particpants was considered. Record sifting and data extraction were performed in duplicate. Random effects meta-analyses were performed. A narrative summary additionally describes the results of a systematic search for data evaluating infectiousness from humans to experimental animals (PubMed, all studies to 27 March 2018). Other evidence on duration of infectiousness—including studies reporting on cough dynamics, human tuberculin skin test conversion, or early bactericidal activity of TB treatments—was outside the scope of this review. The literature search was repeated on 22 November 2020, at the request of the editors, to identify studies published after the previous censor date. Four small studies reporting 3 different outcome measures were identified, which included no data that would alter the findings of the review; they are not included in the meta-analyses. Of 5,290 identified records, 44 were included. Twenty-seven (61%) were RCTs and 17 (39%) were cohort studies. Thirteen studies (30%) reported data from Africa, 12 (27%) from Asia, 6 (14%) from South America, 5 (11%) from North America, and 4 (9%) from Europe. Four studies reported data from multiple continents. Summary estimates suggested smear conversion in 9% of patients at 2 weeks (95% CI 3%–24%, 1 single study [N = 1]), and 82% of patients at 2 months of treatment (95% CI 78%–86%, N = 10). Among baseline smear-positive patients, solid culture conversion occurred by 2 weeks in 5% (95% CI 0%–14%, N = 2), increasing to 88% at 2 months (95% CI 84%–92%, N = 20). At equivalent time points, liquid culture conversion was achieved in 3% (95% CI 1%–16%, N = 1) and 59% (95% CI 47%–70%, N = 8). Significant heterogeneity was observed. Further interrogation of the data to explain this heterogeneity was limited by the lack of disaggregation of results, including by factors such as HIV status, baseline smear status, and the presence or absence of lung cavitation.ConclusionsThis systematic review found that most patients remained culture positive at 2 weeks of TB treatment, challenging the view that individuals are not infectious after this interval. Culture positivity is, however, only 1 component of infectiousness, with reduced cough frequency and aerosol generation after TB treatment initiation likely to also be important. Studies that integrate our findings with data on cough dynamics could provide a more complete perspective on potential transmission of Mycobacterium tuberculosis by individuals on treatment.Trial registrationSystematic review registration: PROSPERO 85226.  相似文献   

19.

Objectives

To characterize hepatitis C virus (HCV) epidemiology and assess country-specific population-level HCV prevalence in four countries in the Middle East and North Africa (MENA) region: Djibouti, Somalia, Sudan, and Yemen.

Methods

Reports of HCV prevalence were systematically reviewed as per PRISMA guidelines. Pooled HCV prevalence estimates in different risk populations were conducted when the number of measures per risk category was at least five.

Results

We identified 101 prevalence estimates. Pooled HCV antibody prevalence in the general population in Somalia, Sudan and Yemen was 0.9% (95% confidence interval [95%CI]: 0.3%–1.9%), 1.0% (95%CI: 0.3%–1.9%) and 1.9% (95%CI: 1.4%–2.6%), respectively. The only general population study from Djibouti reported a prevalence of 0.3% (CI: 0.2%–0.4%) in blood donors. In high-risk populations (e.g., haemodialysis and haemophilia patients), pooled HCV prevalence was 17.3% (95%CI: 8.6%–28.2%) in Sudan. In Yemen, three studies of haemodialysis patients reported HCV prevalence between 40.0%-62.7%. In intermediate-risk populations (e.g.. healthcare workers, in patients and men who have sex with men), pooled HCV prevalence was 1.7% (95%CI: 0.0%–4.9%) in Somalia and 0.6% (95%CI: 0.4%–0.8%) in Sudan.

Conclusion

National HCV prevalence in Yemen appears to be higher than in Djibouti, Somalia, and Sudan as well as most other MENA countries; but otherwise prevalence levels in this subregion are comparable to global levels. The high HCV prevalence in patients who have undergone clinical care appears to reflect ongoing transmission in clinical settings. HCV prevalence in people who inject drugs remains unknown.  相似文献   

20.

Purpose

The role of spot sign on computed tomography angiography (CTA) for predicting hematoma expansion (HE) after primary intracerebral hemorrhage (ICH) has been the focus of many studies. Our study sought to evaluate the predictive accuracy of spot signs for HE in a meta-analytic approach.

Materials and Methods

The database of Pubmed, Embase, and the Cochrane Library were searched for eligible studies. Researches were included if they reported data on HE in primary ICH patients, assessed by spot sign on first-pass CTA. Studies with additional data of second-pass CTA, post-contrast CT (PCCT) and CT perfusion (CTP) were also included.

Results

18 studies were pooled into the meta-analysis, including 14 studies of first-pass CTA, and 7 studies of combined CT modalities. In evaluating the accuracy of spot sign for predicting HE, studies of first-pass CTA showed that the sensitivity was 53% (95% CI, 49%–57%) with a specificity of 88% (95% CI, 86%–89%). The pooled positive likelihood ratio (PLR) was 4.70 (95% CI, 3.28–6.74) and the negative likelihood ratio (NLR) was 0.44 (95% CI, 0.34–0.58). For studies of combined CT modalities, the sensitivity was 73% (95% CI, 67%–79%) with a specificity of 88% (95% CI, 86%–90%). The aggregated PLR was 6.76 (95% CI, 3.70–12.34) and the overall NLR was 0.17 (95% CI 0.06–0.48).

Conclusions

Spot signs appeared to be a reliable imaging biomarker for HE. The additional detection of delayed spot sign was helpful in improving the predictive accuracy of early spot signs. Awareness of our results may impact the primary ICH care by providing supportive evidence for the use of combined CT modalities in detecting spot signs.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号