首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 24 毫秒
1.
BackgroundThe Bangladeshi national treatment guidelines for uncomplicated malaria follow WHO recommendations but without G6PD testing prior to primaquine administration. A prospective observational study was conducted to assess the efficacy of the current antimalarial policy.MethodsPatients with uncomplicated malaria, confirmed by microscopy, attending a health care facility in the Chittagong Hill Tracts, Bangladesh, were treated with artemether-lumefantrine (days 0–2) plus single dose primaquine (0.75mg/kg on day2) for P. falciparum infections, or with chloroquine (days 0–2) plus 14 days primaquine (3.5mg/kg total over 14 days) for P. vivax infections. Hb was measured on days 0, 2 and 9 in all patients and also on days 16 and 30 in patients with P. vivax infection. Participants were followed for 30 days. The study was registered with the clinical trials website (NCT02389374).ResultsBetween September 2014 and February 2015 a total of 181 patients were enrolled (64% P. falciparum, 30% P. vivax and 6% mixed infections). Median parasite clearance times were 22.0 (Interquartile Range, IQR: 15.2–27.3) hours for P. falciparum, 20.0 (IQR: 9.5–22.7) hours for P. vivax and 16.6 (IQR: 10.0–46.0) hours for mixed infections. All participants were afebrile within 48 hours, two patients with P. falciparum infection remained parasitemic at 48 hours. No patient had recurrent parasitaemia within 30 days. Adjusted male median G6PD activity was 7.82U/gHb. One male participant (1/174) had severe G6PD deficiency (<10% activity), five participants (5/174) had mild G6PD deficiency (10–60% activity). The Hb nadir occurred on day 2 prior to primaquine treatment in P. falciparum and P. vivax infected patients; mean fractional fall in Hb was -8.8% (95%CI -6.7% to -11.0%) and -7.4% (95%CI: -4.5 to -10.4%) respectively.ConclusionThe current antimalarial policy remains effective. The prevalence of G6PD deficiency was low. Main contribution to haemolysis in G6PD normal individuals was attributable to acute malaria rather than primaquine administration.

Trial Registration

ClinicalTrials.gov NCT02389374  相似文献   

2.
BackgroundPreventive chemotherapy is the cornerstone of soil-transmitted helminth (STH) control. Long-term outcomes and adequate treatment frequency of the recently recommended albendazole-ivermectin have not been studied to date.Methodology/principal findingsDouble-blind randomized controlled trials were conducted in Lao PDR, Pemba Island, Tanzania and Côte d’Ivoire between 2018 and 2020 to evaluate the efficacy and safety of ivermectin-albendazole versus albendazole-placebo in Trichuris trichiura-infected individuals aged 6 to 60. In the framework of this study, in Lao PDR 466 and 413 participants and on Pemba Island, 558 and 515 participants were followed-up six and 12 months post-treatment, respectively. From each participant at least one stool sample was processed for Kato-Katz diagnosis and cure rates (CRs), egg reduction rates (ERRs) and apparent reinfection rates were calculated. If found helminth-positive at six months, participants were re-treated according to their allocated treatment.Long-term outcomes against T. trichiura based on CRs and ERRs of ivermectin-albendazole compared to albendazole were significantly higher at six months in Lao PDR (CR, 65.8 vs 13.4%, difference; 52.4; 95% CI 45.0–60.0; ERRs, 99.0 vs 79.6, difference 19.4; 95% CI 14.4–24.4) and Pemba Island (CR, 17.8 vs 1.4%, difference; 16.4; 95% CI 11.6–21.0; ERRs, 84.9 vs 21.2, difference 63.8; 95% CI 50.6–76.9) and also at 12 months in Lao PDR (CR, 74.0 vs 23.4%, difference; 50.6; 95% CI 42.6–61.0; ERRs, 99.6 vs 91.3, difference 8.3; 95% CI 5.7–10.8) and Pemba Island (CR, 19.5 vs 3.4%, difference; 16.1; 95% CI 10.7–21.5; ERRs, 92.9 vs 53.6, difference 39.3; 95% CI 31.2–47.4) respectively.Apparent reinfection rates with T. trichiura were considerably higher on Pemba Island (100.0%, 95% CI, 29.2–100.0) than in Lao PDR (10.0%, 95% CI, 0.2–44.5) at 12 months post-treatment for participants treated with albendazole alone.Conclusions/significanceThe long-term outcomes against T. trichiura of ivermectin-albendazole are superior to albendazole in terms of CRs and ERRs and in reducing infection intensities. Our results will help to guide decisions on how to best use ivermectin-albendazole in the context of large-scale PC programs tailored to the local context to sustainably control STH infections.Trial registrationClinicalTrials.gov registered with clinicaltrials.gov, reference: NCT03527732, date assigned: 17 May 2018.  相似文献   

3.

Background

Glucose-6-phosphate dehydrogenase (G6PD) deficiency is the most common inherited human enzyme defect. This deficiency provides some protection from clinical malaria, but it can also cause haemolysis after administration of drugs with oxidant properties.

Methods

The safety of chlorproguanil-dapsone+artesunate (CD+A) and amodiaquine+sulphadoxine-pyrimethamine (AQ+SP) for the treatment of uncomplicated P. falciparum malaria was evaluated according to G6PD deficiency in a secondary analysis of an open-label, randomized clinical trial [1]. 702 children, treated with CD+A or AQ+SP and followed for 28 days after treatment were genotyped for G6PD A- deficiency.

Findings

In the first 4 days following CD+A treatment, mean haematocrit declined on average 1.94% (95% CI 1.54 to 2.33) and 1.05% per day (95% CI 0.95 to 1.15) respectively in patients with G6PD deficiency and normal patients; a mean reduction of 1.3% per day was observed among patients who received AQ+SP regardless of G6PD status (95% CI 1.25 to 1.45). Patients with G6PD deficiency recipients of CD+A had significantly lower haematocrit than the other groups until day 7 (p = 0.04). In total, 10 patients had severe post-treatment haemolysis requiring blood transfusion. Patients with G6PD deficiency showed a higher risk of severe anaemia following treatment with CD+A (RR = 10.2; 95% CI 1.8 to 59.3) or AQ+SP (RR = 5.6; 95% CI 1.0 to 32.7).

Conclusions

CD+A showed a poor safety profile in individuals with G6PD deficiency most likely as a result of dapsone induced haemolysis. Screening for G6PD deficiency before drug administration of potentially pro-oxidants drugs, like dapsone-containing combinations, although seldom available, is necessary.

Trial Registration

ClinicalTrials.gov NCT00461578  相似文献   

4.

Background

School-based mass treatment with praziquantel is the cornerstone for schistosomiasis control in school-aged children. However, uptake of treatment among school-age children in Uganda is low in some areas. The objective of the study was to examine the effectiveness of a pre-treatment snack on uptake of mass treatment.

Methods and Findings

In a cluster randomized trial carried out in Jinja district, Uganda, 12 primary schools were randomized into two groups; one received education messages for schistosomiasis prevention for two months prior to mass treatment, while the other, in addition to the education messages, received a pre-treatment snack shortly before mass treatment. Four weeks after mass treatment, uptake of praziquantel was assessed among a random sample of 595 children in the snack schools and 689 children in the non-snack schools as the primary outcome. The occurrence of side effects and the prevalence and mean intensity of Schistosoma mansoni infection were determined as the secondary outcomes. Uptake of praziquantel was higher in the snack schools, 93.9% (95% CI 91.7%–95.7%), compared to that in the non-snack schools, 78.7% (95% CI 75.4%–81.7%) (p = 0.002). The occurrence of side effects was lower in the snack schools, 34.4% (95% CI 31.5%–39.8%), compared to that in the non-snack schools, 46.9% (95% CI 42.2%–50.7%) (p = 0.041). Prevalence and mean intensity of S. mansoni infection was lower in the snack schools, 1.3% (95% CI 0.6%–2.6%) and 38.3 eggs per gram of stool (epg) (95% CI 21.8–67.2), compared to that in the non-snack schools, 14.1% (95% CI 11.6%–16.9%) (p = 0.001) and 78.4 epg (95% CI 60.6–101.5) (p = 0.001), respectively.

Conclusions

Our results suggest that provision of a pre-treatment snack combined with education messages achieves a higher uptake compared to the education messages alone. The use a pre-treatment snack was associated with reduced side effects as well as decreased prevalence and intensity of S. mansoni infection.

Trial registration

www.ClinicalTrials.gov NCT01869465 Please see later in the article for the Editors'' Summary  相似文献   

5.

Background

Many epidemiological studies have been conducted to explore the association between a single CYP2D6 gene polymorphism and Parkinson’s disease (PD) susceptibility. However, the results remain controversial.

Objectives

To clarify the effects of a single CYP2D6 gene polymorphism on the risk of PD, a meta-analysis of all available studies relating to CYP2D6*4 polymorphism and the risk of PD was conducted.

Methods

A comprehensive literature search of PubMed, EMBASE, and the China National Knowledge Infrastructure (CNKI) up to September 1, 2013 was conducted. Data were extracted by two independent authors and pooled odds ratio (OR) with 95% confidence interval (CI) were calculated. Meta-regression, Galbraith plots, subgroup analysis, sensitivity analysis, and publication bias analysis were also performed.

Results

Twenty-two separate comparisons consisting of 2,629 patients and 3,601 controls were included in our meta-analysis. The pooled analyses showed a significant association between CYP2D6*4G/A polymorphism and PD risk in all of the comparisons (A vs. G allele: OR = 1.28, 95% CI = 1.14–1.43, P = 0.001; AA vs. GG: OR = 1.43, 95% CI = 1.06–1.93, P = 0.018; AG vs. GG: OR = 1.22, 95% CI = 1.06–1.40, P = 0.006; AG+AA vs. GG: OR = 1.26, 95% CI = 1.10–1.44, P = 0.001; AA vs. AG+GG: OR = 1.37, 95% CI = 1.02–1.83, P = 0.036). In subgroup analysis stratified by ethnicity, significant associations were also demonstrated in Caucasians but not in Asians. No significant association was found in subgroup analysis stratified by age of onset or disease form.

Conclusions

We concluded that the CYP2D6*4G/A polymorphism denotes an increased genetic susceptibility to PD in the overall population, especially in Caucasians. Further large and well-designed studies are needed to confirm this association.  相似文献   

6.
Triple-negative breast cancer (TNBC) is a highly heterogeneous disease, and molecular subtyping may result in improved diagnostic precision and targeted therapies. Our previous study classified TNBCs into four subtypes with putative therapeutic targets. Here, we conducted the FUTURE trial (ClinicalTrials.gov identifier: NCT03805399), a phase Ib/II subtyping-based and genomic biomarker-guided umbrella trial, to evaluate the efficacy of these targets. Patients with refractory metastatic TNBC were enrolled and stratified by TNBC subtypes and genomic biomarkers, and assigned to one of these seven arms: (A) pyrotinib with capecitabine, (B) androgen receptor inhibitor with CDK4/6 inhibitor, (C) anti PD-1 with nab-paclitaxel, (D) PARP inhibitor included, (E) and (F) anti-VEGFR included, or (G) mTOR inhibitor with nab-paclitaxel. The primary end point was the objective response rate (ORR). We enrolled 69 refractory metastatic TNBC patients with a median of three previous lines of therapy (range, 1–8). Objective response was achieved in 20 (29.0%, 95% confidence interval (CI): 18.7%–41.2%) of the 69 intention-to-treat (ITT) patients. Our results showed that immunotherapy (arm C), in particular, achieved the highest ORR (52.6%, 95% CI: 28.9%–75.6%) in the ITT population. Arm E demonstrated favorable ORR (26.1%, 95% CI: 10.2%–48.4% in the ITT population) but with more high grade (≥ 3) adverse events. Somatic mutations of TOP2A and CD8 immunohistochemical score may have the potential to predict immunotherapy response in the immunomodulatory subtype of TNBC. In conclusion, the phase Ib/II FUTURE trial suggested a new concept for TNBC treatment, demonstrating the clinical benefit of subtyping-based targeted therapy for refractory metastatic TNBC.Subject terms: Breast cancer, Targeted therapies  相似文献   

7.

Background:

Small studies have yielded divergent results for administration of granulocyte colony-stimulating factor (G-CSF) after acute myocardial infarction. Adequately powered studies involving patients with at least moderate left ventricular dysfunction are lacking.

Methods:

Patients with left ventricular ejection fraction less than 45% after anterior-wall myocardial infarction were treated with G-CSF (10 μg/kg daily for 4 days) or placebo. After initial randomization of 86 patients, 41 in the placebo group and 39 in the G-CSF group completed 6-month follow-up and underwent measurement of left ventricular ejection fraction by radionuclide angiography.

Results:

Baseline and 6-week mean ejection fraction was similar for the G-CSF and placebo groups: 34.8% (95% confidence interval [CI] 32.6%–37.0%) v. 36.4% (95% CI 33.5%–39.2%) at baseline and 39.8% (95% CI 36.2%–43.4%) v. 43.1% (95% CI 39.2%–47.0%) at 6 weeks. However, G-CSF therapy was associated with a lower ejection fraction at 6 months relative to placebo (40.8% [95% CI 37.4%–44.2%] v. 46.0% [95% CI 42.7%–44.3%]). Both groups had improved left ventricular function, but change in left ventricular ejection fraction was lower in patients treated with G-CSF than in those who received placebo (5.7 [95% CI 3.4–8.1] percentage points v. 9.2 [95% CI 6.3–12.1] percentage points). One or more of a composite of several major adverse cardiac events occurred in 8 patients (19%) within each group, with similar rates of target-vessel revascularization.

Interpretation:

In patients with moderate left ventricular dysfunction following anterior-wall infarction, G-CSF therapy was associated with a lower 6-month left ventricular ejection fraction but no increased risk of major adverse cardiac events. Future studies of G-CSF in patients with left ventricular dysfunction should be monitored closely for safety. Trial registration: ClinicalTrials.gov, no. NCT00394498Rapid reperfusion therapy has become the standard treatment for ST-segment elevation myocardial infarction (STEMI), with congestive heart failure and left ventricular dysfunction continuing as the strongest predictors of higher long-term risk.1 To date, no definitive therapies exist to regenerate myocardium following myocardial necrosis, and myocardial preservation is therefore the goal of STEMI care. Contemporary studies have suggested the possibility of myocardial regeneration by endogenous stem and progenitor cell populations, and preliminary clinical studies have hinted at potential benefit.2,3 Studies investigating whether postinfarction myocardial function can be improved by enhancing stem cell–mediated repair are in progress (NCT00936819 and NCT00984178).Granulocyte colony-stimulating factor (G-CSF), an endogenously produced glycoprotein growth factor, when given in pharmacologic doses, stimulates mobilization of hematopoietic stem cells into the peripheral blood. Therapeutically, recombinant synthetic forms have been used to enhance recovery from neutropenia following chemotherapy and for mobilization of stem cells before hematopoietic stem cell transplant.4 Numerous small clinical studies have investigated the potential of G-CSF–induced mobilization of stem cells in the peri-infarction period to enhance left ventricular recovery, but they have yielded discordant results. However, meta-analyses have suggested benefit for left ventricular ejection fraction in subgroups who received G-CSF early after infarction or in patients whose left ventricular dysfunction was mild to moderate.5,6 Larger trials are necessary because, in addition to mobilizing stem cells, G-CSF modulates intracellular signalling cascades within cardiomyocytes and can activate neutrophils, and several trials have been stopped early as a result of excessive in-stent restenosis and acute coronary syndromes in patients with coronary artery disease.711 Animal data have similarly yielded discordant results, depending on the dose and timing of G-CSF.12To clarify the role of G-CSF in promoting left ventricular recovery after acute myocardial infarction, we performed an adequately powered randomized clinical trial in patients with moderate left ventricular dysfunction following anterior-wall STEMI.  相似文献   

8.
BackgroundHypertension is the most important cardiovascular risk factor in India, and representative studies of middle-aged and older Indian adults have been lacking. Our objectives were to estimate the proportions of hypertensive adults who had been diagnosed, took antihypertensive medication, and achieved control in the middle-aged and older Indian population and to investigate the association between access to healthcare and hypertension management.Methods and findingsWe designed a nationally representative cohort study of the middle-aged and older Indian population, the Longitudinal Aging Study in India (LASI), and analyzed data from the 2017–2019 baseline wave (N = 72,262) and the 2010 pilot wave (N = 1,683). Hypertension was defined as self-reported physician diagnosis or elevated blood pressure (BP) on measurement, defined as systolic BP ≥ 140 mm Hg or diastolic BP ≥ 90 mm Hg. Among hypertensive individuals, awareness, treatment, and control were defined based on self-reports of having been diagnosed, taking antihypertensive medication, and not having elevated BP, respectively. The estimated prevalence of hypertension for the Indian population aged 45 years and older was 45.9% (95% CI 45.4%–46.5%). Among hypertensive individuals, 55.7% (95% CI 54.9%–56.5%) had been diagnosed, 38.9% (95% CI 38.1%–39.6%) took antihypertensive medication, and 31.7% (95% CI 31.0%–32.4%) achieved BP control. In multivariable logistic regression models, access to public healthcare was a key predictor of hypertension treatment (odds ratio [OR] = 1.35, 95% CI 1.14–1.60, p = 0.001), especially in the most economically disadvantaged group (OR of the interaction for middle economic status = 0.76, 95% CI 0.61–0.94, p = 0.013; OR of the interaction for high economic status = 0.84, 95% CI 0.68–1.05, p = 0.124). Having health insurance was not associated with improved hypertension awareness among those with low economic status (OR = 0.96, 95% CI 0.86–1.07, p = 0.437) and those with middle economic status (OR of the interaction = 1.15, 95% CI 1.00–1.33, p = 0.051), but it was among those with high economic status (OR of the interaction = 1.28, 95% CI 1.10–1.48, p = 0.001). Comparing hypertension awareness, treatment, and control rates in the 4 pilot states, we found statistically significant (p < 0.001) improvement in hypertension management from 2010 to 2017–2019. The limitations of this study include the pilot sample being relatively small and that it recruited from only 4 states.ConclusionsAlthough considerable variations in hypertension diagnosis, treatment, and control exist across different sociodemographic groups and geographic areas, reducing uncontrolled hypertension remains a public health priority in India. Access to healthcare is closely tied to both hypertension diagnosis and treatment.

Jinkook Lee and colleagues investigate hypertension management and its association with healthcare access in middle-aged and older adults in India.  相似文献   

9.

Background

The Trypanosoma cruzi satellite DNA (satDNA) OligoC-TesT is a standardised PCR format for diagnosis of Chagas disease. The sensitivity of the test is lower for discrete typing unit (DTU) TcI than for TcII-VI and the test has not been evaluated in chronic Chagas disease patients.

Methodology/Principal Findings

We developed a new prototype of the OligoC-TesT based on kinetoplast DNA (kDNA) detection. We evaluated the satDNA and kDNA OligoC-TesTs in a multi-cohort study with 187 chronic Chagas patients and 88 healthy endemic controls recruited in Argentina, Chile and Spain and 26 diseased non-endemic controls from D.R. Congo and Sudan. All specimens were tested in duplicate. The overall specificity in the controls was 99.1% (95% CI 95.2%–99.8%) for the satDNA OligoC-TesT and 97.4% (95% CI 92.6%–99.1%) for the kDNA OligoC-TesT. The overall sensitivity in the patients was 67.9% (95% CI 60.9%–74.2%) for the satDNA OligoC-TesT and 79.1% (95% CI 72.8%–84.4%) for the kDNA OligoC-Test.

Conclusions/Significance

Specificities of the two T. cruzi OligoC-TesT prototypes are high on non-endemic and endemic controls. Sensitivities are moderate but significantly (p = 0.0004) higher for the kDNA OligoC-TesT compared to the satDNA OligoC-TesT.  相似文献   

10.
Background:ABO blood type locus has been reported to be an important genetic determinant of venous and arterial thrombosis in genome-wide association studies. We tested the hypothesis that ABO blood type alone and in combination with mutations in factor V Leiden R506Q and prothrombin G20210A is associated with the risk of venous thromboembolism and myocardial infarction in the general population.Methods:We used data from 2 Danish studies that followed members of the general public from 1977 through 2010. We obtained the genotype of 66 001 white participants for ABO blood type, factor V Leiden R506Q and prothrombin G20210A. We calculated hazard ratios (HRs) and population attributable risk. Our main outcome measures were venous thromboembolism and myocardial infarction.Results:The multivariable adjusted HR for venous thromboembolism was 1.4 (95% confidence interval [CI] 1.3–1.5) for non-O blood type (v. O blood type). For the factor V Leiden R506Q mutation, the adjusted HR was 2.2 (95% CI 2.0–2.5) for heterozygous participants and 7.0 (95%CI 4.8–10) for homozygous participants (v. participants without the mutation). For prothrombin G20210A, the adjusted HR was 1.5 (95%CI 1.2–1.9) for heterozygous participants and 11 (95% CI 2.8–44) for homozygous participants (v. participants without the mutation). When we combined ABO blood type and factor V Leiden R506Q or prothrombin G20210A genotype, there was a stepwise increase in the risk of venous thromboembolism (trend, p < 0.001). The population attributable risk of venous thromboembolism was 20% for ABO blood type, 10% for factor V Leiden R506Q and 1% for prothrombin G20210A. Multivariable adjusted HRs for myocardial infarction by genotypes did not differ from 1.0.Interpretation:ABO blood type had an additive effect on the risk of venous thromboembolism when combined with factor V Leiden R506Q and prothrombin G20210A mutations; blood type was the most important risk factor for venous thromboembolism in the general population.Genome-wide association studies have reported that ABO blood type locus is an important genetic determinant of venous and arterial thrombosis,1,2 leading to renewed interest in the association between ABO blood type and venous and arterial thrombosis. This challenges conventional thoughts on genetic screening for thrombophilia, which presently does not include ABO blood type.Individuals with an A or B blood type have an increased risk of venous thromboembolism and myocardial infarction compared with individuals with O blood type.36 Earlier studies concluded that ABO antigen expression determines von Willebrand factor levels;711 however, recent findings from genome-wide association studies suggest that ABO antigens may also exert their effect through other pathways.1216 Both factor V Leiden R506Q and prothrombin G20210A mutations have been consistently associated with increased risk of venous thrombosis but not consistently associated with the risk of arterial thrombosis.1719In this study, we tested the hypothesis that ABO blood type, alone and in combination with the factor V Leiden R506Q and prothrombin G20210A mutations, is associated with the risk of venous thromboembolism and myocardial infarction in the general population.  相似文献   

11.

Background

The impact of dialysis modality on survival is still somewhat controversial. Given possible differences in patients’ characteristics and the cause and rate of death in different countries, the issue needs to be evaluated in Korean cohorts.

Methods

A nationwide prospective observational cohort study (NCT00931970) was performed to compare survival between peritoneal dialysis (PD) and hemodialysis (HD). A total of 1,060 end-stage renal disease patients in Korea who began dialysis between September 1, 2008 and June 30, 2011 were followed through December 31, 2011.

Results

The patients (PD, 30.6%; HD, 69.4%) were followed up for 16.3±7.9 months. PD patients were significantly younger, less likely to be diabetic, with lower body mass index, and larger urinary volume than HD patients. Infection was the most common cause of death. Multivariate Cox regression with the entire cohort revealed that PD tended to be associated with a lower risk of death compared to HD [hazard ratio (HR) 0.63, 95% confidence interval (CI) 0.36–1.08]. In propensity score matched pairs (n = 278 in each modality), cumulative survival probabilities for PD and HD patients were 96.9% and 94.1% at 12 months (P = 0.152) and 94.3% and 87.6% at 24 months (P = 0.022), respectively. Patients on PD had a 51% lower risk of death compared to those on HD (HR 0.49, 95% CI 0.25–0.97).

Conclusions

PD exhibits superior survival to HD in the early period of dialysis, even after adjusting for differences in the patients’ characteristics between the two modalities. Notably, the most common cause of death was infection in this Korean cohort.  相似文献   

12.

Background

Centenarians are a rapidly growing demographic group worldwide, yet their health and social care needs are seldom considered. This study aims to examine trends in place of death and associations for centenarians in England over 10 years to consider policy implications of extreme longevity.

Methods and Findings

This is a population-based observational study using death registration data linked with area-level indices of multiple deprivations for people aged ≥100 years who died 2001 to 2010 in England, compared with those dying at ages 80-99. We used linear regression to examine the time trends in number of deaths and place of death, and Poisson regression to evaluate factors associated with centenarians’ place of death. The cohort totalled 35,867 people with a median age at death of 101 years (range: 100–115 years). Centenarian deaths increased 56% (95% CI 53.8%–57.4%) in 10 years. Most died in a care home with (26.7%, 95% CI 26.3%–27.2%) or without nursing (34.5%, 95% CI 34.0%–35.0%) or in hospital (27.2%, 95% CI 26.7%–27.6%). The proportion of deaths in nursing homes decreased over 10 years (−0.36% annually, 95% CI −0.63% to −0.09%, p = 0.014), while hospital deaths changed little (0.25% annually, 95% CI −0.06% to 0.57%, p = 0.09). Dying with frailty was common with “old age” stated in 75.6% of death certifications. Centenarians were more likely to die of pneumonia (e.g., 17.7% [95% CI 17.3%–18.1%] versus 6.0% [5.9%–6.0%] for those aged 80–84 years) and old age/frailty (28.1% [27.6%–28.5%] versus 0.9% [0.9%–0.9%] for those aged 80–84 years) and less likely to die of cancer (4.4% [4.2%–4.6%] versus 24.5% [24.6%–25.4%] for those aged 80–84 years) and ischemic heart disease (8.6% [8.3%–8.9%] versus 19.0% [18.9%–19.0%] for those aged 80–84 years) than were younger elderly patients. More care home beds available per 1,000 population were associated with fewer deaths in hospital (PR 0.98, 95% CI 0.98–0.99, p<0.001).

Conclusions

Centenarians are more likely to have causes of death certified as pneumonia and frailty and less likely to have causes of death of cancer or ischemic heart disease, compared with younger elderly patients. To reduce reliance on hospital care at the end of life requires recognition of centenarians’ increased likelihood to “acute” decline, notably from pneumonia, and wider provision of anticipatory care to enable people to remain in their usual residence, and increasing care home bed capacity. Please see later in the article for the Editors'' Summary  相似文献   

13.
14.
BackgroundCervical cancer screening strategies using visual inspection or cytology may have suboptimal diagnostic accuracy for detection of precancer in women living with HIV (WLHIV). The optimal screen and screen–triage strategy, age to initiate, and frequency of screening for WLHIV remain unclear. This study evaluated the sensitivity, specificity, and positive predictive value of different cervical cancer strategies in WLHIV in Africa.Methods and findingsWLHIV aged 25–50 years attending HIV treatment centres in Burkina Faso (BF) and South Africa (SA) from 5 December 2011 to 30 October 2012 were enrolled in a prospective evaluation study of visual inspection using acetic acid (VIA) or visual inspection using Lugol’s iodine (VILI), high-risk human papillomavirus DNA test (Hybrid Capture 2 [HC2] or careHPV), and cytology for histology-verified high-grade cervical intraepithelial neoplasia (CIN2+/CIN3+) at baseline and endline, a median 16 months later. Among 1,238 women (BF: 615; SA: 623), median age was 36 and 34 years (p < 0.001), 28.6% and 49.6% ever had prior cervical cancer screening (p < 0.001), and 69.9% and 64.2% were taking ART at enrolment (p = 0.045) in BF and SA, respectively. CIN2+ prevalence was 5.8% and 22.4% in BF and SA (p < 0.001), respectively. VIA had low sensitivity for CIN2+ (44.7%, 95% confidence interval [CI] 36.9%–52.7%) and CIN3+ (56.1%, 95% CI 43.3%–68.3%) in both countries, with specificity for ≤CIN1 of 78.7% (95% CI 76.0%–81.3%). HC2 had sensitivity of 88.8% (95% CI 82.9%–93.2%) for CIN2+ and 86.4% (95% CI 75.7%–93.6%) for CIN3+. Specificity for ≤CIN1 was 55.4% (95% CI 52.2%–58.6%), and screen positivity was 51.3%. Specificity was higher with a restricted genotype (HPV16/18/31/33/35/45/52/58) approach (73.5%, 95% CI 70.6%–76.2%), with lower screen positivity (33.7%), although there was lower sensitivity for CIN3+ (77.3%, 95% CI 65.3%–86.7%). In BF, HC2 was more sensitive for CIN2+/CIN3+ compared to VIA/VILI (relative sensitivity for CIN2+ = 1.72, 95% CI 1.28–2.32; CIN3+: 1.18, 95% CI 0.94–1.49). Triage of HC2-positive women with VIA/VILI reduced the number of colposcopy referrals, but with loss in sensitivity for CIN2+ (58.1%) but not for CIN3+ (84.6%). In SA, cytology high-grade squamous intraepithelial lesion or greater (HSIL+) had best combination of sensitivity (CIN2+: 70.1%, 95% CI 61.3%–77.9%; CIN3+: 80.8%, 95% CI 67.5%–90.4%) and specificity (81.6%, 95% CI 77.6%–85.1%). HC2 had similar sensitivity for CIN3+ (83.0%, 95% CI 70.2%–91.9%) but lower specificity compared to HSIL+ (42.7%, 95% CI 38.4%–47.1%; relative specificity = 0.57, 95% CI 0.52–0.63), resulting in almost twice as many referrals. Compared to HC2, triage of HC2-positive women with HSIL+ resulted in a 40% reduction in colposcopy referrals but was associated with some loss in sensitivity. CIN2+ incidence over a median 16 months was highest among VIA baseline screen-negative women (2.2%, 95% CI 1.3%–3.7%) and women who were baseline double-negative with HC2 and VIA (2.1%, 95% CI 1.3%–3.5%) and lowest among HC2 baseline screen-negative women (0.5%, 95% CI 0.1%–1.8%). Limitations of our study are that WLHIV included in the study may not reflect a contemporary cohort of WLHIV initiating ART in the universal ART era and that we did not evaluate HPV tests available in study settings today.ConclusionsIn this cohort study among WLHIV in Africa, a human papillomavirus (HPV) test targeting 14 high-risk (HR) types had higher sensitivity to detect CIN2+ compared to visual inspection but had low specificity, although a restricted genotype approach targeting 8 HR types decreased the number of unnecessary colposcopy referrals. Cytology HSIL+ had optimal performance for CIN2+/CIN3+ detection in SA. Triage of HPV-positive women with HSIL+ maintained high specificity but with some loss in sensitivity compared to HC2 alone.

In this cohort study, Helen Kelly and colleagues explore cervical cancer screening strategies for women living with HIV.  相似文献   

15.
BackgroundA number of prior studies have demonstrated that research participants with limited English proficiency in the United States are routinely excluded from clinical trial participation. Systematic exclusion through study eligibility criteria that require trial participants to be able to speak, read, and/or understand English affects access to clinical trials and scientific generalizability. We sought to establish the frequency with which English language proficiency is required and, conversely, when non-English languages are affirmatively accommodated in US interventional clinical trials for adult populations.Methods and findingsWe used the advanced search function on ClinicalTrials.gov specifying interventional studies for adults with at least 1 site in the US. In addition, we used these search criteria to find studies with an available posted protocol. A computer program was written to search for evidence of English or Spanish language requirements, or the posted protocol, when available, was manually read for these language requirements. Of the 14,367 clinical trials registered on ClinicalTrials.gov between 1 January 2019 and 1 December 2020 that met baseline search criteria, 18.98% (95% CI 18.34%–19.62%; n = 2,727) required the ability to read, speak, and/or understand English, and 2.71% (95% CI 2.45%–2.98%; n = 390) specifically mentioned accommodation of translation to another language. The remaining trials in this analysis and the following sub-analyses did not mention English language requirements or accommodation of languages other than English. Of 2,585 federally funded clinical trials, 28.86% (95% CI 27.11%–30.61%; n = 746) required English language proficiency and 4.68% (95% CI 3.87%–5.50%; n = 121) specified accommodation of other languages; of the 5,286 industry-funded trials, 5.30% (95% CI 4.69%–5.90%; n = 280) required English and 0.49% (95% CI 0.30%–0.69%; n = 26) accommodated other languages. Trials related to infectious disease were less likely to specify an English requirement than all registered trials (10.07% versus 18.98%; relative risk [RR] = 0.53; 95% CI 0.44–0.64; p < 0.001). Trials related to COVID-19 were also less likely to specify an English requirement than all registered trials (8.18% versus 18.98%; RR = 0.43; 95% CI 0.33–0.56; p < 0.001). Trials with a posted protocol (n = 366) were more likely than all registered clinical trials to specify an English requirement (36.89% versus 18.98%; RR = 1.94, 95% CI 1.69–2.23; p < 0.001). A separate analysis of studies with posted protocols in 4 therapeutic areas (depression, diabetes, breast cancer, and prostate cancer) demonstrated that clinical trials related to depression were the most likely to require English (52.24%; 95% CI 40.28%–64.20%). One limitation of this study is that the computer program only searched for the terms “English” and “Spanish” and may have missed evidence of other language accommodations. Another limitation is that we did not differentiate between requirements to read English, speak English, understand English, and be a native English speaker; we grouped these requirements together in the category of English language requirements.ConclusionsA meaningful percentage of US interventional clinical trials for adults exclude individuals who cannot read, speak, and/or understand English, or are not native English speakers. To advance more inclusive and generalizable research, funders, sponsors, institutions, investigators, institutional review boards, and others should prioritize translating study materials and eliminate language requirements unless justified either scientifically or ethically.

Akila Muthukumar and coauthors, systematically analyze ClinicalTrials.gov to evaluate the frequency of English language requirements in clinical trial eligibility criteria.  相似文献   

16.

Background

Several point-of-care (POC) tests are available for evaluation of febrile patients, but the data about their performance in acute care setting is sparse. We investigated the analytical accuracy and feasibility of POC tests for white blood cell (WBC) count and C-reactive protein (CRP) at the pediatric emergency department (ED).

Methods

In the first part of the study, HemoCue WBC and Afinion AS100 CRP POC analyzers were compared with laboratory’s routine WBC (Sysmex XE-2100) and CRP (Modular P) analyzers in the hospital central laboratory in 77 and 48 clinical blood samples, respectively. The POC tests were then adopted in use at the pediatric ED. In the second part of the study, we compared WBC and CRP levels measured by POC and routine methods during 171 ED patient visits by 168 febrile children and adolescents. Attending physicians performed POC tests in capillary fingerprick samples.

Results

In parallel measurements in the laboratory both WBC and CRP POC analyzers showed good agreement with the reference methods. In febrile children at the emergency department (median age 2.4 years), physician performed POC determinations in capillary blood gave comparable results with those in venous blood analyzed in the laboratory. The mean difference between POC and reference test result was 1.1 E9/L (95% limits of agreement from -6.5 to 8.8 E9/L) for WBC and -1.2 mg/L (95% limits of agreement from -29.6 to 27.2 mg/L) for CRP.

Conclusions

POC tests are feasible and relatively accurate methods to assess CRP level and WBC count among febrile children at the ED.  相似文献   

17.
ObjectiveCurrent practice guidelines recommend the routine use of several cardiac medications early in the course of acute myocardial infarction (AMI). Our objective was to analyze temporal trends in medication use and in-hospital mortality of AMI patients in a Chinese population.MethodsThis is a retrospective observational study using electronic medical records from the hospital information system (HIS) of 14 Chinese hospitals. We identified 5599 patients with AMI between 2005 and 2011. Factors associated with medication use and in-hospital mortality were explored by using hierarchical logistic regression.ResultsThe use of several guideline-recommended medications all increased during the study period: statins (57.7%–90.1%), clopidogrel (61.8%–92.3%), β-Blockers (45.4%–65.1%), ACEI/ARB (46.7%–58.7%), aspirin (81.9%–92.9%), and the combinations thereof increased from 24.9% to 42.8% (P<0.001 for all). Multivariate analyses showed statistically significant increases in all these medications. The in-hospital mortality decreased from 15.9% to 5.7% from 2005 to 2011 (P<0.001). After multivariate adjustment, admission year was still a significant factor (OR = 0.87, 95% CI 0.79–0.96, P = 0.007), the use of aspirin (OR = 0.64, 95% CI 0.46–0.87), clopidogrel (OR = 0.44, 95% CI 0.31–0.61), ACEI/ARB (OR = 0.73, 95% CI 0.56–0.94) and statins (OR = 0.54, 95% CI 0.40–0.73) were associated with a decrease in in-hospital mortality. Patients with older age, cancer and renal insufficiency had higher in-hospital mortality, while they were generally less likely to receive all these medications.ConclusionUse of guideline-recommended medications early in the course of AMI increased between 2005 and 2011 in a Chinese population. During this same time, there was a decrease in in-hospital mortality.  相似文献   

18.

Background

This study (NCT01682005) aims to assess clinical and cost impacts of complete and incomplete rotavirus (RV) vaccination.

Methods

Beneficiaries who continuously received medical and pharmacy benefits since birth were identified separately in Truven Commercial Claims and Encounters (2000–2011) and Truven Medicaid Claims (2002–2010) and observed until the first of end of insurance eligibility or five years. Infants with ≥1 RV vaccine within the vaccination window (6 weeks-8 months) were divided into completely and incompletely vaccinated cohorts. Historically unvaccinated (before 2007) and contemporarily unvaccinated (2007 and after) cohorts included children without RV vaccine. Claims with International Classification of Disease 9th edition (ICD-9) codes for diarrhea and RV were identified. First RV episode incidence, RV-related and diarrhea-related healthcare resource utilization after 8 months old were calculated and compared across groups. Poisson regressions were used to generate incidence rates with 95% confidence intervals (CIs). Mean total, inpatient, outpatient and emergency room costs for first RV and diarrhea episodes were calculated; bootstrapping was used to construct 95% CIs to evaluate cost differences.

Results

1,069,485 Commercial and 515,557 Medicaid patients met inclusion criteria. Among commercially insured, RV incidence per 10,000 person-years was 3.3 (95% CI 2.8–3.9) for completely, 4.0 (95% CI 3.3–5.0) for incompletely vaccinated, and 20.9 (95% CI 19.5–22.4) for contemporarily and 40.3 (95% CI 38.6–42.1) for historically unvaccinated. Rates in Medicaid were 7.5 (95% CI 4.8–11.8) for completely, 9.0 (95% CI 6.5–12.3) for incompletely vaccinated, and 14.6 (95% CI 12.8–16.7) for contemporarily and 52.0 (95% CI 50.2–53.8) for historically unvaccinated. Mean cost for first RV episode per cohort member was $15.33 (95% CI $12.99-$18.03) and $4.26 ($95% CI $2.34-$6.35) lower for completely vaccinated versus contemporarily unvaccinated in Commercial and Medicaid, respectively.

Conclusions

RV vaccination results in significant reduction in RV infection. There is evidence of indirect benefit to unvaccinated individuals.  相似文献   

19.
BackgroundAcute Plasmodium vivax malaria is associated with haemolysis, bone marrow suppression, reticulocytopenia, and post-treatment reticulocytosis leading to haemoglobin recovery. Little is known how malaria affects glucose-6-phosphate dehydrogenase (G6PD) activity and whether changes in activity when patients present may lead qualitative tests, like the fluorescent spot test (FST), to misdiagnose G6PD deficient (G6PDd) patients as G6PD normal (G6PDn). Giving primaquine or tafenoquine to such patients could result in severe haemolysis.MethodsWe investigated the G6PD genotype, G6PD enzyme activity over time and the baseline FST phenotype in Cambodians with acute P. vivax malaria treated with 3-day dihydroartemisinin piperaquine and weekly primaquine, 0·75 mg/kg x8 doses.ResultsOf 75 recruited patients (males 63), aged 5–63 years (median 24), 15 were G6PDd males (14 Viangchan, 1 Canton), 3 were G6PD Viangchan heterozygous females, and 57 were G6PDn; 6 patients had α/β-thalassaemia and 26 had HbE.Median (range) Day0 G6PD activities were 0·85 U/g Hb (0·10–1·36) and 11·4 U/g Hb (6·67–16·78) in G6PDd and G6PDn patients, respectively, rising significantly to 1·45 (0·36–5·54, p<0.01) and 12·0 (8·1–17·4, p = 0.04) U/g Hb on Day7, then falling to ~Day0 values by Day56. Day0 G6PD activity did not correlate (p = 0.28) with the Day0 reticulocyte counts but both correlated over time. The FST diagnosed correctly 17/18 G6PDd patients, misclassifying one heterozygous female as G6PDn.ConclusionsIn Cambodia, acute P. vivax malaria did not elevate G6PD activities in our small sample of G6PDd patients to levels that would result in a false normal qualitative test. Low G6PDd enzyme activity at disease presentation increases upon parasite clearance, parallel to reticulocytosis. More work is needed in G6PDd heterozygous females to ascertain the effect of P. vivax on their G6PD activities.Trial registrationThe trial was registered (ACTRN12613000003774) with the Australia New Zealand Clinical trials (https://www.anzctr.org.au/Trial/Registration/TrialReview.aspx?id=363399&isReview=true).  相似文献   

20.

Background

Whether to continue oral anticoagulant therapy beyond 6 months after an “unprovoked” venous thromboembolism is controversial. We sought to determine clinical predictors to identify patients who are at low risk of recurrent venous thromboembolism who could safely discontinue oral anticoagulants.

Methods

In a multicentre prospective cohort study, 646 participants with a first, unprovoked major venous thromboembolism were enrolled over a 4-year period. Of these, 600 participants completed a mean 18-month follow-up in September 2006. We collected data for 69 potential predictors of recurrent venous thromboembolism while patients were taking oral anticoagulation therapy (5–7 months after initiation). During follow-up after discontinuing oral anticoagulation therapy, all episodes of suspected recurrent venous thromboembolism were independently adjudicated. We performed a multivariable analysis of predictor variables (p < 0.10) with high interobserver reliability to derive a clinical decision rule.

Results

We identified 91 confirmed episodes of recurrent venous thromboembolism during follow-up after discontinuing oral anticoagulation therapy (annual risk 9.3%, 95% CI 7.7%–11.3%). Men had a 13.7% (95% CI 10.8%–17.0%) annual risk. There was no combination of clinical predictors that satisfied our criteria for identifying a low-risk subgroup of men. Fifty-two percent of women had 0 or 1 of the following characteristics: hyperpigmentation, edema or redness of either leg; D-dimer ≥ 250 μg/L while taking warfarin; body mass index ≥ 30 kg/m2; or age ≥ 65 years. These women had an annual risk of 1.6% (95% CI 0.3%–4.6%). Women who had 2 or more of these findings had an annual risk of 14.1% (95% CI 10.9%–17.3%).

Interpretation

Women with 0 or 1 risk factor may safely discontinue oral anticoagulant therapy after 6 months of therapy following a first unprovoked venous thromboembolism. This criterion does not apply to men. (http://Clinicaltrials.gov trial register number NCT00261014)Venous thromboembolism is a common, potentially fatal, yet treatable, condition. The risk of a recurrent venous thromboembolic event after 3–6 months of oral anticoagulant therapy varies. Some groups of patients (e.g., those who had a venous thromboembolism after surgery) have a very low annual risk of recurrence (< 1%),1 and they can safely discontinue anticoagulant therapy.2 However, among patients with an unprovoked thromboembolism who discontine anticoagulation therapy after 3–6 months, the risk of a recurrence in the first year is 5%–27%.3–6 In the second year, the risk is estimated to be 5%,3 and it is estimated to be 2%–3.8% for each subsequent year.5,7 The case-fatality rate for recurrent venous thromboembolism is between 5% and 13%.8,9 Oral anticoagulation therapy is very effective for reducing the risk of recurrence during therapy (> 90% relative risk [RR] reduction);3,4,10,11 however, this benefit is lost after therapy is discontinued.3,10,11 The risk of major bleeding with ongoing oral anticoagulation therapy among venous thromboembolism patients is 0.9–3.0% per year,3,4,6,12 with an estimated case-fatality rate of 13%.13Given that the long-term risk of fatal hemorrhage appears to balance the risk of fatal recurrent pulmonary embolism among patients with an unprovoked venous thromboembolism, clinicians are unsure if continuing oral anticoagulation therapy beyond 6 months is necessary.2,14 Identifying subgroups of patients with an annual risk of less than 3% will help clinicians decide which patients can safely discontinue anticoagulant therapy.We sought to determine the clinical predictors or combinations of predictors that identify patients with an annual risk of venous thromboembolism of less than 3% after taking an oral anticoagulant for 5–7 months after a first unprovoked event.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号