首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.

Background

Improving the equitable distribution of government healthcare subsidies (GHS), particularly among low-income citizens, is a major goal of China’s healthcare sector reform in China.

Objectives

This study investigates the distribution of GHS in China between socioeconomic populations at two different points in time, examines the comparative distribution of healthcare benefits before and after healthcare reforms in Northwest China, compares the parity of distribution between urban and rural areas, and explores factors that influence equitable GHS distribution.

Methods

Benefit incidence analysis of GHS progressivity was performed, and concentration and Kakwani indices for outpatient, inpatient, and total healthcare were calculated. Two rounds of household surveys that used multistage stratified samples were conducted in 2003 (13,564 respondents) and 2008 (12,973 respondents). Data on socioeconomics, healthcare payments, and healthcare utilization were collected using household interviews.

Results

High-income individuals generally reap larger benefits from GHS, as reflected by positive concentration indices, which indicates a regressive system. Concentration indices for inpatient care were 0.2199 (95% confidence interval [CI], 0.0829 to 0.3568) and 0.4445 (95% CI, 0.3000 to 0.5890) in 2002 (urban vs. rural, respectively), and 0.3925 (95% CI, 0.2528 to 0.5322) and 0.4084 (95% CI, 0.2977 to 0.5190) in 2007. Outpatient healthcare subsidies showed different distribution patterns in urban and rural areas following the redesign of rural healthcare insurance programs (urban vs. rural: 0.1433 [95% CI, 0.0263 to 0.2603] and 0.3662 [95% CI, 0.2703 to 0.4622] in 2002, respectively; 0.3063 [95% CI, 0.1657 to 0.4469] and −0.0273 [95% CI, −0.1702 to 0.1156] in 2007).

Conclusions

Our study demonstrates an inequitable distribution of GHS in China from 2002 to 2007; however, the inequity was reduced, especially in rural outpatient services. Future healthcare reforms in China should not only focus on expanding the coverage, but also on improving the equity of distribution of healthcare benefits.  相似文献   

2.

Background

The prevalence of hepatitis C virus (HCV) infection in Malaysia has been estimated at 2.5% of the adult population. Our objective, satisfying one of the directives of the WHO Framework for Global Action on Viral Hepatitis, was to forecast the HCV disease burden in Malaysia using modelling methods.

Methods

An age-structured multi-state Markov model was developed to simulate the natural history of HCV infection. We tested three historical incidence scenarios that would give rise to the estimated prevalence in 2009, and calculated the incidence of cirrhosis, end-stage liver disease, and death, and disability-adjusted life-years (DALYs) under each scenario, to the year 2039. In the baseline scenario, current antiviral treatment levels were extended from 2014 to the end of the simulation period. To estimate the disease burden averted under current sustained virological response rates and treatment levels, the baseline scenario was compared to a counterfactual scenario in which no past or future treatment is assumed.

Results

In the baseline scenario, the projected disease burden for the year 2039 is 94,900 DALYs/year (95% credible interval (CrI): 77,100 to 124,500), with 2,002 (95% CrI: 1340 to 3040) and 540 (95% CrI: 251 to 1,030) individuals predicted to develop decompensated cirrhosis and hepatocellular carcinoma, respectively, in that year. Although current treatment practice is estimated to avert a cumulative total of 2,200 deaths from DC or HCC, a cumulative total of 63,900 HCV-related deaths is projected by 2039.

Conclusions

The HCV-related disease burden is already high and is forecast to rise steeply over the coming decades under current levels of antiviral treatment. Increased governmental resources to improve HCV screening and treatment rates and to reduce transmission are essential to address the high projected HCV disease burden in Malaysia.  相似文献   

3.

Objectives

This study aimed to assess the relation between stent edge restenosis (SER) and the distance from the stent edge to the residual plaque using quantitative intravascular ultrasound.

Background

Although percutaneous coronary intervention with drug-eluting stents has improved SER rates, determining an appropriate stent edge landing zone can be challenging in cases of diffuse plaque lesions. It is known that edge vascular response can occur within 2 mm from the edge of a bare metal stent, but the distance to the adjacent plaque has not been evaluated for drug-eluting stents.

Methods

A total of 97 proximal residual plaque lesions (plaque burden [PB] >40%) treated with everolimus-eluting stents were retrospectively evaluated to determine the distance from the stent edge to the residual plaque.

Results

The SER group had significantly higher PB (59.1 ± 6.1% vs. 51.9 ± 9.1% for non-SER; P = 0.04). Higher PB was associated with SER, with the cutoff value of 54.74% determined using receiver operating characteristic (ROC) curve analysis. At this cutoff value of PB, the distance from the stent edge to the lesion was significantly associated with SER (odds ratio = 2.05, P = 0.035). The corresponding area under the ROC curve was 0.725, and the cutoff distance value for predicting SER was 1.0 mm.

Conclusion

An interval less than 1 mm from the proximal stent edge to the nearest point with the determined PB cutoff value of 54.74% was significantly associated with SER in patients with residual plaque lesions.  相似文献   

4.

Objectives

To estimate the effects of achieving China’s national goals for dietary salt (NaCl) reduction or implementing culturally-tailored dietary salt restriction strategies on cardiovascular disease (CVD) prevention.

Methods

The CVD Policy Model was used to project blood pressure lowering and subsequent downstream prevented CVD that could be achieved by population-wide salt restriction in China. Outcomes were annual CVD events prevented, relative reductions in rates of CVD incidence and mortality, quality-adjusted life-years (QALYs) gained, and CVD treatment costs saved.

Results

Reducing mean dietary salt intake to 9.0 g/day gradually over 10 years could prevent approximately 197 000 incident annual CVD events [95% uncertainty interval (UI): 173 000–219 000], reduce annual CVD mortality by approximately 2.5% (2.2–2.8%), gain 303 000 annual QALYs (278 000–329 000), and save approximately 1.4 billion international dollars (Int$) in annual CVD costs (Int$; 1.2–1.6 billion). Reducing mean salt intake to 6.0 g/day could approximately double these benefits. Implementing cooking salt-restriction spoons could prevent 183 000 fewer incident CVD cases (153 000–215 000) and avoid Int$1.4 billion in CVD treatment costs annually (1.2–1.7 billion). Implementing a cooking salt substitute strategy could lead to approximately three times the health benefits of the salt-restriction spoon program. More than three-quarters of benefits from any dietary salt reduction strategy would be realized in hypertensive adults.

Conclusion

China could derive substantial health gains from implementation of population-wide dietary salt reduction policies. Most health benefits from any dietary salt reduction program would be realized in adults with hypertension.  相似文献   

5.

Objectives

To determine the prevalence, determinants, and potential clinical relevance of adherence with the Dutch dosing guideline in patients with impaired renal function at hospital discharge.

Design

Retrospective cohort study between January 2007 and July 2011.

Setting

Academic teaching hospital in the Netherlands.

Subjects

Patients with an estimated glomerular filtration rate (eGFR) between 10-50 ml/min/1.73m2 at discharge and prescribed one or more medicines of which the dose is renal function dependent.

Main Outcome Measures

The prevalence of adherence with the Dutch renal dosing guideline was investigated, and the influence of possible determinants, such as reporting the eGFR and severity of renal impairment (severe: eGFR<30 and moderate: eGFR 30-50 ml/min/1.73m2). Furthermore, the potential clinical relevance of non-adherence was assessed.

Results

1327 patients were included, mean age 67 years, mean eGFR 38 ml/min/1.73m2. Adherence with the guideline was present in 53.9% (n=715) of patients. Reporting the eGFR, which was incorporated since April 2009, resulted in more adherence with the guideline: 50.7% vs. 57.0%, RR 1.12 (95% CI 1.02-1.25). Adherence was less in patients with severe renal impairment (46.0%), compared to patients with moderate renal impairment (58.1%, RR 0.79; 95% CI 0.70-0.89). 71.4% of the cases of non-adherence had the potential to cause moderate to severe harm.

Conclusion

Required dosage adjustments in case of impaired renal function are often not performed at hospital discharge, which may cause harm to the majority of patients. Reporting the eGFR can be a small and simple first step to improve adherence with dosing guidelines.  相似文献   

6.

Background

Shugan Jianpi Zhixie therapy (SJZT) has been widely used to treat diarrhea-predominant irritable bowel syndrome (IBS-D), but the results are still controversial. A meta-analysis of randomized, double-blind, placebo-controlled trials was performed to assess the efficacy and tolerability of SJZT for IBS-D.

Methods

The MEDLINE, EMBASE, Cochrane Library, the China National Knowledge Infrastructure database, the Chinese Biomedical Literature database and the Wanfang database were searched up to June 2014 with no language restrictions. Summary estimates, including 95% confidence intervals (CI), were calculated for global symptom improvement, abdominal pain improvement, and Symptom Severity Scale (BSS) score.

Results

Seven trials (N=954) were included. The overall risk of bias assessment was low. SJZT showed significant improvement for global symptom compared to placebo (RR 1.61; 95% CI 1.24, 2.10; P =0.0004; therapeutic gain = 33.0%; number needed to treat (NNT) = 3.0). SJZT was significantly more likely to reduce overall BSS score (SMD –0.67; 95% CI –0.94, –0.40; P < 0.00001) and improve abdominal pain (RR 4.34; 95% CI 2.64, 7.14; P < 0.00001) than placebo. The adverse events of SJZT were no different from those of placebo.

Conclusions

This meta-analysis suggests that SJZT is an effective and safe therapy option for patients with IBS-D. However, due to the high clinical heterogeneity and small sample size of the included trials, further standardized preparation, large-scale and rigorously designed trials are needed.  相似文献   

7.

Objective

To model the cost-effectiveness impact of routine use of an antimicrobial chlorhexidine gluconate-containing securement dressing compared to non-antimicrobial transparent dressings for the protection of central vascular lines in intensive care unit patients.

Design

This study uses a novel health economic model to estimate the cost-effectiveness of using the chlorhexidine gluconate dressing versus transparent dressings in a French intensive care unit scenario. The 30-day time non-homogeneous markovian model comprises eight health states. The probabilities of events derive from a multicentre (12 French intensive care units) randomized controlled trial. 1,000 Monte Carlo simulations of 1,000 patients per dressing strategy are used for probabilistic sensitivity analysis and 95% confidence intervals calculations. The outcome is the number of catheter-related bloodstream infections avoided. Costs of intensive care unit stay are based on a recent French multicentre study and the cost-effectiveness criterion is the cost per catheter-related bloodstream infections avoided. The incremental net monetary benefit per patient is also estimated.

Patients

1000 patients per group simulated based on the source randomized controlled trial involving 1,879 adults expected to require intravascular catheterization for 48 hours.

Intervention

Chlorhexidine Gluconate-containing securement dressing compared to non-antimicrobial transparent dressings.

Results

The chlorhexidine gluconate dressing prevents 11.8 infections /1,000 patients (95% confidence interval: [3.85; 19.64]) with a number needed to treat of 85 patients. The mean cost difference per patient of €141 is not statistically significant (95% confidence interval: [€-975; €1,258]). The incremental cost-effectiveness ratio is of €12,046 per catheter-related bloodstream infection prevented, and the incremental net monetary benefit per patient is of €344.88.

Conclusions

According to the base case scenario, the chlorhexidine gluconate dressing is more cost-effective than the reference dressing.

Trial Registration

This model is based on the data from the RCT registered with www.clinicaltrials.gov (NCT01189682).  相似文献   

8.

Background

Behavior Change Communications (BCC) play a decisive role in modifying socio-cultural norms affecting the perception and nutritional practices during pregnancy.

Objective

To examine the effectiveness of ‘Trials of Improved Practices’ (TIPs) on dietary and iron-folate intake during pregnancy.

Design

Community based quasi experimental study with a control group

Setting

Four villages of Chiraigaon Community Development Block of Varanasi, India from May 2010 and recruited from August 2010. End line assessment, after 12 weeks of intervention, was completed in April 2011.

Participants

Pregnant women in 13–28 weeks of gestation

Intervention

TIPs was implemented in addition to ongoing essential obstetric care services in two villages through 3 home (assessment, negotiation and evaluation) visits and only assessment and evaluation visits in the other two control villages. Interpersonal communication, endorsing the active participation of family members and home based reminder materials were the TIPs based strategies. The effect of TIPs was assessed by comparing key outcome variables at baseline and after 12 weeks of intervention.

Outcome Measures

Hemoglobin%, anemia prevalence, weight gain, compliance for iron-folate supplementation and dietary intake of calorie, protein, calcium and iron.

Results

A total of 86 participants completed the study. At the end, mean hemoglobin levels were 11.5±1.24 g/dl and 10.37±1.38 g/dl in the TIPs and control groups, respectively. The prevalence of anemia reduced by half in TIPs group and increased by 2.4% in the control group. Weight gain (grams/week) was significantly (p<0.01) higher in TIPs group (326.9±91.8 vs. 244.6±97.4). More than 85% of the PW in TIPs group were compliant for Iron-folate and only 38% were compliant among controls. The mean intake of protein increased by 1.78gm in intervention group and decreased by 1.81 gm in controls (p<0.05). More than two thirds of PW in TIPs group were taking one extra meal and only one third of controls were doing the same.

Conclusion

TIPs found to be an effective approach to improve the nutritional status of pregnant women in the study area. TIPs strategy could be further explored on larger sample representing different socio-cultural and geographical areas.

Trial Registration

Clinical Trial Registry of India CTRI/2015/02/005517  相似文献   

9.
10.

Introduction

Patients receiving antiretroviral therapy (ART) require routine monitoring to track response to treatment and assess for treatment failure. This study aims to identify gaps in monitoring practices in Kenya and Uganda.

Methods

We conducted a systematic retrospective chart review of adults who initiated ART between 2007 and 2012. We assessed the availability of baseline measurements (CD4 count, weight, and WHO stage) and ongoing CD4 and weight monitoring according to national guidelines in place at the time. Mixed-effects logistic regression models were used to analyze facility and patient factors associated with meeting monitoring guidelines.

Results

From 2007 to 2012, at least 88% of patients per year in Uganda had a recorded weight at initiation, while in Kenya there was a notable increase from 69% to 90%. Patients with a documented baseline CD4 count increased from 69% to about 80% in both countries. In 2012, 83% and 86% of established patients received the recommended quarterly weight monitoring in Kenya and Uganda, respectively, while semiannual CD4 monitoring was less common (49% in Kenya and 38% in Uganda). Initiating at a more advanced WHO stage was associated with a lower odds of baseline CD4 testing. On-site CD4 analysis capacity was associated with increased odds of CD4 testing at baseline and in the future.

Discussion

Substantial gaps were noted in ongoing CD4 monitoring of patients on ART. Although guidelines have since changed, limited laboratory capacity is likely to remain a significant issue in monitoring patients on ART, with important implications for ensuring quality care.  相似文献   

11.

Background/Objectives

The prevalence rate of hypertension increases significantly with the aging society, and hypertension is obviously becoming a major health care concern in China. The aim of the study was to explore the epidemiological characteristics of hypertension in the elderly and to provide a basis for the prevention of hypertension.

Design

3-cross sectional studies in 2000, 2004, and 2007, respectively.

Setting

Beijing, China.

Participants

A group of 2,832, 1,828, and 2,277 elderly residents aged ≥60 years were included this study in 2000, 2004, and 2007, respectively.

Intervention

None.

Measurements

Statistical sampling techniques included cluster, stratification, and random selection. Trained staff used a comprehensive geriatric assessment questionnaire and a standard survey instrument to complete the assessments. During the person-to-person interviews, the participants’ demographic characteristics, living conditions, and health status were collected, and their blood pressure was measured.

Results

The prevalence rates (69.2%, 61.9%, and 56.0%) of hypertension and the control rates (22.6%, 16.7%, and 21.5%) lowered annually, while the awareness rates (43.7%, 55.8%, and 57.6%) of the treatment elevated annually in 2000, 2004, and 2007, respectively. There was no increase in the control rates for males (26.2%, 16.7%, and 20.8%), younger participants (28.0%, 18.4%, and 21.0%), and rural residents (19.5%, 9.6%, and 13.4%) in 2000, 2004, and 2007, respectively.

Conclusions

Our study findings indicated that the prevalence of hypertension is high in rural elderly participants, while the rates of awareness, treatment, and control were low. This suggests that effective public measures need to be developed to improve the prevention and control of hypertension.  相似文献   

12.

Background

The use of liquid medium (MGIT960) for tuberculosis (TB) diagnosis was recommended by WHO in 2007. However, there has been no evaluation of its effectiveness on clinically important outcomes.

Methods and Findings

A pragmatic trial was carried out in a tertiary hospital and a secondary health care unit in Rio de Janeiro City, Brazil. Participants were 16 years or older, suspected of having TB. They were excluded if only cerebral spinal fluid or blood specimens were available for analysis. MGIT960 technique was compared with the Lowenstein-Jensen (LJ) method for laboratory diagnosis of active TB. Primary outcome was the proportion of patients who had their initial medical management changed within 2 months after randomisation. Secondary outcomes were: mean time for changing the procedure, patient satisfaction with the overall treatment and adverse events. Data were analysed by intention-to-treat. Between April 2008 and September 2011, 693 patients were enrolled (348 to MGIT, 345 to LJ). Smear and culture results were positive for 10% and 15.7% of participants, respectively. Patients in the MGIT arm had their initial medical management changed more frequently than those in the LJ group (10.1% MGIT vs 3.8% LJ, RR 2.67 95% CI 1.44–.96, p = 0.002, NNT 16, 95% CI 10–39). Mean time for changing the initial procedure was greater in LJ group at both sites: 20.0 and 29.6 days in MGIT group and 52.2 and 64.3 in LJ group (MD 33.5, 95% CI 30.6–36.4, p = 0.0001). No other important differences were observed.

Conclusions

This study suggests that opting for the MGIT960 system for TB diagnosis provides a promising case management model for improving the quality of care and control of TB.

Trial Registration

Controlled-Trials.com ISRCTN79888843  相似文献   

13.

Background

Anxiety and depression in children and adolescents are undertreated. Computer- and Internet-based cognitive behavioral treatments (cCBT) may be an attractive treatment alternative to regular face-to-face treatment.This meta-analysis aims to evaluate whether cCBT is effective for treating symptoms of anxiety and depression in youth.

Methods and Findings

We conducted systematic searches in bibliographical databases (Pubmed, Cochrane controlled trial register, PsychInfo) up to December 4, 2013. Only randomized controlled trials in which a computer-, Internet- or mobile-based cognitive behavioral intervention targeting either depression, anxiety or both in children or adolescents up to the age of 25 were compared to a control condition were selected. We employed a random-effects pooling model in overall effect analyses and a mixed effect model for sub-group analyses. Searches resulted in identifying 13 randomized trials, including 796 children and adolescents that met inclusion criteria. Seven studies were directed at treating anxiety, four studies at depression, and two were of a transdiagnostic nature, targeting both anxiety and depression. The overall mean effect size (Hedges’ g) of cCBT on symptoms of anxiety or depression at post-test was g=0.72 (95% CI:0.55-0.90, numbers needed to be treated (NNT)=2.56). Heterogeneity was low (I²=20.14%, 95% CI: 0-58%). The superiority of cCBT over controls was evident for interventions targeting anxiety (g=0.68; 95% CI: 0.45-0.92; p < .001; NNT=2.70) and for interventions targeting depression (g=0.76; 95% CI: 0.41-0.12; p < .001; NNT=2.44) as well as for transdiagnostic interventions (g=0.94; 95% CI: 0.23-2.66; p < .001; NNT=2.60).

Conclusions

Results provide evidence for the efficacy of cCBT in the treatment of anxiety and depressive symptoms in youth. Hence, such interventions may be a promising treatment alternative when evidence based face-to-face treatment is not feasible. Future studies should examine long-term effects of treatments and should focus on obtaining patient-level data from existing studies, to perform an individual patient data meta-analysis.  相似文献   

14.

Background

Oral pre-exposure prophylaxis (PrEP) can be clinically effective and cost-effective for HIV prevention in high-risk men who have sex with men (MSM). However, individual patients have different risk profiles, real-world populations vary, and no practical tools exist to guide clinical decisions or public health strategies. We introduce a practical model of HIV acquisition, including both a personalized risk calculator for clinical management and a cost-effectiveness calculator for population-level decisions.

Methods

We developed a decision-analytic model of PrEP for MSM. The primary clinical effectiveness and cost-effectiveness outcomes were the number needed to treat (NNT) to prevent one HIV infection, and the cost per quality-adjusted life-year (QALY) gained. We characterized patients according to risk factors including PrEP adherence, condom use, sexual frequency, background HIV prevalence and antiretroviral therapy use.

Results

With standard PrEP adherence and national epidemiologic parameters, the estimated NNT was 64 (95% uncertainty range: 26, 176) at a cost of $160,000 (cost saving, $740,000) per QALY – comparable to other published models. With high (35%) HIV prevalence, the NNT was 35 (21, 57), and cost per QALY was $27,000 (cost saving, $160,000), and with high PrEP adherence, the NNT was 30 (14, 69), and cost per QALY was $3,000 (cost saving, $200,000). In contrast, for monogamous, serodiscordant relationships with partner antiretroviral therapy use, the NNT was 90 (39, 157) and cost per QALY was $280,000 ($14,000, $670,000).

Conclusions

PrEP results vary widely across individuals and populations. Risk calculators may aid in patient education, clinical decision-making, and cost-effectiveness evaluation.  相似文献   

15.

Background

Recent publications have emphasized the importance of a multidisciplinary strategy for maximum conservation and utilization of lung biopsy material for advanced testing, which may determine therapy. This paper quantifies the effect of a multidisciplinary strategy implemented to optimize and increase tissue volume in CT-guided transthoracic needle core lung biopsies. The strategy was three-pronged: (1) once there was confidence diagnostic tissue had been obtained and if safe for the patient, additional biopsy passes were performed to further increase volume of biopsy material, (2) biopsy material was placed in multiple cassettes for processing, and (3) all tissue ribbons were conserved when cutting blocks in the histology laboratory. This study quantifies the effects of strategies #1 and #2.

Design

This retrospective analysis comparing CT-guided lung biopsies from 2007 and 2012 (before and after multidisciplinary approach implementation) was performed at a single institution. Patient medical records were reviewed and main variables analyzed include biopsy sample size, radiologist, number of blocks submitted, diagnosis, and complications. The biopsy sample size measured was considered to be directly proportional to tissue volume in the block.

Results

Biopsy sample size increased 2.5 fold with the average total biopsy sample size increasing from 1.0 cm (0.9–1.1 cm) in 2007 to 2.5 cm (2.3–2.8 cm) in 2012 (P<0.0001). The improvement was statistically significant for each individual radiologist. During the same time, the rate of pneumothorax requiring chest tube placement decreased from 15% to 7% (P = 0.065). No other major complications were identified. The proportion of tumor within the biopsy material was similar at 28% (23%–33%) and 35% (30%–40%) for 2007 and 2012, respectively. The number of cases with at least two blocks available for testing increased from 10.7% to 96.4% (P<0.0001).

Conclusions

The effect of this multidisciplinary strategy to CT-guided lung biopsies was effective in significantly increasing tissue volume and number of blocks available for advanced diagnostic testing.  相似文献   

16.

Background

Currently there is controversy surrounding the optimal way to treat patients with prostate cancer in the post-prostatectomy setting. Adjuvant therapies carry possible benefits of improved curative results, but there is uncertainty in which patients should receive adjuvant therapy. There are concerns about giving toxicity to a whole population for the benefit of only a subset. We hypothesized that making post-prostatectomy treatment decisions using genomics-based risk prediction estimates would improve cancer and quality of life outcomes.

Methods

We developed a state-transition model to simulate outcomes over a 10 year horizon for a cohort of post-prostatectomy patients. Outcomes included cancer progression rates at 5 and 10 years, overall survival, and quality-adjusted survival with reductions for treatment, side effects, and cancer stage. We compared outcomes using population-level versus individual-level risk of cancer progression, and for genomics-based care versus usual care treatment recommendations.

Results

Cancer progression outcomes, expected life-years (LYs), and expected quality-adjusted life-years (QALYs) were significantly different when individual genomics-based cancer progression risk estimates were used in place of population-level risk estimates. Use of the genomic classifier to guide treatment decisions provided small, but statistically significant, improvements in model outcomes. We observed an additional 0.03 LYs and 0.07 QALYs, a 12% relative increase in the 5-year recurrence-free survival probability, and a 4% relative reduction in the 5-year probability of metastatic disease or death.

Conclusions

The use of genomics-based risk prediction to guide treatment decisions may improve outcomes for prostate cancer patients. This study offers a framework for individualized decision analysis, and can be extended to incorporate a wide range of personal attributes to enable delivery of patient-centered tools for informed decision-making.  相似文献   

17.

Background

Vitamin-K antagonists (VKAs) present an effective anticoagulant treatment in deep venous thrombosis (DVT). However, the use of VKAs is limited because of the risk of bleeding and the necessity of frequent and long-term laboratory monitoring. Therefore, new oral anticoagulant drugs (NOACs) such as dabigatran, with lower rates of (major) intracranial bleeding compared to VKAs and not requiring monitoring, may be considered.

Objectives

To estimate resource utilization and costs of patients treated with the VKAs acenocoumarol and phenprocoumon, for the indication DVT. Furthermore, a formal cost-effectiveness analysis of dabigatran compared to VKAs for DVT treatment was performed, using these estimates.

Methods

A retrospective observational study design in the thrombotic service of a teaching hospital (Deventer, The Netherlands) was applied to estimate real-world resource utilization and costs of VKA monitoring. A pooled analysis of data from RE-COVER and RE-COVER II on DVT was used to reflect the probabilities for events in the cost-effectiveness model. Dutch costs, utilities and specific data on coagulation monitoring levels were incorporated in the model. Next to the base case analysis, univariate probabilistic sensitivity and scenario analyses were performed.

Results

Real-world resource utilization in the thrombotic service of patients treated with VKA for the indication of DVT consisted of 12.3 measurements of the international normalized ratio (INR), with corresponding INR monitoring costs of €138 for a standardized treatment period of 180 days. In the base case, dabigatran treatment compared to VKAs in a cohort of 1,000 DVT patients resulted in savings of €18,900 (95% uncertainty interval (UI) -95,832, 151,162) and 41 (95% UI -18, 97) quality-adjusted life-years (QALYs) gained calculated from societal perspective. The probability that dabigatran is cost-effective at a conservative willingness-to pay threshold of €20,000 per QALY was 99%. Sensitivity and scenario analyses also indicated cost savings or cost-effectiveness below this same threshold.

Conclusions

Total INR monitoring costs per patient were estimated at minimally €138. Inserting these real-world data into a cost-effectiveness analysis for patients diagnosed with DVT, dabigatran appeared to be a cost-saving alternative to VKAs in the Netherlands in the base case. Cost savings or favorable cost-effectiveness were robust in sensitivity and scenario analyses. Our results warrant confirmation in other settings and locations.  相似文献   

18.

Background

Atrial fibrillation (AF) can be managed with rhythm- or rate-control strategies. There are few data from routine clinical practice on the frequency with which each strategy is used and their correlates in terms of patients’ clinical characteristics, AF control, and symptom burden.

Methods

RealiseAF was an international, cross-sectional, observational survey of 11,198 patients with AF. The aim of this analysis was to describe patient profiles and symptoms according to the AF management strategy used. A multivariate logistic regression identified factors associated with AF management strategy at the end of the visit.

Results

Among 10,497 eligible patients, 53.7% used a rate-control strategy, compared with 34.5% who used a rhythm-control strategy. In 11.8% of patients, no clear strategy was stated. The proportion of patients with AF-related symptoms (EHRA Class > = II) was 78.1% (n = 4396/5630) for those using a rate-control strategy vs. 67.8% for those using a rhythm-control strategy (p<0.001). Multivariate logistic regression analysis revealed that age <75 years or the paroxysmal or persistent form of AF favored the choice of a rhythm-control strategy. A change in strategy was infrequent, even in patients with European Heart Rhythm Association (EHRA) Class > = II.

Conclusions

In the RealiseAF routine clinical practice survey, rate control was more commonly used than rhythm control, and a change in strategy was uncommon, even in symptomatic patients. In almost 12% of patients, no clear strategy was stated. Physician awareness regarding optimal management strategies for AF may be improved.  相似文献   

19.

Background

The usefulness of the 2013 ACC/AHA guidelines for the management of blood cholesterol in the Asian population remains controversial. In this study, we investigated whether eligibility for statin therapy determined by the 2013 ACC/AHA guidelines is better aligned with the presence of subclinical coronary atherosclerosis detected by CCTA (coronary computed tomography angiography) compared to the previously recommended 2004 NCEP ATP III guidelines.

Methods

We collected the data from 5,837 asymptomatic subjects who underwent CCTA using MDCT during routine health examinations. Based on risk factor assessment and lipid data, we determined guideline-based eligibility for statin therapy according to the 2013 ACC/AHA and 2004 NCEP ATP III guidelines. We defined the presence and severity of subclinical coronary atherosclerosis detected in CCTA according to the presence of significant coronary artery stenosis (defined as >50% stenosis), plaques, and the degree of coronary calcification.

Results

As compared to the 2004 ATP III guidelines, a significantly higher proportion of subjects with significant coronary stenosis (61.8% vs. 33.8%), plaques (52.3% vs. 24.7%), and higher CACS (CACS >100, 63.6% vs. 26.5%) was assigned to statin therapy using the 2013 ACC/AHA guidelines (P < .001 for all variables). The area under the curves of the pooled cohort equation of the new guidelines in detecting significant stenosis, plaques, and higher CACS were significantly higher than those of the Framingham risk calculator.

Conclusions

Compared to the previous ATP III guidelines, the 2013 ACC/AHA guidelines were more sensitive in identifying subjects with subclinical coronary atherosclerosis detected by CCTA in an Asian population.  相似文献   

20.

Objectives

To identify and understand, through data from multiple sources, some of the factors that affect authors’ and editors’ decisions to use reporting guidelines in the publication of health research.

Design

Mixed methods study comprising an online survey and semi-structured interviews with a sample of authors (online survey: n = 56; response rate = 32%; semi-structured interviews: n = 5) and journal editors (online survey: n = 43; response rate = 27%; semi-structured interviews: n = 6) involved in publishing health and medical research. Participants were recruited from an earlier study examining the effectiveness of the TREND reporting guideline.

Results

Four types of factors interacted to affect authors’ and editors’ likelihood of reporting guideline use; individual (e.g. having multiple reasons for use of reporting guidelines); the professional culture in which people work; environmental (e.g. policies of journals); and, practical (e.g. having time to use reporting guidelines). Having multiple reasons for using reporting guidelines was a particularly salient factor in facilitating reporting guidelines use for both groups of participants.

Conclusions

Improving the completeness and consistency of reporting of research studies is critical to the integrity and synthesis of health research. The use of reporting guidelines offers one potentially efficient and effective means for achieving this, but decisions to use (or not use) reporting guidelines take many factors into account. These findings could be used to inform future studies that might, for example, test the factors that we have identified within a wider theoretical framework for understanding changes in professional practices. The use of reporting guidelines by senior professionals appears to shape the expectations of what constitutes best practice and can be assimilated into the culture of a field or discipline. Without evidence of effectiveness of reporting guidelines, and sustained, multifaceted efforts to improve reporting, little progress seems likely to be made.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号