首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.

Background

Patients with non-dialysis-dependent chronic kidney disease (ND-CKD) often receive an erythropoiesis-stimulating agent (ESA) and oral iron treatment. This study evaluated whether a switch from oral iron to intravenous ferric carboxymaltose can reduce ESA requirements and improve iron status and hemoglobin in patients with ND-CKD.

Methods

This prospective, single arm and single-center study included adult patients with ND-CKD (creatinine clearance ≤40 mL/min), hemoglobin 11–12 g/dL and iron deficiency (ferritin <100 μg/L or transferrin saturation <20%), who were regularly treated with oral iron and ESA during 6 months prior to inclusion. Study patients received an intravenous ferric carboxymaltose dose of 1,000 mg iron, followed by a 6-months ESA/ ferric carboxymaltose maintenance regimen (target: hemoglobin 12 g/dL, transferrin saturation >20%). Outcome measures were ESA dose requirements during the observation period after initial ferric carboxymaltose treatment (primary endpoint); number of hospitalizations and transfusions, renal function before and after ferric carboxymaltose administration, number of adverse reactions (secondary endpoints). Hemoglobin, mean corpuscular volume, ferritin and transferrin saturation were measured monthly from baseline until end of study. Creatinine clearance, proteinuria, C-reactive protein, aspartate aminotransferase, alanine aminotransferase and alkaline phosphatase bimonthly from baseline until end of study.

Results

Thirty patients were enrolled (age 70.1±11.4 years; mean±SD). Mean ESA consumption was significantly reduced by 83.2±10.9% (from 41,839±3,668 IU/patient to 6,879±4,271 IU/patient; p<0.01). Hemoglobin increased by 0.7±0.3 g/dL, ferritin by 196.0±38.7 μg/L and transferrin saturation by 5.3±2.9% (month 6 vs. baseline; all p<0.01). No ferric carboxymaltose-related adverse events were reported and no patient withdrew or required transfusions during the study.

Conclusion

Among patients with ND-CKD and stable normal or borderline hemoglobin, switching from oral iron to intravenous ferric carboxymaltose was associated with significant improvements in hematological and iron parameters and a significant reduction in ESA dose requirements in this single-center pilot study.

Trial Registration

ClinicalTrials.gov NCT02232906  相似文献   

2.
Chronic liver disease and liver cancer associated with chronic hepatitis B (CHB) are leading causes of death among adults in China. Although newborn hepatitis B immunization has successfully reduced the prevalence of CHB in children, about 100 million Chinese adults remain chronically infected. If left unmanaged, 15–25% will die from liver cancer or liver cirrhosis. Antiviral treatment is not necessary for all patients with CHB, but when it is indicated, good response to treatment would prevent disease progression and reduce disease mortality and morbidity, and costly complications. The aim of this study is to analyze the cost-effectiveness of generic and brand antiviral drugs for CHB treatment in China, and assessing various thresholds at which a highly potent, low resistance antiviral drug would be cost-saving and/or cost-effective to introduce in a national treatment program. We developed a Markov simulation model of disease progression using effectiveness and cost data from the medical literature. We measured life-time costs, quality adjusted life years (QALYs), incremental cost-effectiveness ratios (ICERs), and clinical outcomes. The no treatment strategy incurred the highest health care costs ($12,932-$25,293) per patient, and the worst health outcomes, compared to the antiviral treatment strategies. Monotherapy with either entecavir or tenofovir yielded the most QALYs (14.10–19.02) for both HBeAg-positive and negative patients, with or without cirrhosis. Threshold analysis showed entercavir or tenofovir treatment would be cost saving if the drug price is $32–75 (195–460 RMB) per month, highly cost-effective at $62–110 (379–670 RMB) per month and cost-effective at $63–120 (384–734 RMB) per month. This study can support policy decisions regarding the implementation of a national health program for chronic hepatitis B treatment in China at the population level.  相似文献   

3.

Objective

Despite substantial investment in Electronic Medical Record (EMR) systems there has been little research to evaluate them. Our aim was to evaluate changes in efficiency and quality of services after the introduction of a purpose built EMR system, and to assess its acceptability by the doctors, nurses and patients using it.

Methods

We compared a nine month period before and after the introduction of an EMR system in a large sexual health service, audited a sample of records in both periods and undertook anonymous surveys of both staff and patients.

Results

There were 9,752 doctor consultations (in 5,512 consulting hours) in the Paper Medical Record (PMR) period and 9,145 doctor consultations (in 5,176 consulting hours in the EMR period eligible for inclusion in the analysis. There were 5% more consultations per hour seen by doctors in the EMR period compared to the PMR period (rate ratio = 1.05; 95% confidence interval, 1.02, 1.08) after adjusting for type of consultation. The qualitative evaluation of 300 records for each period showed no difference in quality (P>0.17). A survey of clinicians demonstrated that doctors and nurses preferred the EMR system (P<0.01) and a patient survey in each period showed no difference in satisfaction of their care (97% for PMR, 95% for EMR, P = 0.61).

Conclusion

The introduction of an integrated EMR improved efficiency while maintaining the quality of the patient record. The EMR was popular with staff and was not associated with a decline in patient satisfaction in the clinical care provided.  相似文献   

4.

Purpose

To investigate the influence of daily oral iron supplementation on changes in hemoglobin mass (Hbmass) and iron parameters after 2–4 weeks of moderate altitude exposure.

Methods

Hematological data collected from 178 athletes (98 males, 80 females) exposed to moderate altitude (1,350–3,000 m) were analysed using linear regression to determine how altitude exposure combined with oral iron supplementation influenced Hbmass, total iron incorporation (TII) and blood iron parameters [ferritin and transferrin saturation (TSAT)].

Results

Altitude exposure (mean ± s: 21 ± 3 days) increased Hbmass by 1.1% [-0.4, 2.6], 3.3% [1.7, 4.8], and 4.0% [2.0, 6.1] from pre-altitude levels in athletes who ingested nil, 105 mg and 210 mg respectively, of oral iron supplement daily. Serum ferritin levels decreased by -33.2% [-46.9, -15.9] and 13.8% [-32.2, 9.7] from pre-altitude levels in athletes who supplemented with nil and 105 mg of oral iron supplement daily, but increased by 36.8% [1.3, 84.8] in athletes supplemented with 210 mg of oral iron daily. Finally, athletes who ingested either 105 mg or 210 mg of oral iron supplement daily had a greater TII compared with non-supplemented athletes (0 versus 105 mg: effect size (d) = -1.88 [-2.56, -1.17]; 0 versus 210 mg: effect size (d) = -2.87 [-3.88, -1.66]).

Conclusion

Oral iron supplementation during 2–4 weeks of moderate altitude exposure may enhance Hbmass production and assist the maintenance of iron balance in some athletes with low pre-altitude iron stores.  相似文献   

5.

Introduction

Starting in June 2010 the Infectious Diseases Institute (IDI) clinic (a large urban HIV out-patient facility) switched to provider-based Electronic Medical Records (EMR) from paper EMR entered in the database by data-entry clerks. Standardized clinics forms were eliminated but providers still fill free text clinical notes in physical patients’ files. The objective of this study was to compare the rate of errors in the database before and after the introduction of the provider-based EMR.

Methods and Findings

Data in the database pre and post provider-based EMR was compared with the information in the patients’ files and classified as correct, incorrect, and missing. We calculated the proportion of incorrect, missing and total error for key variables (toxicities, opportunistic infections, reasons for treatment change and interruption). Proportions of total errors were compared using chi-square test. A survey of the users of the EMR was also conducted. We compared data from 2,382 visits (from 100 individuals) of a retrospective validation conducted in 2007 with 34,957 visits (from 10,920 individuals) of a prospective validation conducted in April–August 2011. The total proportion of errors decreased from 66.5% in 2007 to 2.1% in 2011 for opportunistic infections, from 51.9% to 3.5% for ART toxicity, from 82.8% to 12.5% for reasons for ART interruption and from 94.1% to 0.9% for reasons for ART switch (all P<0.0001). The survey showed that 83% of the providers agreed that provider-based EMR led to improvement of clinical care, 80% reported improved access to patients’ records, and 80% appreciated the automation of providers’ tasks.

Conclusions

The introduction of provider-based EMR improved the quality of data collected with a significant reduction in missing and incorrect information. The majority of providers and clients expressed satisfaction with the new system. We recommend the use of provider-based EMR in large HIV programs in Sub-Saharan Africa.  相似文献   

6.

Background

There are few published estimates of the cost of pediatric antiretroviral therapy (ART) in Africa. Our objective was to estimate the outpatient cost of providing ART to children remaining in care at six public sector clinics in Zambia during the first three years after ART initiation, stratified by service delivery site and time on treatment.

Methods

Data on resource utilization (drugs, diagnostics, outpatient visits, fixed costs) and treatment outcomes (in care, died, lost to follow up) were extracted from medical records for 1,334 children at six sites who initiated ART at <15 years of age between 2006 and 2011. Fixed and variable unit costs (reported in 2011 USD) were estimated from the provider’s perspective using site level data.

Results

Median age at ART initiation was 4.0 years; median CD4 percentage was 14%. One year after ART initiation, 73% of patients remained in care, ranging from 60% to 91% depending on site. The average annual outpatient cost per patient remaining in care was $209 (95% CI, $199–$219), ranging from $116 (95% CI, $107–$126) to $516 (95% CI, $499–$533) depending on site. Average annual costs decreased as time on treatment increased. Antiretroviral drugs were the largest component of all outpatient costs (>50%) at four sites. At the two remaining sites, outpatient visits and fixed costs together accounted for >50% of outpatient costs. The distribution of costs is slightly skewed, with median costs 3% to 13% lower than average costs during the first year after ART initiation depending on site.

Conclusions

Outpatient costs for children initiating ART in Zambia are low and comparable to reported outpatient costs for adults. Outpatient costs and retention in care vary widely by site, suggesting opportunities for efficiency gains. Taking advantage of such opportunities will help ensure that targets for pediatric treatment coverage can be met.  相似文献   

7.

Background

Some retrospective studies have found that HIV-infected women have a higher mortality risk than men after adjusting for baseline characteristics, while others have not. Anemia is a known predictor of HIV-related mortality. We assessed whether anemia contributed to the sex difference in mortality in our cohort.

Methods

We conducted a retrospective cohort study among HIV-infected persons in care at the Comprehensive Care Center (Nashville, TN) between 1998 and 2009. Cox proportional hazards models compared time from first clinic visit to death and AIDS-defining events (ADE), adjusted for baseline characteristics with and without baseline hemoglobin.

Results

Of 3,633 persons, 879 (24%) were women. Women had lower median baseline hemoglobin compared to men: 12.4 g/dL (inter-quartile range (IQR) 11.3–13.4) vs. 14.4 (IQR 13.1–15.5), respectively (P<0.001). In multivariable models without hemoglobin, the risk of death was higher among women: hazard ratio (HR) 1.46 (95% confidence interval (CI) 1.17, 1.82; P = 0.001). In multivariable models with hemoglobin, the risk of death in women was diminished and no longer statistically significant: HR 1.2 (95% CI 0.93, 1.55; P = 0.17). The risk of ADE was higher among women in both models, but not statistically significant: HR 1.1 (95% CI 0.85–1.42; P = 0.46) in the model without hemoglobin and 1.11 (95% CI 0.82–1.48; P = 0.50) in the model with hemoglobin. Hemoglobin was a strong predictor of death: HR 0.88 per 1 g/dL increase (95% CI 0.83, 0.93; P<0.001).

Conclusion

In our study population of HIV-infected persons in care, women had lower baseline hemoglobin, and lower hemoglobin contributed to their higher risk of ADE and death.  相似文献   

8.

Background

Hemoglobin (Hb) levels are regarded as an important determinant of outcome in a number of cancers treated with radiotherapy. However, for patients treated with intensity modulated radiotherapy (IMRT), information regarding the prognostic value of hemoglobin level is scarce.

Patients and Methods

A total of 650 patients with nasopharyngeal carcinoma (NPC), enrolled between May, 2005, and November, 2012, were included in this study. The prognostic significance of hemoglobin level (anemia or no-anemia) at three different time points was investigated, including before treatment, during treatment and at the last week of treatment. Univariate and multivariate analyses were conducted using the log–rank test and the Cox proportional hazards model, respectively.

Results

The 5-year OS (overall survival) rate of patients who were anemia and no-anemia before treatment were 89.1%, and 80.7% (P = 0.01), respectively. The 5-year DMFS (distant metastasis-free survival) rate of patients who were anemia and no-anemia before treatment were 88.9%, and 78.2% (P = 0.01), respectively. The 5-year OS rate of patients who were anemia and no-anemia during treatment were 91.7% and 83.3% (P = 0.004). According to multivariate analysis, the pre-treatment Hb level predicted a decreased DMFS (P = 0.007, HR = 2.555, 95% CI1.294–5.046). Besides, the mid-treatment Hb level predicted a decreased OS (P = 0.013, HR = 2.333, 95% CI1.199–4.541).

Conclusions

Hemoglobin level is a useful prognostic factor in NPC patients receiving IMRT. It is important to control the level of hemoglobin both before and during chemoradiotherapy.  相似文献   

9.
BackgroundShorter, safer, and cheaper tuberculosis (TB) preventive treatment (TPT) regimens will enhance uptake and effectiveness. WHO developed target product profiles describing minimum requirements and optimal targets for key attributes of novel TPT regimens. We performed a cost-effectiveness analysis addressing the scale-up of regimens meeting these criteria in Brazil, a setting with relatively low transmission and low HIV and rifampicin-resistant TB (RR-TB) prevalence, and South Africa, a setting with higher transmission and higher HIV and RR-TB prevalence.Methods and findingsWe used outputs from a model simulating scale-up of TPT regimens meeting minimal and optimal criteria. We assumed that drug costs for minimal and optimal regimens were identical to 6 months of daily isoniazid (6H). The minimal regimen lasted 3 months, with 70% completion and 80% efficacy; the optimal regimen lasted 1 month, with 90% completion and 100% efficacy. Target groups were people living with HIV (PLHIV) on antiretroviral treatment and household contacts (HHCs) of identified TB patients. The status quo was 6H at 2019 coverage levels for PLHIV and HHCs. We projected TB cases and deaths, TB-associated disability-adjusted life years (DALYs), and costs (in 2020 US dollars) associated with TB from a TB services perspective from 2020 to 2035, with 3% annual discounting. We estimated the expected costs and outcomes of scaling up 6H, the minimal TPT regimen, or the optimal TPT regimen to reach all eligible PLHIV and HHCs by 2023, compared to the status quo. Maintaining current 6H coverage in Brazil (0% of HHCs and 30% of PLHIV treated) would be associated with 1.1 (95% uncertainty range [UR] 1.1–1.2) million TB cases, 123,000 (115,000–132,000) deaths, and 2.5 (2.1–3.1) million DALYs and would cost $1.1 ($1.0–$1.3) billion during 2020–2035. Expanding the 6H, minimal, or optimal regimen to 100% coverage among eligible groups would reduce DALYs by 0.5% (95% UR 1.2% reduction, 0.4% increase), 2.5% (1.8%–3.0%), and 9.0% (6.5%–11.0%), respectively, with additional costs of $107 ($95–$117) million and $51 ($41–$60) million and savings of $36 ($14–$58) million, respectively. Compared to the status quo, costs per DALY averted were $7,608 and $808 for scaling up the 6H and minimal regimens, respectively, while the optimal regimen was dominant (cost savings, reduced DALYs). In South Africa, maintaining current 6H coverage (0% of HHCs and 69% of PLHIV treated) would be associated with 3.6 (95% UR 3.0–4.3) million TB cases, 843,000 (598,000–1,201,000) deaths, and 36.7 (19.5–58.0) million DALYs and would cost $2.5 ($1.8–$3.6) billion. Expanding coverage with the 6H, minimal, or optimal regimen would reduce DALYs by 6.9% (95% UR 4.3%–95%), 15.5% (11.8%–18.9%), and 38.0% (32.7%–43.0%), respectively, with additional costs of $79 (−$7, $151) million and $40 (−$52, $140) million and savings of $608 ($443–$832) million, respectively. Compared to the status quo, estimated costs per DALY averted were $31 and $7 for scaling up the 6H and minimal regimens, while the optimal regimen was dominant. Study limitations included the focus on 2 countries, and no explicit consideration of costs incurred before the decision to prescribe TPT.ConclusionsOur findings suggest that scale-up of TPT regimens meeting minimum or optimal requirements would likely have important impacts on TB-associated outcomes and would likely be cost-effective or cost saving.

Placide Nsengiyumva and colleagues analyze costs and cost-effectiveness of scaling up target regimens for Tuberculosis Preventive Treatment among persons living with HIV and household contacts of TB patients in Brazil and South Africa.  相似文献   

10.

Introduction

The effect of hemoglobin levels on the weaning outcomes of mechanically ventilated patients remains under debate, particularly for the patients with difficult weaning. This study aims to evaluate the effect of hemoglobin levels on weaning outcomes in difficult-to-wean patients.

Methods

This retrospective cohort study was conducted in a university-affiliated teaching hospital in Taiwan. Patients who fulfilled the criteria of difficult weaning were enrolled. Medical records were reviewed to obtain data on hemograms, biochemistry tests, transfusion records, comorbidities and weaning outcome. The association between hemoglobin levels and 30-day weaning outcomes was evaluated using a logistic regression model.

Results

A total of 751 patients received mechanical ventilation during the study period, 138 of whom fulfilled the criteria of difficult weaning. Compared with the patients whose hemoglobin was <8 g/dL, those with higher hemoglobin levels were more likely to be successfully weaned (odds ratio [OR], 3.69; 95% CI, 1.22–11.15 for hemoglobin 8–10 g/dL and OR, 4.16, 95% CI, 1.30–13.29 for hemoglobin >10 g/dL). Multivariate analysis showed that the odds ratio for weaning success remained significant for hemoglobin levels of 8–10 g/dL (adjusted OR, 3.3; 95% CI, 1.07–10.15) with borderline significance for hemoglobin level > 10 g/dL (adjusted OR, 2.95, 95% CI, 0.88–9.96).

Conclusions

Hemoglobin level is independently associated with weaning outcome in difficult-to-wean patients. Further studies are needed to evaluate whether a restrictive transfusion trigger for acute critical illness is also appropriate for such patients.  相似文献   

11.

Background

The healthcare costs of cancer care are highest in the last month of life. The effect of hospice care on end-of-life (EOL) healthcare costs is not clearly understood.

Purpose

The purpose of this study was to evaluate the effect of hospice care on survival and healthcare costs for lung cancer patients in their final month of life.

Methods

We adopted Taiwan’s National Health Insurance Research Claims Database to analyze data for 3399 adult lung cancer patients who died in 1997–2011. A logistic regression analysis was performed to determine the predictors of high healthcare cost, defined as costs falling above the 90th percentile. Patients who received hospice cares were assigned to a hospice (H) group and those who did not were assigned to a non-hospice (non-H) group.

Results

The patients in the H group had a longer mean (median) survival time than those in the non-H group did (1.40 ± 1.61 y (0.86) vs. 1.10 ± 1.47 (0.61), p<0.001). The non-H group had a lower mean healthcare cost than the H group (US $1,821 ± 2,441 vs. US $1,839 ± 1,638, p<0.001). And, there were a total of 340 patients (10%) with the healthcare costs exceeding the 90th percentile (US $4,721) as the cutoff value of high cost. The non-H group had a higher risk of high cost than the H group because many more cases in the non-H group had lower costs. Moreover, the risk of high health care costs were predicted for patients who did not receive hospice care (odds ratio [OR]: 3.68, 95% confidence interval [CI]: 2.44–5.79), received chemotherapy (OR: 1.51, 95% CI: 1.18–1.96) and intubation (OR: 2.63, 95% CI: 1.64–4.16), and those who had more emergency department visits (OR: 1.78, 95% CI: 1.24–2.52), longer hospital admission in days (OR: 1.08, 95% CI: 1.07–1.09), and received radiotherapy (OR: 1.33, 95% CI: 1.00–1.78). Lower risks of high health care costs were observed in patients with low socioeconomic status (OR: 0.58, 95% CI: 0.40–0.83), or previous employment (OR: 0.66, 95% CI: 0.47–0.92). After propensity-score matching, the patients of the non-H group had a higher mean cost and a higher risk of high cost. Similar results were obtained from logistic regression analysis in propensity score-matched patients.

Conclusions

The survival of the hospice group was longer than non-H group, and patients in the non-H group were 3.74 times more likely to have high healthcare costs at EOL. The positive predictors for high health care costs were patients who did not receive hospice care, who received chemotherapy and intubation, who had more emergency department visits and longer hospital admission, and who received radiotherapy. Negative predictors were patients who had a low socioeconomic status or previous employment. The issue of how to reduce the high health care costs for patients with lung cancer in the last month of life is a challenge for policy makers and health care providers.  相似文献   

12.
The metabolism of hepcidin is profoundly modified in chronic kidney disease (CKD). We investigated its relation to iron disorders, inflammation and hemoglobin (Hb) level in 199 non-dialyzed, non-transplanted patients with CKD stages 1–5. All had their glomerular filtration rate measured by 51Cr-EDTA renal clearance (mGFR), as well as measurements of iron markers including hepcidin and of erythropoietin (EPO). Hepcidin varied from 0.2 to 193 ng/mL. The median increased from 23.3 ng/mL [8.8–28.7] to 36.1 ng/mL [14.1–92.3] when mGFR decreased from ≥60 to <15 mL/min/1.73 m2 (p = 0.02). Patients with absolute iron deficiency (transferrin saturation (TSAT) <20% and ferritin <40 ng/mL) had the lowest hepcidin levels (5.0 ng/mL [0.7–11.7]), and those with a normal iron profile (TSAT ≥20% and ferritin ≥40), the highest (34.5 ng/mL [23.7–51.6]). In multivariate analysis, absolute iron deficiency was associated with lower hepcidin values, and inflammation combined with a normal or functional iron profile with higher values, independent of other determinants of hepcidin concentration, including EPO, mGFR, and albuminemia. The hepcidin level, although it rose overall when mGFR declined, collapsed in patients with absolute iron deficiency. There was a significant interaction with iron status in the association between Hb and hepcidin. Except in absolute iron deficiency, hepcidin’s negative association with Hb level indicates that it is not down-regulated in CKD anemia.  相似文献   

13.
J L Reynolds 《CMAJ》1995,153(3):275-282
OBJECTIVE: To determine the effectiveness of a continuous quality improvement (CQI) program in reducing episiotomy rates. DESIGN: Before-and-after study; CQI methods were used to examine the process of care during labour and birth. INTERVENTIONS: Identification of care factors that would increase the probability of episiotomy. Implementation of initiatives that would change the process of care to minimize the probability of episiotomy. Educational strategies included promotion of better understanding of what constitutes an appropriate episiotomy rate and ways to reduce maternal exhaustion and true fetal distress as well as manoeuvres to protect the perineum during birth. SETTING: Low-risk family practice obstetrics service in a tertiary care hospital in southwestern Ontario. PARTICIPANTS: All 102 family physicians at the study hospital who provided intrapartum care in the year before and the year during which the CQI program was implemented and the women for whom the care was provided (approximately 1,400 per year). OUTCOME MEASURES: Episiotomy rates (overall, among primiparous and multiparous women, and among individual family physicians) and rates of perineal tear, perineal infection and postpartum readmission. RESULTS: Although the planned reduction in the episiotomy rate was not achieved during the study period, the overall rate decreased significantly from 44.5% to 33.3% (p < 0.001). Among the primiparous women the rate decreased from 57.6% to 46.2% (p < 0.001) and among the multiparous women from 34.3% to 23.6% (p < 0.001). The reduced episiotomy rate among the primiparous women was associated with a significant decrease in the rate of third- and fourth-degree perineal tears and a significant increase in the number of women giving birth with an intact perineum or a minor (first-degree) tear. These benefits were not seen among the multiparous women, whose decreased episiotomy rate was associated with a significant increase in the number of women experiencing a second-degree perineal tear. During the intervention period, there was no increase in the rates of vaginal trauma or postpartum bleeding, infection or readmission because of complications related to perineal trauma. The episiotomy rates for most physicians decreased significantly during the intervention period. CONCLUSIONS: The CQI model may be useful in modifying clinical practices such as episiotomy because it focuses on understanding the process of care and the environment in which care is provided, both of which may have a major impact on physician behaviour. Further study is needed to ascertain the sustainability of the effects of this approach and which components of the model had the greatest effect.  相似文献   

14.

Objectives

We examined the charges, their variability, and respective payer group for diagnosis and treatment of the ten most common outpatient conditions presenting to the Emergency department (ED).

Methods

We conducted a cross-sectional study of the 2006–2008 Medical Expenditure Panel Survey. Analysis was limited to outpatient visits with non-elderly, adult (years 18–64) patients with a single discharge diagnosis.

Results

We studied 8,303 ED encounters, representing 76.6 million visits. Median charges ranged from $740 (95% CI $651–$817) for an upper respiratory infection to $3437 (95% CI $2917–$3877) for a kidney stone. The median charge for all ten outpatient conditions in the ED was $1233 (95% CI $1199– $1268), with a high degree of charge variability. All diagnoses had an interquartile range (IQR) greater than $800 with 60% of IQRs greater than $1550.

Conclusion

Emergency department charges for common conditions are expensive with high charge variability. Greater acute care charge transparency will at least allow patients and providers to be aware of the emergency department charges patients may face in the current health care system.  相似文献   

15.
Currently there are no satisfactory methods for estimating dietary iron absorption (bioavailability) at a population level, but this is essential for deriving dietary reference values using the factorial approach. The aim of this work was to develop a novel approach for estimating dietary iron absorption using a population sample from a sub-section of the UK National Diet and Nutrition Survey (NDNS). Data were analyzed in 873 subjects from the 2000–2001 adult cohort of the NDNS, for whom both dietary intake data and hematological measures (hemoglobin and serum ferritin (SF) concentrations) were available. There were 495 men aged 19–64 y (mean age 42.7±12.1 y) and 378 pre-menopausal women (mean age 35.7±8.2 y). Individual dietary iron requirements were estimated using the Institute of Medicine calculations. A full probability approach was then applied to estimate the prevalence of dietary intakes that were insufficient to meet the needs of the men and women separately, based on their estimated daily iron intake and a series of absorption values ranging from 1–40%. The prevalence of SF concentrations below selected cut-off values (indicating that absorption was not high enough to maintain iron stores) was derived from individual SF concentrations. An estimate of dietary iron absorption required to maintain specified SF values was then calculated by matching the observed prevalence of insufficiency with the prevalence predicted for the series of absorption estimates. Mean daily dietary iron intakes were 13.5 mg for men and 9.8 mg for women. Mean calculated dietary absorption was 8% in men (50th percentile for SF 85 µg/L) and 17% in women (50th percentile for SF 38 µg/L). At a ferritin level of 45 µg/L estimated absorption was similar in men (14%) and women (13%). This new method can be used to calculate dietary iron absorption at a population level using data describing total iron intake and SF concentration.  相似文献   

16.

Objective

To describe severity of anemia and explore its determinants among children under 36 months old in rural western China.

Study Design

The family information of 6711 children was collected and their hemoglobin was measured in 2005. A generalized estimated equation (GEE) linear model was used to identify the determinants of severity of childhood anemia.

Results

The prevalence of mild, moderate and severe anemia among these children was 27.4%, 21.9% and 3.2% respectively. GEE model analysis showed that province-level region and severity of maternal anemia affected the severity of childhood anemia not only in 0–5 months but also beyond 5 months. In addition, children aged 0–5 months in families using iron pot (coefficient = −0.26 95%CI −0.41,−0.12) had seldom more severe anemia, and children aged 6–36 months in families more than 4 members (coefficient = −0.03 95%CI −0.06,−0.01) or of Han ethnicity (coefficient = −0.08 95%CI −0.13,−0.04) seldom had more severe anemia but boys (coefficient = 0.03 95%CI 0.01,0.06) or younger children (6–11 month vs 30–36 month: coefficient = 0.23 95%CI 0.17, 0.28; 12–17 month vs 30–36 month: coefficient = 0.19 95%CI 0.15,0.24; 18–23 vs 30–36 month: coefficient = 0.09 95%CI 0.04,0.13) had more severe anemia.

Conclusion

The prevalence of moderate-to-severe anemia in these children was about 25%. Province-level region, iron pot use, family size, ethnicity, age and gender of children and severity of maternal anemia were important determinants of the severity of childhood anemia. These findings have some important implications for health policy decision for childhood anemia in rural western China.  相似文献   

17.
18.

Background

Governments and international donors have partnered to provide free HIV treatment to over 6 million individuals in low and middle-income countries. Understanding the determinants of HIV treatment costs will help improve efficiency and provide greater certainty about future resource needs.

Methods and Findings

We collected data on HIV treatment costs from 54 clinical sites in Botswana, Ethiopia, Mozambique, Nigeria, Uganda, and Vietnam. Sites provided free HIV treatment funded by the U.S. President’s Emergency Plan for AIDS Relief (PEPFAR), national governments, and other partners. Service delivery costs were categorized into successive six-month periods from the date when each site began HIV treatment scale-up. A generalized linear mixed model was used to investigate relationships between site characteristics and per-patient costs, excluding ARV expenses. With predictors at their mean values, average annual per-patient costs were $177 (95% CI: 127–235) for pre-ART patients, $353 (255–468) for adult patients in the first 6 months of ART, and $222 (161–296) for adult patients on ART for >6 months (excludes ARV costs). Patient volume (no. patients receiving treatment) and site maturity (months since clinic began providing treatment services) were both strong independent predictors of per-patient costs. Controlling for other factors, costs declined by 43% (18–63) as patient volume increased from 500 to 5,000 patients, and by 28% (6–47) from 5,000 to 10,000 patients. For site maturity, costs dropped 41% (28–52) between months 0–12 and 25% (15–35) between months 12–24. Price levels (proxied by per-capita GDP) were also influential, with costs increasing by 22% (4–41) for each doubling in per-capita GDP. Additionally, the frequency of clinical follow-up, frequency of laboratory monitoring, and clinician-patient ratio were significant independent predictors of per-patient costs.

Conclusions

Substantial reductions in per-patient service delivery costs occur as sites mature and patient cohorts increase in size. Other predictors suggest possible strategies to reduce per-patient costs.  相似文献   

19.
20.

Background

Most adults dying from falciparum malaria will die within 48 hours of their hospitalisation. An essential component of early supportive care is the rapid identification of patients at greatest risk. In resource-poor settings, where most patients with falciparum malaria are managed, decisions regarding patient care must frequently be made using clinical evaluation alone.

Methods

We retrospectively analysed 4 studies of 1801 adults with severe falciparum malaria to determine whether the presence of simple clinical findings might assist patient triage.

Results

If present on admission, shock, oligo-anuria, hypo- or hyperglycaemia, an increased respiratory rate, a decreased Glasgow Coma Score and an absence of fever were independently predictive of death. The variables were used to construct a simple clinical algorithm. When applied to the 1801 patients, this algorithm’s positive predictive value for survival to 48 hours was 99.4 (95% confidence interval (CI) 97.8–99.9) and for survival to discharge 96.9% (95% CI 94.3–98.5). In the 712 patients receiving artesunate, the algorithm’s positive predictive value for survival to 48 hours was 100% (95% CI 97.3–100) and to discharge was 98.5% (95% CI 94.8–99.8).

Conclusions

Simple clinical findings are closely linked to the pathophysiology of severe falciparum malaria in adults. A basic algorithm employing these indices can facilitate the triage of patients in settings where intensive care services are limited. Patients classified as low-risk by this algorithm can be safely managed initially on a general ward whilst awaiting senior clinical review and laboratory data.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号