首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.

Background

The United States Public Health Service (USPHS) Guideline for Treating Tobacco Use and Dependence includes ten key recommendations regarding the identification and the treatment of tobacco users seen in all health care settings. To our knowledge, the impact of system-wide brief interventions with cigarette smokers on smoking prevalence and health care utilization has not been examined using patient population-based data.

Methods and Findings

Data on clinical interventions with cigarette smokers were examined for primary care office visits of 104,639 patients at 17 Harvard Vanguard Medical Associates (HVMA) sites. An operational definition of “systems change” was developed. It included thresholds for intervention frequency and sustainability. Twelve sites met the criteria. Five did not. Decreases in self-reported smoking prevalence were 40% greater at sites that achieved systems change (13.6% vs. 9.7%, p<.01). On average, the likelihood of quitting increased by 2.6% (p<0.05, 95% CI: 0.1%–4.6%) per occurrence of brief intervention. For patients with a recent history of current smoking whose home site experienced systems change, the likelihood of an office visit for smoking-related diagnoses decreased by 4.3% on an annualized basis after systems change occurred (p<0.05, 95% CI: 0.5%–8.1%). There was no change in the likelihood of an office visit for smoking-related diagnoses following systems change among non-smokers.

Conclusions

The clinical practice data from HVMA suggest that a systems approach can lead to significant reductions in smoking prevalence and the rate of office visits for smoking-related diseases. Most comprehensive tobacco intervention strategies focus on the provider or the tobacco user, but these results argue that health systems should be included as an integral component of a comprehensive tobacco intervention strategy. The HVMA results also give us an indication of the potential health impacts when meaningful use core tobacco measures are widely adopted.  相似文献   

2.

Objective

To measure the prices and availability of selected medicines in Shaanxi Province after the implementation of new healthcare reform in 2009.

Methods

Data on the prices and availability of 47 medicines were collected from 50 public and 36 private sector medicine outlets in six regions of Shaanxi Province, Western China using a standardized methodology developed by the World Health Organization and Health Action International from September to October 2010. Medicine prices were compared with international reference prices to obtain a median price ratio. Affordability was measured as the number of days’ wages required for the lowest-paid unskilled government worker to purchase standard treatments for common conditions.

Findings

The mean availabilities of originator brands and lowest-priced generics were 8.9% and 26.5% in the public sector, and 18.1% and 43.6% in the private sector, respectively. The public sector procured generics and originator brands at median price ratios of 0.75 and 8.49, respectively, while patients paid 0.97 and 10.16. Final patient prices for lowest-priced generics and originator brands in the private sector were about 1.53 and 8.36 times their international retail prices, respectively. Public sector vendors applied high markups of 30.4% to generics, and 19.6% to originator brands. In the private sector, originator brands cost 390.7% more, on average, than their generic equivalents. Generic medicines were priced 17.3% higher in the private sector than the public sector. The lowest-paid government worker would need 0.1 day’s wages to purchase captopril for lowest-priced generics from private sector, while 6.6 days’ wages for losartan. For originator brands, the costs rise to 1.2 days’ wages for salbutamol inhaler and 15.6 days’ wages for omeprazole.

Conclusions

The prices, availability and affordability of medicines in China should be improved to ensure equitable access to basic medical treatments, especially for the poor. This requires multi-faceted interventions, as well as the review and refocusing of policies, regulations and educational interventions.  相似文献   

3.

Purpose

Graft failure remains an obstacle to experimental subretinal cell transplantation. A key step is preparing a viable graft, as high levels of necrosis and apoptosis increase the risk of graft failure. Retinal grafts are commonly harvested from cell cultures. We termed the graft preparation procedure “transplant conditions” (TC). We hypothesized that culture conditions influenced graft viability, and investigated whether viability decreased following TC using a mouse retinal pigment epithelial (RPE) cell line, DH01.

Methods

Cell viability was assessed by trypan blue exclusion. Levels of apoptosis and necrosis in vitro were determined by flow cytometry for annexin V and propidium iodide and Western blot analysis for the pro- and cleaved forms of caspases 3 and 7. Graft viability in vivo was established by terminal deoxynucleotidyl transferase dUTP nick end labeling (TUNEL) and cleaved caspase 3 immunolabeling of subretinal allografts.

Results

Pre-confluent cultures had significantly less nonviable cells than post-confluent cultures (6.6%±0.8% vs. 13.1%±0.9%, p<0.01). Cell viability in either group was not altered significantly following TC. Caspases 3 and 7 were not altered by levels of confluence or following TC. Pre-confluent cultures had low levels of apoptosis/necrosis (5.6%±1.1%) that did not increase following TC (4.8%±0.5%). However, culturing beyond confluence led to progressively increasing levels of apoptosis and necrosis (up to 16.5%±0.9%). Allografts prepared from post-confluent cultures had significantly more TUNEL-positive cells 3 hours post-operatively than grafts of pre-confluent cells (12.7%±3.1% vs. 4.5%±1.4%, p<0.001). Subretinal grafts of post-confluent cells also had significantly higher rates of cleaved caspase 3 than pre-confluent grafts (20.2%±4.3% vs. 7.8%±1.8%, p<0.001).

Conclusion

Pre-confluent cells should be used to maximize graft cell viability.  相似文献   

4.

Background

Studies from the UK and North America have reported vitamin C deficiency in around 1 in 5 men and 1 in 9 women in low income groups. There are few data on vitamin C deficiency in resource poor countries.

Objectives

To investigate the prevalence of vitamin C deficiency in India.

Design

We carried out a population-based cross-sectional survey in two areas of north and south India. Randomly sampled clusters were enumerated to identify people aged 60 and over. Participants (75% response rate) were interviewed for tobacco, alcohol, cooking fuel use, 24 hour diet recall and underwent anthropometry and blood collection. Vitamin C was measured using an enzyme-based assay in plasma stabilized with metaphosphoric acid. We categorised vitamin C status as deficient (<11 µmol/L), sub-optimal (11–28 µmol/L) and adequate (>28 µmol/L). We investigated factors associated with vitamin C deficiency using multivariable Poisson regression.

Results

The age, sex and season standardized prevalence of vitamin C deficiency was 73.9% (95% confidence Interval, CI 70.4,77.5) in 2668 people in north India and 45.7% (95% CI 42.5,48.9) in 2970 from south India. Only 10.8% in the north and 25.9% in the south met the criteria for adequate levels. Vitamin C deficiency varied by season, and was more prevalent in men, with increasing age, users of tobacco and biomass fuels, in those with anthropometric indicators of poor nutrition and with lower intakes of dietary vitamin C.

Conclusions

In poor communities, such as in our study, consideration needs to be given to measures to improve the consumption of vitamin C rich foods and to discourage the use of tobacco.  相似文献   

5.
6.

Background

Understanding how kill rates vary among seasons is required to understand predation by vertebrate species living in temperate climates. Unfortunately, kill rates are only rarely estimated during summer.

Methodology/Principal Findings

For several wolf packs in Yellowstone National Park, we used pairs of collared wolves living in the same pack and the double-count method to estimate the probability of attendance (PA) for an individual wolf at a carcass. PA quantifies an important aspect of social foraging behavior (i.e., the cohesiveness of foraging). We used PA to estimate summer kill rates for packs containing GPS-collared wolves between 2004 and 2009. Estimated rates of daily prey acquisition (edible biomass per wolf) decreased from 8.4±0.9 kg (mean ± SE) in May to 4.1±0.4 kg in July. Failure to account for PA would have resulted in underestimating kill rate by 32%. PA was 0.72±0.05 for large ungulate prey and 0.46±0.04 for small ungulate prey. To assess seasonal differences in social foraging behavior, we also evaluated PA during winter for VHF-collared wolves between 1997 and 2009. During winter, PA was 0.95±0.01. PA was not influenced by prey size but was influenced by wolf age and pack size.

Conclusions/Significance

Our results demonstrate that seasonal patterns in the foraging behavior of social carnivores have important implications for understanding their social behavior and estimating kill rates. Synthesizing our findings with previous insights suggests that there is important seasonal variation in how and why social carnivores live in groups. Our findings are also important for applications of GPS collars to estimate kill rates. Specifically, because the factors affecting the PA of social carnivores likely differ between seasons, kill rates estimated through GPS collars should account for seasonal differences in social foraging behavior.  相似文献   

7.
Wan X  Shin SS  Wang Q  Raymond HF  Liu H  Ding D  Yang G  Novotny TE 《PloS one》2011,6(8):e23028

Background

Rural-to-urban migrant women may be vulnerable to smoking initiation as they are newly exposed to risk factors in the urban environment. We sought to identify correlates of smoking among rural-to-urban migrant women in China.

Methods/Principal Findings

A cross-sectional survey of rural-to-urban migrant women working in restaurants and hotels (RHW) and those working as commercial sex workers (CSW) was conducted in ten provincial capital cities in China. Multiple logistic regression was conducted to identify correlates of smoking. We enrolled 2229 rural-to-urban migrant women (1697 RHWs aged 18–24 years and 532 CSWs aged 18–30 years). Of these, 18.4% RHWs and 58.3% CSWs reported ever tried smoking and 3.2% RHWs and 41.9% CSWs reported current smoking. Participants who first tried smoking after moving to the city were more likely to be current smokers compared to participants who first tried smoking before moving to the city (25.3% vs. 13.8% among RHWs, p = 0.02; 83.6% vs. 58.6% among CSWs, p = <0.01). Adjusting for other factors, “tried female cigarette brands” had the strongest association with current smoking (OR 5.69, 95%CI 3.44 to 9.41) among participants who had ever tried smoking.

Conclusions/Significance

Exposure to female cigarette brands may increase the susceptibility to smoking among rural-to-urban migrant women. Smoke-free policies and increased taxes may be effective in preventing rural-to-urban migrant women from smoking initiation.  相似文献   

8.
9.

Background

Urinary tract infections (UTI) are frequent in outpatients. Fast pathogen identification is mandatory for shortening the time of discomfort and preventing serious complications. Urine culture needs up to 48 hours until pathogen identification. Consequently, the initial antibiotic regimen is empirical.

Aim

To evaluate the feasibility of qualitative urine pathogen identification by a commercially available real-time PCR blood pathogen test (SeptiFast®) and to compare the results with dipslide and microbiological culture.

Design of study

Pilot study with prospectively collected urine samples.

Setting

University hospital.

Methods

82 prospectively collected urine samples from 81 patients with suspected UTI were included. Dipslide urine culture was followed by microbiological pathogen identification in dipslide positive samples. In parallel, qualitative DNA based pathogen identification (SeptiFast®) was performed in all samples.

Results

61 samples were SeptiFast® positive, whereas 67 samples were dipslide culture positive. The inter-methodological concordance of positive and negative findings in the gram+, gram- and fungi sector was 371/410 (90%), 477/492 (97%) and 238/246 (97%), respectively. Sensitivity and specificity of the SeptiFast® test for the detection of an infection was 0.82 and 0.60, respectively. SeptiFast® pathogen identifications were available at least 43 hours prior to culture results.

Conclusion

The SeptiFast® platform identified bacterial DNA in urine specimens considerably faster compared to conventional culture. For UTI diagnosis sensitivity and specificity is limited by its present qualitative setup which does not allow pathogen quantification. Future quantitative assays may hold promise for PCR based UTI pathogen identification as a supplementation of conventional culture methods.  相似文献   

10.

Objective

The use of pictorial warning labels on cigarette packages is one of the provisions included in the first ever global health treaty by the World Health Organization against the tobacco epidemic. There is substantial evidence demonstrating the effectiveness of graphic health warning labels on intention to quit, thoughts about health risks and engaging in cessation behaviors. However, studies that address the implicit emotional drives evoked by such warnings are still underexplored. Here, we provide experimental data for the use of pictorial health warnings as a reliable strategy for tobacco control.

Methods

Experiment 1 pre-tested nineteen prototypes of pictorial warnings to screen for their emotional impact. Participants (n = 338) were young adults balanced in gender, smoking status and education. Experiment 2 (n = 63) tested pictorial warnings (ten) that were stamped on packs. We employed an innovative set-up to investigate the impact of the warnings on the ordinary attitude of packs’ manipulation, and quantified judgments of warnings’ emotional strength and efficacy against smoking.

Findings

Experiment 1 revealed that women judged the warning prototypes as more aversive than men, and smokers judged them more aversive than non-smokers. Participants with lower education judged the prototypes more aversive than participants with higher education. Experiment 2 showed that stamped warnings antagonized the appeal of the brands by imposing a cost to manipulate the cigarette packs, especially for smokers. Additionally, participants’ judgments revealed that the more aversive a warning, the more it is perceived as effective against smoking.

Conclusions

Health warning labels are one of the key components of the integrated approach to control the global tobacco epidemic. The evidence presented in this study adds to the understanding of how implicit responses to pictorial warnings may contribute to behavioral change.  相似文献   

11.

Background

Chronic non-communicable diseases (NCDs) are becoming significant causes of morbidity and mortality, particularly in sub-Saharan African countries, although local, high-quality data to inform evidence-based policies are lacking.

Objectives

To determine the magnitude of NCDs and their risk factors in Malawi.

Methods

Using the WHO STEPwise approach to chronic disease risk factor surveillance, a population-based, nationwide cross-sectional survey was conducted between July and September 2009 on participants aged 25–64 years. Socio-demographic and behaviour risk factors were collected in Step 1. Physical anthropometric measurements and blood pressure were documented in Step 2. Blood cholesterol and fasting blood glucose were measured in Step 3.

Results and Conclusion

A total of 5,206 adults (67% females) were surveyed. Tobacco smoking, alcohol drinking and raised blood pressure (BP) were more frequent in males than females, 25% vs 3%, 30% vs 4% and 37% vs 29%. Overweight, physical inactivity and raised cholesterol were more common in females than males, 28% vs 16%, 13% vs 6% and 11% vs 6%. Tobacco smoking was more common in rural than urban areas 11% vs 7%, and overweight and physical inactivity more common in urban than rural areas 39% vs 22% and 24% vs 9%, all with p<0.05. Overall (both sexes) prevalence of tobacco smoking, alcohol consumption, overweight and physical inactivity was 14%, 17%, 22%, 10% and prevalence of raised BP, fasting blood sugar and cholesterol was 33%, 6% and 9% respectively. These data could be useful in the formulation and advocacy of NCD policy and action plan in Malawi.  相似文献   

12.

Background

Serological tests for IgM and IgG are routinely used in clinical laboratories for the rapid diagnosis of dengue and can differentiate between primary and secondary infections. Dengue virus non-structural protein 1 (NS1) has been identified as an early marker for acute dengue, and is typically present between days 1–9 post-onset of illness but following seroconversion it can be difficult to detect in serum.

Aims

To evaluate the performance of a newly developed Panbio® Dengue Early Rapid test for NS1 and determine if it can improve diagnostic sensitivity when used in combination with a commercial IgM/IgG rapid test.

Methodology

The clinical performance of the Dengue Early Rapid was evaluated in a retrospective study in Vietnam with 198 acute laboratory-confirmed positive and 100 negative samples. The performance of the Dengue Early Rapid in combination with the IgM/IgG Rapid test was also evaluated in Malaysia with 263 laboratory-confirmed positive and 30 negative samples.

Key Results

In Vietnam the sensitivity and specificity of the test was 69.2% (95% CI: 62.8% to 75.6%) and 96% (95% CI: 92.2% to 99.8) respectively. In Malaysia the performance was similar with 68.9% sensitivity (95% CI: 61.8% to 76.1%) and 96.7% specificity (95% CI: 82.8% to 99.9%) compared to RT-PCR. Importantly, when the Dengue Early Rapid test was used in combination with the IgM/IgG test the sensitivity increased to 93.0%. When the two tests were compared at each day post-onset of illness there was clear differentiation between the antigen and antibody markers.

Conclusions

This study highlights that using dengue NS1 antigen detection in combination with anti-glycoprotein E IgM and IgG serology can significantly increase the sensitivity of acute dengue diagnosis and extends the possible window of detection to include very early acute samples and enhances the clinical utility of rapid immunochromatographic testing for dengue.  相似文献   

13.
Lo YL  Lin TY  Fang YF  Wang TY  Chen HC  Chou CL  Chung FT  Kuo CH  Feng PH  Liu CY  Kuo HP 《PloS one》2011,6(11):e27769

Objectives

There are safety issues associated with propofol use for flexible bronchoscopy (FB). The bispectral index (BIS) correlates well with the level of consciousness. The aim of this study was to show that BIS-guided propofol infusion is safe and may provide better sedation, benefiting the patients and bronchoscopists.

Methods

After administering alfentanil bolus, 500 patients were randomized to either propofol infusion titrated to a BIS level of 65-75 (study group) or incremental midazolam bolus based on clinical judgment to achieve moderate sedation. The primary endpoint was safety, while the secondary endpoints were recovery time, patient tolerance, and cooperation.

Results

The proportion of patients with hypoxemia or hypotensive events were not different in the 2 groups (study vs. control groups: 39.9% vs. 35.7%, p = 0.340; 7.4% vs. 4.4%, p = 0.159, respectively). The mean lowest blood pressure was lower in the study group. Logistic regression revealed male gender, higher American Society of Anesthesiologists physical status, and electrocautery were associated with hypoxemia, whereas lower propofol dose for induction was associated with hypotension in the study group. The study group had better global tolerance (p<0.001), less procedural interference by movement or cough (13.6% vs. 36.1%, p<0.001; 30.0% vs. 44.2%, p = 0.001, respectively), and shorter time to orientation and ambulation (11.7±10.2 min vs. 29.7±26.8 min, p<0.001; 30.0±18.2 min vs. 55.7±40.6 min, p<0.001, respectively) compared to the control group.

Conclusions

BIS-guided propofol infusion combined with alfentanil for FB sedation provides excellent patient tolerance, with fast recovery and less procedure interference.

Trial Registration

ClinicalTrials. gov NCT00789815  相似文献   

14.

Background and Aims

Although the advent of ultra-deep sequencing technology allows for the analysis of heretofore-undetectable minor viral mutants, a limited amount of information is currently available regarding the clinical implications of hepatitis B virus (HBV) genomic heterogeneity.

Methods

To characterize the HBV genetic heterogeneity in association with anti-viral therapy, we performed ultra-deep sequencing of full-genome HBV in the liver and serum of 19 patients with chronic viral infection, including 14 therapy-naïve and 5 nucleos(t)ide analogue(NA)-treated cases.

Results

Most genomic changes observed in viral variants were single base substitutions and were widely distributed throughout the HBV genome. Four of eight (50%) chronic therapy-naïve HBeAg-negative patients showed a relatively low prevalence of the G1896A pre-core (pre-C) mutant in the liver tissues, suggesting that other mutations were involved in their HBeAg seroconversion. Interestingly, liver tissues in 4 of 5 (80%) of the chronic NA-treated anti-HBe-positive cases had extremely low levels of the G1896A pre-C mutant (0.0%, 0.0%, 0.1%, and 1.1%), suggesting the high sensitivity of the G1896A pre-C mutant to NA. Moreover, various abundances of clones resistant to NA were common in both the liver and serum of treatment-naïve patients, and the proportion of M204VI mutants resistant to lamivudine and entecavir expanded in response to entecavir treatment in the serum of 35.7% (5/14) of patients, suggesting the putative risk of developing drug resistance to NA.

Conclusion

Our findings illustrate the strong advantage of deep sequencing on viral genome as a tool for dissecting the pathophysiology of HBV infection.  相似文献   

15.

Background

Cardiometabolic disease risk in US military recruits and the effects of military training have not been determined. This study examined lifestyle factors and biomarkers associated with cardiometabolic risk in US Army recruits (209; 118 male, 91 female, 23±5 yr) before, during, and after basic combat training (BCT).

Methodology/Principal Findings

Anthropometrics; fasting total (TC), high-density lipoprotein (HDL) and low-density lipoprotein (LDL) cholesterol; triglycerides (TG); glucose; and insulin were measured at baseline and every 3 wks during the 10 wk BCT course. At baseline, 14% of recruits were obese (BMI>30 kg/m2), 27% were cigarette smokers, 37% were sedentary, and 34% reported a family history of cardiometabolic disease. TC was above recommended levels in 8%, LDL in 39%, TG in 5%, and glucose in 8% of recruits, and HDL was below recommended levels in 33% of recruits at baseline. By week 9, TC decreased 8%, LDL 10%, TG 13%, glucose 6% and homeostasis model assessment of insulin resistance (HOMA-IR) 40% in men (P<0.05). In women, TC, LDL, glucose and HOMA-IR were decreased from baseline at weeks 3 and 6 (P<0.05), but were not different from baseline levels at week 9. During BCT, body weight declined in men but not women, while body fat percentage declined in both men and women (P<0.05).

Conclusions/Significance

At the start of military service, the prevalence of cardiometabolic risk in US military recruits is comparable to that reported in similar, college-aged populations. Military training appears to be an effective strategy that may mitigate risk in young people through improvements in lipid profiles and glycemic control.  相似文献   

16.

Background

The clinical outcomes of short interruptions of PI-based ART regimens remains undefined.

Methods

A 2-arm non-inferiority trial was conducted on 53 HIV-1 infected South African participants with viral load <50 copies/ml and CD4 T cell count >450 cells/µl on stavudine (or zidovudine), lamivudine and lopinavir/ritonavir. Subjects were randomized to a) sequential 2, 4 and 8-week ART interruptions or b) continuous ART (cART). Primary analysis was based on the proportion of CD4 count >350 cells(c)/ml over 72 weeks. Adherence, HIV-1 drug resistance, and CD4 count rise over time were analyzed as secondary endpoints.

Results

The proportions of CD4 counts >350 cells/µl were 82.12% for the intermittent arm and 93.73 for the cART arm; the difference of 11.95% was above the defined 10% threshold for non-inferiority (upper limit of 97.5% CI, 24.1%; 2-sided CI: −0.16, 23.1). No clinically significant differences in opportunistic infections, adverse events, adherence or viral resistance were noted; after randomization, long-term CD4 rise was observed only in the cART arm.

Conclusion

We are unable to conclude that short PI-based ART interruptions are non-inferior to cART in retention of immune reconstitution; however, short interruptions did not lead to a greater rate of resistance mutations or adverse events than cART suggesting that this regimen may be more forgiving than NNRTIs if interruptions in therapy occur.

Trial Registration

ClinicalTrials.gov NCT00100646  相似文献   

17.
18.

Background

Cryptococcal infection is a frequent cause of mortality in Cambodian HIV-infected patients with CD4+ count ≤100 cells/µl. This study assessed the cost-effectiveness of three strategies for cryptococcosis prevention in HIV-infected patients.

Methods

A Markov decision tree was used to compare the following strategies at the time of HIV diagnosis: no intervention, one time systematic serum cryptococcal antigen (CRAG) screening and treatment of positive patients, and systematic primary prophylaxis with fluconazole. The trajectory of a hypothetical cohort of HIV-infected patients with CD4+ count ≤100 cells/µl initiating care was simulated over a 1-year period (cotrimoxazole initiation at enrollment; antiretroviral therapy within 3 months). Natural history and cost data (US$ 2009) were from Cambodia. Efficacy data were from international literature.

Results

In a population in which 81% of patients had a CD4+ count ≤50 cells/ µl and 19% a CD4+ count between 51–100 cells/µl, the proportion alive 1 year after enrolment was 61% (cost $ 472) with no intervention, 70% (cost $ 483) with screening, and 72% (cost $ 492) with prophylaxis. After one year of follow-up, the cost-effectiveness of screening vs. no intervention was US$ 180/life year gained (LYG). The cost-effectiveness of prophylaxis vs. screening was $ 511/LYG. The cost-effectiveness of prophylaxis vs. screening was estimated at $1538/LYG if the proportion of patients with CD4+ count ≤50 cells/µl decreased by 75%.

Conclusion

In a high endemic area of cryptococcosis and HIV infection, serum CRAG screening and prophylaxis are two cost effective strategies to prevent AIDS associated cryptococcosis in patients with CD4+ count ≤100 cells/µl, at a short-term horizon, screening being more cost-effective but less effective than prophylaxis. Systematic primary prophylaxis may be preferred in patients with CD4+ below 50 cells/µl while systematic serum CRAG screening for early targeted treatment may be preferred in patients with CD4+ between 51–100 cells/µl.  相似文献   

19.

Aims

Obesity causes a high disease burden in Australia and across the world. We aimed to analyse the cost-effectiveness of weight reduction with pharmacotherapy in Australia, and to assess its potential to reduce the disease burden due to excess body weight.

Methods

We constructed a multi-state life-table based Markov model in Excel in which body weight influences the incidence of stroke, ischemic heart disease, hypertensive heart disease, diabetes mellitus, osteoarthritis, post-menopausal breast cancer, colon cancer, endometrial cancer and kidney cancer. We use data on effectiveness identified from PubMed searches, on mortality from Australian Bureau of Statistics, on disease costs from the Australian Institute of Health and Welfare, and on drug costs from the Department of Health and Ageing. We evaluate 1-year pharmacological interventions with sibutramine and orlistat targeting obese Australian adults free of obesity-related disease. We use a lifetime horizon for costs and health outcomes and a health sector perspective for costs. Incremental Cost-Effectiveness Ratios (ICERs) below A$50 000 per Disability Adjusted Life Year (DALY) averted are considered good value for money.

Results

The ICERs are A$130 000/DALY (95% uncertainty interval [UI] 93 000–180 000) for sibutramine and A$230 000/DALY (170 000–340 000) for orlistat. The interventions reduce the body weight-related disease burden at the population level by 0.2% and 0.1%, respectively. Modest weight loss during the interventions, rapid post-intervention weight regain and low adherence limit the health benefits.

Conclusions

Treatment with sibutramine or orlistat is not cost-effective from an Australian health sector perspective and has a negligible impact on the total body weight-related disease burden.  相似文献   

20.

Background

Nucleic acid amplification tests are sensitive for identifying Mycobacterium tuberculosis in populations with positive sputum smears for acid-fast bacilli, but less sensitive in sputum-smear-negative populations. Few studies have evaluated the clinical impact of these tests in low-income countries with high burdens of TB and HIV.

Methods

We prospectively enrolled 211 consecutive adults with cough ≥2 weeks and negative sputum smears at Mulago Hospital in Kampala, Uganda. We tested a single early-morning sputum specimen for Mycobacterium tuberculosis DNA using two nucleic acid amplification tests: a novel in-house polymerase chain reaction targeting the mycobacterial secA1 gene, and the commercial Amplified® Mycobacterium tuberculosis Direct (MTD) test (Gen-Probe Inc, San Diego, CA). We calculated the diagnostic accuracy of these index tests in reference to a primary microbiologic gold standard (positive mycobacterial culture of sputum or bronchoalveolar lavage fluid), and measured their likely clinical impact on additional tuberculosis cases detected among those not prescribed initial TB treatment.

Results

Of 211 patients enrolled, 170 (81%) were HIV-seropositive, with median CD4+ T-cell count 78 cells/µL (interquartile range 29-203). Among HIV-seropositive patients, 94 (55%) reported taking co-trimoxazole prophylaxis and 29 (17%) reported taking antiretroviral therapy. Seventy-five patients (36%) had culture-confirmed TB. Sensitivity of MTD was 39% (95% CI 28–51) and that of secA1 was 24% (95% CI 15–35). Both tests had specificities of 95% (95% CI 90–98). The MTD test correctly identified 18 (24%) TB patients not treated at discharge and led to a 72% relative increase in the smear-negative case detection rate.

Conclusions

The secA1 and MTD nucleic acid amplification tests had moderate sensitivity and high specificity for TB in a predominantly HIV-seropositive population with negative sputum smears. Although newer, more sensitive nucleic acid assays may enhance detection of Mycobacterium tuberculosis in sputum, even currently available tests can provide substantial clinical impact in smear-negative populations.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号