首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.

Objective

To compare persistence with tumor necrosis factor alpha (TNF) antagonists among rheumatoid arthritis patients in British Columbia. Treatment persistence has been suggested as a proxy for real-world therapeutic benefit and harm of treatments for chronic non-curable diseases, including rheumatoid arthritis. We hypothesized that the different pharmacological characteristics of infliximab, adalimumab and etanercept cause statistically and clinically significant differences in persistence.

Methods

We conducted a population-based cohort study using administrative health data from the Canadian province of British Columbia. The study cohort included rheumatoid arthritis patients who initiated the first course of a TNF antagonist between 2001 and 2008. Persistence was measured as the time between first dispensing to discontinuation. Drug discontinuation was defined as a drug-free interval of 180 days or switching to another TNF antagonist, anakinra, rituximab or abatacept. Persistence was estimated and compared using survival analysis.

Results

The study cohort included 2,923 patients, 63% treated with etanercept. Median persistence in years (95% confidence interval) with infliximab was 3.7 (2.9–4.9), with adalimumab 3.3 (2.6–4.1) and with etanercept 3.8 (3.3–4.3). Similar risk of discontinuation was observed for the three drugs: the hazard ratio (95% confidence interval) was 0.98 (0.85–1.13) comparing infliximab with etanercept, 0.95 (0.78–1.15) comparing infliximab with adalimumab and 1.04 (0.88–1.22) comparing adalimumab with etanercept.

Conclusions

Similar persistence was observed with infliximab, adalimumab and etanercept in rheumatoid arthritis patients during the first 9 years of use. If treatment persistence is a good proxy for the therapeutic benefit and harm of these drugs, then this finding suggests that the three drugs share an overall similar benefit-harm profile in rheumatoid arthritis patients.  相似文献   

2.

Background

The pathogenesis of appendicitis is unclear. We evaluated whether exposure to air pollution was associated with an increased incidence of appendicitis.

Methods

We identified 5191 adults who had been admitted to hospital with appendicitis between Apr. 1, 1999, and Dec. 31, 2006. The air pollutants studied were ozone, nitrogen dioxide, sulfur dioxide, carbon monoxide, and suspended particulate matter of less than 10 μ and less than 2.5 μ in diameter. We estimated the odds of appendicitis relative to short-term increases in concentrations of selected pollutants, alone and in combination, after controlling for temperature and relative humidity as well as the effects of age, sex and season.

Results

An increase in the interquartile range of the 5-day average of ozone was associated with appendicitis (odds ratio [OR] 1.14, 95% confidence interval [CI] 1.03–1.25). In summer (July–August), the effects were most pronounced for ozone (OR 1.32, 95% CI 1.10–1.57), sulfur dioxide (OR 1.30, 95% CI 1.03–1.63), nitrogen dioxide (OR 1.76, 95% CI 1.20–2.58), carbon monoxide (OR 1.35, 95% CI 1.01–1.80) and particulate matter less than 10 μ in diameter (OR 1.20, 95% CI 1.05–1.38). We observed a significant effect of the air pollutants in the summer months among men but not among women (e.g., OR for increase in the 5-day average of nitrogen dioxide 2.05, 95% CI 1.21–3.47, among men and 1.48, 95% CI 0.85–2.59, among women). The double-pollutant model of exposure to ozone and nitrogen dioxide in the summer months was associated with attenuation of the effects of ozone (OR 1.22, 95% CI 1.01–1.48) and nitrogen dioxide (OR 1.48, 95% CI 0.97–2.24).

Interpretation

Our findings suggest that some cases of appendicitis may be triggered by short-term exposure to air pollution. If these findings are confirmed, measures to improve air quality may help to decrease rates of appendicitis.Appendicitis was introduced into the medical vernacular in 1886.1 Since then, the prevailing theory of its pathogenesis implicated an obstruction of the appendiceal orifice by a fecalith or lymphoid hyperplasia.2 However, this notion does not completely account for variations in incidence observed by age,3,4 sex,3,4 ethnic background,3,4 family history,5 temporal–spatial clustering6 and seasonality,3,4 nor does it completely explain the trends in incidence of appendicitis in developed and developing nations.3,7,8The incidence of appendicitis increased dramatically in industrialized nations in the 19th century and in the early part of the 20th century.1 Without explanation, it decreased in the middle and latter part of the 20th century.3 The decrease coincided with legislation to improve air quality. For example, after the United States Clean Air Act was passed in 1970,9 the incidence of appendicitis decreased by 14.6% from 1970 to 1984.3 Likewise, a 36% drop in incidence was reported in the United Kingdom between 1975 and 199410 after legislation was passed in 1956 and 1968 to improve air quality and in the 1970s to control industrial sources of air pollution. Furthermore, appendicitis is less common in developing nations; however, as these countries become more industrialized, the incidence of appendicitis has been increasing.7Air pollution is known to be a risk factor for multiple conditions, to exacerbate disease states and to increase all-cause mortality.11 It has a direct effect on pulmonary diseases such as asthma11 and on nonpulmonary diseases including myocardial infarction, stroke and cancer.1113 Inflammation induced by exposure to air pollution contributes to some adverse health effects.1417 Similar to the effects of air pollution, a proinflammatory response has been associated with appendicitis.1820We conducted a case–crossover study involving a population-based cohort of patients admitted to hospital with appendicitis to determine whether short-term increases in concentrations of selected air pollutants were associated with hospital admission because of appendicitis.  相似文献   

3.
Sonja A. Swanson  Ian Colman 《CMAJ》2013,185(10):870-877

Background:

Ecological studies support the hypothesis that suicide may be “contagious” (i.e., exposure to suicide may increase the risk of suicide and related outcomes). However, this association has not been adequately assessed in prospective studies. We sought to determine the association between exposure to suicide and suicidality outcomes in Canadian youth.

Methods:

We used baseline information from the Canadian National Longitudinal Survey of Children and Youth between 1998/99 and 2006/07 with follow-up assessments 2 years later. We included all respondents aged 12–17 years in cycles 3–7 with reported measures of exposure to suicide.

Results:

We included 8766 youth aged 12–13 years, 7802 aged 14–15 years and 5496 aged 16–17 years. Exposure to a schoolmate’s suicide was associated with ideation at baseline among respondents aged 12–13 years (odds ratio [OR] 5.06, 95% confidence interval [CI] 3.04–8.40), 14–15 years (OR 2.93, 95% CI 2.02–4.24) and 16–17 years (OR 2.23, 95% CI 1.43–3.48). Such exposure was associated with attempts among respondents aged 12–13 years (OR 4.57, 95% CI 2.39–8.71), 14–15 years (OR 3.99, 95% CI 2.46–6.45) and 16–17 years (OR 3.22, 95% CI 1.62–6.41). Personally knowing someone who died by suicide was associated with suicidality outcomes for all age groups. We also assessed 2-year outcomes among respondents aged 12–15 years: a schoolmate’s suicide predicted suicide attempts among participants aged 12–13 years (OR 3.07, 95% CI 1.05–8.96) and 14–15 years (OR 2.72, 95% CI 1.47–5.04). Among those who reported a schoolmate’s suicide, personally knowing the decedent did not alter the risk of suicidality.

Interpretation:

We found that exposure to suicide predicts suicide ideation and attempts. Our results support school-wide interventions over current targeted interventions, particularly over strategies that target interventions toward children closest to the decedent.Suicidal thoughts and behaviours are prevalent13 and severe47 among adolescents. One hypothesized cause of suicidality is “suicide contagion” (i.e., exposure to suicide or related behaviours influences others to contemplate, attempt or die by suicide).8 Ecological studies support this theory: suicide and suspected suicide rates increase following a highly publicized suicide.911 However, such studies are prone to ecological fallacy and do not allow for detailed understanding of who may be most vulnerable.Adolescents may be particularly susceptible to this contagion effect. More than 13% of adolescent suicides are potentially explained by clustering;1214 clustering may explain an even larger proportion of suicide attempts.15,16 Many local,17,18 national8,19 and international20 institutions recommend school- or community-level postvention strategies in the aftermath of a suicide to help prevent further suicides and suicidality. These postvention strategies typically focus on a short interval following the death (e.g., months) with services targeted toward the most at-risk individuals (e.g., those with depression).19In this study, we assessed the association between exposure to suicide and suicidal thoughts and attempts among youth, using both cross-sectional and prospective (2-yr follow-up) analyses in a population-based cohort of Canadian youth.  相似文献   

4.
5.
Background:Otitis media with effusion is a common problem that lacks an evidence-based nonsurgical treatment option. We assessed the clinical effectiveness of treatment with a nasal balloon device in a primary care setting.Methods:We conducted an open, pragmatic randomized controlled trial set in 43 family practices in the United Kingdom. Children aged 4–11 years with a recent history of ear symptoms and otitis media with effusion in 1 or both ears, confirmed by tympanometry, were allocated to receive either autoinflation 3 times daily for 1–3 months plus usual care or usual care alone. Clearance of middle-ear fluid at 1 and 3 months was assessed by experts masked to allocation.Results:Of 320 children enrolled, those receiving autoinflation were more likely than controls to have normal tympanograms at 1 month (47.3% [62/131] v. 35.6% [47/132]; adjusted relative risk [RR] 1.36, 95% confidence interval [CI] 0.99 to 1.88) and at 3 months (49.6% [62/125] v. 38.3% [46/120]; adjusted RR 1.37, 95% CI 1.03 to 1.83; number needed to treat = 9). Autoinflation produced greater improvements in ear-related quality of life (adjusted between-group difference in change from baseline in OMQ-14 [an ear-related measure of quality of life] score −0.42, 95% CI −0.63 to −0.22). Compliance was 89% at 1 month and 80% at 3 months. Adverse events were mild, infrequent and comparable between groups.Interpretation:Autoinflation in children aged 4–11 years with otitis media with effusion is feasible in primary care and effective both in clearing effusions and improving symptoms and ear-related child and parent quality of life. Trial registration: ISRCTN, No. 55208702.Otitis media with effusion, also known as glue ear, is an accumulation of fluid in the middle ear, without symptoms or signs of an acute ear infection. It is often associated with viral infection.13 The prevalence rises to 46% in children aged 4–5 years,4 when hearing difficulty, other ear-related symptoms and broader developmental concerns often bring the condition to medical attention.3,5,6 Middle-ear fluid is associated with conductive hearing losses of about 15–45 dB HL.7 Resolution is clinically unpredictable,810 with about a third of cases showing recurrence.11 In the United Kingdom, about 200 000 children with the condition are seen annually in primary care.12,13 Research suggests some children seen in primary care are as badly affected as those seen in hospital.7,9,14,15 In the United States, there were 2.2 million diagnosed episodes in 2004, costing an estimated $4.0 billion.16 Rates of ventilation tube surgery show variability between countries,1719 with a declining trend in the UK.20Initial clinical management consists of reasonable temporizing or delay before considering surgery.13 Unfortunately, all available medical treatments for otitis media with effusion such as antibiotics, antihistamines, decongestants and intranasal steroids are ineffective and have unwanted effects, and therefore cannot be recommended.2123 Not only are antibiotics ineffective, but resistance to them poses a major threat to public health.24,25 Although surgery is effective for a carefully selected minority,13,26,27 a simple low-cost, nonsurgical treatment option could benefit a much larger group of symptomatic children, with the purpose of addressing legitimate clinical concerns without incurring excessive delays.Autoinflation using a nasal balloon device is a low-cost intervention with the potential to be used more widely in primary care, but current evidence of its effectiveness is limited to several small hospital-based trials28 that found a higher rate of tympanometric resolution of ear fluid at 1 month.2931 Evidence of feasibility and effectiveness of autoinflation to inform wider clinical use is lacking.13,28 Thus we report here the findings of a large pragmatic trial of the clinical effectiveness of nasal balloon autoinflation in a spectrum of children with clinically confirmed otitis media with effusion identified from primary care.  相似文献   

6.

Background

Cryotherapy is widely used for the treatment of cutaneous warts in primary care. However, evidence favours salicylic acid application. We compared the effectiveness of these treatments as well as a wait-and-see approach.

Methods

Consecutive patients with new cutaneous warts were recruited in 30 primary care practices in the Netherlands between May 1, 2006, and Jan. 26, 2007. We randomly allocated eligible patients to one of three groups: cryotherapy with liquid nitrogen every two weeks, self-application of salicylic acid daily or a wait-and-see approach. The primary outcome was the proportion of participants whose warts were all cured at 13 weeks. Analysis was on an intention-to-treat basis. Secondary outcomes included treatment adherence, side effects and treatment satisfaction. Research nurses assessed outcomes during home visits at 4, 13 and 26 weeks.

Results

Of the 250 participants (age 4 to 79 years), 240 were included in the analysis at 13 weeks (loss to follow-up 4%). Cure rates were 39% (95% confidence interval [CI] 29%–51%) in the cryotherapy group, 24% (95% CI 16%–35%) in the salicylic acid group and 16% (95% CI 9.5%–25%) in the wait-and-see group. Differences in effectiveness were most pronounced among participants with common warts (n = 116): cure rates were 49% (95% CI 34%–64%) in the cryotherapy group, 15% (95% CI 7%–30%) in the salicylic acid group and 8% (95% CI 3%–21%) in the wait-and-see group. Cure rates among the participants with plantar warts (n = 124) did not differ significantly between treatment groups.

Interpretation

For common warts, cryotherapy was the most effective therapy in primary care. For plantar warts, we found no clinically relevant difference in effectiveness between cryotherapy, topical application of salicylic acid or a wait-and-see approach after 13 weeks. (ClinicalTrial.gov registration no. ISRCTN42730629)Cutaneous warts are common.13 Up to one-third of primary school children have warts, of which two-thirds resolve within two years.4,5 Because warts frequently result in discomfort,6 2% of the general population and 6% of school-aged children each year present with warts to their family physician.7,8 The usual treatment is cryotherapy with liquid nitrogen or, less frequently, topical application of salicylic acid.912 Some physicians choose a wait-and-see approach because of the benign natural course of warts and the risk of side effects of treatment.10,11A recent Cochrane review on treatments of cutaneous warts concluded that available studies were small, poorly designed or limited to dermatology outpatients.10,11 Evidence on cryotherapy was contradictory,1318 whereas the evidence on salicylic acid was more convincing.1923 However, studies that compared cryotherapy and salicylic acid directly showed no differences in effectiveness.24,25 The Cochrane review called for high-quality trials in primary care to compare the effects of cryotherapy, salicylic acid and placebo.We conducted a three-arm randomized controlled trial to compare the effectiveness of cryotherapy with liquid nitrogen, topical application of salicylic acid and a wait-and-see approach for the treatment of common and plantar warts in primary care.  相似文献   

7.

Background

Fractures have largely been assessed by their impact on quality of life or health care costs. We conducted this study to evaluate the relation between fractures and mortality.

Methods

A total of 7753 randomly selected people (2187 men and 5566 women) aged 50 years and older from across Canada participated in a 5-year observational cohort study. Incident fractures were identified on the basis of validated self-report and were classified by type (vertebral, pelvic, forearm or wrist, rib, hip and “other”). We subdivided fracture groups by the year in which the fracture occurred during follow-up; those occurring in the fourth and fifth years were grouped together. We examined the relation between the time of the incident fracture and death.

Results

Compared with participants who had no fracture during follow-up, those who had a vertebral fracture in the second year were at increased risk of death (adjusted hazard ratio [HR] 2.7, 95% confidence interval [CI] 1.1–6.6); also at risk were those who had a hip fracture during the first year (adjusted HR 3.2, 95% CI 1.4–7.4). Among women, the risk of death was increased for those with a vertebral fracture during the first year (adjusted HR 3.7, 95% CI 1.1–12.8) or the second year of follow-up (adjusted HR 3.2, 95% CI 1.2–8.1). The risk of death was also increased among women with hip fracture during the first year of follow-up (adjusted HR 3.0, 95% CI 1.0–8.7).

Interpretation

Vertebral and hip fractures are associated with an increased risk of death. Interventions that reduce the incidence of these fractures need to be implemented to improve survival.Osteoporosis-related fractures are a major health concern, affecting a growing number of individuals worldwide. The burden of fracture has largely been assessed by the impact on health-related quality of life and health care costs.1,2 Fractures can also be associated with death. However, trials that have examined the relation between fractures and mortality have had limitations that may influence their results and the generalizability of the studies, including small samples,3,4 the examination of only 1 type of fracture,410 the inclusion of only women,8,11 the enrolment of participants from specific areas (i.e., hospitals or certain geographic regions),3,4,7,8,10,12 the nonrandom selection of participants311 and the lack of statistical adjustment for confounding factors that may influence mortality.3,57,12We evaluated the relation between incident fractures and mortality over a 5-year period in a cohort of men and women 50 years of age and older. In addition, we examined whether other characteristics of participants were risk factors for death.  相似文献   

8.

Background:

Morbidity due to cardiovascular disease is high among First Nations people. The extent to which this may be related to the likelihood of coronary angiography is unclear. We examined the likelihood of coronary angiography after acute myocardial infarction (MI) among First Nations and non–First Nations patients.

Methods:

Our study included adults with incident acute MI between 1997 and 2008 in Alberta. We determined the likelihood of angiography among First Nations and non–First Nations patients, adjusted for important confounders, using the Alberta Provincial Project for Outcome Assessment in Coronary Heart Disease (APPROACH) database.

Results:

Of the 46 764 people with acute MI, 1043 (2.2%) were First Nations. First Nations patients were less likely to receive angiography within 1 day after acute MI (adjusted odds ratio [OR] 0.73, 95% confidence interval [CI] 0.62–0.87). Among First Nations and non–First Nations patients who underwent angiography (64.9%), there was no difference in the likelihood of percutaneous coronary intervention (PCI) (adjusted hazard ratio [HR] 0.92, 95% CI 0.83–1.02) or coronary artery bypass grafting (CABG) (adjusted HR 1.03, 95% CI 0.85–1.25). First Nations people had worse survival if they received medical management alone (adjusted HR 1.38, 95% CI 1.07–1.77) or if they underwent PCI (adjusted HR 1.38, 95% CI 1.06–1.80), whereas survival was similar among First Nations and non–First Nations patients who received CABG.

Interpretation:

First Nations people were less likely to undergo angiography after acute MI and experienced worse long-term survival compared with non–First Nations people. Efforts to improve access to angiography for First Nations people may improve outcomes.Although cardiovascular disease has been decreasing in Canada,1 First Nations people have a disproportionate burden of the disease. First Nations people in Canada have a 2.5-fold higher prevalence of cardiovascular disease than non–First Nations people,2 with hospital admissions for cardiovascular-related events also increasing.3The prevalence of cardiovascular disease in First Nations populations is presumed to be reflective of the prevalence of cardiovascular risk factors.47 However, the disproportionate increase in rates of hospital admission suggests that suboptimal management of cardiovascular disease or its risk factors may also influence patient outcomes.2,3 Racial disparities in the quality of cardiovascular care resulting in adverse outcomes have been documented, although most studies have focused on African-American, Hispanic and Asian populations.8,9 As a result, it is unclear whether suboptimal delivery of guideline-recommended treatment contributes to increased cardiovascular morbidity and mortality among First Nations people.1012We undertook a population-based study involving adults with incident acute myocardial infarction (MI) to examine the receipt of guideline-recommended coronary angiography among First Nations and non–First Nations patients.1012 Among patients who underwent angiography, we sought to determine whether there were differences between First Nations and non–First Nations patients in the likelihood of revascularization and long-term survival.  相似文献   

9.

Background

High prevalence of infant macrosomia (up to 36%, the highest in the world) has been reported in some First Nations communities in the Canadian province of Quebec and the eastern area of the province of Ontario. We aimed to assess whether infant macrosomia was associated with elevated risks of perinatal and postneonatal mortality among First Nations people in Quebec.

Methods

We calculated risk ratios (RRs) of perinatal and postneonatal mortality by birthweight for gestational age, comparing births to First Nations women (n = 5193) versus women whose mother tongue is French (n = 653 424, the majority reference group) in Quebec 1991–2000.

Results

The prevalence of infant macrosomia (birthweight for gestational age > 90th percentile) was 27.5% among births to First Nations women, which was 3.3 times (confidence interval [CI] 3.2–3.5) higher than the prevalence (8.3%) among births to women whose mother tongue is French. Risk ratios for perinatal mortality among births to First Nations women were 1.8 (95% CI 1.3–2.5) for births with weight appropriate for gestational age, 4.1 (95% CI 2.4–7.0) for small-for-gestational-age (< 10th percentile) births and < 1 (not significant) for macrosomic births compared to births among women whose mother tongue is French. The RRs for postneonatal mortality were 4.3 (95% CI 2.7–6.7) for infants with appropriate-for-gestational-age birthweight and 8.3 (95% CI 4.0–17.0) for infants with macrosomia.

Interpretation

Macrosomia was associated with a generally protective effect against perinatal death, but substantially greater risks of postneonatal death among births to First Nations women in Quebec versus women whose mother tongue is French.A trend toward higher birthweights has emerged in recent decades.13 Reflected in this trend is a rise in the prevalence of infant macrosomia, commonly defined as either a birthweight greater than 4000 g or a birthweight for gestational age greater than the 90th percentile relative to a fetal growth standard.48 Maternal obesity, impaired glucose tolerance and gestational diabetes mellitus are important risk factors for infant macrosomia9,10 and are known to afflict a much higher proportion of people in Aboriginal populations than in the general population.1114 This is true especially for Aboriginal populations in which a traditional lifestyle has changed to a less physically active, modern lifestyle in recent decades. A high prevalence of infant macrosomia (up to 36%, which, to the best of our knowledge, is the highest in the world) has been reported in some First Nations communities of Quebec and eastern Ontario in Canada.1517 However, little is known about the implications of this high prevalence for perinatal and infant health of First Nations people in these regions. We examined whether infant macrosomia was associated with increased risk for perinatal and postneonatal death among First Nations infants in Quebec.  相似文献   

10.
11.
12.

Background

The prevention of head injuries in alpine activities has focused on helmets. However, no systematic review has examined the effect of helmets on head and neck injuries among skiers and snowboarders.

Methods

We searched electronic databases, conference proceedings and reference lists using a combination of the key words “head injury or head trauma,” “helmet” and “skiing or snowboarding.” We included studies that used a control group; compared skiers or snowboarders with and without helmets; and measured at least one objectively quantified outcome (e.g., head injury, and neck or cervical injury).

Results

We included 10 case–control, 1 case–control/case-crossover and 1 cohort study in our analysis. The pooled odds ratio (OR) indicated that skiers and snowboarders with a helmet were significantly less likely than those without a helmet to have a head injury (OR 0.65, 95% confidence interval [CI] 0.55–0.79). The result was similar for studies that used controls without an injury (OR 0.61, 95% CI 0.36–0.92), those that used controls with an injury other than a head or neck injury (OR 0.63, 95% CI 0.52–0.80) and studies that included children under the age of 13 years (OR 0.41, 95% CI 0.27–0.59). Helmets were not associated with an increased risk of neck injury (OR 0.89, 95% CI 0.72–1.09).

Interpretation

Our findings show that helmets reduce the risk of head injury among skiers and snowboarders with no evidence of an increased risk of neck injury.Skiing and snowboarding are popular winter activities.1 Estimates from numerous countries indicate that head injuries account for 9% to 19%, and neck injuries for 1% to 4%, of all injuries reported by ski patrols and emergency departments.211 Rates of head and neck injuries have been reported between 0.09 and 0.46 per 1000 outings.12 Head and neck injuries are disproportionately represented in cases of severe trauma, and traumatic brain injury is the leading cause of death and serious injury among skiers and snow-boarders.13 As far back as 1983, Oh and Schmid recommended mandatory helmet use for children while skiing.14Many studies of the relation between helmet use and head injuries among skiers and snowboarders have found a protective effect.1524 It has been suggested that the use of helmets may increase the risk of neck injury in a crash or fall.25 This may be more evident among children because they have a greater head:body ratio than adults, and the additional size and weight of the helmet may increase the risk of neck injury in an otherwise routine fall.26 We conducted a systematic review of the effect of helmets on head and neck injuries among skiers and snowboarders.  相似文献   

13.

Background:

Screening for methicillin-resistant Staphylococcus aureus (MRSA) is intended to reduce nosocomial spread by identifying patients colonized by MRSA. Given the widespread use of this screening, we evaluated its potential clinical utility in predicting the resistance of clinical isolates of S. aureus.

Methods:

We conducted a 2-year retrospective cohort study that included patients with documented clinical infection with S. aureus and prior screening for MRSA. We determined test characteristics, including sensitivity and specificity, of screening for predicting the resistance of subsequent S. aureus isolates.

Results:

Of 510 patients included in the study, 53 (10%) had positive results from MRSA screening, and 79 (15%) of infecting isolates were resistant to methicillin. Screening for MRSA predicted methicillin resistance of the infecting isolate with 99% (95% confidence interval [CI] 98%–100%) specificity and 63% (95% CI 52%–74%) sensitivity. When screening swabs were obtained within 48 hours before isolate collection, sensitivity increased to 91% (95% CI 71%–99%) and specificity was 100% (95% CI 97%–100%), yielding a negative likelihood ratio of 0.09 (95% CI 0.01–0.3) and a negative predictive value of 98% (95% CI 95%–100%). The time between swab and isolate collection was a significant predictor of concordance of methicillin resistance in swabs and isolates (odds ratio 6.6, 95% CI 1.6–28.2).

Interpretation:

A positive result from MRSA screening predicted methicillin resistance in a culture-positive clinical infection with S. aureus. Negative results on MRSA screening were most useful for excluding methicillin resistance of a subsequent infection with S. aureus when the screening swab was obtained within 48 hours before collection of the clinical isolate.Antimicrobial resistance is a global problem. The prevalence of resistant bacteria, including methicillin-resistant Staphylococcus aureus (MRSA), has reached high levels in many countries.13 Methicillin resistance in S. aureus is associated with excess mortality, hospital stays and health care costs,3,4 possibly owing to increased virulence or less effective treatments for MRSA compared with methicillin-sensitive S. aureus (MSSA).5The initial selection of appropriate empirical antibiotic treatment affects mortality, morbidity and potential health care expenditures.68 The optimal choice of antibiotics in S. aureus infections is important for 3 major reasons: β-lactam antibiotics have shown improved efficacy over vancomycin and are the ideal treatment for susceptible strains of S. aureus;6 β-lactam antibiotics are ineffective against MRSA, and so vancomycin or other newer agents must be used empirically when MRSA is suspected; and unnecessary use of broad-spectrum antibiotics (e.g., vancomycin) can lead to the development of further antimicrobial resistance.9 It is therefore necessary to make informed decisions regarding selection of empirical antibiotics.1013 Consideration of a patient’s previous colonization status is important, because colonization predates most hospital and community-acquired infections.10,14Universal or targeted surveillance for MRSA has been implemented widely as a means of limiting transmission of this antibiotic-resistant pathogen.15,16 Although results of MRSA screening are not intended to guide empirical treatment, they may offer an additional benefit among patients in whom clinical infection with S. aureus develops.Studies that examined the effects of MRSA carriage on the subsequent likelihood of infection allude to the potential diagnostic benefit of prior screening for MRSA.17,18 Colonization by MRSA at the time of hospital admission is associated with a 13-fold increased risk of subsequent MRSA infection.17,18 Moreover, studies that examined nasal carriage of S. aureus after documented S. aureus bacteremia have shown remarkable concordance between the genotypes of paired colonizing and invasive strains (82%–94%).19,20 The purpose of our study was to identify the usefulness of prior screening for MRSA for predicting methicillin resistance in culture-positive S. aureus infections.  相似文献   

14.

Background

Systemic inflammation and dysregulated immune function in chronic obstructive pulmonary disease (COPD) is hypothesized to predispose patients to development of herpes zoster. However, the risk of herpes zoster among patients with COPD is undocumented. We therefore aimed to investigate the risk of herpes zoster among patients with COPD.

Methods

We conducted a cohort study using data from the Taiwan Longitudinal Health Insurance Database. We performed Cox regressions to compare the hazard ratio (HR) of herpes zoster in the COPD cohort and in an age- and sex-matched comparison cohort. We divided the patients with COPD into three groups according to use of steroid medications and performed a further analysis to examine the risk of herpes zoster.

Results

The study included 8486 patients with COPD and 33 944 matched control patients. After adjustment for potential confounding factors, patients with COPD were more likely to have incidents of herpes zoster (adjusted HR 1.68, 95% confidence interval [CI] 1.45–1.95). When compared with the comparison cohort, the adjusted HR of herpes zoster was 1.67 (95% CI 1.43–1.96) for patients with COPD not taking steroid medications. The adjusted HR of herpes zoster was slightly greater for patients with COPD using inhaled corticosteroids only (adjusted HR 2.09, 95% CI 1.38–3.16) and was greatest for patients with COPD using oral steroids (adjusted HR 3.00, 95% CI 2.40–3.75).

Interpretation

Patients with COPD were at increased risk of herpes zoster relative to the general population. The relative risk of herpes zoster was greatest for patients with COPD using oral steroids.Herpes zoster is caused by a reactivation of latent varicella-zoster virus residing in sensory ganglia after an earlier episode of varicella.1 Herpes zoster is characterized by a painful vesicular dermatomal rash. It is commonly complicated with chronic pain (postherpetic neuralgia), resulting in reduced quality of life and functional disability to a degree comparable to that experienced by patients with congestive heart failure, diabetes mellitus and major depression.1,2 Patients with herpes zoster experience more substantial role limitations resulting from emotional and physical problems than do patients with congestive heart failure or diabetes.3 Pain scores for postherpetic neuralgia have been shown to be as high as those for chronic pain from osteoarthritis and rheumatoid arthritis.3 Although aging is the most well-known risk factor for herpes zoster, people with diseases associated with impaired immunity, such as malignancy, HIV infection, diabetes and rheumatic diseases, are also at higher risk for herpes zoster.4,5Chronic obstructive pulmonary disease (COPD) is characterized by progressive airflow limitation that is associated with an abnormal inflammatory response by the small airways and alveoli to inhaled particles and pollutants.6 Disruption of local defence systems (e.g., damage to the innate immune system, impaired mucociliary clearance) predispose patients with COPD to respiratory tract infections. Each infection can cause exacerbation of COPD and further deterioration of lung function, which in turn increase predisposition to infection.7,8There is increasing evidence that COPD is an autoimmune disease, with chronic systemic inflammation involving more than just the airways and lungs.6 Given that various immune-mediated diseases (e.g., rheumatoid arthritis, inflammatory bowel disease) have been reported to be associated with an increased risk of herpes zoster,4,9,10 it is reasonable to hypothesize that the immune dysregulation found in COPD may put patients at higher risk of developing herpes zoster. In addition, inhaled or systemic corticosteroids used for management of COPD can increase susceptibility to herpes zoster by suppressing normal immune function.11 However, data are limited regarding the risk of herpes zoster among patients with COPD.The goal of our study was to investigate whether patients with COPD have a higher incidence of herpes zoster than the general population. In addition, we aimed to examine the risk for herpes zoster with and without steroid therapy among patients with COPD relative to the general population.  相似文献   

15.

Background:

Little evidence exists on the effect of an energy-unrestricted healthy diet on metabolic syndrome. We evaluated the long-term effect of Mediterranean diets ad libitum on the incidence or reversion of metabolic syndrome.

Methods:

We performed a secondary analysis of the PREDIMED trial — a multicentre, randomized trial done between October 2003 and December 2010 that involved men and women (age 55–80 yr) at high risk for cardiovascular disease. Participants were randomly assigned to 1 of 3 dietary interventions: a Mediterranean diet supplemented with extra-virgin olive oil, a Mediterranean diet supplemented with nuts or advice on following a low-fat diet (the control group). The interventions did not include increased physical activity or weight loss as a goal. We analyzed available data from 5801 participants. We determined the effect of diet on incidence and reversion of metabolic syndrome using Cox regression analysis to calculate hazard ratios (HRs) and 95% confidence intervals (CIs).

Results:

Over 4.8 years of follow-up, metabolic syndrome developed in 960 (50.0%) of the 1919 participants who did not have the condition at baseline. The risk of developing metabolic syndrome did not differ between participants assigned to the control diet and those assigned to either of the Mediterranean diets (control v. olive oil HR 1.10, 95% CI 0.94–1.30, p = 0.231; control v. nuts HR 1.08, 95% CI 0.92–1.27, p = 0.3). Reversion occurred in 958 (28.2%) of the 3392 participants who had metabolic syndrome at baseline. Compared with the control group, participants on either Mediterranean diet were more likely to undergo reversion (control v. olive oil HR 1.35, 95% CI 1.15–1.58, p < 0.001; control v. nuts HR 1.28, 95% CI 1.08–1.51, p < 0.001). Participants in the group receiving olive oil supplementation showed significant decreases in both central obesity and high fasting glucose (p = 0.02); participants in the group supplemented with nuts showed a significant decrease in central obesity.

Interpretation:

A Mediterranean diet supplemented with either extra virgin olive oil or nuts is not associated with the onset of metabolic syndrome, but such diets are more likely to cause reversion of the condition. An energy-unrestricted Mediterranean diet may be useful in reducing the risks of central obesity and hyperglycemia in people at high risk of cardiovascular disease. Trial registration: ClinicalTrials.gov, no. ISRCTN35739639.Metabolic syndrome is a cluster of 3 or more related cardiometabolic risk factors: central obesity (determined by waist circumference), hypertension, hypertriglyceridemia, low plasma high-density lipoprotein (HDL) cholesterol levels and hyperglycemia. Having the syndrome increases a person’s risk for type 2 diabetes and cardiovascular disease.1,2 In addition, the condition is associated with increased morbidity and all-cause mortality.1,35 The worldwide prevalence of metabolic syndrome in adults approaches 25%68 and increases with age,7 especially among women,8,9 making it an important public health issue.Several studies have shown that lifestyle modifications,10 such as increased physical activity,11 adherence to a healthy diet12,13 or weight loss,1416 are associated with reversion of the metabolic syndrome and its components. However, little information exists as to whether changes in the overall dietary pattern without weight loss might also be effective in preventing and managing the condition.The Mediterranean diet is recognized as one of the healthiest dietary patterns. It has shown benefits in patients with cardiovascular disease17,18 and in the prevention and treatment of related conditions, such as diabetes,1921 hypertension22,23 and metabolic syndrome.24Several cross-sectional2529 and prospective3032 epidemiologic studies have suggested an inverse association between adherence to the Mediterranean diet and the prevalence or incidence of metabolic syndrome. Evidence from clinical trials has shown that an energy-restricted Mediterranean diet33 or adopting a Mediterranean diet after weight loss34 has a beneficial effect on metabolic syndrome. However, these studies did not determine whether the effect could be attributed to the weight loss or to the diets themselves.Seminal data from the PREDIMED (PREvención con DIeta MEDiterránea) study suggested that adherence to a Mediterranean diet supplemented with nuts reversed metabolic syndrome more so than advice to follow a low-fat diet.35 However, the report was based on data from only 1224 participants followed for 1 year. We have analyzed the data from the final PREDIMED cohort after a median follow-up of 4.8 years to determine the long-term effects of a Mediterranean diet on metabolic syndrome.  相似文献   

16.
17.

Background:

Brief interventions delivered by family physicians to address excessive alcohol use among adult patients are effective. We conducted a study to determine whether such an intervention would be similarly effective in reducing binge drinking and excessive cannabis use among young people.

Methods:

We conducted a cluster randomized controlled trial involving 33 family physicians in Switzerland. Physicians in the intervention group received training in delivering a brief intervention to young people during the consultation in addition to usual care. Physicians in the control group delivered usual care only. Consecutive patients aged 15–24 years were recruited from each practice and, before the consultation, completed a confidential questionnaire about their general health and substance use. Patients were followed up at 3, 6 and 12 months after the consultation. The primary outcome measure was self-reported excessive substance use (≥ 1 episode of binge drinking, or ≥ 1 joint of cannabis per week, or both) in the past 30 days.

Results:

Of the 33 participating physicians, 17 were randomly allocated to the intervention group and 16 to the control group. Of the 594 participating patients, 279 (47.0%) identified themselves as binge drinkers or excessive cannabis users, or both, at baseline. Excessive substance use did not differ significantly between patients whose physicians were in the intervention group and those whose physicians were in the control group at any of the follow-up points (odds ratio [OR] and 95% confidence interval [CI] at 3 months: 0.9 [0.6–1.4]; at 6 mo: 1.0 [0.6–1.6]; and at 12 mo: 1.1 [0.7–1.8]). The differences between groups were also nonsignificant after we re stricted the analysis to patients who reported excessive substance use at baseline (OR 1.6, 95% CI 0.9–2.8, at 3 mo; OR 1.7, 95% CI 0.9–3.2, at 6 mo; and OR 1.9, 95% CI 0.9–4.0, at 12 mo).

Interpretation:

Training family physicians to use a brief intervention to address excessive substance use among young people was not effective in reducing binge drinking and excessive cannabis use in this patient population. Trial registration: Australian New Zealand Clinical Trials Registry, no. ACTRN12608000432314.Most health-compromising behaviours begin in adolescence.1 Interventions to address these behaviours early are likely to bring long-lasting benefits.2 Harmful use of alcohol is a leading factor associated with premature death and disability worldwide, with a disproportionally high impact on young people (aged 10–24 yr).3,4 Similarly, early cannabis use can have adverse consequences that extend into adulthood.58In adolescence and early adulthood, binge drinking on at least a monthly basis is associated with an increased risk of adverse outcomes later in life.912 Although any cannabis use is potentially harmful, weekly use represents a threshold in adolescence related to an increased risk of cannabis (and tobacco) dependence in adulthood.13 Binge drinking affects 30%–50% and excessive cannabis use about 10% of the adolescent and young adult population in Europe and the United States.10,14,15Reducing substance-related harm involves multisectoral approaches, including promotion of healthy child and adolescent development, regulatory policies and early treatment interventions.16 Family physicians can add to the public health messages by personalizing their content within brief interventions.17,18 There is evidence that brief interventions can encourage young people to reduce substance use, yet most studies have been conducted in community settings (mainly educational), emergency services or specialized addiction clinics.1,16 Studies aimed at adult populations have shown favourable effects of brief alcohol interventions, and to some extent brief cannabis interventions, in primary care.1922 These interventions have been recommended for adolescent populations.4,5,16 Yet young people have different modes of substance use and communication styles that may limit the extent to which evidence from adult studies can apply to them.Recently, a systematic review of brief interventions to reduce alcohol use in adolescents identified only 1 randomized controlled trial in primary care.23 The tested intervention, not provided by family physicians but involving audio self-assessment, was ineffective in reducing alcohol use in exposed adolescents.24 Sanci and colleagues showed that training family physicians to address health-risk behaviours among adolescents was effective in improving provider performance, but the extent to which this translates into improved outcomes remains unknown.25,26 Two nonrandomized studies suggested screening for substance use and brief advice by family physicians could favour reduced alcohol and cannabis use among adolescents,27,28 but evidence from randomized trials is lacking.29We conducted the PRISM-Ado (Primary care Intervention Addressing Substance Misuse in Adolescents) trial, a cluster randomized controlled trial of the effectiveness of training family physicians to deliver a brief intervention to address binge drinking and excessive cannabis use among young people.  相似文献   

18.

Background

Little is known about the incidence and causes of heparin-induced skin lesions. The 2 most commonly reported causes of heparin-induced skin lesions are immune-mediated heparin-induced thrombocytopenia and delayed-type hypersensitivity reactions.

Methods

We prospectively examined consecutive patients who received subcutaneous heparin (most often enoxaparin or nadroparin) for the presence of heparin-induced skin lesions. If such lesions were identified, we performed a skin biopsy, platelet count measurements, and antiplatelet-factor 4 antibody and allergy testing.

Results

We enrolled 320 patients. In total, 24 patients (7.5%, 95% confidence interval [CI] 4.7%–10.6%) had heparin-induced skin lesions. Delayed-type hypersensitivity reactions were identified as the cause in all 24 patients. One patient with histopathologic evidence of delayed-type hypersensitivity tested positive for antiplatelet-factor 4 antibodies. We identified the following risk factors for heparin-induced skin lesions: a body mass index greater than 25 (odds ratio [OR] 4.6, 95% CI 1.7–15.3), duration of heparin therapy longer than 9 days (OR 5.9, 95% CI 1.9–26.3) and female sex (OR 3.0, 95% CI 1.1–8.8).

Interpretation

Heparin-induced skin lesions are relatively common, have identifiable risk factors and are commonly caused by a delayed-type hypersensitivity reaction (type IV allergic response). (ClinicalTrials.gov trial register no. NCT00510432.)Hpeparin has been used as an anticoagulant for over 60 years.1 Well-known adverse effects of heparin therapy are bleeding, osteoporosis, hair loss, and immune and nonimmune heparin-induced thrombocytopenia. The incidence of heparin-induced skin lesions is unknown, despite being increasingly reported.24 Heparin-induced skin lesions may be caused by at least 5 mechanisms: delayed-type (type IV) hypersensitivity responses,2,46 immune-mediated thrombocytopenia,3 type I allergic reactions,7,8 skin necrosis9 and pustulosis.10Heparin-induced skin lesions may indicate the presence of life-threatening heparin-induced thrombocytopenia11 — even in the absence of thrombocytopenia.3 There are no data available on the incidence of heparin-induced skin lesions or their causes. Given the rising number of reports of heparin-induced skin lesions and the importance of correctly diagnosing this condition, we sought to determine the incidence of heparin-induced skin lesions.  相似文献   

19.
20.

Background

Recent studies have reported a high prevalence of relative adrenal insufficiency in patients with liver cirrhosis. However, the effect of corticosteroid replacement on mortality in this high-risk group remains unclear. We examined the effect of low-dose hydrocortisone in patients with cirrhosis who presented with septic shock.

Methods

We enrolled patients with cirrhosis and septic shock aged 18 years or older in a randomized double-blind placebo-controlled trial. Relative adrenal insufficiency was defined as a serum cortisol increase of less than 250 nmol/L or 9 μg/dL from baseline after stimulation with 250 μg of intravenous corticotropin. Patients were assigned to receive 50 mg of intravenous hydrocortisone or placebo every six hours until hemodynamic stability was achieved, followed by steroid tapering over eight days. The primary outcome was 28-day all-cause mortality.

Results

The trial was stopped for futility at interim analysis after 75 patients were enrolled. Relative adrenal insufficiency was diagnosed in 76% of patients. Compared with the placebo group (n = 36), patients in the hydrocortisone group (n = 39) had a significant reduction in vasopressor doses and higher rates of shock reversal (relative risk [RR] 1.58, 95% confidence interval [CI] 0.98–2.55, p = 0.05). Hydrocortisone use was not associated with a reduction in 28-day mortality (RR 1.17, 95% CI 0.92–1.49, p = 0.19) but was associated with an increase in shock relapse (RR 2.58, 95% CI 1.04–6.45, p = 0.03) and gastrointestinal bleeding (RR 3.00, 95% CI 1.08–8.36, p = 0.02).

Interpretation

Relative adrenal insufficiency was very common in patients with cirrhosis presenting with septic shock. Despite initial favourable effects on hemodynamic parameters, hydrocortisone therapy did not reduce mortality and was associated with an increase in adverse effects. (Current Controlled Trials registry no. ISRCTN99675218.)Cirrhosis is a leading cause of death worldwide,1 often with septic shock as the terminal event.29 Relative adrenal insufficiency shares similar features of distributive hyperdynamic shock with both cirrhosis and sepsis10,11 and increasingly has been reported to coexist with both conditions.11,12 The effect of low-dose hydrocortisone therapy on survival of critically ill patients in general with septic shock remains controversial, with conflicting results from randomized controlled trials1317 and meta-analyses.18,19 The effect of hydrocortisone therapy on mortality among patients with cirrhosis, who are known to be a group at high risk for relative adrenal insufficiency, has not been studied and hence was the objective of our study.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号