首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.

Background:

Morbidity due to cardiovascular disease is high among First Nations people. The extent to which this may be related to the likelihood of coronary angiography is unclear. We examined the likelihood of coronary angiography after acute myocardial infarction (MI) among First Nations and non–First Nations patients.

Methods:

Our study included adults with incident acute MI between 1997 and 2008 in Alberta. We determined the likelihood of angiography among First Nations and non–First Nations patients, adjusted for important confounders, using the Alberta Provincial Project for Outcome Assessment in Coronary Heart Disease (APPROACH) database.

Results:

Of the 46 764 people with acute MI, 1043 (2.2%) were First Nations. First Nations patients were less likely to receive angiography within 1 day after acute MI (adjusted odds ratio [OR] 0.73, 95% confidence interval [CI] 0.62–0.87). Among First Nations and non–First Nations patients who underwent angiography (64.9%), there was no difference in the likelihood of percutaneous coronary intervention (PCI) (adjusted hazard ratio [HR] 0.92, 95% CI 0.83–1.02) or coronary artery bypass grafting (CABG) (adjusted HR 1.03, 95% CI 0.85–1.25). First Nations people had worse survival if they received medical management alone (adjusted HR 1.38, 95% CI 1.07–1.77) or if they underwent PCI (adjusted HR 1.38, 95% CI 1.06–1.80), whereas survival was similar among First Nations and non–First Nations patients who received CABG.

Interpretation:

First Nations people were less likely to undergo angiography after acute MI and experienced worse long-term survival compared with non–First Nations people. Efforts to improve access to angiography for First Nations people may improve outcomes.Although cardiovascular disease has been decreasing in Canada,1 First Nations people have a disproportionate burden of the disease. First Nations people in Canada have a 2.5-fold higher prevalence of cardiovascular disease than non–First Nations people,2 with hospital admissions for cardiovascular-related events also increasing.3The prevalence of cardiovascular disease in First Nations populations is presumed to be reflective of the prevalence of cardiovascular risk factors.47 However, the disproportionate increase in rates of hospital admission suggests that suboptimal management of cardiovascular disease or its risk factors may also influence patient outcomes.2,3 Racial disparities in the quality of cardiovascular care resulting in adverse outcomes have been documented, although most studies have focused on African-American, Hispanic and Asian populations.8,9 As a result, it is unclear whether suboptimal delivery of guideline-recommended treatment contributes to increased cardiovascular morbidity and mortality among First Nations people.1012We undertook a population-based study involving adults with incident acute myocardial infarction (MI) to examine the receipt of guideline-recommended coronary angiography among First Nations and non–First Nations patients.1012 Among patients who underwent angiography, we sought to determine whether there were differences between First Nations and non–First Nations patients in the likelihood of revascularization and long-term survival.  相似文献   

2.
Background:Otitis media with effusion is a common problem that lacks an evidence-based nonsurgical treatment option. We assessed the clinical effectiveness of treatment with a nasal balloon device in a primary care setting.Methods:We conducted an open, pragmatic randomized controlled trial set in 43 family practices in the United Kingdom. Children aged 4–11 years with a recent history of ear symptoms and otitis media with effusion in 1 or both ears, confirmed by tympanometry, were allocated to receive either autoinflation 3 times daily for 1–3 months plus usual care or usual care alone. Clearance of middle-ear fluid at 1 and 3 months was assessed by experts masked to allocation.Results:Of 320 children enrolled, those receiving autoinflation were more likely than controls to have normal tympanograms at 1 month (47.3% [62/131] v. 35.6% [47/132]; adjusted relative risk [RR] 1.36, 95% confidence interval [CI] 0.99 to 1.88) and at 3 months (49.6% [62/125] v. 38.3% [46/120]; adjusted RR 1.37, 95% CI 1.03 to 1.83; number needed to treat = 9). Autoinflation produced greater improvements in ear-related quality of life (adjusted between-group difference in change from baseline in OMQ-14 [an ear-related measure of quality of life] score −0.42, 95% CI −0.63 to −0.22). Compliance was 89% at 1 month and 80% at 3 months. Adverse events were mild, infrequent and comparable between groups.Interpretation:Autoinflation in children aged 4–11 years with otitis media with effusion is feasible in primary care and effective both in clearing effusions and improving symptoms and ear-related child and parent quality of life. Trial registration: ISRCTN, No. 55208702.Otitis media with effusion, also known as glue ear, is an accumulation of fluid in the middle ear, without symptoms or signs of an acute ear infection. It is often associated with viral infection.13 The prevalence rises to 46% in children aged 4–5 years,4 when hearing difficulty, other ear-related symptoms and broader developmental concerns often bring the condition to medical attention.3,5,6 Middle-ear fluid is associated with conductive hearing losses of about 15–45 dB HL.7 Resolution is clinically unpredictable,810 with about a third of cases showing recurrence.11 In the United Kingdom, about 200 000 children with the condition are seen annually in primary care.12,13 Research suggests some children seen in primary care are as badly affected as those seen in hospital.7,9,14,15 In the United States, there were 2.2 million diagnosed episodes in 2004, costing an estimated $4.0 billion.16 Rates of ventilation tube surgery show variability between countries,1719 with a declining trend in the UK.20Initial clinical management consists of reasonable temporizing or delay before considering surgery.13 Unfortunately, all available medical treatments for otitis media with effusion such as antibiotics, antihistamines, decongestants and intranasal steroids are ineffective and have unwanted effects, and therefore cannot be recommended.2123 Not only are antibiotics ineffective, but resistance to them poses a major threat to public health.24,25 Although surgery is effective for a carefully selected minority,13,26,27 a simple low-cost, nonsurgical treatment option could benefit a much larger group of symptomatic children, with the purpose of addressing legitimate clinical concerns without incurring excessive delays.Autoinflation using a nasal balloon device is a low-cost intervention with the potential to be used more widely in primary care, but current evidence of its effectiveness is limited to several small hospital-based trials28 that found a higher rate of tympanometric resolution of ear fluid at 1 month.2931 Evidence of feasibility and effectiveness of autoinflation to inform wider clinical use is lacking.13,28 Thus we report here the findings of a large pragmatic trial of the clinical effectiveness of nasal balloon autoinflation in a spectrum of children with clinically confirmed otitis media with effusion identified from primary care.  相似文献   

3.

Background

The pathogenesis of appendicitis is unclear. We evaluated whether exposure to air pollution was associated with an increased incidence of appendicitis.

Methods

We identified 5191 adults who had been admitted to hospital with appendicitis between Apr. 1, 1999, and Dec. 31, 2006. The air pollutants studied were ozone, nitrogen dioxide, sulfur dioxide, carbon monoxide, and suspended particulate matter of less than 10 μ and less than 2.5 μ in diameter. We estimated the odds of appendicitis relative to short-term increases in concentrations of selected pollutants, alone and in combination, after controlling for temperature and relative humidity as well as the effects of age, sex and season.

Results

An increase in the interquartile range of the 5-day average of ozone was associated with appendicitis (odds ratio [OR] 1.14, 95% confidence interval [CI] 1.03–1.25). In summer (July–August), the effects were most pronounced for ozone (OR 1.32, 95% CI 1.10–1.57), sulfur dioxide (OR 1.30, 95% CI 1.03–1.63), nitrogen dioxide (OR 1.76, 95% CI 1.20–2.58), carbon monoxide (OR 1.35, 95% CI 1.01–1.80) and particulate matter less than 10 μ in diameter (OR 1.20, 95% CI 1.05–1.38). We observed a significant effect of the air pollutants in the summer months among men but not among women (e.g., OR for increase in the 5-day average of nitrogen dioxide 2.05, 95% CI 1.21–3.47, among men and 1.48, 95% CI 0.85–2.59, among women). The double-pollutant model of exposure to ozone and nitrogen dioxide in the summer months was associated with attenuation of the effects of ozone (OR 1.22, 95% CI 1.01–1.48) and nitrogen dioxide (OR 1.48, 95% CI 0.97–2.24).

Interpretation

Our findings suggest that some cases of appendicitis may be triggered by short-term exposure to air pollution. If these findings are confirmed, measures to improve air quality may help to decrease rates of appendicitis.Appendicitis was introduced into the medical vernacular in 1886.1 Since then, the prevailing theory of its pathogenesis implicated an obstruction of the appendiceal orifice by a fecalith or lymphoid hyperplasia.2 However, this notion does not completely account for variations in incidence observed by age,3,4 sex,3,4 ethnic background,3,4 family history,5 temporal–spatial clustering6 and seasonality,3,4 nor does it completely explain the trends in incidence of appendicitis in developed and developing nations.3,7,8The incidence of appendicitis increased dramatically in industrialized nations in the 19th century and in the early part of the 20th century.1 Without explanation, it decreased in the middle and latter part of the 20th century.3 The decrease coincided with legislation to improve air quality. For example, after the United States Clean Air Act was passed in 1970,9 the incidence of appendicitis decreased by 14.6% from 1970 to 1984.3 Likewise, a 36% drop in incidence was reported in the United Kingdom between 1975 and 199410 after legislation was passed in 1956 and 1968 to improve air quality and in the 1970s to control industrial sources of air pollution. Furthermore, appendicitis is less common in developing nations; however, as these countries become more industrialized, the incidence of appendicitis has been increasing.7Air pollution is known to be a risk factor for multiple conditions, to exacerbate disease states and to increase all-cause mortality.11 It has a direct effect on pulmonary diseases such as asthma11 and on nonpulmonary diseases including myocardial infarction, stroke and cancer.1113 Inflammation induced by exposure to air pollution contributes to some adverse health effects.1417 Similar to the effects of air pollution, a proinflammatory response has been associated with appendicitis.1820We conducted a case–crossover study involving a population-based cohort of patients admitted to hospital with appendicitis to determine whether short-term increases in concentrations of selected air pollutants were associated with hospital admission because of appendicitis.  相似文献   

4.

Background

Cryotherapy is widely used for the treatment of cutaneous warts in primary care. However, evidence favours salicylic acid application. We compared the effectiveness of these treatments as well as a wait-and-see approach.

Methods

Consecutive patients with new cutaneous warts were recruited in 30 primary care practices in the Netherlands between May 1, 2006, and Jan. 26, 2007. We randomly allocated eligible patients to one of three groups: cryotherapy with liquid nitrogen every two weeks, self-application of salicylic acid daily or a wait-and-see approach. The primary outcome was the proportion of participants whose warts were all cured at 13 weeks. Analysis was on an intention-to-treat basis. Secondary outcomes included treatment adherence, side effects and treatment satisfaction. Research nurses assessed outcomes during home visits at 4, 13 and 26 weeks.

Results

Of the 250 participants (age 4 to 79 years), 240 were included in the analysis at 13 weeks (loss to follow-up 4%). Cure rates were 39% (95% confidence interval [CI] 29%–51%) in the cryotherapy group, 24% (95% CI 16%–35%) in the salicylic acid group and 16% (95% CI 9.5%–25%) in the wait-and-see group. Differences in effectiveness were most pronounced among participants with common warts (n = 116): cure rates were 49% (95% CI 34%–64%) in the cryotherapy group, 15% (95% CI 7%–30%) in the salicylic acid group and 8% (95% CI 3%–21%) in the wait-and-see group. Cure rates among the participants with plantar warts (n = 124) did not differ significantly between treatment groups.

Interpretation

For common warts, cryotherapy was the most effective therapy in primary care. For plantar warts, we found no clinically relevant difference in effectiveness between cryotherapy, topical application of salicylic acid or a wait-and-see approach after 13 weeks. (ClinicalTrial.gov registration no. ISRCTN42730629)Cutaneous warts are common.13 Up to one-third of primary school children have warts, of which two-thirds resolve within two years.4,5 Because warts frequently result in discomfort,6 2% of the general population and 6% of school-aged children each year present with warts to their family physician.7,8 The usual treatment is cryotherapy with liquid nitrogen or, less frequently, topical application of salicylic acid.912 Some physicians choose a wait-and-see approach because of the benign natural course of warts and the risk of side effects of treatment.10,11A recent Cochrane review on treatments of cutaneous warts concluded that available studies were small, poorly designed or limited to dermatology outpatients.10,11 Evidence on cryotherapy was contradictory,1318 whereas the evidence on salicylic acid was more convincing.1923 However, studies that compared cryotherapy and salicylic acid directly showed no differences in effectiveness.24,25 The Cochrane review called for high-quality trials in primary care to compare the effects of cryotherapy, salicylic acid and placebo.We conducted a three-arm randomized controlled trial to compare the effectiveness of cryotherapy with liquid nitrogen, topical application of salicylic acid and a wait-and-see approach for the treatment of common and plantar warts in primary care.  相似文献   

5.
6.

Background

Fractures have largely been assessed by their impact on quality of life or health care costs. We conducted this study to evaluate the relation between fractures and mortality.

Methods

A total of 7753 randomly selected people (2187 men and 5566 women) aged 50 years and older from across Canada participated in a 5-year observational cohort study. Incident fractures were identified on the basis of validated self-report and were classified by type (vertebral, pelvic, forearm or wrist, rib, hip and “other”). We subdivided fracture groups by the year in which the fracture occurred during follow-up; those occurring in the fourth and fifth years were grouped together. We examined the relation between the time of the incident fracture and death.

Results

Compared with participants who had no fracture during follow-up, those who had a vertebral fracture in the second year were at increased risk of death (adjusted hazard ratio [HR] 2.7, 95% confidence interval [CI] 1.1–6.6); also at risk were those who had a hip fracture during the first year (adjusted HR 3.2, 95% CI 1.4–7.4). Among women, the risk of death was increased for those with a vertebral fracture during the first year (adjusted HR 3.7, 95% CI 1.1–12.8) or the second year of follow-up (adjusted HR 3.2, 95% CI 1.2–8.1). The risk of death was also increased among women with hip fracture during the first year of follow-up (adjusted HR 3.0, 95% CI 1.0–8.7).

Interpretation

Vertebral and hip fractures are associated with an increased risk of death. Interventions that reduce the incidence of these fractures need to be implemented to improve survival.Osteoporosis-related fractures are a major health concern, affecting a growing number of individuals worldwide. The burden of fracture has largely been assessed by the impact on health-related quality of life and health care costs.1,2 Fractures can also be associated with death. However, trials that have examined the relation between fractures and mortality have had limitations that may influence their results and the generalizability of the studies, including small samples,3,4 the examination of only 1 type of fracture,410 the inclusion of only women,8,11 the enrolment of participants from specific areas (i.e., hospitals or certain geographic regions),3,4,7,8,10,12 the nonrandom selection of participants311 and the lack of statistical adjustment for confounding factors that may influence mortality.3,57,12We evaluated the relation between incident fractures and mortality over a 5-year period in a cohort of men and women 50 years of age and older. In addition, we examined whether other characteristics of participants were risk factors for death.  相似文献   

7.

Background:

Vitamin D fortification of non–cow’s milk beverages is voluntary in North America. The effect of consuming non–cow’s milk beverages on serum 25-hydroxyvitamin D levels in children is unclear. We studied the association between non–cow’s milk consumption and 25-hydroxyvitamin D levels in healthy preschool-aged children. We also explored whether cow’s milk consumption modified this association and analyzed the association between daily non–cow’s milk and cow’s milk consumption.

Methods:

In this cross-sectional study, we recruited children 1–6 years of age attending routinely scheduled well-child visits. Survey responses, and anthropometric and laboratory measurements were collected. The association between non–cow’s milk consumption and 25-hydroxyvitamin D levels was tested using multiple linear regression and logistic regression. Cow’s milk consumption was explored as an effect modifier using an interaction term. The association between daily intake of non–cow’s milk and cow’s milk was explored using multiple linear regression.

Results:

A total of 2831 children were included. The interaction between non–cow’s milk and cow’s milk consumption was statistically significant (p = 0.03). Drinking non–cow’s milk beverages was associated with a 4.2-nmol/L decrease in 25-hydroxyvitamin D level per 250-mL cup consumed among children who also drank cow’s milk (p = 0.008). Children who drank only non–cow’s milk were at higher risk of having a 25-hydroxyvitamin D level below 50 nmol/L than children who drank only cow’s milk (odds ratio 2.7, 95% confidence interval 1.6 to 4.7).

Interpretation:

Consumption of non–cow’s milk beverages was associated with decreased serum 25-hydroxyvitamin D levels in early childhood. This association was modified by cow’s milk consumption, which suggests a trade-off between consumption of cow’s milk fortified with higher levels of vitamin D and non–cow’s milk with lower vitamin D content.Goat’s milk and plant-based milk alternatives made from soy, rice, almonds, coconut, hemp, flax or oats (herein called “non–cow’s milk”) are increasingly available on supermarket shelves. Many consumers may be switching from cow’s milk to these beverages.13 Parents may choose non–cow’s milk beverages for their children because of perceived health benefits. However, it is unclear whether they offer health advantages over cow’s milk or, alternatively, whether they increase the risk of nutritional inadequacy.In the United States and Canada, cow’s milk products are required to contain about 40 IU of vitamin D per 100 mL, making it the major dietary source of vitamin D for children.48 The only other food source with mandatory vitamin D fortification in Canada is margarine, which is required to contain 53 IU per 10 mL (10 g).5 Fortification of non–cow’s milk beverages with vitamin D is also possible, but it is voluntary in both countries. Furthermore, there is little regulation on the vitamin D content even if such beverages are fortified.5,6,9We conducted a study to test the association between total daily consumption of non–cow’s milk and serum 25-hydroxyvitamin D levels in a population of healthy urban preschool-aged children attending routinely scheduled well-child visits. We hypothesized that vitamin D stores would be lower in children who consume non–cow’s milk. The secondary objectives were to explore how consumption of cow’s milk might modify this association and to study the association between daily intake of non–cow’s milk and cow’s milk.  相似文献   

8.

Background:

Little evidence exists on the effect of an energy-unrestricted healthy diet on metabolic syndrome. We evaluated the long-term effect of Mediterranean diets ad libitum on the incidence or reversion of metabolic syndrome.

Methods:

We performed a secondary analysis of the PREDIMED trial — a multicentre, randomized trial done between October 2003 and December 2010 that involved men and women (age 55–80 yr) at high risk for cardiovascular disease. Participants were randomly assigned to 1 of 3 dietary interventions: a Mediterranean diet supplemented with extra-virgin olive oil, a Mediterranean diet supplemented with nuts or advice on following a low-fat diet (the control group). The interventions did not include increased physical activity or weight loss as a goal. We analyzed available data from 5801 participants. We determined the effect of diet on incidence and reversion of metabolic syndrome using Cox regression analysis to calculate hazard ratios (HRs) and 95% confidence intervals (CIs).

Results:

Over 4.8 years of follow-up, metabolic syndrome developed in 960 (50.0%) of the 1919 participants who did not have the condition at baseline. The risk of developing metabolic syndrome did not differ between participants assigned to the control diet and those assigned to either of the Mediterranean diets (control v. olive oil HR 1.10, 95% CI 0.94–1.30, p = 0.231; control v. nuts HR 1.08, 95% CI 0.92–1.27, p = 0.3). Reversion occurred in 958 (28.2%) of the 3392 participants who had metabolic syndrome at baseline. Compared with the control group, participants on either Mediterranean diet were more likely to undergo reversion (control v. olive oil HR 1.35, 95% CI 1.15–1.58, p < 0.001; control v. nuts HR 1.28, 95% CI 1.08–1.51, p < 0.001). Participants in the group receiving olive oil supplementation showed significant decreases in both central obesity and high fasting glucose (p = 0.02); participants in the group supplemented with nuts showed a significant decrease in central obesity.

Interpretation:

A Mediterranean diet supplemented with either extra virgin olive oil or nuts is not associated with the onset of metabolic syndrome, but such diets are more likely to cause reversion of the condition. An energy-unrestricted Mediterranean diet may be useful in reducing the risks of central obesity and hyperglycemia in people at high risk of cardiovascular disease. Trial registration: ClinicalTrials.gov, no. ISRCTN35739639.Metabolic syndrome is a cluster of 3 or more related cardiometabolic risk factors: central obesity (determined by waist circumference), hypertension, hypertriglyceridemia, low plasma high-density lipoprotein (HDL) cholesterol levels and hyperglycemia. Having the syndrome increases a person’s risk for type 2 diabetes and cardiovascular disease.1,2 In addition, the condition is associated with increased morbidity and all-cause mortality.1,35 The worldwide prevalence of metabolic syndrome in adults approaches 25%68 and increases with age,7 especially among women,8,9 making it an important public health issue.Several studies have shown that lifestyle modifications,10 such as increased physical activity,11 adherence to a healthy diet12,13 or weight loss,1416 are associated with reversion of the metabolic syndrome and its components. However, little information exists as to whether changes in the overall dietary pattern without weight loss might also be effective in preventing and managing the condition.The Mediterranean diet is recognized as one of the healthiest dietary patterns. It has shown benefits in patients with cardiovascular disease17,18 and in the prevention and treatment of related conditions, such as diabetes,1921 hypertension22,23 and metabolic syndrome.24Several cross-sectional2529 and prospective3032 epidemiologic studies have suggested an inverse association between adherence to the Mediterranean diet and the prevalence or incidence of metabolic syndrome. Evidence from clinical trials has shown that an energy-restricted Mediterranean diet33 or adopting a Mediterranean diet after weight loss34 has a beneficial effect on metabolic syndrome. However, these studies did not determine whether the effect could be attributed to the weight loss or to the diets themselves.Seminal data from the PREDIMED (PREvención con DIeta MEDiterránea) study suggested that adherence to a Mediterranean diet supplemented with nuts reversed metabolic syndrome more so than advice to follow a low-fat diet.35 However, the report was based on data from only 1224 participants followed for 1 year. We have analyzed the data from the final PREDIMED cohort after a median follow-up of 4.8 years to determine the long-term effects of a Mediterranean diet on metabolic syndrome.  相似文献   

9.

Background:

Despite a low prevalence of chronic kidney disease (estimated glomerular filtration rate [GFR] < 60 mL/min per 1.73 m2), First Nations people have high rates of kidney failure requiring chronic dialysis or kidney transplantation. We sought to examine whether the presence and severity of albuminuria contributes to the progression of chronic kidney disease to kidney failure among First Nations people.

Methods:

We identified all adult residents of Alberta (age ≥ 18 yr) for whom an outpatient serum creatinine measurement was available from May 1, 2002, to Mar. 31, 2008. We determined albuminuria using urine dipsticks and categorized results as normal (i.e., no albuminuria), mild, heavy or unmeasured. Our primary outcome was progression to kidney failure (defined as the need for chronic dialysis or kidney transplantation, or a sustained doubling of serum creatinine levels). We calculated rates of progression to kidney failure by First Nations status, by estimated GFR and by albuminuria category. We determined the relative hazard of progression to kidney failure for First Nations compared with non–First Nations participants by level of albuminuria and estimated GFR.

Results:

Of the 1 816 824 participants we identified, 48 669 (2.7%) were First Nations. First Nations people were less likely to have normal albuminuria compared with non–First Nations people (38.7% v. 56.4%). Rates of progression to kidney failure were consistently 2- to 3-fold higher among First Nations people than among non–First Nations people, across all levels of albuminuria and estimated GFRs. Compared with non–First Nations people, First Nations people with an estimated GFR of 15.0–29.9 mL/min per 1.73 m2 had the highest risk of progression to kidney failure, with similar hazard ratios for those with normal and heavy albuminuria.

Interpretation:

Albuminuria confers a similar risk of progression to kidney failure for First Nations and non–First Nations people.Severe chronic kidney disease (estimated glomerular filtration rate [GFR] < 30 mL/min per 1.73 m2) is almost 2-fold higher, and rates of end-stage kidney disease (defined as the need for chronic dialysis or kidney transplantation) are 4-fold higher, among First Nations people compared with non–First Nations people in Canada.1,2 The reasons for the higher rate of end-stage kidney disease when there is a lower prevalence of earlier stages of chronic kidney disease in First Nations people (estimated GFR 30–60 mL/min per 1.73 m2) are unclear. The rising incidence of diabetes is seen as the major cause of kidney failure among First Nations people;3 however, First Nations people without diabetes are also 2–3 times more likely to eventually have kidney failure.4 These observations suggest that diabetes is not the sole determinant of risk for kidney failure and that there are yet undefined factors that may accelerate the progression of chronic kidney disease in the First Nations population.5Recent studies have highlighted the prognostic importance of albuminuria as a risk factor for kidney failure.6 Although ethnic variations in the prevalence and severity of albuminuria and their association with renal outcomes have been reported, these studies are primarily limited to non–First Nations populations.7 A limited number of studies have reported an increased prevalence of albuminuria among First Nations people, suggesting the potential association between albuminuria and risk of kidney failure.8,9 We sought to measure the presence and severity of albuminuria and estimate the risk of progression to kidney failure for First Nations people compared with non–First Nations people using a community-based cohort.  相似文献   

10.

Background:

Brief interventions delivered by family physicians to address excessive alcohol use among adult patients are effective. We conducted a study to determine whether such an intervention would be similarly effective in reducing binge drinking and excessive cannabis use among young people.

Methods:

We conducted a cluster randomized controlled trial involving 33 family physicians in Switzerland. Physicians in the intervention group received training in delivering a brief intervention to young people during the consultation in addition to usual care. Physicians in the control group delivered usual care only. Consecutive patients aged 15–24 years were recruited from each practice and, before the consultation, completed a confidential questionnaire about their general health and substance use. Patients were followed up at 3, 6 and 12 months after the consultation. The primary outcome measure was self-reported excessive substance use (≥ 1 episode of binge drinking, or ≥ 1 joint of cannabis per week, or both) in the past 30 days.

Results:

Of the 33 participating physicians, 17 were randomly allocated to the intervention group and 16 to the control group. Of the 594 participating patients, 279 (47.0%) identified themselves as binge drinkers or excessive cannabis users, or both, at baseline. Excessive substance use did not differ significantly between patients whose physicians were in the intervention group and those whose physicians were in the control group at any of the follow-up points (odds ratio [OR] and 95% confidence interval [CI] at 3 months: 0.9 [0.6–1.4]; at 6 mo: 1.0 [0.6–1.6]; and at 12 mo: 1.1 [0.7–1.8]). The differences between groups were also nonsignificant after we re stricted the analysis to patients who reported excessive substance use at baseline (OR 1.6, 95% CI 0.9–2.8, at 3 mo; OR 1.7, 95% CI 0.9–3.2, at 6 mo; and OR 1.9, 95% CI 0.9–4.0, at 12 mo).

Interpretation:

Training family physicians to use a brief intervention to address excessive substance use among young people was not effective in reducing binge drinking and excessive cannabis use in this patient population. Trial registration: Australian New Zealand Clinical Trials Registry, no. ACTRN12608000432314.Most health-compromising behaviours begin in adolescence.1 Interventions to address these behaviours early are likely to bring long-lasting benefits.2 Harmful use of alcohol is a leading factor associated with premature death and disability worldwide, with a disproportionally high impact on young people (aged 10–24 yr).3,4 Similarly, early cannabis use can have adverse consequences that extend into adulthood.58In adolescence and early adulthood, binge drinking on at least a monthly basis is associated with an increased risk of adverse outcomes later in life.912 Although any cannabis use is potentially harmful, weekly use represents a threshold in adolescence related to an increased risk of cannabis (and tobacco) dependence in adulthood.13 Binge drinking affects 30%–50% and excessive cannabis use about 10% of the adolescent and young adult population in Europe and the United States.10,14,15Reducing substance-related harm involves multisectoral approaches, including promotion of healthy child and adolescent development, regulatory policies and early treatment interventions.16 Family physicians can add to the public health messages by personalizing their content within brief interventions.17,18 There is evidence that brief interventions can encourage young people to reduce substance use, yet most studies have been conducted in community settings (mainly educational), emergency services or specialized addiction clinics.1,16 Studies aimed at adult populations have shown favourable effects of brief alcohol interventions, and to some extent brief cannabis interventions, in primary care.1922 These interventions have been recommended for adolescent populations.4,5,16 Yet young people have different modes of substance use and communication styles that may limit the extent to which evidence from adult studies can apply to them.Recently, a systematic review of brief interventions to reduce alcohol use in adolescents identified only 1 randomized controlled trial in primary care.23 The tested intervention, not provided by family physicians but involving audio self-assessment, was ineffective in reducing alcohol use in exposed adolescents.24 Sanci and colleagues showed that training family physicians to address health-risk behaviours among adolescents was effective in improving provider performance, but the extent to which this translates into improved outcomes remains unknown.25,26 Two nonrandomized studies suggested screening for substance use and brief advice by family physicians could favour reduced alcohol and cannabis use among adolescents,27,28 but evidence from randomized trials is lacking.29We conducted the PRISM-Ado (Primary care Intervention Addressing Substance Misuse in Adolescents) trial, a cluster randomized controlled trial of the effectiveness of training family physicians to deliver a brief intervention to address binge drinking and excessive cannabis use among young people.  相似文献   

11.

Background:

Diabetes-related end-stage renal disease disproportionately affects indigenous peoples. We explored the role of differential mortality in this disparity.

Methods:

In this retrospective cohort study, we examined the competing risks of end-stage renal disease and death without end-stage renal disease among Saskatchewan adults with diabetes mellitus, both First Nations and non–First Nations, from 1980 to 2005. Using administrative databases of the Saskatchewan Ministry of Health, we developed Fine and Gray subdistribution hazards models and cumulative incidence functions.

Results:

Of the 90 429 incident cases of diabetes, 8254 (8.9%) occurred among First Nations adults and 82 175 (90.9%) among non–First Nations adults. Mean age at the time that diabetes was diagnosed was 47.2 and 61.6 years, respectively (p < 0.001). After adjustment for sex and age at the time of diabetes diagnosis, the risk of end-stage renal disease was 2.66 times higher for First Nations than non–First Nations adults (95% confidence interval [CI] 2.24–3.16). Multivariable analysis with adjustment for sex showed a higher risk of death among First Nations adults, which declined with increasing age at the time of diabetes diagnosis. Cumulative incidence function curves stratified by age at the time of diabetes diagnosis showed greatest risk for end-stage renal disease among those with onset of diabetes at younger ages and greatest risk of death among those with onset of diabetes at older ages.

Interpretation:

Because they are typically younger when diabetes is diagnosed, First Nations adults with this condition are more likely than their non–First Nations counterparts to survive long enough for end-stage renal disease to develop. Differential mortality contributes substantially to ethnicity-based disparities in diabetes-related end-stage renal disease and possibly to chronic diabetes complications. Understanding the mechanisms underlying these disparities is vital in developing more effective prevention and management initiatives.Indigenous peoples experience an excess burden of diabetes-related end-stage renal disease,14 but the reasons for this disparity are incompletely understood. Although the increase in end-stage renal disease among indigenous peoples has paralleled the global emergence of type 2 diabetes mellitus,5 disparities in end-stage renal disease among Canada’s First Nations adults persist2 after adjustment for elevated prevalence of diabetes.6 In an earlier study, we suggested that First Nations adults might be more prone to diabetic nephropathy and might experience more rapid progression to end-stage renal disease.7 However, although albuminuria is more prevalent in this population,8 affected individuals unexpectedly have a longer average time from diagnosis of diabetes to end-stage renal disease than people from non–First Nations populations.2 These findings could be explained by a younger age at the time of diabetes diagnosis6 and lower mortality among those with chronic kidney disease.8 An age-related survival benefit among First Nations adults with diabetes could lead to longer exposure to the metabolic consequences of diabetes and greater likelihood of end-stage renal disease.Our objective was to examine the contribution of differential mortality to disparities in diabetes-related end-stage renal disease within large populations of indigenous and non-indigenous North Americans. Accordingly, we used competing-risks survival analysis to compare the simultaneous risks of diabetes-related end-stage renal disease and death without end-stage renal disease among First Nations and non–First Nations adults.9  相似文献   

12.
Background:Rates of imaging for low-back pain are high and are associated with increased health care costs and radiation exposure as well as potentially poorer patient outcomes. We conducted a systematic review to investigate the effectiveness of interventions aimed at reducing the use of imaging for low-back pain.Methods:We searched MEDLINE, Embase, CINAHL and the Cochrane Central Register of Controlled Trials from the earliest records to June 23, 2014. We included randomized controlled trials, controlled clinical trials and interrupted time series studies that assessed interventions designed to reduce the use of imaging in any clinical setting, including primary, emergency and specialist care. Two independent reviewers extracted data and assessed risk of bias. We used raw data on imaging rates to calculate summary statistics. Study heterogeneity prevented meta-analysis.Results:A total of 8500 records were identified through the literature search. Of the 54 potentially eligible studies reviewed in full, 7 were included in our review. Clinical decision support involving a modified referral form in a hospital setting reduced imaging by 36.8% (95% confidence interval [CI] 33.2% to 40.5%). Targeted reminders to primary care physicians of appropriate indications for imaging reduced referrals for imaging by 22.5% (95% CI 8.4% to 36.8%). Interventions that used practitioner audits and feedback, practitioner education or guideline dissemination did not significantly reduce imaging rates. Lack of power within some of the included studies resulted in lack of statistical significance despite potentially clinically important effects.Interpretation:Clinical decision support in a hospital setting and targeted reminders to primary care doctors were effective interventions in reducing the use of imaging for low-back pain. These are potentially low-cost interventions that would substantially decrease medical expenditures associated with the management of low-back pain.Current evidence-based clinical practice guidelines recommend against the routine use of imaging in patients presenting with low-back pain.13 Despite this, imaging rates remain high,4,5 which indicates poor concordance with these guidelines.6,7Unnecessary imaging for low-back pain has been associated with poorer patient outcomes, increased radiation exposure and higher health care costs.8 No short- or long-term clinical benefits have been shown with routine imaging of the low back, and the diagnostic value of incidental imaging findings remains uncertain.912 A 2008 systematic review found that imaging accounted for 7% of direct costs associated with low-back pain, which in 1998 translated to more than US$6 billion in the United States and £114 million in the United Kingdom.13 Current costs are likely to be substantially higher, with an estimated 65% increase in spine-related expenditures between 1997 and 2005.14Various interventions have been tried for reducing imaging rates among people with low-back pain. These include strategies targeted at the practitioner such as guideline dissemination,1517 education workshops,18,19 audit and feedback of imaging use,7,20,21 ongoing reminders7 and clinical decision support.2224 It is unclear which, if any, of these strategies are effective.25 We conducted a systematic review to investigate the effectiveness of interventions designed to reduce imaging rates for the management of low-back pain.  相似文献   

13.
Sonja A. Swanson  Ian Colman 《CMAJ》2013,185(10):870-877

Background:

Ecological studies support the hypothesis that suicide may be “contagious” (i.e., exposure to suicide may increase the risk of suicide and related outcomes). However, this association has not been adequately assessed in prospective studies. We sought to determine the association between exposure to suicide and suicidality outcomes in Canadian youth.

Methods:

We used baseline information from the Canadian National Longitudinal Survey of Children and Youth between 1998/99 and 2006/07 with follow-up assessments 2 years later. We included all respondents aged 12–17 years in cycles 3–7 with reported measures of exposure to suicide.

Results:

We included 8766 youth aged 12–13 years, 7802 aged 14–15 years and 5496 aged 16–17 years. Exposure to a schoolmate’s suicide was associated with ideation at baseline among respondents aged 12–13 years (odds ratio [OR] 5.06, 95% confidence interval [CI] 3.04–8.40), 14–15 years (OR 2.93, 95% CI 2.02–4.24) and 16–17 years (OR 2.23, 95% CI 1.43–3.48). Such exposure was associated with attempts among respondents aged 12–13 years (OR 4.57, 95% CI 2.39–8.71), 14–15 years (OR 3.99, 95% CI 2.46–6.45) and 16–17 years (OR 3.22, 95% CI 1.62–6.41). Personally knowing someone who died by suicide was associated with suicidality outcomes for all age groups. We also assessed 2-year outcomes among respondents aged 12–15 years: a schoolmate’s suicide predicted suicide attempts among participants aged 12–13 years (OR 3.07, 95% CI 1.05–8.96) and 14–15 years (OR 2.72, 95% CI 1.47–5.04). Among those who reported a schoolmate’s suicide, personally knowing the decedent did not alter the risk of suicidality.

Interpretation:

We found that exposure to suicide predicts suicide ideation and attempts. Our results support school-wide interventions over current targeted interventions, particularly over strategies that target interventions toward children closest to the decedent.Suicidal thoughts and behaviours are prevalent13 and severe47 among adolescents. One hypothesized cause of suicidality is “suicide contagion” (i.e., exposure to suicide or related behaviours influences others to contemplate, attempt or die by suicide).8 Ecological studies support this theory: suicide and suspected suicide rates increase following a highly publicized suicide.911 However, such studies are prone to ecological fallacy and do not allow for detailed understanding of who may be most vulnerable.Adolescents may be particularly susceptible to this contagion effect. More than 13% of adolescent suicides are potentially explained by clustering;1214 clustering may explain an even larger proportion of suicide attempts.15,16 Many local,17,18 national8,19 and international20 institutions recommend school- or community-level postvention strategies in the aftermath of a suicide to help prevent further suicides and suicidality. These postvention strategies typically focus on a short interval following the death (e.g., months) with services targeted toward the most at-risk individuals (e.g., those with depression).19In this study, we assessed the association between exposure to suicide and suicidal thoughts and attempts among youth, using both cross-sectional and prospective (2-yr follow-up) analyses in a population-based cohort of Canadian youth.  相似文献   

14.
15.

Background

There is controversy about which children with minor head injury need to undergo computed tomography (CT). We aimed to develop a highly sensitive clinical decision rule for the use of CT in children with minor head injury.

Methods

For this multicentre cohort study, we enrolled consecutive children with blunt head trauma presenting with a score of 13–15 on the Glasgow Coma Scale and loss of consciousness, amnesia, disorientation, persistent vomiting or irritability. For each child, staff in the emergency department completed a standardized assessment form before any CT. The main outcomes were need for neurologic intervention and presence of brain injury as determined by CT. We developed a decision rule by using recursive partitioning to combine variables that were both reliable and strongly associated with the outcome measures and thus to find the best combinations of predictor variables that were highly sensitive for detecting the outcome measures with maximal specificity.

Results

Among the 3866 patients enrolled (mean age 9.2 years), 95 (2.5%) had a score of 13 on the Glasgow Coma Scale, 282 (7.3%) had a score of 14, and 3489 (90.2%) had a score of 15. CT revealed that 159 (4.1%) had a brain injury, and 24 (0.6%) underwent neurologic intervention. We derived a decision rule for CT of the head consisting of four high-risk factors (failure to reach score of 15 on the Glasgow coma scale within two hours, suspicion of open skull fracture, worsening headache and irritability) and three additional medium-risk factors (large, boggy hematoma of the scalp; signs of basal skull fracture; dangerous mechanism of injury). The high-risk factors were 100.0% sensitive (95% CI 86.2%–100.0%) for predicting the need for neurologic intervention and would require that 30.2% of patients undergo CT. The medium-risk factors resulted in 98.1% sensitivity (95% CI 94.6%–99.4%) for the prediction of brain injury by CT and would require that 52.0% of patients undergo CT.

Interpretation

The decision rule developed in this study identifies children at two levels of risk. Once the decision rule has been prospectively validated, it has the potential to standardize and improve the use of CT for children with minor head injury.Each year more than 650 000 children are seen in hospital emergency departments in North America with “minor head injury,” i.e., history of loss of consciousness, amnesia or disorientation in a patient who is conscious and responsive in the emergency department (Glasgow Coma Scale score1 13–15). Although most patients with minor head injury can be discharged after a period of observation, a small proportion experience deterioration of their condition and need to undergo neurosurgical intervention for intracranial hematoma.24 The use of computed tomography (CT) in the emergency department is important in the early diagnosis of these intracranial hematomas.Over the past decade the use of CT for minor head injury has become increasingly common, while its diagnostic yield has remained low. In Canadian pediatric emergency departments the use of CT for minor head injury increased from 15% in 1995 to 53% in 2005.5,6 Despite this increase, a small but important number of pediatric intracranial hematomas are missed in Canadian emergency departments at the first visit.3 Few children with minor head injury have a visible brain injury on CT (4%–7%), and only 0.5% have an intracranial lesion requiring urgent neurosurgical intervention.5,7 The increased use of CT adds substantially to health care costs and exposes a large number of children each year to the potentially harmful effects of ionizing radiation.8,9 Currently, there are no widely accepted, evidence-based guidelines on the use of CT for children with minor head injury.A clinical decision rule incorporates three or more variables from the history, physical examination or simple tests10.11 into a tool that helps clinicians to make diagnostic or therapeutic decisions at the bedside. Members of our group have developed decision rules to allow physicians to be more selective in the use of radiography for children with injuries of the ankle12 and knee,13 as well as for adults with injuries of the ankle,1417 knee,1820 head21,22 and cervical spine.23,24 The aim of this study was to prospectively derive an accurate and reliable clinical decision rule for the use of CT for children with minor head injury.  相似文献   

16.

Background

Chest pain can be caused by various conditions, with life-threatening cardiac disease being of greatest concern. Prediction scores to rule out coronary artery disease have been developed for use in emergency settings. We developed and validated a simple prediction rule for use in primary care.

Methods

We conducted a cross-sectional diagnostic study in 74 primary care practices in Germany. Primary care physicians recruited all consecutive patients who presented with chest pain (n = 1249) and recorded symptoms and findings for each patient (derivation cohort). An independent expert panel reviewed follow-up data obtained at six weeks and six months on symptoms, investigations, hospital admissions and medications to determine the presence or absence of coronary artery disease. Adjusted odds ratios of relevant variables were used to develop a prediction rule. We calculated measures of diagnostic accuracy for different cut-off values for the prediction scores using data derived from another prospective primary care study (validation cohort).

Results

The prediction rule contained five determinants (age/sex, known vascular disease, patient assumes pain is of cardiac origin, pain is worse during exercise, and pain is not reproducible by palpation), with the score ranging from 0 to 5 points. The area under the curve (receiver operating characteristic curve) was 0.87 (95% confidence interval [CI] 0.83–0.91) for the derivation cohort and 0.90 (95% CI 0.87–0.93) for the validation cohort. The best overall discrimination was with a cut-off value of 3 (positive result 3–5 points; negative result ≤ 2 points), which had a sensitivity of 87.1% (95% CI 79.9%–94.2%) and a specificity of 80.8% (77.6%–83.9%).

Interpretation

The prediction rule for coronary artery disease in primary care proved to be robust in the validation cohort. It can help to rule out coronary artery disease in patients presenting with chest pain in primary care.Chest pain is common. Studies have shown a lifetime prevalence of 20% to 40% in the general population.1 Its prevalence in primary care ranges from 0.7% to 2.7% depending on inclusion criteria and country,24 with coronary artery disease being the underlying cause in about 12% of primary care patients.1,5 General practitioners are challenged to identify serious cardiac disease reliably and also protect patients from unnecessary investigations and hospital admissions. Because electrocardiography and the cardiac troponin test are of limited value in primary care,6,7 history taking and physical examination remain the main diagnostic tools.Most published studies on the diagnostic accuracy of signs and symptoms for acute coronary events have been conducted in high-prevalence settings such as hospital emergency departments.810 Predictive scores have also been developed for use in emergency departments, mainly for the diagnosis of acute coronary syndromes.1113 To what degree these apply in primary care is unknown.1416A clinical prediction score to rule out coronary artery disease in general practice has been developed.17 However, it did not perform well when validated externally. The aim of our study was to develop a simple, valid and usable prediction score based on signs and symptoms to help primary care physicians rule out coronary artery disease in patients presenting with chest pain.  相似文献   

17.

Background:

Hypoglycemia remains a common life-threatening event associated with diabetes treatment. We compared the risk of first or recurrent hypoglycemia event among metformin initiators who intensified treatment with insulin versus sulfonylurea.

Methods:

We assembled a retrospective cohort using databases of the Veterans Health Administration, Medicare and the National Death Index. Metformin initiators who intensified treatment with insulin or sulfonylurea were followed to either their first or recurrent hypoglycemia event using Cox proportional hazard models. Hypoglycemia was defined as hospital admission or an emergency department visit for hypoglycemia, or an outpatient blood glucose value of less than 3.3 mmol/L. We conducted additional analyses for risk of first hypoglycemia event, with death as the competing risk.

Results:

Among 178 341 metformin initiators, 2948 added insulin and 39 990 added sulfonylurea. Propensity score matching yielded 2436 patients taking metformin plus insulin and 12 180 taking metformin plus sulfonylurea. Patients took metformin for a median of 14 (interquartile range [IQR] 5–30) months, and the median glycated hemoglobin level was 8.1% (IQR 7.2%–9.9%) at intensification. In the group who added insulin, 121 first hypoglycemia events occurred, and 466 first events occurred in the group who added sulfonylurea (30.9 v. 24.6 events per 1000 person-years; adjusted hazard ratio [HR] 1.30, 95% confidence interval [CI] 1.06–1.59). For recurrent hypoglycemia, there were 159 events in the insulin group and 585 events in the sulfonylurea group (39.1 v. 30.0 per 1000 person-years; adjusted HR 1.39, 95% CI 1.12–1.72). In separate competing risk analyses, the adjusted HR for hypoglycemia was 1.28 (95% CI 1.04–1.56).

Interpretation:

Among patients using metformin who could use either insulin or sulfonylurea, the addition of insulin was associated with a higher risk of hypoglycemia than the addition of sulfonylurea. This finding should be considered by patients and clinicians when discussing the risks and benefits of adding insulin versus a sulfonylurea.Hypoglycemia remains one of the most common medication-related adverse events among patients with diabetes and a leading cause of hospital admissions and emergency department visits.1,2 It is a concern to patients and clinicians and a strong determinant of treatment choices.3 Hypoglycemic medications account for 25% of emergency hospital admissions for adverse drug events among patients aged 65 years and older.2,4 Multiple factors predispose patients with diabetes to hypoglycemia, including older age, polypharmacy, poor nutrition, underlying illness, alcohol use and declining renal function.5,6 Intensive glucose-control treatment for patients with these factors is strongly associated with hypoglycemia.6,7Consensus statements by major diabetes associations, including the Canadian Diabetes Association, recommend lifestyle modification and metformin as first-line therapies for type 2 diabetes, with the goal of treatment being a glycated hemoglobin (HbA1C) level of 7% or less for many patients.8,9 Multiple options are listed as acceptable add-on treatments. Sulfonylurea is easier to initiate, but insulin dose can be modified in response to daily variation in food intake, exercise or other variables that cause fluctuations in glucose values. Within the Veterans Health Administration clinical practice guideline, both the combination of metformin plus sulfonylurea or the use of bedtime insulin combined with metformin are considered acceptable based on level I evidence.10 To make well-informed decisions about treatment regimens, patients and providers need to understand clinical benefits, such as improvement in microvascular outcomes,11 and harms, such as hypoglycemia.We recently reported that intensification of metformin with insulin compared with sulfonylurea was associated with an increased risk of all-cause mortality among veterans with diabetes.12 Evidence for a causal relation between hypoglycemia and cardiovascular disease or death is limited, because patients at risk for hypoglycemia also have factors that increase their risk for those outcomes.7,1315 Both sulfonylurea and insulin are associated with an elevated risk of hypoglycemia compared with metformin.5,7,1618 We sought to test the hypothesis that using the combination of metformin plus insulin was associated with a greater risk of serious hypoglycemia than using metformin plus sulfonylurea.  相似文献   

18.

Background

Recent studies have reported a high prevalence of relative adrenal insufficiency in patients with liver cirrhosis. However, the effect of corticosteroid replacement on mortality in this high-risk group remains unclear. We examined the effect of low-dose hydrocortisone in patients with cirrhosis who presented with septic shock.

Methods

We enrolled patients with cirrhosis and septic shock aged 18 years or older in a randomized double-blind placebo-controlled trial. Relative adrenal insufficiency was defined as a serum cortisol increase of less than 250 nmol/L or 9 μg/dL from baseline after stimulation with 250 μg of intravenous corticotropin. Patients were assigned to receive 50 mg of intravenous hydrocortisone or placebo every six hours until hemodynamic stability was achieved, followed by steroid tapering over eight days. The primary outcome was 28-day all-cause mortality.

Results

The trial was stopped for futility at interim analysis after 75 patients were enrolled. Relative adrenal insufficiency was diagnosed in 76% of patients. Compared with the placebo group (n = 36), patients in the hydrocortisone group (n = 39) had a significant reduction in vasopressor doses and higher rates of shock reversal (relative risk [RR] 1.58, 95% confidence interval [CI] 0.98–2.55, p = 0.05). Hydrocortisone use was not associated with a reduction in 28-day mortality (RR 1.17, 95% CI 0.92–1.49, p = 0.19) but was associated with an increase in shock relapse (RR 2.58, 95% CI 1.04–6.45, p = 0.03) and gastrointestinal bleeding (RR 3.00, 95% CI 1.08–8.36, p = 0.02).

Interpretation

Relative adrenal insufficiency was very common in patients with cirrhosis presenting with septic shock. Despite initial favourable effects on hemodynamic parameters, hydrocortisone therapy did not reduce mortality and was associated with an increase in adverse effects. (Current Controlled Trials registry no. ISRCTN99675218.)Cirrhosis is a leading cause of death worldwide,1 often with septic shock as the terminal event.29 Relative adrenal insufficiency shares similar features of distributive hyperdynamic shock with both cirrhosis and sepsis10,11 and increasingly has been reported to coexist with both conditions.11,12 The effect of low-dose hydrocortisone therapy on survival of critically ill patients in general with septic shock remains controversial, with conflicting results from randomized controlled trials1317 and meta-analyses.18,19 The effect of hydrocortisone therapy on mortality among patients with cirrhosis, who are known to be a group at high risk for relative adrenal insufficiency, has not been studied and hence was the objective of our study.  相似文献   

19.
Background:Lifetime risk is a relatively straightforward measure used to communicate disease burden, representing the cumulative risk of an outcome during the remainder of an individual’s life starting from a disease-free index age. We estimated the lifetime risk of diabetes among men and women in both First Nations and non–First Nations populations using a cohort of adults in a single Canadian province.Methods:We used a population-based cohort consisting of Alberta residents from 1997 to 2008 who were free of diabetes at cohort entry to estimate the lifetime risk of diabetes among First Nations and non–First Nations people. We calculated age-specific incidence rates with the person-year method in 5-year bands. We estimated the sex- and index-age–specific lifetime risk of incident diabetes, after adjusting for the competing risk of death.Results:The cohort included 70 631 First Nations and 2 732 214 non–First Nations people aged 18 years or older. The lifetime risk of diabetes at 20 years of age was 75.6% among men and 87.3% among women in the First Nations group, as compared with 55.6% among men and 46.5% among women in the non–First Nations group. The risk was higher among First Nations people than among non–First Nations people for all index ages and for both sexes. Among non–First Nations people, men had a higher lifetime risk of diabetes than women across all index ages. In contrast, among First Nations people, women had a higher lifetime risk than men across all index ages.Interpretation:About 8 in 10 First Nations people and about 5 in 10 non–First Nations people of young age will develop diabetes in their remaining lifetime. These population-based estimates may help health care planners and decision-makers set priorities and increase public awareness and interest in the prevention of diabetes.Diabetes mellitus is a major health problem worldwide and is associated with increased morbidity, mortality, life expectancy and health care costs.14 The prevalence of diabetes in Canada has increased more than twofold over the past decade.5 Currently, the disease affects almost 2.4 million Canadians,6 and its management, along with that of associated complications, costs more than $9 billion annually.7 The burden of diabetes is particularly high among First Nations people in Canada, with prevalence rates 3–5 times higher than those among non–First Nations people.8Reducing the risk of type 2 diabetes will require a broad set of population-based and individual-level interventions that target diabetogenic aspects of lifestyle, as well as social determinants of health. The changes required to achieve these objectives will need buy-in from a wide range of stakeholders. Thus, it will be important to communicate risk in a way that is understood by the general population and by health authorities.Although estimates of incidence and prevalence provide important information about the burden of a disease in the community, they do not provide adequate information regarding the perspective of risk at the individual level. Lifetime risk (the probability of a disease-free individual developing the disease during his or her remaining lifespan) may be more informative for the general population and for decision-makers. Life-table modelling techniques use incidence and mortality data to estimate the lifetime risk of diabetes. This important assessment of the disease burden of diabetes has been undertaken in a few studies,911 but it has not been done in Canada. The need for such estimates is particularly relevant given the higher prevalence of diabetes among First Nations people in Canada.We estimated the lifetime risk of diabetes among men and women in both First Nations and non–First Nations populations using a cohort of adults residing in a single Canadian province.  相似文献   

20.

Background

Little is known about the incidence and causes of heparin-induced skin lesions. The 2 most commonly reported causes of heparin-induced skin lesions are immune-mediated heparin-induced thrombocytopenia and delayed-type hypersensitivity reactions.

Methods

We prospectively examined consecutive patients who received subcutaneous heparin (most often enoxaparin or nadroparin) for the presence of heparin-induced skin lesions. If such lesions were identified, we performed a skin biopsy, platelet count measurements, and antiplatelet-factor 4 antibody and allergy testing.

Results

We enrolled 320 patients. In total, 24 patients (7.5%, 95% confidence interval [CI] 4.7%–10.6%) had heparin-induced skin lesions. Delayed-type hypersensitivity reactions were identified as the cause in all 24 patients. One patient with histopathologic evidence of delayed-type hypersensitivity tested positive for antiplatelet-factor 4 antibodies. We identified the following risk factors for heparin-induced skin lesions: a body mass index greater than 25 (odds ratio [OR] 4.6, 95% CI 1.7–15.3), duration of heparin therapy longer than 9 days (OR 5.9, 95% CI 1.9–26.3) and female sex (OR 3.0, 95% CI 1.1–8.8).

Interpretation

Heparin-induced skin lesions are relatively common, have identifiable risk factors and are commonly caused by a delayed-type hypersensitivity reaction (type IV allergic response). (ClinicalTrials.gov trial register no. NCT00510432.)Hpeparin has been used as an anticoagulant for over 60 years.1 Well-known adverse effects of heparin therapy are bleeding, osteoporosis, hair loss, and immune and nonimmune heparin-induced thrombocytopenia. The incidence of heparin-induced skin lesions is unknown, despite being increasingly reported.24 Heparin-induced skin lesions may be caused by at least 5 mechanisms: delayed-type (type IV) hypersensitivity responses,2,46 immune-mediated thrombocytopenia,3 type I allergic reactions,7,8 skin necrosis9 and pustulosis.10Heparin-induced skin lesions may indicate the presence of life-threatening heparin-induced thrombocytopenia11 — even in the absence of thrombocytopenia.3 There are no data available on the incidence of heparin-induced skin lesions or their causes. Given the rising number of reports of heparin-induced skin lesions and the importance of correctly diagnosing this condition, we sought to determine the incidence of heparin-induced skin lesions.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号