首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.

Background:

Many people with depression experience repeated episodes. Previous research into the predictors of chronic depression has focused primarily on the clinical features of the disease; however, little is known about the broader spectrum of sociodemographic and health factors inherent in its development. Our aim was to identify factors associated with a long-term negative prognosis of depression.

Methods:

We included 585 people aged 16 years and older who participated in the 2000/01 cycle of the National Population Health Survey and who reported experiencing a major depressive episode in 2000/01. The primary outcome was the course of depression until 2006/07. We grouped individuals into trajectories of depression using growth trajectory models. We included demographic, mental and physical health factors as predictors in the multivariable regression model to compare people with different trajectories.

Results:

Participants fell into two main depression trajectories: those whose depression resolved and did not recur (44.7%) and those who experienced repeated episodes (55.3%). In the multivariable model, daily smoking (OR 2.68, 95% CI 1.54–4.67), low mastery (i.e., feeling that life circumstances are beyond one’s control) (OR 1.10, 95% CI 1.03–1.18) and history of depression (OR 3.5, 95% CI 1.95–6.27) were significant predictors (p < 0.05) of repeated episodes of depression.

Interpretation:

People with major depression who were current smokers or had low levels of mastery were at an increased risk of repeated episodes of depression. Future studies are needed to confirm the predictive value of these variables and to evaluate their accuracy for diagnosis and as a guide to treatment.Depression is a common and often recurrent disorder that compromises daily functioning and is associated with a decrease in quality of life.13 Guidelines for the treatment of depression, such as those published by the Canadian Network for Mood and Anxiety Treatments (CANMAT)5 and the National Institute for Health and Clinical Excellence (NICE) in the United Kingdom,4 often recommend antidepressant treatment in patients with severe symptoms and outline specific risk factors supporting long-term treatment maintenance.4,5 However, for patients who do not meet the criteria for treatment of depression, the damaging sequelae of depression are frequently compounded without treatment.5 In such cases, early treatment for depression may result in an improved long-term prognosis.68A small but growing number of studies have begun to characterize the long-term course of depression in terms of severity,9 life-time prevalence10 and patterns of recurrence.11 However, a recent systematic review of the risk factors of chronic depression highlighted a need for longitudinal studies to better identify prognostic factors.12 The capacity to distinguish long-term patterns of recurrence of depression in relation to the wide range of established clinical and nonclinical factors for depression could be highly beneficial. Our objective was to use a population-based cohort to identify and understand the baseline factors associated with a long-term negative prognosis of depression.  相似文献   

2.

Background:

The gut microbiota is essential to human health throughout life, yet the acquisition and development of this microbial community during infancy remains poorly understood. Meanwhile, there is increasing concern over rising rates of cesarean delivery and insufficient exclusive breastfeeding of infants in developed countries. In this article, we characterize the gut microbiota of healthy Canadian infants and describe the influence of cesarean delivery and formula feeding.

Methods:

We included a subset of 24 term infants from the Canadian Healthy Infant Longitudinal Development (CHILD) birth cohort. Mode of delivery was obtained from medical records, and mothers were asked to report on infant diet and medication use. Fecal samples were collected at 4 months of age, and we characterized the microbiota composition using high-throughput DNA sequencing.

Results:

We observed high variability in the profiles of fecal microbiota among the infants. The profiles were generally dominated by Actinobacteria (mainly the genus Bifidobacterium) and Firmicutes (with diverse representation from numerous genera). Compared with breastfed infants, formula-fed infants had increased richness of species, with overrepresentation of Clostridium difficile. Escherichia–Shigella and Bacteroides species were underrepresented in infants born by cesarean delivery. Infants born by elective cesarean delivery had particularly low bacterial richness and diversity.

Interpretation:

These findings advance our understanding of the gut microbiota in healthy infants. They also provide new evidence for the effects of delivery mode and infant diet as determinants of this essential microbial community in early life.The human body harbours trillions of microbes, known collectively as the “human microbiome.” By far the highest density of commensal bacteria is found in the digestive tract, where resident microbes outnumber host cells by at least 10 to 1. Gut bacteria play a fundamental role in human health by promoting intestinal homeostasis, stimulating development of the immune system, providing protection against pathogens, and contributing to the processing of nutrients and harvesting of energy.1,2 The disruption of the gut microbiota has been linked to an increasing number of diseases, including inflammatory bowel disease, necrotizing enterocolitis, diabetes, obesity, cancer, allergies and asthma.1 Despite this evidence and a growing appreciation for the integral role of the gut microbiota in lifelong health, relatively little is known about the acquisition and development of this complex microbial community during infancy.3Two of the best-studied determinants of the gut microbiota during infancy are mode of delivery and exposure to breast milk.4,5 Cesarean delivery perturbs normal colonization of the infant gut by preventing exposure to maternal microbes, whereas breastfeeding promotes a “healthy” gut microbiota by providing selective metabolic substrates for beneficial bacteria.3,5 Despite recommendations from the World Health Organization,6 the rate of cesarean delivery has continued to rise in developed countries and rates of breastfeeding decrease substantially within the first few months of life.7,8 In Canada, more than 1 in 4 newborns are born by cesarean delivery, and less than 15% of infants are exclusively breastfed for the recommended duration of 6 months.9,10 In some parts of the world, elective cesarean deliveries are performed by maternal request, often because of apprehension about pain during childbirth, and sometimes for patient–physician convenience.11The potential long-term consequences of decisions regarding mode of delivery and infant diet are not to be underestimated. Infants born by cesarean delivery are at increased risk of asthma, obesity and type 1 diabetes,12 whereas breastfeeding is variably protective against these and other disorders.13 These long-term health consequences may be partially attributable to disruption of the gut microbiota.12,14Historically, the gut microbiota has been studied with the use of culture-based methodologies to examine individual organisms. However, up to 80% of intestinal microbes cannot be grown in culture.3,15 New technology using culture-independent DNA sequencing enables comprehensive detection of intestinal microbes and permits simultaneous characterization of entire microbial communities. Multinational consortia have been established to characterize the “normal” adult microbiome using these exciting new methods;16 however, these methods have been underused in infant studies. Because early colonization may have long-lasting effects on health, infant studies are vital.3,4 Among the few studies of infant gut microbiota using DNA sequencing, most were conducted in restricted populations, such as infants delivered vaginally,17 infants born by cesarean delivery who were formula-fed18 or preterm infants with necrotizing enterocolitis.19Thus, the gut microbiota is essential to human health, yet the acquisition and development of this microbial community during infancy remains poorly understood.3 In the current study, we address this gap in knowledge using new sequencing technology and detailed exposure assessments20 of healthy Canadian infants selected from a national birth cohort to provide representative, comprehensive profiles of gut microbiota according to mode of delivery and infant diet.  相似文献   

3.

Introduction:

Pregnancy causes diverse physiologic and lifestyle changes that may contribute to increased driving and driving error. We compared the risk of a serious motor vehicle crash during the second trimester to the baseline risk before pregnancy.

Methods:

We conducted a population-based self-matched longitudinal cohort analysis of women who gave birth in Ontario between April 1, 2006, and March 31, 2011. We excluded women less than age 18 years, those living outside Ontario, those who lacked a valid health card identifier under universal insurance, and those under the care of a midwife. The primary outcome was a motor vehicle crash resulting in a visit to an emergency department.

Results:

A total of 507 262 women gave birth during the study period. These women accounted for 6922 motor vehicle crashes as drivers during the 3-year baseline interval (177 per mo) and 757 motor vehicle crashes as drivers during the second trimester (252 per mo), equivalent to a 42% relative increase (95% confidence interval 32%–53%; p < 0.001). The increased risk extended to diverse populations, varied obstetrical cases and different crash characteristics. The increased risk was largest in the early second trimester and compensated for by the third trimester. No similar increase was observed in crashes as passengers or pedestrians, cases of intentional injury or inadvertent falls, or self-reported risky behaviours.

Interpretation:

Pregnancy is associated with a substantial risk of a serious motor vehicle crash during the second trimester. This risk merits attention for prenatal care.Motor vehicle crashes are the leading cause of fetal death related to maternal trauma.14 The outcomes for survivors are also concerning, given that brain injury in early life can contribute to neurologic deficits in later life.5 Emergency care of an injured pregnant woman is further problematic because the physiologic changes of pregnancy can mask the usual signs of acute blood loss (e.g., tachycardia, hypotension), resuscitation science is incomplete (e.g., clinical trials usually exclude pregnant women) and trauma protocols need adjustment (e.g., iodine contrast radiography can potentially harm a fetus).4,5 Even rudimentary care such as analgesia can be complicated when a pregnant woman is involved.6 Every crash creates worry and potential future litigation that might have been avoided if the crash had been prevented.7,8Motor vehicle crashes occur when human error aligns with system failures.9,10 In the United States, the net effect is about 15 million crashes annually, resulting in about 2.5 million individuals sent to hospital with fractures, concussions, ruptured vessels, organ lacerations, soft tissue damage or other injuries.11 The specific details of common human errors are not well understood; in contrast, life-threatening defects in the vehicle or roadway are relatively blatant and infrequent.12 One pattern of human error is that people are overly confident, misjudge their abilities and fail to take protective actions.13 The shared nature of many motor vehicle crashes also makes it easy to blame the other person involved and fail to learn from past experience.14We questioned whether pregnancy might interact with human error and increase the risk of a serious motor vehicle crash. Intermittent nausea, general fatigue, unintended distraction and sleep disruption are common features of a normal pregnancy that sometimes underlie human error.1517 Important physiologic changes related to pregnancy can occur before overt changes in anatomy are apparent.18 Hence, the intermediate stages of pregnancy provide a potential interval of overconfidence when a person could be compromised yet still active.19 The aim of our study was to examine the risk of a serious motor vehicle crash during pregnancy with special attention to the first, second and third trimesters separately.  相似文献   

4.
Schultz AS  Finegan B  Nykiforuk CI  Kvern MA 《CMAJ》2011,183(18):E1334-E1344

Background:

Many hospitals have adopted smoke-free policies on their property. We examined the consequences of such polices at two Canadian tertiary acute-care hospitals.

Methods:

We conducted a qualitative study using ethnographic techniques over a six-month period. Participants (n = 186) shared their perspectives on and experiences with tobacco dependence and managing the use of tobacco, as well as their impressions of the smoke-free policy. We interviewed inpatients individually from eight wards (n = 82), key policy-makers (n = 9) and support staff (n = 14) and held 16 focus groups with health care providers and ward staff (n = 81). We also reviewed ward documents relating to tobacco dependence and looked at smoking-related activities on hospital property.

Results:

Noncompliance with the policy and exposure to secondhand smoke were ongoing concerns. Peoples’ impressions of the use of tobacco varied, including divergent opinions as to whether such use was a bad habit or an addiction. Treatment for tobacco dependence and the management of symptoms of withdrawal were offered inconsistently. Participants voiced concerns over patient safety and leaving the ward to smoke.

Interpretation:

Policies mandating smoke-free hospital property have important consequences beyond noncompliance, including concerns over patient safety and disruptions to care. Without adequately available and accessible support for withdrawal from tobacco, patients will continue to face personal risk when they leave hospital property to smoke.Canadian cities and provinces have passed smoking bans with the goal of reducing people’s exposure to secondhand smoke in workplaces, public spaces and on the property adjacent to public buildings.1,2 In response, Canadian health authorities and hospitals began implementing policies mandating smoke-free hospital property, with the goals of reducing the exposure of workers, patients and visitors to tobacco smoke while delivering a public health message about the dangers of smoking.25 An additional anticipated outcome was the reduced use of tobacco among patients and staff. The impetuses for adopting smoke-free policies include public support for such legislation and the potential for litigation for exposure to second-hand smoke.2,4Tobacco use is a modifiable risk factor associated with a variety of cancers, cardiovascular diseases and respiratory conditions.611 Patients in hospital who use tobacco tend to have more surgical complications and exacerbations of acute and chronic health conditions than patients who do not use tobacco.611 Any policy aimed at reducing exposure to tobacco in hospitals is well supported by evidence, as is the integration of interventions targetting tobacco dependence.12 Unfortunately, most of the nearly five million Canadians who smoke will receive suboptimal treatment,13 as the routine provision of interventions for tobacco dependence in hospital settings is not a practice norm.1416 In smoke-free hospitals, two studies suggest minimal support is offered for withdrawal, 17,18 and one reports an increased use of nicotine-replacement therapy after the implementation of the smoke-free policy.19Assessments of the effectiveness of smoke-free policies for hospital property tend to focus on noncompliance and related issues of enforcement.17,20,21 Although evidence of noncompliance and litter on hospital property2,17,20 implies ongoing exposure to tobacco smoke, half of the participating hospital sites in one study reported less exposure to tobacco smoke within hospital buildings and on the property.18 In addition, there is evidence to suggest some decline in smoking among staff.18,19,21,22We sought to determine the consequences of policies mandating smoke-free hospital property in two Canadian acute-care hospitals by eliciting lived experiences of the people faced with enacting the policies: patients and health care providers. In addition, we elicited stories from hospital support staff and administrators regarding the policies.  相似文献   

5.
Gronich N  Lavi I  Rennert G 《CMAJ》2011,183(18):E1319-E1325

Background:

Combined oral contraceptives are a common method of contraception, but they carry a risk of venous and arterial thrombosis. We assessed whether use of drospirenone was associated with an increase in thrombotic risk relative to third-generation combined oral contraceptives.

Methods:

Using computerized records of the largest health care provider in Israel, we identified all women aged 12 to 50 years for whom combined oral contraceptives had been dispensed between Jan. 1, 2002, and Dec. 31, 2008. We followed the cohort until 2009. We used Poisson regression models to estimate the crude and adjusted rate ratios for risk factors for venous thrombotic events (specifically deep vein thrombosis and pulmonary embolism) and arterial thromboic events (specifically transient ischemic attack and cerebrovascular accident). We performed multivariable analyses to compare types of contraceptives, with adjustment for the various risk factors.

Results:

We identified a total of 1017 (0.24%) venous and arterial thrombotic events among 431 223 use episodes during 819 749 woman-years of follow-up (6.33 venous events and 6.10 arterial events per 10 000 woman-years). In a multivariable model, use of drospirenone carried an increased risk of venous thrombotic events, relative to both third-generation combined oral contraceptives (rate ratio [RR] 1.43, 95% confidence interval [CI] 1.15–1.78) and second-generation combined oral contraceptives (RR 1.65, 95% CI 1.02–2.65). There was no increase in the risk of arterial thrombosis with drospirenone.

Interpretation:

Use of drospirenone-containing oral contraceptives was associated with an increased risk of deep vein thrombosis and pulmonary embolism, but not transient ischemic attack or cerebrovascular attack, relative to second- and third-generation combined oral contraceptives.Oral hormonal therapy is the preferred method of contraception, especially among young women. In the United States in 2002, 12 million women were using “the pill.”1 In a survey of households in Great Britain conducted in 2005 and 2006, one-quarter of women aged 16 to 49 years of age were using this form of contraception.2 A large variety of combined oral contraceptive preparations are available, differing in terms of estrogen dose and in terms of the dose and type of the progestin component. Among preparations currently in use, the estrogen dose ranges from 15 to 35 μg, and the progestins are second-generation, third-generation or newer. The second-generation progestins (levonorgestrel and norgestrel), which are derivatives of testosterone, have differing degrees of androgenic and estrogenic activities. The structure of these agents was modified to reduce the androgenic activity, thus producing the third-generation progestins (desogestrel, gestodene and norgestimate). Newer progestins are chlormadinone acetate, a derivative of progesterone, and drospirenone, an analogue of the aldosterone antagonist spironolactone having antimineralo-corticoid and antiandrogenic activities. Drospirenone is promoted as causing less weight gain and edema than other forms of oral contraceptives, but few well-designed studies have compared the minor adverse effects of these drugs.3The use of oral contraceptives has been reported to confer an increased risk of venous and arterial thrombotic events,47 specifically an absolute risk of venous thrombosis of 6.29 per 10 000 woman-years, compared with 3.01 per 10 000 woman-years among nonusers.8 It has long been accepted that there is a dose–response relationship between estrogen and the risk of venous thrombotic events. Reducing the estrogen dose from 50 μg to 20–30 μg has reduced the risk.9 Studies published since the mid-1990s have suggested a greater risk of venous thrombotic events with third-generation oral contraceptives than with second-generation formulations,1013 indicating that the risk is also progestin-dependent. The pathophysiological mechanism of the risk with different progestins is unknown. A twofold increase in the risk of arterial events (specifically ischemic stroke6,14 and myocardial infarction7) has been observed in case–control studies for users of second-generation pills and possibly also third-generation preparations.7,14Conflicting information is available regarding the risk of venous and arterial thrombotic events associated with drospirenone. An increased risk of venous thromboembolism, relative to second-generation pills, has been reported recently,8,15,16 whereas two manufacturer-sponsored studies claimed no increase in risk.17,18 In the study reported here, we investigated the risk of venous and arterial thrombotic events among users of various oral contraceptives in a large population-based cohort.  相似文献   

6.

Background

Fractures have largely been assessed by their impact on quality of life or health care costs. We conducted this study to evaluate the relation between fractures and mortality.

Methods

A total of 7753 randomly selected people (2187 men and 5566 women) aged 50 years and older from across Canada participated in a 5-year observational cohort study. Incident fractures were identified on the basis of validated self-report and were classified by type (vertebral, pelvic, forearm or wrist, rib, hip and “other”). We subdivided fracture groups by the year in which the fracture occurred during follow-up; those occurring in the fourth and fifth years were grouped together. We examined the relation between the time of the incident fracture and death.

Results

Compared with participants who had no fracture during follow-up, those who had a vertebral fracture in the second year were at increased risk of death (adjusted hazard ratio [HR] 2.7, 95% confidence interval [CI] 1.1–6.6); also at risk were those who had a hip fracture during the first year (adjusted HR 3.2, 95% CI 1.4–7.4). Among women, the risk of death was increased for those with a vertebral fracture during the first year (adjusted HR 3.7, 95% CI 1.1–12.8) or the second year of follow-up (adjusted HR 3.2, 95% CI 1.2–8.1). The risk of death was also increased among women with hip fracture during the first year of follow-up (adjusted HR 3.0, 95% CI 1.0–8.7).

Interpretation

Vertebral and hip fractures are associated with an increased risk of death. Interventions that reduce the incidence of these fractures need to be implemented to improve survival.Osteoporosis-related fractures are a major health concern, affecting a growing number of individuals worldwide. The burden of fracture has largely been assessed by the impact on health-related quality of life and health care costs.1,2 Fractures can also be associated with death. However, trials that have examined the relation between fractures and mortality have had limitations that may influence their results and the generalizability of the studies, including small samples,3,4 the examination of only 1 type of fracture,410 the inclusion of only women,8,11 the enrolment of participants from specific areas (i.e., hospitals or certain geographic regions),3,4,7,8,10,12 the nonrandom selection of participants311 and the lack of statistical adjustment for confounding factors that may influence mortality.3,57,12We evaluated the relation between incident fractures and mortality over a 5-year period in a cohort of men and women 50 years of age and older. In addition, we examined whether other characteristics of participants were risk factors for death.  相似文献   

7.
8.
Prakesh S. Shah  Arne Ohlsson 《CMAJ》2009,180(12):E99-E108

Background

Reduced intake of micronutrients during pregnancy exposes women to nutritional deficiencies and may affect fetal growth. We conducted a systematic review to examine the efficacy of prenatal supplementation with multimicronutrients on pregnancy outcomes.

Methods

We searched MEDLINE, EMBASE, CINAHL and the Cochrane Library for relevant articles published in English up to December 2008. We also searched the bibliographies of selected articles as well as clinical trial registries. The primary outcome was low birth weight; secondary outcomes were preterm birth, small-for-gestational-age infants, birth weight and gestational age.

Results

We observed a significant reduction in the risk of low birth weight among infants born to women who received multimicronutrients during pregnancy compared with placebo (relative risk [RR] 0.81, 95% confidence interval [CI] 0.73–0.91) or iron–folic acid supplementation (RR 0.83, 95% CI 0.74–0.93). Birth weight was significantly higher among infants whose mothers were in the multimicronutrient group than among those whose mothers received iron–folic acid supplementation (weighted mean difference 54 g, 95% CI 36 g–72 g). There was no significant differences in the risk of preterm birth or small-for-gestational-age infants between the 3 study groups.

Interpretation

Prenatal multimicronutrient supplementation was associated with a significantly reduced risk of low birth weight and with improved birth weight when compared with iron–folic acid supplementation. There was no significant effect of multimicronutrient supplementation on the risk of preterm birth or small-for-gestational-age infants.Nutrition plays an important role in the growth and development of the fetus. Studies of the nutritional status of pregnant women during the Dutch famine revealed increased risks of infertility, abortion, fetal intrauterine growth restriction and perinatal mortality among malnourished women.1 In many parts of the world, a similar situation exists for many pregnant women with respect to nutrition. Overall, the diet of pregnant women has been reported to be deficient in calories and micronutrients.2 Both macro- and micronutrients are important for a woman to sustain pregnancy and for appropriate growth of the fetus.The exact mechanisms of how supplementation with micronutrients can affect pregnancy outcomes are not completely understood. Possible mechanisms for beneficial effects include a generalized improvement in the immune function of women, with a reduced incidence of infections and subsequent reduced incidence of preterm birth;3 improved energy metabolism and anabolic processes in the mother, with a reduced incidence of fetal intrauterine growth restriction;3 improved ability to respond to stress;3 expansion of plasma volume secondary to fluid retention, with subsequent improvements in fetal growth;4 improved hemoglobin levels;5 and increased absorption of iron related to intake of vitamin C and riboflavin, with subsequent improvement in hemoglobin levels.5Potential disadvantages include adverse interactions of micronutrients when supplied in combination;6 enhanced or reduced absorption of one nutrient by other nutrients (e.g., interaction between iron and vitamin C, and iron and zinc);7 deleterious effects on the fetus and the mother from overdose of any one component (e.g., vitamin A overdose);6 and cost.6Potential barriers include the lack of well-defined government policies on maternal health and nutrition.6 A multicomponent approach has been criticized from the standpoint that some micronutrients may be necessary, some may not be needed and some may even be harmful.2 Generalized or mass supplementation with multimicronutrients may have different effects on pregnancy outcomes depending on the underlying nutritional status of the women.On the basis of a systematic review performed in 2005, the World Health Organization currently recommends iron–folic acid supplementation for all pregnant women.8,9 The review reported that multimicronutrient supplementation during pregnancy were more efficacious than 2 or fewer micronutrients in reducing the rates of low birth weight and small-for-gestational-age births. However, when multimicronutrients were compared with iron–folic acid supplementation, no evidence of a difference was noted.7 Further research in this area was encouraged because information was derived from a few reports. Since then, several randomized controlled trials have evaluated the efficacy of multimicronutrients and have reported varied results. With advancement in our knowledge from recently reported trials,10 we conducted a systematic review and meta-analysis of the efficacy of supplementation with multimicronutrients during pregnancy in reducing the rates of low birth weight, preterm birth and small-for-gestational-age births compared with placebo or iron–folic acid supplementation.  相似文献   

9.

Background

The pathogenesis of appendicitis is unclear. We evaluated whether exposure to air pollution was associated with an increased incidence of appendicitis.

Methods

We identified 5191 adults who had been admitted to hospital with appendicitis between Apr. 1, 1999, and Dec. 31, 2006. The air pollutants studied were ozone, nitrogen dioxide, sulfur dioxide, carbon monoxide, and suspended particulate matter of less than 10 μ and less than 2.5 μ in diameter. We estimated the odds of appendicitis relative to short-term increases in concentrations of selected pollutants, alone and in combination, after controlling for temperature and relative humidity as well as the effects of age, sex and season.

Results

An increase in the interquartile range of the 5-day average of ozone was associated with appendicitis (odds ratio [OR] 1.14, 95% confidence interval [CI] 1.03–1.25). In summer (July–August), the effects were most pronounced for ozone (OR 1.32, 95% CI 1.10–1.57), sulfur dioxide (OR 1.30, 95% CI 1.03–1.63), nitrogen dioxide (OR 1.76, 95% CI 1.20–2.58), carbon monoxide (OR 1.35, 95% CI 1.01–1.80) and particulate matter less than 10 μ in diameter (OR 1.20, 95% CI 1.05–1.38). We observed a significant effect of the air pollutants in the summer months among men but not among women (e.g., OR for increase in the 5-day average of nitrogen dioxide 2.05, 95% CI 1.21–3.47, among men and 1.48, 95% CI 0.85–2.59, among women). The double-pollutant model of exposure to ozone and nitrogen dioxide in the summer months was associated with attenuation of the effects of ozone (OR 1.22, 95% CI 1.01–1.48) and nitrogen dioxide (OR 1.48, 95% CI 0.97–2.24).

Interpretation

Our findings suggest that some cases of appendicitis may be triggered by short-term exposure to air pollution. If these findings are confirmed, measures to improve air quality may help to decrease rates of appendicitis.Appendicitis was introduced into the medical vernacular in 1886.1 Since then, the prevailing theory of its pathogenesis implicated an obstruction of the appendiceal orifice by a fecalith or lymphoid hyperplasia.2 However, this notion does not completely account for variations in incidence observed by age,3,4 sex,3,4 ethnic background,3,4 family history,5 temporal–spatial clustering6 and seasonality,3,4 nor does it completely explain the trends in incidence of appendicitis in developed and developing nations.3,7,8The incidence of appendicitis increased dramatically in industrialized nations in the 19th century and in the early part of the 20th century.1 Without explanation, it decreased in the middle and latter part of the 20th century.3 The decrease coincided with legislation to improve air quality. For example, after the United States Clean Air Act was passed in 1970,9 the incidence of appendicitis decreased by 14.6% from 1970 to 1984.3 Likewise, a 36% drop in incidence was reported in the United Kingdom between 1975 and 199410 after legislation was passed in 1956 and 1968 to improve air quality and in the 1970s to control industrial sources of air pollution. Furthermore, appendicitis is less common in developing nations; however, as these countries become more industrialized, the incidence of appendicitis has been increasing.7Air pollution is known to be a risk factor for multiple conditions, to exacerbate disease states and to increase all-cause mortality.11 It has a direct effect on pulmonary diseases such as asthma11 and on nonpulmonary diseases including myocardial infarction, stroke and cancer.1113 Inflammation induced by exposure to air pollution contributes to some adverse health effects.1417 Similar to the effects of air pollution, a proinflammatory response has been associated with appendicitis.1820We conducted a case–crossover study involving a population-based cohort of patients admitted to hospital with appendicitis to determine whether short-term increases in concentrations of selected air pollutants were associated with hospital admission because of appendicitis.  相似文献   

10.

Background:

Polymyalgia rheumatica is one of the most common inflammatory rheumatologic conditions in older adults. Other inflammatory rheumatologic disorders are associated with an excess risk of vascular disease. We investigated whether polymyalgia rheumatica is associated with an increased risk of vascular events.

Methods:

We used the General Practice Research Database to identify patients with a diagnosis of incident polymyalgia rheumatica between Jan. 1, 1987, and Dec. 31, 1999. Patients were matched by age, sex and practice with up to 5 patients without polymyalgia rheumatica. Patients were followed until their first vascular event (cardiovascular, cerebrovascular, peripheral vascular) or the end of available records (May 2011). All participants were free of vascular disease before the diagnosis of polymyalgia rheumatica (or matched date). We used Cox regression models to compare time to first vascular event in patients with and without polymyalgia rheumatica.

Results:

A total of 3249 patients with polymyalgia rheumatica and 12 735 patients without were included in the final sample. Over a median follow-up period of 7.8 (interquartile range 3.3–12.4) years, the rate of vascular events was higher among patients with polymyalgia rheumatica than among those without (36.1 v. 12.2 per 1000 person-years; adjusted hazard ratio 2.6, 95% confidence interval 2.4–2.9). The increased risk of a vascular event was similar for each vascular disease end point. The magnitude of risk was higher in early disease and in patients younger than 60 years at diagnosis.

Interpretation:

Patients with polymyalgia rheumatica have an increased risk of vascular events. This risk is greatest in the youngest age groups. As with other forms of inflammatory arthritis, patients with polymyalgia rheumatica should have their vascular risk factors identified and actively managed to reduce this excess risk.Inflammatory rheumatologic disorders such as rheumatoid arthritis,1,2 systemic lupus erythematosus,2,3 gout,4 psoriatic arthritis2,5 and ankylosing spondylitis2,6 are associated with an increased risk of vascular disease, especially cardiovascular disease, leading to substantial morbidity and premature death.26 Recognition of this excess vascular risk has led to management guidelines advocating screening for and management of vascular risk factors.79Polymyalgia rheumatica is one of the most common inflammatory rheumatologic conditions in older adults,10 with a lifetime risk of 2.4% for women and 1.7% for men.11 To date, evidence regarding the risk of vascular disease in patients with polymyalgia rheumatica is unclear. There are a number of biologically plausible mechanisms between polymyalgia rheumatica and vascular disease. These include the inflammatory burden of the disease,12,13 the association of the disease with giant cell arteritis (causing an inflammatory vasculopathy, which may lead to subclinical arteritis, stenosis or aneurysms),14 and the adverse effects of long-term corticosteroid treatment (e.g., diabetes, hypertension and dyslipidemia).15,16 Paradoxically, however, use of corticosteroids in patients with polymyalgia rheumatica may actually decrease vascular risk by controlling inflammation.17 A recent systematic review concluded that although some evidence exists to support an association between vascular disease and polymyalgia rheumatica,18 the existing literature presents conflicting results, with some studies reporting an excess risk of vascular disease19,20 and vascular death,21,22 and others reporting no association.2326 Most current studies are limited by poor methodologic quality and small samples, and are based on secondary care cohorts, who may have more severe disease, yet most patients with polymyalgia rheumatica receive treatment exclusively in primary care.27The General Practice Research Database (GPRD), based in the United Kingdom, is a large electronic system for primary care records. It has been used as a data source for previous studies,28 including studies on the association of inflammatory conditions with vascular disease29 and on the epidemiology of polymyalgia rheumatica in the UK.30 The aim of the current study was to examine the association between polymyalgia rheumatica and vascular disease in a primary care population.  相似文献   

11.
Background:Head injuries have been associated with subsequent suicide among military personnel, but outcomes after a concussion in the community are uncertain. We assessed the long-term risk of suicide after concussions occurring on weekends or weekdays in the community.Methods:We performed a longitudinal cohort analysis of adults with diagnosis of a concussion in Ontario, Canada, from Apr. 1, 1992, to Mar. 31, 2012 (a 20-yr period), excluding severe cases that resulted in hospital admission. The primary outcome was the long-term risk of suicide after a weekend or weekday concussion.Results:We identified 235 110 patients with a concussion. Their mean age was 41 years, 52% were men, and most (86%) lived in an urban location. A total of 667 subsequent suicides occurred over a median follow-up of 9.3 years, equivalent to 31 deaths per 100 000 patients annually or 3 times the population norm. Weekend concussions were associated with a one-third further increased risk of suicide compared with weekday concussions (relative risk 1.36, 95% confidence interval 1.14–1.64). The increased risk applied regardless of patients’ demographic characteristics, was independent of past psychiatric conditions, became accentuated with time and exceeded the risk among military personnel. Half of these patients had visited a physician in the last week of life.Interpretation:Adults with a diagnosis of concussion had an increased long-term risk of suicide, particularly after concussions on weekends. Greater attention to the long-term care of patients after a concussion in the community might save lives because deaths from suicide can be prevented.Suicide is a leading cause of death in both military and community settings.1 During 2010, 3951 suicide deaths occurred in Canada2 and 38 364 in the United States.3 The frequency of attempted suicide is about 25 times higher, and the financial costs in the US equate to about US$40 billion annually.4 The losses from suicide in Canada are comparable to those in other countries when adjusted for population size.5 Suicide deaths can be devastating to surviving family and friends.6 Suicide in the community is almost always related to a psychiatric illness (e.g., depression, substance abuse), whereas suicide in the military is sometimes linked to a concussion from combat injury.710Concussion is the most common brain injury in young adults and is defined as a transient disturbance of mental function caused by acute trauma.11 About 4 million concussion cases occur in the US each year, equivalent to a rate of about 1 per 1000 adults annually;12 direct Canadian data are not available. The majority lead to self-limited symptoms, and only a small proportion have a protracted course.13 However, the frequency of depression after concussion can be high,14,15 and traumatic brain injury in the military has been associated with subsequent suicide.8,16 Severe head trauma resulting in admission to hospital has also been associated with an increased risk of suicide, whereas mild concussion in ambulatory adults is an uncertain risk factor.1720The aim of this study was to determine whether concussion was associated with an increased long-term risk of suicide and, if so, whether the day of the concussion (weekend v. weekday) could be used to identify patients at further increased risk. The severity and mechanism of injury may differ by day of the week because recreational injuries are more common on weekends and occupational injuries are more common on weekdays.2127 The risk of a second concussion, use of protective safeguards, propensity to seek care, subsequent oversight, sense of responsibility and other nuances may also differ for concussions acquired from weekend recreation rather than weekday work.2831 Medical care on weekends may also be limited because of shortfalls in staffing.32  相似文献   

12.
13.

Background

Chronic neuropathic pain affects 1%–2% of the adult population and is often refractory to standard pharmacologic treatment. Patients with chronic pain have reported using smoked cannabis to relieve pain, improve sleep and improve mood.

Methods

Adults with post-traumatic or postsurgical neuropathic pain were randomly assigned to receive cannabis at four potencies (0%, 2.5%, 6% and 9.4% tetrahydrocannabinol) over four 14-day periods in a crossover trial. Participants inhaled a single 25-mg dose through a pipe three times daily for the first five days in each cycle, followed by a nine-day washout period. Daily average pain intensity was measured using an 11-point numeric rating scale. We recorded effects on mood, sleep and quality of life, as well as adverse events.

Results

We recruited 23 participants (mean age 45.4 [standard deviation 12.3] years, 12 women [52%]), of whom 21 completed the trial. The average daily pain intensity, measured on the 11-point numeric rating scale, was lower on the prespecified primary contrast of 9.4% v. 0% tetrahydrocannabinol (5.4 v. 6.1, respectively; difference = 0.7, 95% confidence interval [CI] 0.02–1.4). Preparations with intermediate potency yielded intermediate but nonsignificant degrees of relief. Participants receiving 9.4% tetrahydrocannabinol reported improved ability to fall asleep (easier, p = 0.001; faster, p < 0.001; more drowsy, p = 0.003) and improved quality of sleep (less wakefulness, p = 0.01) relative to 0% tetrahydrocannabinol. We found no differences in mood or quality of life. The most common drug-related adverse events during the period when participants received 9.4% tetrahydrocannabinol were headache, dry eyes, burning sensation in areas of neuropathic pain, dizziness, numbness and cough.

Conclusion

A single inhalation of 25 mg of 9.4% tetrahydrocannabinol herbal cannabis three times daily for five days reduced the intensity of pain, improved sleep and was well tolerated. Further long-term safety and efficacy studies are indicated. (International Standard Randomised Controlled Trial Register no. ISRCTN68314063)Chronic neuropathic pain has a prevalence of 1%–2%,1 and treatment options are limited.2 Pharmacotherapy includes anticonvulsants, antidepressants, opioids and local anesthetics,3,4 but responses vary and side effects limit compliance.Cannabis sativa has been used to treat pain since the third millennium BC.5 An endogenous pain-processing system has been identified, mediated by endogenous cannabinoid ligands acting on specific cannabinoid receptors.6 These findings, coupled with anecdotal evidence of the analgesic effects of smoked cannabis,7 support a reconsideration of cannabinoid agents as analgesics.Oral cannabinoids such as tetrahydrocannabinol, cannabidiol and nabilone have, alone and in combination, shown efficacy in central8,9 and peripheral10 neuropathic pain, rheumatoid arthritis11 and fibromyalgia.12The analgesic effects of smoked cannabis remain controversial, although it is used by 10%–15% of patients with chronic noncancer pain13 and multiple sclerosis.14 Clinical trials are needed to evaluate these effects, given that the risks and benefits of inhaled cannabinoids may differ from oral agents. To date, three small clinical trials of the analgesic efficacy of smoked cannabis have been reported.1517 All studies were conducted in residential laboratories, and participants smoked multiple doses of the drug at each time point. No study adequately reported data related to adverse events.We conducted a clinical trial using a standardized single-dose delivery system to explore further the safety and efficacy of smoked cannabis in outpatients with chronic neuropathic pain.  相似文献   

14.

Background:

Previous studies have suggested that the immunochemical fecal occult blood test has superior specificity for detecting bleeding in the lower gastrointestinal tract even if bleeding occurs in the upper tract. We conducted a large population-based study involving asymptomatic adults in Taiwan, a population with prevalent upper gastrointestinal lesions, to confirm this claim.

Methods:

We conducted a prospective cohort study involving asymptomatic people aged 18 years or more in Taiwan recruited to undergo an immunochemical fecal occult blood test, colonoscopy and esophagogastroduodenoscopy between August 2007 and July 2009. We compared the prevalence of lesions in the lower and upper gastrointestinal tracts between patients with positive and negative fecal test results. We also identified risk factors associated with a false-positive fecal test result.

Results:

Of the 2796 participants, 397 (14.2%) had a positive fecal test result. The sensitivity of the test for predicting lesions in the lower gastrointestinal tract was 24.3%, the specificity 89.0%, the positive predictive value 41.3%, the negative predictive value 78.7%, the positive likelihood ratio 2.22, the negative likelihood ratio 0.85 and the accuracy 73.4%. The prevalence of lesions in the lower gastrointestinal tract was higher among those with a positive fecal test result than among those with a negative result (41.3% v. 21.3%, p < 0.001). The prevalence of lesions in the upper gastrointestinal tract did not differ significantly between the two groups (20.7% v. 17.5%, p = 0.12). Almost all of the participants found to have colon cancer (27/28, 96.4%) had a positive fecal test result; in contrast, none of the three found to have esophageal or gastric cancer had a positive fecal test result (p < 0.001). Among those with a negative finding on colonoscopy, the risk factors associated with a false-positive fecal test result were use of antiplatelet drugs (adjusted odds ratio [OR] 2.46, 95% confidence interval [CI] 1.21–4.98) and a low hemoglobin concentration (adjusted OR 2.65, 95% CI 1.62–4.33).

Interpretation:

The immunochemical fecal occult blood test was specific for predicting lesions in the lower gastrointestinal tract. However, the test did not adequately predict lesions in the upper gastrointestinal tract.The fecal occult blood test is a convenient tool to screen for asymptomatic gastrointestinal bleeding.1 When the test result is positive, colonoscopy is the strategy of choice to investigate the source of bleeding.2,3 However, 13%–42% of patients can have a positive test result but a negative colonoscopy,4 and it has not yet been determined whether asymptomatic patients should then undergo evaluation of the upper gastrointestinal tract.Previous studies showed that the frequency of lesions in the upper gastrointestinal tract was comparable or even higher than that of colonic lesions59 and that the use of esophagogastroduodenoscopy may change clinical management.10,11 Some studies showed that evaluation of the upper gastrointestinal tract helped to identify important lesions in symptomatic patients and those with iron deficiency anemia;12,13 however, others concluded that esophagogastroduodenoscopy was unjustified because important findings in the upper gastrointestinal tract were rare1417 and sometimes irrelevant to the results of fecal occult blood testing.1821 This controversy is related to the heterogeneity of study populations and to the limitations of the formerly used guaiac-based fecal occult blood test,520 which was not able to distinguish bleeding in the lower gastrointestinal tract from that originating in the upper tract.The guaiac-based fecal occult blood test is increasingly being replaced by the immunochemical-based test. The latter is recommended for detecting bleeding in the lower gastrointestinal tract because it reacts with human globin, a protein that is digested by enzymes in the upper gastrointestinal tract.22 With this advantage, the occurrence of a positive fecal test result and a negative finding on colonoscopy is expected to decrease.We conducted a population-based study in Taiwan to verify the performance of the immunochemical fecal occult blood test in predicting lesions in the lower gastrointestinal tract and to confirm that results are not confounded by the presence of lesions in the upper tract. In Taiwan, the incidence of colorectal cancer is rapidly increasing, and Helicobacter pylori-related lesions in the upper gastrointestinal tract remain highly prevalent.23 Same-day bidirectional endoscopies are therefore commonly used for cancer screening.24 This screening strategy provides an opportunity to evaluate the performance of the immunochemical fecal occult blood test.  相似文献   

15.

Background:

Acute kidney injury is a serious complication of elective major surgery. Acute dialysis is used to support life in the most severe cases. We examined whether rates and outcomes of acute dialysis after elective major surgery have changed over time.

Methods:

We used data from Ontario’s universal health care databases to study all consecutive patients who had elective major surgery at 118 hospitals between 1995 and 2009. Our primary outcomes were acute dialysis within 14 days of surgery, death within 90 days of surgery and chronic dialysis for patients who did not recover kidney function.

Results:

A total of 552 672 patients underwent elective major surgery during the study period, 2231 of whom received acute dialysis. The incidence of acute dialysis increased steadily from 0.2% in 1995 (95% confidence interval [CI] 0.15–0.2) to 0.6% in 2009 (95% CI 0.6–0.7). This increase was primarily in cardiac and vascular surgeries. Among patients who received acute dialysis, 937 died within 90 days of surgery (42.0%, 95% CI 40.0–44.1), with no change in 90-day survival over time. Among the 1294 patients who received acute dialysis and survived beyond 90 days, 352 required chronic dialysis (27.2%, 95% CI 24.8–29.7), with no change over time.

Interpretation:

The use of acute dialysis after cardiac and vascular surgery has increased substantially since 1995. Studies focusing on interventions to better prevent and treat perioperative acute kidney injury are needed.More than 230 million elective major surgeries are done annually worldwide.1 Acute kidney injury is a serious complication of major surgery. It represents a sudden loss of kidney function that affects morbidity, mortality and health care costs.2 Dialysis is used for the most severe forms of acute kidney injury. In the nonsurgical setting, the incidence of acute dialysis has steadily increased over the last 15 years, and patients are now more likely to survive to discharge from hospital.35 Similarly, in the surgical setting, the incidence of acute dialysis appears to be increasing over time,610 with declining inhospital mortality.8,10,11Although previous studies have improved our understanding of the epidemiology of acute dialysis in the surgical setting, several questions remain. Many previous studies were conducted at a single centre, thereby limiting their generalizability.6,1214 Most multicentre studies were conducted in the nonsurgical setting and used diagnostic codes for acute kidney injury not requiring dialysis; however, these codes can be inaccurate.15,16 In contrast, a procedure such as dialysis is easily determined. The incidence of acute dialysis after elective surgery is of particular interest given the need for surgical consent, the severe nature of the event and the potential for mitigation. The need for chronic dialysis among patients who do not recover renal function after surgery has been poorly studied, yet this condition has a major affect on patient survival and quality of life.17 For these reasons, we studied secular trends in acute dialysis after elective major surgery, focusing on incidence, 90-day mortality and need for chronic dialysis.  相似文献   

16.
Chun-Yuh Yang 《CMAJ》2010,182(6):569-572

Background

There are limited empirical data to support the theory of a protective effect of parenthood against suicide, as proposed by Durkheim in 1897. I conducted this study to examine whether there is an association between parity and risk of death from suicide among women.

Methods

The study cohort consisted of 1 292 462 women in Taiwan who had a first live birth between Jan. 1, 1978, and Dec. 31, 1987. The women were followed up from the date of their first birth to Dec. 31, 2007. Their vital status was ascertained by means of linking records with data from a computerized mortality database. Cox proportional hazard regression models were used to estimate hazard ratios of death from suicide associated with parity.

Results

There were 2252 deaths from suicide during 32 464 187 person-years of follow-up. Suicide-related mortality was 6.94 per 100 000 person-years. After adjustment for age at first birth, marital status, years of schooling and place of delivery, the adjusted hazard ratio was 0.61 (95% confidence interval [CI] 0.54–0.68) among women with two live births and 0.40 (95% CI 0.35–0.45) among those with three or more live births, compared with women who had one live birth. I observed a significantly decreasing trend in adjusted hazard ratios of suicide with increasing parity.

Interpretation

This study provides evidence to support Durkheim’s hypothesis that parenthood confers a protective effect against suicide.Childbearing is considered to have long-term effects on women’s health.1 However, little is known about the relation between parity and mortality among women except for cancers of the reproductive organs.2In his book on suicide published in 1897, Durkheim concluded that the rate of death from suicide was lower among married women than among unmarried women because of the effect of parenthood and not marriage per se.3 Three studies since then have explored Durkheim’s hypothesis. In the first, published almost 100 years later, Hoyer and Lund conducted a prospective study in Norway involving 989 949 married women aged 25 years or older who were followed up for 15 years.4 They reported a negative association between suicide-related mortality and number of children. In a nested case–control study in Denmark involving 6500 women who committed suicide between Jan. 1, 1981, and Dec. 31, 1997, and 130 000 matched control subjects, Qin and Mortensen found a significantly decreased risk of suicide with increasing number of children.5 In the third study, 12 055 pregnant women in Finland were followed up from delivery in 1966 until 2001; the authors found a decreasing trend in suicide-related mortality with increasing parity.1One reason for the limited empirical evidence exploring Durkheim’s hypothesis may have to do with sample size and study design.4 Only studies involving representative suicides from the general population could make it possible to achieve sufficient power to detect the effect of parity on rare events such as suicide.1,4,5 Even in the prospective study involving 989 949 women followed for 15 years, only 11 deaths from suicide occurred among women with six or more children.4In Taiwan, suicide is the eighth leading cause of death among men and the ninth among women. The age-adjusted rate of death from suicide was 19.7 per 100 000 among men and 9.7 among women in 2007.6 Suicide rates in Western countries have been generally lower than those in Asian countries.7 A consistent increase in the suicide rate since 1999 has been found in Taiwan.6 However, most Western countries have had stable or slightly decreasing rates during the 1990s.8,9 The male:female ratio of suicide is frequently greater than 3:1 in Western countries,7 whereas it is 2:1 in Taiwan.10 High suicide rates among Chinese women have been well documented.11 One explanation is that Chinese women do not benefit from marriage as much as their male counterparts.12 The sex difference in suicide rates is largely driven by a high rate of suicide among women in Chinese societies.11 In many Western countries, the trend over the past several years has been in the opposite direction: rates among women have been stable or decreasing, whereas rates among men have been increasing.12 Furthermore, in an epidemiologic study of suicides in Chinese communities, the prevalence of mental illness among people committing suicide was much lower in those communities than in Western societies.13Because the previous studies that related parity and suicide-related mortality were carried out in economically developed countries and because different cultural settings might influence suicide patterns,3 I undertook the present study in Taiwan, using a cohort of women who had a first and singleton live birth between Jan. 1, 1978, and Dec. 31, 1987, to explore further Durkheim’s hypothesis.  相似文献   

17.

Background:

Telehealthcare has the potential to provide care for long-term conditions that are increasingly prevalent, such as asthma. We conducted a systematic review of studies of telehealthcare interventions used for the treatment of asthma to determine whether such approaches to care are effective.

Methods:

We searched the Cochrane Airways Group Specialised Register of Trials, which is derived from systematic searches of bibliographic databases including CENTRAL (the Cochrane Central Register of Controlled Trials), MEDLINE, Embase, CINAHL (Cumulative Index to Nursing and Allied Health Literature) and PsycINFO, as well as other electronic resources. We also searched registers of ongoing and unpublished trials. We were interested in studies that measured the following outcomes: quality of life, number of visits to the emergency department and number of admissions to hospital. Two reviewers identified studies for inclusion in our meta-analysis. We extracted data and used fixedeffect modelling for the meta-analyses.

Results:

We identified 21 randomized controlled trials for inclusion in our analysis. The methods of telehealthcare intervention these studies investigated were the telephone and video- and Internet-based models of care. Meta-analysis did not show a clinically important improvement in patients’ quality of life, and there was no significant change in the number of visits to the emergency department over 12 months. There was a significant reduction in the number of patients admitted to hospital once or more over 12 months (risk ratio 0.25 [95% confidence interval 0.09 to 0.66]).

Interpretation:

We found no evidence of a clinically important impact on patients’ quality of life, but telehealthcare interventions do appear to have the potential to reduce the risk of admission to hospital, particularly for patients with severe asthma. Further research is required to clarify the cost-effectiveness of models of care based on telehealthcare.There has been an increase in the prevalence of asthma in recent decades,13 and the Global Initiative for Asthma estimates that 300 million people worldwide now have the disease.4 The highest prevalence rates (30%) are seen in economically developed countries.58 There has also been an increase in the prevalence of asthma affecting both children and adults in many economically developing and transition countries.911Asthma’s high burden of disease requires improvements in access to treatments.7,12,13 Patterns of help-seeking behaviour are also relevant: delayed reporting is associated with morbidity and the need for emergency care.It is widely believed that telehealthcare interventions may help address some of the challenges posed by asthma by enabling remote delivery of care, facilitating timely access to health advice, supporting self-monitoring and medication concordance, and educating patients on avoiding triggers.1416 The precise role of these technologies in the management of care for people with long-term respiratory conditions needs to be established.17The objective of this study was to systematically review the effectiveness of telehealthcare interventions among people with asthma in terms of quality of life, number of visits to the emergency department and admissions to hospital for exacerbations of asthma.  相似文献   

18.

Background:

A higher risk of preterm birth among black women than among white women is well established in the United States. We compared differences in preterm birth between non-Hispanic black and white women in Canada and the US, hypothesizing that disparities would be less extreme in Canada given the different historical experiences of black populations and Canada’s universal health care system.

Methods:

Using data on singleton live births in Canada and the US for 2004–2006, we estimated crude and adjusted risk ratios and risk differences in preterm birth (< 37 wk) and very preterm birth (< 32 wk) among non-Hispanic black versus non-Hispanic white women in each country. Adjusted models for the US were standardized to the covariate distribution of the Canadian cohort.

Results:

In Canada, 8.9% and 5.9% of infants born to black and white mothers, respectively, were preterm; the corresponding figures in the US were 12.7% and 8.0%. Crude risk ratios for preterm birth among black women relative to white women were 1.49 (95% confidence interval [CI] 1.32 to 1.66) in Canada and 1.57 (95% CI 1.56 to 1.58) in the US (p value for heterogeneity [pH] = 0.3). The crude risk differences for preterm birth were 2.94 (95% CI 1.91 to 3.96) in Canada and 4.63 (95% CI 4.56 to 4.70) in the US (pH = 0.003). Adjusted risk ratios for preterm birth (pH = 0.1) were slightly higher in Canada than in the US, whereas adjusted risk differences were similar in both countries. Similar patterns were observed for racial disparities in very preterm birth.

Interpretation:

Relative disparities in preterm birth and very preterm birth between non-Hispanic black and white women were similar in magnitude in Canada and the US. Absolute disparities were smaller in Canada, which reflects a lower overall risk of preterm birth in Canada than in the US in both black and white populations.In the United States, a higher risk of preterm birth among black women than among white women is well established.13 This racial disparity is of great concern because preterm birth is a leading cause of perinatal mortality and is predictive of developmental problems and adverse health outcomes later in life.4 The underlying causes of the racial disparity in preterm birth in the US are not well understood, although research has suggested contributing roles for a wide range of factors, including socioeconomic disadvantage,5 poor neighbourhood conditions (e.g., poverty, crime),5,6 lack of access to health care,7 psychosocial stress,8 racial discrimination9 and adverse health behaviours.10Rates of preterm birth have consistently been lower in Canada than in the US.11,12 However, in contrast to the US, little is known about differences in rates by race or ethnicity in Canada. There is evidence that African-born and Caribbean-born women in the provinces of Quebec and Ontario have higher rates of preterm birth than Canadian-born women.1315 Although the magnitude of these differences is smaller than the disparity in preterm births between black and white women in the US,16 foreign-born black women in the US have been found to be at lower risk of preterm birth than US-born black women.17,18In both Canada and the US, socioeconomic conditions at both individual and neighbourhood levels are important predictors of preterm birth.1921 Although the income gap between black and white people is markedly smaller in Canada than in the US,22 black populations in both countries have lower education levels, higher unemployment rates and a greater likelihood of living in low-quality neighbourhoods compared with white populations.23 Canada and the US share similar social and economic influences, yet the historical experiences of black populations and the social welfare systems (e.g., universal health care) are quite different in the 2 countries. Black people constitute about 13% of the total US population, but only about 3% of the Canadian population.24,25 The overwhelming majority of Canada’s black population are immigrants who entered the country after 1960 and their descendants, whereas more than 85% of black Americans can trace their ancestry 3 or more generations in the US, with most being descendants of slaves.22The objectives of our study are twofold. First, using data from a new cohort linking birth registrations with information from the 2006 Canadian long-form census, we present Canada-wide estimates of differences in preterm birth rates between black and white populations. Second, we use comparable methodology to compare racial differences in preterm birth rates between Canada and the US. Given different historical experiences of black populations in the 2 countries, as well as Canada’s commitment to universal health care and its general perception as a more egalitarian society than the US,22 we hypothesized that we would observe smaller racial disparities in the rates in Canada than in the US.  相似文献   

19.
The erythropoietin receptor (EpoR) was discovered and described in red blood cells (RBCs), stimulating its proliferation and survival. The target in humans for EpoR agonists drugs appears clear—to treat anemia. However, there is evidence of the pleitropic actions of erythropoietin (Epo). For that reason, rhEpo therapy was suggested as a reliable approach for treating a broad range of pathologies, including heart and cardiovascular diseases, neurodegenerative disorders (Parkinson’s and Alzheimer’s disease), spinal cord injury, stroke, diabetic retinopathy and rare diseases (Friedreich ataxia). Unfortunately, the side effects of rhEpo are also evident. A new generation of nonhematopoietic EpoR agonists drugs (asialoEpo, Cepo and ARA 290) have been investigated and further developed. These EpoR agonists, without the erythropoietic activity of Epo, while preserving its tissue-protective properties, will provide better outcomes in ongoing clinical trials. Nonhematopoietic EpoR agonists represent safer and more effective surrogates for the treatment of several diseases such as brain and peripheral nerve injury, diabetic complications, renal ischemia, rare diseases, myocardial infarction, chronic heart disease and others.In principle, the erythropoietin receptor (EpoR) was discovered and described in red blood cell (RBC) progenitors, stimulating its proliferation and survival. Erythropoietin (Epo) is mainly synthesized in fetal liver and adult kidneys (13). Therefore, it was hypothesized that Epo act exclusively on erythroid progenitor cells. Accordingly, the target in humans for EpoR agonists drugs (such as recombinant erythropoietin [rhEpo], in general, called erythropoiesis-simulating agents) appears clear (that is, to treat anemia). However, evidence of a kaleidoscope of pleitropic actions of Epo has been provided (4,5). The Epo/EpoR axis research involved an initial journey from laboratory basic research to clinical therapeutics. However, as a consequence of clinical observations, basic research on Epo/EpoR comes back to expand its clinical therapeutic applicability.Although kidney and liver have long been considered the major sources of synthesis, Epo mRNA expression has also been detected in the brain (neurons and glial cells), lung, heart, bone marrow, spleen, hair follicles, reproductive tract and osteoblasts (617). Accordingly, EpoR was detected in other cells, such as neurons, astrocytes, microglia, immune cells, cancer cell lines, endothelial cells, bone marrow stromal cells and cells of heart, reproductive system, gastrointestinal tract, kidney, pancreas and skeletal muscle (1827). Conversely, Sinclair et al.(28) reported data questioning the presence or function of EpoR on nonhematopoietic cells (endothelial, neuronal and cardiac cells), suggesting that further studies are needed to confirm the diversity of EpoR. Elliott et al.(29) also showed that EpoR is virtually undetectable in human renal cells and other tissues with no detectable EpoR on cell surfaces. These results have raised doubts about the preclinical basis for studies exploring pleiotropic actions of rhEpo (30).For the above-mentioned data, a return to basic research studies has become necessary, and many studies in animal models have been initiated or have already been performed. The effect of rhEpo administration on angiogenesis, myogenesis, shift in muscle fiber types and oxidative enzyme activities in skeletal muscle (4,31), cardiac muscle mitochondrial biogenesis (32), cognitive effects (31), antiapoptotic and antiinflammatory actions (3337) and plasma glucose concentrations (38) has been extensively studied. Neuro- and cardioprotection properties have been mainly described. Accordingly, rhEpo therapy was suggested as a reliable approach for treating a broad range of pathologies, including heart and cardiovascular diseases, neurodegenerative disorders (Parkinson’s and Alzheimer’s disease), spinal cord injury, stroke, diabetic retinopathy and rare diseases (Friedreich ataxia).Unfortunately, the side effects of rhEpo are also evident. Epo is involved in regulating tumor angiogenesis (39) and probably in the survival and growth of tumor cells (25,40,41). rhEpo administration also induces serious side effects such as hypertension, polycythemia, myocardial infarction, stroke and seizures, platelet activation and increased thromboembolic risk, and immunogenicity (4246), with the most common being hypertension (47,48). A new generation of nonhematopoietic EpoR agonists drugs have hence been investigated and further developed in animals models. These compounds, namely asialoerythropoietin (asialoEpo) and carbamylated Epo (Cepo), were developed for preserving tissue-protective properties but reducing the erythropoietic activity of native Epo (49,50). These drugs will provide better outcome in ongoing clinical trials. The advantage of using nonhematopoietic Epo analogs is to avoid the stimulation of hematopoiesis and thereby the prevention of an increased hematocrit with a subsequent procoagulant status or increased blood pressure. In this regard, a new study by van Rijt et al. has shed new light on this topic (51). A new nonhematopoietic EpoR agonist analog named ARA 290 has been developed, promising cytoprotective capacities to prevent renal ischemia/reperfusion injury (51). ARA 290 is a short peptide that has shown no safety concerns in preclinical and human studies. In addition, ARA 290 has proven efficacious in cardiac disorders (52,53), neuropathic pain (54) and sarcoidosis-induced chronic neuropathic pain (55). Thus, ARA 290 is a novel nonhematopoietic EpoR agonist with promising therapeutic options in treating a wide range of pathologies and without increased risks of cardiovascular events.Overall, this new generation of EpoR agonists without the erythropoietic activity of Epo while preserving tissue-protective properties of Epo will provide better outcomes in ongoing clinical trials (49,50). Nonhematopoietic EpoR agonists represent safer and more effective surrogates for the treatment of several diseases, such as brain and peripheral nerve injury, diabetic complications, renal ischemia, rare diseases, myocardial infarction, chronic heart disease and others.  相似文献   

20.

Background:

Persistent postoperative pain continues to be an underrecognized complication. We examined the prevalence of and risk factors for this type of pain after cardiac surgery.

Methods:

We enrolled patients scheduled for coronary artery bypass grafting or valve replacement, or both, from Feb. 8, 2005, to Sept. 1, 2009. Validated measures were used to assess (a) preoperative anxiety and depression, tendency to catastrophize in the face of pain, health-related quality of life and presence of persistent pain; (b) pain intensity and interference in the first postoperative week; and (c) presence and intensity of persistent postoperative pain at 3, 6, 12 and 24 months after surgery. The primary outcome was the presence of persistent postoperative pain during 24 months of follow-up.

Results:

A total of 1247 patients completed the preoperative assessment. Follow-up retention rates at 3 and 24 months were 84% and 78%, respectively. The prevalence of persistent postoperative pain decreased significantly over time, from 40.1% at 3 months to 22.1% at 6 months, 16.5% at 12 months and 9.5% at 24 months; the pain was rated as moderate to severe in 3.6% at 24 months. Acute postoperative pain predicted both the presence and severity of persistent postoperative pain. The more intense the pain during the first week after surgery and the more it interfered with functioning, the more likely the patients were to report persistent postoperative pain. Pre-existing persistent pain and increased preoperative anxiety also predicted the presence of persistent postoperative pain.

Interpretation:

Persistent postoperative pain of nonanginal origin after cardiac surgery affected a substantial proportion of the study population. Future research is needed to determine whether interventions to modify certain risk factors, such as preoperative anxiety and the severity of pain before and immediately after surgery, may help to minimize or prevent persistent postoperative pain.Postoperative pain that persists beyond the normal time for tissue healing (> 3 mo) is increasingly recognized as an important complication after various types of surgery and can have serious consequences on patients’ daily living.13 Cardiac surgeries, such as coronary artery bypass grafting (CABG) and valve replacement, rank among the most frequently performed interventions worldwide.4 They aim to improve survival and quality of life by reducing symptoms, including anginal pain. However, persistent postoperative pain of nonanginal origin has been reported in 7% to 60% of patients following these surgeries.523 Such variability is common in other types of major surgery and is due mainly to differences in the definition of persistent postoperative pain, study design, data collection methods and duration of follow-up.13,24Few prospective cohort studies have examined the exact time course of persistent postoperative pain after cardiac surgery, and follow-up has always been limited to a year or less.9,14,25 Factors that put patients at risk of this type of problem are poorly understood.26 Studies have reported inconsistent results regarding the contribution of age, sex, body mass index, preoperative angina, surgical technique, grafting site, postoperative complications or level of opioid consumption after surgery.57,9,13,14,1619,2123,25,27 Only 1 study investigated the role of chronic nonanginal pain before surgery as a contributing factor;21 5 others prospectively assessed the association between persistent postoperative pain and acute pain intensity in the first postoperative week but reported conflicting results.13,14,21,22,25 All of the above studies were carried out in a single hospital and included relatively small samples. None of the studies examined the contribution of psychological factors such as levels of anxiety and depression before cardiac surgery, although these factors have been shown to influence acute or persistent postoperative pain in other types of surgery.1,24,28,29We conducted a prospective multicentre cohort study (the CARD-PAIN study) to determine the prevalence of persistent postoperative pain of nonanginal origin up to 24 months after cardiac surgery and to identify risk factors for the presence and severity of the condition.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号