首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
2.

Background:

Screening for methicillin-resistant Staphylococcus aureus (MRSA) is intended to reduce nosocomial spread by identifying patients colonized by MRSA. Given the widespread use of this screening, we evaluated its potential clinical utility in predicting the resistance of clinical isolates of S. aureus.

Methods:

We conducted a 2-year retrospective cohort study that included patients with documented clinical infection with S. aureus and prior screening for MRSA. We determined test characteristics, including sensitivity and specificity, of screening for predicting the resistance of subsequent S. aureus isolates.

Results:

Of 510 patients included in the study, 53 (10%) had positive results from MRSA screening, and 79 (15%) of infecting isolates were resistant to methicillin. Screening for MRSA predicted methicillin resistance of the infecting isolate with 99% (95% confidence interval [CI] 98%–100%) specificity and 63% (95% CI 52%–74%) sensitivity. When screening swabs were obtained within 48 hours before isolate collection, sensitivity increased to 91% (95% CI 71%–99%) and specificity was 100% (95% CI 97%–100%), yielding a negative likelihood ratio of 0.09 (95% CI 0.01–0.3) and a negative predictive value of 98% (95% CI 95%–100%). The time between swab and isolate collection was a significant predictor of concordance of methicillin resistance in swabs and isolates (odds ratio 6.6, 95% CI 1.6–28.2).

Interpretation:

A positive result from MRSA screening predicted methicillin resistance in a culture-positive clinical infection with S. aureus. Negative results on MRSA screening were most useful for excluding methicillin resistance of a subsequent infection with S. aureus when the screening swab was obtained within 48 hours before collection of the clinical isolate.Antimicrobial resistance is a global problem. The prevalence of resistant bacteria, including methicillin-resistant Staphylococcus aureus (MRSA), has reached high levels in many countries.13 Methicillin resistance in S. aureus is associated with excess mortality, hospital stays and health care costs,3,4 possibly owing to increased virulence or less effective treatments for MRSA compared with methicillin-sensitive S. aureus (MSSA).5The initial selection of appropriate empirical antibiotic treatment affects mortality, morbidity and potential health care expenditures.68 The optimal choice of antibiotics in S. aureus infections is important for 3 major reasons: β-lactam antibiotics have shown improved efficacy over vancomycin and are the ideal treatment for susceptible strains of S. aureus;6 β-lactam antibiotics are ineffective against MRSA, and so vancomycin or other newer agents must be used empirically when MRSA is suspected; and unnecessary use of broad-spectrum antibiotics (e.g., vancomycin) can lead to the development of further antimicrobial resistance.9 It is therefore necessary to make informed decisions regarding selection of empirical antibiotics.1013 Consideration of a patient’s previous colonization status is important, because colonization predates most hospital and community-acquired infections.10,14Universal or targeted surveillance for MRSA has been implemented widely as a means of limiting transmission of this antibiotic-resistant pathogen.15,16 Although results of MRSA screening are not intended to guide empirical treatment, they may offer an additional benefit among patients in whom clinical infection with S. aureus develops.Studies that examined the effects of MRSA carriage on the subsequent likelihood of infection allude to the potential diagnostic benefit of prior screening for MRSA.17,18 Colonization by MRSA at the time of hospital admission is associated with a 13-fold increased risk of subsequent MRSA infection.17,18 Moreover, studies that examined nasal carriage of S. aureus after documented S. aureus bacteremia have shown remarkable concordance between the genotypes of paired colonizing and invasive strains (82%–94%).19,20 The purpose of our study was to identify the usefulness of prior screening for MRSA for predicting methicillin resistance in culture-positive S. aureus infections.  相似文献   

3.

Background:

Falls cause more than 60% of head injuries in older adults. Lack of objective evidence on the circumstances of these events is a barrier to prevention. We analyzed video footage to determine the frequency of and risk factors for head impact during falls in older adults in 2 long-term care facilities.

Methods:

Over 39 months, we captured on video 227 falls involving 133 residents. We used a validated questionnaire to analyze the mechanisms of each fall. We then examined whether the probability for head impact was associated with upper-limb protective responses (hand impact) and fall direction.

Results:

Head impact occurred in 37% of falls, usually onto a vinyl or linoleum floor. Hand impact occurred in 74% of falls but had no significant effect on the probability of head impact (p = 0.3). An increased probability of head impact was associated with a forward initial fall direction, compared with backward falls (odds ratio [OR] 2.7, 95% confidence interval [CI] 1.3–5.9) or sideways falls (OR 2.8, 95% CI 1.2–6.3). In 36% of sideways falls, residents rotated to land backwards, which reduced the probability of head impact (OR 0.2, 95% CI 0.04–0.8).

Interpretation:

Head impact was common in observed falls in older adults living in long-term care facilities, particularly in forward falls. Backward rotation during descent appeared to be protective, but hand impact was not. Attention to upper-limb strength and teaching rotational falling techniques (as in martial arts training) may reduce fall-related head injuries in older adults.Falls from standing height or lower are the cause of more than 60% of hospital admissions for traumatic brain injury in adults older than 65 years.15 Traumatic brain injury accounts for 32% of hospital admissions and more than 50% of deaths from falls in older adults.1,68 Furthermore, the incidence and age-adjusted rate of fall-related traumatic brain injury is increasing,1,9 especially among people older than 80 years, among whom rates have increased threefold over the past 30 years.10 One-quarter of fall-related traumatic brain injuries in older adults occur in long-term care facilities.1The development of improved strategies to prevent fall-related traumatic brain injuries is an important but challenging task. About 60% of residents in long-term care facilities fall at least once per year,11 and falls result from complex interactions of physiologic, environmental and situational factors.1216 Any fall from standing height has sufficient energy to cause brain injury if direct impact occurs between the head and a rigid floor surface.1719 Improved understanding is needed of the factors that separate falls that result in head impact and injury from those that do not.1,10 Falls in young adults rarely result in head impact, owing to protective responses such as use of the upper limbs to stop the fall, trunk flexion and rotation during descent.2023 We have limited evidence of the efficacy of protective responses to falls among older adults.In the current study, we analyzed video footage of real-life falls among older adults to estimate the prevalence of head impact from falls, and to examine the association between head impact, and biomechanical and situational factors.  相似文献   

4.

Background:

Anemia is an important public health and clinical problem. Observational studies have linked iron deficiency and anemia in children with many poor outcomes, including impaired cognitive development; however, iron supplementation, a widely used preventive and therapeutic strategy, is associated with adverse effects. Primary-school–aged children are at a critical stage in intellectual development, and optimization of their cognitive performance could have long-lasting individual and population benefits. In this study, we summarize the evidence for the benefits and safety of daily iron supplementation in primary-school–aged children.

Methods:

We searched electronic databases (including MEDLINE and Embase) and other sources (July 2013) for randomized and quasi-randomized controlled trials involving daily iron supplementation in children aged 5–12 years. We combined the data using random effects meta-analysis.

Results:

We identified 16 501 studies; of these, we evaluated 76 full-text papers and included 32 studies including 7089 children. Of the included studies, 31 were conducted in low- or middle-income settings. Iron supplementation improved global cognitive scores (standardized mean difference 0.50, 95% confidence interval [CI] 0.11 to 0.90, p = 0.01), intelligence quotient among anemic children (mean difference 4.55, 95% CI 0.16 to 8.94, p = 0.04) and measures of attention and concentration. Iron supplementation also improved age-adjusted height among all children and age-adjusted weight among anemic children. Iron supplementation reduced the risk of anemia by 50% and the risk of iron deficiency by 79%. Adherence in the trial settings was generally high. Safety data were limited.

Interpretation:

Our analysis suggests that iron supplementation safely improves hematologic and nonhematologic outcomes among primary-school–aged children in low- or middle-income settings and is well-tolerated.An estimated 25% of school-aged children worldwide are anemic.1 Iron deficiency is thought to account for about half of the global cases of anemia2 and is associated with inadequate dietary iron and, in developing settings, hookworm and schistosomiasis.3 In developed settings, anemia is prevalent among disadvantaged populations, including newly arrived refugees, indigenous people4 and some ethnic groups (e.g., Hispanic people in the United States).5,6 About 3% of primary-school–aged children in Canada are anemic.7 Programs to address anemia are constrained by concerns that iron supplements cause adverse effects, including an increased risk of infections such as malaria in endemic areas.8In observational studies, iron deficiency has been associated with impaired cognitive and physical development. It has been estimated that each 10 g/L decrement in hemoglobin reduces future intelligence quotient (IQ) by 1.73 points.9 However, observational data are susceptible to confounding,10 and a causal relation between iron deficiency and cognitive impairment has not been confirmed.11 Randomized controlled trials should overcome confounding, but results of trials examining this question have not agreed.Optimizing cognitive and physical development in primary-school–aged children could have life-long benefits.12 However, anemia-control recommendations must balance safety and efficacy. We performed a systematic review of the effects of daily iron supplementation, a commonly used strategy to combat anemia,2 in primary-school–aged children. We examined cognitive, growth and hematologic outcomes and adverse effects across all settings.  相似文献   

5.

Background:

Chronic kidney disease is an important risk factor for death and cardiovascular-related morbidity, but estimates to date of its prevalence in Canada have generally been extrapolated from the prevalence of end-stage renal disease. We used direct measures of kidney function collected from a nationally representative survey population to estimate the prevalence of chronic kidney disease among Canadian adults.

Methods:

We examined data for 3689 adult participants of cycle 1 of the Canadian Health Measures Survey (2007–2009) for the presence of chronic kidney disease. We also calculated the age-standardized prevalence of cardiovascular risk factors by chronic kidney disease group. We cross-tabulated the estimated glomerular filtration rate (eGFR) with albuminuria status.

Results:

The prevalence of chronic kidney disease during the period 2007–2009 was 12.5%, representing about 3 million Canadian adults. The estimated prevalence of stage 3–5 disease was 3.1% (0.73 million adults) and albuminuria 10.3% (2.4 million adults). The prevalence of diabetes, hypertension and hypertriglyceridemia were all significantly higher among adults with chronic kidney disease than among those without it. The prevalence of albuminuria was high, even among those whose eGFR was 90 mL/min per 1.73 m2 or greater (10.1%) and those without diabetes or hypertension (9.3%). Awareness of kidney dysfunction among adults with stage 3–5 chronic kidney disease was low (12.0%).

Interpretation:

The prevalence of kidney dysfunction was substantial in the survey population, including individuals without hypertension or diabetes, conditions most likely to prompt screening for kidney dysfunction. These findings highlight the potential for missed opportunities for early intervention and secondary prevention of chronic kidney disease.Chronic kidney disease is defined as the presence of kidney damage or reduced kidney function for more than 3 months and requires either a measured or estimated glomerular filtration rate (eGFR) of less than 60 mL/min per 1.73 m2, or the presence of abnormalities in urine sediment, renal imaging or biopsy results.1 Between 1.3 million and 2.9 million Canadians are estimated to have chronic kidney disease, based on an extrapolation of the prevalence of end-stage renal disease.2 In the United States, the 1999–2004 National Health and Nutrition Examination Survey reported a prevalence of 5.0% for stage 1 and 2 disease and 8.1% for stage 3 and 4 disease.3,4Chronic kidney disease has been identified as a risk factor for death and cardiovascular-related morbidity and is a substantial burden on the health care system.1,5 Hemodialysis costs the Canadian health care system about $60 000 per patient per year of treatment.1 The increasing prevalence of chronic kidney disease can be attributed in part to the growing elderly population and to increasing rates of diabetes and hypertension.1,6,7Albuminuria, which can result from abnormal vascular permeability, atherosclerosis or renal disease, has gained recognition as an independent risk factor for progressive renal dysfunction and adverse cardiovascular outcomes.810 In earlier stages of chronic kidney disease, albuminuria has been shown to be more predictive of renal and cardiovascular events than eGFR.4,9 This has prompted the call for a new risk stratification for cardiovascular outcomes based on both eGFR and albuminuria.11A recent review advocated screening people for chronic kidney disease if they have hypertension, diabetes, clinically evident cardiovascular disease or a family history of kidney failure or are more than 60 years old.4 The Canadian Society of Nephrology published guidelines on the management of chronic kidney disease but did not offer guidance on screening.1 The Canadian Diabetes Association recommends annual screening with the use of an albumin:creatinine ratio,12 and the Canadian Hypertension Education Program guideline recommends urinalysis as part of the initial assessment of hypertension.13 Screening for chronic kidney disease on the basis of eGFR and albuminuria is not considered to be cost-effective in the general population, among older people or among people with hypertension.14The objective of our study was to use direct measures (biomarkers) of kidney function to generate nationally representative, population-based prevalence estimates of chronic kidney disease among Canadian adults overall and in clinically relevant groups.  相似文献   

6.

Background:

Hospital mortality has decreased over time for critically ill patients with various forms of brain injury. We hypothesized that the proportion of patients who progress to neurologic death may have also decreased.

Methods:

We performed a prospective cohort study involving consecutive adult patients with traumatic brain injury, subarachnoid hemorrhage, intracerebral hemorrhage or anoxic brain injury admitted to regional intensive care units in southern Alberta over a 10.5-year period. We used multivariable logistic regression to adjust for patient age and score on the Glasgow Coma Scale at admission, and to assess whether the proportion of patients who progress to neurologic death has changed over time.

Results:

The cohort consisted of 2788 patients. The proportion of patients who progressed to neurologic death was 8.1% at the start of the study period, and the adjusted odds of progressing to neurologic death decreased over the study period (odds ratio [OR] per yr 0.92, 95% confidence interval [CI] 0.87–0.98, p = 0.006). This change was most pronounced among patients with traumatic brain injury (OR per yr 0.87, 95% CI 0.78–0.96, p = 0.005); there was no change among patients with anoxic injury (OR per yr 0.96, 95% CI 0.85–1.09, p = 0.6). A review of the medical records suggests that missed cases of neurologic death were rare (≤ 0.5% of deaths).

Interpretation:

The proportion of patients with brain injury who progress to neurologic death has decreased over time, especially among those with head trauma. This finding may reflect positive developments in the prevention and care of brain injury. However, organ donation after neurologic death represents the major source of organs for transplantation. Thus, these findings may help explain the relatively stagnant rates of deceased organ donation in some regions of Canada, which in turn has important implications for the care of patients with end-stage organ failure.Mortality has decreased among critically ill patients with various forms of brain injury in Canada and around the world.110 There have also been changes in the incidence of stroke and the rate of admission to hospital for traumatic brain injury, especially among younger people and those whose injuries are related to motor vehicle or bicycle crashes.5,6,1013Some countries have noted a possible decline in the total number of patients with neurologic death.14,15 Neurologic death (“brain death”) may occur when patients with brain injury experience progressive cerebral edema, complicated by transtentorial herniation. It is defined by the irreversible cessation of cerebral and brainstem functions, including respiration.16 Circulation and gas exchange persist only because of the use of mechanical ventilation. National guidelines exist for the diagnosis of neurologic death.17,18 We hypothesized that the proportion of patients with acute brain injury who progress to neurologic death may have decreased over time.  相似文献   

7.

Background:

Understanding the health care experience of people with dementia and their caregivers is becoming increasingly important given the growing number of affected individuals. We conducted a systematic review of qualitative studies that examined aspects of the health care experience of people with dementia and their caregivers to better understand ways to improve care for this population.

Methods:

We searched the electronic databases MEDLINE, Embase, PsychINFO and CINAHL to identify relevant articles. We extracted key study characteristics and methods from the included studies. We also extracted direct quotes from the primary studies, along with the interpretations provided by authors of the studies. We used meta-ethnography to synthesize the extracted information into an overall framework. We evaluated the quality of the primary studies using the Consolidated Criteria for Reporting Qualitative Research (COREQ) checklist.

Results:

In total, 46 studies met our inclusion criteria; these involved 1866 people with dementia and their caregivers. We identified 5 major themes: seeking a diagnosis; accessing supports and services; addressing information needs; disease management; and communication and attitudes of health care providers. We conceptualized the health care experience as progressing through phases of seeking understanding and information, identifying the problem, role transitions following diagnosis and living with change.

Interpretation:

The health care experience of people with dementia and their caregivers is a complex and dynamic process, which could be improved for many people. Understanding these experiences provides insight into potential gaps in existing health services. Modifying existing services or implementing new models of care to address these gaps may lead to improved outcomes for people with dementia and their caregivers.The global prevalence of Alzheimer disease and related dementias is estimated to be 36 million people and is expected to double in the next 20 years.1 Several recent strategies for providing care to patients with dementia have highlighted the importance of coordinated health care services for this growing population.25 Gaps in the quality of care for people with dementia have been identified,68 and improving their quality of care and health care experience has been identified as a priority area.25Incorporating the health care experience of patients and caregivers in health service planning is important to ensure that their needs are met and that person-centred care is provided.9 The health care experience of people with dementia and their caregivers provides valuable information about preferences for services and service delivery.10 Matching available services to patient treatment preferences leads to improved patient outcomes11,12 and satisfaction without increasing costs.13 Qualitative research is ideally suited to exploring the experiences and perspectives of patients and caregivers and has been used to examine these experiences for other conditions.14 We performed a systematic review and meta-ethnographic synthesis of qualitative studies exploring the health care experience of people with dementia and their caregivers in primary care settings, and we propose a conceptual framework for understanding and improving these health care experiences.  相似文献   

8.

Background:

Some children feel pain during wound closures using tissue adhesives. We sought to determine whether a topically applied analgesic solution of lidocaine–epinephrine–tetracaine would decrease pain during tissue adhesive repair.

Methods:

We conducted a randomized, placebo-controlled, blinded trial involving 221 children between the ages of 3 months and 17 years. Patients were enrolled between March 2011 and January 2012 when presenting to a tertiary-care pediatric emergency department with lacerations requiring closure with tissue adhesive. Patients received either lidocaine–epinephrine–tetracaine or placebo before undergoing wound closure. Our primary outcome was the pain rating of adhesive application according to the colour Visual Analogue Scale and the Faces Pain Scale — Revised. Our secondary outcomes were physician ratings of difficulty of wound closure and wound hemostasis, in addition to their prediction as to which treatment the patient had received.

Results:

Children who received the analgesic before wound closure reported less pain (median 0.5, interquartile range [IQR] 0.25–1.50) than those who received placebo (median 1.00, IQR 0.38–2.50) as rated using the colour Visual Analogue Scale (p = 0.01) and Faces Pain Scale – Revised (median 0.00, IQR 0.00–2.00, for analgesic v. median 2.00, IQR 0.00–4.00, for placebo, p < 0.01). Patients who received the analgesic were significantly more likely to report having or to appear to have a pain-free procedure (relative risk [RR] of pain 0.54, 95% confidence interval [CI] 0.37–0.80). Complete hemostasis of the wound was also more common among patients who received lidocaine–epinephrine–tetracaine than among those who received placebo (78.2% v. 59.3%, p = 0.008).

Interpretation:

Treating minor lacerations with lidocaine–epinephrine–tetracaine before wound closure with tissue adhesive reduced ratings of pain and increased the proportion of pain-free repairs among children aged 3 months to 17 years. This low-risk intervention may benefit children with lacerations requiring tissue adhesives instead of sutures. Trial registration: ClinicalTrials.gov, no. PR 6138378804.Minor laceration repair with tissue adhesive, or “skin glue,” is common in pediatrics. Although less painful than cutaneous sutures,1 tissue adhesives polymerize through an exothermic reaction that may cause a burning, painful sensation. Pain is dependent on the specific formulation of the adhesive used and the method of application. One study of different tissue adhesives reported 23.8%–40.5% of participants feeling a “burning sensation”,2 whereas another study reported “pain” in 17.6%–44.1% of children.3 The amounts of adhesive applied, method of application and individual patient characteristics can also influence the feeling of pain.3,4 Because tissue adhesives polymerize on contact with moisture,4,5 poor wound hemostasis has the potential to cause premature setting of the adhesive, leading to less efficient and more painful repairs.6Preventing procedural pain is a high priority in pediatric care.7 Inadequate analgesia for pediatric procedures may result in more complicated procedures, increased pain sensitivity with future procedures8 and increased fear and anxiety of medical experiences persisting into adulthood.9 A practical method to prevent pain during laceration repairs with tissue adhesive would have a substantial benefit for children.A topically applied analgesic solution containing lidocaine–epinephrine–tetracaine with vasoconstrictive properties provides safe and effective pain control during wound repair using sutures.10 A survey of pediatric emergency fellowship directors in the United States reported that 76% of respondents use this solution or a similar solution when suturing 3-cm chin lacerations in toddlers.11 However, in a hospital chart review, this solution was used in less than half of tissue adhesive repairs, the remainder receiving either local injection of anesthetic or no pain control.12 Reluctance to use lidocaine–epinephrine–tetracaine with tissue adhesive may be due to the perception that it is not worth the minimum 20-minute wait required for the analgesic to take effect13 or to a lack of awareness that tissue adhesives can cause pain.We sought to investigate whether preapplying lidocaine–epinephrine–tetracaine would decrease pain in children during minor laceration repair using tissue adhesive.  相似文献   

9.

Background:

Hemorrhage coupled with coagulopathy remains the leading cause of preventable in-hospital deaths among trauma patients. Use of a transfusion protocol with a predefined ratio of 1:1:1 (1 each of red blood cells [RBC], frozen plasma [FP] and platelets) has been associated with improved survival in retrospective studies in military and civilian settings, but such a protocol has its challenges and may increase the risk of respiratory complications. We conducted a randomized controlled trial to assess the feasibility of a 1:1:1 transfusion protocol and its effect on mortality and complications among patients with severe trauma.

Methods:

We included 78 patients seen in a tertiary trauma centre between July 2009 and October 2011 who had hypotension and bleeding and were expected to need massive transfusion (≥ 10 RBC units in 24 h). We randomly assigned them to either the fixed-ratio (1:1:1) transfusion protocol (n = 40) or to a laboratory-results–guided transfusion protocol (control; n = 38). The primary outcome, feasibility, was assessed in terms of blood product ratios and plasma wastage. Safety was measured based on 28-day mortality and survival free of acute respiratory distress syndrome.

Results:

Overall, a transfusion ratio of 1:1:1 was achieved in 57% (21/37) of patients in the fixed-ratio group, as compared with 6% (2/32) in the control group. A ratio of 1:1 (RBC:FP) was achieved in 73% (27/37) in the fixed-ratio group and 22% (7/32) in the control group. Plasma wastage was higher with the intervention protocol (22% [86/390] of FP units v. 10% [30/289] in the control group). The 28-day mortality and number of days free of acute respiratory distress syndrome were statistically similar between the groups.

Interpretation:

The fixed-ratio transfusion protocol was feasible in our study, but it was associated with increased plasma wastage. Larger randomized trials are needed to evaluate the efficacy of such a protocol in trauma care. Trial registration: ClinicalTrials.gov, no. NCT00945542A fixed-ratio (1:1:1) transfusion strategy is a resuscitation strategy for trauma patients that promotes the transfusion of red blood cells (RBC), plasma and platelets (PLT) at a 1:1:1 ratio while minimizing crystalloid infusion.1 This balanced transfusion strategy aims to correct both the early coagulopathy of trauma and the volume status of patients in hemorrhagic shock, thus targeting preventable hemorrhage-related deaths.2,3 Retrospective studies of the 1:1:1 transfusion protocol reported marked reductions in mortality based on retrospectively calculated ratios of plasma:PLT:RBC.46 Methodologic limitations, particularly survivorship bias (where higher mortality was associated with low ratios of plasma and PLT to RBC in unsalvageable patients who died before 1:1:1 transfusion could be achieved), preclude any definitive conclusion on the potential benefit of a 1:1:1 transfusion strategy in terms of efficacy and safety.710The 1:1:1 transfusion strategy has been widely adopted by trauma centres worldwide11,12 and is being increasingly used in prehospital care and in the care of patients without traumatic injuries.1315 Widespread adoption of the strategy has significant resource and safety implications. Its full implementation requires access to thawed type AB plasma, which is chronically in short supply.16 In addition, because of the difficulty in predicting the need for massive transfusion (commonly defined as ≥ 10 RBC units in 24 h), the 1:1:1 transfusion protocol may lead to unnecessary exposure to blood components and an increased risk of acute respiratory distress syndrome, sepsis and multiple organ dysfunction.17We conducted a pilot randomized controlled trial comparing a 1:1:1 transfusion strategy with the standard of care at our institution (laboratory-results–guided transfusion; laboratory results are available for transfusion decisions throughout resuscitation) in trauma patients predicted to need massive transfusion. Our primary objective was to assess the feasibility and safety of the fixed-ratio protocol in patients with severe trauma.  相似文献   

10.

Background:

Although warfarin has been extensively studied in clinical trials, little is known about rates of hemorrhage attributable to its use in routine clinical practice. Our objective was to examine incident hemorrhagic events in a large population-based cohort of patients with atrial fibrillation who were starting treatment with warfarin.

Methods:

We conducted a population-based cohort study involving residents of Ontario (age ≥ 66 yr) with atrial fibrillation who started taking warfarin between Apr. 1, 1997, and Mar. 31, 2008. We defined a major hemorrhage as any visit to hospital for hemorrage. We determined crude rates of hemorrhage during warfarin treatment, overall and stratified by CHADS2 score (congestive heart failure, hypertension, age ≥ 75 yr, diabetes mellitus and prior stroke, transient ischemic attack or thromboembolism).

Results:

We included 125 195 patients with atrial fibrillation who started treatment with warfarin during the study period. Overall, the rate of hemorrhage was 3.8% (95% confidence interval [CI] 3.8%–3.9%) per person-year. The risk of major hemorrhage was highest during the first 30 days of treatment. During this period, rates of hemorrhage were 11.8% (95% CI 11.1%–12.5%) per person-year in all patients and 16.7% (95% CI 14.3%–19.4%) per person-year among patients with a CHADS2 scores of 4 or greater. Over the 5-year follow-up, 10 840 patients (8.7%) visited the hospital for hemorrhage; of these patients, 1963 (18.1%) died in hospital or within 7 days of being discharged.

Interpretation:

In this large cohort of older patients with atrial fibrillation, we found that rates of hemorrhage are highest within the first 30 days of warfarin therapy. These rates are considerably higher than the rates of 1%–3% reported in randomized controlled trials of warfarin therapy. Our study provides timely estimates of warfarin-related adverse events that may be useful to clinicians, patients and policy-makers as new options for treatment become available.Atrial fibrillation is a major risk factor for stroke and systemic embolism, and strong evidence supports the use of the anticoagulant warfarin to reduce this risk.13 However, warfarin has a narrow therapeutic range and requires regular monitoring of the international normalized ratio to optimize its effectiveness and minimize the risk of hemorrhage.4,5 Although rates of major hemorrhage reported in trials of warfarin therapy typically range between 1% and 3% per person-year,611 observational studies suggest that rates may be considerably higher when warfarin is prescribed outside of a clinical trial setting,1215 approaching 7% per person-year in some studies.1315 The different safety profiles derived from clinical trials and observational data may reflect the careful selection of patients, precise definitions of bleeding and close monitoring in the trial setting. Furthermore, although a few observational studies suggest that hemorrhage rates are higher than generally appreciated, these studies involve small numbers of patients who received care in specialized settings.1416 Consequently, the generalizability of their results to general practice may be limited.More information regarding hemorrhage rates during warfarin therapy is particularly important in light of the recent introduction of new oral anticoagulant agents such as dabigatran, rivaroxaban and apixaban, which may be associated with different outcome profiles.1719 There are currently no large studies offering real-world, population-based estimates of hemorrhage rates among patients taking warfarin, which are needed for future comparisons with new anticoagulant agents once they are widely used in routine clinical practice.20We sought to describe the risk of incident hemorrhage in a large population-based cohort of patients with atrial fibrillation who had recently started warfarin therapy.  相似文献   

11.

Background:

There is an increased risk of venous thromboembolism among women taking oral contraceptives. However, whether there is an additional risk among women with polycystic ovary syndrome (PCOS) is unknown.

Methods:

We developed a population-based cohort from the IMS LifeLink Health Plan Claims Database, which includes managed care organizations in the United States. Women aged 18–46 years taking combined oral contraceptives and who had a claim for PCOS (n = 43 506) were matched, based on a propensity score, to control women (n = 43 506) taking oral contraceptives. Venous thromboembolism was defined using administrative coding and use of anticoagulation. We used Cox proportional hazards models to assess the relative risk (RR) of venous thromboembolism among users of combined oral contraceptives with and without PCOS.

Results:

The incidence of venous thromboembolism among women with PCOS was 23.7/10 000 person-years, while that for matched controls was 10.9/10 000 person-years. Women with PCOS taking combined oral contraceptives had an RR for venous thromboembolism of 2.14 (95% confidence interval [CI] 1.41–3.24) compared with other contraceptive users. The incidence of venous thromboembolism was 6.3/10 000 person-years among women with PCOS not taking oral contraceptives; the incidence was 4.1/10 000 person-years among matched controls. The RR of venous thromboembolism among women with PCOS not taking oral contraceptives was 1.55 (95% CI 1.10–2.19).

Interpretation:

We found a 2-fold increased risk of venous thromboembolism among women with PCOS who were taking combined oral contraceptives and a 1.5-fold increased risk among women with PCOS not taking oral contraceptives. Physicians should consider the increased risk of venous thromboembolism when prescribing contraceptive therapy to women with PCOS.Polycystic ovary syndrome (PCOS) is the most common endocrine disorder among women of reproductive age. The National Institutes of Health criteria estimates its prevalence in the United States to be between 6% and 10%, while the Rotterdam criteria estimates the prevalence to be as high as 15%.1 Although its cause is not entirely known, the diagnostic criteria include oligo- or anovulation, clinical and/or biochemical signs of hyperandrogenism, and polycystic ovaries.2 Women often present with clinical manifestations of high androgen levels, including facial hair growth (hirsutism), acne vulgaris and hair loss on the scalp. Previous studies reported the prevalence of impaired glucose tolerance to be 31.1%–35.2% and the prevalence of type 2 diabetes to be 7.5%–9.8% among women with PCOS.3,4 A recent consensus workshop reported that the prevalence of several known risk factors for cardiovascular disease (hypertension, diabetes, abdominal obesity, psychological factors, smoking, altered apoA1/ApoB ratios) are doubled among women with PCOS compared with matched controls.1,5Combined oral contraceptives are the mainstay treatment for PCOS. However, they are also known to elevate the risk of venous thromboembolism and cardiovascular disease.6 To date, contraceptive studies involving women with PCOS have focused mainly on efficacy, evaluating the effect of combined oral contraceptives on the reduction of hirsutism and hyperandrogenism.7,8 Two studies assessed the metabolic effects of combined oral contraceptives in PCOS, but these studies had small sample sizes and could not evaluate for cardiovascular events.9,10Although women with PCOS have an increase in both cardiovascular risk factors and subclinical cardiovascular disease,11 recent guidelines have concluded there are no data in the literature assessing the association between the use of oral contraceptives and cardiovascular disease among women with PCOS.2 Because combined oral contraceptives are the mainstay treatment, our objective was to determine whether women with PCOS taking combined oral contraceptives have a greater risk of venous thromboembolism compared with other contraceptive users. We also examined whether women with PCOS not taking oral contraceptives had an increased risk of venous thromboembolism compared with the general population.  相似文献   

12.
13.

Background:

Routine eye examinations for healthy adults aged 20–64 years were delisted from the Ontario Health Insurance Plan in 2004, but they continue to be insured for people with diabetes regardless of age. We sought to assess whether the delisting of routine eye examinations for healthy adults had the unintended consequence of decreasing retinopathy screening for adults with diabetes.

Methods:

We used administrative data to calculate eye examinations for people with diabetes ages 40–64 years and 65 years and older in each 2-year period from 1998 to 2010. We examined differences by sex, income, rurality and type of health care provider. We used segmented linear regression to assess the change in trend before and after 2004.

Results:

For people with diabetes aged 65 years and older, eye examinations rose gradually from 1998 to 2010, with no substantial change between 2004 and 2006. For people with diabetes aged 40–65 years, there was an 8.7% (95% confidence interval [CI] 6.3%–11.1%) decrease in eye examinations between 2004 and 2006. Results were similar for all population subgroups. Ophthalmologic examinations decreased steadily for both age groups during the study period, and there was a decline in optometry examinations for people ages 40–65 years after 2004.

Interpretation:

The delisting of routine eye examinations for healthy adults in Ontario had the unintended consequence of reducing publicly funded retinopathy screening for people with diabetes. More research is needed to understand whether patients are being charged for an insured service or to what degree misunderstanding has prevented patients from seeking care.Diabetic retinopathy is the leading cause of new cases of blindness in people of working age.1 In the United States, about 40% of adults with diabetes aged 40 years and older have retinopathy, and 8% have vision-threatening retinopathy.2 Studies suggest that, if untreated, 50% of patients with proliferative diabetic retinopathy become legally blind within 5 years, compared with only 5% of patients who receive early treatment.3 Regular dilated eye examinations are effective for early detection and monitoring of asymptomatic retinopathy in people with diabetes4 and are recommended by clinical practice guidelines.5,6In Ontario, Canada’s most populous province, medically necessary services are covered by the Ontario Health Insurance Plan (OHIP) for all permanent residents and Canadian citizens living in the province.7 Under OHIP, routine eye examinations were fully insured for all children and adults until November 1, 2004. At that time, routine eye examinations ceased being insured for healthy adults aged 20–64 years, but continued to be insured for children aged 19 years and younger and for adults aged 65 years and older.8 Regardless of age, adults with diabetes and some other medical conditions affecting the eye, as well as adults receiving social assistance, continued to have an annual eye examination covered by OHIP. Insured examinations are at no cost to the patient and are reimbursed to the provider at about Can$40. In contrast, healthy adults aged 20–64 years are required to pay out-of-pocket or through private insurance for a routine eye examination, with fees set at the discretion of the optometrist9 or physician.10Health policy experts suggest that delisting services from insurance schemes can have unpredictable effects.11 Understanding the effect of delisting on care is particularly important as governments face fiscal pressures and contemplate further reductions in what is publicly insured.12 We sought to assess whether delisting routine eye examinations for healthy middle-aged adults in Ontario had the unintended consequence of decreasing retinopathy screening for middle-aged adults with diabetes, even though eye examinations continued to be insured for this population.  相似文献   

14.
15.

Background:

Little evidence exists on the effect of an energy-unrestricted healthy diet on metabolic syndrome. We evaluated the long-term effect of Mediterranean diets ad libitum on the incidence or reversion of metabolic syndrome.

Methods:

We performed a secondary analysis of the PREDIMED trial — a multicentre, randomized trial done between October 2003 and December 2010 that involved men and women (age 55–80 yr) at high risk for cardiovascular disease. Participants were randomly assigned to 1 of 3 dietary interventions: a Mediterranean diet supplemented with extra-virgin olive oil, a Mediterranean diet supplemented with nuts or advice on following a low-fat diet (the control group). The interventions did not include increased physical activity or weight loss as a goal. We analyzed available data from 5801 participants. We determined the effect of diet on incidence and reversion of metabolic syndrome using Cox regression analysis to calculate hazard ratios (HRs) and 95% confidence intervals (CIs).

Results:

Over 4.8 years of follow-up, metabolic syndrome developed in 960 (50.0%) of the 1919 participants who did not have the condition at baseline. The risk of developing metabolic syndrome did not differ between participants assigned to the control diet and those assigned to either of the Mediterranean diets (control v. olive oil HR 1.10, 95% CI 0.94–1.30, p = 0.231; control v. nuts HR 1.08, 95% CI 0.92–1.27, p = 0.3). Reversion occurred in 958 (28.2%) of the 3392 participants who had metabolic syndrome at baseline. Compared with the control group, participants on either Mediterranean diet were more likely to undergo reversion (control v. olive oil HR 1.35, 95% CI 1.15–1.58, p < 0.001; control v. nuts HR 1.28, 95% CI 1.08–1.51, p < 0.001). Participants in the group receiving olive oil supplementation showed significant decreases in both central obesity and high fasting glucose (p = 0.02); participants in the group supplemented with nuts showed a significant decrease in central obesity.

Interpretation:

A Mediterranean diet supplemented with either extra virgin olive oil or nuts is not associated with the onset of metabolic syndrome, but such diets are more likely to cause reversion of the condition. An energy-unrestricted Mediterranean diet may be useful in reducing the risks of central obesity and hyperglycemia in people at high risk of cardiovascular disease. Trial registration: ClinicalTrials.gov, no. ISRCTN35739639.Metabolic syndrome is a cluster of 3 or more related cardiometabolic risk factors: central obesity (determined by waist circumference), hypertension, hypertriglyceridemia, low plasma high-density lipoprotein (HDL) cholesterol levels and hyperglycemia. Having the syndrome increases a person’s risk for type 2 diabetes and cardiovascular disease.1,2 In addition, the condition is associated with increased morbidity and all-cause mortality.1,35 The worldwide prevalence of metabolic syndrome in adults approaches 25%68 and increases with age,7 especially among women,8,9 making it an important public health issue.Several studies have shown that lifestyle modifications,10 such as increased physical activity,11 adherence to a healthy diet12,13 or weight loss,1416 are associated with reversion of the metabolic syndrome and its components. However, little information exists as to whether changes in the overall dietary pattern without weight loss might also be effective in preventing and managing the condition.The Mediterranean diet is recognized as one of the healthiest dietary patterns. It has shown benefits in patients with cardiovascular disease17,18 and in the prevention and treatment of related conditions, such as diabetes,1921 hypertension22,23 and metabolic syndrome.24Several cross-sectional2529 and prospective3032 epidemiologic studies have suggested an inverse association between adherence to the Mediterranean diet and the prevalence or incidence of metabolic syndrome. Evidence from clinical trials has shown that an energy-restricted Mediterranean diet33 or adopting a Mediterranean diet after weight loss34 has a beneficial effect on metabolic syndrome. However, these studies did not determine whether the effect could be attributed to the weight loss or to the diets themselves.Seminal data from the PREDIMED (PREvención con DIeta MEDiterránea) study suggested that adherence to a Mediterranean diet supplemented with nuts reversed metabolic syndrome more so than advice to follow a low-fat diet.35 However, the report was based on data from only 1224 participants followed for 1 year. We have analyzed the data from the final PREDIMED cohort after a median follow-up of 4.8 years to determine the long-term effects of a Mediterranean diet on metabolic syndrome.  相似文献   

16.

Background:

Modifiable behaviours during early childhood may provide opportunities to prevent disease processes before adverse outcomes occur. Our objective was to determine whether young children’s eating behaviours were associated with increased risk of cardiovascular disease in later life.

Methods:

In this cross-sectional study involving children aged 3–5 years recruited from 7 primary care practices in Toronto, Ontario, we assessed the relation between eating behaviours as assessed by the NutriSTEP (Nutritional Screening Tool for Every Preschooler) questionnaire (completed by parents) and serum levels of non–high-density lipoprotein (HDL) cholesterol, a surrogate marker of cardiovascular risk. We also assessed the relation between dietary intake and serum non-HDL cholesterol, and between eating behaviours and other laboratory indices of cardiovascular risk (low-density lipoprotein [LDL] cholesterol, apolipoprotein B, HDL cholesterol and apoliprotein A1).

Results:

A total of 1856 children were recruited from primary care practices in Toronto. Of these children, we included 1076 in our study for whom complete data and blood samples were available for analysis. The eating behaviours subscore of the NutriSTEP tool was significantly associated with serum non-HDL cholesterol (p = 0.03); for each unit increase in the eating behaviours subscore suggesting greater nutritional risk, we saw an increase of 0.02 mmol/L (95% confidence interval [CI] 0.002 to 0.05) in serum non-HDL cholesterol. The eating behaviours subscore was also associated with LDL cholesterol and apolipoprotein B, but not with HDL cholesterol or apolipoprotein A1. The dietary intake subscore was not associated with non-HDL cholesterol.

Interpretation:

Eating behaviours in preschool-aged children are important potentially modifiable determinants of cardiovascular risk and should be a focus for future studies of screening and behavioural interventions.Modifiable behaviours during early childhood may provide opportunities to prevent later chronic diseases, in addition to the behavioural patterns that contribute to them, before adverse outcomes occur. There is evidence that behavioural interventions during early childhood (e.g., ages 3–5 yr) can promote healthy eating.1 For example, repeated exposure to vegetables increases vegetable preference and intake,2 entertaining presentations of fruits (e.g., in the shape of a boat) increase their consumption,3 discussing internal satiety cues with young children reduces snacking,4 serving carrots before the main course (as opposed to with the main course) increases carrot consumption,5 and positive modelling of the consumption of healthy foods increases their intake by young children.6,7 Responsive eating behavioural styles in which children are given access to healthy foods and allowed to determine the timing and pace of eating in response to internal cues with limited distractions, such as those from television, have been recommended by the Institute of Medicine.8Early childhood is a critical period for assessing the origins of cardiometabolic disease and implementing preventive interventions.8 However, identifying behavioural risk factors for cardiovascular disease during early childhood is challenging, because signs of disease can take decades to appear. One emerging surrogate marker for later cardiovascular risk is the serum concentration of non–high-density lipoprotein (HDL) cholesterol (or total cholesterol minus HDL cholesterol).912 The Young Finn Longitudinal Study found an association between non-HDL cholesterol levels during childhood (ages 3–18 yr) and an adult measure of atherosclerosis (carotid artery intima–media thickness), although this relation was not significant for the subgroup of younger female children (ages 3–9 yr).10,11 The Bogalusa Heart Study, which included a subgroup of children aged 2–15 years, found an association between low-density lipoprotein (LDL) cholesterol concentration (which is highly correlated with non-HDL cholesterol) and asymptomatic atherosclerosis at autopsy.12 The American Academy of Pediatrics recommends non-HDL cholesterol concentration as the key measure for screening for cardiovascular risk in children.9 Serum non-HDL cholesterol concentration is the dyslipidemia screening test recommended by the American Academy of Pediatrics for children aged 9–11 years.9 Cardiovascular risk stratification tools such as the Reynold Risk Score (www.reynoldsriskscore.org) and the Framingham Heart Study coronary artery disease 10-year risk calculator (www.framinghamheartstudy.org/risk) for adults do not enable directed interventions when cardiovascular disease processes begin — during childhood.The primary objective of our study was to determine whether eating behaviours at 3–5 years of age, as assessed by the NutriSTEP (Nutritional Screening for Every Preschooler) questionnaire,13,14 are associated with non-HDL cholesterol levels, a surrogate marker of cardiovascular risk. Our secondary objectives were to determine whether other measures of nutritional risk, such as dietary intake, were associated with non-HDL cholesterol levels and whether eating behaviours are associated with other cardiovascular risk factors, such as LDL cholesterol, apolipoprotein B, HDL cholesterol and apoliprotein A1.  相似文献   

17.

Background

Preventive guidelines on cardiovascular risk management recommend lifestyle changes. Support for lifestyle changes may be a useful task for practice nurses, but the effect of such interventions in primary prevention is not clear. We examined the effect of involving patients in nurse-led cardiovascular risk management on lifestyle adherence and cardiovascular risk.

Methods

We performed a cluster randomized controlled trial in 25 practices that included 615 patients. The intervention consisted of nurse-led cardiovascular risk management, including risk assessment, risk communication, a decision aid and adapted motivational interviewing. The control group received a minimal nurse-led intervention. The self-reported outcome measures at one year were smoking, alcohol use, diet and physical activity. Nurses assessed 10-year cardiovascular mortality risk after one year.

Results

There were no significant differences between the intervention groups. The effect of the intervention on the consumption of vegetables and physical activity was small, and some differences were only significant for subgroups. The effects of the intervention on the intake of fat, fruit and alcohol and smoking were not significant. We found no effect between the groups for cardiovascular 10-year risk.

Interpretation

Nurse-led risk communication, use of a decision aid and adapted motivational interviewing did not lead to relevant differences between the groups in terms of lifestyle changes or cardiovascular risk, despite significant within-group differences.It is not clear if programs for lifestyle change are effective in the primary prevention of cardiovascular diseases. Some studies have shown lifestyle improvements with cardiovascular rehabilitation programs,13 and studies in primary prevention have suggested small, but potentially important, reductions in the risk of cardiovascular disease. However, these studies have had limitations and have recommended further research.4,5 According to national and international guidelines for cardiovascular risk management, measures to prevent cardiovascular disease, such as patient education and support for lifestyle change, can be delegated to practice nurses in primary care.68 However, we do not know whether the delivery of primary prevention programs by practice nurses is effective. We also do no know the effect of nurse-led prevention, including shared decision-making and risk communication, on cardiovascular risk.Because an unhealthy lifestyle plays an important role in the development of cardiovascular disease,9,10 preventive guidelines on cardiovascular disease and diabetes recommend education and counselling about smoking, diet, physical exercise and alcohol consumption for patients with moderately and highly increased risk.6,11 These patients are usually monitored in primary care practices. The adherence to lifestyle advice ranges from 20% to 90%,1215 and improving adherence requires effective interventions, comprising cognitive, behavioural and affective components (strategies to influence adherence to lifestyle advice via feelings and emotions or social relationships and social supports).16 Shared treatment decisions are highly preferred. Informed and shared decision-making requires that all information about the cardiovascular risk and the pros and cons of the risk-reduction options be shared with the patient, and that the patients’ individual values, personal resources and capacity for self-determination be respected.1719 In our cardiovascular risk reduction study,20 we developed an innovative implementation strategy that included a central role for practice nurses. Key elements of our intervention included risk assessment, risk communication, use of a decision aid and adapted motivational interviewing (Box 1).19,21,22

Box 1.?Key features of the nurse-led intervention

  • Risk assessment (intervention and control): The absolute 10-year mortality risk from cardiovascular diseases was assessed with use of a risk table from the Dutch guidelines (for patients without diabetes) or the UK Prospective Diabetes Study risk engine (for patients with diabetes).6,23 Nurses in the control group continued to provide usual care after this step.
  • Risk communication (intervention only): Nurses informed the patients of their absolute 10-year cardiovascular mortality risk using a risk communication tool developed for this study.2437
  • Decision support (intervention only): Nurses provied support to the patients using an updated decision aid.28 This tool facilitated the nurses’ interaction with the patients to arrive at informed, value-based choices for risk reduction. The tool provided information about the options and their associated relevant outcomes.
  • Adapted motivational interviewing (intervention only): Nurses discussed the options for risk reduction. The patient’s personal values were elicited using adapted motivational interviewing.
In the present study, we investigated whether a nurse-led intervention in primary care had a positive effect on lifestyle and 10-year cardiovascular risk. We hypothesized that involving patients in decision-making would increase adherence to lifestyle changes and decrease the absolute risk of 10-year cardiovascular mortality.  相似文献   

18.

Background:

Systems of stroke care delivery have been promoted as a means of improving the quality of stroke care, but little is known about their effectiveness. We assessed the effect of the Ontario Stroke System, a province-wide strategy of regionalized stroke care delivery, on stroke care and outcomes in Ontario, Canada.

Methods:

We used population-based provincial administrative databases to identify all emergency department visits and hospital admissions for acute stroke and transient ischemic attack from Jan. 1, 2001, to Dec. 31, 2010. Using piecewise regression analyses, we assessed the effect of the full implementation of the Ontario Stroke System in 2005 on the proportion of patients who received care at stroke centres, and on rates of discharge to long-term care facilities and 30-day mortality after stroke.

Results:

We included 243 287 visits by patients with acute stroke or transient ischemic attack. The full implementation of the Ontario Stroke System in 2005 was associated with an increase in rates of care at stroke centres (before implementation: 40.0%; after implementation: 46.5%), decreased rates of discharge to long-term care facilities (before implementation: 16.9%; after implementation: 14.8%) and decreased 30-day mortality for hemorrhagic (before implementation: 38.3%; after implementation: 34.4%) and ischemic stroke (before implementation: 16.3%; after implementation: 15.7%). The system’s implementation was also associated with marked increases in the proportion of patients who received neuroimaging, thrombolytic therapy, care in a stroke unit and antithrombotic therapy.

Interpretation:

The implementation of an organized system of stroke care delivery was associated with improved processes of care and outcomes after stroke.Stroke is a leading cause of death and disability worldwide.1,2 Guidelines recommend that eligible patients receive care in a stroke unit, undergo neuroimaging and receive thrombolytic therapy, antithrombotic agents and screening for carotid stenosis.36 Many of these interventions require specialized resources, including clinicians with expertise in stroke care and rapid access to brain and vascular imaging; however, wide interfacility variations exist in the availability of such resources.710To address regional disparities in resources and care, organizations such as the Canadian Stroke Network and the American Stroke Association have recommended the implementation of organized systems of stroke care delivery.11,12 Such systems are designed to facilitate access to optimal stroke care across an entire region and to promote the use of evidence-based therapies.11 However, little is known about the effect of stroke systems of care on outcomes in patients with stroke.The province of Ontario was the first large jurisdiction in Canada, and in North America, to implement an integrated regional system of stroke care delivery. A system of coordinated stroke care, known as the Ontario Stroke System, was launched in 2000 and fully implemented in 2005, resulting in a major transformation in the delivery of stroke care across the province.13 We used population-based administrative and clinical data to evaluate the effect of the system’s implementation on stroke care and outcomes.  相似文献   

19.

Background:

Compression ultrasonography performed serially over a 7-day period is recommended for the diagnosis of deep vein thrombosis in symptomatic pregnant women, but whether this approach is safe is unknown. We evaluated the safety of withholding anticoagulation from pregnant women with suspected deep vein thrombosis following negative serial compression ultrasonography and iliac vein imaging.

Methods:

Consecutive pregnant women who presented with suspected deep vein thrombosis underwent compression ultrasonography and Doppler imaging of the iliac vein of the symptomatic leg(s). Women whose initial test results were negative underwent serial testing on 2 occasions over the next 7 days. Women not diagnosed with deep vein thrombosis were followed for a minimum of 3 months for the development of symptomatic deep vein thrombosis or pulmonary embolism.

Results:

In total, 221 pregnant women presented with suspected deep vein thrombosis. Deep vein thrombosis was diagnosed in 16 (7.2%) women by initial compression ultrasonography and Doppler studies; none were identified as having deep vein thrombosis on serial testing. One patient with normal serial testing had a pulmonary embolism diagnosed 7 weeks later. The overall prevalence of deep vein thrombosis was 7.7% (17/221); of these, 65% (11/17) of cases were isolated to the iliofemoral veins and 12% (2/17) were isolated iliac deep vein thromboses. The incidence of venous thromboembolism during follow-up was 0.49% (95% confidence interval [CI] 0.09%–2.71%). The sensitivity of serial compression ultrasonography with Doppler imaging was 94.1% (95% CI 69.2%–99.7%), the negative predictive value was 99.5% (95% CI 96.9%–100%), and the negative likelihood ratio was 0.068 (95% CI 0.01–0.39).

Interpretation:

Serial compression ultrasonography with Doppler imaging of the iliac vein performed over a 7-day period excludes deep-vein thrombosis in symptomatic pregnant women.Over the last 2 decades, venous compression ultrasonography has become the imaging test of choice for diagnosing deep vein thrombosis in the lower extremities of men and nonpregnant women.14 Although this test is highly sensitive (about 97%) for deep vein thrombosis involving the femoral and popliteal veins, compression ultrasonography is less sensitive for the detection of isolated deep vein thrombosis in the calf.5 Because proximal propagation of isolated calf deep vein thrombosis occurs in about 20% of cases, serial compression ultrasonography performed over a 7-day period is recommended to definitely exclude such thromboses if the results of the initial compression ultrasound are negative.6The use of serial compression ultrasonography in symptomatic men and nonpregnant women has been validated in prospective studies,1,7 suggesting that withholding anticoagulation from symptomatic patients whose serial compression ultrasound results are negative is safe, with less than 2% of patients subsequently being diagnosed with deep vein thrombosis.1,7,8 Although the use of serial compression ultrasonography has not been validated in pregnant women, this strategy is also advocated for symptomatic pregnant women.9The appeal of using compression ultrasonography for diagnosing deep vein thrombosis in pregnant women is obvious: it is noninvasive, widely available and does not expose the fetus to ionizing radiation. However, generalizing results from studies involving men and nonpregnant women to pregnant women is problematic because of differences in clinical presentation and anatomic distribution of deep vein thromboses.10 Compared with men and nonpregnant women, pregnant women more often present with very proximal deep vein thrombosis (including isolated iliac vein deep vein thrombosis); isolated distal calf deep vein thromboses are infrequent.10 In a recent review of the literature, we found that 62% of all deep vein thromboses in symptomatic pregnant women were in the iliofemoral veins, 17% were in the iliac vein alone, and 6% were in the calf veins.10 In contrast, in the general population, more than 80% of deep vein thromboses involved calf veins, and iliofemoral deep vein thromboses or isolated iliac veins are uncommon (< 5%).14Physiologic changes associated with pregnancy might affect blood flow patterns and normal compressibility of the proximal veins, thereby affecting the diagnostic accuracy of compression ultrasonograpy. This technique cannot be used to detect isolated deep vein thromboses in the iliac vein; these veins are not compressible because of their intrapelvic location. Whether Doppler studies are sensitive for detecting deep vein thromboses in these high proximal veins (i.e., iliac veins) has not been well studied, but data suggest that this method of detection compares favourably to compression ultrasonography in men and nonpregnant women for proximal deep vein thromboses.11 The use of Doppler imaging in pregnant women for the purpose of detecting iliac vein deep vein thromboses has been reported in the literature,12,13 but it has not been adequately evaluated.Currently, the standard practice of diagnosing deep vein thrombosis in symptomatic pregnant women is by compression ultrasonography. If the results of the compression ultrasound are negative, Doppler imaging of the iliac vein (with or without vagal manoeuvres) is recommended, particularly for women with a high clinical probability of deep vein thrombosis in the iliac vein.9,12,13 This diagnostic approach is advocated despite the absence of any prospective studies validating its use. In this study, we evaluated the diagnostic accuracy of serial compression ultrasonography and Doppler imaging of the iliac veins over a 7-day period among symptomatic pregnant women.  相似文献   

20.

Background:

Multimorbidity, the presence of more than 1 long-term disorder, is associated with increased use of health services, but unplanned admissions to hospital may often be undesirable. Furthermore, socioeconomic deprivation and mental health comorbidity may lead to additional unplanned admissions. We examined the association between unplanned admission to hospital and physical multimorbidity, mental health and socioeconomic deprivation.

Methods:

We conducted a retrospective cohort study using data from 180 815 patients aged 20 years and older who were registered with 40 general practices in Scotland. Details of 32 physical and 8 mental health morbidities were extracted from the patients’ electronic health records (as of Apr. 1, 2006) and linked to hospital admission data. We then recorded the occurrence of unplanned or potentially preventable unplanned acute (nonpsychiatric) admissions to hospital in the subsequent 12 months. We used logistic regression models, adjusting for age and sex, to determine associations between unplanned or potentially preventable unplanned admissions to hospital and physical multimorbidity, mental health and socioeconomic deprivation.

Results:

We identified 10 828 (6.0%) patients who had at least 1 unplanned admission to hospital and 2037 (1.1%) patients who had at least 1 potentially preventable unplanned admission to hospital. Both unplanned and potentially preventable unplanned admissions were independently associated with increasing physical multimorbidity (for ≥ 4 v. 0 conditions, odds ratio [OR] 5.87 [95% confidence interval (CI) 5.45–6.32] for unplanned admissions, OR 14.38 [95% CI 11.87–17.43] for potentially preventable unplanned admissions), mental health conditions (for ≥ 1 v. 0 conditions, OR 2.01 [95% CI 1.92–2.09] for unplanned admissions, OR 1.80 [95% CI 1.64–1.97] for potentially preventable unplanned admissions) and socioeconomic deprivation (for most v. least deprived quintile, OR 1.56 [95% CI 1.43–1.70] for unplanned admissions, OR 1.98 [95% CI 1.63–2.41] for potentially preventable unplanned admissions).

Interpretation:

Physical multimorbidity was strongly associated with unplanned admission to hospital, including admissions that were potentially preventable. The risk of admission to hospital was exacerbated by the coexistence of mental health conditions and socioeconomic deprivation.Multimorbidity — usually defined as the presence of more than 1 long-term disorder — is becoming the norm rather than the exception as populations age.1,2 A recent study found that most people older than 65 years of age had multimorbidity, and the mean number of comorbidities per person increased with age;1 however, multimorbidity is not confined to older adults.3Multimorbidity is associated with a range of adverse outcomes. People with multimorbidity have worse physical, social and psychological quality of life4 and increased mortality.5 Mental health conditions often accompany and exacerbate long-term physical conditions, leading to poor health outcomes, reduced quality of life and increased costs.1,6,7 Furthermore, health services are largely organized to provide care for single diseases, particularly in hospitals or under specialist care. Indeed, many aspects of care are poor for patients with multimorbidity.810 This situation may be further aggravated among patients who are socioeconomically disadvantaged, because they often have poorer health and higher health care needs, while also experiencing poorer provision of services, than their more advantaged counterparts.11 A lack of social and personal resources, coupled with multiple stresses, makes coping difficult for these patients,12 and the multiplicity of physical, psychological and social problems means that family physicians sometimes struggle to support patients with multimorbidity in deprived settings.13Multimorbidity is associated with increased use of health services; however, whereas high use of primary and specialist ambulatory care may be seen as an appropriate response to multimorbidity, frequent unplanned admissions to hospital will often be undesirable.14 Unfortunately, there are relatively few large studies that have examined the association between multimorbidity and unplanned hospital admissions.1517 Moreover, such studies did not separately examine physical and mental health morbidity and did not account for the additional effect of socioeconomic deprivation — shortcomings we hope to have addressed. Using linked routine clinical primary care and hospital data, we sought to determine the association between unplanned admissions to hospital and physical multimorbidity, as well as any additional effect of mental health morbidity and socioeconomic deprivation.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号