首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.

Purpose

Favourable small cell lung carcinoma (SCLC) survival outcomes have been reported in patients with paraneoplastic neurological disorders (PNDs) associated with neuronal antibodies (Neur-Abs), but the presence of a PND might have expedited diagnosis. Our aim was to establish whether neuronal antibodies, independent of clinical neurological features, correlate with SCLC survival.

Experimental Design

262 consecutive SCLC patients were examined: of these, 24 with neurological disease were excluded from this study. The remaining 238 were tested for a broad array of Neur-Abs at the time of cancer diagnosis; survival time was established from follow-up clinical data.

Results

Median survival of the non-PND cohort (n = 238) was 9.5 months. 103 patients (43%) had one or more antigen-defined Neur-Abs. We found significantly longer median survival in 23 patients (10%) with HuD/anti-neuronal nuclear antibody type 1 (ANNA-1, 13.0 months P = 0.037), but not with any of the other antigen-defined antibodies, including the PND-related SOX2 (n = 56, 24%). An additional 28 patients (12%) had uncharacterised anti-neuronal nuclear antibodies (ANNA-U); their median survival time was longer still (15.0 months, P = 0.0048), contrasting with the survival time in patients with non-neuronal anti-nuclear antibodies (detected using HEp-2 cells, n = 23 (10%), 9.25 months). In multivariate analyses, both ANNA-1 and ANNA-U independently reduced the mortality hazard by a ratio of 0.532 (P = 0.01) and 0.430 (P<0.001) respectively.

Conclusions

ANNAs, including the newly described ANNA-U, may be key components of the SCLC immunome and have a potential role in predicting SCLC survival; screening for them could add prognostic value that is similar in magnitude to that of limited staging at diagnosis.  相似文献   

2.

Background

Low molecular weight heparins (LMWH’s) are used to prevent and treat thrombosis. Tests for monitoring LMWH’s include anti-factor Xa (anti-FXa), activated partial thromboplastin time (aPTT) and thrombin generation. Anti-FXa is the current gold standard despite LMWH’s varying affinities for FXa and thrombin.

Aim

To examine the effects of two different LMWH’s on the results of 4 different aPTT-tests, anti-FXa activity and thrombin generation and to assess the tests’ concordance.

Method

Enoxaparin and tinzaparin were added ex-vivo in concentrations of 0.0, 0.5, 1.0 and 1.5 anti-FXa international units (IU)/mL, to blood from 10 volunteers. aPTT was measured using two whole blood methods (Free oscillation rheometry (FOR) and Hemochron Jr (HCJ)) and an optical plasma method using two different reagents (ActinFSL and PTT-Automat). Anti-FXa activity was quantified using a chromogenic assay. Thrombin generation (Endogenous Thrombin Potential, ETP) was measured on a Ceveron Alpha instrument using the TGA RB and more tissue-factor rich TGA RC reagents.

Results

Methods’ mean aPTT at 1.0 IU/mL LMWH varied between 54s (SD 11) and 69s (SD 14) for enoxaparin and between 101s (SD 21) and 140s (SD 28) for tinzaparin. ActinFSL gave significantly shorter aPTT results. aPTT and anti-FXa generally correlated well. ETP as measured with the TGA RC reagent but not the TGA RB reagent showed an inverse exponential relationship to the concentration of LMWH. The HCJ-aPTT results had the weakest correlation to anti-FXa and thrombin generation (Rs0.62–0.87), whereas the other aPTT methods had similar correlation coefficients (Rs0.80–0.92).

Conclusions

aPTT displays a linear dose-respone to LMWH. There is variation between aPTT assays. Tinzaparin increases aPTT and decreases thrombin generation more than enoxaparin at any given level of anti-FXa activity, casting doubt on anti-FXa’s present gold standard status. Thrombin generation with tissue factor-rich activator is a promising method for monitoring LMWH’s.  相似文献   

3.

Background

A substantial fraction of all American healthcare expenditures are potentially wasted, and practices that are not evidence-based could contribute to such waste. We sought to characterize whether Prothrombin Time (PT) and activated Partial Thromboplastin Time (aPTT) tests of preoperative patients are used in a way unsupported by evidence and potentially wasteful.

Methods and Findings

We evaluated prospectively-collected patient data from 19 major teaching hospitals and 8 hospital-affiliated surgical centers in 7 states (Delaware, Florida, Maryland, Massachusetts, New Jersey, New York, Pennsylvania) and the District of Columbia. A total of 1,053,472 consecutive patients represented every patient admitted for elective surgery from 2009 to 2012 at all 27 settings. A subset of 682,049 patients (64.7%) had one or both tests done and history and physical (H&P) records available for analysis. Unnecessary tests for bleeding risk were defined as: PT tests done on patients with no history of abnormal bleeding, warfarin therapy, vitamin K-dependent clotting factor deficiency, or liver disease; or aPTT tests done on patients with no history of heparin treatment, hemophilia, lupus anticoagulant antibodies, or von Willebrand disease. We assessed the proportion of patients who received PT or aPTT tests who lacked evidence-based reasons for testing.

Conclusions

This study sought to bring the availability of big data together with applied comparative effectiveness research. Among preoperative patients, 26.2% received PT tests, and 94.3% of tests were unnecessary, given the absence of findings on H&P. Similarly, 23.3% of preoperative patients received aPTT tests, of which 99.9% were unnecessary. Among patients with no H&P findings suggestive of bleeding risk, 6.6% of PT tests and 7.1% of aPTT tests were either a false positive or a true positive (i.e. indicative of a previously-undiagnosed potential bleeding risk). Both PT and aPTT, designed as diagnostic tests, are apparently used as screening tests. Use of unnecessary screening tests raises concerns for the costs of such testing and the consequences of false positive results.  相似文献   

4.

Background

Fortification of staple foods is considered an effective and safe strategy to combat micronutrient deficiencies, thereby improving health. While improving micronutrient status might be expected to have positive effects on immunity, some studies have reported increases in infections or inflammation after iron supplementation.

Objective

To study effects of micronutrient-fortified rice on hookworm infection in Cambodian schoolchildren.

Methods

A double-blinded, cluster-randomized trial was conducted in 16 Cambodian primary schools partaking in the World Food Program school meal program. Three types of multi-micronutrient fortified rice were tested against placebo rice within the school meal program: UltraRice_original, UltraRice_improved and NutriRice. Four schools were randomly assigned to each study group (placebo n = 492, UltraRice_original n = 479, UltraRice_improved n = 500, NutriRice n = 506). Intestinal parasite infection was measured in fecal samples by Kato-Katz method at baseline and after three and seven months. In a subgroup (N = 330), fecal calprotectin was measured by ELISA as a marker for intestinal inflammation.

Results

Baseline prevalence of hookworm infection was 18.6%, but differed considerably among schools (range 0%- 48.1%).Micronutrient-fortified rice significantly increased risk of new hookworm infection. This effect was modified by baseline hookworm prevalence at the school; hookworm infection risk was increased by all three types of fortified rice in schools where baseline prevalence was high (>15%), and only by UltraRice_original in schools with low baseline prevalence. Neither hookworm infection nor fortified rice was related to fecal calprotectin.

Conclusions

Consumption of rice fortified with micronutrients can increase hookworm prevalence, especially in environments with high infection pressure. When considering fortification of staple foods, a careful risk-benefit analysis is warranted, taking into account severity of micronutrient deficiencies and local prevalence of parasitic infections.

Trial Registration

ClinicalTrials.gov NCT01706419  相似文献   

5.

Background

Early detection of cancer is an effective and efficient cancer management strategy. In South Korea, the National Health Insurance administers the National Cancer Screening Program to its beneficiaries. We examined the impact of the National Cancer Screening Program on socioeconomic disparities in cancer stage at diagnosis.

Methods

Cancer patients registered in the Korean Central Cancer Registry from January 1, 2010 to December 31, 2010 with a diagnosis of gastric cancer (n = 22,470), colon cancer (n = 16,323), breast cancer (n = 10,076), or uterine cervical cancer (n = 2,447) were included. Income level was divided into three groups according to their monthly contribution of National Health Insurance. We employed absolute (age-standardized prevalence rate, slope index of inequality) and relative (relative index of inequality) measures to separately examine social disparities among participants and non-participants of the National Cancer Screening Program in terms of the early-stage rate.

Results

Age-standardized prevalence rates of early-stage by income group were always higher in participants than in non-participants. Furthermore, the age-standardized prevalence rate of early-stage in the low income group of the participants was also higher than that of the high income group of the non-participants. The sizes of disparities (both slope index of inequality and relative index of inequality) are smaller in participants compared to non-participants.

Conclusion

National Cancer Screening Program participation reduced income disparity in cancer stage at diagnosis. Population-based cancer screening programs can be used as an effective measure to reduce income disparity in cancer care.  相似文献   

6.
7.

Background

Myelin oligodendrocyte glycoprotein antibody (MOG Ab) associated demyelination represents a subgroup of autoimmune demyelination that is separate from multiple sclerosis and aquaporin 4 IgG-positive NMO, and can have a relapsing course. Unlike NMO and MS, there is a paucity of literature on immunopathology and CSF cytokine/chemokines in MOG Ab associated demyelination.

Aim

To study the differences in immunopathogenesis based on cytokine/chemokine profile in MOG Ab-positive (POS) and -negative (NEG) groups.

Methods

We measured 34 cytokines/chemokines using multiplex immunoassay in CSF collected from paediatric patients with serum MOG Ab POS [acute disseminated encephalomyelitis (ADEM = 8), transverse myelitis (TM = 2) n = 10] and serum MOG Ab NEG (ADEM = 5, TM = 4, n = 9) demyelination. We generated normative data using CSF from 20 non-inflammatory neurological controls.

Results

The CSF cytokine and chemokine levels were higher in both MOG Ab POS and MOG Ab NEG demyelination groups compared to controls. The CSF in MOG Ab POS patients showed predominant elevation of B cell related cytokines/chemokines (CXCL13, APRIL, BAFF and CCL19) as well as some of Th17 related cytokines (IL-6 AND G-CSF) compared to MOG Ab NEG group (all p<0.01). In addition, patients with elevated CSF MOG antibodies had higher CSF CXCL13, CXCL12, CCL19, IL-17A and G-CSF than patients without CSF MOG antibodies.

Conclusion

Our findings suggest that MOG Ab POS patients have a more pronounced CNS inflammatory response with elevation of predominant humoral associated cytokines/chemokines, as well as some Th 17 and neutrophil related cytokines/chemokines suggesting a differential inflammatory pathogenesis associated with MOG antibody seropositivity. This cytokine/chemokine profiling provides new insight into disease pathogenesis, and improves our ability to monitor inflammation and response to treatment. In addition, some of these molecules may represent potential immunomodulatory targets.  相似文献   

8.

Objective

To summarize efficacy and safety data on a new progesterone compound which is available for subcutaneous administration as compared to vaginally administered progesterone for luteal phase support in patients undergoing IVF treatment.

Design

Data from two randomized phase III trials (07EU/Prg06 and 07USA/Prg05) performed according to GCP standards with a total sample size of 1435 per-protocol patients were meta-analyzed on an individual patient data level.

Setting

University affiliated reproductive medicine unit.

Patients

Subcutaneous progesterone was administered to a total of 714 subjects and vaginal progesterone was administered to a total of 721 subjects who underwent fresh embryo transfer after ovarian stimulation followed by IVF or ICSI. The subjects were between 18 and 42 years old and had a BMI <30kg/m2.

Interventions

Subcutaneous progesterone 25 mg daily vs. either progesterone vaginal gel 90 mg daily (07EU/Prg06) or 100 mg intravaginal twice a day (07USA/Prg05) for luteal phase support in IVF patients.

Main outcome measures

Ongoing pregnancy rate beyond 10 gestational weeks, live birth rate and OHSS risk.

Results

The administration of subcutaneous progesterone versus intra-vaginal progesterone had no impact on ongoing pregnancy likelihood (OR = 0.865, 95% CI 0.694 to 1.077; P = n.s.), live birth likelihood (OR = 0.889, 95% CI 0.714 to 1.106; P = n.s.) or OHSS risk (OR = 0.995, 95% CI 0.565 to 1.754; P = n.s.) in regression analyses accounting for clustering of patients within trials, while adjusting for important confounders. Only female age and number of oocytes retrieved were significant predictors of live birth likelihood and OHSS risk.

Conclusion

No statistical significant or clinical significant differences exist between subcutaneous and vaginal progesterone for luteal phase support.  相似文献   

9.

Objectives

To determine the prevalence, determinants, and potential clinical relevance of adherence with the Dutch dosing guideline in patients with impaired renal function at hospital discharge.

Design

Retrospective cohort study between January 2007 and July 2011.

Setting

Academic teaching hospital in the Netherlands.

Subjects

Patients with an estimated glomerular filtration rate (eGFR) between 10-50 ml/min/1.73m2 at discharge and prescribed one or more medicines of which the dose is renal function dependent.

Main Outcome Measures

The prevalence of adherence with the Dutch renal dosing guideline was investigated, and the influence of possible determinants, such as reporting the eGFR and severity of renal impairment (severe: eGFR<30 and moderate: eGFR 30-50 ml/min/1.73m2). Furthermore, the potential clinical relevance of non-adherence was assessed.

Results

1327 patients were included, mean age 67 years, mean eGFR 38 ml/min/1.73m2. Adherence with the guideline was present in 53.9% (n=715) of patients. Reporting the eGFR, which was incorporated since April 2009, resulted in more adherence with the guideline: 50.7% vs. 57.0%, RR 1.12 (95% CI 1.02-1.25). Adherence was less in patients with severe renal impairment (46.0%), compared to patients with moderate renal impairment (58.1%, RR 0.79; 95% CI 0.70-0.89). 71.4% of the cases of non-adherence had the potential to cause moderate to severe harm.

Conclusion

Required dosage adjustments in case of impaired renal function are often not performed at hospital discharge, which may cause harm to the majority of patients. Reporting the eGFR can be a small and simple first step to improve adherence with dosing guidelines.  相似文献   

10.

Background

Patients with pituitary stalk interruption syndrome (PSIS) are initially referred for hypoglycemia during the neonatal period or growth retardation during childhood. PSIS is either isolated (nonsyndromic) or associated with extra-pituitary malformations (syndromic).

Objective

To compare baseline characteristics and long-term evolution in patients with PSIS according to the initial presentation.

Study Design

Sixty-seven patients with PSIS were included. Data from subgroups were compared: neonates (n = 10) versus growth retardation patients (n = 47), and syndromic (n = 32) versus nonsyndromic patients (n = 35).

Results

Neonates displayed a more severe hormonal and radiological phenotype than children referred for growth retardation, with a higher incidence of multiple hormonal deficiencies (100% versus 34%; P = 0.0005) and a nonvisible anterior pituitary lobe (33% versus 2%; P = 0.0017). Regular follow-up of growth might have allowed earlier diagnosis in the children with growth retardation, as decreased growth velocity and growth retardation were present respectively 3 and 2 years before referral. We documented a progressive worsening of endocrine impairment throughout childhood in these patients. Presence of extra-pituitary malformations (found in 48%) was not associated with more severe hormonal and radiological characteristics. Growth under GH treatment was similar in the patient groups and did not vary according to the pituitary MRI findings.

Conclusions

PSIS diagnosed in the neonatal period has a particularly severe hormonal and radiological phenotype. The progressive worsening of endocrine impairment throughout childhood justifies periodic follow-up to check for additional hormonal deficiencies.  相似文献   

11.

Background

Controversy exists as to whether obesity constitutes a risk-factor or a protective-factor for the development of nosocomial Infection (NI). According to the obesity-paradox, there is evidence that moderate obesity is a protective-factor. In Mexico few studies have focused on the nutritional status (NS) distribution in the hospital setting.

Objectives

The aim of this study was to estimate the distribution of NS and the prevalence of nosocomial infection NI among adult elective surgery (ES) patients and to compare the clinical and anthropometric characteristics and length of stays (LOS) between obese and non-obese patients and between patients with and without NI.

Methods

We conducted a cross-sectional study with a sample (n = 82) adult ES patients (21–59 years old) who were recruited from a tertiary-care hospital. The prevalences of each NS category and NI were estimated, the assessments were compared between groups (Mann-Whitney, Chi-squared or the Fisher''s-exact-test), and the association between preoperative risk-factors and NI was evaluated using odds ratios.

Results

The distribution of subjects by NS category was: underweight (3.66%), normal-weight (28.05%), overweight (35.36%), and obese (32.93%). The prevalence of NI was 14.63%. The LOS was longer (p<0.001) for the patients who developed NI. The percentages of NI were: 33.3% in underweight, 18.52% in obese, 17.39% in normal-weight, and 6.90% in overweight patients.

Conclusion

The prevalence of overweight and obesity in adult ES patients is high. The highest prevalence of NI occurred in the underweight and obese patients. The presence of NI considerably increased the LOS, resulting in higher medical care costs.  相似文献   

12.

Background

Human adenoviruses (HAdVs) have been recognised as pathogens that cause a broad spectrum of diseases. The studies on HAdV infection among children with severe acute respiratory infection (SARI) are limited.

Objective

To investigate the prevalence, epidemiology, and genotype of HAdV among children with SARI in China.

Study Design

Nasopharyngeal aspirates (NPAs) or induced sputum (IS) was collected from hospitalised children with SARIs in Beijing (representing Northern China; n = 259) and Zhejiang Province (representing Eastern China; n = 293) from 2007 to 2010. The prevalence of HAdV was screened by polymerase chain reaction (PCR), followed by sequence typing of PCR fragments that targeted the second half of the hexon gene. In addition, co-infection with other human respiratory viruses, related epidemiological profiles and clinical presentations were investigated.

Results and Conclusions

In total, 76 (13.8%) of 552 SARI patients were positive for HAdV, and the infection rates of HAdV in Northern and Eastern China were 20.1% (n = 52) and 8.2% (n = 24), respectively. HAdV co-infection with other respiratory viruses was frequent (infection rates: Northern China, 90.4%; Eastern China, 70.8%). The peak seasons for HAdV-B infection was winter and spring. Additionally, members of multiple species (Human mastadenovirus B, C, D and E) were circulating among paediatric patients with SARI, of which HAdV-B (34/52; 65.4%) and HAdV-C (20/24, 83.3%) were the most predominant in Northern and Eastern China, respectively. These findings provide a benchmark for future epidemiology and prevention strategies for HAdV.  相似文献   

13.

Aim

Describe polio patients visiting a polio clinic in Sweden, a country where vaccination was introduced in 1957.

Design

A consecutive cohort study.

Patients

Prior polio patients.

Methods

All patients (n = 865) visiting the polio clinic at Sahlgrenska University Hospital, Gothenburg Sweden, between 1994 and 2012 were included in this study. Data at first visit regarding patient characteristics, polio classification, data of electromyography, origin, assistive devices and gait speed as well as muscle strength were collected for these patients. Twenty-three patients were excluded because no polio diagnosis could be established. A total of 842 patients with confirmed polio remained in the study.

Results

More than twenty percent of the patients were from countries outside the Nordic region and considerably younger than those from the Nordic region. The majority of the emigrants were from Asia and Africa followed by Europe (outside the Nordic region). Of all patients included ninety-seven percent (n = 817) had polio in the lower extremity and almost 53% (n = 444) had polio in the upper extremity while 28% (n = 238) had polio in the trunk, according to clinical classification of polio. Compared with a sample of the normal population, the polio patients walked 61–71% slower, and were 53–77% weaker in muscle strength of the knee and foot as well as grip strength.

Conclusion

The younger patients with polio emigrating from countries with different cultures may lead to a challenge for the multi professional teams working with post-polio rehabilitation and are of importance when planning for the care of polio patients the coming years.  相似文献   

14.

Importance

The fellow eye of patients with unilateral neovascular age-related degeneration (nAMD) is at increased risk of developing late AMD. Several cohort studies have evaluated the prevalence of pseudodrusen and the association between pseudodrusen and late AMD in the fellow eye of patients with unilateral nAMD. However, these studies have limited sample sizes and their results are inconsistent.

Objective

To evaluate the prevalence rate of pseudodrusen, and the association between pseudodrusen and incidence of late AMD (nAMD and geographic atrophy (GA)) in the fellow eye of patients with unilateral nAMD.

Data Sources

The PubMed, EMBASE, Web of Science, and Cochrane Library databases were searched up to July 2015, as well as other systematic reviews.

Study Selection

All cohort studies for pseudodrusen with late AMD in the fellow eye of patients with unilateral nAMD.

Data Extraction and Synthesis

The numbers of patients with and without pseudodrusen at baseline and the numbers of incident nAMD and GA during follow up among patients with and without pseudodrusen were independently extracted by 2 authors. The results were pooled using random-effects meta-analysis. Heterogeneity was assessed using the I2 test.

Main Outcome Measures

Prevalence rate of pseudodrusen, risk ratios (RRs) and their 95% confidence intervals (95% CIs) for associations between pseudodrusen and the incidence of nAMD and GA in the fellow eye.

Results

Five cohort studies (N = 677 patients) from 8 countries across 4 continents were included. The pooled prevalence rate of pseudodrusen in the fellow eye was 48.1% (95% Cl: 36.7–59.5%, I2 = 87%). Pseudodrusen were associated with an increased risk of nAMD (RR = 1.54, 95% Cl: 1.10–2.16, I2 = 42%), GA (RR = 4.70, 95% Cl: 1.22–18.1, I2 = 64%), and late AMD (RR = 2.03, 95% Cl: 1.35–3.06, I2 = 60%).

Conclusions

For patients with unilateral nAMD, pseudodrusen were present in about half of the fellow eyes. The presence of pseudodrusen was associated with a 1.5 times higher risk of developing nAMD, a 4.7 times higher risk of developing GA, and a 2 times higher risk of developing late AMD. Pseudodrusen should be considered in evaluating the risk of late AMD development; however, due to considerable heterogeneity across these studies, a larger study is needed to validate these findings.  相似文献   

15.

Purpose

To evaluate postoperative metamorphopsia in macula-off rhegmatogenous retinal detachment (RRD) and its association with visual function, vision related quality of life, and optical coherence tomography (OCT) findings.

Methods

45 patients with primary macula-off RRD were included. At 12 months postoperatively, data on metamorphopsia using sine amsler charts (SAC), best corrected visual acuity (BCVA), letter contrast sensitivity, color vision (saturated and desaturated color confusion indexes), critical print size, reading acuity, the 25-item National Eye Institute Visual Functioning Questionnaire (NEI-VFQ-25), and OCT, were obtained.

Results

Metamorphopsia was present in 39 patients (88.6%), with most of them (n = 35, 77.8%) showing only mild metamorphopsia (SAC score = 1). Patients with metamorphopsia had significantly worse postoperative BCVA (p = 0.02), critical print size (p<0.0005), and reading acuity (p = 0.001) compared to patients without metamorphopsia. Other visual function outcomes and NEI-VFQ-25 overall composite score were all also somewhat lower in patients with metamorphopsia, but this did not reach statistical significance. No association with OCT findings was present.

Conclusion

The prevalence of postoperative metamorphopsia in macula-off RRD patients is high, however, the degree of metamorphopsia is relatively low. When metamorphopsia is present, visual functions seem to be compromised, while vision related quality of life is only mildly affected.  相似文献   

16.
17.

Background

State-level estimates from the Centers for Disease Control and Prevention (CDC) underestimate the obesity epidemic because they use self-reported height and weight. We describe a novel bias-correction method and produce corrected state-level estimates of obesity and severe obesity.

Methods

Using non-parametric statistical matching, we adjusted self-reported data from the Behavioral Risk Factor Surveillance System (BRFSS) 2013 (n = 386,795) using measured data from the National Health and Nutrition Examination Survey (NHANES) (n = 16,924). We validated our national estimates against NHANES and estimated bias-corrected state-specific prevalence of obesity (BMI≥30) and severe obesity (BMI≥35). We compared these results with previous adjustment methods.

Results

Compared to NHANES, self-reported BRFSS data underestimated national prevalence of obesity by 16% (28.67% vs 34.01%), and severe obesity by 23% (11.03% vs 14.26%). Our method was not significantly different from NHANES for obesity or severe obesity, while previous methods underestimated both. Only four states had a corrected obesity prevalence below 30%, with four exceeding 40%–in contrast, most states were below 30% in CDC maps.

Conclusions

Twelve million adults with obesity (including 6.7 million with severe obesity) were misclassified by CDC state-level estimates. Previous bias-correction methods also resulted in underestimates. Accurate state-level estimates are necessary to plan for resources to address the obesity epidemic.  相似文献   

18.

Objective

To examine change in county-level adult obesity prevalence between 2004 and 2009 and identify associated community characteristics.

Methods

Change in county-level adult (≥20 years) obesity prevalence was calculated for a 5-year period (2004–2009). Community measures of economic, healthcare, recreational, food environment, population structure, and education contexts were also calculated. Regression analysis was used to assess community characteristics associated (p<0.01) with change in adult obesity prevalence.

Results

Mean±SD change in obesity prevalence was 5.1±2.4%. Obesity prevalence decreased in 1.4% (n = 44) and increased in 98% (n = 3,060) of counties from 2004–2009. Results showed that both baseline levels and increases in physically inactive adults were associated with greater increases in obesity prevalence, while baseline levels of and increases in physician density and grocery store/supercenter density were related to smaller increases in obesity rates. Baseline levels of the Hispanic population share were negatively linked to changing obesity levels, while places with greater Hispanic population growth saw greater increases in obesity.

Conclusions

Most counties in the U.S. experienced increases in adult obesity prevalence from 2004 to 2009. Findings suggest that community-based interventions targeting adult obesity need to incorporate a range of community factors, such as levels of physical inactivity, access to physicians, availability of food outlets, and ethnic/racial population composition.  相似文献   

19.

Introduction

Postnatal depression (PND) is one of the most common psychopathology and is considered as a serious public health issue because of its devastating effects on mother, family, and infant or the child.

Objective

To elicit socio-demographic, obstetric and pregnancy outcome predictors of Postnatal Depression (PND) among rural postnatal women in Karnataka state, India.

Design

Hospital based analytical cross sectional study

Setting

A rural tertiary care hospital of Mandya District, Karnataka state, India.

Sample

PND prevalence based estimated sample of 102 women who came for postnatal follow up from 4th to 10th week of lactation.

Method

Study participants were interviewed using validated kannada version of Edinburgh Postnatal Depression Scale (EPDS). Cut-off score of ≥13 was used as high risk of PND. The percentage of women at risk of PND was estimated, and differences according to socio-demographic, obstetric and pregnancy outcome were described. Logistic regression was applied to identify the independent predictors of PND risk.

Main Outcome Measures

Prevalence, Odds ratio (OR) and adjusted (adj) OR of PND

Results

Prevalence of PND was 31.4% (95% CI 22.7–41.4%). PND showed significant (P<0.05) association with joint family, working women, non-farmer husbands, poverty, female baby and pregnancy complications or known medical illness. In binomial logistic regression poverty (adjOR: 11.95, 95% CI:1.36–105), birth of female baby (adjOR: 3.6, 95% CI:1.26–10.23) and pregnancy complications or known medical illness (adjOR: 17.4, 95% CI:2.5–121.2) remained as independent predictors of PND.

Conclusion

Risk of PND among rural postnatal women was high (31.4%). Birth of female baby, poverty and complications in pregnancy or known medical illness could predict the high risk of PND. PND screening should be an integral part of postnatal care. Capacity building of grass root level workers and feasibility trials for screening PND by them are needed.  相似文献   

20.

Objectives

The aim of this study was to compare conventional versus steerable catheter guided coronary sinus (CS) cannulation in patients with advanced heart failure undergoing cardiac resynchronization therapy (CRT).

Background

Steerable catheter guided coronary sinus cannulation could reduce fluoroscopy time and contrast medium use during CRT implantation.

Methods

176 consecutive patients with ischemic and non-ischemic heart failure undergoing CRT implantation from January 2008 to December 2012 at the University Hospital of Cologne were identified. During the study period two concurrent CS cannulation techniques were used: standard CS cannulation technique (standard-group, n = 113) and CS cannulation using a steerable electrophysiology (EP) catheter (EPCath-group, n = 63). Propensity-score matched pairs of conventional and EP-catheter guided CS cannulation made up the study population (n = 59 pairs). Primary endpoints were total fluoroscopy time and contrast medium amount used during procedure.

Results

The total fluoroscopy time was 30.9 min (interquartile range (IQR), 19.9–44.0 min) in the standard-group and 23.4 min (IQR, 14.2-34-2 min) in the EPCath-group (p = 0.011). More contrast medium was used in the standard-group (60.0 ml, IQR, 30.0–100 ml) compared to 25.0 ml (IQR, 20.0–50.0 ml) in the EPCath-group (P<0.001).

Conclusions

Use of steerable EP catheter was associated with significant reduction of fluoroscopy time and contrast medium use in patients undergoing CRT implantation.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号