共查询到20条相似文献,搜索用时 203 毫秒
1.
Marlene Perignon Marion Fiorentino Khov Kuong Kurt Burja Megan Parker Sek Sisokhom Chhoun Chamnan Jacques Berger Frank T. Wieringa 《PloS one》2014,9(11)
Background
Nutrition is one of many factors affecting the cognitive development of children. In Cambodia, 55% of children <5 y were anemic and 40% stunted in 2010. Currently, no data exists on the nutritional status of Cambodian school-aged children, or on how malnutrition potentially affects their cognitive development.Objective
To assess the anthropometric and micronutrient status (iron, vitamin A, zinc, iodine) of Cambodian schoolchildren and their associations with cognitive performance.Methods
School children aged 6–16 y (n = 2443) from 20 primary schools in Cambodia were recruited. Anthropometry, hemoglobin, serum ferritin, transferrin receptors, retinol-binding protein and zinc concentrations, inflammation status, urinary iodine concentration and parasite infection were measured. Socio-economic data were collected in a sub-group of children (n = 616). Cognitive performance was assessed using Raven’s Colored Progressive Matrices (RCPM) and block design and picture completion, two standardized tests from the Wechsler Intelligence Scale for Children (WISC-III).Results
The prevalence of anemia, iron, zinc, iodine and vitamin A deficiency were 15.7%; 51.2%, 92.8%, 17.3% and 0.7% respectively. The prevalence of stunting was 40.0%, including 10.9% of severe stunting. Stunted children scored significantly lower than non-stunted children on all tests. In RCPM test, boys with iron-deficiency anemia had lower scores than boys with normal iron status (−1.46, p<0.05). In picture completion test, children with normal iron status tended to score higher than iron-deficient children with anemia (−0.81; p = 0.067) or without anemia (−0.49; p = 0.064). Parasite infection was associated with an increase in risk of scoring below the median value in block design test (OR = 1.62; p<0.05), and with lower scores in other tests, for girls only (both p<0.05).Conclusion
Poor cognitive performance of Cambodian school-children was multifactorial and significantly associated with long-term (stunting) and current nutritional status indicators (iron status), as well as parasite infection. A life-cycle approach with programs to improve nutrition in early life and at school-age could contribute to optimal cognitive performance. 相似文献2.
Nisreen A. Alwan Janet E. Cade Darren C. Greenwood John Deanfield Debbie A. Lawlor 《PloS one》2014,9(1)
Background
Iron deficiency is common during pregnancy. Experimental animal studies suggest that it increases cardiovascular risk in the offspring.Objective
To examine the relationship between maternal pregnancy dietary and supplement iron intake and hemoglobin, with offspring’s arterial stiffness (measured by carotid-radial pulse wave velocity), endothelial function (measured by brachial artery flow mediated dilatation), blood pressure, and adiposity (measured by body mass index), test for mediation by cord ferritin, birth weight, gestational age, and child dietary iron intake, and for effect modification by maternal vitamin C intake and offspring sex.Design
Prospective data from 2958 mothers and children pairs at 10 years of age enrolled in an English birth cohort, the Avon Longitudinal Study for Parents and Children (ALSPAC), was analysed.Results
2639 (89.2%) mothers reported dietary iron intake in pregnancy below the UK reference nutrient intake of 14.8 mg/day. 1328 (44.9%) reported taking iron supplements, and 129 (4.4%) were anemic by 18 weeks gestation. No associations were observed apart from maternal iron intake from supplements with offspring systolic blood pressure (−0.8 mmHg, 99% CI −1.7 to 0, P = 0.01 in the sample with all relevant data observed, and −0.7 mmHg, 99% CI −1.3 to 0, P = 0.008 in the sample with missing data imputed).Conclusion
There was no evidence of association between maternal pregnancy dietary iron intake, or maternal hemoglobin concentration (which is less likely to be biased by subjective reporting) with offspring outcomes. There was a modest inverse association between maternal iron supplement intake during pregnancy with offspring systolic blood pressure at 10 years. 相似文献3.
Background
Iron deficiency anemia (IDA) is a global public health problem among school age children, which retards psychomotor development and impairs cognitive performance. There is limited data on prevalence and risk factors for IDA.Objective
The aim of this study was to determine the prevalence, severity, and predictors of nutritional IDA in school age children in Southwest Ethiopia.Methodology
A community based cross-sectional study was conducted in Jimma Town, Southwest Ethiopia from April to July 2013. A total of 616 school children aged 6 to 12 years were included in the study using multistage sampling technique. A structured questionnaire was used to collect sociodemographic data. Five milliliter venous blood was collected from each child for hematological examinations. Anemia was defined as a hemoglobin level lower than 11.5 g/dl and 12 g/dl for age group of 5–11 years and 12–15 years, respectively. Iron deficiency anemia was defined when serum iron and ferritin levels are below 10 µmol/l and 15 µg/dl, respectively. Moreover, fresh stool specimen was collected for diagnosis of intestinal parasitic infection. Stained thick and thin blood films were examined for detection of Plasmodium infection and study of red blood cell morphology. Dietary patterns of the study subjects were assessed using food frequency questionnaire and anthropometric measurements were done. Data were analyzed using SPSS V-20.0 for windows.Result
Overall, prevalence of anemia was 43.7%, and that of IDA was 37.4%. Not-consuming protein source foods [AOR = 2.30, 95%CI(1.04,5.14)], not-consuming dairy products [AOR = 1.83, 95%CI(1.14,5.14)], not-consuming discretionary calories [AOR = 2.77, 95%CI(1.42,5.40)], low family income [AOR = 6.14, 95%CI(2.90,12.9)] and intestinal parasitic infections [AOR = 1.45, 95%CI(1.23, 5. 27)] were predictors of IDA.Conclusion
Iron deficiency anemia is a moderate public health problem in the study site. Dietary deficiencies and intestinal parasitic infections were predictors of IDA. Therefore, emphasis should be given to the strategies for the prevention of risk factors for IDA. 相似文献4.
Dulciene Maria Magalhaes Queiroz Paul R. Harris Ian R. Sanderson Henry J. Windle Marjorie M. Walker Andreia Maria Camargos Rocha Gifone Aguiar Rocha Simone Diniz Carvalho Paulo Fernando Souto Bittencourt Lucia Porto Fonseca de Castro Andrea Villagrán Carolina Serrano Dermot Kelleher Jean E. Crabtree 《PloS one》2013,8(7)
Objective
Iron deficiency (ID) and iron deficiency anaemia (IDA) are global major public health problems, particularly in developing countries. Whilst an association between H. pylori infection and ID/IDA has been proposed in the literature, currently there is no consensus. We studied the effects of H. pylori infection on ID/IDA in a cohort of children undergoing upper gastrointestinal endoscopy for upper abdominal pain in two developing and one developed country.Methods
In total 311 children (mean age 10.7±3.2 years) from Latin America - Belo Horizonte/Brazil (n = 125), Santiago/Chile (n = 105) - and London/UK (n = 81), were studied. Gastric and duodenal biopsies were obtained for evaluation of histology and H. pylori status and blood samples for parameters of ID/IDA.Results
The prevalence of H. pylori infection was 27.7% being significantly higher (p<0.001) in Latin America (35%) than in UK (7%). Multiple linear regression models revealed H. pylori infection as a significant predictor of low ferritin and haemoglobin concentrations in children from Latin-America. A negative correlation was observed between MCV (r = −0.26; p = 0.01) and MCH (r = −0.27; p = 0.01) values and the degree of antral chronic inflammation, and between MCH and the degree of corpus chronic (r = −0.29, p = 0.008) and active (r = −0.27, p = 0.002) inflammation.Conclusions
This study demonstrates that H. pylori infection in children influences the serum ferritin and haemoglobin concentrations, markers of early depletion of iron stores and anaemia respectively. 相似文献5.
Alessio Aghemo Eleonora Grassi Maria Grazia Rumi Roberta D'Ambrosio Enrico Galmozzi Elisabetta Degasperi Davide Castaldi Roberta Soffredini Massimo Colombo 《PloS one》2014,9(4)
Background
Severe anemia is a common side effect of Pegylated Interferon + Ribavirin (PR) and Telaprevir (TVR) in hepatitis C virus (HCV) genotype 1 patients with advanced fibrosis or cirrhosis (F3–F4). Inosine triphosphatase (ITPA) genetic variants are associated with RBV- induced anemia and dose reduction.Aim
To test the association of ITPA polymorphisms rs1127354 and rs7270101 with hemoglobin (Hb) decline, need for RBV dose reduction (RBV DR), erythropoietin (EPO) support and blood transfusions during the first 12 weeks of TVR triple therapy.Materials and Methods
69 consecutive HCV-1 patients (mean age 57 years) with F3-F4 who received PR and TVR were genotyped for ITPA polymorphisms rs1127354 and rs7270101. Estimated ITPA deficiency was graded on severity (0–3, no deficiency/mild/moderate/severe).Results
ITPA deficiency was absent in 48 patients (70%), mild in 12 (17%) and moderate in 9 patients (13%). Mean week 4 Hb decline was higher in non ITPA deficient patients (3,85 g/dL) than in mildly or moderately ITPA deficient patients (3,07 g/dL and 1,67 g/dL, p<0.0001). Grade 3–4 anemia developed in 81% non ITPA deficient patients versus 67% mild deficient and 55% moderate deficient patients (p = ns). Grade of ITPA deficiency was not associated with RbvDR (no deficiency: 60%, mild: 58%, moderate: 67%; p = ns), EPO use (no deficiency: 65%, mild: 58%, moderate:56%; p = ns) or need for blood transfusion (no deficiency: 27%, mild: 17%, moderate: 33%; p = ns).Conclusions
In patients with F3–F4 chronic hepatitis C receiving TVR based therapy, ITPA genotype does not impact on the management of early anemia. 相似文献6.
Amy Woods Laura A. Garvican-Lewis Philo U. Saunders Greg Lovell David Hughes Ruth Fazakerley Bev Anderson Christopher J. Gore Kevin G. Thompson 《PloS one》2014,9(9)
Purpose
To determine the effect of intravenous iron supplementation on performance, fatigue and overall mood in runners without clinical iron deficiency.Methods
Fourteen distance runners with serum ferritin 30–100 µg·L−1 were randomly assigned to receive three blinded injections of intravenous ferric-carboxymaltose (2 ml, 100 mg, IRON) or normal saline (PLACEBO) over four weeks (weeks 0, 2, 4). Athletes performed a 3,000 m time trial and 10×400 m monitored training session on consecutive days at week 0 and again following each injection. Hemoglobin mass (Hbmass) was assessed via carbon monoxide rebreathing at weeks 0 and 6. Fatigue and mood were determined bi-weekly until week 6 via Total Fatigue Score (TFS) and Total Mood Disturbance (TMD) using the Brief Fatigue Inventory and Brunel Mood Scale. Data were analyzed using magnitude-based inferences, based on the unequal variances t-statistic and Cohen''s Effect sizes (ES).Results
Serum ferritin increased in IRON only (Week 0: 62.8±21.9, Week 4: 128.1±46.6 µg·L−1; p = 0.002) and remained elevated two weeks after the final injection (127.0±66.3 µg·L−1, p = 0.01), without significant changes in Hbmass. Supplementation had a moderate effect on TMD of IRON (ES -0.77) with scores at week 6 lower than PLACEBO (ES -1.58, p = 0.02). Similarly, at week 6, TFS was significantly improved in IRON vs. PLACEBO (ES –1.54, p = 0.05). There were no significant improvements in 3,000 m time in either group (Week 0 vs. Week 4; Iron: 625.6±55.5 s vs. 625.4±52.7 s; PLACEBO: 624.8±47.2 s vs. 639.1±59.7 s); but IRON reduced their average time for the 10×400 m training session at week 2 (Week 0: 78.0±6.6 s, Week 2: 77.2±6.3; ES–0.20, p = 0.004).Conclusion
During 6 weeks of training, intravenous iron supplementation improved perceived fatigue and mood of trained athletes with no clinical iron deficiency, without concurrent improvements in oxygen transport capacity or performance. 相似文献7.
Emanuel Zitt Gisela Sturm Florian Kronenberg Ulrich Neyer Florian Knoll Karl Lhotta Günter Weiss 《PloS one》2014,9(12)
Background
Studies on the association between iron supplementation and mortality in dialysis patients are rare and conflicting.Methods
In our observational single-center cohort study (INVOR study) we prospectively studied 235 incident dialysis patients. Time-dependent Cox proportional hazards models using all measured laboratory values for up to 7.6 years were applied to study the association between iron supplementation and all-cause mortality, cardiovascular and sepsis-related mortality. Furthermore, the time-dependent association of ferritin levels with mortality in patients with normal C-reactive protein (CRP) levels (<0.5 mg/dL) and elevated CRP levels (≧0.5 mg/dL) was evaluated by using non-linear P-splines to allow flexible modeling of the association.Results
One hundred and ninety-one (81.3%) patients received intravenous iron, 13 (5.5%) patients oral iron, whereas 31 (13.2%) patients were never supplemented with iron throughout the observation period. Eighty-two (35%) patients died during a median follow-up of 34 months, 38 patients due to cardiovascular events and 21 patients from sepsis. Baseline CRP levels were not different between patients with and without iron supplementation. However, baseline serum ferritin levels were lower in patients receiving iron during follow up (median 93 vs 251 ng/mL, p<0.001). Iron supplementation was associated with a significantly reduced all-cause mortality [HR (95%CI): 0.22 (0.08–0.58); p = 0.002] and a reduced cardiovascular and sepsis-related mortality [HR (95%CI): 0.31 (0.09–1.04); p = 0.06]. Increasing ferritin concentrations in patients with normal CRP were associated with a decreasing mortality, whereas in patients with elevated CRP values ferritin levels>800 ng/mL were linked with increased mortality.Conclusions
Iron supplementation is associated with reduced all-cause mortality in incident dialysis patients. While serum ferritin levels up to 800 ng/mL appear to be safe, higher ferritin levels are associated with increased mortality in the setting of concomitant inflammation. 相似文献8.
Background
Iron supplementation is employed to treat post-malarial anaemia in environments where iron deficiency is common. Malaria induces an intense inflammatory reaction that stalls reticulo-endothelial macrophagal iron recycling from haemolysed red blood cells and inhibits oral iron absorption, but the magnitude and duration of these effects are unclear.Methodology/Principal Findings
We examined the red blood cell incorporation of oral administered stable isotopes of iron and compared incorporation between age matched 18 to 36 months old children with either anaemia post-malaria (n = 37) or presumed iron deficiency anaemia alone (n = 36). All children were supplemented for 30 days with 2 mg/kg elemental iron as liquid iron sulphate and administered 57Fe and 58Fe on days 1 and 15 of supplementation respectively. 57Fe and58Fe incorporation were significantly reduced (8% vs. 28%: p<0.001 and 14% vs. 26%: p = 0.045) in the malaria vs. non-malaria groups. There was a significantly greater haemoglobin response in the malaria group at both day 15 (p = 0.001) and 30 (p<0.000) with a regression analysis estimated greater change in haemoglobin of 7.2 g/l (s.e. 2.0) and 10.1 g/l (s.e. 2.5) respectively.Conclusion/Significance
Post-malaria anaemia is associated with a better haemoglobin recovery despite a significant depressant effect on oral iron incorporation which may indicate that early erythropoetic iron need is met by iron recycling rather than oral iron. Supplemental iron administration is of questionable utility within 2 weeks of clinical malaria in children with mild or moderate anaemia. 相似文献9.
Sunil Sazawal Usha Dhingra Pratibha Dhingra Girish Hiremath Archana Sarkar Arup Dutta Venugopal P. Menon Robert E. Black 《PloS one》2010,5(8)
Background
Multiple micronutrient deficiencies are highly prevalent among preschool children and often lead to anemia and growth faltering. Given the limited success of supplementation and health education programs, fortification of foods could be a viable and sustainable option. We report results from a community based double-masked, randomized trial among children 1–4 years evaluating the effects of micronutrients (especially of zinc and iron) delivered through fortified milk on growth, anemia and iron status markers as part of a four group study design, running two studies simultaneously.Methods and Findings
Enrolled children (n = 633) were randomly allocated to receive either micronutrients fortified milk (MN = 316) or control milk (Co = 317). Intervention of MN milk provided additional 7.8 mg zinc, 9.6 mg iron, 4.2 µg selenium, 0.27 mg copper, 156 µg vitamin A, 40.2 mg vitamin C, and 7.5 mg vitamin E per day (three serves) for one year. Anthropometry was recorded at baseline, mid- and end-study. Hematological parameters were estimated at baseline and end-study. Both groups were comparable at baseline. Compliance was over 85% and did not vary between groups. Compared to children consuming Co milk, children consuming MN milk showed significant improvement in weight gain (difference of mean: 0.21 kg/year; 95% confidence interval [CI] 0.12 to 0.31, p<0.001) and height gain (difference of mean: 0.51 cm/year; 95% CI 0.27 to 0.75, p<0.001). Mean hemoglobin (Hb) (difference of 13.6 g/L; 95% CI 11.1 to 16.0, p<0.001) and serum ferritin levels (difference of 7.9 µg/L; 95% CI 5.4 to 10.5, p<0.001) also improved. Children in MN group had 88% (odds ratio = 0.12, 95% CI 0.08 to 0.20, p<0.001) lower risk of iron deficiency anemia.Conclusions/Significance
Milk provides an acceptable and effective vehicle for delivery of specific micronutrients, especially zinc and iron. Micronutrient bundle improved growth and iron status and reduced anemia in children 1–4 years old.Trial Registration
ClinicalTrials.gov NCT00255385相似文献10.
Purpose
We sought to estimate the risks of adverse obstetric outcomes and disease outcomes associated with severe thrombocytopenia in pregnant women with aplastic anemia (AA).Methods
In a retrospective study, we compared demographics, clinical characteristics, laboratory results, and outcomes between severe thrombocytopenia (ST) and non-severe thrombocytopenia (non-ST) groups comprising pregnant women with AA.Results
Of 61 AA patients, 43 (70%) were diagnosed as AA before pregnancy and 18 (30%) were AA during pregnancy. The ST group exhibited lower gestational age at nadir of platelet count (26.0 versus 37.0 weeks, p<0.001) and at delivery (37.3 versus 39.1 weeks, p = 0.008), and a higher rate of bleeding gums (33.8 versus 7.7%, p = 0.015) than the non-ST group. In addition, the ST group exhibited more transfusions during pregnancy (72.7 versus 15.4%, p<0.001) and postpartum period (45.0 versus 2.7%, p<0.001), and more bone marrow transplant after delivery (25.0 versus 0.0%, p<0.001) than the non-ST group. The ST group had a higher odds ratio of composite disease complications (OR, 9.63; 95% CI, 2.82–32.9; p<0.001) and composite obstetric complications (OR, 6.78; 95% CI, 2.11–21.8; p = 0.001) than the non-ST group.Conclusions
Severe thrombocytopenia is more associated with obstetric and disease complications than is non-severe thrombocytopenia in pregnant women with AA. 相似文献11.
Chun-An Chen Meng-Yao Lu Shinn-Forng Peng Kai-Hsin Lin Hsiu-Hao Chang Yung-Li Yang Shiann-Tarng Jou Dong-Tsamn Lin Yen-Bin Liu Herng-Er Horng Hong-Chang Yang Jou-Kou Wang Mei-Hwan Wu Chau-Chung Wu 《PloS one》2014,9(1)
Background
Patients with transfusion-dependent beta-thalassemia major (TM) are at risk for myocardial iron overload and cardiac complications. Spatial repolarization heterogeneity is known to be elevated in patients with certain cardiac diseases, but little is known in TM patients. The purpose of this study was to evaluate spatial repolarization heterogeneity in patients with TM, and to investigate the relationships between spatial repolarization heterogeneity, cardiac iron load, and adverse cardiac events.Methods and Results
Fifty patients with TM and 55 control subjects received 64-channel magnetocardiography (MCG) to determine spatial repolarization heterogeneity, which was evaluated by a smoothness index of QTc (SI-QTc), a standard deviation of QTc (SD-QTc), and a QTc dispersion. Left ventricular function and myocardial T2* values were assessed by cardiac magnetic resonance. Patients with TM had significantly greater SI-QTc, SD-QTc, and QTc dispersion compared to the control subjects (all p values<0.001). Spatial repolarization heterogeneity was even more pronounced in patients with significant iron overload (T2*<20 ms, n = 20) compared to those with normal T2* (all p values<0.001). Loge cardiac T2* correlated with SI-QTc (r = −0.609, p<0.001), SD-QTc (r = −0.572, p<0.001), and QTc dispersion (r = −0.622, p<0.001), while all these indices had no relationship with measurements of the left ventricular geometry or function. At the time of study, 10 patients had either heart failure or arrhythmia. All 3 indices of repolarization heterogeneity were related to the presence of adverse cardiac events, with areas under the receiver operating characteristic curves (ranged between 0.79 and 0.86), similar to that of cardiac T2*.Conclusions
Multichannel MCG demonstrated that patients with TM had increased spatial repolarization heterogeneity, which is related to myocardial iron load and adverse cardiac events. 相似文献12.
Objective
To explore the feasibility of dual-source dual-energy computed tomography (DSDECT) for hepatic iron and fat separation in vivo.Materials and Methods
All of the procedures in this study were approved by the Research Animal Resource Center of Shanghai Ruijin Hospital. Sixty rats that underwent DECT scanning were divided into the normal group, fatty liver group, liver iron group, and coexisting liver iron and fat group, according to Prussian blue and HE staining. The data for each group were reconstructed and post-processed by an iron-specific, three-material decomposition algorithm. The iron enhancement value and the virtual non-iron contrast value, which indicated overloaded liver iron and residual liver tissue, respectively, were measured. Spearman''s correlation and one-way analysis of variance (ANOVA) were performed, respectively, to analyze statistically the correlations with the histopathological results and differences among groups.Results
The iron enhancement values were positively correlated with the iron pathology grading (r = 0.729, p<0.001). Virtual non-iron contrast (VNC) values were negatively correlated with the fat pathology grading (r = −0.642,p<0.0001). Different groups showed significantly different iron enhancement values and VNC values (F = 25.308,p<0.001; F = 10.911, p<0.001, respectively). Among the groups, significant differences in iron enhancement values were only observed between the iron-present and iron-absent groups, and differences in VNC values were only observed between the fat-present and fat-absent groups.Conclusion
Separation of hepatic iron and fat by dual energy material decomposition in vivo was feasible, even when they coexisted. 相似文献13.
Idro R Gwer S Williams TN Otieno T Uyoga S Fegan G Kager PA Maitland K Kirkham F Neville BG Newton CR 《PloS one》2010,5(11):e14001
Background
There are conflicting reports on whether iron deficiency changes susceptibility to seizures. We examined the hypothesis that iron deficiency is associated with an increased risk of acute seizures in children in a malaria endemic area.Methods
We recruited 133 children, aged 3–156 months, who presented to a district hospital on the Kenyan coast with acute seizures and frequency-matched these to children of similar ages but without seizures. We defined iron deficiency according to the presence of malarial infection and evidence of inflammation. In patients with malaria, we defined iron deficiency as plasma ferritin<30µg/ml if plasma C-reactive protein (CRP) was<50mg/ml or ferritin<273µg/ml if CRP≥50mg/ml, and in those without malaria, as ferritin<12µg/ml if CRP<10mg/ml or ferritin<30µg/ml if CRP≥10mg/ml. In addition, we performed a meta-analysis of case-control studies published in English between January 1966 and December 2009 and available through PUBMED that have examined the relationship between iron deficiency and febrile seizures in children.Results
In our Kenyan case control study, cases and controls were similar, except more cases reported past seizures. Malaria was associated with two-thirds of all seizures. Eighty one (30.5%) children had iron deficiency. Iron deficiency was neither associated with an increased risk of acute seizures (45/133[33.8%] cases were iron deficient compared to 36/133[27.1%] controls, p = 0.230) nor status epilepticus and it did not affect seizure semiology. Similar results were obtained when children with malaria, known to cause acute symptomatic seizures in addition to febrile seizures were excluded. However, in a meta-analysis that combined all eight case-control studies that have examined the association between iron deficiency and acute/febrile seizures to-date, iron deficiency, described in 310/1,018(30.5%) cases and in 230/1,049(21.9%) controls, was associated with a significantly increased risk of seizures, weighted OR 1.79(95%CI 1.03–3.09).Conclusions
Iron deficiency is not associated with an increased risk of all acute seizures in children but of febrile seizures. Further studies should examine mechanisms involved and the implications for public health. 相似文献14.
15.
Ching-Hui Huang Chia-Chu Chang Chen-Ling Kuo Ching-Shan Huang Tzai-Wen Chiu Chih-Sheng Lin Chin-San Liu 《PloS one》2014,9(8)
Objective
Anemia is associated with high mortality and poor prognosis after acute coronary syndrome (ACS). Increased red cell distribution width (RDW) is a strong independent predictor for adverse outcomes in ACS. The common underlying mechanism for anemia and increased RDW value is iron deficiency. It is not clear whether serum iron deficiency without anemia affects left ventricular (LV) performance after primary angioplasty for acute myocardial infarction (AMI). We investigated the prognostic value of serum iron concentration on LV ejection fraction (EF) at 6 months and its relationship to thrombolysis in myocardial infarction (TIMI) risk score in post MI patients.Methods
We recruited 55 patients who were scheduled to undergo primary coronary balloon angioplasty after AMI and 54 age- and sex-matched volunteers. Serum iron concentration and interleukin-6 levels were measured before primary angioplasty. LVEF was measured by echocardiography at baseline and after 6 months. TIMI risk score was calculated for risk stratification.Results
Serum iron concentration was significantly lower in those in whom LVEF had not improved ≥10% from baseline (52.7±24.1 versus 80.8±50.8 µg/dl, P = 0.016) regardless of hemoglobin level, and was significantly lower in the AMI group than in the control group (62.5±37.7 versus 103.0±38.1 µg/dl, P<0.001). Trend analysis revealed that serum iron concentration decreased as TIMI risk score increased (P = 0.002). In addition, lower serum iron concentrations were associated with higher levels of inflammatory markers. Multiple linear regression showed that baseline serum iron concentration can predict LV systolic function 6 months after primary angioplasty for AMI even after adjusting for traditional prognostic factors.Conclusion
Hypoferremia is not only a marker of inflammation but also a potential prognostic factor for LV systolic function after revascularization therapy for AMI, and may be a novel biomarker for therapeutic intervention. 相似文献16.
Fabiana Oliveira Bastos Bonato Marcelo Montebello Lemos José Luiz Cassiolato Maria Eugênia Fernandes Canziani 《PloS one》2013,8(6)
Background and Objectives
Sudden cardiac death is the most common cause of mortality in chronic kidney disease patients, and it occurs mostly due to ventricular arrhythmias. In this study, we aimed at investigating the prevalence of ventricular arrhythmia and the factors associated with its occurrence in nondialyzed chronic kidney disease patients.Design, Setting, Participants and Measurements
This cross-sectional study evaluated 111 chronic kidney disease patients (estimated glomerular filtration rate 34.7±16.1 mL/min/1.73 m2, 57±11.4 years, 60% male, 24% diabetics). Ventricular arrhythmia was assessed by 24-hour electrocardiogram. Left ventricular hypertrophy (echocardiogram), 24-hour ambulatory blood pressure monitoring, and coronary artery calcification (multi-slice computed tomography) and laboratory parameters were also evaluated.Results
Ventricular arrhythmia was found in 35% of the patients. Non-controlled hypertension was observed in 21%, absence of systolic decency in 29%, left ventricular hypertrophy in 27%, systolic dysfunction in 10%, and coronary artery calcification in 49%. Patients with ventricular arrhythmia were older (p<0.001), predominantly men (p = 0.009), had higher estimated glomerular filtration rate (p = 0.03) and hemoglobin (p = 0.005), and lower intact parathyroid hormone (p = 0.024) and triglycerides (p = 0.011) when compared to patients without ventricular arrhythmia. In addition, a higher left ventricular mass index (p = 0.002) and coronary calcium score (p = 0.002), and a lower ejection fraction (p = 0.001) were observed among patients with ventricular arrhythmia. In the multiple logistic regression analysis, aging, increased hemoglobin levels and reduced ejection fraction were independently related to the presence of ventricular arrhythmia.Conclusions
Ventricular arrhythmia is prevalent in nondialyzed chronic kidney disease patients. Age, hemoglobin levels and ejection fraction were the factors associated with ventricular arrhythmia in these patients. 相似文献17.
Lauren Hund Christine A. Northrop-Clewes Ronald Nazario Dilora Suleymanova Lusine Mirzoyan Munira Irisova Marcello Pagano Joseph J. Valadez 《PloS one》2013,8(11)
Background
The Uzbekistan 1996 Demographic Health Survey reported 60.4% of women of reproductive age (WRA) had low hemoglobin concentrations (<120 g/L), and anemia was an important public health problem. Fortification of wheat flour was identified as an appropriate intervention to address anemia due to the ubiquitous consumption of wheat flour. A National Flour Fortification Program (NFFP) was implemented in 2005.Methodology/Principal Findings
After 3-years of the NFFP, a national survey using large country-lot quality assurance sampling was carried out to assess iron, folate, hemoglobin and inflammation status of WRA; the coverage and knowledge of the fortified first grade UzDonMakhsulot (UDM) flour/grey loaf program; and consumption habits of women to investigate the dietary factors associated with anemia. Estimated anemia prevalence was 34.4% (95% CI: 32.0, 36.7), iron depletion 47.5% (95% CI: 45.1, 49.9) and folate deficiency 28.8% (95% CI: 26.8, 30.8); the effect of inflammation was minimal (4% with CRP >5 mg/L). Severe anemia was more prevalent among folate deficient than iron depleted WRA. Presence of UDM first grade flour or the grey loaf was reported in 71.3% of households. Among WRA, 32.1% were aware of UDM fortification; only 3.7% mentioned the benefits of fortification and 12.5% understood causes of anemia. Consumption of heme iron-containing food (91%) and iron absorption enhancers (97%) was high, as was the consumption of iron absorption inhibitors (95%).Conclusions/Significance
The NFFP coincided with a substantial decline in the prevalence of anemia. Folate deficiency was a stronger predictor of severe anemia than iron depletion. However, the prevalence of iron depletion was high, suggesting that women are not eating enough iron or iron absorption is inhibited. Fortified products were prevalent throughout Uzbekistan, though UDM flour must be adequately fortified and monitored in the future. Knowledge of fortification and anemia was low, suggesting consumer education should be prioritized. 相似文献18.
Introduction
Few have examined determinants of adverse outcomes in patients presenting with ascending cholangitis. The objective of this study was to examine factors associated with in-hospital mortality, prolonged length of stay (LOS) and increased hospital charges (HC) in patients presenting with acute cholangitis.Methods
Within the Health Care Utilization Project Nationwide Inpatient Sample (NIS), we focused on patients, 18 years and older, admitted to the emergency department with cholangitis as primary diagnosis (1998–2009). Models were fitted to predict likelihood of in-hospital mortality, prolonged LOS and increased HC. Covariates included race, day of admission, insurance status, socio-economical status and other patient and hospital characteristics.Results
Overall, weighted estimates of 248,942 patients were admitted with acute cholangitis between 1998 and 2009, of which 13,534 (5.4%) died during the admission. Multivariable analyses revealed that relative to Caucasian patients, African American, Hispanic and Asian and Pacific Islander patients were more likely to die (OR = 1.61, p<0.001, OR = 1.20, p = 0.01 and OR = 1.26, p = 0.008), to experience a prolonged LOS (OR = 1.77, p<0.001, OR = 1.30, p<0.001, 1.34, p<0.001), and to incur high HC (OR = 1.83, p<0.001, OR = 1.51, p<0.001, OR = 1.56, p<0.001). Moreover, Medicaid and Medicare patients were more likely to die (OR = 1.64, p<0.001, OR = 1.24, p<0.001), to experience a prolonged LOS (1.74, p<0.001, OR = 1.25, p<0.001) and to incur high HC (OR = 1.23, p = 0.002, OR = 1.12, p = 0.002) compared to privately insured patients. In subgroup analysis, there were no differences for Medicare patients age 65 years and over. However, those under 65, most of whom have disability or end stage renal disease, were more likely to experience the negative outcomes.Conclusion
Race and insurance status represent independent predictors of in-hospital mortality and adverse outcomes in patients presenting with cholangitis. Whether these disparities are due to biological predisposition or unequal quality of care requires further investigation. Regardless, efforts should be made to reduce these outcome disparities. 相似文献19.
Purpose
The occurrence of brushite stones has increased during recent years. However, the pathogenic factors driving the development of brushite stones remain unclear.Methods
Twenty-eight brushite stone formers and 28 age-, sex- and BMI-matched healthy individuals were enrolled in this case-control study. Anthropometric, clinical, 24 h urinary parameters and dietary intake from 7-day weighed food records were assessed.Results
Pure brushite stones were present in 46% of patients, while calcium oxalate was the major secondary stone component. Urinary pH and oxalate excretion were significantly higher, whereas urinary citrate was lower in patients as compared to healthy controls. Despite lower dietary intake, urinary calcium excretion was significantly higher in brushite stone patients. Binary logistic regression analysis revealed pH>6.50 (OR 7.296; p = 0.035), calcium>6.40 mmol/24 h (OR 25.213; p = 0.001) and citrate excretion <2.600 mmol/24 h (OR 15.352; p = 0.005) as urinary risk factors for brushite stone formation. A total of 56% of patients exhibited distal renal tubular acidosis (dRTA). Urinary pH, calcium and citrate excretion did not significantly differ between patients with or without dRTA.Conclusions
Hypercalciuria, a diminished citrate excretion and an elevated pH turned out to be the major urinary determinants of brushite stone formation. Interestingly, urinary phosphate was not associated with urolithiasis. The increased urinary oxalate excretion, possibly due to decreased calcium intake, promotes the risk of mixed stone formation with calcium oxalate. Neither dietary factors nor dRTA can account as cause for hypercalciuria, higher urinary pH and diminished citrate excretion. Further research is needed to define the role of dRTA in brushite stone formation and to evaluate the hypothesis of an acquired acidification defect. 相似文献20.
Anna A. Wawer Linda J. Harvey Jack R. Dainty Natalia Perez-Moral Paul Sharp Susan J. Fairweather-Tait 《PloS one》2014,9(11)
Previous in vitro results indicated that alginate beads might be a useful vehicle for food iron fortification. A human study was undertaken to test the hypothesis that alginate enhances iron absorption. A randomised, single blinded, cross-over trial was carried out in which iron absorption was measured from serum iron appearance after a test meal. Overnight-fasted volunteers (n = 15) were given a test meal of 200 g cola-flavoured jelly plus 21 mg iron as ferrous gluconate, either in alginate beads mixed into the jelly or in a capsule. Iron absorption was lower from the alginate beads than from ferrous gluconate (8.5% and 12.6% respectively, p = 0.003). Sub-group B (n = 9) consumed the test meals together with 600 mg calcium to determine whether alginate modified the inhibitory effect of calcium. Calcium reduced iron absorption from ferrous gluconate by 51%, from 11.5% to 5.6% (p = 0.014), and from alginate beads by 37%, from 8.3% to 5.2% (p = 0.009). In vitro studies using Caco-2 cells were designed to explore the reasons for the difference between the previous in vitro findings and the human study; confirmed the inhibitory effect of alginate. Beads similar to those used in the human study were subjected to simulated gastrointestinal digestion, with and without cola jelly, and the digestate applied to Caco-2 cells. Both alginate and cola jelly significantly reduced iron uptake into the cells, by 34% (p = 0.009) and 35% (p = 0.003) respectively. The combination of cola jelly and calcium produced a very low ferritin response, 16.5% (p<0.001) of that observed with ferrous gluconate alone. The results of these studies demonstrate that alginate beads are not a useful delivery system for soluble salts of iron for the purpose of food fortification.