首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
The late health effects of low-dose rate radiation exposure are still a serious public concern in the Fukushima area even four years after the accident at Fukushima Daiichi Nuclear Power Plant (FNPP). To clarify the factors associated with residents’ risk perception of radiation exposure and consequent health effects, we conducted a survey among residents of Kawauchi village in May and June 2014, which is located within 30 km of FNPP. 85 of 285 residents (29.8%) answered that acute radiation syndrome might develop in residents after the accident, 154 (54.0%) residents responded that they had anxieties about the health effects of radiation on children, and 140 (49.1%) residents indicated that they had anxieties about the health effects of radiation on offspring. Furthermore, 107 (37.5%) residents answered that they had concerns about health effects that would appear in the general population simply by living in an environment with a 0.23 μSv per hour ambient dose for one year, 149 (52.2%) residents reported that they were reluctant to eat locally produced foods, and 164 (57.5%) residents believed that adverse health effects would occur in the general population by eating 100 Bq per kg of mushrooms every day for one year. The present study shows that a marked bipolarization of the risk perception about the health effects of radiation among residents could have a major impact on social well-being after the accident at FNPP.  相似文献   

2.
The distribution of radiocesium was examined in bamboo shoots, Phyllostachys pubescens, collected from 10 sites located some 41 to 1140 km from the Fukushima Daiichi nuclear power plant, Japan, in the Spring of 2012, 1 year after the Fukushima nuclear accident. Maximum activity concentrations for radiocesium 134Cs and 137Cs in the edible bamboo shoot parts, 41 km away from the Fukushima Daiichi plant, were in excess of 15.3 and 21.8 kBq/kg (dry weight basis; 1.34 and 1.92 kBq/kg, fresh weight), respectively. In the radiocesium-contaminated samples, the radiocesium activities were higher in the inner tip parts, including the upper edible parts and the apical culm sheath, than in the hardened culm sheath and underground basal parts. The radiocesium/potassium ratios also tended to be higher in the inner tip parts. The radiocesium activities increased with bamboo shoot length in another bamboo species, Phyllostachys bambusoides, suggesting that radiocesium accumulated in the inner tip parts during growth of the shoots.  相似文献   

3.
The aim of the present large population-based cohort study is to explore the risk factors of age-related mortality in liver transplant recipients in Taiwan. Basic information and data on medical comorbidities for 2938 patients who received liver transplants between July 1, 1998, and December 31, 2012, were extracted from the National Health Insurance Research Database on the basis of ICD-9-codes. Mortality risks were analyzed after adjusting for preoperative comorbidities and compared among age cohorts. All patients were followed up until the study endpoint or death. This study finally included 2588 adults and 350 children [2068 (70.4%) male and 870 (29.6%) female patients]. The median age at transplantation was 52 (interquartile range, 43–58) years. Recipients were categorized into the following age cohorts: <20 (n = 350, 11.9%), 20–39 (n = 254, 8.6%), 40–59 (n = 1860, 63.3%), and ≥60 (n = 474, 16.1%) years. In the total population, 428 deaths occurred after liver transplantation, and the median follow-up period was 2.85 years (interquartile range, 1.2–5.5 years). Dialysis patients showed the highest risk of mortality irrespective of age. Further, the risk of death increased with an increase in the age at transplantation. Older liver transplant recipients (≥60 years), especially dialysis patients, have a higher mortality rate, possibly because they have more medical comorbidities. Our findings should make clinicians aware of the need for better risk stratification among elderly liver transplantation candidates.  相似文献   

4.
In the wake of the 2011 Fukushima Daiichi Nuclear Power Station accident, to facilitate evidence-based risk communication we need to understand radiation risk perception and the effectiveness of risk-comparison information. We measured and characterized perceptions of dread risks and unknown risks regarding dietary radionuclides in residents of Fukushima, Tokyo, and Osaka to identify the primary factors among location, evacuation experience, gender, age, employment status, absence/presence of spouse, children and grandchildren, educational background, humanities/science courses, smoking habits, and various types of trustworthy information sources. We then evaluated the effects of these factors and risk-comparison information on multiple outcomes, including subjective and objective understanding, perceived magnitude of risk, perceived accuracy of information, backlash against information, and risk acceptance. We also assessed how risk-comparison information affected these multiple outcomes for people with high risk perception. Online questionnaires were completed by people (n = 9249) aged from 20 to 69 years in the three prefectures approximately 5 years after the accident. We gave each participant one of 15 combinations of numerical risk data and risk-comparison information, including information on standards, smoking-associated risk, and cancer risk, in accordance with Covello’s guidelines. Dread-risk perception among Fukushima residents with no experience of evacuation was much lower than that in Osaka residents, whereas evacuees had strikingly higher dread-risk perception, irrespective of whether their evacuation had been compulsory or voluntary. We identified location (distance from the nuclear power station), evacuation experience, and trust of central government as primary factors. Location (including evacuation experience) and trust of central government were significantly associated with the multiple outcomes above. Only information on “cancer risk from radiation and smoking risk” enhanced both subjective and objective understanding without diminishing trust in all participants and in the high dread-risk perception group; use of other risk-comparison information could lead the public to overestimate risk.  相似文献   

5.

Introduction

Pesticide poisoning is an important public health problem worldwide. The study aimed to determine the risk of all-cause and cause-specific inpatient mortality and to identify prognostic factors for inpatient mortality associated with unintentional insecticide and herbicide pesticide poisonings.

Methods

We performed a retrospective cohort study of 3,986 inpatients recruited at hospitalization between 1999 and 2008 in Taiwan. We used the International Classification of Disease, 9th ed., Clinical Modification external causes of injury codes to classify poisoning agents into accidental poisoning by insecticides and herbicides. Comparisons in mortality rates were made between insecticide poisoning patients and herbicide poisoning patients by using the Cox proportional hazards models to estimate multivariable-adjusted hazard ratios (HRs) and their 95% confidence intervals (CIs).

Results

There were 168 deaths during 21,583 person-days of follow-up evaluation (7.8 per 1,000 person-days). The major causes of mortality for insecticide poisonings were the toxic effect of organophosphate and coma, and the major causes of mortality for herbicide poisonings were the toxic effect of other pesticides and the toxic effect of organophosphate. The mortality for herbicide exposure was fourfold higher than that for insecticide exposure. The factors associated with inpatient mortality were herbicide poisonings (HR = 4.58, 95% CI 3.29 to 6.37) and receiving mechanical ventilation treatment (HR = 3.85, 95% CI 2.73 to 5.42).

Conclusions

We demonstrated that herbicides stand out as the dominant agent for poisoning-related fatalities. The control of and limiting access to herbicide agents and developing appropriate therapeutic regimens, including emergency care, should be priorities.  相似文献   

6.

Background

Sleep disorders, especially chronic insomnia, have become major health problem worldwide and, as a result, the use of hypnotics is steadily increasing. However, few studies with a large sample size and long-term observation have been conducted to investigate the relationship between specific hypnotics and mortality.

Methods

We conducted this retrospective cohort study using data from the National Health Insurance Research Database in Taiwan. Information from claims data including basic characteristics, the use of hypnotics, and survival from 2000 to 2009 for 1,320,322 individuals were included. The use of hypnotics was divided into groups using the defined daily dose and the cumulative length of use. Hazard ratios (HRs) were calculated from a Cox proportional hazards model, with two different matching techniques to examine the associations.

Results

Compared to the non-users, both users of benzodiazepines (HR = 1.81; 95% confidence interval [CI] = 1.78–1.85) and mixed users (HR = 1.44; 95% CI = 1.42–1.47) had a higher risk of death, whereas the users of other non-benzodiazepines users showed no differences. Zolpidem users (HR = 0.73; 95% CI = 0.71–0.75) exhibited a lower risk of mortality in the adjusted models. This pattern remained similar in both matching techniques. Secondary analysis indicated that zolpidem users had a reduced risk of major cause-specific mortality except cancer, and that this protective effect was dose-responsive, with those using for more than 1 year having the lowest risk.

Conclusions

The effects of different types of hypnotics on mortality were diverse in this large cohort with long-term follow-up based on representative claims data in Taiwan. The use of zolpidem was associated with a reduced risk of mortality.  相似文献   

7.
As a result of the accident at the Fukushima Daiichi Nuclear Power Plant (FNPP) on 11 March 2011, a huge amount of radionuclides, including radiocesium, was released and spread over a wide area of eastern Japan. Although three years have passed since the accident, residents around the FNPP are anxious about internal radiation exposure due to radiocesium. In this study, we screened internal radiation exposure doses in Iwaki city of Fukushima prefecture, using a whole-body counter. The first screening was conducted from October 2012 to February 2013, and the second screening was conducted from May to November 2013. Study participants were employees of ALPINE and their families who underwent examination. A total of 2,839 participants (1,366 men and 1,473 women, 1–86 years old) underwent the first screening, and 2,092 (1,022 men and 1,070 women, 1–86 years old) underwent the second screening. The results showed that 99% of subjects registered below 300 Bq per body in the first screening, and all subjects registered below 300 Bq per body in the second screening. The committed effective dose ranged from 0.01–0.06 mSv in the first screening and 0.01–0.02 mSv in the second screening. Long-term follow-up studies are needed to avoid unnecessary chronic internal exposure and to reduce anxiety among the residents by communicating radiation health risks.  相似文献   

8.

Background

Psychiatric manifestations after occurrence of epilepsy have often been noted. However, the association between newly diagnosed epilepsy and psychiatric disorders afterward is not completely understood. We conducted two longitudinal cohorts for patients with and without epilepsy to investigate the risk factors and hazard ratios of developing psychiatric disorders after patients were newly diagnosed with epilepsy.

Methods

We identified 938 patients with a new diagnosis of epilepsy and 518,748 participants without epilepsy from the National Health Insurance Research Database in 2000–2002 and tracked them until 2008. We compared the incidence of developing psychiatric disorders between the two cohorts, evaluated risk factors and measured the associated hazard ratios (HRs) and 95% confidence intervals (CIs) of developing psychiatric disorders.

Findings

The incidences of psychiatric disorders for people with and without epilepsy were 94.1 and 22.6 per 1000 person-years, respectively. After adjusting the covariates, the epilepsy cohort showed the highest risks in mental retardation (HR 31.5, 95% CI 18.9 to 52.4), bipolar disorder (HR 23.5, 95% CI 11.4 to 48.3) and alcohol or drug psychosis (HR 18.8, 95% CI 11.1 to 31.8) among psychiatric complications developed after newly diagnosed epilepsy. The risk increased with epileptic general seizure and frequency of outpatient visits for epilepsy, as well as with emergency room visits and hospitalizations for epilepsy, and with older age. Chronologically, the highest risk occurred in the first year after epilepsy diagnosis (HR 11.4, 95% CI 9.88 to 13.2).

Conclusion

Various psychiatric disorders were demonstrated after newly diagnosed epilepsy and closely related to general seizure and use of medical services for epilepsy. This shows a need for integrated psychiatric care for patients newly diagnosed with epilepsy, especially in the first year.  相似文献   

9.

Background

Meningitis after neurosurgery can result in severe morbidity and high mortality. Incidence varies among regions and limited data are focused on meningitis after major craniotomy.

Aim

This retrospective cohort study aimed to determine the incidence, risk factors and microbiological spectrum of postcraniotomy meningitis in a large clinical center of Neurosurgery in China.

Methods

Patients who underwent neurosurgeries at the Department of Neurosurgery in Huashan Hospital, the largest neurosurgery center in Asia and the Pacific, between 1stJanuary and 31st December, 2008 were selected. Individuals with only shunts, burr holes, stereotactic surgery, transsphenoidal or spinal surgery were excluded. The complete medical records of each case were reviewed, and data on risk factors were extracted and evaluated for meningitis.

Results

A total of 65 meningitides were identified among 755 cases in the study, with an incidence of 8.60%. The risk of meningitis was increased by the presence of diabetes mellitus (odds ratio [OR], 6.27; P = 0.009), the use of external ventricular drainage (OR, 4.30; P = 0.003) and the use of lumbar drainage (OR, 17.23; P<0.001). The isolated microorganisms included Acinetobacter baumannii, Enterococcus sp, Streptococcus intermedius and Klebsiella pneumonia.

Conclusions

Meningitis remains an important source of morbidity and mortality after major craniotomy. Diabetic patients or those with cerebral spinal fluid shunts carry significant high risk of infection. Thus, identification of the risk factors as soon as possible will help physicians to improve patient care.  相似文献   

10.
Numerous radionuclides were released from the Fukushima Daiichi Nuclear Power Station (F1-NPS) in Japan following the magnitude 9.0 earthquake and tsunami on March 11, 2011. Local residents have been eager to calculate their individual radiation exposure. Thus, absorbed dose rates in the indoor and outdoor air at evacuation sites in the Fukushima Prefecture were measured using a gamma-ray measuring devices, and individual radiation exposure was calculated by assessing the radiation dose reduction efficiency (defined as the ratio of absorbed dose rate in the indoor air to the absorbed dose rate in the outdoor air) of wood, aluminum, and reinforced concrete buildings. Between March 2011 and July 2011, dose reduction efficiencies of wood, aluminum, and reinforced concrete buildings were 0.55±0.04, 0.15±0.02, and 0.19±0.04, respectively. The reduction efficiency of wood structures was 1.4 times higher than that reported by the International Atomic Energy Agency. The efficiency of reinforced concrete was similar to previously reported values, whereas that of aluminum structures has not been previously reported. Dose reduction efficiency increased in proportion to the distance from F1-NPS at 8 of the 18 evacuation sites. Time variations did not reflect dose reduction efficiencies at evacuation sites although absorbed dose rates in the outdoor air decreased. These data suggest that dose reduction efficiency depends on structure types, levels of contamination, and evacuee behaviors at evacuation sites.  相似文献   

11.
12.

Background

During 2007 and 2008 it is likely that millions of patients in the US received heparin contaminated (CH) with oversulfated chondroitin sulfate, which was associated with anaphylactoid reactions. We tested the hypothesis that CH was associated with serious morbidity, mortality, intensive care unit (ICU) stay and heparin-induced thrombocytopenia following adult cardiac surgery.

Methods and Findings

We conducted a single center, retrospective, propensity-matched cohort study during the period of CH and the equivalent time frame in the three preceding or the two following years. Perioperative data were obtained from the institutional record of the Society of Thoracic Surgeons National Database, for which the data collection is prospective, standardized and performed by independent investigators. After matching, logistic regression was performed to evaluate the independent effect of CH on the composite adverse outcome (myocardial infarction, stroke, pneumonia, dialysis, cardiac arrest) and on mortality. Cox regression was used to determine the association between CH and ICU length of stay. The 1∶5 matched groups included 220 patients potentially exposed to CH and 918 controls. There were more adverse outcomes in the exposed cohort (20.9% versus 12.0%; difference = 8.9%; 95% CI 3.6% to 15.1%, P<0.001) with an odds ratio for CH of 2.0 (95% CI, 1.4 to 3.0, P<0.001). In the exposed group there was a non-significant increase in mortality (5.9% versus 3.5%, difference = 2.4%; 95% CI, −0.4 to 3.5%, P = 0.1), the median ICU stay was longer by 14.1 hours (interquartile range −26.6 to 79.8, S = 3299, P = 0.0004) with an estimated hazard ratio for CH of 1.2 (95% CI, 1.0 to 1.4, P = 0.04). There was no difference in nadir platelet counts between cohorts.

Conclusions

The results from this single center study suggest the possibility that contaminated heparin might have contributed to serious morbidity following cardiac surgery.  相似文献   

13.

Objective

Patients with late-onset depression (LOD) have been reported to run a higher risk of subsequent dementia. The present study was conducted to assess whether statins can reduce the risk of dementia in these patients.

Methods

We used the data from National Health Insurance of Taiwan during 1996–2009. Standardized Incidence Ratios (SIRs) were calculated for LOD and subsequent dementia. The criteria for LOD diagnoses included age ≥65 years, diagnosis of depression after 65 years of age, at least three service claims, and treatment with antidepressants. The time-dependent Cox proportional hazards model was applied for multivariate analyses. Propensity scores with the one-to-one nearest-neighbor matching model were used to select matching patients for validation studies. Kaplan-Meier curve estimate was used to measure the group of patients with dementia living after diagnosis of LOD.

Results

Totally 45,973 patients aged ≥65 years were enrolled. The prevalence of LOD was 12.9% (5,952/45,973). Patients with LOD showed to have a higher incidence of subsequent dementia compared with those without LOD (Odds Ratio: 2.785; 95% CI 2.619–2.958). Among patients with LOD, lipid lowering agent (LLA) users (for at least 3 months) had lower incidence of subsequent dementia than non-users (Hazard Ratio = 0.781, 95% CI 0.685–0.891). Nevertheless, only statins users showed to have reduced risk of dementia (Hazard Ratio = 0.674, 95% CI 0.547–0.832) while other LLAs did not, which was further validated by Kaplan-Meier estimates after we used the propensity scores with the one-to-one nearest-neighbor matching model to control the confounding factors.

Conclusions

Statins may reduce the risk of subsequent dementia in patients with LOD.  相似文献   

14.

Background

Appropriate empiric therapy, antibiotic therapy with in vitro activity to the infecting organism given prior to confirmed culture results, may improve Staphylococcus aureus outcomes. We aimed to measure the clinical impact of appropriate empiric antibiotic therapy on mortality, while statistically adjusting for comorbidities, severity of illness and presence of virulence factors in the infecting strain.

Methodology

We conducted a retrospective cohort study of adult patients admitted to a tertiary-care facility from January 1, 2003 to June 30, 2007, who had S. aureus bacteremia. Time to appropriate therapy was measured from blood culture collection to the receipt of antibiotics with in vitro activity to the infecting organism. Cox proportional hazard models were used to measure the association between receipt of appropriate empiric therapy and in-hospital mortality, statistically adjusting for patient and pathogen characteristics.

Principal Findings

Among 814 admissions, 537 (66%) received appropriate empiric therapy. Those who received appropriate empiric therapy had a higher hazard of 30-day in-hospital mortality (Hazard Ratio (HR): 1.52; 95% confidence interval (CI): 0.99, 2.34). A longer time to appropriate therapy was protective against mortality (HR: 0.79; 95% CI: 0.60, 1.03) except among the healthiest quartile of patients (HR: 1.44; 95% CI: 0.66, 3.15).

Conclusions/Significance

Appropriate empiric therapy was not associated with decreased mortality in patients with S. aureus bacteremia except in the least ill patients. Initial broad antibiotic selection may not be widely beneficial.  相似文献   

15.
On March 11, 2011, Japan’s northeast Pacific coast was hit by a gigantic earthquake and subsequent tsunami. Soma City in Fukushima Prefecture is situated approximately 44 km north of Fukushima Daiichi Nuclear Power Plant. Soma General Hospital is the only hospital in Soma City that provides full-time otolaryngological medical care. We investigated the changes in new patients from one year before to three years after the disaster. We investigated 18,167 new patients treated at our department during the four years from April 1, 2010 to March 31, 2014. Of the new patients, we categorized the diagnoses into Meniere’s disease, acute low-tone sensorineural hearing loss, vertigo, sudden deafness, tinnitus, and facial palsy as neuro-otologic symptoms. We also investigated the changes in the numbers of patients whom we examined at that time concerning other otolaryngological disorders, including epistaxis, infectious diseases of the laryngopharynx, and allergic rhinitis. The total number of new patients did not change remarkably on a year-to-year basis. Conversely, cases of vertigo, Meniere’s disease, and acute low-tone sensorineural hearing loss increased in number immediately after the disaster, reaching a plateau in the second year and slightly decreasing in the third year. Specifically, 4.8% of patients suffering from these neuro-otologic diseases had complications from depression and other mental diseases. With regard to new patients in our department, there was no apparent increase in the number of patients suffering from diseases other than neuro-otologic diseases, including epistaxis, and allergic rhinitis. Patients suffering from vertigo and/or dizziness increased during the first few years after the disaster. These results are attributed to the continuing stress and tension of the inhabitants. This investigation of those living in the disaster area highlights the need for long-term support.  相似文献   

16.

Background

Sleep-disordered breathing is a common condition associated with adverse health outcomes including hypertension and cardiovascular disease. The overall objective of this study was to determine whether sleep-disordered breathing and its sequelae of intermittent hypoxemia and recurrent arousals are associated with mortality in a community sample of adults aged 40 years or older.

Methods and Findings

We prospectively examined whether sleep-disordered breathing was associated with an increased risk of death from any cause in 6,441 men and women participating in the Sleep Heart Health Study. Sleep-disordered breathing was assessed with the apnea–hypopnea index (AHI) based on an in-home polysomnogram. Survival analysis and proportional hazards regression models were used to calculate hazard ratios for mortality after adjusting for age, sex, race, smoking status, body mass index, and prevalent medical conditions. The average follow-up period for the cohort was 8.2 y during which 1,047 participants (587 men and 460 women) died. Compared to those without sleep-disordered breathing (AHI: <5 events/h), the fully adjusted hazard ratios for all-cause mortality in those with mild (AHI: 5.0–14.9 events/h), moderate (AHI: 15.0–29.9 events/h), and severe (AHI: ≥30.0 events/h) sleep-disordered breathing were 0.93 (95% CI: 0.80–1.08), 1.17 (95% CI: 0.97–1.42), and 1.46 (95% CI: 1.14–1.86), respectively. Stratified analyses by sex and age showed that the increased risk of death associated with severe sleep-disordered breathing was statistically significant in men aged 40–70 y (hazard ratio: 2.09; 95% CI: 1.31–3.33). Measures of sleep-related intermittent hypoxemia, but not sleep fragmentation, were independently associated with all-cause mortality. Coronary artery disease–related mortality associated with sleep-disordered breathing showed a pattern of association similar to all-cause mortality.

Conclusions

Sleep-disordered breathing is associated with all-cause mortality and specifically that due to coronary artery disease, particularly in men aged 40–70 y with severe sleep-disordered breathing. Please see later in the article for the Editors'' Summary  相似文献   

17.
18.
PurposeThis study investigated whether alcoholic intoxication (AI) increases the risk of inflammatory bowel disease (IBD) by using a population-based database in Taiwan.MethodsThis retrospective matched-cohort study included 57 611 inpatients with new-onset AI (AI cohort) and 230 444 randomly selected controls (non-AI cohort). Each patient was monitored for 10 years to individually identify those who were subsequently diagnosed with Crohn disease (CD) and ulcerative colitis (UC) during the follow-up period. Cox proportional hazard regression analysis was conducted to determine the risk of IBD in patients with AI compared with controls without AI.ResultsThe incidence rate of IBD during the 10-year follow-up period was 2.69 per 1 000 person-years and 0.49 per 1 000 person-years in the AI and non-AI cohorts, respectively. After adjustment for age, sex, and comorbidity, the AI cohort exhibited a 3.17-fold increased risk of IBD compared with the non-AI cohort (hazard ratio [HR] = 3.17, 95% confidence interval [CI] = 2.19–4.58). Compared with the non-AI cohort, the HRs of CD and UC were 4.40 and 2.33 for the AI cohort, respectively. After stratification for the severity of AI according to the duration of hospital stay, the adjusted HRs exhibited a significant correlation with the severity; the HRs of IBD were 1.76, 6.83, and 19.9 for patients with mild, moderate, and severe AI, respectively (p for the trend < .0001).ConclusionThe risk of IBD was higher in patients with AI and increased with the length of hospital stay.  相似文献   

19.

Background

In vitro studies have shown inhibitory effects of magnesium (Mg) on phosphate-induced calcification of vascular smooth muscle cells, raising the possibility that maintaining a high Mg level may be useful for reducing cardiovascular risks of patients with hyperphosphatemia. We examined how serum Mg levels affect the association between serum phosphate levels and the risk of cardiovascular mortality in patients undergoing hemodialysis.

Methods

A nationwide register-based cohort study was conducted using database of the Renal Data Registry of the Japanese Society for Dialysis Therapy in 2009. We identified 142,069 patients receiving in-center hemodialysis whose baseline serum Mg and phosphate levels were available. Study outcomes were one-year cardiovascular and all-cause mortality. Serum Mg levels were categorized into three groups (lower, <2.7 mg/dL; intermediate, ≥2.7, <3.1 mg/dL; and higher, ≥3.1 mg/dL).

Results

During follow-up, 11,401 deaths occurred, out of which 4,751 (41.7%) were ascribed to cardiovascular disease. In multivariable analyses, an increase in serum phosphate levels elevated the risk of cardiovascular mortality in the lower- and intermediate-Mg groups, whereas no significant risk increment was observed in the higher-Mg group. Moreover, among patients with serum phosphate levels of ≥6.0 mg/dL, the cardiovascular mortality risk significantly decreased with increasing serum Mg levels (adjusted odds ratios [95% confidence intervals] of the lower-, intermediate-, and higher-Mg groups were 1.00 (reference), 0.81 [0.66–0.99], and 0.74 [0.56–0.97], respectively.). An interaction between Mg and phosphate on the risk of cardiovascular mortality was statistically significant (P = 0.03).

Conclusion

Serum Mg levels significantly modified the mortality risk associated with hyperphosphatemia in patients undergoing hemodialysis.  相似文献   

20.

Purpose

To investigate risk factors associated with progressive visual field (VF) loss in primary angle closure glaucoma (PACG).

Methods

We retrospectively reviewed medical record of PACG patients who had ≥5 reliable VF examinations (central 24-2 threshold test, Humphrey Field Analyzer) and ≥2 years of follow-up. Each VF was scored using Collaborative Initial Glaucoma Treatment Study system. Progression was defined if 3 consecutive follow-up VF tests had an increased score of ≥3 above the mean of the first 2 VF scores. Factors associated with VF progression were evaluated by Cox proportional hazards models.

Results

A total of 89 eyes from 89 patients (mean age, 69.8 ± 7.9 years), who received a mean of 6.9 ± 2.3 VF tests (mean deviation at initial, -8.1 ± 4.4 dB) with a mean follow-up of 63.9 ± 23.9 months were included. VF progression was detected in 9 eyes (10%). The axial length (AL), anterior chamber depth, and intraocular pressure (IOP) in patients with and without progression were 22.5 ± 0.6 and 23.1 ± 0.9 mm, 2.5 ± 0.3 and 2.5 ± 0.3 mm, 14.8 ± 2.4 and 14.3 ± 2.3 mm Hg, respectively. AL was the only factor associated with progression in both Cox proportional hazards univariate (p = 0.031) and multivariate models (p = 0.023).

Conclusion

When taking into account age, IOP, follow-up period, and number of VF tests, a shorter AL is the only factor associated with VF progression in this cohort of Chinese patients with PACG. Further studies are warranted to verify the role of AL in progressive VF loss in PACG.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号