首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.

Background

Birth size, perhaps a proxy for prenatal environment, might be a correlate of subsequent breast cancer risk, but findings from epidemiological studies have been inconsistent. We re-analysed individual participant data from published and unpublished studies to obtain more precise estimates of the magnitude and shape of the birth size–breast cancer association.

Methods and Findings

Studies were identified through computer-assisted and manual searches, and personal communication with investigators. Individual participant data from 32 studies, comprising 22,058 breast cancer cases, were obtained. Random effect models were used, if appropriate, to combine study-specific estimates of effect. Birth weight was positively associated with breast cancer risk in studies based on birth records (pooled relative risk [RR] per one standard deviation [SD] [= 0.5 kg] increment in birth weight: 1.06; 95% confidence interval [CI] 1.02–1.09) and parental recall when the participants were children (1.02; 95% CI 0.99–1.05), but not in those based on adult self-reports, or maternal recall during the woman''s adulthood (0.98; 95% CI 0.95–1.01) (p for heterogeneity between data sources = 0.003). Relative to women who weighed 3.000–3.499 kg, the risk was 0.96 (CI 0.80–1.16) in those who weighed < 2.500 kg, and 1.12 (95% CI 1.00–1.25) in those who weighed ≥ 4.000 kg (p for linear trend = 0.001) in birth record data. Birth length and head circumference from birth records were also positively associated with breast cancer risk (pooled RR per one SD increment: 1.06 [95% CI 1.03–1.10] and 1.09 [95% CI 1.03–1.15], respectively). Simultaneous adjustment for these three birth size variables showed that length was the strongest independent predictor of risk. The birth size effects did not appear to be confounded or mediated by established breast cancer risk factors and were not modified by age or menopausal status. The cumulative incidence of breast cancer per 100 women by age 80 y in the study populations was estimated to be 10.0, 10.0, 10.4, and 11.5 in those who were, respectively, in the bottom, second, third, and top fourths of the birth length distribution.

Conclusions

This pooled analysis of individual participant data is consistent with birth size, and in particular birth length, being an independent correlate of breast cancer risk in adulthood.  相似文献   

2.

Background

Neonatal hypothyroidism has been associated in animal models with maternal exposure to several environmental contaminants; however, evidence for such an association in humans is inconsistent. We evaluated whether maternal exposure to 2,3,7,8-Tetrachlorodibenzo-p-dioxin (TCDD), a persistent and widespread toxic environmental contaminant, is associated with modified neonatal thyroid function in a large, highly exposed population in Seveso, Italy.

Methods and Findings

Between 1994 and 2005, in individuals exposed to TCDD after the 1976 Seveso accident we conducted: (i) a residence-based population study on 1,014 children born to the 1,772 women of reproductive age in the most contaminated zones (A, very high contamination; B, high contamination), and 1,772 age-matched women from the surrounding noncontaminated area (reference); (ii) a biomarker study on 51 mother–child pairs for whom recent maternal plasma dioxin measurements were available. Neonatal blood thyroid-stimulating hormone (b-TSH) was measured on all children. We performed crude and multivariate analyses adjusting for gender, birth weight, birth order, maternal age, hospital, and type of delivery. Mean neonatal b-TSH was 0.98 μU/ml (95% confidence interval [CI] 0.90–1.08) in the reference area (n = 533), 1.35 μU/ml (95% CI 1.22–1.49) in zone B (n = 425), and 1.66 μU/ml (95% CI 1.19–2.31) in zone A (n = 56) (p < 0.001). The proportion of children with b-TSH > 5 μU/ml was 2.8% in the reference area, 4.9% in zone B, and 16.1% in zone A (p < 0.001). Neonatal b-TSH was correlated with current maternal plasma TCDD (n = 51, β = 0.47, p < 0.001) and plasma toxic equivalents of coplanar dioxin-like compounds (n = 51, β = 0.45, p = 0.005).

Conclusions

Our data indicate that environmental contaminants such as dioxins have a long-lasting capability to modify neonatal thyroid function after the initial exposure.  相似文献   

3.

Background

Immune specific genes as well as genes regulating the formation of skin barrier are major determinants for eczema manifestation. There is a debate as to whether allergic sensitization and filaggrin gene (FLG) variants lead to eczema or FLG variants and eczema increase the risk of allergic sensitization. To investigate the time-order between eczema and allergic sensitization with respect to FLG variants, data from a large prospective study covering infancy to late adolescence were analyzed.

Methodology/Principal Findings

Repeated measurements of eczema and allergic sensitization (documented by skin prick tests) at ages 1, 2, 4, 10, and 18 years were ascertained in the Isle of Wight birth cohort (n = 1,456). Three transition periods were analyzed: age 1-or-2 to 4, 4 to 10, and 10 to 18 years. FLG variants were genotyped in 1,150 participants. Over the three transition periods, in temporal sequence analyses of initially eczema-free participants, the combined effect of FLG variants and allergic sensitization showed a 2.92-fold (95% CI: 1.47–5.77) increased risk ratio (RR) of eczema in subsequent examinations. This overall risk was more pronounced at a younger age (transition period 1-or-2 to 4, RR = 6.47, 95% CI: 1.96–21.33). In contrast, FLG variants in combination with eczema showed a weaker, but significant, risk ratio for subsequent allergic sensitization only up to 10 years of age.

Conclusions/Significance

Taking the time order into account, this prospective study demonstrates for the first time, that a combination of FLG variants and allergic sensitization increased the risk of eczema in subsequent years. Also FLG variants interacted with eczema and increased the risk of subsequent allergic sensitization, which, was limited to the younger age. Hence, early restoration of defective skin barrier could prevent allergic sensitization and subsequently reduce the risk of eczema development.  相似文献   

4.

Background

Multidrug-resistant Plasmodium vivax (Pv) is widespread in eastern Indonesia, and emerging elsewhere in Asia-Pacific and South America, but is generally regarded as a benign disease. The aim of the study was to review the spectrum of disease associated with malaria due to Pv and P. falciparum (Pf) in patients presenting to a hospital in Timika, southern Papua, Indonesia.

Methods and Findings

Data were prospectively collected from all patients attending the outpatient and inpatient departments of the only hospital in the region using systematic data forms and hospital computerised records. Between January 2004 and December 2007, clinical malaria was present in 16% (60,226/373,450) of hospital outpatients and 32% (12,171/37,800) of inpatients. Among patients admitted with slide-confirmed malaria, 64% of patients had Pf, 24% Pv, and 10.5% mixed infections. The proportion of malarial admissions attributable to Pv rose to 47% (415/887) in children under 1 y of age. Severe disease was present in 2,634 (22%) inpatients with malaria, with the risk greater among Pv (23% [675/2,937]) infections compared to Pf (20% [1,570/7,817]; odds ratio [OR] = 1.19 [95% confidence interval (CI) 1.08–1.32], p = 0.001), and greatest in patients with mixed infections (31% [389/1,273]); overall p < 0.0001. Severe anaemia (haemoglobin < 5 g/dl) was the major complication associated with Pv, accounting for 87% (589/675) of severe disease compared to 73% (1,144/1,570) of severe manifestations with Pf (p < 0.001). Pure Pv infection was also present in 78 patients with respiratory distress and 42 patients with coma. In total 242 (2.0%) patients with malaria died during admission: 2.2% (167/7,722) with Pf, 1.6% (46/2,916) with Pv, and 2.3% (29/1260) with mixed infections (p = 0.126).

Conclusions

In this region with established high-grade chloroquine resistance to both Pv and Pf, Pv is associated with severe and fatal malaria particularly in young children. The epidemiology of P. vivax needs to be re-examined elsewhere where chloroquine resistance is increasing.  相似文献   

5.

Background

Hip and knee replacement are some of the most frequently performed surgical procedures in the world. Resurfacing of the hip and unicondylar knee replacement are increasingly being used. There is relatively little evidence on their performance. To study performance of joint replacement in England, we investigated revision rates in the first 3 y after hip or knee replacement according to prosthesis type.

Methods and Findings

We linked records of the National Joint Registry for England and Wales and the Hospital Episode Statistics for patients with a primary hip or knee replacement in the National Health Service in England between April 2003 and September 2006. Hospital Episode Statistics records of succeeding admissions were used to identify revisions for any reason. 76,576 patients with a primary hip replacement and 80,697 with a primary knee replacement were included (51% of all primary hip and knee replacements done in the English National Health Service). In hip patients, 3-y revision rates were 0.9% (95% confidence interval [CI] 0.8%–1.1%) with cemented, 2.0% (1.7%–2.3%) with cementless, 1.5% (1.1%–2.0% CI) with “hybrid” prostheses, and 2.6% (2.1%–3.1%) with hip resurfacing (p < 0.0001). Revision rates after hip resurfacing were increased especially in women. In knee patients, 3-y revision rates were 1.4% (1.2%–1.5% CI) with cemented, 1.5% (1.1%–2.1% CI) with cementless, and 2.8% (1.8%–4.5% CI) with unicondylar prostheses (p < 0.0001). Revision rates after knee replacement strongly decreased with age.

Interpretation

Overall, about one in 75 patients needed a revision of their prosthesis within 3 y. On the basis of our data, consideration should be given to using hip resurfacing only in male patients and unicondylar knee replacement only in elderly patients.  相似文献   

6.

Background

Despite guidelines establishing the need to perform comprehensive paediatric drug development programs, pivotal trials in children with epilepsy have been completed mostly in Phase IV as a postapproval replication of adult data. However, it has been shown that the treatment response in children can differ from that in adults. It has not been investigated whether differences in drug effect between adults and children might occur in the treatment of drug-resistant partial epilepsy, although such differences may have a substantial impact on the design and results of paediatric randomised controlled trials (RCTs).

Methods and Findings

Three electronic databases were searched for RCTs investigating any antiepileptic drug (AED) in the add-on treatment of drug-resistant partial epilepsy in both children and adults. The treatment effect was compared between the two age groups using the ratio of the relative risk (RR) of the 50% responder rate between active AEDs treatment and placebo groups, as well as meta-regression. Differences in the response to placebo and to active treatment were searched using logistic regression. A comparable approach was used for analysing secondary endpoints, including seizure-free rate, total and adverse events-related withdrawal rates, and withdrawal rate for seizure aggravation. Five AEDs were evaluated in both adults and children with drug-resistant partial epilepsy in 32 RCTs. The treatment effect was significantly lower in children than in adults (RR ratio: 0.67 [95% confidence interval (CI) 0.51–0.89]; p = 0.02 by meta-regression). This difference was related to an age-dependent variation in the response to placebo, with a higher rate in children than in adults (19% versus 9.9%, p < 0.001), whereas no significant difference was observed in the response to active treatment (37.2% versus 30.4%, p = 0.364). The relative risk of the total withdrawal rate was also significantly lower in children than in adults (RR ratio: 0.65 [95% CI 0.43–0.98], p = 0.004 by metaregression), due to higher withdrawal rate for seizure aggravation in children (5.6%) than in adults (0.7%) receiving placebo (p < 0.001). Finally, there was no significant difference in the seizure-free rate between adult and paediatric studies.

Conclusions

Children with drug-resistant partial epilepsy receiving placebo in double-blind RCTs demonstrated significantly greater 50% responder rate than adults, probably reflecting increased placebo and regression to the mean effects. Paediatric clinical trial designs should account for these age-dependent variations of the response to placebo to reduce the risk of an underestimated sample size that could result in falsely negative trials.  相似文献   

7.
Shen L  Xie X  Su Y  Luo C  Zhang C  Zeng B 《PloS one》2011,6(10):e26267

Background

Bisphosphonates and parathyroid hormone (PTH) represent the antiresorptive and anabolic classes of drugs for osteoporosis treatment. Bone mineral density (BMD) is an essential parameter for the evaluation of anti-osteoporotic drugs. The aim of this study was to evaluate the effects of PTH versus bisphosphonates on BMD for the treatment of osteoporosis.

Methods/Principal Findings

We performed a literature search to identify studies that investigated the effects of PTH versus bisphosphonates treatment on BMD. A total of 7 articles were included in this study, representing data on 944 subjects. The pooled data showed that the percent change of increased BMD in the spine is higher with PTH compared to bisphosphonates (WMD = 5.90, 95% CI: 3.69–8.10, p<0.01,). In the hip, high dose (40 µg) PTH (1–34) showed significantly higher increments of BMD compared to alendronate (femoral neck: WMD = 5.67, 95% CI: 3.47–7.87, p<0.01; total hip: WMD = 2.40, 95%CI: 0.49–4.31, p<0.05). PTH treatment has yielded significantly higher increments than bisphosphonates with a duration of over 12 months (femoral neck: WMD = 5.67, 95% CI: 3.47–7.86, p<0.01; total hip: WMD = 2.40, 95% CI: 0.49–4.31, P<0.05) and significantly lower increments at 12 months (femoral neck: WMD = −1.05, 95% CI: −2.26–0.16, p<0.01; total hip: WMD: −1.69, 95% CI: −3.05–0.34, p<0.05). In the distal radius, a reduction in BMD was significant between PTH and alendronate treatment. (WMD = −3.68, 95% CI: −5.57–1.79, p<0.01).

Discussion

Our results demonstrated that PTH significantly increased lumbar spine BMD as compared to treatment with bisphosphonates and PTH treatment induced duration- and dose-dependent increases in hip BMD as compared to bisphosphonates treatment. This study has also disclosed that for the distal radius, BMD was significantly lower from PTH treatment than alendronate treatment.  相似文献   

8.

Background

Several sub-Saharan African countries have rapidly scaled up the number of households that own insecticide-treated mosquito nets (ITNs). Although the efficacy of ITNs in trials has been shown, evidence on their impact under routine conditions is limited to a few countries and the extent to which the scale-up of ITNs has improved population health remains uncertain.

Methods and Findings

We used matched logistic regression to assess the individual-level association between household ITN ownership or use in children under 5 years of age and the prevalence of parasitemia among children using six malaria indicator surveys (MIS) and one demographic and health survey. We used Cox proportional hazards models to assess the relationship between ITN household ownership and child mortality using 29 demographic and health surveys. The pooled relative reduction in parasitemia prevalence from random effects meta-analysis associated with household ownership of at least one ITN was 20% (95% confidence interval [CI] 3%–35%; I 2 = 73.5%, p<0.01 for I 2 value). Sleeping under an ITN was associated with a pooled relative reduction in parasitemia prevalence in children of 24% (95% CI 1%–42%; I 2 = 79.5%, p<0.001 for I 2 value). Ownership of at least one ITN was associated with a pooled relative reduction in mortality between 1 month and 5 years of age of 23% (95% CI 13–31%; I 2 = 25.6%, p>0.05 for I 2 value).

Conclusions

Our findings across a number of sub-Saharan African countries were highly consistent with results from previous clinical trials. These findings suggest that the recent scale-up in ITN coverage has likely been accompanied by significant reductions in child mortality and that additional health gains could be achieved with further increases in ITN coverage in populations at risk of malaria. Please see later in the article for the Editors'' Summary  相似文献   

9.

Background

Pain in infancy is poorly understood, and medical staff often have difficulty assessing whether an infant is in pain. Current pain assessment tools rely on behavioural and physiological measures, such as change in facial expression, which may not accurately reflect pain experience. Our ability to measure cortical pain responses in young infants gives us the first opportunity to evaluate pain assessment tools with respect to the sensory input and establish whether the resultant pain scores reflect cortical pain processing.

Methods and Findings

Cortical haemodynamic activity was measured in infants, aged 25–43 wk postmenstrual, using near-infrared spectroscopy following a clinically required heel lance and compared to the magnitude of the premature infant pain profile (PIPP) score in the same infant to the same stimulus (n = 12, 33 test occasions). Overall, there was good correlation between the PIPP score and the level of cortical activity (regression coefficient = 0.72, 95% confidence interval [CI] limits 0.32–1.11, p = 0.001; correlation coefficient = 0.57). Of the different PIPP components, facial expression correlated best with cortical activity (regression coefficient = 1.26, 95% CI limits 0.84–1.67, p < 0.0001; correlation coefficient = 0.74) (n = 12, 33 test occasions). Cortical pain responses were still recorded in some infants who did not display a change in facial expression.

Conclusions

While painful stimulation generally evokes parallel cortical and behavioural responses in infants, pain may be processed at the cortical level without producing detectable behavioural changes. As a result, an infant with a low pain score based on behavioural assessment tools alone may not be pain free.  相似文献   

10.

Background

Severe malaria (SM) is classically associated with Plasmodium falciparum infection. Little information is available on the contribution of P. vivax to severe disease. There are some epidemiological indications that P. vivax or mixed infections protect against complications and deaths. A large morbidity surveillance conducted in an area where the four species coexist allowed us to estimate rates of SM among patients infected with one or several species.

Methods and Findings

This was a prospective cohort study conducted within the framework of the Malaria Vaccine Epidemiology and Evaluation Project. All presumptive malaria cases presenting at two rural health facilities over an 8-y period were investigated with history taking, clinical examination, and laboratory assessment. Case definition of SM was based on the World Health Organization (WHO) criteria adapted for the setting (i.e., clinical diagnosis of malaria associated with asexual blood stage parasitaemia and recent history of fits, or coma, or respiratory distress, or anaemia [haemoglobin < 5 g/dl]). Out of 17,201 presumptive malaria cases, 9,537 (55%) had a confirmed Plasmodium parasitaemia. Among those, 6.2% (95% confidence interval [CI] 5.7%–6.8%) fulfilled the case definition of SM, most of them in children <5 y. In this age group, the proportion of SM was 11.7% (10.4%–13.2%) for P. falciparum, 8.8% (7.1%–10.7%) for P. vivax, and 17.3% (11.7%–24.2%) for mixed P. falciparum and P. vivax infections. P. vivax SM presented more often with respiratory distress than did P. falciparum (60% versus 41%, p = 0.002), but less often with anaemia (19% versus 41%, p = 0.0001).

Conclusion

P. vivax monoinfections as well as mixed Plasmodium infections are associated with SM. There is no indication that mixed infections protected against SM. Interventions targeted toward P. falciparum only might be insufficient to eliminate the overall malaria burden, and especially severe disease, in areas where P. falciparum and P. vivax coexist.  相似文献   

11.
Wang J  Chen J  Chen X  Wang B  Li K  Bi J 《PloS one》2011,6(12):e28844

Background and Objective

Blood vessel invasion plays a very important role in the progression and metastasis of cancer. However, blood vessel invasion as a prognostic factor for survival in non-small cell lung cancer (NSCLC) remains controversial. The aim of this study is to explore the relationship between blood vessel invasion and outcome in patients with NSCLC using meta-analysis.

Methods

A meta-analysis of published studies was conducted to investigate the effects of blood vessel invasion on both relapse-free survival (RFS) and overall survival (OS) for patients with NSCLC. Hazard ratios (HRs) with 95% confidence intervals (95% CIs) were used to assess the strength of this association.

Results

A total of 16,535 patients from 52 eligible studies were included in the systematic review and meta-analysis. In total, blood vessel invasion was detected in 29.8% (median; range from 6.2% to 77.0%) of patients with NSCLC. The univariate and multivariate estimates for RFS were 3.28 (95% CI: 2.14–5.05; P<0.0001) and 3.98 (95% CI: 2.24–7.06; P<0.0001), respectively. For the analyses of blood vessel invasion and OS, the pooled HR estimate was 2.22 (95% CI: 1.93–2.56; P<0.0001) by univariate analysis and 1.90 (95% CI: 1.65–2.19; P<0.0001) by multivariate analysis. Furthermore, in stage I NSCLC patients, the meta-risk for recurrence (HR = 6.93, 95% CI: 4.23–11.37, P<0.0001) and death (HR = 2.15, 95% CI: 1.68–2.75; P<0.0001) remained highly significant by multivariate analysis.

Conclusions

This study shows that blood vessel invasion appears to be an independent negative prognosticator in surgically managed NSCLC. However, adequately designed large prospective studies and investigations are warranted to confirm the present findings.  相似文献   

12.

Background

There are no published data on national lifetime prevalence and treatment of mental disorders in the Arab region. Furthermore, the effect of war on first onset of disorders has not been addressed previously on a national level, especially in the Arab region. Thus, the current study aims at investigating the lifetime prevalence, treatment, age of onset of mental disorders, and their relationship to war in Lebanon.

Methods and Findings

The Lebanese Evaluation of the Burden of Ailments and Needs Of the Nation study was carried out on a nationally representative sample of the Lebanese population (n = 2,857 adults). Respondents were interviewed using the fully structured WHO Composite International Diagnostic Interview 3.0. Lifetime prevalence of any Diagnostic and Statistical Manual of Mental Disorders, fourth edition (DSM-IV) disorder was 25.8%. Anxiety (16.7%) and mood (12.6%) were more common than impulse control (4.4%) and substance (2.2%) disorders. Only a minority of people with any mental disorder ever received professional treatment, with substantial delays (6 to 28 y) between the onset of disorders and onset of treatment. War exposure increased the risk of first onset of anxiety (odds ratio [OR] 5.92, 95% confidence interval [CI] 2.5–14.1), mood (OR 3.32, 95% CI 2.0–5.6), and impulse control disorders (OR 12.72, 95% CI 4.5–35.7).

Conclusions

About one-fourth of the sample (25.8%) met criteria for at least one of the DSM-IV disorders at some point in their lives. There is a substantial unmet need for early identification and treatment. Exposure to war events increases the odds of first onset of mental disorders.  相似文献   

13.

Background

The cardiopulmonary exercise test (CPX) is an affordable tool for risk prediction in patients with chronic heart failure (CHF). We aimed to determine the role of CPX parameters in predicting the risk of incidence of sustained ventricular arrhythmias (SVA) in CHF.

Methods

Sixty-one consecutive patients with CHF enrolled in the Daunia Heart Failure Registry underwent CPX and were followed for 327 ± 247 days. Clinical follow-up was performed every month and anticipated in case of re-hospitalisation for cardiac disease. Incidence of SVA was evaluated by direct clinical examination (ECG, ambulatory ECG).

Results

Patients with episodes of SVA (N 14) showed lower values of pVO2 and PetCO2, and higher values of VE/VCO2, VE/VCO2 slope, and VE%. After correction for age, gender, diabetes, ischaemic heart disease and left ventricular ejection fraction, peak VO2 (hazard ratio (HR) 0.68, 95 % confidence interval (CI) 0.51–0.91, p < 0.05), VE% (HR 1.38, 95 % CI 1.04–1.84, p < 0.05), VE/VCO2 (HR 1.38, 95 % CI 1.04–1.82, p < 0.05), VE/VCO2 slope (HR 1.77, 95 % CI 1.31–2.39, p < 0.01), PetCO2 (HR 0.66, 95 % CI 0.50–0.88, p < 0.01) were found as predictors of SVA. At Kaplan-Meier analysis, lower event-free rates were found in subjects with peak VO2 values below median (log rank p < 0.05), values of VE/VCO2 above mean (p < 0.05), higher VE/VCO2 slope tertiles (p <0.05), and values of PetCO2 below median (p < 0.05).

Conclusions

CPX provides prognostic independent information for risk of SVA in subjects with CHF.  相似文献   

14.
Xie X  Ma YT  Yang YN  Li XM  Liu F  Huang D  Fu ZY  Ma X  Chen BD  Huang Y 《PloS one》2010,5(12):e15181

Background and Methodology

A low ankle-to-brachial index (ABI) is a strong correlate of cardiovascular disease and subsequent mortality. The relationship between ABI and alcohol consumption remains unclear. Data are from the Cardiovascular Risk Survey (CRS), a multiple-ethnic, community-based, cross-sectional study of 14 618 Chinese people (5 757 Hans, 4 767 Uygurs, and 4 094 Kazakhs) aged 35 years and over at baseline from Oct. 2007 to March 2010. The relationship between alcohol intake and ABI was determined by use of analysis of covariance and multivariable regressions.

Principal Findings

In men, alcohol consumption was significantly associated with ABI (P<0.001). After adjusted for the confounding factors, such as age, sex, ethnicity, body mass index, smoking, work stress, diabetes, and fasting blood glucose, the difference remained significant (P<0.001); either the unadjusted or multivariate-adjusted odds ratio (OR) for peripheral artery disease (PAD) was significantly higher in men who consumed >60.0 g/d [OR  = 3.857, (95% CI: 2.555–5.824); OR = 2.797, (95% CI: 1.106–3.129); OR = 2.878, (95% CI: 1.215–4.018); respectively] and was significantly lower in men who consumed 20.1–40.0 g/d [OR  = 0.330, (95% CI: 0.181–0.599); OR = 0.484, (95% CI: 0.065–0.894); OR = 0.478, (95% CI: 0.243–1.534); respectively] and 40.1–60.0 g/d [OR  = 0.306, (95% CI: 0.096–0.969); OR = 0.267, (95% CI: 0.087–0.886); OR = 0.203, (95% CI: 0.113–0.754); respectively] compared with never drinking, respectively (all P<0.01). Neither in unadjusted nor in multivariate-adjusted model was the association between ABI and alcohol consumption significant (all P>0.05) in women. Similarly, PAD was not correlated with alcohol intake in women (all P>0.05).

Conclusions/Significance

Our results indicated that in Chinese men, alcohol consumption was associated with peripheral artery disease, and consumption of less than 60 g/d had an inverse association with peripheral atherosclerosis whereas consumption of 60 g/d or more had a positive association.  相似文献   

15.

Background

The diagnosis of tuberculosis (TB) in resource-limited settings relies on Ziehl-Neelsen (ZN) smear microscopy. LED fluorescence microscopy (LED-FM) has many potential advantages over ZN smear microscopy, but requires evaluation in the field. The aim of this study was to assess the sensitivity/specificity of LED-FM for the diagnosis of pulmonary TB and whether its performance varies with the timing of specimen collection.

Methods and Findings

Adults with cough ≥2 wk were enrolled consecutively in Ethiopia, Nepal, Nigeria, and Yemen. Sputum specimens were examined by ZN smear microscopy and LED-FM and compared with culture as the reference standard. Specimens were collected using a spot-morning-spot (SMS) or spot-spot-morning (SSM) scheme to explore whether the collection of the first two smears at the health care facility (i.e., “on the spot”) the first day of consultation followed by a morning sample the next day (SSM) would identify similar numbers of smear-positive patients as smears collected via the SMS scheme (i.e., one on-the-spot-smear the first day, followed by a morning specimen collected at home and a second on-the-spot sample the second day). In total, 529 (21.6%) culture-positive and 1,826 (74.6%) culture-negative patients were enrolled, of which 1,156 (49%) submitted SSM specimens and 1,199 (51%) submitted SMS specimens. Single LED-FM smears had higher sensitivity but lower specificity than single ZN smears. Using two LED-FM or two ZN smears per patient was 72.8% (385/529, 95% CI 68.8%–76.5%) and 65.8% (348/529, 95% CI 61.6%–69.8%) sensitive (p<0.001) and 90.9% (1,660/1,826, 95% CI 89.5%–92.2%) and 98% (1,790/1,826, 95% CI 97.3%–98.6%) specific (p<0.001). Using three LED-FM or three ZN smears per patient was 77% (408/529, 95% CI 73.3%–80.6%) and 70.5% (373/529, 95% CI 66.4%–74.4%, p<0.001) sensitive and 88.1% (95% CI 86.5%–89.6%) and 96.5% (95% CI 96.8%–98.2%, p<0.001) specific. The sensitivity/specificity of ZN smear microscopy and LED-FM did not vary between SMS and SSM.

Conclusions

LED-FM had higher sensitivity but, in this study, lower specificity than ZN smear microscopy for diagnosis of pulmonary TB. Performance was independent of the scheme used for collecting specimens. The introduction of LED-FM needs to be accompanied by appropriate training, quality management, and monitoring of performance in the field.

Trial Registration

Current Controlled Trials ISRCTN53339491 Please see later in the article for the Editors'' Summary  相似文献   

16.

Background

Pesticide ingestion is a common method of self-harm in the rural developing world. In an attempt to reduce the high case fatality seen with the herbicide paraquat, a novel formulation (INTEON) has been developed containing an increased emetic concentration, a purgative, and an alginate that forms a gel under the acid conditions of the stomach, potentially slowing the absorption of paraquat and giving the emetic more time to be effective. We compared the outcome of paraquat self-poisoning with the standard formulation against the new INTEON formulation following its introduction into Sri Lanka.

Methods and Findings

Clinical data were prospectively collected on 586 patients with paraquat ingestion presenting to nine large hospitals across Sri Lanka with survival to 3 mo as the primary outcome. The identity of the formulation ingested after October 2004 was confirmed by assay of blood or urine samples for a marker compound present in INTEON. The proportion of known survivors increased from 76/297 with the standard formulation to 103/289 with INTEON ingestion, and estimated 3-mo survival improved from 27.1% to 36.7% (difference 9.5%; 95% confidence interval [CI] 2.0%–17.1%; p = 0.002, log rank test). Cox proportional hazards regression analyses showed an approximately 2-fold reduction in toxicity for INTEON compared to standard formulation. A higher proportion of patients ingesting INTEON vomited within 15 min (38% with the original formulation to 55% with INTEON, p < 0.001). Median survival time increased from 2.3 d (95% CI 1.2–3.4 d) with the standard formulation to 6.9 d (95% CI 3.3–10.7 d) with INTEON ingestion (p = 0.002, log rank test); however, in patients who did not survive there was a comparatively smaller increase in median time to death from 0.9 d (interquartile range [IQR] 0.5–3.4) to 1.5 d (IQR 0.5–5.5); p = 0.02.

Conclusions

The survey has shown that INTEON technology significantly reduces the mortality of patients following paraquat ingestion and increases survival time, most likely by reducing absorption.  相似文献   

17.

Background

Prospective cohort studies have shown that high fruit and vegetable consumption is inversely associated with coronary heart disease (CHD). Whether food processing affects this association is unknown. Therefore, we quantified the association of fruit and vegetable consumption with 10-year CHD incidence in a population-based study in the Netherlands and the effect of processing on these associations.

Methods

Prospective population-based cohort study, including 20,069 men and women aged 20 to 65 years, enrolled between 1993 and 1997 and free of cardiovascular disease at baseline. Diet was assessed using a validated 178-item food frequency questionnaire. Hazard ratios (HR) were calculated for CHD incidence using multivariable Cox proportional hazards models.

Results

During a mean follow-up time of 10.5y, 245 incident cases of CHD were documented, which comprised 211 non-fatal acute myocardial infarctions and 34 fatal CHD events. The risk of CHD incidence was 34% lower for participants with a high intake of total fruit and vegetables (>475 g/d; HR: 0.66; 95% CI: 0.45–0.99) compared to participants with a low total fruit and vegetable consumption (≤241 g/d). Intake of raw fruit and vegetables (>262 g/d vs ≤92 g/d; HR: 0.70; 95% CI: 0.47–1.04) as well as processed fruit and vegetables (>234 g/d vs ≤113 g/d; HR: 0.79; 95% CI: 0.54–1.16) were inversely related with CHD incidence.

Conclusion

Higher consumption of fruit and vegetables, whether consumed raw or processed, may protect against CHD incidence.  相似文献   

18.

Introduction

Pre-temozolomide studies demonstrated that loss of the tumor suppressor gene PTEN held independent prognostic significance in GBM patients. We investigated whether loss of PTEN predicted shorter survival in the temozolomide era. The role of PTEN in the PI3K/Akt pathway is also reviewed.

Methods

Patients with histologically proven newly diagnosed GBM were identified from a retrospective database between 2007 and 2010. Cox proportional hazards analysis was used to calculate the independent effects of PTEN expression, age, extent of resection, Karnofsky performance scale (KPS), and treatment on overall survival.

Results

Sixty-five percent of patients were men with median age of 63 years, and 70% had KPS≥80. Most patients (81%) received standard treatment (temozolomide with concurrent radiation). A total of 72 (47%) patients had retained PTEN expression. Median overall survival (OS) was 19.1 months (95% CI: 15.0–22.5). Median survival of 20.0 months (95% CI: 15.0–25.5) and 18.2 months (95% CI: 13.0–25.7) was observed in PTEN retained and PTEN loss patients, respectively (p = .71). PTEN loss patients were also found to have amplifications of EGFR gene more frequently than patients with retained PTEN (70.8% vs. 47.8%, p = .01). Multivariate analysis showed that older age (HR 1.64, CI: 1.02–2.63, p = .04), low KPS (HR 3.57, CI: 2.20–5.79, p<.0001), and lack of standard treatment (HR 3.98, CI: 2.38–6.65, p<.0001) yielded worse survival. PTEN loss was not prognostic of overall survival (HR 1.31, CI: 0.85–2.03, p = .22).

Conclusions

Loss of expression of PTEN does not confer poor overall survival in the temozolomide era. These findings imply a complex and non-linear molecular relationship between PTEN, its regulators and effectors in the tumorigenesis of glioblastoma. Additionally, there is evidence that temozolomide may be more effective in eradicating GBM cancer cells with PTEN loss and hence, level the outcomes between the PTEN retained and loss groups.  相似文献   

19.
Chang ML  Lin SM  Yeh CT 《PloS one》2011,6(10):e26323

Background

Hepatoma up-regulated protein (HURP) is a component of the chromatin-dependent pathway for spindle assembly. We examined the prognostic predictive value of HURP in human hepatocellular carcinoma (HCC).

Methods

HURP expression was evaluated by immunocytochemistry of fine needle aspirated hepatoma cells in 97 HCC patients with Barcelona Clinic Liver Cancer (BCLC) stage A. Subsequently, these patients underwent partial hepatectomy (n = 18) or radiofrequency ablation (n = 79) and were followed for 2 to 35 months. The clinicopathological parameters were submitted for survival analysis.

Results

HURP expression in aspirated HCC cells was detected in 19.6% patients. Kaplan-Meier survival analysis showed that positive HURP expression (P = 0.023), cytological grading ≥3 (P = 0.008), AFP ≥35 ng/mL (P = 0.039), bilirubin ≥1.3 mg/dL (P = 0.010), AST ≥50 U/L (P = 0.003) and ALT ≥35 U/L (P = 0.005) were all associated with a shorter disease-free survival. A stepwise multivariate Cox proportional hazard model revealed that positive HURP expression (HR, 2.334; 95% CI, 1.165–4.679, P = 0.017), AST ≥50 U/L (HR, 3.697; 95% CI, 1.868–7.319, p<0.001), cytological grade ≥3 (HR, 4.249; 95% CI, 2.061–8.759, P<0.001) and tumor number >1 (HR, 2.633; 95% CI, 1.212–5.722, P = 0.014) were independent predictors for disease-free survival. By combining the 4 independent predictors, patients with different risk scores (RS) showed distinguishable disease-free survival (RS≤1 vs. RS = 2, P = 0.001; RS = 2 vs. RS = 3, P<0.001). In contrast, the patients cannot be separated into prognosis distinguishable subgroups by using AJCC/UICC TNM staging system.

Conclusion

HCC patients with BCLC stage A can be separated into three prognosis-distinguishable groups by use of a risk score that is based upon HURP expression in aspirated HCC cells, ALT, cytological grade and tumor number.  相似文献   

20.
Common household chemicals and the allergy risks in pre-school age children   总被引:1,自引:0,他引:1  

Background

The risk of indoor exposure to volatile organic compounds (VOCs) on allergic airway diseases in children remains unknown.

Objective

We examined the residential concentrations of VOCs, emitted from building materials, paints, furniture, and other lifestyle practices and the risks of multiple allergic diseases as well as the IgE-sensitization in pre-school age children in Sweden.

Methods

In a case-control investigation (198 case children with asthma and allergy and 202 healthy controls), air samples were collected in the room where the child slept. The air samples were analyzed for the levels of eight classes of VOCs.

Results

A natural-log unit of summed propylene glycol and glycol ethers (PGEs) in bedroom air (equal to interquartile range, or 3.43 – 15.65 µg/m3) was associated with 1.5-fold greater likelihood of being a case (95% CI, 1.1 – 2.1), 1.5-fold greater likelihood of asthma (95% CI, 1.0 – 2.3), 2.8-fold greater likelihood of rhinitis (95% CI, 1.6 – 4.7), and 1.6-fold greater likelihood of eczema (95% CI, 1.1 – 2.3), accounting for gender, secondhand smoke, allergies in both parents, wet cleaning with chemical agents, construction period of the building, limonene, cat and dog allergens, butyl benzyl phthalate (BBzP), and di(2-ethylhexyl)phthalate (DEHP). When the analysis was restricted to the cases, the same unit concentration was associated with 1.8-fold greater likelihood of IgE-sensitization (95% CI, 1.1 – 2.8) compared to the non-IgE sensitized cases. No similar associations were found for the other classes of VOCs.

Conclusion

We propose a novel hypothesis that PGEs in indoor air exacerbate and/or induce the multiple allergic symptoms, asthma, rhinitis and eczema, as well as IgE sensitization respectively.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号