首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 125 毫秒
1.

Background

Birth size, perhaps a proxy for prenatal environment, might be a correlate of subsequent breast cancer risk, but findings from epidemiological studies have been inconsistent. We re-analysed individual participant data from published and unpublished studies to obtain more precise estimates of the magnitude and shape of the birth size–breast cancer association.

Methods and Findings

Studies were identified through computer-assisted and manual searches, and personal communication with investigators. Individual participant data from 32 studies, comprising 22,058 breast cancer cases, were obtained. Random effect models were used, if appropriate, to combine study-specific estimates of effect. Birth weight was positively associated with breast cancer risk in studies based on birth records (pooled relative risk [RR] per one standard deviation [SD] [= 0.5 kg] increment in birth weight: 1.06; 95% confidence interval [CI] 1.02–1.09) and parental recall when the participants were children (1.02; 95% CI 0.99–1.05), but not in those based on adult self-reports, or maternal recall during the woman''s adulthood (0.98; 95% CI 0.95–1.01) (p for heterogeneity between data sources = 0.003). Relative to women who weighed 3.000–3.499 kg, the risk was 0.96 (CI 0.80–1.16) in those who weighed < 2.500 kg, and 1.12 (95% CI 1.00–1.25) in those who weighed ≥ 4.000 kg (p for linear trend = 0.001) in birth record data. Birth length and head circumference from birth records were also positively associated with breast cancer risk (pooled RR per one SD increment: 1.06 [95% CI 1.03–1.10] and 1.09 [95% CI 1.03–1.15], respectively). Simultaneous adjustment for these three birth size variables showed that length was the strongest independent predictor of risk. The birth size effects did not appear to be confounded or mediated by established breast cancer risk factors and were not modified by age or menopausal status. The cumulative incidence of breast cancer per 100 women by age 80 y in the study populations was estimated to be 10.0, 10.0, 10.4, and 11.5 in those who were, respectively, in the bottom, second, third, and top fourths of the birth length distribution.

Conclusions

This pooled analysis of individual participant data is consistent with birth size, and in particular birth length, being an independent correlate of breast cancer risk in adulthood.  相似文献   

2.

Background

Hip and knee replacement are some of the most frequently performed surgical procedures in the world. Resurfacing of the hip and unicondylar knee replacement are increasingly being used. There is relatively little evidence on their performance. To study performance of joint replacement in England, we investigated revision rates in the first 3 y after hip or knee replacement according to prosthesis type.

Methods and Findings

We linked records of the National Joint Registry for England and Wales and the Hospital Episode Statistics for patients with a primary hip or knee replacement in the National Health Service in England between April 2003 and September 2006. Hospital Episode Statistics records of succeeding admissions were used to identify revisions for any reason. 76,576 patients with a primary hip replacement and 80,697 with a primary knee replacement were included (51% of all primary hip and knee replacements done in the English National Health Service). In hip patients, 3-y revision rates were 0.9% (95% confidence interval [CI] 0.8%–1.1%) with cemented, 2.0% (1.7%–2.3%) with cementless, 1.5% (1.1%–2.0% CI) with “hybrid” prostheses, and 2.6% (2.1%–3.1%) with hip resurfacing (p < 0.0001). Revision rates after hip resurfacing were increased especially in women. In knee patients, 3-y revision rates were 1.4% (1.2%–1.5% CI) with cemented, 1.5% (1.1%–2.1% CI) with cementless, and 2.8% (1.8%–4.5% CI) with unicondylar prostheses (p < 0.0001). Revision rates after knee replacement strongly decreased with age.

Interpretation

Overall, about one in 75 patients needed a revision of their prosthesis within 3 y. On the basis of our data, consideration should be given to using hip resurfacing only in male patients and unicondylar knee replacement only in elderly patients.  相似文献   

3.

Background

Multidrug-resistant Plasmodium vivax (Pv) is widespread in eastern Indonesia, and emerging elsewhere in Asia-Pacific and South America, but is generally regarded as a benign disease. The aim of the study was to review the spectrum of disease associated with malaria due to Pv and P. falciparum (Pf) in patients presenting to a hospital in Timika, southern Papua, Indonesia.

Methods and Findings

Data were prospectively collected from all patients attending the outpatient and inpatient departments of the only hospital in the region using systematic data forms and hospital computerised records. Between January 2004 and December 2007, clinical malaria was present in 16% (60,226/373,450) of hospital outpatients and 32% (12,171/37,800) of inpatients. Among patients admitted with slide-confirmed malaria, 64% of patients had Pf, 24% Pv, and 10.5% mixed infections. The proportion of malarial admissions attributable to Pv rose to 47% (415/887) in children under 1 y of age. Severe disease was present in 2,634 (22%) inpatients with malaria, with the risk greater among Pv (23% [675/2,937]) infections compared to Pf (20% [1,570/7,817]; odds ratio [OR] = 1.19 [95% confidence interval (CI) 1.08–1.32], p = 0.001), and greatest in patients with mixed infections (31% [389/1,273]); overall p < 0.0001. Severe anaemia (haemoglobin < 5 g/dl) was the major complication associated with Pv, accounting for 87% (589/675) of severe disease compared to 73% (1,144/1,570) of severe manifestations with Pf (p < 0.001). Pure Pv infection was also present in 78 patients with respiratory distress and 42 patients with coma. In total 242 (2.0%) patients with malaria died during admission: 2.2% (167/7,722) with Pf, 1.6% (46/2,916) with Pv, and 2.3% (29/1260) with mixed infections (p = 0.126).

Conclusions

In this region with established high-grade chloroquine resistance to both Pv and Pf, Pv is associated with severe and fatal malaria particularly in young children. The epidemiology of P. vivax needs to be re-examined elsewhere where chloroquine resistance is increasing.  相似文献   

4.

Background

Severe malaria (SM) is classically associated with Plasmodium falciparum infection. Little information is available on the contribution of P. vivax to severe disease. There are some epidemiological indications that P. vivax or mixed infections protect against complications and deaths. A large morbidity surveillance conducted in an area where the four species coexist allowed us to estimate rates of SM among patients infected with one or several species.

Methods and Findings

This was a prospective cohort study conducted within the framework of the Malaria Vaccine Epidemiology and Evaluation Project. All presumptive malaria cases presenting at two rural health facilities over an 8-y period were investigated with history taking, clinical examination, and laboratory assessment. Case definition of SM was based on the World Health Organization (WHO) criteria adapted for the setting (i.e., clinical diagnosis of malaria associated with asexual blood stage parasitaemia and recent history of fits, or coma, or respiratory distress, or anaemia [haemoglobin < 5 g/dl]). Out of 17,201 presumptive malaria cases, 9,537 (55%) had a confirmed Plasmodium parasitaemia. Among those, 6.2% (95% confidence interval [CI] 5.7%–6.8%) fulfilled the case definition of SM, most of them in children <5 y. In this age group, the proportion of SM was 11.7% (10.4%–13.2%) for P. falciparum, 8.8% (7.1%–10.7%) for P. vivax, and 17.3% (11.7%–24.2%) for mixed P. falciparum and P. vivax infections. P. vivax SM presented more often with respiratory distress than did P. falciparum (60% versus 41%, p = 0.002), but less often with anaemia (19% versus 41%, p = 0.0001).

Conclusion

P. vivax monoinfections as well as mixed Plasmodium infections are associated with SM. There is no indication that mixed infections protected against SM. Interventions targeted toward P. falciparum only might be insufficient to eliminate the overall malaria burden, and especially severe disease, in areas where P. falciparum and P. vivax coexist.  相似文献   

5.
Lo WT  Wang CC  Lin WJ  Wang SR  Teng CS  Huang CF  Chen SJ 《PloS one》2010,5(12):e15791

Background

Staphylococcus aureus is an important cause of infection, particularly in persons colonized with this organism. This study compared the annual prevalence and microbiological characteristics of methicillin-resistant S. aureus (MRSA) nasal colonization in Taiwanese children from 2004 through 2009. Risk factors for MRSA were determined for the overall study period.

Methods

Children from birth to ≤14 years of age presenting for health maintenance visits or attending 1 of 57 kindergartens were recruited. Nasal swabs were obtained, and a questionnaire was administered. The prevalence and microbiological characteristics of MRSA colonization were also calculated for two 3-year periods: 2004–2006 and 2007–2009.

Results

Cultures of the anterior nares were positive for S. aureus in 824 (25.8%) of the 3,200 children, and MRSA colonization was found in 371 (11.6%) children. The prevalence of S. aureus colonization decreased from 28.1% in 2004–2006 to 23.3% in 2007–2009 (p<0.01), whereas the prevalence of MRSA colonization increased from 8.1% to 15.1% during this period (p<0.0001). Multivariate analysis revealed that the independent risk factors for MRSA carriage were different for male and female children, and also among age groups. Most MRSA isolates belonged to sequence type 59 (ST59) (86.3%); however, a multiresistant MRSA clone with ST338 background emerged in 2007–2009. Ten (62.5%) of the 16 MRSA isolates expressed the genotypic profile ST338/staphylococcal cassette chromosome mec VT/Panton-Valentine leukocidin-positive/staphylococcal enterotoxin B-positive, and differed only in their antimicrobial susceptibility patterns.

Conclusion

The prevalence of nasal colonization by MRSA increased among healthy Taiwanese children from 2004–2006 to 2007–2009, despite an overall decrease in the prevalence of nasal colonization by S. aureus. A multiresistant MRSA clone characterized as ST338 was identified from these children.  相似文献   

6.

Background

Despite guidelines establishing the need to perform comprehensive paediatric drug development programs, pivotal trials in children with epilepsy have been completed mostly in Phase IV as a postapproval replication of adult data. However, it has been shown that the treatment response in children can differ from that in adults. It has not been investigated whether differences in drug effect between adults and children might occur in the treatment of drug-resistant partial epilepsy, although such differences may have a substantial impact on the design and results of paediatric randomised controlled trials (RCTs).

Methods and Findings

Three electronic databases were searched for RCTs investigating any antiepileptic drug (AED) in the add-on treatment of drug-resistant partial epilepsy in both children and adults. The treatment effect was compared between the two age groups using the ratio of the relative risk (RR) of the 50% responder rate between active AEDs treatment and placebo groups, as well as meta-regression. Differences in the response to placebo and to active treatment were searched using logistic regression. A comparable approach was used for analysing secondary endpoints, including seizure-free rate, total and adverse events-related withdrawal rates, and withdrawal rate for seizure aggravation. Five AEDs were evaluated in both adults and children with drug-resistant partial epilepsy in 32 RCTs. The treatment effect was significantly lower in children than in adults (RR ratio: 0.67 [95% confidence interval (CI) 0.51–0.89]; p = 0.02 by meta-regression). This difference was related to an age-dependent variation in the response to placebo, with a higher rate in children than in adults (19% versus 9.9%, p < 0.001), whereas no significant difference was observed in the response to active treatment (37.2% versus 30.4%, p = 0.364). The relative risk of the total withdrawal rate was also significantly lower in children than in adults (RR ratio: 0.65 [95% CI 0.43–0.98], p = 0.004 by metaregression), due to higher withdrawal rate for seizure aggravation in children (5.6%) than in adults (0.7%) receiving placebo (p < 0.001). Finally, there was no significant difference in the seizure-free rate between adult and paediatric studies.

Conclusions

Children with drug-resistant partial epilepsy receiving placebo in double-blind RCTs demonstrated significantly greater 50% responder rate than adults, probably reflecting increased placebo and regression to the mean effects. Paediatric clinical trial designs should account for these age-dependent variations of the response to placebo to reduce the risk of an underestimated sample size that could result in falsely negative trials.  相似文献   

7.

Background

Loss-of-function variants in the gene encoding filaggrin (FLG) are major determinants of eczema. We hypothesized that weakening of the physical barrier in FLG-deficient individuals may potentiate the effect of environmental exposures. Therefore, we investigated whether there is an interaction between FLG loss-of-function mutations with environmental exposures (pets and dust mites) in relation to the development of eczema.

Methods and Findings

We used data obtained in early life in a high-risk birth cohort in Denmark and replicated the findings in an unselected birth cohort in the United Kingdom. Primary outcome was age of onset of eczema; environmental exposures included pet ownership and mite and pet allergen levels. In Copenhagen (n = 379), FLG mutation increased the risk of eczema during the first year of life (hazard ratio [HR] 2.26, 95% confidence interval [CI] 1.27–4.00, p = 0.005), with a further increase in risk related to cat exposure at birth amongst children with FLG mutation (HR 11.11, 95% CI 3.79–32.60, p < 0.0001); dog exposure was moderately protective (HR 0.49, 95% CI 0.24–1.01, p = 0.05), but not related to FLG genotype. In Manchester (n = 503) an independent and significant association of the development of eczema by age 12 mo with FLG genotype was confirmed (HR 1.95, 95% CI 1.13–3.36, p = 0.02). In addition, the risk increased because of the interaction of cat ownership at birth and FLG genotype (HR 3.82, 95% CI 1.35–10.81, p = 0.01), with no significant effect of the interaction with dog ownership (HR 0.59, 95% CI 0.16–2.20, p = 0.43). Mite-allergen had no effects in either cohort. The observed effects were independent of sensitisation.

Conclusions

We have demonstrated a significant interaction between FLG loss-of-function main mutations (501x and 2282del4) and cat ownership at birth on the development of early-life eczema in two independent birth cohorts. Our data suggest that cat but not dog ownership substantially increases the risk of eczema within the first year of life in children with FLG loss-of-function variants, but not amongst those without. FLG-deficient individuals may need to avoid cats but not dogs in early life.  相似文献   

8.

Background

Little is known about the long-term impact of the killing of a parent in childhood or adolescence during war on distress and disability in young adulthood. This study assessed current prevalence rates of mental disorders and levels of dysfunction among young adults who had lost their father due to war-related violence in childhood or adolescence.

Methods

179 bereaved young adults and 175 non-bereaved young adults were interviewed a decade after experiencing the war in Kosovo. Prevalence rates of Major Depressive Episode (MDE), anxiety, and substance use disorders, and current suicide risk were assessed using the Mini–International Neuropsychiatric Interview. The syndrome of Prolonged Grief Disorder (PGD) was assessed with the Prolonged Grief Disorder Interview (PG-13). Somatic symptoms were measured with the Patient Health Questionnaire. General health distress was assessed with the General Health Questionnaire.

Findings

Bereaved participants were significantly more likely to suffer from either MDE or any anxiety disorder than non-bereaved participants (58.7% vs. 40%). Among bereaved participants, 39.7% met criteria for Post-Traumatic Stress Disorder, 34.6% for PGD, and 22.3% for MDE. Bereaved participants with PGD were more likely to suffer from MDE, any anxiety disorder, or current suicide risk than bereaved participants without PGD. Furthermore, these participants reported significantly greater physical distress than bereaved participants without PGD.

Conclusion

War-related loss during middle childhood and adolescence presents significant risk for adverse mental health and dysfunction in young adulthood in addition to exposure to other war-related traumatic events. Furthermore, the syndrome of PGD can help to identify those with the greatest degree of distress and dysfunction.  相似文献   

9.

Objectives

Use electronic health records Autism Spectrum Disorder (ASD) to assess the comorbidity burden of ASD in children and young adults.

Study Design

A retrospective prevalence study was performed using a distributed query system across three general hospitals and one pediatric hospital. Over 14,000 individuals under age 35 with ASD were characterized by their co-morbidities and conversely, the prevalence of ASD within these comorbidities was measured. The comorbidity prevalence of the younger (Age<18 years) and older (Age 18–34 years) individuals with ASD was compared.

Results

19.44% of ASD patients had epilepsy as compared to 2.19% in the overall hospital population (95% confidence interval for difference in percentages 13.58–14.69%), 2.43% of ASD with schizophrenia vs. 0.24% in the hospital population (95% CI 1.89–2.39%), inflammatory bowel disease (IBD) 0.83% vs. 0.54% (95% CI 0.13–0.43%), bowel disorders (without IBD) 11.74% vs. 4.5% (95% CI 5.72–6.68%), CNS/cranial anomalies 12.45% vs. 1.19% (95% CI 9.41–10.38%), diabetes mellitus type I (DM1) 0.79% vs. 0.34% (95% CI 0.3–0.6%), muscular dystrophy 0.47% vs 0.05% (95% CI 0.26–0.49%), sleep disorders 1.12% vs. 0.14% (95% CI 0.79–1.14%). Autoimmune disorders (excluding DM1 and IBD) were not significantly different at 0.67% vs. 0.68% (95% CI −0.14-0.13%). Three of the studied comorbidities increased significantly when comparing ages 0–17 vs 18–34 with p<0.001: Schizophrenia (1.43% vs. 8.76%), diabetes mellitus type I (0.67% vs. 2.08%), IBD (0.68% vs. 1.99%) whereas sleeping disorders, bowel disorders (without IBD) and epilepsy did not change significantly.

Conclusions

The comorbidities of ASD encompass disease states that are significantly overrepresented in ASD with respect to even the patient populations of tertiary health centers. This burden of comorbidities goes well beyond those routinely managed in developmental medicine centers and requires broad multidisciplinary management that payors and providers will have to plan for.  相似文献   

10.

Background

The provision of highly active antiretroviral therapy (HAART) in resource-limited settings follows a public health approach, which is characterised by a limited number of regimens and the standardisation of clinical and laboratory monitoring. In industrialized countries doctors prescribe from the full range of available antiretroviral drugs, supported by resistance testing and frequent laboratory monitoring. We compared virologic response, changes to first-line regimens, and mortality in HIV-infected patients starting HAART in South Africa and Switzerland.

Methods and Findings

We analysed data from the Swiss HIV Cohort Study and two HAART programmes in townships of Cape Town, South Africa. We included treatment-naïve patients aged 16 y or older who had started treatment with at least three drugs since 2001, and excluded intravenous drug users. Data from a total of 2,348 patients from South Africa and 1,016 patients from the Swiss HIV Cohort Study were analysed. Median baseline CD4+ T cell counts were 80 cells/μl in South Africa and 204 cells/μl in Switzerland. In South Africa, patients started with one of four first-line regimens, which was subsequently changed in 514 patients (22%). In Switzerland, 36 first-line regimens were used initially, and these were changed in 539 patients (53%). In most patients HIV-1 RNA was suppressed to 500 copies/ml or less within one year: 96% (95% confidence interval [CI] 95%–97%) in South Africa and 96% (94%–97%) in Switzerland, and 26% (22%–29%) and 27% (24%–31%), respectively, developed viral rebound within two years. Mortality was higher in South Africa than in Switzerland during the first months of HAART: adjusted hazard ratios were 5.90 (95% CI 1.81–19.2) during months 1–3 and 1.77 (0.90–3.50) during months 4–24.

Conclusions

Compared to the highly individualised approach in Switzerland, programmatic HAART in South Africa resulted in similar virologic outcomes, with relatively few changes to initial regimens. Further innovation and resources are required in South Africa to both achieve more timely access to HAART and improve the prognosis of patients who start HAART with advanced disease.  相似文献   

11.

Background

The relevance to coronary heart disease (CHD) of cytokines that govern inflammatory cascades, such as interleukin-6 (IL-6), may be underestimated because such mediators are short acting and prone to fluctuations. We evaluated associations of long-term circulating IL-6 levels with CHD risk (defined as nonfatal myocardial infarction [MI] or fatal CHD) in two population-based cohorts, involving serial measurements to enable correction for within-person variability. We updated a systematic review to put the new findings in context.

Methods and Findings

Measurements were made in samples obtained at baseline from 2,138 patients who had a first-ever nonfatal MI or died of CHD during follow-up, and from 4,267 controls in two cohorts comprising 24,230 participants. Correction for within-person variability was made using data from repeat measurements taken several years apart in several hundred participants. The year-to-year variability of IL-6 values within individuals was relatively high (regression dilution ratios of 0.41, 95% confidence interval [CI] 0.28–0.53, over 4 y, and 0.35, 95% CI 0.23–0.48, over 12 y). Ignoring this variability, we found an odds ratio for CHD, adjusted for several established risk factors, of 1.46 (95% CI 1.29–1.65) per 2 standard deviation (SD) increase of baseline IL-6 values, similar to that for baseline C-reactive protein. After correction for within-person variability, the odds ratio for CHD was 2.14 (95% CI 1.45–3.15) with long-term average (“usual”) IL-6, similar to those for some established risk factors. Increasing IL-6 levels were associated with progressively increasing CHD risk. An updated systematic review of electronic databases and other sources identified 15 relevant previous population-based prospective studies of IL-6 and clinical coronary outcomes (i.e., MI or coronary death). Including the two current studies, the 17 available prospective studies gave a combined odds ratio of 1.61 (95% CI 1.42–1.83) per 2 SD increase in baseline IL-6 (corresponding to an odds ratio of 3.34 [95% CI 2.45–4.56] per 2 SD increase in usual [long-term average] IL-6 levels).

Conclusions

Long-term IL-6 levels are associated with CHD risk about as strongly as are some major established risk factors, but causality remains uncertain. These findings highlight the potential relevance of IL-6–mediated pathways to CHD.  相似文献   

12.

Background

Soil-transmitted helminth (STH) infections (i.e., Ascaris lumbricoides, hookworm, and Trichuris trichiura) affect more than a billion people. Preventive chemotherapy (i.e., repeated administration of anthelmintic drugs to at-risk populations), is the mainstay of control. This strategy, however, does not prevent reinfection. We performed a systematic review and meta-analysis to assess patterns and dynamics of STH reinfection after drug treatment.

Methodology

We systematically searched PubMed, ISI Web of Science, EMBASE, Cochrane Database of Systematic Reviews, China National Knowledge Infrastructure, WanFang Database, Chinese Scientific Journal Database, and Google Scholar. Information on study year, country, sample size, age of participants, diagnostic method, drug administration strategy, prevalence and intensity of infection pre- and posttreatment, cure and egg reduction rate, evaluation period posttreatment, and adherence was extracted. Pooled risk ratios from random-effects models were used to assess the risk of STH reinfection after treatment. Our protocol is available on PROSPERO, registration number: CRD42011001678.

Principal Findings

From 154 studies identified, 51 were included and 24 provided STH infection rates pre- and posttreatment, whereas 42 reported determinants of predisposition to reinfection. At 3, 6, and 12 months posttreatment, A. lumbricoides prevalence reached 26% (95% confidence interval (CI): 16–43%), 68% (95% CI: 60–76%) and 94% (95% CI: 88–100%) of pretreatment levels, respectively. For T. trichiura, respective reinfection prevalence were 36% (95% CI: 28–47%), 67% (95% CI: 42–100%), and 82% (95% CI: 62–100%), and for hookworm, 30% (95% CI: 26–34%), 55% (95% CI: 34–87%), and 57% (95% CI: 49–67%). Prevalence and intensity of reinfection were positively correlated with pretreatment infection status.

Conclusion

STH reinfections occur rapidly after treatment, particularly for A. lumbricoides and T. trichiura. Hence, there is a need for frequent anthelmintic drug administrations to maximize the benefit of preventive chemotherapy. Integrated control approaches emphasizing health education and environmental sanitation are needed to interrupt transmission of STH.  相似文献   

13.

Background

Individuals'' expectations on returning to work after an injury have been shown to predict the duration of time that a person with work-related low back pain will remain on benefits; individuals with lower recovery expectations received benefits for a longer time than those with higher expectations. The role of expectations in recovery from traumatic neck pain, in particular whiplash-associated disorders (WAD), has not been assessed to date to our knowledge. The aim of this study was to investigate if expectations for recovery are a prognostic factor after experiencing a WAD.

Methods and Findings

We used a prospective cohort study composed of insurance claimants in Sweden. The participants were car occupants who filed a neck injury claim (i.e., for WAD) to one of two insurance companies between 15 January 2004 and 12 January 2005 (n = 1,032). Postal questionnaires were completed shortly (average 23 d) after the collision and then again 6 mo later. Expectations for recovery were measured with a numerical rating scale (NRS) at baseline, where 0 corresponds to “unlikely to make a full recovery” and 10 to “very likely to make a full recovery.” The scale was reverse coded and trichotomised into NRS 0, 1–4, and 5–10. The main outcome measure was self-perceived disability at 6 mo postinjury, measured with the Pain Disability Index, and categorised into no/low, moderate, and high disability. Multivariable polytomous logistic regression was used for the analysis. There was a dose response relationship between recovery expectations and disability. After controlling for severity of physical and mental symptoms, individuals who stated that they were less likely to make a full recovery (NRS 5–10), were more likely to have a high disability compared to individuals who stated that they were very likely to make a full recovery (odds ratio [OR] 4.2 [95% confidence interval (CI) 2.1 to 8.5]. For the intermediate category (NRS 1–4), the OR was 2.1 (95% CI 1.2 to 3.2). Associations between expectations and disability were also found among individuals with moderate disability.

Conclusions

Individuals'' expectations for recovery are important in prognosis, even after controlling for symptom severity. Interventions designed to increase patients'' expectations may be beneficial and should be examined further in controlled studies.  相似文献   

14.

Background

Neonatal hypothyroidism has been associated in animal models with maternal exposure to several environmental contaminants; however, evidence for such an association in humans is inconsistent. We evaluated whether maternal exposure to 2,3,7,8-Tetrachlorodibenzo-p-dioxin (TCDD), a persistent and widespread toxic environmental contaminant, is associated with modified neonatal thyroid function in a large, highly exposed population in Seveso, Italy.

Methods and Findings

Between 1994 and 2005, in individuals exposed to TCDD after the 1976 Seveso accident we conducted: (i) a residence-based population study on 1,014 children born to the 1,772 women of reproductive age in the most contaminated zones (A, very high contamination; B, high contamination), and 1,772 age-matched women from the surrounding noncontaminated area (reference); (ii) a biomarker study on 51 mother–child pairs for whom recent maternal plasma dioxin measurements were available. Neonatal blood thyroid-stimulating hormone (b-TSH) was measured on all children. We performed crude and multivariate analyses adjusting for gender, birth weight, birth order, maternal age, hospital, and type of delivery. Mean neonatal b-TSH was 0.98 μU/ml (95% confidence interval [CI] 0.90–1.08) in the reference area (n = 533), 1.35 μU/ml (95% CI 1.22–1.49) in zone B (n = 425), and 1.66 μU/ml (95% CI 1.19–2.31) in zone A (n = 56) (p < 0.001). The proportion of children with b-TSH > 5 μU/ml was 2.8% in the reference area, 4.9% in zone B, and 16.1% in zone A (p < 0.001). Neonatal b-TSH was correlated with current maternal plasma TCDD (n = 51, β = 0.47, p < 0.001) and plasma toxic equivalents of coplanar dioxin-like compounds (n = 51, β = 0.45, p = 0.005).

Conclusions

Our data indicate that environmental contaminants such as dioxins have a long-lasting capability to modify neonatal thyroid function after the initial exposure.  相似文献   

15.

Background

Idiopathic pulmonary fibrosis (IPF) is a chronic progressive fibrotic lung disease associated with substantial morbidity and mortality. The objective of this study was to determine whether there is a peripheral blood protein signature in IPF and whether components of this signature may serve as biomarkers for disease presence and progression.

Methods and Findings

We analyzed the concentrations of 49 proteins in the plasma of 74 patients with IPF and in the plasma of 53 control individuals. We identified a combinatorial signature of five proteins—MMP7, MMP1, MMP8, IGFBP1, and TNFRSF1A—that was sufficient to distinguish patients from controls with a sensitivity of 98.6% (95% confidence interval [CI] 92.7%–100%) and specificity of 98.1% (95% CI 89.9%–100%). Increases in MMP1 and MMP7 were also observed in lung tissue and bronchoalveolar lavage fluid obtained from IPF patients. MMP7 and MMP1 plasma concentrations were not increased in patients with chronic obstructive pulmonary disease or sarcoidosis and distinguished IPF compared to subacute/chronic hypersensitivity pneumonitis, a disease that may mimic IPF, with a sensitivity of 96.3% (95% CI 81.0%–100%) and specificity of 87.2% (95% CI 72.6%–95.7%). We verified our results in an independent validation cohort composed of patients with IPF, familial pulmonary fibrosis, subclinical interstitial lung disease (ILD), as well as with control individuals. MMP7 and MMP1 concentrations were significantly higher in IPF patients compared to controls in this cohort. Furthermore, MMP7 concentrations were elevated in patients with subclinical ILD and negatively correlated with percent predicted forced vital capacity (FVC%) and percent predicted carbon monoxide diffusing capacity (DLCO%).

Conclusions

Our experiments provide the first evidence for a peripheral blood protein signature in IPF to our knowledge. The two main components of this signature, MMP7 and MMP1, are overexpressed in the lung microenvironment and distinguish IPF from other chronic lung diseases. Additionally, increased MMP7 concentration may be indicative of asymptomatic ILD and reflect disease progression.  相似文献   

16.

Background

Large state tobacco control programs have been shown to reduce smoking and would be expected to affect health care costs. We investigate the effect of California''s large-scale tobacco control program on aggregate personal health care expenditures in the state.

Methods and Findings

Cointegrating regressions were used to predict (1) the difference in per capita cigarette consumption between California and 38 control states as a function of the difference in cumulative expenditures of the California and control state tobacco control programs, and (2) the relationship between the difference in cigarette consumption and the difference in per capita personal health expenditures between the control states and California between 1980 and 2004. Between 1989 (when it started) and 2004, the California program was associated with $86 billion (2004 US dollars) (95% confidence interval [CI] $28 billion to $151 billion) lower health care expenditures than would have been expected without the program. This reduction grew over time, reaching 7.3% (95% CI 2.7%–12.1%) of total health care expenditures in 2004.

Conclusions

A strong tobacco control program is not only associated with reduced smoking, but also with reductions in health care expenditures.  相似文献   

17.
18.

Background

Virus-specific CD8+ T lymphocytes play a key role in the initial reduction of peak viremia during acute viral infections, but display signs of increasing dysfunction and exhaustion under conditions of chronic antigen persistence. It has been suggested that virus-specific CD8+ T cells with a “polyfunctional” profile, defined by the capacity to secrete multiple cytokines or chemokines, are most competent in controlling viral replication in chronic HIV-1 infection. We used HIV-1 infection as a model of chronic persistent viral infection to investigate the process of exhaustion and dysfunction of virus-specific CD8+ T cell responses on the single-epitope level over time, starting in primary HIV-1 infection.

Methods and Findings

We longitudinally analyzed the polyfunctional epitope-specific CD8+ T cell responses of 18 patients during primary HIV-1 infection before and after therapy initiation or sequence variation in the targeted epitope. Epitope-specific CD8+ T cells responded with multiple effector functions to antigenic stimulation during primary HIV-1 infection, but lost their polyfunctional capacity in response to antigen and up-regulated programmed death 1 (PD-1) expression with persistent viremic infection. This exhausted phenotype significantly decreased upon removal of stimulation by antigen, either in response to antiretroviral therapy or by reduction of epitope-specific antigen load in the presence of ongoing viral replication, as a consequence of in vivo selection of cytotoxic T lymphocyte escape mutations in the respective epitopes. Monofunctionality increased in CD8+ T cell responses directed against conserved epitopes from 49% (95% confidence interval 27%–72%) to 76% (56%–95%) (standard deviation [SD] of the effect size 0.71), while monofunctionality remained stable or slightly decreased for responses directed against escaped epitopes from 61% (47%–75%) to 56% (42%–70%) (SD of the effect size 0.18) (p < 0.05).

Conclusion

These data suggest that persistence of antigen can be the cause, rather than the consequence, of the functional impairment of virus-specific T cell responses observed during chronic HIV-1 infection, and underscore the importance of evaluating autologous viral sequences in studies aimed at investigating the relationship between virus-specific immunity and associated pathogenesis.  相似文献   

19.

Background

Epidemiologic data on malaria are scant in many high-burden countries including the Democratic Republic of the Congo (DRC), which suffers the second-highest global burden of malaria. Malaria control efforts in regions with challenging infrastructure require reproducible and efficient surveillance. We employed new high-throughput molecular testing to characterize the state of malaria control in the DRC and estimate childhood mortality attributable to excess malaria transmission.

Methods and Findings

The Demographic and Health Survey was a cross-sectional, population-based cluster household survey of adults aged 15–59 years in 2007 employing structured questionnaires and dried blood spot collection. Parasitemia was detected by real-time PCR, and survey responses measured adoption of malaria control measures and under-5 health indices. The response rate was 99% at the household level, and 8,886 households were surveyed in 300 clusters; from 8,838 respondents molecular results were available. The overall prevalence of parasitemia was 33.5% (95% confidence interval [C.I.] 32–34.9); P. falciparum was the most prevalent species, either as monoinfection (90.4%; 95% C.I. 88.8–92.1) or combined with P. malariae (4.9%; 95% C.I. 3.7–5.9) or P. ovale (0.6%; 95% C.I. 0.1–0.9). Only 7.7% (95% CI 6.8–8.6) of households with children under 5 owned an insecticide-treated bednet (ITN), and only 6.8% (95% CI 6.1–7.5) of under-fives slept under an ITN the preceding night. The overall under-5 mortality rate was 147 deaths per 1,000 live births (95% C.I. 141–153) and between clusters was associated with increased P. falciparum prevalence; based on the population attributable fraction, 26,488 yearly under-5 deaths were attributable to excess malaria transmission.

Conclusions

Adult P. falciparum prevalence is substantial in the DRC and is associated with under-5 mortality. Molecular testing offers a new, generalizable, and efficient approach to characterizing malaria endemicity in underserved countries.  相似文献   

20.

Background

An outbreak of chikungunya virus affected over one-third of the population of La Réunion Island between March 2005 and December 2006. In June 2005, we identified the first case of mother-to-child chikungunya virus transmission at the Groupe Hospitalier Sud-Réunion level-3 maternity department. The goal of this prospective study was to characterize the epidemiological, clinical, biological, and radiological features and outcomes of all the cases of vertically transmitted chikungunya infections recorded at our institution during this outbreak.

Methods and Findings

Over 22 mo, 7,504 women delivered 7,629 viable neonates; 678 (9.0%) of these parturient women were infected (positive RT-PCR or IgM serology) during antepartum, and 61 (0.8%) in pre- or intrapartum. With the exception of three early fetal deaths, vertical transmission was exclusively observed in near-term deliveries (median duration of gestation: 38 wk, range 35–40 wk) in the context of intrapartum viremia (19 cases of vertical transmission out of 39 women with intrapartum viremia, prevalence rate 0.25%, vertical transmission rate 48.7%). Cesarean section had no protective effect on transmission. All infected neonates were asymptomatic at birth, and median onset of neonatal disease was 4 d (range 3–7 d). Pain, prostration, and fever were present in 100% of cases and thrombocytopenia in 89%. Severe illness was observed in ten cases (52.6%) and mainly consisted of encephalopathy (n = 9; 90%). These nine children had pathologic MRI findings (brain swelling, n = 9; cerebral hemorrhages, n = 2), and four evolved towards persistent disabilities.

Conclusions

Mother-to-child chikungunya virus transmission is frequent in the context of intrapartum maternal viremia, and often leads to severe neonatal infection. Chikungunya represents a substantial risk for neonates born to viremic parturients that should be taken into account by clinicians and public health authorities in the event of a chikungunya outbreak.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号