首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.

Background

Perinatal hypoxia-ischemia is a major cause of mortality and cerebral morbidity, and using oxygen during newborn resuscitation may further harm the brain. The aim was to examine how supplementary oxygen used for newborn resuscitation would influence early brain tissue injury, cell death and repair processes and the regulation of genes related to apoptosis, neurodegeneration and neuroprotection.

Methods and Findings

Anesthetized newborn piglets were subjected to global hypoxia and then randomly assigned to resuscitation with 21%, 40% or 100% O2 for 30 min and followed for 9 h. An additional group received 100% O2 for 30 min without preceding hypoxia. The left hemisphere was used for histopathology and immunohistochemistry and the right hemisphere was used for in situ zymography in the corpus striatum; gene expression and the activity of various relevant biofactors were measured in the frontal cortex. There was an increase in the net matrix metalloproteinase gelatinolytic activity in the corpus striatum from piglets resuscitated with 100% oxygen vs. 21%. Hematoxylin-eosin (HE) staining revealed no significant changes. Nine hours after oxygen-assisted resuscitation, caspase-3 expression and activity was increased by 30–40% in the 100% O2 group (n = 9/10) vs. the 21% O2 group (n = 10; p<0.04), whereas brain-derived neurotrophic factor (BDNF) activity was decreased by 65% p<0.03.

Conclusions

The use of 100% oxygen for resuscitation resulted in increased potentially harmful proteolytic activities and attenuated BDNF activity when compared with 21%. Although there were no significant changes in short term cell loss, hyperoxia seems to cause an early imbalance between neuroprotective and neurotoxic mechanisms that might compromise the final pathological outcome.  相似文献   

2.

Background

Understanding the role of different classes of T cells during HIV infection is critical to determining which responses correlate with protective immunity. To date, it is unclear whether alterations in regulatory T cell (Treg) function are contributory to progression of HIV infection.

Methodology

FOXP3 expression was measured by both qRT-PCR and by flow cytometry in HIV-infected individuals and uninfected controls together with expression of CD25, GITR and CTLA-4. Cultured peripheral blood mononuclear cells were stimulated with anti-CD3 and cell proliferation was assessed by CFSE dilution.

Principal Findings

HIV infected individuals had significantly higher frequencies of CD4+FOXP3+ T cells (median of 8.11%; range 1.33%–26.27%) than healthy controls (median 3.72%; range 1.3–7.5%; P = 0.002), despite having lower absolute counts of CD4+FOXP3+ T cells. There was a significant positive correlation between the frequency of CD4+FOXP3+ T cells and viral load (rho = 0.593 P = 0.003) and a significant negative correlation with CD4 count (rho = −0.423 P = 0.044). 48% of our patients had CD4 counts below 200 cells/µl and these patients showed a marked elevation of FOXP3 percentage (median 10% range 4.07%–26.27%). Assessing the mechanism of increased FOXP3 frequency, we found that the high FOXP3 levels noted in HIV infected individuals dropped rapidly in unstimulated culture conditions but could be restimulated by T cell receptor stimulation. This suggests that the high FOXP3 expression in HIV infected patients is likely due to FOXP3 upregulation by individual CD4+ T cells following antigenic or other stimulation.

Conclusions/Significance

FOXP3 expression in the CD4+ T cell population is a marker of severity of HIV infection and a potential prognostic marker of disease progression.  相似文献   

3.

Background

Sepsis likely contributes to the high burden of infectious disease morbidity and mortality in low income countries. Data regarding sepsis management in sub-Saharan Africa are limited. We conducted a prospective observational study reporting the management and outcomes of severely septic patients in two Ugandan hospitals. We describe their epidemiology, management, and clinical correlates for mortality.

Methodology/Results

Three-hundred eighty-two patients fulfilled enrollment criteria for a severe sepsis syndrome. Vital signs, management and laboratory results were recorded. Outcomes measured included in-hospital and post-discharge mortality.Most patients were HIV-infected (320/377, 84.9%) with a median CD4+ T cell (CD4) count of 52 cells/mm3 (IQR, 16–131 cells/mm3). Overall mortality was 43.0%, with 23.7% in-hospital mortality (90/380) and 22.3% post-discharge mortality (55/247). Significant predictors of in-hospital mortality included admission Glasgow Coma Scale and Karnofsky Performance Scale (KPS), tachypnea, leukocytosis and thrombocytopenia. Discharge KPS and early fluid resuscitation were significant predictors of post-discharge mortality. Among HIV-infected patients, CD4 count was a significant predictor of post-discharge mortality.Median volume of fluid resuscitation within the first 6 hours of presentation was 500 mLs (IQR 250–1000 mls). Fifty-two different empiric antibacterial regimens were used during the study. Bacteremic patients were more likely to die in hospital than non-bacteremic patients (OR 1.83, 95% CI = 1.01–3.33). Patients with Mycobacterium tuberculosis (MTB) bacteremia (25/249) had higher in-hospital mortality (OR 1.97, 95% CI = 1.19–327) and lower median CD4 counts (p = 0.001) than patients without MTB bacteremia.

Conclusion

Patients presenting with sepsis syndromes to two Ugandan hospitals had late stage HIV infection and high mortality. Bacteremia, especially from MTB, was associated with increased in-hospital mortality. Most clinical predictors of in-hospital mortality were easily measurable and can be used for triaging patients in resource-constrained settings. Procurement of low cost and high impact treatments like intravenous fluids and empiric antibiotics may help decrease sepsis-associated mortality in resource-constrained settings.  相似文献   

4.

Background

We aimed to determine the effect of resistance exercise intensity (% 1 repetition maximum—1RM) and volume on muscle protein synthesis, anabolic signaling, and myogenic gene expression.

Methodology/Principal Findings

Fifteen men (21±1 years; BMI = 24.1±0.8 kg/m2) performed 4 sets of unilateral leg extension exercise at different exercise loads and/or volumes: 90% of repetition maximum (1RM) until volitional failure (90FAIL), 30% 1RM work-matched to 90%FAIL (30WM), or 30% 1RM performed until volitional failure (30FAIL). Infusion of [ring-13C6] phenylalanine with biopsies was used to measure rates of mixed (MIX), myofibrillar (MYO), and sarcoplasmic (SARC) protein synthesis at rest, and 4 h and 24 h after exercise. Exercise at 30WM induced a significant increase above rest in MIX (121%) and MYO (87%) protein synthesis at 4 h post-exercise and but at 24 h in the MIX only. The increase in the rate of protein synthesis in MIX and MYO at 4 h post-exercise with 90FAIL and 30FAIL was greater than 30WM, with no difference between these conditions; however, MYO remained elevated (199%) above rest at 24 h only in 30FAIL. There was a significant increase in AktSer473 at 24h in all conditions (P = 0.023) and mTORSer2448 phosphorylation at 4 h post-exercise (P = 0.025). Phosporylation of Erk1/2Tyr202/204, p70S6KThr389, and 4E-BP1Thr37/46 increased significantly (P<0.05) only in the 30FAIL condition at 4 h post-exercise, whereas, 4E-BP1Thr37/46 phosphorylation was greater 24 h after exercise than at rest in both 90FAIL (237%) and 30FAIL (312%) conditions. Pax7 mRNA expression increased at 24 h post-exercise (P = 0.02) regardless of condition. The mRNA expression of MyoD and myogenin were consistently elevated in the 30FAIL condition.

Conclusions/Significance

These results suggest that low-load high volume resistance exercise is more effective in inducing acute muscle anabolism than high-load low volume or work matched resistance exercise modes.  相似文献   

5.

Introduction

Although oxygen is essential for the wound healing process, tissue hypoxia is known to stimulate angiogenesis. To explore these inconsistent findings, we estimated the influence of the oxygen environment on wound healing with our original model.

Methods

Experiment 1 (Establishment of the model): To modify the topical oxygen tension, oxygen impermeable (polyvinylidene chloride) and permeable (polymethylpentene) membranes were applied to symmetrical excisional wounds in ddy mice (n = 6). Oxygen tension under the membrane was quantified with a device using photo-quenching technique. Experiment 2 (Influence of oxygen environment on wound healing): The wound area, granulation thickness and vascular density were analyzed under different oxygen environments (n = 24).

Results

Experiment 1: The permeable group maintained equivalent oxygen level to atmosphere (114.1±29.8 mmHg on day 7), while the impermeable group showed extremely low oxygen tension (5.72±2.99 mmHg on day 7). Accordingly, each group was defined as the normoxia group and the hypoxia group. Experiment 2: Percent decrease in wound size was significantly enhanced in the normoxia group (11.1±1.66% on day 7) in comparison with the hypoxia group (27.6±3.47% on day 7). The normoxia group showed significantly thicker granulation tissue than the hypoxia group (491.8±243.2 vs. 295.3±180.9 µm). Contrarily, the vascular density of the hypoxia group significantly increased on day 7 (0.046±0.025 vs. 0.011±0.008 mm2/mm2).

Conclusions

Our original model successfully controlled local oxygen concentration around the wound, and the hypoxic wounds showed increased angiogenesis but with a smaller amount of granulation tissue and delayed wound closure. Enhanced neovascularization in the hypoxic group likely implies compensative response to an insufficient ambient oxygen supply.  相似文献   

6.

Background

We have previously shown that angiopoietin-like 4 (angptl4) mRNA, a hypoxia-inducible gene, is highly expressed in clear cell renal-cell carcinoma (ccRCC), the most common subtype of RCC for which no specific marker is available. We here investigated whether angptl4 mRNA 1) could be a useful diagnostic and/or prognostic marker of ccRCC in a large and comprehensive retrospective series, 2) induction is dependent on the VHL status of tumors.

Methodology/Principal Findings

Using in situ hybridization, we report that angptl4 mRNA is expressed in 100% of both sporadic (n = 102) and inherited (n = 6) primary ccRCCs, without any statistical association with nuclear grade (p = 0.39), tumor size (p = 0.09), stage grouping (p = 0.17), progression-free survival (p = 0.94), and overall survival (p = 0.80). Angptl4 mRNA was also expressed in 26 (87%) of 30 secondary ccRCCs but neither in any other secondary RCCs (n = 7). In contrast, angptl4 mRNA was neither expressed in 94% non-ccRCC renal tumors (papillary RCCs (n = 46), chromophobe RCCs (n = 28), and oncocytomas (n = 9)), nor in non-renal clear cell carcinomas (n = 39). Angptl4 expression was also examined in tumors associated (n = 23) or not associated (n = 66) with VHL disease. 40 (98%) hemangioblastomas expressed angptl4 whereas all pheochromocytomas (n = 23) and pancreatic tumors (n = 25) were angptl4-negative, whatever their VHL status.

Conclusions/Significance

Angptl4 mRNA expression was highly associated with ccRCC (p = 1.5 10−49, Chi square test) allowing to define its expression as a diagnosis marker for primary ccRCC. Moreover, angptl4 mRNA allows to discriminate the renal origin of metastases of clear-cell carcinomas arising from various organs. Finally, inactivation of VHL gene is neither necessary nor sufficient for angptl4 mRNA induction.  相似文献   

7.

Background

Reduced saturated fat (SFA) consumption is recommended to reduce coronary heart disease (CHD), but there is an absence of strong supporting evidence from randomized controlled trials (RCTs) of clinical CHD events and few guidelines focus on any specific replacement nutrient. Additionally, some public health groups recommend lowering or limiting polyunsaturated fat (PUFA) consumption, a major potential replacement for SFA.

Methods and Findings

We systematically investigated and quantified the effects of increased PUFA consumption, as a replacement for SFA, on CHD endpoints in RCTs. RCTs were identified by systematic searches of multiple online databases through June 2009, grey literature sources, hand-searching related articles and citations, and direct contacts with experts to identify potentially unpublished trials. Studies were included if they randomized participants to increased PUFA for at least 1 year without major concomitant interventions, had an appropriate control group, and reported incidence of CHD (myocardial infarction and/or cardiac death). Inclusions/exclusions were adjudicated and data were extracted independently and in duplicate by two investigators and included population characteristics, control and intervention diets, follow-up duration, types of events, risk ratios, and SEs. Pooled effects were calculated using inverse-variance-weighted random effects meta-analysis. From 346 identified abstracts, eight trials met inclusion criteria, totaling 13,614 participants with 1,042 CHD events. Average weighted PUFA consumption was 14.9% energy (range 8.0%–20.7%) in intervention groups versus 5.0% energy (range 4.0%–6.4%) in controls. The overall pooled risk reduction was 19% (RR = 0.81, 95% confidence interval [CI] 0.70–0.95, p = 0.008), corresponding to 10% reduced CHD risk (RR = 0.90, 95% CI = 0.83–0.97) for each 5% energy of increased PUFA, without evidence for statistical heterogeneity (Q-statistic p = 0.13; I2 = 37%). Meta-regression identified study duration as an independent determinant of risk reduction (p = 0.017), with studies of longer duration showing greater benefits.

Conclusions

These findings provide evidence that consuming PUFA in place of SFA reduces CHD events in RCTs. This suggests that rather than trying to lower PUFA consumption, a shift toward greater population PUFA consumption in place of SFA would significantly reduce rates of CHD. Please see later in the article for the Editors'' Summary  相似文献   

8.

Background

Global programs of anti-HIV treatment depend on sustained laboratory capacity to assess treatment initiation thresholds and treatment response over time. Currently, there is no valid alternative to CD4 count testing for monitoring immunologic responses to treatment, but laboratory cost and capacity limit access to CD4 testing in resource-constrained settings. Thus, methods to prioritize patients for CD4 count testing could improve treatment monitoring by optimizing resource allocation.

Methods and Findings

Using a prospective cohort of HIV-infected patients (n = 1,956) monitored upon antiretroviral therapy initiation in seven clinical sites with distinct geographical and socio-economic settings, we retrospectively apply a novel prediction-based classification (PBC) modeling method. The model uses repeatedly measured biomarkers (white blood cell count and lymphocyte percent) to predict CD4+ T cell outcome through first-stage modeling and subsequent classification based on clinically relevant thresholds (CD4+ T cell count of 200 or 350 cells/µl). The algorithm correctly classified 90% (cross-validation estimate = 91.5%, standard deviation [SD] = 4.5%) of CD4 count measurements <200 cells/µl in the first year of follow-up; if laboratory testing is applied only to patients predicted to be below the 200-cells/µl threshold, we estimate a potential savings of 54.3% (SD = 4.2%) in CD4 testing capacity. A capacity savings of 34% (SD = 3.9%) is predicted using a CD4 threshold of 350 cells/µl. Similar results were obtained over the 3 y of follow-up available (n = 619). Limitations include a need for future economic healthcare outcome analysis, a need for assessment of extensibility beyond the 3-y observation time, and the need to assign a false positive threshold.

Conclusions

Our results support the use of PBC modeling as a triage point at the laboratory, lessening the need for laboratory-based CD4+ T cell count testing; implementation of this tool could help optimize the use of laboratory resources, directing CD4 testing towards higher-risk patients. However, further prospective studies and economic analyses are needed to demonstrate that the PBC model can be effectively applied in clinical settings. Please see later in the article for the Editors'' Summary  相似文献   

9.

Background

Alcoholism is associated with susceptibility to infectious disease, particularly bacterial pneumonia. In the present study we described characteristics in alcoholic patients with bacterial meningitis and delineate the differences with findings in non-alcoholic adults with bacterial meningitis.

Methods/Principal Findings

This was a prospective nationwide observational cohort study including patients aged >16 years who had bacterial meningitis confirmed by culture of cerebrospinal fluid (696 episodes of bacterial meningitis occurring in 671 patients). Alcoholism was present in 27 of 686 recorded episodes of bacterial meningitis (4%) and alcoholics were more often male than non-alcoholics (82% vs 48%, P = 0.001). A higher proportion of alcoholics had underlying pneumonia (41% vs 11% P<0.001). Alcoholics were more likely to have meningitis due to infection with Streptococcus pneumoniae (70% vs 50%, P = 0.01) and Listeria monocytogenes (19% vs 4%, P = 0.005), whereas Neisseria meningitidis was more common in non-alcoholic patients (39% vs 4%, P = 0.01). A large proportion of alcoholics developed complications during clinical course (82% vs 62%, as compared with non-alcoholics; P = 0.04), often cardiorespiratory failure (52% vs 28%, as compared with non-alcoholics; P = 0.01). Alcoholic patients were at risk for unfavourable outcome (67% vs 33%, as compared with non-alcoholics; P<0.001).

Conclusions/Significance

Alcoholic patients are at high risk for complications resulting in high morbidity and mortality. They are especially at risk for cardiorespiratory failure due to underlying pneumonia, and therefore, aggressive supportive care may be crucial in the treatment of these patients.  相似文献   

10.

Background

Nevirapine (NVP) is widely used in antiretroviral treatment (ART) of HIV-1 globally. The primary objective of the AA5208/OCTANE trial was to compare the efficacy of NVP-based versus lopinavir/ritonavir (LPV/r)-based initial ART.

Methods and Findings

In seven African countries (Botswana, Kenya, Malawi, South Africa, Uganda, Zambia, and Zimbabwe), 500 antiretroviral-naïve HIV-infected women with CD4<200 cells/mm3 were enrolled into a two-arm randomized trial to initiate open-label ART with tenofovir (TDF)/emtricitabine (FTC) once/day plus either NVP (n = 249) or LPV/r (n = 251) twice/day, and followed for ≥48 weeks. The primary endpoint was time from randomization to death or confirmed virologic failure ([VF]) (plasma HIV RNA<1 log10 below baseline 12 weeks after treatment initiation, or ≥400 copies/ml at or after 24 weeks), with comparison between treatments based on hazard ratios (HRs) in intention-to-treat analysis. Equivalence of randomized treatments was defined as finding the 95% CI for HR for virological failure or death in the range 0.5 to 2.0. Baseline characteristics were (median): age = 34 years, CD4 = 121 cells/mm3, HIV RNA = 5.2 log10copies/ml. Median follow-up = 118 weeks; 29 (6%) women were lost to follow-up. 42 women (37 VFs, five deaths; 17%) in the NVP and 50 (43 VFs, seven deaths; 20%) in the LPV/r arm reached the primary endpoint (HR 0.85, 95% CI 0.56–1.29). During initial assigned treatment, 14% and 16% of women receiving NVP and LPV/r experienced grade 3/4 signs/symptoms and 26% and 22% experienced grade 3/4 laboratory abnormalities. However, 35 (14%) women discontinued NVP because of adverse events, most in the first 8 weeks, versus none for LPV/r (p<0.001). VF, death, or permanent treatment discontinuation occurred in 80 (32%) of NVP and 54 (22%) of LPV/r arms (HR = 1.7, 95% CI 1.2–2.4), with the difference primarily due to more treatment discontinuation in the NVP arm. 13 (45%) of 29 women tested in the NVP versus six (15%) of 40 in the LPV/r arm had any drug resistance mutation at time of VF.

Conclusions

Initial ART with NVP+TDF/FTC demonstrated equivalent virologic efficacy but higher rates of treatment discontinuation and new drug resistance compared with LPV/r+TDF/FTC in antiretroviral-naïve women with CD4<200 cells/mm3.

Trial registration

ClinicalTrials.gov NCT00089505 Please see later in the article for the Editors'' Summary  相似文献   

11.

Background

We have previously shown that multiple genetic loci identified by genome-wide association studies (GWAS) increase the susceptibility to obesity in a cumulative manner. It is, however, not known whether and to what extent this genetic susceptibility may be attenuated by a physically active lifestyle. We aimed to assess the influence of a physically active lifestyle on the genetic predisposition to obesity in a large population-based study.

Methods and Findings

We genotyped 12 SNPs in obesity-susceptibility loci in a population-based sample of 20,430 individuals (aged 39–79 y) from the European Prospective Investigation of Cancer (EPIC)-Norfolk cohort with an average follow-up period of 3.6 y. A genetic predisposition score was calculated for each individual by adding the body mass index (BMI)-increasing alleles across the 12 SNPs. Physical activity was assessed using a self-administered questionnaire. Linear and logistic regression models were used to examine main effects of the genetic predisposition score and its interaction with physical activity on BMI/obesity risk and BMI change over time, assuming an additive effect for each additional BMI-increasing allele carried. Each additional BMI-increasing allele was associated with 0.154 (standard error [SE] 0.012) kg/m2 (p = 6.73×10−37) increase in BMI (equivalent to 445 g in body weight for a person 1.70 m tall). This association was significantly (p interaction = 0.005) more pronounced in inactive people (0.205 [SE 0.024] kg/m2 [p = 3.62×10−18; 592 g in weight]) than in active people (0.131 [SE 0.014] kg/m2 [p = 7.97×10−21; 379 g in weight]). Similarly, each additional BMI-increasing allele increased the risk of obesity 1.116-fold (95% confidence interval [CI] 1.093–1.139, p = 3.37×10−26) in the whole population, but significantly (p interaction = 0.015) more in inactive individuals (odds ratio [OR] = 1.158 [95% CI 1.118–1.199; p = 1.93×10−16]) than in active individuals (OR = 1.095 (95% CI 1.068–1.123; p = 1.15×10−12]). Consistent with the cross-sectional observations, physical activity modified the association between the genetic predisposition score and change in BMI during follow-up (p interaction = 0.028).

Conclusions

Our study shows that living a physically active lifestyle is associated with a 40% reduction in the genetic predisposition to common obesity, as estimated by the number of risk alleles carried for any of the 12 recently GWAS-identified loci. Please see later in the article for the Editors'' Summary  相似文献   

12.

Background

The pathogenesis of diabetes mellitus (DM) is variable, comprising different inflammatory and immune responses. Proteome analysis holds the promise of delivering insight into the pathophysiological changes associated with diabetes. Recently, we identified and validated urinary proteomics biomarkers for diabetes. Based on these initial findings, we aimed to further validate urinary proteomics biomarkers specific for diabetes in general, and particularity associated with either type 1 (T1D) or type 2 diabetes (T2D).

Methodology/Principal Findings

Therefore, the low-molecular-weight urinary proteome of 902 subjects from 10 different centers, 315 controls and 587 patients with T1D (n = 299) or T2D (n = 288), was analyzed using capillary-electrophoresis mass-spectrometry. The 261 urinary biomarkers (100 were sequenced) previously discovered in 205 subjects were validated in an additional 697 subjects to distinguish DM subjects (n = 382) from control subjects (n = 315) with 94% (95% CI: 92–95) accuracy in this study. To identify biomarkers that differentiate T1D from T2D, a subset of normoalbuminuric patients with T1D (n = 68) and T2D (n = 42) was employed, enabling identification of 131 biomarker candidates (40 were sequenced) differentially regulated between T1D and T2D. These biomarkers distinguished T1D from T2D in an independent validation set of normoalbuminuric patients (n = 108) with 88% (95% CI: 81–94%) accuracy, and in patients with impaired renal function (n = 369) with 85% (95% CI: 81–88%) accuracy. Specific collagen fragments were associated with diabetes and type of diabetes indicating changes in collagen turnover and extracellular matrix as one hallmark of the molecular pathophysiology of diabetes. Additional biomarkers including inflammatory processes and pro-thrombotic alterations were observed.

Conclusions/Significance

These findings, based on the largest proteomic study performed to date on subjects with DM, validate the previously described biomarkers for DM, and pinpoint differences in the urinary proteome of T1D and T2D, indicating significant differences in extracellular matrix remodeling.  相似文献   

13.

Background

Cardiac arrest in patients with pulmonary embolism (PE) is associated with high morbidity and mortality. Thrombolysis is expected to improve the outcome in these patients. However studies evaluating rescue-thrombolysis in patients with PE are missing, mainly due to the difficulties of clinical diagnosis of PE. We aimed to determine the success influencing factors of thrombolysis during resuscitation in patients with PE.

Methodology/Principal Findings

We analyzed retrospectively the outcome of 104 consecutive patients with confirmed (n = 63) or highly suspected (n = 41) PE and monitored cardiac arrest. In all patients rtPA was administrated for thrombolysis during cardiopulmonary resuscitation. In 40 of the 104 patients (38.5%) a return of spontaneous circulation (ROSC) could be achieved successfully. Patients with ROSC received thrombolysis significantly earlier after CPR onset compared to patients without ROSC (13.6±1.2 min versus 24.6±0.8 min; p<0.001). 19 patients (47.5%) out of the 40 patients with initially successful resuscitation survived to hospital discharge. In patients with hospital discharge thrombolysis therapy was begun with a significantly shorter delay after cardiac arrest compared to all other patients (11.0±1.3 vs. 22.5±0.9 min; p<0.001).

Conclusion

Rescue-thrombolysis should be considered and started in patients with PE and cardiac arrest, as soon as possible after cardiac arrest onset.  相似文献   

14.

Background

There is a commonly held assumption that early August is an unsafe period to be admitted to hospital in England, as newly qualified doctors start work in NHS hospitals on the first Wednesday of August. We investigate whether in-hospital mortality is higher in the week following the first Wednesday in August than in the previous week.

Methodology

A retrospective study in England using administrative hospital admissions data. Two retrospective cohorts of all emergency patients admitted on the last Wednesday in July and the first Wednesday in August for 2000 to 2008, each followed up for one week.

Principal Findings

The odds of death for patients admitted on the first Wednesday in August was 6% higher (OR 1.06, 95% CI 1.00 to 1.15, p = 0.05) after controlling for year, gender, age, socio-economic deprivation and co-morbidity. When subdivided into medical, surgical and neoplasm admissions, medical admissions admitted on the first Wednesday in August had an 8% (OR 1.08, 95% CI 1.01 to 1.16, p = 0.03) higher odds of death. In 2007 and 2008, when the system for junior doctors'' job applications changed, patients admitted on Wednesday August 1st had 8% higher adjusted odds of death than those admitted the previous Wednesday, but this was not statistically significant (OR 1.08, 95% CI 0.95 to 1.23, p = 0.24).

Conclusions

We found evidence that patients admitted on the first Wednesday in August have a higher early death rate in English hospitals compared with patients admitted on the previous Wednesday. This was higher for patients admitted with a medical primary diagnosis.  相似文献   

15.

Background

The neglected tropical diseases (NTDs) cause significant morbidity and mortality worldwide. Due to the growth in international travel and immigration, NTDs may be diagnosed in countries of the western world, but there has been no specific focus in the literature on imported NTDs.

Methods

Retrospective study of a cohort of immigrants and travelers diagnosed with one of the 13 core NTDs at a Tropical Medicine Referral Unit in Spain during the period April 1989-December 2007. Area of origin or travel was recorded and analyzed.

Results

There were 6168 patients (2634 immigrants, 3277 travelers and 257 VFR travelers) in the cohort. NTDs occurred more frequently in immigrants, followed by VFR travelers and then by other travelers (p<0.001 for trend). The main NTDs diagnosed in immigrants were onchocerciasis (n = 240, 9.1%) acquired mainly in sub-Saharan Africa, Chagas disease (n = 95, 3.6%) in immigrants from South America, and ascariasis (n = 86, 3.3%) found mainly in immigrants from sub-Saharan Africa. Most frequent NTDs in travelers were: schistosomiasis (n = 43, 1.3%), onchocerciasis (n = 17, 0.5%) and ascariasis (n = 16, 0.5%), and all were mainly acquired in sub-Saharan Africa. The main NTDs diagnosed in VFR travelers were onchocerciasis (n = 14, 5.4%), and schistosomiasis (n = 2, 0.8%).

Conclusions

The concept of imported NTDs is emerging as these infections acquire a more public profile. Specific issues such as the possibility of non-vectorial transmission outside endemic areas and how some eradication programmes in endemic countries may have an impact even in non-tropical western countries are addressed. Recognising NTDs even outside tropical settings would allow specific prevention and control measures to be implemented and may create unique opportunities for research in future.  相似文献   

16.

Introduction

Kaposi sarcoma (KS) is the leading cause of cancer in Uganda and occurs in people with and without HIV. Human herpesvirus-8 (HHV-8) replication is important both in transmission of HHV-8 and progression to KS. We characterized the sites and frequency of HHV-8 detection in Ugandans with and without HIV and KS.

Methods

Participants were enrolled into one of four groups on the basis of HIV and KS status (HIV negative/KS negative, HIV positive/KS negative, HIV negative/KS positive, and HIV positive/KS positive). Participants collected oral swabs daily and clinicians collected oral swabs, anogenital swabs, and plasma samples weekly over 4 weeks. HHV-8 DNA at each site was quantified by polymerase chain reaction (PCR).

Results

78 participants collected a total of 2063 orals swabs and 358 plasma samples. Of these, 428 (21%) oral swabs and 96 (27%) plasma samples had detectable HHV-8 DNA. HHV-8 was detected more frequently in both the oropharynx of persons with KS (24 (57%) of 42 persons with KS vs. 8 (22%) of 36 persons without, p = 0.002) and the peripheral blood (30 (71%) of 42 persons with KS vs. 8 (22%) of 36 persons without, p<0.001). In a multivariate model, HHV-8 viremia was more frequent among men (IRR = 3.3, 95% CI = 1.7–6.2, p<0.001), persons with KS (IRR = 3.9, 95% CI = 1.7–9.0, p = 0.001) and persons with HIV infection (IRR = 1.7, 95% CI = 1.0–2.7, p = 0.03). Importantly, oral HHV-8 detection predicted the subsequent HHV-8 viremia. HHV-8 viremia was significantly more common when HHV-8 DNA was detected from the oropharynx during the week prior than when oral HHV-8 was not detected (RR = 3.3, 95% CI = 1.8–5.9 p<0.001). Genital HHV-8 detection was rare (9 (3%) of 272 swabs).

Conclusions

HHV-8 detection is frequent in the oropharynx and peripheral blood of Ugandans with endemic and epidemic KS. Replication at these sites is highly correlated, and viremia is increased in men and those with HIV. The high incidence of HHV-8 replication at multiple anatomic sites may be an important factor leading to and sustaining the high prevalence of KS in Uganda.  相似文献   

17.

Background

Access to essential maternal and reproductive health care is poor throughout Burma, but is particularly lacking among internally displaced communities in the eastern border regions. In such settings, innovative strategies for accessing vulnerable populations and delivering basic public health interventions are urgently needed.

Methods

Four ethnic health organizations from the Shan, Mon, Karen, and Karenni regions collaborated on a pilot project between 2005 and 2008 to examine the feasibility of an innovative three-tiered network of community-based providers for delivery of maternal health interventions in the complex emergency setting of eastern Burma. Two-stage cluster-sampling surveys among ever-married women of reproductive age (15–45 y) conducted before and after program implementation enabled evaluation of changes in coverage of essential antenatal care interventions, attendance at birth by those trained to manage complications, postnatal care, and family planning services.

Results

Among 2,889 and 2,442 women of reproductive age in 2006 and 2008, respectively, population characteristics (age, marital status, ethnic distribution, literacy) were similar. Compared to baseline, women whose most recent pregnancy occurred during the implementation period were substantially more likely to receive antenatal care (71.8% versus 39.3%, prevalence rate ratio [PRR] = 1.83 [95% confidence interval (CI) 1.64–2.04]) and specific interventions such as urine testing (42.4% versus 15.7%, PRR = 2.69 [95% CI 2.69–3.54]), malaria screening (55.9% versus 21.9%, PRR = 2.88 [95% CI 2.15–3.85]), and deworming (58.2% versus 4.1%, PRR = 14.18 [95% CI 10.76–18.71]. Postnatal care visits within 7 d doubled. Use of modern methods to avoid pregnancy increased from 23.9% to 45.0% (PRR = 1.88 [95% CI 1.63–2.17]), and unmet need for contraception was reduced from 61.7% to 40.5%, a relative reduction of 35% (95% CI 28%–40%). Attendance at birth by those trained to deliver elements of emergency obstetric care increased almost 10-fold, from 5.1% to 48.7% (PRR = 9.55 [95% CI 7.21–12.64]).

Conclusions

Coverage of maternal health interventions and higher-level care at birth was substantially higher during the project period. The MOM Project''s focus on task-shifting, capacity building, and empowerment at the community level might serve as a model approach for similarly constrained settings. Please see later in the article for the Editors'' Summary  相似文献   

18.

Background

Monitoring cerebral saturation is increasingly seen as an aid to management of patients in the operating room and in neurocritical care. How best to manipulate cerebral saturation is not fully known. We examined cerebral saturation with graded changes in carbon dioxide tension while isoxic and with graded changes in oxygen tension while isocapnic.

Methodology/Principal Findings

The study was approved by the Research Ethics Board of the University Health Network at the University of Toronto. Thirteen studies were undertaken in healthy adults with cerebral oximetry by near infrared spectroscopy. End-tidal gas concentrations were manipulated using a model-based prospective end-tidal targeting device. End-tidal carbon dioxide was altered ±15 mmHg from baseline in 5 mmHg increments with isoxia (clamped at 110±4 mmHg). End-tidal oxygen was changed to 300, 400, 500, 80, 60 and 50 mmHg under isocapnia (37±2 mmHg). Twelve studies were completed. The end-tidal carbon dioxide versus cerebral saturation fit a linear relationship (R2 = 0.92±0.06). The end-tidal oxygen versus cerebral saturation followed log-linear behaviour and best fit a hyperbolic relationship (R2 = 0.85±0.10). Cerebral saturation was maximized in isoxia at end-tidal carbon dioxide of baseline +15 mmHg (77±3 percent). Cerebral saturation was minimal in isocapnia at an end-tidal oxygen tension of 50 mmHg (61±3 percent). The cerebral saturation during normoxic hypocapnia was equivalent to normocapnic hypoxia of 60 mmHg.

Conclusions/Significance

Hypocapnia reduces cerebral saturation to an extent equivalent to moderate hypoxia.  相似文献   

19.

Background

Several risk factors for depression during pregnancy have already been established. However, very few studies have conducted a multivariate analysis incorporating both the major predictors of depression in women, in accordance with comprehensive developmental models of depression, and specific stressors associated with the biological and psychosocial state of the mother-to-be.

Methodology/Principal Findings

We used a cross-sectional cohort design to analyze the associations between prenatal depression and potential risk factors. 693 French-speaking women with singleton pregnancies at 20–28 weeks'' gestation were consecutively recruited at Caen University Hospital. Fifty women with missing values were subsequently excluded from the analysis. Depressive symptoms were assessed on the Edinburgh Postnatal Depression Scale. Risk factors were either extracted from the computerized obstetric records or assessed by means of self-administered questionnaires. The associations between prenatal depression and the potential risk factors were assessed using log-binomial regression models to obtain a direct estimate of relative risk (RR). The following factors were found to be significant in the multivariate analysis: level of education (p<0.001), past psychiatric history (adjusted RR = 1.8, 95% confidence interval (CI): 1.1;2.8, p = 0.014), stress related to the health and viability of the fetus (adjusted RR = 2.6, 95% CI: 1.6;4.1, p<0.001), and stress related to severe marital conflicts (adjusted RR = 2.4, 95% CI: 1.5;3.9, p<0.001) or to serious difficulties at work (adjusted RR = 1.6, 95% CI :1.04;2.4, p = 0.031). An association was also found with the previous delivery of a child with a major or minor birth defect (adjusted RR = 2.0, 95% CI: 1.04;4.0, p = 0.038). Univariate analyses revealed a strong association with childhood adversity (parental rejection: RR = 1.8, 95% CI: 1.2;2.8, p = 0.0055 and family secrets: RR = 2.0, 95% CI: 1.2;3.1, p = 0.0046) and with lack of partner support (RR = 0.50, 95% CI: 0.30;0.84, p = 0.0086).

Conclusions/Significance

Our study identifies several risk factors that could easily be assessed in clinical practice. It draws attention to the impact of previously delivering a child with a birth defect. The association with childhood adversity warrants further study.  相似文献   

20.

Background

Factors associated with the survival of truth of clinical conclusions in the medical literature are unknown. We hypothesized that publications with a first author having a higher Hirsch'' index value (h-I), which quantifies and predicts an individual''s scientific research output, should have a longer half-life.

Methods and Results

474 original articles concerning cirrhosis or hepatitis published from 1945 to 1999 were selected. The survivals of the main conclusions were updated in 2009. The truth survival was assessed by time-dependent methods (Kaplan Meier method and Cox). A conclusion was considered to be true, obsolete or false when three or more observers out of the six stated it to be so. 284 out of 474 conclusions (60%) were still considered true, 90 (19%) were considered obsolete and 100 (21%) false. The median of the h-I was = 24 (range 1–85). Authors with true conclusions had significantly higher h-I (median = 28) than those with obsolete (h-I = 19; P = 0.002) or false conclusions (h-I = 19; P = 0.01). The factors associated (P<0.0001) with h-I were: scientific life (h-I = 33 for>30 years vs. 16 for<30 years), -methodological quality score (h-I = 36 for high vs. 20 for low scores), and -positive predictive value combining power, ratio of true to not-true relationships and bias (h-I = 33 for high vs. 20 for low values). In multivariate analysis, the risk ratio of h-I was 1.003 (95%CI, 0.994–1.011), and was not significant (P = 0.56). In a subgroup restricted to 111 articles with a negative conclusion, we observed a significant independent prognostic value of h-I (risk ratio = 1.033; 95%CI, 1.008–1.059; P = 0.009). Using an extrapolation of h-I at the time of article publication there was a significant and independent prognostic value of baseline h-I (risk ratio = 0.027; P = 0.0001).

Conclusions

The present study failed to clearly demonstrate that the h-index of authors was a prognostic factor for truth survival. However the h-index was associated with true conclusions, methodological quality of trials and positive predictive values.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号