首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 78 毫秒
1.

Abstract/Background

Dengue is the most important arthropod borne viral disease worldwide in terms of morbidity and mortality and is caused by any of the four serotypes of dengue virus (DENV-1 to 4). Brazil is responsible for approximately 80% of dengue cases in the Americas, and since the introduction of dengue in 1986, a total of 5,944,270 cases have been reported including 21,596 dengue hemorrhagic fever and 874 fatal cases. DENV can infect many cell types and cause diverse clinical and pathological effects. The goal of the study was to investigate the usefulness of NS1 capture tests as an alternative tool to detect DENV in tissue specimens from previously confirmed dengue fatal cases (n = 23) that occurred in 2002 in Brazil.

Methodology/Principal Findings

A total of 74 tissue specimens were available: liver (n = 23), lung (n = 14), kidney (n = 04), brain (n = 10), heart (n = 02), skin (n = 01), spleen (n = 15), thymus (n = 03) and lymph nodes (n = 02). We evaluated three tests for NS1 antigen capture: first generation Dengue Early ELISA (PanBio Diagnostics), Platelia NS1 (BioRad Laboratories) and the rapid test NS1 Ag Strip (BioRad Laboratories). The overall dengue fatal case diagnosis based on the tissues analyzed by Dengue Early ELISA, Platelia NS1 and the NS1 Ag Strip was 34.7% (08/23), 60.8% (14/23) and 91.3% (21/23), respectively. The Dengue Early ELISA detected NS1 in 22.9% (17/74) of the specimens analyzed and the Platelia NS1 in 45.9% (34/74). The highest sensitivity (78.3%; 58/74) was achieved by the NS1 Ag Strip, and the differences in the sensitivities were statistically significant (p<0.05). The NS1 Ag Strip was the most sensitive in liver (91.3%; 21/23), lung (71.4%; 10/14), kidney (100%; 4/4), brain (80%; 8/10), spleen (66.6%, 10/15) and thymus (100%, 3/3) when compared to the other two ELISA assays.

Conclusions/Significance

This study shows the DENV NS1 capture assay as a rapid and valuable approach to postmortem dengue confirmation. With an increasing number of DHF and fatal cases, the availability of new approaches useful for cases confirmation plays an important tool for the disease surveillance.  相似文献   

2.

Context

Randomized controlled trails have identified online cognitive behavioral therapy as an efficacious intervention in the management of common mental health disorders.

Objective

To assess the effectiveness of online CBT for different mental disorders in routine clinical practice.

Design

An uncontrolled before-after study, with measurements at baseline, posttest, 6-week follow-up, and 1-year follow-up.

Participants & Setting

1500 adult patients (female: 67%; mean age: 40 years) with a GP referral for psychotherapy were treated at a Dutch online mental health clinic for symptoms of depression (n = 413), panic disorder (n = 139), posttraumatic stress (n = 478), or burnout (n = 470).

Interventions

Manualized, web-based, therapist-assisted CBT, of which the efficacy was previously demonstrated in a series of controlled trials. Standardized duration of treatment varied from 5 weeks (online CBT for Posttraumatic stress) to 16 weeks (online CBT for Depression).

Main Outcome Measures

Validated self-report questionnaires of specific and general psychopathology, including the Beck Depression Inventory, the Impact of Event Scale, the Panic Disorder Severity Scale-Self Report, the Oldenburg Burnout Inventory, and the Depression Anxiety Stress Scales.

Results

Treatment adherence was 71% (n = 1071). Study attrition was 21% at posttest, 33% at 6-week FU and 65% at 1-year FU. Mixed-model repeated measures regression identified large short-term reductions in all measures of primary symptoms (d = 1.9±0.2 to d = 1.2±0.2; P<.001), which sustained up to one year after treatment. At posttest, rates of reliable improvement and recovery were 71% and 52% in the completer sample (full sample: 55%/40%). Patient satisfaction was high.

Conclusions

Results suggest that online therapist-assisted CBT may be as effective in routine practice as it is in clinical trials. Although pre-treatment withdrawal and long-term outcomes require further study, results warrant continued implementation of online CBT.  相似文献   

3.

Introduction

Patients with hematologic malignancies have greater risk-factors for primary bloodstream infections (BSI).

Methods

From 2004–2009, we analyzed bacteremia caused by extended-spectrum beta-lactamase Escherichia coli (ESBL-EC) (n = 100) and we compared with bacteremia caused by cephalosporin-susceptible E. coli (n = 100) in patients with hematologic malignancies.

Objective

To assess the clinical features, risk factors, and outcome of ESBL-EC BSI in patients with hematologic malignancies, and to study the molecular epidemiology of ESBL-EC isolates.

Results

The main diagnosis was acute leukemia in 115 patients (57.5%). Death-related E. coli infection was significantly increased with ESBL-EC (34% vs. control group, 19%; p = 0.03). Treatment for BSI was considered appropriate in 64 patients with ESBL-EC (mean survival, 245±345 days), and in 45 control patients this was 443±613 (p = 0.03). In patients not receiving appropriate antimicrobial treatment, survival was significantly decreased in cases compared with controls (26±122 vs. 276±442; p = 0.001). Fifty six of the ESBL-EC isolates were characterized by molecular analysis: 47 (84%) expressed CTX-M-15, two (3.6%) SHV, and seven (12.5%) did not correspond to either of these two ESBL enzymes. No TLA-1 enzyme was detected.

Conclusions

Patients who had been previously hospitalized and who received cephalosporins during the previous month, have an increased risk of ESBL-EC bacteremia. Mortality was significantly increased in patients with ESBL-EC BSI. A polyclonal trend was detected, which reflects non-cross transmission of multiresistant E.coli isolates.  相似文献   

4.
Chavada R  Kok J  van Hal S  Chen SC 《PloS one》2011,6(12):e28247

Background

Fungal peritonitis is a serious complication of peritoneal dialysis (PD) therapy with the majority of patients ceasing PD permanently. The aims of this study were to identify risk factors and clinical associations that may discriminate between fungal from bacterial peritonitis.

Methods

We retrospectively identified episodes of fungal peritonitis from 2001–2010 in PD patients at Liverpool and Westmead Hospitals (Australia). Fungal peritonitis cases were matched in a 1∶2 ratio with patients with bacterial peritonitis from each institution''s dialysis registry, occurring closest in time to the fungal episode. Patient demographic, clinical and outcome data were obtained from the medical records.

Results

Thirty-nine episodes of fungal peritonitis (rate of 0.02 episodes per patient-year of dialysis) were matched with 78 episodes of bacterial peritonitis. Candida species were the commonest pathogens (35/39; 90% episodes) with Candida albicans (37%), Candida parapsilosis (32%) and Candida glabrata (13%) the most frequently isolated species. Compared to bacterial peritonitis, fungal peritonitis patients had received PD for significantly longer (1133 vs. 775 catheter-days; p = 0.016), were more likely to have had previous episodes of bacterial peritonitis (51% vs. 10%; p = 0.01), and to have received prior antibacterial therapy (51% vs. 10%; p = 0.01). Patients with fungal peritonitis were less likely to have fever and abdominal pain on presentation, but had higher rates of PD catheter removal (79% vs. 22%; p<0.005), and permanent transfer to haemodialysis (87% vs. 24%; p<0.005). Hospital length of stay was significantly longer in patients with fungal peritonitis (26.1 days vs. 12.6 days; p = 0.017), but the all-cause 30-day mortality rate was similar in both groups. Fluconazole was a suitable empiric antifungal agent; with no Candida resistance detected.

Conclusion

Prompt recognition of clinical risk factors, initiation of antifungal therapy and removal of PD catheters are key considerations in optimising outcomes.  相似文献   

5.

Background

Antibodies that impair Plasmodium falciparum merozoite invasion and intraerythrocytic development are one of several mechanisms that mediate naturally acquired immunity to malaria. Attempts to correlate anti-malaria antibodies with risk of infection and morbidity have yielded inconsistent results. Growth inhibition assays (GIA) offer a convenient method to quantify functional antibody activity against blood stage malaria.

Methods

A treatment-time-to-infection study was conducted over 12-weeks in a malaria holoendemic area of Kenya. Plasma collected from healthy individuals (98 children and 99 adults) before artemether-lumefantrine treatment was tested by GIA in three separate laboratories.

Results

Median GIA levels varied with P. falciparum line (D10, 8.8%; 3D7, 34.9%; FVO, 51.4% inhibition). The magnitude of growth inhibition decreased with age in all P. falciparum lines tested with the highest median levels among children <4 years compared to adults (e.g. 3D7, 45.4% vs. 30.0% respectively, p = 0.0003). Time-to-infection measured by weekly blood smears was significantly associated with level of GIA controlling for age. Upper quartile inhibition activity was associated with less risk of infection compared to individuals with lower levels (e.g. 3D7, hazard ratio = 1.535, 95% CI = 1.012–2.329; p = 0.0438). Various GIA methodologies had little effect on measured parasite growth inhibition.

Conclusion

Plasma antibody-mediated growth inhibition of blood stage P. falciparum decreases with age in residents of a malaria holoendemic area. Growth inhibition assay may be a useful surrogate of protection against infection when outcome is controlled for age.  相似文献   

6.

Background

Traditional methods of diagnosing mucosal leishmaniasis (ML), such as biopsy with histopathology, are insensitive and require collection of an invasive diagnostic specimen.

Methods

We compared standard invasive procedures including biopsy histopathology, biopsy PCR, and leishmanin skin test (LST) to a novel, non-invasive, cytology-brush based PCR for the diagnosis of ML in Lima, Peru. Consensus reference standard was 2/4 tests positive, and outcome measures were sensitivity and specificity. Leishmania species identification was performed by PCR-based assays of positive specimens.

Results

Twenty-eight patients were enrolled, 23 of whom fulfilled criteria for a diagnosis of ML. Sensitivity and specificity of biopsy with histopathology were 21.7% [95% CI 4.9–38.5%] and 100%; 69.6% [95% CI 50.8–88.4%] and 100% for LST; 95.7% [95% CI 87.4–100%] and 100% for biopsy PCR; and 95.7% [95% CI 87.4–100%] and 90% [95% CI 71.4–100%] for cytology brush PCR using both Cervisoft® and Histobrush® cervical cytology brushes. Represented species identified by PCR-RFLP included: L. (V). braziliensis (n = 4), and L. (V). peruviana (n = 3).

Conclusions

Use of commercial grade cytology brush PCR for diagnosis of ML is sensitive, rapid, well tolerated, and carries none of the risks of invasive diagnostic procedures such as biopsy. Further optimization is required for adequate species identification. Further evaluation of this method in field and other settings is warranted.  相似文献   

7.

Background

Multiple sclerosis (MS) patients with breakthrough disease on immunomodulatory drugs are frequently offered to switch to natalizumab or immunosuppressants. The effect of natalizumab monotherapy in patients with breakthrough disease is unknown.

Methods

This is an open-label retrospective cohort study of 993 patients seen at least four times at the University of California San Francisco MS Center, 95 had breakthrough disease on first-line therapy (60 patients switched to natalizumab, 22 to immunosuppressants and 13 declined the switch [non-switchers]). We used Poisson regression adjusted for potential confounders to compare the relapse rate within and across groups before and after the switch.

Results

In the within-group analyses, the relapse rate decreased by 70% (95% CI 50,82%; p<0.001) in switchers to natalizumab and by 77% (95% CI 59,87%; p<0.001) in switchers to immunosuppressants; relapse rate in non-switchers did not decrease (6%, p = 0.87). Relative to the reduction among non-switchers, the relapse rate was reduced by 68% among natalizumab switchers (95% CI 19,87%; p = 0.017) and by 76% among the immunosuppressant switchers (95% CI 36,91%; p = 0.004).

Conclusions

Switching to natalizumab or immunosuppressants in patients with breakthrough disease is effective in reducing clinical activity of relapsing MS. The magnitude of the effect and the risk-benefit ratio should be evaluated in randomized clinical trials and prospective cohort studies.  相似文献   

8.

Background

Oncogenic BRAF mutations have been found in diverse malignancies and activate RAF/MEK/ERK signaling, a critical pathway of tumorigenesis. We examined the clinical characteristics and outcomes of patients with mutant (mut) BRAF advanced cancer referred to phase 1 clinic.

Methods

We reviewed the records of 80 consecutive patients with mutBRAF advanced malignancies and 149 with wild-type (wt) BRAF (matched by tumor type) referred to the Clinical Center for Targeted Therapy and analyzed their outcome.

Results

Of 80 patients with mutBRAF advanced cancer, 56 had melanoma, 10 colorectal, 11 papillary thyroid, 2 ovarian and 1 esophageal cancer. Mutations in codon 600 were found in 77 patients (62, V600E; 13, V600K; 1, V600R; 1, unreported). Multivariate analysis showed less soft tissue (Odds ratio (OR) = 0.39, 95%CI: 0.20–0.77, P = 0.007), lung (OR = 0.38, 95%CI: 0.19–0.73, p = 0.004) and retroperitoneal metastases (OR = 0.34, 95%CI: 0.13–0.86, p = 0.024) and more brain metastases (OR = 2.05, 95%CI: 1.02–4.11, P = 0.043) in patients with mutBRAF versus wtBRAF. Comparing to the corresponding wtBRAF, mutBRAF melanoma patients had insignificant trend to longer median survival from diagnosis (131 vs. 78 months, p = 0.14), while mutBRAF colorectal cancer patients had an insignificant trend to shorter median survival from diagnosis (48 vs. 53 months, p = 0.22). In melanoma, V600K mutations in comparison to other BRAF mutations were associated with more frequent brain (75% vs. 36.3%, p = 0.02) and lung metastases (91.6% vs. 47.7%, p = 0.007), and shorter time from diagnosis to metastasis and to death (19 vs. 53 months, p = 0.046 and 78 vs. 322 months, p = 0.024 respectively). Treatment with RAF/MEK targeting agents (Hazard ratio (HR) = 0.16, 95%CI: 0.03–0.89, p = 0.037) and any decrease in tumor size after referral (HR = 0.07, 95%CI: 0.015–0.35, p = 0.001) correlated with longer survival in mutBRAF patients.

Conclusions

BRAF appears to be a druggable mutation that also defines subgroups of patients with phenotypic overlap, albeit with differences that correlate with histology or site of mutation.  相似文献   

9.

Background

The aim of this study is to analyse the prevalence of transmitted drug resistance, TDR, and the impact of TDR on treatment success in the German HIV-1 Seroconverter Cohort.

Methods

Genotypic resistance analysis was performed in treatment-naïve study patients whose sample was available 1,312/1,564 (83.9% October 2008). A genotypic resistance result was obtained for 1,276/1,312 (97.3%). The resistance associated mutations were identified according to the surveillance drug resistance mutations list recommended for drug-naïve patients. Treatment success was determined as viral suppression below 500 copies/ml.

Results

Prevalence of TDR was stable at a high level between 1996 and 2007 in the German HIV-1 Seroconverter Cohort (N = 158/1,276; 12.4%; CIwilson 10.7–14.3; p for trend = 0.25). NRTI resistance was predominant (7.5%) but decreased significantly over time (CIWilson: 6.2–9.1, p for trend = 0.02). NNRTI resistance tended to increase over time (NNRTI: 3.5%; CIWilson: 2.6–4.6; p for trend  = 0.07), whereas PI resistance remained stable (PI: 3.0%; CIWilson: 2.1–4.0; p for trend  = 0.24). Resistance to all drug classes was frequently caused by singleton resistance mutations (NRTI 55.6%, PI 68.4%, NNRTI 99.1%). The majority of NRTI-resistant strains (79.8%) carried resistance-associated mutations selected by the thymidine analogues zidovudine and stavudine. Preferably 2NRTI/1PIr combinations were prescribed as first line regimen in patients with resistant HIV as well as in patients with susceptible strains (susceptible 45.3%; 173/382 vs. resistant 65.5%; 40/61). The majority of patients in both groups were treated successfully within the first year after ART-initiation (susceptible: 89.9%; 62/69; resistant: 7/9; 77.8%).

Conclusion

Overall prevalence of TDR remained stable at a high level but trends of resistance against drug classes differed over time. The significant decrease of NRTI-resistance in patients newly infected with HIV might be related to the introduction of novel antiretroviral drugs and a wider use of genotypic resistance analysis prior to treatment initiation.  相似文献   

10.

Background

The catechol-O-methyltransferase (COMT) enzyme has a key function in the degradation of catecholamines and a functional polymorphism is val158met. The val/val genotype results in a three to fourfold higher enzymatic activity compared with the met/met genotype, with the val/met genotype exhibiting intermediate activity. Since pain syndromes as well as anxiety and depression are associated to low and high COMT activity respectively and these conditions are all associated with irritable bowel syndrome (IBS) we wanted for the first time to explore the relationship between the polymorphism and IBS.

Methodology/Principal Findings

867 subjects (445 women) representative of the general population and 70 consecutively sampled patients with IBS (61 women) were genotyped for the val158met polymorphism and the IBS patients filled out the Hospital-Anxiety-and-Depression-Scale (HADS) questionnaire, and an IBS symptom diary.

Results

There was a significantly higher occurrence of the val/val genotype in patients compared with controls (30% vs 20%; Chi2 (1) 3.98; p = 0.046) and a trend toward a lower occurrence of the val/met genotype in IBS patients compared with controls (39% vs 49%; Chi2 (1) 2.89; p = 0.089). Within the IBS patients the val/val carriers exhibited significantly increased bowel frequency (2.6 vs 1.8 stools per day; Chi2 (1) 5.3; p = 0.03) and a smaller proportion of stools with incomplete defecation (41% vs 68%; Chi2 (1) 4.3; p = 0.04) compared with the rest (val/met+met/met carriers). The val/val carriers also showed a trend for a smaller proportion of hard stools (0% vs 15%; Chi2 (1) 3.2; p = 0.08) and a higher frequency of postprandial defecation (26% vs 21%; Chi2 (1) 3.0; p = 0.08).

Conclusions/Significance

In this study we found an association between the val/val genotype of the val158met COMT gene and IBS as well as to specific IBS related bowel pattern in IBS patients.  相似文献   

11.

Background

In areas endemic for visceral leishmaniasis (VL), a large number of infected individuals mount a protective cellular immune response and remain asymptomatic carriers. We propose an interferon-gamma release assay (IFN-γRA) as a novel marker for latent L. donovani infection.

Methods and Findings

We modified a commercial kit (QuantiFERON) evaluating five different leishmania-specific antigens; H2B, H2B-PSA2, H2B-Lepp12, crude soluble antigen (CSA) and soluble leishmania antigen (SLA) from L. donovani with the aim to detect the cell-mediated immune response in VL. We evaluated the assay on venous blood samples of active VL patients (n = 13), cured VL patients (n = 15), non-endemic healthy controls (n = 11) and healthy endemic controls (n = 19). The assay based on SLA had a sensitivity of 80% (95% CI = 54.81–92.95) and specificity of 100% (95% CI = 74.12–100).

Conclusion

Our findings suggest that a whole-blood SLA-based QuantiFERON assay can be used to measure the cell-mediated immune response in L. donovani infection. The positive IFN-γ response to stimulation with leishmania antigen in patients with active VL was contradictory to the conventional finding of a non-proliferative antigen-specific response of peripheral blood mononuclear cells and needs further research.  相似文献   

12.

Background

Empyema is an increasingly frequent clinical problem worldwide, and has substantial morbidity and mortality. Our objectives were to identify the clinical, surgical and microbiological features, and management outcomes, of empyema.

Methods

A retrospective observational study over 12 years (1999–2010) was carried out at The Heart Hospital, London, United Kingdom. Patients with empyema were identified by screening the hospital electronic ‘Clinical Data Repository’. Demographics, clinical and microbiological characteristics, underlying risk factors, peri-operative blood tests, treatment and outcomes were identified. Univariable and multivariable statistical analyses were performed.

Results

Patients (n = 406) were predominantly male (74.1%); median age = 53 years (IQR = 37–69). Most empyema were community-acquired (87.4%) and right-sided (57.4%). Microbiological diagnosis was obtained in 229 (56.4%) patients, and included streptococci (16.3%), staphylococci (15.5%), Gram-negative organisms (8.9%), anaerobes (5.7%), pseudomonads (4.4%) and mycobacteria (9.1%); 8.4% were polymicrobial. Most (68%) cases were managed by open thoracotomy and decortication. Video-assisted thoracoscopic surgery (VATS) reduced hospitalisation from 10 to seven days (P = 0.0005). All-cause complication rate was 25.1%, and 28 day mortality 5.7%. Predictors of early mortality included: older age (P = 0.006), major co-morbidity (P = 0.01), malnutrition (P = 0.001), elevated red cell distribution width (RDW, P<0.001) and serum alkaline phosphatase (P = 0.004), and reduced serum albumin (P = 0.01) and haemoglobin (P = 0.04).

Conclusions

Empyema remains an important cause of morbidity and hospital admissions. Microbiological diagnosis was only achieved in just over 50% of cases, and tuberculosis is a notable causative organism. Treatment of empyema with VATS may reduce duration of hospital stay. Raised RDW appears to associate with early mortality.  相似文献   

13.

Background

Morbidity due to schistosomiasis is currently controlled by treatment of schistosome infected people with the antihelminthic drug praziquantel (PZQ). Children aged up to 5 years are currently excluded from schistosome control programmes largely due to the lack of PZQ safety data in this age group. This study investigated the safety and efficacy of PZQ treatment in such children.

Methods

Zimbabwean children aged 1–5 years (n = 104) were treated with PZQ tablets and side effects were assessed by questionnaire administered to their caregivers within 24 hours of taking PZQ. Treatment efficacy was determined 6 weeks after PZQ administration through schistosome egg counts in urine. The change in infection levels in the children 1–5 years old (n = 100) was compared to that in 6–10 year old children (n = 435).

Principal Findings

Pre-treatment S. haematobium infection intensity in 1–5 year olds was 14.6 eggs/10 ml urine and prevalence was 21%. Of the 104 children, 3.8% reported side effects within 24 hours of taking PZQ treatment. These were stomach ache, loss of appetite, lethargy and inflammation of the face and body. PZQ treatment significantly reduced schistosome infection levels in 1–5 year olds with an egg reduction rate (ERR) of 99% and cure rate (CR) of 92%. This was comparable to the efficacy of praziquantel in 6–10 year olds where ERR was 96% and CR was 67%.

Interpretation/Significance

PZQ treatment is as safe and efficacious in children aged 1–5 years as it is in older children aged 6–10 years in whom PZQ is the drug of choice for control of schistosome infections.  相似文献   

14.

Background and Aim

Non-alcoholic fatty liver disease (NAFLD) is a common condition, associated with hepatic insulin resistance and the metabolic syndrome including hyperglycaemia and dyslipidemia. We aimed at studying the potential impact of the NAFLD-associated PNPLA3 rs738409 G-allele on NAFLD-related metabolic traits in hyperglycaemic individuals.

Methods

The rs738409 variant was genotyped in the population-based Inter99 cohort examined by an oral glucose-tolerance test, and a combined study-sample consisting of 192 twins (96 twin pairs) and a sub-set of the Inter99 population (n = 63) examined by a hyperinsulinemic euglycemic clamp (n total = 255). In Inter99, we analyzed associations of rs738409 with components of the WHO-defined metabolic syndrome (n = 5,847) and traits related to metabolic disease (n = 5,663). In the combined study sample we elucidated whether the rs738409 G-allele altered hepatic or peripheral insulin sensitivity. Study populations were divided into individuals with normal glucose-tolerance (NGT) and with impaired glucose regulation (IGR).

Results

The case-control study showed no associations with components of the metabolic syndrome or the metabolic syndrome. Among 1,357 IGR individuals, the rs738409 G-allele associated with decreased fasting serum triglyceride levels (per allele effect(β) = −9.9% [−14.4%;−4.0% (95% CI)], p = 5.1×10−5) and fasting total cholesterol (β = −0.2 mmol/l [−0.3;−0.01 mmol/l(95% CI)], p = 1.5×10−4). Meta-analyses showed no impact on hepatic or peripheral insulin resistance in carriers of the rs738409 G-allele.

Conclusion

Our findings suggest that the G-allele of PNPLA3 rs738409 associates with reduced fasting levels of cholesterol and triglyceride in individuals with IGR.  相似文献   

15.

Background

Different clonal types of Toxoplasma gondii are thought to be associated with distinct clinical manifestations of infections. Serotyping is a novel technique which may allow to determine the clonal type of T. gondii humans are infected with and to extend typing studies to larger populations which include infected but non-diseased individuals.

Methodology

A peptide-microarray test for T. gondii serotyping was established with 54 previously published synthetic peptides, which mimic clonal type-specific epitopes. The test was applied to human sera (n = 174) collected from individuals with an acute T. gondii infection (n = 21), a latent T. gondii infection (n = 53) and from T. gondii-seropositive forest workers (n = 100).

Findings

The majority (n = 124; 71%) of all T. gondii seropositive human sera showed reactions against synthetic peptides with sequences specific for clonal type II (type II peptides). Type I and type III peptides were recognized by 42% (n = 73) or 16% (n = 28) of the human sera, respectively, while type II–III, type I–III or type I–II peptides were recognized by 49% (n = 85), 36% (n = 62) or 14% (n = 25) of the sera, respectively. Highest reaction intensities were observed with synthetic peptides mimicking type II-specific epitopes. A proportion of the sera (n = 22; 13%) showed no reaction with type-specific peptides. Individuals with acute toxoplasmosis reacted with a statistically significantly higher number of peptides as compared to individuals with latent T. gondii infection or seropositive forest workers.

Conclusions

Type II-specific reactions were overrepresented and higher in intensity in the study population, which was in accord with genotyping studies on T. gondii oocysts previously conducted in the same area. There were also individuals with type I- or type III-specific reactions. Well-characterized reference sera and further specific peptide markers are needed to establish and to perform future serotyping approaches with higher resolution.  相似文献   

16.

Background

Several sub-Saharan African countries have rapidly scaled up the number of households that own insecticide-treated mosquito nets (ITNs). Although the efficacy of ITNs in trials has been shown, evidence on their impact under routine conditions is limited to a few countries and the extent to which the scale-up of ITNs has improved population health remains uncertain.

Methods and Findings

We used matched logistic regression to assess the individual-level association between household ITN ownership or use in children under 5 years of age and the prevalence of parasitemia among children using six malaria indicator surveys (MIS) and one demographic and health survey. We used Cox proportional hazards models to assess the relationship between ITN household ownership and child mortality using 29 demographic and health surveys. The pooled relative reduction in parasitemia prevalence from random effects meta-analysis associated with household ownership of at least one ITN was 20% (95% confidence interval [CI] 3%–35%; I 2 = 73.5%, p<0.01 for I 2 value). Sleeping under an ITN was associated with a pooled relative reduction in parasitemia prevalence in children of 24% (95% CI 1%–42%; I 2 = 79.5%, p<0.001 for I 2 value). Ownership of at least one ITN was associated with a pooled relative reduction in mortality between 1 month and 5 years of age of 23% (95% CI 13–31%; I 2 = 25.6%, p>0.05 for I 2 value).

Conclusions

Our findings across a number of sub-Saharan African countries were highly consistent with results from previous clinical trials. These findings suggest that the recent scale-up in ITN coverage has likely been accompanied by significant reductions in child mortality and that additional health gains could be achieved with further increases in ITN coverage in populations at risk of malaria. Please see later in the article for the Editors'' Summary  相似文献   

17.
Kong Y  Wang D  Shang Y  Liang W  Ling X  Guo Z  He X 《PloS one》2011,6(9):e24387

Background

Introduction of calcineurin-inhibitor (CNI) has made transplantation a miracle in the past century. However, the side effects of long-term use of CNI turn out to be one of the major challenges in the current century. Among these, renal dysfunction attracts more and more attention. Herein, we undertook a meta-analysis to evaluate the efficacy and safety of calcineurin-inhibitor (CNI) minimization protocols in liver transplant recipients with CNI-related renal dysfunction.

Methods

We included randomized trials with no year and language restriction. All data were analyzed using random effect model by Review Manager 5.0. The primary endpoints were glomerular filtration rate (GFR), serum creatinine level (sCr) and creatinine clearance rate (CrCl), and the secondary endpoints were acute rejection episodes, incidence of infection and patient survival at the end of follow-up.

Results

GFR was significantly improved in CNI minimization group than in routine CNI regimen group (Z = 5.45, P<0.00001; I2 = 0%). Likely, sCr level was significantly lower in the CNI minimization group (Z = 2.84, P = 0.005; I2 = 39%). However, CrCl was not significantly higher in the CNI minimization group (Z = 1.59, P = 0.11; I2 = 0%). Both acute rejection episodes and patient survival were comparable between two groups (rejection: Z = 0.01, P = 0.99; I2 = 0%; survival: Z = 0.28, P = 0.78; I2 = 0%, respectively). However, current CNI minimization protocols may be related to a higher incidence of infections (Z = 3.06, P = 0.002; I2 = 0%).

Conclusion

CNI minimization can preserve or even improve renal function in liver transplant patients with renal impairment, while sharing similar short term acute rejection rate and patient survival with routine CNI regimen.  相似文献   

18.

Background

Recent studies have suggested that higher serum cholesterol may be associated with lower occurrence of Parkinson''s disease (PD). This study is to test the hypothesis that higher serum cholesterol correlates with slower PD progression.

Methods

Baseline non-fasting serum total cholesterol was measured in 774 of the 800 subjects with early PD enrolled between 1987 and 1988 in the Deprenyl and Tocopherol Antioxidative Therapy of Parkinsonism (DATATOP) trial. Participants were followed for up to two years, with clinical disability requiring levodopa therapy as the primary endpoint. Hazard ratios (HRs) and 95% confidence intervals (CI) were determined for increasing serum cholesterol concentration (in quintiles) for clinical disability requiring levodopa therapy, after adjusting for confounders. At baseline, only nine subjects reported use of cholesterol-lowering agents (two with statins).

Results

The overall mean cholesterol level was 216 mg/dL (range 100–355). The HR of progressing to the primary endpoint decreased with increasing serum cholesterol concentrations. Compared to the lowest quintile, the HRs (95%CI), for each higher quintile (in ascending order) are 0.83 (0.59–1.16); 0.86 (0.61–1.20); 0.84 (0.60–1.18); and 0.75 (0.52–1.09). The HR for one standard deviation (SD) increase = 0.90 [(0.80–1.01), p for trend = 0.09]. This trend was found in males (HR per SD = 0.88 [(0.77–1.00), p for trend = 0.05], but not in females [HR = 1.03 (0.81–1.32)].

Conclusions

This secondary analysis of the DATATOP trial provides preliminary evidence that higher total serum cholesterol concentrations may be associated with a modest slower clinical progression of PD, and this preliminary finding needs confirmation from larger prospective studies.  相似文献   

19.

Background

We evaluated kDNA PCR/hybridization and quantitative real-time PCR (qPCR) targeting the gene of DNA polymerase of Leishmania infantum for CVL diagnosis and assessment of parasite load in clinical samples obtained invasively and non-invasively.

Methodology/Principal Findings

Eighty naturally infected dogs from an endemic urban area in Brazil were used. Animals were divided into two groups based on the presence or absence of CVL clinical sings. Skin biopsies, bone marrow, blood and conjunctival swabs samples were collected and submitted to L. infantum DNA detection. In addition, anti-Leishmania antibody titers were measured by Immunofluorescence antibody test. The symptomatic dogs had increased titers compared to asymptomatic dogs (P = 0.025). The frequencies of positive results obtained by kDNA PCR/hybridization for asymptomatic and symptomatic dogs, respectively, were as follows: right conjunctiva, 77.5% and 95.0%; left conjunctiva, 75.0% and 87.5%; skin, 45.0% and 75.0%; bone marrow, 50.0% and 77.5%; and blood, 27.5% and 22.5%. In both groups, the parasite load in the skin samples was the highest (P<0.0001). The parasite loads in the conjunctival swab and bone marrow samples were statistically equivalent within each group. The parasite burden in conjunctival swabs was higher in the dogs with clinical signs than in asymptomatic dogs (P = 0.028). This same relationship was also observed in the bone marrow samples (P = 0.002). No differences in amastigotes load in the skin were detected between the groups.

Conclusions

The conjunctival swab is a suitable clinical sample for qualitative molecular diagnosis of CVL. The highest parasite burdens were detected in skin regardless of the presence of VL-associated clinical signs. The qPCR results emphasized the role of dogs, particularly asymptomatic dogs, as reservoirs for CVL because of the high cutaneous parasite loads. These results may help to explain the maintenance of high transmission rates and numbers of CVL cases in endemic urban regions.  相似文献   

20.

Background

We evaluated the Ziehl-Neelsen staining (ZNS) technique for the diagnosis of paragonimiasis in Laos and compared different modifications of the ZNS techniques.

Methodology

We applied the following approach: We (1) examined a paragonimiasis index case''s sputum with wet film direct examination (WF) and ZNS; (2) re-examined stored ZNS slides from two provinces; (3) compared prospectively WF, ZNS, and formalin-ether concentration technique (FECT) for sputum examination of patients with chronic cough; and (4) compared different ZNS procedures. Finally, we assessed excess direct costs associated with the use of different diagnostic techniques.

Principal Findings

Paragonimus eggs were clearly visible in WF and ZNS sputum samples of the index case. They appeared brownish-reddish in ZNS and were detected in 6 of 263 archived ZNS slides corresponding to 5 patients. One hundred sputum samples from 43 patients were examined with three techniques, which revealed that 6 patients had paragonimiasis (13 positive samples). Sensitivity per slide of the FECT, ZNS and the WF technique was 84.6 (p = 0.48), 76.9 (p = 0.25) and 61.5% (p = 0.07), respectively. Percentage of fragmented eggs was below 19% and did not differ between techniques (p = 0.13). Additional operational costs per slide were 0 (ZNS), 0.10 US$ (WF), and 0.79 US$ (FECT). ZNS heated for five minutes contained less eggs than briefly heated slides (29 eggs per slide [eps] vs. 42 eps, p = 0.01). Bloodstained sputum portions contained more eggs than unstained parts (3.3 eps vs. 0.7 eps, p = 0.016).

Conclusions/Significance

Paragonimus eggs can easily be detected in today''s widely used ZNS of sputum slides. The ZNS technique appears superior to the standard WF sputum examination for paragonimiasis and eliminates the risk of tuberculosis transmission. Our findings suggest that ZNS sputum slides should also be examined routinely for Paragonimus eggs. ZNS technique has potential in epidemiological research on paragonimiasis.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号