首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 312 毫秒
1.

Introduction

Antimalarial resistance has led to a global policy of artemisinin-based combination therapy. Despite growing resistance chloroquine (CQ) remained until recently the official first-line treatment for falciparum malaria in Pakistan, with sulfadoxine-pyrimethamine (SP) second-line. Co-treatment with the gametocytocidal primaquine (PQ) is recommended for transmission control in South Asia. The relative effect of artesunate (AS) or primaquine, as partner drugs, on clinical outcomes and gametocyte carriage in this setting were unknown.

Methods

A single-blinded, randomized trial among Afghan refugees in Pakistan compared six treatment arms: CQ; CQ+(single-dose)PQ; CQ+(3 d)AS; SP; SP+(single-dose)PQ, and SP+(3 d)AS. The objectives were to compare treatment failure rates and effect on gametocyte carriage, of CQ or SP monotherapy against the respective combinations (PQ or AS). Outcomes included trophozoite and gametocyte clearance (read by light microscopy), and clinical and parasitological failure.

Findings

A total of 308 (87%) patients completed the trial. Failure rates by day 28 were: CQ 55/68 (81%); CQ+AS 19/67 (28%), SP 4/41 (9.8%), SP+AS 1/41 (2.4%). The addition of PQ to CQ or SP did not affect failure rates (CQ+PQ 49/67 (73%) failed; SP+PQ 5/33 (16%) failed). AS was superior to PQ at clearing gametocytes; gametocytes were seen on d7 in 85% of CQ, 40% of CQ+PQ, 21% of CQ+AS, 91% of SP, 76% of SP+PQ and 23% of SP+AS treated patients. PQ was more effective at clearing older gametocyte infections whereas AS was more effective at preventing emergence of mature gametocytes, except in cases that recrudesced.

Conclusions

CQ is no longer appropriate by itself or in combination. These findings influenced the replacement of CQ with SP+AS for first-line treatment of uncomplicated falciparum malaria in the WHO Eastern Mediterranean Region. The threat of SP resistance remains as SP monotherapy is still common. Three day AS was superior to single-dose PQ for reducing gametocyte carriage.

Trial Registration

ClinicalTrials.gov bold>  相似文献   

2.

Background

Pesticide ingestion is a common method of self-harm in the rural developing world. In an attempt to reduce the high case fatality seen with the herbicide paraquat, a novel formulation (INTEON) has been developed containing an increased emetic concentration, a purgative, and an alginate that forms a gel under the acid conditions of the stomach, potentially slowing the absorption of paraquat and giving the emetic more time to be effective. We compared the outcome of paraquat self-poisoning with the standard formulation against the new INTEON formulation following its introduction into Sri Lanka.

Methods and Findings

Clinical data were prospectively collected on 586 patients with paraquat ingestion presenting to nine large hospitals across Sri Lanka with survival to 3 mo as the primary outcome. The identity of the formulation ingested after October 2004 was confirmed by assay of blood or urine samples for a marker compound present in INTEON. The proportion of known survivors increased from 76/297 with the standard formulation to 103/289 with INTEON ingestion, and estimated 3-mo survival improved from 27.1% to 36.7% (difference 9.5%; 95% confidence interval [CI] 2.0%–17.1%; p = 0.002, log rank test). Cox proportional hazards regression analyses showed an approximately 2-fold reduction in toxicity for INTEON compared to standard formulation. A higher proportion of patients ingesting INTEON vomited within 15 min (38% with the original formulation to 55% with INTEON, p < 0.001). Median survival time increased from 2.3 d (95% CI 1.2–3.4 d) with the standard formulation to 6.9 d (95% CI 3.3–10.7 d) with INTEON ingestion (p = 0.002, log rank test); however, in patients who did not survive there was a comparatively smaller increase in median time to death from 0.9 d (interquartile range [IQR] 0.5–3.4) to 1.5 d (IQR 0.5–5.5); p = 0.02.

Conclusions

The survey has shown that INTEON technology significantly reduces the mortality of patients following paraquat ingestion and increases survival time, most likely by reducing absorption.  相似文献   

3.

Background

Paraquat (PQ) is a potent, highly toxic and widely used herbicide. The major medical problems associated with PQ are accidental or suicidal ingestion. There are several prognostic markers of PQ poisoning, with the serum PQ concentration considered to be the best indicator of outcome. However, the measurement of such markers is limited in many hospitals.

Objective

The present study was conducted to investigate the association of absolute lymphocyte count (ALC) and the 30-day mortality rate in patients with PQ poisoning.

Methods

We performed a retrospective analysis of patients admitted to the emergency department after paraquat poisoning between January 2010 and April 2013. Independent risk factors including ALC for 30-day mortality were determined. The ALC was categorized in quartiles as ≤1700, 1700 to 3200, 3200 to 5000, and >5000. Univariate and multivariate Cox proportional hazard analysis were performed to determine the independent risk factors for mortality.

Results

A total of 136 patients were included in the study, and the 30-day mortality was 73.5%. ALC was significantly higher in nonsurvivors than in survivors. The highest ALC quartile (ALC>5000; hazard ratio, 2.58; 95% CI, 1.08–6.21) was associated with increased mortality in multivariate analysis. In addition, old age, lower arterial PaCO2, increased peripheral neutrophil count, and high serum levels of creatinine were associated with mortality.

Conclusion

The absolute lymphocyte count is associated with the 30-day mortality rate in patients with paraquat poisoning.  相似文献   

4.

Background

There is limited data on the clinical outcome of patients with pandemic H1N1 (pH1N1) pneumonia who received oseltamivir treatment, especially when the treatment was administered more than 48 hours after symptom onset.

Methods

During the pandemic in 2009, a cohort of pH1N1 influenza pneumonia was built in China, and their clinical information was collected systematically, and analyzed with Cox models.

Results

920 adults and 541 children with pneumonia who didn''t receive corticosteroids were analyzed. In-hospital mortality was higher in adults who did not receive antiviral therapy (18.2%) than those with who received oseltamivir ≤ 2days (2.9%), between 2–5 days (4.6%) and >5 days after illness onset (4.9%), p<0.01. A similar trend was observed in pediatric patients. Cox regression showed that at 60 days after symptoms onset, 11 patients (10.8%) who did not receive antivirals died versus 4 (1.8%), 18 (3.3%), and 23 (3.7%) patients whose oseltamivir treatment was started ≤ 2days, between 2–5days, and >5 days, respectively. For males patients, aged ≥ 14 years and baseline PaO2/FiO2<200, oseltamivir administration reduced the mortality risk by 92.1%, 88% and 83.5%, respectively. Higher doses of oseltamivir (>3.8 mg/kg/d) did not improve clinical outcome (mortality, higher dose 2.5% vs standard dose 2.8%, p>0.05).

Conclusions

Antiviral therapy might reduce mortality of patients with pH1N1 pneumonia, even when initiated more than 48 hours after onset of illness. Greater protective effects might be in males, patients aged 14–60 years, and patients with PaO2/FiO2<200.  相似文献   

5.
Bao W  Rong S  Zhang M  Yu X  Zhao Y  Xiao X  Yang W  Wang D  Yao P  Hu FB  Liu L 《PloS one》2012,7(3):e32223

Background

Our previous study has recently shown that plasma heme oxygenase-1 (HO-1), a stress-responsive protein, is elevated in individuals with type 2 diabetes. The current study aimed to examine the association between plasma HO-1 concentration and impaired glucose regulation (IGR) in non-diabetic individuals.

Methods

We conducted a case-control study including a total of 865 subjects (262 IGR individuals and 603 healthy controls) in a Chinese population. Basic characteristics were collected by questionnaire and standardized anthropometric measurements. Plasma HO-1 concentration was determined by ELISA.

Results

Plasma HO-1 concentration was significantly increased in IGR individuals compared with healthy controls (1.34 (0.81–2.29) ng/ml vs 0.98 (0.56–1.55) ng/ml, P<0.001). After adjustment for age, sex, and BMI, the ORs for IGR in the highest quartile of plasma HO-1 concentrations, compared with the lowest, was 3.42 (95% CI 2.11–5.54; P for trend <0.001). The trend remained significant even after additional adjustment for smoking, alcohol drinking, hypertension, family history of diabetes, lipid profiles and C-reactive protein. In the receiver-operating characteristic curve analysis, addition of plasma HO-1 concentration to a model with known risk factors yielded significantly improved discriminative value for IGR (area under the curves 0.75 (95% CI 0.71–0.78) vs. 0.72 (95% CI 0.69–0.76); P for difference = 0.026).

Conclusions

Elevated plasma HO-1 concentration is significantly associated with increased ORs for IGR. However, its clinical utility should be validated in further studies, especially in prospective cohort studies.  相似文献   

6.

Background

In Uganda, Rhodesian sleeping sickness, caused by Trypanosoma brucei rhodesiense, and animal trypanosomiasis caused by T. vivax and T. congolense, are being controlled by treating cattle with trypanocides and/or insecticides. We used a mathematical model to identify treatment coverages required to break transmission when host populations consisted of various proportions of wild and domestic mammals, and reptiles.

Methodology/Principal Findings

An Ro model for trypanosomiasis was generalized to allow tsetse to feed off multiple host species. Assuming populations of cattle and humans only, pre-intervention Ro values for T. vivax, T. congolense, and T. brucei were 388, 64 and 3, respectively. Treating cattle with trypanocides reduced R 0 for T. brucei to <1 if >65% of cattle were treated, vs 100% coverage necessary for T. vivax and T. congolense. The presence of wild mammalian hosts increased the coverage required and made control of T. vivax and T. congolense impossible. When tsetse fed only on cattle or humans, R 0 for T. brucei was <1 if 20% of cattle were treated with insecticide, compared to 55% for T. congolense. If wild mammalian hosts were also present, control of the two species was impossible if proportions of non-human bloodmeals from cattle were <40% or <70%, respectively. R 0 was <1 for T. vivax only when insecticide treatment led to reductions in the tsetse population. Under such circumstances R 0<1 for T. brucei and T. congolense if cattle make up 30% and 55%, respectively of the non-human tsetse bloodmeals, as long as all cattle are treated with insecticide.

Conclusions/Significance

In settled areas of Uganda with few wild hosts, control of Rhodesian sleeping sickness is likely to be much more effectively controlled by treating cattle with insecticide than with trypanocides.  相似文献   

7.

Background

Thousands of paraquat (PQ)-poisoned patients continue to die, particularly in developing countries. Although animal studies indicate that hemoperfusion (HP) within 2−4 h after intoxication effectively reduces mortality, the effect of early HP in humans remains unknown.

Methods

We analyzed the records of all PQ-poisoned patients admitted to 2 hospitals between 2000 and 2009. Patients were grouped according to early or late HP and high-dose (oral cyclophosphamide [CP] and intravenous dexamethasone [DX]) or repeated pulse (intravenous methylprednisolone [MP] and CP, followed by DX and repeated MP and/or CP) PQ therapy. Early HP was defined as HP <4 h, and late HP, as HP ≥4 h after PQ ingestion. We evaluated the associations between HP <4 h, <5 h, <6 h, and <7 h after PQ ingestion and the outcomes. Demographic, clinical, laboratory, and mortality data were analyzed.

Results

The study included 207 severely PQ-poisoned patients. Forward stepwise multivariate Cox hazard regression analysis showed that early HP <4 h (hazard ratio [HR] = 0.38, 95% confidence interval (CI) 0.16–0.86; P = 0.020) or HP <5 h (HR = 0.60, 95% CI: 0.39–0.92; P = 0.019) significantly decreased the mortality risk. Further analysis showed that early HP reduced the mortality risk only in patients treated with repeated pulse therapy (n = 136), but not high-dose therapy (n = 71). Forward stepwise multivariate Cox hazard regression analysis showed that HP <4.0 h (HR = 0.19, 95% CI: 0.05–0.79; P = 0.022) or <5.0 h (HR = 0.49, 95% CI: 0.24–0.98; P = 0.043) after PQ ingestion significantly decreased the mortality risk in repeated pulse therapy patients, after adjustment for relevant variables.

Conclusion

The results showed that early HP after PQ exposure might be effective in reducing mortality in severely poisoned patients, particularly in those treated with repeated pulse therapy.  相似文献   

8.

Background

Dengue is a major public health problem in tropical and subtropical countries. Exploring the relationships between virological features of infection with patient immune status and outcome may help to identify predictors of disease severity and enable rational therapeutic strategies.

Methods

Clinical features, antibody responses and virological markers were characterized in Vietnamese adults participating in a randomised controlled treatment trial of chloroquine.

Results

Of the 248 patients with laboratory-confirmed dengue and defined serological and clinical classifications 29 (11.7%) had primary DF, 150 (60.5%) had secondary DF, 4 (1.6%) had primary DHF and 65 (26.2%) had secondary DHF. DENV-1 was the commonest serotype (57.3%), then DENV-2 (20.6%), DENV-3 (15.7%) and DENV-4 (2.8%). DHF was associated with secondary infection (Odds ratio = 3.13, 95% CI 1.04–12.75). DENV-1 infections resulted in significantly higher viremia levels than DENV-2 infections. Early viremia levels were higher in DENV-1 patients with DHF than with DF, even if the peak viremia level was often not observed because it occurred prior to enrolment. Peak viremias were significantly less often observed during secondary infections than primary for all disease severity grades (P = 0.001). The clearance of DENV viremia and NS1 antigenemia occurs earlier and faster in patients with secondary dengue (P<0.0001). The maximum daily rate of viremia clearance was significantly higher in patients with secondary infections than primary (P<0.00001).

Conclusions

Collectively, our findings suggest that the early magnitude of viremia is positively associated with disease severity. The clearance of DENV is associated with immune status, and there are serotype dependent differences in infection kinetics. These findings are relevant for the rational design of randomized controlled trials of therapeutic interventions, especially antivirals.  相似文献   

9.

Objectives

We prospectively examined whether socioeconomic status (SES) predicts incident type II diabetes (diabetes), a cardiovascular risk equivalent and burgeoning public health epidemic among women.

Methods

Participants include 23,992 women with HbA1c levels <6% and no CVD or diabetes at baseline followed from February 1993 to March 2007. SES was measured by education and income while diabetes was self-reported.

Results

Over 12.3 years of follow-up, 1,262 women developed diabetes. In age and race adjusted models, the relative risk of diabetes decreased with increasing education (<2 years of nursing, 2 to <4 years of nursing, bachelor''s degree, master''s degree, and doctorate: 1.0, 0.7 [95% Confidence Interval (CI), 0.6–0.8], 0.6 (95% CI, 0.5–0.7), 0.5 (95% CI, 0.4–0.6), 0.4 (95% CI, 0.3–0.5); ptrend<0.001). Adjustment for traditional and non-traditional cardiovascular risk factors attenuated this relationship (education: ptrend = 0.96). Similar associations were observed between income categories and diabetes.

Conclusion

Advanced education and increasing income were both inversely associated with incident diabetes even in this relatively well-educated cohort. This relationship was largely explained by behavioral factors, particularly body mass index.  相似文献   

10.

Background

The relationships between the infecting dengue serotype, primary and secondary infection, viremia and dengue severity remain unclear. This cross-sectional study examined these interactions in adult patients hospitalized with dengue in Ha Noi.

Methods and Findings

158 patients were enrolled between September 16 and November 11, 2008. Quantitative RT-PCR, serology and NS1 detection were used to confirm dengue infection, determine the serotype and plasma viral RNA concentration, and categorize infections as primary or secondary. 130 (82%) were laboratory confirmed. Serology was consistent with primary and secondary infection in 34% and 61%, respectively. The infecting serotype was DENV-1 in 42 (32%), DENV-2 in 39 (30%) and unknown in 49 (38%). Secondary infection was more common in DENV-2 infections (79%) compared to DENV-1 (36%, p<0.001). The proportion that developed dengue haemorrhagic fever (DHF) was 32% for secondary infection compared to 18% for primary infection (p = 0.14), and 26% for DENV-1 compared to 28% for DENV-2. The time until NS1 and plasma viral RNA were undetectable was shorter for DENV-2 compared to DENV-1 (p≤0.001) and plasma viral RNA concentration on day 5 was higher for DENV-1 (p = 0.03). Plasma viral RNA concentration was higher in secondary infection on day 5 of illness (p = 0.046). We didn''t find an association between plasma viral RNA concentration and clinical severity.

Conclusion

Dengue is emerging as a major public health problem in Ha Noi. DENV-1 and DENV-2 were the prevalent serotypes with similar numbers and clinical presentation. Secondary infection may be more common amongst DENV-2 than DENV-1 infections because DENV-2 infections resulted in lower plasma viral RNA concentrations and viral RNA concentrations were higher in secondary infection. The drivers of dengue emergence in northern Viet Nam need to be elucidated and public health measures instituted.  相似文献   

11.

Background

Vivax malaria remains a major cause of morbidity in the subtropics. To undermine the stability of the disease, drugs are required that prevent relapse and provide reservoir reduction. A 14-day course of primaquine (PQ) is effective but cannot safely be used in routine practice because of its interaction with glucose-6-phosphate dehydrogenase (G6PD) deficiency for which testing is seldom available. Safe and effective use of PQ without the need for G6PD testing would be ideal. The efficacy and safety of an 8-week, once weekly PQ regimen was compared with current standard treatment (chloroquine alone) and a 14-day PQ regimen.

Methods and Principal Findings

200 microscopically confirmed Plasmodium vivax patients were randomly assigned to either once weekly 8-week PQ (0.75mg/kg/week), once weekly 8-week placebo, or 14-day PQ (0.5mg/kg/day) in North West Frontier Province, Pakistan. All patients were treated with a standard chloroquine dose and tested for G6PD deficiency. Deficient patients were assigned to the 8-week PQ group. Failure was defined as any subsequent episode of vivax malaria over 11 months of observation. There were 22/71 (31.0%) failures in the placebo group and 1/55 (1.8%) and 4/75 (5.1%) failures in the 14-day and 8-week PQ groups, respectively. Adjusted odds ratios were: for 8-week PQ vs. placebo-0.05 (95%CI: 0.01-0.2, p<0.001) and for 14-day PQ vs. placebo-0.01 (95%CI: 0.002-0.1, p<0.001). Restricted analysis allowing for a post-treatment prophylactic effect confirmed that the 8-week regimen was superior to current treatment. Only one G6PD deficient patient presented. There were no serious adverse events.

Conclusions

A practical radical treatment for vivax malaria is essential for control and elimination of the disease. The 8-week PQ course is more effective at preventing relapse than current treatment with chloroquine alone. Widespread use of the 8-week regimen could make an important contribution to reservoir reduction or regional elimination where G6PD testing is not available.

Trial Registration

ClinicalTrials.gov NCT00158587  相似文献   

12.

Introduction

Recent studies have shown that apoptosis plays a critical role in the pathogenesis of sepsis. High plasma cell free DNA (cf-DNA) concentrations have been shown to be associated with sepsis outcome. The origin of cf-DNA is unclear.

Methods

Total plasma cf-DNA was quantified directly in plasma and the amplifiable cf-DNA assessed using quantitative PCR in 132 patients with bacteremia caused by Staphylococcus aureus, Streptococcus pneumoniae, ß-hemolytic streptococcae or Escherichia coli. The quality of cf-DNA was analyzed with a DNA Chip assay performed on 8 survivors and 8 nonsurvivors. Values were measured on days 1–4 after positive blood culture, on day 5–17 and on recovery.

Results

The maximum cf-DNA values on days 1–4 (n = 132) were markedly higher in nonsurvivors compared to survivors (2.03 vs 1.26 ug/ml, p<0.001) and the AUCROC in the prediction of case fatality was 0.81 (95% CI 0.69–0.94). cf-DNA at a cut-off level of 1.52 ug/ml showed 83% sensitivity and 79% specificity for fatal disease. High cf-DNA (>1.52 ug/ml) remained an independent risk factor for case fatality in a logistic regression model. Qualitative analysis of cf-DNA showed that cf-DNA displayed a predominating low-molecular-weight cf-DNA band (150–200 bp) in nonsurvivors, corresponding to the size of the apoptotic nucleosomal DNA. cf-DNA concentration showed a significant positive correlation with visually graded apoptotic band intensity (R = 0.822, p<0.001).

Conclusions

Plasma cf-DNA concentration proved to be a specific independent prognostic biomarker in bacteremia. cf-DNA displayed a predominating low-molecular-weight cf-DNA band in nonsurvivors corresponding to the size of apoptotic nucleosomal DNA.  相似文献   

13.
Kelso JK  Halder N  Milne GJ 《PloS one》2010,5(11):e13797

Background

Neuraminidase inhibitors were used to reduce the transmission of pandemic influenza A/H1N1 2009 at the early stages of the 2009/2010 pandemic. Policies for diagnosis of influenza for the purposes of antiviral intervention differed markedly between and within countries, leading to differences in the timing and scale of antiviral usage.

Methodology/Principal Findings

The impact of the percentage of symptomatic infected individuals who were diagnosed, and of delays to diagnosis, for three antiviral intervention strategies (each with and without school closure) were determined using a simulation model of an Australian community. Epidemic characteristics were based on actual data from the A/H1N1 2009 pandemic including reproduction number, serial interval and age-specific infection rate profile. In the absence of intervention an illness attack rate (AR) of 24.5% was determined from an estimated R0 of 1.5; this was reduced to 21%, 16.5% or 13% by treatment-only, treatment plus household prophylaxis, or treatment plus household plus extended prophylaxis antiviral interventions respectively, assuming that diagnosis occurred 24 hours after symptoms arose and that 50% of symptomatic cases were diagnosed. If diagnosis occurred without delay, ARs decreased to 17%, 12.2% or 8.8% respectively. If 90% of symptomatic cases were diagnosed (with a 24 hour delay), ARs decreased to 17.8%, 11.1% and 7.6%, respectively.

Conclusion

The ability to rapidly diagnose symptomatic cases and to diagnose a high proportion of cases was shown to improve the effectiveness of all three antiviral strategies. For epidemics with R0< = 1.5 our results suggest that when the case diagnosis coverage exceeds ∼70% the size of the antiviral stockpile required to implement the extended prophylactic strategy decreases. The addition of at least four weeks of school closure was found to further reduce cumulative and peak attack rates and the size of the required antiviral stockpile.  相似文献   

14.

Objective

To determine clinically related characteristics in patients with pure lower motor neuron (LMN) syndromes, not fulfilling accepted diagnostic criteria, who were likely to respond to intravenous immunoglobulin (IVIg) treatment.

Methods

Demographic, clinical, laboratory and neurophysiological characteristics were prospectively collected from patients with undifferentiated isolated LMN syndromes who were then treated with IVIg. Patients were classified as either responders or non-responders to therapy with IVIg based on clinical data and the two groups were compared.

Results

From a total cohort of 42 patients (30 males, 12 females, aged 18-83 years), 31 patients responded to IVIg and 11 did not. Compared to patients that developed progressive neurological decline, responders were typically younger (45.8 compared to 56.0 years, P<0.05) and had upper limb (83.9% compared to 63.6%, NS), unilateral (80.6% compared to 45.5%, P<0.05), and isolated distal (54.1% compared to 9.1%, P<0.05) weakness. Patients with predominantly upper limb, asymmetrical, and distal weakness were more likely to respond to IVIg therapy. Of the patients who responded to treatment, only 12.9% had detectable GM1 antibodies and conduction block (not fulfilling diagnostic criteria) was only identified in 22.6%.

Conclusions

More than 70% of patients with pure LMN syndromes from the present series responded to treatment with IVIg therapy, despite a low prevalence of detectable GM1 antibodies and conduction block. Patients with isolated LMN presentations, not fulfilling accepted diagnostic criteria, may respond to IVIg therapy, irrespective of the presence of conduction block or GM1 antibodies, and should be given an empirical trial of IVIg to determine treatment responsiveness.  相似文献   

15.

Background

In areas of seasonal malaria transmission, treatment of asymptomatic carriers of malaria parasites, whose parasitaemia persists at low densities throughout the dry season, could be a useful strategy for malaria control. We carried out a randomized trial to compare two drug regimens for clearance of parasitaemia in order to identify the optimum regimen for use in mass drug administration in the dry season.

Methodology and Principal Findings

A two-arm open-label randomized controlled trial was conducted during the dry season in an area of distinct seasonal malaria in two villages in Gedarif State in eastern Sudan. Participants were asymptomatic adults and children aged over 6 months, with low-density P. falciparum infection detected by PCR. Participants were randomized to receive artesunate/sulfadoxine-pyrimethamine (AS+SP) combination for three days with or without a dose of primaquine (PQ) on the fourth day. Parasitaemia detected by PCR on days 3, 7 and 14 after the start of treatment and gametocytes detected by RT-PCR on days 7 and 14 were then recorded. 104 individuals who had low density parasitaemia at screening were randomized and treated during the dry season. On day 7, 8.3% were positive by PCR in the AS+SP+PQ group and 6.5% in the AS+SP group (risk difference 1.8%, 95%CI −10.3% to +13.8%). At enrolment, 12% (12/100) were carrying gametocytes. This was reduced to 6.4% and 4.4% by day 14 (Risk difference 1.9% (95%CI −9.3% to +13.2%) in AS+SP+PQ and AS+SP groups, respectively.

Conclusion

Addition of primaquine to artemisinin combination treatment did not improve elimination of parasitaemia and prevention of gametocyte carriage in carriers with low-density parasitaemia in the dry season.

Trial Registration

ClinicalTrials.gov NCT00330902  相似文献   

16.

Background

Aortic arch anomalies (AAA) are rare cardio-vascular anomalies. Right-sided and double-sided aortic arch anomalies (RAAA, DAAA) are distinguished, both may cause airway obstructions. We studied the degree of airway obstruction in infants with AAA by neonatal lung function testing (LFT).

Patients and Methods

17 patients (10 RAAA and 7 DAAA) with prenatal diagnosis of AAA were investigated. The median (range) post conception age at LFT was 40.3 (36.6–44.1) weeks, median body weight 3400 (2320–4665) g. Measurements included tidal breathing flow-volume loops (TBFVL), airway resistance (Raw) by bodyplethysmography and the maximal expiratory flow at functional residual capacity (V′maxFRC) by rapid thoracic-abdominal compression (RTC) technique. V′maxFRC was also expressed in Z-scores, based on published gender-, age and height-specific reference values.

Results

Abnormal lung function tests were seen in both RAAA and DAAA infants. Compared to RAAA infants, infants with DAAA had significantly more expiratory flow limitations in the TBFVL, (86% vs. 30%, p<0.05) and a significantly increased Raw (p = 0.015). Despite a significant correlation between Raw and the Z-score of V′maxFRC (r = 0.740, p<0.001), there were no statistically significant differences in V′maxFRC and it''s Z-scores between RAAA and DAAA infants. 4 (24%) infants (2 RAAA, 2 DAAA) were near or below the 10th percentile of V′maxFRC, indicating a high risk for airway obstruction.

Conclusion

Both, infants with RAAA and DAAA, are at risk for airway obstruction and early LFT helps to identify and to monitor these infants. This may support the decision for therapeutic interventions before clinical symptoms arise.  相似文献   

17.

Background

Infection by the pandemic influenza A (H1N1/09) virus resulted in significant pathology among specific ethnic groups worldwide. Natural Killer (NK) cells are important in early innate immune responses to viral infections. Activation of NK cells, in part, depend on killer-cell immunoglobulin-like receptors (KIR) and HLA class I ligand interactions. To study factors involved in NK cell dysfunction in overactive immune responses to H1N1 infection, KIR3DL1/S1 and KIR2DL2/L3 allotypes and cognate HLA ligands of H1N1/09 intensive-care unit (ICU) patients were determined.

Methodology and Findings

KIR3DL1/S1, KIR2DL2/L3, and HLA -B and -C of 51 H1N1/09 ICU patients and 105 H1N1-negative subjects (St. Theresa Point, Manitoba) were characterized. We detected an increase of 3DL1 ligand-negative pairs (3DL1/S1+ Bw6+ Bw4), and a lack of 2DL1 HLA-C2 ligands, among ICU patients. They were also significantly enriched for 2DL2/L3 ligand-positive pairs (P<0.001, Pc<0.001; Odds Ratio:6.3158, CI95%:2.481–16.078). Relative to St. Theresa aboriginals (STh) and Venezuelan Amerindians (VA), allotypes enriched among aboriginal ICU patients (Ab) were: 2DL3 (Ab>VA, P = 0.024, Pc = 0.047; Odds Ratio:2.563, CI95%:1.109–5.923), 3DL1*00101 (Ab>VA, P<0.001, Pc<0.001), 3DL1*01502 (Ab>STh, P = 0.034, Pc = 0.268), and 3DL1*029 (Ab>STh, P  = 0.039, Pc = 0.301). Aboriginal patients ligand-positive for 3DL1/S1 and 2DL1 had the lowest probabilities of death (Rd) (Rd = 28%), compared to patients that were 3DL1/S1 ligand-negative (Rd = 52%) or carried 3DL1*029 (Rd = 52%). Relative to Caucasoids (CA), two allotypes were enriched among non-aboriginal ICU patients (NAb): 3DL1*00401 (NAb>CA, P<0.001, Pc<0.001) and 3DL1*01502 (CAP = 0.012, Pc = 0.156). Non-aboriginal patients with ligands for all three KIRs (3DL1/S1, 2DL2/L3, and 2DL1) had the lowest probabilities of death (Rd = 36%), compared to subjects with 3DL1*01502 (Rd = 48%) and/or 3DL1*00401 (Rd = 58%).

Conclusions

Specific KIR3DL1/S1 allotypes, 3DL1/S1 and 2DL1 ligand-negative pairs, and 2DL2/L3 ligand-positive pairs were enriched among ICU patients. This suggests a possible association with NK cell dysfunction in patients with overactive immune responses to H1N1/09, leading to severe disease.  相似文献   

18.
Chang ZY  Lu DW  Yeh MK  Chiang CH 《PloS one》2012,7(3):e33983

Purpose

The aim of the study was to develop a high-content flow cytometric method for assessing the viability and damage of small, medium, and large retinal ganglion cells (RGCs) in N-methyl-D-aspartic acid (NMDA)-injury model.

Methods/Results

Retinal toxicity was induced in rats by intravitreal injection of NMDA and RGCs were retrogradely labeled with Fluoro-Gold (FG). Seven days post-NMDA injection, flatmount and flow cytometric methods were used to evaluate RGCs. In addition, the RGC area diameter (D(a)) obtained from retinal flatmount imaging were plotted versus apparent volume diameter (D(v)) obtained from flow cytometry for the same cumulative cell number (sequentially from small to large RGCs) percentile (Q) to establish their relationship for accurately determining RGC sizes. Good correlation (r = 0.9718) was found between D(a) and apparent D(v). Both flatmount and flow cytometric analyses of RGCs showed that 40 mM NMDA significantly reduced the numbers of small and medium RGCs but not large RGCs. Additionally, flow cytometry showed that the geometric means of FG and thy-1 intensities in three types of RGCs decreased to 90.96±2.24% (P<0.05) and 91.78±1.89% (P>0.05) for small, 69.62±2.11% (P<0.01) and 69.07±2.98% (P<0.01) for medium, and 69.68±6.48% (P<0.05) and 69.91±6.23% (P<0.05) for large as compared with the normal RGCs.

Conclusion

The established flow cytometric method provides high-content analysis for differential evaluation of RGC number and status and should be useful for the evaluation of various models of optic nerve injury and the effects of potential neuroprotective agents.  相似文献   

19.

Background

Methamphetamine is one of the most toxic of the drugs of abuse, which may reflect its distribution and accumulation in the body. However no studies have measured methamphetamine''s organ distribution in the human body.

Methods

Positron Emission Tomography (PET) was used in conjunction with [11C]d-methamphetamine to measure its whole-body distribution and bioavailability as assessed by peak uptake (% Dose/cc), rate of clearance (time to reach 50% peak-clearance) and accumulation (area under the curve) in healthy participants (9 Caucasians and 10 African Americans).

Results

Methamphetamine distributed through most organs. Highest uptake (whole organ) occurred in lungs (22% Dose; weight ∼1246 g), liver (23%; weight ∼1677 g) and intermediate in brain (10%; weight ∼1600 g). Kidneys also showed high uptake (per/cc basis) (7%; weight 305 g). Methamphetamine''s clearance was fastest in heart and lungs (7–16 minutes), slowest in brain, liver and stomach (>75 minutes), and intermediate in kidneys, spleen and pancreas (22–50 minutes). Lung accumulation of [11C]d-methamphetamine was 30% higher for African Americans than Caucasians (p<0.05) but did not differ in other organs.

Conclusions

The high accumulation of methamphetamine, a potent stimulant drug, in most body organs is likely to contribute to the medical complications associated with methamphetamine abuse. In particular, we speculate that methamphetamine''s high pulmonary uptake could render this organ vulnerable to infections (tuberculosis) and pathology (pulmonary hypertension). Our preliminary findings of a higher lung accumulation of methamphetamine in African Americans than Caucasians merits further investigation and questions whether it could contribute to the infrequent use of methamphetamine among African Americans.  相似文献   

20.

Purpose

To asses if tennis at prepubertal age elicits the hypertrophy of dominant arm muscles.

Methods

The volume of the muscles of both arms was determined using magnetic resonance imaging (MRI) in 7 male prepubertal tennis players (TP) and 7 non-active control subjects (CG) (mean age 11.0±0.8 years, Tanner 1–2).

Results

TP had 13% greater total muscle volume in the dominant than in the contralateral arm. The magnitude of inter-arm asymmetry was greater in TP than in CG (13 vs 3%, P<0.001). The dominant arm of TP was 16% greater than the dominant arm of CG (P<0.01), whilst non-dominant arms had similar total muscle volumes in both groups (P = 0.25), after accounting for height as covariate. In TP, dominant deltoid (11%), forearm supinator (55%) and forearm flexors (21%) and extensors (25%) were hypertrophied compared to the contralateral arm (P<0.05). In CG, the dominant supinator muscle was bigger than its contralateral homonimous (63%, P<0.05).

Conclusions

Tennis at prepubertal age is associated with marked hypertrophy of the dominant arm, leading to a marked level of asymmetry (+13%), much greater than observed in non-active controls (+3%). Therefore, tennis particpation at prepubertal age is associated with increased muscle volumes in dominant compared to the non-dominant arm, likely due to selectively hypertrophy of the loaded muscles.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号