首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 359 毫秒
1.

Background

In malaria endemic countries, children who have experienced an episode of severe anaemia are at increased risk of a recurrence of anaemia. There is a need to find ways of protecting these at risk children from malaria and chemoprevention offers a potential way of achieving this objective.

Methods

During the 2003 and 2004 malaria transmission seasons, 1200 Gambian children with moderate or severe anaemia (Hb concentration <7 g/dL) were randomised to receive either monthly sulfadoxine-pyrimethamine (SP) or placebo until the end of the malaria transmission season in which they were enrolled, in a double-blind trial. All study subjects were treated with oral iron for 28 days and morbidity was monitored through surveillance at health centres. The primary endpoint was the proportion of children with moderate or severe anaemia at the end of the transmission season. Secondary endpoints included the incidence of clinical episodes of malaria during the surveillance period, outpatient attendances, the prevalence of parasitaemia and splenomegaly, nutritional status at the end of the malaria transmission season and compliance with the treatment regimen.

Results

The proportions of children with a Hb concentration of <7 g/dL at the end of the malaria transmission season were similar in the two study groups, 14/464 (3.0%) in children who received at least one dose of SP and 16/471 (3.4%) in those who received placebo, prevalence ratio 0.89 (0.44,1.8) P = 0.742. The protective efficacy of SP against episodes of clinical malaria was 53% (95% CI 37%, 65%). Treatment with SP was safe and well tolerated; no serious adverse events related to SP administration were observed. Mortality following discharge from hospital was low among children who received SP or placebo (6 in the SP group and 9 in the placebo group respectively).

Conclusions

Intermittent treatment with SP did not reduce the proportion of previously anaemic children with moderate or severe anaemia at the end of the malaria season, although it prevented malaria. The combination of appropriate antimalarial treatment plus one month of iron supplementation and good access to healthcare during follow-up proved effective in restoring haemoglobin to an acceptable level in the Gambian setting.

Trial Registration

ClinicalTrials.gov NCT00131716  相似文献   

2.

Background

Iron supplementation is employed to treat post-malarial anaemia in environments where iron deficiency is common. Malaria induces an intense inflammatory reaction that stalls reticulo-endothelial macrophagal iron recycling from haemolysed red blood cells and inhibits oral iron absorption, but the magnitude and duration of these effects are unclear.

Methodology/Principal Findings

We examined the red blood cell incorporation of oral administered stable isotopes of iron and compared incorporation between age matched 18 to 36 months old children with either anaemia post-malaria (n = 37) or presumed iron deficiency anaemia alone (n = 36). All children were supplemented for 30 days with 2 mg/kg elemental iron as liquid iron sulphate and administered 57Fe and 58Fe on days 1 and 15 of supplementation respectively. 57Fe and58Fe incorporation were significantly reduced (8% vs. 28%: p<0.001 and 14% vs. 26%: p = 0.045) in the malaria vs. non-malaria groups. There was a significantly greater haemoglobin response in the malaria group at both day 15 (p = 0.001) and 30 (p<0.000) with a regression analysis estimated greater change in haemoglobin of 7.2 g/l (s.e. 2.0) and 10.1 g/l (s.e. 2.5) respectively.

Conclusion/Significance

Post-malaria anaemia is associated with a better haemoglobin recovery despite a significant depressant effect on oral iron incorporation which may indicate that early erythropoetic iron need is met by iron recycling rather than oral iron. Supplemental iron administration is of questionable utility within 2 weeks of clinical malaria in children with mild or moderate anaemia.  相似文献   

3.

Background

Chlorproguanil-dapsone (Lapdap), developed as a low-cost antimalarial, was withdrawn in 2008 after concerns about safety in G6PD deficient patients. This trial was conducted in 2004 to evaluate the safety and effectiveness of CD and comparison with artemether-lumefantrine (AL) under conditions of routine use in G6PD normal and G6PD deficient patients with uncomplicated malaria in The Gambia. We also examined the effects of a common genetic variant that affects chlorproguanil metabolism on risk of treatment failure.

Methods

1238 children aged 6 months to 10 years with uncomplicated malaria were randomized to receive CD or artemether-lumefantrine (AL) and followed for 28 days. The first dose was supervised, subsequent doses given unsupervised at home. G6PD genotype was determined to assess the interaction between treatment and G6PD status in their effects on anaemia. The main endpoints were clinical treatment failure by day 28, incidence of severe anaemia (Hb<5 g/dL), and haemoglobin concentration on day 3.

Findings

One third of patients treated with AL, and 6% of patients treated with CD, did not complete their course of medication. 18% (109/595) of children treated with CD and 6.1% (36/587) with AL required rescue medication within 4 weeks, risk difference 12% (95%CI 8.9%–16%). 23 children developed severe anaemia (17 (2.9%) treated with CD and 6 (1.0%) with AL, risk difference 1.8%, 95%CI 0.3%–3.4%, P = 0.02). Haemoglobin concentration on day 3 was lower among children treated with CD than AL (difference 0.43 g/dL, 95% CI 0.24 to 0.62), and within the CD group was lower among those children who had higher parasite density at enrolment. Only 17 out of 1069 children who were typed were G6PD A- deficient, of these 2/9 treated with CD and 1/8 treated with AL developed severe anaemia. 5/9 treated with CD had a fall of 2 g/dL or more in haemoglobin concentration by day 3.

Interpretation

AL was well tolerated and highly effective and when given under operational conditions despite poor adherence to the six-dose regimen. There were more cases of severe malaria and anaemia after CD treatment although G6PD deficiency was uncommon.

Trial Registration

Clinicaltrials.gov NCT00118794  相似文献   

4.

Background

The high prevalence of anaemia and the increased morbidity and mortality associated with anaemia during AIDS has been well described yet there has been little information about anaemia and changes in haemoglobin levels during acute and early HIV-1 infection.

Methods

HIV-negative women (n = 245) were enrolled into an observational cohort as part of the Centre for the AIDS Programme of Research in South Africa (CAPRISA) Acute Infection Study. Acute infection was diagnosed following a positive HIV RNA PCR in the absence of antibodies, or detection of HIV-1 antibodies within 3 months of a previously negative antibody test. Haemotologic parameters were assessed before infection and at regular intervals in the first twelve months of HIV infection.

Results

Fifty-seven participants with acute HIV infection were identified at a median of 14.5 days post-infection (range 10–81) and were enrolled in the CAPRISA Acute Infection cohort at a median of 41 days post-infection (range 15–104). Mean haemoglobin prior to HIV-1 infection was 12.7 g/dL, with a mean decline of 0.46 g/dL following infection. The prevalence of anaemia increased from 25.0% prior to HIV-1 infection to 52.6% at 3 months post-infection, 61.1% at 6 months post-infection, and 51.4% at 12 months post-infection.

Conclusions

Haematologic derangements and anaemia with a trend towards iron deficiency are common with acute HIV-1 subtype C infection in this small cohort. The negative impact of anaemia concurrent with established HIV infection upon morbidity and mortality has been well documented but the prognostic potential and long-term effects of anaemia during acute HIV-1 infection remain unknown.  相似文献   

5.

Background

Anaemia is a common clinical finding in HIV infected women and has been associated with advanced disease. The use of antiretroviral drugs such as Zidovudine (ZDV) either for prevention of mother to child transmission (MTCT) of HIV or used in combination with other antiretrovirals have been implicated in the development or increased severity of anaemia. We report the prevalence, type, severity and incidence of anaemia in a cohort of HIV infected women who initiated antiretroviral prophylaxis or treatment during pregnancy.

Methods and Materials

This is a retrospective cohort data analysis of 408 HIV infected pregnant women who participated in a breastfeeding intervention study (HPTN 046 Study, ClinicalTrials.gov NCT 00074412) in South Africa. Women initiated zidovudine prophylaxis for PMTCT or triple antiretroviral treatment in pregnancy according to the standard of care. Laboratory and clinical data in pregnancy, <72 hours and 2 weeks postdelivery were extracted from the main database and analysed.

Results

The mean Hb concentration was 10.6 g/dL at baseline and 262/408 (64.2%) women were diagnosed with anaemia (Hb<11 g/dL) in pregnancy, 48/146 (32.9%) subsequently developed anaemia intrapartum or postpartum and 89/310 (28.7%) of all cases of anaemia remained unresolved by 2 weeks postdelivery. In a univariate analysis, CD4 count and gravidity were significant risk factors for anaemia in pregnancy, RR 1.41; 1.23–1.61 (p<0.001) and 1.10; 1.01–1.18 (p = 0.02) respectively. After adjusting for antiretroviral regimen, age and gravidity in a multivariable analysis, only the CD4 count remains a significant risk factor for anaemia in pregnancy and postdelivery.

Conclusion

In conclusion, anaemia was most common among women in the advanced stage of HIV infection (CD4<200 cells/mm3). There was no evidence of an association between ZDV or triple ARVs and anaemia.  相似文献   

6.

Objective

C-reactive protein (CRP) levels>3 mg/L and>10 mg/L are associated with high and very high cardiovascular risk, respectively, in the general population. Because rheumatoid arthritis (RA) confers excess cardiovascular mortality, we determined the prevalence of these CRP levels among RA patients stratified on the basis of their RA disease activity.

Methods

We evaluated physician and patient global assessments of disease activity, tender and swollen 28 joint counts, erythrocyte sedimentation rate (ESR), and CRP measured in a single clinic visit for 151 RA patients. Disease activity was calculated using the Clinical Disease Activity Index (CDAI) and the Disease Activity Score 28 Joints (DAS28-ESR and DAS28-CRP).

Results

Median CRP level was 5.3 mg/L. 68% of patients had CRP>3 mg/L, and 25% had CRP>10 mg/L. Of those with 0–1 swollen joints (n = 56), or 0–1 tender joints (n = 81), 64% and 67%, respectively, had CRP>3 mg/L, and 23% and 20%, respectively, had CRP>10 mg/L. Of those with remission or mildly active disease by CDAI (n = 58), DAS28-ESR (n = 39), or DAS28-CRP (n = 70), 49–66% had CRP>3 mg/L, and 10–14% had CRP>10 mg/L. Of patients with moderate disease activity by CDAI (n = 51), DAS28-ESR (n = 78), or DAS28-CRP (n = 66), 67–73% had CRP>3 mg/L, and 25–33% had CRP>10 mg/L.

Conclusion

Even among RA patients whose disease is judged to be controlled by joint counts or standardized disease scores, a substantial proportion have CRP levels that are associated high or very high risk for future cardiovascular events in the general population.  相似文献   

7.

Background

Previous studies have demonstrated gaps in achievement of low-density lipoprotein-cholesterol (LDL-C) goals among U.S. individuals at high cardiovascular disease risk; however, recent studies in selected populations indicate improvements.

Objective

We sought to define the longitudinal trends in achieving LDL-C goals among high-risk United States adults from 1999–2008. Methods We analyzed five sequential population-based cross-sectional National Health and Nutrition Examination Surveys 1999–2008, which included 18,656 participants aged 20–79 years. We calculated rates of LDL-C goal achievement and treatment in the high-risk population.

Results

The prevalence of high-risk individuals increased from 13% to 15.5% (p = 0.046). Achievement of LDL-C <100 mg/dL increased from 24% to 50.4% (p<0.0001) in the high-risk population with similar findings in subgroups with (27% to 64.8% p<0.0001) and without (21.8% to 43.7%, p<0.0001) coronary heart disease (CHD). Achievement of LDL-C <70 mg/dL improved from 2.4% to 17% (p<0.0001) in high-risk individuals and subgroups with (3.4% to 21.4%, p<0.0001) and without (1.7% to 14.9%, p<0.0001) CHD. The proportion with LDL-C ≥130 mg/dL and not on lipid medications decreased from 29.4% to 18% (p = 0.0002), with similar findings among CHD (25% to 11.9% p = 0.0013) and non-CHD (35.8% to 20.8% p<0.0001) subgroups.

Conclusion

The proportions of the U.S. high-risk population achieving LDL-C <100 mg/dL and <70 mg/dL increased over the last decade. With 65% of the CHD subpopulation achieving an LDL-C <100 mg/dL in the most recent survey, U.S. LDL-C goal achievement exceeds previous reports and approximates rates achieved in highly selected patient cohorts.  相似文献   

8.

Background

Intermittent preventive treatment of malaria in children (IPTc) is a promising strategy for malaria control. A study conducted in Mali in 2008 showed that administration of three courses of IPTc with sulphadoxine-pyrimethamine (SP) and amodiaquine (AQ) at monthly intervals reduced clinical malaria, severe malaria and malaria infection by >80% in children under 5 years of age. Here we report the results of a follow-on study undertaken to establish whether children who had received IPTc would be at increased risk of malaria during the subsequent malaria transmission season.

Methods

Morbidity from malaria and the prevalence of malaria parasitaemia and anaemia were measured in children who had previously received IPTc with SP and AQ using similar surveillance methods to those employed during the previous intervention period.

Results

1396 of 1508 children (93%) who had previously received IPTc and 1406 of 1508 children (93%) who had previously received placebos were followed up during the high malaria transmission season of the year following the intervention. Incidence rates of clinical malaria during the post-intervention transmission season (July –November 2009) were 1.87 (95% CI 1.76 –1.99) and 1.73 (95% CI; 1.62–1.85) episodes per child year in the previous intervention and placebo groups respectively; incidence rate ratio (IRR) 1.09 (95% CI 0.99 –1.21) (P = 0.08). The prevalence of malaria infection was similar in the two groups, 7.4% versus 7.5%, prevalence ratio (PR) of 0.99 (95% CI 0.73–1.33) (P = 0.95). At the end of post-intervention malaria transmission season, the prevalence of anaemia, defined as a haemoglobin concentration<11g/dL, was similar in the two groups (56.2% versus 55.6%; PR = 1.01 [95% CI 0.91 – 1.12]) (P = 0.84).

Conclusion

IPTc with SP+AQ was not associated with an increase in incidence of malaria episodes, prevalence of malaria infection or anaemia in the subsequent malaria transmission season.

Trial Registration

ClinicalTrials.gov NCT00738946  相似文献   

9.

Objective

Patient chances for cure and palliation for a variety of malignancies may be greatly affected by the care provided by a treating hospital. We sought to determine the effect of volume and teaching status on patient outcomes for five gynecologic malignancies: endometrial, cervical, ovarian and vulvar carcinoma and uterine sarcoma.

Methods

The Florida Cancer Data System dataset was queried for all patients undergoing treatment for gynecologic cancers from 1990–2000.

Results

Overall, 48,981 patients with gynecologic malignancies were identified. Endometrial tumors were the most common, representing 43.2% of the entire cohort, followed by ovarian cancer (30.9%), cervical cancer (20.8%), vulvar cancer (4.6%), and uterine sarcoma (0.5%). By univariate analysis, although patients treated at high volume centers (HVC) were significantly younger, they benefited from an improved short-term (30-day and/or 90-day) survival for cervical, ovarian and endometrial cancers. Multivariate analysis (MVA), however, failed to demonstrate significant survival benefit for gynecologic cancer patients treated at teaching facilities (TF) or HVC. Significant prognostic factors at presentation by MVA were age over 65 (HR = 2.6, p<0.01), African-American race (HR = 1.36, p<0.01), and advanced stage (regional HR = 2.08, p<0.01; advanced HR = 3.82, p<0.01, respectively). Surgery and use of chemotherapy were each significantly associated with improved survival.

Conclusion

No difference in patient survival was observed for any gynecologic malignancy based upon treating hospital teaching or volume status. Although instances of improved outcomes may occur, overall further regionalization would not appear to significantly improve patient survival.  相似文献   

10.

Background

The accurate diagnosis of TB in HIV-infected patients, particularly with advanced immunosuppression, is difficult. Recent studies indicate that a lipoarabinomannan (LAM) assay (Clearview-TB®-ELISA) may have some utility for the diagnosis of TB in HIV-infected patients; however, the precise subgroup that may benefit from this technology requires clarification. The utility of LAM in sputum samples has, hitherto, not been evaluated.

Methods

LAM was measured in sputum and urine samples obtained from 500 consecutively recruited ambulant patients, with suspected TB, from 2 primary care clinics in South Africa. Culture positivity for M. tuberculosis was used as the reference standard for TB diagnosis.

Results

Of 440 evaluable patients 120/387 (31%) were HIV-infected. Urine-LAM positivity was associated with HIV positivity (p = 0.007) and test sensitivity, although low, was significantly higher in HIV-infected compared to uninfected patients (21% versus 6%; p<0.001), and also in HIV-infected participants with a CD4 <200 versus >200 cells/mm3 (37% versus 0%; p = 0.003). Urine-LAM remained highly specific in all 3 subgroups (95%–100%). 25% of smear-negative but culture-positive HIV-infected patients with a CD4 <200 cells/mm3 were positive for urine-LAM. Sputum-LAM had good sensitivity (86%) but poor specificity (15%) likely due to test cross-reactivity with several mouth-residing organisms including actinomycetes and nocardia species.

Conclusions

These preliminary data indicate that in a high burden primary care setting the diagnostic usefulness of urine-LAM is limited, as a rule-in test, to a specific patient subgroup i.e. smear-negative HIV-infected TB patients with a CD4 count <200 cells/mm3, who would otherwise have required further investigation. However, even in this group sensitivity was modest. Future and adequately powered studies in a primary care setting should now specifically target patients with suspected TB who have advanced HIV infection.  相似文献   

11.

Background

Alcoholism is associated with susceptibility to infectious disease, particularly bacterial pneumonia. In the present study we described characteristics in alcoholic patients with bacterial meningitis and delineate the differences with findings in non-alcoholic adults with bacterial meningitis.

Methods/Principal Findings

This was a prospective nationwide observational cohort study including patients aged >16 years who had bacterial meningitis confirmed by culture of cerebrospinal fluid (696 episodes of bacterial meningitis occurring in 671 patients). Alcoholism was present in 27 of 686 recorded episodes of bacterial meningitis (4%) and alcoholics were more often male than non-alcoholics (82% vs 48%, P = 0.001). A higher proportion of alcoholics had underlying pneumonia (41% vs 11% P<0.001). Alcoholics were more likely to have meningitis due to infection with Streptococcus pneumoniae (70% vs 50%, P = 0.01) and Listeria monocytogenes (19% vs 4%, P = 0.005), whereas Neisseria meningitidis was more common in non-alcoholic patients (39% vs 4%, P = 0.01). A large proportion of alcoholics developed complications during clinical course (82% vs 62%, as compared with non-alcoholics; P = 0.04), often cardiorespiratory failure (52% vs 28%, as compared with non-alcoholics; P = 0.01). Alcoholic patients were at risk for unfavourable outcome (67% vs 33%, as compared with non-alcoholics; P<0.001).

Conclusions/Significance

Alcoholic patients are at high risk for complications resulting in high morbidity and mortality. They are especially at risk for cardiorespiratory failure due to underlying pneumonia, and therefore, aggressive supportive care may be crucial in the treatment of these patients.  相似文献   

12.

Background

Cryptococcal meningitis (CM) remains a leading cause of death for HIV-infected individuals in sub-Saharan Africa. Improved treatment strategies are needed if individuals are to benefit from the increasing availability of antiretroviral therapy. We investigated the factors associated with mortality in routine care in KwaZulu-Natal, South Africa.

Methodology/Principal Findings

A prospective year long, single-center, consecutive case series of individuals diagnosed with cryptococcal meningitis 190 patients were diagnosed with culture positive cryptococcal meningitis, of whom 186 were included in the study. 52/186 (28.0%) patients died within 14 days of diagnosis and 60/186 (32.3%) had died by day 28. In multivariable cox regression analysis, focal neurology (aHR 11 95%C.I. 3.08–39.3, P<0.001), diastolic blood pressure <60 mmHg (aHR 2.37 95%C.I. 1.11–5.04, P = 0.025), concurrent treatment for tuberculosis (aHR 2.11 95%C.I. 1.02–4.35, P = 0.044) and use of fluconazole monotherapy (aHR 3.69 95% C.I. 1.74–7.85, P<0.001) were associated with increased mortality at 14 and 28 days.

Conclusions

Even in a setting where amphotericin B is available, mortality from cryptococcal meningitis in this setting is high, particularly in the immediate period after diagnosis. This highlights the still unmet need not only for earlier diagnosis of HIV and timely access to treatment of opportunistic infections, but for better treatment strategies of cryptococcal meningitis.  相似文献   

13.

Background

Multiple micronutrient deficiencies are highly prevalent among preschool children and often lead to anemia and growth faltering. Given the limited success of supplementation and health education programs, fortification of foods could be a viable and sustainable option. We report results from a community based double-masked, randomized trial among children 1–4 years evaluating the effects of micronutrients (especially of zinc and iron) delivered through fortified milk on growth, anemia and iron status markers as part of a four group study design, running two studies simultaneously.

Methods and Findings

Enrolled children (n = 633) were randomly allocated to receive either micronutrients fortified milk (MN = 316) or control milk (Co = 317). Intervention of MN milk provided additional 7.8 mg zinc, 9.6 mg iron, 4.2 µg selenium, 0.27 mg copper, 156 µg vitamin A, 40.2 mg vitamin C, and 7.5 mg vitamin E per day (three serves) for one year. Anthropometry was recorded at baseline, mid- and end-study. Hematological parameters were estimated at baseline and end-study. Both groups were comparable at baseline. Compliance was over 85% and did not vary between groups. Compared to children consuming Co milk, children consuming MN milk showed significant improvement in weight gain (difference of mean: 0.21 kg/year; 95% confidence interval [CI] 0.12 to 0.31, p<0.001) and height gain (difference of mean: 0.51 cm/year; 95% CI 0.27 to 0.75, p<0.001). Mean hemoglobin (Hb) (difference of 13.6 g/L; 95% CI 11.1 to 16.0, p<0.001) and serum ferritin levels (difference of 7.9 µg/L; 95% CI 5.4 to 10.5, p<0.001) also improved. Children in MN group had 88% (odds ratio = 0.12, 95% CI 0.08 to 0.20, p<0.001) lower risk of iron deficiency anemia.

Conclusions/Significance

Milk provides an acceptable and effective vehicle for delivery of specific micronutrients, especially zinc and iron. Micronutrient bundle improved growth and iron status and reduced anemia in children 1–4 years old.

Trial Registration

ClinicalTrials.gov NCT00255385  相似文献   

14.

Background

Epidemiologic studies suggest that LDL particle concentration (LDL-P) may remain elevated at guideline recommended LDL cholesterol goals, representing a source of residual risk. We examined the following seven separate lipid parameters in achieving the LDL-P goal of <1000 nmol/L goal for very high risk secondary prevention: total cholesterol to HDL cholesterol ratio, TC/HDL, <3; a composite of ATP-III very high risk targets, LDL-C<70 mg/dL, non-HDL-C<100 mg/dL and TG<150 mg/dL; a composite of standard secondary risk targets, LDL-C<100, non-HDL-C<130, TG<150; LDL phenotype; HDL-C≥40; TG<150; and TG/HDL-C<3.

Methods

We measured ApoB, ApoAI, ultracentrifugation lipoprotein cholesterol and NMR lipoprotein particle concentration in 148 unselected primary and secondary prevention patients.

Results

TC/HDL-C<3 effectively discriminated subjects by LDL-P goal (F = 84.1, p<10−6). The ATP-III very high risk composite target (LDL-C<70, nonHDL-C<100, TG<150) was also effective (F = 42.8, p<10−5). However, the standard secondary prevention composite (LDL-C<100, non-HDL-C<130, TG<150) was also effective but yielded higher LDL-P than the very high risk composite (F = 42.0, p<10−5) with upper 95% confidence interval of LDL-P less than 1000 nmol/L. TG<150 and TG/HDL-C<3 cutpoints both significantly discriminated subjects but the LDL-P upper 95% confidence intervals fell above goal of 1000 nmol/L (F = 15.8, p = 0.0001 and F = 9.7, p = 0.002 respectively). LDL density phenotype neared significance (F = 2.85, p = 0.094) and the HDL-C cutpoint of 40 mg/dL did not discriminate (F = 0.53, p = 0.47) alone or add discriminatory power to ATP-III targets.

Conclusions

A simple composite of ATP-III very high risk lipoprotein cholesterol based treatment targets or TC/HDL-C ratio <3 most effectively identified subjects meeting the secondary prevention target level of LDL-P<1000 nmol/L, providing a potential alternative to advanced lipid testing in many clinical circumstances.  相似文献   

15.

Background

Early recognition and prompt and appropriate antibiotic treatment can significantly reduce mortality from serious bacterial infections (SBI). The aim of this study was to evaluate the utility of five markers of infection: C-reactive protein (CRP), procalcitonin (PCT), soluble triggering receptor expressed on myeloid cells-1 (sTREM-1), CD163 and high mobility group box-1 (HMGB1), as markers of SBI in severely ill Malawian children.

Methodology and Principal Findings

Children presenting with a signs of meningitis (n = 282) or pneumonia (n = 95), were prospectively recruited. Plasma samples were taken on admission for CRP, PCT, sTREM-1 CD163 and HMGB1 and the performance characteristics of each test to diagnose SBI and to predict mortality were determined. Of 377 children, 279 (74%) had SBI and 83 (22%) died. Plasma CRP, PCT, CD163 and HMGB1 and were higher in HIV-infected children than in HIV-uninfected children (p<0.01). In HIV-infected children, CRP and PCT were higher in children with SBI compared to those with no detectable bacterial infection (p<0.0005), and PCT and CD163 were higher in non-survivors (p = 0.001, p = 0.05 respectively). In HIV-uninfected children, CRP and PCT were also higher in children with SBI compared to those with no detectable bacterial infection (p<0.0005), and CD163 was higher in non-survivors (p = 0.05). The best predictors of SBI were CRP and PCT, and areas under the curve (AUCs) were 0.81 (95% CI 0.73–0.89) and 0.86 (95% CI 0.79–0.92) respectively. The best marker for predicting death was PCT, AUC 0.61 (95% CI 0.50–0.71).

Conclusions

Admission PCT and CRP are useful markers of invasive bacterial infection in severely ill African children. The study of these markers using rapid tests in a less selected cohort would be important in this setting.  相似文献   

16.

Objectives

Low levels of high-density lipoprotein (HDL) cholesterol are associated with an increased risk of acute myocardial infarction possibly through impaired endothelial atheroprotection and decreased nitric oxide (NO) bioavailability. Asymmetric dimethylarginine (ADMA) mediates endothelial function by inhibiting nitric oxide synthase activity. In patients with acute myocardial infarction, we investigated the relationship between serum levels of HDL and ADMA.

Approach and Results

Blood samples from 612 consecutive patients hospitalized for acute MI <24 hours after symptom onset were taken on admission. Serum levels of ADMA, its stereoisomer, symmetric dimethylarginine (SDMA) and L-arginine were determined using high-performance liquid chromatography. Patients with low HDL (<40 mg/dL for men and <50 mg/dL for women) were compared with patients with higher HDL. Most patients (59%) had low HDL levels. Median ADMA levels were markedly higher in the low HDL group (0.69 vs. 0.50 µmole/L, p<0.001). In contrast, SDMA and L-arginine levels were similar for the two groups (p = 0.120 and p = 0.064). Notably, ADMA, but not SDMA or L-arginine, was inversely correlated with HDL (r = −0.311, p<0.001). In stratified analysis, this relationship was only found for low HDL levels (r = −0.265, p<0.001), but not when HDL levels were higher (r = −0.077, p = 0.225). By multivariate logistic regression analysis, ADMA level was strongly associated with low HDL levels (OR(95%CI):6.06(3.48–10.53), p<0.001), beyond traditional confounding factors.

Conclusions

Our large population-based study showed for the first time a strong inverse relationship between HDL and ADMA in myocardial infarction patients, suggesting a functional interaction between HDL and endothelium, beyond metabolic conditions associated with low HDL levels.  相似文献   

17.

Background

In the framework of the monitoring and evaluation of the Nigerien schistosomiasis and soil-transmitted helminth control programme, a follow-up of children took place in eight sentinel sites. The objective of the study was to assess the evolution of Schistosoma haematobium infection and anaemia in schoolchildren after a single administration of praziquantel (PZQ) and albendazole.

Methods/Principal Findings

Pre-treatment examination and follow-up at one year post-treatment of schoolchildren aged 7, 8, and 11 years, including interview, urine examination, ultrasound examination of the urinary tract, and measurement of haemoglobin. Before treatment, the overall prevalence of S. heamatobium infection was 75.4% of the 1,642 enrolled children, and 21.8% of children excreted more than 50 eggs/10 ml urine. Prevalence increased with age. The overall prevalence of anaemia (haemoglobin <11.5 g/dl) was 61.6%, decreasing significantly with increasing age. The mean haemoglobinemia was 11 g/dl. In bivariate analysis, anaemia was significantly more frequent in children infected with S. haematobium, although it was not correlated to the intensity of infection. Anaemia was also associated with micro-haematuria and to kidney distensions. In a sub-sample of 636 children tested for P. falciparum infection, anaemia was significantly more frequent in malaria-infected children. In multivariate analysis, significant predictors of anaemia were P. falciparum infection, kidney distension, and the village. One year after a single-dose praziquantel treatment (administered using the WHO PZQ dose pole) co-administered with albendazole (400 mg single dose) for de-worming, the prevalence of S. haematobium infection was 38%, while the prevalence of anaemia fell to 50.4%. The mean haemoglobinemia showed a statistically significant increase of 0.39 g/dl to reach 11.4 g/dl. Anaemia was no longer associated with S. haematobium or to P. falciparum infections, or to haematuria or ultrasound abnormalities of the urinary tract.

Conclusions

The high prevalence of anaemia in Nigerien children is clearly a result of many factors and not of schistosomiasis alone. Nevertheless, treatment of schistosomiasis and de-worming were followed by a partial, but significant, reduction of anaemia in schoolchildren, not explainable by any other obvious intervention.  相似文献   

18.

Objective

To assess the prognostic and diagnostic value of whole blood impedance aggregometry in patients with sepsis and SIRS and to compare with whole blood parameters (platelet count, haemoglobin, haematocrit and white cell count).

Methods

We performed an observational, prospective study in the acute setting. Platelet function was determined using whole blood impedance aggregometry (multiplate) on admission to the Emergency Department or Intensive Care Unit and at 6 and 24 hours post admission. Platelet count, haemoglobin, haematocrit and white cell count were also determined.

Results

106 adult patients that met SIRS and sepsis criteria were included. Platelet aggregation was significantly reduced in patients with severe sepsis/septic shock when compared to SIRS/uncomplicated sepsis (ADP: 90.7±37.6 vs 61.4±40.6; p<0.001, Arachadonic Acid 99.9±48.3 vs 66.3±50.2; p = 0.001, Collagen 102.6±33.0 vs 79.1±38.8; p = 0.001; SD ± mean)). Furthermore platelet aggregation was significantly reduced in the 28 day mortality group when compared with the survival group (Arachadonic Acid 58.8±47.7 vs 91.1±50.9; p<0.05, Collagen 36.6±36.6 vs 98.0±35.1; p = 0.001; SD ± mean)). However haemoglobin, haematocrit and platelet count were more effective at distinguishing between subgroups and were equally effective indicators of prognosis. Significant positive correlations were observed between whole blood impedance aggregometry and platelet count (ADP 0.588 p<0.0001, Arachadonic Acid 0.611 p<0.0001, Collagen 0.599 p<0.0001 (Pearson correlation)).

Conclusions

Reduced platelet aggregometry responses were not only significantly associated with morbidity and mortality in sepsis and SIRS patients, but also correlated with the different pathological groups. Whole blood aggregometry significantly correlated with platelet count, however, when we adjust for the different groups we investigated, the effect of platelet count appears to be non-significant.  相似文献   

19.

Background

C-reactive protein (CRP) and erythrocyte sedimentation rate (ESR) have been shown to be useful for diagnosis of prosthetic hip and knee infection. Little information is available on CRP and ESR in patients undergoing revision or resection of shoulder arthroplasties or spine implants.

Methods/Results

We analyzed preoperative CRP and ESR in 636 subjects who underwent knee (n = 297), hip (n = 221) or shoulder (n = 64) arthroplasty, or spine implant (n = 54) removal. A standardized definition of orthopedic implant-associated infection was applied. Receiver operating curve analysis was used to determine ideal cutoff values for differentiating infected from non-infected cases. ESR was significantly different in subjects with aseptic failure infection of knee (median 11 and 53.5 mm/h, respectively, p = <0.0001) and hip (median 11 and 30 mm/h, respectively, p = <0.0001) arthroplasties and spine implants (median 10 and 48.5 mm/h, respectively, p = 0.0033), but not shoulder arthroplasties (median 10 and 9 mm/h, respectively, p = 0.9883). Optimized ESR cutoffs for knee, hip and shoulder arthroplasties and spine implants were 19, 13, 26, and 45 mm/h, respectively. Using these cutoffs, sensitivity and specificity to detect infection were 89 and 74% for knee, 82 and 60% for hip, and 32 and 93% for shoulder arthroplasties, and 57 and 90% for spine implants. CRP was significantly different in subjects with aseptic failure and infection of knee (median 4 and 51 mg/l, respectively, p<0.0001), hip (median 3 and 18 mg/l, respectively, p<0.0001), and shoulder (median 3 and 10 mg/l, respectively, p = 0.01) arthroplasties, and spine implants (median 3 and 20 mg/l, respectively, p = 0.0011). Optimized CRP cutoffs for knee, hip, and shoulder arthroplasties, and spine implants were 14.5, 10.3, 7, and 4.6 mg/l, respectively. Using these cutoffs, sensitivity and specificity to detect infection were 79 and 88% for knee, 74 and 79% for hip, and 63 and 73% for shoulder arthroplasties, and 79 and 68% for spine implants.

Conclusion

CRP and ESR have poor sensitivity for the diagnosis of shoulder implant infection. A CRP of 4.6 mg/l had a sensitivity of 79 and a specificity of 68% to detect infection of spine implants.  相似文献   

20.
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号