首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.

Introduction

Palawan, where health care facilities are still limited, is one of the most malaria endemic provinces in the Philippines. Since 1999, microscopists (community health workers) have been trained in malaria diagnosis and feasibility of early diagnosis and treatments have been enhanced throughout the province. To accelerate the universal access of malaria patients to diagnostic testing in Palawan, positive health seeking behavior should be encouraged when malaria infection is suspected.

Methods

In this cross-sectional study, structured interviews were carried out with residents (N = 218) of 20 remote malaria-endemic villages throughout Palawan with a history of suspected malaria from January to February in 2012. Structural equation modeling (SEM) was conducted to determine factors associated with appropriate treatment, which included: (1) socio-demographic characteristics; (2) proximity to a health facility; (3) health seeking behavior; (4) knowledge on malaria; (5) participation in community awareness-raising activities.

Results

Three factors independently associated with appropriate treatment were identified by SEM (CMIN = 10.5, df = 11, CFI = 1.000, RMSEA = .000): “living near microscopist” (p < 0.001), “not living near private pharmacy” (p < 0.01), and “having severe symptoms” (p < 0.01). “Severe symptoms” were positively correlated with more “knowledge on malaria symptoms” (p < 0.001). This knowledge was significantly increased by attending “community awareness-raising activities by microscopists” (p < 0.001).

Conclusions

In the resource-limited settings, microscopists played a significant role in providing appropriate treatment to all participants with severe malaria symptoms. However, it was considered that knowledge on malaria symptoms made participants more aware of their symptoms, and further progressed self-triage. Strengthening this recognition sensitivity and making residents aware of nearby microscopists may be the keys to accelerating universal access to effective malaria treatment in Palawan.  相似文献   

2.

Background

Malaria and schistosomiasis often overlap in tropical and subtropical countries and impose tremendous disease burdens; however, the extent to which schistosomiasis modifies the risk of febrile malaria remains unclear.

Methods

We evaluated the effect of baseline S. haematobium mono-infection, baseline P. falciparum mono-infection, and co-infection with both parasites on the risk of febrile malaria in a prospective cohort study of 616 children and adults living in Kalifabougou, Mali. Individuals with S. haematobium were treated with praziquantel within 6 weeks of enrollment. Malaria episodes were detected by weekly physical examination and self-referral for 7 months. The primary outcome was time to first or only malaria episode defined as fever (≥37.5°C) and parasitemia (≥2500 asexual parasites/µl). Secondary definitions of malaria using different parasite densities were also explored.

Results

After adjusting for age, anemia status, sickle cell trait, distance from home to river, residence within a cluster of high S. haematobium transmission, and housing type, baseline P. falciparum mono-infection (n = 254) and co-infection (n = 39) were significantly associated with protection from febrile malaria by Cox regression (hazard ratios 0.71 and 0.44; P = 0.01 and 0.02; reference group: uninfected at baseline). Baseline S. haematobium mono-infection (n = 23) did not associate with malaria protection in the adjusted analysis, but this may be due to lack of statistical power. Anemia significantly interacted with co-infection (P = 0.009), and the malaria-protective effect of co-infection was strongest in non-anemic individuals. Co-infection was an independent negative predictor of lower parasite density at the first febrile malaria episode.

Conclusions

Co-infection with S. haematobium and P. falciparum is significantly associated with reduced risk of febrile malaria in long-term asymptomatic carriers of P. falciparum. Future studies are needed to determine whether co-infection induces immunomodulatory mechanisms that protect against febrile malaria or whether genetic, behavioral, or environmental factors not accounted for here explain these findings.  相似文献   

3.

Introduction

Plasmodium vivax is the most prevalent malaria species in the American region. Brazil accounts for the higher number of the malaria cases reported in pregnant women in the Americas. This study aims to describe the characteristics of pregnant women with malaria in an endemic area of the Brazilian Amazon and the risk factors associated with prematurity and low birth weight (LBW).

Methods/Principal Findings

Between December 2005 and March 2008, 503 pregnant women with malaria that attended a tertiary health centre were enrolled and followed up until delivery and reported a total of 1016 malaria episodes. More than half of study women (54%) were between 20–29 years old, and almost a third were adolescents. The prevalence of anaemia at enrolment was 59%. Most women (286/503) reported more than one malaria episode and most malaria episodes (84.5%, 846/1001) were due to P. vivax infection. Among women with only P. vivax malaria, the risk of preterm birth and low birth weight decreased in multigravidae (OR, 0.36 [95% CI, 0.16–0.82]; p = 0.015 and OR 0.24 [95% CI, 0.10–0.58]; p = 0.001, respectively). The risk of preterm birth decreased with higher maternal age (OR 0.43 [95% CI, 0.19–0.95]; p = 0.037) and among those women who reported higher antenatal care (ANC) attendance (OR, 0.32 [95% CI, 0.15–0.70]; p = 0.005).

Conclusion

This study shows that P. vivax is the prevailing species among pregnant women with malaria in the region and shows that vivax clinical malaria may represent harmful consequences for the health of the mother and their offsprings particularly on specific groups such as adolescents, primigravidae and those women with lower ANC attendance.  相似文献   

4.

Background

Multiple micronutrients in powder (MNP) are recommended by WHO to prevent anemia in young children. However, evidences for its effectiveness in different populations and improvements in other outcomes (e.g. linear growth and vitamin A deficiency) are scarce.

Methods

A multicentre pragmatic controlled trial was carried out in primary health centres. At study baseline, a control group (CG) of children aged 10- to 14 months (n = 521) was recruited in the routine healthcare for assessing anemia, anthropometric and micronutrient status. At the same time, an intervention group (IG) of infants aged 6- to 8 months (n = 462) was recruited to receive MNP daily in complementary feeding over a period of 60 days. Both study groups were compared when the IG infants reached the age of the CG children at enrolment.

Results

In CG, the prevalence of anemia [hemoglobin (Hb) < 110 g/L], iron deficiency (ID, plasma ferritin < 12 μg/L or TfR > 8.3 mg/L), and vitamin A deficiency (VAD, serum retinol < 0.70μmol/L) were 23.1%, 37.4%, and 17.4%, respectively. Four to six months after enrolment, when the IG participants had the same age of the controls at the time of testing, the prevalence of anemia, ID and VAD in IG were 14.3%, 30.1% and 7.9%, respectively. Adjusting for city, health centre, maternal education, and age, IG children had a lower likelihood of anemia and VAD [Prevalence Ratio (95% CI) = 0.63 (0.45, 0.88) and 0.45 (0.29, 0.69), respectively] when compared with CG children. The adjusted mean distributions of Hb and length-for-age Z-scores improved by 2 SE in the IG compared to CG children.

Conclusions

MNP effectively reduced anemia and improved growth and micronutrient status among young Brazilian children.

Trial Registration

Registro Brasileiro de Ensaios Clinicos RBR-5ktv6b  相似文献   

5.
6.

Background

In this study we aimed to assess site heterogeneity of early, intermediate, and late mortality prediction in children with severe Plasmodium falciparum malaria in sub-Saharan Africa.

Methods

Medical records of 26,036 children admitted with severe Plasmodium falciparum malaria in six hospital research centers between December 2000 to May 2005 were analyzed. Demographic, clinical and laboratory data of children who died within 24 hours (early), between 24 and 47 hours (intermediate) and thereafter (48 hours or later, late mortality) were compared between groups and survivors.

Results

Overall mortality was 4·3% (N = 1,129). Median time to death varied across sites (P<0·001), ranging from 8h (3h–52h) in Lambaréné to 40h (10h–100h) in Kilifi. Fifty-eight percent of deaths occurred within 24 hours and intermediate and late mortality rate were 19% and 23%, respectively. Combining all sites, deep breathing, prostration and hypoglycemia were independent predictors for early, intermediate and late mortality (P<0·01). Site specific independent predictors for early death included prostration, coma and deep breathing at all sites (P<0·001). Site specific independent predictors for intermediate and late death largely varied between sites (P<0·001) and included between 1 and 7 different clinical and laboratory variables.

Conclusion

Site heterogeneity for mortality prediction is evident in African children with severe malaria. Prediction for early mortality has the highest consistency between sites.  相似文献   

7.

Background

Good house construction may reduce the risk of malaria by limiting the entry of mosquito vectors. We assessed how house design may affect mosquito house entry and malaria risk in Uganda.

Methods

100 households were enrolled in each of three sub-counties: Walukuba, Jinja district; Kihihi, Kanungu district; and Nagongera, Tororo district. CDC light trap collections of mosquitoes were done monthly in all homes. All children aged six months to ten years (n = 878) were followed prospectively for a total of 24 months to measure parasite prevalence every three months and malaria incidence. Homes were classified as modern (cement, wood or metal walls; and tiled or metal roof; and closed eaves) or traditional (all other homes).

Results

A total of 113,618 female Anopheles were collected over 6,765 nights. 6,816 routine blood smears were taken of which 1,061 (15.6%) were malaria parasite positive. 2,582 episodes of uncomplicated malaria were diagnosed after 1,569 person years of follow-up, giving an overall incidence of 1.6 episodes per person year at risk. The human biting rate was lower in modern homes than in traditional homes (adjusted incidence rate ratio (IRR) 0.48, 95% confidence interval (CI) 0.37–0.64, p<0.001). The odds of malaria infection were lower in modern homes across all the sub-counties (adjusted odds ratio 0.44, 95%CI 0.30–0.65, p<0.001), while malaria incidence was lower in modern homes in Kihihi (adjusted IRR 0.61, 95%CI 0.40–0.91, p = 0.02) but not in Walukuba or Nagongera.

Conclusions

House design is likely to explain some of the heterogeneity of malaria transmission in Uganda and represents a promising target for future interventions, even in highly endemic areas.  相似文献   

8.

Background

Red cell distribution width (RDW) is a routine laboratory measure associated with poor outcomes in adult critical illness.

Objective

We determined the utility of RDW as an early pragmatic biomarker for outcome in pediatric critical illness.

Methods

We used multivariable logistic regression to test the association of RDW on the first day of pediatric intensive care unit (PICU) admission with prolonged PICU length of stay (LOS) >48 hours and mortality. The area under the receiver operating characteristic curve (AUROC) for RDW was compared to the Pediatric Index of Mortality (PIM)-2 score.

Results

Over a 13-month period, 596 unique patients had RDW measured on the first day of PICU admission. Sepsis was an effect modifier for LOS >48 hours but not mortality. In sepsis, RDW was not associated with LOS >48 hours. For patients without sepsis, each 1% increase in RDW was associated with 1.17 (95% CI 1.06, 1.30) increased odds of LOS >48 hours. In all patients, RDW was independently associated with PICU mortality (OR 1.25, 95% CI 1.09, 1.43). The AUROC for RDW to predict LOS >48 hours and mortality was 0.61 (95% CI 0.56, 0.66) and 0.65 (95% CI 0.55, 0.75), respectively. Although the AUROC for mortality was comparable to PIM-2 (0.75, 95% CI 0.66, 0.83; p = 0.18), RDW did not increase the discriminative utility when added to PIM-2. Despite the moderate AUROC, RDW <13.4% (upper limit of lower quartile) had 53% risk of LOS >48 hours and 3.3% risk of mortality compared to patients with an RDW >15.7% (lower limit of upper quartile) who had 78% risk of LOS >48 hours and 12.9% risk of mortality (p<0.001 for both outcomes).

Conclusions

Elevated RDW was associated with outcome in pediatric critical illness and provided similar prognostic information as the more complex PIM-2 severity of illness score. Distinct RDW thresholds best discriminate low- versus high-risk patients.  相似文献   

9.

Background

Early HIV diagnosis and enrolment in care is needed to achieve early antiretroviral treatment (ART) initiation. Studies on HIV disease stage at enrolment in care from Asian countries are limited. We evaluated trends in and factors associated with late HIV disease presentation over a ten-year period in the largest ART center in Cambodia.

Methods

We conducted a retrospective analysis of program data including all ARV-naïve adults (> 18 years old) enrolling into HIV care from March 2003-December 2013 in a non-governmental hospital in Phnom Penh, Cambodia. We calculated the proportion presenting with advanced stage HIV disease (WHO clinical stage IV or CD4 cell count <100 cells/μL) and the probability of ART initiation by six months after enrolment. Factors associated with late presentation were determined using multivariate logistic regression.

Results

From 2003–2013, a total of 5642 HIV-infected patients enrolled in HIV care. The proportion of late presenters decreased from 67% in 2003 to 44% in 2009 and 41% in 2013; a temporary increase to 52% occurred in 2011 coinciding with logistical/budgetary constraints at the national program level. Median CD4 counts increased from 32 cells/μL (IQR 11–127) in 2003 to 239 cells/μL (IQR 63–291) in 2013. Older age and male sex were associated with late presentation across the ten-year period. The probability of ART initiation by six months after enrolment increased from 22.6% in 2003–2006 to 79.9% in 2011–2013.

Conclusion

Although a gradual improvement was observed over time, a large proportion of patients still enroll late, particularly older or male patients. Interventions to achieve early HIV testing and efficient linkage to care are warranted.  相似文献   

10.

Background

Although the weaning classification based on the difficulty and duration of the weaning process has been evaluated in the different type of intensive care units (ICUs), little is known about clinical outcomes and validity among the three groups in medical ICU. The objectives of this study were to evaluate the clinical relevance of weaning classification and its association with hospital mortality in a medical ICU with a protocol-based weaning program.

Methods

All consecutive patients admitted to the medical ICU and requiring mechanical ventilation (MV) for more than 24 hours were prospectively registered and screened for weaning readiness by a standardized weaning program between July 2010 and June 2013. Baseline characteristics and outcomes were compared across weaning classifications.

Results

During the study period, a total of 680 patients were weaned according to the standardized weaning protocol. Of these, 457 (67%) were classified as simple weaning, 136 (20%) as difficult weaning, and 87 (13%) as prolonged weaning. Ventilator-free days within 28 days decreased significantly from simple to difficult to prolonged weaning groups (P < 0.001, test for trends). In addition, reintubation within 48 hours after extubation (P < 0.001) and need for tracheostomy during the weaning process (P < 0.001) increased significantly across weaning groups. Finally, ICU (P < 0.001), post-ICU (P = 0.001), and hospital (P < 0.001) mortalities significantly increased across weaning groups. In a multiple logistic regression model, prolonged weaning but not difficult weaning was still independently associated with ICU (adjusted OR 8.265, 95% CI 3.484–19.605, P < 0.001), and post-ICU (adjusted OR 3.180, 95% CI 1.349–7.497, P = 0.005), and hospital (adjusted OR 5.528, 95% CI 2.801–10.910, P < 0.001) mortalities.

Conclusions

Weaning classification based on the difficulty and duration of the weaning process may provide prognostic information for mechanically ventilated patients who undergo the weaning process.  相似文献   

11.

Background

Hepatitis B virus (HBV) and hepatitis C virus (HCV) co-infections contributes to a substantial proportion of liver disease worldwide. The aim of this study was to assess the clinical and virological features of HBV-HCV co-infection.

Methods

Demographic data were collected for 3238 high-risk people from an HCV-endemic region in China. Laboratory tests included HCV antibody and HBV serological markers, liver function tests, and routine blood analysis. Anti-HCV positive samples were analyzed for HCV RNA levels and subgenotypes. HBsAg-positive samples were tested for HBV DNA.

Results

A total of 1468 patients had chronic HCV and/or HBV infections. Among them, 1200 individuals were classified as HCV mono-infected, 161 were classified as HBV mono-infected, and 107 were classified as co-infected. The HBV-HCV co-infected patients not only had a lower HBV DNA positive rate compared to HBV mono-infected patients (84.1% versus 94.4%, respectively; P<0.001). The median HCV RNA levels in HBV-HCV co-infected patients were significantly lower than those in the HCV mono-infected patients (1.18[Interquartile range (IQR) 0–5.57] versus 5.87[IQR, 3.54–6.71] Log10 IU/mL, respectively; P<0.001). Furthermore, co-infected patients were less likely to have detectable HCV RNA levels than HCV mono-infected patients (23.4% versus 56.5%, respectively; P<0.001). Those HBV-HCV co-infected patients had significantly lower median HBV DNA levels than those mono-infected with HBV (1.97[IQR, 1.3–3.43] versus 3.06[IQR, 2–4.28] Log10 IU/mL, respectively; P<0.001). The HBV-HCV co-infection group had higher ALT, AST, ALP, GGT, APRI and FIB-4 levels, but lower ALB and total platelet compared to the HBV mono-infection group, and similar to that of the HCV mono-infected group.

Conclusion

These results suggest that co-infection with HCV and HBV inhibits the replication of both viruses. The serologic results of HBV-HCV co-infection in patients suggests more liver injury compared to HBV mono-infected patients, but is similar to HCV mono-infection.  相似文献   

12.

Objective

The English questionnaire Pregnancy-Unique Quantification of Emesis and nausea (PUQE) identifies women with severe Hyperemesis Gravidarum. Our aim was to investigate whether scores from the translated Norwegian version; SUKK (SvangerskapsUtløst Kvalme Kvantifisering) was associated with severity of hyperemesis and nutritional intake.

Design

A prospective cohort validation study.

Setting

Hospital cohort of Hyperemesis Gravidarum (HG) patients from western Norway and healthy pregnant women from Bergen, Norway.

Sample

38 women hospitalized due to HG and 31 healthy pregnant controls attending routine antenatal check-up at health centers.

Methods

Data were collected May 2013-January 2014. The study participants answered the Norwegian PUQE-questionnaire (scores ranging from 3 to15) and registered prospectively 24-hours nutritional intake by a food list form.

Main outcome measures

Differences of PUQE-scores, QOL-score and nutritional intake between hyperemesis patients and controls.

Results

Hyperemesis patients had shorter gestational age compared to controls (median 9.7 weeks; 95% CI 8.6-10.6 versus 11.9; 95% CI 10.1-12.9, p=0.004), and larger weight-change from pre-pregnant weight (loss of median 3 kg; 95% CI 3-4 versus gain of 2 kg; 95% CI 0.5-2, p<0.001) otherwise groups were similar regarding pre-pregnant BMI, age, gravidity, and inclusion weight. Compared to controls, hyperemesis patients had significant higher PUQE-score (median 13; 95% CI 11-14 vs. 7; 95% CI 4-8), lower QOL (median score 3; 95% CI 2-4 vs. 6; 95% CI 4.5-8) and lower nutritional intake (energy intake median 990 kcal/24 hours; 95% CI 709-1233 vs. 1652; 95% CI 1558-1880 all p<0.001). PUQE-score was inversely correlated to nutritional intake (-0.5, p<0.001). At discharge PUQE-score had fallen to median 6 (95% CI 5-8) and QOL score risen to 7 (95% CI 6-8) in the HG group, (both p<0.001 compared to admission values).

Conclusion

PUQE-scoring has been validated as a robust indicator of severe hyperemesis gravidarum and insufficient nutritional intake in a Norwegian setting.  相似文献   

13.

Introduction

Burkina Faso started nationwide community case management of malaria (CCMm) in 2010. In 2011, health center user fees for children under five were abolished in some districts.

Objective

To assess the effects of concurrent implementation of CCMm and user fees abolition on treatment-seeking practices for febrile children.

Methods

This is a natural experiment conducted in the districts of Kaya (CCMm plus user fees abolition) and Zorgho (CCMm only). Registry data from 2005 to 2014 on visits for malaria were collected from all eight rural health centers in the study area. Annual household surveys were administered during malaria transmission season in 2011 and 2012 in 1,035 randomly selected rural households. Interrupted time series models were fitted for registry data and Fine and Gray’s competing risks models for survey data.

Results

User fees abolition in Kaya significantly increased health center use by eligible children with malaria (incidence rate ratio for intercept change = 2.1, p <0.001). In 2011, in Kaya, likelihood of health center use for febrile children was three times higher and CHW use three times lower when caregivers knew services were free. Among the 421 children with fever in 2012, the delay before visiting a health center was significantly shorter in Kaya than in Zorgho (1.46 versus 1.79 days, p <0.05). Likelihood of visiting a health center on the first day of fever among households <2.5km or <5 km from a health center was two and three times higher in Kaya than in Zorgho, respectively (p <0.001).

Conclusions

User fees abolition reduced visit delay for febrile children living close to health centers. It also increased demand for and use of health center for children with malaria. Concurrently, demand for CHWs’ services diminished. User fees abolition and CCMm should be coordinated to maximize prompt access to treatment in rural areas.  相似文献   

14.

Background

This study examines the relative importance of living in an urban versus rural setting and malaria in contributing to the public health problem of malarial anaemia (MA) and anaemia respectively in apparently healthy primary school children.

Methods

A cross-sectional study was conducted among 727 school children aged between four and 15 years living in an urban (302) and rural (425) settings in the Mount Cameroon area. Blood sample collected from each child was used for the preparation of blood films for detection of malaria parasites and assessment of malaria parasite density as well as full blood count determination using an automated haematology analyzer. Based on haemoglobin (Hb) measurements, children with malaria parasitaemia were stratified into MA (Hb<11g/dL); mild MA (Hb of 8–10.9g/dL); moderate MA (Hb of 6.1–7.9g/dL) and severe MA (Hb≤6g/dL). Evaluation of potential determinants of MA and anaemia was performed by multinomial logistic-regression analysis and odds ratios used to evaluate risk factors.

Results

Out of the 727 children examined, 72 (9.9%) had MA. The prevalence of MA and anaemia were significantly higher (χ2 = 36.5, P <0.001; χ2 = 16.19, P <0.001 respectively) in children in the urban (17.9%; 26.8% respectively) than in the rural area (4.2%; 14.8% respectively). Majority of the MA cases were mild (88.9%), with moderate (5.6%) and severe MA (5.6%) occurring in the urban area only. The age group ≤6years was significantly (P <0.05) associated with both MA and anaemia. In addition, low parasite density was associated with MA while malaria parasite negative and microcytosis were associated with anaemia.

Conclusions

Malarial anaemia and anaemia display heterogeneity and complexity that differ with the type of settlement. The presence of severe MA and the contributions of the age group ≤6 years, low parasite density and microcytosis to the public health problem of MA and anaemia are noteworthy.  相似文献   

15.

Objectives

To evaluate the humanistic and economic burden of a restless legs syndrome (RLS) diagnosis with regard to health-related quality of life, work productivity loss, healthcare resource use, and direct and indirect costs.

Study Design

Self-reported data came from the 2012 National Health and Wellness Survey (NHWS), a large, annual, nationally representative cross-sectional general health survey of US adults.

Methods

RLS patients (n = 2,392) were matched on demographic and health characteristics to Non-RLS respondents via propensity score matching differences between groups were tested with Bivariate and multivariable analyses.

Results

RLS patients had significantly lower health-related quality of life scores: Mental Component Summary (44.60 vs. 48.92, p<.001), Physical Component Summary (40.57 vs. 46.78, p<.001), Health Utilities (.63 vs. .71, p<.001) and higher levels of work productivity loss in the past seven days including absenteeism (8.1% vs. 9.3%, p<.001), presenteeism (26.5% vs. 15.8%, p<.001), and overall productivity loss (30.1% vs. 18.1%, p<.001) as well as general activity impairment (46.1% vs. 29.7%, p<.001). RLS patients had significantly higher healthcare resource use in the past 6 months than non-RLS patients: healthcare provider visits (7.46 vs. 4.42%, p<.001), ER visits (0.45 vs. 0.24, p<.001), and hospitalizations (0.24 vs. 0.15, p<.001). RLS patients also had higher estimated direct and indirect costs than non-RLS patients. Finally, it was found that across outcomes increasing severity is associated with increased economic and humanistic burden for RLS patients.

Conclusions

RLS patients suffer a greater humanistic and economic burden than those without RLS. Moreover as severity increases so does the burden of RLS.  相似文献   

16.

Objectives

Peritonitis is one of the most important causes of treatment failure in peritoneal dialysis (PD) patients. This study describes changes in characteristics of causative organisms in PD-related peritonitis and antimicrobial susceptibility.

Methods

In this single center study we analyzed retrospective 487 susceptibility profiles of the peritoneal fluid cultures of 351 adult patients with peritonitis from 1979 to 2014 (divided into three time periods, P1-P3).

Results

Staphylococcus aureus decreased from P1 compared to P2 and P3 (P<0.05 and P<0.01, respectively). Methicillin-resistant S. aureus (MRSA) occurred only in P3. Methicillin-resistant Staphylococcus epidermidis (MRSE) increased in P3 over P1 and P2 (P <0.0001, respectively). In P2 and P3, vancomycin resistant enterococci were detected. The percentage of gram-negative organisms remained unchanged. Third generation cephalosporin resistant gram-negative rods (3GCR-GN) were found exclusively in P3. Cefazolin-susceptible gram-positive organisms decreased over the three decades (93% in P1, 75% in P2 and 58% in P3, P<0.01, P<0.05 and P<0.0001, respectively). Vancomycin susceptibility decreased and gentamicin susceptibility in gram-negatives was 94% in P1, 82% in P2 and 90% in P3. Ceftazidim susceptibility was 84% in P2 and 93% in P3.

Conclusions

Peritonitis caused by MSSA decreased, but peritonitis caused by MRSE increased. MRSA peritonitis is still rare. Peritonitis caused by 3GCR-GN is increasing. An initial antibiotic treatment protocol should be adopted for PD patients to provide continuous surveillance.  相似文献   

17.

Objective

To determine six-year spherical refractive error change among white children and young adults in the UK and evaluate differences in refractive profiles between contemporary Australian children and historical UK data.

Design

Population-based prospective study.

Participants

The Northern Ireland Childhood Errors of Refraction (NICER) study Phase 1 examined 1068 children in two cohorts aged 6–7 years and 12–13 years. Prospective data for six-year follow-up (Phase 3) are available for 212 12–13 year olds and 226 18–20 year olds in each cohort respectively.

Methods

Cycloplegic refractive error was determined using binocular open-field autorefraction (Shin-Nippon NVision-K 5001, cyclopentolate 1%). Participants were defined by spherical equivalent refraction (SER) as myopic SER ≤-0.50D, emmetropic -0.50D<SER<+2.00 or hyperopic SER≥+2.00D.

Main Outcome Measures

Proportion and incidence of myopia.

Results

The proportion of myopes significantly increased between 6–7 years (1.9%) and 12–13 years (14.6%) (p<0.001) but not between 12–13 and 18–20 years (16.4% to 18.6%, p = 0.51). The estimated annual incidence of myopia was 2.2% and 0.7% for the younger and older cohorts respectively. There were significantly more myopic children in the UK at age 12–13 years in the NICER study (16.4%) than reported in Australia (4.4%) (p<0.001). However by 17 years the proportion of myopia neared equivalence in the two populations (NICER 18.6%, Australia 17.7%, p = 0.75). The proportion of myopic children aged 12–13 years in the present study (2006–2008) was 16.4%, significantly greater than that reported for children aged 10–16 years in the 1960’s (7.2%, p = 0.01). The proportion of hyperopes in the younger NICER cohort decreased significantly over the six year period (from 21.7% to 14.2%, p = 0.04). Hyperopes with SER ≥+3.50D in both NICER age cohorts demonstrated persistent hyperopia.

Conclusions

The incidence and proportion of myopia are relatively low in this contemporary white UK population in comparison to other worldwide studies. The proportion of myopes in the UK has more than doubled over the last 50 years in children aged between 10–16 years and children are becoming myopic at a younger age. Differences between the proportion of myopes in the UK and in Australia apparent at 12–13 years were eliminated by 17 years of age.  相似文献   

18.

Background

The analysis of heart rate variability (HRV) has been shown as a promising non-invasive technique for assessing the cardiac autonomic modulation in trauma. The aim of this study was to evaluate HRV during hemorrhagic shock and fluid resuscitation, comparing to traditional hemodynamic and metabolic parameters.

Methods

Twenty anesthetized and mechanically ventilated pigs were submitted to hemorrhagic shock (60% of estimated blood volume) and evaluated for 60 minutes without fluid replacement. Surviving animals were treated with Ringer solution and evaluated for an additional period of 180 minutes. HRV metrics (time and frequency domain) as well as hemodynamic and metabolic parameters were evaluated in survivors and non-survivors animals.

Results

Seven of the 20 animals died during hemorrhage and initial fluid resuscitation. All animals presented an increase in time-domain HRV measures during haemorrhage and fluid resuscitation restored baseline values. Although not significantly, normalized low-frequency and LF/HF ratio decreased during early stages of haemorrhage, recovering baseline values later during hemorrhagic shock, and increased after fluid resuscitation. Non-surviving animals presented significantly lower mean arterial pressure (43±7vs57±9 mmHg, P<0.05) and cardiac index (1.7±0.2vs2.6±0.5 L/min/m2, P<0.05), and higher levels of plasma lactate (7.2±2.4vs3.7±1.4 mmol/L, P<0.05), base excess (-6.8±3.3vs-2.3±2.8 mmol/L, P<0.05) and potassium (5.3±0.6vs4.2±0.3 mmol/L, P<0.05) at 30 minutes after hemorrhagic shock compared with surviving animals.

Conclusions

The HRV increased early during hemorrhage but none of the evaluated HRV metrics was able to discriminate survivors from non-survivors during hemorrhagic shock. Moreover, metabolic and hemodynamic variables were more reliable to reflect hemorrhagic shock severity than HRV metrics.  相似文献   

19.

Background

Sensitive and specific detection of malarial parasites is crucial in controlling the significant malaria burden in the developing world. Also important is being able to identify life threatening Plasmodium falciparum malaria quickly and accurately to reduce malaria related mortality. Existing methods such as microscopy and rapid diagnostic tests (RDTs) have major shortcomings. Here, we describe a new real-time PCR-based diagnostic test device at point-of-care service for resource-limited settings.

Methods

Truenat® Malaria, a chip-based microPCR test, was developed by bigtec Labs, Bangalore, India, for differential identification of Plasmodium falciparum and Plasmodium vivax parasites. The Truenat Malaria tests runs on bigtec’s Truelab Uno® microPCR device, a handheld, battery operated, and easy-to-use real-time microPCR device. The performance of Truenat® Malaria was evaluated versus the WHO nested PCR protocol. The Truenat® Malaria was further evaluated in a triple-blinded study design using a sample panel of 281 specimens created from the clinical samples characterized by expert microscopy and a rapid diagnostic test kit by the National Institute of Malaria Research (NIMR). A comparative evaluation was done on the Truelab Uno® and a commercial real-time PCR system.

Results

The limit of detection of the Truenat Malaria assay was found to be <5 parasites/μl for both P. falciparum and P. vivax. The Truenat® Malaria test was found to have sensitivity and specificity of 100% each, compared to the WHO nested PCR protocol based on the evaluation of 100 samples. The sensitivity using expert microscopy as the reference standard was determined to be around 99.3% (95% CI: 95.5–99.9) at the species level. Mixed infections were identified more accurately by Truenat Malaria (32 samples identified as mixed) versus expert microscopy and RDTs which detected 4 and 5 mixed samples, respectively.

Conclusion

The Truenat® Malaria microPCR test is a valuable diagnostic tool and implementation should be considered not only for malaria diagnosis but also for active surveillance and epidemiological intervention.  相似文献   

20.

Background and Objective

Conflicting data have been reported on the association between tumor necrosis factor (TNF) –308G>A and nitric oxide synthase 3 (NOS3) +894G>T polymorphisms and migraine. We performed a meta-analysis of case-control studies to evaluate whether the TNF –308G>A and NOS3 +894G>T polymorphisms confer genetic susceptibility to migraine.

Method

We performed an updated meta-analysis for TNF –308G>A and a meta-analysis for NOS3 +894G>T based on studies published up to July 2014. We calculated study specific odds ratios (OR) and 95% confidence intervals (95% CI) assuming allele contrast, dominant model, recessive model, and co-dominant model as pooled effect estimates.

Results

Eleven studies in 6682 migraineurs and 22591 controls for TNF –308G>A and six studies in 1055 migraineurs and 877 controls for NOS3 +894G>T were included in the analysis. Neither indicated overall associations between gene polymorphisms and migraine risk. Subgroup analyses suggested that the “A” allele of the TNF –308G>A variant increases the risk of migraine among non-Caucasians (dominant model: pooled OR = 1.82; 95% CI 1.15 – 2.87). The risk of migraine with aura (MA) was increased among both Caucasians and non-Caucasians. Subgroup analyses suggested that the “T” allele of the NOS3 +894G>T variant increases the risk of migraine among non-Caucasians (co-dominant model: pooled OR = 2.10; 95% CI 1.14 – 3.88).

Conclusions

Our findings appear to support the hypothesis that the TNF –308G>A polymorphism may act as a genetic susceptibility factor for migraine among non-Caucasians and that the NOS3 +894G>T polymorphism may modulate the risk of migraine among non-Caucasians.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号