首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 870 毫秒
1.

Background

Approximately 14 million persons living in areas endemic for lymphatic filariasis have lymphedema of the leg. Clinical studies indicate that repeated episodes of bacterial acute dermatolymphangioadenitis (ADLA) lead to progression of lymphedema and that basic lymphedema management, which emphasizes hygiene, skin care, exercise, and leg elevation, can reduce ADLA frequency. However, few studies have prospectively evaluated the effectiveness of basic lymphedema management or assessed the role of compressive bandaging for lymphedema in resource-poor settings.

Methodology/Principal Findings

Between 1995 and 1998, we prospectively monitored ADLA incidence and leg volume in 175 persons with lymphedema of the leg who enrolled in a lymphedema clinic in Leogane, Haiti, an area endemic for Wuchereria bancrofti. During the first phase of the study, when a major focus of the program was to reduce leg volume using compression bandages, ADLA incidence was 1.56 episodes per person-year. After March 1997, when hygiene and skin care were systematically emphasized and bandaging discouraged, ADLA incidence decreased to 0.48 episodes per person-year (P<0.0001). ADLA incidence was significantly associated with leg volume, stage of lymphedema, illiteracy, and use of compression bandages. Leg volume decreased in 78% of patients; over the entire study period, this reduction was statistically significant only for legs with stage 2 lymphedema (P = 0.01).

Conclusions/Significance

Basic lymphedema management, which emphasized hygiene and self-care, was associated with a 69% reduction in ADLA incidence. Use of compression bandages in this setting was associated with an increased risk of ADLA. Basic lymphedema management is feasible and effective in resource-limited areas that are endemic for lymphatic filariasis.  相似文献   

2.

Background

Lymphedema management programs have been shown to decrease episodes of adenolymphangitis (ADLA), but the impact on lymphedema progression and of program compliance have not been thoroughly explored. Our objectives were to determine the rate of ADLA episodes and lymphedema progression over time for patients enrolled in a community-based lymphedema management program. We explored the association between program compliance and ADLA episodes as well as lymphedema progression.

Methodology/Principal Findings

A lymphedema management program was implemented in Odisha State, India from 2007–2010 by the non-governmental organization, Church''s Auxiliary for Social Action, in consultation with the Centers for Disease Control and Prevention. A cohort of patients was followed over 24 months. The crude 30-day rate of ADLA episodes decreased from 0.35 episodes per person-month at baseline to 0.23 at 24 months. Over the study period, the percentage of patients who progressed to more severe lymphedema decreased (P-value  = 0.0004), while those whose lymphedema regressed increased over time (P-value<0.0001). Overall compliance to lymphedema management, lagged one time point, appeared to have little to no association with the frequency of ADLA episodes among those without entry lesions (RR = 0.87 (0.69, 1.10)) and was associated with an increased rate (RR = 1.44 (1.11, 1.86)) among those with entry lesions. Lagging compliance two time points, it was associated with a decrease in the rate of ADLA episodes among those with entry lesions (RR = 0.77 (95% CI: 0.59, 0.99)) and was somewhat associated among those without entry lesions (RR = 0.83 (95% CI: 0.64, 1.06)). Compliance to soap was associated with a decreased rate of ADLA episodes among those without inter-digital entry lesions.

Conclusions/Significance

These results indicate that a community-based lymphedema management program is beneficial for lymphedema patients for both ADLA episodes and lymphedema. It is one of the first studies to demonstrate an association between program compliance and rate of ADLA episodes.  相似文献   

3.
4.
ObjectiveTo evaluate the effect of iron supplementation on the incidence of infections in children.DesignSystematic review of randomised controlled trials.InterventionsOral or parenteral iron supplementation or fortified formula milk or cereals.OutcomesIncidence of all recorded infectious illnesses, and individual illnesses, including respiratory tract infection, diarrhoea, malaria, other infections, and prevalence of positive smear results for malaria.ResultsThe pooled estimate (random effects model) of the incidence rate ratio (iron v placebo) was 1.02 (95% confidence interval 0.96 to 1.08, P=0.54; P<0.0001 for heterogeneity). The incidence rate difference (iron minus placebo) for all recorded illnesses was 0.06 episodes/child year (−0.06 to 0.18, P=0.34; P<0.0001 for heterogeneity). However, there was an increase in the risk of developing diarrhoea (incidence rate ratio 1.11, 1.01 to 1.23, P=0.04), but this would not have an overall important on public health (incidence rate difference 0.05 episodes/child year, –0.03 to 0.13; P=0.21). The occurrence of other illnesses and positive results on malaria smears (adjusted for positive smears at baseline) were not significantly affected by iron administration. On meta-regression, the statistical heterogeneity could not be explained by the variables studied.ConclusionIron supplementation has no apparent harmful effect on the overall incidence of infectious illnesses in children, though it slightly increases the risk of developing diarrhoea.

What is already known on this topic

Iron supplementation is recommended to prevent iron deficiency, which is a major health problem, especially in the developing countriesConflicting data exist regarding the possibility of an increase in the incidence of infections with iron supplementation, resulting in concern about the safety of this intervention

What this study adds

Iron supplementation has no apparent harmful effect on the overall incidence of infectious illnesses in childrenIron administration increases the risk of developing diarrhoeaFortification of foods may be the safest and most beneficial mode of supplementation in relation to infectious illnesses  相似文献   

5.
ObjectiveTo assess the effectiveness of nurse led follow up in the management of patients with lung cancer.DesignRandomised controlled trial.SettingSpecialist cancer hospital and three cancer units in southeastern England.Participants203 patients with lung cancer who had completed their initial treatment and were expected to survive for at least 3 months.InterventionNurse led follow up of outpatients compared with conventional medical follow up.ResultsPatient acceptability of nurse led follow up was high: 75% (203/271) of eligible patients consented to participate. Patients who received the intervention had less severe dyspnoea at 3 months (P=0.03) and had better scores for emotional functioning (P=0.03) and less peripheral neuropathy (P=0.05) at 12 months. Intervention group patients scored significantly better in most satisfaction subscales at 3, 6, and 12 months (P<0.01 for all subscales at 3 months). No significant differences in general practitioners'' overall satisfaction were seen between the two groups. No differences were seen in survival or rates of objective progression, although nurses recorded progression of symptoms sooner than doctors (P=0.01). Intervention patients were more likely to die at home rather than in a hospital or hospice (P=0.04), attended fewer consultations with a hospital doctor during the first 3 months (P=0.004), had fewer radiographs during the first 6 months (P=0.04), and had more radiotherapy within the first 3 months (P=0.01). No other differences were seen between the two groups in terms of the use of resources.ConclusionNurse led follow up was acceptable to lung cancer patients and general practitioners and led to positive outcomes.

What is already known on this topic

Most patients with cancer are routinely seen in outpatient clinics for many years despite lack of evidence of effectivenessDoctors and nurses often fail to detect patients'' emotional distress, and patients have little time to raise concerns

What this study adds

Follow up of patients with lung cancer by clinical nurse specialists is safe, acceptable, and cost effectiveBoth patients and general practitioners were highly satisfied with the nurse led model of follow up  相似文献   

6.
7.

Objective

To determine the vision-related quality of life (VR-QOL) after surgery for macula-off rhegmatogenous retinal detachment (RRD) in relation to visual acuity, contrast acuity, and color vision.

Methods

In a prospective observational study, we included 55 patients with a macula-off RRD. Best corrected visual acuity (BCVA), color vision (saturated and desaturated color confusion indices (CCI)) and contrast acuity were measured at 12 months postoperatively in both the RRD eye and the fellow control eye, and the 25-item National Eye Institute Visual Function Questionnaire (NEI VFQ-25) was filled out.

Results

Operated and fellow control eyes differed significantly in mean LogMAR BCVA (P<0.0001), median Log contrast acuity (P<0.0001), saturated CCI (P = 0.009), and desaturated CCI (P = 0.016). Significant correlations were observed between the NEI VFQ-25 overall composite score and postoperative LogMAR BCVA (R = −0.551, P<0.0001), contrast acuity (R = 0.472, P<0.0001), saturated CCI (R = −0.315, P = 0.023), and desaturated CCI (R = −0.283, P = 0.044).

Conclusions

A lower VR-QOL was highly correlated to a worse postoperative BCVA and contrast acuity and to a lesser extent to color vision disturbances.  相似文献   

8.
Objectives:To evaluate the effect of high-quality care on limb function recovery and quality of life (QOL) after osteoporotic hip fracture (OHF) surgery in the elderly.Methods:116 elderly patients with OHF enrolled in our hospital from January 2017 to December 2019 were assigned into observation group (high-quality care, n=58) and control group (routine care, n=58). After one month of intervention, Harris Hip Score (HHS) and Barthel Index (BI) were used to evaluating limb function and self-care ability, pain intensity numerical rating scale (PINRS) for pain assessment, self-rating anxiety scale (SAS), and self-rating depression scale (SDS) for emotion assessment. Besides, postsurgical complications, QOL and patient satisfaction were examined.Results:HHS and BI were higher in observation group (P<0.05); PINRS, SAS and SDS were lower in observation group (P<0.05); incidence of postsurgical complications in the observation group was significantly lower than that in the control group (P<0.05); QOL and patient satisfaction in the observation group were higher than those in the control group (P<0.05).Conclusion:High-quality care promotes the recovery of limb function, the QOL and the satisfaction of elderly patients.  相似文献   

9.
BackgroundRecent studies have shown significant decline in the final cure rate after miltefosine treatment in visceral leishmaniasis. This study evaluates the efficacy of miltefosine in the treatment of post kala-azar dermal leishmaniasis (PKDL) patients recruited over a period of 5 years with 18 months of follow-up.MethodologyIn this study 86 confirmed cases of PKDL were treated with two different dosage regimens of miltefosine (Regimen I- 50mg twice daily for 90 days and Regimen II- 50 mg thrice for 60 days) and the clinical outcome assessed monthly. Cure/relapse was ascertained by clinical and histopathological examination, and measuring parasite burden by quantitative real-time PCR. In vitro susceptibility of parasites towards miltefosine was estimated at both promastigote and amastigote stages.ResultsSeventy three of eighty six patients completed the treatment and achieved clinical cure. Approximately 4% (3/73) patients relapsed by the end of 12 months follow-up, while a total of 15% (11/73) relapsed by the end of 18 months. Relapse rate was significantly higher in regimen II (31%) compared to regimen I (10.5%)(P<0.005). Parasite load at the pre-treatment stage was significantly higher (P<0.005) in cases that relapsed compared to the cases that remained cured. In vitro susceptibility towards miltefosine of parasites isolated after relapse was significantly lower (>2 fold) in comparison with the pre-treatment isolates (P<0.005).ConclusionRelapse rate in PKDL following miltefosine treatment has increased substantially, indicating the need of introducing alternate drugs/ combination therapy with miltefosine.  相似文献   

10.
ObjectiveTo review the evidence from clinical trials of follow up of patients after curative resection for colorectal cancer.DesignSystematic review and meta-analysis of randomised controlled trials of intensive compared with control follow up.ResultsFive trials, which included 1342 patients, met the inclusion criteria. Intensive follow up was associated with a reduction in all cause mortality (combined risk ratio 0.81, 95% confidence interval 0.70 to 0.94, P=0.007). The effect was most pronounced in the four extramural detection trials that used computed tomography and frequent measurements of serum carcinoembryonic antigen (risk ratio 0.73, 0.60 to 0.89, P=0.002). Intensive follow up was associated with significantly earlier detection of all recurrences (difference in means 8.5 months, 7.6 to 9.4 months, P<0.001) and an increased detection rate for isolated local recurrences (risk ratio 1.61, 1.12 to 2.32, P=0.011).ConclusionsIntensive follow up after curative resection for colorectal cancer improves survival. Large trials are required to identify which components of intensive follow up are most beneficial.

What is already known on this topic

There is a lack of direct evidence that intensive follow up after initial curative treatment for colorectal cancer leads to increased survivalGuidelines are inconclusive and clinical practice varies widely

What this study adds

The cumulative analysis of available data supports the view that intensive follow up after curative resection for colorectal cancer improves survivalIf computed tomography and frequent measurements of serum carcinoembryonic antigen are used during follow up mortality related to cancer is reduced by 9-13%This survival benefit is partly attributable to the earlier detection of all recurrences, particularly the increased detection of isolated recurrent disease  相似文献   

11.

Background

Podoconiosis is a non-filarial form of elephantiasis resulting in lymphedema of the lower legs. Previous studies have suggested that podoconiosis arises from the interplay of individual and environmental factors. Here, our aim was to understand the individual-level correlates of podoconiosis by comparing 460 podoconiosis-affected individuals and 707 unaffected controls.

Methods/principal findings

This was a case-control study carried out in six kebeles (the lowest governmental administrative unit) in northern Ethiopia. Each kebele was classified into one of three endemicity levels: ‘low’ (prevalence <1%), ‘medium’ (1–5%) and ‘high’ (>5%). A total of 142 (30.7%) households had two or more cases of podoconiosis. Compared to controls, the majority of the cases, especially women, were less educated (OR = 1.7, 95% CI = 1.3 to 2.2), were unmarried (OR = 3.4, 95% CI = 2.6–4.6) and had lower income (t = −4.4, p<0.0001). On average, cases started wearing shoes ten years later than controls. Among cases, age of first wearing shoes was positively correlated with age of onset of podoconiosis (r = 0.6, t = 12.5, p<0.0001). Among all study participants average duration of shoe wearing was less than 30 years. Between both cases and controls, people in ‘high’ and ‘medium’ endemicity kebeles were less likely than people in ‘low’ endemicity areas to ‘ever’ have owned shoes (OR = 0.5, 95% CI = 0.4–0.7).

Conclusions

Late use of shoes, usually after the onset of podoconiosis, and inequalities in education, income and marriage were found among cases, particularly among females. There were clustering of cases within households, thus interventions against podoconiosis will benefit from household-targeted case tracing. Most importantly, we identified a secular increase in shoe-wearing over recent years, which may give opportunities to promote shoe-wearing without increasing stigma among those at high risk of podoconiosis.  相似文献   

12.

Objective

To evaluate the changes in serum neuron specific enolase and protein S-100B, after carotid endarterectomy performed using the conventional technique with routine shunting and patch closure, or eversion technique without the use of shunt.

Materials and Methods

Prospective non-randomized study included 43 patients with severe (>80%) carotid stenosis undergoing carotid endarterectomy in regional anesthesia. Patients were divided into two groups: conventional endarterectomy with routine use of shunt and Dacron patch (csCEA group) and eversion endarterectomy without the use of shunt (eCEA group). Protein S-100B and NSE concentrations were measured from peripheral blood before carotid clamping, after declamping and 24 hours after surgery.

Results

Neurologic examination and brain CT findings on the first postoperative day did not differ from preoperative controls in any patients. In csCEA group, NSE concentrations decreased after declamping (P<0.01), and 24 hours after surgery (P<0.01), while in the eCEA group NSE values slightly increased (P=ns), accounting for a significant difference between groups on the first postoperative day (P=0.006). In both groups S-100B concentrations significantly increased after declamping (P<0.05), returning to near pre-clamp values 24 hours after surgery (P=ns). Sub-group analysis revealed significant decline of serum NSE concentrations in asymptomatic patients shunted during surgery after declamping (P<0.05) and 24 hours after surgery (P<0.01), while no significant changes were noted in non-shunted patients (P=ns). Decrease of NSE serum levels was also found in symptomatic patients operated with the use of shunt on the first postoperative day (P<0.05). Significant increase in NSE serum levels was recorded in non-shunted symptomatic patients 24 hours after surgery (P<0.05).

Conclusion

Variations of NSE concentrations seemed to be influenced by cerebral perfusion alterations, while protein S-100B values were unaffected by shunting strategy. Routine shunting during surgery for symptomatic carotid stenosis may have the potential to prevent postoperative increase of serum NSE levels, a potential marker of brain injury.  相似文献   

13.

Background

Circulating miRNAs are emerging as promising blood-based biomarkers for colorectal and other human cancers; however, technical factors that confound the development of these assays remain poorly understood and present a clinical challenge. The aim of this study was to systematically evaluate the effects of factors that may interfere with the accurate measurement of circulating miRNAs for clinical purposes.

Methods

Blood samples from 53 subjects, including routinely drawn serum samples, matched plasma from 30 subjects, and matched serum samples drawn before and after bowel preparation for colonoscopy from 29 subjects were collected. Additionally, 38 serum specimens stored in the clinical laboratory for seven days were used to test the stability of miRNAs. Hemolysis controls with serial dilutions of hemoglobin were prepared. RNA was extracted from serum, plasma or hemolyzed controls with spiked-in cel-miR-39, and levels of miR-21, miR-29a, miR-125b and miR-16 were examined by real-time RT-PCR. Hemolysis was measured by spectrophotometry.

Results

The expression levels of miR-16 and the degree of hemolysis were significantly higher in plasma than in serum (P<0.0001). Measured miR-21, miR-29a, miR-125b and miR-16 expression increased with hemoglobin levels in hemolyzed controls. The degree of hemolysis in serum samples correlated significantly with the levels of miR-21 (P<0.0001), miR-29a (P = 0.0002), miR-125b (P<0.0001) and miR-16 (P<0.0001). All four miRNAs showed significantly lower levels in sera that had been stored at 4°C for seven days (P<0.0001). Levels of miR-21 (P<0.0001), miR-29a (P<0.0001) and miR-16 (P = 0.0003), and the degree of hemolysis (P = 0.0002) were significantly higher in sera drawn after vs. before bowel preparation.

Conclusions

The measured levels of miRNAs in serum and plasma from same patients varied in the presence of hemolysis, and since hemolysis and other factors affected miRNA expression, it is important to consider these confounders while developing miRNA-based diagnostic assays.  相似文献   

14.

Purpose

This study is to evaluate the Hangzhou criteria (HC) for patients with HCC undergoing surgical resection and to identify whether this staging system is superior to other staging systems in predicting the survival of resectable HCC.

Method

774 HCC patients underwent surgical resection between 2007 and 2009 in West China Hospital were enrolled retrospectively. Predictors of survival were identified using the Kaplan–Meier method and the Cox model. The disease state was staged by the HC, as well as by the TNM and BCLC staging systems. Prognostic powers were quantified using a linear trend χ2 test, c-index, and the likelihood ratio (LHR) χ2 test and correlated using Cox''s regression model adjusted using the Akaike information criterion (AIC).

Results

Serum AFP level (P = 0.02), tumor size (P<0.001), tumor number (P<0.001), portal vein invasion (P<0.001), hepatic vein invasion (P<0.001), tumor differentiation (P<0.001), and distant organ (P = 0.016) and lymph node metastasis (P<0.001) were identified as independent risk factors of survival after resection by multivariate analysis. The comparison of the different staging system results showed that BCLC had the best homogeneity (likelihood ratio χ2 test 151.119, P<0.001), the TNM system had the best monotonicity of gradients (linear trend χ2 test 137.523, P<0.001), and discriminatory ability was the highest for the BCLC (the AUCs for 1-year mortality were 0.759) and TNM staging systems (the AUCs for 3-, and 5-year mortality were 0.738 and 0.731, respectively). However, based on the c-index and AIC, the HC was the most informative staging system in predicting survival (c-index 0.6866, AIC 5924.4729).

Conclusions

The HC can provide important prognostic information after surgery. The HC were shown to be a promising survival predictor in a Chinese cohort of patients with resectable HCC.  相似文献   

15.
Maternal diabetes in pregnancy affects offspring health. The impact of parental diabetes on offspring health is unclear. We investigated the impact of parental diabetes on the metabolic-health of adult-offspring who did not themselves have diabetes. Data from the Generation Scotland: Scottish Family Health Study, a population-based family cohort, were record-linked to subjects’ own diabetes medical records. From F0-parents, we identified F1-offspring of: mothers with diabetes (OMD, n = 409), fathers with diabetes (OFD, n = 468), no parent with diabetes (ONoPD, n = 2489). Metabolic syndrome, body, biochemical measurements and blood-pressures were compared between F1-offspring groups by sex. A higher proportion of female OMD had metabolic syndrome than female OFD or ONoPD (P<0.0001). In female offspring, predictors of metabolic syndrome were: having a mother with diabetes (OR = 1.78, CI 1.03–3.07, [reference ONoPD]), body mass index (BMI, OR = 1.21, CI 1.13–1.30) and age (OR = 1.03, CI 1.01–1.06). In male offspring, predictors of metabolic syndrome were: BMI (OR = 1.18, CI 1.09–1.29) and percent body-fat (OR = 1.12, CI 1.05–1.19). In both sexes, OMD had higher blood-pressures than OFD (P<0.0001). In females, OMD had higher glucose (P<0.0001) and percent body-fat (P<0.0001) compared with OFD or ONoPD. OMD and OFD both had increased waist-measurements (P<0.0001), BMI (P<0.0001) and percent body-fat (P<0.0001) compared with ONoPD. Female OMD and OFD had lower HDL-cholesterol levels (P<0.0001) than female ONoPD. Parental diabetes is associated with higher offspring-BMI and body-fat. In female offspring, maternal diabetes increased the odds of metabolic syndrome, even after adjusting for BMI. Further investigations are required to determine the mechanisms involved.  相似文献   

16.

Introduction and Aim

The association between thyroid dysfunction and mortality is controversial. Moreover, the impact of duration of thyroid dysfunction is unclarified. Our aim was to investigate the correlation between biochemically assessed thyroid function as well as dysfunction duration and mortality.

Methods

Register-based follow-up study of 239,768 individuals with a serum TSH measurement from hospitals and/or general practice in Funen, Denmark. Measurements were performed at a single laboratory from January 1st 1995 to January 1st 2011. Cox regression was used for mortality analyses and Charlson Comorbidity Index (CCI) was used as comorbidity score.

Results

Hazard ratios (HR) with 95% confidence intervals (CI) for mortality with decreased (<0.3 mIU/L) or elevated (>4.0 mIU/L) levels of TSH were 2.22; 2.14–2.30; P<0.0001 and 1.28; 1.22–1.35; P<0.0001, respectively. Adjusting for age, gender, CCI and diagnostic setting attenuated the risk estimates (HR 1.23; 95% CI: 1.19–1.28; P<0.0001, mean follow-up time 7.7 years, and HR 1.07; 95% CI: 1.02–1.13; P = 0.004, mean follow-up time 7.2 years) for decreased and elevated values of TSH, respectively. Mortality risk increased by a factor 1.09; 95% CI: 1.08–1.10; P<0.0001 or by a factor 1.03; 95% CI: 1.02–1.04; P<0.0001 for each six months a patient suffered from decreased or elevated TSH, respectively. Subdividing according to degree of thyroid dysfunction, overt hyperthyroidism (HRovert 1.12; 95% CI: 1.06–1.19; P<0.0001), subclinical hyperthyroidism (HRsubclinical 1.09; 95% CI: 1.02–1.17; P = 0.02) and overt hypothyroidism (HRovert 1.57; 95% CI: 1.34–1.83; P<0.0001), but not subclinical hypothyroidism (HRsubclinical 1.03; 95% CI: 0.97–1.09; P = 0.4) were associated with increased mortality.

Conclusions and Relevance

In a large-scale, population-based cohort with long-term follow-up (median 7.4 years), overt and subclinical hyperthyroidism and overt but not subclinical hypothyroidism were associated with increased mortality. Excess mortality with increasing duration of decreased or elevated serum TSH suggests the importance of timely intervention in individuals with thyroid dysfunction.  相似文献   

17.

Purpose

To evaluate corneal reinnervation, and the corresponding corneal sensitivity and keratocyte density after small incision lenticule extraction (SMILE) and femtosecond laser in situ keratomileusis (FS-LASIK).

Methods

In this prospective, non-randomized observational study, 18 patients (32 eyes) received SMILE surgery, and 22 patients (42 eyes) received FS-LASIK surgery to correct myopia. The corneal subbasal nerve density and microscopic morphological changes in corneal architecture were evaluated by confocal microscopy prior to surgery and at 1 week, 1 month, 3 months, and 6 months after surgery. A correlation analysis was performed between subbasal corneal nerve density and the corresponding keratocyte density and corneal sensitivity.

Results

The decrease in subbasal nerve density was less severe in SMILE-treated eyes than in FS-LASIK-treated eyes at 1 week (P = 0.0147), 1 month (P = 0.0243), and 3 months (P = 0.0498), but no difference was detected at the 6-month visit (P = 0.5277). The subbasal nerve density correlated positively with central corneal sensitivity in both groups (r = 0.416, P<0.0001, and r = 0.2567, P = 0.0038 for SMILE group and FS-LASIK group, respectively). The SMILE-treated eyes have a lower risk of developing peripheral empty space with epithelial cells filling in (P = 0.0005).

Conclusions

The decrease in subbasal nerve fiber density was less severe in the SMILE group than the FS-LASIK group in the first 3 months following the surgeries. The subbasal nerve density was correlated with central corneal sensitivity.  相似文献   

18.

Background

An up-to-date and reliable map of podoconiosis is needed to design geographically targeted and cost-effective intervention in Ethiopia. Identifying the ecological correlates of the distribution of podoconiosis is the first step for distribution and risk maps. The objective of this study was to investigate the spatial distribution and ecological correlates of podoconiosis using historical and contemporary survey data.

Methods

Data on the observed prevalence of podoconiosis were abstracted from published and unpublished literature into a standardized database, according to strict inclusion and exclusion criteria. In total, 10 studies conducted between 1969 and 2012 were included, and data were available for 401,674 individuals older than 15 years of age from 229 locations. A range of high resolution environmental factors were investigated to determine their association with podoconiosis prevalence, using logistic regression.

Results

The prevalence of podoconiosis in Ethiopia was estimated at 3.4% (95% CI 3.3%–3.4%) with marked regional variation. We identified significant associations between mean annual Land Surface Temperature (LST), mean annual precipitation, topography of the land and fine soil texture and high prevalence of podoconiosis. The derived maps indicate both widespread occurrence of podoconiosis and a marked variability in prevalence of podoconiosis, with prevalence typically highest at altitudes >1500 m above sea level (masl), with >1500 mm annual rainfall and mean annual LST of 19–21°C. No (or very little) podoconiosis occurred at altitudes <1225 masl, with annual rainfall <900 mm, and mean annual LST of >24°C.

Conclusion

Podoconiosis remains a public health problem in Ethiopia over considerable areas of the country, but exhibits marked geographical variation associated in part with key environmental factors. This is work in progress and the results presented here will be refined in future work.  相似文献   

19.

Background

Several markers have been proposed to predict the outcome of chronic lymphocytic leukemia (CLL) patients. However, discordances exist between the current prognostic factors, indicating that none of these factors are totally perfect.

Methodology/Principal Findings

Here, we compared the prognostic power of new RNA-based markers in order to construct a quantitative PCR (qPCR) score composed of the most powerful factors. ZAP70, LPL, CLLU1, microRNA-29c and microRNA-223 were measured by real time PCR in a cohort of 170 patients with a median follow-up of 64 months (range3-330). For each patient, cells were obtained at diagnosis and RNA was extracted from purified CD19 cells. The best markers were included in a qPCR score, which was thereafter compared to each individual factor. Statistical analysis showed that all five RNA-based markers can predict treatment-free survival (TFS), but only ZAP70, LPL and microRNA-29c could significantly predict overall survival (OS). These three markers were thus included in a simple qPCR score that was able to significantly predict TFS and OS by dividing patients into three groups (0/3, 1-2/3 and 3/3). Median TFS were >210, 61 and 24 months (P<0.0001) and median OS were >330, 242 and 137 months (P<0.0001), respectively. Interestingly, TFS results were also confirmed in Binet stage A patients (P<0.0001). When compared to other classical factors, this score displays the highest univariate Cox hazard ratio (TFS: HR = 9.45 and OS: HR = 13.88) but also provides additional prognostic information.

Conclusions

In our hands, this score is the most powerful tool for CLL risk stratification at the time of diagnosis.  相似文献   

20.

Background

FoxM1 has been reported to be important in initiation and progression of various tumors. However, whether FoxM1 has any indication for prognosis in non-small cell lung cancer patients remains unclear.

Methodology/Principal Findings

In this study, FoxM1 expression in tumor cells was examined first by immunohistochemistry in 175 NSCLC specimens, the result of which showed that FoxM1 overexpression was significantly associated with positive smoking status (P = 0.001), poorer tissue differentiation (P = 0.0052), higher TNM stage (P<0.0001), lymph node metastasis (P<0.0001), advanced tumor stage (P<0.0001), and poorer prognosis (P<0.0001). Multivariable analysis showed that FoxM1 expression increased the hazard of death (hazard ratio, 1.899; 95% CI, 1.016–3.551). Furthermore, by various in vitro and in vivo experiments, we showed that targeted knockdown of FoxM1 expression could inhibit the migratory and invasive abilities of NSCLC cells, whereas enforced expression of FoxM1 could increased the invasion and migration of NSCLC cells. Finally, we found that one of the cellular mechanisms by which FoxM1 promotes tumor metastasis is through inducing epithelial-mesenchymal transition (EMT) program.

Conclusions

These results suggested that FoxM1 overexpression in tumor tissues is significantly associated with the poor prognosis of NSCLC patients through promoting tumor metastasis.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号