首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Shimmering is a collective defence behaviour in Giant honeybees (Apis dorsata) whereby individual bees flip their abdomen upwards, producing Mexican wave-like patterns on the nest surface. Bucket bridging has been used to explain the spread of information in a chain of members including three testable concepts: first, linearity assumes that individual "agent bees" that participate in the wave will be affected preferentially from the side of wave origin. The directed-trigger hypothesis addresses the coincidence of the individual property of trigger direction with the collective property of wave direction. Second, continuity describes the transfer of information without being stopped, delayed or re-routed. The active-neighbours hypothesis assumes coincidence between the direction of the majority of shimmering-active neighbours and the trigger direction of the agents. Third, the graduality hypothesis refers to the interaction between an agent and her active neighbours, assuming a proportional relationship in the strength of abdomen flipping of the agent and her previously active neighbours. Shimmering waves provoked by dummy wasps were recorded with high-resolution video cameras. Individual bees were identified by 3D-image analysis, and their strength of abdominal flipping was assessed by pixel-based luminance changes in sequential frames. For each agent, the directedness of wave propagation was based on wave direction, trigger direction, and the direction of the majority of shimmering-active neighbours. The data supported the bucket bridging hypothesis, but only for a small proportion of agents: linearity was confirmed for 2.5%, continuity for 11.3% and graduality for 0.4% of surface bees (but in 2.6% of those agents with high wave-strength levels). The complimentary part of 90% of surface bees did not conform to bucket bridging. This fuzziness is discussed in terms of self-organisation and evolutionary adaptedness in Giant honeybee colonies to respond to rapidly changing threats such as predatory wasps scanning in front of the nest.  相似文献   

2.
3.

Background

IPT with or without concomitant administration of ART is a proven intervention to prevent tuberculosis among PLHIV. However, there are few data on the routine implementation of this intervention and its effectiveness in settings with limited resources.

Objectives

To measure the level of uptake and effectiveness of IPT in reducing tuberculosis incidence in a cohort of PLHIV enrolled into HIV care between 2007 and 2010 in five hospitals in southern Ethiopia.

Methods

A retrospective cohort analysis of electronic patient database was done. The independent effects of no intervention, “IPT-only,” “IPT-before-ART,” “IPT-and-ART started simultaneously,” “ART-only,” and “IPT-after-ART” on TB incidence were measured. Cox-proportional hazards regression was used to assess association of treatment categories with TB incidence.

Results

Of 7,097 patients, 867 were excluded because they were transferred-in; a further 823 (12%) were excluded from the study because they were either identified to have TB through screening (292 patients) or were on TB treatment (531). Among the remaining 5,407 patients observed, IPT had been initiated for 39% of eligible patients. Children, male sex, advanced disease, and those in Pre-ART were less likely to be initiated on IPT. The overall TB incidence was 2.6 per 100 person-years. As compared to those with no intervention, use of “IPT-only” (aHR = 0.36, 95% CI = 0.19–0.66) and “ART-only” (aHR = 0.32, 95% CI = 0.24–0.43) were associated with significant reduction in TB incidence rate. Combining ART and IPT had a more profound effect. Starting IPT-before-ART (aHR = 0.18, 95% CI = 0.08–0.42) or simultaneously with ART (aHR = 0.20, 95% CI = 0.10–0.42) provided further reduction of TB at ∼80%.

Conclusions

IPT was found to be effective in reducing TB incidence, independently and with concomitant ART, under programme conditions in resource-limited settings. The level of IPT provision and effectiveness in reducing TB was encouraging in the study setting. Scaling up and strengthening IPT service in addition to ART can have beneficial effect in reducing TB burden among PLHIV in settings with high TB/HIV burden.  相似文献   

4.

Objective

To examine whether psychosocial factors mediate (explain) the association between socioeconomic position and takeaway food consumption.

Design

A cross-sectional postal survey conducted in 2009.

Setting

Participants reported their usual consumption of 22 takeaway food items, and these were grouped into a “healthy” and “less healthy” index based on each items'' nutritional properties. Principal Components Analysis was used to derive three psychosocial scales that measured beliefs about the relationship between diet and health (α = 0.73), and perceptions about the value (α = 0.79) and pleasure (α = 0.61) of takeaway food. A nutrition knowledge index was also used. Socioeconomic position was measured by highest attained education level.

Subjects

Randomly selected adults (n = 1,500) aged between 25–64 years in Brisbane, Australia (response rate  =  63.7%, N = 903).

Results

Compared with those with a bachelor degree or higher, participants with a diploma level of education were more likely to consume “healthy” takeaway food (p = 0.023) whereas the least educated (high school only) were more likely to consume “less healthy” choices (p = 0.002). The least educated were less likely to believe in a relationship between diet and health (p<0.001), and more likely to have lower nutritional knowledge compared with their highly educated counterparts (p<0.001). Education differences in beliefs about the relationship between diet and health partly and significantly mediated the association between education and “healthy” takeaway food consumption. Diet- and health-related beliefs and nutritional knowledge partly and significantly mediated the education differences in “less healthy” takeaway food consumption.

Conclusions

Interventions that target beliefs about the relationship between diet and health, and nutritional knowledge may reduce socioeconomic differences in takeaway food consumption, particularly for “less healthy” options.  相似文献   

5.

Objective

To evaluate the quality of clinical practice guidelines (CPGs) for otorhinolaryngology in China.

Materials and Methods

A systematic search of relevant literature databases (CBM, WANFANG, VIP, CNKI, China Guideline Clearinghouse) published between 1978 and March 2012 was undertaken to identify and select CPGs related to otorhinolaryngology. Four independent reviewers assessed the eligible guidelines using the Appraisal of Guidelines for Research and Evaluation (AGREE II) instrument. Their degree of agreement was evaluated using the intraclass correlation coefficient (ICC).

Result

From 170 citations, 21 relevant guidelines were included. The overall agreement among reviewers was moderate (ICC = 0.87; 95% confidence interval [CI], 0.78–0.91). The scores for each of the AGREE domains were the following: “scope and purpose” (mean ± standard error [SE] = 45.4±4.4; ICC = 0.92), “stakeholder involvement” (mean ± SE = 30.4±3.1; ICC = 0.81), “rigor of development” (mean ± SE = 20.9±2.8; ICC = 0.87), “clarity of presentation” (mean ± SE = 48.8±3.7; ICC = 0.80), “applicability” (mean ± SE = 12.6±1.7; ICC = 0.72), and “editorial independence” (mean ± SE = 6.2±0.8; ICC = 0.76). Three guidelines (14%) mentioned updates, and the average update frequency was 7 years. None used the GRADE system.

Conclusion

The quality of otorhinolaryngology guidelines in China is low. Greater efforts are needed to provide high-quality guidelines that serve as a useful and reliable tool for clinical decision-making in this field.  相似文献   

6.

Introduction

Dietary patterns are culturally specific and there is limited data on the association of dietary patterns with late-life depression in Chinese. This study examined the associations between dietary patterns and baseline and subsequent depressive symptoms in community-dwelling Chinese older people in Hong Kong.

Methods

Participants aged ≥65 year participating in a cohort study examining the risk factors for osteoporosis completed a validated food frequency questionnaire at baseline between 2001 and 2003. Factor analysis was used to identify three dietary patterns: “vegetables-fruits” pattern, “snacks-drinks-milk products” pattern, and “meat-fish” pattern. Depressive symptoms were measured at baseline and 4-year using the validated Geriatric Depression Scale. Multiple logistic regression was used for cross-sectional analysis (n = 2,902) to assess the associations between dietary patterns and the presence of depressive symptoms, and for longitudinal analysis (n = 2,211) on their associations with 4-year depressive symptoms, with adjustment for socio-demographic and lifestyle factors.

Results

The highest quartile of “vegetables-fruits” pattern score was associated with reduced likelihood of depressive symptoms [Adjusted OR = 0.55 (95% CI: 0.36–0.83), ptrend = 0.017] compared to the lowest quartile at baseline. Similar inverse trend was observed for the highest quartile of “snacks-drinks-milk products” pattern score [Adjusted OR = 0.41 (95% CI: 0.26–0.65), ptrend<0.001] compared to the lowest quartile. There was no association of “meat-fish” pattern with the presence of depressive symptoms at baseline. None of the dietary patterns were associated with subsequent depressive symptoms at 4-year.

Conclusion

Higher “vegetables-fruits” and “snacks-drinks-milk products” pattern scores were associated with reduced likelihood of baseline depressive symptoms in Chinese older people in Hong Kong. The longitudinal analyses failed to show any causal relationship between dietary patterns and depressive symptoms in this population.  相似文献   

7.
This study investigated the relationship between level of stress in middle and high school students aged 12–18 and risk of atopic dermatitis. Data from the Sixth Korea Youth Risk Behavior Web-based Survey (KYRBWS-VI), a cross-sectional study among 74,980 students in 800 middle schools and high schools with a response rate of 97.7%, were analyzed. Ordinal logistic regression analyses were conducted to determine the relationship between stress and atopic dermatitis with severity. A total of 5,550 boys and 6,964 girls reported having been diagnosed with atopic dermatitis. Younger students were more likely to have atopic dermatitis. Interestingly, the educational level of parents was found to be associated with having atopic dermatitis and having more severe condition. In particular, girls with mothers with at least college education had a 41% higher risk of having atopic dermatitis and severe atopic condition (odds ratio (OR)) = 1.41, 95% CI, 1.22–1.63; P<0.0001) compared with those with mothers who had attended middle school at most. Similar trend was shown among both boys and girls for their father''s education level. The stress level was found to be significantly associated with the risk of atopic dermatitis. Compared to boys with who reported “no stress”, boys with “very high” stress had 46% higher the risk of having more severe atopic dermatitis (OR = 1.46, 95% CI, 1.20–1.78; P<0.0001), 44% higher (OR = 1.44, 95% CI, 1.19–1.73; P<0.0001) with “high” stress, and 21% higher (OR = 1.21, 95% CI, 1.00–1.45; P = 0.05) with “moderate” stress. In contrast, we found no statistically significant relationship between stress and atopic dermatitis in girls. This study suggests that stress and parents'' education level were associated with atopic dermatitis. Specifically, degree of stress is positively correlated with likelihood of being diagnosed with this condition and increasing the severity.  相似文献   

8.

Background

Identifying individuals at increased risk for melanoma could potentially improve public health through targeted surveillance and early detection. Studies have separately demonstrated significant associations between melanoma risk, melanocortin receptor (MC1R) polymorphisms, and indoor ultraviolet light (UV) exposure. Existing melanoma risk prediction models do not include these factors; therefore, we investigated their potential to improve the performance of a risk model.

Methods

Using 875 melanoma cases and 765 controls from the population-based Minnesota Skin Health Study we compared the predictive ability of a clinical melanoma risk model (Model A) to an enhanced model (Model F) using receiver operating characteristic (ROC) curves. Model A used self-reported conventional risk factors including mole phenotype categorized as “none”, “few”, “some” or “many” moles. Model F added MC1R genotype and measures of indoor and outdoor UV exposure to Model A. We also assessed the predictive ability of these models in subgroups stratified by mole phenotype (e.g. nevus-resistant (“none” and “few” moles) and nevus-prone (“some” and “many” moles)).

Results

Model A (the reference model) yielded an area under the ROC curve (AUC) of 0.72 (95% CI = 0.69, 0.74). Model F was improved with an AUC = 0.74 (95% CI = 0.71–0.76, p<0.01). We also observed substantial variations in the AUCs of Models A & F when examined in the nevus-prone and nevus-resistant subgroups.

Conclusions

These results demonstrate that adding genotypic information and environmental exposure data can increase the predictive ability of a clinical melanoma risk model, especially among nevus-prone individuals.  相似文献   

9.

Background

Dying at home and dying at the preferred place of death are advocated to be desirable outcomes of palliative care. More insight is needed in their usefulness as quality indicators. Our objective is to describe whether “the percentage of patients dying at home” and “the percentage of patients who died in their place of preference” are feasible and informative quality indicators.

Methods and Findings

A mortality follow-back study was conducted, based on data recorded by representative GP networks regarding home-dwelling patients who died non-suddenly in Belgium (n = 1036), the Netherlands (n = 512), Italy (n = 1639) or Spain (n = 565). “The percentage of patients dying at home” ranged between 35.3% (Belgium) and 50.6% (the Netherlands) in the four countries, while “the percentage of patients dying at their preferred place of death” ranged between 67.8% (Italy) and 86.0% (Spain). Both indicators were strongly associated with palliative care provision by the GP (odds ratios of 1.55–13.23 and 2.30–6.63, respectively). The quality indicator concerning the preferred place of death offers a broader view than the indicator concerning home deaths, as it takes into account all preferences met in all locations. However, GPs did not know the preferences for place of death in 39.6% (the Netherlands) to 70.3% (Italy), whereas the actual place of death was known in almost all cases.

Conclusion

GPs know their patients’ actual place of death, making the percentage of home deaths a feasible indicator for collection by GPs. However, patients’ preferred place of death was often unknown to the GP. We therefore recommend using information from relatives as long as information from GPs on the preferred place of death is lacking. Timely communication about the place where patients want to be cared for at the end of life remains a challenge for GPs.  相似文献   

10.

Background

Learning followed by a period of sleep, even as little as a nap, promotes memory consolidation. It is now generally recognized that sleep facilitates the stabilization of information acquired prior to sleep. However, the temporal nature of the effect of sleep on retention of declarative memory is yet to be understood. We examined the impact of a delayed nap onset on the recognition of neutral pictorial stimuli with an added spatial component.

Methodology/Principal Findings

Participants completed an initial study session involving 150 neutral pictures of people, places, and objects. Immediately following the picture presentation, participants were asked to make recognition judgments on a subset of “old”, previously seen, pictures versus intermixed “new” pictures. Participants were then divided into one of four groups who either took a 90-minute nap immediately, 2 hours, or 4 hours after learning, or remained awake for the duration of the experiment. 6 hours after initial learning, participants were again tested on the remaining “old” pictures, with “new” pictures intermixed.

Conclusions/Significance

Interestingly, we found a stabilizing benefit of sleep on the memory trace reflected as a significant negative correlation between the average time elapsed before napping and decline in performance from test to retest (p = .001). We found a significant interaction between the groups and their performance from test to retest (p = .010), with the 4-hour delay group performing significantly better than both those who slept immediately and those who remained awake (p = .044, p = .010, respectively). Analysis of sleep data revealed a significant positive correlation between amount of slow wave sleep (SWS) achieved and length of the delay before sleep onset (p = .048). The findings add to the understanding of memory processing in humans, suggesting that factors such as waking processing and homeostatic increases in need for sleep over time modulate the importance of sleep to consolidation of neutral declarative memories.  相似文献   

11.
Laboratory-based CD4 monitoring of HIV patients presents challenges in resource limited settings (RLS) including frequent machine breakdown, poor engineering support and limited cold chain and specimen transport logistics. This study assessed the performance of two CD4 tests designed for use in RLS; the Dynal assay and the Alere PIMA test (PIMA). Accuracy of Dynal and PIMA using venous blood was assessed in a centralised laboratory by comparison to BD FACSCount (BD FACS). Dynal had a mean bias of −50.35 cells/µl (r2 = 0.973, p<0.0001, n = 101) and PIMA −22.43 cells/µl (r2 = 0.964, p<0.0001, n = 139) compared to BD FACS. Similar results were observed for PIMA operated by clinicians in one urban (n = 117) and two rural clinics (n = 98). Using internal control beads, PIMA precision was 10.34% CV (low bead mean 214.24 cells/µl) and 8.29% (high bead mean 920.73 cells/µl) and similar %CV results were observed external quality assurance (EQA) and replicate patient samples. Dynal did not perform using EQA and no internal controls are supplied by the manufacturer, however duplicate testing of samples resulted in r2 = 0.961, p<0.0001, mean bias = −1.44 cells/µl. Using the cut-off of 350 cells/µl compared to BD FACS, PIMA had a sensitivity of 88.85% and specificity of 98.71% and Dynal 88.61% and 100%. A total of 0.44% (2/452) of patient samples were misclassified as “no treat” and 7.30% (33/452) “treat” using PIMA whereas with Dynal 8.91% (9/101) as “treat” and 0% as “no treat”. In our setting PIMA was found to be accurate, precise and user-friendly in both laboratory and clinic settings. Dynal performed well in initial centralized laboratory evaluation, however lacks requisite quality control measures, and was technically more difficult to use, making it less suitable for use at lower tiered laboratories.  相似文献   

12.
Some studies have reported that angiotensin converting enzyme (ACE) and angiotensinogen (AGT) genes have been associated with hypertrophic cardiomyopathy (HCM). However, there have been inconsonant results among different studies. To clarify the influence of ACE and AGT on HCM, a systemic review and meta-analysis of case-control studies were performed. The following databases were searched to indentify related studies: PubMed database, the Embase database, the Cochrane Central Register of Controlled Trials database, China National Knowledge Information database, and Chinese Scientific and Technological Journal database. Search terms included “hypertrophic cardiomyopathy”, “angiotensin converting enzyme” (ACE) or “ACE” and “polymorphism or mutation”. For the association of AGT M235T polymorphism and HCM, “angiotensin converting enzyme” or “ACE” was replaced with “angiotensinogen”. A total of seventeen studies were included in our review. For the association of ACE I/D polymorphism and HCM, eleven literatures were included in the meta-analysis on association of penetrance and genotype. Similarly, six case-control studies were included in the meta-analysis for AGT M235T. For ACE I/D polymorphism, the comparison of DI/II genotype vs DD genotype was performed in the present meta-analysis. The OR was 0.73 (95% CI: 0.527, 0.998, P = 0.049, power = 94%, alpha = 0.05) after the study which deviated from Hardy-Weinberg Equilibrium was excluded, indicating that the ACE I/D gene polymorphism might be associated with HCM. The AGT M235T polymorphism did not significantly affect the risk of HCM. In addition, ACE I/D gene polymorphism did not significantly influence the interventricular septal thickness in HCM patients. In conclusion, the ACE I/D polymorphism might be associated with the risk of HCM.  相似文献   

13.

Background

Although many case reports have described patients with proton pump inhibitor (PPI)-induced hypomagnesemia, the impact of PPI use on hypomagnesemia has not been fully clarified through comparative studies. We aimed to evaluate the association between the use of PPI and the risk of developing hypomagnesemia by conducting a systematic review with meta-analysis.

Methods

We conducted a systematic search of MEDLINE, EMBASE, and the Cochrane Library using the primary keywords “proton pump,” “dexlansoprazole,” “esomeprazole,” “ilaprazole,” “lansoprazole,” “omeprazole,” “pantoprazole,” “rabeprazole,” “hypomagnesemia,” “hypomagnesaemia,” and “magnesium.” Studies were included if they evaluated the association between PPI use and hypomagnesemia and reported relative risks or odds ratios or provided data for their estimation. Pooled odds ratios with 95% confidence intervals were calculated using the random effects model. Statistical heterogeneity was assessed with Cochran’s Q test and I 2 statistics.

Results

Nine studies including 115,455 patients were analyzed. The median Newcastle-Ottawa quality score for the included studies was seven (range, 6–9). Among patients taking PPIs, the median proportion of patients with hypomagnesemia was 27.1% (range, 11.3–55.2%) across all included studies. Among patients not taking PPIs, the median proportion of patients with hypomagnesemia was 18.4% (range, 4.3–52.7%). On meta-analysis, pooled odds ratio for PPI use was found to be 1.775 (95% confidence interval 1.077–2.924). Significant heterogeneity was identified using Cochran’s Q test (df = 7, P<0.001, I 2 = 98.0%).

Conclusions

PPI use may increase the risk of hypomagnesemia. However, significant heterogeneity among the included studies prevented us from reaching a definitive conclusion.  相似文献   

14.
Aminoacyl-tRNA synthetases (ARSs) are in charge of cellular protein synthesis and have additional domains that function in a versatile manner beyond translation. Eight core ARSs (EPRS, MRS, QRS, RRS, IRS, LRS, KRS, DRS) combined with three nonenzymatic components form a complex known as multisynthetase complex (MSC).We hypothesize that the single-nucleotide polymorphisms (SNPs) of the eight core ARS coding genes might influence the susceptibility of sporadic congenital heart disease (CHD). Thus, we conducted a case-control study of 984 CHD cases and 2953 non-CHD controls in the Chinese Han population to evaluate the associations of 16 potentially functional SNPs within the eight ARS coding genes with the risk of CHD. We observed significant associations with the risk of CHD for rs1061248 [G/A; odds ratio (OR) = 0.90, 95% confidence interval (CI) = 0.81–0.99; P = 3.81×10−2], rs2230301 [A/C; OR = 0.73, 95%CI = 0.60–0.90, P = 3.81×10−2], rs1061160 [G/A; OR = 1.18, 95%CI = 1.06–1.31; P = 3.53×10−3] and rs5030754 [G/A; OR = 1.39, 95%CI = 1.11–1.75; P = 4.47×10−3] of EPRS gene. After multiple comparisons, rs1061248 conferred no predisposition to CHD. Additionally, a combined analysis showed a significant dosage-response effect of CHD risk among individuals carrying the different number of risk alleles (P trend = 5.00×10−4). Compared with individuals with “0–2” risk allele, those carrying “3”, “4” or “5 or more” risk alleles had a 0.97-, 1.25- or 1.38-fold increased risk of CHD, respectively. These findings indicate that genetic variants of the EPRS gene may influence the individual susceptibility to CHD in the Chinese Han population.  相似文献   

15.

Purpose

To examine whether interpersonal violence perpetration and violence toward objects are associated with body mass index (BMI), body weight perception (BWP), and repeated weight-loss dieting in female adolescents.

Methods

A cross-sectional survey using a self-report questionnaire was performed evaluating interpersonal violence perpetration, violence toward objects, the number of diets, BMI, BWP, the 12-item General Health Questionnaire (GHQ-12), victimization, substance use, and other psychosocial variables among 9,112 Japanese females aged between 12–18 years. Logistic regression analysis was conducted to analyze the contribution of BMI, BWP, and weight-control behavior to the incidence of violent behavior, while controlling for potential confounding factors.

Results

The number of diets was associated with both interpersonal violence perpetration (OR = 1.18, 95% CI 1.08–1.29, p<0.001) and violence toward objects (OR = 1.34, 95% CI 1.24–1.45, p<0.001), after adjusting for age, BMI, BWP, the GHQ-12 total score, victimization, and substance use. In terms of BMI and BWP, the “overweight” BWP was associated with violence toward objects (OR = 1.29, 95% CI 1.07–1.54, p<0.05). On the other hand, the “Underweight” and “Slightly underweight” BMI were related to violence toward objects [(OR = 1.28, 95% CI 1.01–1.62, p<0.05) and (OR = 1.27, 95% CI 1.07–1.51, p<0.05), respectively]. The “Underweight” BWP was related to interpersonal violence perpetration (OR = 2.30, 95% CI 1.38–3.84, p<0.05).

Conclusions

The cumulative number of diets is associated with violent behavior in female adolescents. In addition, underweight BMI and extreme BWP are associated with violent behavior.  相似文献   

16.
The most common lethal accidents in General Aviation are caused by improperly executed landing approaches in which a pilot descends below the minimum safe altitude without proper visual references. To understand how expertise might reduce such erroneous decision-making, we examined relevant neural processes in pilots performing a simulated landing approach inside a functional MRI scanner. Pilots (aged 20–66) were asked to “fly” a series of simulated “cockpit view” instrument landing scenarios in an MRI scanner. The scenarios were either high risk (heavy fog–legally unsafe to land) or low risk (medium fog–legally safe to land). Pilots with one of two levels of expertise participated: Moderate Expertise (Instrument Flight Rules pilots, n = 8) or High Expertise (Certified Instrument Flight Instructors or Air-Transport Pilots, n = 12). High Expertise pilots were more accurate than Moderate Expertise pilots in making a “land” versus “do not land” decision (CFII: d′ = 3.62±2.52; IFR: d′ = 0.98±1.04; p<.01). Brain activity in bilateral caudate nucleus was examined for main effects of expertise during a “land” versus “do not land” decision with the no-decision control condition modeled as baseline. In making landing decisions, High Expertise pilots showed lower activation in the bilateral caudate nucleus (0.97±0.80) compared to Moderate Expertise pilots (1.91±1.16) (p<.05). These findings provide evidence for increased “neural efficiency” in High Expertise pilots relative to Moderate Expertise pilots. During an instrument approach the pilot is engaged in detailed examination of flight instruments while monitoring certain visual references for making landing decisions. The caudate nucleus regulates saccade eye control of gaze, the brain area where the “expertise” effect was observed. These data provide evidence that performing “real world” aviation tasks in an fMRI provide objective data regarding the relative expertise of pilots and brain regions involved in it.  相似文献   

17.
The present work intends to evaluate the use of immediate loaded orthodontic screws in a growing model, and to study the specific bone response. Thirty-two screws (half of stainless steel and half of titanium) were inserted in the alveolar bone of 8 growing pigs. The devices were immediately loaded with a 100 g orthodontic force. Two loading periods were assessed: 4 and 12 weeks. Both systems of screws were clinically assessed. Histological observations and histomorphometric analysis evaluated the percent of “bone-to-implant contact” and static and dynamic bone parameters in the vicinity of the devices (test zone) and in a bone area located 1.5 cm posterior to the devices (control zone). Both systems exhibit similar responses for the survival rate; 87.5% and 81.3% for stainless steel and titanium respectively (p = 0.64; 4-week period), and 62.5% and 50.0% for stainless steel and titanium respectively (p = 0.09; 12-week period). No significant differences between the devices were found regarding the percent of “bone-to-implant contact” (p = 0.1) or the static and dynamic bone parameters. However, the 5% threshold of “bone-to-implant contact” was obtained after 4 weeks with the stainless steel devices, leading to increased survival rate values. Bone in the vicinity of the miniscrew implants showed evidence of a significant increase in bone trabecular thickness when compared to bone in the control zone (p = 0.05). In our study, it is likely that increased trabecular thickness is a way for low density bone to respond to the stress induced by loading.  相似文献   

18.

Objective

Formulate a definition and describe the clinical characteristics of PD patients with a “brittle response” (BR) to medications versus a “non-brittle response” (NBR), and characterize the use of DBS for this population.

Methods

An UF IRB approved protocol used a retrospective chart review of 400 consecutive PD patients presenting to the UF Center for Movement Disorders and Neurorestoration. Patient records were anonymized and de-identified prior to analysis. SPSS statistics were used to analyze data.

Results

Of 345 included patients, 19 (5.5%) met criteria for BR PD. The BR group was comprised of 58% females, compared to 29% in the NBR group (P = .008). The former had a mean age of 63.4 compared to 68.1 in the latter. BR patients had lower mean weight (63.5 vs. 79.6, P = <.001), longer mean disease duration (12.6 vs. 8.9 years, P = .003), and had been on LD for more years compared to NBR patients (9.8 vs. 5.9, P = .001). UPDRS motor scores were higher (40.4 vs. 30.0, P = .001) in BR patients. No differences were observed regarding the Schwab and England scale, PDQ-39, and BDI-II. Sixty-three percent of the BR group had undergone DBS surgery compared to 18% (P = .001). Dyskinesias were more common, severe, and more often painful (P = <.001) in the BR group. There was an overall positive benefit from DBS.

Conclusion

BR PD occurred more commonly in female patients with a low body weight. Patients with longer disease duration and longer duration of LD therapy were also at risk. The BR group responded well to DBS.  相似文献   

19.

Objectives

To estimate the relationship between exposure to extremely low-frequency electromagnetic fields (ELF-EMF) and the risk of amyotrophic lateral sclerosis (ALS) by a meta-analysis.

Methods

Through searching PubMed databases (or manual searching) up to April 2012 using the following keywords: “occupational exposure”, “electromagnetic fields” and “amyotrophic lateral sclerosis” or “motor neuron disease”, seventeen studies were identified as eligible for this meta-analysis. The associations between ELF-EMF exposure and the ALS risk were estimated based on study design (case-control or cohort study), and ELF-EMF exposure level assessment (job title or job-exposure matrix). The heterogeneity across the studies was tested, as was publication bias.

Results

Occupational exposure to ELF-EMF was significantly associated with increased risk of ALS in pooled studies (RR = 1.29, 95%CI = 1.02–1.62), and case-control studies (OR = 1.39, 95%CI = 1.05–1.84), but not cohort studies (RR = 1.16, 95% CI = 0.80–1.69). In sub-analyses, similar significant associations were found when the exposure level was defined by the job title, but not the job-exposure matrix. In addition, significant associations between occupational exposure to ELF-EMF and increased risk of ALS were found in studies of subjects who were clinically diagnosed but not those based on the death certificate. Moderate heterogeneity was observed in all analyses.

Conclusions

Our data suggest a slight but significant ALS risk increase among those with job titles related to relatively high levels of ELF-EMF exposure. Since the magnitude of estimated RR was relatively small, we cannot deny the possibility of potential biases at work. Electrical shocks or other unidentified variables associated with electrical occupations, rather than magnetic-field exposure, may be responsible for the observed associations with ALS.  相似文献   

20.

Purpose

To investigate the current status of diabetic self-management behavior and the factors influencing this behavior in Chengdu, a typical city in western China.

Methods

We performed stratified sampling in 6 urban districts of Chengdu. We used questionnaires concerning self-management knowledge, self-management beliefs, self-management efficacy, social support, and self-management behavior to investigate patients with T2DM from August to November 2011. All of the data were analyzed using the SPSS 17.0 statistical package.

Results

We enrolled a total of 364 patients in the present study. The median score of self-management behavior was 111.00, the interquartile range was 100.00–119.00, and the index score was 77.77. Self-management was described as “good” in 46%, “fair” in 45%, and “poor” in 6% of patients. A multiple-factor analysis identified age (OR, 0.43; 95% CI, 0.20–0.91; P = 0.026), education in “foot care” (OR, 0.42; 95% CI, 0.18–0.99; P = 0.048), self-management knowledge (OR, 0.86; 95% CI, 0.80–0.92; P<0.001), self-management belief (OR, 0.92; 95% CI, 0.87–0.97; P = 0.002), self-efficacy (OR, 0.93; 95% CI, 0.90–0.96; P<0.001), and social support (OR, 0.62; 95% CI, 0.41–0.94; P = 0.023) as positive factors. Negative factors included diabetes duration (5–9 years: OR, 14.82; 95% CI, 1.64–133.73; P = 0.016; and ≥10 years: OR, 10.28; 95% CI, 1.06–99.79; P = 0.045) and hospitalization experience (OR, 2.96; 95% CI, 1.64–5.36; P<0.001).

Conclusion

We observed good self-management behavior in patients with T2DM in Chengdu. When self-management education is provided, age, education, knowledge, belief, self-efficacy, and social support should be considered to offer more appropriate intervention and to improve patients'' behavior.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号